dlss nonsense 

ive been saying dlss is shit didnt i
i mean. it was okay because it didnt do much. and now they cranked it to 11 and it's Officially Bad to everyone. cool

i was just saying the Method is Bad
(training a model on a high resolution video game recording and then applying that to a low resolution game render).
literally nvidia's real-time gaming waifu2x. give me a billion dollars

- it is bad because you have to train it on a specific game for it to not look like shit, which is a lot of work that will inevitably be attempted to optimize away or generalize, normalizing-out every graphic style and renderer quirks

- it is bad because you will always miss some bits in the training data and compress away parts to keep the model a reasonable size, meaning it will look like shit on less statistically significant parts

- it is bad because it leaves the upsampler 'guessing' a lot of details out of your initial render. just give me a pixelated version my brain will do the rest. my eyes cant see much clearer than 1080p anyway

- it is bad because the ultimate business goal of this type of tool is to render your games in a datacenter, stream you a hyper-compressed video flow, and re-hydrate it into the blandest output possible that is an optimal sludge of every AAA video game

you will not catch me complaining about pixelated textures and pointy edges lmao. actually i will pay extra for good chunky pixels

Follow

dlss nonsense 

@alice Oh, you can tell with DLSS5 in particular that there's no way normal people will run this load. This is to make use of all those idling Geforce Now clusters they've got in a desperate bid to ride out the AI bubble bursting.

I wish Nvidia a very explode with the rest of the bubble and I hope they do it while everyone's still mocking them, from players to other game developers

· · Web · 0 · 0 · 0
Sign in to participate in the conversation
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.