They can't solve input latency. It's nice for situations where you can maintain stable 60 fps and want a high framerate situation on a 120 Hz monitor. It's not nice when the base frame rate is 15 and you want to have 60 fps. The input latency is terrible.
It was always a polish for when you already have an acceptable input lag i.e at 60 fps. I had to explain to a friend why his 90fps with framegen game played like shit and he was very sad to find out IRL.
It's exactly as you've said, something extra on top of already good performance, instead of a magic trick to reach baseline, and I kind of get your friend's reaction since Nvidia first advertised it on a Portal RTX demo that ran at 15 fps turned 40ish, then also started comparing specs of new vs old gen cards WITH frame gen included.
I think the tech wouldn't get as much hate if it were actually advertised as that, motion smoothing meant to fill enthusiast level refresh + it were treated as its own separate thing, instead of including it into general GPU performance charts, which feels malicious.
Agreed its being used as borderline manipulation for marketing purposes and also is making some game developers lazier since they can rely on DLSS and frame gen as a crutch.
They now have an option to apply it dynamically. So you can keep the FPS locked to your monitors refresh rate or whatever, and instead of having to deal with skipped frames it will fill those in (imo a much better experience).
that doesn't really change its fundamentals, it just shifts it from being "good when its not needed and bad when it is" from a permanent judgement to a per scene judgement.
Well, ideally the frame rate will be at your monitors refresh rate when you’re in an area without many particles or whatever. Frame gen doesn’t impact that (only 1-3% difference at most). It does however allow for more consistent frames.
If you’re not reaching that base level of fps then frame gen isn’t to blame. The people working on frame gen/DLSS etc and the games are entirely different and improvement in frame gen/gpu technology doesn’t take away from improvement elsewhere.
The loss to base fps increases latency to begin with. Then holding back the latest rendered frame while generating in-between frames also adds latency.
Handhelds would be the target market for fixing that problem. With DLSS/FSR/XeSS you can lower wattage but maintain acceptable performance, thus increasing battery life. All current handhelds only have about 1-2.5 hours battery life, it’s not great.
They can, they are decoupling the frames from your inputs basically, so you always have low latency, even lower than native. And filling in the blanks on the corners with AI.
No he's right. They had a demo for it. It's kind of like black magic, the technology imagines the frame with your input applied before it even gets rendered. Was showcased last year on nvidia youtube channel.
Btw this sounds like bullshit but it's a technology used on vr headset right now.
no but i think about this like this, because i have some background in game design, though not an expert.
some game engines already have this, they have a 'simulation' fps where they update the physics, inputs, etc. at 30 frames per second (fromsoftware is a good example of locked 60fps physics), then output the actual graphics with variable refresh rate
the input though is always evaluated every 30/60fps
if the game can't even get 15fps due to lack of hardware, it doesn't matter if it can output 500fps visually, the internal simulation can only run at 15fps, which means ur input only gets evaluated at 15fps
Yeah that's how it has been for 15 years. That stuff im talking about is a lot more recent, you should really watch the video by Nvidia, i think it's called reflex 3 (which i find weird because its a whole other thing than what reflex 1&2 were). I'm also a game dev, if that makese sound more legitimate.
okay but what if this technology is what devs will rely on to make their next cyberpunk 2077 with ray tracing and extremely real™ graphics go from 15fps to 250fps at higher presets?
if anyone has tried to TTRPG with an LLM they will know how fucking retarded it is as an idea to play a video game with an AI. The best LLMs can barely run a TTRPG coherently, how will a game work when it's just rules times a billion
You dont get it. Eventually we mught have ai generated players that even play games for ourself. We wouldn't even need to have fun as ai can us on tha part
you mean that tech that looks like it applies make-up to faces in videos games where the character wouldn't be wearing make-up? Since, you know, the only data it can use is human faces which are available publicity, so usually flawless insta filters, or highly curated faces.
My favorite part of the demo was that the face in question HAD emotions in the normal render, and then after the RTX Neural Face Render, it looked like she had gotten a nice make-over and some botox, with all emotion stripped from her expression.
Because, again, AI cannot create anything unique or new.
I mean, as it steals more intellectual property and continues to ruin the entire consumer electronics space and environment, I'm sure the AI slop will get better.
Hopefully the industry collpases before that unfortunate turn of events happens.
China too but if you believe it's an arms race you at least admit the only real use see for AI is in facial recognition and surveillance (which has caused multiple wrongful arrests in the US).
Also we won't be slaves to China, they are the rising super power.
We've already ceded major industries to China in the battery market, and the current administration has done so much for China's economy its not even funny.
Neural rendering is a very exciting tech, but the current path of training AI from existing content is not going to get us there. The closest thing I think we have to the same "vibe" as neural rendering is Gaussian Splatting which can let you create novel views of things. Totally different of course. Neural rendering needs to be at the shader level which means it needs to be trained very differently than how DLSS is trained.
Not really the same thing. It's using motion vectors from the game engine, so it produces a much better result than basic old interpolation algorithms.
It really does wonders to motion smoothness if you can achieve near 60 FPS before enabling it.
I think framegen could work fine for like 120fps+ to 1000fps. You are going to reach a point where unless you are like an ultra pro gamer playing a super twitchy game the extra real frames just won't matter as far as reaction time is concerned. There will be too much lag between brain and hand to make use of them. At that point the frames might as well be fake since all they are going to do is make the game look nicer.
If my GPU can manage around 60ish fps natively, then I don't need upscaling.
Frame Generation is utter AI bullshit, no matter how much they hype it up. I'd rather have 40 real frames rather than 120, out of which 3 out of 4 are AI generated slop.
Yeah but when the time comes when it can't manage it natively, dlss will be there to help, I don't like fg because of the latency but dlss doesn't add latency, only improves fps while looking visually the exact same as native to the naked eye. Even DLAA is better than TAA in most games these days.
I fail to see why Nvidia gets so much praise. PC gaming was and is defined by low latency; this is a community obsessed with server tick rates and input lag. Yet, suddenly, 'fake frames' are acceptable? It feels insulting as a consumer to see interpolated frames promoted as a standard feature.
For a very long time high fps was a proxy for low latency. They went hand in hand when a decrease in frame (render) time decreased click-to-photon latency.
Well, fps became a marketing term, decoupled from its extended meaning. Just another parameter to optimize for, all else be damned.
Reminds me of the number of VRM phases in MOBOs. Used to be a good indicator of the power delivery's quality until manufacturers noticed and started putting on extra phases or just doubled the inductors for marketing.
Server tick rates are something only multiplayer competitive players really care about. E.g., Counter-Strike, Valorant, etc. Also they benefit from DLSS as well (DLSS SR which actually improves both FPS and input-lag by a mile), as well as DLSS Reflex 2 whenever it releases which will decrease perceived input lag from mouse-movements to a couple milliseconds (which woudl feel insanely good).
Players who playe single player games prefer different things though. A single-player game typically cares much more about frame image quality than super-high FPS and low input-lag. E.g., 30 ms input-lag and 60 to 100 FPS can be acceptable to such players, as the input lag doesn't really give you much competitive edge and is barely noticeable especially in non-FPS games (or FPS games on controller), and super-high FPS is not worth playing on much lower graphics settings. If you can achieve 200 FPS, it means you're probably mistuning the game, as it could look A LOT better and render at 100 FPS instead, by increasing shadow quality, particle quality, polygon count, etc.
For those players, after achieving 60 FPS, there's no reason not to enable FG and get ~100 FPS with barely any noticeable input lag difference (e.g, 18 ms input lag to 25 ms input lag or similar numbers), because they get much better perceived motion smoothness.
By DLSS 8 we won't need to even download full games anymore.
We just need to download a 200mb zip file with a couple high-res slides and the GPU will infer the models, textures, in-game-physics and mechanics from a prompt in the supplementary txt file.
How stupid do they think we are, no one asked for this. A “120 fps” game where 75% of the frames are ai generated will still run like dogshit because it can’t speed up the game code reacting to user input! It’s fake and I trust people will see through this scam
If the frame isn't real, you can't *really* click it, right? So if you're playing at 9 fps, because Leatherman wants you to, then you see about 120 frames with your eyes, but your brain is telling you, there's some WEEEEEIRD sh~~ going on.
I won't play any modern game then haha.
There are already enough games released to complete until my last day on this earth so I'm good thanks. Also fuck you nRAPEia
I just commented on a DLSS 4 post and said I only like real frames. People bashed me on it. Wtf? I can't even express my opinion on this platform anymore. Fuck modern society.
I think I am going to be ignore games that require DLSS just to run properly. Like Larian Studios said for their next game they are going to have to spend more time optimizing so their game can run with less RAM etc. That should have been a requirement already...
I have been having a lot more fun playing indie games in the past few years anyway. So I think I will just be focusing more on games that try and run on sensible hardware rather than just be lazy and force you to use framegen to to get 60fps.
Hurry up and future proof a future where you use a $400 2GB frame gen injected graphics card that requires a subscription, Facebook login and a promise against hate speech before using.
I use framegen for emulating 30fps console games at 60 which is my native refresh. I use adaptive set to my refresh rate and it fills in the difference with no screen tearing. It works even better for games that can hit 45 to 50 plus frames, it actually feels smooth because it only needs to generate every couple frames.
I'm not trying to turn 30 into 120 FPS or 120 into 240.
Another good use for frame generation outside selling overpriced graphics cards for poorly optimized games, is watching 30 FPS video at 60. Going from 30 to 60 does show some artifacts but the increased motion is worth it in my opinion and it's something I do for most of my shows I watch streaming that are at 30fps. It's not gaming but I use it more often than when I'm playing emulated switch games..
The technology and how they present it to us, frame its use and price is on them.
I think they went too big too fast, now they need to reel us back in. Lower vram more frame gen, higher prices, deals with developers for monopolizing the landscape that's coming.
I don't get it. All frames in a game were always a representation of something, using clever methods for speed and realism, therefore they were always "fake". FSR and DLSS are just the newer clever methods.
You are making a false analogy by saying generated frames are just a visual effect.
Screen space reflections and shadows are connected to game logic and user input. Try moving the camera, and the effect is updated the next frame immediately.
Fake frames use two or more rasterized frames to fill in the gap in between them. The process cannot react to user input because the last frame it uses as reference is already rendered.
Most of this 'fake frames' blabbering is coming from people that are happy watching movies and videos compressed by lossy video encoding - because they weren't told that most frames in these videos are 'fakes' (using their terminology and logic).
Better FPS brings smoothness and increased clarity. Frame gen saves you from overspending and turning your PC into a heat gun. It's not like Huang had stolen an ancient recipe of 4K 1000FPS (native resolution) GPU from mankind.
Found the target audience for MFG. Who in their right mind needs 6x frame generation? Seriously.
Ah yes, turn my 50 FPS at 30ms latency into 240 FPS at 60ms.
Not only does it add more latency by needing to withhold the next rendered frame to fill in the rest, but it also lowers your base frame rate by putting extra load on your already maxed out gpu, adding another hit to latency.
Who gives a crap about user experience when you can have more eye candy, right?
I have a GPU that supports frame gen. I’ve used it. It looks bad, and increases the latency to the point it’s uncomfortable to play the games I want to play. I don’t want to use it for that reason.
The more you generate
The more nvidia saves on silicon wafers
The more it can sell to AI corpos and everyone is happy
Them (nvidia) especially
The less you own and the more leather jackets Jensen buys.
more you fart, the more you shart
"The more frames in the sky, the more people will buy!" - Jensen Huang
They can't solve input latency. It's nice for situations where you can maintain stable 60 fps and want a high framerate situation on a 120 Hz monitor. It's not nice when the base frame rate is 15 and you want to have 60 fps. The input latency is terrible.
It was always a polish for when you already have an acceptable input lag i.e at 60 fps. I had to explain to a friend why his 90fps with framegen game played like shit and he was very sad to find out IRL.
It's exactly as you've said, something extra on top of already good performance, instead of a magic trick to reach baseline, and I kind of get your friend's reaction since Nvidia first advertised it on a Portal RTX demo that ran at 15 fps turned 40ish, then also started comparing specs of new vs old gen cards WITH frame gen included.
I think the tech wouldn't get as much hate if it were actually advertised as that, motion smoothing meant to fill enthusiast level refresh + it were treated as its own separate thing, instead of including it into general GPU performance charts, which feels malicious.
Agreed its being used as borderline manipulation for marketing purposes and also is making some game developers lazier since they can rely on DLSS and frame gen as a crutch.
Yeah the advertising Nvidia makes with frame gen is usually so misleading. RTX 4090 performance with the RTX 5070 for example
They now have an option to apply it dynamically. So you can keep the FPS locked to your monitors refresh rate or whatever, and instead of having to deal with skipped frames it will fill those in (imo a much better experience).
that doesn't really change its fundamentals, it just shifts it from being "good when its not needed and bad when it is" from a permanent judgement to a per scene judgement.
how is dynamic fg relevant when the dude argument was in 15 frames. it still wont do shi, dynamic or not
Well, ideally the frame rate will be at your monitors refresh rate when you’re in an area without many particles or whatever. Frame gen doesn’t impact that (only 1-3% difference at most). It does however allow for more consistent frames.
If you’re not reaching that base level of fps then frame gen isn’t to blame. The people working on frame gen/DLSS etc and the games are entirely different and improvement in frame gen/gpu technology doesn’t take away from improvement elsewhere.
That 1-3% figure is what you lose in DLSS upscaling going from DLSS 4 to DLSS 4.5. (In some games, like Spider Man 2 the difference is about 15%)
The FG overhead in general is more than 1-3%.
For example: CP2077 1440p on RTX 5060
Native: 60 fps
DLSS FG X2: 96 fps (48 base, 20% less)
DLSS FG 4X: 152 fps (38 base, 37% less)
Last of Us 2 1440p
Native: 67 fps
DLSS FG 2X: 110 fps (base 55, 18% less)
DLSS FG 4X: 160 fps (40 base, 40% less) source: Vex
The loss to base fps increases latency to begin with. Then holding back the latest rendered frame while generating in-between frames also adds latency.
Nice to have when it doesn't matter, makes latency worse when it's already bad. 10/10 feature
Rich get more, poor get less. Typical.
That is very similar to my thoughts as well, don't get the hype for any of it.
Turn based rpg is going to make a comeback lol
Maybe they could do fake input latency too? /s
Gen AI generating extra latency.
Check out reflex 2
Base frame rate needs to be over 100 for it to be decent, anything less than that feels terrible.
It will also generate inputs for you.
With dlss 10, you can just lay back and watch.
Dlss 12 will also have vision capabilities.
So you don't even have to watch your games get played!
It'll just send you the summary of how much fun it has so you can work more.
Because the dlss 12 demands the latest hardware which will be at 12999 $.
per month via subscription only!
Ok and they have always said that you need a good base framerate. It was never going to make 25 fps look like 60.
I use it to turn my 100 fps into 240
and honestly even if you lock ur fps to 30, and use frame gen to get to 120 it still plays better than consoles on a normal tv
If your base FPS is 15 you have more problems tbh. Nothing is gonna save that.
Handhelds would be the target market for fixing that problem. With DLSS/FSR/XeSS you can lower wattage but maintain acceptable performance, thus increasing battery life. All current handhelds only have about 1-2.5 hours battery life, it’s not great.
Believe me or not, it is INTENDED to be used with 60 fps, not 15, not 30. It's just a tool to make use of modern 240 and 480hz monitors in AAA games
Believe me or not, nvidia is promoting 6x FG with 40 fps.
https://www.youtube.com/watch?v=C2oaGMWo410
https://www.youtube.com/watch?v=4S5TJTvMcqg
Good for them, I was talking about frame interpolation overall, not just dlfg and Nvidia recommendations
Reflex 2.0 is being worked on and currently reflex performs the same as Radeon anti lag…
sooo its good as long as you use it as intended? :D
They can, they are decoupling the frames from your inputs basically, so you always have low latency, even lower than native. And filling in the blanks on the corners with AI.
Reflex 2.
Should come later this year
i dont think that's how game engines work
No he's right. They had a demo for it. It's kind of like black magic, the technology imagines the frame with your input applied before it even gets rendered. Was showcased last year on nvidia youtube channel.
Btw this sounds like bullshit but it's a technology used on vr headset right now.
no but i think about this like this, because i have some background in game design, though not an expert.
some game engines already have this, they have a 'simulation' fps where they update the physics, inputs, etc. at 30 frames per second (fromsoftware is a good example of locked 60fps physics), then output the actual graphics with variable refresh rate
the input though is always evaluated every 30/60fps
if the game can't even get 15fps due to lack of hardware, it doesn't matter if it can output 500fps visually, the internal simulation can only run at 15fps, which means ur input only gets evaluated at 15fps
Yeah that's how it has been for 15 years. That stuff im talking about is a lot more recent, you should really watch the video by Nvidia, i think it's called reflex 3 (which i find weird because its a whole other thing than what reflex 1&2 were). I'm also a game dev, if that makese sound more legitimate.
ok ill watch it then
Most games arent like this.
That is horrible input lag just from the base game
okay but what if this technology is what devs will rely on to make their next cyberpunk 2077 with ray tracing and extremely real™ graphics go from 15fps to 250fps at higher presets?
Complain to the devs then, not nvidia for making this technology.
Dont buy shitty optimized games, demand better performance, refund and leave a negative review.
Fuck is it NVIDIAs problem for them optimizing it bad 😂
If they released 4x better GPUs natively , would you complain to NVIDIA that its making the devs not optimize ? 😂
Its already happening, and gamers are complaining while buying the games and succing of the studio
If we're going to play the game of what ifs then we can imagine everything tbh
I mean they're already working on making AI generated games. Coded by AI, rendered by AI, might as well be played with AI.
I think the last one I saw was a Minecraft demo and it was actually pretty convincing for like the first 10 seconds
Seemingly convincing, fundamentally flawed. Current AI in a nutshell.
if anyone has tried to TTRPG with an LLM they will know how fucking retarded it is as an idea to play a video game with an AI. The best LLMs can barely run a TTRPG coherently, how will a game work when it's just rules times a billion
Maybe AI could play the game for me then summarize how was it
Chatgpt spends all day burning down a rainforest and boiling an ocean. At the end, it spits out the result:
"I played minecraft. It was fun."
"played with AI"
Might be a thing though. Like a service for multiplayer games. You pay for them so game servers look full. Great for game launches etc.
It's different from an in-game AI, as it's the same as a noob player. An individual PC, IP, etc..
Might easily adapt into games made with common game engines with a quick implementation.
Everything that is supposed to be a joke is going real, bruh.
And AMD, Nvidia, Intel make AI gpus soo hooray
You dont get it. Eventually we mught have ai generated players that even play games for ourself. We wouldn't even need to have fun as ai can us on tha part
Are we having fun yet human?
Next generation of rtx GPU will directly connect to your brain and inject the fake frames directly to it to reduce latency and input lag
Great. My nightmares are gonna run at 6x greater speed. But with even harder/smoother edges
Lmfaoo
I mean, you are not wrong, just out of line. NVIDIA's ultimate goal is neural rendering
you mean that tech that looks like it applies make-up to faces in videos games where the character wouldn't be wearing make-up? Since, you know, the only data it can use is human faces which are available publicity, so usually flawless insta filters, or highly curated faces.
My favorite part of the demo was that the face in question HAD emotions in the normal render, and then after the RTX Neural Face Render, it looked like she had gotten a nice make-over and some botox, with all emotion stripped from her expression.
Because, again, AI cannot create anything unique or new.
Yes that's totally the lesson from the last decade. AI can't improve.
I mean, as it steals more intellectual property and continues to ruin the entire consumer electronics space and environment, I'm sure the AI slop will get better.
Hopefully the industry collpases before that unfortunate turn of events happens.
I wish all AI companies a happy Bankrupty
Yeah down with the west. Hopefully we can all be slaves to China soon.
China too but if you believe it's an arms race you at least admit the only real use see for AI is in facial recognition and surveillance (which has caused multiple wrongful arrests in the US).
Also we won't be slaves to China, they are the rising super power.
We've already ceded major industries to China in the battery market, and the current administration has done so much for China's economy its not even funny.
Neural rendering is a very exciting tech, but the current path of training AI from existing content is not going to get us there. The closest thing I think we have to the same "vibe" as neural rendering is Gaussian Splatting which can let you create novel views of things. Totally different of course. Neural rendering needs to be at the shader level which means it needs to be trained very differently than how DLSS is trained.
Thats why i dont use the tech
I mean it does make games look much smoother...
So does my 15 year old sony tv.
Frame interpolation has never played well with gaming.
Not really the same thing. It's using motion vectors from the game engine, so it produces a much better result than basic old interpolation algorithms.
It really does wonders to motion smoothness if you can achieve near 60 FPS before enabling it.
Hey I dislike the push for pure ai frames but it works pretty fucking well now. At least on single player games.
For me not realy. I just need enough FPS for my 165hz screen thats it.
Except when it generates artifacts and makes the game feel sluggish due to input lag
At some point he will tell us : "you can also use your imagination for extra fake frames "
I think framegen could work fine for like 120fps+ to 1000fps. You are going to reach a point where unless you are like an ultra pro gamer playing a super twitchy game the extra real frames just won't matter as far as reaction time is concerned. There will be too much lag between brain and hand to make use of them. At that point the frames might as well be fake since all they are going to do is make the game look nicer.
I agree, that's a valid use case (for 2x FG) if you have a very high end system, it is a nice to have.
Sadly MFG is marketed with the 60 class, turning low fps figures into chart-topping numbers for marketing purposes.
If someone has a GPU that can't render at 60+ fps, then what use would he have for 4-6x MFG without a 240 or 360 Hz monitor?
Input delay will take 2 business days to register
Lol!!!
Imagine having a gpu,
or not you poor ass mf
I'll never use Frame Gen. I also avoid using DLSS 99% of the time. Nvidia can shove their AI upscaling up their asses.
I think the upscaling works reasonably well in most situations. The frame gen is a long way from being good enough for me though.
DLSS definitely deserves more credit, 4 and 4.5 have been treats to use. I agree with you on fg though.
If my GPU can manage around 60ish fps natively, then I don't need upscaling.
Frame Generation is utter AI bullshit, no matter how much they hype it up. I'd rather have 40 real frames rather than 120, out of which 3 out of 4 are AI generated slop.
Yeah but when the time comes when it can't manage it natively, dlss will be there to help, I don't like fg because of the latency but dlss doesn't add latency, only improves fps while looking visually the exact same as native to the naked eye. Even DLAA is better than TAA in most games these days.
I fail to see why Nvidia gets so much praise. PC gaming was and is defined by low latency; this is a community obsessed with server tick rates and input lag. Yet, suddenly, 'fake frames' are acceptable? It feels insulting as a consumer to see interpolated frames promoted as a standard feature.
For a very long time high fps was a proxy for low latency. They went hand in hand when a decrease in frame (render) time decreased click-to-photon latency.
Well, fps became a marketing term, decoupled from its extended meaning. Just another parameter to optimize for, all else be damned.
Reminds me of the number of VRM phases in MOBOs. Used to be a good indicator of the power delivery's quality until manufacturers noticed and started putting on extra phases or just doubled the inductors for marketing.
Server tick rates are something only multiplayer competitive players really care about. E.g., Counter-Strike, Valorant, etc. Also they benefit from DLSS as well (DLSS SR which actually improves both FPS and input-lag by a mile), as well as DLSS Reflex 2 whenever it releases which will decrease perceived input lag from mouse-movements to a couple milliseconds (which woudl feel insanely good).
Players who playe single player games prefer different things though. A single-player game typically cares much more about frame image quality than super-high FPS and low input-lag. E.g., 30 ms input-lag and 60 to 100 FPS can be acceptable to such players, as the input lag doesn't really give you much competitive edge and is barely noticeable especially in non-FPS games (or FPS games on controller), and super-high FPS is not worth playing on much lower graphics settings. If you can achieve 200 FPS, it means you're probably mistuning the game, as it could look A LOT better and render at 100 FPS instead, by increasing shadow quality, particle quality, polygon count, etc.
For those players, after achieving 60 FPS, there's no reason not to enable FG and get ~100 FPS with barely any noticeable input lag difference (e.g, 18 ms input lag to 25 ms input lag or similar numbers), because they get much better perceived motion smoothness.
🤣
you know he knows he's garbage because he's always wearing a trash bag.
Might as well watch a walkthrough of the game instead of playing it at some point.
Just one frame at the beginning of the game and one at the end. Game still weights 500Gb.
But does it matter if they are fake or not, as long as everything runs smoothly and quality is good why would anyone care
Smoothness, yes. Responsiveness? Not really. The less frames the game engine renders the bigger the gaps in input and game physics updates.
By DLSS 8 we won't need to even download full games anymore.
We just need to download a 200mb zip file with a couple high-res slides and the GPU will infer the models, textures, in-game-physics and mechanics from a prompt in the supplementary txt file.
You might think its funny, but he said exactly that couple years back (I think at 40xx launch).
“In the future frames won't be rendered. They will be generated.”
How stupid do they think we are, no one asked for this. A “120 fps” game where 75% of the frames are ai generated will still run like dogshit because it can’t speed up the game code reacting to user input! It’s fake and I trust people will see through this scam
If the frame isn't real, you can't *really* click it, right? So if you're playing at 9 fps, because Leatherman wants you to, then you see about 120 frames with your eyes, but your brain is telling you, there's some WEEEEEIRD sh~~ going on.
Im of the unfortunate few who can notice frame gen artifacts even at 1x frame, even in slow-paced games. So them pushing this tech is sooo annoying.
Lag-input is knocking your door... With a rocket launcher.
AI will just hallucinate all your game from cinematic reveal trailer and you will be so happy.
The actual game is just a powerpoint and AI is imagining the rest of the game
Yeah we never wanted well-optimized games anyway
Latency reaches 1000ms with that too Nvidia introducing: boost plus pro max+
I won't play any modern game then haha. There are already enough games released to complete until my last day on this earth so I'm good thanks. Also fuck you nRAPEia
Frames will be delivered from the nearest datacenter.
I sincerely hope that nVidia goes bankrupt.
I just commented on a DLSS 4 post and said I only like real frames. People bashed me on it. Wtf? I can't even express my opinion on this platform anymore. Fuck modern society.
You will play on 480p and get output = 4k max
I think I am going to be ignore games that require DLSS just to run properly. Like Larian Studios said for their next game they are going to have to spend more time optimizing so their game can run with less RAM etc. That should have been a requirement already...
I have been having a lot more fun playing indie games in the past few years anyway. So I think I will just be focusing more on games that try and run on sensible hardware rather than just be lazy and force you to use framegen to to get 60fps.
so getting 30 fps and enabling FG isn't a win?
It is a win for nvidia's marketing, yes.
I’ve already played the demo of that game. Very cool that anything can happen, very disturbing when it freaks out.
And all with 8 GB of vram!
4gb incoming
How much frame gen is needed to fix stutter in unreal engine tho?
Hurry up and future proof a future where you use a $400 2GB frame gen injected graphics card that requires a subscription, Facebook login and a promise against hate speech before using.
I use framegen for emulating 30fps console games at 60 which is my native refresh. I use adaptive set to my refresh rate and it fills in the difference with no screen tearing. It works even better for games that can hit 45 to 50 plus frames, it actually feels smooth because it only needs to generate every couple frames.
I'm not trying to turn 30 into 120 FPS or 120 into 240.
Another good use for frame generation outside selling overpriced graphics cards for poorly optimized games, is watching 30 FPS video at 60. Going from 30 to 60 does show some artifacts but the increased motion is worth it in my opinion and it's something I do for most of my shows I watch streaming that are at 30fps. It's not gaming but I use it more often than when I'm playing emulated switch games..
The technology and how they present it to us, frame its use and price is on them.
I think they went too big too fast, now they need to reel us back in. Lower vram more frame gen, higher prices, deals with developers for monopolizing the landscape that's coming.
Not you will game, fake frames will game you
and what when the game doesn't have DLSS?
https://preview.redd.it/qyv3nkpk40cg1.png?width=318&format=png&auto=webp&s=4b504eb26151a9b0709908ca8f4aa515edc87acf
But you will stream all your games over Geforce now
In the future, instead of key card, game manufacturer can sell you a prompt card, player will plug it into AI and generate the whole fucking game.
At some point, will there be a number where we wouldn't really notice?
The
beatingsframe generation will continue untilmoralelatency improves.at what point the game become the ship of theasus
I mean you can't deny that fake frames work lol
Reflex 2 can already do 1->4000 frame generation via reprojection, so x1000 isn't even far fetched.
Jensen on RTX 70 Feynman GPUs: from 1x1 pixel grid you can generate 16K image. You don't even need a screen to display!
The more you buy the more you save people!
Its not that hard to understand.
They want to sell you 1200Hz displays by then. It all makes sense.
100x? Why not 10000x?
In before 2gb rtx 6040 faster than 5090 cause 10000x frames and ram upscaling playing at 480p to 16k 😂
As if AMD wouldn't do the exact same thing given a choice
Oh, AMD already does. They follow nvidia's lead, but it's a shit direction nonetheless.
Tbh i wont buy or support anything ai
Nvidia should invent a chip that slows down the user's brain, and then sell you $2k GPUs that run at 5 FPS.
if i can see them then there are no longer fake xd
I legit think they will do this and call it intermittent frame system or something.
DLSS10 the AI plays for you.. all you need to do is watch it like how your big brother gave you the 2P controller when you were young...
They're all fake to start with anyway, so... good?
I don't get it. All frames in a game were always a representation of something, using clever methods for speed and realism, therefore they were always "fake". FSR and DLSS are just the newer clever methods.
Fake as in disconnected from game logic and user input.
...because p.x. screen space reflections/shadows are real and connected to game logic and user input...
You are making a false analogy by saying generated frames are just a visual effect.
Screen space reflections and shadows are connected to game logic and user input. Try moving the camera, and the effect is updated the next frame immediately.
Fake frames use two or more rasterized frames to fill in the gap in between them. The process cannot react to user input because the last frame it uses as reference is already rendered.
It's not that simple. FSR and DLSS gen frames are a hybrid system, like screen space effects.
fps is dead, fpm is the new benchmark king, frames per million, meaning true frames per million displayed frames
You'll own no frames and you'll be happy.
Enough frames to run the simulation in fallout 3 tranquility lane where jenson is the head of the Chinese army🤣
Do you feel better with 40 frame?
That would be great if true
If I get thousands of fps and there is no difference to real frames, great.
DLSS 10 will be a cloud subscription, and there will be no more GPUs.
It will be locked behind 60 series cards only and everything under 60 will not get updated AT ALL
I'm stoked to finally be able to run cyberpunk 2077 at 60fps 😂
That look when 4 0 9 0 P e r f o r m a n c e
So... I'm assuming AMD had a lot of exciting announcements and not just a bunch of AI, a refresh and the 9850X3D. Oh wait... It was almost all AI too
Dont be like that we have to cheer the second largest AI company instead of the largest who ported DLSS4.5 to the rtx 20 cards
All frames are fake
And some are extra fake
I am going to let you all in some news... all your frames are already fake. For real frames you need to open the window...
You mean dlss frames aren't fake now!?
Faker than fake.
Nothing faker than Jensen Fakir
https://preview.redd.it/71cd5sgloybg1.jpeg?width=960&format=pjpg&auto=webp&s=f5da64f1696c8420112ce2936734f1489dd8377a
Seriously, what is the stupid trend of "fake frames"?
None of you can evne see the difference in game. You are all basically complaining about free performance.
You simply want to complain
Most of this 'fake frames' blabbering is coming from people that are happy watching movies and videos compressed by lossy video encoding - because they weren't told that most frames in these videos are 'fakes' (using their terminology and logic).
Better FPS brings smoothness and increased clarity. Frame gen saves you from overspending and turning your PC into a heat gun. It's not like Huang had stolen an ancient recipe of 4K 1000FPS (native resolution) GPU from mankind.
Found the target audience for MFG. Who in their right mind needs 6x frame generation? Seriously.
Ah yes, turn my 50 FPS at 30ms latency into 240 FPS at 60ms.
Not only does it add more latency by needing to withhold the next rendered frame to fill in the rest, but it also lowers your base frame rate by putting extra load on your already maxed out gpu, adding another hit to latency.
Who gives a crap about user experience when you can have more eye candy, right?
Isn’t AMD moving towards the same goal? They are going all in on AI.
Yeah they follow suit, doesn't make the direction one bit better.
I have a GPU that supports frame gen. I’ve used it. It looks bad, and increases the latency to the point it’s uncomfortable to play the games I want to play. I don’t want to use it for that reason.