I live in Germany, the 2080ti FE costs 1260€. WTF NVIDIA IT'S A GAMING CARD NOT A TITAN. This is why we need competition, I will be so happy when Intel joins the market.
Woo boy, saying Intel at the end there was probably the biggest plot twist of 2018. AMD has 100% more of a better chance than Intel ever will at making a decent inexpensive GPU.
Why would you want a 2080 Ti? To boast? There's hardly any game in any resolution that needs more than a 1080 Ti, and for those, a 2070 suffices. What do you want, 4K@144Hz with a monitor that costs 2,700 EUR?
See thats the thing...I feel like these bumps in frame rate performance are in comparison with the origional gtx 1080 attempting to run the same simulation with RTX on...since it is not designed for ray tracing, obviously the frame rate is much less. This is the fishy bit....what is the framerate jump WITHOUT rtx. Guess we will find out soon enough.
They only make boost them because the shadows are "dealt with" RT chips, but I dont believe this thing will be used since not everyone will be getting a RTX card, and not everyone minds playing with shadows on med or low since it has a huge impact.
Iancreed8592 the performance gain is not going to be huge. It's the fact that you can do more with a GPU that is appealing to me. Significantly faster AI training, much faster renders, a group of more features... Gpu's can only get so much faster. And whole techinally these cards are just a bit faster, in reality they will make games look better. Is this not what alot of people do when they get a better graphics card? These new gpu's do just that. People are missing the point on the new architecture. Quality over quantity with better efficiency.
"Look at this guys, the new card is so much better at something that was designed for the new card than the old card! PLEASE be excited!" - nVidia probably
well at least someone pushes for new things ray tracing is new and what a few new games support it lol will call of duty support it probably not lol because of there shit engine lol
jonathan oxlade ray tracing is anything but new. It’s been used in games for years. Computers have just been unable to handle real time ray tracing, and games have used baked ray tracing instead
Cmdr Flint The games that will implement it (shadow of the tomb raider and metro) only use it for reflections or shadows. But I think that games in the future will get more rtx graphics than these gameworks
Definitely not a gamesworks feature. It's built into Direct X. I for one am excited about this technology and so are a lot of devs. I'll be making full use of this tech as it is already implemented into Unreal and hopefully Unity soon. It's going to make lighting scenes in realtime way less of a headache and will quickly become the standard in the upcoming years.
CassetteMelody You said the magic words..."standard in the upcoming years". You're probably a content creator and a person who likes to adopt the new tech early. But this RTX generation of GPUs might be just that... First steps for a new tech that will become more and more commonplace, but that's the thing with new tech....most of the people won't be jumping in right from the start. It feels like something as I, a gamer who isn't doing any creation, can easily skip for now and get into it when it's more commonplace and affordable. Time will tell- maybe that change will happen much faster, but I doubt it. I think this GPU line will be divisive- early adopters love it, majority will pass it and wait what happens to the tech.
The buzzwords and marketing were brilliant. The whole chat ate a lot of it up. "This is wrong, this is right." "This is old tech, this is new tech." "It just works." "It's lit." (they didn't mean it in that context, but everyone was joking around like they were, and who knows maybe they were) "The best Pascal GPU. The 1080." "8x the performance of Pascal/Volta." (can't remember which he said specifically) "1 Turing is the same as 4x Volta's that're worth $60,000." "Starting at $499." *picture of a 2080 Ti in the background* "Look at this performance scale!" *in RTX-OPS* "It's impossible without RTX." And so much more. Then there was that dancing demo, which is probably an interpretation of Jensen on his way to the bank.
people seems a bit uneducated. When they introduce family of 3 and say starting at $499 it was obvious that this was about 2070. If you have a bit of knowledge about NVIDIA's pricing it also helps. I wonder what people expected. 2070 at $370 and 2800 @ $440? Simply lol.
RyNiuu, Well, I was thrown a bit with the $499 and RTX2080Ti in the background and I think a lot of people viewing and even in the audience was too. I was certain that was incorrect but that doesn't change the fact the timing of his statement confused people. Also, the prices at NVidia sites the same or next day were way off those shown. He said $999 MSRP for example and that's now $1199 (20% higher). *huh... it just occurred to me that this may be due to Trumps idiotic Trade War. Apparently as of August 23rd, 2018 a lot of electronics may cost up to 25% more when shipped to the USA from China.
In the not too distant future... “Computer, Enhance that image!” said the pale, hunched man in an self assured, authoritative voice. “Exporting all private user data....contacting Bluffdale...hard drive contents, exported...” “No Computer, Stop. End Task!” “...biometric data, exported...browser history, exported...webcam and microphone data, exported...key log and behavioural information, exported...” “Stop! Terminate!” “Terminating user in 5...4...3...”
That's technically doable with the RTX series XD Just need to write a program that recognizes voice commands and links them to DLSS or Ray Tracing options in the nVidia control panel, and they do exactly that XD
Candi Soda I mean most game run at 4K 60fps on the 1080ti. 3070 won’t be it Out for another year and a half so why wait that long. If you’re content with 1080p and 60fps 1440p then yeah keep it. I also had a 1070 until it fried it played most games at 4K 30fps maxed so its still a beast.
New generations don't have to be more expensive, this is just dumb. The GTX 680, 580, 480, 280, ... all launched at 499$, the 2080 is ~750$ on preorder.
Until the majority of games optimize and or build for raytracing...I think I'll stick with my 1070 even if the 2070 is as powerful as a Titan Xp(or so Jensun says)
Yeah, these are currently a waste of money. Maybe in a year or two, but even then it feels like they're building cards for games which don't yet exist.
It might be as powerful as Titan XP if you add up the FLOPS and integer operations of every type of core, but put that in a game and only the CUDA cores get used so far...so kind of pointless. If Ray Tracing even becomes a thing it'll be with the 3080 series...or 2180....might as well make the naming scheme as dumb as possible.
Going to be a looong time before the majority support it and likely the next generation of cards will be out before even a relevant minority support ray tracing. No need at all to upgrade from a 1070 until the 7nm shrink unless you're just rich and playing at 4k (which im not a fan of 4k gaming anyway)
Then you'd be fine the infiltrator demo, from the 1080ti could only do 38fps, while the 2080ti can do it at 72 fps... With none of the gameworks features that are also pretty awesome.
Then you will get murdered by everyone who sees your shadow on the corner and you cant see theirs, or your reflection in the red car and you cant see theirs. It's not rocket science.
+TheGoreforce what you are saying is the 1080ti got 38fps with everything on and the 2080ti got 72fps with everything off?? so 1080ti 38fps on ultra, 2080ti 40-45fps on ultra????
I know, my gtx 1080ti isn't exactly capable of doing that (can maintain 100 fps in most games though). hopefully the 2080ti will be able to do that, otherwise there's no point in buying it.
mapesdhs: that level of simulation is pretty compute intensive and takes current supercomputers forever to calculate. Making everything in a room physically correct is too much for current hardware. Especially fluid dynamics, holy shit, fluid dynamics are a bitch to calculate...
I somehow agree with you and Greg. My bet is that the SM cores are identical to Pascal and the Tensor cores are from Volta. The RT cores is the new addition and I see them as a specialized hardware, I do not expect them to do anything else than real time RT, given that you used the new Geforce RT library... And the DSS thing, well NNs in any form they exist they are still ML algorithms and they try to learn patterns from data they have encountered. But the moment you feed them with something new then you are in for a treat... Also this is a new tech both in hardware and software (they mentioned DirectRX) so it will take a few years to be adopted by the industry. They found a way make ray tracing hardware, instead of relying on software and SIMD computations the GPU cores offer. That's why they are calling it a revolution imho.
As Greg said no talk of FPS gains... More power hungry... Price is too high.. I'm curious to see performance without the AI and RT also.. Sounds like they're really banking on if the technology works
I don't really care about how much faster the new gpu's are going to be. They are going to be slightly faster in normal games but they will allow users to use their cards for artifical intelegence training allong with bumping graphical settings up. This architecture is NOT focused on "oh you get x% better performance in y game while bing z% more efficient." They already did that with pascal and previous generations this is about bringing the next generation of graphics to the present day. It's kinda like comparing a plane and a car. Both can drive on the ground but one can fly. You see what I'm saying. This is not about "oh I want more frames" as much as "look how pretty we can make games look". Also does the 2080 support sli. I saw no real mention of sli (not that it matters really.)
yes. I think the new cards are underwhelming when it comes to normal games that use cuda cores instead of all the new shit on the die. That's what they want to hide. Nvidia said a 2070 will be faster than a titan xp... But in what. Raytracing? I better hope so. In normal games I doubt it. Guess We'll find out soon
That 499 MSRP for 2070. LMAO Ok, AIB Partners just going to follow Founder Edition MSRP like last generation. So really $599, $799 and $1199. Love this GPU monopoly we have going on!
Except for Zotac, which only makes NVIDIA cards, the other board partners are probably getting rather annoyed with AMD for not having anything new before 2019 to let them overclock and bin for higher performance.
EVGA is currently selling 2080 $50 cheaper than FE cards on Newegg, but even $750 is insane. Such greed from Nvidia! So much for Fordism in today's crony capitalism. :(
Also keep in mind that pretty much all the reviewers are still under NDA that prevents them from talking about performance or any of that stuff just yet.
Maybe you're too young to remember, but this is the same case as when programable pipeline made it's presence. The first full T&L card was the GeForce256, it claimed to offer twice speed about it's predecesor, the TNT2, but (and this is the important part) only on games using the T&L engine, which at the time was nearly no game, so a Voodoo3 still beated the GF256 in fixed pipeline games and was a lot cheaper. Said that, at the next gen every single game updated to DirectX7 and used the T&L and suddenly that "underwhelming technology" turned to be the holy grail which we have been using until now. This is the same situation, RTX offers six times the power of a 10XX series in RTX usage, this doesn't translates to an overwhelming speed upgrade on previous games, this is a new tech which must be adopted by the developers, and until then, it may seem "a scam", but it's not, this is a new era like we haven't lived since 1999 with the introduction of programable pipeline and thanks to this in 2-3 years the jump in graphic quality we're about to see is going to be amazing.
and voodoo (3dfx) died and were eventually bought for IP by nvidia because they purely focused on brute force speed improvements on the old existing technology rather than inventing.
It won't be stickied, too reasonable... Honestly, how to people not get this? RTX is a leap forward, even if that won't be reflected in FPS. Until we see actual benchmarks, we only legitimate gripe people have is price... But what do you expect, when AMD is MIA in high end GPUs.
Yes, but by the time games catch up to this new era, the 20** series cards will be obsolete, having been replaced by the 30** series cards. So, essentially, anyone that buys a 20** series card is being put out and used to fund development for future cards. While this isn't necessarily a bad thing in regards to ensuring we will have better cards in the future, all anyone that jumps into buying these cards right now will be spending the next year or so playing new, buggy games with a new, buggy technology that isn't yet properly optimized for the hardware, and by the time new games come out with RT in 2020, these cards will be behind in performance and new cards will be on the horizon within the next year or so after that, essentially making this entire release a public alpha test for the technology. I see a lot of people being unimpressed and upset in regards to performance once they start playing the likes of Metro and Tomb Raider. People care much less about how games look than they do about how they perform, within reason. I'm the opposite, really. I'm okay with 60FPS as long as the game looks beautiful and go out of my way to make sure that it does, even going as far as using ReShade. IMO, Assassin's Creed: Origins color palette and shader quality is terrible. I made the game look really impressive using ReShade, only losing around 4-5FPS out of my 75FPS monitor cap. So, to be clear, I'm all about how a game looks. But within /reason/. I don't want to be the guinea pig for a multi-billion-dollar corporation. Shouldn't have to.
Daxter of course they did. Nvidia and their bs marketing. Convincing us that we need Ray tracing. Not showing any gaming benchmarks. 4k gameplay. NV-link. Nothing. Just wait a month more for now. The premium price for the pre order is just aint worth it. But if all of them can live up to their price, Im fine with it.
Daxter They are just making sound like the new cards are big jump but in reality when the benchmark arrive i think the rtx 2080 ti wil be 10% more powerful than gtx 1080 ti it just like what apple do with their iphone 50% Faster when benchmarked the old one score 200k the new score 240k and you can tell how is that not a 50%
Thank you for making me not feel crazy for having a gut feeling that the RTX cards are going to be fairly underwhelming for gaming in general (And the dual 8 pins and dual fan reference cooler gives me the feeling it's going to be pretty poor for performance per watt)
Eva :D That's a bad understanding of it though, if you have a game with no RTX and just use CUDA it will just use CUDA cores and power draw will be similar if not lower giving equal or better perf per watt, now if you have a RTX game a 1080TI running RTX through software emulation (yes you can do this too with RTX) might get say 20 fps but the new card with tensor cores might get 80+ fps as CUDA doesn't handle the RT so performance per watt goes insane then with the 20 series, with tensor cores doing the lighting and shadows that can leave even more head room on the CUDA cores for even more fidelity. Where moving from artistic interpretation to realistic physics based, this is the first step in a awesome direction.
Exactly How I feel all He talked about were Made UP Terms like GIGA RAY... He said NOTHING ABOUT GAMING Performance, REALLY!!!! And These Cards are already sold out every where.... It's going to be sad when people find out they could have gotten a Discounted 1080Ti And XP a month from now for around and under $400 to $500 for the XP... People will see, They still have way to much older stock to make it obsolete, Turing will be in Games about 25% faster across the board in all categories for gaming at the Most, THAT IS IT...
I'm no developer, but why is nobody talking about RTX increasing efficiency for the development process. I mean, at least the developers there sounded really happy. If it improves efficiency, and they don't have to define lighting characteristics for each pixel, then wouldn't that perhaps (strong emphasis on that) mean an easier implementation into current gen games? And to me, the whole point of graphics innovation is to make a game look and perform better. People talking about just shadows and ignoring real time reflections are missing the point of the demonstration; developers can define specific materials used to reflect light differently. That changes the way games, who want to replicate realism, look and act. Hell, it even changes gameplay itself... I feeling like I'm regurgitating the presentation, but from what I saw it really made sense to me. I'm actually pretty excited.
It's gonna be amazing since we're switching to PBR completely from now on. Just wait for Sim titles to get this, then people will understand how it affects the realism.
Finally someone said it! I was scrolling through the comments to upvote this. Ray Tracing will mean that developers can focus less on lighting and more on what the gamers want in video games.
RTX will not make the development process more efficient, not at first anyways, since they will still have to use the old lighting techniques until RTX is ubiquitous. After all performance would REALLY suck for non RTX cards if they didn't. The workload will actually increase since they'll have to do 2 different sets of lighting, one for raytracing and one for rasterization. Though the lighting for RT is indeed way simpler to do.
3 years ago the 980ti released at an initial pricing of $649. in 3 years time the initial pricing of Nvidia's flagship card has nearly increased with 200%. No thanks
The pricing was a huge turn off for me and I hate seeing so many people just accepting the fact that they have to pay 1k for a top of the line gaming card it just seems ridiculous to me
Try living in Canada. Things cost even more here thanks to the shitty exchange rate with the US dollar. ... Anyways, no one really needs a high end card, midrange is good enough for most people.
Alexander Yordanov Well yeah, people love Nvidia. And the 900 and 1000 Series are generally good cards for the most part. AMD also gave up on the market for many years before getting back recently. They only got themselves to blame.
I still own and use an R9 270X and it is still serving my needs and doing what it was designed to do at a much lower price than its nV counterparts. So, yeah...
They didn't even talk about the new "SLI"/"NVLink" connectors, performance in actual games, why there's a type C in the rear IO, and the power consumption.
tdp is roughly the same as pascal equivalent on 2080ti within 10watts of it. 2070 and 2080 abot 20ish watts higher. not bad really according to the specs on the store page. 175-185w on 2070, 215-225w on 2080 and 250-260w on 2080ti. higher number being founder card tdp (factory OC)
Probably just going to pick up the 2080. The Ti doesn’t seem like it’s really worth it. However, I’m going to wait till the benchmarks come out. If the the 2080 isn’t faster than the 1080 Ti, I’m not gonna buy it.
my MSI gtx 1080 ti FE hits 2038mhz core and 5954mhz mem on the stock air cooler XD and with this EVGA hybrid kit i bought hopefully 2150mhz core and 6100mhz mem PS ray tracing couldnt be more irrelevant to me, it chafes my anus that i paid MSRP on launch day for my GPU with 11gb of memory, and these new cards top out at 11gb too, albeit of gddr6.... until nvidia releases a ti card with at least 2x the vram i wont give a fuck
People keep comparing this to PhysX. If Physx were a Microsoft-owned technology and part of Directx like DXR people might actually have used it for something other than making Batman walk through loads of paper for some reason.
whats even the point of it? If even actual games like tomb raider cant run in 60fps even on 1080p how will future games do it? Why are they marketing those cards with that? Some better shadows wont me leave behind 4k 60fps which has a much bigger impact. They should market ray tracing when it can be achieved at those resolutions/framerates.
It's also worth pointing out that according to the dev team behind it, this is a post-launch feature that is still in an early stage of development so is not optimised. They have not given the timeline of when they are expecting ray-tracing to be in a playable state in SotTR to my knowledge.
engels 141 AMD doesn't need RT. IF they can deliver a card without RT but 50% faster than a Nvidia at the same pricepoint i will buy an AMD card. Speed rules. 1080Ti here.
I'm really hoping the 7nm Radeons are good next year. Vega 64 would be a very strong card if they can fix the scheduling issues that cause many cores to be underutilized or not used at all.
They're marketing to gamers and all they showed us was shadows and reflections, nothing is fishy, just non-existent. No one who is a gamer or PC enthusiast has ever said "yeah 4k at 115fps is great, but damn....can I like get some more shadows?". If a 2070 could do The Witcher 3 4k at 90fps , would that not make everyone jump out of their seat and pre-order? You don't host an event like this to market Ray Tracing and then in 2 weeks tweet out "oh yeah btw the 2070 can do 4k 120fps". Like if it did that they would show it while everyone who cares is watching. Don't get me wrong, I'm positive these will be an improvement. But for $1200 that thing better TELEPORT me to Novigrad. Not to mention they're using new metrics so that channels like these cant do the math and actually see what performance would be estimated at. If Jensen compared the old generation with the new one using the new metrics, he can damn certain do it with the old ones to. Marketing hype to frontload their pockets with pre-orders until proven otherwise.
go back again and watch the part where they calculated the rtx flops. shows theoretical performance of each part of the architecture. 14tf one part 14tf 2nd area over 100tfs 16fp on another... with the right drivers, which im sure they have performance gains should be leaps above pascal. utilising different cores for different calculations and improved rendering.. did people not understand it or something... this architecture is years ahead of pascal... even rendering without rtx on should be loads faster.
look back at 10 series announcing video,listen to all those clap and shout, compare to 20 series.....is like oh ya....yay... we need those shadow and lighting to game on Red Alert 2.
That's because of the massive leaks of 20xx specs. Nobody was surprised anymore*. Compare that to Apple's iPhone presentations. First one, massive. Today with all the leaks, meh. ______ * In fact, I was thrilled for a moment when they said "All the leaks are wrong" but then they didn't follow up on it, so I was like "meh".
If you are uneducated on the subject then it is meaningless to you and I guess the amount of people that have that problem is quite big reading through the comments.. It's quite funny how there's a large amount of people that scream they only care about FPS but still poop their pants on fully pre-rendered trailers because how pretty it is or how realistic it looks. Or how they just care about FPS but still buy a 1080Ti to play fortnite instead of buying a 970 GTX and putting everthing on low settings. Facepalming my head in at the moment.
Remember T&L? I remembered T&L. Without developer support, it's nothing of value, just like Ray Tracing. I'm pretty sure the RTX 20XX will perform better than the 10XX series, but how much better and for that jump in price, is it worth the upgrade? My assumption is that the card will perform much better in acoustic compared to the old FE cards and most likely 25% average of framerate uptick. I prefer Nvidia blower style fan for my SFF machine, so I don't know about this dual fan solution since my PSU clearance to the second fan is only about half a centimeter.
@Science Studio Greg, It was super fishy in my opinion as well, he never used the same previous gen terminology to compare performance... he kept using the "RAYFLOP" whatever term 78 timees faster in rays... but THEN.. THEN he said 2070 faster than a GTX titan XP as a standalone statement??... so yeah that kind of tells it all for the rest
There is no AIB cards that cost less than the founder cards online. So you were not inaccurate, people will defend nvidia, that's normal for fans to do
@@ThanhLe-im3ce exactly so don't know what he was talking about and why the cards are 100 more unless that's the preorder price and they will be 499 on shelfs? Who knowa
the fact you speak of it being about 'shadows' kind of explains why you feel underwhelmed. You don't appear to understand what this card does, or what the with game production and current gen hardware problems are. The problem is with graphics cards, as they exist today, it's chasing an ever move complex and expensive goal. two years ago nvidia touted 4k gaming, now you want 4k gaming at 120fps? in two years what then? Higher frame rates and higher resolutions are already things few people can 'see', so why chase it, especially as it becomes an order or magnitude problem to solve. Meanwhile gaming production skyrockets because designers are spending more and more time faking realworld lighting and physical representation of materials. Some of the better games that have come out in the past couple of years have pre-computed global illumination, have been praised for their art, but normally run slower than other games. It's not hard to see the obvious direction change needed. Accurate representation of lights and materials (if every one had access to real time raytracing) would actually speed up art and level production for games. they look better, but right now, they run slower. This paradigm shift needs to start happening sooner rather than later. it looks like it's happening now, and on a card where the rasterisation aspect IS running faster. release date is only 4 weeks away, benchmarks will happen soon, and people will buy the cards- because PC gamers always want to over-engineer their rigs, whether they can use or see the difference is largely irrelevant. :)
Jensen is fooling the pubic with those vague terms because he knows no one is gonna throw 999$ for better shadows but then again people who gonna buy will buy
Idiots bought a titan x and then had to rebuy a titan Xp. They didnt even complain about it. Those rhat did were in the minority. I got a 1080ti for 660 second hand before prices went crazy high.
That's really what it comes down to, I watch those demos and find myself thinking it looks good enough without ray tracing. So unless a solid fps increase or way better support for high resolution high refresh rate in development VR headsets is there.... I think skipping this gen is fine. You KNOW they're going to do a 7nm refresh like a year later anyway probably with more vram and less cut down
Get 1080ti sli with a bridge, it actually gives decent performance in most triple a games at about a 50-80% increase, granted I don't know how the 2080ti performs in 4k, you would think that it does extremely well once they put those supercomputers to work to create artificial 4k content which probably would look near eqeally as good as 4k itself. But at this point you can doubt anything nvidia says including its already bs bloated msrp of 1000$ since that was a straight up lie.
Fuck gpu power i'm more worried about the pricing, who even are these gpu's aimed at lmao, the 1% that can afford them (looking at you 2080ti)? Good way to fuck up your lineup permanently.
They weren't using RTX in that demo. They didn't say they were nor were they showing the indicator in the corner like they normally did. That was raw performance.
Eternal Cowboy I doubt it, whatever AMD’s weaknesses at the high end, they are very good at the lower end and they have a lot of experience with consoles at the moment. I suspect consoles don’t want to rock the boat too much either given the main home ones transitioned to standard x86 architectures. Using AMD would encourage backwards compatibility which I’m sure both Sony and MS very much wants (the former because they many great exclusives, the latter because they kinda need all the library they can get).
Not hyped for new gen cards at all eh. I can see how working with lighting and shadows might games look better but I find myself not looking at shadows......
serious gamers always turn shadows off, it helps them spot enemies hiding in the dark better. ray tracing would be good for casual gamers and non competitive games thou.
The lack of the competition from team RED made Nvidia treat gamers like idiots, he was talking gibberish to people to cover up the fact that the performance in gaming is a little to may be none improved at all, he was selling Ray Tracing like there is no tomorrow, to us gamers that’s just one feature in many we see every day when we go to advance video menu in games, we turn some down and some up to keep the FPS as high as possible. Did he ever mention FPS? I don’t think so, and that’s the only thing we care about Mr. Jensen Huang.
Why? So that everyone can go out and buy the slower and more expensive nvidea cards anyway yet again. Amd isn't comming back to high end consumer products. They are moving to the professional market where their products are actually wanted and most importantly bought if they are competitive. Best business descision the made in years.
With raytracing I can see my unmodelled underside of the tank in games. The pixelated tree leaf in the background cast a shadow. The half modelled hil in background casting a bowl into my foreground. brightens up the unmodelled backend of the incomplete background houses etc...
I'm still using an R9 390 and while it decently does 1440p. I still have some issues and I have been interested in an upgrade. As much as I hoped my next card would be RX Vega... it's probably gonna ben an rtx 2070. But will definitely wait for benchmarks before I order
Ray tracing= Physx 2.0, sadly amd/radeon not in competition anymore, imagine if they still here today, they will crush this rtx in term of FPS without doubt
Just look at reddit and how every user on there is just saying "I only care about frame rate." which is... not true. They say this now, but in a year or two when they actually see Ray Tracing in an Elder Scrolls game they will suddenly care about things other the frame rates again. I absolutely hate the narrative that the amazing leap of ray-tracing is just "better shadows" even your own previous video on ray-tracing did a very good job of showing how much more it is than just that. I'll be honest, at these prices, if the 1070 isn't as or more powerful at pushing frames than the 1080, I'll be severely underwhelmed. But on some games (not shooters or action) I will take 60fps and ray-tracing. Horror games would benefit from this. Elder Scrolls, Final Fantasy. Maybe even GTA 6. I would turn off ray-tracing for anything competitive, or anything really fast paced.
@Seymor Onion. Well the card has to be finished and released first before any developer can fully develop a game with that amount of graphical power in mind. Without actual hands-on the developer cannot balance the game to get the proper amount of FPS using Realtime raytracing. Current games are not set up yet to work with it in a smart way so yes it will take a year maybe 2 for games to fully use the potential of the card. Just compare Gears of War 1 and Gears of War 3 the difference of those 2 game visually is insane and you basically saying.. that the visuals of GOW 1 should have been what GOW 3 looked like.
I'm more interested in seeing how it will shape 3D art. Real time rendering gives artists more time to improve upon and makes it easier for them to polish. Surely it will let them push the boundaries by allowing them better preview things and worry about fewer issues before committing to more calculation heavy rendering. Real time rendering will do wonders for hobby/indie/beginner artists as well when prices eventually go down and this tech becomes more mainstream. They can better create today what took a day or more to render yesterday at a more affordable price.
End of the day when a company doesn't use the standard terminology to it's stakeholders they have something to hide so your probably right in this video.
@-T-X-M- doesnt matter, 3gb vs 6gb is literally almost microscopic differences with numerous comparison videos to back that up. I would much rather have the 3gb over the 6gb and hold out until 7nm cards come out.
@-T-X-M- I paid 250 for my 1060 6gb because of a promo code. But now I pissed to hear that the 2060 will be as powerful as a 1080. That's drives me insane
The new card reminds me of the transition to programmable shaders. Nvidia took the risk of adding programmable shaders to Geforce 3, kneecapping its potential performance on contemporary games. They could have just added more fixed function hardware to add more fps. It was a great thing they didn't because, developers could then implement support for shader model 1 in their games, because the hardware was now in gamers' hands. Unfortunately, when the games finally did support programmable shaders, Geforce 3 was too slow and below the minimum system requirements. In addition, NV released a Geforce 4 MX, which was just a Geforce 2 in disguise (fixed function pipeline, no programmable shaders), which confused gamers and betrayed developers. So if history repeats itself, Turing will kickstart the use of realtime raytracing, but wont be fast enough to support meaningful RTX implementations and nVidia will release deceptively named non-RTX cards which will confuse the market (and slow adoption of the new tech by devs).
They are more concerned with marketing the tech to the industry than showing off its gaming performance to their core audience (gamers). It's amazing real time ray tracing is here, but games over the years have gotten very good at faking it. Does anyone really need real time reflections to enhance gameplay? Better FPS, higher resolutions, and physics all have big impacts on how you enjoy a game. I know it's more than reflections, but dynamic lighting and shadows faked over the years look pretty damn good, and the key difference: Run good too. This tech is 5-10 years from seeing a shift in how games are built and how it performs. It can't even get 60 fps at 1080P, not even taking into account other aspects of ray tracing beyond reflections. Big red flag that no benchmarks vs the previous generation were shown...
EXACTLYYY i thought there was nothing special except for ray tracing/ realism/ deep learning and ai/ supercomputers rather than gaming im just hoping i can finally get a gtx 1070 for $150 after everyone fallls for the "greatest leap with 10 years of research" hype
so Raytracing is like hairworks ..its an Luxury item... which hits hard on performance. may be reducing like 25% of FPS for a shining reflecting light. its sure not for majority of gaming users who would want to turn off it.(GTX 1060, RTX 2060 users?) with only handful of AAA games thats use it...i dont know why it was stretched so much on it. Whats a real breakthrough is 4K 120+Hz HDR Ultra setting GAMING.. if rtx series can do it... then fine...else ...DISAPPOINTMENT!!
Technologically speaking, this is a huge thing to be proud about, this was literally impossible like 1-2 years ago... But I get what you mean... However, DLSS could interest you, because it can make games run at 1080p look like native 4K on a 4K screen without the massive performance hit of actually doing all the pixels...potentially giving you more fps with a beautiful sharp image. And that is also an RTX function.
i looked up at DLSS(Deep Learning Super Sampling) from what you said.. it does look interesting on paper..atleast But man..there one problem for it too..It needs Game-developers implementation. so for each & every game.. it must be individually implemented.
Magma Vol: AMD does the same thing, Vega has tons of baked in features that developers have to use... But, nVidia has the money, resources and investment to actually go out and tell developers to do it...the fact that there is already a list of games that support either DLSS or Ray Tracing or both, with patches anounced post launch for many current games is an alarming sign that nVidia will push this tech into mainstream as hard as they can. Partially to drive sales of course, they are not saints, but it won't be DOA like many of AMDs Vega features, where nobody feels the need to invest time into that. nVidia just up and pays you to do it, so then it becomes worth it.
Its a hit now, there has to be a first generation of everything. If you don't want to be an early adopter it makes sense, but in a few years people will want (but still complain)
If you watch the Quaddro presentation it takes about 48mS for one frame with RTX fully utilized. This means in it's purest form Raytracing isn't possible with gaming since you need about 4-8mS per frame. Great for canned scenes or CAD design though.
These prices are just insane. I was kind of hyped for the release, but damn, 1200$ for a 2080Ti is a *lot*. Nvidia just boasting RTX around and "ooh, it is X times more powerful than Pascal" is really fishy. I'd really like to see these puppies benched out, and compare non RT performance. Also waiting on Vega 7nm. Maybe we get something really cool from Team Red?
They did the exact same thing with maxwell to pascal. Just now people are salty they are charing so much. Not surprised given the typical hardware enthusiast.
The price leap is insane. I really feel like they are banking on mining picking up again to try and cash in. Unless the 2070 has vastly better performance than a 1080, I dont see how they can justify their pricing models. I hope it is faster tho seeing as a 1070 beats out a 980ti in many games, the generational/pricing shift will only make sense if this is the case. I'm almost mad at how good Pascal was, it kind of haulted the need for innovation. Heres hoping AMD has something up their sleeve...
I have a 1080 as well. Best card I have ever owned and picked it up for $499 last year right before the mining craze. Just bothers me that the X080 card is not sticking to the same pricing slot.
there is no competition at all . amd can't catch nvidia in terms of performance. nvidia slapping price on ur faces. go buy in tht price or don't. simple
This make me remember the launch of the Geforce 3, it was fenomenal hardware that did at same frame rate and sometimes even less performance than the Geforce 2 Ultra. But the technology leap was a pivot point between fixed rendering pipeline to programmable shaders, It was a outstanding thing that nobody really understand it until the Geforce 4 ti came along with the leap on performance with nFinite FX flying. nVIDIA has done it again, make gigant leaps on computer graphics.
Yah, these prices are insanely high (the actual store prices even more insane) for something that looks like it'll only actually be ~20% faster (After more than 2 years). Ray Tracing is great and all but there isn't a game out there that uses it yet so who cares? By the time games have it you'll want the next generation of RTX technology, and even then they'll all still work on non-RT cards. I'm thinking people should be buying 1080Tis at their sale price right now and skip this whole 20XX generation.
at least for 2080 vs 1080 it's only a 15% increase in CUDA cores with no clock speeds published yet. If clocks were higher they'd have announced them so we can probably pretty safely say that for games that don;t use the Tensor cores and RT they won't run more than 15% faster. For the Ti that's 21% more CUDA cores but the clocks are actually slower (100MHz less base for the EVGA Ultra Gaming) with boost not published...This might be 15% faster again....DO NOT PRE-ORDER THIS. Wait for reviews.
You aren't factoring in GDDR6 and minor architectural improvements they mentioned. I'm expecting more like 30-35% gain across the board (of announced cards). Still underwhelming so I agree with the sentiment.
I think the oddest part was how he started with "*All* the leaks are wrong" and then didn't mention a single number from the leaks that was wrong (in fact, it appears all leaks got the core numbers right).
Awesome i can finally buy a 1080ti once the price falls ..... Not buying the first round of NEW graphics cards - ill wait for them to get the kinks out
Firstly, I just want to say, I have seen so many people jumping on nvidia for not talking about gaming performance, and while I agree, they probably should have, I understand they didn't, because they were unveiling a massive new hardware feature that by the sounds of it has been a work in progress since 2008, they don't have anyone to compete with in terms of "regular benchmark metrics", so they talked about their brand new never before seen thing that will bring real-time raytracing to further enhance graphics. 1:46 The biggest issue I have with this statement, is raytracing is so much more then "shadows", which I assume you know, but you said that the whole point of raytracing is accurate shadow interpretation, yet glossed over the BFV demo which focused mostly on the reflective aspects of raytracing(ie, not shadows), raytracing in general, can be used for so much more then just shadows, and now we can get that in real-time. 3:28 Correct, with real-time raytracing on, you will see a significant performance drop, I would assume, this is raytracing techniques done in real-time, it is literately about advancement in graphical fidelity, the first generation of RTX will likely suck at the raytracing performance, it wont be until next generation or the one after that it will start to be worth using. 4:52 Again you mention shadows as being the only benefit.... 5:20 And again... Did you ignore the BFV demo showing the reflective properties afforded by raytracing? You know, the demo that iirc, said nothing about shadows, but instead focused entirely on reflections? Again, still only one aspect that raytracing can do. Anyway, in short: They have no one to compete with for performance(thanks AMD), so they focused on their new hardware features and marketing real-time raytracing, instead of saying how much faster these cards are then....their own cards.
A German tech site posted a video of the 2080ti running Tomb Raider at 1080p on the show floor. It was averaging around 30-40 fps with RTX.......at 1080p.......ROFL......
you got it wrong, the whole point of RTX was his team is bringing a tech that was only available on supercomputers previously to the mainstream. if you wanted raytracing in your movie, u had to pay a lot for it. now it's in your desk top. now everyone gets to enjoy it.
The ray tracing that is happening on the desktop is still not anywhere near what super computers or render farms do. It is extremely limited and used only for certain affects and it is also limited in what it can accelerate. It is nice technology but it is being vastly overhyped. Many different things can be called "ray tracing".
No, raytracing has been the most important part of 3d Graphics for decades now and them bringing it to real time is a huge achievement that is guaranteed to become a fundamental part of the graphics pipeline going forward. Improving ray tracing is worth just as much as getting better performance without it.
Everyone saying "I don't care about shadows and reflections".. Then why defuq do you care about the high end gpus? These high end cards are made so you can get the best graphics quality and get good fps. If you don't care about shadows n shit then you should just get a low or mid range gpu and turn down the settings. Remember you don't *need* these cards to play games.
how about fps. Much more important for competitive multiplayer games. in fact, hardcore gamers usually dial the graphics setting to minimal to max out fps and reduce unnecessary visual effect that interferes with target acquisition.
@438295 no... No.. Lighting is more important when it comes to photo realism than textures. And yes we do need better lighting. This is how graphics advance over the years. Without innovation like this we would still be playing 8 bit 2d games.
3440 x 1440 at 100+ fps. And if the 1080 Ti can do the task with most current games costing as much as 300 bucks less then the RTX just don't have a chance.
Stop flaming AMD about competition, how are they meant to support you if you dont support them? Such little buyers from consumers whether its Ryzen or Radeon. People keep justifying their 8700k purchases because of the 10% more FPS gains? Its frustrating that i see flame towards AMD if we dont fund them how are they meant to fund us?
don't be ridiculous AMD sold every single GPU they could produce to crypto miners over the last year to the point where gamers couldn't buy any for months.
The time when it mattered was when AMD was legitimately competitive. Both the R9 290 and the 290x comfortably outperformed the 780, and everyone still bought the more expensive, worse performing 780 because "it's nvidia so it must be good", the mindshare of nvidia was still there. AMD lost tons of money by putting in the R&D into a great card (best on the market) that nobody bought, despite a cheaper price, they legitimately can't afford to put in the R&D for a card that will beat NVidia's best offering, especially with NVidia's anti-consumeristic tactics like GPP and Gimpworks that are designed to cripple the competition instead of improving their own products.
8700k is a great all rounder, I feel with Ryzen you get really subpar single core performance, I run a r5 1600 and it's susceptible to frame drops, a friends 7700k is consistently solid
*shrug* I support AMD and put my money where my mouth is... I bought RX580 when they came out (where within 10 bucks of the RX480 so it was easy choice) card runs well with my R7 1700. I try to avoid Intel's scams and Ngreedia since both are shady in my eyes. AMD is not fault free but a drop in the ocean compared to the other two. My PC before that was an Intel 2700K with a AMD 7970 if I recall right.
This presentation was more exciting from the gamedev perspective than from the gamers perspective. Games looking nice are what gamers would make to buy a new graphic card, but this time a GPU came with a nice new feature before any game have something notorious to show using this technology. This is why as a gamer you could feel a bit disappointing about this presentation. This is the points I got from the presentation: - They sacrificed chip space to add a Tensor core and a RT core. - Traditional raster pipeline got a 2x speed up. - Compared with the traditional raytracing pipeline made with compute shaders in Pascal, they got a 6x speedup in raytracing with the new cores. This lets this new generation of GPUs make some raytracing tricks in the real-time range (16 ~ 45ms per frame). - Tensor core enables to make any neuronal network filtering like denoising or antialiasing absurdly fast. From the gamedev perspective, if you can afford raytracing, your game render pipeline simplifies a lot. Your game no longer needs to make complicated tricks to make lights shadows just look good, they will just look good. This kind of technology unlocks many other techniques that gamers would enjoy, but they are still to come.
Sorta kinda planning to upgrade and I figured this would be a good jump from my GTX770, but...while that might be true, if the performance gains aren't huge over a 1080ti, it feels like that might be a better choice for now.
This seems more like populism video trying to create drama where there isn't one. You are getting a real-time raytracing capable single GPU for 499 to use in your gaming computer. Now I get if you do not understand the importance of this technical accomplishment, but to try to discredit it is quite unreasonable. Real-time raytracing is one of the most important factors of achieving realistic representation of light, reflections and materials in a virtual environment and I am unsure as to what you expect Nvidia or any other GPU company to do other than to focus on something as important as this. I think sometimes people, especially pc gamers can tend to get entitled. You are sitting with a high quality monitor behind you and an expensive computing unit next to it and now a company has made real-time raytracing possible to run in that PC of yours for 499 starting price. Get real man.
Could not agree more. A version of clickbait. We just have to wait for proper tests to come out, like within a few weeks or even days instead of all the "guessing" videos. Check "geniwab" comment below, explains it pretty good. And if a lot of vocal gamers are upset about Nvidias new card, Nvidia is probably doing it right :)
This is such a stupid comment. People act like "ray tracing" is one thing. It is not. Claybook can run at 4K 60fps on a high end Pascal GPU and it uses ray tracing. Nvidia is just accelerating the BVH structure. Plenty of other approaches. Solving ray tracing is as much of a software problem as a hardware one.
You missed the point of their approach. The whole point of brute force rendering is to avoid software gimmicks that try to "solve" raytracing, because until now, we have relied on software workarounds for a lot of gimmicky ways that aimed to achieve realistic light cast. Real-time brute force raytracing is the real way of doing it or at least it is the goal. Even Claybook relies on software workaround methods and Pascal GPUs do not have a separate allocation on the die for rendering of raytracing, meaning the hardware is not optimized for such scenarios and has to generally dedicate performance to it or rely on software gimmicks. For the first time you are getting a consumer entry GPU (499) that is specifically, meaning hardware and die, aimed at rendering this at real-time.
Our current "faking" methods for lighting achieve 90% of the effect for less then 1/2 the performance hit. Remember how revolutionary PhysX and DX11 were; or better yet how unimpressive they turned out to be ? The RTX 20XX series is going to be bad at ray tracing just like the DX11 cards were at launch.(by that I mean not fast enough) I really think your missing the point he was trying to make with this video. Most people could probably agree this GPU launch seems off from previous generations. The lack of any head to head results against current cards feel strangely similar to AMD's Vega launch. Where they did everything in their power to avoid an apples to apples with Nvidia's cards. I personally believe the RTX 20XX series will see the same 15-20% generation boast we're used too. The 2070 = 1080, 2080 = 1080ti, and the 2080ti becomes the new king. The same meta as the last 3 generations launches.
You cannot do head-to-head comparison between previous generation GPUs and RTX GPUs, because the RTX GPUs are relying on an entirely different architecture, which again, is the whole point. One of the two biggest GPU makers, Nvidia, has acknowledged through heavy investment that AI and raytracing are so crucial to game development, that they physically allocated the rendering of these on the RTX architecture. Thus, they circumvent the need for software gimmicks and significantly improve project workflow and development time, not to mention render times for non real-time applications. As for fake methods achieving 90%, which is a percentage you pulled out of thin air, the whole point of progress in the game industry is to reach points in hardware development that allow us to get rid of fake solutions and patch works. Just think of the amount of resources that are saved from having artists and programmers work on fake circumventing methods and instead have them apply direct solutions.
The problem with the 20-series is the concentration on RTX-OPS. "You" can't compare speed in RTX-OPS between a card that can do RTX-OPS and a card that is not designed to do RTX-OPS. Of course the new 20-series will be much faster then. What we need to see is comparisons between 1080Ti and 2080Ti for example in games NOT using RTX-OPS. That's the only way we can see if the cards are really faster at all. My guess is that they are just marginally faster and to really experience the difference in "speed" we need to start changing the games... We won't see any benefit from the new cards until we have games that rely on the Tensor cores and AI to calculate shadows, reflections and GI. Also a thought... Check the demos they presented... Something seems off. Maybe it's the fact that the raytracer only takes into account the things chosen to cast shadows. Not everything cast shadows. Same goes for reflections. Not everything is reflected (See particles in the big explosion on Battlefield V for example). Another thing is that the raytracer is used for reflections and the rasterizer is used for the rest of the graphics. Doesn't that make it so the graphics rendered and rasterized directly in screen differs from the graphics reflected since the same object basically gets rendered with two different engines. I believe we might see cases where the reflected image does not match the look of the actual rendered object that is reflected (If both ar visible at the same time). It will be extremely interesting to see where this goes and where this ends up, but for now it feels much like PhysX and the hype around it when it was released. We'll see... :)
Their Battlefield V demo was seriously broken. Cars looking like they were pasted into the scene, no real shadows underneath. But "hey, reflections!". I guess either the RT support was hastily tacked onto the game or they still have issues somewhere...
They did comment that it will only work on properly made objects. For example if the car is simply a car shaped box on the ground wrapped in a skin to make it look better, then the light will not be able to bounce around under it. It's likely some objects in the scene were made in this way which is why they focused more on the "real" parts of things, like the guns, the tank, the planes, the eyes, and which is why they made it clear if you make objects properly, turning on ray tracing is easy.
That will inevitably happen, they just want to ride the hypetrain and bait the pre-order culture at the moment. Can't really blame them, if people want some new shiney thing sooooo badly they pay for it before it even exists they deserve disappointment. TBF I am excited for the new cards and eagerly await actual reviews.
But the card isn't designed to be significantly faster at everything. The card is specifically designed to be faster with ray tracing which creates a superior graphical experience. You are asking for something that you know wasn't the thing which was focused on when the card was being designed.
My guess is that because the 20x series is so vastly different from the 10x series, that there are a lot of issues with existing games that still haven't been addressed, which could create all kind of frame rate issues, etc. 20 games coming out with Ray Tracing does not make me want to upgrade my 1080ti.
Is there any point of investing in an rtx line when only a hand full of games are going to support this ray tracing thing in the cards 2 year life cycle
It felt very unimpressive. Most of the terminology and lightning talk was on point (to be fair) but overall uninteresting. The demo that had to keep 60fps dipped in the 40s at some point. Lara's demo had a noticeable fps drop during the explosion sequence as well. The flame reflection onto the black cab looked unrealistic (to me at least).
Well, I can imagine giving the same talk twice (or thrice, ok thrice is quite boring for the presenter), but having virtually the same keynote I cannot comprehend.
I always sell my cards to friends or on let go or Ebay and buy newer cards but these are not worth it I always wait 2 yrs or so for upgrading but these cards no not for me
I don't buy powerful gpu for realism. I buy it for max fps count in multiplayer games. Gonna wait till other test the card out. Remember Xbox with tera flops marketing? Yeah still stuck at 30 fps.
Then just buy a 960 gtx and put all settings on low... you'll have all the FPS's you wants. The whole point of buying a powerful GPU is to increase realism in realtime lighting, reflections, texture resolution, particle amounts and many other things that make the game look more realistic/pretty/impressive whatever you want to call it.