It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Abishia: [...]
not even worth buying for the tiny gain.
[...]
worth is subjective
Depends on the generation. 2080 vs 1080 was not a worthwhile improvement. Maybe even 980 to 2080. However the 3080 was like an 80% jump. Just plan your upgrades accordingly.

Also, evil or not, developers stop optimizing for older hardware pretty quickly. So does nvidia with their drivers.
It's not always upto you. I got two games for Christmas last year, and when I got around to installing them... they refused to play! Sometimes newer games demand newer processors. Even if the gain is 0 FPS, if the software refuses to work with your old GPU, you're stuck.

In my case, Deathloop (Arkane) wouldn't work on my old GTX660 because it couldn't create a DirectX 12 connection. Eventually you'll upgrade or choose not to play new games.
avatar
rtcvb32: If you really need the gain though you adjust your video settings until you balance resolution vs effects vs speed and find a good middle ground you're happy with.
The only settings people are happy with are max settings, don't even dare mention it.

Anti-aliasing and Ambient Occlusion usually have a very high speed penalty, it's not unreal to expect a game to run 50 to 100% faster and still look good by tweaking a few settings.
I still have a GTX 1080 and I'm quite happy with it for 1080p gaming. Of course I'm not into the RTX stuff and think regular non-RT global illumination systems still look darn good enough (see RDR2).
avatar
tritone: In my case, Deathloop (Arkane) wouldn't work on my old GTX660 because it couldn't create a DirectX 12 connection. Eventually you'll upgrade or choose not to play new games.
Second generation Maxwell (900 series, released in 2014) introduced support for Direct3D12 Feature Level 12_1, which is all that you need to play any modern game with "RTX off" these days. We're still a ways off before that changes and cards are more likely to hit performance obsolescence before that happens IMHO.
Post edited December 04, 2022 by WinterSnowfall
doh, completely misread
Post edited December 04, 2022 by neumi5694
avatar
pds41: It's not as simple as you think. There are lots of scenarios where upgrading makes sense. For example, someone who has a 9 or 10 series; getting a 30 series makes sense (although less so when you consider the 40 series is coming out soon).
avatar
Namur: That's exactly what i did, swapped a GTX 970 for a 3060 TI and my 2015 machine is good to go until 2025. The 400€ i dropped on the card will give me considerably more latitude for the next 3 years at which point i'm planning to retire my current pc (but likely not the 3060 TI, since ii plan to stick with1080p it may very well go into the new PC)
Yeah - that's a pretty good upgrade and the 3060Ti is great for 1080p. My last upgrade was about 3 years ago - I went from a 6 series (GTX 660) to a 2070S. Provided I stick at 1080p, I think that should last me until the 50 or 60 series comes out (possibly beyond).
avatar
Abishia: would you ever upgrade a card paying again 800$ to gain like what 20% more speed?.
I've kind of already answered that question, but I'll give it another go. I wouldn't upgrade from a 2080Ti to a 3080Ti, but that's not the buying decision that I face. GPUs all offer a different performance per pound ratio - as a consumer, you take a decision and buy according to what you can afford and what you value. I'm not driven by the need to have the fastest, but the best value, so I sit around the 60Ti/70 range in GPUs. I also keep GPUs running for a number of generations and then upgrade. So, I wouldn't ever be spending $800 for a "20% performance increase" (although I think to put it down to a single % figure is too simplistic) - I'd likely be spending £400-500 for a huge performance increase and vastly increased feature set.

As for GPU prices, unfortunately, everything is getting more expensive. All things being equal, generally in developed economies, we expect prices to double every 25 years without external factors influencing; for computers, since the 1990s we've been quite lucky that technological advances have also come with reduced costs. Unfortunately, in the last few years, we've had the crypto idiots, electric cars, the Wuhan Coronavirus and Russian invasion of Ukraine which have all added upward pressure on prices, especially computer components.

Ultimately, if a new GPU falls outside your ability to afford and value spheres, don't buy one and keep your old one running a few more years.
avatar
pds41: Yeah - that's a pretty good upgrade and the 3060Ti is great for 1080p. My last upgrade was about 3 years ago - I went from a 6 series (GTX 660) to a 2070S. Provided I stick at 1080p, I think that should last me until the 50 or 60 series comes out (possibly beyond).
Absolutely, the 2070S is a great card, @1080p it should carry you nicely for a good few years. I actually considered the 20 series when i was shopping around but it never happened because the prices were crazy considering the series and never spotted one with a decent enough discount to justify picking one up (the 3060 TI popped on my radar at a substantial discount which is why i jumped on it).

Prices are crazy still all across the board it seems, the 40 series cards are available around here in stores but prices for the 30 series, and below, either didn't budge or actually went up in some cases.
avatar
tritone: It's not always upto you. I got two games for Christmas last year, and when I got around to installing them... they refused to play! Sometimes newer games demand newer processors. Even if the gain is 0 FPS, if the software refuses to work with your old GPU, you're stuck.

In my case, Deathloop (Arkane) wouldn't work on my old GTX660 because it couldn't create a DirectX 12 connection. Eventually you'll upgrade or choose not to play new games.
I'm in a similar position with Total Warhammer 3 atm. Yes, the 2060super i'm using still does the trick. I can choose to have the game running normally. This will spike my oc'd card well above the 200 Watts running hot hot hot at 120% of its maximum, or i can choose to open the bag of tricks lowering resolution, capping fps going for a less than ideal situation.

In any case, this game is only a forebear for how life will become in the coming 2, 3 years at gaming point.

So, if you are bound to play modern, current day games you will have to upgrade, either sooner or later.

For myself, i've been checking out these 3080 prices as well. We have deals at Megekko, one of NVidia's official Dutch supply channels starting at 800 euro's.
https://www.megekko.nl/product/0/1015342/PNY-Geforce-RTX-3080-XLR8-Gaming-UPRISING-EPIC-X?r=googleshopping&utm_source=googleshopping&utm_medium=cpc&gclid=EAIaIQobChMIze_t4c7g-wIVw9myCh2iHARyEAQYCCABEgLinPD_BwE
Maybe Santa will be nice this year
avatar
Dark_art_: The only settings people are happy with are max settings, don't even dare mention it.
Well that's stupid. Especially if you compare between ultra and max, i see very little visual difference, and between medium and max there's a fair difference but it's easier to identify things on the screen without the extra lighting effects.

I enjoy going from lowest quality and working my way up and finding what i can be happy with (although i do high models/textures since that is a memory issue more than a speed issue generally). Even on low settings games should run and look decent.

avatar
Dark_art_: Anti-aliasing and Ambient Occlusion usually have a very high speed penalty, it's not unreal to expect a game to run 50 to 100% faster and still look good by tweaking a few settings.
AA has it's uses, but for games i don't find it as useful. And Ambient Occlusion I've tried one game with it, and the penalty was too high vs baked shadows/lighting. (Course that game i have at 720 to play at 60fps).
Post edited December 05, 2022 by rtcvb32
I only recently upgraded, from my old AMD (formerly ATI) Radeon HD 5770 to a Nvidia RTX 3080
Without gettting into the topics of relative worth/value, or appointing oneself as judge of what other people do with their money, here is what I got by replacing a 2060 6GB ($350-2019) with a 3080 12GB ($800-2022):

In very modded (heavy ENB, lots of textures) 4k Skyrim (LE and SE) and Fallout 4, went from 24-30 FPS to 75-90 FPS.

Interpret as you will.
Might not be a popular opinion, but I try to leverage my hardware until it breaks (or I feel it is likely to break if I can't afford to be without it for close to a month).

There was a big hardware boom until about 10 years ago where people had to upgrade their computers continuously and as much as it was thrilling for the advances in computing it allowed, I'm glad its over, because it was also an environmental disaster (and back then, a smaller percentage of the world could afford computers... I don't even want to think what it would have been like if it had happened nowadays).

It would be much better for our future if people could minimize their hardware usage and leverage the hardware they got for a decade or more.

The only time I "upgraded" my gpu was when I got a desktop gpu in 2018 to try make my laptop work with an external gpu (spoiler alert: It was easy on Windows, hard on Linux). My goal there was to make an external gpu work with gpu passthrough on a Windows vm with a Linux host (in order to play Windows game on a Linux laptop without a discrete gpu in the laptop... it was before the pandemic, I really needed a laptop, but I always got an all-purpose powerful workstation laptop that I wanted to use for 7+ years and I wanted to leverage it as much as possible).

Ultimately, I caved in and built a Windows machine around the gpu I got (thanks to the non-triviality of the endeavor and time constraints because of work). I consider that an excess on my part. Short of another hardware revolution (which I hope doesn't happen or at least, not at the same pace as the last one where most feel compelled to replace a computer that is less than 5 years old), I intend to leverage that Windows machine for gaming until 2028 at least.
Post edited December 05, 2022 by Magnitus
Where is there?

Anyway...

I went from 780 to 1080 Ti and I'm still riding that one out. I probably could upgrade at this point, but with there being so few new demanding games that I'd be interested in, I can still postpone it. I rather opted for a higher refresh rate display, so I'm currently sitting at 1440p 240 Hz. For most games, the 1080 Ti is still perfectly capable. Framerate can drop to double digits in some more demanding titles, but usually sitting in the 100+ FPS territory. The best value card I've been on so far.

I was planning to upgrade to the 40 series, but at this point, I'll probably put it off as long as possible and build an entirely new PC in a couple of years. Or when something eventually stops working.
Post edited December 06, 2022 by idbeholdME
There are more use cases for GPUs than games, just saying. Speed gains are great of course but going from 8 to 16/24/48 GB VRAM can make a huge difference for productive uses.

Id agree that the differences between two subsequent generations often aren't really worth the upgrade price though... i bought a new GPU recently but If i only needed mine for games id still be (mostly) fine with my RX 480.