It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Shadowstalker16: Games can only be optimized to a certain extent. There will always be some hardware that will not be powerful enough to run a game, especially if the game is a modern 3d game with the latest visual effects. Models and textures for example can only scale down so much and I doubt the devs of any game can make it so that their game will run on something with outdated specifications without compromising the game elsewhere.

What do you think about CDPR releasing unoptimized versions of CP2077? I think if support for weaker hardware goes in that direction, ie making another inferior version of the game entirely from scratch just to run on it, it would be a huge step backwards. It would be like the early console days where PS1 or N64 or PS2 and XBox versions were different and the player had to choose his poison in terms of optimization and the definitive version of the game.

In terms of actually optimizing for recent hardware, I think all devs should do it. But I'd imagine that it would depend on the skill of the people coding it and their publisher's will when it comes what and where they invest their time. If the design philosophy is just to vomit out open unoptimized worlds, I'd imagine quantity would take priority over (optimization) quality.
I agree with this.
I'm not thinking about "let's optimize our next game so it can run on both potatoes and high-end rigs equally well".
I'm thinking more on the lines of having onboard GPUs as playable targets upon designing their next game engine. Asking otherwise would not be practical, I think.
avatar
Orkhepaj: but many games are not playable at all with that hardware
and would make no sense to limit games to that level
avatar
skeletonbow: Indeed, and many games choose to not support Intel integrated video because of that, which makes sense. Personally I think all games would be better off if they did that, as all of the development resources that went into supporting potato onboard graphics could be spent optimizing and improving the game on the actual discreet gaming cards that everyone uses.

And as counter-intuitive as it might be, I say that as someone owning thousands of games who has outdated 7 year old AMD GPU, and low end (by today's standards) onboard Intel and nvidia GPU laptop. So my opinion would actually hurt me more than help me, but it'd be better for gaming overall and keeping things moving forward and with higher quality.

I'd be getting new hardware if it was actually possible to get it in 2021... Perhaps they'll end up having to support old potato hardware in another year because new hardware only exists in PDF datasheets and not on store shelves. :)
I remember back when 3DFX with Voodoo, NVidia, ATI and all of those cards were taking off - namely b/c Intel didn't want to support T&L b/c it would've costed both them & consumers/users more $. Why waste resources?

Remember, a lot of Intel's money was made not just off the consumers, but also making $ off PC's for businesses who would NOT need any sort of GPU/iGPU. So, to save them money, why put a iGPU in their business PC's? Why sells iGPU's to users who just won't use them?

Some games just really aren't built around that aging and/or weak hardware. Especially when the are supporting open-worlds, AI's, Ray-Tracing, and other stuff.
avatar
thegreyshadow: I agree with this.
I'm not thinking about "let's optimize our next game so it can run on both potatoes and high-end rigs equally well".
I'm thinking more on the lines of having onboard GPUs as playable targets upon designing their next game engine. Asking otherwise would not be practical, I think.
ah you still dont get it it is not practical to target such a weak gpu for most AA and AAA games
there are plenty of games which can run on those thou , you should play them
avatar
Orkhepaj: stupid crypto should be outlawed , does it bring anything good for humanity? I dont think so
some luckers got rich , many other got scammed , many criminals enjoy its benefits
What counts as "stupid" crypto versus "smart" crypto?

Also, without crypto your purchases would be unencrypted, which means someone who can intercept your network connection (for example, your ISP) could easily get your credit card info and use it to make fraudulent purchases.
avatar
Orkhepaj: stupid crypto should be outlawed , does it bring anything good for humanity? I dont think so
some luckers got rich , many other got scammed , many criminals enjoy its benefits
avatar
dtgreene: What counts as "stupid" crypto versus "smart" crypto?

Also, without crypto your purchases would be unencrypted, which means someone who can intercept your network connection (for example, your ISP) could easily get your credit card info and use it to make fraudulent purchases.
what ?:D that is clearly not how it works

every crypto where chinese cryptofarms control the integrity is a stupid crypto
Post edited March 08, 2021 by Orkhepaj
You have an Intel UHD 620?! My potato Intel HD 4400 laughs at you. :D

Seriously now, I guess it's all about market share.
It takes time and resources to optimize code to run on older hardware, and most developers can't be bothered with that if gamers are willing to sell a kidney for the latest fancy GPU.

I'm not getting a better potato or buying a GPU just so I can play games (plenty of retro stuff to keep me busy), but grumpy old people are not a target demographic for any game except Farmville :P

On a side note, you might want to consider an AMD CPU with integrated graphics next time you upgrade. They blow Intel stuff out of the water, even entry-level mobile chips. As long as you don't mind lack of Linux support!
avatar
Orkhepaj: stupid crypto should be outlawed , does it bring anything good for humanity? I dont think so
some luckers got rich , many other got scammed , many criminals enjoy its benefits
avatar
dtgreene: What counts as "stupid" crypto versus "smart" crypto?

Also, without crypto your purchases would be unencrypted, which means someone who can intercept your network connection (for example, your ISP) could easily get your credit card info and use it to make fraudulent purchases.
Orkhepaj means crypto as in Crypto-Currency (or crypto-highly-speculative-investments as they would be more accurately named), not crypto as in cryptography.
avatar
dtgreene: What counts as "stupid" crypto versus "smart" crypto?

Also, without crypto your purchases would be unencrypted, which means someone who can intercept your network connection (for example, your ISP) could easily get your credit card info and use it to make fraudulent purchases.
avatar
pds41: Orkhepaj means crypto as in Crypto-Currency (or crypto-highly-speculative-investments as they would be more accurately named), not crypto as in cryptography.
yup , i havent seen anybody use crypto as cryptography ,if anyone can search for crpyto online to see all of the results are cryptocurrencies

You have an Intel UHD 620?! My potato Intel HD 4400 laughs at you. :D

Seriously now, I guess it's all about market share.
It takes time and resources to optimize code to run on older hardware, and most developers can't be bothered with that if gamers are willing to sell a kidney for the latest fancy GPU.

I'm not getting a better potato or buying a GPU just so I can play games (plenty of retro stuff to keep me busy), but grumpy old people are not a target demographic for any game except Farmville :P

On a side note, you might want to consider an AMD CPU with integrated graphics next time you upgrade. They blow Intel stuff out of the water, even entry-level mobile chips. As long as you don't mind lack of Linux support!
My Intel HD 4600 is considerably more powerful than my Broadcam Videocore VI. (Determined this with a program I wrote: It takes a lot longer for the VC6 (in a Raspberry Pi 4) to render the Mandelbrot Set than this old Intel chip with integrated graphics, and this is when the Intel system still had cooling issues.)

Also, AMD GPUs work fine on Linux, and don't need proprietary drivers unlike nvidia ones; I've even played a few games on the AMD 3500U's integrated graphics.

avatar
dtgreene: What counts as "stupid" crypto versus "smart" crypto?

Also, without crypto your purchases would be unencrypted, which means someone who can intercept your network connection (for example, your ISP) could easily get your credit card info and use it to make fraudulent purchases.
avatar
pds41: Orkhepaj means crypto as in Crypto-Currency (or crypto-highly-speculative-investments as they would be more accurately named), not crypto as in cryptography.
To me, when I see "crypto" the first thing I think of is cryptography; currency doesn't enter the picture.

Of note, nvidia has decided to deliberately reduce the Etherium mining speed of their consumer GPUs and create a different product, CMPs, for those who want to mine. While I think cryptocurrency is unethical (due to the environmental cost from using all that power), I also think this "solution" that nvidia has is unethical, as it's basically a form of DRM.
Post edited March 08, 2021 by dtgreene
avatar
dtgreene: Of note, nvidia has decided to deliberately reduce the Etherium mining speed of their consumer GPUs and create a different product, CMPs, for those who want to mine. While I think cryptocurrency is unethical (due to the environmental cost from using all that power), I also think this "solution" that nvidia has is unethical, as it's basically a form of DRM.
how is it drm?:O i clearly cant see the connection
avatar
dtgreene: Of note, nvidia has decided to deliberately reduce the Etherium mining speed of their consumer GPUs and create a different product, CMPs, for those who want to mine. While I think cryptocurrency is unethical (due to the environmental cost from using all that power), I also think this "solution" that nvidia has is unethical, as it's basically a form of DRM.
avatar
Orkhepaj: how is it drm?:O i clearly cant see the connection
It's an artificial limitation on how the user can use the hardware they bought, which I consider to basically be a form of DRM.

(Also, note that this isn't the only arbitrary limitation nvidia has imposed on consumer GPUs; they don't play well with GPU passthrough, for example.)
avatar
Orkhepaj: how is it drm?:O i clearly cant see the connection
avatar
dtgreene: It's an artificial limitation on how the user can use the hardware they bought, which I consider to basically be a form of DRM.

(Also, note that this isn't the only arbitrary limitation nvidia has imposed on consumer GPUs; they don't play well with GPU passthrough, for example.)
their loss , i probably buy just another amd next time
avatar
zwolfy: On a side note, you might want to consider an AMD CPU with integrated graphics next time you upgrade. They blow Intel stuff out of the water, even entry-level mobile chips. As long as you don't mind lack of Linux support!
Great advice, but Linux support is a must, and preferably with open source drivers.
avatar
zwolfy: On a side note, you might want to consider an AMD CPU with integrated graphics next time you upgrade. They blow Intel stuff out of the water, even entry-level mobile chips. As long as you don't mind lack of Linux support!
avatar
thegreyshadow: Great advice, but Linux support is a must, and preferably with open source drivers.
Except that AMD integrated graphics works well with Linux open source drivers, unlike nvidia.
avatar
thegreyshadow: Great advice, but Linux support is a must, and preferably with open source drivers.
avatar
dtgreene: Except that AMD integrated graphics works well with Linux open source drivers, unlike nvidia.
That was my impression too. Seems like my next device would be powered by an AMD CPU/GPU