It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
lupineshadow: See, here's the problem, I don't drive, and have no interest in cars.

But just out of interest, since I looked it up, why are all the Honda Civics called the same? Surely the 1972 Honda Civic is not anything like the 2021 Honda Civic? Why is the same name used for a very different car?
That's when they do the confusing thing of introducing different package names. DX, SE, LE, II, and so on.

But at the end of the day, there's a big bold badge, and typically other markings that identify it as such; it's a Honda Civic because Honda says so.

avatar
W1ldc44t: I mean, you say it, but a world where 1660 is inferior to 1080 kind of makes labels totally useless, doesn't it? Nobody can be sure just by looking at names to compare and be sure of anything once one comparison is invalidated.
And it still doesn't readily identify who made it. Is the 3499T made by Company A, B, or C?

Sure, the Dell Optiplex 9010 Mini-Tower isn't exactly a creative name, but it readily identifies an exact model and default specification set.
Post edited December 06, 2022 by Darvond
Numbers are nothing but numbers.
Remember when computer magazines sold the January Edition actually in January?

One then started to sell the edition 01/20xy in december (20xy-1) while all others still sold 12/(20xy-1)

Customers thought that edition was more up to date than the others so everyone started renumbering their editions to match the numbers of the other one.

The second XBox was called XBox 360. Why? Because Sony had a "Playstation 3" in stores, a "XBox 2" would have looked inferior.

Judging a product by a number in it's name is complete bollocks, especially if you compare them between different companies. They only help you to identify the product. The stats and values you have to look up for yourself.
avatar
neumi5694: Judging a product by a number in it's name is complete bollocks, especially if you compare them between different companies. They only help you to identify the product. The stats and values you have to look up for yourself.
They don't help you identify the product. Nvidia introduced a video card under the same number for a third time recently.
avatar
Darvond: They don't help you identify the product. Nvidia introduced a video card under the same number for a third time recently.
The best you can do is look at tech sites that include the "relative performance" chart of your GPU:-

https://www.techpowerup.com/gpu-specs/radeon-pro-wx-3100.c2978

Even some of the slowest GPU's that require no power connector at all (eg, 75w GTX 1650 / RX 6400) is more than double the speed of your 65w WX 3100 but still available in tiny single-fan ITX versions. A typical "low-end" GPU that's still fairly low wattage, eg, 120w RX 6600 will be +4-5x the speed and 2x the VRAM vs what you have now and will still run cool & quiet under load.
Post edited December 07, 2022 by AB2012
avatar
neumi5694: Judging a product by a number in it's name is complete bollocks, especially if you compare them between different companies. They only help you to identify the product. The stats and values you have to look up for yourself.
avatar
Darvond: They don't help you identify the product. Nvidia introduced a video card under the same number for a third time recently.
Depends on the context. You won't find a graphics card from 1998 (new) in a store.
avatar
Darvond: They don't help you identify the product. Nvidia introduced a video card under the same number for a third time recently.
So if I say "Nvidia" you have just as many useful information as when I say "NVidia 2060"?
I mean:
With option 1 I think there are many dozends of graphic cards.
With option 2 there are three.

How is that not helping to identify the card?
avatar
AB2012: The best you can do is look at tech sites that include the "relative performance" chart of your GPU:-

https://www.techpowerup.com/gpu-specs/radeon-pro-wx-3100.c2978
I would like to add this website as well.

https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

The legacy list is pretty handy if you are looking for something older.


Re. naming schemes, while they are weird that's not unique to GPU's, CPU's also follow the same naming scheme. At least on x86 space they somehaw make sense, try to get any idea of performance on ARM by the names or numbers, I dare you :D
Post edited December 07, 2022 by Dark_art_
avatar
neumi5694: So if I say "Nvidia" you have just as many useful information as when I say "NVidia 2060"?
I mean:
With option 1 I think there are many dozends of graphic cards.
With option 2 there are three.

How is that not helping to identify the card?
Because much like Doom (1993) and Doom (2016) we now have to slap an additional quantifier onto them. Not to mention, don't a lot of video cards come with the same specs buy different memory amounts?
avatar
Darvond: Not to mention, don't a lot of video cards come with the same specs buy different memory amounts?
Not only different memory ammounts but also performance. I know at least 4 variants of the nVidia gt710 mixing different shading performance/amount and memory capacity/speed.

It's even more funny when OEM's like HP and Dell make their own GPU board and mix/limit stuff even more.

Should we even start to talk about laptop's?
Post edited December 07, 2022 by Dark_art_
I haven't upgraded my GPU/graphics card in years.

Even the most recent games I own still run decently well, and for most of my games, the card I have is already way overpowered, anyway.
And since I have no interest in the newest raytracing fad, etc...I see no need to upgrade.
avatar
BreOl72: I haven't upgraded my GPU/graphics card in years.

Even the most recent games I own still run decently well, and for most of my games, the card I have is already way overpowered, anyway.
And since I have no interest in the newest raytracing fad, etc...I see no need to upgrade.
I'm sure it adds a little something, but honestly, improvements in graphics technology have long gone past the point of greatly diminishing return for me where they'll spend countless man-hours optimizing things that will lead to minute degrees of appreciation for me.

Plus, as I got older, I realised that probably for many kinds of games I enjoy playing, ultra-realistic graphics is actually noise that can take focus away from the gameplay. For example, in strategy games, 3D often means that things like trees will obscure parts of the map.

At first, it was cool in the 90s when I discovered first-person games, but by the early 2000s, I found it a little depressing that practically ALL games at that time it seemed were going for ever more realistic 3D graphics without wondering if it was the right decision for the kind of game that was being made. Esthetically, I didn't find it that pretty a lot of the time.
Post edited December 07, 2022 by Magnitus
avatar
Darvond: Because much like Doom (1993) and Doom (2016) we now have to slap an additional quantifier onto them. Not to mention, don't a lot of video cards come with the same specs buy different memory amounts?
So saying "A game from ID soft" is much as worth as "Doom"?
I don't know about you, but if someone says "Doom" I can pretty much exclude Wolfenstein.

Sure, to come to a final conclusion, more information is needed. But it's a first step in the right direction, which usually is considered to be helpful.
Post edited December 07, 2022 by neumi5694
avatar
Darvond: Not to mention, don't a lot of video cards come with the same specs buy different memory amounts?
They used to in some cases. We're talking 1050Ti (? iirc) and 570 (or 470, since 570 was a rebrand) era though where cards had exactly multiples of memory, eg 4/8 or 2/4. Even then you got same model cards with different specs like the 1060 where the 1060 6GB had different core counts etc from the 3GB model. There are also some with faster binned memory ie higher clockspeeds/ bandwidth than default, I have a Vega 64 with memory about 20% faster than default for the reference model for example. Still a Vega64.

If you're doing what nVidia has been doing recently and having a 12GB '4080' and a 8GB '3060' instead of 16 and 12GB respectively other specs have to be changed though due to the way memory functions- specifically the bus width has (well, practically) to match the memory as well, and that is a physical difference. If it doesn't you get a 970 repeat with 3.5 GB of fast vRAM and 0.5GB of s l o w vRAM on the same card, and nVidia gets sued. Those 4080 (as were) and 3060s have other specs different too in their cases.

(Really though, the 4080 12GB was clearly what it's now marketed as, a 3070Ti. The new 3060 8GB is really a 3050Ti. They don't want to market them that way because higher model number --> more money. AMD has done similar on at least one occasion as well fairly recently, with a 580 model with fewer shaders than the base model; but that card was meant to be just for China)
avatar
Dark_art_: Should we even start to talk about laptop's?
Ye gods, no. I'm already unpleasantly unaware of what a terribly confusing whirlwind those are.
avatar
Magnitus: Plus, as I got older, I realised that probably for many kinds of games I enjoy playing, ultra-realistic graphics is actually noise that can take focus away from the gameplay. For example, in strategy games, 3D often means that things like trees will obscure parts of the map.
And such nonsense, i have strategy titles, if left unchecked i'm sure the buggers would easily draw 4 - 500 watts!!! Like what are we talking about. I understand it is modern to include everything these days, from ultra realistic shadows to human like unit behaviour etc, i wish they would invest half that budget in ai development, without the need for quantum computing or w/e!!