It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
timppu: I don't think anyone is against backwards compatibility. PS2 was great because it seemed to have near flawless backwards compatibility to PSOne library.
Actually as i understand it, the PS2 had the original PS1 hardware built into it, so when it detected a PS1 game it switched which hardware it was using. So it wasn't nearly flawless, it was flawless, or as far as i understand it.
avatar
timppu: I don't think anyone is against backwards compatibility. PS2 was great because it seemed to have near flawless backwards compatibility to PSOne library.
avatar
rtcvb32: Actually as i understand it, the PS2 had the original PS1 hardware built into it, so when it detected a PS1 game it switched which hardware it was using. So it wasn't nearly flawless, it was flawless, or as far as i understand it.
The joypad controller chip in the PS2 is practically a full PS1 as it was cheaper to shrink the hardware than design something new, and as a side effect it's nearly 100% compatible with the old games, memory cards, and peripherals.

The first-edition PS3, on the other hand, only had PS2 hardware in order to be backwards compatible, keeping it in there had no function beyond that, which is why it was thrown out of later revisions to decrease production cost.
Post edited March 04, 2016 by Maighstir
avatar
rtcvb32: Off hand i don't see a disadvantage. But there's always the possibility say the newer model has 4 more cores in it, then games are made assuming the system has 12 instead of 8 cores, then someone with an older system tries to run it and it doesn't exactly crap out but drops to 20fps due to bad assumptions, or a system requiring more memory and starts using Virtual Memory/Swap Space for the missing memory.

If the games HAVE to be made using the original hardware and it playable, then better hardware can only give better performance. However the reverse isn't always true.
avatar
hedwards: With the exception of power consumption and thermal dissipation, I'm not sure how any of that's going to work if it doesn't allow people to play games that they couldn't already play. Why would anybody pay for an upgrade?

Nintendo used to do the right thing in that regard with their GB line. The later revisions would play the same games, but had sharper graphics anyways. It's not something I foresee being reasonable with current generation consoles though. They were able to do that with the GB line because of the way the games were coded without requiring that the games be recoded for the fixes.
4K capability.
Size reduction.
Bluetooth integration.
4K per eye wireless VR.
Home networking file transfer (for media)
Virtual home server.
New storage capabilities in the future (similar to the switch to FLASH - but maybe something newer and awesomer is in the works)
Vertical and horizontal mounting options
New wireless technologies (including wireless video)
Auto-VPN from mobile and other xbox devices/stream your media from a distance (that may just be software instead of needing hardware)
Faster processors that allow playing two games at once on two TVs from 2 HDMIs on the same system. Hey, one can dream. Or even two HDMIs from one system to play a multiplayer game on 2 TVs. That could be awesome. Heck, 8 TVs.
Hardware to allow 3 screens for surround video

I think that's enough to think about for a minute.
avatar
hedwards: With the exception of power consumption and thermal dissipation, I'm not sure how any of that's going to work if it doesn't allow people to play games that they couldn't already play. Why would anybody pay for an upgrade?

Nintendo used to do the right thing in that regard with their GB line. The later revisions would play the same games, but had sharper graphics anyways. It's not something I foresee being reasonable with current generation consoles though. They were able to do that with the GB line because of the way the games were coded without requiring that the games be recoded for the fixes.
avatar
Tallima: 4K capability.
Size reduction.
Bluetooth integration.
4K per eye wireless VR.
Home networking file transfer (for media)
Virtual home server.
New storage capabilities in the future (similar to the switch to FLASH - but maybe something newer and awesomer is in the works)
Vertical and horizontal mounting options
New wireless technologies (including wireless video)
Auto-VPN from mobile and other xbox devices/stream your media from a distance (that may just be software instead of needing hardware)
Faster processors that allow playing two games at once on two TVs from 2 HDMIs on the same system. Hey, one can dream. Or even two HDMIs from one system to play a multiplayer game on 2 TVs. That could be awesome. Heck, 8 TVs.
Hardware to allow 3 screens for surround video

I think that's enough to think about for a minute.
And those are all things that would require massive restructuring of the hardware and/or create a situation where the owners are having to worry about which revision they've got to see if a program will run on it. That's really not the market that consoles are in. People who don't mind that buy computers.

Also, nobody is going to pay for most of those things that isn't rich. For instance 4K is for theaters and people producing video, not for consumers. By the time you get your nose close enough to see the pixels you're already losing sight of the edges of the screen. I remember the first time I saw an HDTV big screen, I had to get pretty damn close to the screen before I could make out the pixels and I've got excellent eyesight.
avatar
timppu: Aren't you now talking about something different? I think this was about your XBox being upgradeable (e.g. you could add more RAM to it or replace the CPU/GPU with a faster one), but you seem to talk about newer consoles being backwards compatible.
My interpretation of Microsoft's statement is that it's not going to provide user installable upgrades, but rather update the console hardware "mid cycle" (what I'm not sure is how they're define a generation if they update regularly).
avatar
ET3D: My interpretation of Microsoft's statement is that it's not going to provide user installable upgrades, but rather update the console hardware "mid cycle" (what I'm not sure is how they're define a generation if they update regularly).
If that is the case, then they have to figure out the pricing, as I presume there will be opposition against the need to buy new consoles more regularly to play the latest games, also considering that GPU and CPU progress seems to have just slowed down over the years (ie. each new console generation feels less and less revolutionary, also PC gamers will replace hardware less often).

Anyway, more and more I'm starting to think MS is fantasizing of "XBox" being a streamed service free of the platform shackles, and that would include also PC (gaming). These are merely steps toward that goal. But as before, MS might suddenly totally reverse their plans in a blink of an eye.
avatar
hedwards: Nintendo used to do the right thing in that regard with their GB line. The later revisions would play the same games, but had sharper graphics anyways. It's not something I foresee being reasonable with current generation consoles though. They were able to do that with the GB line because of the way the games were coded without requiring that the games be recoded for the fixes.
All in all, the CPU/hardware didn't change except for perhaps the screen. As for the GBC/color, newer games included pixel codes what to make what color from the 4 color grayscale to something more pleasing, and built into the GBC for all released games included these color palette choices. It's not really that hard to come up with, unless the colors need to change mid-game for some reason.

Reminds me a bit of the scheme used by Apple computers, where if you had a black&white monitor/apple then it would look good, and if it was in color it looked good too, using the exact same code. It had to do with how the video was output and encoded to be interpreted as i recall.

But we're not talking about the difference of screens, we're talking about larger impactful changes. Higher Cpu speeds could affect how physics work a lot making the game easier/harder, more ram could show off bugs that were totally unnoticed with less memory and tighter memory management. more GPU cores may mean nothing as the fixed known number known during the hardware's release could mean that extra GPU power would just be idle, or maybe become glitchier due to race conditions.

CPU speed could outright break the game. True this isn't back when we had 20Mhz systems when the 66Mhz came out and they had to include a Turbo button to limit the CPU so the older programs wouldn't zoom by. It's said for every 10 lines of code there's a bug, and games and OSes made today are hundreds of millions of lines of code. Even tiny changes could make a huge difference. How much i'm really not sure. It depends on how reliant it goes to standards, standards we aren't told about, be they hardware, software, API, OS, or whatnot. We are totally in the dark.
avatar
Tallima: 4K capability.
Size reduction.
Bluetooth integration.
4K per eye wireless VR.
Home networking file transfer (for media)
Virtual home server.
New storage capabilities in the future (similar to the switch to FLASH - but maybe something newer and awesomer is in the works)
Vertical and horizontal mounting options
New wireless technologies (including wireless video)
Auto-VPN from mobile and other xbox devices/stream your media from a distance (that may just be software instead of needing hardware)
Faster processors that allow playing two games at once on two TVs from 2 HDMIs on the same system. Hey, one can dream. Or even two HDMIs from one system to play a multiplayer game on 2 TVs. That could be awesome. Heck, 8 TVs.
Hardware to allow 3 screens for surround video

I think that's enough to think about for a minute.
avatar
hedwards: And those are all things that would require massive restructuring of the hardware and/or create a situation where the owners are having to worry about which revision they've got to see if a program will run on it. That's really not the market that consoles are in. People who don't mind that buy computers.

Also, nobody is going to pay for most of those things that isn't rich. For instance 4K is for theaters and people producing video, not for consumers. By the time you get your nose close enough to see the pixels you're already losing sight of the edges of the screen. I remember the first time I saw an HDTV big screen, I had to get pretty damn close to the screen before I could make out the pixels and I've got excellent eyesight.
And as with Kinect, 3MB Video RAM cartridges and nunchucks, MS will be able to clearly distinguish what is runnable with which hardware. XBOX One VR games will require the VR add-on. Which you'll know you have b/c you'll have a VR add-on. PS4's doing it and nobody seems to have a problem.

Vertical and horizontal mounting options, home networking capabilities and multi-TV outputs would be simple to add without people going nuts that something's not working. If it's compatible, send different images. If it's not, send the same image to all TVs. Easy peasy.

The point MS is making is that hardware innovations can happen more easily when you have software that is malleable. When software is locked onto the hardware, it's nearly impossible to make changes and keep compatibility.

It's not a dumb idea to retain backwards compatibly while changing mounting, power or size options. It's brilliant to me. And no other console generation has ever pulled it off. Even PS4 can't do it. They rely on cloud-based computing to do it.

So, yes. There are add-ons and changes they can make. But also, it's not just about add-ons. It's about keeping the console running everything very stable with hardware changes to make the system cheaper, stabler, smaller, less power hungry and even possibly to the great fears of everyone, more feature-rich.


P.S. PS4 has 4k and soon VR and it's already finding a market. So this is all marketable stuff.
Post edited March 05, 2016 by Tallima
avatar
hedwards: Nintendo used to do the right thing in that regard with their GB line. The later revisions would play the same games, but had sharper graphics anyways. It's not something I foresee being reasonable with current generation consoles though. They were able to do that with the GB line because of the way the games were coded without requiring that the games be recoded for the fixes.
avatar
rtcvb32: All in all, the CPU/hardware didn't change except for perhaps the screen. As for the GBC/color, newer games included pixel codes what to make what color from the 4 color grayscale to something more pleasing, and built into the GBC for all released games included these color palette choices. It's not really that hard to come up with, unless the colors need to change mid-game for some reason.

Reminds me a bit of the scheme used by Apple computers, where if you had a black&white monitor/apple then it would look good, and if it was in color it looked good too, using the exact same code. It had to do with how the video was output and encoded to be interpreted as i recall.

But we're not talking about the difference of screens, we're talking about larger impactful changes. Higher Cpu speeds could affect how physics work a lot making the game easier/harder, more ram could show off bugs that were totally unnoticed with less memory and tighter memory management. more GPU cores may mean nothing as the fixed known number known during the hardware's release could mean that extra GPU power would just be idle, or maybe become glitchier due to race conditions.

CPU speed could outright break the game. True this isn't back when we had 20Mhz systems when the 66Mhz came out and they had to include a Turbo button to limit the CPU so the older programs wouldn't zoom by. It's said for every 10 lines of code there's a bug, and games and OSes made today are hundreds of millions of lines of code. Even tiny changes could make a huge difference. How much i'm really not sure. It depends on how reliant it goes to standards, standards we aren't told about, be they hardware, software, API, OS, or whatnot. We are totally in the dark.
With larger changes the compatibility challenges become significantly greater. Why would anybody be upgrading if the games didn't require it? And if the games did require it, then they've brought about similar complications to what PC gamers have without the benefits of using a computer.

I'm sure it's possible to make this work in a way that makes sense, but I can't see this being a profitable route to take. The profitable portion of this was already established in the past. Charge for controllers, HDD and similar.

And you're more or less completely right about GB, I think it did come with some extra memory or something like that, but the actual changes that were visible to the developers were limited so that the actual adjustments were mostly in the hardware itself allowing for any GB to play any GB game without issues.
avatar
hedwards: And those are all things that would require massive restructuring of the hardware and/or create a situation where the owners are having to worry about which revision they've got to see if a program will run on it. That's really not the market that consoles are in. People who don't mind that buy computers.

Also, nobody is going to pay for most of those things that isn't rich. For instance 4K is for theaters and people producing video, not for consumers. By the time you get your nose close enough to see the pixels you're already losing sight of the edges of the screen. I remember the first time I saw an HDTV big screen, I had to get pretty damn close to the screen before I could make out the pixels and I've got excellent eyesight.
avatar
Tallima: And as with Kinect, 3MB Video RAM cartridges and nunchucks, MS will be able to clearly distinguish what is runnable with which hardware. XBOX One VR games will require the VR add-on. Which you'll know you have b/c you'll have a VR add-on. PS4's doing it and nobody seems to have a problem.

Vertical and horizontal mounting options, home networking capabilities and multi-TV outputs would be simple to add without people going nuts that something's not working. If it's compatible, send different images. If it's not, send the same image to all TVs. Easy peasy.

The point MS is making is that hardware innovations can happen more easily when you have software that is malleable. When software is locked onto the hardware, it's nearly impossible to make changes and keep compatibility.

It's not a dumb idea to retain backwards compatibly while changing mounting, power or size options. It's brilliant to me. And no other console generation has ever pulled it off. Even PS4 can't do it. They rely on cloud-based computing to do it.

So, yes. There are add-ons and changes they can make. But also, it's not just about add-ons. It's about keeping the console running everything very stable with hardware changes to make the system cheaper, stabler, smaller, less power hungry and even possibly to the great fears of everyone, more feature-rich.

P.S. PS4 has 4k and soon VR and it's already finding a market. So this is all marketable stuff.
That sounds terribly confusing. MS might know, but the people buying things would then have to do a lot more research about whether or not a game is going to work with their console. The main benefit of having a console is that you don't have to think about things like compatibility. Any PS3 game should work wtih any PS3 console. Same goes for XB360, PS4, XBONE etc.

Introducing those kinds of upgrades just fragments the market and requires customers to do more research before buying games.
Post edited March 05, 2016 by hedwards
avatar
hedwards: Introducing those kinds of upgrades just fragments the market and requires customers to do more research before buying games.
The NEW 3DS is just an upgraded 3DS, but already it's obvious there's a split in who can play what games. In theory you might be able to play it on the old systems, but i wouldn't expect it to be 'playable'.

Let's not forget Sega's addons/upgrades, 32X and SegaCD, and then the Saturn with 'it's not our future' (although that was on the CEO being a butthead who ruined the company and didn't care).

You know, i'd almost wish for a universe where the PS4/XBone where consoles, however you could switch some settings and turn it into a fully fledged PC, then you could upgrade them normally after you've owned/used them a while (however once the switch was flipped it doesn't work as a console anymore). That would be a great way to start gaming, and then change the system to something actually useful when you need it, like writing reports for school, learning programming among other things. And of course, free Porn :P

I feel so sad at how much the raw hardware can't be used for other things, which is outright annoying.
avatar
snowkatt: considering the last time this happened it was THIS !
https://en.wikipedia.org/wiki/32X

and the most succesfull add ons were these two
https://en.wikipedia.org/wiki/Sega_CD
https://en.wikipedia.org/wiki/Family_Computer_Disk_System
Hmm seems the OP already references most of what i'm talking about.
Post edited March 05, 2016 by rtcvb32
avatar
hedwards: Introducing those kinds of upgrades just fragments the market and requires customers to do more research before buying games.
avatar
rtcvb32: The NEW 3DS is just an upgraded 3DS, but already it's obvious there's a split in who can play what games. In theory you might be able to play it on the old systems, but i wouldn't expect it to be 'playable'.

Let's not forget Sega's addons/upgrades, 32X and SegaCD, and then the Saturn with 'it's not our future' (although that was on the CEO being a butthead who ruined the company and didn't care).

You know, i'd almost wish for a universe where the PS4/XBone where consoles, however you could switch some settings and turn it into a fully fledged PC, then you could upgrade them normally after you've owned/used them a while (however once the switch was flipped it doesn't work as a console anymore). That would be a great way to start gaming, and then change the system to something actually useful when you need it, like writing reports for school, learning programming among other things. And of course, free Porn :P

I feel so sad at how much the raw hardware can't be used for other things, which is outright annoying.
That's rather problematic. Sega managed it just fine just because the CD format was completely separate from the Genesis format. The 32x was a bit touchy, but they were able to handle it in branding and with carts that wouldn't physically fit in the older part of the system.

This is one of the advantages that I wish we still had with gaming consoles. You know, being actual consoles rather than locked down computers with proprietary OSes and controllers.

Like I've said, it's definitely something that's possible, I just can't see how they can do it in a way that makes sense.
avatar
hedwards: This is one of the advantages that I wish we still had with gaming consoles. You know, being actual consoles rather than locked down computers with proprietary OSes and controllers.
Where the software ran directly on top of the hardware and ran at full speed? Most of it written in assembly language to make it faster...

Yeah, i do miss those days too... It's no wonder i still love all the 16-bit classics of my youth.
avatar
hedwards: This is one of the advantages that I wish we still had with gaming consoles. You know, being actual consoles rather than locked down computers with proprietary OSes and controllers.
avatar
rtcvb32: Where the software ran directly on top of the hardware and ran at full speed? Most of it written in assembly language to make it faster...

Yeah, i do miss those days too... It's no wonder i still love all the 16-bit classics of my youth.
It depends, they didn't always use assembly. By the GBA era you could write code in C. You would however be writing directly to various registers in order to do things and the hardware itself would do most or all of the heavy lifting.

And yeah, the games would run at essentially full speed as there wouldn't be anything except for the game running and most common actions you might want to do would have a register to speed up the process.

I don't personally get the draw of modern consoles as they're effectively just computers with that proprietary stuff. The PS3 even allowed people to install Linux on it for a while.
avatar
hedwards: It depends, they didn't always use assembly. By the GBA era you could write code in C. You would however be writing directly to various registers in order to do things and the hardware itself would do most or all of the heavy lifting.

And yeah, the games would run at essentially full speed as there wouldn't be anything except for the game running and most common actions you might want to do would have a register to speed up the process.

I don't personally get the draw of modern consoles as they're effectively just computers with that proprietary stuff. The PS3 even allowed people to install Linux on it for a while.
To my understanding, although you could write it in C, some of the overhead of function calls was too high that they instead stuck with assembly language. I'm referring more to SNES games more so than others. Of course the fact it's written in assembly leaves certain interesting behaviors and bugs present, and can be quite difficult to take full advantage of.


I personally was really annoyed when they took away the option to put Linux on the PS3. Had they left it alone and the CELL processors been programmable, it's possible the system might have done better. However truthfully they should have released the CELL processors for people to get used to and make applications/uses for years before pushing it onto a console. But alas logic isn't always corporations strong point.

Oh well.
ah yes, the;

Second Edition X-Box ONE Revamp


yes yes, a sad, sad acronym indeed...