The tech industry has predicted the death of consoles for several generations now, and it hasn’t happened. But with the surge of this newfangled cloud computing, many analysts have once again claimed we’ll not be seeing another Xbox with local processing. Microsoft’s Phil Spencer firmly disagrees with this sentiment (him and I both)
Someone asked Phil Spencer on Twitter, and his response was: “I don’t. I think local compute will be important for a long time”
Remember, Microsoft are heavily invested in cloud technologies. They have been pushing their cloud computing service, Windows Azure for some time, and it powers Microsoft’s Xbox Live infrastructure. The Xbox One console does use the Xbox Live Cloud for some of its titles, for instance, the Forza Motorsports or Titanfall AI. Potentially, huge amounts of AI and other calculations can be offloaded to the cloud, while the local console takes care of stuff which the user interacts with more frequently. Other examples would be something like weather processing or persistent game worlds. Other uses including digital distribution of content (serving content such as downloads to users) along with providing dedicated servers to games for multiplayer environments.
Sony have also recently announced their Playstation Now cloud service. In a nutshell, this service will allow you to stream games in 720P to your local device, and play it (using a controller) in the comfort of your own home. Cloud computing in theory has access to a lot more ‘grunt’ than what you’ll have in any console, with potentially TFLOPS of performance easily on tap in the cloud. But, I’m sure we’re all aware of the many drawbacks of the cloud. For certain games, cloud gaming makes some sense. But for games which require fast reactions, or on frame precise movements, cloud begins to lose some of its appeal.
There are certainly other considerations – one of which is bandwidth. It’s also another reason why both Microsoft and Sony stuck an optical drive into their respective consoles. While some of us are lucky to have no data cap, others aren’t as fortunate. And with Sony’s Playstation Now services requiring at least a 5Mbps connection, there’ll no doubt be some who’re left out in the cold, either because their connection isn’t fast enough, or they don’t have enough data per month.
Latency is certainly going to be a factor for a long time. The time it takes data to be sent from your system, through the internet and then sent to a server somewhere to process, then package that information back and then send it back to you in a compressed form. It’s getting better, depending on your own connection, your distance from the server and other factors you could be dealing with a few hundred ms ping.
Then we get into other issues such as Frame Rate and Resolution. The amount of data required to process 1080P from 720P could be potentially double (depending on bit rate), and with 4K televisions looking to be the normal in a few years time, the problem becomes a longer running one. Certainly not unsolvable, but it’ll take a great deal of resources in the network infrastructure of the countries and ISP’s who’re responsible for sending data to their customers. That’s not to say cloud gaming isn’t useful in light weight clients, such as the Ipads, TV set top boxes, cell phones and other devices were local processing isn’t really enough to give us the quality in gaming that we’re expectant of in a living room environment.
Of course, there are other issues too. Sony, Microsoft or Nintendo going purely cloud gaming then puts them in the rather interesting position of just merely being a service. They’re not providing a platform of hardware any longer, and it’s a lot easier for a publisher with a lot of clout (I’m looking at you EA or Ubisoft) to come along and jump in on the act. This is similar to what both of these companies have done with the PC. EA with its Origin service, and Ubisoft with Uplay.
Also, as customers – I’m unsure if it’s actually a good thing for us. I dislike cloud gaming because we don’t own any of the content. It’s a service, and while that works with films, I don’t really feel the way with games, Call be old fashion if you’d like.
The other major reason is that we’re not really pushing the local computing that heavily yet. The most powerful of the next generation consoles, Sony’s Playstation 4, puts out 1.84TFLOPS of computing power for its GPU. It sure sounds a lot, especially compared to the 240GFLOPS or so that the Xbox 360 the generation prior managed. But it’s still very little compared to what a high end PC gaming rig can manage, which is often 5TFLOPS plus. In other words, in just a few years time, a GPU several times more powerful than the PS4’s will be available for exactly the same price range.
Nvidia’s 8800 GTX was released November, 2006 (almost a year prior to the GPU destroyer known as Crysis). In early 2010, Nvidia released the GTX 480. To put that into perspective, Nvidia’s 8800 GTX put out 518GFLOPS, compared to the GTX 480’s puts out 1.35TFLOPS of performance. This jump in performance is by no means ‘impressive’ in GPU terms either. It’s actually fairly standard, and there have been much more significant increases in shorter spaces of time. In other words, by the time either company choose to release another console, they’d likely be sporting GPU’s with several TFLOPS of compute performance on the GPU alone.
So, all in all this is a very deep and expansive topic. Do let us know your thoughts on our Facebook or Twitter accounts.