It’s no secret that Microsoft wanted the Xbox One to be a cloud-based system that could grow in power as more and more of the weight on the system was handled by remote servers. It was quite the concept, but there was one major issue: the availability of broadband internet. They wanted everyone to be online all the time, which is the only way a cloud-based system would work. Without the internet, your console would just be a fancy looking brick.
Microsoft was forced to taper back their admittedly novel ideas to make the console more accessible to the user base. While the world wasn’t ready for Microsoft’s original concept, it will be ready in the future for such a concept. The phrase “ahead of its time” comes to mind, and it’s no more true than here.
It’s honestly for the best though, the landscape is changing and moving towards a cloud-based future, but it won’t be here this generation. Let’s take a look at the shifting field of hard drive technology and how cloud-based gaming could be the best option for Xbox 2.
The Future of Physical Hard Drives: Where’s the Limit?
The 2014 Magnetism and Magnetic Materials (MMM, and yes, it’s real) Conference, put on by the IEEE Magnetics society and the American Physical Society brought with it some speakers who talked about new frontiers and concepts in hard drive technology. They also discussed options known as non-volatile memory.
Recent announcements from major players in the industry have spoke of 10 terabyte capacities in upcoming hard drives. That’s 10,000 gigabytes for those who are counting. That’s an impressive amount of space, but seeing as how our current generation games are pushing 50 gigabytes a piece, you have to wonder what happens when we push towards photorealistic graphics and suddenly our games take up 500 or even 1,000 gigabytes of space? There’s no evidence to support games this large yet, but the point still stands.
Physical storage has limitations, it depends on speed and capacity. For gaming to truly evolve, we need to lift limits and allow for anything to be possible. That being said, we’re looking at emerging technologies like singled magnetic recording (SMR) and hermetically sealed helium-filled disk drives that allow for the aforementioned capacity.
These concepts are already hitting a wall, so the focus at the recent conference was the road map for the future of digital storage. The Advanced Storage Technology Consortium (ASTC) released their road map for something called areal density, which is the amount of storage that can be placed on the surface area of a hard drive.
According to this roadmap (pictured below) the introduction of a technology caled Heat Assisted Magnetic Recording (HAMR) should be available by 2017, which would double the growth rate. Another emerging technology called bit pattern media (breaking the magnetic media into small regions on the surface) will appear in 2021 and will be combined with SMR to create hard drives that are 10 times more powerful than what we have now.
In short? 100 terabytes, or 1 million gigabytes of space by 2025. These new types of magnetic drives use spin technology which is a property of electrons that results in a magnetic field.These types of magnetic random access memory (MRAM) are a great replacement for the current technologies which use semiconductors to store data and require regular jolts to remain active.
The current technology cannot hold memory without jolts of electrical energy. Magnetic storage will allow for the power to be cut without risk of losing data.
Another major breakthrough in the field of digital storage comes in the form of IBM’s Racetrack memory. As the original pioneers of hard drive technology, it’s only fitting that they present these new take on data storage. Racetrack memory works by placing magnetic bits along a nanowire that is 1,000 times finer than a human hair.
Data is placed using the spin of electrons at atomically precise locations on the racetrack and moves at hundreds of miles per hour. Instead of the computer seeking out data, the technology slides these bits along the nanowire and brings the data to the computer, saving both time and energy. Racetrack memory combines the speed of solid state memory with the cost effective pricing of hard drives.
Theoretically, according to a press release from IBM this technology could hold all of the movies released in a given year with room to spare. It can also be powered by a single battery for weeks before needing a charge. All of this technology means that information can be accessed from the drive by moving it to a specific point and presenting it to the user within less than a billionth of a second.
So, Racetrack memory seems like a possible candidate, especially if the Xbox 2 is going to be more of a portable system, or possibly one grounded in augmented reality. As it stands, we’re finding new ways to expand on the capacities of our hard drives, but that doesn’t mean cloud-based gaming is out of the picture. Let’s look at what that technology could bring to the table.
Cloud-Based Gaming: Is This The Future?
Most of us gamers are probably familiar with the concept of a cloud. It’s essentially a server, or servers where data is stored, streamed, or computed remotely instead of on the actual console. It has a few different uses, some of which are already being utilized. In many cases, we use cloud storage to backup saved data, but this is only a small piece of what the cloud can do for gaming.
Streaming services like PlayStation Now and OnLive (recently closed and bought out by Sony) have been experimenting with the idea of streaming games to you from the cloud. Think of it like Netflix streaming, but with a game instead. It’s not a perfect process, but it’s something that’s been gaining ground.
The Xbox One utilizes another technique called “cloud computing” to improve user experiences by tapping into Microsoft’s Azure cloud. Right now, the only two games that use this technique with any significant dedication: Forza 5 and Titanfall. The former game uses the cloud to power its “Drivatar” system where it races for you online when you’re gone and follows your same skills it’s learned.
Theoretically, the A.I in a game could also be offloaded to the cloud to free up more power on the system. This would allow for more complex enemies and more competent decisions on the A.I's part during a match or in a massively multiplayer online (MMO) type of game.
Titanfall, an online only multiplayer game, uses the Azure cloud to handle its dedicated servers. Of course, there’s a lot more it can do than that, but creating a game that relies on cloud computing also requires the game to be online constantly. If it isn’t always online, then the developers have to do even more work to make the game function without that assistance of the cloud.
In an ideal scenario, a game can push some of the more intensive aspects of its moment-to-moment gameplay to the cloud and let the system handle less pressing aspects of the game. Things like dynamic physics, lighting, and other realistic effects can require an immense amount of power from the console to produce.
Ideally, tasks like these could be sent to the cloud and handled remotely with far more powerful hardware. That’s the idea anyway, but it’s not without its caveats. As mentioned earlier, developers would have to account for both online and offline players and adjust the effects accordingly.
A developer who remained anonymous spoke with NeoGAF about an unannounced Xbox One game they’re working on that uses the cloud. During the interview, the developer made some clarifications on what cloud computing can do and what it cannot do:
“You can not boost your games resolution with Azure. And no. You can not create better lighting effects with Azure. But, if you focus on it, you can still boost the overall graphical look of your game by a mile. We are currently creating a game. But in fact, we are kind of creating two-in-one. One with Azure available, and one for offline only. Everything you code, you need to code for two scenarios. This is a ton of work. if online = dynamic grass; if offline=static grass.”
Assuming we still have a physical box to work with when Xbox Two rolls around, cloud computing could be used to better prioritize the hardware it has, but it’s not a game changer in terms of boosting the game’s overall quality. The cloud is about offloading processing tasks, not actual graphics like resolutions or textures.
Essentially, your Xbox still has to make the meal, but with cloud computing, the remote servers can throw in those spices that would otherwise be impossible to use on the console’s power alone. This could be different though if the entire experiences didn’t use a box, or a hard drive for that matter.
Two Possible Futures
So, the Xbox Two could stick with a hard drive and make it as large as 100 terabytes depending on when the system releases and whether they allow for hard drive expansion on the next console. With that large of a hard drive, gamers probably wouldn’t have to worry about space, and with an always online console (plus reliable internet in the future) cloud computing or streaming games from the cloud could mean a cheaper and far more powerful console.
Of course, Microsoft could also ditch the hard drive and the console entirely and go with some sort of HoloLens type console that is portable and relies entirely on the cloud to stream games and dispute the computing between several servers, removing the need for any kind of hard drive whatsoever.
It’s hard to say more until we see how the Xbox One continues to showcase cloud computing. Depending on how that technology is utilized this generation, Microsoft will be faced with a decision between another Xbox or something else entirely that doesn’t need a hard drive. What do you think about this divide? Is it one or the other, or are there options to use both?
Tell us your thoughts in the comments below and weigh in with your predictions!