The ESRAM of the Xbox One is perhaps the single largest issue developers are facing right now when porting games over to the machine. Leaks and rumors say it’s not only responsible for the lower native resolution of Xbox One Games, but also causing games to be harder to develop on the system. Let’s investigate the cause and clear up any confusion over the X1’s ESRAM.
Judging purely on the face value specs of both consoles, Microsoft’s Xbox One and Sony’s Playstation 4 at least on paper seem very similar. 8GB of RAM, 8x lower power AMD Jaguar Cores and a GPU based on the Radeon GCN architecture. And yet, developers have confirmed various titles are running in only a lower resolution – 720P, including Killer Instinct, Call of Duty: Ghosts, Battlefield 4 and Titanfall on the XBox One. This is compared to CoD Ghosts 1080P, and BF4 at 900P on the Playstation 4.
The Xbox One’s ESRAM according to developers is the biggest culprit. The 32MB of ESRAM is placed on the main SoC’s die, along with the CPU and GPU (along with other bits of technical wizardry) to make up for the slower DDR3 memory that the Xbox One uses. The Playstation 4 uses GDDR5 8GB, running at 1375MHZ (5500MHZ effective) on a 256 bit memory bus, giving a total memory bandwidth of 176GB/s. Meanwhile. the Xbox One’s memory system uses DDR3-1066 (2133MHZ effective) also on a 256 bit memory bus, providing 68GB/s. These figures are easily calculated by the width of the bus (256) * the effective data rate and then divided by 8. For instance, the Playstation 4’s GDDR5 memory is calculated as follows: 5500 (MHZ) * 256 (bus width) / 8 = 176,000MB/s. There’s 1024MB in 1GB, and thus you’re provided the answer.
ESRAM is running at around 204GB/s memory bandwidth (this is based on actual numbers according to Microsoft, and takes into account inefficiencies of the ESRAM). The decision to use DDR3 and ESRAM over Sony’s choice of GDDR5 is likely spearheaded by two major motive – cost and design goals. The Xbox One was designed to not be solely a gaming console. Microsoft knew they wanted it to run multiple applications alongside the games, and knew the only way they could achieve this was using a lot of memory. 8GB of RAM was pretty much a prerequisite if they wanted to ensure that they had enough memory to run next generation games along with various applications. Sony could afford to go with GDDR5 memory because they weren’t so focused on multi-tasking on their machine. Therefore, they targeted 4GB of RAM fairly early. It was only reported they only managed to squeeze 8GB of GDDR5 in the system at pretty much the last minute, and offset the extra cost by not bundling in the camera as standard with the console.
Microsoft couldn’t make this same gamble. Not only were they planning on bundling in the Kinect technology with the Xbox One which raised the cost of the system substantially, but furthermore they couldn’t be sure large enough memory modules would be available to make GDDR5 a viable options. Remember, these choices were set in stone by 2010, and it was pretty much too late to start back tracking a year or two later.
So why only 32MB of ESRAM?
32MB of ESRAM was unfortunately the most they could squeeze on to the APU’s die, without starting to compromise the amount of GCN cores or CPU powerformance on the Xbox One to the point were having the extra memory would have been a waste of time. Microsoft’s previous console, the Xbox 360 used GDDR3 memory, this ran at 1400MHZ on a 128bit bus. In addition to this 512MB of GDDR3, the Xbox 360 also had 10MB of eDRAM, providing a total of 32GB/s between the GPU and the eDRAM. This memory gave the Xbox 360 basically “free” Anti-Aliasing along with other graphical effects.
With the Xbox One, this isn’t the case – and with many gamers expecting and demanding 1080P for their next generation titles, it appears that the 32MB of ESRAM is simply insufficient to meet the needs.
Let’s talk about render targets for 1080P and how much space they take up in memory:
Figures below using deferred rendering “…deferred because no shading is actually performed in the first pass of the vertex and pixel shaders: instead shading is “deferred” until a second pass. On the first pass of a deferred shader, data that is required for shading computation is only gathered. Positions, normals, and materials for each surface are rendered into the geometry buffer (G-buffer) as a series of textures. After this, a pixel shader computes the direct and indirect lighting at each pixel using the information of the texture buffers, in screen space.”
BPP = Bytes Per Pixel (wikipedia link on color depth)
*UPDATE* – I accidentally forgot to divide by 8 in the formula I was using when I was originally putting everything together for the article. I was using excel, and selected the wrong part of the formula, I apologize for that, I should have doubled checked the numbers. I’ve updated with a new explanation.
An image containing 1920 x 1080 pxls
1920 * 1080 = 2073600
= 2073600 pixels x 24 (which most games will strive for)
= 49766400 ÷ 8
= 6220800 (then you can divide this by 1024 twice to clean it up) or as we say 6mb. More info on this
All above figures are using no Anti-Aliasing.
This problem because more profound due to the need to often store multiple pieces of data in the ESRAM, along with the speed games need update / create frame buffers. Also remember, texturing and other functions often occur in ESRAM. This figure only increases if Anti-Aliasing (AA) is used, Developers are stating that memory management of the ESRAM is a huge issue already.
There are numerous examples of DDR3 being used for Graphics Solutions on the PC desktop space, even now. Although anything higher than the lower mid range will typically shun the traditional DDR3 in favor of GDDR5 of some description. Intel’s Haswell IGP (Integrated Graphics Processor) features 128MB of eDRAM, but slower DDR3 for the system drastically impact FPS on games despite this. Other GPU’s such as the Radeon 7770 GDDR5 vs the Radeon 7770 DDR3 show that the DDR3 version is up to 90 percent slower than the the GDDR5 version
Countering the Xbox One’s ESRAM problems
Is there anything games developers can do to help get around these issues, well unfortunately it’s going to require the developers to get used to working around the limitations. One problem developers have spoken about is having to manually flush the ESRAM. Microsoft supply the Xbox One with its own memory compressed render targets. These are very similar to what has already been working for the Xbox 360. They argue that by using these render targets, in conjunction with ESRAM and DDR3 developers can work around the ESRAM. The problem is that with today’s multi platform environments, especially with the rather commanding lead that PC’s and the Playstation 4 already have in terms of ease of development, how easy will this be to implement. It’s far easier to just say “well, let’s aim for 720P for the X1″.
In short, only experience and better programming tools will really help to improve matters for developers hoping to release titles higher than 720P. CBOAT (a rather famous leaker of MS news) has said:
“Yes. Ripened tools help. Developer familiarity, too. Same thing occurred with PS3 and Cell last time around.
ESRAM is a handicap for now but a workaround will come.
Hardware power is always behind though. Never changing without a complete hardware update. 900-720p will be the norm. The games will be good though, and that’s what matters. They’ll just struggle more to work on the Xbox One.” – CBOAT
Another concern is the next generation will also focus heavily on GPGPU compute. That is the GPU helping with tasks such as physics, AI and other tasks which are better suited for it. With so much of the Xbox One’s ESRAM being taken up purely on Render Targets, it’ll have limited scope with the possibility to help with compute. It’s likely that the DDR3 memory will be the most important for the X1, and carefully using that bandwidth will be much more important than even the ESRAM. The ESRAM has plenty of bandwidth, but with such little space, it’ll mean a juggling act of ensuring that data most needing the higher speed is present within the ESRAM. 1080P at 60FPS is very demanding, even for the Playstation 4. 30 to 60FPS is quite literally double the work load for the GPU to process.
1920*1080 (1080P) = 2073600 Pixels
1600*900 (900P) = 1440000 Pixels difference between 1080P 1.44x
1280*720 (720P) = 921600 Pixels – difference between 1080P 2.25x
One must also remember that the ESRAM is by no means the only limitation of the Xbox One compared to the Playstation 4, if we look at the raw GPU specs:
Xbox One GPU:
1.18TFLOPS (available for games) from 12 CU. (X1 has 1.32TFLOPS but 10 percent GPU reserved for system)
48 Texture Units
2 ACE with 8 queues each = 16 queues total
Playstation 4 GPU:
18.4TFLOPS from a total of 18 Compute Units +56%
1152 shaders 50%
72 Texture Units + 50%
32 ROPS + 100%
8 ACE with 8 queues each = 64 queues total +400%
The Playstation 4’s GDDR5 memory is certainly looking easier to use. There’s pretty much no extra latency – read more and requires no extra power. The PS4’s GDDR5 instead allows the developers to simply heap all of their assets into one large pile, which not just for compute, but for general work is a good. PC’s have huge pools of memory on video cards and main system, consoles make do with elegant design. But the Xbox One’s design is certainly causing games developers to prepare for extra work.
Microsoft have shown off their tiled resources technology, and how they can fit huge amounts of textures into just a small amount of space using partially resident textures. But, while this technique will certainly have its uses, it’ll require games developers to use it. And more over, it’ll still leave issues with the ESRAM’s space and Frame Targets.
While Infinity Ward and other huge studios only able to squeeze 720P out for the first generation of games isn’t a good sign, remember that knowledge of the console will improve. The issue is that the demands for what developers what to do on the console will also increase. We’ll have to see if the Xbox One’s ESRAM is capable of keeping up with the 1080P resolutions that many gamers expect for the next generation.