Well, it would appear Ubisoft have (unofficially) provided answer as to why we’re unlikely to be seeing the Assassin’s Creed Unity running at 1080P on the Playstation 4. Ubisoft claim the reason for the ‘parity’ with the Xbox One is down to hardware limitations of the consoles CPU. BruiseBear, a user over at NeoGaf found that a Ubisoft developer had contacted Bombcast (via email) and had divulge the reasons to the issues.
While the source has remained nameless (for obvious reasons), they were checked out by Bombcast, and Bombcast seem happy enough to believe it’s a genuine team member from Ubisoft.
“I’m happy to enlighten you guys because way too much bullshit about 1080p making a difference is being thrown around. If the game is as pretty and fun as ours will be, who cares? Getting this game to 900p was a BITCH. The game is so huge in terms of rendering that it took months to get it to 720p at 30fps. The game was 9fps 9 months ago. We only achieved 900p at 30fps weeks ago. The PS4 couldn’t handle 1080p 30fps for our game, whatever people, or Sony and Microsoft say.” The source begins, rather strongly.
Clearly the person is on the defensive over the whole ‘parity’ argument. Users (end gamer’s) have accused Ubisoft of purposefully holding back the Playstation 4 version of Assassin’s Creed: Unity for the purposes of parity. But, according to this person, that was never the case. Indeed, it would appear it took the team a massive effort getting the title running at even 900P.
“Yes, we have a deal with Microsoft, and yes we don’t want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re talking about a 1 or 2 fps difference between the two consoles,” he continues.
For those who’re wondering, this is obviously the various SDK updates from Microsoft which reduce the reserves for Kinect. Despite initial rumors that these reserves only impacted the GPU, it would appear upon further investigation CPU and memory bandwidth were also being held back for Kinect.
“So yes, locking the framerate is a conscious decision to keep people bullshiting, but that doesn’t seem to have worked in the end. Even if Ubi has deals, the dev team members are proud, and want the best performance out of every console out there. What’s hard is not getting the game to render at this point, it’s making everything else in the game work at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed optimization for lots of Ubisoft games in the past, this is crazily optimized for such a young generation of consoles. This really is about to define a next gen like no other game before. Mordor has next gen system and gameplay, but not graphics like Unity does. “
This one statement alone has caused an internet riot, particularly when paired with the next paragraph.
“The proof comes in that game being cross gen. Our producer (Vincent) saying we’re bound with AI by the CPU is right, but not entirely. Consider this, they started this game so early for next gen, MS and Sony wanted to push graphics first, so that’s what we did. I believe 50% of the CPU is dedicated to helping the rendering by processing pre-packaged information, and in our case, much like Unreal 4, baked global illumination lighting. The result is amazing graphically, the depth of field and lighting effects are beyond anything you’ve seen on the market, and even may surpass Infamous and others. Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data,” the statement concludes.
The issue here for many, is it still doesn’t go completely into why there’s still resolution issue between the two machines. Frame rate makes sense, as it can be impacted by either CPU or GPU. In this case, Ubisoft are claiming to be CPU bound, hence why they are at 30FPS. And it’s true that the CPU must tell’ the GPU what to do. Quite simply, if the CPU can’t provide enough compute dispatches, or enough draw calls then the GPU is just waiting around.
As a side note, it’s also a demonstration of how console creators influence game development in nothing else.
But in terms of resolution, that’s where things become trickier. Typically, it’s a case where resolution is “GPU bound” – in other words, the GPU cannot physically render more pixels.
So we’re left in an extremely odd situation – where even a supposed ‘leak’ is being called in to question. One of the biggest issues I see is that Ubisoft have mentioned parity a few times, the only thing I feel would really satisfy people at this point is to be told (possibly with graphs showing CPU usage, and other debug information) why the CPU is holding back the GPU rendering at a higher resolution.
It becomes all the stranger when reading over our analysis of Ubisoft’s own GDC 2014 presentation, Their own internal benchmarking highlighting a clear and rather massive gulf between the PS4’s and Xbox One’s GPU performance.
One thing which does strike me, given the recent GDC Presentation (linked above) is Ubisoft were demonstrating the Playstation 4 using DX11 code. I do wonder if it’s possible Ubisoft are using high level API’s on the PS4, and not the low level optimization,
High level on the Playstation 4 uses GNMX, which is fairly similar to DX11. GNM is the low level API. It’s difficult to know for sure, but Ubisoft did say they manually had to manage much of their compute.
For PC folks who’ve ever bought a new set of PC components, and their new GPU has arrived ahead of their CPU / motherboard, you might have given into the “ohhh, shiny” and plugged the GPU in to the older system to try it out. Sure enough, you’re able to push the resolution or Anti-Aliasing up, but certain in game settings (those which tax the CPU) still badly affect frame rates. Indeed, you might even raise or lower resolution with little difference in the FRAPS frame counter.
The whole reason is because the CPU can’t keep up with the GPU. So for the time being, your frame rates will still suffer.
And then, there’s a user over at Reddit points out that three folks from Ubisoft came to visit to provide a lecture and answer questions on games development. During this interview, the user said the following:
“The Game Architect said that they aim for 60 fps but due to “limitations”, they have to settle for 30 fps in recent games. He then implied that console makers are pressuring them into doing the same thing on PC.“
We’re now left in a rather unfortunate situation. Where even if Ubisoft (or for that matter, several other AAA developers) told the honest truth, gamer’s wouldn’t known if they could believe them.