The Witcher 3 Developer Says DirectX 12 Will Not Help Xbox One Reach 1080p

The Witcher 3: Wild Hunt lead engine programmer Balazs Torok has said the forthcoming DirectX 12 update will not help the Xbox One reach 1080p, something that gamers had been hoping for. Many titles run at a slightly lower resolution on the Microsoft console, Bungie’s Destiny being one of the few that matches the output of the PlayStation 4.

“I think there is a lot of confusion around what and why DX12 will improve. Most games out there can’t go 1080p because the additional load on the shading units would be too much. For all these games DX12 is not going to change anything,” said Torok.

“They might be able to push more triangles to the GPU but they are not going to be able to shade them, which defeats the purpose,” he added.

Stardock Entertainment’s CEO, Brad Wardell, expressed a similar sentiment in April this year, suggesting “The PS4 hardware is substantially better overall,” and that Microsoft’s choice of memory was to blame.

Developers are continually improving the efficiency of their game engines so the Xbox One may eventually match the PS4 but it looks like DirectX may not be the solution everyone was hoping for. Microsoft recently allowed developers to drop Kinect support within games in exchange for a small boost in power which should also help in the quest for 1080p.

Source: GamingBolt

– PAGE CONTINUES BELOW –

26 Comments

  1. Funny, I play games because I enjoy them. I don’t find a slight drop in resolution really affects that enjoyment. I guess other people have different priorities when playing a game.

    • Funny this logic didn’t seem to apply last generation when we had to endure 400x zoomed in comparisons of gtaiv..

      Of course it matters, and this news fails because Sony will also improve too, so any gap now will stay, and more likely widen as developers get to flex the ps4 hardware more one the xbox parity clauses are torn up.

      Dx12 was never going to change things, anyone that works as a developer knows this.

    • In an ideal world, this would prompt a technical discussion of dx12 and how it reduces load on the cpu to do more tasks, in the comments. Your comment seems pretty irrelevant to the article and inviting/baiting an argument.

      • Errr not what I was going for. I was just saying that I don’t care what hardware a game runs on, or if it has a billion more pixels or whatever. So long as the game is fun (you know, fun, enjoyment, the idea behind most games) then I’m not that fussed. I enjoyed playing Red Dead on PS3 so much, did it matter that it looked better on the 360? Nope, not in the slightest because the story was great, gameplay excellent, sounds wonderful… all of which made it FUN!

      • Ok, just thought your comment came across that way a bit. I’m not saying anyone would by a game not prioritising enjoyment (fun I would disagree, horror games for example, are they ‘fun’).

        But today’s high end games are technological marvels and surely that warrants interest and debate?

        Like, dx12 should reduce a game’s dependency on the cpu speed (by multi threaded rendering and less draw calls), allowing more complex games, potentially, and a more stable framerate. So it really does affect enjoyment.

        However resolution has almost nothing to do with cpu power so this developer is right in that it won’t directly help going from 720/792/882/900/whatever p to 1080p.

  2. This is officially the worst gen of gaming.

    • From a media/community perspective it’s horrid but it’s not the generation’s fault. It’s ours. The shitstorms we see brewing on websites across the world are nothing more than a disgrace when you look at the context of what the industry is – a wonderful place for us to enjoy gaming!

      Sad, isn’t it.

      • I’d say the blame is shared between fans who want to go on about their console being superior, and the writers who publish the articles, stoking the fires. As you say Mike, gaming is about enjoying yourself, too many have forgotten that!

      • Spot on, fella. Very much why I mentioned media/community. The providers and the readers.

        The stuff I read (thankfully not usually on TSA) is nothing short of contemptuous. Weird how the internet can show such profundity and equally be like naughty children playing up and in dire need of a fu***** good slap.

        Please appreciate I don’t go around slapping children! :-)

    • Amen to that mate, i refuse to buy destiny on a next gen!! Not because the game is rubbish, its not its one of the best games i have played in a long time and I love it. But 30 FPS pfffttt no thanks ;) ;)

  3. I guess it depends on how their engine works and how they choose to use the EdRAM. Microsoft made the EdRAM way to small forcing developers to choose between a higher resolution framebuffer or being able to use more of the high speed EdRAM for other things. You can bet if destiny manages 1080p on the xbox one the resolution of other parts of the picture will go down, like shadows

    • Yep. Not sure how they’re going to get around it but maybe there’s something truly clever they can do. My programmer friends (who are all genuinely talented guys usually running their own companies) mentioned straight from the off and most of them are not even gamers.

      Words to the effect of “the DDR3 coupled with the eSRAM is going to be a big, big problem for 1080p when matching the PS4 and the PC”. Most devs do seem to know the situation but I’m fascinated to see if there’s a workaround.

    • I feel so sorry for saying this, but it’s eSRAM: Embedded Static Random Access Memory.

      eDRAM is what the Wii U and 360 has.
      The D standing for Dynamic.

      eSRAM can hold its data longer, as long as it’s fed power. Whereas the older DRAM requires a constant refresh. SRAM is also much faster.

      But yeah, the problem is capacity and not speed or processing power. 32MB is enough for 1080p, if it’s like a seventh-gen game in other terms. Hence Halo Collection running at 1080p and 60FPS. The problem is that when you add better lighting, shaders and all the wizards jizz that make up a “next gen” game… It falls woefully short.

      It’s weird that Microsoft, the creators of DirectX, failed to see the importance of this bottleneck. Perhaps they planned on making it a bigger amount, but the added cost was too much, forcing them to reduce it? It’s weird, but I assume there’s ways to make the best of it (we’ll see next year with Halo 5) and 900p isn’t the end of the world.

  4. Meh.

    I don’t have an Xbox One but the differences aren’t drastic, if you can’t see something unless you go searching for it you have to ask yourself why you’re searching for it at all.

    Just play a game on the platform of your choice, they all look good at this point.

    • What if the differences aren’t drastic yet? Like XBONE the PS4 will also be optimized over time. I remember every multiplat XB360 looking far superior compared to PS3. Simply because it was a whole lot more effort to optimize for PS3.

      • To be honest, I think this will become less of a problem (to most people) as each console’s game library grows.

  5. It’s starting to look like most of the games which I think do look visually impressive for the new gen don’t seem to reach 1080p on the Xbox One.
    This could make sense as to why I personally thought Destiny’s visuals looked a bit poor, lo and behold Destiny will be 1080p on the Xbox One.

  6. I find myself (having just turned 41, having started out gaming on a Sinclair ZX81) in Aug’2014 with BT seemingly unable to supply a stable 8 Meg connection to a house 500 yards from the bloody exchange, MS charging me to use an out dated web browser on the 360 that Gmail tells me will soon be unsupported, everything new seems to have massive day 1 patches, wants to sell me DLC on Ps3/360…

    If i ‘upgrade’ i’ll be forced to play online on either machine (PS4/Xbox One) face bigger patches, media abilities worse than my existing formats (360/PS3) and games will still struggle to reach 1080P, 60 FPS, something i was promised last generation.

    1st world issues, but i cannot honestly think of a generation where i’ve so content to stick with what i’ve got.

    People really need to get over the OMG it won’t be 1080P issue though, Lair was on PS3 and that was an utter turd of a game.

    If developers are going to feel pressure to ensure 1080P on Xbox One comes before everything else, expect it to ‘cost’ in other areas….

    Basically be careful what you wish for.

    The arms race has been odd past few years, Nintendo fans (i’m owner of DS, GC, GB etc, had SNES and N64) once loved to harp on about SNES Mode 7, better sound chip, more colours etc over Mega Drive, but by time of Wii/Wii U, suddenly it was’nt about the most powerful platform….

    MS fans (own Xbox and 360) were happy to to do the same with Classic Xbox over PS2, 360 over PS3 (Digital Foundary posts made me shudder at times), yet now Sony finally woken up to fact decent level hardware needs decent levels of Ram (PS1, PS2, PS3+PSP all suffered from limited Ram), it’s time to accept that this time it’s Microsoft’s choices that are proving a little akward for developers, but it’s not a game killer….

    Lack of must have/only on…games on both PS4/Xbox for myself are proving far bigger issues than the resolution they run at, though i had hoped this gen might see 1080P as standard for all those who own 1080P TV’s, i myself still happy with the 720P set.

    So no..removing Kinect, using the cloud, changing to DX 12…whatever whilst will improve performance, there’s always going to be a trade-off between what developer would like to do and what hardware will allow, but then, again, if you were looking for cuttuing edge visuals each time you bought a multi-platform game, you’d have gone the PC route, which can be upgraded, right?

  7. AT the end of the day, if you have only one console, then it only matters how good the game looks on it. Whether another console has a (slight) advantage or not shouldn’t matter, as long as you enjoy the games you have.

    Having said that, the gaming world seems to spend a lot of time boasting about how one machine is better than another. I still remember a friend and I comparing various games: REscue on Fractalus on the commodore 64 and atari 800XL – the c64 version sounded slighly better and loaded much faster – but the atari version looked obviously better; drop zone looked better on the XL while the C64 sounded better – but the disk version on the atari looked even faster than the turbo tape version on c64. And then there was Tir Na Nog – an awesome game on both ZX Spectrum and C64 – and which I recently ruined my teenage memories of when I noticed while playing on the Pi the hero even had manboobs!

    End conclusion: Play games for the gameplay, and enjoy the machine you have!

    (and sorry for long winded post :P)

  8. long post, but seriousily BT? 10 days on, so many emails, phone calls, texts, evidence entire village is effected and your still telling me it’s my property that’s at fault, likely to be (standard) £130 charge…..

    30 mins to download a 3 Meg patch on 360 last week….

    What resolution a game runs at is last thing i’m worried about.Those massive patches for PS4/Xbox One games linked to limits on a sh*te broadband service rate far higher, personally.

  9. @MuggleMind:Kudous for the A8 references.

    Rescue On Fract.ran a LOT faster on A8 (was written for the hardware i seem to recal), A8 had faster CPU so handled the ‘maths’ far better, POKEY was great, multi-purpose soundchip, more flexible for sound FX, but SID on C64 far better for music.A8 version also had opening sequence not on C64.The increased speed made for a much ‘better’ game, i picked up C64 version after moving from A8 to C64, found C64 version too sluggish by comparison, so got tired of it far more quickly.

    As for Dropzone:Archer Mc. said it ran 2.5X faster on A8 and in 12k less code.

    C64 won over A8 in terms of sprites (or PMG in A8’s case, SID chip, Turbo Loaders on tape and software support!, but Atari had 256 colours, faster CPU) i found i needed both for the killer games on each and even then ZX Spectrum blew both away in cases like R-Type, Robocop (Speccy 128K), Chase HQ etc.

    As long as the potential of any hardware is being used fully and a game delivers quality entertainment in spades, the benchmarking stuff really should’nt be the issue it’s suddenly become.

    People wanted the Next Gen..NOW, but at £299 or under, which was an impossible task for Sony or MS to deliver without incurring MASSIVE losses and even if either platform had a much faster CPU etc by very nature of PC hardware, PC would over take it 12 months+ down the line as newer CPU’s, GPU’s etc emerged.

    Bit surprised people are’nt more up in arms about cost of online gaming on console, how it’s now accepted to release games now, patch on Day 1 or later….

    • It would appear that if you spread out horrendous amounts of money (ie. per month, etc) people don’t get nearly as angry about it. *looks at expensive smart phone* Strange, that. We’re so easy to con. :-)

  10. While you lot don’t care I bet you this will be near the top of the most read pages on TSA this week. There are a lot of readers who never comment and are interested in this sort of thing which is why I felt it was worth posting.

    • Most of them work in Microsoft damage control, that’s why. They come out and pretend graphics don’t matter… If do why did you buy a next generation console?

      • Obviously to play the latest games, whether they look like Mercenary Kings or inFamous.

Comments are now closed for this post.