XI
you are not logged in
News

Sony Issues Statement Regarding PS4 RAM - 5GB Available to Developers?

But what about the directly flexible memory?

Sony have issued a statement to Eurogamerclarifying how much memory developers will have access to. Essentially, they’ve explained the differences between “flexible” and “direct” memory, with Eurogamer working out that there is likely 5GB available to developers with 4.5GB of conventional RAM and 512MB of the “flexible memory”.

We would like to clear up a misunderstanding regarding our “direct” and “flexible” memory systems. The article states that “flexible” memory is borrowed from the OS, and must be returned when requested – that’s not actually the case.

The actual true distinction is that:

  • “Direct Memory” is memory allocated under the traditional video game model, so the game controls all aspects of its allocation
  • “Flexible Memory” is memory managed by the PS4 OS on the game’s behalf, and allows games to use some very nice FreeBSD virtual memory functionality. However this memory is 100 per cent the game’s memory, and is never used by the OS, and as it is the game’s memory it should be easy for every developer to use it.

We have no comment to make on the amount of memory reserved by the system or what it is used for.

So, that’s 4.5GB of direct memory, which is purely allocated to the game and then, according to Eurogamer, the flexible memory is “virtual address space”, of which 512MB is used for “physical area”, leaving 512mb available.

I’ve just realised that Sony still haven’t said how much RAM is actually used by the OS. Funny that. Maybe “not too much” would be a suitable description.

Read more: # #
35 Comments
  1. Danlord
    Member
    Since: Sep 2009

    “Essentially, they’ve explained that there is 5GB available to developers with 4.5GB of conventional RAM and 512MB of the “flexible memory”.”

    In the quote posted, they don’t explain that at all and only explain the difference between the Direct and Flex memory and don’t go to any period to confirm the amount of RAM allocated to games.

    Furthering that, we know from Jonathan Blow that The Witness is using over 5GB and we’ve heard from numerous reliable NeoGAF sources (as linked in the other post) that some games are currently using 6GB of RAM.

    Keep in mind, the Eurogamer article is posted by Leadbetter, the author in recent times of some highly questionable posts regarding PS4 and Xbox One.

    Comment posted on 29/07/2013 at 14:21.
    • Sir_Jay
      Member
      Since: Dec 2009

      I completely agree with the comment above. Sony haven’t confirmed or denied anything.

      Comment posted on 29/07/2013 at 14:36.
    • evret
      Member
      Since: Mar 2012

      THIS
      sony only cleared up the miss-understanding of the flexible memory, they DID NOT confirm how much RAM is allocated to OS or how much is available to games

      Comment posted on 29/07/2013 at 14:39.
      • Blair Inglis
        hisnameisblair
        Since: May 2009

        Yeah, I think misread Eurogamer’s comment as Sony’s. Updated.

        Comment posted on 29/07/2013 at 14:50.
      • Lymmusic
        Member
        Since: Mar 2010

        i made the same mistake as blair here yesterday…

        Digital Foundry have a fairly good reputation and technical knowledge but have in the past been very Xbox biased, but I think the general tone of their article wasn’t bashing Sony or the PS4. Some people seem to be behaving as though they are.

        I doubt Sony will reveal their OS footprint ever, and the RAM available is huge compared to current gen. The PS4 is going to be a great machine, and unless you are planning on coding on it, I would just stop worrying about RAM. :)

        Comment posted on 29/07/2013 at 15:02.
      • cc_star
        Team TSA: Writer
        Since: Forever

        You take the reporting of Xbox performance advantages after thourough & factual analysis in the majority of (not all) multiplatform titles over the course of the generation as Xbox bias? Weird.

        Comment posted on 29/07/2013 at 15:17.
      • Danlord
        Member
        Since: Sep 2009

        I do catch on partially to what LYMMUSIC feels like in regards to Digital Foundry in some instances, or should I just say more specifically Leadbetter.

        From recent memory, he has done an article that somehow has magically improved the bandwidth in the eSRAM which NeoGAF commentors have thoroughly shown to be clear exaggeration and fluffing the numbers to attempt to match the PS4′s bandwidth from GDDR5. Then there is the article that claims the PS4 records 7 minutes of gameplay when in actual fact it was 15, and justified the evidence from low-quality audio from a conference when a SCEE speaker at Develop said ‘several’, could be understandable but okay. Then there’s this situation. The PS4′s RAM. I’m holding NeoGAF in high regard into the validity of their claims as I know that many have confirmed insider status and have been right numerous times with insider information, and they have claimed that this article is false.

        I guess I shouldn’t paint DF with the same brush, given it’s Leadbetter’s reporting, such as the Final Fantasy XIII ‘face-off’ that was supposed to happen, but really was an excuse as to how the Xbox 360 version should’ve been better. I find that kind of article to be questionable, because there is barely any occasions where the PS3 version of a game which is inferior is given the same coverage of how it should’ve been better. There has also been on numerous occasions when the Xbox 360 version is slightly inferior it’s down to the developers and downplaying the PS3 ‘advantages’ in a different light, like the Portal 2 face-off, by (you guessed it) Richard Leadbetter. One author is seriously putting into question the validity of DF articles. The cynicism is spreading further to whether or not the face-offs are truly representative and if they are cherry-picking specific moments in the game to further a bias. It’s cynical but in recent times I am somewhat justified.

        Sorry for the long comment, but it’s worth discussing this matter

        Comment posted on 29/07/2013 at 15:46.
      • Lymmusic
        Member
        Since: Mar 2010

        Nail on the head there danlord… We all know the 360 had marginally better performing multiplats and we know why, but leadbetter’s bias was manifest in subtle but tangible ways. Any small victory a ps3 port may have had was often brushed aside, whereas the same technical subtlety was always a deal breaker if it was an otherwise underperforming 360 game.. But its his opinion and i read most face offs knowing it was just that. The tests he did were fun ways of learning technical terms that have helped me understand 3d graphics technologies, but i’d never base a purchase off them.

        But anyway i digress…

        Comment posted on 29/07/2013 at 16:07.
      • JesseDeya
        Member
        Since: Jan 2010

        I tend to agree with Lymmusic and Danlord. You can’t read Leadbetter’s articles, particular as a collective, without getting a strong feeling he has an axe to grind with Sony. More specifically, he seems to be pretty cosy with Microsoft sources.

        That’s all fine and beaut, but these days I take what he ‘reports’ with a healthy grain of salt.

        Comment posted on 30/07/2013 at 06:24.
    • TSBonyman
      Member
      Since: Dec 2009

      We also know that historically that development machines have had more ram available to them than retail machines so it might turn out that when The Witness is completed it will have been compiled to run in less than 5GB.But i’m just speculating :)

      Comment posted on 29/07/2013 at 14:56.
      • Danlord
        Member
        Since: Sep 2009

        That is true that development kits have an additional overhead for debugging/testing of the games, but his game is loaded into RAM and thus the allocation implies that at 5GB+ is dedicated to gaming. This tweet also touches on slightly your comment about it; https://twitter.com/Jonathan_Blow/status/343886658632036353

        Comment posted on 29/07/2013 at 15:11.
      • TSBonyman
        Member
        Since: Dec 2009

        Cool, thanks for clearing that up.

        Comment posted on 29/07/2013 at 15:44.
  2. cc_star
    Team TSA: Writer
    Since: Forever

    3GB-3.5GB used by the OS.
    Xbox One uses 3GB which runs a game OS including its GameDVR recording feature and TV OS simultaneously to allow instant switching between the two.

    Still struggling to see what Sony are doing to use so much, if it really is a 1.5GB OS and 1.5GB reserved for 15mins of game recording that’s just a bizarre design choice to force recording on devs & players

    While the RAM is plenty early in the gen especially as most games are cross-gen, as the gen kicks on it will become just as limiting as it is now.

    hopefully Sony have a new feature to reveal just why their OS takes up as big a footprint as Microsoft’s which with so far announced features does more with the TV stuff alongside the usual game features.

    Comment posted on 29/07/2013 at 14:25.
    • uncharted86
      Member
      Since: Jan 2012

      Yeah, 3-3.5 GB for the PS4 OS seem way overkill. Fortunately Sony insiders at Neogaf are confirming that it’s a 6/2 GB memory split between what game developers have access to and the OS.

      Comment posted on 29/07/2013 at 14:47.
    • HunterGatherer
      Banned
      Since: Jul 2013

      This comment is hidden.

      Comment posted on 29/07/2013 at 14:48.
      • KeRaSh
        Member
        Since: Nov 2009

        This sounds very plausible. If they can shrink the OS and release that RAM to game devs then that would be pretty cool.
        GAF sources say that even the 6/2 situation will net be how it will be in a few years so this really is just a storm in a teacup.

        If they really do write the 15 minutes of video footage to the RAM (which I think is not the case) then it must be to keep the HDD from stessing out from the constant recording.
        They must have had this feature in mind long before the RAM upgrade and I can’t imagine they’d waste 1,5 GB of only 4 GB of RAM for such a feature.

        Comment posted on 29/07/2013 at 19:05.
    • Amphlett
      Member
      Since: Jul 2009

      Is the 1.5GB OS and 1.5GB Video really such a bizarre design choice?
      Sony’s engineers have been working day and night on this design for a few years, they’ve scrapped the CELL based bespoke architecture, and this time around they’ve pulled in engineers from development studios to discuss and have direct input. That sounds to me like a decent design process and, without the benefit of seeing all levels of design documentation, I would put a level of trust in their ability and decisions.

      Comment posted on 29/07/2013 at 14:49.
      • uncharted86
        Member
        Since: Jan 2012

        Yes, because they have not been working for years as you say with a 8 GB system in mind, but rather a 4 GB system. With 4 GB memory, a OS of 3 GB would be a pretty bad idea. The bump to 8 GB only happened shortly before the February reveal.

        Comment posted on 29/07/2013 at 14:52.
      • cc_star
        Team TSA: Writer
        Since: Forever

        If the vid were to take up a good portion of RAM, then yes it would be a bizarre design choice.

        Sure there’s more than plenty of RAM stepping up from PS360 to PS4, but after a couple of years will it still seem like such a lot

        There can’t be a one size fits all, I’d much rather have seen Sony mandate 30sec ‘brag clips’ (enough to show the decisive kill, crash puzzle solve or whatever) and provide the tools & OS support for devs to scale that up to whatever they want, if they want to eat into available game RAM by providing 15mins or whatever of game recording then fine, but if a dev needs the RAM space to fill with textures, effects or whatever to help them hit their target framerate then it should be dev choice

        Who even needs game recording beyond brag clips other than reviewers and those who think they’re YouTube famous – We can’t all be YouTube famous, because then no one will be :p

        Comment posted on 29/07/2013 at 15:34.
      • HunterGatherer
        Banned
        Since: Jul 2013

        This comment is hidden.

        Comment posted on 29/07/2013 at 20:18.
      • a inferior race
        I'm special
        Since: Jul 2009

        Yay! Misinformed insults.

        Comment posted on 29/07/2013 at 21:27.
    • TSBonyman
      Member
      Since: Dec 2009

      I expect it must be reserved for something like Gaikai or some other feature they haven’t revealed. I thought at first it might be that they were reserving some processing power for the second wave of titles, but i can’t see an extra 512MB or 1GB making a huge difference. Then again it might just be enough to allow for improvement.

      Comment posted on 29/07/2013 at 15:05.
  3. OneShotWook
    Member
    Since: Jun 2010

    Makes for a bad headline but until they go into alot more detail (or just release it) all we really have are comparisons to current pc software/hardware which to my knowledge still doesn’t have any fully 64bit+ games.
    It all seems a bit moot but i tend to agree with you CC on the fact that it does appear short sighted.
    (I’ve not looked at xbox future tech so cannot comment on that).

    Comment posted on 29/07/2013 at 14:50.
  4. Bilbo_bobbins
    Member
    Since: Jun 2009

    Hold on, how is this news still. Sony haven’t confirmed anything from what I know of?
    Eurogamer and TSA seem to be posting similar stories too lately from what I’ve noticed. If it was me I would steer clear of that because TSA normally stays away from the big sites to bring decent articles, not more speculation.

    We all know as the console matures more memory will be available, not that it matters in the first place.

    Comment posted on 29/07/2013 at 14:58.
    • JesseDeya
      Member
      Since: Jan 2010

      I agree, it’s not news, particularly when you consider the original TSA article on this topic had already been updated to reflect the EG update, AND, there has been a subsequent article citing developer tweets and GaF posts debunking the EG ‘report’.

      Comment posted on 30/07/2013 at 06:26.
  5. joeybamboo
    Member
    Since: Nov 2008

    i dont really think it matters that much to be honest, as long as it plays well and looks good i dont really care how much is used by os or game. cant believe people actually see this as an issue. either way its still more than ps3 uses, so its win win.

    Comment posted on 29/07/2013 at 15:06.
  6. Jones81
    Member
    Since: Mar 2010

    Gddr5! Ram! Teraflopalops! Woop woop !

    Comment posted on 29/07/2013 at 15:13.
  7. Dar-Kaus
    Member
    Since: Apr 2009

    Never mind the poxy memory, just Googleplex up on the amount of data transfers per second Sony and MS will be quaking in their boots, because it just a sodding numbers game to some people.

    Comment posted on 29/07/2013 at 16:09.
  8. cam the man
    Member
    Since: May 2009

    I thought there would be more RAM available to developers, at least 6Gb. Still loads more than the PS3.

    Comment posted on 29/07/2013 at 16:26.
  9. stormy
    Member
    Since: Apr 2010

    Why does it need so much Ram for a OS? I just don’t get it….

    Comment posted on 29/07/2013 at 16:51.
    • HunterGatherer
      Banned
      Since: Jul 2013

      This comment is hidden.

      Comment posted on 29/07/2013 at 20:13.
  10. orimisac
    Member
    Since: Apr 2012

    You want to know how much is used by OS. They answered that when they stated “some very nice FreeBSD virtual memory functionality. However this memory is 100 per cent the game’s memory, and is never used by the OS, and as it is the game’s memory it should be easy for every developer to use it.”

    Then you go and see the FreeBSD memory fingerprint. You’ll have 512MBytes allocated for OS. Supposing you have a diskless install (meaning everything in RAM disk) you’ll need about 1.6 to 4.5 GBytes. I guess Sony didn’t use XWindow/Gnome/KDE neither several other stuff (sendmail/apache/mySQL/etc). I guess Sony does not use a “diskless” FreeBSD (because it comes with HD >= 500GBytes and SO in flash).

    So, where they use the “disappeared memory”? As it was noticed in a previous entry, both Sony AND Microsoft stated that they’ll allow for zero latency switch between games and other games/applications and Sony announced that it will be possible to play games while downloading them. This can only be accomplished if you have a fracking memory swap/RAM disk area where data is kept safe while swapping applications. How much memory do they both Sony AND Microsoft reserve for this? They haven’t stated that yet. IMHO something around 2GBytes. And that would represent the limit of memory that cannot be used by game developers (512MBytes SO + 512~1GBytes OS data (including SWAP) + 2GBytes reserved: 2 + 1 + 0.5 ~ 3.5GBytes or so. 8 – 3.5 ~ 4.5GBytes. Easy numbers even for extremely dumb or ill intended people.

    So, this stuff is bad journalism from begin to end. And with that (very) bad taste of lobby directed merchandising. Because BOTH incoming consoles make very similar use of memory… and only Sony is being questioned about “memory issues”. Sooo… how much are you earning from Microsoft?

    Anyways, it is always possible to go back. It’s always possible to dismiss the functionality of fast swapping, playing while loading, sharing, etc. It’s always possible to go to the concept of PS3 (or XB360) with more memory. Is it what gamers want? To be stuck in the LAST generation paradigm forever?

    On the other hand, what 4.5GBytes (or 5GBytes) of allocated memory mean to games? Just as a measure, current games load in a 512MByte space. In any platform you want to mention (PC, PS3, XB360). 4.5GBytes is 9 times more memory. But it’s more than that: besides this 9 times more memory, there’s a fast swapping capability that allows zero time latency while loading chapters. Also, it’s possible to swap between games (meaning, for instance, that it’s possible to play multiplayer CoD and when your team disband, switch to Campaign mode with zero latency. Is it a bad use of “platform resources” as so emphasized by media and blogs by people that didn’t pass 1st year of Computer Science course? Don’t think so.

    Comment posted on 29/07/2013 at 17:10.
    • seedaripper1973
      Member
      Since: Forever

      *ouch*

      Comment posted on 29/07/2013 at 19:52.
    • bunimomike
      Member
      Since: Jul 2009

      Wonderful reply, fella. If they use the flexible memory as a swap space in the way you mention, then the seamlessness and immediacy will be nothing short of breathtaking. God knows the devs are probably very happy with 4GB as they don’t have that luxury on the PC as there’s no fixed hardware spec. Equally, as mentioned by many, Sony can always “give a bit back” as time goes by and we may see the limit increased by half a gig or so. Even if not, Sony’s battle plan for the next few years is a “best guess” and we’ll all look forward to seeing how that unfolds. :-)

      Comment posted on 29/07/2013 at 21:30.
      • orimisac
        Member
        Since: Apr 2012

        But 4GBytes is plenty of memory for a game. Start at current standards: 512MBytes (PS3 and XB360). Then double it, double it again, double it again and you’ll have the 4GBytes (minus extra 512MBytes minus “extra 512MBytes”). So, even at the 4.5GBytes there’s 9 times more memory. That means we are talking about the square of 4 times current memory.

        Thinking in terms of representation space, this increase in memory allows for (representation) maps 8, 9 or 10 times larger than current ones. And that gives the increase in complexity gained just by means of enlarged memory space. This allows multiple AI engines, much more complex physics engines, etc.

        And that’s good news for PC players too. Because now game developers will adopt the 4GBytes as baseline for game development. Perhaps they’ll even adopt the fast swap and integration with apps as baseline for PC games. And, at least to me, these are really great news.

        Are everything roses? Not really. Why?

        Because this increase in possible complexity maps in a corresponding increase in development costs. There will be necessary to balance the gains on not having to think in terms of small memory and having to deal with exponential complexities. That’s why I guess the first games won’t differ that much from current generation games. Investment must be done in the development of SDKs that allow to deal with increased complexity (larger maps, multiple AI engines, multiple physics engines, advanced multimedia, more complex graphics, etc).

        Besides, memory is important. But it is necessary to understand the whole architecture to see where other performance gains are possible. Now developers can use up to 6 cores for mutithreading and GPU pipelines for vector calculus. I don’t have comparison regarding AMD pipelines and the older Cell + NVIDIA GPU pipelines. I know for sure it’ll be easier to program AMD pipelines but IMO I’d prefer NVidia because ATI GPU pipelines are usually programmed via OpenCL that’s not as good as CUDA for extracting performance. But again… as PC is the baseline for graphics hardware, I just imagine that advanced GPU vector processing will be left aside for non exclusive games.

        Comment posted on 30/07/2013 at 19:03.

Leave a Reply

You must be logged in to post a comment.

Latest Comments

TSA Meets

  • None today