Announcement

Collapse
No announcement yet.

Testing The First PCIe Gen 5.0 NVMe SSD On Linux Has Been Disappointing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by L_A_G View Post
    Looks like I'll have to re-phrase my point again; No. There is a whole host of different use cases where high bandwidth storage is useful. Even just work on larger projects in a compiled language like C++ is far from a niche use. On top of the examples you've brought up there's major things like modern day CAD/CAM work (those are on fairly big data sets these days), any sort of modelling, simulation and data visualization work (which again works on very large datasets) and 3D art/asset creation just to name major categories that I can think off the top of my head.
    Useful isn't the same thing as necessary. Some things you mentioned may involve a lot of data but unless you're low on RAM or you're unnecessarily loading way more than you need to, you're not going to be constantly reading/writing data to the disk. Waiting another few seconds to load the data once isn't a big deal. There is only a small handful of workloads where disk performance may impede your workflow.
    All of which are previous generation titles and, as I already explained to you, contain additional versions/LoDs of assets that aren't in the console versions of those games. Just look at the backport of the enhanced PC version of GTA5 to the current gen consoles (Playstation 4 and Xbox Series). It's about 87 GB or twice the size of the previous generation (Playstation 4 and Xbox One) release. You can see a similar, albeit not as extreme, case in the PS4/XB1 versions of Cyberpunk 2077 versus the enhanced PS5/XBS version.
    Most of those are new enough. I'm sure you can find newer examples. And yes, I'm well aware PC versions have additional versions of LoD. I thought I actually said that already but looking back at what I described, I see how I phrased myself poorly.
    Uuuh... Games don't down- or upscale assets on the fly and never have. They contain multiple copies of the same high quality base asset that have been progressively more downscaled in terms of texture quality and polygon count. The few games that can't take advantage of this technique are also generally smaller (in terms of development team and disc space use) indie titles and mobile games. Even fairly heavily zoomed in platformers like the aforementioned Crash Bandicoot 3 and Spyro took advantage of LoDs for objects in the background/distance.
    The only reason you are arguing with me at this point is because you're taking what I say way too literally. I don't and haven't been disagreeing with anything you said there.

    Comment


    • #32
      Originally posted by schmidtbag View Post
      Useful isn't the same thing as necessary. Some things you mentioned may involve a lot of data but unless you're low on RAM or you're unnecessarily loading way more than you need to, you're not going to be constantly reading/writing data to the disk. Waiting another few seconds to load the data once isn't a big deal. There is only a small handful of workloads where disk performance may impede your workflow.
      So now we're going into what's necessary? Because if we're going by what's technically necessary then there's no need for SSDs altogether. Technically you can do anything off old school mechanical disc drives if you don't care about productivity. Go beyond that and technically you don't need much more than four walls, a roof and basic food.

      This is not about gains of a "a few seconds", but instead minutes. The sort of thing that makes people get out of their seat and go for coffee or the bathroom.

      Maybe you're retired or not in the workforce for some other reason, but it seems as if you're badly out of touch here...

      Most of those are new enough. I'm sure you can find newer examples. And yes, I'm well aware PC versions have additional versions of LoD. I thought I actually said that already but looking back at what I described, I see how I phrased myself poorly.

      The only reason you are arguing with me at this point is because you're taking what I say way too literally. I don't and haven't been disagreeing with anything you said there.
      I think it's becoming clear that you're one of those know-it-all types who's basically incapable of admitting you're wrong. You clearly didn't understand/know why PC versions of games tend to get bigger than the console versions once a console generation starts getting on in years nor that console games also LoDs the exact same way. The only difference being that console versions don't include the LoDs too detailed and resource intensive for them to use. Now that consoles can use those higher LoDs we get console games that are approaching on 100GB like the aforementioned enhanced edition of GTA5.

      You're not phrasing yourself wrong; You're using it as an excuse to not admit you didn't know something.

      Comment


      • #33
        Originally posted by L_A_G View Post
        This is not about gains of a "a few seconds", but instead minutes.
        1 minute @ 500 MB/s = 30 GB of data. That's if you load all of it at once. And that's assuming NVMe loads it instantly, which is not the case either (so difference has to be at least a minute).

        You're clowning. Buyer's remorse? Justifying your purchase?

        Comment


        • #34
          Originally posted by Weasel View Post
          1 minute @ 500 MB/s = 30 GB of data. That's if you load all of it at once. And that's assuming NVMe loads it instantly, which is not the case either (so difference has to be at least a minute).
          You're talking theoretical "one big chunk of data" type speeds, not the small chunks type that most big data sets and projects are actually stored on disc. Not only that, I've worked on pointclouds that are over 250 GB on disc as part of developing software that's meant to be able to work with data sets even bigger than that. The kind where "overnight" compute jobs really are overnight jobs that you leave running when you leave at the end of the day (even on Xeon/Threadripper machines with workstation graphics cards).

          You're clowning. Buyer's remorse? Justifying your purchase?
          Hardly... Since my first PCIe SSD that I got in 2014 I've bought 4 more. Two for my machines at work, two for the personal use machines that I built in 2017 and late last year respectively.

          Comment


          • #35
            Originally posted by L_A_G View Post
            So now we're going into what's necessary? Because if we're going by what's technically necessary then there's no need for SSDs altogether. Technically you can do anything off old school mechanical disc drives if you don't care about productivity. Go beyond that and technically you don't need much more than four walls, a roof and basic food.
            Serious question - are you autistic? Do you have buyers remorse? It was pretty obvious by now that this was about whether high-performance SSDs are worth spending extra for, which many would agree is synonymous with "necessary". Your literalism is really stepping on your toes (no, not literally). It's already been established that even everyday tasks can make use high-end SSDs, as in, the drives are "useful". Useful a meaningless word in this context and you know it.
            Anyway, we live in the real world, where most of us have to work in order to survive. Time is money, so yes, there are points where high-speed SSDs are actually necessary. When your workload is bottlenecked by your disk, that's when it really matters. Such scenarios exist, but as I keep saying over and over again: there are only a handful. Waiting a few extra seconds for something to be loaded into RAM where the disk spends the rest of your work day idle does not warrant a high-performance SSD; the time advantage is negligible.
            This is not about gains of a "a few seconds", but instead minutes. The sort of thing that makes people get out of their seat and go for coffee or the bathroom.

            Maybe you're retired or not in the workforce for some other reason, but it seems as if you're badly out of touch here...
            Mind providing specifics, or are you just speaking upon your own anecdotes where perhaps you need to reevaluate how you handle your workload?
            If you're really waiting minutes just to load up a project on a modern mid-range or even a [competent] SATA SSD and the disk is the bottleneck, I have to start questioning whether you're handling the data appropriately. We're talking some seriously niche situations if you really need tens of gigabytes of data loaded into RAM all at once. It reminds me of "web developers" who "justify" having 128GB of RAM because they "need" to have 200+ tabs open. I can't fathom any context where that truly makes sense.
            I think it's becoming clear that you're one of those know-it-all types who's basically incapable of admitting you're wrong. You clearly didn't understand/know why PC versions of games tend to get bigger than the console versions once a console generation starts getting on in years nor that console games also LoDs the exact same way. The only difference being that console versions don't include the LoDs too detailed and resource intensive for them to use. Now that consoles can use those higher LoDs we get console games that are approaching on 100GB like the aforementioned enhanced edition of GTA5.
            Uh, I'm the know-it-all? You're the one who is champing at the bit to correct me in moments where I'm not even disagreeing. You're so caught up in your incessant literalism where my sloppy phrasing gets you foaming at the mouth to make petty corrections. Here's the thing: you ain't saying anything that special. I didn't get more specific in my phrasing because most of what you're so excited to "correct" me about is obvious/implied.
            So enough with the red herrings and let's circle back to the original argument: I said games aren't compressed enough. You suggest otherwise, even though Steam alone can prove you wrong about that. You download quite a lot less data from Valve's servers than what the total install size is. Case closed.

            Comment


            • #36
              Originally posted by schmidtbag View Post
              Serious question - are you autistic? Do you have buyers remorse? It was pretty obvious by now that this was about whether high-performance SSDs are worth spending extra for, which many would agree is synonymous with "necessary". Your literalism is really stepping on your toes (no, not literally). It's already been established that even everyday tasks can make use high-end SSDs, as in, the drives are "useful". Useful a meaningless word in this context and you know it.
              Like I said in my previous port; When I've been using them for 9 years on the machines that I use for serious work on its clear this isn't buyer's remorse. Similarly, pointing out factual errors in your posts is also hardly "literalism" so your question about autism seems more like projecting than anything.

              Anyway, we live in the real world, where most of us have to work in order to survive. Time is money, so yes, there are points where high-speed SSDs are actually necessary. When your workload is bottlenecked by your disk, that's when it really matters. Such scenarios exist, but as I keep saying over and over again: there are only a handful. Waiting a few extra seconds for something to be loaded into RAM where the disk spends the rest of your work day idle does not warrant a high-performance SSD; the time advantage is negligible.
              To repeat myself again; There's more than "a handful" and I've already listed quite a few just off the top of my head so your repeated claims of just "a handful" is just repeating a disproven assertion. Over and over again. Seems like my guess as to you being retired or otherwise not gainfully employed was correct.

              Mind providing specifics, or are you just speaking upon your own anecdotes where perhaps you need to reevaluate how you handle your workload?

              If you're really waiting minutes just to load up a project on a modern mid-range or even a [competent] SATA SSD and the disk is the bottleneck, I have to start questioning whether you're handling the data appropriately. We're talking some seriously niche situations if you really need tens of gigabytes of data loaded into RAM all at once. It reminds me of "web developers" who "justify" having 128GB of RAM because they "need" to have 200+ tabs open. I can't fathom any context where that truly makes sense.
              If you've ever worked with big datasets you'll know that those have always been too big to fit into memory all at once and include a lot of swapping stuff in and out all the time. Processing them involves a lot of subsequent and sometimes also concurrent reading and writing data to disc. With that kind of work a fast disc is a major benefit to productivity. There's also the benefit of the work being more enjoyable when you don't need to wait on data.

              Uh, I'm the know-it-all? You're the one who is champing at the bit to correct me in moments where I'm not even disagreeing. You're so caught up in your incessant literalism where my sloppy phrasing gets you foaming at the mouth to make petty corrections. Here's the thing: you ain't saying anything that special. I didn't get more specific in my phrasing because most of what you're so excited to "correct" me about is obvious/implied.
              Again; Pointing our and correcting factual errors in your post is not "literalism". It is just that; Pointing our and correcting factual errors. You for example claimed that console games are better compressed based on the false impression that they use better compression when the truth was that this is because PC versions contain additional data. Something that isn't true now that there's a new console generation and the new generation has the memory and performance to use the LoDs that were previously exclusive to the PC versions.

              So enough with the red herrings and let's circle back to the original argument: I said games aren't compressed enough. You suggest otherwise, even though Steam alone can prove you wrong about that. You download quite a lot less data from Valve's servers than what the total install size is. Case closed.
              Circle back? We never left that point. You claimed that games aren't properly compressed and as an example used the console versions of games whose PC versions contained additional more highly detailed versions of assets. I pointed this out along with proof in the shape of new current generation console versions of games that are the same size as their previously larger PC versions.

              Installers being smaller than their resulting on-disc install is also hardly proof of anything when they're not real time and generally use compression techniques that aren't practical in real time use cases.

              Comment


              • #37
                Originally posted by L_A_G View Post
                Similarly, pointing out factual errors in your posts is also hardly "literalism" so your question about autism seems more like projecting than anything.
                They're only factual errors when you take what I say literally, hence my point. Most people would understand the gist of what I'm getting at.
                To repeat myself again; There's more than "a handful" and I've already listed quite a few just off the top of my head so your repeated claims of just "a handful" is just repeating a disproven assertion. Over and over again. Seems like my guess as to you being retired or otherwise not gainfully employed was correct.
                Your examples don't prove a damn thing. Using one of the workloads you mentioned (because the workloads I mentioned I know demand as much bandwidth as you can provide), show me a real life example of a real person where a decent SATA SSD is slow enough to hinder productivity where a PCIe 4.0+ SSD will not. Just one is all I ask. Hypotheticals mean nothing.
                If you've ever worked with big datasets you'll know that those have always been too big to fit into memory all at once and include a lot of swapping stuff in and out all the time. Processing them involves a lot of subsequent and sometimes also concurrent reading and writing data to disc. With that kind of work a fast disc is a major benefit to productivity. There's also the benefit of the work being more enjoyable when you don't need to wait on data.
                Depends on your definition of a big dataset. At my current job, I have a single database table with millions of records that take up over 25GB. It never goes longer than 15 minutes without something touching it, day and night. It's stored on a RAID10 HDD configuration. Obviously, that is a bottleneck, but less than you'd think - the disks aren't always the bottleneck. The other thing to keep in mind is this data is on a server, not my local workstation. Even if it were upgraded to SSDs, my PC doesn't need to handle all that data. Even for the workloads you propose, a server would warrant a high-performance SSD because there's a good chance multiple people are touching the data at the same time, and that overhead will impede productivity. But, the context here is your local workstation. That's where high-performance SSDs start to not make so much sense. There aren't a lot of situations where you have tens of GB of data on your local workstation and where the vast majority of that data would need to be accessed throughout the day.
                Again; Pointing our and correcting factual errors in your post is not "literalism". It is just that; Pointing our and correcting factual errors. You for example claimed that console games are better compressed based on the false impression that they use better compression when the truth was that this is because PC versions contain additional data. Something that isn't true now that there's a new console generation and the new generation has the memory and performance to use the LoDs that were previously exclusive to the PC versions.
                Except there are situations where consoles have more compression. I'm not saying it's common, but it absolutely happens. More importantly, I was saying there is room for more compression.
                Installers being smaller than their resulting on-disc install is also hardly proof of anything when they're not real time and generally use compression techniques that aren't practical in real time use cases.
                Except with PC versions of games, there is room for additional compression and it would most likely improve load times. A while back, I had a squashfs image of UT2004 and pretty much every community addon to less than half its original size. On my crappy 2c/2t CPU and 2GB of RAM at the time, it shaved several seconds off load times, despite the fact the CPU and RAM were the bottleneck. Most people nowadays have a CPU with plenty of resources to spare, and for any game with loading screens, the CPU isn't typically working that hard. In the unlikely event the compression demands more CPU and RAM than what is available, the probability of it taking longer to load is highly unlikely.
                Last edited by schmidtbag; 08 March 2023, 02:19 PM.

                Comment


                • #38
                  Originally posted by schmidtbag View Post
                  They're only factual errors when you take what I say literally, hence my point. Most people would understand the gist of what I'm getting at.
                  Oh, so I'm supposed to be able to read your mind over the internet to know that you didn't mean those things literally? You obviously know what you mean, but when someone else sees you saying something that's factually incorrect, outside of a few narrowly defined contexts, they're simply going to see someone saying something that's factually incorrect.

                  Your examples don't prove a damn thing. Using one of the workloads you mentioned (because the workloads I mentioned I know demand as much bandwidth as you can provide), show me a real life example of a real person where a decent SATA SSD is slow enough to hinder productivity where a PCIe 4.0+ SSD will not. Just one is all I ask. Hypotheticals mean nothing.
                  Productivity is not a binary on/off thing, it's more of a sliding scale. The less time you spend waiting on data the more time you have to spend doing actual work. You also have soft factors like the fact that it's frustrating and de-motivating to have to spend time waiting on your machine. So when your tools help rather than hinder you, you're also going to be more productive trough pure motivation.

                  Depends on your definition of a big dataset. At my current job, I have a single database table with millions of records that take up over 25GB. It never goes longer than 15 minutes without something touching it, day and night. It's stored on a RAID10 HDD configuration. Obviously, that is a bottleneck, but less than you'd think - the disks aren't always the bottleneck. The other thing to keep in mind is this data is on a server, not my local workstation. Even if it were upgraded to SSDs, my PC doesn't need to handle all that data. Even for the workloads you propose, a server would warrant a high-performance SSD because there's a good chance multiple people are touching the data at the same time, and that overhead will impede productivity. But, the context here is your local workstation. That's where high-performance SSDs start to not make so much sense. There aren't a lot of situations where you have tens of GB of data on your local workstation and where the vast majority of that data would need to be accessed throughout the day.
                  If you think a 25 GB dataset in 2023 is big, then it's clear you're just out of touch. Plain and simple.

                  For any workstation use in the examples I pointed out that's a plain tiny dataset. Like I pointed out in a post you seem to have ignored, I've got a dataset on hand right now that's 10 times the size that's already almost a decade old and our customers work on datasets much bigger than that. For any of the use cases I've already brought up, just off the top of my head, a dataset that size is what's considered a small dataset.

                  Seems to me like you're just going desktop support for basic office work and that's obviously not a use case where you need anything more than a thin client. Not only that, it seems difficult for you to imagine any significant number of people doing work that's outside of what you see people doing at work.

                  Except there are situations where consoles have more compression. I'm not saying it's common, but it absolutely happens. More importantly, I was saying there is room for more compression.
                  Except I've already twice explained that this is a misconception on your part. Console games aren't better compressed, they just used to not contain the highest quality versions of assets. After the latest jump they now contain those as well and are the same size as the PC versions of the game.

                  Except with PC versions of games, there is room for additional compression and it would most likely improve load times. A while back, I had a squashfs image of UT2004 and pretty much every community addon to less than half its original size. On my crappy 2c/2t CPU and 2GB of RAM at the time, it shaved several seconds off load times, despite the fact the CPU and RAM were the bottleneck. Most people nowadays have a CPU with plenty of resources to spare, and for any game with loading screens, the CPU isn't typically working that hard. In the unlikely event the compression demands more CPU and RAM than what is available, the probability of it taking longer to load is highly unlikely.
                  SqashFS is a read only file system and not meant nor practical for most applications. Games have also tried to avoid loading screens for the last 20 years with increasing success. These days you only really see them once upon loading into the game and after that they're either masked as you're kept in-game or then just plain streamed in based on your location and heading in the world. You simply don't have moments where the CPU has nothing to do than load data off the disc.

                  Comment


                  • #39
                    Originally posted by L_A_G View Post
                    Oh, so I'm supposed to be able to read your mind over the internet to know that you didn't mean those things literally? You obviously know what you mean, but when someone else sees you saying something that's factually incorrect, outside of a few narrowly defined contexts, they're simply going to see someone saying something that's factually incorrect.
                    If you're not autistic, yeah. Seeing as some of my posts had upvotes, surely others knew what I was saying. The thing is, you went into this with an ax to grind, so I'm not sure how it would have turned out any different.
                    Productivity is not a binary on/off thing, it's more of a sliding scale. The less time you spend waiting on data the more time you have to spend doing actual work. You also have soft factors like the fact that it's frustrating and de-motivating to have to spend time waiting on your machine. So when your tools help rather than hinder you, you're also going to be more productive trough pure motivation.
                    I agree, but that's why there's a relatively short list of workstation workloads to be worthy of high-performance SSDs. If your work is slowed down in realtime because your drive is too slow (such as heavy compiling, UHD video scrubbing, massive data collection, managing archives, etc) then faster is definitely better. When you're loading several GB of assets into your tool that you'll be using for the rest of the day, waiting a few extra seconds is a negligible impact and your ROI would be better spent on a reliable drive than a fast one.
                    Again: there's a point where the size of your data starts to not make sense on a workstation. I don't think we're so much in disagreement about the many potential tasks that demand high disk performance for servers.
                    If you think a 25 GB dataset in 2023 is big, then it's clear you're just out of touch. Plain and simple.
                    For someone who nitpicks about semantics and details, I'm shocked you didn't define "big"; the word is relative. 25GB worth of textures is nothing significant. A single 4K video recording can easily exceed 25GB. Google probably consumes 25GB of disk space on their servers every tenth of a second. So yeah, I'm very well aware of how big a dataset gets, but you're moving goalposts here. In the context of textures, there isn't a good reason to load the entire dataset all at once, so a decent SATA SSD will work fine. I already discussed how UHD video scrubbing demands high-bandwidth data. What Google does is not going to fit on your workstation.
                    So: I'm talking about a single table consisting of nothing but UTF-8 text. That's not huge, gargantuan, or very big. It's just "big". Big enough that you wouldn't normally see it on a workstation, but small enough that if for some reason you were to put it on a workstation, a basic SSD could handle it just fine.
                    The thing is, it doesn't really matter how big your dataset is, what matters most is how much of that data you need to load at any particular time. Those 25GB of text might not be that extreme but it is loaded frequently.
                    For any workstation use in the examples I pointed out that's a plain tiny dataset. Like I pointed out in a post you seem to have ignored, I've got a dataset on hand right now that's 10 times the size that's already almost a decade old and our customers work on datasets much bigger than that. For any of the use cases I've already brought up, just off the top of my head, a dataset that size is what's considered a small dataset.
                    You keep saying things in very general or relative terms. If you want me to stop ignoring things, you need to be specific.
                    Seems to me like you're just going desktop support for basic office work and that's obviously not a use case where you need anything more than a thin client. Not only that, it seems difficult for you to imagine any significant number of people doing work that's outside of what you see people doing at work.
                    Actually desktop support is pretty much the one thing I almost never do at my job, but I digress.
                    How the hell is it difficult for me to imagine when I gave real-world examples where high-performance SSDs are a necessity in workstations? The only thing you disagree with are how many more different workloads there are, and so far all you've done is propose hypothetical situations. Like really... show me 1 legit professional who is doing CAD work and bottlenecked by a decent SATA SSD.
                    Except I've already twice explained that this is a misconception on your part. Console games aren't better compressed, they just used to not contain the highest quality versions of assets. After the latest jump they now contain those as well and are the same size as the PC versions of the game.
                    Yeah, you can explain it again and it doesn't change the fact there are some games that have less compression on PC. I didn't say it's common practice, I'm saying it happens. That CoD example I brought up earlier was one of them; that's why the game was so huge - from what I recall, they didn't bother to compress the audio.
                    SqashFS is a read only file system and not meant nor practical for most applications. Games have also tried to avoid loading screens for the last 20 years with increasing success. These days you only really see them once upon loading into the game and after that they're either masked as you're kept in-game or then just plain streamed in based on your location and heading in the world. You simply don't have moments where the CPU has nothing to do than load data off the disc.
                    The fact squashfs was used isn't particularly relevant; btrfs offers some pretty good compression too without being read-only. I only brought up squashfs because it is known to have some pretty good compression for its time. The thing is, pretty much the only game data that is often updated are the binaries or libraries, but they also account for a rather small percentage of the total game. The assets don't change much, and they could be extra compressed without really any penalty.
                    Anyway, I'm glad you brought up the transition away from loading screens, because that really emphasizes my point: throughout those last 20 years, most consoles had relatively slow drives and yet they could keep up with it anyway.
                    Granted, textures are getting a lot bigger and fast. To compensate for this, you either need:
                    A. Faster drives (which you can't guarantee all gamers can have).
                    B. Be a bit more aggressive with predicting what needs to be loaded ahead of time before I/O becomes a bottleneck, which needs more RAM.
                    C. Compress the data more, where games can load without stuttering on crappy drives, are faster to download/update, and you don't have to constantly uninstall games to play another one.

                    Comment


                    • #40
                      Originally posted by schmidtbag View Post
                      If you're not autistic, yeah. Seeing as some of my posts had upvotes, surely others knew what I was saying. The thing is, you went into this with an ax to grind, so I'm not sure how it would have turned out any different.
                      Upvotes aren't really a proof of anything. They're just as likely to be from people who took your posts at face value and didn't know that what you wrote was factually incorrect.

                      I agree, but that's why there's a relatively short list of workstation workloads to be worthy of high-performance SSDs. If your work is slowed down in realtime because your drive is too slow (such as heavy compiling, UHD video scrubbing, massive data collection, managing archives, etc) then faster is definitely better. When you're loading several GB of assets into your tool that you'll be using for the rest of the day, waiting a few extra seconds is a negligible impact and your ROI would be better spent on a reliable drive than a fast one.
                      Again; It's not a short list. It may not be something a majority of people working in an office environment do, but there is a long list of professions and tasks where a fast SSD, something that usually costs less than 250€/TB, is well worth the money.

                      People who work on big datasets doing design and visualization also don't just load a big lump in at the start of the workday and then work on it for the rest of the day. They're usually too big to fit into memory all at once anyway so the application will be swapping data in and onto the disc. Add to that the fact that they're processing this data so they'll also be writing additional data onto the disc as part of that work. For some tasks this may even be more data than the original dataset.

                      Again: there's a point where the size of your data starts to not make sense on a workstation. I don't think we're so much in disagreement about the many potential tasks that demand high disk performance for servers.
                      Maybe, but that's not the case until you start getting into the territory of datasets so big they don't even fit on a single consumer-grade disc. When you start moving into the realm of literally multiple terabytes of data. In video production you usually keep an archive of the footage on the server and then make local copies of that as you're editing a new project. Fast discs on the server means you can get your even faster local copies faster.

                      For someone who nitpicks about semantics and details, I'm shocked you didn't define "big"; the word is relative. 25GB worth of textures is nothing significant. A single 4K video recording can easily exceed 25GB. Google probably consumes 25GB of disk space on their servers every tenth of a second. So yeah, I'm very well aware of how big a dataset gets, but you're moving goalposts here. In the context of textures, there isn't a good reason to load the entire dataset all at once, so a decent SATA SSD will work fine. I already discussed how UHD video scrubbing demands high-bandwidth data. What Google does is not going to fit on your workstation.
                      Once again; Pointing out clear factual errors and correcting misunderstandings when they're used as arguments isn't nitpicking.

                      On topic; No. A 25 GB database is not big by today's standards. Nor is the access rate you described. I'm running something fairly similar on what is essentially the lowest end cloud instance that Google currently provides and it's almost overkill for the job.

                      So: I'm talking about a single table consisting of nothing but UTF-8 text. That's not huge, gargantuan, or very big. It's just "big". Big enough that you wouldn't normally see it on a workstation, but small enough that if for some reason you were to put it on a workstation, a basic SSD could handle it just fine.
                      This is what I've been trying to make you understand; What you consider to be big, really isn't by today's standards. Maybe by the standards of the 1990s. But not by the standards of the 2020s. Which is why I posited that you're simply out of touch with what people do at work today.

                      You keep saying things in very general or relative terms. If you want me to stop ignoring things, you need to be specific.
                      What? Is basic arithmetic too much to ask now?

                      Actually desktop support is pretty much the one thing I almost never do at my job, but I digress.
                      How the hell is it difficult for me to imagine when I gave real-world examples where high-performance SSDs are a necessity in workstations? The only thing you disagree with are how many more different workloads there are, and so far all you've done is propose hypothetical situations. Like really... show me 1 legit professional who is doing CAD work and bottlenecked by a decent SATA SSD.
                      Hypothetical examples? All of the ones I pointed out are very much real. The fact that whatever office you support doesn't do that kind of work doesn't mean that there aren't a lot of those kinds of use cases. The 250GB dataset that I mentioned? That's a LIDAR scan of the Helsinki area tram network commissioned by the city planning department and has been used for planning out the maintenance, improvement and expansion of the Helsinki tram network.

                      Yeah, you can explain it again and it doesn't change the fact there are some games that have less compression on PC. I didn't say it's common practice, I'm saying it happens. That CoD example I brought up earlier was one of them; that's why the game was so huge - from what I recall, they didn't bother to compress the audio.
                      Audio is ultimately a fairly small part of the total install size on disc. That game was also absolutely gargantuan on consoles as well on account of being a battle royale game with multiple of the genre-typical absolutely massive maps (IIRC made by bodging together multiple maps from previous games). As per usual, most of the difference was made up of data simply not found in the console versions of the game.

                      The fact squashfs was used isn't particularly relevant; btrfs offers some pretty good compression too without being read-only. I only brought up squashfs because it is known to have some pretty good compression for its time. The thing is, pretty much the only game data that is often updated are the binaries or libraries, but they also account for a rather small percentage of the total game. The assets don't change much, and they could be extra compressed without really any penalty.
                      The main point of btrfs is anything but compression. It's forte is stuff like file integrity, avoidance of and auto-defragmentation, dynamic volume re-sizing, online load balancing. It's compression is primarily native support for things like zlib, which has been pretty much ubiquitous in games for the last 15 years. The devkits of the Playstation 3, Xbox 360 and Wii all had it included as a standard library from the beginning.

                      To put it as simply as I can; Your primary example is a game from before the use of zlib became standard. Your solution; To use what's already an industry standard.

                      Anyway, I'm glad you brought up the transition away from loading screens, because that really emphasizes my point: throughout those last 20 years, most consoles had relatively slow drives and yet they could keep up with it anyway.
                      Granted, textures are getting a lot bigger and fast. To compensate for this, you either need:
                      A. Faster drives (which you can't guarantee all gamers can have).
                      B. Be a bit more aggressive with predicting what needs to be loaded ahead of time before I/O becomes a bottleneck, which needs more RAM.
                      C. Compress the data more, where games can load without stuttering on crappy drives, are faster to download/update, and you don't have to constantly uninstall games to play another one.
                      A. The Playstation 5 and Xbox Series consoles all have a fast PCIe SSD as standard so you can almost guarantee that people have them.
                      B. Its something Sony's early PC ports of Playstation 5 games have already begun to do that. Housemarque's Returnal has 32 GB of RAM as a requirement
                      C. Zlib has been an industry standard for over 15 years so the kind of gains you're thinking of don't exist beyond stuff like generated assets.

                      Oh and before you start going on about the potential of generated assets and stuff like this 177k on disc demo, I'll have to point out that they always spend more time generating those assets than it's take to read them off a half-decent disc.

                      Comment

                      Working...
                      X