Announcement

Collapse
No announcement yet.

Testing The First PCIe Gen 5.0 NVMe SSD On Linux Has Been Disappointing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by schmidtbag View Post
    Ah, so that's why databases, web hosting, network storage, render farms, physics simulations, etc are done on workstations!
    /s
    Uuh... None of those are online/real-time usages. You tried to say that something which has been tried on servers, and found to not be practical, should be run on servers and now you're bringing up completely separate server use cases. It's fairly obvious you're just dragging this out trying to be the "last man standing" so you can get the last word in and then pretend like you've "won" the debate.

    Your behavior and line of thinking is bizarre. How do I have an argument "anymore" if I wasn't really disagreeing with you in the first place?
    Your "agreement" is more like this:

    Me: This is why you're wrong
    You: But it means I'm right so we agree
    Me: No, your argument is still wrong

    You have this annoying tendency to look at things in an all-or-nothing perspective. I don't know how many times I have to tell you that not every game has this problem, and that you don't know how every development studio works. But go ahead, feign expertise - it's totally working out for you.
    When something is basically ubiquitous, then it's basically ubiquitous. Your counter-example from 20 years ago and before it became the norm doesn't prove anything. It's just plain outdated. Same thing with the misconception you had about how install sizes for games' PC versions used to be bigger. When you're wrong, you're wrong. Real life doesn't have any participation trophies.

    You have yet to prove how you are all that special. Just because you claim authority, doesn't make you have any. This is the internet, kid - you're nothing here. Having access to a decent workstation and working with 40-250GB chunks of datasets doesn't make you an expert.
    It doesn't make me an expert, but it does mean I can bring up real-life examples to disprove your arguments. Specially when you were claiming that I couldn't bring up a single real world use case. You gambled that I couldn't and you lost. Just accept it. I also had an old "big" dataset just at hand and a test chunk of a bigger dataset in the order of magnitude of over a terabyte.

    In the face of that, your "big" 25 GB database table was simply paltry.

    Exactly: virtual memory is a 1990s thing. Today, there isn't much advantage in it so long as you design your system around your workflow properly. RAM is relatively cheap and abundant. If you depend on virtual memory, you're either being way too cheap, you're doing things wrong/inefficiently or you have a very niche situation where perhaps you only need it very temporarily.
    If it's so abundant, then why is that 25GB database table still not something that you can easily cache? Because it further suggests the whole database would fit into the RAM of my previous decade old workstation with OS and basic applications to spare.

    No... I didn't, and insisting that doesn't make it true, just like insisting you're an expert doesn't make it true. Deliberately ignoring the complete argument does nothing to make you look competent or intelligent. You pick and choose the parts that seem stupid by themselves, but you ignore the rest.
    Again, I'm not capable of reading your mind over the internet. You brought up the size of this one table the fact that you didn't say it was just a small fraction of the whole database or representative of the size tables are so I got the impression this was not representative of the whole database. So I chanced on it and if I was wrong you could've easily corrected me, stating that "Actually, the whole database is 500GB. That was just one table for size."

    However, you didn't...

    So, why should I elaborate on anything else when you have the attention span and memory of a goldfish?
    That's pretty rich coming from you when I had to explain out to you, three times, that PC versions in the recent past less compressed than the console versions of games, they just contained higher quality assets consoles weren't able to use.

    Once again, you are inadvertently agreeing with me. Kinda amusing.
    Nah, the really funny thing here are the mental gymnastics you're going trough when you've simply run out of any substantive arguments and still can't admit you're wrong. Instead telling me "stop arguing with me" etc.

    Now, let's say there are hundreds or thousands of smaller tables rather than one big one. Then, sprinkle in some extra processing where you're not just simply doing select/insert statements. That's what my system looks like.
    You've gone on an on about this 25 GB table and now you're talking about hundreds or even thousands of smaller tables? Is that big table just the result of basically the whole table being cobbled together with join statements?

    Says the one with an authority complex.
    Says someone who's trying to claim they know something they've never worked on better than someone who literally does it for a living...

    And how long do you keep that machine? 1, maybe 2 years? Considering it makes up for most of your salary, that seems rather expensive, relatively speaking. You can't ignore peripherals either; after all, this isn't a server!
    I wrote "sans peripherals" as something that's an additional cost on top of that. The fact that you think businesses throw out equipment that expensive and new goes to show that you really don't know what you're talking about. Those machines, including the discs, usually have an amortization period of at least 4 years.

    It's pretty rare for a company to have a looser fist than a government. Spending 500€ extra so you can see more dots would only be agreeable if your boss was clueless.
    So the same companies who throw out 3000€+ in one or two years are now too tightfisted to spend another 500€ to improve and extend the lifespan of those machines? Make up your mind before you post, not after. Similarly, you think they're ready to spend over 10.000€ on generating a high quality dataset, but too cheap to actually use it properly.

    None of that changes my point; funny how I'm thought to be the strawman.
    You tried to use a straw man me with the point of those examples, I point this out and explain the actual point and now you claim the actual argument doesn't change the straw man? You're really sticking your fingers in your ears aren't you?

    I find your interpretation of my idea confusing; no wonder you think I know nothing.If you like throwing out trendy psychological terms as if it contributes anything: you've got an extreme case of confirmation bias.
    When you make arguments like suggesting we tell off customers for wanting to fully utilize something they've spent a lot of money for it's inevitable you don't look very smart.

    If your software loads in 25GB+ chunks during runtime when (in your case) the data points do not depend on one another, then you're not a good developer.
    I can't wait for you to correct me on how clueless I am about this!
    It's pretty ironic how you complain about me not reading your posts correctly and then go do that exact thing. Projecting much? Because I wrote about reading that 25 GB "in chunks", not all of it in a single 25 GB chunk. Do you not understand the concept of plural forms either? If you absolutely must know, we read in the higher density parts in chunks ranging from a few hundred MB to a few GB depending on the size and density of the dataset.

    There you go again, only quoting the parts that are convenient for you.
    Funny how I still include more of the post than you and up until the last post included your entire post in my quotes.

    Exactly, so it stands to reason that you don't know as much as you think. HuRr DuRr Dunning-Kruger!!! I provided examples throughout to back up my claims. The only thing you have to back up yourself is "trust me, I'm an expert", over-generalized claims, and a personal anecdote. I asked for more sources and you didn't provide them. Surely, it should be easy if you're that certain.
    You provided examples, I explained why they're not valid. Some of them multiple times after you brought them up again after I'd already explained why they're not valid. To counter this I brought up examples of my own from my own line of work as a software engineer with the kind of processing requirements you're claiming are incredibly niche. You haven't even tried to explain why my examples are invalid. You've just dismissed them as "anecdotes" to avoid having to come up with something substantial. This is the first time you've even asked for sources and you've not provided any yourself either.

    This isn't the comments section to some consumer electronics blog. Most of the people on here are software developers, hobbyists and students in the field. We even have major kernel developers posting here on occasion. Over here you regularly run into people that can be described as subject matter experts.

    That's why I said to do compression level 9... Obviously, you won't see any noteworthy difference doing the same exact method, and I never suggested that.
    You don't think they're going to use whatever level of compression they can without hurting performance more than the speedup from disc load times? Because your argument here is basically that game developers are idiots.

    You are as dense as you want your LIDAR points to be.

    Using zlib at level 9 to further compress an already compressed game is a worst-case scenario example that I don't suggest anyone do. The point is, it manages to shave off more than just a couple percent, that shows how much potential there is for more compression. NOT compression using zlib, but methods optimized for specific assets. Y'know, like WebP vs BMP, or FLAC vs WAV. Go with lossy methods and you can save even more.
    Wait, you really think developers aren't using efficient formats for assets before they run the final install trough compression? It seems you know even less than I thought you did. Apologies for that. I need to explain even more to you know because developers absolutely scale down assets with things like texture and vertex compressors well before the heavy debugging, optimization and compression stage that typically starts a few months out from the gold master build going out.

    Seems to me you're a strong case of Dunning-Kruger in this regard.
    A pretty ironic thing for you to say when I realized you were a Dunning-Kruger case even as I thought you knew more than you actually know.
    "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

    Comment


    • #52
      Originally posted by L_A_G View Post
      Uuh... None of those are online/real-time usages. You tried to say that something which has been tried on servers, and found to not be practical, should be run on servers and now you're bringing up completely separate server use cases. It's fairly obvious you're just dragging this out trying to be the "last man standing" so you can get the last word in and then pretend like you've "won" the debate.
      Many of those things I listed are "online", "real-time", "iterative", "low-latency" workflows, unless you somehow have your own definition of them.
      I don't see how you're any different in regards to dragging this on. You're the one who feels like you have to correct every single thing I say, I'm just here for the ride because it's getting a little amusing at this point.
      Your "agreement" is more like this:
      No it's more like this:
      Me: b is the second letter of the alphabet.
      You: No, it's B. And sometimes its Б.
      Me: Hardly any difference, and we're not talking about Cyrillic. Otherwise, what about β?
      You: Dunning-Kruger.
      When something is basically ubiquitous, then it's basically ubiquitous. Your counter-example from 20 years ago and before it became the norm doesn't prove anything. It's just plain outdated. Same thing with the misconception you had about how install sizes for games' PC versions used to be bigger. When you're wrong, you're wrong. Real life doesn't have any participation trophies.
      Being outdated is irrelevant when the point still stands:
      1. Contrary to what you believe, there is in fact more room to further compress games.
      2. People have the CPU cycles and RAM to handle more compression.
      3. Compressing data further allows for faster load times.
      I'm not wrong just because you insist to ignore things that undermine your objective. I never said PC versions used to be bigger. This is precisely why this argument is one-sided - you're arguing with things I never said, and I'm not disagreeing with hardly anything you said. I worry for your mental stability.
      It doesn't make me an expert, but it does mean I can bring up real-life examples to disprove your arguments. Specially when you were claiming that I couldn't bring up a single real world use case. You gambled that I couldn't and you lost. Just accept it. I also had an old "big" dataset just at hand and a test chunk of a bigger dataset in the order of magnitude of over a terabyte.
      How did I lose the gamble? The example you brought up proved me right: the software is either capable of lowering the data resolution for faster load times (and therefore can work on inferior drives), or, it's unnecessarily loading large chunks of data at a time.
      If it's so abundant, then why is that 25GB database table still not something that you can easily cache? Because it further suggests the whole database would fit into the RAM of my previous decade old workstation with OS and basic applications to spare.
      I already explained this to you several times...
      The server needs memory to handle things (which includes non-database workloads) beyond just this 1 table.
      Again, I'm not capable of reading your mind over the internet. You brought up the size of this one table the fact that you didn't say it was just a small fraction of the whole database or representative of the size tables are so I got the impression this was not representative of the whole database. So I chanced on it and if I was wrong you could've easily corrected me, stating that "Actually, the whole database is 500GB. That was just one table for size."
      A couple things:
      1. I know you assume I'm a clueless and living in the early 2000s, but honestly what would be the point of me bringing that up when there are last-gen games with bigger datasets than this table? It should have been obvious that this is not the only thing this server has to work with.
      2. The thing you seem completely incapable of retaining/understanding is that the size of the table is and wasn't relevant to my point. The whole database on this one particular server (of many servers) is around 700GB but the total size makes no difference; 25GB is a big enough dataset that most (but certainly not all) businesses would not be loading the whole thing into RAM. This is especially true for a dataset that increases by thousands of records every day.
      That's pretty rich coming from you when I had to explain out to you, three times, that PC versions in the recent past less compressed than the console versions of games, they just contained higher quality assets consoles weren't able to use.
      You felt the need to continue explaining because you continued to think that I made such a blanket statement when I didn't. I only said that it happens, and contrary to your belief: it does happen.
      You've gone on an on about this 25 GB table and now you're talking about hundreds or even thousands of smaller tables? Is that big table just the result of basically the whole table being cobbled together with join statements?
      Uh... you're the one who keeps bringing it up. And who the hell makes a permanent table consisting of joined other tables? If you're going to find ways to make me look dumb, at least use a realistic scenario.
      Says someone who's trying to claim they know something they've never worked on better than someone who literally does it for a living...
      Doing something for a living doesn't mean you're good at it.
      I wrote "sans peripherals" as something that's an additional cost on top of that. The fact that you think businesses throw out equipment that expensive and new goes to show that you really don't know what you're talking about. Those machines, including the discs, usually have an amortization period of at least 4 years.
      Yeah... and you need those peripherals in order to do your job. That can't be ignored. Otherwise, it's just a server, which according to you, is totally incapable of doing the job.
      12 years ago, I worked as a technician for a fortune 500 company. I was once contracted to dismantle a single rack worth $3 million and was only a couple years old. Other customers would swap out terabytes upon terabytes of fully-functioning hard drives, simply because they reached an arbitrary age. Other customers would literally just give me thousands of dollars worth of fiber optic cables, simply because they didn't need them. Yep, clearly I don't know how often companies replace their equipment. Clearly, I don't have real experience.
      So the same companies who throw out 3000€+ in one or two years are now too tightfisted to spend another 500€ to improve and extend the lifespan of those machines? Make up your mind before you post, not after. Similarly, you think they're ready to spend over 10.000€ on generating a high quality dataset, but too cheap to actually use it properly.
      A tightfisted company wouldn't replace within 2 years, nor would they spend 500€ extra. But, you're the one saying it isn't "big money", though it could be if it were replaced that frequently.
      Shows how little you understand about most industries if you think most companies actually make use of the full quality of their dataset. In the movie and music industry, they often record in a far higher resolution and dynamic range (and often minimal/lossless compression) than they'll ever distribute. That doesn't mean they aren't using it, but they don't need the full level of detail all the time.
      You tried to use a straw man me with the point of those examples, I point this out and explain the actual point and now you claim the actual argument doesn't change the straw man? You're really sticking your fingers in your ears aren't you?
      Oh the irony.
      Also... you need ears to read and write?
      When you make arguments like suggesting we tell off customers for wanting to fully utilize something they've spent a lot of money for it's inevitable you don't look very smart.
      I didn't say that but ok.
      It's pretty ironic how you complain about me not reading your posts correctly and then go do that exact thing. Projecting much? Because I wrote about reading that 25 GB "in chunks", not all of it in a single 25 GB chunk. Do you not understand the concept of plural forms either? If you absolutely must know, we read in the higher density parts in chunks ranging from a few hundred MB to a few GB depending on the size and density of the dataset.
      So.... you're only reading a few GB at most, and you tell me you need a high-end SSD. Right. Thanks for proving my point.
      The thing is... if you read 25GB chunks at a time, that would actually make sense to need a high-end SSD. Again: poor software design if that were true, but at least the need for such disk performance would have totally made sense.
      Funny how I still include more of the post than you and up until the last post included your entire post in my quotes.
      That's because the parts I omit contain no value. I don't need to reply to your precious feelings about me. The parts you omit from me are the basis of my whole point, which is why you continue to feel the need to argue with me.
      You provided examples, I explained why they're not valid. Some of them multiple times after you brought them up again after I'd already explained why they're not valid. To counter this I brought up examples of my own from my own line of work as a software engineer with the kind of processing requirements you're claiming are incredibly niche. You haven't even tried to explain why my examples are invalid. You've just dismissed them as "anecdotes" to avoid having to come up with something substantial. This is the first time you've even asked for sources and you've not provided any yourself either.
      No, you took the parts of my examples you wanted to read, ignored the rest, and explained why your cropped perspective wasn't valid.
      The only thing you've said (pertaining to the subject) that I find invalid is what you deem necessary for high-performance SSDs. Reading a few GB of a roughly 25GB chunk doesn't seem to be something that a decent SATA drive would struggle with, so long as you have a decent buffer.
      This isn't the comments section to some consumer electronics blog. Most of the people on here are software developers, hobbyists and students in the field. We even have major kernel developers posting here on occasion. Over here you regularly run into people that can be described as subject matter experts.
      No... this comments section is comprised mostly of Stallmanite neckbeards. The hobbyists are the loud minority. The students are basically neckbeards in the making. The actual software devs tend to be the only sane, patient, and polite people here.
      You don't think they're going to use whatever level of compression they can without hurting performance more than the speedup from disc load times? Because your argument here is basically that game developers are idiots.
      The game devs use compression methods that the consoles can readily handle. The consoles are somewhat limited in CPU, RAM, and especially cooling capacity. This is particularly true of last-gen. Current-gen consoles distribute the work by using low-cost NVMe drives. Devs aren't incentivized to re-work assets just for PC. So, they're not idiots, but they're not doing the best they could. Considering their tight timelines, I don't blame them.
      Wait, you really think developers aren't using efficient formats for assets before they run the final install trough compression? It seems you know even less than I thought you did. Apologies for that. I need to explain even more to you know because developers absolutely scale down assets with things like texture and vertex compressors well before the heavy debugging, optimization and compression stage that typically starts a few months out from the gold master build going out.
      *sigh* I'm talking about the final released product. It's really surprising how dumb you choose to be.

      Comment


      • #53
        Seriously, do you really not have a life or something? You were typing away at your last reply on a Friday evening when I was out relaxing...

        Originally posted by schmidtbag View Post
        Many of those things I listed are "online", "real-time", "iterative", "low-latency" workflows, unless you somehow have your own definition of them.
        I don't see how you're any different in regards to dragging this on. You're the one who feels like you have to correct every single thing I say, I'm just here for the ride because it's getting a little amusing at this point.
        If the fact that they've tried what you've suggested and stopped because users hated it doesn't even give you pause, then I don't think anyone could say anything to convince you otherwise. You just seem to be a habitual contrarian whose idea of a good time on a Friday evening is arguing with someone on the internet.

        No it's more like this:
        Maybe in your world where everyone can read your mind and know when you're not saying something that's wrong, you're just joking when there's nothing in the context or the way it's written that suggests this...

        Being outdated is irrelevant when the point still stands:
        When an example is outdated it is, by definition, irrelevant in a present day context. No ifs or buts. When an example is wrong because it misses something crucial, like your argument about how previous generation console games being better compressed than their PC versions, then it's just plain wrong. Specially when even the misconception doesn't even apply anymore.

        I'm not wrong just because you insist to ignore things that undermine your objective. I never said PC versions used to be bigger. This is precisely why this argument is one-sided - you're arguing with things I never said, and I'm not disagreeing with hardly anything you said. I worry for your mental stability.
        You explicitly talked about a couple of PS4 versions of games from a few years ago, 2014's Middle Earth: Shadow of Mordor, having smaller install sizes on consoles and used that as proof of PC games having space for additional compression. The fact that you seem to have forgotten this suggests you're either going senile or you're trying to dishonestly act like you never said that.

        How did I lose the gamble? The example you brought up proved me right: the software is either capable of lowering the data resolution for faster load times (and therefore can work on inferior drives), or, it's unnecessarily loading large chunks of data at a time.
        If a paying customer wants and also pays for something, then it's, by definition, necessary. The personal opinions of some internet know-it-all who thinks he knows better than people who use and make this kind of software for a living are irrelevant. The fact that you seem to think that your personal opinions on something you've never used or worked on matter further suggest your parents never let you be wrong and as a result raised an insufferable know-it-all.

        I already explained this to you several times... The server needs memory to handle things (which includes non-database workloads) beyond just this 1 table.
        We weren't talking about server usage here, we were talking about workstation use cases.

        1. I know you assume I'm a clueless and living in the early 2000s, but honestly what would be the point of me bringing that up when there are last-gen games with bigger datasets than this table? It should have been obvious that this is not the only thing this server has to work with.
        You explicitly brought up that 20-year-old Unreal Tournament game twice and its the only example you've brought up so far of a game that genuinely wasn't properly compressed. The console games you brought up on the other hand were a misconception based on the fact that the PC version contained additional higher quality assets.

        2. The thing you seem completely incapable of retaining/understanding is that the size of the table is and wasn't relevant to my point. The whole database on this one particular server (of many servers) is around 700GB but the total size makes no difference; 25GB is a big enough dataset that most (but certainly not all) businesses would not be loading the whole thing into RAM. This is especially true for a dataset that increases by thousands of records every day.
        If the size of the table wasn't relevant, then why did you need to make it explicit? If that 700GB really is the size of a whole dataset (and not something you've made up to avoid being embarrassed by the actual size of the database) then that's all the more reason for having a fast PCIe SSD. Any significant processing of that kind of dataset with optimized software is going to vastly benefit from a fast SSD.

        You felt the need to continue explaining because you continued to think that I made such a blanket statement when I didn't. I only said that it happens, and contrary to your belief: it does happen.
        When you repeat a false statement someone who has told you is wrong they're going to tell you you're wrong again. Repeating a falsehood doesn't make it any less of falsehood.

        Uh... you're the one who keeps bringing it up. And who the hell makes a permanent table consisting of joined other tables? If you're going to find ways to make me look dumb, at least use a realistic scenario.
        I never suggested it's a permanent table, just one cobbled together to make an argument.

        Doing something for a living doesn't mean you're good at it.
        In government maybe, but in the private sector you don't do something you're incompetent at for very long, if at all.

        Yeah... and you need those peripherals in order to do your job. That can't be ignored. Otherwise, it's just a server, which according to you, is totally incapable of doing the job.
        12 years ago, I worked as a technician for a fortune 500 company. I was once contracted to dismantle a single rack worth $3 million and was only a couple years old. Other customers would swap out terabytes upon terabytes of fully-functioning hard drives, simply because they reached an arbitrary age. Other customers would literally just give me thousands of dollars worth of fiber optic cables, simply because they didn't need them. Yep, clearly I don't know how often companies replace their equipment. Clearly, I don't have real experience.
        I mentioned peripherals because my argument was built around an "and up" type price. You need to read what you're replying to before you write or at the very least press the "post" button. Also, fortune 500 companies do not represent businesses as a whole. Most companies are small businesses who have a far better level of control and oversight over their expenditure.

        A tightfisted company wouldn't replace within 2 years, nor would they spend 500€ extra. But, you're the one saying it isn't "big money", though it could be if it were replaced that frequently.
        A company too tightfisted to spend less than 500€ on a significant productivity improvement aren't going to spend the thousands of euros on that workstation, the same amount on the software, or for that matter, a server to do the job. No, they'll either rent the machine or, more probably, outsource the job to an outside company who will have that fast drive to make their expensive employees working on an expensive machine be as productive as possible.

        Oh the irony. Also... you need ears to read and write?
        Says the guy who thinks media companies don't use the high quality "raw" footage for editing and then downsample it to the media its distributed on. Reality however is that they do and when new higher quality media comes out, they use the higher quality un-downsampled originals to produce the copies for that new media. A good example of this the the re-booted Battlestar Galactica. It was filmed and edited in HD, but originally broadcast and sold on DVD in SDTV resolution. Then when blu-ray became popular they too the original HD edits and put them on the format.

        I'm not even going to bother going deeper into your seeming inability to understand a common figurative expression...

        I didn't say that but ok.
        This is you from a few posts ago:

        As for looking through a 4K monitor, seems to me you're just spoiled. "Oh no! I can notice some gaps in the pointcloud! How will I ever do my work!?"
        Cry me a river.​
        Telling a paying customer to "cry me a river" isn't something you want to be doing unless you want to lose the business...

        So.... you're only reading a few GB at most, and you tell me you need a high-end SSD. Right. Thanks for proving my point. The thing is... if you read 25GB chunks at a time, that would actually make sense to need a high-end SSD. Again: poor software design if that were true, but at least the need for such disk performance would have totally made sense.
        Again with the tactical dyslexia? The fact that data is stored in chunks of a few gigabytes doesn't mean that you're going to only to have one or two in memory at any given point, but rather half a dozen or more on a higher density dataset. The higher the density, the less physical area a single fixed size chunk of it will contain. Our software allows users to set what amount of RAM the application can use for that buffer and our customers saturate their 16, 32 or 64GB buffers.

        It seems like you're so incapable of admitting you're wrong you're trying to construct a Catch 22 type argument. If the application reads too much data at once it's un-optimized and if it reads less it doesn't need to read enough that a fast SSD makes sense. Any customer that wants to use that much data is "spoiled" and needs to "cry me a river".

        That's because the parts I omit contain no value. I don't need to reply to your precious feelings about me. The parts you omit from me are the basis of my whole point, which is why you continue to feel the need to argue with me. No, you took the parts of my examples you wanted to read, ignored the rest, and explained why your cropped perspective wasn't valid.
        Right... Up until that one post I hadn't omitted a single word of yours and now suddenly I'm supposedly ignoring "the basis of your point" when you know it's just me and you who are even reading these posts.

        The only thing you've said (pertaining to the subject) that I find invalid is what you deem necessary for high-performance SSDs. Reading a few GB of a roughly 25GB chunk doesn't seem to be something that a decent SATA drive would struggle with, so long as you have a decent buffer.
        Seriously? You complain about me omitting core parts of your post while at the same time omitting my explicit pointing out that I was talking about chunks in plural where you in read big datasets in parts as you process and visualize it with sustained read speeds above what a SATA SSD can handle.

        No... this comments section is comprised mostly of Stallmanite neckbeards. The hobbyists are the loud minority. The students are basically neckbeards in the making. The actual software devs tend to be the only sane, patient, and polite people here.
        The thing about "Stallmanite neckbeards" is that they usually are software developers, IT professionals or hobbyists. Devs who explicitly state who they are in real life also post here as representatives of their employers so they're not going to argue against you if you really insist that "1 + 1 = 3" the way you're doing.

        The game devs use compression methods that the consoles can readily handle. The consoles are somewhat limited in CPU, RAM, and especially cooling capacity. This is particularly true of last-gen. Current-gen consoles distribute the work by using low-cost NVMe drives. Devs aren't incentivized to re-work assets just for PC. So, they're not idiots, but they're not doing the best they could. Considering their tight timelines, I don't blame them.
        Consoles have since the previous gen had a decent amount of extra CPU time available for compression with their "total throughput over per-thread throughput" CPU designs and the 8 core Zen2 current gen consoles even more so. As I told you before, the PS5 has hardware decompression and still gets the same results on the same assets as the Xbox series consoles with software zlib. Those fast SSDs aren't used for less compression, they're used for large and dense worlds with high quality assets and no loading screens.

        It may be a little late now, but if you want to see what those fast SSD's on consoles are for; Check out Epic's The Matrix Awakens demo and how large, dense and seamless its rendition of late 90s New York is when you get to start waking and flying around it.

        *sigh* I'm talking about the final released product. It's really surprising how dumb you choose to be.
        You talked about them not compressing their assets by using things like the compressors that are part of applications like Autodesk Maya they literally use to make all of those assets. If you think game installs are big, that's nothing compared to how big the archival "raw" datasets are.

        But I mean seriously... It seems like there's nothing anyone could ever say to convince you you're wrong. Probably because of that personality, you've got so little going on in your life that you think continuing an argument you've long since lost is a perfectly normal way to spend a Friday evening. Hence it seems time to just walk away and conclude that you're just trying to get the last word in thinking that will actually prove you right and make your Don Quixote-esque quest to prove the world you're right about everything worth it.
        "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

        Comment


        • #54
          Originally posted by L_A_G View Post
          Seriously, do you really not have a life or something? You were typing away at your last reply on a Friday evening when I was out relaxing...
          Y'know there's things called time zones, right? I reply to you when I have downtime at work, like right now. Makes me wonder how much of a life you have considering you're the one particularly frustrated at someone you acknowledge isn't going to give up. I'm not really putting up much of a fight; you're the one who has an axe to grind, a point to prove, a principle that only you care about. I already made my point, which you're free to disagree with. I agree with most of your points, so again, I'm just here for the ride at this point.
          If the fact that they've tried what you've suggested and stopped because users hated it doesn't even give you pause, then I don't think anyone could say anything to convince you otherwise. You just seem to be a habitual contrarian whose idea of a good time on a Friday evening is arguing with someone on the internet.
          Tried what, running heavy workloads on a server? The stuff I mentioned before is commonly done on servers without being much of an issue. But of course, you know all that there is to know, so clearly the 1 or 2 examples you encountered account for all user experiences.
          How am I the habitual contrarian when I hardly disagree with anything you've said?
          Maybe in your world where everyone can read your mind and know when you're not saying something that's wrong, you're just joking when there's nothing in the context or the way it's written that suggests this...
          In my world, people give each other the benefit of the doubt and can extrapolate what they really meant without having to use so many words, or, without assuming they're totally inept.
          When an example is outdated it is, by definition, irrelevant in a present day context. No ifs or buts. When an example is wrong because it misses something crucial, like your argument about how previous generation console games being better compressed than their PC versions, then it's just plain wrong. Specially when even the misconception doesn't even apply anymore.
          Not really, because the principles of the example hold up, and that's the point of having an example.
          Examples also hold up when people actually interpret what was said. I never said previous generation console games were better compressed, period. I just said sometimes (not necessarily often, and certainly not always) the PC version isn't well compressed. The fact it happens at all is a problem.
          You explicitly talked about a couple of PS4 versions of games from a few years ago, 2014's Middle Earth: Shadow of Mordor, having smaller install sizes on consoles and used that as proof of PC games having space for additional compression. The fact that you seem to have forgotten this suggests you're either going senile or you're trying to dishonestly act like you never said that.
          Yes, those are proof of instances where the PC version was notably larger. In some cases, it could be from higher-res textures, in others, it could be from different levels of compression. I wasn't about to dig into the differences, but at the time, you were arguing there were no differences.
          If a paying customer wants and also pays for something, then it's, by definition, necessary. The personal opinions of some internet know-it-all who thinks he knows better than people who use and make this kind of software for a living are irrelevant. The fact that you seem to think that your personal opinions on something you've never used or worked on matter further suggest your parents never let you be wrong and as a result raised an insufferable know-it-all.
          I'd be less of a know-it-all to you if you actually understood what I said, because I didn't say that the complete dataset isn't necessary. In fact, I actually justified why it absolutely is necessary.
          We weren't talking about server usage here, we were talking about workstation use cases.
          The context of that particular table is from a server. In any case, it really doesn't change the underlying point, so stop moving goalposts.
          You explicitly brought up that 20-year-old Unreal Tournament game twice and its the only example you've brought up so far of a game that genuinely wasn't properly compressed. The console games you brought up on the other hand were a misconception based on the fact that the PC version contained additional higher quality assets.
          I didn't use that game as an example of what modern games lack. I used that as an example of how big of a performance difference you get when you highly compress a game. I haven't tried the experiment again, but I have done on-the-fly LZO compression on much newer games, which sometimes shaved off a few GB. However, you know this to be a poor solution. The fact it makes a difference at all suggests there is room for more compression. That's all that matters.
          If the size of the table wasn't relevant, then why did you need to make it explicit? If that 700GB really is the size of a whole dataset (and not something you've made up to avoid being embarrassed by the actual size of the database) then that's all the more reason for having a fast PCIe SSD. Any significant processing of that kind of dataset with optimized software is going to vastly benefit from a fast SSD.
          Kind of a stupid question - if I didn't mention it, you'd have asked me anyway. The 700GB is largely irrelevant, which is why I didn't bring it up. The whole point is that this particular table is a large-enough dataset that you wouldn't normally load entirely into RAM (in addition to the implied other processes of the server), is accessed frequently, and yet, doesn't demand a high-performance SSD.
          When you repeat a false statement someone who has told you is wrong they're going to tell you you're wrong again. Repeating a falsehood doesn't make it any less of falsehood.
          Solid advice you should take.
          I never suggested it's a permanent table, just one cobbled together to make an argument.
          In case you're not fluent in SQL, temporary tables are an actual thing, and only populate in RAM where capable. Therefore, if a table is on a disk, it's not temporary, as in, permanent. Doesn't have to be literally permanent. In any case, it's a stupid concept to make a table stored on disk from a join statement.
          In government maybe, but in the private sector you don't do something you're incompetent at for very long, if at all.
          That's not even a little bit true and you know it. Too often, people BS their capabilities and get by with the bare minimum.
          I mentioned peripherals because my argument was built around an "and up" type price. You need to read what you're replying to before you write or at the very least press the "post" button. Also, fortune 500 companies do not represent businesses as a whole. Most companies are small businesses who have a far better level of control and oversight over their expenditure.
          So let me get this straight:
          I assumed your workstation was "big money", a relative term. You mentioned just the price of the workstation alone as a means to undermine that it isn't all that big, not implying an "and up" type price. But, those peripherals are in fact part of the deal, since it's otherwise not a usable system.
          Small businesses are the ones that are going to be picky about spending 500€ on a drive (or in another perspective, 17% of the whole cost of the PC alone) just so you can see more dots in 4K while navigating the scene. Big companies won't care. Medium sized companies, which I assume yours must be, depend on how they're managed.
          A company too tightfisted to spend less than 500€ on a significant productivity improvement aren't going to spend the thousands of euros on that workstation, the same amount on the software, or for that matter, a server to do the job. No, they'll either rent the machine or, more probably, outsource the job to an outside company who will have that fast drive to make their expensive employees working on an expensive machine be as productive as possible.
          That's the thing though - YOU (not necessarily the customer) seeing those extra dots doesn't seem to be a significant improvement to YOUR productivity. Unless it's you and maybe a couple others with good workstations, that 500€ adds up real fast.
          Says the guy who thinks media companies don't use the high quality "raw" footage for editing and then downsample it to the media its distributed on. Reality however is that they do and when new higher quality media comes out, they use the higher quality un-downsampled originals to produce the copies for that new media. A good example of this the the re-booted Battlestar Galactica. It was filmed and edited in HD, but originally broadcast and sold on DVD in SDTV resolution. Then when blu-ray became popular they too the original HD edits and put them on the format.
          Huh? I didn't say that. I said the media companies do use the full quality un-downsampled originals, when editing. But they certainly do not use that for the media they distribute.
          So once again, we're in agreement, but you're so hellbent on disagreeing with everything I say that you invent stupid crap I never said.
          Telling a paying customer to "cry me a river" isn't something you want to be doing unless you want to lose the business...
          Stop being deliberately obtuse. I'm telling YOU to cry me a river, not your customers. YOU don't need to be working with the full dataset all the time, you just need it when it makes sense to see in higher resolution. It makes perfect sense why the customers want to have the ability to see it all. Seeing as you aren't your customer, you aren't to be concerned with what their PCs can handle. If you are the one presenting the data to the customer from your own PC, while that would warrant a high-performance SSD so they're kept satisfied, that is a rather niche situation. Niche enough that it's nothing more than an anecdote and not worth arguing about.
          Again with the tactical dyslexia? The fact that data is stored in chunks of a few gigabytes doesn't mean that you're going to only to have one or two in memory at any given point, but rather half a dozen or more on a higher density dataset. The higher the density, the less physical area a single fixed size chunk of it will contain. Our software allows users to set what amount of RAM the application can use for that buffer and our customers saturate their 16, 32 or 64GB buffers.
          I understand and have already understood that since my last reply.
          It seems like you're so incapable of admitting you're wrong you're trying to construct a Catch 22 type argument. If the application reads too much data at once it's un-optimized and if it reads less it doesn't need to read enough that a fast SSD makes sense. Any customer that wants to use that much data is "spoiled" and needs to "cry me a river".
          It's really not a hard concept to grasp. We're talking about what is NEEDED, not what is nice-to-have. If your application NEEDS to load 32GB at a time on a workstation, it is poorly optimized. Since it has the option to load 64GB where bandwidth is available, then great.
          While there is obviously a certain point where the data density is low enough to not be productive to work with, there is also a point where more data doesn't contribute anything of significance, especially after a certain distance away. Assuming you stop moving the camera and the software can load in the higher density detail automatically, this is why you just sound like a whiny spoiled brat - you want to see more detail all the time, when it most likely isn't necessary. You remind me of the people who will spend hundreds extra on a CPU that achieves another 15FPS in a game that is already 100FPS higher than what your display can render, because you can't settle for anything that isn't the best.
          Right... Up until that one post I hadn't omitted a single word of yours and now suddenly I'm supposedly ignoring "the basis of your point" when you know it's just me and you who are even reading these posts.
          No, you've regularly been omitting words, because it's a lot easier to make someone look stupid when you twist what they said to fit your narrative.
          Seriously? You complain about me omitting core parts of your post while at the same time omitting my explicit pointing out that I was talking about chunks in plural where you in read big datasets in parts as you process and visualize it with sustained read speeds above what a SATA SSD can handle.
          I ignored nothing, you were vague.
          The thing about "Stallmanite neckbeards" is that they usually are software developers, IT professionals or hobbyists. Devs who explicitly state who they are in real life also post here as representatives of their employers so they're not going to argue against you if you really insist that "1 + 1 = 3" the way you're doing.
          I've seen devs here dabble in some back and forth arguments. Obviously they're a lot more polite about it.
          For what it's worth, I'm insisting 1+1=2, you're insisting 1.0+1.0=2.0
          Consoles have since the previous gen had a decent amount of extra CPU time available for compression with their "total throughput over per-thread throughput" CPU designs and the 8 core Zen2 current gen consoles even more so. As I told you before, the PS5 has hardware decompression and still gets the same results on the same assets as the Xbox series consoles with software zlib. Those fast SSDs aren't used for less compression, they're used for large and dense worlds with high quality assets and no loading screens.
          Right, that's for one layer of compression. Again: it makes sense to use asset-specific optimized compression, which apparently MS is doing with BCPack. Meanwhile, just as I suspected, zlib isn't the only thing devs use:

          You talked about them not compressing their assets by using things like the compressors that are part of applications like Autodesk Maya they literally use to make all of those assets. If you think game installs are big, that's nothing compared to how big the archival "raw" datasets are.
          I said they're not compressing enough. There is room for more. Some assets can get by with lossy compression for a major reduction in resource consumption but a negligible fidelity loss.
          But I mean seriously... It seems like there's nothing anyone could ever say to convince you you're wrong. Probably because of that personality, you've got so little going on in your life that you think continuing an argument you've long since lost is a perfectly normal way to spend a Friday evening. Hence it seems time to just walk away and conclude that you're just trying to get the last word in thinking that will actually prove you right and make your Don Quixote-esque quest to prove the world you're right about everything worth it.
          How would you respond if I said the same to you, besides touting your self-proclaimed expertise?
          Last edited by schmidtbag; 20 March 2023, 01:20 PM.

          Comment


          • #55
            Originally posted by schmidtbag View Post
            Y'know there's things called time zones, right? I reply to you when I have downtime at work, like right now. Makes me wonder how much of a life you have considering you're the one particularly frustrated at someone you acknowledge isn't going to give up. I'm not really putting up much of a fight; you're the one who has an axe to grind, a point to prove, a principle that only you care about. I already made my point, which you're free to disagree with. I agree with most of your points, so again, I'm just here for the ride at this point.
            Right... You admit that you're going to continue this argument indefinitely and I'm the one with an axe to grind? Projecting much?

            Because when they're proven wrong, any normal person will admit that and move on. Only a habitual contrarian will keep arguing after they've lost the argument.

            Tried what, running heavy workloads on a server? The stuff I mentioned before is commonly done on servers without being much of an issue. But of course, you know all that there is to know, so clearly the 1 or 2 examples you encountered account for all user experiences.
            So now you're adding tactical dementia to the mix? We were talking about real-time workstation applications here you senile git.

            How am I the habitual contrarian when I hardly disagree with anything you've said?
            The fact that you're openly admitting you plan on continuing this argument indefinitely is probably the clearest sign. Seems like growing up you "won" arguments in your own head by just moaning on and on until people got sick of talking to you and you never grew out of it.

            In my world, people give each other the benefit of the doubt and can extrapolate what they really meant without having to use so many words, or, without assuming they're totally inept.
            Like I've already told you more than once; The inside of your head has only one resident. You. Any (honest) person will argue with what you actually say/write and only that.

            Not really, because the principles of the example hold up, and that's the point of having an example. Examples also hold up when people actually interpret what was said. I never said previous generation console games were better compressed, period. I just said sometimes (not necessarily often, and certainly not always) the PC version isn't well compressed. The fact it happens at all is a problem.
            Do you not understand how an example works? It's meant to be a representation of something and when it's not representative, then it's an invalid example. An example, by definition, needs to be representative of what it's meant to represent. If it's outdated or was always a misconception, it's it's simply no good and it doesn't prove anything.

            Yes, those are proof of instances where the PC version was notably larger. In some cases, it could be from higher-res textures, in others, it could be from different levels of compression. I wasn't about to dig into the differences, but at the time, you were arguing there were no differences.
            Again; You've yet to demonstrate a SINGLE valid example of a game whose PC version is significantly larger than its console release.

            I'd be less of a know-it-all to you if you actually understood what I said, because I didn't say that the complete dataset isn't necessary. In fact, I actually justified why it absolutely is necessary.
            When "understanding" your argument means that you need to add in parts that were never there, i.e a form of straw man fallacy, nobody (honest) except you is ever going to "understand" you. Make your argument and make it in it's entirety. Nobody (honest) is going to start adding on points and arguments you never made.

            The context of that particular table is from a server. In any case, it really doesn't change the underlying point, so stop moving goalposts.
            Its not moving the goalposts when it's something you yourself brought up as an example...

            I didn't use that game as an example of what modern games lack. I used that as an example of how big of a performance difference you get when you highly compress a game. I haven't tried the experiment again, but I have done on-the-fly LZO compression on much newer games, which sometimes shaved off a few GB. However, you know this to be a poor solution. The fact it makes a difference at all suggests there is room for more compression. That's all that matters.
            Yet you somehow felt the need to bring up this 20-year-old game as if the performance bump provided by decent compression was somehow a revelation. Being kind and assuming you're not just making things up; When 40-50 GB install sizes have been the norm for the almost a decade at this point a couple of GB from additional compression suggests there isn't that much extra room for compression and that you intentionally chose a non-representative example.

            Kind of a stupid question - if I didn't mention it, you'd have asked me anyway. The 700GB is largely irrelevant, which is why I didn't bring it up. The whole point is that this particular table is a large-enough dataset that you wouldn't normally load entirely into RAM (in addition to the implied other processes of the server), is accessed frequently, and yet, doesn't demand a high-performance SSD.
            Yet you somehow felt the need to only mention this one table, further suggesting that the 700 GB figure is either entirely made up or the sum total of data on a data storage server where most of it is user files.

            Solid advice I should take.
            Fixed that for you. The sad part is that you're never going to take it because your parents failed to raise someone with the self-reflection to be able to realize when they're wrong.

            In case you're not fluent in SQL, temporary tables are an actual thing, and only populate in RAM where capable. Therefore, if a table is on a disk, it's not temporary, as in, permanent. Doesn't have to be literally permanent. In any case, it's a stupid concept to make a table stored on disk from a join statement.
            If you're just cobbling everything together just to make a point then it's only natural you don't even bother writing it to disc.

            That's not even a little bit true and you know it. Too often, people BS their capabilities and get by with the bare minimum.
            Like I said; There's a limit to how long you can keep up a lie about your capabilities when when you're employed specifically for those capabilities. You can get your foot in the door, even get hired if they're incompetent and don't test you or fail to notice plagiarism, but eventually they will find out when you fail to perform.

            So let me get this straight: I assumed your workstation was "big money", a relative term. You mentioned just the price of the workstation alone as a means to undermine that it isn't all that big, not implying an "and up" type price. But, those peripherals are in fact part of the deal, since it's otherwise not a usable system.
            You insisted that it's a significant expenditure and I countered by pointing out that compared to the salary of the people who use them and the other tools they're also going to be given it's not a significant expenditure. You getting stuck on me not including the cost of the peripherals not being included in that (because they may not be replaced at the same time as the machine) suggests it either (honestly) went over your head or you're (dishonestly) trying to divert away from the actual point; They're not a major investment.

            Small businesses are the ones that are going to be picky about spending 500€ on a drive (or in another perspective, 17% of the whole cost of the PC alone) just so you can see more dots in 4K while navigating the scene. Big companies won't care. Medium sized companies, which I assume yours must be, depend on how they're managed.
            Small businesses are often going to be just the person using that drive and if that's going to help them make more money for less than half a week's salary, then its a no-brainer. That's also assuming they build their machine themselves when all workstations (and gaming PCs) in that price class already come with one as standard.

            That's the thing though - YOU (not necessarily the customer) seeing those extra dots doesn't seem to be a significant improvement to YOUR productivity. Unless it's you and maybe a couple others with good workstations, that 500€ adds up real fast.
            Like I've told you multiple times already; When you spend 10.000€ or more producing a single dataset you damn well want to use it to it's fullest extent. That's another example of a cost so big the additional cost of a fast SSD is negligible by comparison.

            Huh? I didn't say that. I said the media companies do use the full quality un-downsampled originals, when editing. But they certainly do not use that for the media they distribute.
            When you preface it with this statement;
            You either meant just that or then you said something and then contradicted yourself in the very next sentence. I assumed the less dumb one, i.e the former.

            So once again, we're in agreement, but you're so hellbent on disagreeing with everything I say that you invent stupid crap I never said.
            Says the person openly stating that they plan on continuing this argument indefinitely... Maybe I should write a script that takes your reply every time you do, has ChatGPT generate a response and post it here. That way you can keep arguing with the AI until the day you die. Hell, you may even enjoy cutting out the middleman and just go argue with an AI yourself.

            Stop being deliberately obtuse. I'm telling YOU to cry me a river, not your customers. YOU don't need to be working with the full dataset all the time, you just need it when it makes sense to see in higher resolution. It makes perfect sense why the customers want to have the ability to see it all. Seeing as you aren't your customer, you aren't to be concerned with what their PCs can handle. If you are the one presenting the data to the customer from your own PC, while that would warrant a high-performance SSD so they're kept satisfied, that is a rather niche situation. Niche enough that it's nothing more than an anecdote and not worth arguing about.
            I'd already told you multiple times by that point that I'm a software developer and what do most of us do? Make software for other people. I assumed this was obvious and that I was talking about the software we sell to our customers and what they use it for. Our customers buy our software because they want to use it for things and those things are uses where a fast SSD is a real benefit that more than pays for itself. I was also obviously talking about things we and our customers use this software for. A big part of software development is to test your software and to sufficiently do so requires us to do what our customers will or already do with our software.

            Our internal testing is not niche; It's what our customers will and do use our software for and we have to do it.

            I understand and have already understood that since my last reply.
            Yet you still tried to make the argument that we and our customers aren't going to be taxing our drives with those smaller chunks...

            It's really not a hard concept to grasp. We're talking about what is NEEDED, not what is nice-to-have. If your application NEEDS to load 32GB at a time on a workstation, it is poorly optimized. Since it has the option to load 64GB where bandwidth is available, then great.
            If our customer pays a lot of money to have those high-res scans done, they're going to want to use them to their fullest. If they want something and will also pay for it, then for us then its a need. It's as simple as that. We don't tell them to "cry me a river" or call them a "spoiled brat" when our software can't handle what they want to use it for. No, what we do is improve our software so it can handle what they want to do with it and as part of it, we'll also do just that. We also try to stay head of our customer so that when they decide to up the resolution of their scans or add a new type of processing to their workflow, our software can ready handle that.

            Handling these high-res scans is not something speculative. It's something our customers have already used for over a decade and our software could handle it before they started using these high-res datasets.

            While there is obviously a certain point where the data density is low enough to not be productive to work with, there is also a point where more data doesn't contribute anything of significance, especially after a certain distance away. Assuming you stop moving the camera and the software can load in the higher density detail automatically, this is why you just sound like a whiny spoiled brat - you want to see more detail all the time, when it most likely isn't necessary. You remind me of the people who will spend hundreds extra on a CPU that achieves another 15FPS in a game that is already 100FPS higher than what your display can render, because you can't settle for anything that isn't the best.
            You're doing that "I know better than subject matter experts in a field I've never worked in"-thing again...

            I know you're a know-it-all who thinks he's literally the smartest man on the planet. But it's getting ridiculous the number of times you're going on about how you know SO much more than all these subject matter experts that we have on staff and especially as our customers. It's quite frankly embarrassing to read when you've never actually done anything in or worked in this industry you think you know better than anyone in it.

            No, you've regularly been omitting words, because it's a lot easier to make someone look stupid when you twist what they said to fit your narrative.
            Oh, so now I'm chopping up sentences to make you look stupid? Fully how in that one post that didn't include the entirety of your previous post I dropped a couple of paragraphs where you re-iterated points I'd already proven wrong and went for the "punchline" at the end, re-iterating the disproving of the incorrect points.

            I ignored nothing, you were vague.
            So now using the plural form of a word is being "vague" and I should instead be explicit in stating that I mean something in plural? Is it that hard to admit you misread?

            I've seen devs here dabble in some back and forth arguments. Obviously they're a lot more polite about it.
            Like I said; The devs clearly identified as such know they're acting as representatives so they're going to be on their best behavior. I've had one instance where I've later realized I was actually wrong (something I, unlike you, am able to admit to) and the dev decided to just let it go.

            For what it's worth, I'm insisting 1+1=2, you're insisting 1.0+1.0=2.0
            With you and your "I-know-more-than-subject-matter-experts-in-this-field-I'm-a-tota-novice-in"

            Right, that's for one layer of compression. Again: it makes sense to use asset-specific optimized compression, which apparently MS is doing with BCPack.
            See? I've been telling you time and time again that developers absolutely DO compress their games by quite a lot. With multiple compressors that are a standard part of their toolkits, including asset-specific ones like the compression tools in Autodesk Maya.

            Meanwhile, just as I suspected, zlib isn't the only thing devs use
            One step forward, two steps backwards... I explicitly mentioned the Playstation 5's hardware compressor three times to you. Of course zlib isn't the only tool they use when I bring up multiple other commonly used compressors, explicitly. I brought up zlib because its been part of the standard toolkit of every console that's come our since 2005.

            I said they're not compressing enough. There is room for more. Some assets can get by with lossy compression for a major reduction in resource consumption but a negligible fidelity loss.
            When you, by to your own admission, can only wrangle a couple of gigabytes out of a 40-50 GB game with a high setting, your "enough" is one where you spend more additional time de-compressing assets than you save in reading them from disc. Its textbook "Bending over a dollar to pick up a dime" for crying our loud.

            How would you respond if I said the same to you, besides touting your self-proclaimed expertise?
            If you actually had an argument I couldn't disprove I'd do what any normal person would do and concede the argument. However you've made it abundantly clear that you're never going to concede the argument. Not just with how you keep insisting you know better about what's actually useful than any subject matter expert in a field you've never worked in, but also in how you've explicitly stated that you're never going to concede the argument.

            As I've said before; Being the "last man standing" doesn't mean anything. It takes no skill to keep arguing after you've long since lost the actual argument...
            "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

            Comment


            • #56
              Originally posted by L_A_G View Post
              Because when they're proven wrong, any normal person will admit that and move on. Only a habitual contrarian will keep arguing after they've lost the argument.
              I have acknowledged multiple times where you're either right or where your perspective would make sense if it were actually true. The only thing we disagree on is subjective. That's what's so funny about this, because you're the only one deliberately finding ways to disagree with me, yet you call me the habitual contrarian. All I've been doing is clarifying things that you continue to misinterpret, whether that be because you're the contrarian, because you omit the complete picture (whether deliberate or not), or because you make rash assumptions that only make sense if you assume the worst in someone.
              So now you're adding tactical dementia to the mix? We were talking about real-time workstation applications here you senile git.
              How am I the one with dementia when it was established from the beginning what type of system that dataset was in?
              The fact that you're openly admitting you plan on continuing this argument indefinitely is probably the clearest sign. Seems like growing up you "won" arguments in your own head by just moaning on and on until people got sick of talking to you and you never grew out of it.
              To be a contrarian means I'm going out of my way to disagree. I've really only been disagreeing about 1 particular thing, with the rest just clarifying my perspective. So: I'll gladly admit I'm a stubborn jackass, annoying, and childish in my behavior - this is the internet, I can be whoever I want with minimal consequence, just like you can pretend to be an expert! But I'm not a habitual contrarian; that title belongs to you, since you're the one who from the beginning has been disagreeing with just about everything I've said.
              Go ahead, disagree with me here, and see how you're less of a contrarian than me.
              Do you not understand how an example works? It's meant to be a representation of something and when it's not representative, then it's an invalid example. An example, by definition, needs to be representative of what it's meant to represent. If it's outdated or was always a misconception, it's it's simply no good and it doesn't prove anything.
              It was representative: it showed that with good compression, you can load data quicker. Just because it was an old game, the concept doesn't change. Modern games may be a lot more compressed but the point remains: there is room for more compression.
              Again; You've yet to demonstrate a SINGLE valid example of a game whose PC version is significantly larger than its console release.
              No? I mentioned a handful of examples. There was only 1 example where it was publicly known that the size difference was from a lack of compression.
              When "understanding" your argument means that you need to add in parts that were never there, i.e a form of straw man fallacy, nobody (honest) except you is ever going to "understand" you. Make your argument and make it in it's entirety. Nobody (honest) is going to start adding on points and arguments you never made.
              Those additional points didn't have to be made, because most honest people who actually have a semblance of expertise would understand what's implied.
              What next, I have to tell you the database software, the OS, the hardware, the ISP, etc too? There's a point where additional information is implied; nobody runs a business off a single table, therefore, it was implied there was more.
              Yet you somehow felt the need to bring up this 20-year-old game as if the performance bump provided by decent compression was somehow a revelation. Being kind and assuming you're not just making things up; When 40-50 GB install sizes have been the norm for the almost a decade at this point a couple of GB from additional compression suggests there isn't that much extra room for compression and that you intentionally chose a non-representative example.
              Who said anything about a revelation? I was just using an example of how much compression helps.
              And again: saving a couple GB of 50GB from on-the-fly compression is a worst-case scenario. It's not efficient and it doesn't make sense to do that, but the thing that can't seem to penetrate your thick head is that it shows how much potential there is for further compression.
              Yet you somehow felt the need to only mention this one table, further suggesting that the 700 GB figure is either entirely made up or the sum total of data on a data storage server where most of it is user files.
              I only mentioned the one table because it's the only thing that was relevant. The other ~700GB has nothing to do with my example, at all. All it does is state the obvious: that there is more than 1 table.
              Fixed that for you. The sad part is that you're never going to take it because your parents failed to raise someone with the self-reflection to be able to realize when they're wrong.
              You are bursting at the seams with hypocrisy. I'd be happy knowing I've taken down your reputation here, if only our audience hadn't already left several pages ago. Even in cases where people agree with you, you've dragged this on too long to ever be viewed favorably. I already have a bad rep here so it makes no difference to me. Like I said, I'm here for the ride - we're in the same car, and you're the one driving haha.
              If you're just cobbling everything together just to make a point then it's only natural you don't even bother writing it to disc.
              Or... I don't make tables out of join statements because that's generally stupid and unnecessary.
              Like I said; There's a limit to how long you can keep up a lie about your capabilities when when you're employed specifically for those capabilities. You can get your foot in the door, even get hired if they're incompetent and don't test you or fail to notice plagiarism, but eventually they will find out when you fail to perform.
              Agreed. The thing is, you can stop lying to me about your expertise - I don't really care about your self-proclaimed capabilities.
              You insisted that it's a significant expenditure and I countered by pointing out that compared to the salary of the people who use them and the other tools they're also going to be given it's not a significant expenditure. You getting stuck on me not including the cost of the peripherals not being included in that (because they may not be replaced at the same time as the machine) suggests it either (honestly) went over your head or you're (dishonestly) trying to divert away from the actual point; They're not a major investment.
              Considering how much it compared to the salary figure, I'd say it was in fact a significant expenditure. If you are one of many who have such a system, that makes it even more significant. Peripherals are required to do the job so they can't be ignored.
              Like I've told you multiple times already; When you spend 10.000€ or more producing a single dataset you damn well want to use it to it's fullest extent. That's another example of a cost so big the additional cost of a fast SSD is negligible by comparison.
              I agree with wanting to use it to its fullest extent, but that doesn't mean perpetually. Most industries are aware that it's not economical or necessary to use the complete dataset all the time.
              Says the person openly stating that they plan on continuing this argument indefinitely... Maybe I should write a script that takes your reply every time you do, has ChatGPT generate a response and post it here. That way you can keep arguing with the AI until the day you die. Hell, you may even enjoy cutting out the middleman and just go argue with an AI yourself.
              It would certainly end sooner, because ChatGPT isn't driven by emotions and wouldn't conveniently omit information for the sake of supporting its own argument. ChatGPT, while not always correct, is a more trustworthy source of information.
              Seriously, try it though: ask ChatGPT how necessary high performance NVMe drives are. Ask it what workloads demand them.
              I'd already told you multiple times by that point that I'm a software developer and what do most of us do? Make software for other people. I assumed this was obvious and that I was talking about the software we sell to our customers and what they use it for. Our customers buy our software because they want to use it for things and those things are uses where a fast SSD is a real benefit that more than pays for itself. I was also obviously talking about things we and our customers use this software for. A big part of software development is to test your software and to sufficiently do so requires us to do what our customers will or already do with our software.
              You clarified what your organization does but not specifically what YOU did. Unlike you, I don't make such assumptions. That being said, there's quite a lot you're not saying that makes a lot of difference here.
              There are a variety of reasons why it makes sense to have a single workstation with all the bells and whistles (including a high-performance SSD). If it's just you and maybe 5 other devs, I can see why maybe all of you would have such a system; I'd still argue it's not necessary for you all to have one, but I get it. But, if you are one of dozens of devs, it starts to not make sense, because not all of you need such a system. In fact, you'd be doing your customers a disservice by only testing on the best you can get.
              Furthermore, if your title along with several others is nothing more than just software developer, you're probably not doing work directly for your customers via your software. So, since nobody is requesting YOU to work with the full dataset, it doesn't make sense why YOU need your own full-spec'd workstation.
              Of course, if it came to something like compiling code and you've got yourself oodles of threads then I'd say sure, get that drive - you'll need it there. As I've said before, high-performance SSDs make sense with software development.
              Our internal testing is not niche; It's what our customers will and do use our software for and we have to do it.
              I meant niche compared to all industries that demand a high-performance SSD... I'm sure the amount of customers you have is within 4 digits, and even that seems rather high.
              Yet you still tried to make the argument that we and our customers aren't going to be taxing our drives with those smaller chunks...
              Of course it'll be taxing. What I'm suggesting is that it wouldn't be a crippling bottleneck. A stutter here and there doesn't really warrant paying 5x the price.​
              If our customer pays a lot of money to have those high-res scans done, they're going to want to use them to their fullest. If they want something and will also pay for it, then for us then its a need. It's as simple as that. We don't tell them to "cry me a river" or call them a "spoiled brat" when our software can't handle what they want to use it for. No, what we do is improve our software so it can handle what they want to do with it and as part of it, we'll also do just that. We also try to stay head of our customer so that when they decide to up the resolution of their scans or add a new type of processing to their workflow, our software can ready handle that.
              I agree with all that; the customer can do whatever they want. Doesn't mean it's necessary, and you're not the customer. Like stated before: it's smart to have a single fully-spec'd workstation on-hand, but it doesn't mean YOU need one.
              I know you're a know-it-all who thinks he's literally the smartest man on the planet. But it's getting ridiculous the number of times you're going on about how you know SO much more than all these subject matter experts that we have on staff and especially as our customers. It's quite frankly embarrassing to read when you've never actually done anything in or worked in this industry you think you know better than anyone in it.
              Here's where you start slipping through the cracks:
              You're the only one who keeps trying to convince yourself of your expertise; it should be apparent by now you won't convince me of this.
              You're the one who feels the need to convince me that you're right. Any industry can just keep throwing more data points and claim they need a high-performance SSD to keep up with how big it gets, but most understand how much you can accomplish with less, not necessarily discarding the excess.
              If you were truly an expert and knew with 100% certainty you were right, you wouldn't have to argue with me. But you, as the driver of this car we're in, are taking us to my homeland.
              Welcome, brother. Bask in the embarrassment you have of me, because like it or not - you've got it yourself now.
              So now using the plural form of a word is being "vague" and I should instead be explicit in stating that I mean something in plural? Is it that hard to admit you misread?
              No... the vagueness is how the datasets were read. How are you not keeping up with this? You mentioned data is read in 25GB chunks. You were not specific in how those were loaded, but based on your argument, it seemed like you were suggesting all 25GB were loaded at once. Like I said, that would make sense to need a high-performance SSD, because that's a hell of a lot of data to load at a time (but a stupid design). So, since you corrected me and said it isn't all loaded at once, that's where you undermined your own argument, since you can basically just load what you need with a bit of a buffer to keep up with a lesser SSD.
              Make sense now, or is there something else you're going to misinterpret/forget/omit now?
              Like I said; The devs clearly identified as such know they're acting as representatives so they're going to be on their best behavior. I've had one instance where I've later realized I was actually wrong (something I, unlike you, am able to admit to) and the dev decided to just let it go.
              Hilarious - only once have you ever realized you were wrong.
              On a more serious note, it seems to me you caved only because of authority. I don't believe you actually realized you were wrong.
              In any case, perhaps you should take their lead.
              See? I've been telling you time and time again that developers absolutely DO compress their games by quite a lot. With multiple compressors that are a standard part of their toolkits, including asset-specific ones like the compression tools in Autodesk Maya.
              Yeah... I know that... I never disagreed with that. The point you continue to forget is zlib isn't the only way to compress data, and data can be compressed further.
              One step forward, two steps backwards... I explicitly mentioned the Playstation 5's hardware compressor three times to you. Of course zlib isn't the only tool they use when I bring up multiple other commonly used compressors, explicitly. I brought up zlib because its been part of the standard toolkit of every console that's come our since 2005.
              You can bring up the hardware compressor for a 4th time, if you like. Doesn't really do anything to help your point other than to explain that it reduces CPU overhead, which is really just tangential. PCs don't have it, but they do have better CPUs.
              Funny how you acknowledge there's things other than zlib once I point it out. You seemed pretty insistent to treat it as the only thing games and their assets are compressed with.
              When you, by to your own admission, can only wrangle a couple of gigabytes out of a 40-50 GB game with a high setting, your "enough" is one where you spend more additional time de-compressing assets than you save in reading them from disc. Its textbook "Bending over a dollar to pick up a dime" for crying our loud.
              I already addressed this earlier in the post, as well as previous posts. I'm not repeating myself.
              I am stubborn to a fault but I'm not insane.
              As I've said before; Being the "last man standing" doesn't mean anything. It takes no skill to keep arguing after you've long since lost the actual argument...
              Hubris sure has quite an impact on people's way of thinking. Tell me: what do you gain from this? Like I said, I've already got a bad rep here, nobody is paying attention to this anymore, and you're the only arguing that 1.0 is different than 1.
              See you tomorrow!

              Comment


              • #57
                Originally posted by schmidtbag View Post
                I have acknowledged multiple times where you're either right or where your perspective would make sense if it were actually true. The only thing we disagree on is subjective. That's what's so funny about this, because you're the only one deliberately finding ways to disagree with me, yet you call me the habitual contrarian. All I've been doing is clarifying things that you continue to misinterpret, whether that be because you're the contrarian, because you omit the complete picture (whether deliberate or not), or because you make rash assumptions that only make sense if you assume the worst in someone.
                Your attempt at playing "5d chess" with that "Yes that's true, but it means I'm right" is just a lame attempt to avoid admitting you were wrong when you know you were wrong.

                When I'm replying to exactly what you've written and not some kind of benign straw man I should construct myself is just being honest. If I did that, as soon as I "win" against this benign straw man you're going to start complaining about me constructing a bad straw man and misrepresenting you. So I'm just replying to exactly what you've written. Word for word with nothing omitted.

                How am I the one with dementia when it was established from the beginning what type of system that dataset was in?
                We were always talking about a desktop use type of product. Not sure how you somehow forgot that...

                To be a contrarian means I'm going out of my way to disagree. I've really only been disagreeing about 1 particular thing, with the rest just clarifying my perspective. So: I'll gladly admit I'm a stubborn jackass, annoying, and childish in my behavior - this is the internet, I can be whoever I want with minimal consequence, just like you can pretend to be an expert! But I'm not a habitual contrarian; that title belongs to you, since you're the one who from the beginning has been disagreeing with just about everything I've said.
                Go ahead, disagree with me here, and see how you're less of a contrarian than me.
                Being a contrarian isn't just about material facts. Your "But muh perspective" when you can't dispute the material facts that go against you doesn't change anything. You're just obsessively replying to keep the discussion going. By your own admission I might add. That's exactly what being a contrarian is about. Arguing for the sake of arguing well after you've lost the argument. The fact that you've not only proven yourself clueless on the subject, but admitted to as much doesn't exactly help your case.

                It was representative: it showed that with good compression, you can load data quicker. Just because it was an old game, the concept doesn't change. Modern games may be a lot more compressed but the point remains: there is room for more compression.
                When games are are already pretty compressed it does absolutely nothing except state the obvious. It doesn't help the argument that modern games still have an significant amount of room for compression one bit. It just explains why games these days are a properly compressed.

                No? I mentioned a handful of examples. There was only 1 example where it was publicly known that the size difference was from a lack of compression.
                Yes, you mentioned a handful of examples and the only one that actually had data that wasn't properly compressed was the audio. Something that doesn't take much space and was (IIRC) fixed soon after it was discovered.

                Those additional points didn't have to be made, because most honest people who actually have a semblance of expertise would understand what's implied. What next, I have to tell you the database software, the OS, the hardware, the ISP, etc too? There's a point where additional information is implied; nobody runs a business off a single table, therefore, it was implied there was more.
                If the size of a particular table is representative of the whole database is a relevant point. You didn't say that until I pressed you and the way you're responding suggests that you're now just making stuff up because it isn't representative of the size of the whole database.

                Who said anything about a revelation? I was just using an example of how much compression helps. And again: saving a couple GB of 50GB from on-the-fly compression is a worst-case scenario. It's not efficient and it doesn't make sense to do that, but the thing that can't seem to penetrate your thick head is that it shows how much potential there is for further compression.
                No, what doesn't penetrate your thick skull is that it suggests the exact opposite. It suggests that yes, you can compress data further, but you'll spend more time decompressing that data than you'd have spent reading a couple of gigabytes more from disc. Its simply a case of "Bending over a dollar to pick up a dime".

                I only mentioned the one table because it's the only thing that was relevant. The other ~700GB has nothing to do with my example, at all. All it does is state the obvious: that there is more than 1 table.
                With the context of the whole database being 700 GB, again generously assuming you're not making it all up, that one table's size is the least relevant part when the overarching subject is the matter of fast disc drives. When you're doing major (well optimized) processing of a dataset that big a fast SSD is absolutely going to be a great benefit. There's a reason why the first fast PCIe SSDs were for datacenters. There's a reason why RAID is these days only used for the redundancy.

                You are bursting at the seams with hypocrisy. I'd be happy knowing I've taken down your reputation here, if only our audience hadn't already left several pages ago. Even in cases where people agree with you, you've dragged this on too long to ever be viewed favorably. I already have a bad rep here so it makes no difference to me. Like I said, I'm here for the ride - we're in the same car, and you're the one driving haha.
                I've dragged this on for too long? Says the guy who's openly admitted that he plans to keep dragging this on indefinitely. I explicitly told you that I don't see the world as some moronic popularity contest when you tried to use the fact that you'd gotten a few upvotes as an argument. That not only have I probably gotten more than you in this discussion, they also mean nothing.

                Or... I don't make tables out of join statements because that's generally stupid and unnecessary.
                Exactly... It was stupid for you to try and make that database look bigger than what it actually is.

                Agreed. The thing is, you can stop lying to me about your expertise - I don't really care about your self-proclaimed capabilities.
                Considering I've worked at the same company for 5 years and delivered several components relating to my area of expertise in high performance compute I'd say it's pretty clear I've proven that I'm not a fraud. My master's degree and several additional years of doing HPC on desktop, datacenter and embedded systems for a living further suggest I'm not a fraud.

                Considering how much it compared to the salary figure, I'd say it was in fact a significant expenditure. If you are one of many who have such a system, that makes it even more significant. Peripherals are required to do the job so they can't be ignored.
                A couple of days' salary for someone who's going to be using it for years is hardly a significant expenditure. The cost of it being insignificant even before the cost of peripherals only further proves how insignificant of an expenditure it is. I didn't put a number on the cost of peripherals because they can range from fairly mundane to something that includes a 10.000€ 3D PluraView stereo monitor setup (something I had in my office until a few months ago when a customer liked it so much they bought it off us).

                I agree with wanting to use it to its fullest extent, but that doesn't mean perpetually. Most industries are aware that it's not economical or necessary to use the complete dataset all the time.
                They will absolutely use it to it's fullest extent from when they get it until it becomes outdated. What's not economical is to pay over 10.000€ for something and then not use properly.

                It would certainly end sooner, because ChatGPT isn't driven by emotions and wouldn't conveniently omit information for the sake of supporting its own argument.
                Aah yes... Omitting information you never bothered to type out, that I should have just known and brought up myself rather than let you look stupid by arguing against what you actually did write out. Word-for-word with nothing omitted.

                ChatGPT, while not always correct, is a more trustworthy source of information. Seriously, try it though: ask ChatGPT how necessary high performance NVMe drives are. Ask it what workloads demand them.
                Apparently you don't seem to know much about ChatGPT, how bad it's "hallucination" problem is or that it's just as bad at vetting what is reads as Tay Tweets.

                You clarified what your organization does but not specifically what YOU did. Unlike you, I don't make such assumptions. That being said, there's quite a lot you're not saying that makes a lot of difference here.
                There are a variety of reasons why it makes sense to have a single workstation with all the bells and whistles (including a high-performance SSD). If it's just you and maybe 5 other devs, I can see why maybe all of you would have such a system; I'd still argue it's not necessary for you all to have one, but I get it. But, if you are one of dozens of devs, it starts to not make sense, because not all of you need such a system. In fact, you'd be doing your customers a disservice by only testing on the best you can get.
                Furthermore, if your title along with several others is nothing more than just software developer, you're probably not doing work directly for your customers via your software. So, since nobody is requesting YOU to work with the full dataset, it doesn't make sense why YOU need your own full-spec'd workstation.
                Of course, if it came to something like compiling code and you've got yourself oodles of threads then I'd say sure, get that drive - you'll need it there. As I've said before, high-performance SSDs make sense with software development.
                When I explicitly tell you I'm a software developer at a company that develops a particular kind of software, what do you think I do all day? Make coffee? No. I write and test the software we're in business to make and sell. Furthermore, I need to be able to test the software I also need to be able to test this software doing what our customers are going to be using it for in a realistic manner. Otherwise I'm going to be missing bugs and our paying customers are going to be running headlong into. I also mentioned that I do this on a machine that's 5 years old at this point (on top of a bunch of other machines widely varying in age and capability) so your attempted shot in the dark there just failed.

                We're not talking about software that's used sporadically by random office workers. We're talking about software where if you use it, you use it to make a living because you're a professional at what our software is for. We're not one of those fortune 500 companies where people are strictly developers, QA or some other role. No, every developer does QA. Further QA is done by our support, training and sales people.

                Its funny how you again go back and forth between sentences. First I don't NEED a full workstation with a fast SSD to test the software I've co-authored in a realistic manner doing what it's actually going to be used for by our customers. Then in the next sentence you admit that compiling that code does significantly benefit from a fast SSD.

                I meant niche compared to all industries that demand a high-performance SSD... I'm sure the amount of customers you have is within 4 digits, and even that seems rather high.
                In terms of companies, you're in the right ball park. In terms of users (or rather, active seats/licenses) add another digit.

                However almost all software for professionals is "niche" like this. There's just a lot of different kinds of professionals who have software written for them.

                Of course it'll be taxing. What I'm suggesting is that it wouldn't be a crippling bottleneck. A stutter here and there doesn't really warrant paying 5x the price.​
                Not sure what bargain bins you're going trough where a 2 TB drive costs less than 100€, but when a customer spends at least 3000€ or more on the computer, up to 10.000€ on peripherals, at least 10.000€ on acquiring the dataset and at least 4000€/month on the person doing the work they're not going to suddenly scrimp and bottleneck it all over a few hundred euros. Let's not even go into the cost of the software because the annual license cost of a full suite of software in these professional workloads can easily exceed a further 10.000€ a year.

                I agree with all that; the customer can do whatever they want. Doesn't mean it's necessary, and you're not the customer. Like stated before: it's smart to have a single fully-spec'd workstation on-hand, but it doesn't mean YOU need one.
                You can keep going on about how you know more than us and our customers til the cows come home, but that doesn't change the fact that we're in business to help our customers do what they want to do so and to properly debug those products before we deliver them, we need to do what they're going to do before they do it.

                Here's where you start slipping through the cracks: You're the only one who keeps trying to convince yourself of your expertise; it should be apparent by now you won't convince me of this. You're the one who feels the need to convince me that you're right. Any industry can just keep throwing more data points and claim they need a high-performance SSD to keep up with how big it gets, but most understand how much you can accomplish with less, not necessarily discarding the excess. If you were truly an expert and knew with 100% certainty you were right, you wouldn't have to argue with me. But you, as the driver of this car we're in, are taking us to my homeland.
                Oh so you're the only one allowed to use your real life work experience and what you do there as an argument. I as someone who works with something that fits exactly in where these kinds of drives are used in a professional setting am not allowed to bring up what me, my co-workers and our customers use these drives as part of making a living. Something you readily admit to not doing.

                How silly of me (sarcasm)

                Welcome, brother. Bask in the embarrassment you have of me, because like it or not - you've got it yourself now.
                Aah yes, listen to the advice from someone who's never worked in an industry and readily admits so, but still thinks he's an authority on that industry and thinks he knows better than professionals who've made a living working in that industry for decades. That trying to inform people so they know better is a sure sign of not knowing what you're talking about.

                No... the vagueness is how the datasets were read. How are you not keeping up with this? You mentioned data is read in 25GB chunks. You were not specific in how those were loaded, but based on your argument, it seemed like you were suggesting all 25GB were loaded at once. Like I said, that would make sense to need a high-performance SSD, because that's a hell of a lot of data to load at a time (but a stupid design). So, since you corrected me and said it isn't all loaded at once, that's where you undermined your own argument, since you can basically just load what you need with a bit of a buffer to keep up with a lesser SSD.
                The only one of us talking about single 25 GB bits of datasets was you. You brought up this 25 GB without any further context as to if it was representative of the rest of the database or, as I suspected, just the whole database joined into one table for the sake of an argument. I countered that by pointing our that I have right now, on hand, a dataset 10 times that size, that it's almost 10 years old, a real customer used dataset and small by today's standards.

                You then assumed and accused the software of being badly written because it loads data in 25 GB chunks, to which I pointed out that no, the software reads the data in chunks of a few gigabytes. Which you somehow misread as it only having a single one of those few gigabytes of data in memory at a time, which I correct and further clarified that in normal use the application will have half a dozen or more in memory at a time, swapping in and out as the user moves the camera around.

                Make sense now, or is there something else you're going to misinterpret/forget/omit now?
                You're the only one of us misinterpreting something that wasn't omitted. Not sure how you could misinterpret a simple plural form of a word and then, with a straight face, complain that I wasn't being clear enough, but here we are.

                Hilarious - only once have you ever realized you were wrong.
                Funny how you accuse me of misinterpreting you and then in the very next sentence try to interpret me talking about a time I was wrong as me saying I've only only once admitted I was wrong. Projecting much?

                On a more serious note, it seems to me you caved only because of authority. I don't believe you actually realized you were wrong.
                If I didn't cave to their authority when I was actually talking to them directly, why do you think I'd cave to their authority and realize I was wrong later?

                In any case, perhaps you should take their lead.
                ... or maybe you should take my lead and develop something your parents failed to teach you. Self-reflection and the ability to admit you're wrong.

                Yeah... I know that... I never disagreed with that. The point you continue to forget is zlib isn't the only way to compress data, and data can be compressed further.
                Continue to forget? I've pointed out to you, over and over again, that zlib and other compression software are a ubiquitous part of game developers' toolkits. The first time I mentioned zlib and the Playstation 5's hardware compression was the SAME post.

                You can bring up the hardware compressor for a 4th time, if you like ... Funny how you acknowledge there's things other than zlib once I point it out. You seemed pretty insistent to treat it as the only thing games and their assets are compressed with.
                Again with the saying one thing and then contradicting that the very next sentence or paragraph? Like I said; The first time I mentioned zlib and the PS5's hardware compressor was in the exact same post. Furthermore I also pointed out in that very post that the PCs and Xbox Series consoles just do the same thing in software, achieving the same level of compression.

                I already addressed this earlier in the post, as well as previous posts. I'm not repeating myself.
                I am stubborn to a fault but I'm not insane.
                When you write something completely contradict yourself in the next sentence or paragraph and do that more than once in a single post, then it does bring your mental health into question...

                Hubris sure has quite an impact on people's way of thinking. Tell me: what do you gain from this? Like I said, I've already got a bad rep here, nobody is paying attention to this anymore, and you're the only arguing that 1.0 is different than 1.
                See you tomorrow!
                Maybe it is hubris trying to inform people so they know better. However like I've told you already, I don't view the world as some childish popularity contest. I try to be well informed and to pass this information onto other people.

                Furthermore, your "agreeing" is more along the lines of; "Yes, A is 1 so A + 1 = 4"
                Last edited by L_A_G; 22 March 2023, 12:26 PM.
                "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

                Comment


                • #58
                  Originally posted by L_A_G View Post
                  When I'm replying to exactly what you've written and not some kind of benign straw man I should construct myself is just being honest. If I did that, as soon as I "win" against this benign straw man you're going to start complaining about me constructing a bad straw man and misrepresenting you. So I'm just replying to exactly what you've written. Word for word with nothing omitted.
                  Except you don't, you reply to what you feel like and ignore the rest, which is the sole reason why you keep arguing. That's why I have to keep repeating things that you misquote. Case in point:
                  We were always talking about a desktop use type of product. Not sure how you somehow forgot that...
                  Being a contrarian isn't just about material facts. Your "But muh perspective" when you can't dispute the material facts that go against you doesn't change anything. You're just obsessively replying to keep the discussion going. By your own admission I might add. That's exactly what being a contrarian is about. Arguing for the sake of arguing well after you've lost the argument. The fact that you've not only proven yourself clueless on the subject, but admitted to as much doesn't exactly help your case.
                  The only thing for me to dispute is your interpretation of my perspective. That's the thing you're not getting: we're mostly on the same page, so there's no "material facts" for me to dispute. The only thing we both definitively disagree on is subjective, which isn't going to have material facts.
                  Seems there's a bit of self-relfection here if you feel I'm obsessed. I'm having fun with this; you're the one with an objective.
                  When games are are already pretty compressed it does absolutely nothing except state the obvious. It doesn't help the argument that modern games still have an significant amount of room for compression one bit. It just explains why games these days are a properly compressed.
                  "Pretty compressed" compared to what? You have less evidence to back up your claim there than I have mine. Again: Steam shows how much potential there is, you have no ground to stand on here.
                  Yes, you mentioned a handful of examples and the only one that actually had data that wasn't properly compressed was the audio. Something that doesn't take much space and was (IIRC) fixed soon after it was discovered.
                  I had one example where there was a known reason for the extra size. And yes, the game did shrink by tens of GB once enough people complained. That is quite a lot of space to save, especially if it were strictly audio files.
                  If the size of a particular table is representative of the whole database is a relevant point. You didn't say that until I pressed you and the way you're responding suggests that you're now just making stuff up because it isn't representative of the size of the whole database.
                  That makes no sense at all. The rest of the database could be half the size and the underlying point I was making would remain the same, a point you continue to try to not understand, because doing so would show you're the Dunning-Kruger victim.
                  No, what doesn't penetrate your thick skull is that it suggests the exact opposite. It suggests that yes, you can compress data further, but you'll spend more time decompressing that data than you'd have spent reading a couple of gigabytes more from disc. Its simply a case of "Bending over a dollar to pick up a dime".
                  That is false when you account for optimized compression methods on modern CPUs.
                  With the context of the whole database being 700 GB, again generously assuming you're not making it all up, that one table's size is the least relevant part when the overarching subject is the matter of fast disc drives. When you're doing major (well optimized) processing of a dataset that big a fast SSD is absolutely going to be a great benefit. There's a reason why the first fast PCIe SSDs were for datacenters. There's a reason why RAID is these days only used for the redundancy.
                  And yet you claim you don't ignore things I say. Hilarious.
                  Repeating myself has got too boring.
                  I've dragged this on for too long? Says the guy who's openly admitted that he plans to keep dragging this on indefinitely. I explicitly told you that I don't see the world as some moronic popularity contest when you tried to use the fact that you'd gotten a few upvotes as an argument. That not only have I probably gotten more than you in this discussion, they also mean nothing.
                  Exactly: I won't end this, you're aware of this, so you're the one who is keeping this going.
                  The point of me bringing up the upvotes was to show that I'm not clueless; that other people understood what I was saying, and that you were just being an autistic literalist.
                  Exactly... It was stupid for you to try and make that database look bigger than what it actually is.
                  Except I didn't, hence me initially not pointing out the rest of the database. You have a strange sense of logic.
                  Considering I've worked at the same company for 5 years and delivered several components relating to my area of expertise in high performance compute I'd say it's pretty clear I've proven that I'm not a fraud. My master's degree and several additional years of doing HPC on desktop, datacenter and embedded systems for a living further suggest I'm not a fraud.
                  Ah ok, I'll just take your word for it. After all, you ARE the authority of all that is correct, and should never be questioned! Not believing you would be foolish!
                  A couple of days' salary for someone who's going to be using it for years is hardly a significant expenditure. The cost of it being insignificant even before the cost of peripherals only further proves how insignificant of an expenditure it is. I didn't put a number on the cost of peripherals because they can range from fairly mundane to something that includes a 10.000€ 3D PluraView stereo monitor setup (something I had in my office until a few months ago when a customer liked it so much they bought it off us).
                  Wait so the system is only worth 2 days salary? When you mentioned the salary, I thought you meant monthly. If so, what kind of sad pathetic life do you live where you have all that cash but your time is spent ranting at some random jerk on the internet with a known-futile trajectory? This is the kind of crap why I don't take your "expertise" seriously.
                  In any case, funny how you omitted that monitor; that's certainly big money by anyone's definition.
                  They will absolutely use it to it's fullest extent from when they get it until it becomes outdated. What's not economical is to pay over 10.000€ for something and then not use properly.
                  I agree. I never suggested otherwise, but as someone who replies "exactly" and "word for word" to what I wrote with no omissions, you would have known that.
                  Aah yes... Omitting information you never bothered to type out, that I should have just known and brought up myself rather than let you look stupid by arguing against what you actually did write out. Word-for-word with nothing omitted.
                  Not my fault you have memory and/or reading comprehension issues.
                  Apparently you don't seem to know much about ChatGPT, how bad it's "hallucination" problem is or that it's just as bad at vetting what is reads as
                  Ah, so I guess it must've been trained by a lot of your posts.
                  When I explicitly tell you I'm a software developer at a company that develops a particular kind of software, what do you think I do all day? Make coffee? No. I write and test the software we're in business to make and sell. Furthermore, I need to be able to test the software I also need to be able to test this software doing what our customers are going to be using it for in a realistic manner. Otherwise I'm going to be missing bugs and our paying customers are going to be running headlong into. I also mentioned that I do this on a machine that's 5 years old at this point (on top of a bunch of other machines widely varying in age and capability) so your attempted shot in the dark there just failed.
                  Yeah, none of that changes my point.
                  You never mentioned the machine being 5 years old, you said you worked at the company for 5 years, two completely different things. This just keeps getting better.
                  Its funny how you again go back and forth between sentences. First I don't NEED a full workstation with a fast SSD to test the software I've co-authored in a realistic manner doing what it's actually going to be used for by our customers. Then in the next sentence you admit that compiling that code does significantly benefit from a fast SSD.
                  So there we have it - you've settled the argument:
                  All I was saying the whole time is that there is a short list of types of workloads that require/need a high-performance SSD, compiling code being one of them. You disagreed, saying that your LIDAR data needs one too, but now, you said in full caps, "I don't NEED a full workstation with a fast SSD".
                  How embarrassing for you to go through all this just to admit I was right.
                  Not sure what bargain bins you're going trough where a 2 TB drive costs less than 100€, but when a customer spends at least 3000€ or more on the computer, up to 10.000€ on peripherals, at least 10.000€ on acquiring the dataset and at least 4000€/month on the person doing the work they're not going to suddenly scrimp and bottleneck it all over a few hundred euros. Let's not even go into the cost of the software because the annual license cost of a full suite of software in these professional workloads can easily exceed a further 10.000€ a year.
                  I don't know in what industry everything you just mentioned is stored on some random guy's workstation.
                  You can keep going on about how you know more than us and our customers til the cows come home, but that doesn't change the fact that we're in business to help our customers do what they want to do so and to properly debug those products before we deliver them, we need to do what they're going to do before they do it.
                  Hence me saying it makes sense to have at least 1 top-spec'd workstation.
                  Oh so you're the only one allowed to use your real life work experience and what you do there as an argument. I as someone who works with something that fits exactly in where these kinds of drives are used in a professional setting am not allowed to bring up what me, my co-workers and our customers use these drives as part of making a living. Something you readily admit to not doing.
                  Took me a minute to figure out where you got that from but I get you now. But no - personal anecdotes don't mean much. You're free to (and already have) taken what I say with a grain of salt, but I'm going to do the same to you. By the time I figured out that you were acting emotionally rather than rationally, I stopped bringing up new anecdotes of my own, because I know better now that they mean nothing to you.
                  The only one of us talking about single 25 GB bits of datasets was you. You brought up this 25 GB without any further context as to if it was representative of the rest of the database or, as I suspected, just the whole database joined into one table for the sake of an argument. I countered that by pointing our that I have right now, on hand, a dataset 10 times that size, that it's almost 10 years old, a real customer used dataset and small by today's standards.
                  You didn't specify in context:
                  Phoronix: Testing The First PCIe Gen 5.0 NVMe SSD On Linux Has Been Disappointing This past week saw the first two consumer PCIe 5.0 NVMe solid-state drives released to retail: the Gigabyte AORUS Gen5 10000 and the Inland TD510. I've been testing the Inland TD510 2TB Gen 5 NVMe SSD the past few days. While in simple I/O

                  "Not all in one go, but in a series of big chunks that can be bigger than that 25GB table."
                  Perhaps this is referring to the 40-250GB datasets you also mentioned. In any case, I'm fine admitting of misquoting here; you're pulling a strawman if you continue to focus on that.
                  And for the hundredth time: it doesn't matter how big the dataset is. That's the point you fail to remember. That's why I never brought up the full database, because any true expert who isn't a habitual contrarian would know that nobody has a system that contains nothing but a single 25GB table and therefore is probably going to need to leave some RAM for other data.
                  You then assumed and accused the software of being badly written because it loads data in 25 GB chunks, to which I pointed out that no, the software reads the data in chunks of a few gigabytes. Which you somehow misread as it only having a single one of those few gigabytes of data in memory at a time, which I correct and further clarified that in normal use the application will have half a dozen or more in memory at a time, swapping in and out as the user moves the camera around.
                  Sounds about right to me.
                  You're the only one of us misinterpreting something that wasn't omitted. Not sure how you could misinterpret a simple plural form of a word and then, with a straight face, complain that I wasn't being clear enough, but here we are.
                  The only thing that takes the fun away from all this is how your comprehension issues make this go nowhere. I want drama; your failing comprehension isn't fun.
                  If I didn't cave to their authority when I was actually talking to them directly, why do you think I'd cave to their authority and realize I was wrong later?
                  Because you realized what it would do to your reputation if you kept going. I'm a nobody so there's less risk to continue.
                  ... or maybe you should take my lead and develop something your parents failed to teach you. Self-reflection and the ability to admit you're wrong.
                  Mind explaining how I'm supposed to take your lead when you have not shown any semblance of admitting you're wrong?
                  I sure hope you're not a parent - just standing there yelling at your kid "stop crawling on the floor! Stand up and walk!" while never teaching how to do so.
                  Continue to forget? I've pointed out to you, over and over again, that zlib and other compression software are a ubiquitous part of game developers' toolkits. The first time I mentioned zlib and the Playstation 5's hardware compression was the SAME post.
                  Except as you continue to forget, there's asset-specific compression and both consoles aren't using strictly zlib.
                  When you write something completely contradict yourself in the next sentence or paragraph and do that more than once in a single post, then it does bring your mental health into question...
                  It's not contradictory when you let emotions get the best of your reading comprehension.
                  Maybe it is hubris trying to inform people so they know better. However like I've told you already, I don't view the world as some childish popularity contest. I try to be well informed and to pass this information onto other people.
                  This takes your narcissism to a new level. Please forgive me, I didn't realize you were just being charitable!
                  You might want to take some consideration of the popularity contest, because nobody is reading this (and therefore are not receiving your "information"). Of anyone who is still reading, they certainly are not looking upon you fondly, even if they agree with you. You are compromising your own "noble cause".
                  Last edited by schmidtbag; 22 March 2023, 03:40 PM.

                  Comment


                  • #59
                    This is now well past the point of ridiculous and well into the realm of the plain sad...

                    You claim I'm straw manning you when I'm quoting you verbatim with your posts in their entirety and replying to every single one of your points. I cannot reply to your posts in a more honest and complete way without starting to write my own benign straw men. Something I refuse to do as I don't use straw men. Malicious or benign.

                    You've also begun to contradict yourself, not only in the same post, but in subsequent paragraphs and even sentences. Flipping between acknowledging that I've mentioned something several times and then, in the next paragraph, accusing me of never having even mentioned them until you brought it up. This is insane.

                    Don't even get me started on your continued insistence that you know better than literal subject matter experts with decades of industry experience despite never having worked a day in this industry and your explicit insistence that you're not going to back down on knowing better than these people. That you're going to continue arguing that you know better than anyone in this industry in perpetuity. This isn't even being stubborn, it's something far worse.

                    You're also now regularly (as in at least once in every post several posts in a row) forgetting things that I've brought up multiple times as some kind of gotcha-thing. As an example I've mentioned game developers using several different kinds of compressors and explicitly their use of asset type specific ones like the ones built into Autodesk Maya twice before you actually bothered looking into compression in videogames. Try to remember that you're the one who brought up zlib as something devs should be using and I pointed out that it's been a standard part of game developers' tookits for almost 20 years already. Normally I'd ascribe it to arguing in bad faith, however...

                    It's pretty clear you're not "all there" so to speak. Thus; Continuing this argument is just abusing someone who isn't all there and as someone with a conscience, I don't enjoy abusing people, let alone vulnerable people like that. I should have realised that and stopped two weeks ago, but I let my eagerness to spread the knowledge I have get the better of me.

                    I just feel sorry for you and I don't mean that as an insult. I genuinely do. Get help.
                    "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

                    Comment


                    • #60
                      Originally posted by L_A_G View Post
                      This is now well past the point of ridiculous and well into the realm of the plain sad...
                      It was ridiculous before it reached page 5.
                      You claim I'm straw manning you when I'm quoting you verbatim with your posts in their entirety and replying to every single one of your points. I cannot reply to your posts in a more honest and complete way without starting to write my own benign straw men. Something I refuse to do as I don't use straw men. Malicious or benign.
                      You take the quote as-is and ignore the rest of the context. That's your problem.
                      You've also begun to contradict yourself, not only in the same post, but in subsequent paragraphs and even sentences. Flipping between acknowledging that I've mentioned something several times and then, in the next paragraph, accusing me of never having even mentioned them until you brought it up. This is insane.
                      Not if you account for the whole context.
                      Don't even get me started on your continued insistence that you know better than literal subject matter experts with decades of industry experience despite never having worked a day in this industry and your explicit insistence that you're not going to back down on knowing better than these people. That you're going to continue arguing that you know better than anyone in this industry in perpetuity. This isn't even being stubborn, it's something far worse.
                      Don't even get me started that you can claim to be anything you want; doesn't make it true.
                      You're also now regularly (as in at least once in every post several posts in a row) forgetting things that I've brought up multiple times as some kind of gotcha-thing. As an example I've mentioned game developers using several different kinds of compressors and explicitly their use of asset type specific ones like the ones built into Autodesk Maya twice before you actually bothered looking into compression in videogames. Try to remember that you're the one who brought up zlib as something devs should be using and I pointed out that it's been a standard part of game developers' tookits for almost 20 years already. Normally I'd ascribe it to arguing in bad faith, however...
                      I never forgot such things, the irony is despite all of that, you forgot I mentioned there's still room for more compression. zlib is a generalist compression method; just because it doesn't yield much, doesn't mean there isn't room for more compression elsewhere.
                      It's pretty clear you're not "all there" so to speak. Thus; Continuing this argument is just abusing someone who isn't all there and as someone with a conscience, I don't enjoy abusing people, let alone vulnerable people like that. I should have realised that and stopped two weeks ago, but I let my eagerness to spread the knowledge I have get the better of me.

                      I just feel sorry for you and I don't mean that as an insult. I genuinely do. Get help.
                      Ah yes, your saintliness continues to grow, while your narcissism and complete inability to reflect upon your own projections shrink.
                      I'm well aware of what I'm doing here. Notice how I never pretended to be someone I'm not; I know I'm being annoying and I never claimed I'm an authority in anything. I know nobody is reading this far in, I know nobody here cares what I have to say, and therefore, I'm not here for some BS "noble cause". You're the one having a crisis over what you've done.

                      It was fun while it lasted. At least you agreed to my underlying point, which is the best part.
                      Last edited by schmidtbag; 23 March 2023, 09:14 AM.

                      Comment

                      Working...
                      X