Announcement

Collapse
No announcement yet.

The Desktop CPU Security Mitigation Impact On Ubuntu 20.04

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by birdie View Post
    A typical reply from a brainless AMD fan.
    You are starting with insult. Typically I even won't read further, since you have already lost an argument here, but let's see what you have to say.

    Originally posted by birdie View Post
    [*]Intel, to this day, has never knowingly released CPUs with vulnerabilities only known within the company confines. If they'd done that, it would have been a major PR disaster and even a very expensive class action lawsuit.
    We don't know that. What we know is Intel has never said anything about them publicly before vulnerable product release. Doesn't mean they didn't do it. Also doesn't mean they did it. We can somewhat assume they didn't, but this statement of yours is practically unverifiable. Possible legal or PR consequences are irrelevant here - they are motivation not to do it, but not a prove of what Intel actually did.

    Originally posted by birdie View Post
    [*]Let's also mention CPUs from the 90s, or even 80s, or even 70s? And how many bugs they had. Why don't you recall the F00F bug or even the FDIV bug for good measure? Oh, wait, I guess you're too young for that. Have you ever designed a single CPU uArch?[*]Doesn't it concern you that the Meltdown bug was found to affect ARM and IBM CPUs as well?[*]Doesn't it concern you that all OoOE CPUs are affected by the Spectre vulnerabilities?
    We are discussing CPUs featured in the article and those are not 70s, 80s, 90s, ARM or IBM. Personal experience in CPU design does not validate or invalidate arguments, actually it is an appeal to authority fallacy, so you have lost an argument twice at this point.

    Originally posted by birdie View Post
    Yeah, let's talk about the extremely vocal minority of all the 10 people in the entire world who browse the web with JS disabled.
    Citation needed. Unspecific, figurative language is not helping you in technical discussion.

    Originally posted by birdie View Post
    It's amazing how you give people very sound proven known facts but since they just don't know how to twist and turn them to their favorite company advantage they start talking about everything else other than provide direct counter arguments as to why Intel is/was bad and AMD is good. I've seen bigotry but in case of AMD-fanboys/Intel-haters, it's just egregiously stupid.
    It's almost exactly the same what you are doing.

    Originally posted by birdie View Post
    Should I remind you how fast AMD started to charge an arm and a leg for their CPUs when they got an advantage for a short while? AMD Athlon 64 FX FX-62 was released for $1031 in 2006. Adjusted for inflation that would be over $1350 today.
    Off topic.

    Comment


    • #42
      Originally posted by ktecho View Post

      And they seem to forgot that with Intel you get launch day support for Linux, whereas with AMD you're going to be in the best case months until you get all in place, but it's going to probably be one or two years. Lets not forget that they got the LM_SENSORS support in place to get the temperature of latest CPUs months (years?) after launch.

      So even if AMD is a lot faster than Intel, you'll need months/years of wait for it to become that faster
      Thanks.

      Comment


      • #43
        Originally posted by birdie View Post

        A typical reply from a brainless AMD fan.

        Comment


        • #44
          Originally posted by birdie View Post
          A typical reply from a brainless AMD fan.
          Nothing does a better job at convincing people that starting off with a grade school level insult...

          Seriously thou.

          Intel, to this day, has never knowingly released CPUs with vulnerabilities only known within the company confines. If they'd done that, it would have been a major PR disaster and even a very expensive class action lawsuit.
          Well from what I've heard from people are more intimately knowledgeable with these things and have attended Intel's technical briefings on new CPUs with new OOE-features they have been warned about their corner cutting years ago. Even at that they've cut way more corners than any of their competitors and as a result have been subject to considerably more of these vulnerabilities than anything put out by their competitors.

          Let's also mention CPUs from the 90s, or even 80s, or even 70s? And how many bugs they had. Why don't you recall the F00F bug or even the FDIV bug for good measure? Oh, wait, I guess you're too young for that. Have you ever designed a single CPU uArch?
          The FDIV bug is a particularly bad example for you to bring up considering Intel found it in their own internal testing prior to release but decided to ignore it and release the chip without fixing it. They didn't even have a proper reporting system at the time and would instead stonewall anyone trying to report the bug to them until their inept response caused a big enough stink that they had to admit the existence of it and create a proper bug reporting system.

          Intel's inept response to the bug caused the whole thing to propagate to the point that they even wrote jokes like "How do the Republicans increase spending, cut taxes and balance the budget at the same time - Simple, they run their spreadsheets on Intel Pentium computers!". There's also the anecdote of John Carmack who ran into it while finishing up the original Quake and got an Intel engineer to come over and have a look at it, only to have this engineer, who knew about this yet-to-be-disclosed silicon bug, tell him to just ignore it.
          Doesn't it concern you that the Meltdown bug was found to affect ARM and IBM CPUs as well? Doesn't it concern you that all OoOE CPUs are affected by the Spectre vulnerabilities?
          Considering that other OOE processors are only vulnerable to a fraction of all the vulnerabilities that Intel's processors from the last decade+ have been found to suffer from, talk about them being vulnerable too is just pointless bikeshedding.

          Comment


          • #45
            Originally posted by ktecho View Post

            Does anyone knows if one can disable mitigations in an EC2 in AWS?
            AFAIK it is not possible as it opens up security vulnerabilities to the host. I don't think you can't disable it for a single VM. It will probably have to be disabled on the hypervisor/host meaning it will affect all VMs. It might be possible using dedicated hosts https://aws.amazon.com/ec2/dedicated-hosts/pricing/ but it's damn expensive.

            Comment


            • #46
              Michael can you confirm this?

              The embedded svg does not render the following text in my browser. I have taken it from inspecting the web page source.

              - Ryzen 7 3700X: No Mitigations: itlb_multihit: Not affected + l1tf: Not affected + mds: Not affected + meltdown: Not affected + spec_store_bypass: Vulnerable + spectre_v1: Vulnerable: __user pointer sanitization and usercopy barriers only; no swapgs barriers + spectre_v2: Vulnerable IBPB: disabled STIBP: disabled + tsx_async_abort: Not affected

              It looks like the mitigation is enabled: "spec_store_bypass: Vulnerable". This should not be enabled, the vulnerability has been fixes in zen2. https://www.techpowerup.com/256478/a...for-spectre-v4

              PS: skeevy420 I also wonder how much compiler mitigations impacts the results. Testing has become a PITA!

              Comment


              • #47
                Originally posted by birdie View Post

                It's amazing how you give people very sound proven known facts but since they just don't know how to twist and turn them to their favorite company advantage they start talking about everything else other than provide direct counter arguments as to why Intel is/was bad and AMD is good. I've seen bigotry but in case of AMD-fanboys/Intel-haters, it's just egregiously stupid.

                How does the fact that some random Intel CPU costs more than the AMD CPU you've purchased relate to anything I've said to demonstrate that most of the flak Intel has received so far in regard to HW vulnerabilties is mostly unwarranted?

                Damn, you just cannot make up excuses to show Intel in bad light. Should I remind you how fast AMD started to charge an arm and a leg for their CPUs when they got an advantage for a short while? AMD Athlon 64 FX FX-62 was released for $1031 in 2006. Adjusted for inflation that would be over $1350 today.
                Your "proven fact" is that there's a 1% difference in the impact for vulnerabilities mitigations between AMD and Intel. So it's basically irrelevant.

                So I brought some much more relevant data for preferring AMD to Intel in the current generation: performance/cost. And right now AMD is the best choice, by far. And if your counter-argument is some CPU from 2006... well it's no argument at all. I'm talking about what is available to purchase right now.

                I'm not anyone's fanboy, before getting my current Ryzen 7 3700X I had an Ivy Bridge Intel CPU (i5-3450), because that was the best choice when I bought it in 2013. It' no longer like that, and how long it stays like that depends only on Intel.

                Comment


                • #48
                  Originally posted by ktecho View Post

                  And they seem to forgot that with Intel you get launch day support for Linux, whereas with AMD you're going to be in the best case months until you get all in place, but it's going to probably be one or two years. Lets not forget that they got the LM_SENSORS support in place to get the temperature of latest CPUs months (years?) after launch.

                  So even if AMD is a lot faster than Intel, you'll need months/years of wait for it to become that faster
                  You also get launch-day vulnerabilities with mitigations that hurt performance.

                  Free CPU bugs or free performance over time? Tough choice innit?

                  Comment


                  • #49
                    Originally posted by skeevy420 View Post

                    In 13 more days it'll be 4 months without a cigarette for me. Unfortunately I've replaced smoking with candy and gained weight

                    What kind of sucks is that I saw a thing earlier about how cigarette smokers are less likely to die from COVID if they catch it....fuck me....tried to become healthy all I did was make myself more at risk and picked up the underlying condition of fat ass
                    I totally missed your comment yesterday.

                    First of all, great job kicking the cigarette habit-- that stuff is just pure poison. I'll be 35 in a few months and I smoked occasionally during college and a little after, and officially stopped three years ago after leaving a stressful and toxic software engineering job. (Not to mention the commute was two hours round trip which aided the smoking.)

                    Second, congratulations on the newfound diabetes addiction! Kidding, of course. Until last month I had a ridiculous sugar tooth as well. Scarfing down SourPunch straws (you know, that Red/Green/Blue canister from Costco), and when I wasn't eating those, shoving three jolly ranchers in my mouth at a time. The sugar addiction is real-- once I stopped after eating healthy, no longer craved the sugar.

                    Third and most important, cigarette smokers are absolutely more likely to die from COVID-19, I'm not sure where you heard that. Any smoking or inflammation of the lungs will irritate and exacerbate the virus as it latches on to the ACE2 receptors in our lungs. Smoking, obesity, pre-existing conditions, all factors in who is more susceptible.

                    P.S: since I stopped smoking pot, I started dreaming again. Weed stops you from entering REM sleep. And let me tell you, the dreams came back and they were intense as all hell. Feels good to dream again.

                    Comment


                    • #50
                      I have an AMD C-60 APU in a machine, and the only sane way to use it is to set mitigations=off

                      Comment

                      Working...
                      X