Announcement

Collapse
No announcement yet.

AMD Releases HIP RT 2.2 With Multi-Level Instancing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Panix View Post
    Right. Is there any connection to programs like Blender? The program - you still can't use HIP-RT with in Linux? So, exactly - what is this announcement for - you pointed out the facts - that Nvidia could already do this rendering - AMD fanboys here will just say Nvidia had a head start and pays these software companies off - to optimize with OptiX or whatever tech. Nvidia uses. I have heard all that before.
    It still doesn't change the fact that AMD's progress in these fields are akin to a sloth climbing a tree. Or at a snail's pace. Pick your expression.
    but Nvidia had a head start ?
    also keep in mind the 2,5 years different what nvidia is ahead does not matter for the future that much.
    Phantom circuit Sequence Reducer Dyslexia

    Comment


    • #12
      Originally posted by sophisticles View Post
      AMD does not want to make any progress it's not in their best interest.

      really ?...

      Originally posted by sophisticles View Post
      AMD's approach to CPUs is more cores, they will be releasing a 192C/384T TRm AMD does not want to sell you a $4000 video card that's 12x faster than a comparable CPU/motherboard combo and they can keep selling faster CPUs with more cores every year.
      AMD is forced to do a 192 Core CPU because the Cloud native market already has 192core ARM cpus for the cloud native market.
      AMD would lose many sales to these 192core ARM chips.

      Originally posted by sophisticles View Post
      Nvidia is kind of forcing AMD to go through the paces of pretending they care about GPU acceleration.
      the 192core cpu has nothing to do with the GPU market. also the AMD PRO w7900 is not an example of planet obsolescence because the 48GB vram will be enough for a longer time any 24GB vram 4090 will be enough...
      Phantom circuit Sequence Reducer Dyslexia

      Comment


      • #13
        Originally posted by sophisticles View Post
        .
        Sophisticles you asked me how to go from a Firefox 119 CVE to the LogoFail UEFI/BIOS hack

        Its a user-right escalation vulnerability in glibc to give a task run as user root rights.

        well here is the answer its CVE-2023-6246, CVE-2023-6779 and CVE-2023-6780 :

        "glibc vulnerability allows root access on Linux In addition, further vulnerabilities were discovered in the Gnu C library. One of them has probably existed for over 30 years. Four vulnerabilities in glibc endanger countless Linux systems. Security researchers at Qualys have discovered several vulnerabilities in the widely used Gnu C library (glibc). One of them, registered as CVE-2023-6246, relates to the __vsyslog_internal() function and allows attackers to gain root access by triggering a heap buffer overflow in the default configuration of several popular Linux distributions. The attack is carried out via the frequently used logging functions syslog() and vsyslog(). According to the researchers' report, this security hole was introduced in version 2.37 of glibc, which was published in August 2022. However, the vulnerable code was later backported to version 2.36 in order to patch another, less serious vulnerability. At least the Linux distributions Debian 12 and 13, Ubuntu 23.04 and 23.10 and Fedora 37 to 39 are proven to be vulnerable. However, CVE-2023-6246 can only be exploited under certain conditions, for example through an unusually long argv[0] or openlog() argument with more than 1,024 bytes. Due to the widespread distribution of the vulnerable library, the effects are still significant. Other, less serious vulnerabilities Less serious are two other buffer overflow vulnerabilities discovered by Qualys (CVE-2023-6779 and CVE-2023-6780), which also relate to glibc's __vsyslog_internal() function. However, according to the researchers, triggering this would be more difficult and effective exploitation would probably be more complex. The researchers published technical details about all three security holes in a separate security note. Patches are probably now available, as a look at the disclosure timeline at the end of the document shows. Advertisement In addition, the Qualys researchers found a memory corruption vulnerability in the glibc sorting function qsort(), which affects all versions of the library released since September 1992, i.e. within the last 32 years - from version 1.04 to the most recent version 2.38. However, for an application to be vulnerable, it must meet certain criteria when calling the qsort() function. The researchers have not yet been able to find any real examples of vulnerable programs. "The glibc security team clarified that the vulnerability arises from applications that use non-transitive comparison functions that are not compliant with the Posix and Iso-C standards," the researchers said. The problem has been fixed by a recently released update - as part of a refactoring of qsort() that was carried out due to an independent discovery.​"


        now you have the complete pathway Firefox 119 CVE ---> glibc CVE ---> LogoFail UEFI/BIOS
        Phantom circuit Sequence Reducer Dyslexia

        Comment


        • #14
          Originally posted by qarium View Post

          really ?...



          AMD is forced to do a 192 Core CPU because the Cloud native market already has 192core ARM cpus for the cloud native market.
          AMD would lose many sales to these 192core ARM chips.



          the 192core cpu has nothing to do with the GPU market. also the AMD PRO w7900 is not an example of planet obsolescence because the 48GB vram will be enough for a longer time any 24GB vram 4090 will be enough...
          LOL - The AMD Pro W7900 sucks in Blender even with all that vram - I guess you can use it in Stable Diffusion - but, again, Nvidia is mostly recommended.

          Stable Diffusion is seeing more use for professional content creation work. How do NVIDIA RTX and Radeon PRO cards compare in this workflow?

          (have to use SHARK)


          Some ppl even say (in the comments) that AMD 'has the hardware' but not the support.... ".... a lot of the AMD AI stuff is still a work in progress." - that, in a nutshell, is the sentiments shared by a lot of ppl. Typical AMD - everything they're doing with gpus is 'a WIP' - like that other poster said, they're concentrating on their processors and increasing core count.

          Comment


          • #15
            Originally posted by Panix View Post
            LOL - The AMD Pro W7900 sucks in Blender even with all that vram - I guess you can use it in Stable Diffusion - but, again, Nvidia is mostly recommended.
            as i already said both nvidia and also amd make so much money in niche that they do not need to compete to make more turnover and more profit.
            you say w7900 sucks in blender and people buy it anyway.


            Originally posted by Panix View Post
            Stable Diffusion is seeing more use for professional content creation work. How do NVIDIA RTX and Radeon PRO cards compare in this workflow?

            (have to use SHARK)

            Some ppl even say (in the comments) that AMD 'has the hardware' but not the support.... ".... a lot of the AMD AI stuff is still a work in progress." - that, in a nutshell, is the sentiments shared by a lot of ppl. Typical AMD - everything they're doing with gpus is 'a WIP' - like that other poster said, they're concentrating on their processors and increasing core count.
            "AMD 'has the hardware' but not the support.... ""

            it is because Nvidia does monopolise the support and for AMD hardware you can choose from many support companies like colabora or redhat ...
            (of course all the support companies to choose from cost money and is not free)

            "they're concentrating on their processors and increasing core count."

            this has nothing to do with GPUs they are unter heavy pressure from the cloud native CPU companies others already have 192core cpus
            and soon there will be 256core ARM cpus. if AMD does not increase core count they are out of business no one waits for them.

            "everything they're doing with gpus is 'a WIP'"

            i think no one said anything different. Nvidia is the only market leader who does not fight a uphill battle.
            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • #16
              Originally posted by qarium View Post

              as i already said both nvidia and also amd make so much money in niche that they do not need to compete to make more turnover and more profit.
              you say w7900 sucks in blender and people buy it anyway.




              "AMD 'has the hardware' but not the support.... ""

              it is because Nvidia does monopolise the support and for AMD hardware you can choose from many support companies like colabora or redhat ...
              (of course all the support companies to choose from cost money and is not free)

              "they're concentrating on their processors and increasing core count."

              this has nothing to do with GPUs they are unter heavy pressure from the cloud native CPU companies others already have 192core cpus
              and soon there will be 256core ARM cpus. if AMD does not increase core count they are out of business no one waits for them.

              "everything they're doing with gpus is 'a WIP'"

              i think no one said anything different. Nvidia is the only market leader who does not fight a uphill battle.
              Bull crap. Who is buying the W7900? Not Blender users. Pppl will buy the Nvidia gpu - and if it's more expensive, they probably buy a used workstation card.

              AMD PRETENDS to support the gpus - but, it's all bs - you can find their bs marketing videos on youtube. But, when gullible ppl buy their cards - they complain it's not working in the software forum for the software they're using. I posted some of those 'chats' for you to read.

              My point is their priority is on their cpus and gaming consoles.

              AMD might fight an 'upfight battle' but then they don't try to compete on price. Look at their 7900 xtx - in several countries - in North America, Europe, Australia - they haven't done anything yet. They're content that these don't sell.

              Comment


              • #17
                Originally posted by Panix View Post
                Bull crap. Who is buying the W7900? Not Blender users. Pppl will buy the Nvidia gpu - and if it's more expensive, they probably buy a used workstation card.

                just speak for yourself. do not speak for the people who really buy a W7900 ....

                only because you as a person you do not unterstand why people do buy a W7900 does not mean they have no reason to justify it to themself (not to you)

                Originally posted by Panix View Post
                AMD PRETENDS to support the gpus - but, it's all bs - you can find their bs marketing videos on youtube. But, when gullible ppl buy their cards - they complain it's not working in the software forum for the software they're using. I posted some of those 'chats' for you to read.
                the marketing videos is for their customers and you are not a customer as it looks like means the videos are not for you.

                i tell you a secret i do not watch nvidia marketing videos on youtube... surprise.

                Originally posted by Panix View Post

                My point is their priority is on their cpus and gaming consoles.
                their CPU division is not connected to the GPU discrete dGPU division.

                also gaming consoles are he custom chip division who is also not connected to the discrete dGPU division.

                Sony and MIcrosoft do not buy these dGPU chips... they buy Custom chips

                Originally posted by Panix View Post

                AMD might fight an 'upfight battle' but then they don't try to compete on price. Look at their 7900 xtx - in several countries - in North America, Europe, Australia - they haven't done anything yet. They're content that these don't sell.
                "AMD might fight an 'upfight battle' but then they don't try to compete on price."

                i explained this to you 1 million times and nvidia does exactly the same right now. if you find a niche to sell more chips at higher price the result is you do not need to compete in price.

                this not something AMD specific Nvidia does exactly the same. but they have different niches nvidia has CUDA,OptiX and so one amd has all the opens-source,wayland,high security sector who needs auditing on opensource code.

                "They're content that these don't sell."

                wrong they sell all chips they can produce.
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • #18
                  Originally posted by qarium View Post

                  just speak for yourself. do not speak for the people who really buy a W7900 ....

                  only because you as a person you do not unterstand why people do buy a W7900 does not mean they have no reason to justify it to themself (not to you)



                  the marketing videos is for their customers and you are not a customer as it looks like means the videos are not for you.

                  i tell you a secret i do not watch nvidia marketing videos on youtube... surprise.



                  their CPU division is not connected to the GPU discrete dGPU division.

                  also gaming consoles are he custom chip division who is also not connected to the discrete dGPU division.

                  Sony and MIcrosoft do not buy these dGPU chips... they buy Custom chips



                  "AMD might fight an 'upfight battle' but then they don't try to compete on price."

                  i explained this to you 1 million times and nvidia does exactly the same right now. if you find a niche to sell more chips at higher price the result is you do not need to compete in price.

                  this not something AMD specific Nvidia does exactly the same. but they have different niches nvidia has CUDA,OptiX and so one amd has all the opens-source,wayland,high security sector who needs auditing on opensource code.

                  "They're content that these don't sell."

                  wrong they sell all chips they can produce.
                  Whether they're connected or not is redundant - what matters is the investment and focus. Are you trying to become a politician?

                  Comment


                  • #19
                    Originally posted by Panix View Post
                    Whether they're connected or not is redundant - what matters is the investment and focus. Are you trying to become a politician?
                    try to ? ...

                    "what matters is the investment and focus"

                    if you watch the stock market prices of amd shares you can discover hat they perform very well.
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • #20
                      Originally posted by sophisticles View Post

                      AMD does not want to make any progress it's not in their best interest.

                      AMD's approach to CPUs is more cores, they will be releasing a 192C/384T TRm AMD does not want to sell you a $4000 video card that's 12x faster than a comparable CPU/motherboard combo and they can keep selling faster CPUs with more cores every year.

                      Nvidia is kind of forcing AMD to go through the paces of pretending they care about GPU acceleration.
                      It has nothing to do with TR. First of all that's not even technically possible since NVIDIA offers professional GPUs, so refusing participate in this market would merely mean leaving the market and not pushing more TR CPUs. Second of all, AMD market share in this segment is very low, so they tend to invest the resources in more profitable ones, since they are far more limited on them than Intel or NVIDIA.

                      Your opinion, as almost always, is pure trash.

                      Comment

                      Working...
                      X