Announcement

Collapse
No announcement yet.

AMD Ryzen 9 5900X + Ryzen 9 5950X Dominate On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    To answer birdie, there is a 5600 coming out to match with the 3600. It's better to compare the 3600x to the 5600x price wise.

    Comment


    • #52
      Originally posted by f0rmat View Post

      Was the first computer that you programmed on a TRS-80 (commonly called a Trash 80)? I learned BASIC on that.

      And I think you mean 200MHz 286 CPU in 1999, not 1989. In 1989, The Sun SPARC II Workstation had like a 25 or 33 MHz processor. Then I was working primarily on that and a VAX/VMS 9000 with FORTRAN IV.
      Oh, thanks for catching that f0rmat, I actually meant 20Mhz! Wow, it's just amazing to recall. I even remember having a "Turbo" button I could never use, which I believe was supposed to overclock the CPU by 2MHz.

      I never had a TRS-80 though, but was lucky enough to work for a small company named Aracor that had a primitive mainframe I was allowed to use in off hours. I was even allowed to connect via a 300 baud modem, which really blew my mind. I developed my first basic program on it, an AI program named "Carl", that interpreted English to allow recalling and moving objects in an imaginary house.

      I later used what I'd learned to create an 8085 firmware assembly program that allowed a laser/X-ray system to be controlled with English phrases instead of cryptic commands. However my boss was angry because it required an extra 4K EPROM, and the government wouldn't allow it (Aracor primarily existed on government contracts from DARPA, etc.) because they were afraid of ambiguous commands being executed. Fantastic bygone times

      Comment


      • #53
        Originally posted by muncrief View Post
        Oh my goodness f0rmat, we are from the same generation! And I feel you about lack of funds, I had to purchase a 200Mhz 286 CPU in 1989 for a critical project, that fortunately ended up ensuring my future career as a R&D engineer/embedded systems designer, because I couldn't afford a 386 or the newly introduced 486. I would literally sleep at the foot of my computer for 3 to 5 days at a time trying to lay out a Xilinx 3000 FPGA (revolutionary for the time), because they used "simulated annealing" for routing, and it almost always failed. So every hour counted, and when it failed I'd have to immediately get up, adjust the floor plan, and try it again.
        LOL - similar experience here, although using 2000 series FPGAs with a 386.

        Ended up going back to hand placement & routing, although since I was working on graphics cards (for Macs back then) there was a lot of logic that could be reused between layouts - counters, address muxes, memory controllers etc...
        Test signature

        Comment


        • #54
          Originally posted by vladpetric View Post
          All we need now is inventory
          I was curious about this so I just checked my local vendor. All stores have 10+ units. Lets see how long they can hold out.

          Comment


          • #55
            Originally posted by piotrj3 View Post

            Quoting Nvidia, 3090 IS FOR EDGE CASES. They literally called it ultimate content creator card. Even in Nvidia presentation 3080 is flagship model not 3090.
            Very few content creators NEED 24GB, and when that is the case, the boss is the one pay for the computer.

            Comment


            • #56
              Originally posted by muncrief View Post

              Oh, thanks for catching that f0rmat, I actually meant 20Mhz! Wow, it's just amazing to recall. I even remember having a "Turbo" button I could never use, which I believe was supposed to overclock the CPU by 2MHz.

              I never had a TRS-80 though, but was lucky enough to work for a small company named Aracor that had a primitive mainframe I was allowed to use in off hours. I was even allowed to connect via a 300 baud modem, which really blew my mind. I developed my first basic program on it, an AI program named "Carl", that interpreted English to allow recalling and moving objects in an imaginary house.

              I later used what I'd learned to create an 8085 firmware assembly program that allowed a laser/X-ray system to be controlled with English phrases instead of cryptic commands. However my boss was angry because it required an extra 4K EPROM, and the government wouldn't allow it (Aracor primarily existed on government contracts from DARPA, etc.) because they were afraid of ambiguous commands being executed. Fantastic bygone times
              DARPA...sh*t. Another term from the past. My school had the Radio Shack TRS-80. I remember in graduate school having a VT-100 terminal with a 300 baud modem. A year later, I got upgraded to a VT 220 and then gave me a 9600 baud modem. Man, I had the fastest ethernet connection EVER.
              GOD is REAL unless declared as an INTEGER.

              Comment


              • #57
                Some thing people need to keep in mind is if AMD had cut the price of these new CPUs in half what would have happened is the scalpers would have bought them all up and you would be searching Ebay for them at insane prices. While I am no fan of price increases squeezing the scalpers margins makes it less likely some thing like the NVidia 3000 series launch problems happen. I expect these to drift down a bit after the initial surge passes.

                Comment


                • #58
                  Originally posted by bridgman View Post

                  LOL - similar experience here, although using 2000 series FPGAs with a 386.

                  Ended up going back to hand placement & routing, although since I was working on graphics cards (for Macs back then) there was a lot of logic that could be reused between layouts - counters, address muxes, memory controllers etc...
                  Aha! So many kindred spirits here today bridgman

                  Luckily I got away with tweaking and locking routes, but man, there was a time when I feared it just wasn't going to work.

                  The FPGA was used in conjunction with a Z80 on a digital checkbook I'd invented, so it had a MMU, charging control circuitry, and a lot of other miscellaneous logic to control the paper feeding mechanism, printer, modem, etc. I also had to create circuitry to adjust the CPU clock between 2/4/8MHz dependent upon the CPU load. I actually ended up creating it successfully by the end of 1990, and it printed out checks in 30 seconds or less, and ran for ten hours. If I remember correctly I benchmarked run time when printing ten checks.

                  I also created an interactive OS in assembly language I named MIOS (Muncrief Interactive Operating System), because the other embedded OSs of the time were too large and inefficient. I had to develop new ways of creating lists, responding to end user errors, and address a plethora of other problems introduced when consumers utilize such a complex system. It was a hell of a lot of fun though, and even though my company ultimately failed it allowed me to become a highly paid consultant, and really not interview much ever again. All I had to do was whip out the checkbook, enter a faux purchase and print out a check, and I was hired!
                  Last edited by muncrief; 05 November 2020, 05:00 PM.

                  Comment


                  • #59
                    Originally posted by bridgman View Post

                    LOL - similar experience here, although using 2000 series FPGAs with a 386.

                    Ended up going back to hand placement & routing, although since I was working on graphics cards (for Macs back then) there was a lot of logic that could be reused between layouts - counters, address muxes, memory controllers etc...
                    Hell...Macs. My first Apple I worked on was an Apple IIe. I remember the first Macintosh was at Vanderbilt University in 1987 studying solid state physics. I was a physicist back then and I "programmed" as opposed to develop. Words and computers change. Writing code for the "counters" was always tedious, but necessary.
                    GOD is REAL unless declared as an INTEGER.

                    Comment


                    • #60
                      holy shit AMD delivered

                      Comment

                      Working...
                      X