Announcement

Collapse
No announcement yet.

The GeForce RTX 2080 Ti Arrives For Linux Benchmarking

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • OMTDesign
    replied
    That's very cool to hear that you got a RTX 2080Ti.

    I wonder how well these cards perform in AI and machine learning applications. With the addition of all those extra RTX and Tensor cores I am hoping to see Titan V level performance.

    Leave a comment:


  • dimko
    replied
    Originally posted by Xaero_Vincent View Post
    $1200 was the entire cost of my gaming desktop I built years ago. Miners have sure driven the cost of GPUs up.
    I can still compile a functioning computer for half price or so. Good one including screen with 1200$

    Leave a comment:


  • pegasus
    replied
    Originally posted by Michael View Post
    Have any particular inference benchmarks in mind? (That are nice to build and script well.)
    Unfortunately the only "nice to build" things in this department are provided as docker images. Building these frameworks manually is an exercise in patience. EasyBuild helps with some but not yet all of them.

    Leave a comment:


  • Xaero_Vincent
    replied
    $1200 was the entire cost of my gaming desktop I built years ago. Miners have sure driven the cost of GPUs up.

    Leave a comment:


  • miabrahams
    replied
    Originally posted by pegasus View Post
    If you ignore miners, the largest audience for these cards is the AI crowd. They can afford to pay for them much more than gamers, but it's still cheaper and more cost-effective for them to go for gaming cards instead of server class Teslas.

    So if you can do some inference benchmarks that would be great. Tesla T4 is supposedly very good at it at 75W only and if this 2080ti shows same numbers, gamers will whine again about availability
    This vastly underestimates the size of the 100 billion dollar gaming market. Who knows whether things like RTX cores are even useful for AI at all? We could likely see a split with server/workstation cards dropping RTX to maximize tensor processing performance while PC cards accrue more and more gaming-specialized features.

    Leave a comment:


  • miabrahams
    replied
    Very interested in the rendering benchmarks (blender, octanebench)

    Leave a comment:


  • tildearrow
    replied
    Michael Does this mean you are canceling your 2080 pre-order?

    Leave a comment:


  • Michael
    replied
    Originally posted by pegasus View Post
    If you ignore miners, the largest audience for these cards is the AI crowd. They can afford to pay for them much more than gamers, but it's still cheaper and more cost-effective for them to go for gaming cards instead of server class Teslas.

    So if you can do some inference benchmarks that would be great. Tesla T4 is supposedly very good at it at 75W only and if this 2080ti shows same numbers, gamers will whine again about availability
    Have any particular inference benchmarks in mind? (That are nice to build and script well.) I have had a few test profiles but they have tended to be messy with a load of dependencies, etc, so continuously looking for new tests in that area.

    Leave a comment:


  • pegasus
    replied
    If you ignore miners, the largest audience for these cards is the AI crowd. They can afford to pay for them much more than gamers, but it's still cheaper and more cost-effective for them to go for gaming cards instead of server class Teslas.

    So if you can do some inference benchmarks that would be great. Tesla T4 is supposedly very good at it at 75W only and if this 2080ti shows same numbers, gamers will whine again about availability

    Leave a comment:


  • Brisse
    replied
    Apparently they are designed similiarly to something Apple would do. In short: They are not made to be taken apart.

    Leave a comment:

Working...
X