Announcement

Collapse
No announcement yet.

Kubuntu Focus M2 Gen4 Announced With Intel Alder Lake, RTX 30 Graphics

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Nth_man
    replied
    Originally posted by tildearrow View Post

    > https://lambdalabs.com/deep-learning/laptops/tensorbook
    What? A business/development laptop with just one year warranty?!

    Terrible...
    In lambdalabs.com they say "An extended warranty covering hardware failures and accidental damage with rapid repair and replacement" and in the Kubuntu Focus site they say that an extended warranty is available, too.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by WannaBeOCer View Post

    Why would there be? this is their target:



    RDNA is terrible for ML, that is a gaming architecture and I don’t see any mention of gaming on that laptop. Looks like a much better option than the overpriced Tensorbook from Razer.

    Intel i7-11800H (8 cores, 2.30 GHz), 64 GB Memory, 2 x 1 TB, NVMe SSD, Data Science & Machine Learning Optimized. TensorFlow, PyTorch, Keras Pre-Installed. Fast shipping.


    When we look at processes the 12700H outperforms its 6800H competitor. There isn’t a V-Cache mobile chip either.
    What? A business/development laptop with just one year warranty?!

    Terrible...

    Leave a comment:


  • Random_Jerk
    replied
    I run Nvidia cards with plasma on an Optimus laptop(TU117 1650 equivalant chip) and also on my desktop with a 3080. The only percieved difference I see in animations is when using a 60Hz monitor vs 144 Hz monitor. KDE X11 with Nvidia seems perfectly smooth and a great experience for me in general, and feels butter smooth the higher the refresh rate is. I haven't observed any of these glitches atleast since the last 3-4 releases, atleast since KDE 5.21. Since then, it has been pretty on the mark. Haven't tried KDE Wayland though.
    Last edited by Random_Jerk; 29 April 2022, 09:54 PM.

    Leave a comment:


  • sverris
    replied
    Article says "also ptimized" - actually a neat typo, is it?...

    Leave a comment:


  • polarathene
    replied
    Originally posted by Danny3 View Post
    And again with Nvidia garbage?
    Originally posted by RejectModernity View Post
    Oh wow. If you want to get the worst linux experience just buy dual-GPU Nvidia laptop with snap-crap.
    Originally posted by arun54321 View Post
    KDE runs bad on NVidia GPUs - slower animations, bugs and glitches.
    Originally posted by darkbasic View Post
    Are they serious? Do they really expect a Linux user to willingly use Nvidia shit?
    You guys aren't the demographic. When it comes to ML, quite a lot of it depends on CUDA, especially proprietary software. Doesn't matter about the other gripes, you can't do your job properly if you need CUDA and lack the hardware to use it.

    Yeah there's some workarounds available sometimes, but usually that's not an option with third-party software you want to use, especially the proprietary ones (and you're going to use those for the same reason you'd be using CUDA, because they're often the best in the market for what your job requires).

    This is just offering a non-Windows option with a DE that users from Windows I guess would find more comfortable. macOS is a joke in this area as they won't support nvidia GPUs anymore, not even via eGPU.

    It'll be great if CUDA was not as dominant, so the easier ROCm gets and more available (last I checked it was lagging behind product releases a fair bit), and Intels dGPU line up also gets competitive with oneAPI, maybe we'll see broader adoption. It's great that some projects aren't CUDA only anymore too, but it's taking longer for proprietary apps to broaden support (even for some open-source ones like AliceVision MeshRoom, which is not competitive to proprietary offerings even with CUDA).

    ---
    ​​​​​​​
    I run Plasma + nvidia btw, and over the years it got much better. I'll be getting AMD next time as I don't need need CUDA myself anymore (different industry now), and I'd like better compatibility from using Mesa than the proprietary drivers, since that removes the workarounds with virgl graphics on VMs, better Wayland support apparently (although I think that's more dependent on Plasma being ready before GPU choice matters), not having to workaround the HW video decode with browsers, and what I assume is better suspend/resume support.

    I haven't noticed any slow animations or stutters though, and bugs/glitches in daily work is not a thing anymore these days (with an exception to Kate, but I don't know if that's nvidia specifically, sometimes there's a small rectangle of an old frame flashing for a few moments). But if you need GPU compute for work, you often need CUDA.

    Leave a comment:


  • piotrj3
    replied
    Originally posted by lamka02sk View Post
    That battery is gonna last 3 hours with this CPU. Ooops
    considering what they aim for (like Machine Learning) it is not the aim. Aim is workstation that you can take in bag, not device to stay on your lap.

    Not to mention CPUs nowadays have very low idle power usage, so when you do stuff like webrowsing it won't consume much energy.

    Leave a comment:


  • Nth_man
    replied
    Originally posted by WannaBeOCer View Post

    I’ve been running Plasma on my Titan RTX for the past two years without any issue. Maybe if I use wayland I’d run into those problems. Solus/openSUSE Tumbleweed with Plasma have been fantastic using a Nvidia GPU.
    Previous Kubuntu Focus went to the top:
    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

    Leave a comment:


  • lamka02sk
    replied
    That battery is gonna last 3 hours with this CPU. Ooops

    Leave a comment:


  • darkbasic
    replied
    Are they serious? Do they really expect a Linux user to willingly use Nvidia shit?

    Leave a comment:


  • Mitch
    replied
    Originally posted by tildearrow View Post
    Aaaaand of course, no AMD option.
    Everything is just right on the specs but this one thing. It's weird to live in a world where Intel is so power-hungry and AMD is relatively cool and efficient.

    Leave a comment:

Working...
X