Announcement

Collapse
No announcement yet.

Intel UHD Graphics 630 "Coffee Lake" On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by schmidtbag View Post
    It's not just Intel GPUs, but AMD and Nvidia chips too. For example, I have a socket AM1 Athlon with HD 8400 graphics. The platform as a whole is completely useless for gaming. That's fine - that isn't why I bought it, but there are so few benchmarks out there for video transcoding. Of the ones that do exist, they were only tested once and never again after further driver patches. I've considered getting a 4K display, but I held off because I wasn't totally sure if the GPU could handle 4K playback (at the very least, I know for a fact it can't support 4K@60FPS, because the HDMI port doesn't support 4K@60Hz displays).
    I'm with you, it's been driving me crazy for years, video cards have been capable of encoding video since back in the 9700 All-in-Wonder days with the Cobra Engine that was capable of offloading 10% mpeg-2 encoding from the cpu to the gpu and IVTC in hardware.

    Nvidia has had the dedicated encoding NVENC chip for years and I have used it with Pascal and with Maxwell and it works just fine, even on Linux via ffmpeg; Intel has had QS since Sandy Bridge, VCE has existed since the ATI 9600 Pro days (I remember using back then and being impressed with what it offered at the time).

    I guarantee you that when the Ryzen apu's come out with the integrated Vega we'll be lucky to see one site do a half assed test of it's hardware encoder and it will be on Windows via Cyberlink's software,

    If AMD squeezed their heads out of their collective asses and released a 6C/12T apu with Vega graphics, I would definitely buy one, but if not then I think I'll pick up a i5 8400 and do a proper encoding test myself.

    Comment


    • #22
      Maybe I get something wrong - but Intel GPUs are not for gamers (except those wanting to play old-school games - which is OK ).
      DiplayPort [DP 1.2 reaches 4k@60Hz dating Dec. 2009, 1.4 even 8kp60Hz HDR deep color dating March 2016] is professional -
      HDMI is just a toy interface [HDMI 1.4a 4k@30Hz dating March 2010, 2.1 reaching 8k@60Hz dating Q2/2017 and not yet used] -
      and concerning the plugs and cables HDMI really cause lots or problems while DP just works (at least for me).
      I am using a 4k with DP since 03/2015 on an Haswell/i7-4770T bought 11/2013 - totally under Linux.
      It works without any problem - and I really would like to get higher resolutions than 4k on the desktop - 5k or even 8k.
      And the system is absolutely silent without any specially means (no water cooling etc.) ... and cool (CPU 27°C, HDD 36°C).
      This is from my point of view the use case for Intel GPUs which they are perfectly suitable for - they are not competing with current 3D cards for gamers.
      So I am not amused that the only change for me since _4 YEARS_ is using three 4k Displays - which would not
      be interesting with 28" or even larger screens.
      Until now no 5k or 8k resolution - and no real progress since Broadwell concerning GPU architecture -
      or have I missed something?
      From OpenGL Haswell may even be capable of getting OpenGL 4.6 when ready.
      So the CPUs thereafter may be little bit faster - but concerning the years gone by Intel really stayed on that level.
      And GPU is not the only field where Intel pretends to have made progress when they actually stagnated.
      And more severely not even the only time Linux support for new HW is lacking at best (like Skylake desktop - not to speak of Skylake Laptops; are those heat problems solved completely? Maybe I missed something in that respect ...).
      I do hope Raven Ridge may change things - as a new target for a silent desktop system to get work done and a speed up to Intel development to not lose even more ground.
      Current CPUs are not worth to buy them for technical people (i.e. without interest in saying they use the latest and greatest ... there is a company for that already ... ).
      Even a Sandy Bridge system of end of 2011 is pretty close to what is delivered today - and the biggest difference is a remarkable USB speed up - no CPU, no GPU speedup is worth mentioning but only visible in benchmarks (e.g. Haswell at 3840×2160@60Hz and Sandy Bridge at 2560x1440@60Hz is pretty similar - formerly advertised as being factors apart).
      And ECC-RAM is still no Intel feature for desktop CPUs - right?
      But maybe my use case is special ...

      Comment


      • #23
        Originally posted by JMB9 View Post
        Current CPUs are not worth to buy them for technical people (i.e. without interest in saying they use the latest and greatest ... there is a company for that already ... ).
        Even a Sandy Bridge system of end of 2011 is pretty close to what is delivered today - and the biggest difference is a remarkable USB speed up - no CPU, no GPU speedup is worth mentioning but only visible in benchmarks (e.g. Haswell at 3840×2160@60Hz and Sandy Bridge at 2560x1440@60Hz is pretty similar - formerly advertised as being factors apart).
        And ECC-RAM is still no Intel feature for desktop CPUs - right?
        But maybe my use case is special ...
        I mostly agree. There a nice things that matter, but they aren't enormous. There are a couple you missed. Haswell made big improvements in power consumption for notebooks. AES-NI is something I want. I thought I wanted and got transactional memory with Haswell, but a firmware update took it away and I don't see anything that is using it anyway.

        Cheap UltraHD TVs are cheaper than UHD computer monitors. With two downsides (at least): HDMI-only, and 4:2:2 or worse chroma sub-sampling (the specs are usually silent on this). I have one on my desk from the days of 30Hz upper-bound on HDMI. I don't think Haswell GPUs will drive it. I'm using a GTX 650, with the nVidia driver since Nouveau seems to misbehave. It surprises me that the 30Hz refresh and the chroma sub-sampling don't bother me.

        Comment


        • #24
          Originally posted by Spooktra View Post
          It amazes me that almost every single site that tests a video card completely ignores the "video" part and instead tests gaming. The benefit of Intel's iGPU's is in it's decode and encode capabilities; they support hardware encoding of mepg-2, h264, hevc, hevc 10bit, vp8, vp9 (<--since Kaby Lake), aac and mjpeg as well as BT2020 and HDR decode and encode and vp9 10 bit and 12 bit decoding.

          These are potent chips for encoding video into delivery and archival formats, yet everyone ignores it.

          On a side note, it's too bad that Intel didn't add a relatively big L4 like the Iris Pro graphics, with just 128MB of cache the difference in some games is substantial, imagine if Intel was to add a 1GB L4.
          Have you managed to decode all of these successfully on Ubuntu or some other Linux flavour with UHD 630? I'm wondering whether I can get away without a discrete GPU on my new laptop which will have a I7-8750H as the only thing I use the GPU for is watching movies.

          Comment

          Working...
          X