Announcement

Collapse
No announcement yet.

Intel Linux Graphics Driver Preparing NN Integer Mode Scaling

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • DavidC1
    replied
    Originally posted by TheLexMachine View Post
    Their current GPU design is based on architecture that is about a decade old and it was never intended for long-term use or gaming, as it was designed for media consumption and Internet use. It will (apparently) be discarded with the soon to be released GPU that was developed with the engineers that Intel poached from AMD and Nvidia in the past six years, which is more in line with current GPUs from those respective companies.
    What? It's not a decade old. Gen 9 introduced with Skylake in late 2015.

    Xe doesn't "discard" it either. It'll be much more efficient and faster but fundamentals will stay. Xe is also known as Gen 12. The engineers Intel poached will certainly help, but will show in low-levels details most people can't understand. High-level is essentially the same. Tigerlake, is revealed to have 96 EUs.
    Last edited by DavidC1; 04 September 2019, 03:54 AM.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by bearoso View Post

    Emulators have had this feature for a long time, yes. There’s also shaders that can scale to non-integer ratios and still maintain nearly the same sharpness level without irregular pixel size aliasing (https://github.com/libretro/slang-sh.../interpolation). The AANN and bandlimiting ones work quite well.
    No, not emulators. I'm talking about hardware scaling.

    Leave a comment:


  • linuxgeex
    replied
    Kudos to Intel for respecting that users might have a preference for display scaling filter. I had that issue with my 22" portrait monitor next to my 50" TV... to get equally sharp text scaled up I couldn't use `xrandr --scale` because it was blurry as hell, so I had to write a modeline that doubled the pixels instead, and that is outside the wheelhouse of the average user.

    OTOH, if you have a device with a 4k display and the included GPU can't afford the cycles to scale up a low-res image via a shader to 3x or higher integer scaling via OpenGL with NN filter, or if that uses a significant amount of power, then you have a pretty spectacularly broken GPU design lol.

    My personal feeling is that this should be something implemented within DRM so that this feature (configurable scaling filters) extends to all display hardware, not just Intel, and then of course XOrg / Wayland etc need to expose that feature via xrandr etc.
    Last edited by linuxgeex; 04 September 2019, 12:12 AM.

    Leave a comment:


  • Guest
    Guest replied
    torsionbar28 - Well so far, Intel never really cared about competing in high performance graphics and compute. Their iGPUs are weak on purpose, they mainly care about the ability to run desktop GUI interfaces for the next several years.

    ​​​​​​I'm not sure if you've heard, but there are the Iris Plus CPUs with iGPUs that are much higher performance than their normal ones. Intel's also working on actual dGPUs and on better GPU performance.

    They're already working on the thing that you want them to do.

    Leave a comment:


  • bearoso
    replied
    Originally posted by tildearrow View Post
    ...what? The SNES already did this first?

    Also, this will not work well for non-pixel art...
    Emulators have had this feature for a long time, yes. There’s also shaders that can scale to non-integer ratios and still maintain nearly the same sharpness level without irregular pixel size aliasing (https://github.com/libretro/slang-sh.../interpolation). The AANN and bandlimiting ones work quite well.

    Leave a comment:


  • TheLexMachine
    replied
    Originally posted by torsionbar28 View Post
    Man, intel really needs to step up their game in iGPU's. Seems like they've been stagnant in this area for years now. The new Ryzen 3200G and 3400G have GPU performance that rivals discrete GPU's from just a few years ago. The 3400G in particular, looks on par with an Rx 460, pretty darn impressive.
    Their current GPU design is based on architecture that is about a decade old and it was never intended for long-term use or gaming, as it was designed for media consumption and Internet use. It will (apparently) be discarded with the soon to be released GPU that was developed with the engineers that Intel poached from AMD and Nvidia in the past six years, which is more in line with current GPUs from those respective companies.

    Leave a comment:


  • torsionbar28
    replied
    Man, intel really needs to step up their game in iGPU's. Seems like they've been stagnant in this area for years now. The new Ryzen 3200G and 3400G have GPU performance that rivals discrete GPU's from just a few years ago. The 3400G in particular, looks on par with an Rx 460, pretty darn impressive.

    Leave a comment:


  • timofonic
    replied
    Can this be useful for ScummVM? It makes possible to run point and click adventure games, RPG games, Interactive Fiction... on more modern systems (Windows, Linux, Mac, Haiku, PS3, Wii, Nintendo Switch, Android, iOS, PSP, PS Vita, OS/2, BSD, webOS, Dreamcast, Pandora, etc.).
    Last edited by timofonic; 03 September 2019, 08:59 PM.

    Leave a comment:


  • AsuMagic
    replied
    Hey, nvidia did something similar on Windows. It only works on the 2xxx series.

    I wish I was kidding.

    Leave a comment:


  • tildearrow
    replied
    ...what? The SNES already did this first?

    Also, this will not work well for non-pixel art...
    Last edited by tildearrow; 03 September 2019, 04:07 PM.

    Leave a comment:

Working...
X