I found this http://www.geekologie.com/2011/05/da...puter-algo.php
also cited here http://techreport.com/discussions.x/21017
Maybe this technique could be used at a low level (X and/or Wayland), too? What do you think? Would the performance hit be too great? Would anyone like to use it? Would there be too many errors to be useful?
Two researchers are developing an algorithm designed specifically to 'de-pixelize' 8-bit (and 16-bit) video game graphics in real-time into smoother, more flowing ones. This. changes. everything. No, no it doesn't -- but it does change the amount of time I'll spend playing NES games on an emulator while I'm supposed to be working.
Maybe this technique could be used at a low level (X and/or Wayland), too? What do you think? Would the performance hit be too great? Would anyone like to use it? Would there be too many errors to be useful?
Comment