Originally posted by Szzz
View Post
Announcement
Collapse
No announcement yet.
VESA Releases DisplayPort 1.3, Pushes 32.4 Gbits/sec
Collapse
X
-
Originally posted by BSDude View PostLet me guess? Brood War?
Battlefield 2
Battlefield 2142
Battlefield : Bad Company 2
Call of Duty 2
Call of Duty 4 Modern Warfare
Call of Duty : World at War
Call of Duty : Modern Warfare 2
Call of Duty : Modern Warfare 3
Call of Duty : Black Ops
World in Conflict : Soviet Assault
Company of Heroes
Team Fortress 2
Left 4 Dead 2
etc, etc.
When the right moment arrives, will buy
Counter Strike : Global Offensive
...and the WARGAME trilogy.
Play also a lot of mods for above games (the ones with mods of course) and i myself mod base games and the mods...just for fun
Comment
-
Originally posted by Szzz View Post24 bit (8-bit per channel) colors are not sufficient, they lead to banding artefacts everywhere, this is awfull. We need to use 10 bit and 12 bit colors (30 and 36 bits per pixel) for real colors.
Problem is that many times they are using 12 or 16bit witch is UTTER C**P.
Even in Window$ drivers, after a fresh install of OS and 1st video driver install for your card, go check your settings (no matter is NVIDIA or AMD)...YEAH, there's there a setting that by DEFAULT LIMITS the range to much less than 24bit
I took the habit to change that right away to full range.
Comment
-
Originally posted by amehaye View PostI wonder when programmer-friendly 4K monitors will start to show up.
Something with 16:10 rather than 16:9 aspect ratio, i.e. 3840x2400@60Hz would be pretty nice. Make it somewhere between 24"-27" and it will be a winner.
Comment
-
Originally posted by AJSB View PostFALSE, if games, MOVIES, etc. used FULLY 24bit you wouldn't see artifacts, 24bit is 16.7 million colors, more than a Human eye can differentiate.
...
Comment
-
Originally posted by log0 View PostThose 16.7 millions colors are not optimized for human perception. Our eyes are nonlinear devices, and very good at noticing brightness changes. And with 24bit you only have 256 shades, means you will see banding along smooth gradients, period. If you don't, you should eventually consider consulting an eye specialist.
Comment
-
Originally posted by TheLexMachine View PostThey will be likely over 30-inches and they will use the new ultra-wide aspect ratio at 21:9. 16:10 4096 x 2180 monitors exist in the media professional realm but they are far out of your price range and not made for "regular use".
Comment
-
Originally posted by karasu View Postho please, don't be rude... All jpgs around the web are 24bits, and no I don't see any banding. Even if I generate a full screen png with a black to white gradient I can't see any banding (I can, however, see bands on gradients on my shitty smartphone screen, any maybe on some cheap desktop LCDs). So, where is an image generated with more than 24bits except niche case like RAW photos and medical applications? video games maybe?
As he pointed out, our eyes are not linear devices - read: you may not notice the difference between 0x00fe0010 and 0x00ff0011 (different shades of red with a touch of blue [pink]), but you'll certainly and easily tell the difference between 0x00ffffff and 0x00fefefe (different shades of white).oVirt-HV1: Intel S2600C0, 2xE5-2658V2, 128GB, 8x2TB, 4x480GB SSD, GTX1080 (to-VM), Dell U3219Q, U2415, U2412M.
oVirt-HV2: Intel S2400GP2, 2xE5-2448L, 120GB, 8x2TB, 4x480GB SSD, GTX730 (to-VM).
oVirt-HV3: Gigabyte B85M-HD3, E3-1245V3, 32GB, 4x1TB, 2x480GB SSD, GTX980 (to-VM).
Devel-2: Asus H110M-K, i5-6500, 16GB, 3x1TB + 128GB-SSD, F33.
Comment
Comment