Originally posted by F.Ultra
View Post
New Sound Drivers Coming In Linux 4.16 Kernel
Collapse
X
-
-
-
Originally posted by F.Ultra View PostYou cannot compare this with video since we have not reached CD quality with video just yet (mainly because the bandwidth required for proper video is so insanely high).
Leave a comment:
-
-
Originally posted by caligula View Post
For some reason people were happy with CD audio quality (or even lower) for so long. The mp3 rips were often 22 kHz in late 1990s. 48 kHz sound cards became more common around 2000 with SB Live and Intel HD Audio. After that things have changed a lot. First 48, then 96, then 192. Now 384 kHz is standard in high end audio world. DAC bit counts have also steadily increased. From 8 to 16 to 24 to 32 bits. High end gear has 64 bits or more. Same thing with video cards. 8 bits per channel was good enough for quite long. 24b true color was standardized in 1994. Now for the first time normal consumers want 30b and latest standards even support up to 48 bits (HDMI, Displayport). It's really surprising how 30b wasn't enough. from 16,7 million colors to 64 x 16,7 million. Instead we need 262144 times more. Maybe up to 64 bits in the future.
The so called High End gear that you can buy are often really shoddy pieces of equipment built by people who do not understand electronics and who truly believes that you can enhance the sound by placing shakti stones on top of the equipment, or special cable lifters (to lessen the gravitational force of the cable...).
You cannot compare this with video since we have not reached CD quality with video just yet (mainly because the bandwidth required for proper video is so insanely high).
Leave a comment:
-
-
Originally posted by caligula View PostFor some reason people were happy with CD audio quality (or even lower) for so long. The mp3 rips were often 22 kHz in late 1990s. 48 kHz sound cards became more common around 2000 with SB Live and Intel HD Audio. After that things have changed a lot. First 48, then 96, then 192. Now 384 kHz is standard in high end audio world. DAC bit counts have also steadily increased. From 8 to 16 to 24 to 32 bits. High end gear has 64 bits or more. Same thing with video cards. 8 bits per channel was good enough for quite long. 24b true color was standardized in 1994. Now for the first time normal consumers want 30b and latest standards even support up to 48 bits (HDMI, Displayport). It's really surprising how 30b wasn't enough. from 16,7 million colors to 64 x 16,7 million. Instead we need 262144 times more. Maybe up to 64 bits in the future.
I mean, why you think having RGB light controllers embedded in every fucking piece of "gaming" PC hardware is now commonplace? Having software-controlled multicolor leds all over the place isn't giving you any edge over an opponent in-game.
Same for mechanical keyboards and ultra-high-res laser mice.
Leave a comment:
-
-
Originally posted by F.Ultra View Post
They might and that too would be completely useless. Already with 24-bit DACs you have a dynamic range of 144.49dB which is well above the limit to give you immediate and permanent hearing damage so no music will ever utilize even a fraction of that range. Even the 16-bit DACs which gives 96.33dB dynamic range is capable of truly reproducing sounds loud enough to cause hearing damages so even this is actually overkill for a DAC. A 24-bit or 32-bit ADC is a whole different story but that is not what we are discussing here.
Leave a comment:
-
-
Originally posted by caligula View Post
You're probably right. The actual real world performance might not be good. But the DACs probably still have actual slots for 32 bits. I'm also guessing that in few years the phones have 64 bit DACs. and 16k graphics with 5" screens.
Leave a comment:
-
-
Originally posted by F.Ultra View Post
It's a press release and by definition fake news ;-). And if you really think that there exists a low cost chip that can truly create a analogue out signal with a 0.000000001 Volt difference then I have a bridge to sell you.
0xFFFFFFFF would have to yield an exact output of 1.736000000 Volts and 0xFFFFFFFE would have to yield an exact output of 1.735999999 from this chip in order for it to be a real 32-bit DAC.
Leave a comment:
-
-
Originally posted by caligula View Post
So you think this is fake news?
"This first member of the ESS PRO SABRE series sets a new benchmark in high-end audio by offering the industry's highest dynamic range (DNR) of 140dB. The ES9038PRO also offers impressively low total harmonic distortion plus noise (THD+N) at -122dB in a 32-bit, 8-channel DAC."
0xFFFFFFFF would have to yield an exact output of 1.736000000 Volts and 0xFFFFFFFE would have to yield an exact output of 1.735999999 from this chip in order for it to be a real 32-bit DAC.
Leave a comment:
-
-
Originally posted by starshipeleven View PostThen what do they represent? Are they just a lie or are they some theoretical max?
Which of course also clearly displays why it's mostly a numbers game since you cannot hear a difference that is that small. In fact the 16-bit for CD:s where set by Philips based on in-house studies to figure out the needed dynamic range and then they added headroom to that.
So there does not exist any one that can hear the difference between a 16-bit or 24-bit sample if both are created correctly. The only reason that studios use 24-bit and above is to enable mixing and so that they do not loose useful resolution when they are compressing the dynamic range.
The same is true for the frequency, 44K.1Khz as is used by CDs can truly represent 100% every sound in the 0-22Khz range however this also means that if you accidentally record sound that is > 22Khz then this will create digital artefacts that will look like lower frequency material in your digital data so you must have a filter. Filtering at 22Khz is not easy since filters (even brick filters) are not infinitely steep so it's much better to record at 96Khz and apply a normal filter somewhere between 22Khz and 48Khz and then apply a digital filter when you downsample to CD quality.
The only times that people have heard differences are when they have either not done a proper ABX test (since the golden ears are so sensitive that they must know which equipment is playing in order to hear the difference...) or when the tested equipment are deliberately altering the sound (not uncommon for very expensive HIFI cables to include electronics that alter the sound aka equalizer so that people can claim to hear a difference).
Leave a comment:
-
-
For those that wonder what can change, the DMA interfacing and control of engine. It can have a dsp or not. The codec might have slight changes. The mixer/audiopath might change.
To be clear: The audio interfacing have become more and more retarded since intel introduced the HDA.
Before that on at least PC's multiple channels were mixed by hardware. These days, the CPU needs to mix everything by software and trash cache, just because the HDMI out does not have a pcm channel mixer in front of it.
As on amplifier side and such, I had no problem attaching an I2S codec with integrated class D amplifier to a generic i2s dma engine.
Leave a comment:
-
Leave a comment: