Announcement

Collapse
No announcement yet.

New Sound Drivers Coming In Linux 4.16 Kernel

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • phoronix
    started a topic New Sound Drivers Coming In Linux 4.16 Kernel

    New Sound Drivers Coming In Linux 4.16 Kernel

    Phoronix: New Sound Drivers Coming In Linux 4.16 Kernel

    Due to longtime SUSE developer Takashi Iwai going on holiday the next few weeks, he has already sent in the sound driver feature updates targeting the upcoming Linux 4.16 kernel cycle...

    http://www.phoronix.com/scan.php?pag...-Sound-Updates

  • sleeplessclassics
    replied
    Originally posted by F.Ultra View Post

    It's a press release and by definition fake news ;-). And if you really think that there exists a low cost chip that can truly create a analogue out signal with a 0.000000001 Volt difference then I have a bridge to sell you.
    I know this is not the point here.....but this the best "I have a bridge to sell you" reference, I have ever come across and this is why I am on Phoronix...for the forum. I mean the articles are amazing as well, but forum is the best part

    Leave a comment:


  • caligula
    replied
    Originally posted by F.Ultra View Post
    You cannot compare this with video since we have not reached CD quality with video just yet (mainly because the bandwidth required for proper video is so insanely high).
    That's debatable. I can see difference between 6-bit TN screens and 8-bit IPS, maybe even 8-bit IPS vs 10-bit OLED, but I really much doubt there's much goodness beyond that. I also can't tell if a mobile phone has 4k screen or 16k.

    Leave a comment:


  • F.Ultra
    replied
    Originally posted by caligula View Post

    For some reason people were happy with CD audio quality (or even lower) for so long. The mp3 rips were often 22 kHz in late 1990s. 48 kHz sound cards became more common around 2000 with SB Live and Intel HD Audio. After that things have changed a lot. First 48, then 96, then 192. Now 384 kHz is standard in high end audio world. DAC bit counts have also steadily increased. From 8 to 16 to 24 to 32 bits. High end gear has 64 bits or more. Same thing with video cards. 8 bits per channel was good enough for quite long. 24b true color was standardized in 1994. Now for the first time normal consumers want 30b and latest standards even support up to 48 bits (HDMI, Displayport). It's really surprising how 30b wasn't enough. from 16,7 million colors to 64 x 16,7 million. Instead we need 262144 times more. Maybe up to 64 bits in the future.
    That is because if we do not count the multi channel vs only stereo thing, CD audio quality is perfectly fine. You can take any existing 384kHZ 64-bit piece of music, convert it to CD quality and no one will be able to point out which is the original vs the CD-version with a higher degree than pure random chance in a proper ABX test.

    The so called High End gear that you can buy are often really shoddy pieces of equipment built by people who do not understand electronics and who truly believes that you can enhance the sound by placing shakti stones on top of the equipment, or special cable lifters (to lessen the gravitational force of the cable...).

    You cannot compare this with video since we have not reached CD quality with video just yet (mainly because the bandwidth required for proper video is so insanely high).

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by caligula View Post
    For some reason people were happy with CD audio quality (or even lower) for so long. The mp3 rips were often 22 kHz in late 1990s. 48 kHz sound cards became more common around 2000 with SB Live and Intel HD Audio. After that things have changed a lot. First 48, then 96, then 192. Now 384 kHz is standard in high end audio world. DAC bit counts have also steadily increased. From 8 to 16 to 24 to 32 bits. High end gear has 64 bits or more. Same thing with video cards. 8 bits per channel was good enough for quite long. 24b true color was standardized in 1994. Now for the first time normal consumers want 30b and latest standards even support up to 48 bits (HDMI, Displayport). It's really surprising how 30b wasn't enough. from 16,7 million colors to 64 x 16,7 million. Instead we need 262144 times more. Maybe up to 64 bits in the future.
    Ever heard of marketing?

    I mean, why you think having RGB light controllers embedded in every fucking piece of "gaming" PC hardware is now commonplace? Having software-controlled multicolor leds all over the place isn't giving you any edge over an opponent in-game.

    Same for mechanical keyboards and ultra-high-res laser mice.

    Leave a comment:


  • caligula
    replied
    Originally posted by F.Ultra View Post

    They might and that too would be completely useless. Already with 24-bit DACs you have a dynamic range of 144.49dB which is well above the limit to give you immediate and permanent hearing damage so no music will ever utilize even a fraction of that range. Even the 16-bit DACs which gives 96.33dB dynamic range is capable of truly reproducing sounds loud enough to cause hearing damages so even this is actually overkill for a DAC. A 24-bit or 32-bit ADC is a whole different story but that is not what we are discussing here.
    For some reason people were happy with CD audio quality (or even lower) for so long. The mp3 rips were often 22 kHz in late 1990s. 48 kHz sound cards became more common around 2000 with SB Live and Intel HD Audio. After that things have changed a lot. First 48, then 96, then 192. Now 384 kHz is standard in high end audio world. DAC bit counts have also steadily increased. From 8 to 16 to 24 to 32 bits. High end gear has 64 bits or more. Same thing with video cards. 8 bits per channel was good enough for quite long. 24b true color was standardized in 1994. Now for the first time normal consumers want 30b and latest standards even support up to 48 bits (HDMI, Displayport). It's really surprising how 30b wasn't enough. from 16,7 million colors to 64 x 16,7 million. Instead we need 262144 times more. Maybe up to 64 bits in the future.

    Leave a comment:


  • F.Ultra
    replied
    Originally posted by caligula View Post

    You're probably right. The actual real world performance might not be good. But the DACs probably still have actual slots for 32 bits. I'm also guessing that in few years the phones have 64 bit DACs. and 16k graphics with 5" screens.
    They might and that too would be completely useless. Already with 24-bit DACs you have a dynamic range of 144.49dB which is well above the limit to give you immediate and permanent hearing damage so no music will ever utilize even a fraction of that range. Even the 16-bit DACs which gives 96.33dB dynamic range is capable of truly reproducing sounds loud enough to cause hearing damages so even this is actually overkill for a DAC. A 24-bit or 32-bit ADC is a whole different story but that is not what we are discussing here.

    Leave a comment:


  • caligula
    replied
    Originally posted by F.Ultra View Post

    It's a press release and by definition fake news ;-). And if you really think that there exists a low cost chip that can truly create a analogue out signal with a 0.000000001 Volt difference then I have a bridge to sell you.

    0xFFFFFFFF would have to yield an exact output of 1.736000000 Volts and 0xFFFFFFFE would have to yield an exact output of 1.735999999 from this chip in order for it to be a real 32-bit DAC.
    You're probably right. The actual real world performance might not be good. But the DACs probably still have actual slots for 32 bits. I'm also guessing that in few years the phones have 64 bit DACs. and 16k graphics with 5" screens.

    Leave a comment:


  • F.Ultra
    replied
    Originally posted by caligula View Post

    So you think this is fake news?
    http://www.marketwired.com/press-rel...ed-2085982.htm

    "This first member of the ESS PRO SABRE series sets a new benchmark in high-end audio by offering the industry's highest dynamic range (DNR) of 140dB. The ES9038PRO also offers impressively low total harmonic distortion plus noise (THD+N) at -122dB in a 32-bit, 8-channel DAC."
    It's a press release and by definition fake news ;-). And if you really think that there exists a low cost chip that can truly create a analogue out signal with a 0.000000001 Volt difference then I have a bridge to sell you.

    0xFFFFFFFF would have to yield an exact output of 1.736000000 Volts and 0xFFFFFFFE would have to yield an exact output of 1.735999999 from this chip in order for it to be a real 32-bit DAC.

    Leave a comment:


  • F.Ultra
    replied
    Originally posted by starshipeleven View Post
    Then what do they represent? Are they just a lie or are they some theoretical max?
    They represent best effort. I.e they have real 32-bits of real decoders inside but there is just no way for them to create a true 32-bit output. Line level which are used for analogue sound transmission between equipment have a total range of 3.472 (for professional equipment! consumer are slightly less) which means that the one bit difference that this chip have to create are 1nVolt (0.000000001), the noise levels alone should be greater than this for any integrated circuit.

    Which of course also clearly displays why it's mostly a numbers game since you cannot hear a difference that is that small. In fact the 16-bit for CD:s where set by Philips based on in-house studies to figure out the needed dynamic range and then they added headroom to that.

    So there does not exist any one that can hear the difference between a 16-bit or 24-bit sample if both are created correctly. The only reason that studios use 24-bit and above is to enable mixing and so that they do not loose useful resolution when they are compressing the dynamic range.

    The same is true for the frequency, 44K.1Khz as is used by CDs can truly represent 100% every sound in the 0-22Khz range however this also means that if you accidentally record sound that is > 22Khz then this will create digital artefacts that will look like lower frequency material in your digital data so you must have a filter. Filtering at 22Khz is not easy since filters (even brick filters) are not infinitely steep so it's much better to record at 96Khz and apply a normal filter somewhere between 22Khz and 48Khz and then apply a digital filter when you downsample to CD quality.

    The only times that people have heard differences are when they have either not done a proper ABX test (since the golden ears are so sensitive that they must know which equipment is playing in order to hear the difference...) or when the tested equipment are deliberately altering the sound (not uncommon for very expensive HIFI cables to include electronics that alter the sound aka equalizer so that people can claim to hear a difference).

    Leave a comment:

Working...
X