Jump to content
IGNORED

Why does SPDIF basically suck?


Recommended Posts

I remember when Dunn and Hawksford first published Is_The_AESEBU_SPDIF_Digital_Audio_Interface_Flawed

about the jitter problems with spdif.

 

Here and in other threads it is oft quoted that the unidirectional nature of spdif is a jitter trouble maker and the bidirectional nature of usb allows data flow controlled and verified by the DAC.

 

For me at least computer usb audio surpassed transport delivered spdif/ aes-ebu some years ago. I suspect this was all about improvements with implementation and ancillary gear to address noise etc.All I really hope for is that usb continues to be researched and allowing reasonably inexpensive upgrade paths without having to invest in a whole new digital wiz bang interface and hardware. Just my 2c.

Sound Minds Mind Sound

 

 

Link to comment

S/PDIF can be reclocked and the results improve quite well such as a Mutec MC-3+USB.

This is more money to spend for device like the Diavalet or Yggdrasil, then again more money needs to treat USB and Ethernet as well.

 

If primary playback is Redbook, the S/PDIF can be mastered and will work well, USB has the upper hand with bandwidth where it’s needed for DSD128+.

 

I agree with @Superdad that DAC designers that struggle with USB interfaces to sound reasonable don’t deserve patronage.

 

Isnt there a new XMOS interface due soonish?

AS Profile Equipment List        Say NO to MQA

Link to comment
On 5/12/2018 at 8:36 PM, mansr said:

You can do something more elaborate than the standard PLL and achieve much lower jitter, but you always must somehow avoid drift between your local clock and the incoming signal. USB does this by providing a feedback channel whereby the receiver can instruct the sender to slow down or speed up as needed to keep the fifo at a suitable fill level. S/PDIF is unidirectional, so nothing like this is possible there.

 

8 minutes ago, One and a half said:

S/PDIF can be reclocked and the results improve quite well

 

Are the above statements compatible?

 

 

 

Sound Minds Mind Sound

 

 

Link to comment
2 minutes ago, Audiophile Neuroscience said:

 

 

Are the above statements compatible?

 

 

 

The Mutec discards any clock data in the s/PDIF stream and creates  new timing injection using its own close to reference type clock. There’s still a PLL PID control with at least a better clock to begin with. Just a different technique to overcome a legacy. 

AS Profile Equipment List        Say NO to MQA

Link to comment
4 hours ago, mansr said:

Really? What was it meant for then, and why has been used on Philips CD players since the mid 80s?

 

Sorry, was thinking about I2S at the time.  Doesn’t change my opinion of S/PDIF though.  A lot of effort is required to at both transmit and receive end to make it perform well, and then the are stil issues with the cable, etc.  No denying that, unless generated right in the computer,  S/PDIF adds another unnecessary conversion in the chain.

Link to comment
1 minute ago, Superdad said:

Sorry, was thinking about I2S at the time.

They showed up around the same time.

 

1 minute ago, Superdad said:

Doesn’t change my opinion of S/PDIF though.  A lot of effort is required to at both transmit and receive end to make it perform well,

Transmitting it isn't hard. The problem for the receiver is recovering the clock without excessive jitter. Early implementations used all transitions for clock recovery resulting in data-dependent jitter, which is what the j-test signal is designed to expose. Later implementations started using only the preamble of each sample, which is constant, to avoid this issue. USB is infinitely more complex but also far more capable.

 

1 minute ago, Superdad said:

and then the are stil issues with the cable, etc.  No denying that, unless generated right in the computer,  S/PDIF adds another unnecessary conversion in the chain.

Yes, going from USB to S/PDIF to I2S is pointless when you can just as well go to I2S directly.

Link to comment

Digital cables are one of the reasons I want my next DAC to have the UPNP/Roon renderer as part of a single chassis solution... I've found I really can't stand USB cables, even with ISO Regen I use back to back USPCB connectors.  SPDIF I splurged on quite a while ago, but I'm sure that if I was willing to spend $$$ on even better  I'd find differences... which all could be rendered moot by keeping the endpoint digital D/A function internal to a single integrated well designed unit.

Regards,

Dave

 

Audio system

Link to comment
8 minutes ago, davide256 said:

Digital cables are one of the reasons I want my next DAC to have the UPNP/Roon renderer as part of a single chassis solution... I've found I really can't stand USB cables, even with ISO Regen I use back to back USPCB connectors.  SPDIF I splurged on quite a while ago, but I'm sure that if I was willing to spend $$$ on even better  I'd find differences... which all could be rendered moot by keeping the endpoint digital D/A function internal to a single integrated well designed unit.

The argument against this is that it's better to keep noisy digital electronics far away from from the sensitive analogue parts, such as in a separate box.

Link to comment
13 hours ago, buonassi said:

So the J test is done using only one tone at 0 dbfs (or close to) and observing the effects on the rest of the spectrum.  Anyone else see the shortfall here besides me?  Even if a J test ran on a sweep, it would be better, but still wouldn't really be sufficient.  Music isn't just one perfectly oscillating sine wave.  And it's reasonable to assume that when the USB chip is presented data that doesn't repeat as nicely as an encoded sine wave, that jitter could increase substantially.

 

Or maybe I'm looking at this through the lens of a lay person, and these tests are really good proxies for real music.  But I haven't heard the any counter argument yet. 

 

 

 

 

Jitter or mis-timing results in noise of various types showing up in the output.  It might be sideband tones around each frequency or just a wide increase in the noise floor depending upon which type of jitter it is.  Because of how the digital format works, how FFTs work the quarter sample rate tone is very good to expose significant levels of jitter. 

 

As mansr said a few posts ago originally SPDIF recovered the clock on each clock transition.  This could cause this could cause various bit combinations to show up as noise around the music.  The actual full Jtest mixed a 689 hz square wave with a 11,025 hz tone.  The 689 hz tone was toggling just the least significant bit.  This would cause the maximum amount of data related jitter. This part of the test really doesn't fit with USB sourced digital.  So many call it a Jtest when they use a high level tone at 1/4th the sample rate.  It actually is not the original Jtest that was developed by Julian Dunn.  

And always keep in mind: Cognitive biases, like seeing optical illusions are a sign of a normally functioning brain. We all have them, it’s nothing to be ashamed about, but it is something that affects our objective evaluation of reality. 

Link to comment
On 2018-05-11 at 11:26 PM, sandyk said:

Well implemented Coax SPDIF can sound markedly better than most USB implementations , although it's bandwidth is inadequate for recent DSD implementations, where there is a limited amount of material available, and not in popular music either.

There is no reason why Coax SPDIF's bandwidth couldn't be markedly improved these days if there was a will to do so.

 

I agree a well implemented coax can sound better than most USB. To say that SPDIF suck sucks ;)

 

Chord use two coax cables with BNC connectors between BLU2 and DAVE (up to 705.6kHz, 768kHz). DCS Vivaldi can play 384kS/s and DoP/64, DoP/128 with dual AES.

Link to comment
On 2018-05-13 at 3:43 AM, Superdad said:

Standards are a good thing, but the S/PDIF standard--originally not even meant to be used externally--really ought to just fade away in the computer audio arena.

 

And still you use I2S (which was not made to be used externally), so it’s possible (I guess) to improve the design like has been done with LVDS I2S.

Link to comment
1 minute ago, Ralf11 said:

another question is how much damage do unnecessary conversions really do in the digital domain - under what circumstances will jitter increase to hearable levels, for example?

Jitter doesn't accumulate. A good final stage can undo much damage done earlier, and a poor one will ruin the cleanest of inputs.

Link to comment
14 minutes ago, Ralf11 said:

another question is how much damage do unnecessary conversions really do in the digital domain - under what circumstances will jitter increase to hearable levels, for example?

 

Conversions between different forms of digital codex has its advantage. One type of design and protocol can be better at reducing RF noise or can be used to send over long distance, another is better at reducing jitter and so on. No digital protocol sucks only the implementation.

Link to comment

 There is a bottleneck here. I would not use USB 2.0 or coax to connect the dac and computer. 

  I do not know why, but AES sounded better on the system I had that output and received both. The future replacements for USB 2.0 are Ethernet, USB -C, and Thunderbolt. 

  FireWire was better than USB 2.0 and it is obsolete.

 

2012 Mac Mini, i5 - 2.5 GHz, 16 GB RAM. SSD,  PM/PV software, Focusrite Clarett 4Pre 4 channel interface. Daysequerra M4.0X Broadcast monitor., My_Ref Evolution rev a , Klipsch La Scala II, Blue Sky Sub 12

Clarett used as ADC for vinyl rips.

Corning Optical Thunderbolt cable used to connect computer to 4Pre. Dac fed by iFi iPower and Noise Trapper isolation transformer. 

Link to comment
1 hour ago, Summit said:

And still you use I2S (which was not made to be used externally), so it’s possible (I guess) to improve the design like has been done with LVDS I2S.

 

Well I2S internally--right from the Ethernet or USB input board--with master clocking done right.

I2S (via LVDS over HDMI) is rarely done in ideal fashion as the source ends up as the master clock (very few DACs feed master clock out to slave the I2S source).

 

I personally presently use I2S external (modified Singxer SU-1 feeding I2S/DSD over LVDS/HDMI cable to Holo Spring L3) because the Crystek CCHD-575 clocks in the Singxer are a lot better than the clocks in the Spring (and/or the USB input of the SU-1 is better).

 

There is nothing preventing an enterprising manufacturer/designer from developing a more ideal--and yet still operating system "sound card driver" compatible--two piece solution with....  (oops.  :ph34r:  shhh... B|).

Link to comment
13 minutes ago, Superdad said:

I2S (via LVDS over HDMI) is rarely done in ideal fashion as the source ends up as the master clock (very few DACs feed master clock out to slave the I2S source).

One reason might be that to have the DAC be the master clock, you'd need to deal with an unknown amount of signal skew. A 1-metre cable would have a minimum of 10 ns round-trip delay, probably twice as much when LVDS interfaces and other logic is included. That's too much to be fed directly into most DAC chips, and you'd need a resynchronisation stage to ensure the phase requirements are met. The source would also need to support an external master clock, which not all do.

Link to comment
7 hours ago, mansr said:

The problem for the receiver is recovering the clock without excessive jitter.

 

What is confusing for me, as far as I can understand the concepts, is the notion that the clock doesn't have to be *recovered* so long as it is *replaced* by a better one. I recall the fad of external super clocks for this purpose. My naive understanding is that the supposition would be that you can have as much jitter in the incoming signal provided the buffer/FIFO is just accurately reclocked before conversion to analog- problem solved?? This kind of 'argument' was often presented by people in the past who advocated that jitter was a non issue for "properly" designed DACs.

 

2 hours ago, Summit said:

 

I agree a well implemented coax can sound better than most USB. To say that SPDIF suck sucks ;)

 

 

Depends what you mean by "most USB". Comparing apples with apples, for me, well implemented USB sounds better than well implemented coax spdif or aes/ebu or glass ST fibre (the bayonet connection)

Sound Minds Mind Sound

 

 

Link to comment
1 minute ago, Audiophile Neuroscience said:

What is confusing for me, as far as I can understand the concepts, is the notion that the clock doesn't have to be *recovered* so long as it is *replaced* by a better one.

Absent a feedback channel for flow control, the playback rate must be slaved to the sender in order to avoid clock drift.

Link to comment
18 minutes ago, mansr said:

Absent a feedback channel for flow control, the playback rate must be slaved to the sender in order to avoid clock drift.

 

So if spdif lacks the feedback channel it must be slaved to the sender, with its jitter? In the alternative, why can't you just discard the sending clock, forget about drift, just use a new clock at the receiving end?

Sound Minds Mind Sound

 

 

Link to comment
3 minutes ago, Audiophile Neuroscience said:

So if spdif lacks the feedback channel it must be slaved to the sender, with its jitter? In the alternative, why can't you just discard the sending clock, forget about drift, just use a new clock at the receiving end?

If you ignore drift, you'll need to either drop or insert samples whenever the clocks slip by more than a sample period. The S/PDIF spec requires a frequency accuracy of 1000 ppm for the sender. Suppose your local clock is running at a perfect 48 kHz while the sender is at the upper end of the permitted range, that is 48048 Hz. Every second, you'll be receiving 48 samples more than you know what to do with. You have no choice but to discard them, and this causes distortion. Similarly, if the sender is slow, you'll have to somehow pull 48 samples per second out of thin air, again distorting the signal.

Link to comment
3 hours ago, mansr said:

If you ignore drift, you'll need to either drop or insert samples whenever the clocks slip by more than a sample period. The S/PDIF spec requires a frequency accuracy of 1000 ppm for the sender. Suppose your local clock is running at a perfect 48 kHz while the sender is at the upper end of the permitted range, that is 48048 Hz. Every second, you'll be receiving 48 samples more than you know what to do with. You have no choice but to discard them, and this causes distortion.

 

Or you can do what @JohnSwenson did for the unique S/PDIF input of the Bottlehead DAC, which was to use an FPGA instead of a traditional S/PDIF receiver with its jitter-prone PLL.  He did a little cleanup of the S/PDIF signal then sent it into an FPGA for the decoding. But the special part was that he’s used a digitally controlled low phase-noise clock, with performance close to some of the best fixed frequency clocks. The FPGA told the variable clock to speed up or slow down so it was synchronized to the average data rate of the source.  It was a REALLY good S/PDIF input!

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...