Jump to content

ksamnic

  • Posts

    3
  • Joined

  • Last visited

  • Country

    country-ZZ

Retained

  • Member Title
    Newbie
  1. Yes, I totally get this (need for a good stable clock) - but I didn't think that it came into play except when encoding/decoding. The part that I am interested in is the stream of bits from the media server to the external DAC. I think it is a pretty straight-forward computing job: take an encoded digital file and send it out a port at a constant rate without messing up the bits. No A/D or D/A. I can only think of two things that can go wrong with the streaming (which doesn't mean there are only two things ... these are just the ones I can think of!): 1. the hardware and/or software in the data path alters or drops bits, or 2. there is jitter introduced by the hardware and/or Operating System, resulting in data loss/corruption at the receiving end This is why I am looking for quantitative studies. There are many opinions that equipment x or y "sounds better", and I am sure that it does - but the science guy in me wants to know why. ... is it because some interfaces lose bits? ... is it because some cards unintentionally change bits? ... is it because some systems introduce jitter? ... is it because some components intentionally "color" the sound? I figure that the way to find out is to compare the actual audio stream, measuring parameters that could account for a change in sound quality. Also, I am pretty sure that someone must have already done this study. It seems like a pretty obvious research project for a computer engineering grad student. Anyway, I will post here if I find out anything.
  2. Are you sure about this (audio files having time)? I understand that there is timing info in the metadata on a CD, but my understanding was that digital audio is just PCM'd audio. There would be a sampling rate etc and the stream is written out as a series of 1's and 0's.
  3. Can anyone point me to the results of an actual test of the output from audio servers. IE: a test that captures the bits that come out of the servers and compares them to the original CD, and measures any jitter introduced by the interface choice. Why? I keep reading a large number of postings on this site (and others) about cards like Lynx (for example) being bit perfect and great and sounding better than a cheap on-board spdif or usb output (w/o a sound card). My feeling (and I am new to all of this) is that if two digital output streams sound different (through the same external DAC and subsequent audio components), then one (or both) of the audio bit streams has been corrupted on the path from the file to the dac. My instincts tell me that the less components in the path from wav file to bits going into my dac, the better. I have the same question for cables. I figure that digital cables either work (get the bits there) or they don't (they lose or change bits). Anyway, I am just looking for any links to actual scientific studies (I tried searching the forums here first but could not find anything quantitative - just opinions).
×
×
  • Create New...