Jump to content

G_Steven

  • Posts

    3
  • Joined

  • Last visited

  • Country

    country-ZZ

Retained

  • Member Title
    Newbie
  1. Thank you Zerung, I think I get it now. In order to utilize the interface as it was intended, the DAC has to "follow the rules" of the interface even if the "rules" are bad. I guess my question is more philosophical then: Why don't DAC manufacturers use the fille directly like my Alpine car stereo head unit (which truly sucks by the way) does? I can insert a thumb drive into it and it will play music by reading the file directly. The head unit then becomes responsible for its own timing. In other words, why should they use the interface in the conventional manner that you have described, why not just use it to transport the file and do the rest of the work internally? Less confused but more disappointed...
  2. Thank you Clay, Your description of the problem makes sense as far as it goes, but I still don't understand why the computer has to be in control of the timing. Just as a thought experiment, (reality actually) I could copy the music to my iPod via USB and play it hours later. Clearly my PC or MAC has no influence on my iPod listening experience unless it sent corrupted data. If Apple can make an iPod so cheaply, complete with its proprietary internal interface and DAC, why can't the audiophile manufacturers re-clock the data by means of a buffer? Still confused...
  3. I would appreciate it if someone could explain to me why the DAC interface matters at all in a well designed DAC. If I can get a bit perfect copy across any sort of connection, it would seem as though the DAC would have the responsibility of buffering the data internally and using it at the EXACT correct moment in time from the buffer NOT from the interface. I could make the buffer many megabytes in size if I chose to thereby smoothing out several minutes worth of jitter! Given then, that the bit is of the correct value (and in the right order of course) and it is handed off to the converter at the correct moment from the internal buffer, why should it matter at all what interface is used to transport it? Clearly I'm missing something here...
×
×
  • Create New...