Jump to content

JAVA Alive

  • Posts

    2
  • Joined

  • Last visited

  • Country

    France

Retained

  • Member Title
    Newbie
  1. Hi, Yes, in fact the FLAC vs WAV is already a test of "A" vs "A". ;-) Who ever prentends a diffrence could exist is deeply missunderstanding what is behind playback in a computer. The only potential difference is the impact of CPU usage on radioelectric noise and power supply. And people who whant to take care about this should first invest in high quality mother board (like the ones dedicated to overclock) and high quiality power supply and, most important, undervolt and undeclock the CPU. The underclock and undervolt is a much higher factor to reduce radioelectric noise and power consuption than using 0.5% instead of 1.5% of CPU (which is the arround difference between wav and flac). One other point I don't understand is why such high consideration on latency (most of the time Asio buffer size). This also has absolutely no impact on hifi listening. This factor is important only when you syncrhonize music form PC with other sources (like midi synth) or when you do multiple tracks listening and recording on the computer. You need very low latency in this case. If not, you may create time gaps between tracks which can alter sond (like flanger sounds) and even tempo if latency is realy high. For hifi, this has abslolutly no impact and buffer should be at the highest value to avoid drops. Last point, I have a smal schema to show the loss of a DAC + ADC conversion signal distorsion : Sinus - HostingPics.net - Hbergement d'images gratuit On this schema (purely theorical and made under XLS) you can see in red a 10khz sinus sampled at 44.1khz. In blue you see a re-construction of this signal based on a theorical re-sample at 44.1khz but clocks are not synchronized (clock of blue signal has a delay of 1/2 sample, which is arround 11.3 µs gap). With this 1/2 sample gap, it is easy to calculate the blue line : bluesample(n) = (redsample(n)+redsample(n+1))/2 You see on the picture that the "0" of each chart are not alligned, this is notmal, it is the 11.3 µs gap. But if you re-align the "0", you see that easily bye eye that charts won't match at all. With a 96khz sample rate, difference will of course be samller but I prefered to use 44.1 to be more "eye friendly". This means the measure protocol you use generates, even without consideration of noice and components limitation, a gap between original sample and final sample. A a consequence, you have proven that difference is UNDER -85 ou -90db (and not equal). I am prety sure that, in fact there is no difference except on potential impact of CPU usage on radioelectric noise and power supply.
  2. Hi all and mitchco ! My first post here, sorry to up such an old thread but it's the post that made me come here ;-) Very interesting articles. One think still bothers me : why a difference (even small) between wav and flac or JRiver and JPlay while there should be absolutely no difference ? I guess this should come from the measuring protocol : both DAC and ADC are not syncrhonous and the de-syncrhonization + compensation algoryhtm must generate noise (even if first goal is quite oposite). I think it could be interesting to make a JRiver vs JRiver measure to see if it goes under 90db.
×
×
  • Create New...