Jump to content

lazy

  • Posts

    42
  • Joined

  • Last visited

  • Country

    United States

Retained

  • Member Title
    Freshman Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks for the recommendation. In looking at the website, it's impressively verbose. Which means I can't determine if it addresses my issue or not. Simply put, based on your experience can it (as a default action and automatically) copy files from a source to a destination if the file modification dates are different, regardless of anything else? I only want one-way syncing and I want all actual changed files synced based on file mod times, even if file size is the same as it typically would be for files with just metadata changes with padding. Put another way, in dealing with MS Office files and the like, Goodsync is fine - Office file sizes change based on moon phases and temperature among many other variables as far as I can tell, which makes it easy for a simple app like Goodsync to do the right thing as a default. But Office files are rarely padded, while many music files are. Foggie or anyone else, Any specific experience with mp3 or flac files with just metadata changes using BeyondCompare?
  2. I recently learned (the hard way) that Goodsync with its default settings just looks at file size and file mod time to determine what to sync when an extremely time-consuming MD5 recalc option is not enabled, and as a result will assume that files of equal size are unchanged regardless of file mod time. As a result of that, Goodsync will just copy over the mod time instead of copying the file for a file that is actually different. Which means any small changes to tag fields in a padded song file will be ignored if and until one runs a sync with the extremely time-consuming option of comparing all MD5 checksums. Since mp3 and FLAC files typically contain padding, this is not good. It seems Goodsync does not provide a way to change its default action, but it gets worse: in response to such files, Goodsync's recommended action is to ignore file mod times and then wipe detection of file mod times by overwriting mod times when files sizes are the same. While this default action can be manually defeated by several manual steps each time one syncs, such a solution is obviously untenable. So does anyone have experience with a syncing app that allows file syncing just based on file mod time?
  3. Chris, I apologize if there is a dedicated thread to equipment review suggestions and I missed it, but it would be great if CA considered reviewing the JBL M2/I-T5000 combo. A few reasons: - It seems awesome from anecdotal comments here and across the internet - It seems no home or high-end audio mag or site has reviewed this speaker or its combo with the bundled DSP/crossover/amp - The combo seems to hit a sweetspot of appealing to high-end computer-savvy listeners that this site appeals to Given the apparent quality of this combo, it would be very intriguing to see this compared to your TAD speakers, amp, sub, something like Acourate, an appropriate DAC and active crossover. The fact that you'd need to bundle many things to equal the functionality of the JBL combo is an interesting value proposition in itself... I have no personal or professional ties to any of the firms mentioned above. If you are interested and can interest JBL in a review, then encouraging them to trickle down the combo into (a) smaller, cheaper version(s) would be awesome.
  4. Have you considered a Beep? (thisisbeep.com). Not shipping yet, but in a few weeks they will be...
  5. I guess it's my fault for not being clear. I specifically proposed something completely different than a thread war, yet that seems to be the only thing people can think I'm proposing. Sorry for miscommunicating so badly. I withdraw my suggestion completely.
  6. I guess I'll never agree that good science, thoroughness and a thoughtful approach are problems, so we'll just have to agree to disagree on that. My recommendations were never intended to appeal to those who have already made up their mind on everything and are immune to learning. My recommendations stand.
  7. Amir may or may not have done what he stated he did, whatever exactly he did (he never revealed his exact settings and software app version). But all we really know is that he already anchored on his conclusion prior to his purported resampling effort, so it doesn't take a genius to doubt his objectivity. Again, just another reason to go with a more thoughtful, thorough and objective blog approach rather than post-by-post debates in lieu of that.
  8. elsdude, thank you for your efforts - no good deed goes unpunished, I guess. But this thread is playing out the same defects as the avsforum approach on this topic of trying "science by forum post" method that just doesn't work. Rather making science a debate via death-by-forum comments, I still think the topic is best served by a thoughtful blog study like mitchco and archimago have performed. And then everyone can post away opining on results. I have no agenda other than trying to get to the most accurate and valid results most efficiently and thoughtfully.
  9. esldude, I agree resampling is one of the most likely causes for audible differences. But it would be great if you could explain a bit more about the files you created when downsampling to 44k and then upsampling to 96k. Wouldn't all content above 22k be eliminated with filters in the 44k version, and thus also not exist when upsampled? Or if the 44k version did not have low pass filtering then it would seem content above 22k in the 44k version would have aliasing issues that would also occur in the upsampled version. Any clarification appreciated.
  10. So you really think that with my appeal to a Mitchco blog post instead of a forum crapshoot meant I really want the opposite? Can you prove that? Can you quote me on how I meant the opposite of what I proposed? And how is investigating 96k/24 vs 44.1k/16 "dysfunctional"? If your real intent was to scare off any thoughtful look at this, then you may have accomplished that already, but I hope not.
  11. I've tried over the last 2 hours to send a PM to Mitchco, but my reward every time has been to receive notice that I've never tried to do what I've done more than a dozen times. So thus my public appeal: Mitchco, you may or may not be aware of the rather intense interest surrounding Mark Waldrep's desire to informally test 96k/24 songs with verifiable content above 22k, via avsforms per AVS/AIX High-Resolution Audio Test: Take 2 - AVS Forum. The informal test may or may not have resulted in audible differences, but the ability to investigate that reasonably was severely hampered by the "explore via forum posts" approach and all of the dysfunctional behavior that inevitably results from such an approach. The debate thread discussing the test was just shut down because of such dysfunctionality, but not before a user suggested that you and/or Archimago would be better positioned to investigate if there was any audible difference, if a difference a preference, and if any preference the source of the preference for 96k/24 over 44.1k/16 for songs with material content above 22k in the original format. Discussion bogged down over if any difference was an inherent result of the bit depth and sampling rate difference, or a result of intermodulation distortion in playback systems or downsampling issues or something else. The downsampling application used (an unknown version of Sonic Solutions' Sonic Process with unknown settings) seemed to have issues that warranted conducting a parallel test with a better downsampler or at least a sampler with known version and settings, but those issues were ignored in the noise of the forum free-for-all. Taking up an avsforum poster's suggestion to appeal to you, I think your approach (as well as Archimago's) is well suited to an informal but robust look into anything insightful of Waldrep's songs with verifiable content above 22k. The good news is that I think there will be significant interest in whatever you come up with, and since you post and blog here I think you are already prepared for any knee-jerk objections to scientific methods. In short, I think you and perhaps in cooperation with Archimago you could do an outstanding job of looking into "hi-res" versus redbook of Waldrep's files in a unique way that is far better than a forum free-for-all but, not as time-consuming as a peer-reviewed published study, and still get at the heart of the matter. Thanks for your efforts so far, and for considering this.
  12. ISPs are already using time-dependent pricing for home usage? News to me. Never even seen it in the consumer market, much less extensively. A good overview of the bandwidth usage and pricing considerations emerging is http://www.tc.umn.edu/~ssen/papers/IEEEComMag-preprint.pdf
  13. I haven't read all the comments, so I apologize if this has already been covered, but I'd like to suggest a refinement to the cloud thing. I agree that outside of one's home, an internet-based cloud will be the likely long-term delivery basis. But I think bandwidth cost/limits, security, privacy, control and quality of service considerations will probably mean that a "home cloud" will also exist via a NAS or other similar centralized home-located storage device distinct from any phone or computer for home entertainment delivery of owned files. The "home cloud" will connect to the "internet cloud" for the purposes of backing up or accessing content not locally stored. With the increasing capabilities of home wireless and networking, I think home storage will be increasingly separated from home PCs and phones, but without taxing or being taxed for using internet bandwidth for files one already owns and wants to retain. Another aspect that I think will develop related to this is the need or incentives to locally cache desired internet-located content in low demand hours so that internet pipe owners can more efficiently manage bandwidth. For example, if you can select what you want locally cache on your (NAS-based) home cloud, you could get a bandwidth credit on your monthly internet bill. In other words, I think it'll be increasingly expensive and inefficient for internet providers to provide bandwidth to handle peak demands, and I thus think these companies will develop incentives to locally cache content to make better use of total internet bandwidth and limit peak service degradation.
  14. I voted "yes". I think it's more intellectually honest to take a formal stand in either direction and mean it and enforce it, rather than waste everyone'e time debating it, and my vote reflects what I see the majority of frequent site commenters supporting, so I'll go with that. This is one issue where one can't be half-pregnant. As has been stated thousands of times here, dbt'ers can go to HA, and that makes sense to me. Let me be clearer: my vote isn't based on what I believe or don't believe: it's what I think is most aligned with this website and my belief that dissembling is bad and honesty and clarity are good whatever your (nonviolent) beliefs.
  15. Can you explain what you mean by "re-up on foobar2000"? Given that they've already revised their project based on feedback in the first 24 hours and Peter's comments, don't hesitate to share any concerns in the HA thread so Peter, Spoon et. al. can see and respond. Personally, I see any success on the kickstarter project improving the Windows desktop app rather than hampering it. For example, I'd expect that any mobile offering(s) will result in tweaks to the Windows desktop version to work with the mobile app(s).
×
×
  • Create New...