Jump to content
IGNORED

Google Drive Cloud Based Streaming to Roon


Recommended Posts

I've done this. Get RClone, use RClone mount + Cache (cut down on api calls) and you're golden. Did a 12TB library on GSuite to a server hosted with OVH running Ubuntu. Works the same on Mac OS. 

There's a ton of guides around about how to set up your containers and mount it as a drive. Get familiar with the really super basic command line operations required for rclone so you're not slaved to a GUI or something stupid. I always advocate for gui support but with something as simple as rclone, you really have no need for basic mounting

 

For god sakes though, do 2 things. Encrypt the music going to GDrive (use rclone encrypted containers, it's invisible to you but just shows garbage data on google) and BACK UP LOCALLY. 

 

 

Link to comment
14 minutes ago, idesign said:

Thank you for the reply. I downloaded ExpanDrive but need to spend more time attempting to configure it. Have you been able to use ExpanDrive with success? And it seems Google's own "Drive File Steam" app would work and perhaps they will make it available to all users. 

 

https://www.google.com/drive/download/

I tried this years ago with JRiver but the latency was too annoying. 

 

Using ExpanDrive you can mount many cloud services as “local” drives. 

 

One issue I ran into in the past was a requirement by the cloud storage software that there be a local copy as well. This is where ExpanDrive really helped. 

Founder of Audiophile Style | My Audio Systems AudiophileStyleStickerWhite2.0.png AudiophileStyleStickerWhite7.1.4.png

Link to comment
2 hours ago, idesign said:

Thank you for the reply. I downloaded ExpanDrive but need to spend more time attempting to configure it. Have you been able to use ExpanDrive with success? And it seems Google's own "Drive File Steam" app would work and perhaps they will make it available to all users. 

 

https://www.google.com/drive/download/

Neither are good solutions. The 'File Stream' app sucks, and ExpanDrive can hit API limits. 

I encrypt  because I'd rather not have google know what I'm storing since technically, its for business and businesses often encrypt their data. There is 0 reason not to. It's entirely transparent for the end user. 

 

 

Just man up and use rclone. 

 

rclone container > rclone encrypted container > rclone cache container is the workflow. 

 

I've used various setups over the years but my best solution so far is as follows:

a local, high end server running linux mounts the cache container (with 512gbs of cache) , this is shared via SMB to my clients. 

p.s. for those worried about 'limits', comfortably sitting at 300tbs and counting. 

 

Link to comment
24 minutes ago, evedoesaudiothings said:

Neither are good solutions. The 'File Stream' app sucks, and ExpanDrive can hit API limits. 

I encrypt  because I'd rather not have google know what I'm storing since technically, its for business and businesses often encrypt their data. There is 0 reason not to. It's entirely transparent for the end user. 

 

 

Just man up and use rclone. 

 

rclone container > rclone encrypted container > rclone cache container is the workflow. 

 

I've used various setups over the years but my best solution so far is as follows:

a local, high end server running linux mounts the cache container (with 512gbs of cache) , this is shared via SMB to my clients. 

p.s. for those worried about 'limits', comfortably sitting at 300tbs and counting. 

 

There’s no free lunch with encryption. The local CPU will be encrypting the content for no reason. It can be resource intensive compared to zero encryption. 

 

I don’t care if Google knows I listen to Britney Spears. 

 

How would the API limits of ExpanDrive effect the end user in this scenario?

Founder of Audiophile Style | My Audio Systems AudiophileStyleStickerWhite2.0.png AudiophileStyleStickerWhite7.1.4.png

Link to comment
27 minutes ago, The Computer Audiophile said:

There’s no free lunch with encryption. The local CPU will be encrypting the content for no reason. It can be resource intensive compared to zero encryption. 

 

I don’t care if Google knows I listen to Britney Spears. 

 

How would the API limits of ExpanDrive effect the end user in this scenario?

It's almost CPU free, I have no issues on any device, even really low power pi's. 

Less about google knowing, more about covering your butt (with no effort required on your part) in case you use say 300TBs and they check in and realize you're only storing copyrighted content. 

 

Well say you're scraping your library to cache all the album thumbs. That's gonna rack up a *ton* of API calls. In this case, the 10TB a day limit isn't an issue, it's just the number of calls required for those transactions Roon, or whatever software you use is requesting. Cache can intelligently control the number of simultaneous connections and effectively schedule transactions to drastically lower the possibilty of an API limit. All API limits reset after 24 hours. 

 

Current limits are

 

Upload: 750gb a day, per user, per ip 

Download: 10TB a day per user, per ip

Link to comment
1 hour ago, evedoesaudiothings said:

It's almost CPU free, I have no issues on any device, even really low power pi's. 

Less about google knowing, more about covering your butt (with no effort required on your part) in case you use say 300TBs and they check in and realize you're only storing copyrighted content. 

 

Well say you're scraping your library to cache all the album thumbs. That's gonna rack up a *ton* of API calls. In this case, the 10TB a day limit isn't an issue, it's just the number of calls required for those transactions Roon, or whatever software you use is requesting. Cache can intelligently control the number of simultaneous connections and effectively schedule transactions to drastically lower the possibilty of an API limit. All API limits reset after 24 hours. 

 

Current limits are

 

Upload: 750gb a day, per user, per ip 

Download: 10TB a day per user, per ip

Thanks for the info. The API issue could be a problem. I may try it just to find out.

Founder of Audiophile Style | My Audio Systems AudiophileStyleStickerWhite2.0.png AudiophileStyleStickerWhite7.1.4.png

Link to comment

I’ve read that although google sets those API limits as defaults, you can request them to increase if you’re hitting them everyday. 

I actually hit the limit everyday myself, but similar to evedoesaudiothings , I’ve been too afraid to inquire about requesting they raise my API limits, because I don’t want to raise a flag to Google since I have more than 200TBs of music in my Gsuite that I’m paying $10/ month to store!!  

I don’t bother encrypting anything though, I mostly use it as a cloud backup to my NAS and for all the stuff that I want to have , but not so much that I want to use up my NAS HDD’s for (I only have 80TB’s of storage in my NAS). 

I think best is to maybe do a smaller version of what I do, keep the bulk of your collection on Google Drive and use Synology backup sync SW to sync with the stuff you want to have local and listen to the most ? 

It works well for me. 

Link to comment
1 hour ago, agladstone said:

I’ve read that although google sets those API limits as defaults, you can request them to increase if you’re hitting them everyday. 

I actually hit the limit everyday myself, but similar to evedoesaudiothings , I’ve been too afraid to inquire about requesting they raise my API limits, because I don’t want to raise a flag to Google since I have more than 200TBs of music in my Gsuite that I’m paying $10/ month to store!!  

I don’t bother encrypting anything though, I mostly use it as a cloud backup to my NAS and for all the stuff that I want to have , but not so much that I want to use up my NAS HDD’s for (I only have 80TB’s of storage in my NAS). 

I think best is to maybe do a smaller version of what I do, keep the bulk of your collection on Google Drive and use Synology backup sync SW to sync with the stuff you want to have local and listen to the most ? 

It works well for me. 

You may be right. The original thinking behind my post was find a way abandon my Synology DS718+ setup and utilize my gigabit internet service but it looks like the technology to stream a lossless music library from a cloud based service into Roon or Audirvana+ is still ways off (at least easily and economically). As collector of rare and out of print classical music,  TIDAL and Qobuz will never fulfill my needs. I hope this thread continues and more people continue to contribute their solutions and ideas.  

Link to comment
On 12/22/2018 at 1:28 AM, idesign said:

You may be right. The original thinking behind my post was find a way abandon my Synology DS718+ setup and utilize my gigabit internet service but it looks like the technology to stream a lossless music library from a cloud based service into Roon or Audirvana+ is still ways off (at least easily and economically). As collector of rare and out of print classical music,  TIDAL and Qobuz will never fulfill my needs. I hope this thread continues and more people continue to contribute their solutions and ideas.  

I think you can still do it. From my perspective, you have a large amount of local storage, that's awesome. There is definitely a place for cloud storage. Back your shit up, right there that's worth the 10 bucks a month. You could also use it to dump non important or infrequently listened to music and keep that as a secondary library. There are tons of options. 

 

You also may be fine to just use a rclone cache mount with Roon, and back it with a large cache size on an SSD or HDD. 

Link to comment
On 12/21/2018 at 11:54 PM, agladstone said:

I’ve read that although google sets those API limits as defaults, you can request them to increase if you’re hitting them everyday. 

I actually hit the limit everyday myself, but similar to evedoesaudiothings , I’ve been too afraid to inquire about requesting they raise my API limits, because I don’t want to raise a flag to Google since I have more than 200TBs of music in my Gsuite that I’m paying $10/ month to store!!  

I don’t bother encrypting anything though, I mostly use it as a cloud backup to my NAS and for all the stuff that I want to have , but not so much that I want to use up my NAS HDD’s for (I only have 80TB’s of storage in my NAS). 

I think best is to maybe do a smaller version of what I do, keep the bulk of your collection on Google Drive and use Synology backup sync SW to sync with the stuff you want to have local and listen to the most ? 

It works well for me. 

 

 

Man, I am very tempted to set up another GSuite and see if they'll remove the limits. I have to imagine the increased scrutiny would be a bad thing? 

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...