Hi all,
I currently have my music collection as a bunch of mp3s, oggs or flacs on a fileserver on my home network. That's fine for me using linux to play them back but we have several devices (including Windows machines and iPods) at home that don't support either ogg or flac. I also don't really want to copy the files as flac to my mps player (my mobile phone) as they take up a lot of space so I currently transcode them to ogg or mps. Ultimately I'd like to keep everything as flac as I have lots of space so I will be gradually re-ripping stuff to flac.
Because file space is cheap I had a crazy idea to keep a directory of originals on the network and then to run a periodic script to build a directory of mps3 or oggs from the originals. This would save time transcoding and would support the various devices on the network better. Has anyone heard of a scheme like this or knows of any software designed to do this.
Ideally the update script would: 1. Create/Update the corresponding mp3/ogg file whenever a file in the originals directory changes (i.e. I rip a new CD) 2. Update the metadata in the originals directory whenever the metadata in a mp3/ogg directory changes. 3. Support all file formats in the originals directory (as I won't have only flacs there at the start). I suppose hard linking mp3s/oggs where relevant is therefore a good option here.
I had an idea to somehow use SCons to "build" the mp3s/oggs from the flac files but that only supports requirements 1. The other option was to write a custom python (or similar) script to do it.
My question is, does anyone have any suggestions for existing tools that do something similar or for ideas of how to implement what I want.
JD
Hi there, I can't think of anything of the peg, but if you wanted this sort of Hot Folder arrangement then dnotify would kick of a script that does the transcoding etc. I did something similar that processes images from my camera.
The trick with dnotify is that I couldn't make it report out the file that caused the trigger event, so what I have done is set up the script that gets called (in this case bash but it could be anything) to rename the files after it has processed them so they can be ignored on the next pass. Either that or you could have a "drop" folder that is separate to your "Processed" folder so you aren't trying to reprocess old files everytime a new one is dropped in.
Be aware though that it is very easy to run into issues when multiple files are dropped because your script will be queued for each new file, there are ways of changing this behavior.