On Thu, 2012-08-09 at 16:05 -0400, Nathan Steele wrote:
> OK, I wrote the wget script and it seems to work, I'll try to modify it 
> so it wont care about the date. if I jsyt tell it to get *part%201.wav 
> will that work?

as we say here " suck it and see "  It should, but labelling generated
in windows environments is sometimes impervious to wildcards 

> 
> I guess the way to make this work is that each part has to be a single 
> cut in it's own cart? then I can modify the title wget saves t as to 
> include th cart number and the dropbox can get the cart number from the 
> file name and overwrite the cut each week? I"ve got several dropboxes 
> running, but this is the first time I've had to do a multipart show. I 
> assume for scheduling purposes it ia also best to have each part be it's 
> own cart. seems the best way to me.


RD offers 1001 ways to do this

I would save the file with something that tells you the generation date
or air date so you know the file is the latest one.

Then RD catch can be used to cart the file, deleting the existing cut,
some suitable time afterthe download has run.

If you use the dropbox feature in rdadmin you have the potential for a
slow download to be carted when it's part way through because the
routine that asks is the fill still being downloaded is fooled by a
pause. Slow local network can do this.

belts and braces is to run the wget script at hour 0 and run the rdcatch
event at hour 0 + 1. As long as both are before air time it just
happens.

OR

You can call rdimport in the script at the end of the download

rdimport --verbose --to-cart=xxxx --delete-cuts
[GROUP] /home/rd/dropbox/progTitle.mp3

you can get rd import to use the filename as the title of the cart and
the cut so you can see it's the right episode

rdimport has all the options you might need


regards

Robert

> 
> 
> Thanks again for the help,
> 
> Nathaniel C. Steele
> Assistant Chief Engineer/Technical Director
> WTRM-FM / TheCrossFM
> 
> On 8/8/2012 6:46 PM, Robert Jeffares wrote:
> > I use wget to get around this problem where URL's don't work with linux
> >
> > this one has more spaces than most
> >
> > wget -O /home/rd/dropbox/theChristianRockPart1.wav
> > http://www.thechristianrock20.com/CR20shows/NewShow_WAV/The%20Christian%
> > 20Rock%2020%20-%20081212%20-%20Part%201.wav
> >
> >
> > will put it in your dropbox with a useable label. It works here
> >
> > rdcatch is a bit precious about spaces
> >
> > to get all segments write a script with 6 wget events. With luck some
> > wildcards will get around the date, or you can get the date included in
> > the download label.
> >
> > Then have rdcatch cart it in time for broadcast from the dropbox have it
> > delete the source and you won't have a problem with 2 weeks progs in the
> > same dropbox.
> >
> > If you want a hand with the script let me know.
> >
> > regards
> >
> > Robert Jeffares
> >
> > On Wed, 2012-08-08 at 18:10 -0400, Fred Gleason wrote:
> >> On Aug 8, 2012, at 17:57 11, Nathan Steele wrote:
> >>
> >>> I suspect it is the spaces in the filename throwing it off but I tried
> >>> adding %20 and that just added 0 instead of a space...if i use the
> >>> spaces it looks like the correct url in /var/log/messages...
> >> What RD version are you using?  The v2.1.4 update fixed a raft of problems 
> >> involving URLs with 'unusual' characters.  Earlier versions are likely to 
> >> exhibit significant flakiness with URLs containing spaces.
> >>
> >> Cheers!
> >>
> >>
> >> |-------------------------------------------------------------------------|
> >> | Frederick F. Gleason, Jr. |               Chief Developer               |
> >> |                           |               Paravel Systems               |
> >> |-------------------------------------------------------------------------|
> >> |          A room without books is like a body without a soul.            |
> >> |                                         -- Cicero                       |
> >> |-------------------------------------------------------------------------|
> >>
> >> _______________________________________________
> >> Rivendell-dev mailing list
> >> [email protected]
> >> http://lists.rivendellaudio.org/mailman/listinfo/rivendell-dev
> >
> > _______________________________________________
> > Rivendell-dev mailing list
> > [email protected]
> > http://lists.rivendellaudio.org/mailman/listinfo/rivendell-dev
> >
> >
> >
> 
> _______________________________________________
> Rivendell-dev mailing list
> [email protected]
> http://lists.rivendellaudio.org/mailman/listinfo/rivendell-dev


_______________________________________________
Rivendell-dev mailing list
[email protected]
http://lists.rivendellaudio.org/mailman/listinfo/rivendell-dev

Reply via email to