I am seeking for working file output that uses external lame encoder. That will allow me to use standard ubuntu packages with MP3 encoding of the archive of my stream.
I wrote something like this: def output.file.lame( ~id="output.file.lame", ~start=true, ~restart=true, ~append=true, ~restart_delay=3, ~description="OCaml Radio!", ~public=true, ~dumpfile="", ~lame="lame", ~bitrate=128, ~swap=false, ~restart_on_crash=false, ~restart_on_new_track=false, ~restart_encoder_delay=3600, ~headers=[], ~perm=438, ~quality=5, ~dir_perm=777, ~reopen_when={ false }, location, s) samplerate = get(default=44100,"frame.samplerate") samplerate = float_of_int(samplerate) / 1000. channels = get(default=2,"frame.channels") swap = if swap then "-x" else "" end mode = if channels == 2 then "j" # Encoding in joint stereo.. else "m" end def lame_p(m) "#{lame} -b #{bitrate} -r --bitwidth 16 -s #{samplerate} \ --signed -q #{quality} -m #{mode} --nores #{swap} -t - -" end output.file.external(id=id, process=lame_p, start=start, append=append, restart_on_crash=true, restart_encoder_delay=restart_encoder_delay, header=false, perm=perm, dir_perm=dir_perm, reopen_when=reopen_when, location, s) end But it always resamples to 32000 KHz and does not reopens the file (reopen_when argument). What I had made wrong? Thank you, Marcin ------------------------------------------------------------------------------ Free Software Download: Index, Search & Analyze Logs and other IT data in Real-Time with Splunk. Collect, index and harness all the fast moving IT data generated by your applications, servers and devices whether physical, virtual or in the cloud. Deliver compliance at lower cost and gain new business insights. http://p.sf.net/sfu/splunk-dev2dev _______________________________________________ Savonet-users mailing list Savonet-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/savonet-users