I think Kris once did a PUTFILES REXX that implemented one or two of the
common techniques.

When your records are sorted on output file, you can do a sipping pipeline
that takes all records for the same output in each sip. Something like
  do forever
      'peekto line'
      parse var line tag .
      'callpipe *: | pick to w1 /== /'tag'/ | > ' tag 'output a'
   end

When the records are not sorted (or can't be sorted because you are
processing live data) you could try appending to files, but it gets nasty
to remember which ones to erase first. So the popular one in that case is
to make a recursive pipeline that diverts records for each output file.
Like this:

   do forever
      'peekto line'
      parse var line name .
      'addpipe (end \) *,input: | p: pick w1 == /'name'/ | >' name 'output
a ',
             '\ p: | *.input:'
    end

Sir Rob the Plumber

On Tue, 12 Oct 2021 at 14:56, Rich Smrcina <[email protected]> wrote:

> Is there a way to split a file into multiple files with the output names
> being a part of the input?
>
> Given something like this:
>
> 1 2 name1 3 4
> 3 4 name2 4 5
> 4 5 name2 5 6
> 5 6 name3 6 7
>
> Ignore the numbers, they just represent other fields. I would like to split
> that into files with the names of the third field. I won't necessarily know
> the values, so a large 'if' type structure won't work.
>
> Thanks!
>
> --
> Rich Smrcina
>

Reply via email to