I don't know why sort is giving you such problems.  there may be something
unusual about your specific input that it wasn't designed to handle (or it
might simply be a latent bug that has never been identified and fixed).

when I need to sort large files, I split(1) them into smaller pieces, then
sort(1) the pieces individually, then use sort(1) (with the -m option) to
merge the sorted pieces into a single large result file.  this has always
worked reliably for me (and because I was raised using 8-bit and 16-bit
computers I don't have any special expectations that programs should "just
work" when given very large inputs).

even if you think doing all this is too much bother, try doing it just
once.  you might be able to identify a specific chunk of your input that's
causing the problem, which will help move us all towards a proper solution
(or at least a caveat in the man page).

-ken

On Sun, Mar 15, 2015 at 9:53 AM, sort problem <sortprob...@safe-mail.net>
wrote:

> Whoops. At least I thought it helped. The default sort with the "-H"
> worked for 132 minutes then said: no space left in /home (that had before
> the sort command: 111 GBytes FREE). And btw, df command said for free
> space: "-18 GByte", 104%.. what? Some kind of reserved space for root?
>
>
> Why does it takes more then 111 GBytes to "sort -u" ~600 MByte sized
> files? This in nonsense.
>
>
> So the default "sort" command is a  big pile of shit when it comes to
> files bigger then 60 MByte? .. lol
>
> I can send the ~600 MByte txt files compressed if needed...
>
> I was suprised... sort is a very old command..
>
>
> -------- Original Message --------
> From: "sort problem" <sortprob...@safe-mail.net>
> To: andreas.zeilme...@mailbox.org
> Cc: misc@openbsd.org
> Subject: Re: I found a sort bug! - How to sort big files?
> Date: Sat, 14 Mar 2015 08:39:55 -0400
>
> o.m.g. It works.
>
> Why doesn't sort uses this by default on files larger then 60 MByte?
>
> Thanks!
>
> -------- Original Message --------
> From: Andreas Zeilmeier <andreas.zeilme...@mailbox.org>
> Apparently from: owner-misc+m147...@openbsd.org
> To: misc@openbsd.org
> Subject: Re: I found a sort bug! - How to sort big files?
> Date: Sat, 14 Mar 2015 13:16:05 +0100
>
> > On 03/14/15 12:49, sort problem wrote:
> > > Hello,
> > >
> > > ----------
> > > # uname -a
> > > OpenBSD notebook.lan 5.6 GENERIC.MP#333 amd64
> > > #
> > > # du -sh small/
> > > 663M    small/
> > > # ls -lah small/*.txt | wc -l
> > >       43
> > > #
> > > # cd small
> > > # ulimit -n
> > > 10000000
> > > # sysctl | grep -i maxfiles
> > > kern.maxfiles=1000000000
> > > #
> > > # grep open /etc/login.conf
> > >         :openfiles-cur=100000:\
> > >         :openfiles-cur=1280000:\
> > >         :openfiles-cur=512:\
> > > #
> > > # sort -u *.txt -o out
> > > Segmentation fault (core dumped)
> > > #
> > > ----------
> > >
> > > This is after a minute run.. The txt files have UTF-8 chars too. A
> line is maximum a few ten chars long in the txt files. All the txt files
> have UNIX eol's. There is enough storage, enough RAM, enough CPU. I'm even
> trying this with root user. The txt files are about ~60 000 000 lines.. not
> a big number... a reboot didn't help.
> > >
> > >
> > >
> > > Any ideas how can I use the "sort" command to actually sort? Please
> help!
> > >
> > >
> > >
> > > Thanks,
> > >
> > > btw, this happens on other UNIX OS too, lol... why do we have the sort
> command if it doesn't work?
> > >
> >
> > Hi,
> >
> > have you tried the option '-H'?
> > The manpage suggested this for files > 60MB.
> >
> >
> > Regards,
> >
> > Andi

Reply via email to