On Sat, May 14, 2016 at 8:32 AM, Bram Moolenaar <[email protected]> wrote:
> > My first reaction is "well, don't do that then". > > The code isn't really prepared for extremely long lines. It reads some > parts at a time (8K I believe). Then channel_collapse() concatenates > each part to the previous, thus you get lots of allocations of > increasing size. You could have a go at making this more efficient. > Also using strlen() many times is a large amount of overhead here. > > Anyway, what would you do with a 100 Mbyte response from a channel? > You probably just want to avoid that. > The use case here is a plug-in that searches arbitrary folders using ag/ack/grep. Given that I don't control where users will run this thing, I can't really stop people from doing crazy things (and have a file consisting of a single 150M-byte line is obviously crazy), but I still don't want to crash or lock-up Vim. I'm going to see if "raw" mode and my own buffering here can avoid these edge cases. I can scan through chunks (which may be around the 8K ballpark that you mention), and if they are part of a super-long line above some threshold that I define, just ignore them, or at least not accumulate the whole thing into my result set. -Greg -- -- You received this message from the "vim_dev" maillist. Do not top-post! Type your reply below the text you are replying to. For more information, visit http://www.vim.org/maillist.php --- You received this message because you are subscribed to the Google Groups "vim_dev" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
