The 'slurp' function on a URL dispatches to java.net.HttpURLConnection,
which is pretty primitive as HTTP clients go. If you need to handle slow
sites or large responses, you'll probably be better off with a real HTTP
client library.
-S
--
You received this message because you are subscribed
Hello,
I apologize if this isn't the right place to post this.
I've seen slurp exhibit some very strange behavior when querying a
particular URL at certain times of day - it seems that a call to slurp as
such:
(slurp http://support.clean-mx.de/clean-mx/viruses.php?limit=0,150;)
will crash
Is this using REPLy / a lein2-previewX repl, by chance? I fixed a memory
leak in REPLy yesterday that happened in long-running command scenarios,
and could have caused this behavior
(see https://github.com/technomancy/leiningen/issues/691 for details).
- Colin
On Wednesday, July 18, 2012
I've had this issue when the network I was on was way overloaded. It seems like
it might be nice to have it at least fail gracefully. IIRC I switched to using
the http lib. It seems like at the very least a graceful failure from slurp
would be an improvement.
Cheers,
'(Devin Walters)
On Jul