From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Neil Mitchell
My standard solution was to invoke wget, but a Haskell solution would
be nicer. For my purpose following redirects etc. isn't required, so
thanks very much for your help. I will be releasing this function as
part
Neil Mitchell wrote:
I will be releasing this function as part of a library shortly
Alistair Bayley wrote:
no! The code was merely meant to illustrate how a really basic
HTTP GET might work. It certainly doesn't deal with a lot of the
additional cases, like redirects and resource moves, and
Hi
My standard solution was to invoke wget, but a Haskell solution would
be nicer. For my purpose following redirects etc. isn't required, so
thanks very much for your help. I will be releasing this function as
part of a library shortly, so will be giving you credit for your help!
Good
On Jan 29, 2007, at 11:11 , Yitzchak Gale wrote:
Neil Mitchell wrote:
I will be releasing this function as part of a library shortly
Alistair Bayley wrote:
no! The code was merely meant to illustrate how a really basic
HTTP GET might work. It certainly doesn't deal with a lot of the
hi, i have popped in on this thread before to mention my own extension
to Network.HTTP (http://www.b7j0c.org/content/haskell-http.html,
providing get() and head()).
i would like to thank bjorn for his work on Network.HTTP and echo his
observation that this package needs some work and active
clawsie:
hi, i have popped in on this thread before to mention my own extension
to Network.HTTP (http://www.b7j0c.org/content/haskell-http.html,
providing get() and head()).
i would like to thank bjorn for his work on Network.HTTP and echo his
observation that this package needs some work
right. there's a bit of a loose group of people who want to take on the
http library and practical, authoritative version, but its a lot of
work. Starting with the great code already in HAppS is one option too.
So yes, we need to fix it. There's people to do it. Now we just need
social
Hi
So yes, we need to fix it. There's people to do it. Now we just need
social factors to kick in and make it happen!
We really do! The inability to get a file off a website is quite
depressing given that the hard bit should be designing an API, but
that anyone could do that for openURL in
I've fallen off the pace on this thread so this is a composite reply, mainly
to Bjorn, Brad and Yitzchak...
I would also like to express my gratitude for the work that Bjorn, and all the
others involved, have done on the http library. I certainly appreciated
having it available for use.
I
Daniel McAllansmith wrote:
The cheap and cheerful solution might be to invoke cURL.
Or MissingPy.
The bottom line is that URL loading is not the same as
HTTP. It is higher level. While Haskell does have a nice
HTTP library, it does not have a URL loading library
yet as far as I can see from
Hi Daniel,
Adding in
hPutStrLn h (Connection: close\r\n)
or
hPutStrLn h (Connection: keep-alive\r\n)
as appropriate should sort that.
Works like a charm.
This is responding with a 302, the resource has been found but is temporarily
at another location indicated in the responses Location
Hi Alistair,
Is there a simple way to get the contents of a webpage using Haskell on a
Windows box?
This isn't exactly what you want, but it gets you partway there. Not
sure if LineBuffering or NoBuffering is the best option. Line
buffering should be fine for just text output, but if you
On Sunday 28 January 2007 09:14, Neil Mitchell wrote:
Hi Alistair,
Is there a simple way to get the contents of a webpage using Haskell on
a Windows box?
This isn't exactly what you want, but it gets you partway there. Not
sure if LineBuffering or NoBuffering is the best option. Line
Hi Daniel
Note that I haven't tried this, or the rest of Alistair code at all, so the
usual 30 day money back guarantee doesn't apply. It certainly won't handle
redirects.
Thanks, it certainly gets more things, but has a nasty habit of taking
a very long time in Hugs on certain URLs:
On Sunday 28 January 2007 10:53, Neil Mitchell wrote:
Thanks, it certainly gets more things, but has a nasty habit of taking
a very long time in Hugs on certain URLs:
research.microsoft.com/,
Looks like IIS is waiting until it receives a Connection header, a bit of a
variation from spec I
Greg Fitzgerald wrote:
I'd like to write a very simple Haskell script that when given a URL,
looks up the page, and returns a string of HTML. I don't see an HTTP
library in the standard libs, and the one in Hackage requires Windows
machines have GHC and MinGW to be installed and in the PATH.
Alistair, Neil, Brad, Yitzchak, Bjorn,
Thanks all for your help.
-Greg
On 1/19/07, Björn Bringert [EMAIL PROTECTED] wrote:
Greg Fitzgerald wrote:
I'd like to write a very simple Haskell script that when given a URL,
looks up the page, and returns a string of HTML. I don't see an HTTP
I'd like to write a very simple Haskell script that when given a URL, looks
up the page, and returns a string of HTML. I don't see an HTTP library in
the standard libs, and the one in Hackage requires Windows machines have GHC
and MinGW to be installed and in the PATH.
Is there a simple way to
Hi,
I've often wondered the same as the above poster. Something like
readWebPage (in the same style as readFile) would be a really handy
function. Do no libraries provide this?
(if not, can one start providing it? MissingH?)
Thanks
Neil
On 1/18/07, Alistair Bayley [EMAIL PROTECTED] wrote:
Alistair Bayley wrote:
I'd like to write a very simple Haskell script that when given a URL, looks
up the page, and returns a string of HTML. I don't see an HTTP library in
the standard libs...
Neil Mitchell wrote:
MissingH?
MissingPy.
It would be great to have a full-featured native
I'd like to write a very simple Haskell script that when given a URL, looks
up the page, and returns a string of HTML. I don't see an HTTP library in
the standard libs, and the one in Hackage requires Windows machines have GHC
and MinGW to be installed and in the PATH.
Is there a simple way to
21 matches
Mail list logo