I seem to recall that ages before someone posted some ideas (or a hint
maybe?) which included a http retriever in bash (using its socket
interface).  IIRC it was called bget (or maybe bashget?...). It's
purpose was providing something akin to wget that is useable right
after completing LFS, so that one can fetch further packages and get
started with BLFS.

I don't have a lnux box handy, but the file /dev/tcp can be used to
connect to the internet.

/dev/tcp/localhost/80 will connect to localhost on port 80.

So (if your game to write your own script).

exec 3<> /dev/tcp/www.google.com/80 will open a HTTP connection to
www.google.com

Now, the below uses basic HTTP protocal to send and recieve data using
HTTP (this is all from top of head, so probably checking out the
actual RFC and a bit of googling would be in order).

echo "%s HTTP/1.0\r\n" "GET /" >&3
echo "\r\n" >&3

while read LINE <&3 ; do
        printf "%s\n" "$LINE"
done

This script connects to google, GET's the home page and outputs the
HTML line by line on the screen.

Now, the real catch here is MULTIPART data, as you will have to parse
this first. (I couldn't tell you how to do this in bash).

Anyway, its something to get you started, if you really want to go this route.

Cheers
Jeeva
--
http://linuxfromscratch.org/mailman/listinfo/blfs-support
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page

Reply via email to