Hi Tomasz!
There are some websites with backslashes istead of slashes in links.
For instance : a href=photos\julia.jpg
instead of : a href=photos/julia.jpg
Internet Explorer can repair such addresses.
My own assumption is: It repairs them, because Microsoft
introduced that #censored# way
Hi Mike!
on the command line, but if I copy this to, e.g., test.txt and try
wget -i test1.txt
Well, make sure
a) whether you have test.txt or test1.txt
b) you only have URLs in your txt file like http:// - no options
c) you save the txt file with a pure ASCII editor like Notepad
Hi Deryck!
As far as I know, wget cannot parse CSS code
(and neither JavaScript code).
It has been requested often, but so far noone
has tackled this (probably rather huge) task.
CU
Jens
(just another user)
Hello,
I can make wget copy the necessary CSS files referenced from a webpage
Hi Jason!
If I understood you correctly, this quote from the manual should help you:
***
Note that these two options [accept and reject based on filenames] do not
affect the downloading of HTML files; Wget must load all the HTMLs to know
where to go at all--recursive retrieval would make no
For me this link does NOT work in
IE 6.0
latest Mozilla
latest Opera
So I tested a bit further.
If you go to the site and reach
http://www.interwetten.com/webclient/start.html
and then use the URL you provide, it works.
A quick check for stored cookies revealed that
two cookies are stored.
So
Hi!
While I was testing the new wget 1.9 binary for windwos from
http://space.tin.it/computer/hherold/
I noticed that it was slow if a URL specified within -i list.txt
did not exist. It would wait until wget tries the next URL listed.
Well, to find out what was happening, I specified -d for
Hi Samuel!
You The Man!
Pssst! Don't tell everyone! :D
Then I did
wget -nc -x -r -l0 -p -np -t10 -k -nv -A.gif,.htm,.html http://URL
This also worked, then I began trying to figure out what the
hell was wrong, I added the .cfm to the list. returned an
empty foldertried .shtml
Hi there!
td ALIGN=CENTER VALIGN=CENTER WIDTH=120 HEIGHT=120a
href=66B27885.htm msover1('Pic1','thumbnails/MO66B27885.jpg');
onMouseOut=msout1('Pic1','thumbnails/66B27885.jpg');img
SRC=thumbnails/66B27885.jpg NAME=Pic1 BORDER=0 /a/td
BTW: it is valign=middle :P
(I detest AllCaps and
Hi Hrvoje!
First, I did/do not mean to offend/attack you,
just in case that my suspicion about you being
pi55ed because of my post is not totally unjustified.
If the HTML code says
a href=URL yaddayada my-Mother=Shopping%5 goingsupermarket/a
Why can't wget just ignore everything after
Hi!
Once again I think this has nothing to do in the bug list, but, there you
go:
I've toyed with the idea of making a flag to allow `-p' span hosts
even when normal download doesn't.
Funny you mention this.
When I first heard about -p (1.7?) I thought exactly that it would default
to that
Hi guys!
Today I found something strange.
First, I am using MS WinME (okok) and Netscape (okok)
I was downloading from audi.com a 3000kB zip file from a page via
right-click
After (!) I had finished downloading I thought Hey why not use
wGet(GUI) for it.
Smart, huh? One file, already
Hi guys!
Yes, you all are right.
Proxy is the answer. I feel stupid now.
/me goes to bed, maybe that helps! :|
Thanks anyway! :)
Until the next intelligent question :D
CU
Jens
Man, I really hate ads like the following:
--
GMX - Die Kommunikationsplattform im Internet.
Hi Tomas!
I see, but then, how to exclude from being downloaded per file-basis?
First, let me be a smartass:
Go to
http://www.acronymfinder.com
and lokk up
RTFM
Then, proceed to the docs of wget.
wget offers download restrictions on
host, directory and file name.
Search in the docs for
-H
-D
-Original Message-
From: Jens Roesner [mailto:[EMAIL PROTECTED]]
Sent: 29 October 2001 23:21
To: Tomas Hjelmberg
Subject: Re: meta noindex
Hi Tomas!
Put
robots = off
in your wgetrc
You cannot use it in the command line if I am not mistaken.
I think it was introduced in 1.7
Hi wGetters!
I want to download a subtree of HTML documents from a foreign site
complete with page-requisites for offline viewing.
I.e. all HTML pages from this point downwards, PLUS all the images(etc.)
they refer to -- no matter where they are in the directory tree
This is
Hi Andreas!
AFAIK wGet has cookie support.
At least the 1.7 I use.
If this does not help you, I did not understand your question.
But I am sure there are smarter guys than me on the list! ;)
CU
Jens
http://www.JensRoesner.de/wGetGUI/
[snip]
Would it make sense to add basic cookie support to
Hi wgetters!
@André
Guys, you don't understand what the OP wants. He needs a
dynamically generated referer, something like
wget --referer 'http://%h/'
where, for each URL downloaded, wget would replace %h by the
hostname.
Well, I understood it this way.
My problem was that I mainly use
Hi wGetters!
I just stumbled over
http://bay4.de/FWget/
Are his changes incorporated into Wget 1.7?
Any opinions on that software?
I think with WinME *yuck* as OS, this is out of question for me,
but...
CU
Jens
Hi Mengmeng!
Thanks very much, I (obviously) was not aware of that!
I'll see how I can incorporate that (-I/-X/-D/-H) in wGetGUI.
Can I do something like -H -Dhome.nexgo.de -Ibmaj.roesner
http://www.AudiStory.com ?
I'll just give it a try.
Thanks again!
Jens
Hi again!
I am trying to start from
http://members.tripod.de/jroes/test.html
(have a look)
The first link goes to a site I do not want.
The second link goes to a site that should be retrieved.
wget -r -l0 -nh -d -o test.log -H -I/bmaj*/
http://members.tripod.de/jroes/test.html
does not work.
Hi Ian, hi wgetters!
Thanks for your help!
It didn't work for me either, but the following variation did:
wget -r -l0 -nh -d -o test.log -H -I'bmaj*' http://members.tripod.de/jroes/test.html
Hm, did not for me :( neither in 1.4.5 nor in the newest Windows
binaries version I downloaded from
Hi Ian and wgetters!
Well if you're running it from a DOS-style shell, get rid of the
single quotes I put in there, i.e. try -Ibmaj*
Oh, I guess that was rather stupid of me.
However, the windows version will only work with
-I/bmaj or -Ibmaj.roesner, not with anything like -I/bmaj* oder
22 matches
Mail list logo