Micah,
But most importantly, what will the new name of Wget be?
PROTECTED]
To: Alan Thomas [EMAIL PROTECTED]; Liz Labbe
[EMAIL PROTECTED]; wget@sunsite.dk
Sent: Friday, February 01, 2008 3:13 PM
Subject: RE: wget running in windows Vista
OK - ignore what I said -
I've been immersed in vista security these days, so thought there might
be some issues with wget and uac
What version of wget? What edition of Vista?
I have used the wget 1.10.2 on Vista before.
Alan
- Original Message -
From: Christopher G. Lewis [EMAIL PROTECTED]
To: Liz Labbe [EMAIL PROTECTED]; wget@sunsite.dk
Sent: Thursday, January
What is wget2? Any plans to move to Java? (Of course, the latter
will not be controversial. :)
Alan
How about:
-- JustGetIt?
--GoFetch?
--SuckItUp?
-- Hoover?
-- DragNet?
-- NetSucker?
Sorry for the misunderstanding. Honestly, Java would be a great language
for what wget does. Lots of built-in support for web stuff. However, I was
kidding about that. wget has a ton of great functionality, and I am a
reformed C/C++ programmer (or a recent Java convert). But I love using
Quotation marks around the text containing special characters should
work in Windows batch files.
- Original Message -
From: Tony Godshall [EMAIL PROTECTED]
To: Uma Shankar [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Sunday, November 11, 2007 12:48 AM
Subject: Re: Need help with wget
features and plans for good
improvements. While it won`t yet meet this one need, I will certainly
continue to use it for other purposes.
Thanks, Alan
- Original Message -
From: Micah Cowan [EMAIL PROTECTED]
To: Alan Thomas [EMAIL
command.com
By the way, Josh and your messages are being put out to the list in
dupicates (at least, that`s what I`m seeing on my end).
- Original Message -
From: Micah Cowan [EMAIL PROTECTED]
To: Alan Thomas [EMAIL PROTECTED]
Cc: wget@sunsite.dk
Sent: Thursday, September 06, 2007 9
Please ignore. It was needing the \\, like Josh said.
- Original Message -
From: Alan Thomas [EMAIL PROTECTED]
To: Josh Williams [EMAIL PROTECTED]; wget@sunsite.dk
Sent: Thursday, September 06, 2007 9:25 PM
Subject: Re: wget syntax problem ?
Wget does not like my use
Wget does not like my use of the --directory-prefix= option. Anyone know
why?
- Original Message -
From: Josh Williams [EMAIL PROTECTED]
To: Alan Thomas [EMAIL PROTECTED]
Cc: wget@sunsite.dk
Sent: Thursday, September 06, 2007 8:53 PM
Subject: Re: wget syntax problem ?
On 9/6/07, Alan
This seems to work up to and including 259 characters in the filename
(not counting the file extension) on Windows (98).
Alan
- Original Message -
From: Alan Thomas [EMAIL PROTECTED]
To: wget@sunsite.dk
Sent: Monday, April 02
I am using wget 1.10.2 on a Windows 98 machine. I would like to
non-interactively query the U.S. patent database. I am using the following
wget command:
wget --convert-links --directory-prefix=C:\Program Files\wget\perimeter
--no-clobber
Substituting %3E for does not work either.
- Original Message -
From: Alan Thomas
To: wget@sunsite.dk
Sent: Saturday, March 31, 2007 3:23 AM
Subject: Special characters in http
I am using wget 1.10.2 on a Windows 98 machine. I would like to
non-interactively query
Steven,
Thanks! Putting quotes around the http request worked.
Alan
- Original Message -
From: Steven M. Schweda [EMAIL PROTECTED]
To: WGET@sunsite.dk
Cc: [EMAIL PROTECTED]
Sent: Saturday, March 31, 2007 8:27 AM
Subject: Re: Special characters in http
From: Alan
Forgive me, as I have no experience with Unix, and this might be the
source of my problem. . . . How do I specify that I only want, e.g., HTML files
that begin with a certain letter from a directory?
Putting /l*.htm at the end of the URL did not work:
Warning: wildcards not
to do with the
characters in the filename, which you mentioned.
Thanks, Alan
- Original Message -
From: Steven M. Schweda [EMAIL PROTECTED]
To: WGET@sunsite.dk
Cc: [EMAIL PROTECTED]
Sent: Tuesday, March 13, 2007 1:23 AM
Subject: Re: Question re web link conversions
From: Alan Thomas
I am using the wget command below to get a page from the U.S. Patent
Office. This works fine. However, when I open the resulting local file with
Internet Explorer (IE), click a link in the file (go to another web site) and
the click Back, it goes back to the real web address
Oh. For some reason, I thought this was just a logfile.Thanks,
Alan
- Original Message -
From: Steven M. Schweda [EMAIL PROTECTED]
To: WGET@sunsite.dk
Cc: [EMAIL PROTECTED]
Sent: Saturday, March 10, 2007 11:44 PM
Subject: Re: Naming output file
From: Alan Thomas
I would like the output files from a wget command to be stored in a
subdirectory (output) of the current directory. To do this, I used the
directory-prefix option, but it does not seem to like my syntax. It gives me
the error missing URL. The variations I tried for the option are:
Never mind. The correct syntax is
--directory-prefix=c:\output\
However, the error was due to a Return vice space before the URL. Duh!
Alan
- Original Message -
From: Alan Thomas
To: wget@sunsite.dk
Sent: Saturday, March 10, 2007 9:07 PM
Subject: Syntax
Is there a way to tell wget how to name an output file (i.e., not what it
is named by the site from which I am retrieving).
Thanks, Alan
I see now that I should not have had the single quotes (') around
country=US.
- Original Message -
From: Tony Lewis
To: 'Alan Thomas' ; wget@sunsite.dk
Sent: Thursday, February 22, 2007 8:59 PM
Subject: RE: php form
The table stuff just affects what's shown
, but rather all of the
data without that filter (from all countries) came back. Also, this data was
in a .php file.
Do you know what I am doing wrong?Thanks, Alan
- Original Message -
From: Tony Lewis
To: 'Alan Thomas' ; wget@sunsite.dk
Sent: Thursday
post to the php?
Thanks, Alan
- Original Message -
From: Tony Lewis
To: 'Alan Thomas' ; wget@sunsite.dk
Sent: Thursday, February 22, 2007 12:53 AM
Subject: RE: php form
Look for form action=some-web-page method=XXX ...
action tells
There is a database on a web server (to which I have access) that is
accessible via username/password. The only way for users to access the
database is to use a form with search criteria and then press a button that
starts a php script that produces a web page with the results of the
Thanks!
- Original Message -
From: Hrvoje Niksic [EMAIL PROTECTED]
To: Alan Thomas [EMAIL PROTECTED]
Cc: wget@sunsite.dk
Sent: Friday, April 29, 2005 6:16 AM
Subject: Re: local HTML files
Alan Thomas [EMAIL PROTECTED] writes:
Can I somehow give wget an HTML file's local hard disk
Can
I somehow give wget an HTML file's local hard disk location vice a URL and have
it retrieve files at URLs referenced in that HTML file?
Thanks, Alan
It doesn`t seem to download the file when I use the debug option. It just quickly says "finished." Hrvoje Niksic [EMAIL PROTECTED] wrote:
Alan Thomas <[EMAIL PROTECTED]>writes: The log file looks like: 17:54:41 URL:https://123.456.89.01/blabla.nsf/HOBART?opeNFRAMESET [565/565]
The log file looks like:
17:54:41 URL:https://123.456.89.01/blabla.nsf/HOBART?opeNFRAMESET [565/565] - "123.456.89.01/blabla.nsf/HOBART?opeNFRAMESET.html" [1]
FINISHED --17:54:41--Downloaded: 565 bytes in 1 filesHrvoje Niksic [EMAIL PROTECTED] wrote:
Alan Thomas <[EMAIL PROTECTED]&
That's probably it. Is there anything I can do to automatically get thefiles with wget? Thanks, Alan- Original Message - From: "Hrvoje Niksic" [EMAIL PROTECTED]To: "Alan Thomas" [EMAIL PROTECTED]Cc: wget@sunsite.dkSent: Friday, April 15, 2005 7:23 PMSubject: [sp
I use Internet Explorer. I disabled Active Scripting and Scripting of JavaApplets, but I can still access this page normally (even after a restart).
Do you Yahoo!?
Plan great trips with Yahoo! Travel: Now over 17,000 guides!
A website uses frames, and when I view it in
Explorer, it has the URL https://123.456.89.01/blabla.nsf/HOBART?opeNFRAMESETand
a bunch of PDF files in two of the frames.
When I try to recursively download this web site, I don`t get the files. I
am using the following command:
wget -nc
I got the wgetgui program, and used it successfully. The commands were
very much like this one. Thanks, Alan
- Original Message -
From: Technology Freak [EMAIL PROTECTED]
To: Alan Thomas [EMAIL PROTECTED]
Sent: Thursday, April 14, 2005 10:12 AM
Subject: [unclassified] Re
34 matches
Mail list logo