RE: Newbie Question - DNS Failure

2007-01-22 Thread Terry Babbey
 I installed wget on a HP-UX box using the depot package.

Which depot package?  (Anyone can make a depot package.) 
Depot package came from
http://hpux.connect.org.uk/hppd/hpux/Gnu/wget-1.10.2/

Which wget version (wget -V)? 
1.10.2

Built how?
Installed using swinstall

Running on which HP-UX system type?
RP-5405

OS version?
HP-UX B.11.11

 Resolving www.lambton.on.ca... failed: host nor service provided, or
not
 known.

   First guess:  You have a DNS problem, not a wget problem.  Can any
other program on the system (Web browser, nslookup, ...) resolve names
any better?
Nslookup and ping work wonderfully. Sorry, I should have mentioned that
the first time.

   Second guess:  If DNS works for everyone else, I'd try building wget
(preferably a current version, 1.10.2) from the source, and see if that
makes any difference.  (Who knows what name resolver is linked in with
the program in the depot?)
Started to try that and got some error messages during the build. I may
need to re-investigate.

   Third guess:  Try the ITRC forum for HP-UX, but you'll probably need
more info than this there, too:

   http://forum1.itrc.hp.com/service/forums/familyhome.do?familyId=117
Thanks, I'll check.



   Steven M. Schweda   [EMAIL PROTECTED]
   382 South Warwick Street(+1) 651-699-9818
   Saint Paul  MN  55105-2547



Re: Newbie Question - DNS Failure

2007-01-22 Thread Steven M. Schweda
From: Terry Babbey

  Built how?
 Installed using swinstall

   How the depot contents were built probably matters more.

 Second guess:  If DNS works for everyone else, I'd try building wget
  (preferably a current version, 1.10.2) from the source, and see if that
  makes any difference.  [...]
 
 Started to try that and got some error messages during the build. I may
 need to re-investigate.

   As usual, it might help if you showed what you did, and what happened
when you did it.  Data like which compiler (and version) could also be
useful.

   On an HP-UX 11.23 Itanium system, starting with my VMS-compatible kit
(http://antinode.org/dec/sw/wget.html;, which shouldn't matter much
here), I seemed to have no problems building using the HP C compiler,
other than getting a bunch of warnings related to socket stuff, which
seem to be harmless.  (Built using CC=cc ./configure and make.)

td176 cc -V
cc: HP C/aC++ B3910B A.06.13 [Nov 27 2006]

And I see no obvious name resolution problems:

td176 ./wget http://www.lambton.on.ca
--23:42:04--  http://www.lambton.on.ca/
   = `index.html'
Resolving www.lambton.on.ca... 192.139.190.140
Connecting to www.lambton.on.ca|192.139.190.140|:80... failed: Connection refuse
d.

d176 ./wget -V
GNU Wget 1.10.2c built on hpux11.23.
[...]

   That's on an HP TestDrive system, which is behind a restrictive
firewall, which, I assume, explains the connection problem.  (At least
it got an IP address for the name.)  And it's not the same OS version,
and who knows which patches have been applied to either system?, and so
on.



   Steven M. Schweda   [EMAIL PROTECTED]
   382 South Warwick Street(+1) 651-699-9818
   Saint Paul  MN  55105-2547


Re: Newbie Question - DNS Failure

2007-01-20 Thread Steven M. Schweda
From: Terry Babbey

 I installed wget on a HP-UX box using the depot package.

   Great.  Which depot package?  (Anyone can make a depot package.) 
Which wget version (wget -V)?  Built how?  Running on which HP-UX
system type?  OS version?

 Resolving www.lambton.on.ca... failed: host nor service provided, or not
 known.

   First guess:  You have a DNS problem, not a wget problem.  Can any
other program on the system (Web browser, nslookup, ...) resolve names
any better?

   Second guess:  If DNS works for everyone else, I'd try building wget
(preferably a current version, 1.10.2) from the source, and see if that
makes any difference.  (Who knows what name resolver is linked in with
the program in the depot?)

   Third guess:  Try the ITRC forum for HP-UX, but you'll probably need
more info than this there, too:

   http://forums1.itrc.hp.com/service/forums/familyhome.do?familyId=117



   Steven M. Schweda   [EMAIL PROTECTED]
   382 South Warwick Street(+1) 651-699-9818
   Saint Paul  MN  55105-2547


Newbie Question - DNS Failure

2007-01-19 Thread Terry Babbey
I installed wget on a HP-UX box using the depot package.

 

Now when I run wget it will not resolve DNS queries.

 

wget http://192.139.190.140 http://192.139.190.140/  works.

wget http://www.lambton.on.ca http://www.lambton.on.ca/  fails with
the following error:

 

# wget http://www.lambton.on.ca

--17:21:22--  http://www.lambton.on.ca/

   = `index.html'

Resolving www.lambton.on.ca... failed: host nor service provided, or not
known.

 

Any help is appreciated.

 

Thanks,

Terry



Terry Babbey - Technical Support Specialist

Information  Educational Technology Department

Lambton College, Sarnia, Ontario, CANADA

 



Re: newbie question

2005-04-14 Thread Jens Rösner
Hi Alan!

As the URL starts with https, it is a secure server. 
You will need to log in to this server in order to download stuff.
See the manual for info how to do that (I have no experience with it).

Good luck
Jens (just another user)


  I am having trouble getting the files I want using a wildcard
 specifier (-A option = accept list).  The following command works fine to
get an
 individual file:
 
 wget

https://164.224.25.30/FY06.nsf/($reload)/85256F8A00606A1585256F900040A32F/$FILE/160RDTEN_FY06PB.pdf
 
 However, I cannot get all PDF files this command: 
 
 wget -A *.pdf

https://164.224.25.30/FY06.nsf/($reload)/85256F8A00606A1585256F900040A32F/$FILE/
 
 Instead, I get:
 
 Connecting to 164.224.25.30:443 . . . connected.
 HTTP request sent, awaiting response . . . 400 Bad Request
 15:57:52  ERROR 400: Bad Request.
 
I also tried this command without success:
 
 wget

https://164.224.25.30/FY06.nsf/($reload)/85256F8A00606A1585256F900040A32F/$FILE/*.pdf
 
 Instead, I get:
 
 HTTP request sent, awaiting response . . . 404 Bad Request
 15:57:52  ERROR 404: Bad Request.
 
  I read through the manual but am still having trouble.  What am I
 doing wrong?
 
 Thanks, Alan
 
 
 

-- 
+++ NEU: GMX DSL_Flatrate! Schon ab 14,99 EUR/Monat! +++

GMX Garantie: Surfen ohne Tempo-Limit! http://www.gmx.net/de/go/dsl


RE: newbie question

2005-04-14 Thread Tony Lewis
Alan Thomas wrote:

 I am having trouble getting the files I want using a wildcard specifier...

There are no options on the command line for what you're attempting to do.

Neither wget nor the server you're contacting understand *.pdf in a URI.
In the case of wget, it is designed to read web pages (HTML files) and then
collect a list of resources that are referenced in those pages, which it
then retrieves. In the case of the web server, it is designed to return
individual objects on request (X.pdf or Y.pdf, but not *.pdf). Some web
servers will return a list of files if you specify a directory, but you
already tried that in your first use case.

Try coming at this from a different direction. If you were going to manually
download every PDF from that directory, how would YOU figure out the names
of each one? Is there a web page that contains a list somewhere? If so,
point wget there.

Hope that helps.

Tony

PS) Jens was mistaken when he said that https requires you to log into the
server. Some servers may require authentication before returning information
over a secure (https) channel, but that is not a given.




Re: newbie question

2005-04-14 Thread Hrvoje Niksic
Alan Thomas [EMAIL PROTECTED] writes:

   I am having trouble getting the files I want using a wildcard
 specifier (-A option = accept list).  The following command works fine to
 get an individual file:
  
 wget
 https://164.224.25.30/FY06.nsf/($reload)/85256F8A00606A1585256F900040A32F/$FILE/160RDTEN_FY06PB.pdf
  
 However, I cannot get all PDF files this command:
  
 wget -A *.pdf
 https://164.224.25.30/FY06.nsf/($reload)/85256F8A00606A1585256F900040A32F/$FILE/
  
 Instead, I get:
  
 Connecting to 164.224.25.30:443 . . . connected.
 HTTP request sent, awaiting response . . . 400 Bad Request
 15:57:52  ERROR 400: Bad Request.

Does that URL work with a browser?  What version of Wget are you
using?

Using -d will provide a full log of what Wget is doing, as well as the
responses it is getting.  You can mail the log here, but please be
sure it doesn't contain sensitive information (if applicable).  This
list is public and has public archives.

Please note that you also need -r (or even better -r -l1) for -A
to work the way you want it.


Re: newbie question

2005-04-14 Thread Hrvoje Niksic
Tony Lewis [EMAIL PROTECTED] writes:

 PS) Jens was mistaken when he said that https requires you to log
 into the server. Some servers may require authentication before
 returning information over a secure (https) channel, but that is not
 a given.

That is true.  HTTPS provides encrypted communication between the
client and the server, but it doesn't always imply authentication.


Re: newbie question

2005-04-14 Thread Jens Rösner
Hi! 

Yes, I see now, I misread Alan's original post. 
I thought he would not even be able to download the single .pdf. 
Don't know why, as he clearly said it works getting a single pdf.

Sorry for the confusion! 
Jens

 Tony Lewis [EMAIL PROTECTED] writes:
 
  PS) Jens was mistaken when he said that https requires you to log
  into the server. Some servers may require authentication before
  returning information over a secure (https) channel, but that is not
  a given.
 
 That is true.  HTTPS provides encrypted communication between the
 client and the server, but it doesn't always imply authentication.
 

-- 
+++ GMX - Die erste Adresse für Mail, Message, More +++

1 GB Mailbox bereits in GMX FreeMail http://www.gmx.net/de/go/mail


Re: [unclassified] Re: newbie question

2005-04-14 Thread Alan Thomas
 I got the wgetgui program, and used it successfully.  The commands were
very much like this one.  Thanks, Alan

- Original Message - 
From: Technology Freak [EMAIL PROTECTED]
To: Alan Thomas [EMAIL PROTECTED]
Sent: Thursday, April 14, 2005 10:12 AM
Subject: [unclassified] Re: newbie question


 Alan,

 You could try something like this

 wget -r -d -l1 -H -t1 -nd -N -np -A pdf URL

 On Wed, 13 Apr 2005, Alan Thomas wrote:

  Date: Wed, 13 Apr 2005 16:02:40 -0400
  From: Alan Thomas [EMAIL PROTECTED]
  To: wget@sunsite.dk
  Subject: newbie question
 
  I am having trouble getting the files I want using a wildcard
specifier (-A option = accept list).  The following command works fine to
get an individual file:
 
  wget
https://164.224.25.30/FY06.nsf/($reload)/85256F8A00606A1585256F900040A32F/$FILE/160RDTEN_FY06PB.pdf
 
  However, I cannot get all PDF files this command:
 
  wget -A *.pdf
https://164.224.25.30/FY06.nsf/($reload)/85256F8A00606A1585256F900040A32F/$FILE/

 --- TekPhreak [EMAIL PROTECTED]
  http://www.tekphreak.com




Re: Newbie question --- using wget for AV def file update - port failure problem

2002-07-22 Thread Steve Bratsberg

that was the fix, thank you
Matt Whimp  Sarah Kemp matt[EMAIL PROTECTED] wrote in message
news:20020720064602.0645af92.matt[EMAIL PROTECTED]...
 On Fri, 19 Jul 2002 11:57:38 -0400
 Steve tapped the following into the keyboard:

  == PORT ... Master socket fd 428 bound.
  using port 1342.
 
  -- PORT 192,168,3,159,5,62
 
 
  500 Illegal PORT command.
 
  Invalid PORT.
  Closing fd 432
  Closing fd 428
  Giving up.
 
 
  --
  Steve


 Have you tried the --passive-ftp option?




 -
 Regards

 Matt and Sarah

 Email us on:

 matt[EMAIL PROTECTED]






Re: newbie question

2002-04-13 Thread Hrvoje Niksic

Newer versions of Wget check the server type and adjust the directory
listing parser accordingly.  If I remember correctly, NT directory
listing is now supported.



newbie question

2002-04-12 Thread dbotham

Just when I thought it was safe to start downloading files, I get this:

wget --mirror -v -I/ -X/report,/Software -w1 -gon
ftp://x:[EMAIL PROTECTED]
--11:27:41--  ftp://x:[EMAIL PROTECTED]:21/
   = `64.226.243.208/.listing'
Connecting to 64.226.243.208:21... connected!
Logging in as xx... Logged in!
== TYPE I ... done.  == CWD not needed.
== PORT ... done.== LIST ... done.

0K - .. ..

11:27:42 (713.53 KB/s) - `64.226.243.208/.listing' saved [16805]

--11:27:43--  ftp://x:[EMAIL PROTECTED]:21/
   = `64.226.243.208/index.html'
== CWD not required.
== PORT ... done.== RETR  ...
No such file `'.


FINISHED --11:27:43--
Downloaded: 0 bytes in 0 files


When I look in my local directory the olny thing I see is the '.listing'
file.  Any suggestions?

Thanks,

Dave...




newbie question

2001-05-03 Thread Gavin Burnett

hello.

i'm finding wget really useful for my dial up connection at home, and I
am trying to set it up in work, as there are some web based documents I
would like to store in my home directory. 

I have given the program the following arguments:

wget --background --output-file=/home/burnett/wget.log --tries=30
--server-response --directory-prefix=/home/burnett/download --recursive
--level=0 --domains=redhat.com
http://www.redhat.com/mirrors/LDP/HOWTO/Kernel-HOWTO.html 

however the download fails to run, with the following output in the log
file:

--16:24:11--  http://www.nitpickers.com:80/
   = `/home/burnett/download/www.nitpickers.com/index.html'
Connecting to www.nitpickers.com:80... 
connect: Connection timed out
Retrying.
.
.
.
(program continually retries and fails with the same error)

i've tried enabling and disabling most of the setting, (i,e the proxy,
suppling http passwords, and so forth),
but the problem always stays the same.

any suggestions would be greatly appricitated.

Gavin.