Re: Maybe a bug

2006-07-21 Thread gentoo
On Sat, 22 Jul 2006, eduardo martins wrote:

> hxxp://vp.video.google.com/videodownload?version=0&secureurl=swAAAEyWHDQ1BZdFGnJOurGFQ8XzwUdnC05S7sJSVvYH2QieipSgdZMMjfCy6CMF4XCGLAuqXc6egyRSj4rckwDLEC5i7VNUeJiDFMb-6UzrQcsYT4Y_hWfCGMxVBi9C2AMCuIwO2AmgoQ39OqHp6HglLe905loQ8H5ZMjC4KAB8J4xKeJim-uYnNL1d6RFDhhbXZzj3xRfgOiY5b2-KD10kcEbhP6laPI3wXNJd67SJZvxndSgNmVaBCPAsnUgyZ6XCaw&sigh=8WvmgMeDxcT4FlWINnl7uz3KDk4&begin=0&len=464629&docid=-318185662095345697
> 
> Wget don't get it..bu opera and also IE get the file, is there a way to fix
> it ?

*** It does work for me, I have version GNU Wget 1.10.2. Copy the error message 
from wget into list, then someone can help you.


wget -c -Y 0 -O aaa 
"http://vp.video.google.com/videodownload?version=0&secureurl=swAAAEyWHDQ1BZdFGnJOurGFQ8XzwUdnC05S7sJSVvYH2QieipSgdZMMjfCy6CMF4XCGLAuqXc6egyRSj4rckwDLEC5i7VNUeJiDFMb-6UzrQcsYT4Y_hWfCGMxVBi9C2AMCuIwO2AmgoQ39OqHp6HglLe905loQ8H5ZMjC4KAB8J4xKeJim-uYnNL1d6RFDhhbXZzj3xRfgOiY5b2-KD10kcEbhP6laPI3wXNJd67SJZvxndSgNmVaBCPAsnUgyZ6XCaw&sigh=8WvmgMeDxcT4FlWINnl7uz3KDk4&begin=0&len=464629&docid=-318185662095345697";
--08:41:11--  
http://vp.video.google.com/videodownload?version=0&secureurl=swAAAEyWHDQ1BZdFGnJOurGFQ8XzwUdnC05S7sJSVvYH2QieipSgdZMMjfCy6CMF4XCGLAuqXc6egyRSj4rckwDLEC5i7VNUeJiDFMb-6UzrQcsYT4Y_hWfCGMxVBi9C2AMCuIwO2AmgoQ39OqHp6HglLe905loQ8H5ZMjC4KAB8J4xKeJim-uYnNL1d6RFDhhbXZzj3xRfgOiY5b2-KD10kcEbhP6laPI3wXNJd67SJZvxndSgNmVaBCPAsnUgyZ6XCaw&sigh=8WvmgMeDxcT4FlWINnl7uz3KDk4&begin=0&len=464629&docid=-318185662095345697
   => `aaa'
Resolving vp.video.google.com... 66.249.93.181
Connecting to vp.video.google.com|66.249.93.181|:80... connected.
HTTP požadavek odeslán, program čeká na odpověď ... 200 OK
Délka: neudáno [video/x-msvideo]

[<=>
   ] 258 227  
147.43K/s 

Re: Wishlist: support the file:/// protocol

2005-12-14 Thread gentoo


On Wed, 14 Dec 2005, Antonio Zerbinati wrote:

> Hi All,
> first of all, keep in mind that rsync already handle local (and remote)
> file/dir transfer at best, IMHO rsync is the best solution, ever, for
> coping files when you have shell access to.
> You can have a look at here: http://rsync.samba.org/ftp/rsync/rsync.html
> 
> just look at the details of what the --progress and --partial does.

*** I agree with this:) Write some other reasons for file:// protocol in wget, 
maybe we can find other solution for you need of file:// than adding this 
support in wget.

For local transfer the dd can be used too.

> Saying that, wget should never be used as a rsync clone, but, a raw file:// 
> protocol may still be useful. Think that wget is used by a small program to 
> collect some file; those file can stay on web or be local, should the program 
> call different tool, wget for the net, and cp/rcp/rsync for local stored 
> file?  if wget can handle both, there is no need for that.

*** I think, the right solution is to write small shell script (or function)


function download() {
  case "$1" in
"http://"* | "ftp://"*) wget -c "$1";;
"file://"* ) rsync -P "${1#file://}";;
  esac  
}


But yes, it would have the same interface for all these protocols if you would 
add the file:// protocol in wget.


What about to implement scp:// protocol?:)


Bye.

Wolf.


Re: Wishlist: support the file:/// protocol

2005-12-12 Thread gentoo


On Mon, 12 Dec 2005, Serhiy Kachanuk wrote:

> On 11.12.2005 16:41, [EMAIL PROTECTED] wrote:
> > 
> > On Sun, 11 Dec 2005, Dan Jacobson wrote:
> > 
> > 
> > >Wishlist: support the file:/// protocol:
> > >$ wget file:///home/jidanni/2005_safe_communities.html
> > >On 11.12.2005 16:41, [EMAIL PROTECTED] wrote:
> > 
> > On Sun, 11 Dec 2005, Dan Jacobson wrote:
> > 
> > 
> > >Wishlist: support the file:/// protocol:
> > >$ wget file:///home/jidanni/2005_safe_communities.html
> > >
> > 
> > 
> > *** Could you describe me the reason for and functionality?
> > 
> This may be useful when some network share are mounted to local file system

*** OK, so why do you want to download the file from local file system to local 
file system?


Bye.

Wolf


Re: Wishlist: support the file:/// protocol

2005-12-11 Thread gentoo


On Sun, 11 Dec 2005, Dan Jacobson wrote:

> Wishlist: support the file:/// protocol:
> $ wget file:///home/jidanni/2005_safe_communities.html
> 

*** Could you describe me the reason for and functionality?

Bye.

Wolf.



Re: Wget download of eMule part File via HTTP or FTP

2005-11-22 Thread gentoo

On Mon, 21 Nov 2005, faina wrote:

> I'm very interesting in a extension of Wget that is able to download via the 
> standard protocols .part file created by the eMule pear to pear protocol, 
> during the downloading process from the remote pear
> The idea is to download pieces of the incoming .part from a remote site using 
> ftp protocol or http protocols.
> To do this wget should continuously read the .met file related to a 
> specific .part file (for example every 5-10 minutes). The .met file contains 
> the information about the pieces (parts) of .part file that eMule is 
> downloading. Knowing these information Wget can fetches the appropriates part 
> of the .part file using the FTP/HTTP protocol (it pass through all firewall), 
> and than reassemble them in a locally .part file. When the pear to pear 
> program eMule completes the download from the remote pear, at the same time 
> Wget completes the reassemble process on the local computer.
> 
> Do anybody think this feature make sense in wget ?

*** Hi all.

IMO this feature should NOT be part of wget functionality. I would like to 
have wget only FTP/HTTP downloader (with the most features of FTP and HTTP 
like authentications etc), but not functions specific for some other 
protocols.

You can do the requested functionality in shell script and linux basic 
utilities like diff, sed/grep, dd etc.


Bye.


Wolf.


Re: File name too long

2005-03-21 Thread gentoo


On Mon, 21 Mar 2005, Martin Trautmann wrote:

> is there a fix when file names are too long?
> 
> bash-2.04$ wget -kxE $URL
> --15:16:37--  
> http://search.ebay.de/ws/search/SaleSearch?copagenum=3D1&sosortproperty=3D2&sojs=3D1&version=3D2&sosortorder=3D2&dfts=3D-1&catref=3DC6&coaction=3Dcompare&soloctog=3D9&dfs=3D20050024&dfte=3D-1&saendtime=3D396614&from=3DR9&dfe=3D20050024&satitle=wget&coentrypage=3Dsearch&ssPageName=3DADME:B:SS:DE:21
>=> 
> `search.ebay.de/ws/search/SaleSearch?copagenum=3D1&sosortproperty=3D2&sojs=3D1&version=3D2&sosortorder=3D2&dfts=3D-1&catref=3DC6&coaction=3Dcompare&soloctog=3D9&dfs=3D20050024&dfte=3D-1&saendtime=3D396614&from=3DR9&dfe=3D20050024&satitle=wget&coentrypage=3Dsearch&ssPageName=3DADME:B:SS:DE:21'
> 
> Proxy request sent, awaiting response... 301 Moved Perminantly
> Location: 
> /wget_W0QQcatrefZ3DC6QQcoactionZ3DcompareQQcoentrypageZ3DsearchQQcopagenumZ3D1QQdfeZ3D20050024QQdfsZ3D20050024QQdfteZ3DQ2d1QQdftsZ3DQ2d1QQfltZ3D9QQfromZ3DR9QQfsooZ3D2QQfsopZ3D2QQsaetmZ3D396614QQsojsZ3D1QQsspagenameZ3DADMEQ3aBQ3aSSQ3aDEQ3a21QQversionZ3D2
>  [following]
> --15:16:37--  
> http://search.ebay.de/wget_W0QQcatrefZ3DC6QQcoactionZ3DcompareQQcoentrypageZ3DsearchQQcopagenumZ3D1QQdfeZ3D20050024QQdfsZ3D20050024QQdfteZ3DQ2d1QQdftsZ3DQ2d1QQfltZ3D9QQfromZ3DR9QQfsooZ3D2QQfsopZ3D2QQsaetmZ3D396614QQsojsZ3D1QQsspagenameZ3DADMEQ3aBQ3aSSQ3aDEQ3a21QQversionZ3D2
>=> 
> `search.ebay.de/wget_W0QQcatrefZ3DC6QQcoactionZ3DcompareQQcoentrypageZ3DsearchQQcopagenumZ3D1QQdfeZ3D20050024QQdfsZ3D20050024QQdfteZ3DQ2d1QQdftsZ3DQ2d1QQfltZ3D9QQfromZ3DR9QQfsooZ3D2QQfsopZ3D2QQsaetmZ3D396614QQsojsZ3D1QQsspagenameZ3DADMEQ3aBQ3aSSQ3aDEQ3a21QQversionZ3D2'
> 
> Length: 46,310 [text/html]

> search.ebay.de/wget_W0QQcatrefZ3DC6QQcoactionZ3DcompareQQcoentrypageZ3DsearchQQcopagenumZ3D1QQdfeZ3D20050024QQdfsZ3D20050024QQdfteZ3DQ2d1QQdftsZ3DQ2d1QQfltZ3D9QQfromZ3DR9QQfsooZ3D2QQfsopZ3D2QQsaetmZ3D396614QQsojsZ3D1QQsspagenameZ3DADMEQ3aBQ3aSSQ3aDEQ3
> a21QQversionZ3D2.html: File name too long


*** This is not problem of wget, but your filesystem. Try to do 

touch 
search.ebay.de/wget_W0QQcatrefZ3DC6QQcoactionZ3DcompareQQcoentrypageZ3DsearchQQcopagenumZ3D1QQdfeZ3D20050024QQdfsZ3D20050024QQdfteZ3DQ2d1QQdftsZ3DQ2d1QQfltZ3D9QQfromZ3DR9QQfsooZ3D2QQfsopZ3D2QQsaetmZ3D396614QQsojsZ3D1QQsspagenameZ3DADMEQ3aBQ3aSSQ3aDEQ3a21QQversionZ3D2.html

> ... apart from that the main thing I look for is how to obtain
> the search results. I still don't manage how to get the result from
> search.ebay.de and then download the links to cgi.ebay.de in one:
> 
>   wget -kxrE -l1 -D cgi.ebay.de -H $URL

*** maybe to create SHA1 sum of the request and store the result in this file
(but you will not know what was the original request, if you don't create some
DB of requests). Or do just simple counting

URL="."
sha1sum="$( echo -n "$URL" | sha1sum )"
echo "$sha1sum $URL" >> SHA1-URL.db
wget -O sha1sum.html [other options] "$URL"

or

URL=""
i=0
echo "$i $URL" >> URL.db
wget -O search-$i.html "$URL"

Could be this your solution?

Wolf.


wget https://... behind proxy

2005-03-15 Thread gentoo

Hi,

I found some problem to download page from https://... URL, when I have 
connection only through proxy. For NON http protocols I use CONNECT method, 
but wget seems to not use it and access directly https URLS. For http:// URL 
wget downloads fine.

Can you tell me, is it error in wget or is there any option or cmdline param 
for this situation (to use proxy also for https)? If this is impossible for 
wget, can I have feature request to use proxy for https URLs too? Using 
CONNECT method.

Thank you for answers.

Bye.

Wolf.