Re: question on using Wget v.1.20.3 built on mingw32

2024-04-07 Thread Darshit Shah
Well it looks like the server you're connecting to sending a redirection. In 
this case there's nothing that Wget can do. It is following the response it got 
from the server. 

Maybe check with the server admin if you think that the redirection is 
incorrect?

On Sun, Apr 7, 2024, at 21:09, Delta Impresa wrote:
> Hello!
> I try to download the file:
> https://cgamos.ru/images/MB_LS/01-0203-0745-001157/0001.jpg
>
> But instead Wget downloads another file:
> https://cgamos.ru/images/qr_pobeda2.png
>
> Please see the screenshot attached.
> It says after awaiting response: *302 Moved Temporarily*
> and then *it uses a location different from one that I supply in my
> url-list.txt file*.
>
> Could you please help me to fix this problem?
>
> Best regards,
> Vladimir
>
> Attachments:
> * 2024-04-07_21-56-54.png



Re: not working with ssl/ipv6?

2024-03-28 Thread Darshit Shah



On Wed, Mar 27, 2024, at 22:05, Brian Vargo wrote:
> Should probably include this:
> $ wget --version
> GNU Wget 1.21.2 built on linux-gnu.
> $ ufw down; wget
> http://ftp.us.debian.org/debian/pool/main/f/foliate/foliate_4.~really3.1.0-0.1_all.deb
> #no firewall
> sudo: ufw: command not found
> --2024-03-27 16:45:23--
> http://ftp.us.debian.org/debian/pool/main/f/foliate/foliate_4.~really3.1.0-0.1_all.deb
> Resolving ftp.us.debian.org (ftp.us.debian.org)... 2600:3402:200:227::2,
> 2600:3404:200:237::2, 2620:0:861:2:208:80:154:139, ...
> Connecting to ftp.us.debian.org (ftp.us.debian.org
> )|2600:3402:200:227::2|:80...
>
> I see this isn't over ssl, just 80.  But it's IPV6.  Is there some way to
> turn that off (for now until I figure out why IPv4 requests work and the
> others don't?

You can use the `-4` option to force IPv4 only mode. 
>
> Suggestions on where else I should start looking for the problem with IPv6?
> (*sigh*)  Firewall's not the issue as you can see unless there's some IP
> tables thing I didn't do.
>
No idea. But as an additional anecdote, I also seem to have troubles with IPv6 
connectivity to github servers in particular. 

> also what resolver is feeding it IPv6 addresses instead IPv4 addresses?  I
> could stop it there too...
>
Your default system resolver. We just make a libc getaddrinfo() call
>
> On Wed, Mar 27, 2024 at 2:01 PM Tim Rühsen  wrote:
>
>> On 3/27/24 17:32, Brian Vargo wrote:
>> > I've not been able to use wget on ssl with github and debian.  I ^C'ed
>> out
>> > of the gitusercontent one (it would have timed out) and the second one is
>> > just a random example:
>> >
>> > ```
>> > $ wget
>> >
>> https://raw.githubusercontent.com/MohamedBassem/hoarder-app/main/docker/docker-compose.yml
>> > --2024-03-27 12:27:25--
>> >
>> https://raw.githubusercontent.com/MohamedBassem/hoarder-app/main/docker/docker-compose.yml
>> > Resolving raw.githubusercontent.com (raw.githubusercontent.com)...
>> > 2606:50c0:8000::154, 2606:50c0:8002::154, 2606:50c0:8003::154, ...
>> > Connecting to raw.githubusercontent.com
>> > (raw.githubusercontent.com)|2606:50c0:8000::154|:443...
>> > ^C
>>
>> Works fine for me on Debian.
>> Did you check your network / firewall?
>>
>> $ wget
>>
>> https://raw.githubusercontent.com/MohamedBassem/hoarder-app/main/docker/docker-compose.yml
>> --2024-03-27
>> 
>> 18:49:21--
>>
>> https://raw.githubusercontent.com/MohamedBassem/hoarder-app/main/docker/docker-compose.yml
>> Resolving raw.githubusercontent.com (raw.githubusercontent.com)...
>> 2606:50c0:8002::154, 2606:50c0:8003::154, 2606:50c0:8000::154, ...
>> Connecting to raw.githubusercontent.com
>> (raw.githubusercontent.com)|2606:50c0:8002::154|:443... connected.
>> HTTP request sent, awaiting response... 200 OK
>> Length: 1251 (1.2K) [text/plain]
>> Saving to: ‘docker-compose.yml’
>>
>> docker-compose.yml
>> 100%[===>]   1.22K
>> --.-KB/sin 0s
>>
>> 2024-03-27 18:49:21 (42.8 MB/s) - ‘docker-compose.yml’ saved [1251/1251]
>>
>>
>> $ wget --version
>> GNU Wget 1.24.5 built on linux-gnu.
>>
>> -cares +digest -gpgme +https +ipv6 +iri +large-file -metalink +nls
>> +ntlm +opie +psl +ssl/gnutls
>>



Re: [bug-wget] Version jump from 1.21.4 to 1.24.5

2024-03-16 Thread Darshit Shah
Hi Brian,

Thanks for pointing out the erroneous jump in the version numbers.
The jump in version numbers happened because I made a typo when finalizing the 
tag just before the release. Unfortunately, this typo went unseen until now.

Fear not, I will not be retracting this version. The version is out and it will 
stay as it is. The missing versions will remain a modern mystery :)

On Sat, Mar 16, 2024, at 16:17, Brian Inglis wrote:
> Hi folks,
>
> Why was the version incremented from 1.21.4 to 1.24.5 rather than 
> 1.21.5 or 1.22.1?
> I want to be certain that this will not be changed before I release a 
> package.
>
> -- 
> Take care. Thanks, Brian Inglis  Calgary, Alberta, Canada
>
> La perfection est atteinte   Perfection is achieved
> non pas lorsqu'il n'y a plus rien à ajouter  not when there is no more to add
> mais lorsqu'il n'y a plus rien à retirer but when there is no more to cut
>  -- Antoine de Saint-Exupéry



[bug #63253] install wget sha256sum impersonated from Stockholm's android servers

2024-03-13 Thread Darshit Shah
Update of bug #63253 (group wget):

  Status:None => Need Info  
 Open/Closed:Open => Closed 


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




wget-1.24.5 released [stable]

2024-03-10 Thread Darshit Shah

This is to announce wget-1.24.5, a stable release.

This is another relative slow release with minor bug fixes. The main one 
being a correction in how subdomains of Top-Level Domains (TLDs) are 
treated when checking for suffixes during HSTS lookups. This is a very 
low criticality vulnerability that has now been patched.


There have been 33 commits by 6 people in the 43 weeks since 1.21.4.

See the NEWS below for a brief summary.

Thanks to everyone who has contributed!
The following people contributed changes to this release:

  Christian Weisgerber (1)
  Darshit Shah (20)
  Jan Palus (1)
  Jan-Michael Brummer (1)
  Tim Rühsen (9)
  Yaakov Selkowitz (1)

Darshit Shah
 [on behalf of the wget maintainers]
==

Here is the GNU wget home page:
https://gnu.org/s/wget/

For a summary of changes and contributors, see:
https://git.sv.gnu.org/gitweb/?p=wget.git;a=shortlog;h=v1.24.5
or run this command from a git-cloned wget directory:
  git shortlog v1.21.4..v1.24.5

Here are the compressed sources:
https://ftpmirror.gnu.org/wget/wget-1.24.5.tar.gz (5.0MB)
https://ftpmirror.gnu.org/wget/wget-1.24.5.tar.lz (2.5MB)

Here are the GPG detached signatures:
https://ftpmirror.gnu.org/wget/wget-1.24.5.tar.gz.sig
https://ftpmirror.gnu.org/wget/wget-1.24.5.tar.lz.sig

Use a mirror for higher download bandwidth:
https://www.gnu.org/order/ftp.html

Here are the SHA1 and SHA256 checksums:

  62525de6f09486942831ca2e352ae6802fc2c3dd  wget-1.24.5.tar.gz
  +i3DW6tRhOy8Rqnvg97yqqo/TJ88l9S9GdywfU2mN94=  wget-1.24.5.tar.gz
  01659f427c2e90c7c943805db69ea00f5da79b07  wget-1.24.5.tar.lz
  V6EHFR5O+U/flK/+z6xZiWPzcvEyk+2cdAMhBTkLNu4=  wget-1.24.5.tar.lz

Verify the base64 SHA256 checksum with cksum -a sha256 --check
from coreutils-9.2 or OpenBSD's cksum since 2007.

Use a .sig file to verify that the corresponding file (without the
.sig suffix) is intact.  First, be sure to download both the .sig file
and the corresponding tarball.  Then, run a command like this:

  gpg --verify wget-1.24.5.tar.gz.sig

The signature should match the fingerprint of the following key:

  pub   rsa4096 2015-10-14 [SC]
    7845 120B 07CB D8D6 ECE5  FF2B 2A17 43ED A91A 35B6
  uid   Darshit Shah 
  uid   Darshit Shah 

If that command fails because you don't have the required public key,
or that public key has expired, try the following commands to retrieve
or refresh it, and then rerun the 'gpg --verify' command.

  gpg --locate-external-key g...@darnir.net

  gpg --recv-keys 64FF90AAE8C70AF9

  wget -q -O- 
'https://savannah.gnu.org/project/release-gpgkeys.php?group=wget=1' 
| gpg --import -


As a last resort to find the key, you can try the official GNU
keyring:

  wget -q https://ftp.gnu.org/gnu/gnu-keyring.gpg
  gpg --keyring gnu-keyring.gpg --verify wget-1.24.5.tar.gz.sig

This release was bootstrapped with the following tools:
  Autoconf 2.72
  Automake 1.16.5
  Gnulib v0.1-7211-gd15237a22b

NEWS

* Noteworthy changes in release 1.24.5 (2024-03-10) [stable]

** Fix how subdomain matches are checked for HSTS.
   Fixes a minor issue where cookies may be leaked to the wrong domain

** Wget will now also parse the srcset attribute in  HTML tags

** Support reading fetchmail style "user" and "passwd" fields from netrc

** In some cases, prevent the confusing "Cannot write to... (success)" 
error messages


** Support extremely fast download speeds (TB/s).
   Previously this would cause Wget to crash when printing the speed

** Improve portability on OpenBSD to run the test suite

** Ensure that CSS URLs are corectly quoted (Bug: 64082)



OpenPGP_0x2A1743EDA91A35B6.asc
Description: OpenPGP public key


OpenPGP_signature.asc
Description: OpenPGP digital signature


[bug #64693] Add command line support for bearer token

2024-03-10 Thread Darshit Shah
Follow-up Comment #1, bug #64693 (group wget):

This would be a simple change and I would be happy to accept a patch :)


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #65042] Unable to download webp images with GNU Wget

2024-03-10 Thread Darshit Shah
Update of bug #65042 (group wget):

  Status:None => Need Info  

___

Follow-up Comment #1:

Could you please give a more precise command line including the actual domain
name? From a rudimentary testing, and from how the -A option works, webp
images are downloaded when the page has any. 



___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #65007] wget uses non-standard way to print IPv6 addresses

2024-03-10 Thread Darshit Shah
Follow-up Comment #3, bug #65007 (group wget):

I think we can leave the IPv4 syntax in Wget's output as-is, but switch the
output for IPv6 to be more compliant. Unlike IPv4, there is a strong
standardization on how IPv6 is often printed and we should try to be
consistent with it. 

The issue with the output changing shouldn't be so bad. We can test it out and
see if someone complains about it.


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




Re: wget input/output using stdin/stdout

2024-03-01 Thread Darshit Shah
Hi Dan,

For this usecase, I would highly recommmend using the successor to GNU Wget, 
GNU Wget2. It is not available in most distribution repositories. See 
https://gitlab.com/gnuwget/wget2

Wget2 supports reading from stdin throughout the life of the program. 

On Sat, Mar 2, 2024, at 09:35, Dan Lewis via Primary discussion list for GNU 
Wget wrote:
> Greetings,
>
> I have a program that loads and executes wget using the following command
> line:
>
> wget -i - -O -
>
>
> and dups wget's stdin, stdout (and stderr) handles so that I can write URLs
> to wget's stdin and read the responses from wget's stdout. What I wanted to
> do was to write a sequence of URLs to wget's stdin, reading each response
> before the next URL is sent. Rather, wget buffers its output so that it
> doesn't output anything until I close its stdin. As a result, it seems that I
> can only send all of the URLs to wget, close its stdin, and then read all
> of the responses.
>
> Is there any wget command line option that will cause wget to output a
> response after each URL without waiting for me to close its stdin?
>
> Thanks!
> Dan



Re: Long time no pkt running with limit-rate

2024-01-30 Thread Darshit Shah
Hi,

Thanks for the report. 
 If I understand you correctly, then this is expected behavior. The way the 
rate limiting is implemented is that it allows a few packets through at the 
maximum bandwidth and then simply sleeps for a time such that the average rate 
is roughly equal to the one set by the user. 

So, waiting for about 70 seconds between packets seems about correct. I don't 
really know how one could implement a better rate limiting algorithm entirely 
in user space. 

On Tue, Jan 30, 2024, at 10:00, Lei B Bao wrote:
> Hi Wget team.
>
> Now we’re using the wget 1.21 version to download a 700K file with a 
> fixed rate-limit 10K, and we want the pkts can be kept running during 
> ~70s, but we found there are about 60s+ no pkt running:
>
>
> root@3c578c2b5c87:/# wget -V
>
> GNU Wget 1.21 built on linux-gnu.
>
>
>
> root@3c578c2b5c87:/# wget 
> http://192.168.200.30:80/wget_traffic_id_2_700KB
>  
> -O /dev/null --limit-rate=10K
>
>
>
>
>
> 
>
> 02:29:28.007124 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [.], ack 698203, win 1391, options [nop,nop,TS val 3141532161 ecr 
> 3882564233], length 0
>
> 02:29:28.007126 IP 192.168.200.30.80 > 192.168.200.200.44544: Flags 
> [P.], seq 698203:700259, ack 151, win 28, options [nop,nop,TS val 
> 3882564233 ecr 3141532160], length 2056: HTTP
>
> 02:29:28.007131 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [.], ack 700259, win 1408, options [nop,nop,TS val 3141532161 ecr 
> 3882564233], length 0
>
>
>
> < 60s+ no pkts running
>
>
>
>
>
> 02:30:36.358950 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [F.], seq 151, ack 700259, win 1408, options [nop,nop,TS val 3141600513 
> ecr 3882564233], length 0
>
> 02:30:36.568323 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [F.], seq 151, ack 700259, win 1408, options [nop,nop,TS val 3141600723 
> ecr 3882564233], length 0
>
> 02:30:36.776340 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [F.], seq 151, ack 700259, win 1408, options [nop,nop,TS val 3141600931 
> ecr 3882564233], length 0
>
> 02:30:37.184305 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [F.], seq 151, ack 700259, win 1408, options [nop,nop,TS val 3141601339 
> ecr 3882564233], length 0
>
> 02:30:38.032310 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [F.], seq 151, ack 700259, win 1408, options [nop,nop,TS val 3141602187 
> ecr 3882564233], length 0
>
> 02:30:39.696306 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [F.], seq 151, ack 700259, win 1408, options [nop,nop,TS val 3141603851 
> ecr 3882564233], length 0
>
> 02:30:42.960334 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [F.], seq 151, ack 700259, win 1408, options [nop,nop,TS val 3141607115 
> ecr 3882564233], length 0
>
> 02:30:49.808344 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [F.], seq 151, ack 700259, win 1408, options [nop,nop,TS val 3141613963 
> ecr 3882564233], length 0
>
> ^@02:31:03.120308 IP 192.168.200.200.44544 
> > 192.168.200.30.80: Flags [F.], seq 151, ack 700259, win 1408, options 
> [nop,nop,TS val 3141627275 ecr 3882564233], length 0
>
> 02:31:29.232320 IP 192.168.200.200.44544 > 192.168.200.30.80: Flags 
> [F.], seq 151, ack 700259, win 1408, options [nop,nop,TS val 3141653387 
> ecr 3882564233], length 0
>
> While we also tested with old version 1.20.1, it seems good, but also 
> about 5 s no pkts running:
>
>
> root@ebe3ce58547c:/# wget -V
>
> GNU Wget 1.20.1 built on linux-gnu.
>
>
>
> root@ebe3ce58547c:/# wget 
> http://192.168.200.30:80/wget_traffic_id_2_700KB
>  
> -O /dev/null --limit-rate=10K
>
>
>
> 02:42:00.187604 IP 192.168.200.200.46556 > 192.168.200.30.80: Flags 
> [.], ack 683507, win 195, options [nop,nop,TS val 265728065 ecr 
> 265435757], length 0
>
> 02:42:00.187631 IP 192.168.200.30.80 > 192.168.200.200.46556: Flags 
> [P.], seq 683507:692455, ack 165, win 489, options [nop,nop,TS val 
> 265435757 ecr 265728065], length 8948: HTTP
>
> 02:42:00.187633 IP 192.168.200.30.80 > 192.168.200.200.46556: Flags 
> [P.], seq 692455:698203, ack 165, win 489, options [nop,nop,TS val 
> 265435757 ecr 265728065], length 5748: HTTP
>
> 02:42:00.187633 IP 192.168.200.30.80 > 192.168.200.200.46556: Flags 
> [P.], seq 698203:700259, ack 165, win 489, options [nop,nop,TS val 
> 265435757 ecr 265728065], length 2056: HTTP
>
> 02:42:00.231702 IP 192.168.200.200.46556 > 192.168.200.30.80: Flags 
> [.], ack 700259, win 94, options [nop,nop,TS val 265728110 ecr 
> 265435757], length 0
>
>
>
> <<< ~5s no pkts running.
>
>
>
> 02:42:05.787250 IP 192.168.200.200.46556 > 192.168.200.30.80: Flags 
> [.], ack 700259, win 411, options [nop,nop,TS val 265733665 ecr 
> 265435757], length 0
>
> 02:42:07.746996 IP 192.168.200.200.46556 > 192.168.200.30.80: Flags 
> [F.], seq 165, ack 700259, win 443, options [nop,nop,TS val 265735625 
> ecr 265435757], length 0
>
> 

Re: Bug report

2023-12-15 Thread Darshit Shah
And what is the problem?

On Mon, Dec 11, 2023, at 18:45, Ritick sethi wrote:
> riticksethi@d7-138-10 homebrew % ln -sf ../Cellar/wget/1.16.1/bin/wget 
> ~/homebrew/bin/wget
>
> ln: /Users/riticksethi/homebrew/bin/wget: No such file or directory
> riticksethi@d7-138-10 homebrew % mkdir -p ~/homebrew/bin
> riticksethi@d7-138-10 homebrew % ln -sf 
> ../../Cellar/wget/1.16.1/bin/wget ~/homebrew/bin/wget
> #
> zsh: command not found: #
> riticksethi@d7-138-10 homebrew % ln -sf 
> ../../Cellar/wget/1.16.1/bin/wget ~/homebrew/bin/wget
> riticksethi@d7-138-10 homebrew % wget --version
> GNU Wget 1.21.4 built on darwin23.0.0.
>
> -cares +digest -gpgme +https +ipv6 +iri +large-file -metalink +nls 
> +ntlm +opie -psl +ssl/openssl 
>
> Wgetrc: 
> /opt/homebrew/etc/wgetrc (system)
> Locale: 
> /opt/homebrew/Cellar/wget/1.21.4/share/locale 
> Compile: 
> clang -DHAVE_CONFIG_H -DSYSTEM_WGETRC="/opt/homebrew/etc/wgetrc" 
> -DLOCALEDIR="/opt/homebrew/Cellar/wget/1.21.4/share/locale" -I. 
> -I../lib -I../lib -I/opt/homebrew/opt/openssl@3/include 
> -I/opt/homebrew/Cellar/libidn2/2.3.4_1/include -DNDEBUG -g -O2 
> Link: 
> clang -I/opt/homebrew/Cellar/libidn2/2.3.4_1/include -DNDEBUG -g 
> -O2 -L/opt/homebrew/Cellar/libidn2/2.3.4_1/lib -lidn2 
> -L/opt/homebrew/opt/openssl@3/lib -lssl -lcrypto -ldl -lz 
> ../lib/libgnu.a -liconv -lintl -Wl,-framework -Wl,CoreFoundation 
> -lunistring 
>
> Copyright (C) 2015 Free Software Foundation, Inc.
> License GPLv3+: GNU GPL version 3 or later
> .
> This is free software: you are free to change and redistribute it.
> There is NO WARRANTY, to the extent permitted by law.
>
> Originally written by Hrvoje Niksic .
> Please send bug reports and questions to .
>
> Regards
>
> Ritick sethi
> MSc student Sensor System Technology
> Hochschule Karlsruhe
> +49 176 2549 3112



Re: Issues on installation

2023-12-14 Thread Darshit Shah
That looks like a 0 (Zero). The option you're looking for is -O (Capital O)

On Fri, Dec 15, 2023, at 06:12, Noah Kpogo wrote:
> Please I'm  trying to install nethunter using wget on termux, they only
> keep saying invalid option. What should I do?
>
> Attachments:
> * Screenshot_20231215-050807.png



Re: css.c: No such file or directory when lex lib not found during build

2023-05-24 Thread Darshit Shah
Hi,

Thanks for reporting this. I think we can modify the configure script to make 
the check on flex a hard error.

I'll do that soon(tm)


On Wed, May 24, 2023, at 11:01, David Cepelik wrote:
> Hi all,
>
> I noticed some strange behavior when building recent wget (fbbdf9ea)
> from sources using recent Flex (d30bdf4) built from sources. I tracked
> the issue down to Flex [1], but I thought that wget's toolchain made the
> problem harder to debug, so I figured it might be a good think to report
> it here as well.
>
> In a nutshell, when lex library is not detected, configure will output
> the following:
>
> [...]
> checking for flex... flex
> checking for lex output file root... lex.yy
> checking for lex library... not found
> configure: WARNING: required lex library not found; giving up on flex
> [...]
>
> which gets drowned in the rest of the output. Even though the library is
> "required" as per the warning above, the build process continues and
> eventually crashes with,
>
> [...]
> echo '#include "wget.h"' > css_.c
> cat css.c >> css_.c
> cat: css.c: No such file or directory
> make[3]: *** [Makefile:3156: css_.c] Error 1
> [...]
>
> This is because LEX defaults to : in src/Makefile:
>
> [...]
>   CC   convert.o
>   CC   cookies.o
> :  -ocss.c css.l
>   CC   ftp.o
>   CC   css-url.o
> [...]
>
> Is this expected?
>
> Best, David
>
> [1] https://github.com/westes/flex/issues/565
> Attachments:
> * signature.asc



Re: bug reporting

2023-05-18 Thread Darshit Shah
And what is the bug? What issue are you facing?

On Thu, May 18, 2023, at 05:59, 亦君羊心 via Primary discussion list for GNU Wget 
wrote:
> == Reinstalling wget
>
> == Pouring wget--1.21.3_1.arm64_monterey.bottle.1.tar.gz
>
>  /opt/homebrew/Cellar/wget/1.21.3_1: 89 files, 4.2MB
>
> == Running `brew cleanup wget`...
>
> Disable this behaviour by setting HOMEBREW_NO_INSTALL_CLEANUP.
>
> Hide these hints with HOMEBREW_NO_ENV_HINTS (see `man brew`).
>
> ➜ ~ wget --version
>
> GNU Wget 1.21.3 在 darwin21.6.0 上编译。
>
>
>
>
> -cares +digest -gpgme +https +ipv6 +iri +large-file -metalink +nls
>
> +ntlm +opie -psl +ssl/openssl
>
>
>
>
> Wgetrc:
>
>   /opt/homebrew/etc/wgetrc (系统)
>
> 语区:
>
>   /opt/homebrew/Cellar/wget/1.21.3_1/share/locale
>
> 编译:
>
>   clang -DHAVE_CONFIG_H 
> -DSYSTEM_WGETRC="/opt/homebrew/etc/wgetrc"
>
>   
> -DLOCALEDIR="/opt/homebrew/Cellar/wget/1.21.3_1/share/locale" -I.
>
>   -I../lib -I../lib -I/opt/homebrew/opt/openssl@3/include
>
>   -I/opt/homebrew/Cellar/libidn2/2.3.4_1/include -DNDEBUG 
> -g -O2
>
> 链接:
>
>   clang -I/opt/homebrew/Cellar/libidn2/2.3.4_1/include 
> -DNDEBUG -g
>
>   -O2 -L/opt/homebrew/Cellar/libidn2/2.3.4_1/lib -lidn2
>
>   -L/opt/homebrew/opt/openssl@3/lib -lssl -lcrypto -ldl -lz
>
>   ../lib/libgnu.a -liconv -lintl -Wl,-framework 
> -Wl,CoreFoundation
>
>   -lunistring
>
>
>
>
> Copyright © 2015 Free Software Foundation, Inc.
>
> 授权 GPLv3+: GNU GPL 第三版或更高版本
>
> 
> 这是自由软件:您可以自由地更改并重新分发它。
>
> 在法律所允许的范围内,没有任何担保。
>
>
>
>
> 最初由 Hrvoje Nikšić 
> 请将错误报告或建议寄给 
>
>
>
>
>
>
>
>
>
>
>
> 亦君羊心
> 450415...@qq.com
>
>
>
> 



[bug #64082] wget unescapes URLs used as CSS url() parameters, leading to spaces and thus invalid CSS

2023-05-16 Thread Darshit Shah
Update of bug #64082 (project wget):

  Status:None => Fixed  
 Open/Closed:Open => Closed 
   Fixed Release:None => trunk  

___

Follow-up Comment #1:

Thanks for the bug report. I've applied a fix to the current master branch
that should hopefully resolve this issue. 

I'd be very grateful if you could build from sources and test it out.


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #64184] Metalink tests fail with gnupg >= 2.4.1

2023-05-14 Thread Darshit Shah
Update of bug #64184 (project wget):

  Status:None => Fixed  
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

Applied patch to trunk.

Thank you for your contribution


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




wget-1.21.4 released [stable]

2023-05-10 Thread Darshit Shah
This is to announce wget-1.21.4, a stable release.

This is a slow release, with not many exciting things to talk about. The main 
reason is to allow HSTS tests to function again on i686 systems. 

There have been 29 commits by 3 people in the 62 weeks since 1.21.3.

See the NEWS below for a brief summary.

Thanks to everyone who has contributed!
The following people contributed changes to this release:

  Darshit Shah (6)
  Tim Rühsen (22)
  jinfuchiang (1)

Darshit
 [on behalf of the wget maintainers]
==

Here is the GNU wget home page:
http://gnu.org/s/wget/

For a summary of changes and contributors, see:
  http://git.sv.gnu.org/gitweb/?p=wget.git;a=shortlog;h=v1.21.4
or run this command from a git-cloned wget directory:
  git shortlog v1.21.3..v1.21.4

Here are the compressed sources:
  https://ftpmirror.gnu.org/wget/wget-1.21.4.tar.gz   (4.9MB)
  https://ftpmirror.gnu.org/wget/wget-1.21.4.tar.lz   (2.4MB)

Here are the GPG detached signatures:
  https://ftpmirror.gnu.org/wget/wget-1.21.4.tar.gz.sig
  https://ftpmirror.gnu.org/wget/wget-1.21.4.tar.lz.sig

Use a mirror for higher download bandwidth:
  https://www.gnu.org/order/ftp.html

Here are the SHA1 and SHA256 checksums:

  c6dc52cbda882c14fa5c3401d039901a0ba823fc  wget-1.21.4.tar.gz
  gVQvXO+4+qzDm7vGyC3tgOPkqIUFrnLqUd8nUlvN4Ew=  wget-1.21.4.tar.gz
  42384273c1937458c9db3766a5509afa636a2f00  wget-1.21.4.tar.lz
  NoNhml9Q7cvMsXIKeQBvo3v5uaJVqMW0gEi8PHqHS9k=  wget-1.21.4.tar.lz

Verify the base64 SHA256 checksum with cksum -a sha256 --check
from coreutils-9.2 or OpenBSD's cksum since 2007.

Use a .sig file to verify that the corresponding file (without the
.sig suffix) is intact.  First, be sure to download both the .sig file
and the corresponding tarball.  Then, run a command like this:

  gpg --verify wget-1.21.4.tar.gz.sig

The signature should match the fingerprint of the following key:

  pub   rsa4096 2015-10-14 [SC]
7845 120B 07CB D8D6 ECE5  FF2B 2A17 43ED A91A 35B6
  uid   Darshit Shah 
  uid   Darshit Shah 

If that command fails because you don't have the required public key,
or that public key has expired, try the following commands to retrieve
or refresh it, and then rerun the 'gpg --verify' command.

  gpg --locate-external-key g...@darnir.net

  gpg --recv-keys 64FF90AAE8C70AF9

  wget -q -O- 
'https://savannah.gnu.org/project/release-gpgkeys.php?group=wget=1' | 
gpg --import -

As a last resort to find the key, you can try the official GNU
keyring:

  wget -q https://ftp.gnu.org/gnu/gnu-keyring.gpg
  gpg --keyring gnu-keyring.gpg --verify wget-1.21.4.tar.gz.sig

This release was bootstrapped with the following tools:
  Autoconf 2.71
  Automake 1.16.5
  Gnulib v0.1-6178-gdfdf33a466

NEWS

* Noteworthy changes in release 1.21.4 (2023-05-11)

** Document --retry-on-host-error in help text

** Increase read buffer size to 64k. This should speed up downloads on gigabit 
and faster connections

** Update deprecated option '--html-extension' to '--adjust-extension' in 
documentation

** Update gnulib compatibility layer.
   Fixes HSTS test failures on i686. (Thanks to Andreas Enge for ponting it out)




Re: Test failures on i686-linux

2023-04-17 Thread Darshit Shah
I'll try and make a new release this week. 

On Sun, Apr 16, 2023, at 20:51, Andreas Enge wrote:
> Hi Tim,
>
> Am Sun, Apr 16, 2023 at 06:38:32PM +0200 schrieb Tim Rühsen:
>> Hm, cb114... looks like it's the needed commit. Maybe also cherry-pick
>> 27d3fcba3331a981bcb8807c663a16b8fa4ebeb3 (gnulib update).
>
> it looks like this is definitely needed. But integrating it into our build
> system is tricky, since it is not just a matter of applying a patch to
> the tarball. (Actually, it looks like the gnulib update is the only one
> that is needed. When I run ./bootstrap with the new gnulib, then git
> checkout v1.21.3, ./configure and make dist, I get a tarball that works
> for us on i686.)
>
>> > Have you got an idea which other commit would be crucial? Or do you think
>> > you could make a new release soonish?
>> We should indeed make a release soon. Do you have some spare time @Darshit ?
>
> That would indeed be most welcome! I would be happy to test a release
> candidate. The one I got and put there:
>https://www.multiprecision.org/wget-1.21.3.24-2b723.tar.lz
> works with the core-updates branch of Guix on i686 and x86_64.
>
> Andreas



Re: Mention multiple -O OK

2023-02-18 Thread Darshit Shah



On Fri, Feb 17, 2023, at 18:03, Tim Rühsen wrote:
>
> It is not possible to control the stored file names with multiple -O 
> options (missing feature)

Sorry, but no. This is not a missing feature. Its a long standing annoyance of 
mine but the manual describes -O as being equivalent to shell redirection. See 
the below selected quotes from the man page:

```
Use of -O is not intended to mean simply "use the name file instead of the one 
in the URL;" rather, it is analogous to shell redirection: wget -O file 
http://foo is intended to work like wget -O - http://foo > file; file will be 
truncated immediately, and all downloaded content will be written there.
```

And

```
Similarly, using -r or -p with -O may not work as you expect: Wget won't just 
download the first file to file and then download the rest to their normal 
names: all downloaded content will be placed in file.
```

Since this is a long standing documented "feature" I am not keen on modifying 
the default behavior as automated scripts that rely on it my break. 

This was the reason that the first thing I did with Wget2 was to remove that 
sentence and implication from its manual allowing for a more intuitive 
implementation of the -O option there. However, unfortunately, this 
non-intuitive behavuour of -O shall stay in Wget for the time being. 



Re: RESUME SITE DOWNLOAD MIRROR

2022-12-31 Thread Darshit Shah
That's what the -c option is for. 

On Wed, Dec 28, 2022, at 22:26, Sir Dante IMT wrote:
> Hello
> I have a question. I try to mirror a website actually with approximately
> 10GIG of data, my internet connection is very good and i was already at
> 5GIG through and my laptop battery died, and i lost the process but still
> have the folder in my directory. Please is there a way to continue the
> website mirror without it having to re-download the data all the way from
> the beginning? i would want to resume the mirror with Wget in the same
> folder and have it continue where it stopped. Thanks man.



[bug #63253] install wget sha256sum impersonated from Stockholm's android servers

2022-10-23 Thread Darshit Shah
Follow-up Comment #2, bug #63253 (project wget):

I'm not sure what you're trying to achieve or how Wget is involved?


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #62964] --on-clobber does not respect sym-linked files

2022-10-23 Thread Darshit Shah
Follow-up Comment #1, bug #62964 (project wget):

>From the manual:


   When running Wget with -N, with or without -r or -p, the decision
as to whether or not to download a newer copy of a file depends on the local
and remote timestamp
   and size of the file.  -nc may not be specified at the same time as
-N.


Thus, your point 3 is invalid. 
However, I will agree that if you have a symlinked file, then Wget should not
clobber it even with -r unless the file on the server has changed. In which
case you need to use --timestamping to prevent that.

I will look into the symlink issue and see what I can find there. 


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #63252] install wget sha256sum impersonated from Stockholm's android servers

2022-10-23 Thread Darshit Shah
Update of bug #63252 (project wget):

  Status:None => Duplicate  
 Open/Closed:Open => Closed 


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #63224] option -O not respecting manadatory agrument

2022-10-23 Thread Darshit Shah
Update of bug #63224 (project wget):

  Status:None => Wont Fix   
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

That is correct and the expected behaviour. Short options don't use the "="
for the argument. Long options may optionally use the "=" if they need to pass
the argument as a single parameter.


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #63077] I can't to install nethunter

2022-10-23 Thread Darshit Shah
Update of bug #63077 (project wget):

  Status:None => Need Info  
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

Please provide some more information. What is nethunter and what are you
trying to achieve? Feel free to reopen with additional  details


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #62137] wget segfaults when using openssl and a FTPS server in TLS 1.3

2022-10-23 Thread Darshit Shah
Update of bug #62137 (project wget):

  Status:None => Ready for Merge
 Assigned to:None => darnir 


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #62795] Distinguish between error codes

2022-10-23 Thread Darshit Shah
Update of bug #62795 (project wget):

  Status:None => Invalid
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

Hi,

the issue here is that there seems to be a misunderstanding of the -O option.
-O is meant to act like shell redirection, i.e. the following are meant to be
identical

$ wget example.com -O- > file.html
$ wget example.com -O file.html


Which means that the file will ideally be truncated before the download even
starts and thus *MUST* always be fully redownloaded. We've tried to add some
checks here and there to protect from footguns and what you're seeing is one
of them. You asked to never clobber any files and then used -O which would
immediately clobber it. 

Even if we were to add the option as you requested, it would not be useful
since it would only check if a file by that name exists. If you really want to
check that it is the same file as on the server you must unfortunately remove
the -O option.

I know this is not ideal and not the answer you wanted to hear. But I'm stuck
in a hard place. This is documented behaviour for about 25 years and I cannot
just change it, it would break a huge number of scripts. 


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




[bug #62869] if retry hits a 302 FOUND wget forgets to send the Range header thus appending the whole file to what's downloaded alrdy

2022-10-23 Thread Darshit Shah
Follow-up Comment #3, bug #62869 (project wget):

Hi, sorry for the late response. I accidentally deleted a couple of the files
you uploaded instead of downloading them. Could you please re-upload your
scripts and PoC? I'll take a look in the coming weeks to see what the issue is
and resolve it


___

Reply to this item at:

  

___
Message sent via Savannah
https://savannah.gnu.org/




Re: Problem downloading image with wget

2022-08-25 Thread Darshit Shah
It works perfectly here on my end. What exactly is the issue you're facing?
Please provide the platform / OS you're running on, the Wget version number and 
full logs from the failing process.

Also, as Tim suggested, please see if you can replicate the issue on the latest 
release

On Wed, Aug 10, 2022, at 11:42, Edgar Mobile wrote:
> I unsuccessfully try to download an image using wget from discogs using 
> the following command:
>
> wget 
> https://i.discogs.com/huN5PKlYZX2k_gqeNNdTZlkyemt3wzwSLH2TBoOkDJY/rs:fit/g:sm/q:90/h:495/w:500/czM6Ly9kaXNjb2dz/LWRhdGFiYXNlLWlt/YWdlcy9SLTEzNDE1/NTk0LTE1NTM3ODA1/MjQtNjUyMy5qcGVn.jpeg
> Bus error: 10
>
> GNU Wget 1.20.3
>
> Any ideas?
>
> Regards
>
> [https://i.discogs.com/huN5PKlYZX2k_gqeNNdTZlkyemt3wzwSLH2TBoOkDJY/rs:fit/g:sm/q:90/h:495/w:500/czM6Ly9kaXNjb2dz/LWRhdGFiYXNlLWlt/YWdlcy9SLTEzNDE1/NTk0LTE1NTM3ODA1/MjQtNjUyMy5qcGVn.jpeg]



Re: Re bugs on setup

2022-03-08 Thread Darshit Shah
What is not working? You've only sent us back the help output of Wget

On Tue, Mar 8, 2022, at 04:32, Leon Munro wrote:
> Email bug reports, questions, discussions to 
>
> I set up termux but once I got here nothing works?
>
>
> Welcome to Termux!
>
> Communities: https://termux.org/community
> Gitter chat: https://gitter.im/termux/termux
> IRC channel: #termux on libera.chat
>
> Working with packages:
>
>  * Search packages:   pkg search 
>  * Install a package: pkg install 
>  * Upgrade packages:  pkg upgrade
>
> Subscribing to additional repositories:
>
>  * Root: pkg install root-repo
>  * X11:  pkg install x11-repo
>
> Report issues at https://termux.org/issues
>
>
> You are likely using a very old version of Termux,
> probably installed from the Google Play Store.
> There are plans in the near future to remove the
> Termux apps from the Play Store so that new users
> cannot install them and to **disable** them for
> existing users with app updates to prevent the use
> of outdated app versions. Instead, you are
> encouraged to move to F-Droid or Github sources
> (see [1]). You can backup all your current Termux
> data before uninstallation and then restore it later
> by following instructions in the wiki [2]. Check
> the changelog [3] for all the new features and fixes
> that you are currently missing. Check [4] for why
> this is being done.
>
> [1] https://github.com/termux/termux-app#installation
> [2] https://wiki.termux.com/wiki/Backing_up_Termux
> [3] https://github.com/termux/termux-app/releases
> [4] https://github.com/termux/termux-app#google-play-store-deprecated
>
> ~ $ curl -LO
> https://packages.termux.org/apt/termux-main/pool/main/t/termux-keyring/termux-keyring_2.4_all.deb
> apt install ./termux-keyring_2.4_all.deb
> apt update && apt dist-upgrade -yq
>   % Total% Received % Xferd  Average Speed   TimeTime Time
> Current
>  Dload  Upload   Total   SpentLeft
> Speed
>   0 00 00 0  0  0 --  0 00 0
> 0
>0  0  0 --  0 00 00 0  0  0 --  0
>  00 00 0  0  0 --  0 00 00 0
>   0  0 --100   153  100   1530 0 40  0  0:00:03
> 0:00:03 --:--:--40
> Reading package lists... Error!
> E: Invalid archive signature
> E: Internal error, could not locate member
> control.tar{.lz4,.gz,.xz,.bz2,.lzma,}
> E: Could not read meta data from
> /data/data/com.termux/files/home/termux-keyring_2.4_all.deb
> E: The package lists or status file could not be parsed or opened.
> Hit:1 https://termux.mentality.rip/game-packages-24 games InRelease
> Hit:2 https://grimler.se/termux-packages-24 stable InRelease
> Hit:3 https://termux.mentality.rip/science-packages-24 science InRelease
> Reading package lists... Done
> Building dependency tree... Done
> Reading state information... Done
> All packages are up to date.
> Reading package lists...
> Building dependency tree...
> Reading state information...
> Calculating upgrade...
> 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
> ~ $ pkg install wget
> Checking availability of current mirror: ok
> Reading package lists... Done
> Building dependency tree... Done
> Reading state information... Done
> wget is already the newest version (1.21.3-1).
> 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
> ~ $ wget -0 install-nethunter-termux https://offs.ec/2MceZWr
> wget: invalid option -- '0'
> Usage: wget [OPTION]... [URL]...
>
> Try `wget --help' for more options.
> ~ $ wget -o install-nethunter-termux https://offs.ec/2MceZWr
> ~ $ chmod +x install-nethunter-termux
> ~ $ wget --help
> GNU Wget 1.21.3, a non-interactive network retriever.
> Usage: wget [OPTION]... [URL]...
>
> Mandatory arguments to long options are mandatory for short options too.
>
> Startup:
>   -V,  --version   display the version of Wget and exit
>   -h,  --help  print this help
>   -b,  --backgroundgo to background after startup
>   -e,  --execute=COMMAND   execute a `.wgetrc'-style command
>
> Logging and input file:
>   -o,  --output-file=FILE  log messages to FILE
>   -a,  --append-output=FILEappend messages to FILE
>   -d,  --debug print lots of debugging information
>   -q,  --quiet quiet (no output)
>   -v,  --verbose   be verbose (this is the default)
>   -nv, --no-verboseturn off verboseness, without being quiet
>--report-speed=TYPE output bandwidth as TYPE.  TYPE can be
> bits
>   -i,  --input-file=FILE   download URLs found in local or external
> FILE
>   -F,  --force-htmltreat input file as HTML
>   -B,  --base=URL  resolves HTML input-file links (-i -F)
>  relative to URL
>--config=FILE   specify config file 

wget-1.21.3 released [stable]

2022-02-26 Thread Darshit Shah

I'm pleased to announce the release of GNU Wget 1.21.3.

GNU Wget is a free utility for non-interactive download of files from
the  Web. It supports HTTP(S), and FTP(S) protocols, as well as
retrieval through HTTP proxies.

This is a minor bugfix release.

Many thanks to everyone who contributed to this release:
Aarni Koskela
Darshit Shah
Michal Ruprich
Nik Soggia
Per Lundberg
Thomas Niederberger
Tim Rühsen


Here are the compressed sources:
  https://ftpmirror.gnu.org/wget/wget-1.21.3.tar.gz   (4.9MB)
  https://ftpmirror.gnu.org/wget/wget-1.21.3.tar.lz   (2.4MB)

Here are the GPG detached signatures[*]:
  https://ftpmirror.gnu.org/wget/wget-1.21.3.tar.gz.sig
  https://ftpmirror.gnu.org/wget/wget-1.21.3.tar.lz.sig

Use a mirror for higher download bandwidth:
  https://www.gnu.org/order/ftp.html

Here are the SHA1 and SHA256 checksums:

e0c2c21aff77693c3e67cf3889945a2ef5a03a39  wget-1.21.3.tar.gz
Vya7i8XKD23HEQ9kFuS7cBni0v9b+T0cov/MZlbyIOU  wget-1.21.3.tar.gz
4208e02d7dd6ff5f1616e8f3fe79d76688e5300d  wget-1.21.3.tar.lz
29L7XkcUnUdS0Oqg2saMxJzyDUbfT44yb/yPGLKvTqU  wget-1.21.3.tar.lz

The SHA256 checksum is base64 encoded, instead of the
hexadecimal encoding that most checksum tools default to.

[*] Use a .sig file to verify that the corresponding file (without the
.sig suffix) is intact.  First, be sure to download both the .sig file
and the corresponding tarball.  Then, run a command like this:

  gpg --verify wget-1.21.3.tar.gz.sig

If that command fails because you don't have the required public key,
then run this command to import it:

  gpg --keyserver keys.gnupg.net --recv-keys 64FF90AAE8C70AF9

and rerun the 'gpg --verify' command.

This release was bootstrapped with the following tools:
  Autoconf 2.71
  Automake 1.16.5
  Gnulib v0.1-5171-gc5c11d6447

NEWS

* Noteworthy changes in release 1.21.3 (2022-02-26)

** Fix computation of total bytes downloaded during FTP transfers (#61277)

** Add option to select TLS 1.3 on the command line

** Fix HSTS build issues on some 64-bit big-endian systems

** Hide password during status report in --no-verbose

** Remove a spurious print statement that showed up even during --quiet

** Some more cleanups and bug-fixes




OpenPGP_0x2A1743EDA91A35B6.asc
Description: OpenPGP public key


OpenPGP_signature
Description: OpenPGP digital signature


Re: [PATCH] netrc: only remove backslash when we have quotes

2022-02-11 Thread Darshit Shah
Hi,

Thanks a lot for the patch! Would you be willing to write a test case for it? 
The tests are  available in the testenv/ directory

On Fri, Feb 11, 2022, at 12:28, Jose Quaresma wrote:
> If the netrc have backslash char "\" it doesn't work with "wget"
> but the same netrc file works with "curl -n"
>
> - For example if the netrc password have a backslash on it the wget will 
> return:
>   Username/Password Authentication Failed.
>
> - The same password with the backslash works if the it is typed on the 
> stdin with:
>   wget uri --user=username --ask-password
>
> commit 2b2fd2924aa9eac8c831380196a13c427f6b4329, introduce quotation mark
> support and after that wget will remove the backslash char in every token
> on the netrc. The backslash can be removed but only when the presence of
> the quotation mark is detected.
>
> Signed-off-by: Jose Quaresma 
> ---
>  src/netrc.c | 2 +-
>  1 file changed, 1 insertion(+), 1 deletion(-)
>
> diff --git a/src/netrc.c b/src/netrc.c
> index 76e52485..ab090256 100644
> --- a/src/netrc.c
> +++ b/src/netrc.c
> @@ -294,7 +294,7 @@ parse_netrc_fp (const char *path, FILE *fp)
> 
>/* Find the end of the token, handling quotes and escapes.  */
>while (*p && (qmark ? *p != '"' : !c_isspace (*p))){
> -if (*p == '\\')
> +if (qmark && *p == '\\')
>shift_left (p);
>  p ++;
>}
> -- 
> 2.35.1



Re: Suggestion on implementing wget in Rust language

2022-02-08 Thread Darshit Shah
Hi Ali,

It's nice to see you so excited about starting a new project. However, we are 
not interested in rewriting Wget in Rust.

P.S.: You're a high school student, so I'm going to assume you didn't know 
better. But at this point "re-write it in Rust" is a meme. Please do not send 
out messages like that. I think Rust is a great language and it has it's 
benefits. If you're really interested in Rust, you'll find a lot of projects 
already written in Rust to contribute to. There is no tangible benefit to 
sending out emails to projects in other languages, offering to rewrite it in 
Rust.

If you would like to continue with this as a personal project, do contact me 
privately and we can have a little chat about how you could proceed. It would 
indeed be a nice project to learn some concepts in Rust.

On Sun, Jan 23, 2022, at 13:50, Ali Moeini wrote:
> Hello everyone;
>
> I'm not sure if I'm doing the right thing asking here but anyway,
>
> I'm a high school student and a newbie in rust (and programming and 
> programming in general) and wanted to start implementing wget in rust as 
> to strengthen my knowledge on rust and as to have my first project.
>
> So as I have no experience and no prior knowledge (just some basic rust) 
> I wanted your suggestions on where to start, and maybe some guidance.
>
>
> Again sorry if this is not the right place.
>
> Sincere respect for all your hard work on this great project.



Re: build problems on Fedora Core 36

2022-02-08 Thread Darshit Shah
Hi George,

This is a known issue due to a mismatch in gettext and autoconf. Please run 
`autoreconf -ivf`. That should hopefully fix the macro versions. 

On Tue, Feb 8, 2022, at 01:59, George R Goffe wrote:
> Hi,
>
> I'm having a heck of a time getting wget to build. I'm using the source 
> from " git clone https://git.savannah.gnu.org/git/wget.git wget".
>
> Here's what I'm seeing:
>
> make[2]: Entering directory '/export/home/tools/wget1/wget/po'
> make[2]: Nothing to be done for 'all'.
> make[2]: Leaving directory '/export/home/tools/wget1/wget/po'
> Making all in gnulib_po
> make[2]: Entering directory '/export/home/tools/wget1/wget/gnulib_po'
> Makefile:643: warning: ignoring prerequisites on suffix rule definition
> *** error: gettext infrastructure mismatch: using a Makefile.in.in from 
> gettext version 0.20 but the autoconf macros are from gettext version 
> 0.19
> make[2]: *** [Makefile:685: stamp-po] Error 1
> make[2]: Leaving directory '/export/home/tools/wget1/wget/gnulib_po'
> make[1]: *** [Makefile:1721: all-recursive] Error 1
> make[1]: Leaving directory '/export/home/tools/wget1/wget'
> make: *** [Makefile:1673: all] Error 2
> fc36-bash 5.1 /tools/wget1/wget# cat ../getwget1
>
>
> I don't see where I'm going wrong, could I get you to take a peek please?
>
> Best regards,
>
> George...



[bug #61427] --no-clobber doesn't work with many files

2022-01-15 Thread Darshit Shah
Update of bug #61427 (project wget):

  Status:None => Invalid
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

Hi,

This is expected behaviour. If you're using Wget to mirror a directory, you
should try using the --mirror option which sets multiple correct options
automatically.

In your particular case, you asked for no-clobber. This means Wget will not
replace the original file in case it was changed. Thus the index.html or
whichever file links to the new file will not be updated on disk. As a result,
Wget doesn't know of the existence of the new file and won't download it.

Do you really need -nc when running with -r? What is your usecase?

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61733] dot_finish() doesn't print the newline when wget is run with the --no-verbose option

2022-01-15 Thread Darshit Shah
Update of bug #61733 (project wget):

  Status:None => Fixed  
 Open/Closed:Open => Closed 
 Release:None => trunk  

___

Follow-up Comment #1:

Pushed the change.
Thank you for your contribution!

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #20368] Cleanup heavy use of #if, #ifdef, etc

2022-01-09 Thread Darshit Shah
Update of bug #20368 (project wget):

  Status: In Progress => Wont Fix   
 Assigned to:  ardsrk => None   
 Open/Closed:Open => Closed 


___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




Re: wget for DOS, nls support

2021-12-22 Thread Darshit Shah
Hi,

NLS support in GNU Wget is handled via the GNU gettext. It is possible that 
your build of Wget does not support NLS. You'll have to check that using `wget 
--version`

If it is indeed supported, you'll have to check the get text documentation to 
see where it expects the NLS translation files on DOS.

However, if it is not built in, you'll need to build a new version of Wget with 
NLS support enabled.

On Wed, Dec 22, 2021, at 21:26, Fritz Mueller wrote:
> hi,
>
> maybe you know that the freedos group plans to update the freedos 
> installation CD to version 1.3.
>
> I am working on translating the nls files to german.
> one of the files is wget for dos, i know you do not support it 
> directly, but maybe you can help me.
> usually a dos nls file is a single file at c:\freedos\nls. e.g. wget.de 
> or wget.en etc.
> the wget website says that nls is supported.
> i tried this way but it didnt work so it maybe that the dos version 
> does not support nls or my syntax or jumppoints or something else is 
> wrong.
> can you tell me what could be the reason or who is the maintainer of 
> this tool so that i can get in contact with him?
>
> thanks and merry christmas
>
> fritz mueller



[bug #61649] Wget not honouring Content-Encoding: gzip

2021-12-09 Thread Darshit Shah
Follow-up Comment #2, bug #61649 (project wget):

I like the idea and will gladly accept a patch for it :)

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61451] Error reading hsts database on mips_24kc with musl 1.2

2021-12-01 Thread Darshit Shah
Update of bug #61451 (project wget):

  Status:None => Fixed  
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

Fixed in trunk by reading a 64-bit value into the variables

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61492] --no-verbose leaks information about HTTP password to stdout

2021-12-01 Thread Darshit Shah
Update of bug #61492 (project wget):

  Status:   Confirmed => Fixed  
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

Fixed in trunk.

Thanks!

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




Re: wget 1.21.2 checks failing with unexpected extra downloaded files

2021-12-01 Thread Darshit Shah
Hi Brian,

Thanks for the report and sorry for the delay it has taken me to respond to 
your email.

This is indeed something weird. You are right in your guess that it happens 
because of the wget-log file being created. 
I will have to try and reproduce this issue in a Cygwin environment at some 
point, since I don't have a Windows machine on hand.

However, I can't recollect any change in the 1.21.2 release that would cause 
Wget to redirect to the log file in a new scenario. So, I have to ask, has 
something changed in your environment?
Wget will automatically redirect to wget-log if it identifies that stdout 
/stderr are not connected to a terminal.

In either case, I'd love to fix this in a way that works in your environment 
because the tests should always run. 

On Sun, Sep 26, 2021, at 23:45, Brian Inglis wrote:
> Latest wget 1.21.2 checks are all failing with unexpected extra files 
> being downloaded under latest Cygwin 3.0.2 (under Windows 2020H2 with 
> updates to 2021-August).
> I have also tried building with and without autoreconf without change.
>
> The issue appears under both 64 and 32 bit environments and under our 
> Github Actions scallywag CI servers with minimal installs of only the 
> required (Cygwin) lib...-devel etc. packages for building and testing.
>
> The relevant test-suite.log files have been attached, lightly "sanitized 
> for my protection".
>
> Never had any problems with test failures (except unsupported IRI) 
> building any previous releases under either Cygwin arch.
>
> Is there any chance that files e.g. wget-log are being created or saved 
> in directories differing *only* from the download directories by *case*, 
> or by some unexpected *encoding* difference, due to file name characters 
> prohibited by the underlying Windows filesystem, somewhere in the path, 
> or an unexpected location?
>
> Have there been any significant changes in testing requirements or 
> utilities?
>
> Could there be any issues from including updated gnulib m4 macros?
> I have had a couple of recent issues with other packages requiring 
> tweaks to gnulib m4 macros.
>
> Please let me know if there is any further diagnosis I can do or 
> information I can provide to assist with resolving this issue.
>
> -- 
> Take care. Thanks, Brian Inglis, Calgary, Alberta, Canada
>
> This email may be disturbing to some readers as it contains
> too much technical detail. Reader discretion is advised.
> [Data in binary units and prefixes, physical quantities in SI.]
>
> Attachments:
> * testenv-test-suite.log
> * tests-test-suite.log



Re: wget 1.21.2 config gnulib threadlib.m4 serial 31 enables weak references under Cygwin 64

2021-12-01 Thread Darshit Shah
Thanks Brian,

This will be automatically included in the newest release when we update the 
gnulib version.

On Tue, Sep 28, 2021, at 05:19, Brian Inglis wrote:
> Cygwin 64 does not support shared library weak references as no 
> references may be undefined in Windows DLLs.
> Recent gnulib versions (at least 29-31 in bison, wget, and wget2) 
> default Cygwin to gl_cv_have_weak=yes.
> Bruno Haible has applied a patch (referenced below and in the attached) 
> to gnulib threadlib.m4 serial 31, and I have applied a modified version 
> to serial 29:
>
> .
> * m4/threadlib.m4 (gl_WEAK_SYMBOLS): Force a "guessing no" result on Cygwin.
>
> The actual patch is in:
>
> https://lists.gnu.org/archive/html/bug-gnulib/2021-09/msg00068.html
>
> -- 
> Take care. Thanks, Brian Inglis, Calgary, Alberta, Canada
>
> This email may be disturbing to some readers as it contains
> too much technical detail. Reader discretion is advised.
> [Data in binary units and prefixes, physical quantities in SI.]
>
> Attachments:
> * threadlib-m4-31-cygwin-weak-no.patch



[bug #61038] Windows post fails to post files over 65536 bytes,

2021-12-01 Thread Darshit Shah
Follow-up Comment #5, bug #61038 (project wget):

Unfortunately, I am unable to reproduce this on Linux.

This seems to be a Windows only issue. Could you please mention where you got
your Windows binaries from? Maybe raise an issue with them about it as well.

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




Re: wildcard and "--exclude-directories"

2021-12-01 Thread Darshit Shah
Hi Jan,

You seem to be using the option -X incorrectly. That option is valid only for 
exact directory names. It does not perform any regex matching. For your 
usecase, you want to use the --reject-regex command instead.

On Tue, Oct 19, 2021, at 22:03, Jan Nagel wrote:
> Hello,
>
> I'm trying to mirror a website "https://server.domain.com/; using GNU 
> wget 1.21 (on Debian 12, "testing").
>
> I want to exclude the directory "https://server.domain.com/foo; and all 
> subdirectories of "https://server.domain.com/bar;, but I want all files 
> in "https://server.domain.com/bar/; to be included.
>
> So I run:
> wget --recursive -X "/foo,/bar/*" https://server.domain.com/
>
> This doesn't do what I expect it to do:
> The directory "https://server.domain.com/foo; is excluded ... as 
> expected.
> Files in directory "https://server.domain.com/bar/; are included ... as 
> expected.
> But subdirectories of "https://server.domain.com/bar/; are included, 
> too.
>
> The man page says:
> "-X list
> --exclude-directories=list
> Specify a comma-separated list of directories you wish to exclude from 
> download.  Elements of list may contain wildcards."
>
> How can I prevent wget from downloading subdirectories of 
> "https://server.domain.com/bar/;?
>
> Am I using the wildcard "*" in the wrong way?
>
> Thanks for your help!
>
>
> Jan Nagel



Re: Patch: Documentation for TLS 1.3

2021-12-01 Thread Darshit Shah
Applied. Thanks!

On Sun, Nov 21, 2021, at 20:09, Thomas Niederberger wrote:
> Hi wget-team
>
> Attached you find a very small patch that adds documentation for the 
> TLSv1_3 flag to the help.
> As far as I can see the flag is already fully implemented and just not 
> documented.
> Thanks for all your good work!
>
> Cheers,
> Thomas
> Attachments:
> * 0001-src-main.c-print_help-Add-command-line-option-for-TL.patch



Re: "OpenSSL: unimplemented 'secure-protocol' option value 2"

2021-12-01 Thread Darshit Shah
Hi,

It looks like you have a version of the OpenSSL library that doesn't support 
the SSLv3 protocol. It doesn't mean it's an old version of the library, the 
support may just be compile time disabled.

Please check the version of OpenSSL and link Wget against a version that 
supports it. Or try using a different secure-protocol value.

On Mon, Nov 29, 2021, at 14:54, Danny Tuerlings via Primary discussion list for 
GNU Wget wrote:
> Hi,
>
> Please advise. Getting an error "OpenSSL: unimplemented 
> 'secure-protocol' option value 2"
> (debug file enclosed).
>
> Thanks in advance.
> Kind regards,
> Danny Tuerlings
>
>
>
>
> Attachments:
> * debug1.txt



[bug #58484] certificate expired

2021-10-11 Thread Darshit Shah
Update of bug #58484 (project wget):

 Open/Closed:Open => Closed 


___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61277] wget crashes when downloading from redirect to ftp

2021-10-11 Thread Darshit Shah
Follow-up Comment #4, bug #61277 (project wget):

Glad that all your issues were resolved.
The other FTP issue might have been fixed by a FTP path segmentation fault
that I fixed back in February.


___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61125] Output SSL_INIT to stderr even with -q

2021-10-11 Thread Darshit Shah
Follow-up Comment #3, bug #61125 (project wget):

You're right. There's a little more cleaning up I want to do with respect to
some open issues. So maybe I'll make a release within a week or two.

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #60346] Failed to rename file to file.1: (17) File exists

2021-10-11 Thread Darshit Shah
Update of bug #60346 (project wget):

  Status: Needs Investigation => Fixed  
 Assigned to:  rockdaboot => darnir 
 Open/Closed:Open => Closed 
Operating System:   Microsoft Windows => None   
   Fixed Release:None => trunk  

___

Follow-up Comment #3:

Fixed in 65e6d5b3b8c2c8e3c62ba5b9a43b84f80b01251b

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #60289] wget --continue: Range header is included in other options' HEAD requests

2021-10-11 Thread Darshit Shah
Update of bug #60289 (project wget):

 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

The claimed expected behaviour is exactly how Wget behaves on v1.21.2

Closing the issue since it is not reproducible. 

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #60246] When writing stdout into a pipe, wget never terminates

2021-10-11 Thread Darshit Shah
Update of bug #60246 (project wget):

Severity:  3 - Normal => 4 - Important  
  Status:   Confirmed => In Progress
 Assigned to:None => darnir 

___

Follow-up Comment #2:

Okay, this is a crazy issue, and I'm surprised no one has reported it in so
many years! I think this issue has likely existed for 15 years if not more.

The problem is simple:
We pass the file descriptor of the file to which the downloaded document is
saved in order to parse it for any links. So, when you combine -p with -O-,
what you get is that Wget waits indefinitely on a read() call on stdout. Which
will _NEVER_ return anything.

In order to handle -p and -O properly, I think we should download to a
temporary file before parsing and then append it to the output file. This is
especially important when using -O-. When writing to a real file, we could
probably optimize the process by writing directly to the file and then calling
fseek() to only process the newly downloaded part of the file.

I would wager this is also an issue with -r -O-. 

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #60246] When writing stdout into a pipe, wget never terminates

2021-10-11 Thread Darshit Shah
Update of bug #60246 (project wget):

  Status:None => Confirmed  
 Release:1.20 => trunk  

___

Follow-up Comment #1:

I can verify that this is an issue on all platforms. However, the issue is not
when writing to pipes, but rather when writing to a pipe and using
--page-requisites

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61140] wget 1.21.2 breaks metalink support on 32-bit x86

2021-10-08 Thread Darshit Shah
Update of bug #61140 (project wget):

 Open/Closed:Open => Closed 


___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61277] wget crashes when downloading from redirect to ftp

2021-10-08 Thread Darshit Shah
Update of bug #61277 (project wget):

  Status: Needs Investigation => Fixed  
 Open/Closed:Open => Closed 

___

Follow-up Comment #2:

This should be fixed in commit aecf5fbf.

I can't reproduce the other issues reported here. Could you please try with
the latest trunk build and see if you can still reproduce them?

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61277] wget crashes when downloading from redirect to ftp

2021-10-08 Thread Darshit Shah
Update of bug #61277 (project wget):

Severity:  3 - Normal => 4 - Important  
  Status:None => Needs Investigation
 Assigned to:None => darnir 
  Regression:  No => Yes

___

Follow-up Comment #1:

Thanks for the detailed bug report. I can reproduce this, but only partially.

The first case with -c, cannot be reproduced by me on Wget 1.21.2.
However, if the files are already there, trying to download them again with -c
results in a segfault. So I'll start investigating there

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61140] wget 1.21.2 breaks metalink support on 32-bit x86

2021-09-11 Thread Darshit Shah
Update of bug #61140 (project wget):

  Status:None => Need Info  

___

Follow-up Comment #1:

What distro are you on?

This is something the distro maintainers should handle. Unfortunately, having
ABI compatibility with two different ABI's is not feasible and switching off
y2038 would not be a smart thing to do. It should be disabled at compile time
by the distro maintainer. Or at least make sure that everything in the
repositories is consistent.

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #60617] POST not continued after 301, 302

2021-09-08 Thread Darshit Shah
Update of bug #60617 (project wget):

  Status:None => Ready For Test 

___

Follow-up Comment #1:

I think you're right. I'll fix the code and add a new test case for it.

Thanks for reporting!

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61050] Early abort should not write 0 Byte output file

2021-09-08 Thread Darshit Shah
Update of bug #61050 (project wget):

  Status:None => Invalid
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

Sorry, this is documented behaviour and will not be changed in Wget. The
manual explicitly states that the -O option is equivalent to the shell
redirection operator ">". Thus, creating / truncating an existing file to 0
bytes immediately is the correct behaviour.

I completely understand where you are coming from. I don't like this either.
But this has been the documented behaviour for many  many years now and
multiple scripts rely on it to be so. As a result, the default will not be
changed in the interests of backwards compatibility.

GNU Wget2, the next major version of Wget changes the meaning of -O and
functions exactly as you described / expected it to. It was the first breaking
change I introduced in there :)

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #60956] stylesheet and icon elements not properly classified as page requisites

2021-09-08 Thread Darshit Shah
Follow-up Comment #2, bug #60956 (project wget):

I believe this has been fixed in 1.21.2 now?
Can we close it?

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #61125] Output SSL_INIT to stderr even with -q

2021-09-08 Thread Darshit Shah
Update of bug #61125 (project wget):

  Status:None => Fixed  
 Assigned to:None => darnir 
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

Thanks for reporting the issue. That is indeed not acceptable in the quiet
mode. Seems like a debug statement was accidentally left behind.

I've removed it and pushed to master. It will be a part of the next release.
Since this is a major annoyance to people using Wget in scripts, I will likely
mark a new release with this fix within a week.

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




Re: wget-1.21.2 released [stable]

2021-09-08 Thread Darshit Shah
Hi Derek,

Sorry for the inconvenience with the GPG keys. I made sure that my key
was valid before I signed and uploaded the tarball.

See below:
gpg --list-key A91A35B6

pub   rsa4096/0x2A1743EDA91A35B6 2015-10-14 [SC] [expires: 2022-09-07]
  Key fingerprint = 7845 120B 07CB D8D6 ECE5  FF2B 2A17 43ED A91A 35B6
uid   [  full  ] Darshit Shah 
uid   [  full  ] Darshit Shah 
sub   rsa4096/0xE92ADE6826DF1410 2015-10-14 [E] [expires: 2022-09-07]
sub   rsa4096/0x64FF90AAE8C70AF9 2016-08-14 [S] [expires: 2022-09-07]
sub   rsa4096/0x5CEAE5CAD23ABCBF 2016-08-19 [A] [expires: 2022-09-07]

It seems like querying the gnupg keyserver with the signing keyid
doesn't work. I'll have to update the release script to not use that,
but instead the actual fingerprint.

And secondly, while I've uploaded my key to the keyservers, it seems
like the process failed without error. I'll look into it this evening.

Thanks for bringing it to my attention!

On 08.09.21 16:47, Derek Martin wrote:
> On Tue, Sep 07, 2021 at 05:22:03PM -0400, Derek Martin wrote:
>> On Tue, Sep 07, 2021 at 09:28:49PM +0200, Darshit Shah wrote:
>>> We are pleased to announce the release of GNU Wget 1.21.2
>> [...]
>>>   gpg --verify wget-1.21.2.tar.gz.sig
>>>
>>> If that command fails because you don't have the required public key,
>>> then run this command to import it:
>>>
>>>   gpg --keyserver keys.gnupg.net --recv-keys 64FF90AAE8C70AF9
>>
>> $ gpg --keyserver keys.gnupg.net --recv-keys 64FF90AAE8C70AF9
>> gpg: keyserver receive failed: No name
>>
>> :(
> 
> Tried other key servers as well...
> 
> $ gpg --keyserver keys.openpgp.org --recv-keys 64FF90AAE8C70AF9
> gpg: key 2A1743EDA91A35B6: no user ID
> gpg: Total number processed: 1
> 
> $ gpg --list-key 64FF90AAE8C70AF9
> gpg: error reading key: No public key
> 
> Was able to get it from Ubunu's key server, however:
> 
> $ gpg --keyserver keyserver.ubuntu.com --recv-keys 64FF90AAE8C70AF9
> gpg: key 2A1743EDA91A35B6: 8 duplicate signatures removed
> gpg: key 2A1743EDA91A35B6: 104 signatures not checked due to missing
> keys
> gpg: key 2A1743EDA91A35B6: public key "Darshit Shah "
> imported
> gpg: no ultimately trusted keys found
> gpg: Total number processed: 1
> gpg:   imported: 1
> 
> $ gpg  --verify wget-1.21.2.tar.gz.sig wget-1.21.2.tar.gz
> gpg: Signature made Tue 07 Sep 2021 03:05:35 PM EDT
> gpg:using RSA key
> 6B98F637D879C5236E277C5C64FF90AAE8C70AF9
> gpg: Good signature from "Darshit Shah " [expired]
> gpg: aka "Darshit Shah " [expired]
> gpg: Note: This key has expired!
> Primary key fingerprint: 7845 120B 07CB D8D6 ECE5  FF2B 2A17 43ED A91A 35B6
>  Subkey fingerprint: 6B98 F637 D879 C523 6E27  7C5C 64FF 90AA E8C7 0AF9
> 
> In particular, note:
> 
>> gpg: Note: This key has expired!
> 
> And:
> 
> $ gpg --list-key 0x64FF90AAE8C70AF9
> pub   rsa4096 2015-10-14 [SC] [expired: 2020-08-16]
>   7845120B07CBD8D6ECE5FF2B2A1743EDA91A35B6
> uid   [ expired] Darshit Shah 
> uid   [ expired] Darshit Shah 
> 
> This key has been expired for over a year.
> 
> Please take care of the maintenance of your GPG key and if appropriate
> re-sign the tarball.
> 


OpenPGP_0x2A1743EDA91A35B6.asc
Description: OpenPGP public key


OpenPGP_signature
Description: OpenPGP digital signature


wget-1.21.2 released [stable]

2021-09-07 Thread Darshit Shah
We are pleased to announce the release of GNU Wget 1.21.2

GNU Wget is a free utility for non-interactive download of files from
the  Web. It supports HTTP(S), and FTP(S) protocols, as well as
retrieval through HTTP proxies.

This is a minor bugfix release.

Many thanks to everyone who contributed to this release:

Darshit Shah
Josef Moellers
Michal Ruprich
Nekun
Nils
Shamil Gumirov
Tim Rühsen
Vincent Lefevre
WB
jmoellers
jrharri...@gmail.com

===

Here are the compressed sources:
  https://ftpmirror.gnu.org/wget/wget-1.21.2.tar.gz   (4.8MB)
  https://ftpmirror.gnu.org/wget/wget-1.21.2.tar.lz   (2.3MB)

Here are the GPG detached signatures[*]:
  https://ftpmirror.gnu.org/wget/wget-1.21.2.tar.gz.sig
  https://ftpmirror.gnu.org/wget/wget-1.21.2.tar.lz.sig

Use a mirror for higher download bandwidth:
  https://www.gnu.org/order/ftp.html

Here are the SHA1 and SHA256 checksums:

1a0f9beb9d34b41fd214e1c8ddcd58704c72acec  wget-1.21.2.tar.gz
5tTHa+gsZ23X6MYaKbKshRCuEIqBC10dGPyaHSyaJJc  wget-1.21.2.tar.gz
4255d97b9067cf91e5c1439520857cdaa6460fc6  wget-1.21.2.tar.lz
FyejMKhqyss+V2Fc4mj18pl4v3rexKvmow03Age8kbM  wget-1.21.2.tar.lz

The SHA256 checksum is base64 encoded, instead of the
hexadecimal encoding that most checksum tools default to.

[*] Use a .sig file to verify that the corresponding file (without the
.sig suffix) is intact.  First, be sure to download both the .sig file
and the corresponding tarball.  Then, run a command like this:

  gpg --verify wget-1.21.2.tar.gz.sig

If that command fails because you don't have the required public key,
then run this command to import it:

  gpg --keyserver keys.gnupg.net --recv-keys 64FF90AAE8C70AF9

and rerun the 'gpg --verify' command.

This release was bootstrapped with the following tools:
  Autoconf 2.71
  Automake 1.16.4
  Gnulib v0.1-4904-g394dde1b2

NEWS

* Noteworthy changes in release 1.21.2 (2021-09-07)

** Support for autoconf 2.71

** Fix a double free in FTP when using an absolute path

** Release tarballs no longer have a dependency on Python.

** --page-requisites will now also download links marked as "alternate
   stylesheet" or "icon"





Re: Patch for bug 56909

2021-09-07 Thread Darshit Shah
Hi Aleksander,

Thank you for the patch to GNU Wget!

I think the new --keep-auth-header option is a misnomer. Since it only applies 
to the case where the user explicitly passes a "Authorization" header, going 
around Wget's knowledge of it.
Thus, if this feature is to be implemented, I would rather that it is 
implemented with an option like "--remove-on-redir" or something else that 
accepts a list of headers to remove. The user can then pass whatever headers 
they want to remove on a redirection to a different domain.

Also, we would need to document the new option in the man and info pages as 
well.

On Tue, Sep 7, 2021, at 13:13, Aleksander Bułanowski via Primary discussion 
list for GNU Wget wrote:
> Hello wget maintainers,
> 
> Attached there is a patch file that strips sending Authentication headers
> on redirects.
> This should solve the https://savannah.gnu.org/bugs/?56909 / CVE-2021-31879.
> 
> Regards,
> Aleksander Bułanowski
> 
> Attachments:
> * wget-redirect-auth.patch



Re: Wget fails to --continue download

2021-08-14 Thread Darshit Shah
--continue requires server side support. While a majority of the servers out 
there support it, not all do. Check the --debug logs to see if the server 
responds with a 200 OK or a 206 Partial Content response.

For --continue to work correctly, you need the server to respond with a 206 
Partial Content with the correct range. 

On Sat, Aug 14, 2021, at 11:21, Nils Andre wrote:
> The subject is self explanatory.
> 
> I started downloading some files using wget on disk A but most of the
> downloads failed due to disk A being full. So I proceeded to `mv` some
> of the files to disk B and then executed the same wget commands again
> adding the --continue flag.
> 
> However, resuming download on both disks fails, it seems that the files
> are downloaded from the beginning again.
> 
> I'm not sure what is causing this. If this is a known issue, with a
> specific cause, please let me know, otherwise I'm here to report that 1
> person has experienced this issue.
> 
> Thanks,
> 
> Nils
> 
> PS: --continue worked as expected on one of the files.
> 
> 



Re: This version does not have support for IRIs

2021-07-28 Thread Darshit Shah
The options are always available. We cannot remove them since the man pages 
would then not match the actual  application. And multiple man pages is just 
not a good idea.

Whether or not the feature is actually supported is based on the compile time 
options. It seems like your version of Wget was compiled without IRI support. 
You would need to recompile it with support for IRIs if you want to use it

On Mon, Jul 26, 2021, at 16:30, Roger Brooks via Primary discussion list for 
GNU Wget wrote:
> If I add the option “—local-encoding=UTF-8” to my wget script, wget 1.19.1
> (the version on my NAS) says:
> 
> “This version does not have support for IRIs”
> 
> If I run “wget –help” on my NAS, both “—local-encoding” and
> “—remote-encoding” are listed as options.
> This error message was reported as a bug against 1.12.x
> 
> Is it still a known bug?
> 
> Was it fixed between 1.19.1 and 1.21.1?
> 
> Am I doing something wrong?
> 
> Thanks in advance for your advice.
> 



Re: bug report - firebase downloading in msys2 arch linux

2021-07-28 Thread Darshit Shah
Hi,

How does this concern GNU Wget? Why is this sent to us?

On Wed, Jul 28, 2021, at 16:45, Ivo Antônio Clemente Júnior wrote:
> [image: image.png]
> 
> 
> -- 
> Adm. Ivo Antônio Clemente Júnior
> CRA SP 118564
> MBA em Negócios Internacionais - FGV Management
> Ribeirão Preto - São Paulo - Brasil
> + 55 16 3624 7674
> + 55 16 9 9295 0925
> ivo.clemente.jun...@gmail.com
> https://br.linkedin.com/pub/ivo-antônio-clemente-júnior/47/671/860
> 
> Attachments:
> * image.png



[bug #59903] regex.c in trunk and 1.12.1 - ./malloc/dynarray-skeleton.c:205:40: error: expected identifier or '('

2021-03-07 Thread Darshit Shah
Update of bug #59903 (project wget):

 Privacy: Private => Public 

___

Follow-up Comment #1:

Firstly: Please take note that the repository URL you provided is not an
official upstream repository for GNU Wget. We neither know about it, nor
control it. The code you see there is not most likely a true mirror of GNU
Wget sources, but we do not take any responsibility for it. The correct
upstream sources are either:
* https://git.savannah.gnu.org/git/wget.git
* https://gitlab.com/gnuwget/wget.git

Regarding your specific issue, I hope it has already been fixed in master. But
given that I do not have access to a MacOS device, I cannot guarantee it.
Could you please check it again on the current trunk and  report if it works?

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #60017] Italian translation error in help text

2021-03-07 Thread Darshit Shah
Update of bug #60017 (project wget):

  Status:None => Wont Fix   
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

GNU Wget employs the services of the Translation Project to provide
translations. They require us to not interfere in the translation process at
all.

Please contact the Translation team directly about issues pertaining to
specific translations. The team for Italian translations can be reached at:
t...@lists.linux.it

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[bug #60106] wget mistakenly interprets Content-Range as bytes, even with Range-Unit: items, and truncates content

2021-03-07 Thread Darshit Shah
Update of bug #60106 (project wget):

  Status:None => Invalid
 Open/Closed:Open => Closed 

___

Follow-up Comment #2:

Hi,

As far as I can tell, the "Range-Unit" header is not mentioned in any of the
HTTP/1.1 Specifications (RFC 7230-7237). The correct way to specify the range
unit would be to add the unit in the "content-range" header itself. This is
very clearly outlined in both RFC2616, the original HTTP/1.1 specification and
in RFC7233 the latest version of HTTP/1.1. Please refer to the ABNF provided
there to create valid HTTP responses that comform to the specification.

By the specification, if the unit is not specified, the client is allowed to
interpret it as Bytes, which is exactly what Wget does.

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




Re: I found bug in wget

2021-03-07 Thread Darshit Shah
Hi,


On 07.03.21 01:14, kmb...@yandex.ru wrote:
> Здравствуйте, Bug-wget.
> 
> I have found one unpleasant particularity Wget.
> Sometimes it can't completely copy recursive the site.
> Since pages and directory of the site are generated dynamically on the 
> grounds of Databasee (MySQL) and do not exist in realities.
> As example of the page of the shop
> https://modastori.prom.ua/g39944845-zhenskaya-obuv
> https://modastori.prom.ua/g39944845-zhenskaya-obuv/page_2
> When downloaded the second page it deletes first.
> wget.exe -x -c --no-check-certificate -i getprom.txt -P ".\shop"
> 
> File getprom.txt contains links
> https://modastori.prom.ua/g39944845-zhenskaya-obuv
> https://modastori.prom.ua/g39944845-zhenskaya-obuv/page_2
> 

This is simple. You first download a page called
"g39944845-zhenskaya-obuv", and then try to download a page in a
subdirectory called by the same name. This is not valid at last on Unix
filesystems, and hence Wget rejects to do so. Instead, it will delete
the old file and create a directory in its place.

Wget is smart enough to recognize this when performing a recursive
download and if you also have --convert-links enabled, it will save the
file as g39944845-zhenskaya-obuv.html to prevent a name collision.

In your particular case of using the -i switch, you could use the
--adjust-extension switch to force Wget to add the html extension. It
will however break links. The only way to fix that is to download it
recursively.

> It occur and when recursive download
> wget.exe -x -r --no-check-certificate https://modastori.prom.ua/ -P ".\shop"
> wget.exe -m --no-check-certificate https://modastori.prom.ua/ -P ".\shop"
> In this case wget can't create full copy of site.
> 

I' sorry, I cannot understand this section. What exactly is the problem
again?

> When download on the contrary
> https://modastori.prom.ua/g39944845-zhenskaya-obuv/page_2
> https://modastori.prom.ua/g39944845-zhenskaya-obuv
> All are OK.
> All files created.
> I think not correct made function of the check of existence of the file.
> OS: WinXP SP3 NTFS
> 

P.S.: I see that you're running Wget 1.11. That is an extremely ancient
version of Wget and absolutely not supported anymore. Please try to
update to a newer version of Wget.



[bug #60119] wget missing URL

2021-02-25 Thread Darshit Shah
Update of bug #60119 (project wget):

  Status:None => Invalid
 Privacy: Private => Public 
 Open/Closed:Open => Closed 

___

Follow-up Comment #1:

```
wget -q -O- https://repo.protonvpn.com/debian/public_key.asc | sudo apt-key
add - 
```

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




Re: wget 1.21.1 fails to build on macOS (10.14, 10.15, 11.1)

2021-01-25 Thread Darshit Shah



On 25.01.21 06:40, Carlo Cabrera wrote:
> wget 1.21.1 fails to build on macOS 10.14, 10.15, and 11.1. The build
> fails with a series of errors starting with
> 
> In file included from regex.c:74:
> In file included from ./regexec.c:1362:
> ./malloc/dynarray-skeleton.c:195:13: error: expected identifier or '('
> __nonnull ((1))
> ^
> ./malloc/dynarray-skeleton.c:195:13: error: expected ')'
> ./malloc/dynarray-skeleton.c:195:12: note: to match this '('
> __nonnull ((1))
>^
> 
> Complete build logs available at
> 
> https://github.com/Homebrew/homebrew-core/pull/68667
> 
> The errors (on 11.1) starts at
> 
> 
> https://github.com/Homebrew/homebrew-core/pull/68667/checks?check_run_id=1674726970#step:7:910
> 
> The exact same errors affect findutils 4.8.0, so the bug may not be in
> wget. If so, I'd appreciate being pointed to the right place to submit
> this bug report.
> 
> 

This looks like a bug in the gnulib code. The correct place t o report
it would be bug-gnu...@gnu.org. I will try and take a closer look at it
soon, but I can't promise a timeline. A little too busy these days.



wget-1.21.1 released [stable]

2021-01-09 Thread Darshit Shah
We are pleased to announce the release of GNU Wget 1.21.1

GNU Wget is a free utility for non-interactive download of files from
the  Web. It supports HTTP(S), and FTP(S) protocols, as well as
retrieval through HTTP proxies.

This is a minor point release to account for some of the build issues on
MacOS and Solaris machines.

Many thanks to everyone who contributed to this release:

Darshit Shah
Jeffrey Walton
Matt Whitlock

===

Here are the compressed sources:
  https://ftpmirror.gnu.org/wget/wget-1.21.1.tar.gz   (4.7MB)
  https://ftpmirror.gnu.org/wget/wget-1.21.1.tar.lz   (2.3MB)

Here are the GPG detached signatures[*]:
  https://ftpmirror.gnu.org/wget/wget-1.21.1.tar.gz.sig
  https://ftpmirror.gnu.org/wget/wget-1.21.1.tar.lz.sig

Use a mirror for higher download bandwidth:
  https://www.gnu.org/order/ftp.html

[*] Use a .sig file to verify that the corresponding file (without the
.sig suffix) is intact.  First, be sure to download both the .sig file
and the corresponding tarball.  Then, run a command like this:

  gpg --verify wget-1.21.1.tar.gz.sig

If that command fails because you don't have the required public key,
then run this command to import it:

  gpg --keyserver keys.gnupg.net --recv-keys 64FF90AAE8C70AF9

and rerun the 'gpg --verify' command.

This release was bootstrapped with the following tools:
  Autoconf 2.70
  Automake 1.16.2
  Gnulib v0.1-4348-gc738b11c8

NEWS

* Noteworthy changes in release 1.21.1 (2021-01-09)

** Fix compilation on MacOS and Solaris 9

** Resove bashism from configure.ac

** Fix a compilation warning on 32-bit systems



OpenPGP_signature
Description: OpenPGP digital signature


Re: Warnings on 32-bit machine

2021-01-07 Thread Darshit Shah
Hi Jeffrey,

Thanks! This happened due to some last minute cleanups we did on
portability code in Wget. I've pushed a patch that should fix these
warnings.

In the future, there are more such cleanups I intend to perform to
simplify the codebase now that Wget expects parts of C99 support everywhere.

On 06.01.21 06:38, Jeffrey Walton wrote:
> Hi Tim/Darshit,
> 
> This is from Wget 1.21 on 32-bit hardware.
> 
> retr.c: In function 'fd_read_body':
> retr.c:498:35: warning: format '%ld' expects argument of type 'long
> int', but argument 2 has type 'wgint {aka long long int}' [-Wformat=]
>DEBUGP(("zlib stream ended unexpectedly after "
>^
> wget.h:129:54: note: in definition of macro 'DEBUGP'
>  #define DEBUGP(args) do { IF_DEBUG { debug_logprintf args; } } while (0)
>   ^~~~
> retr.c:499:38: note: format string is defined here
>"%ld/%ld bytes\n", sum_read, toread));
> ~~^
> %lld
> In file included from retr.c:31:0:
> retr.c:498:35: warning: format '%ld' expects argument of type 'long
> int', but argument 3 has type 'wgint {aka long long int}' [-Wformat=]
>DEBUGP(("zlib stream ended unexpectedly after "
>^
> wget.h:129:54: note: in definition of macro 'DEBUGP'
>  #define DEBUGP(args) do { IF_DEBUG { debug_logprintf args; } } while (0)
>   ^~~~
> retr.c:499:42: note: format string is defined here
>"%ld/%ld bytes\n", sum_read, toread));
> ~~^
> %lld
> In file included from retr.c:31:0:
> retr.c:591:19: warning: format '%ld' expects argument of type 'long
> int', but argument 3 has type 'wgint {aka long long int}' [-Wformat=]
>DEBUGP(("zlib read size differs from raw read size (%lu/%ld)\n",
>^
> wget.h:129:54: note: in definition of macro 'DEBUGP'
>  #define DEBUGP(args) do { IF_DEBUG { debug_logprintf args; } } while (0)
>   ^~~~
> 
> Jeff
> 
> 



Re: wget-1.21: 3 * suspicious conditions ?

2021-01-03 Thread Darshit Shah
Hi,

Thanks for the heads up. The expression is indeed more complex than it
should be. I've made some more elaborate changes than you suggested to
fix and simplify these expressions

On 03.01.21 09:21, David Binderman wrote:
> Hello there,
> 
> wget-1.21/src/retr.c:1445:10: style: Suspicious condition (assignment + 
> comparison); Clarify expression with parentheses. [clarifyCondition]
> wget-1.21/src/retr.c:1447:15: style: Suspicious condition (assignment + 
> comparison); Clarify expression with parentheses. [clarifyCondition]
> wget-1.21/src/retr.c:1454:6: style: Suspicious condition (assignment + 
> comparison); Clarify expression with parentheses. [clarifyCondition]
> 
> The first one is
> 
>   if ((overflow = ((unsigned) snprintf (to, sizeof (to), "%s%s%d", fname, 
> SEP, i)) >= sizeof (to)))
> 
> Maybe better code
> 
>   if ((overflow = ((unsigned) snprintf (to, sizeof (to), "%s%s%d", fname, 
> SEP, i))) >= sizeof (to))
> 
> Regards
> 
> David Binderman
> 
> 



Re: [PATCH] configure.ac: Don't use bashisms

2021-01-03 Thread Darshit Shah
Merged.

Thanks for the fix :)

On 02.01.21 20:34, Lars Wendler wrote:
> From: Matt Whitlock 
> 
> Gentoo-bug: https://bugs.gentoo.org/762946
> ---
>  configure.ac | 2 +-
>  1 file changed, 1 insertion(+), 1 deletion(-)
> 
> diff --git a/configure.ac b/configure.ac
> index 96adf13b..f6268fd5 100644
> --- a/configure.ac
> +++ b/configure.ac
> @@ -978,7 +978,7 @@ AM_CONDITIONAL([IRI_IS_ENABLED], [test "X$iri" != "Xno"])
>  AM_CONDITIONAL([WITH_SSL], [test "X$with_ssl" != "Xno"])
>  AM_CONDITIONAL([METALINK_IS_ENABLED], [test "X$with_metalink" != "Xno"])
>  AM_CONDITIONAL([WITH_XATTR], [test "X$ENABLE_XATTR" != "Xno"])
> -AM_CONDITIONAL([WITH_NTLM], [test "X$ENABLE_NTLM" == "Xyes"])
> +AM_CONDITIONAL([WITH_NTLM], [test "X$ENABLE_NTLM" = "Xyes"])
>  
>  dnl
>  dnl Create output
> 



Re: Wget 1.21 and OS X 10.9

2021-01-01 Thread Darshit Shah
Hi,

The Python tests look like they fail because the system does not have a
Python3 binary.

>From past experience, it is possible that you have python3 installed,
but the system does not have a python3 binary. Just Python. I don't have
a clean way to work around this problem.

On 01.01.21 21:59, Jeffrey Walton wrote:
> Hi Everyone,
> 
> It looks like things went sideways on OS X 10.9. OS X 10.9 has a Perl
> new enough so that I execute self tests.
> 
> $ perl -V
> Summary of my perl5 (revision 5 version 16 subversion 2)
> 
> Here's a sample:
> 
> gmake[4]: Entering directory '/Users/jwalton/Build-Scripts/wget-1.21/testenv'
> FAIL: Test-504.py
> FAIL: Test-416.py
> FAIL: Test-auth-basic-fail.py
> FAIL: Test-auth-basic.py
> FAIL: Test-auth-basic-netrc.py
> FAIL: Test-auth-basic-netrc-user-given.py
> FAIL: Test-auth-basic-netrc-pass-given.py
> FAIL: Test-auth-basic-no-netrc-fail.py
> FAIL: Test-auth-both.py
> FAIL: Test-auth-digest.py
> FAIL: Test-auth-no-challenge.py
> FAIL: Test-auth-no-challenge-url.py
> FAIL: Test-auth-retcode.py
> FAIL: Test-auth-with-content-disposition.py
> FAIL: Test-c-full.py
> FAIL: Test-condget.py
> FAIL: Test-Content-disposition-2.py
> FAIL: Test-Content-disposition.py
> FAIL: Test--convert-links--content-on-error.py
> FAIL: Test-cookie-401.py
> FAIL: Test-cookie-domain-mismatch.py
> FAIL: Test-cookie-expires.py
> FAIL: Test-cookie.py
> FAIL: Test-Head.py
> FAIL: Test-hsts.py
> FAIL: Test--https.py
> FAIL: Test--https-crl.py
> FAIL: Test-missing-scheme-retval.py
> FAIL: Test-O.py
> FAIL: Test-pinnedpubkey-der-https.py
> FAIL: Test-pinnedpubkey-der-no-check-https.py
> FAIL: Test-pinnedpubkey-hash-https.py
> FAIL: Test-pinnedpubkey-hash-no-check-fail-https.py
> FAIL: Test-pinnedpubkey-pem-fail-https.py
> FAIL: Test-pinnedpubkey-pem-https.py
> FAIL: Test-Post.py
> FAIL: Test-recursive-basic.py
> FAIL: Test-recursive-include.py
> FAIL: Test-recursive-redirect.py
> FAIL: Test-redirect.py
> FAIL: Test-redirect-crash.py
> FAIL: Test--rejected-log.py
> FAIL: Test-reserved-chars.py
> FAIL: Test--spider-r.py
> FAIL: Test-no_proxy-env.py
> 
> Testsuite summary for wget 1.21
> 
> # TOTAL: 45
> # PASS:  0
> # SKIP:  0
> # XFAIL: 0
> # FAIL:  45
> # XPASS: 0
> # ERROR: 0
> 
> Jeff
> 



Re: wget 1.21 fails to build on macOS

2021-01-01 Thread Darshit Shah
Hi,

Thanks for the heads up. Luckily it has already been fixed in Gnulib
(just one hour after I updated gnulib to make the release).

I'll update our submodule right away and make a new point release in the
next week after accumulating any other complaints that might arise.


On 01.01.21 11:32, FX wrote:
> Hello all,
> 
> wget 1.21 fails to build from source on all macOS version (10.14, 10.15, and 
> 11). This comes from a bug in gnulib, as I understand it:
> In lib/utime.c, in the code block for REPLACE_FUNC_UTIME_FILE, errno and 
> EOVERFLOW are both used without  being included, leading to:
> 
> utime.c:279:38: error: use of undeclared identifier 'errno'
>   if (stat (name, ) == -1 && errno != EOVERFLOW)
>  ^
> utime.c:279:47: error: use of undeclared identifier 'EOVERFLOW'
>   if (stat (name, ) == -1 && errno != EOVERFLOW)
>   ^
> 2 errors generated.
> make[3]: *** [utime.o] Error 1
> 
> See https://github.com/Homebrew/homebrew-core/pull/68095 for full build log
> 
> 
> This was already reported to gnulib at 
> https://lists.gnu.org/archive/html/bug-gnulib/2020-12/msg00295.html
> and fix in 
> https://git.savannah.gnu.org/cgit/gnulib.git/commit/?id=6a76832db224ac5671599ce332717f985a2addc7
> 
> Could the fix please be applied to wget as well?
> 
> Thanks,
> FX Coudert
> 



wget-1.21 released [stable]

2020-12-31 Thread Darshit Shah
We are pleased to announce the release of GNU Wget 1.21.

GNU Wget is a free utility for non-interactive download of files from
the  Web. It supports HTTP(S), and FTP(S) protocols, as well as
retrieval through HTTP proxies.

This is a small release with some bug fixes and a few quality of life
improvements.

On behalf of everyone contributing to GNU Wget, I wish everyone a happy
new year. May it be better than 2020!

Many thanks to everyone who contributed to this release:

Ander Juaristi
Artem Egorenkov
AviSoomirtee
Bruno Haible
Darshit Shah
Eneas U de Queiroz
Frans de Boer
Gisle Vanem
Jens Schleusener
Jim Cathey
JunDong Xie
Lauri Nurmi
Leif Ryge
raminfp
Steven M. Schweda
sulfastor
Swapnil More
Tim Rühsen
Tomas Hozza
Vyacheslav
Вячеслав Петрищев

===

Here are the compressed sources:
  https://ftp.gnu.org/gnu/wget/wget-1.21.tar.gz   (4.7MB)
  https://ftp.gnu.org/gnu/wget/wget-1.21.tar.lz   (2.3MB)

Here are the GPG detached signatures[*]:
  https://ftp.gnu.org/gnu/wget/wget-1.21.tar.gz.sig
  https://ftp.gnu.org/gnu/wget/wget-1.21.tar.lz.sig

Use a mirror for higher download bandwidth:
  https://www.gnu.org/order/ftp.html

Here are the MD5 and SHA1 checksums:

3852118b7a771a7c9c033c8f5dbf  wget-1.21.tar.gz
c10e69697250635fa2275ed0ab4e9439  wget-1.21.tar.lz
267451bb8de36cc7ac9c7c0403b2dce8854c1d97  wget-1.21.tar.gz
07fc9a34a76c91d377a2594dbb61ba83cebab9ad  wget-1.21.tar.lz

[*] Use a .sig file to verify that the corresponding file (without the
.sig suffix) is intact.  First, be sure to download both the .sig file
and the corresponding tarball.  Then, run a command like this:

  gpg --verify wget-1.21.tar.gz.sig

If that command fails because you don't have the required public key,
then run this command to import it:

  gpg --keyserver keys.gnupg.net --recv-keys dar...@gnu.org

and rerun the 'gpg --verify' command.

This release was bootstrapped with the following tools:
  Autoconf 2.70
  Automake 1.16.2
  Gnulib v0.1-4251-g43ee1a6bf


NEWS

* Changes in Wget 1.21

** Improve the number of translated strings

** Remove all uses of alloca
   In some places the length of untrusted strings has been used, e.g.
   strings from the command line or from remote.

** Fix buffer overflows in progress bar code in some locales

** Fix two null pointer accesses

** Amend cookie file header to be recognized by the 'file' command

** Post Handshake Authentication for OpenSSL

** Require gettext version 0.19.3+

** Add configure flags --enable-fsanitize-ubsan, --enable-fsanitize-asan
   and --enable-fsanitize-msan for gcc and clang

** Make several smaller fixes, enhance fuzzing, enhance building

On Behalf of the Maintainers,
Darshit Shah



Re: Wget 1.20.3 v. VMS

2020-12-30 Thread Darshit Shah
Hi Steven,

Thanks for the patch. I've gone through it and applied it to the current
Wget repository.

I've taken the liberty to make some changes to your patches:

1. I reverted the changes to the help output. They weren't VMS specific.
I'd love to get some uniformity here, but this isn't the place to do it.
2. For use_askpass I instead disabled the option entirely on VMS


Regarding your troubles with the print functions, please contact the
gnulib maintainers at bug-gnu...@gnu.org, they will be able to help you
and fixing the issue at that level will benefit all GNU projects.

A new release is coming soon with your patches included. Thanks!

On 22.12.20 07:29, Steven M. Schweda wrote:
>Greetings:
> 
>It's been a while since I've tried to put a current version of Wget
> onto VMS, but I recently tried 1.20.3, with some success, but with some
> changes needed in the main source.
> 
>Original and modified files should be available at:
> 
>   http://antinode.info/ftp/wget/wget-1_20_3a_vms/wget-1_20_3a_mods.zip
> 
>Notes follow.
> 
> 
> 
>   src/hsts.c
> 
>time_t on VMS is typically unsigned.  (Lazy man's solution to 2038?) 
> I added "(time_t)" type casts to negative values ("-1"), and changed
> tests to avoid complaints like:
> 
>   return (t < 0 ?
> ..^
> %CC-I-QUESTCOMPARE, In this statement, the unsigned expression "t" is being 
> comp
> ared with a relational operator to a constant whose value is not greater than 
> ze
> ro.  This might not be what you intended.
> at line number 224 in file 
> ITS$DKA0:[UTILITY.SOURCE.WGET.wget-1_20_3.src]hsts.c;
> 1
> 
> I believe that it's all compatible with a signed time_t.
> 
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> 
>File path name construction using UNIX-only format:
> 
>   filename = aprintf ("%s/.wget-hsts-test", opt.homedir);
> 
> was replaced by a new function which includes (and segregates) the
> VMS-specific alternative code:
> 
>   filename = ajoin_dir_file (opt.homedir, ".wget-hsts-test");
> 
> 
> 
>   src/init.c
> 
>New function, ajoin_dir_file(), to join a directory and file name
> (used in hsts.c, init.c, and main.c).
> 
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> 
>Removed VMS-specific code using "SYS$LOGIN".  (getenv( "HOME) works
> on VMS, too, when handled properly.)
> 
> 
> 
>   src/init.h
> 
>Added prototype for new function, ajoin_dir_file().
> 
> 
> 
>   src/log.c
> 
>Disabled check_redirect_output() on VMS (as on Windows).
> 
> 
> 
>   src/main.c
> 
>Changed to use ajoin_dir_file().
> 
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> 
>Changed a "`" to "'" in the help text.  I assume that you have a
> policy on "`", but its use seems inconsistent.  I avoid it entirely,
> because, in my experience, it's almost always rendered asymmetrically
> with respect to "'", hence ugly and distracting.  Perhaps I just use bad
> fonts.
> 
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> 
>Added a "use_askpass() not implemented on VMS" message, but did not
> actually disable the option.
> 
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> 
>Added a VMS-specific line (could be more) to the -V/--version report
> in print_version().  Typical output at the moment includes the
> (run-time) VMS version and the OpenSSL version, if any:
> 
>   VMS V8.4-2L1, OpenSSL 1.1.1h  22 Sep 2020
> 
> 
> 
>   src/utils.c
> 
>Changed data types in VMS-specific code in fork_to_background() to
> agree with changed types in non-VMS code.
> 
> 
> 
>Happened to notice:
> 
>   README
> 
> Recursive downloading also works with FTP, where Wget can retrieves a
>---^
> 
> 
>   Other complaints.
> 
>I had a bunch of trouble trying to figure out what to do with the GNU 
> print functions (asprintf(), snprintf(), vasnprintf(), vsnprintf()).  I
> assume that the UNIX auto-jive copes with this stuff, but I couldn't see
> why it would make any sense for vasnprintf() to use snprintf(), and for
> snprintf() to use vasnprintf().  It took me a while to diagnose the
> resulting stack overflow.  I found a solution, but...
> 
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> 
>  

Re: Wget2 bootstrap and "Unknown option: --"

2020-08-11 Thread Darshit Shah
Thanks. The bootstrap script is maintained by upstream gnulib. So I'll 
have to discuss this case with them in order to change the script. But 
this patch will definitely be helpful


On 8/10/20 9:49 PM, Jeffrey Walton wrote:

On Mon, Aug 10, 2020 at 1:02 PM Darshit Shah  wrote:


Interesting.. Thanks for the report.

However, it may make sense to keep this behavior since I don't think
that the script is compatible with Python 2. I'll have to check it
again. I will likely change the behaviour to explicitly check for Python
3 and fall back to the bourne shell script instead.


This patch will get to the "Python 2.3.4 is too old" message.

* https://github.com/noloader/Build-Scripts/blob/master/patch/wget2.patch

Jeff





Re: Wget2 bootstrap and "Unknown option: --"

2020-08-10 Thread Darshit Shah

Interesting.. Thanks for the report.

However, it may make sense to keep this behavior since I don't think 
that the script is compatible with Python 2. I'll have to check it 
again. I will likely change the behaviour to explicitly check for Python 
3 and fall back to the bourne shell script instead.


On 8/10/20 6:44 PM, Jeffrey Walton wrote:

Hi Everyone/Tim,

Are you guys interested in this? It's from a Ubuntu 4 system. Python
is there, but it is antique.

What I have found is, the best way to check Python's version is
'python -V'. It works on old and new Python.

Jeff



Unknown option: --
usage: python [option] ... [-c cmd | file | -] [arg] ...
Try `python -h' for more information.
./bootstrap: Error: 'python' not found

./bootstrap: Please install the prerequisite programs



And:

$ python --version
Unknown option: --

$ python -V
Python 2.3.4






Re: Wget fails inside a docker container

2020-08-10 Thread Darshit Shah

Hi,

We cannot help you without more information. Please provide outputs from 
from the working and non working instances preferably using the --debug 
switch so we can see what was happening.


Also, please share the output of wget --version.

On 8/10/20 4:48 PM, Matti Kamarainen wrote:

Hi,

not sure if this email address is right for asking this kind of questions,
but I hope so.

I'm trying to download data inside an Ubuntu 18.04 docker container using
wget. Mostly it works fine, but one specific file can't be downloaded:
"wget
ftp://ftp.cpc.ncep.noaa.gov/precip/PEOPLE/wd52ws/global_temp/CPC_GLOBAL_T_V0.x_0.5deg.lnx.2020;
fails. The download starts, but the progress bar does not advance.

Outside the docker the download of this particular file works fine.

What might be the issue?


With best regards
Matti Kämäräinen





Re: Why building wget requires wget command?

2020-08-09 Thread Darshit Shah

Hi,

Please use bug-wget@gnu.org in the future for questions and help. It has 
a wider audience who will answer your questions.


About your problem, we are aware of it. It happens because upstream 
gnulib project wants defaults to Wget for downloading the translation 
files. It is however possible to resolve the bootstrapping issue by 
running `./bootstrap --skip-po`; which will build Wget without any 
translations.


If you do care about the translations, you need a 2 stage build to first 
build Wget without translations and use that to bootstrap the Wget with 
translations


On 8/9/20 2:22 PM, 川原智和 wrote:

I am trying to build and install wget on macOS catalina (fresh installed)
with source from savannah repo.
Fresh macOS does not have the wget command.

But bootstrap needs wget cmd like below:
hoge@hoge wget % ./bootstrap --gnulib-srcdir=/src/git/gnulib

./bootstrap: Bootstrapping from checked-out wget sources...

./bootstrap: getting gnulib files...

./bootstrap: getting translations into po/.reference for wget...

./bootstrap: line 738: wget: command not found



Please tell me why wget is requied to build it from source and how can I
run bootrap without wget cmd (or with curl)?

Thanks.
Tomon





Re: Download page with scripted table

2020-07-16 Thread Darshit Shah

Hi,

This is the right place for Wget questions.

The page in question is generated dynamically via a AJAX request and 
some Javascript. Wget does not support any Javascript parsing. You would 
need something a lot more heavyweight to do that. Something like 
Selenium that runs a full browser underneath it.


Else, you could find the right GET request parameters and make your own 
request. That will get your the JSON output which you must then manually 
parse


On 7/17/20 3:11 AM, Morris West via Primary discussion list for GNU Wget 
wrote:

  Hi,

Might anybody know if there is a better place to ask my question below or know 
where I can get consulting for wget?
I did not see any replies.


Morris

  On Monday, June 22, 2020, 12:13:05 AM EDT, Morris West 
 wrote:
  
  Hi,


Is it possible to for wget to save the page at the link below with the table as 
it appears on the page.  My understanding is the table is the result of a 
script within the page.  I have not been able to save it with wget.  Any 
direction, insight and/or the command line would be greatly appreciated!!

https://www.benzinga.com/calendar/ratings


Morris
   


.





Re: Wget2 Question

2020-05-06 Thread Darshit Shah

Hi Yaakow,

Please use the public mailing lists for questions.

It seems like you're looking for the --span-hosts option (-H).

The --reject option takes a list of extensions, not a regex. For what 
you're trying, you want to use --reject-regex.


On 5/5/20 7:08 PM, Yaakov Schwartzman wrote:

Hi Developers,
I am a senior developer (but frankly not in C anymore)
Without burrowing deep in the code, I have been using the latest release
for months and find its performance great.  Much thanks.
I have a rather important question and a less important question.

1. Important question:
Can I configure to extract external images / videos during the download 
process?
For example domain whatever.com  has links to 
images cdnwhatever.com .   Can I automatically 
download those.  If not, can you point me to where approximately in the 
code I would need to change to do this.

2.  Less important question:
The --reject= option does not seem to work.  If I do --reject="*.jpg" or 
--reject="*.jpg, *.mp4" or even --reject="[*.jpg, *.css]"  none work.  
  It seem that I am wrong with my format.. Is it just pure regex 
expression?


Thanks in advance,
--



  Jason Schwartzman

Innovation

L1ght 










Re: Report a bug url cut

2020-02-20 Thread Darshit Shah

Hi Jorge,

That is not a bug in Wget. The "&" character in your command is 
interpreted by the shell you are using to mean 'background this command'.


Wget doesn't get to see the whole URL you typed. You need to double 
quote the entire URL so that the shell does not interpret any special 
characters.


Use:

wget "http://18chan.ml/showthread.php?tid=53=949&_lbGate=863515;

Note-to-self: Add this as a FAQ. It indeed is asked too often.

On 20/02/2020 16:22, Jorge Fernandez wrote:

Hello, please check

I use wget 1.20:
when I send this request for example:
h
wget cut the query to:  http://18chan.ml/showthread.php?tid=53

why?

[image: image.png]


[image: image.png]





Re: wget man page should hit at existence of wget2

2020-02-20 Thread Darshit Shah
That's a good point. We were just discussing about making a new release 
for Wget. I'll add an entry to the man page before doing that


On 20/02/2020 04:14, Dan Jacobson wrote:

The wget man and Info pages should say "SEE ALSO wget2(1)" etc. else the
user will never know.






Re: wget GET vs HEAD

2020-02-03 Thread Darshit Shah




On 03/02/2020 16:08, Peng Yu wrote:

Hi,

I'd like to understand the following two commands. One uses GET, the
other uses HEAD.

wget -q -O /dev/null -S -o- URL
wget -q --spider -S -o- URL

Is there first still download response body? Does wget know that its
/dev/null so that it just download the header and ignore the response
body?


No. Wget does not perform this optimization. As mentioned by Tim, there 
are many valid usecases where one would want to actually download the 
body, but not store it.




If I only want the reponse header, and GET and HEAD result in the same
header for my purpose, I should use the 2nd one to save time as it
does not have response body?



Yes. In an ideal world, GET and HEAD should always return exactly the 
same headers (See RFC2616, sec. 9.4). However, some servers do not 
follow this (*cough* Google *cough*). If it works fine for you, yes, you 
can simply use the `--spider` option to get only the headers.


If that doesn't work, there is another possible hack, __if__ the server 
supports HTTP/1.1 Range Headers. You can explicitly send
`--header=Range: bytes=0-1`. This will cause the server to send only 1 
byte of the body. If you pass `--unlink` or `-O/dev/null` then the file 
will be automatically deleted for you as well

Thanks.





Re: Add --custom-html-attrs option to support custom HTML tags and attributes

2020-01-23 Thread Darshit Shah
Hi Lyubomyr,

Thanks for your patch!

I haven't tested it yet, but I wonder why the existing switch
`--follow-tags=LIST` was not enough for your use case. IIRC, it was
exactly this case that it was implemented for.

* Lyubomyr Shaydariv  [200123 18:35]:
> Hi,
> 
> Some HTML documents use non-standard attributes that are essentially URLs 
> that might be walked through. The attached patch allows to specify new tags 
> and attributes to follow.
> 
> Example of use:
> 
> ./wget -nd -r -P . -A jpg \
>     --custom-html-attrs=div/big_img,div/med_img \
>     http://localhost/index.html
> 
> However:
> The patch does not include tests or external documentation updates of any 
> kind. It does not validate tag/attribute pairs except of simple "/" checking. 
> I'm not a C programmer, so I'm fine if the patch is considered poor and 
> consequently rejected.
> Thanks.




signature.asc
Description: PGP signature


Re: Problem building/installing wget2

2019-11-06 Thread Darshit Shah
Tim is right. This is an issue that came up with an updated version of Doxygen.
The new version broke our existing Doxygen configuration. You can either
downgrade the version of Doxygen you use or use the git master for Wget2.

I have fixed this issue in git already

* Tim Rühsen  [191106 10:25]:
> Hi George,
> 
> can you make sure you have the latest git master (commit
> a1f3f7bcc59ea071a153fed8288d1d66527e8b9d or later) ?
> 
> Darshit meanwhile fixed the doxygen issue, should work on your Fedora 31
> (?) even without pandoc.
> 
> Regards, Tim
> 
> On 11/6/19 9:50 AM, Tim Rühsen wrote:
> > On 11/6/19 4:03 AM, George R Goffe via Primary discussion list for GNU
> > Wget wrote:
> >> Hi,
> >>
> >> I just tried to build/install wget2 but there are some problems at the end 
> >> of the install related to man pages.
> >>
> >> Here's a copy of the log. 
> >>
> >> Did I do something wrong or is this really a bug?
> > 
> > Hi George,
> > 
> > likely it's a bug coming up in a certain environment. Darshit and I
> > recently discussed a similar issue, but somehow we lost focus...
> > 
> > What version of doxygen do you have installed ?
> > 
> > What if you install pandoc and build again (starting with ./configure ...)
> > 
> > As a work-around, you can skip the docs with
> > ./configure --disable-doc
> > 
> > Regards, Tim
> > 
> 




-- 
Thanking You,
Darshit Shah
PGP Fingerprint: 7845 120B 07CB D8D6 ECE5 FF2B 2A17 43ED A91A 35B6


signature.asc
Description: PGP signature


Re: Confusing "Success" error message

2019-11-03 Thread Darshit Shah
Hi Francesco,

This issue has indeed been reported multiple times in the past. While it is
confusing, the text "Success" is out of our hands. It is the value of
`perror()` after we encounter that a system call resulted in the error.

For some reason the error string is still "Success" in these cases. It is
highly reproduceable as well. At some point I will try and dig deeper / contact
the libc upstream to see why this is happening.

* Francesco Turco  [191103 15:04]:
> Hello.
> 
> I'm using wget 1.20.3 on a Gentoo Linux system.
> 
> I obtain a confusing "Success" error message from wget when trying to 
> download any file into a write-protected directory.
> 
> Steps to reproduce:
> 1) mkdir test
> 2) chmod -w test
> 3) cd test
> 4) wget --no-config 
> https://gitweb.gentoo.org/repo/gentoo.git/plain/sys-apps/lm-sensors/lm-sensors-3.6.0.ebuild
> 
> This is the output of the last command:
> 
> > --2019-11-03 14:53:34--  
> > https://gitweb.gentoo.org/repo/gentoo.git/plain/sys-apps/lm-sensors/lm-sensors-3.6.0.ebuild
> > Resolving gitweb.gentoo.org... 108.28.123.238
> > Connecting to gitweb.gentoo.org|108.28.123.238|:443... connected.
> > HTTP request sent, awaiting response... 200 OK
> > Length: 6445 (6.3K) [text/plain]
> > lm-sensors-3.6.0.ebuild: Permission denied
> > 
> > Cannot write to ‘lm-sensors-3.6.0.ebuild’ (Success).
> 
> Exit status is 3 (File I/O error).
> 
> As far as I know you can try replacing the Gentoo ebuild I used with any 
> other file from the internet: the error message will be the same.
> 
> Why does wget use the word "success" when it's clearly a failure instead? Is 
> this a bug?
> 
> -- 
> https://fturco.net/
> 
> 

-- 
Thanking You,
Darshit Shah
PGP Fingerprint: 7845 120B 07CB D8D6 ECE5 FF2B 2A17 43ED A91A 35B6


signature.asc
Description: PGP signature


[Bug-wget] [bug #56909] wget Authorization header leak via 3xx redirects

2019-10-04 Thread Darshit Shah
Update of bug #56909 (project wget):

 Privacy: Private => Public 

___

Follow-up Comment #4:

I agree with Tim here that this is not a security issue.

Wget provides an option to correctly use the Authorization header. If the user
chooses to otherwise coerce Wget into doing something different, we should not
stop them from doing so.

Using `--header=Authorization: ds` means that the user is explicitly opting to
send the header everytime rather than only to a specific domain.

On your request I'm making this issue public.


___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




[Bug-wget] [bug #56808] wget uses HEAD method when both --spider and --post-data options are used

2019-08-26 Thread Darshit Shah
Follow-up Comment #2, bug #56808 (project wget):

`--post-data` works by setting `opt.method` and `opt.body_data` while
`--spider` works by setting `opt.method`, albeit indirectly.

Now, I believe that it makes absolutely no sense to set both of those in
contradicting ways. Especially with something like `POST` or `PUT` which
requires a Request body. Since a request body is not allowed with a HEAD
request.

My suggestion here would be to add a check that prevents _any_ change to
`opt.method` if `--spider` is also passed.

This prevents not only `--post-data`, but also `--method` from setting
something funny that makes no sense.

___

Reply to this item at:

  

___
  Message sent via Savannah
  https://savannah.gnu.org/




Re: [Bug-wget] Does `wget -q -O /dev/null -S -o- url` ignore response body?

2019-08-12 Thread Darshit Shah
That is precisely what the `--spider` option does. It sends a HEAD request.
Just like the similarly named option in Curl.

If you want it to be more explicit, you can use `--method=HEAD` instead. It
will still do the same thing though.
* Peng Yu  [190812 20:56]:
> curl has the --head option. Is there a reason why wget doesn't have it?
> 
>-I, --head
>   (HTTP  FTP  FILE)  Fetch the headers only! HTTP-servers
> feature the command HEAD which this uses to get nothing but the header
> of a document. When used on an
>   FTP or FILE file, curl displays the file size and last
> modification time only.
> 
> On 8/9/19, Tim Rühsen  wrote:
> > On 09.08.19 18:06, Peng Yu wrote:
> >> Hi,
> >>
> >> I just want to retrieve the response header instead of the response body..
> >>
> >> Does `wget -q -O /dev/null -S -o- url` still download the response
> >> body, but then dump it to /dev/null? Or wget is smart enough to know
> >> the destination is /dev/null so that it will not download the response
> >> body at all? Thanks.
> >
> > /dev/null is just a another file.
> >
> > Try with --spider. It will send a HEAD request instead of a GET request
> > - thus no body is downloaded. The server just serves the header as if it
> > was a GET request.
> >
> > Regards, Tim
> >
> >
> 
> 
> -- 
> Regards,
> Peng
> 
> 

-- 
Thanking You,
Darshit Shah
PGP Fingerprint: 7845 120B 07CB D8D6 ECE5 FF2B 2A17 43ED A91A 35B6


signature.asc
Description: PGP signature


Re: [Bug-wget] [PATCH] Disable automatic wget headers.

2019-05-30 Thread Darshit Shah
;File" + str(index)
> +Files[0].append (WgetFile(file_name, file_content, rules=File_rules))
> +WGET_OPTIONS += header  + (',' if index < headers_len else '"')
> +WGET_URLS[0].append (file_name)
> +
> +Servers = [HTTP]
> +
> +ExpectedReturnCode = 0
> +
> + Pre and Post Test Hooks 
> #
> +pre_test = {
> +"ServerFiles"   : Files
> +}
> +test_options = {
> +"WgetCommands"  : WGET_OPTIONS,
> +"Urls"  : WGET_URLS
> +}
> +post_test = {
> +"ExpectedRetcode"   : ExpectedReturnCode
> +}
> +
> +err = HTTPTest (
> +pre_hook=pre_test,
> +test_params=test_options,
> +post_hook=post_test,
> +protocols=Servers
> +).begin ()
> +
> +exit (err)
> diff --git a/testenv/Test-disable-headers-after.py 
> b/testenv/Test-disable-headers-after.py
> new file mode 100644
> index ..344301a3
> --- /dev/null
> +++ b/testenv/Test-disable-headers-after.py
> @@ -0,0 +1,80 @@
> +#!/usr/bin/env python3
> +from sys import exit
> +from test.http_test import HTTPTest
> +from test.base_test import HTTP, HTTPS
> +from misc.wget_file import WgetFile
> +
> +"""
> +This is test ensures that the --disable-header option removes user 
> headers
> +from the HTTP request when it's placed after --header="header: value".
> +"""
> +# File Definitions 
> ###
> +file_content = """Les paroles de la bouche d'un homme sont des eaux 
> profondes;
> +La source de la sagesse est un torrent qui jaillit."""
> +
> +Headers = {
> +'Authorization',
> +'User-Agent',
> +'Referer',
> +'Cache-Control',
> +'Pragma',
> +'If-Modified-Since',
> +'Range',
> +'Accept',
> +'Accept-Encoding',
> +'Host',
> +'Connection',
> +'Proxy-Connection',
> +'Content-Type',
> +'Content-Length',
> +'Proxy-Authorization',
> +'Cookie',
> +'MyHeader',
> +}
> +
> +WGET_OPTIONS = ''
> +WGET_URLS = [[]]
> +Files = [[]]
> +
> +# Define user defined headers
> +for header in Headers:
> +WGET_OPTIONS += ' --header="' + header + ': any"'
> +
> +WGET_OPTIONS += ' --disable-header="'
> +headers_len = len(Headers)
> +
> +for index, header in enumerate(Headers, start=1):
> +File_rules = {
> +"RejectHeader": {
> +header : 'any'
> +}
> +}
> +file_name = "File" + str(index)
> +Files[0].append(WgetFile(file_name, file_content, rules=File_rules))
> +WGET_OPTIONS += header  + (',' if index < headers_len else '"')
> +WGET_URLS[0].append(file_name)
> +
> +Servers = [HTTP]
> +
> +ExpectedReturnCode = 0
> +
> + Pre and Post Test Hooks 
> #
> +pre_test = {
> +"ServerFiles"   : Files
> +}
> +test_options = {
> +"WgetCommands"  : WGET_OPTIONS,
> +"Urls"  : WGET_URLS
> +}
> +post_test = {
> +"ExpectedRetcode"   : ExpectedReturnCode
> +}
> +
> +err = HTTPTest (
> +pre_hook=pre_test,
> +test_params=test_options,
> +post_hook=post_test,
> +protocols=Servers
> +).begin ()
> +
> +exit (err)
> diff --git a/testenv/Test-disable-headers-before.py 
> b/testenv/Test-disable-headers-before.py
> new file mode 100644
> index ..bc19fda9
> --- /dev/null
> +++ b/testenv/Test-disable-headers-before.py
> @@ -0,0 +1,78 @@
> +#!/usr/bin/env python3
> +from sys import exit
> +from test.http_test import HTTPTest
> +from test.base_test import HTTP, HTTPS
> +from misc.wget_file import WgetFile
> +
> +"""
> +This is test ensures that the --disable-header option doesn't remove 
> user headers
> +from the HTTP request when it's placed before --header="header: value".
> +"""
> +# File Definitions 
> ###
> +file_content = """Les paroles de la bouche d'un homme sont des eaux 
> profondes;
> +La source de la sagesse est un torrent qui jaillit."""
> +
> +Headers = {
> +'Authorization',
> +'User-Agent',
> +'Referer',
> +'Cache-Control',
> +'Pragma',
> +'If-Modified-Since',
> +'Range',
> +'Accept',
> +'Accept-Encoding',
&g

Re: [Bug-wget] informations about patch

2019-05-29 Thread Darshit Shah
Hi,

Sorry, it has taken very long. The patch has not been forgotten.
I'm a little too busy with other things right now and haven't had the time to
review the patch.

I will comment back on it as soon as possible. Sorry for the delay

* adham elkarn  [190529 21:33]:
> 
> 
> Envoyé à partir de Outlook<http://aka.ms/weboutlook>
> 
> De : adham elkarn
> Envoyé : samedi 18 mai 2019 16:32
> À : bug-wget@gnu.org
> Objet : informations about patch
> 
> Hello,
> is there any news about our patch for the bug (#54769 
> (https://savannah.gnu.org/bugs/?54769))  ?
> 
> Adham EL KARN
> 
> Envoyé à partir de Outlook<http://aka.ms/weboutlook>
> 

-- 
Thanking You,
Darshit Shah
PGP Fingerprint: 7845 120B 07CB D8D6 ECE5 FF2B 2A17 43ED A91A 35B6


signature.asc
Description: PGP signature


Re: [Bug-wget] /usr/bin/env: invalid option -- 'S'

2019-05-29 Thread Darshit Shah
That's very weird, the shebang line in that file reads:

```
#!/usr/bin/env perl
```

No options are being passed to env there. I'm going to have to take another
look at this later

* Jeffrey Walton  [190529 14:21]:
> Hi Everyone/Tim,
> 
> Debian 9.9:
> 
> $ lsb_release -a
> No LSB modules are available.
> Distributor ID: Debian
> Description:Debian GNU/Linux 9.9 (stretch)
> Release:9.9
> Codename:   stretch
> 
> $ make check
> ...
> 
> PASS: Test-ftp-pasv-not-supported.px
> FAIL: Test-https-pfs.px
> FAIL: Test-https-tlsv1.px
> FAIL: Test-https-tlsv1x.px
> FAIL: Test-https-selfsigned.px
> SKIP: Test-https-weboftrust.px
> FAIL: Test-https-clientcert.px
> FAIL: Test-https-crl.px
> PASS: Test-https-badcerts.px
> 
> Trying to run manually:
> 
> $ ./wget-1.20.3/tests/Test-https-pfs.px
> /usr/bin/env: invalid option -- 'S'
> Try '/usr/bin/env --help' for more information.
> 
> And
> 
> $ /usr/bin/env --version
> env (GNU coreutils) 8.26
> 
> 

-- 
Thanking You,
Darshit Shah
PGP Fingerprint: 7845 120B 07CB D8D6 ECE5 FF2B 2A17 43ED A91A 35B6


signature.asc
Description: PGP signature


  1   2   3   4   5   6   7   8   >