Your list of files is saved in UTF-16 (aka UCS-2). wget cannot read that
format. If you save the file as UTF-8, with or without a BOM, wget should
be able to download all the links.
-Original Message-
From: Bug-wget On Behalf
Of pythonomor...@gmail.com
Sent: Tuesday, February 8, 2022
Hi Ali,
It's nice to see you so excited about starting a new project. However, we are
not interested in rewriting Wget in Rust.
P.S.: You're a high school student, so I'm going to assume you didn't know
better. But at this point "re-write it in Rust" is a meme. Please do not send
ou
Hi George,
This is a known issue due to a mismatch in gettext and autoconf. Please run
`autoreconf -ivf`. That should hopefully fix the macro versions.
On Tue, Feb 8, 2022, at 01:59, George R Goffe wrote:
> Hi,
>
> I'm having a heck of a time getting wget to build. I'm using the source
> from
Hi,
please check which pkg-config files you have for zlib
locate zlib.pc
and do
pkg-config --debug zlib
to see whether the correct file is found and that the correct paths and
flags are set.
In case this doesn't help, send me your config.log file.
Regards, Tim
On 26.01.22 01:04, George
Hi Gerd,
On 23.01.22 09:52, ge...@mweb.co.za wrote:
Hi,
I see. Fascinating story. But it probably then is too late to fix that issue
now (old wget, old servers, old antivirus on old router - they are all possible
sources of errors.)
Also, as was pointed out here by others (was it Tim?) as
Hi,
I see. Fascinating story. But it probably then is too late to fix that issue
now (old wget, old servers, old antivirus on old router - they are all possible
sources of errors.)
Also, as was pointed out here by others (was it Tim?) as well as me: The
continuation mechanism by its very
Hello
> I guess nobody even tries to reproduce the issue as nobody uses XP or
> the old wget 1.11.4. For example, I don't even have a Windows license
> and thus no Windows installed.
> - update to the latest wget (hundreds of bugs have been fixed
> meanwhile). Static binaries for 32/64 bit
Hi,
I guess nobody even tries to reproduce the issue as nobody uses XP or
the old wget 1.11.4. For example, I don't even have a Windows license
and thus no Windows installed.
To get better feedback from other users, I would suggest
- update to the latest wget (hundreds of bugs have been
Hi,
On 18.01.22 22:29, George R Goffe wrote:
Hi,
Thanks for your earlier reply. Somehow my /usr/lsd tool chain acquired a libz.a
but NO libz.so. I renamed them to off.* and the wget2 build succeeded.
You can also use ./confgure --without-zlib. No need to change something
on the system
Hi,
Thanks for your earlier reply. Somehow my /usr/lsd tool chain acquired a libz.a
but NO libz.so. I renamed them to off.* and the wget2 build succeeded.
I tried to build the original wget from " git clone
https://git.savannah.gnu.or/git/wget.git wget" but it's failing now. Not the
libz
Hi,
On 15.01.22 01:46, George R Goffe wrote:
Hi,
I'm trying to build wget2 from both the gnu ftp site and from the repository and am
having trouble with "make install". I'm enclosing the build log. Could I get
you to take a look at the end of the log and let me know what I'm doing wrong
Hi,
On 04.01.22 17:45, Laura Hebert wrote:
Hello,
I use Linux Ubuntu 18.04 to build an embedded linux (Yocto Sumo) - it
builds nightly. Just before the holidays it stopped building because of a
fetch php Temporarily Unavailable error.
So I left it hoping it would fix itself over the holidays
On Mon, Dec 27, 2021 at 12:50 PM Tim Rühsen wrote:
>
> It looks like the underlying TLS/SSL library doesn't support SSLv3.
> You possibly need to build the TLS/SSL library with SSLv3 enabled.
For OpenSSL, the option is enable-ssl3:
./Configure enable-ssl3
But OpenSSL master (3.x) does not
It looks like the underlying TLS/SSL library doesn't support SSLv3.
You possibly need to build the TLS/SSL library with SSLv3 enabled.
Regards, Tim
On 27.12.21 16:38, keda...@outlook.com wrote:
XUser:~/Desktop$ wget parrotsec.org --secure-protocol=SSLv3
--2021-12-27 10:29:27--
> If it is indeed supported, you'll have to check the get text´
> documentation to see where it expects the NLS translation files on DOS
YES ... please keep WGET for DOS available. ;-)
Hi,
NLS support in GNU Wget is handled via the GNU gettext. It is possible that
your build of Wget does not support NLS. You'll have to check that using `wget
--version`
If it is indeed supported, you'll have to check the get text documentation to
see where it expects the NLS translation
On Sat, Dec 4, 2021 at 3:16 PM Ivo Antônio Clemente Júnior
wrote:
>
> Eduardo@Eduardo-PC MINGW64 ~
> $ /XAttacker.git
> -bash: /XAttacker.git: No such file or directory
>
> Eduardo@Eduardo-PC MINGW64 ~
> $ ./XAttacker.git/action
> -bash: ./XAttacker.git/action: No such file or directory
>
>
On 02.12.21 22:48, Johan wrote:
I have a hard time making a site static with wget, it keeps saving CSS
files with versioning like main.css?v=12345 into main.css@v=12345.css, this
is not at all what I want. I would prefer it would just save it to main.css
without the query params, is this
Hi Brian,
Thanks for the report and sorry for the delay it has taken me to respond to
your email.
This is indeed something weird. You are right in your guess that it happens
because of the wget-log file being created.
I will have to try and reproduce this issue in a Cygwin environment at some
Thanks Brian,
This will be automatically included in the newest release when we update the
gnulib version.
On Tue, Sep 28, 2021, at 05:19, Brian Inglis wrote:
> Cygwin 64 does not support shared library weak references as no
> references may be undefined in Windows DLLs.
> Recent gnulib
Hi Jan,
You seem to be using the option -X incorrectly. That option is valid only for
exact directory names. It does not perform any regex matching. For your
usecase, you want to use the --reject-regex command instead.
On Tue, Oct 19, 2021, at 22:03, Jan Nagel wrote:
> Hello,
>
> I'm trying to
Applied. Thanks!
On Sun, Nov 21, 2021, at 20:09, Thomas Niederberger wrote:
> Hi wget-team
>
> Attached you find a very small patch that adds documentation for the
> TLSv1_3 flag to the help.
> As far as I can see the flag is already fully implemented and just not
> documented.
> Thanks for all
Hi,
It looks like you have a version of the OpenSSL library that doesn't support
the SSLv3 protocol. It doesn't mean it's an old version of the library, the
support may just be compile time disabled.
Please check the version of OpenSSL and link Wget against a version that
supports it. Or try
Just tried
https://eternallybored.org/imgs/large/255k.png
with a MinGW64 build on Debian Linux (bookworm).
At least that downloads without issues.
+digest +https +ssl/gnutls +ipv6 +iri +large-file -nls -ntlm -opie +psl
-hsts +iconv +idn2 -zlib -lzma -brotlidec -zstd -bzip2 -lzip -http2
Sorry for the delay, there was vacation and my MinGW build scripts
needed annoying updates for several dependencies.
Just tried the bigtest URL and got
HTTP ERROR response 404 Not Found [https://eternallybored.org/misc/bigtest]
Could you enable that URL for testing again ?
(Next testing likely
Yes, it is recursive :/ IDK why in that case everything is downloaded
and can't be rejected :(
On 10/17/21, Tim Rühsen wrote:
> On 12.10.21 16:55, Gabriele Zaverio wrote:
>> Hi there,
>>
>> Writing about wget --reject.
>>
>> When using it, wget actually download(!!!) the file, and THEN delete
On 12.10.21 16:55, Gabriele Zaverio wrote:
Hi there,
Writing about wget --reject.
When using it, wget actually download(!!!) the file, and THEN delete it.
I think this is wrong. It must not download it at all.
Can this be a bug / be corrected?
Thank you
Are you using wget in recursive
On Wed, Oct 13, 2021 at 11:06 AM Ivo Antônio Clemente Júnior
wrote:
>
> I am trying to install "termuxAlpine.sh" in my msys2 arch linux...
>
> How can I fix this bug?
>
> Eduardo@Eduardo-PC MINGW64 ~
> $ curl -LO
> https://raw.githubusercontent.com/Hax4us/TermuxAlpine/master/TermuxAlpine.sh
> %
Hi,
Though Bash bites to me from time to time (and hence I might be wrong),
the last statement doesn't look like valid syntax.
What you're doing is redirecting stdout to a file called '1'. Then the
argument '2' is taken as another URL to download. So wget gets that file
from GitHub
On Wed, Jun 3, 2020 at 11:06 AM anonymous wrote:
>
> Follow-up Comment #1, bug #58484 (project wget):
>
> I fixed my problem.
> I removed the expired AddTrust_External_Root.crt from ca-certificate,
> according to the suggestion from
>
On Sunday, October 3, 2021, 11:21:58, Tim Rühsen wrote:
> while I try to reproduce here (have to build all the deps first), could you
> possibly do some more tests to narrow down the issue !?
> a) test with --max-threads=1 --no-http2
> b) test with --max-threads=1 using a plain text (http://)
Hi Jernej,
while I try to reproduce here (have to build all the deps first), could
you possibly do some more tests to narrow down the issue !?
a) test with --max-threads=1 --no-http2
b) test with --max-threads=1 using a plain text (http://) URL
I think the message "failed to receive: 0" (the
7 0AF9
>
> In particular, note:
>
>> gpg: Note: This key has expired!
>
> And:
>
> $ gpg --list-key 0x64FF90AAE8C70AF9
> pub rsa4096 2015-10-14 [SC] [expired: 2020-08-16]
> 7845120B07CBD8D6ECE5FF2B2A1743EDA91A35B6
> uid [ expired] Darshit Shah
> uid
; gpg: Note: This key has expired!
And:
$ gpg --list-key 0x64FF90AAE8C70AF9
pub rsa4096 2015-10-14 [SC] [expired: 2020-08-16]
7845120B07CBD8D6ECE5FF2B2A1743EDA91A35B6
uid [ expired] Darshit Shah
uid [ expired] Darshit Shah
This key has been expired for over a y
On Tue, Sep 07, 2021 at 09:28:49PM +0200, Darshit Shah wrote:
> We are pleased to announce the release of GNU Wget 1.21.2
[...]
> gpg --verify wget-1.21.2.tar.gz.sig
>
> If that command fails because you don't have the required public key,
> then run this command to import it:
>
> gpg
Hi Aleksander,
Thank you for the patch to GNU Wget!
I think the new --keep-auth-header option is a misnomer. Since it only applies
to the case where the user explicitly passes a "Authorization" header, going
around Wget's knowledge of it.
Thus, if this feature is to be implemented, I would
On 23.08.21 06:10, Matt Huszagh wrote:
Tim Rühsen writes:
this works as expected with wget2 built from latest git master. Which
reminds me that we urgently need a new release.
If you want to build wget2 from tarball (which is more hassle-free than
building from git master), follow the
Tim Rühsen writes:
> this works as expected with wget2 built from latest git master. Which
> reminds me that we urgently need a new release.
>
> If you want to build wget2 from tarball (which is more hassle-free than
> building from git master), follow the instruction from
>
Tim Rühsen writes:
> this works as expected with wget2 built from latest git master. Which
> reminds me that we urgently need a new release.
>
> If you want to build wget2 from tarball (which is more hassle-free than
> building from git master), follow the instruction from
>
Hi Matt,
this works as expected with wget2 built from latest git master. Which
reminds me that we urgently need a new release.
If you want to build wget2 from tarball (which is more hassle-free than
building from git master), follow the instruction from
I apologize: I completely forgot to include the wget2 version. It's
1.99.2. wget version is 1.21.1.
Matt
> Follow-up Comment #2, bug #61038 (project wget):
> Since this is a Windows issue, could you please report this issue to where you
> got the binary from ?
See also: https://lists.gnu.org/archive/html/bug-wget/2021-08/msg2.html
It reliably fails already above 8 KiO.
>> Thanks. I got the binaries from https://eternallybored.org/misc/wget/
>>> Provided by Jernej Simončič
>>> jernej|s-webs...@eternallybored.org
> Yes, try to report to h
Done.
> I'd try some of the older binaries from eternallybored.com as well
I had done this before.
> least that is some
> We currently use wget 1.11.4 to download a file from the web. We
> are experiencing an issue with TLS 1.2 since that website removed
> support for the 1.0 and 1.1 TLS protocols. We were looking for the
> most recent version of wget but as we are using Windows servers
Same bug here.
On 09.08.21 15:41, Taylor wrote:
Hello
Sorry, I can't reproduce this on Linux with wget 1.21 nor with wget from
latest master.
Thanks. I got the binaries from https://eternallybored.org/misc/wget/
Provided by Jernej Simončič
jernej|s-webs...@eternallybored.org
Shall I try to report it to
--continue requires server side support. While a majority of the servers out
there support it, not all do. Check the --debug logs to see if the server
responds with a 200 OK or a 206 Partial Content response.
For --continue to work correctly, you need the server to respond with a 206
Partial
Hello
> Sorry, I can't reproduce this on Linux with wget 1.21 nor with wget from
> latest master.
Thanks. I got the binaries from https://eternallybored.org/misc/wget/
> Provided by Jernej Simončič
> jernej|s-webs...@eternallybored.org
Shall I try to report it to Jernej? I have never seen a
Sorry, I can't reproduce this on Linux with wget 1.21 nor with wget from
latest master.
Did you report the issue to where you got the wget.exe from ?
Regards, Tim
On 02.08.21 15:19, Taylor wrote:
To : bug-wget@gnu.org
Subj : BUG in recent versions of WGET with POSTing more than 8 KiO data
On Sun, Aug 1, 2021 at 6:48 AM Tim Rühsen wrote:
>
> On 31.07.21 13:39, 積丹尼 Dan Jacobson wrote:
> >> "TR" == Tim Rühsen writes:
> > TR> If you know that the server sends uncompressed content, you can
> > TR> compress it yourself on-the-fly to avoid excessive disk space usage.
> > TR> At
On 31.07.21 13:39, 積丹尼 Dan Jacobson wrote:
"TR" == Tim Rühsen writes:
TR> If you know that the server sends uncompressed content, you can
TR> compress it yourself on-the-fly to avoid excessive disk space usage.
TR> At least it works for single files:
TR> wget -O- | gzip > radio.csv.gz
> "TR" == Tim Rühsen writes:
TR> If you know that the server sends uncompressed content, you can
TR> compress it yourself on-the-fly to avoid excessive disk space usage.
TR> At least it works for single files:
TR> wget -O- | gzip > radio.csv.gz
That's nice but like
On 30.07.21 03:30, 積丹尼 Dan Jacobson wrote:
The man page should mention for --compression=gzip,
the website can still ignore it and send the whole file uncompressed.
And wget is not picky, and still will accept it. So better have extra
disk space ready. Tested with
The options are always available. We cannot remove them since the man pages
would then not match the actual application. And multiple man pages is just
not a good idea.
Whether or not the feature is actually supported is based on the compile time
options. It seems like your version of Wget
Hi,
How does this concern GNU Wget? Why is this sent to us?
On Wed, Jul 28, 2021, at 16:45, Ivo Antônio Clemente Júnior wrote:
> [image: image.png]
>
>
> --
> Adm. Ivo Antônio Clemente Júnior
> CRA SP 118564
> MBA em Negócios Internacionais - FGV Management
> Ribeirão Preto - São Paulo -
ents[@\?].*|External-Events[@\?].*|event-\d+[@\?].*|/[Ff]onts"
\
--rejected-log=wget-rejected.log \
--restrict-file-names=windows \
--wait=1 \
https://imcz.club/
<<
Thanks for your help!
Regards, Roger
-Original Message-
From: Tim Rühsen
Sent: Thursday, July 8, 20
[@\?].*" (the fourth and last term
in the regex).
Once again, https://regex101.com/ confirms that
"event-4193082@CalendarViewType=1=6%2F27%2F2021.html" matches
this term.
Thanks for your support.
-Original Message-
From: Tim Rühsen
Sent: Monday, July 5, 2021 4:09 PM
To: R
.
Once again, https://regex101.com/ confirms that
"event-4193082@CalendarViewType=1=6%2F27%2F2021.html" matches
this term.
Thanks for your support.
-Original Message-
From: Tim Rühsen
Sent: Monday, July 5, 2021 4:09 PM
To: Roger Brooks ; bug-wget@gnu.org
Subject: Re: Exclusion
On 7/5/21 7:31 PM, Tim Rühsen wrote:
There has been some discussion if and how we should implement FTP(S)
in Wget2. I'd like to get more feedback / suggestions / ideas on
https://gitlab.com/gnuwget/wget2/-/issues/3#note_618746514.
In case you don't have and don't want a Gitlab.com account,
On 28.06.21 19:36, Roger Brooks wrote:
I am trying to use wget 1.19.1 to back up a club website. Here is a reduced
version of my wget command, which only accesses the public parts of the
website:
cd /volume1/Backup/
wget -EkKrNpH \
--output-file=wget.log \
On 30.06.21 17:51, Mourad Amani wrote:
Bonjour,
Je travaille comme ingénieur en informatique dans une entreprise en France,
j'ai un bug sur votre commande wget que je n'arrive pas à résoudre.
Je suis actuellement en train de travailler sur un projet ou je dois lancer
des données sur une url
On 18.06.21 18:22, Stephane Ascoet wrote:
I've solved it: the directory is forbidden by robots.txt
Wget should mention this on standard input when finished, something like
"some elements couldn't be downloaded because of robots.txt rules"
Good suggestion ! Maybe even with the exact number of
V Tue, Jun 22, 2021 at 02:55:29PM +0430, Pejman Taslimi napsal(a):
> The following command with any random IP retrieves google.com! Here I've
> just set a header, but wget connects really to google.com instead of
> 192.168.15.15.
>
> $ wget -O- http://192.168.15.15 --header="Host: www.google.com"
Hi Lionel,
this seems to be an nohup / ssh issue.
It looks like wget (or better: your script) receives another signal than
SIGHUP. It might be SIGPIPE, which is typical when you try to write to a
stream that has been closed or has no reader any more. This may be the
reason why it works with
On 04.05.21 08:59, Josef Moellers wrote:
> Hi,
>
> I'm currently trying to tackle the CVE about passing credentials to
> redirected servers.
> I wonder if it may be necessary to be able to disable this feature, if
> one trusts the servers, ie if some kind of command-line option might be
>
On Fri, Nov 08, 2019 at 07:06:15PM +0200, Eli Zaretskii wrote:
>> Did you read the line "a function that succeeds is allowed
>> to change errno"?
>
> Yes, but that's against every library whose sources I've ever read.
Today I got the error message
Cannot write to ‘’ (Success).
from
On 02.05.21 14:08, Tim Rühsen wrote:
On 01.05.21 07:30, Rm Beer wrote:
SO: Archlinux32 (Linux 5.11.10-arch1-1.0 #1 SMP PREEMPT Sat, 27 Mar
2021 20:56:37 + i686 GNU/Linux)
packages:
wget 1.21.1-1.0
gnutls 3.7.1-1.0
No matter what you do, modifying user-agent, cookies, or headers, it
returns
Hi Tim,
Well, if wget is frozen, what about just failing when multiple regex
arguments are passed? Yeah, it may break poorly written scripts too,
but (imo) users shouldn't expect that dirty undocumented tricks
will be work forever and it will produce an apparent error which is
easy to fix, rather
Hi,
your patches look great, good work. And allowing multiple regexes seems
to be a good idea to me.
Here comes the (small) but...
a) Wget is in maintenance mode - we try not to add any new features
here; just bugs are fixed. New features (and I consider this a new
feature due to b))
Thanks, applied.
Regards, Tim
On 01.05.21 07:56, Nekun wrote:
* src/utils.c: Remove unpaired brace
---
src/utils.c | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/utils.c b/src/utils.c
index 426cda31..45304825 100644
--- a/src/utils.c
+++ b/src/utils.c
@@ -711,7
On 01.05.21 07:30, Rm Beer wrote:
SO: Archlinux32 (Linux 5.11.10-arch1-1.0 #1 SMP PREEMPT Sat, 27 Mar
2021 20:56:37 + i686 GNU/Linux)
packages:
wget 1.21.1-1.0
gnutls 3.7.1-1.0
No matter what you do, modifying user-agent, cookies, or headers, it
returns 403. It doesn't happen with other
On Mon, Apr 26, 2021 at 4:12 PM Ivo Antônio Clemente Júnior
wrote:
>
> [image: image.png]
The picture is too small. I can't read the text.
Maybe it would be a good idea to provide the actual text of the error
you are experiencing.
Jeff
On Sun, Apr 25, 2021 at 04:37:45PM +0200, Tim Rühsen wrote:
> b) Did you take a look into Wget2 to compare the code ?
> There is a similar function to basically do the same that was written as
> library function and which is fuzzed continuously (code for the fuzzers is
> in /fuzz).
>
On Sun, Apr 25, 2021 at 04:37:45PM +0200, Tim Rühsen wrote:
> Hey Derek :-)
Hey! :)
> Thanks for going deep into this issue.
>
> Your approach of extracting the code and reducing the initial size of the
> buffer is sensible and allows to test this code much easier then by invoking
> wget.
Hey Derek :-)
Thanks for going deep into this issue.
Your approach of extracting the code and reducing the initial size of
the buffer is sensible and allows to test this code much easier then by
invoking wget.
If I get it right, you found two issues
1. The trailing 0 byte is always 1 byte
Actually my previous patch had a bug, in the case where there is one
character remaining to be converted in the input buffer, and the size
of the output buffer need to convert it is greater than in_remain * 2
(2 bytes). This patch addresses that as well.
diff -ur wget-1.21.1.orig/src/iri.c
I believe the following patch correctly fixes the reallocation case
and avoids any buffer overflow. The variable tooshort is removed
because it was only set, never used. After looking at the EILSEQ ||
EINVAL case again, I think it does not need to be modified in order to
maintain what it was
The latest master worked for me. Thanks!
On Thu, Apr 15, 2021 at 09:09:05PM +0200, Tim Rühsen wrote:
> Hi Nils,
>
> adding AM_GNU_GETTEXT_VERSION back to configure.ac also auto-generates
> ABOUT-NLS for me. Can you try again with latest master ?
>
> Regards, Tim
>
> On 13.04.21 14:47, Nils
Hi Nils,
adding AM_GNU_GETTEXT_VERSION back to configure.ac also auto-generates
ABOUT-NLS for me. Can you try again with latest master ?
Regards, Tim
On 13.04.21 14:47, Nils Andre wrote:
I'm trying to compile wget from source but I get the following error:
```
You may need to use the
Thank you, Nils.
Commit messages slightly amend and pushed.
Regards, Tim
On 13.04.21 19:10, Nils wrote:
The attribute in html is "nofollow" so it is more consistent to call it
so than to hyphenate it.
---
src/html-url.c | 2 +-
src/recur.c| 2 +-
2 files changed, 2 insertions(+), 2
I managed to compile from git using the latest release (tag v1.21.1) so
I decided to bisect.
```
git bisect start
git bisect old v1.21.1
git bisect new master
git bisect run sh -c 'git clean -xfd && git reset --hard && ./bootstrap'
```
came up with commit 7840db6c0bbf614cd2c4c7bb9365adf2e50037d9
Because ABOUT-NLS just contains text, I used `touch ABOUT-NLS` to have
it be present.
Now, when I run `./configure --with-ssl=openssl`, I get another error
(again on both macOS (M1 Big Sur) and Linux (NixOS)):
```
checking pcre2.h usability... no
checking pcre2.h presence... no
checking for
On 31.03.21 11:06, Aggarcia wrote:
I was looking from a txt file if a group of websites where UP and running.
And i found out that www.nalco.com gets freeze
Command: wget --spider --wait=5 --timeout=5 --tries=1 www.nalco.com
It says it gets "301 Moved Permanently" (everything right) until it
On Tue, Mar 23, 2021 at 6:16 AM Ryan Schmidt wrote:
>
>
>
> On Mar 23, 2021, at 04:01, Jeffrey Walton wrote:
>
> > On Tue, Mar 23, 2021 at 12:38 AM Ryan Schmidt wrote:
> >>
> >> On Mar 22, 2021, at 17:56, Jeffrey Walton wrote:
> >>
> >>> It looks like this is an Autotools problem. Autotools
On Mar 23, 2021, at 04:01, Jeffrey Walton wrote:
> On Tue, Mar 23, 2021 at 12:38 AM Ryan Schmidt wrote:
>>
>> On Mar 22, 2021, at 17:56, Jeffrey Walton wrote:
>>
>>> It looks like this is an Autotools problem. Autotools should test if
>>> ccp works as expected, and avoid cpp if it does not.
On Tue, Mar 23, 2021 at 12:38 AM Ryan Schmidt wrote:
>
> On Mar 22, 2021, at 17:56, Jeffrey Walton wrote:
>
> > It looks like this is an Autotools problem. Autotools should test if
> > ccp works as expected, and avoid cpp if it does not. Autotools should
> > not break the build.
> >
> > Perl
On Mar 22, 2021, at 17:56, Jeffrey Walton wrote:
> It looks like this is an Autotools problem. Autotools should test if
> ccp works as expected, and avoid cpp if it does not. Autotools should
> not break the build.
>
> Perl works around Apple's cpp:
>
On Sun, Mar 21, 2021 at 6:41 AM Jeffrey Walton wrote:
>
> Hi Everyone/Tim,
>
> I've been testing iOS cross-compiles. It looks like Wget2 is having
> trouble with arm64:
>
> $ echo $CPPFLAGS
> -DNDEBUG -isysroot
>
On Mar 21, 2021, at 05:41, Jeffrey Walton wrote:
> I've been testing iOS cross-compiles. It looks like Wget2 is having
> trouble with arm64:
>
> $ echo $CPPFLAGS
> -DNDEBUG -isysroot
> /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.2.sdk
>
> $
On 21.03.21 11:41, Jeffrey Walton wrote:
Hi Everyone/Tim,
I've been testing iOS cross-compiles. It looks like Wget2 is having
trouble with arm64:
$ echo $CPPFLAGS
-DNDEBUG -isysroot
/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS8.2.sdk
$ echo
V Wed, Mar 10, 2021 at 02:34:05AM +0200, kmb...@yandex.ru napsal(a):
> Здравствуйте, Bug-wget.
>
> Unpleasant particularity Wget
> In version 1.11.4 when I downloade file from http://sourceforge.net/ I get it.
> For example
>
Hi,
On 07.03.21 01:14, kmb...@yandex.ru wrote:
> Здравствуйте, Bug-wget.
>
> I have found one unpleasant particularity Wget.
> Sometimes it can't completely copy recursive the site.
> Since pages and directory of the site are generated dynamically on the
> grounds of Databasee (MySQL) and do
On 07.03.21 19:41, Tim Rühsen wrote:
On 07.03.21 06:53, Ryan Schmidt wrote:
On Mar 6, 2021, at 22:46, Jeffrey Walton wrote:
I'm building Wget 1.21.1 on an old PowerMac with OS X 10.5. Debian
still lacks a stable image for the G5's, so I keep using Apple's OS X.
$ make
make
On 07.03.21 06:53, Ryan Schmidt wrote:
On Mar 6, 2021, at 22:46, Jeffrey Walton wrote:
I'm building Wget 1.21.1 on an old PowerMac with OS X 10.5. Debian
still lacks a stable image for the G5's, so I keep using Apple's OS X.
$ make
make all-recursive
Making all in lib
make all-am
...
V Sun, Mar 07, 2021 at 02:15:40AM +0200, kmb...@yandex.ru napsal(a):
> Unpleasant particularity Wget
> In version 1.11.4 when I downloaded file from http://sourceforge.net/ got it.
> For example
> https://sourceforge.net/projects/sevenzip/files/7-Zip/17.01/7z1701.exe/download
> I got file with
On Mar 7, 2021, at 00:04, Jeffrey Walton wrote:
> --without-included-regex is the default. (Or help is wrong. And I did
> not add an option to enable it).
Well if it were then you wouldn't see the problem, per the investigations that
were done in January.
On Mar 6, 2021, at 21:37, Jeffrey Walton wrote:
> Hi Everyone,
>
> I'm building Wget 1.21.1 on an Apple M1. Things go well for a while, and then:
>
> % make
> /Library/Developer/CommandLineTools/usr/bin/make all-recursive
> Making all in lib
> /Library/Developer/CommandLineTools/usr/bin/make
On Sun, Mar 7, 2021 at 1:15 AM Ryan Schmidt wrote:
>
> On Mar 7, 2021, at 00:12, Jeffrey Walton wrote:
>
> > On Sun, Mar 7, 2021 at 1:06 AM Ryan Schmidt wrote:
> >>
> >>
> >>
> >> On Mar 7, 2021, at 00:04, Jeffrey Walton wrote:
> >>
> >>> --without-included-regex is the default. (Or help is
On Mar 7, 2021, at 00:12, Jeffrey Walton wrote:
> On Sun, Mar 7, 2021 at 1:06 AM Ryan Schmidt wrote:
>>
>>
>>
>> On Mar 7, 2021, at 00:04, Jeffrey Walton wrote:
>>
>>> --without-included-regex is the default. (Or help is wrong. And I did
>>> not add an option to enable it).
>>
>> Well if it
On Sun, Mar 7, 2021 at 1:06 AM Ryan Schmidt wrote:
>
>
>
> On Mar 7, 2021, at 00:04, Jeffrey Walton wrote:
>
> > --without-included-regex is the default. (Or help is wrong. And I did
> > not add an option to enable it).
>
> Well if it were then you wouldn't see the problem, per the investigations
On Sun, Mar 7, 2021 at 12:56 AM Ryan Schmidt wrote:
>
> On Mar 6, 2021, at 21:37, Jeffrey Walton wrote:
>
> > I'm building Wget 1.21.1 on an Apple M1. Things go well for a while, and
> > then:
> >
> > % make
> > /Library/Developer/CommandLineTools/usr/bin/make all-recursive
> > Making all in
201 - 300 of 5537 matches
Mail list logo