In url.c / url_file_name() an empty query is not used for the filename:
/* Append ?query to the file name. */
u_query = u-query *u-query ? u-query : NULL;
Should it be patched here ?
Mit freundlichen Grüßen
Tim Rühsen
Am Thursday 29 March 2012 schrieb Tim Ruehsen:
Just some
Hello Alejandro,
here is a patch that fixes the issue with empty HTTP queries.
But the website has two files that can't be loaded (404 Not found). These
files won't be translated to local filenames. This is a correct behaviour,
since these files do not exist locally.
Guiseppe, put you in CC
Am Freitag, 9. November 2012 schrieb Ángel González:
On 09/11/12 16:27, Tim Ruehsen wrote:
While implementing cookies for Mget (https://github.com/rockdaboot/mget)
conforming to RFC 6265, I stubled over http://publicsuffix.org/ (Mozilla
Public Suffix List).
Looking at Wget sources
Am Freitag, 9. November 2012 schrieb Ángel González:
On 09/11/12 20:17, Tim Rühsen wrote:
Am Freitag, 9. November 2012 schrieb Ángel González:
I see little reason for concern about supercookies on wget given that it
is unlikely
to use it for different tasks in the same invocation
Am Donnerstag, 22. November 2012 schrieb Hrvoje Niksic:
Giuseppe Scrivano gscriv...@gnu.org writes:
Let's be realistic, is there any platform/system (with more than 3
users) where C99 is a problem?
Visual Studio is not a problem as there are other ways to build wget on
Windows that
Am Dienstag, 4. Dezember 2012 schrieb Adrien Dumont:
Hi,
I have find a bug in GNU Wget 1.13.4 :
wget $edt_url --config=$wget_config \
--post-data=login=$edt_loginpassword=$edt_passwordaction=Connexion \
--keep-session-cookies --save-cookies '/tmp/edt_cookies.txt' \
-O '/dev/null' -nv -a
Am Samstag, 8. Dezember 2012 schrieb 7382...@gmail.com:
Hello
I think wget should HTTP compression (Accept-Encoding: gzip, deflate). It
would put less strain on servers being downloading from, and use less of
their bandwidth. Is it okay to add this idea to the
Am Dienstag, 5. März 2013 schrieb Darshit Shah:
Need some help with writing a test for this functionality.
I have implemented a --method=HHTPMethod command that currently supports
DELETE only.
I would be very grateful if someone can help me with writing a test to
ensure that this is working
Am Freitag, 29. März 2013 schrieb Andy Jackson:
When using wget 1.14 to generate warc.gz files, e.g.
wget -O tempname --warc-file=output http://example.com;
the files this creates do not play back well using the Internet Archives
warc.gz parsers, throwing errors like
Invalid FExtra
Am Montag, 8. April 2013 schrieb Olivier Diotte:
Hi Giuseppe,
On Sat, Apr 6, 2013 at 4:22 PM, Giuseppe Scrivano gscriv...@gnu.org wrote:
Hi Oliver,
Olivier Diotte oliv...@diotte.ca writes:
The commands used are:
wget --save-cookies cookies.txt --keep-session-cookies --post-data
Hi Olivier,
Am Donnerstag, 11. April 2013 schrieb Olivier Diotte:
On Thu, Apr 11, 2013 at 5:19 AM, Tim Ruehsen tim.rueh...@gmx.de wrote:
Hi Olivier,
I got openWRT running.
And I can reproduce the problem.
Wget -r seems to miss some a href URLs.
I have a conference right now
Am Sonntag, 14. April 2013 schrieb Darshit Shah:
Assuming that my previous patch adding --method, --body-file and
--body-data options is accepted and merged into master,
I wanted to propose that we use Content-Type: multipart/form-data and send
the whole file as-is when using the --body-file
to get back to you in the next days.
Regards, and HTH,
Stefano
Mit freundlichem Gruß
Tim Rühsen
--
Thanking You,
Darshit Shah
Research Lead, Code Innovation
Kill Code Phobia.
B.E.(Hons.) Mechanical Engineering, '14. BITS-Pilani
--
Thanking You
Am Mittwoch, 1. Mai 2013 schrieb Darshit Shah:
First, sorry for the quick and dirty hack which was the perfect example of
how NOT to do things.
Than it was a good example ;-)
Secondly, it lies upon me that this feature wasn't tested before submitting
the patch. I had however relied on the
Hi,
Am Mittwoch, 8. Mai 2013 schrieb Mark:
Hi,
I noticed some problems relating to URLs like
http://www.example.com/path/to/filename.zip?arg1=somestringarg2=anotherstring;...
Wget doesn't strip the ? and following characters from the filename when
creating local files. As far as I can
Hi Martin,
having an abort() without a message is simply a big waste of time for any
developer who stumbles upon it.
Since the init code of Wget has to be rewritten anyways, i provide the fastest
solution right now: increasing the buffer size and printing a message before
Wget aborts.
And
I replaced some hand-written string code by standard library functions.
In any case these functions may be found in gnulib as well.
Regards, Tim
From d540fd5dbd3644936a8ad1a384516abba10de268 Mon Sep 17 00:00:00 2001
From: Tim Ruehsen tim.rueh...@gmx.de
Date: Thu, 9 May 2013 19:53:36 +0200
Replaced read_whole_file(), which needs one malloc/free per line, by getline()
which reuses a growable buffer.
getline() is a GNU function (but Wget is a GNU tool, isn't it ? :-).
Since Wget compiles/links with gnulib, I don't see a problem here.
Regards, Tim
From
Am Sonntag, 12. Mai 2013 schrieb Giuseppe Scrivano:
Tim Rühsen tim.rueh...@gmx.de writes:
having an abort() without a message is simply a big waste of time for any
developer who stumbles upon it.
I disagree here, what is so difficult that a debugger cannot catch? On
the other hand, I
Am Sonntag, 12. Mai 2013 schrieb Ángel González:
On 12/05/13 21:50, Tim Rühsen wrote:
A real solution would be a rewrite of the init stuff (I saw that already
somewhere on the Wget 2.0 wish list or somewhere - don't remeber exactly).
I already wrote this kind of code and would contribute
functions that don't behave C99 compliant.
Sound not like a compiler, but a library issue.
Maybe some functions have to be provided by Wget. If we just had a list of
issues/functions...
Regards, Tim
Am Montag, 13. Mai 2013 schrieb Bykov Aleksey:
Greetings, Tim Rühsen.
Possible that i'm understood
Sorry, forgot to switch my IDE to GNU style.
But now that I made all the requested changes to my working tree, how do I
make a diff to some commit back in time or to upstream ? Especially with git
format-patch ? Locally, I didn't create my own branch, so i am on master.
(I have to read a git
Thank you and Angel for your answers.
Am Dienstag, 14. Mai 2013 schrieb Daniel Stenberg:
On Tue, 14 May 2013, Tim Rühsen wrote:
But now that I made all the requested changes to my working tree, how do I
make a diff to some commit back in time or to upstream ? Especially with
git
Hi Alex,
snprintf %a seems to print the correct result with wine (set to WinXP), but
the same executable on a real WinXP just prints 'a'.
Replacing the sprintf() by __mingw_sprintf printed the correct result with
wine and on the WinXP machine.
About C99 - sorry, i think If article isn't
Am Donnerstag, 16. Mai 2013 schrieb Giuseppe Scrivano:
Tim Rühsen tim.rueh...@gmx.de writes:
Hi Alex,
yes, it is the _PC_NAME_MAX issue which is only valid for pathconf().
Attached is the little patch to fix it.
Since MinGW is based on gcc-4.6, C99 should be available
Am Sonntag, 26. Mai 2013 schrieb Javier Vasquez:
On Sun, May 26, 2013 at 9:51 AM, Tim Rühsen tim.rueh...@gmx.de wrote:
...
You can edit wget.texi and change all e.g. '@item number' into '@item
string. I can't test it right here since perl 5.18 is still in
experimental
and has some
Am Sonntag, 26. Mai 2013 schrieb Tim Rühsen:
Am Sonntag, 26. Mai 2013 schrieb Javier Vasquez:
On Sun, May 26, 2013 at 9:51 AM, Tim Rühsen tim.rueh...@gmx.de wrote:
...
You can edit wget.texi and change all e.g. '@item number' into '@item
string. I can't test it right here since perl
Hi Darshit,
congratulations for your selection !
I didn't know about your proposal, so I couldn't post my opinion...
In your proposal you write:
The suggestion as one dev put it, “I would prefer a C test environment for a
C project, having tests written in C”.
I guess that was me ;-)
This
Am Dienstag, 28. Mai 2013 schrieb David Linn:
Is there a way I can limit the number of links retrieved via wget -m ?
For example, just the first 100 links in a website.
Yes, having a *nix shell and grep/egrep around you can:
wget -m www.your-domain.org 21|egrep -m 100 'saved|no newer
Am Sonntag, 16. Juni 2013 schrieb Giuseppe Scrivano:
Giuseppe Scrivano gscriv...@gnu.org writes:
In my understanding, all new general patches should go into 'master',
those
regarding metalink/multithreading should go into (experimental)
'parallel-
wget' for later merging with 'master'.
Am Donnerstag, 11. Juli 2013 schrieb Tomas Hozza:
Calling wget on https server with --timeout option does not work
when the server does not answer SSL handshake. Note that this has
been tested on wget-1.14 compiled with OpenSSL.
Hi,
here is the corresponding patch for GnuTLS.
Regards, Tim
Am Donnerstag, 11. Juli 2013 schrieb Giuseppe Scrivano:
Tim Rühsen tim.rueh...@gmx.de writes:
diff --git a/src/gnutls.c b/src/gnutls.c
index 54422fc..a3b4ecc 100644
--- a/src/gnutls.c
+++ b/src/gnutls.c
do
{
err = gnutls_handshake (session);
- if (err 0
Hi,
we need a check in http.c:3759:
if (algorithm != NULL ! strcmp (algorithm, MD5-sess))
else we strcmp() with algorithm being NULL.
That should do it.
Regards, Tim
Am Freitag, 12. Juli 2013 schrieb Darshit Shah:
I have tried this response and wget just crashes here. What is the
allocated by strdupdelim () when returning.
That seems to be something that has never been done.
I hope, I am not too late ;-)
Regards, Tim
Am Freitag, 12. Juli 2013 schrieb Giuseppe Scrivano:
Tim Rühsen tim.rueh...@gmx.de writes:
we need a check in http.c:3759:
if (algorithm != NULL
Am Freitag, 12. Juli 2013 schrieb Giuseppe Scrivano:
Tim Rühsen tim.rueh...@gmx.de writes:
+ realm = opaque = nonce = qop = NULL;
+ algorithm = MD5;
Don't do that.
1. 'algorithm' will be xfreed later
2. this forces a 'algorithm=MD5 parameter even if it wasn't given before
Am Mittwoch, 10. Juli 2013 schrieb Hrvoje Niksic:
The NTLM code kindly donated by Daniel has always required OpenSSL.
configure.ac says:
Updating the code to also support GNU/TLS appears straightforward.
Here is a (quick) patch for testing using libnettle (which GnuTLS relies on
anyway).
I
Am Sonntag, 14. Juli 2013, 00:47:48 schrieb Giuseppe Scrivano:
Darshit Shah dar...@gmail.com writes:
Do you know a test HTTP server that supports auth-int ?
If yes, we could try to implement it.
In the Test Suite I am currently writing, I had the server to simply send
a qop=auth-int
Am Sonntag, 14. Juli 2013, 19:02:36 schrieb Darshit Shah:
Hi,
In http.c:3739, we club 3 different error types in one.
1. The Server did not send a nonce / realm / uri attribute in the
WWW-Authenticate Header. This should exit as a Protocol Error, Status 7
2. Wget was not invoked with a
Am Montag, 15. Juli 2013, 03:34:46 schrieb Darshit Shah:
Wait, this is all Client End. Wget already has NTLM client-end support.
I need to write a Test Server for it.
There seems to be an Apache module for NTLM at
http://modntlm.sourceforge.net/
You should put writing an own NTLM
Am Montag, 15. Juli 2013, 09:50:27 schrieb Tom Merriam:
On 07/13/2013 08:00 AM, Tim Rühsen wrote:
Am Mittwoch, 10. Juli 2013 schrieb Hrvoje Niksic:
The NTLM code kindly donated by Daniel has always required OpenSSL.
configure.ac says:
Updating the code to also support GNU/TLS appears
Am Montag, 15. Juli 2013, 16:48:36 schrieb Tom Merriam:
When built with configure make (with that patch):
GNU Wget 1.14 built on linux-gnu.
+digest +https +ipv6 -iri +large-file +nls -ntlm +opie +ssl/gnutls
Wgetrc:
/usr/local/etc/wgetrc (system)
Locale:
Am Dienstag, 16. Juli 2013, 08:34:28 schrieb Tom Merriam:
I tried that, but had the same problem. I gave up, re-extracted the
source from archive, and reapplied patch. I compiled it with configure
make and it works!
I am able to authenticate with my Windows Server.
GNU Wget 1.14 built on
Sorry, again git bugged me.
Somehow the last pull merged together with my local changes and everything is
in one commit now.
So, the attached patch is not in git-format-patch format and a ChangeLog entry
is missing:
2013-07-16 Tim Ruehsen tim.rueh...@gmx.de
* NTLM support using
Am Dienstag, 16. Juli 2013, 20:52:01 schrieb Darshit Shah:
There are two regions that I would like to draw attention to:
1. http.c:3752 : The code is quite redundant and I would prefer that it was
somehow merged. Ideas on fixing this would be greatly appreciated!
I guess you are talking
Am Donnerstag, 18. Juli 2013, 02:32:21 schrieb Darshit Shah:
I see no wasted cycles in here.
Sorry, my mistake.
And you are right, a bit code cleanup wouldn't be too bad.
Tim
signature.asc
Description: This is a digitally signed message part.
Hi,
while playing around with the test suite, I realized that --spider -r creates
a directory structure. It's the directories where Wget would save files without
--spider.
Is that intended behaviour (feature) or is it a bug ?
Regards, Tim
signature.asc
Description: This is a digitally
Am Samstag, 20. Juli 2013, 23:49:11 schrieb Darshit Shah:
When using spider, I guess this should be classified as a bug.
I'll see if I can look into fixing it. I will add a test for the same
nonetheless in the Suite I am working on.
I just took a quick look: the same applies to
Am Montag, 22. Juli 2013, 23:13:17 schrieb Darshit Shah:
This patch seems to break for normal builds.
I get the following error on running make:
configure: error: conditional HAVE_NETTLE was never defined.
Usually this means the macro was only invoked conditionally.
make: ***
Am Dienstag, 23. Juli 2013, 13:43:13 schrieb Giuseppe Scrivano:
Tim Rühsen tim.rueh...@gmx.de writes:
The above error puzzles me.
Did you do an 'autoreconf' after you locally applied the commit ?
Could someone explain that error to me ?
I think you can reproduce it when you try
Am Dienstag, 23. Juli 2013, 15:47:35 schrieb Giuseppe Scrivano:
Tim Rühsen tim.rueh...@gmx.de writes:
You changed my original patch in a way, that you won't need AM_CONDITIONAL
any more. Try commenting it out - it should work.
Thanks, it seems to work here. Are you ok with this commit
Thank you for your work, Andrew !
In general, I like the idea of being able to read and/or modify the filenames.
Just for discussion: What about a more slightly extended option:
Call an external program after downloading and saving, not only with filename
but also with additional information
Am Samstag, 3. August 2013, 00:14:38 schrieb Ángel González:
On 02/08/13 16:11, Tim Ruehsen wrote:
Hi,
I realized that gnutls.c loads every file it can find in the given
ca_directory (default: /etc/ssl/certs).
For me (on Debian SID) it means, every certificate is loaded 4 times !
Some improvements to gnutls.c, especially improved certificate loading.
Regards, Tim
From 1194317f35a014c878526dc3d2ada55ebd5fd6de Mon Sep 17 00:00:00 2001
From: Tim Ruehsen tim.rueh...@gmx.de
Date: Sat, 3 Aug 2013 19:56:39 +0200
Subject: [PATCH] gnutls improvements
---
src/ChangeLog | 7
Am Montag, 5. August 2013, 00:18:32 schrieb Giuseppe Scrivano:
Tim Rühsen tim.rueh...@gmx.de writes:
Some improvements to gnutls.c, especially improved certificate loading.
thanks for the patch but it doesn't seem to apply to origin/master.
On what version is it based? Could you please
Am Mittwoch, 7. August 2013, 00:18:25 schrieb Giuseppe Scrivano:
Hi Tim,
Tim Rühsen tim.rueh...@gmx.de writes:
I don't know what is wrong:
tim@debian:~/src/wget/trunk$ git pull
Already up-to-date.
tim@debian:~/src/wget/trunk$ git branch --list -a
* master
parallel-wget
Am Mittwoch, 7. August 2013, 08:24:35 schrieb Will Dietz:
Hi all,
There's a minor integer error in wget as described in the following bug
report:
https://savannah.gnu.org/bugs/?39453
Patch is included, please review.
Thanks!
Hi Will,
isn't the real problem a signed/unsigned
I deleted and git-cloned the complete repository.
And moved my copy of gnutls.c into src.
+#include hash.h
can you please add an explicit dependency to the hash module in
bootstrap.conf?
??? it is the hash.h from src directory. Why and where should it go into
bootstrap.conf ?
Regards, Tim
Am Donnerstag, 12. September 2013, 12:59:00 schrieb Björn Mattsson:
Run into a bug in wget last week.
Done some digging but can't solve it by my self.
If i tries to wget a file containing capital ÅÄÖ they gets coverted
wrongly, and åäö works fine.
I uses wget -m to backup one of my
Am Donnerstag, 12. September 2013, 17:37:17 schrieb Tim Ruehsen:
On Thursday 12 September 2013 12:59:00 Björn Mattsson wrote:
Run into a bug in wget last week.
Done some digging but can't solve it by my self.
If i tries to wget a file containing capital ÅÄÖ they gets coverted
wrongly,
I can download link through any browser, the link is
http://developer.blackberry.com/native/downloads/fetch/BlackBerry10Simulator
-Installer-BB10_2_0X-1155-Win-201308081613.exe
I installed wsproxy.exe in my c:\windows folder and get following error:
Just a guess.
You are behind a
Am Donnerstag, 17. Oktober 2013, 12:55:18 schrieb Andrea Urbani:
Hi,
first of all I'm sorry: I was not subscribed to the bug-wget list so I saw
only yesterday the replies of other users.
Well, this patch replaces the previous ones from me.
Now wget, after the SYST command, looks if it
Hi Sci-Fi @ hush.ai, found a prob on your XPI (nice rhyme !)
You problem is reproducable here by using
-e timeout=20 -e check-certificate=off
A workaround is
-e timeout=0
It must be some sort of regression, as you say.
I have no time to dig, but maybe my observation might help
Am Montag, 11. November 2013, 18:06:53 schrieb daniele.cal...@tin.it:
Hello,
In attachment a fix of the bug #40426
Hi Daniele,
thanks for your contribution.
But it would be nice to have -O and -r working together.
Did you try to find out why Wget blocks ?
Regards, Tim
signature.asc
Am Donnerstag, 14. November 2013, 21:00:13 schrieb Tim Rühsen:
Am Montag, 11. November 2013, 18:06:53 schrieb daniele.cal...@tin.it:
Hello,
In attachment a fix of the bug #40426
Hi Daniele,
thanks for your contribution.
But it would be nice to have -O and -r working together
Am Freitag, 20. Dezember 2013, 09:03:43 schrieb L Walsh:
But at the end of the update script, I notice a message:
if ($foundignored)
{
print STDERR \n* = CA Certificates in /etc/ssl/certs are only seen by
some legacy applications.
To install CA-Certificates globally move them to
Am Freitag, 20. Dezember 2013, 13:54:12 schrieb Mike Frysinger:
On Friday 20 December 2013 12:03:43 L Walsh wrote:
Perhaps wget isn't using the new location?
openssl manages its cert locations itself, not wget. file a bug for your
distro.
You are right.
What I wrote before about
Am Donnerstag, 26. Dezember 2013, 01:26:00 schrieb SciFi:
ping
I guess I need to remind about this bug,
I haven't opened a real bugzilla report, tho.
Shall I?
FWIW, I've changed to the timeout=0 setting,
which did let the httpS code work.
I'll need to have a non-infinite setting
for
Am Freitag, 17. Januar 2014, 11:42:41 schrieb Tony Lewis:
Darshit Shah wrote:
In case both the --config and --no-config commands are issued, the one
that
appears first on the command will be considered and the other ignored.
Given my memory of the way the parsing loop works, I would
Hi Jeffrey,
thanks for pointing this out.
BTW, to reproduce the issue I used a GnuTLS compiled/linked version of Wget:
$ wget -d --ca-certificate=ca-rsa-cert.pem --private-key=ca-rsa-key-plain.pem
https://example.com:8443
2014-03-18 21:48:04 (1.88 GB/s) - Read error at byte 5116 (The TLS
Am Mittwoch, 19. März 2014, 10:59:05 schrieb Daniel Kahn Gillmor:
I'm imagining a C library API that has a public suffix list context
object that can do efficient lookups (however we define the lookups),
and the library would bundle a pre-compiled context, based on the
currently-known public
I created a google group mailing list for further libpsl discussion.
(hope is works)
https://groups.google.com/forum/#!forum/libpsl-bugs/join
Tim
signature.asc
Description: This is a digitally signed message part.
Am Samstag, 22. März 2014, 17:41:27 schrieb Daniel Kahn Gillmor:
I would still like to move the discussion to libpsl-bugs, but so far
nobody is reading it ...
I've tried to subscribe, but apparently i have to be approved first.
please approve me! :)
Sorry, that was my fault (a
Am Sonntag, 30. März 2014, 16:00:07 schrieb Darshit Shah:
Hello,
I've been wanting to clean up the code for Wget for some time now.
Today, I wrote a small script that compiles Wget with a bunch of
warning flags and uploads each warning to GitHub as an issue.
These issues have been created
Am Freitag, 4. April 2014, 17:14:07 schrieb Darshit Shah:
On Fri, Apr 4, 2014 at 4:40 PM, Giuseppe Scrivano gscriv...@gnu.org wrote:
Hi Karl,
k...@freefriends.org (Karl Berry) writes:
Giuseppe et al.,
I suggest making unknown .wgetrc directives a warning (and just ignore
them,
Am Donnerstag, 24. April 2014, 20:00:18 schrieb Andries E. Brouwer:
On Thu, Apr 24, 2014 at 03:43:40PM +0200, Tim Ruehsen wrote:
1. How do you know, what filesystem you are writing to ?
I just think of these fat32 USB sticks flying around everywhere.
UTF-8 might be a problem (see
Am Sonntag, 11. Mai 2014, 15:56:15 schrieb Giuseppe Scrivano:
Darshit Shah dar...@gmail.com writes:
Subject: [PATCH] Fix LOTS of compiler warnings
great work! Just some minor comments:
* http.c: Fix small memory leak
diff --git a/src/css-url.c b/src/css-url.c
index
Am Freitag, 30. Mai 2014, 22:24:19 schrieb Darshit Shah:
I've attached a patch that adds support for using libpsl for cookie
domain checking in Wget.
The old heuristic checks still remain as a fallback. When the libpsl
library on the system is built without the builtin list, Wget simply
Am Freitag, 6. Juni 2014, 13:39:32 schrieb Darshit Shah:
I'm facing an issue with the patch I submitted for libpsl and would be
glad if someone could help me.
The configure.ac file does not work as expected. When libpsl is not
installed on a system, the LDFLAGS does not contain -lpsl flag,
Am Mittwoch, 11. Juni 2014, 13:50:46 schrieb Tim Rühsen:
Am Freitag, 6. Juni 2014, 13:39:32 schrieb Darshit Shah:
I'm facing an issue with the patch I submitted for libpsl and would be
glad if someone could help me.
The configure.ac file does not work as expected. When libpsl
Am Mittwoch, 11. Juni 2014, 18:57:13 schrieb Darshit Shah:
On Wed, Jun 11, 2014 at 5:20 PM, Tim Rühsen tim.rueh...@gmx.de wrote:
Am Freitag, 6. Juni 2014, 13:39:32 schrieb Darshit Shah:
I'm facing an issue with the patch I submitted for libpsl and would be
glad if someone could help me
Am Donnerstag, 12. Juni 2014, 07:16:13 schrieb Darshit Shah:
Yes, the configure statements given by Tim work. I found out that the issue
on machine was caching of configure values. Deleting the configure cache
fixed the issue.
I also agree with Giuseppe's point about not using the autoconf
Am Donnerstag, 12. Juni 2014, 13:24:02 schrieb Giuseppe Scrivano:
Darshit Shah dar...@gmail.com writes:
On Wed, Jun 11, 2014 at 5:20 PM, Tim Rühsen tim.rueh...@gmx.de wrote:
Am Freitag, 6. Juni 2014, 13:39:32 schrieb Darshit Shah:
I'm facing an issue with the patch I submitted for libpsl
Am Dienstag, 8. Juli 2014, 16:57:35 schrieb Giuseppe Scrivano:
Tomas Hozza thoz...@gnu.org writes:
What do you think about extending --secure-protocol and having a runtime
option instead of a compile time option ? Users could set the system wide
default value in /etc/wgetrc and people are
ACK from here.
And please also amend
return true ? (is_acceptable == 1) : false;
to
return is_acceptable == 1;
Regards, Tim
Am Samstag, 19. Juli 2014, 22:05:26 schrieb Darshit Shah:
Does anyone ack this patch? It's a memory leak that I would like to fix.
I'll work on Tim's
Am Freitag, 18. Juli 2014, 23:58:58 schrieb Ángel González:
I have written a wrapping script for wget that -using tsocks- makes it
connect through a socks proxy if the environment variable socks_proxy
is set.
I'm sharing it here as it may be of interest for a wider audience (or
should it
Am Montag, 21. Juli 2014, 00:58:49 schrieb Darshit Shah:
On Mon, Jul 7, 2014 at 8:14 PM, Tim Ruehsen tim.rueh...@gmx.de wrote:
One more comment / idea.
The 'cookie_domain' comes from a HTTP Set-Cookie repsonse header and thus
is (must be) toASCII() encoded (=puncode). Of course this has
Am Montag, 21. Juli 2014, 15:35:10 schrieb Giuseppe Scrivano:
Darshit Shah dar...@gmail.com writes:
From a44841cbe2abe712de84d7413c31fc14b44225a7 Mon Sep 17 00:00:00 2001
From: Darshit Shah dar...@gmail.com
Date: Mon, 21 Jul 2014 13:25:54 +0530
Subject: [PATCH] Fix potential memory leak
Am Samstag, 30. August 2014, 09:23:08 schrieb Darshit Shah:
Earlier this year, I implemented a new, more concise form of the
progress bar. However, I've just been given a bug report regarding the
same, which I was unable to fix.
The currently implemented progress bar shows only upto 15
Am Sonntag, 28. September 2014, 01:00:40 schrieb Darshit Shah:
patch version #3 is attached
Hi Tim,
I just wanted to point out that there is a blocking issue with this patch.
It eliminates the ability for the user to execute single tests directly. I
can run `make check` without much
Hi Darshit,
I am answering inline...
Am Sonntag, 28. September 2014, 01:23:08 schrieb Darshit Shah:
There are a few issues that I've been facing with the old perl based test
suite that I'd like to highlight and discuss here.
1. The way the test suite has been written, I was unable to hack
Am Mittwoch, 15. Oktober 2014, 13:45:18 schrieb Petr Pisar:
On Wed, Oct 15, 2014 at 11:57:47AM +0200, Tim Rühsen wrote:
(means, the libraries defaults are used, whatever that is).
Should we break compatibility and map 'auto' to TLSv1 ?
For the security of the users.
Please no. Instead
Am Donnerstag, 16. Oktober 2014, 14:03:43 schrieb Christoph Anton Mitterer:
Hi.
Could you please consider to remove SSLv3 (and if not done yet SSLv2 as
well) from being automatically used, while still leaving users the
choice to manually enable it (e.g. via --secure-protocol=SSLv2/3).
I
Am Mittwoch, 15. Oktober 2014, 17:26:49 schrieb Daniel Kahn Gillmor:
On 10/15/2014 03:10 PM, Tim Rühsen wrote:
I tried to make clear that Wget *explicitely* asks for SSLv2 and SSLv3 in
the default configuration when compiled with OpenSSL. Whatever the
OpenSSL library vendor is doing
schrieb Tim Rühsen:
Am Mittwoch, 15. Oktober 2014, 17:26:49 schrieb Daniel Kahn Gillmor:
On 10/15/2014 03:10 PM, Tim Rühsen wrote:
I tried to make clear that Wget *explicitely* asks for SSLv2 and SSLv3
in
the default configuration when compiled with OpenSSL. Whatever the
OpenSSL library
Am Donnerstag, 16. Oktober 2014, 22:01:35 schrieb Ángel González:
Ángel González wrote:
First of all, note that wget doesn't react to a disconnect with a
downgraded retry thus
it is mainly not vulnerable to poodle (you could only use
CVE-2014-3566 against servers
not supporting TLS).
Am Freitag, 17. Oktober 2014, 18:02:39 schrieb Christoph Anton Mitterer:
On Thu, 2014-10-16 at 21:34 +0200, Ángel González wrote:
First of all, note that wget doesn't react to a disconnect with a
downgraded retry thus
it is mainly not vulnerable to poodle (you could only use CVE-2014-3566
Am Samstag, 18. Oktober 2014, 22:40:14 schrieb Tushar:
Hi,
I am a student who would like to contribute to GNU Project. I'm very
passionate about GNU organization and would like to dedicate some time
everyday for GNU. It was mentioned that I have to send an email to this
address before
Am Sonntag, 19. Oktober 2014, 11:27:51 schrieb Matthew Atkinson:
Hi
I was looking through the list archives to find something that useful I
could contribute to wget and get familiar with the code, I have attached
a patch for the following.
Darshit Shah darnir at gmail.com wrote on
Am Sonntag, 19. Oktober 2014, 16:07:35 schrieb Giuseppe Scrivano:
Tim Rühsen tim.rueh...@gmx.de writes:
patch V2
- removed SSLv3 from --secure-protocol=auto|pfs (GnuTLS code)
- removed SSLv3 from --secure-protocol=auto (OpenSSL code)
- amended the docs
I am
Am Sonntag, 19. Oktober 2014, 21:11:01 schrieb Ángel González:
Tim Rühsen wrote:
Hi Ángel,
thanks for your testing.
I would like to reproduce it - can you tell me what you did exactly ?
I used a simple server that printed the TLS Client Hello and closed the
connection.
Browsers
1 - 100 of 831 matches
Mail list logo