Re: [Bug-wget] Wget - acess list bypass / race condition PoC

2016-08-17 Thread Dawid Golunski
CVE-ID team, could you please let us know a CVE-ID number we could use
for the GNU Wget vulnerability described in this thread ?

Thanks

On Wed, Aug 17, 2016 at 7:03 PM, Dawid Golunski  wrote:
> Hi Tim,
>
> Thanks for this.  The filename generation with temporary name in it looks 
> good.
> As for the credit, I discovered this vulnerability and passed it on to
> VeriSign, so I'd appreciate it if you could add 1 more credit in the
> patch / bug announcement as:
>
> Discovered by: Dawid Golunski (http://legalhackers.com)
>
> As for the CVEID, I think we should email this to cve-ass...@mitre.org
>
> Thanks.
>
>
>
> On Wed, Aug 17, 2016 at 4:48 PM, Tim Rühsen  wrote:
>> Please review / test this patch.
>>
>> Please check the 'Reported-by' in the commit message and if you got a CVE
>> number, please report for inclusion into the commit message (and/or the 
>> code).
>>
>> Regards, Tim
>>
>> On Mittwoch, 17. August 2016 10:40:35 CEST Dawid Golunski wrote:
>>> Random file name + .part extension on temporary files would already be
>>> good improvement (even if still stored within the same directory) and
>>> help prevent the exploitation.
>>>
>>> Thanks.
>>>
>>> On Wed, Aug 17, 2016 at 9:22 AM, Tim Rühsen  wrote:
>>> > On Mittwoch, 17. August 2016 13:37:33 CEST Ander Juaristi wrote:
>>> >> I was thinking we could rename php extensions to phps, but it's all the
>>> >> same thing in the end, and even better, since the former applies to any
>>> >> kind of file and I've seen some broken servers that actually run phps
>>> >> files.>>
>>> >> So, this is what I would do:
>>> >> 1. Write temporary files with 600 perms, and make sure they're owned
>>> >>
>>> >> by the running user and group. qmail goes even further [1] by not
>>> >> letting root run, but I would not do that here.
>>> >>
>>> >> 2. Use mkostemp() to generate a unique filename and give it a
>>> >>
>>> >> harmless extension (like Mozilla's .part). We already have unique_name()
>>> >> in utils.c, altough it returns the file name untouched if it does not
>>> >> exist. We should do some research on whether we could reuse parts of it.
>>> >
>>> > Giuseppe and I  have a working patch that is basically like this. We are
>>> > still discussing the details (mkstemp extension, fixed extension, both or
>>> > maybe a mkdtemp directory where we put all the temp files).
>>> >
>>> > As soon as we agree, we'll post the patch here for further
>>> > discussion/review.>
>>> >> 3. Place them in /tmp, or even better, in ~/.wget-tempfiles, or
>>> >>
>>> >> something like that.
>>> >
>>> > /tmp often is on a separate filesystem (e.g. RAM disk) with limited space.
>>> > This could open another (DOS) attack vector.
>>> >
>>> > You do not always have a home directory when running Wget.
>>> >
>>> >> There's a patch by Tim somewhere in this list that already does 1 (but
>>> >> please, remove the braces ;D).
>>> >>
>>> >> It also comes to my mind, instead of writing each temp file to its own
>>> >> file, we could put them all in the same file (with O_APPEND). But a) we
>>> >> need a way to tell them apart later, and b) it may cause problems in
>>> >> NFS, according to open(2).
>>> >>
>>> >> [1] http://cr.yp.to/qmail/guarantee.html
>>> >>
>>> >> On 15/08/16 18:31, Tim Rühsen wrote:
>>> >> > On Montag, 15. August 2016 10:02:55 CEST moparisthebest wrote:
>>> >> >> Hello,
>>> >> >>
>>> >> >> I find it extremely hard to call this a wget vulnerability when SO
>>> >> >> many
>>> >> >> other things are wrong with that 'vulnerable code' implementation it
>>> >> >> isn't even funny:
>>> >> >>
>>> >> >> 1. The image_importer.php script takes a single argument, why would it
>>> >> >> download with the recursive switch turned on?  Isn't that clearly a
>>> >> >> bug
>>> >> >> in the php script?  Has a php script like this that downloads all
>>> >> >> files
>>> >> >> from a website of a particular extension ever been observed in the
>>> >> >> wild?
>>> >> >>
>>> >> >> 2. A *well* configured server would have a whitelist of .php files it
>>> >> >> will execute, making it immune to this.  A *decently* configured
>>> >> >> server
>>> >> >> would always at a minimum make sure they don't execute code in
>>> >> >> directories with user provided uploads in them.  So it's additionally
>>> >> >> a
>>> >> >> bug in the server configuration. (incidentally every php package I've
>>> >> >> downloaded has at minimum a .htaccess in upload directories to prevent
>>> >> >> this kind of thing with apache)
>>> >> >>
>>> >> >> It seems to me like there has always been plenty of ways to shoot
>>> >> >> yourself in the foot with PHP, and this is just another iteration on a
>>> >> >> theme.
>>> >> >
>>> >> > Hi,
>>> >> >
>>> >> > this is absolutely true and your points were the first things that came
>>> >> > to
>>> >> > my mind when reading the original post.
>>> >> >
>>> >> > But there is also non-obvious wget behavior in creating those (temp)
>>> 

Re: [Bug-wget] Wget - acess list bypass / race condition PoC

2016-08-17 Thread Dawid Golunski
Hi Tim,

Thanks for this.  The filename generation with temporary name in it looks good.
As for the credit, I discovered this vulnerability and passed it on to
VeriSign, so I'd appreciate it if you could add 1 more credit in the
patch / bug announcement as:

Discovered by: Dawid Golunski (http://legalhackers.com)

As for the CVEID, I think we should email this to cve-ass...@mitre.org

Thanks.



On Wed, Aug 17, 2016 at 4:48 PM, Tim Rühsen  wrote:
> Please review / test this patch.
>
> Please check the 'Reported-by' in the commit message and if you got a CVE
> number, please report for inclusion into the commit message (and/or the code).
>
> Regards, Tim
>
> On Mittwoch, 17. August 2016 10:40:35 CEST Dawid Golunski wrote:
>> Random file name + .part extension on temporary files would already be
>> good improvement (even if still stored within the same directory) and
>> help prevent the exploitation.
>>
>> Thanks.
>>
>> On Wed, Aug 17, 2016 at 9:22 AM, Tim Rühsen  wrote:
>> > On Mittwoch, 17. August 2016 13:37:33 CEST Ander Juaristi wrote:
>> >> I was thinking we could rename php extensions to phps, but it's all the
>> >> same thing in the end, and even better, since the former applies to any
>> >> kind of file and I've seen some broken servers that actually run phps
>> >> files.>>
>> >> So, this is what I would do:
>> >> 1. Write temporary files with 600 perms, and make sure they're owned
>> >>
>> >> by the running user and group. qmail goes even further [1] by not
>> >> letting root run, but I would not do that here.
>> >>
>> >> 2. Use mkostemp() to generate a unique filename and give it a
>> >>
>> >> harmless extension (like Mozilla's .part). We already have unique_name()
>> >> in utils.c, altough it returns the file name untouched if it does not
>> >> exist. We should do some research on whether we could reuse parts of it.
>> >
>> > Giuseppe and I  have a working patch that is basically like this. We are
>> > still discussing the details (mkstemp extension, fixed extension, both or
>> > maybe a mkdtemp directory where we put all the temp files).
>> >
>> > As soon as we agree, we'll post the patch here for further
>> > discussion/review.>
>> >> 3. Place them in /tmp, or even better, in ~/.wget-tempfiles, or
>> >>
>> >> something like that.
>> >
>> > /tmp often is on a separate filesystem (e.g. RAM disk) with limited space.
>> > This could open another (DOS) attack vector.
>> >
>> > You do not always have a home directory when running Wget.
>> >
>> >> There's a patch by Tim somewhere in this list that already does 1 (but
>> >> please, remove the braces ;D).
>> >>
>> >> It also comes to my mind, instead of writing each temp file to its own
>> >> file, we could put them all in the same file (with O_APPEND). But a) we
>> >> need a way to tell them apart later, and b) it may cause problems in
>> >> NFS, according to open(2).
>> >>
>> >> [1] http://cr.yp.to/qmail/guarantee.html
>> >>
>> >> On 15/08/16 18:31, Tim Rühsen wrote:
>> >> > On Montag, 15. August 2016 10:02:55 CEST moparisthebest wrote:
>> >> >> Hello,
>> >> >>
>> >> >> I find it extremely hard to call this a wget vulnerability when SO
>> >> >> many
>> >> >> other things are wrong with that 'vulnerable code' implementation it
>> >> >> isn't even funny:
>> >> >>
>> >> >> 1. The image_importer.php script takes a single argument, why would it
>> >> >> download with the recursive switch turned on?  Isn't that clearly a
>> >> >> bug
>> >> >> in the php script?  Has a php script like this that downloads all
>> >> >> files
>> >> >> from a website of a particular extension ever been observed in the
>> >> >> wild?
>> >> >>
>> >> >> 2. A *well* configured server would have a whitelist of .php files it
>> >> >> will execute, making it immune to this.  A *decently* configured
>> >> >> server
>> >> >> would always at a minimum make sure they don't execute code in
>> >> >> directories with user provided uploads in them.  So it's additionally
>> >> >> a
>> >> >> bug in the server configuration. (incidentally every php package I've
>> >> >> downloaded has at minimum a .htaccess in upload directories to prevent
>> >> >> this kind of thing with apache)
>> >> >>
>> >> >> It seems to me like there has always been plenty of ways to shoot
>> >> >> yourself in the foot with PHP, and this is just another iteration on a
>> >> >> theme.
>> >> >
>> >> > Hi,
>> >> >
>> >> > this is absolutely true and your points were the first things that came
>> >> > to
>> >> > my mind when reading the original post.
>> >> >
>> >> > But there is also non-obvious wget behavior in creating those (temp)
>> >> > files
>> >> > in the filesystem. And there is also a long history of attack vectors
>> >> > introduced by temp files as well.
>> >> >
>> >> > Today the maintainers discussed a few possible fixes, all with pros and
>> >> > cons. I would like to list them here, in case someone likes to comment:
>> >> >
>> >> > 1. Rewrite code to keep temp 

Re: [Bug-wget] problems with some wget versions for LDAP authentication

2016-08-17 Thread Tim Rühsen
Hello Irina.

On Montag, 15. August 2016 15:18:50 CEST Irina Gerasimov wrote:
> Hello,
> 
> we use an LDAP authentication scheme on our servers at GES DISC NASA
> GSFC to authenticate data access. We discovered that some wget versions
> do not correctly pass credentials to the LDAP server. To our knowledge
> wget 1.12 and lower as well as wget 1.17 do not work as expected and
> wget 1.14, 1.15 and 1.18 work.
> 
> I will greatly appreciate if you could look into this and see if you
> correct issue in wget past 1.14 as we advise to our users to download
> wget 1.14 and higher.

Looking into the change history, we fixed a regression in 1.17 regarding HTTP 
authentication.  From what I can see, only 1.17 is affected, 1.17.1 fixes it.
(Your issue is not LDAP but HTTP authentication.)

Please advice your users not use 1.17 since authentication is broken.
Meanwhile, all maintained distributions should provide 1.17.1 or 1.18 instead 
of 1.17.

Wget 1.18 is the current version.

Regards, Tim


signature.asc
Description: This is a digitally signed message part.


Re: [Bug-wget] wget: configure.ac: gnutl > openssl ordering patch and libtool support

2016-08-17 Thread Tim Rühsen
On Dienstag, 16. August 2016 08:32:31 CEST Alex Damb wrote:
>  Hello,
> 
> This patch for configure.ac replaces logic of searching ssl backend: if not
> specified with --with-ssl=openssl, --with-ssl=gnutls or --with-ssl=no,
> then, first, tries to find gnutls library and, if failes, then tries to
> find openssl. In addition, it adds LT_INIT call to call libtool for
> specifying rpath option. 
> 
> Motivation: not defining --with-ssl=openssl and LDFLAGS="-I/opt/openssl/path
> -Wl,rpath,/opt/openssl/path" in case, when openssl or gnutls is correctly
> installed in non-standard locations, but with providing correct pkg-config
> files
> 
> steps:
> tar xf wget-1.18.tar.xz
> cd wget-1.18
> patch < ../wget-1.18-configure.ac.patch
> libtoolize
> autoreconf
> ./configure ...

Hi Alex,

thanks for your contribution.

Could you split your patch into two - the LT_INIT and the OpenSSL part, 
please.
And be so kind and remove the trailing white space (e.g. configure your editor  
to automatically remove these on saving and save again).

Regards, Tim

signature.asc
Description: This is a digitally signed message part.


Re: [Bug-wget] buildbot failure in OpenCSW Buildbot on wget-solaris10-i386

2016-08-17 Thread Tim Rühsen
Please update libpsl. Looks like you have a rather outdated version.

Regards, Tim

On Mittwoch, 17. August 2016 21:20:45 CEST build...@opencsw.org wrote:
> The Buildbot has detected a new failure on builder wget-solaris10-i386 while
> building wget. Full details are available at:
>  https://buildfarm.opencsw.org/buildbot/builders/wget-solaris10-i386/builds/
> 167
> 
> Buildbot URL: https://buildfarm.opencsw.org/buildbot/
> 
> Buildslave for this Build: unstable10x
> 
> Build Reason: scheduler
> Build Source Stamp: [branch master] 262baeb11379a2765507455569d16abfa94947c0
> Blamelist: Tim Rühsen 
> 
> BUILD FAILED: failed shell_2 shell_3
> 
> sincerely,
>  -The Buildbot



signature.asc
Description: This is a digitally signed message part.


Re: [Bug-wget] Wget - acess list bypass / race condition PoC

2016-08-17 Thread Tim Rühsen
Please review / test this patch.

Please check the 'Reported-by' in the commit message and if you got a CVE
number, please report for inclusion into the commit message (and/or the code).

Regards, Tim

On Mittwoch, 17. August 2016 10:40:35 CEST Dawid Golunski wrote:
> Random file name + .part extension on temporary files would already be
> good improvement (even if still stored within the same directory) and
> help prevent the exploitation.
>
> Thanks.
>
> On Wed, Aug 17, 2016 at 9:22 AM, Tim Rühsen  wrote:
> > On Mittwoch, 17. August 2016 13:37:33 CEST Ander Juaristi wrote:
> >> I was thinking we could rename php extensions to phps, but it's all the
> >> same thing in the end, and even better, since the former applies to any
> >> kind of file and I've seen some broken servers that actually run phps
> >> files.>>
> >> So, this is what I would do:
> >> 1. Write temporary files with 600 perms, and make sure they're owned
> >>
> >> by the running user and group. qmail goes even further [1] by not
> >> letting root run, but I would not do that here.
> >>
> >> 2. Use mkostemp() to generate a unique filename and give it a
> >>
> >> harmless extension (like Mozilla's .part). We already have unique_name()
> >> in utils.c, altough it returns the file name untouched if it does not
> >> exist. We should do some research on whether we could reuse parts of it.
> >
> > Giuseppe and I  have a working patch that is basically like this. We are
> > still discussing the details (mkstemp extension, fixed extension, both or
> > maybe a mkdtemp directory where we put all the temp files).
> >
> > As soon as we agree, we'll post the patch here for further
> > discussion/review.>
> >> 3. Place them in /tmp, or even better, in ~/.wget-tempfiles, or
> >>
> >> something like that.
> >
> > /tmp often is on a separate filesystem (e.g. RAM disk) with limited space.
> > This could open another (DOS) attack vector.
> >
> > You do not always have a home directory when running Wget.
> >
> >> There's a patch by Tim somewhere in this list that already does 1 (but
> >> please, remove the braces ;D).
> >>
> >> It also comes to my mind, instead of writing each temp file to its own
> >> file, we could put them all in the same file (with O_APPEND). But a) we
> >> need a way to tell them apart later, and b) it may cause problems in
> >> NFS, according to open(2).
> >>
> >> [1] http://cr.yp.to/qmail/guarantee.html
> >>
> >> On 15/08/16 18:31, Tim Rühsen wrote:
> >> > On Montag, 15. August 2016 10:02:55 CEST moparisthebest wrote:
> >> >> Hello,
> >> >>
> >> >> I find it extremely hard to call this a wget vulnerability when SO
> >> >> many
> >> >> other things are wrong with that 'vulnerable code' implementation it
> >> >> isn't even funny:
> >> >>
> >> >> 1. The image_importer.php script takes a single argument, why would it
> >> >> download with the recursive switch turned on?  Isn't that clearly a
> >> >> bug
> >> >> in the php script?  Has a php script like this that downloads all
> >> >> files
> >> >> from a website of a particular extension ever been observed in the
> >> >> wild?
> >> >>
> >> >> 2. A *well* configured server would have a whitelist of .php files it
> >> >> will execute, making it immune to this.  A *decently* configured
> >> >> server
> >> >> would always at a minimum make sure they don't execute code in
> >> >> directories with user provided uploads in them.  So it's additionally
> >> >> a
> >> >> bug in the server configuration. (incidentally every php package I've
> >> >> downloaded has at minimum a .htaccess in upload directories to prevent
> >> >> this kind of thing with apache)
> >> >>
> >> >> It seems to me like there has always been plenty of ways to shoot
> >> >> yourself in the foot with PHP, and this is just another iteration on a
> >> >> theme.
> >> >
> >> > Hi,
> >> >
> >> > this is absolutely true and your points were the first things that came
> >> > to
> >> > my mind when reading the original post.
> >> >
> >> > But there is also non-obvious wget behavior in creating those (temp)
> >> > files
> >> > in the filesystem. And there is also a long history of attack vectors
> >> > introduced by temp files as well.
> >> >
> >> > Today the maintainers discussed a few possible fixes, all with pros and
> >> > cons. I would like to list them here, in case someone likes to comment:
> >> >
> >> > 1. Rewrite code to keep temp files in memory.
> >> > Too complex, needs a redesign of wget. And has been done for wget2...
> >> >
> >> > 2. Add a harmless extension to the file names.
> >> > Possible name collision with wanted files.
> >> > Possible name length issues, have to be worked around.
> >> >
> >> > 3. Using file mode 0 (no flags at all).
> >> > Short vulnerability when changing modes to write/read the data.
> >> >
> >> > 4. Using O_TMPFILE for open().
> >> > Just for Linux, not for every filesystem available.
> >> >
> >> > 5. Using mkostemp().
> >> > Possible name collision with wanted files 

[Bug-wget] buildbot failure in OpenCSW Buildbot on wget-solaris10-sparc

2016-08-17 Thread buildbot
The Buildbot has detected a new failure on builder wget-solaris10-sparc while 
building wget.
Full details are available at:
 https://buildfarm.opencsw.org/buildbot/builders/wget-solaris10-sparc/builds/172

Buildbot URL: https://buildfarm.opencsw.org/buildbot/

Buildslave for this Build: unstable10s

Build Reason: scheduler
Build Source Stamp: [branch master] 262baeb11379a2765507455569d16abfa94947c0
Blamelist: Tim Rühsen 

BUILD FAILED: failed shell_2 shell_3

sincerely,
 -The Buildbot






[Bug-wget] buildbot failure in OpenCSW Buildbot on wget-solaris10-i386

2016-08-17 Thread buildbot
The Buildbot has detected a new failure on builder wget-solaris10-i386 while 
building wget.
Full details are available at:
 https://buildfarm.opencsw.org/buildbot/builders/wget-solaris10-i386/builds/167

Buildbot URL: https://buildfarm.opencsw.org/buildbot/

Buildslave for this Build: unstable10x

Build Reason: scheduler
Build Source Stamp: [branch master] 262baeb11379a2765507455569d16abfa94947c0
Blamelist: Tim Rühsen 

BUILD FAILED: failed shell_2 shell_3

sincerely,
 -The Buildbot






Re: [Bug-wget] Wget tests

2016-08-17 Thread Dale R. Worley
Tim Rühsen  writes:
> Somethin went wrong... try again:

Yes, you are correct.  I repeated the "git clone", and the Makefile.am
files are present in the new clone.  E.g., the tests/Makefile.am clearly
lists the Perl tests, and the Perl test files are present (e.g.,
tests/Test-auth-basic.px).

Looking at my old working directory, tests/Makefile.am is missing but
tests/Makefile is present.  (It also contains the list of tests at
"PX_TESTS =", but that is impossible to find unless you know the
variable name to look for.)  But the *.px files are missing.

I will investigate why this happened.

It appears that there is only one test for redirection behavior:
testenv/Test-redirect-crash.py.

Thanks for your assistance!

Dale



Re: [Bug-wget] [PATCH] Implement --keep-badhash to keep Metalink's files with a bad hash

2016-08-17 Thread Matthew White
On Thu, 04 Aug 2016 17:08:20 +0200
Tim Ruehsen  wrote:

> On Thursday, August 4, 2016 12:40:49 PM CEST Matthew White wrote:
> > On Thu, 04 Aug 2016 12:08:50 +0200
> > 
> > Tim Ruehsen  wrote:
> > > On Wednesday, August 3, 2016 2:40:14 PM CEST Matthew White wrote:
> > > > On Wed, 03 Aug 2016 14:05:06 +0200
> > > > 
> > > > Tim Ruehsen  wrote:
> > > > > On Tuesday, August 2, 2016 11:27:28 AM CEST Matthew White wrote:
> > > > > > On Mon, 01 Aug 2016 16:32:30 +0200
> > > > > > 
> > > > > > Tim Ruehsen  wrote:
> > > > > > > On Saturday, July 30, 2016 9:41:48 PM CEST Matthew White wrote:
> > > > > > > > Hello!
> > > > > > > > 
> > > > > > > > I think that sometimes it could help to keep downloaded
> > > > > > > > Metalink's
> > > > > > > > files
> > > > > > > > which have a bad hash.
> > > > > > > > 
> > > > > > > > The default wget behaviour is to delete such files.
> > > > > > > > 
> > > > > > > > This patch provides a way to keep files which have a bad hash
> > > > > > > > through
> > > > > > > > the
> > > > > > > > option --keep-badhash. It appends the suffix .badhash to the
> > > > > > > > file
> > > > > > > > name,
> > > > > > > > except without overwriting existing files. In the latter case,
> > > > > > > > an
> > > > > > > > unique
> > > > > > > > suffix is appended after .badhash.
> > > > > > > > 
> > > > > > > > I made this patch working on the following branch:
> > > > > > > > master (latest 20cac2c5ab3d63aacfba35fb10878a2d490e2377)
> > > > > > > > git://git.savannah.gnu.org/wget.git
> > > > > > > > 
> > > > > > > > What do you think?
> > > > > > > 
> > > > > > > Hi Matthew,
> > > > > > > 
> > > > > > > good work !
> > > > > > > 
> > > > > > > While your FSF assignment is underway (PM), we can continue
> > > > > > > polishing.
> > > > > > > 
> > > > > > > Didn't test your code yet,  but from what I see, there are just
> > > > > > > these
> > > > > > > peanuts:
> > > > > > > 
> > > > > > > 1.
> > > > > > > +  bhash = malloc (strlen (name) + strlen (".badhash") + 1);
> > > > > > > +  strcat (strcpy (bhash, name), ".badhash");
> > > > > > > 
> > > > > > > Please consider using concat_strings() from util.h.
> > > > > > > And please leave an empty line between var declaration and code -
> > > > > > > just
> > > > > > > for
> > > > > > > readability.
> > > > > > > 
> > > > > > > 2.
> > > > > > > Please add 'link' and 'unlink' to bootstrap.conf. Gnulib will
> > > > > > > emulate
> > > > > > > these on platforms where they are not available (or have a
> > > > > > > slightly
> > > > > > > different behavior). I guess we simply forgot 'unlink' when we
> > > > > > > switched
> > > > > > > to gnulib.
> > > > > > > 
> > > > > > > 3.
> > > > > > > The (GNU) commit messages should ideally look like:
> > > > > > > 
> > > > > > > one line of short description
> > > > > > > 
> > > > > > > file changes
> > > > > > > 
> > > > > > > long description
> > > > > > > 
> > > > > > > For example:
> > > > > > > Add new option --keep-badhash
> > > > > > > 
> > > > > > > * src/init.c: Add keepbadhash
> > > > > > > * src/main.c: Add keep-badhash
> > > > > > > * src/options.h: Add keep_badhash
> > > > > > > * doc/wget.texi: Add docs for --keep-badhash
> > > > > > > * src/metalink.h: Add prototypes badhash_suffix(),
> > > > > > > badhash_or_remove()
> > > > > > > * src/metalink.c: New functions badhash_suffix(),
> > > > > > > badhash_or_remove().
> > > > > > > 
> > > > > > >   (retrieve_from_metalink): Call append .badhash()
> > > > > > > 
> > > > > > > With --keep-badhash, append .badhash to Metalink's files with
> > > > > > > checksum
> > > > > > > mismatch, except without overwriting existing files.
> > > > > > > 
> > > > > > > Without --keep-badhash, remove downloaded files with checksum
> > > > > > > mismatch
> > > > > > > (this conforms to the old behaviour).
> > > > > > > 
> > > > > > > [This also applies to to your other patches]
> > > > > > > 
> > > > > > > 4.
> > > > > > > Please not only write 'keepbadhash', but also what you did (e.g.
> > > > > > > remove,
> > > > > > > rename, add, ...), see my example above.
> > > > > > > 
> > > > > > > Those ChangeLog entries should allow finding changes / faults /
> > > > > > > regressions
> > > > > > > etc. even when a versioning system is not at hand, e.g. when
> > > > > > > unpacking
> > > > > > > the
> > > > > > > sources from a tarball. (Not debatable that consulting commit
> > > > > > > messages
> > > > > > > directly is much more powerful.)
> > > > > > > 
> > > > > > > 5.
> > > > > > > You write "except without overwriting existing files", maybe you
> > > > > > > should
> > > > > > > mention appending a counter instead of overwriting existent files
> > > > > > > !?
> > > > > > > 
> > > > > > > 
> > > > > > > Regards, Tim
> > > > > > 
> > > > > > Thanks Tim!
> > > > > > 
> > > > > > I really needed your guidance. I sent the modified patches to
> > > > > > bug-wget@gnu.org.
> > > > > > 
> > > > > > I 

Re: [Bug-wget] Wget - acess list bypass / race condition PoC

2016-08-17 Thread Dawid Golunski
Random file name + .part extension on temporary files would already be
good improvement (even if still stored within the same directory) and
help prevent the exploitation.

Thanks.

On Wed, Aug 17, 2016 at 9:22 AM, Tim Rühsen  wrote:
> On Mittwoch, 17. August 2016 13:37:33 CEST Ander Juaristi wrote:
>> I was thinking we could rename php extensions to phps, but it's all the
>> same thing in the end, and even better, since the former applies to any
>> kind of file and I've seen some broken servers that actually run phps files.
>>
>> So, this is what I would do:
>>
>> 1. Write temporary files with 600 perms, and make sure they're owned
>> by the running user and group. qmail goes even further [1] by not
>> letting root run, but I would not do that here.
>> 2. Use mkostemp() to generate a unique filename and give it a
>> harmless extension (like Mozilla's .part). We already have unique_name()
>> in utils.c, altough it returns the file name untouched if it does not
>> exist. We should do some research on whether we could reuse parts of it.
>
> Giuseppe and I  have a working patch that is basically like this. We are still
> discussing the details (mkstemp extension, fixed extension, both or maybe a
> mkdtemp directory where we put all the temp files).
>
> As soon as we agree, we'll post the patch here for further discussion/review.
>
>> 3. Place them in /tmp, or even better, in ~/.wget-tempfiles, or
>> something like that.
>
> /tmp often is on a separate filesystem (e.g. RAM disk) with limited space.
> This could open another (DOS) attack vector.
>
> You do not always have a home directory when running Wget.
>
>> There's a patch by Tim somewhere in this list that already does 1 (but
>> please, remove the braces ;D).
>>
>> It also comes to my mind, instead of writing each temp file to its own
>> file, we could put them all in the same file (with O_APPEND). But a) we
>> need a way to tell them apart later, and b) it may cause problems in
>> NFS, according to open(2).
>>
>> [1] http://cr.yp.to/qmail/guarantee.html
>>
>> On 15/08/16 18:31, Tim Rühsen wrote:
>> > On Montag, 15. August 2016 10:02:55 CEST moparisthebest wrote:
>> >> Hello,
>> >>
>> >> I find it extremely hard to call this a wget vulnerability when SO many
>> >> other things are wrong with that 'vulnerable code' implementation it
>> >> isn't even funny:
>> >>
>> >> 1. The image_importer.php script takes a single argument, why would it
>> >> download with the recursive switch turned on?  Isn't that clearly a bug
>> >> in the php script?  Has a php script like this that downloads all files
>> >> from a website of a particular extension ever been observed in the wild?
>> >>
>> >> 2. A *well* configured server would have a whitelist of .php files it
>> >> will execute, making it immune to this.  A *decently* configured server
>> >> would always at a minimum make sure they don't execute code in
>> >> directories with user provided uploads in them.  So it's additionally a
>> >> bug in the server configuration. (incidentally every php package I've
>> >> downloaded has at minimum a .htaccess in upload directories to prevent
>> >> this kind of thing with apache)
>> >>
>> >> It seems to me like there has always been plenty of ways to shoot
>> >> yourself in the foot with PHP, and this is just another iteration on a
>> >> theme.
>> >
>> > Hi,
>> >
>> > this is absolutely true and your points were the first things that came to
>> > my mind when reading the original post.
>> >
>> > But there is also non-obvious wget behavior in creating those (temp) files
>> > in the filesystem. And there is also a long history of attack vectors
>> > introduced by temp files as well.
>> >
>> > Today the maintainers discussed a few possible fixes, all with pros and
>> > cons. I would like to list them here, in case someone likes to comment:
>> >
>> > 1. Rewrite code to keep temp files in memory.
>> > Too complex, needs a redesign of wget. And has been done for wget2...
>> >
>> > 2. Add a harmless extension to the file names.
>> > Possible name collision with wanted files.
>> > Possible name length issues, have to be worked around.
>> >
>> > 3. Using file mode 0 (no flags at all).
>> > Short vulnerability when changing modes to write/read the data.
>> >
>> > 4. Using O_TMPFILE for open().
>> > Just for Linux, not for every filesystem available.
>> >
>> > 5. Using mkostemp().
>> > Possible name collision with wanted files (which would be unexpectedly
>> > named as *.1 in case of a collision). At least the chance for a collision
>> > seems very low.
>> >
>> > Any thoughts or other ideas ?
>> >
>> > Regards, Tim
>



-- 
Regards,
Dawid Golunski
http://legalhackers.com



Re: [Bug-wget] Wget - acess list bypass / race condition PoC

2016-08-17 Thread Tim Rühsen
On Mittwoch, 17. August 2016 13:37:33 CEST Ander Juaristi wrote:
> I was thinking we could rename php extensions to phps, but it's all the
> same thing in the end, and even better, since the former applies to any
> kind of file and I've seen some broken servers that actually run phps files.
> 
> So, this is what I would do:
> 
> 1. Write temporary files with 600 perms, and make sure they're owned
> by the running user and group. qmail goes even further [1] by not
> letting root run, but I would not do that here.
> 2. Use mkostemp() to generate a unique filename and give it a
> harmless extension (like Mozilla's .part). We already have unique_name()
> in utils.c, altough it returns the file name untouched if it does not
> exist. We should do some research on whether we could reuse parts of it.

Giuseppe and I  have a working patch that is basically like this. We are still 
discussing the details (mkstemp extension, fixed extension, both or maybe a 
mkdtemp directory where we put all the temp files).

As soon as we agree, we'll post the patch here for further discussion/review.

> 3. Place them in /tmp, or even better, in ~/.wget-tempfiles, or
> something like that.

/tmp often is on a separate filesystem (e.g. RAM disk) with limited space.
This could open another (DOS) attack vector.

You do not always have a home directory when running Wget.

> There's a patch by Tim somewhere in this list that already does 1 (but
> please, remove the braces ;D).
> 
> It also comes to my mind, instead of writing each temp file to its own
> file, we could put them all in the same file (with O_APPEND). But a) we
> need a way to tell them apart later, and b) it may cause problems in
> NFS, according to open(2).
> 
> [1] http://cr.yp.to/qmail/guarantee.html
> 
> On 15/08/16 18:31, Tim Rühsen wrote:
> > On Montag, 15. August 2016 10:02:55 CEST moparisthebest wrote:
> >> Hello,
> >> 
> >> I find it extremely hard to call this a wget vulnerability when SO many
> >> other things are wrong with that 'vulnerable code' implementation it
> >> isn't even funny:
> >> 
> >> 1. The image_importer.php script takes a single argument, why would it
> >> download with the recursive switch turned on?  Isn't that clearly a bug
> >> in the php script?  Has a php script like this that downloads all files
> >> from a website of a particular extension ever been observed in the wild?
> >> 
> >> 2. A *well* configured server would have a whitelist of .php files it
> >> will execute, making it immune to this.  A *decently* configured server
> >> would always at a minimum make sure they don't execute code in
> >> directories with user provided uploads in them.  So it's additionally a
> >> bug in the server configuration. (incidentally every php package I've
> >> downloaded has at minimum a .htaccess in upload directories to prevent
> >> this kind of thing with apache)
> >> 
> >> It seems to me like there has always been plenty of ways to shoot
> >> yourself in the foot with PHP, and this is just another iteration on a
> >> theme.
> > 
> > Hi,
> > 
> > this is absolutely true and your points were the first things that came to
> > my mind when reading the original post.
> > 
> > But there is also non-obvious wget behavior in creating those (temp) files
> > in the filesystem. And there is also a long history of attack vectors
> > introduced by temp files as well.
> > 
> > Today the maintainers discussed a few possible fixes, all with pros and
> > cons. I would like to list them here, in case someone likes to comment:
> > 
> > 1. Rewrite code to keep temp files in memory.
> > Too complex, needs a redesign of wget. And has been done for wget2...
> > 
> > 2. Add a harmless extension to the file names.
> > Possible name collision with wanted files.
> > Possible name length issues, have to be worked around.
> > 
> > 3. Using file mode 0 (no flags at all).
> > Short vulnerability when changing modes to write/read the data.
> > 
> > 4. Using O_TMPFILE for open().
> > Just for Linux, not for every filesystem available.
> > 
> > 5. Using mkostemp().
> > Possible name collision with wanted files (which would be unexpectedly
> > named as *.1 in case of a collision). At least the chance for a collision
> > seems very low.
> > 
> > Any thoughts or other ideas ?
> > 
> > Regards, Tim



signature.asc
Description: This is a digitally signed message part.


Re: [Bug-wget] Wget - acess list bypass / race condition PoC

2016-08-17 Thread Ander Juaristi
I was thinking we could rename php extensions to phps, but it's all the
same thing in the end, and even better, since the former applies to any
kind of file and I've seen some broken servers that actually run phps files.

So, this is what I would do:

1. Write temporary files with 600 perms, and make sure they're owned
by the running user and group. qmail goes even further [1] by not
letting root run, but I would not do that here.
2. Use mkostemp() to generate a unique filename and give it a
harmless extension (like Mozilla's .part). We already have unique_name()
in utils.c, altough it returns the file name untouched if it does not
exist. We should do some research on whether we could reuse parts of it.
3. Place them in /tmp, or even better, in ~/.wget-tempfiles, or
something like that.

There's a patch by Tim somewhere in this list that already does 1 (but
please, remove the braces ;D).

It also comes to my mind, instead of writing each temp file to its own
file, we could put them all in the same file (with O_APPEND). But a) we
need a way to tell them apart later, and b) it may cause problems in
NFS, according to open(2).

[1] http://cr.yp.to/qmail/guarantee.html

On 15/08/16 18:31, Tim Rühsen wrote:
> On Montag, 15. August 2016 10:02:55 CEST moparisthebest wrote:
>> Hello,
>>
>> I find it extremely hard to call this a wget vulnerability when SO many
>> other things are wrong with that 'vulnerable code' implementation it
>> isn't even funny:
>>
>> 1. The image_importer.php script takes a single argument, why would it
>> download with the recursive switch turned on?  Isn't that clearly a bug
>> in the php script?  Has a php script like this that downloads all files
>> from a website of a particular extension ever been observed in the wild?
>>
>> 2. A *well* configured server would have a whitelist of .php files it
>> will execute, making it immune to this.  A *decently* configured server
>> would always at a minimum make sure they don't execute code in
>> directories with user provided uploads in them.  So it's additionally a
>> bug in the server configuration. (incidentally every php package I've
>> downloaded has at minimum a .htaccess in upload directories to prevent
>> this kind of thing with apache)
>>
>> It seems to me like there has always been plenty of ways to shoot
>> yourself in the foot with PHP, and this is just another iteration on a
>> theme.
> 
> Hi,
> 
> this is absolutely true and your points were the first things that came to my 
> mind when reading the original post.
> 
> But there is also non-obvious wget behavior in creating those (temp) files in 
> the filesystem. And there is also a long history of attack vectors introduced 
> by temp files as well.
> 
> Today the maintainers discussed a few possible fixes, all with pros and cons.
> I would like to list them here, in case someone likes to comment:
> 
> 1. Rewrite code to keep temp files in memory.
> Too complex, needs a redesign of wget. And has been done for wget2...
> 
> 2. Add a harmless extension to the file names.
> Possible name collision with wanted files.
> Possible name length issues, have to be worked around.
> 
> 3. Using file mode 0 (no flags at all).
> Short vulnerability when changing modes to write/read the data.
> 
> 4. Using O_TMPFILE for open().
> Just for Linux, not for every filesystem available.
> 
> 5. Using mkostemp().
> Possible name collision with wanted files (which would be unexpectedly named 
> as 
> *.1 in case of a collision). At least the chance for a collision seems very 
> low.
> 
> Any thoughts or other ideas ?
> 
> Regards, Tim
> 



signature.asc
Description: OpenPGP digital signature


[Bug-wget] CII best practices for wget?

2016-08-17 Thread Daniel Stenberg

Hey,

The Core Infrastructure Initiative runs a project[1] to have free and open 
source projects register to get a "best practises" badge by filling in a form 
telling the world about what practises and procedures the project has and 
uses.


The idea being to A) make sure more projects do the "right things" and then B) 
show the world the projects that do those right things.


More and more projects are adding their info there. Would be lovely to see 
wget added as well!


I'm just a wget fan who appreciates the "CII best practices" project, not 
otherwise associated.


[1] = https://bestpractices.coreinfrastructure.org/

--

 / daniel.haxx.se