I am basically interested only in oauth that can be used by remote
applications / processes running on user's PC, which isn't available
yet

On Tue, Oct 22, 2013 at 7:18 PM, Chris Steipp <[email protected]> wrote:
> On Tue, Oct 22, 2013 at 1:57 AM, Merlijn van Deen <[email protected]>wrote:
>
>> Hi Chris,
>>
>> On 22 October 2013 05:45, Chris Steipp <[email protected]> wrote:
>>
>> > OAuth does not support this, since the results of an api call
>> > using OAuth signatures aren't signed (only the request from the OAuth
>> > consumer is signed), so it's possible that an attacker could forge a
>> > response back to the application, and the application would think a
>> > different user was logged in. This is less likely if you're using https
>> for
>> > your api calls, but it's surprisingly hard to get https right [1], even
>> if
>> > you trust all your CA's.
>> >
>> (...)
>>
>> >
>> > This is a common issue is being addressed by the OpenID Connect extension
>> > to OAuth2, which allows the application to request information about the
>> > person doing the authorization, and the result is signed by the server to
>> > prevent tampering.
>>
>> (...)
>>
>> I'm a bit confused by this -- I was under the impression https would be
>> enough to confirm I'm actually talking to the WMF's servers. The main
>> argument in [1] against just using https seems to be it's easy to ignore
>> invalid certificates. Is there another reason why it's dangerous to assume
>> you're talking to mw.o if the certificates check out?
>>
>
> That's correct. The issue is more that we (the security community) keep
> finding code out there that doesn't correctly handle the verification (
> http://www.cs.utexas.edu/~shmat/shmat_ccs12.pdf was one of the popular
> surveys of the subject). It's often the underlying libraries at fault
> (errors parsing the certificates, or the revocation lists, that fail open),
> or common programming mistakes (like how mediawiki
> set CURLOPT_SSL_VERIFYHOST to true, instead of 2, for a very long time).
> But if you accept that your current libraries are probably flawed, and so
> you keep your libraries up to date, and you're careful about how you're
> doing the verification at the application layer, you *should* be fine.
>
>
>>
>> Basically, I'm not quite sure whether using OIDC will help alleviate this
>> problem - you get a response back, but you still have to check the
>> signature! And with the ease of not checking the signature, you're
>> basically back to the same problem with not checking the ssl certificate.
>>
>
> Correct. Hopefully, applications that really need to know the identity of a
> user (like UTRS) will go through the bother of checking the signature (in
> both OpenID Connect, and the intermediate solution I'm proposing, this is
> an HMAC signature using a pre-established secret, so it should be easy
> enough that the effort is worth the security).
>
>
>>
>> Nonetheless, I think it's useful to add an authentication mechanism that
>> follows a standard - which is clearly not the case with the current
>> 'api.php?meta=userinfo' calls.
>>
>> Merlijn
>>
>> [1]
>>
>> http://blog.astrumfutura.com/2010/10/do-cryptographic-signatures-beat-ssltls-in-oauth-2-0/
>> _______________________________________________
>> Wikitech-l mailing list
>> [email protected]
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> _______________________________________________
> Wikitech-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to