Re: Torbutton Documentation - Adversary Capabilities.

2010-07-15 Thread Matthew



On 15/07/10 08:21, Mike Perry wrote:

Thus spake Matthew (pump...@cotse.net):


  So to go back to the OP's question (my question)what do people think
of my questions about JavaScript being able to obtain non-Tor IPs when
wiping the cache?

If you are also restarting the browser, or closing all windows, you
are probably safe from most direct javascript attack vectors. The main
danger is in leaving pages open after changing proxy settings. Then
direct unmasking is possible. Identifiers can be stored in the page
javascript itself.


Thanks for the information about fingerprinting.

Yes, I close the browser, wipe the cache, and all cookie, history, download 
files, then restart and remove the proxy settings.  I will start to use 
Torbutton however!


***
To unsubscribe, send an e-mail to majord...@torproject.org with
unsubscribe or-talkin the body. http://archives.seul.org/or/talk/


Re: Torbutton Documentation - Adversary Capabilities.

2010-07-15 Thread Mike Perry
Thus spake Matthew (pump...@cotse.net):

>  So to go back to the OP's question (my question)what do people think 
> of my questions about JavaScript being able to obtain non-Tor IPs when 
> wiping the cache?

If you are also restarting the browser, or closing all windows, you
are probably safe from most direct javascript attack vectors. The main
danger is in leaving pages open after changing proxy settings. Then
direct unmasking is possible. Identifiers can be stored in the page
javascript itself.

However, Javascript still has quite a bit of ability to fingerprint you
based on your desktop resolution, user agent, timezone, any many other
things. Torbutton does a good job of blocking a lot of the
fingerprintable attributes, which make it hard to correlate your
non-tor browser fingerprint to your tor browser fingerprint. More work
still needs to be done here, but we do handle quite a bit of the major
fingerprinting sources.

See also: https://wiki.mozilla.org/Fingerprinting


-- 
Mike Perry
Mad Computer Scientist
fscked.org evil labs


pgpshLOWegVQY.pgp
Description: PGP signature


Re: Torbutton Documentation - Adversary Capabilities. - fork: Normalization of XHR requests

2010-07-14 Thread Paul Syverson
On Thu, Jul 15, 2010 at 12:53:50AM +0100, Anon Mus wrote:
> Paul Syverson wrote:
>> On Tue, Jul 13, 2010 at 05:30:27PM +0100, Anon Mus wrote:
>>   
>>> Paul Syverson wrote:
>>> 
>> And just as there is no such thing as a secure system---only systems
>> secure against a given adversary conducting a given class of attack
>> provided that the implementation, deployment and environment satisfy
>> certain assumptions, so to there is no such thing as an anonymous
>> system. In that sense, the answer is no, "anonymous" should not mean
>> anonymous, or rather it depends what _you_ mean by anonymous and a
>> whole bunch of other things that must be stated.
>>
>>   
>
> Well if is your attitude,

This is not attitude; it's an explanation of science. It's how
'secure' is understood by anyone that I know who works on security
analysis and design from those who write the textbooks on computer
security to those who hack, from those who try to secure major defense
command and control systems to those who try to make secure web
browsers to protect consumers against phishing.

> then why have Tor in the first place?

To protect communication. And Tor does that pretty much better than
anything else available. (The professional philosopher in me feels
obligated to acknowledge that there are adversaries and contexts for
which other systems are more secure, but I believe that they are less
secure in ways that are significant and lack pathways to change that,
unlike Tor.)  And lots of us are working as hard as we can to make it
better still.

> Seems to me you need to pull over and let those who are interested
> in making Tor secure against Timing Attacks take the road. That way
> Tor will at least be on the road to more being more secure than it
> is now.
>

Tor is on the road to being more secure now. I have tried to point you
to the research in the area of timing attacks that is being done. I
reiterate that there is no evidence to date that the sorts of things
you are proposing actually work. That is not to say we shouldn't keep
looking at timing attack resistance as a research question. But it is
still very unproven research. I think I have taken this as far as
anyone, and it's still a long way from practical. But having worked on
it and many other aspects for a long time, there are many things that
can be and are being done---also research that will lead to
improvements IMO long before timing-attack countermeasures ever
produce anything but much larger overhead, much smaller anonymity sets
and thus worse anonymity (less entropy if you prefer). Why are you so
focused on timing attacks?  There's plenty of positive changes to work
on where the expected payoff is better.

> Why get up in the morning?
>

To make better Tor (amongst other things).

aloha,
paul
***
To unsubscribe, send an e-mail to majord...@torproject.org with
unsubscribe or-talkin the body. http://archives.seul.org/or/talk/


RE: Torbutton Documentation - Adversary Capabilities.

2010-07-14 Thread downie -


> Date: Wed, 14 Jul 2010 22:26:26 +0100
> From: pump...@cotse.net
> To: or-talk@freehaven.net
> Subject: Re: Torbutton Documentation - Adversary Capabilities.
> 
>   So to go back to the OP's question (my question)what do people think 
> of my questions about JavaScript being able to obtain non-Tor IPs when 
> wiping the cache?

I may need correcting here, but I believe that things like Javascript timers 
are stored in memory as part of the page's Document Object Model (DOM), and DOM 
Storage attacks are one of the things that Torbutton protects against. The DOM 
disappears when the window or tab is closed anyway. 
Furthermore, if Torbutton is set up correctly, the cache in the Tor state is 
isolated from the cache in the Non-Tor state, so stored .js files can't come 
back to bite you.
GD
  
_
The New Busy think 9 to 5 is a cute idea. Combine multiple calendars with 
Hotmail. 
http://www.windowslive.com/campaign/thenewbusy?tile=multicalendar&ocid=PID28326::T:WLMTAGL:ON:WL:en-US:WM_HMP:042010_5

Re: Torbutton Documentation - Adversary Capabilities. - fork: Normalization of XHR requests

2010-07-14 Thread Anon Mus

Paul Syverson wrote:

On Tue, Jul 13, 2010 at 05:30:27PM +0100, Anon Mus wrote:
  

Paul Syverson wrote:


Tor doesn't do any batching or delaying.  This is just another way you
could be identified by timing attacks. Tor provides no resistance to
timing attacks, and so far there are no countermeasures that have
been identified as working against a passive, much less active, adversary
without imposing unacceptably high overhead or limitations.
  
Since Tor's inception (must be getting ion for 10 years now) it has been 
getting faster year after year, this is due to network  speed and bandwidth 
increases, which have been about a 200 fold (e.g. speeds of 100+Kbps max 
2003 to 20+Mbps today).


OK, there have been some increases in  web page byte size but it not more 
than 10 fold.


That means a real speed increase of at least 10 fold. So perhaps Tor 
developers should start putting in some "timing attack" protection. It 
seems to me that the time is right. What is holding them back? Are they 
afraid of global big brother complaining they cannot identify users at 
will? Anonymous should mean anonymous, no?





Even assuming your description of the evolution of Tor network
communication processing is correct, I don't understand what increase
in network speed (throughput?) or bandwidth have to do with making it
more feasible to protect against timing attacks.


Obvious really, I quote you (from above) "without imposing unacceptably 
high overhead" - if the speeds & bandwidth (you might like to read up on 
this subject) are up 10 fold then the latency is down. Pages load fast 
now, so there IS room for some extra "ovehead" now. Didn't you figure 
that out?


There are lots of methods that can be employed to resist against timing 
attacks... and there's definite resistance to implementing them, even 
though its obvious on first principles that they DO work and that other 
anonymity systems have/do use them. The obvious one are..


1. Bundling/Multiplexing individual streams into mixed streams, 
individual streams can even be split by over multiple routes then 
reconstituted. (means streams cannot reliably be followed). - adds entropy.
2. Caching by exit nodes (means streams cannot always be tracked from 
the external site) - adds entropy.
3. Variable (3-n random pattern) node size paths (means timing attack 
adversaries cannot EASILY predict route start and end) - adds entropy.
4. Random variable packet delay/sequence position transmission - adds 
entropy.

5. Addition of "chaff" traffic - adds entropy.

INCREASED ENTROPY is the KEY.

More entropy, the less certainty of the adversary of finding a timing 
attack solution.


At the moment Tor has the appearance of an ordered NETWORK/WEB/GRAPH - 
low entropy (predictable system), the above would make it look more like 
an amorphous CLOUD - high entropy (unpredictable system).


As for the rest you say below - as you are stuck with ever faster 
networks you'd better get used to it and put some ENTROPY into the Tor 
system.




 Faster networks
should just make timing attacks more effective, and we know that we
were already unable to do anything useful when such attacks were less
effective.

People should continue to work on this hard research problem.  (I
myself have a paper on it to be presented in the Privacy Enhancing
Technologies Symposium next week, "Preventing Active Timing Attacks in
Low-Latency Anonymous Communication ".) But as the blog post I pointed
at noted, nobody has yet made a suggestion that clearly improves the
situation (even in theory) and would clearly be feasible and practical
to deploy on the Tor network as it stands.

  
THE ABOVE 1..5 ALL THEORETICALLY INCREASE ENTROPY, which ACTUALLY makes 
it more difficult to make timing attacks on Tor - as you need MORE and 
MORE data on the MORE Tor nodes and users and the computational solution 
grows by the power of the number of nodes/users that have to be included 
in the timing attack solution. - why would you argue otherwise?



And just as there is no such thing as a secure system---only systems
secure against a given adversary conducting a given class of attack
provided that the implementation, deployment and environment satisfy
certain assumptions, so to there is no such thing as an anonymous
system. In that sense, the answer is no, "anonymous" should not mean
anonymous, or rather it depends what _you_ mean by anonymous and a
whole bunch of other things that must be stated.

  


Well if is your attitude, then why have Tor in the first place? Seems to 
me you need to pull over and let those who are interested in making Tor 
secure against Timing Attacks take the road. That way Tor will at least 
be on the road to more being more secure than it is now.


Why get up in the morning?


HTH,
Paul
***
To unsubscribe, send an e-mail to majord...@torproject.org with
unsubscribe or-talkin the body. http://archives.seul.

Re: Torbutton Documentation - Adversary Capabilities.

2010-07-14 Thread Matthew
 So to go back to the OP's question (my question)what do people think 
of my questions about JavaScript being able to obtain non-Tor IPs when 
wiping the cache?

On 13/07/2010, at 6:47 AM, Matthew wrote:


Hello,

I have been reading the Torbutton documentation (thanks, guys) and have a 
question about the adversary capabilities.

The first adversary capability is "inserting javascript".  The document says that 
"If not properly disabled, Javascript event handlers and timers can cause the browser to 
perform network activity after Tor has been disabled, thus allowing the adversary to correlate Tor 
and Non-Tor activity and reveal a user's non-Tor IP address."

The third adversary capability is "inserting CSS".  The document says that "CSS can 
also be used to correlate Tor and Non-Tor activity and reveal a user's Non-Tor IP address, via the 
usage of CSS popups - essentially CSS-based event handlers that fetch content via CSS's 
DEFANGED_Onmouseover attribute. If these popups are allowed to perform network activity in a 
different Tor state than they were loaded in, they can easily correlate Tor and Non-Tor activity 
and reveal a user's IP address."

I understand that Torbutton is useful for protecting privacy in multiple ways.  
But I would like to address this specific issue if I may.

Let us imagine that a user surfs the net using Tor (and Polipo or Privoxy).  He 
has JavaScript installed and uses it for all sites.  He finishes his activities 
and then closes his browser.  He then wipes the following files and directories 
(I am using Ubuntu as my example):

/.mozilla/firefox/nameofuser/cookies.sqlite
/.mozilla/firefox/nameofuser/downloads.sqlite
/.mozilla/firefox/nameofuser/cookies.sqlite-journal
/.mozilla/firefox/nameofuser/places.sqlite
/.mozilla/firefox/nameofuser/places.sqlite-journal
/.mozilla/firefox/nameofuser/formhistory.sqlite

/.mozilla/firefox/nameofuser/Cache/

Now I assume that these Javascript events and handlers and the CSS handlers 
were downloaded into the Cache from when the user was browsing using Tor.  They 
would then be deleted as detailed above. Therefore, when the user loads up 
Firefox and turns off the Tor proxy settings, presumably the potential for 
JavaScript or CSS to connect Tor and non-Tor activity and get the users real 
(non-Tor) IP address is no longer a concern?

Is this correct?  Or am I missing something?  Just to re-state: I am only 
looking at this one issue - I am well aware of how useful Tor button is in 
other areas!

Thanks.


***
To unsubscribe, send an e-mail to majord...@torproject.org with
unsubscribe or-talkin the body. http://archives.seul.org/or/talk/


***
To unsubscribe, send an e-mail to majord...@torproject.org with
unsubscribe or-talkin the body. http://archives.seul.org/or/talk/


Re: Torbutton Documentation - Adversary Capabilities. - fork: Normalization of XHR requests

2010-07-13 Thread Paul Syverson
On Tue, Jul 13, 2010 at 05:30:27PM +0100, Anon Mus wrote:
> Paul Syverson wrote:
>> Tor doesn't do any batching or delaying.  This is just another way you
>> could be identified by timing attacks. Tor provides no resistance to
>> timing attacks, and so far there are no countermeasures that have
>> been identified as working against a passive, much less active, adversary
>> without imposing unacceptably high overhead or limitations.
> Since Tor's inception (must be getting ion for 10 years now) it has been 
> getting faster year after year, this is due to network  speed and bandwidth 
> increases, which have been about a 200 fold (e.g. speeds of 100+Kbps max 
> 2003 to 20+Mbps today).
>
> OK, there have been some increases in  web page byte size but it not more 
> than 10 fold.
>
> That means a real speed increase of at least 10 fold. So perhaps Tor 
> developers should start putting in some "timing attack" protection. It 
> seems to me that the time is right. What is holding them back? Are they 
> afraid of global big brother complaining they cannot identify users at 
> will? Anonymous should mean anonymous, no?
>

Even assuming your description of the evolution of Tor network
communication processing is correct, I don't understand what increase
in network speed (throughput?) or bandwidth have to do with making it
more feasible to protect against timing attacks. Faster networks
should just make timing attacks more effective, and we know that we
were already unable to do anything useful when such attacks were less
effective.

People should continue to work on this hard research problem.  (I
myself have a paper on it to be presented in the Privacy Enhancing
Technologies Symposium next week, "Preventing Active Timing Attacks in
Low-Latency Anonymous Communication ".) But as the blog post I pointed
at noted, nobody has yet made a suggestion that clearly improves the
situation (even in theory) and would clearly be feasible and practical
to deploy on the Tor network as it stands.

And just as there is no such thing as a secure system---only systems
secure against a given adversary conducting a given class of attack
provided that the implementation, deployment and environment satisfy
certain assumptions, so to there is no such thing as an anonymous
system. In that sense, the answer is no, "anonymous" should not mean
anonymous, or rather it depends what _you_ mean by anonymous and a
whole bunch of other things that must be stated.

HTH,
Paul
***
To unsubscribe, send an e-mail to majord...@torproject.org with
unsubscribe or-talkin the body. http://archives.seul.org/or/talk/


Re: Torbutton Documentation - Adversary Capabilities. - fork: Normalization of XHR requests

2010-07-13 Thread Anon Mus

Paul Syverson wrote:

Tor doesn't do any batching or delaying.  This is just another way you
could be identified by timing attacks. Tor provides no resistance to
timing attacks, and so far there are no countermeasures that have
been identified as working against a passive, much less active, adversary
without imposing unacceptably high overhead or limitations.
Since Tor's inception (must be getting ion for 10 years now) it has been 
getting faster year after year, this is due to network  speed and 
bandwidth increases, which have been about a 200 fold (e.g. speeds of 
100+Kbps max 2003 to 20+Mbps today).


OK, there have been some increases in  web page byte size but it not 
more than 10 fold.


That means a real speed increase of at least 10 fold. So perhaps Tor 
developers should start putting in some "timing attack" protection. It 
seems to me that the time is right. What is holding them back? Are they 
afraid of global big brother complaining they cannot identify users at 
will? Anonymous should mean anonymous, no?



!<
 Most have
these limitations and still don't work.

See the blog post
http://blog.torproject.org/blog/one-cell-enough

  


***
To unsubscribe, send an e-mail to majord...@torproject.org with
unsubscribe or-talkin the body. http://archives.seul.org/or/talk/


Re: Torbutton Documentation - Adversary Capabilities. - fork: Normalization of XHR requests

2010-07-13 Thread Paul Syverson
On Tue, Jul 13, 2010 at 03:30:57PM +0800, John Barker wrote:
> On this subject.
> 
> I assume that all javascript requests actually use your browsers
> HTTP/socket engines - therefore although javascript is able to send
> network requests, they'll still be going through Tor.
> 
> This means that you could be identified by timing attacks, a
> particular sequence of Javascript XHR requests could uniquely
> identify Tor users because basically Tor doesn't do all that much
> normalisation (batching, delaying etc) of requests.
> 

Tor doesn't do any batching or delaying.  This is just another way you
could be identified by timing attacks. Tor provides no resistance to
timing attacks, and so far there are no countermeasures that have
been identified as working against a passive, much less active, adversary
without imposing unacceptably high overhead or limitations. Most have
these limitations and still don't work.

See the blog post
http://blog.torproject.org/blog/one-cell-enough

> Perhaps it would be possible for Tor to do some limited batching for
> XHR requests? I understand Tor is capable of some limited
> application scrubbing - couldn't it see the HTTP headers that
> indicate a XHR request and treat them differently?
> 
> Since XHR requests are typically quite interactive, delaying them
> might make Ajax javascript heavy websites sluggish or unusable, but
> it's just an idea I'm throwing out. Websites are definitely moving
> towards more javascript usage and it would be good to reduce this
> weakness.
> 
> 
> 
> On 13/07/2010, at 6:47 AM, Matthew wrote:
> 
> > Hello,
> > 
> > I have been reading the Torbutton documentation (thanks, guys) and have a 
> > question about the adversary capabilities.  
> > 
> > The first adversary capability is "inserting javascript".  The document 
> > says that "If not properly disabled, Javascript event handlers and timers 
> > can cause the browser to perform network activity after Tor has been 
> > disabled, thus allowing the adversary to correlate Tor and Non-Tor activity 
> > and reveal a user's non-Tor IP address."  
> > 
> > The third adversary capability is "inserting CSS".  The document says that 
> > "CSS can also be used to correlate Tor and Non-Tor activity and reveal a 
> > user's Non-Tor IP address, via the usage of CSS popups - essentially 
> > CSS-based event handlers that fetch content via CSS's DEFANGED_Onmouseover 
> > attribute. If these popups are allowed to perform network activity in a 
> > different Tor state than they were loaded in, they can easily correlate Tor 
> > and Non-Tor activity and reveal a user's IP address."
> > 
> > I understand that Torbutton is useful for protecting privacy in multiple 
> > ways.  But I would like to address this specific issue if I may.
> > 
> > Let us imagine that a user surfs the net using Tor (and Polipo or Privoxy). 
> >  He has JavaScript installed and uses it for all sites.  He finishes his 
> > activities and then closes his browser.  He then wipes the following files 
> > and directories (I am using Ubuntu as my example):
> > 
> > /.mozilla/firefox/nameofuser/cookies.sqlite 
> > /.mozilla/firefox/nameofuser/downloads.sqlite 
> > /.mozilla/firefox/nameofuser/cookies.sqlite-journal
> > /.mozilla/firefox/nameofuser/places.sqlite 
> > /.mozilla/firefox/nameofuser/places.sqlite-journal
> > /.mozilla/firefox/nameofuser/formhistory.sqlite
> >  
> > /.mozilla/firefox/nameofuser/Cache/
> > 
> > Now I assume that these Javascript events and handlers and the CSS handlers 
> > were downloaded into the Cache from when the user was browsing using Tor.  
> > They would then be deleted as detailed above. Therefore, when the user 
> > loads up Firefox and turns off the Tor proxy settings, presumably the 
> > potential for JavaScript or CSS to connect Tor and non-Tor activity and get 
> > the users real (non-Tor) IP address is no longer a concern?
> > 
> > Is this correct?  Or am I missing something?  Just to re-state: I am only 
> > looking at this one issue - I am well aware of how useful Tor button is in 
> > other areas!
> > 
> > Thanks. 
> > 
> 
***
To unsubscribe, send an e-mail to majord...@torproject.org with
unsubscribe or-talkin the body. http://archives.seul.org/or/talk/


Re: Torbutton Documentation - Adversary Capabilities. - fork: Normalization of XHR requests

2010-07-13 Thread John Barker
On this subject.

I assume that all javascript requests actually use your browsers HTTP/socket 
engines - therefore although javascript is able to send network requests, 
they'll still be going through Tor.

This means that you could be identified by timing attacks, a particular 
sequence of Javascript XHR requests could uniquely identify Tor users because 
basically Tor doesn't do all that much normalisation (batching, delaying etc) 
of requests.

Perhaps it would be possible for Tor to do some limited batching for XHR 
requests? I understand Tor is capable of some limited application scrubbing - 
couldn't it see the HTTP headers that indicate a XHR request and treat them 
differently? 

Since XHR requests are typically quite interactive, delaying them might make 
Ajax javascript heavy websites sluggish or unusable, but it's just an idea I'm 
throwing out. Websites are definitely moving towards more javascript usage and 
it would be good to reduce this weakness.



On 13/07/2010, at 6:47 AM, Matthew wrote:

> Hello,
> 
> I have been reading the Torbutton documentation (thanks, guys) and have a 
> question about the adversary capabilities.  
> 
> The first adversary capability is "inserting javascript".  The document says 
> that "If not properly disabled, Javascript event handlers and timers can 
> cause the browser to perform network activity after Tor has been disabled, 
> thus allowing the adversary to correlate Tor and Non-Tor activity and reveal 
> a user's non-Tor IP address."  
> 
> The third adversary capability is "inserting CSS".  The document says that 
> "CSS can also be used to correlate Tor and Non-Tor activity and reveal a 
> user's Non-Tor IP address, via the usage of CSS popups - essentially 
> CSS-based event handlers that fetch content via CSS's onmouseover attribute. 
> If these popups are allowed to perform network activity in a different Tor 
> state than they were loaded in, they can easily correlate Tor and Non-Tor 
> activity and reveal a user's IP address."
> 
> I understand that Torbutton is useful for protecting privacy in multiple 
> ways.  But I would like to address this specific issue if I may.
> 
> Let us imagine that a user surfs the net using Tor (and Polipo or Privoxy).  
> He has JavaScript installed and uses it for all sites.  He finishes his 
> activities and then closes his browser.  He then wipes the following files 
> and directories (I am using Ubuntu as my example):
> 
> /.mozilla/firefox/nameofuser/cookies.sqlite 
> /.mozilla/firefox/nameofuser/downloads.sqlite 
> /.mozilla/firefox/nameofuser/cookies.sqlite-journal
> /.mozilla/firefox/nameofuser/places.sqlite 
> /.mozilla/firefox/nameofuser/places.sqlite-journal
> /.mozilla/firefox/nameofuser/formhistory.sqlite
>  
> /.mozilla/firefox/nameofuser/Cache/
> 
> Now I assume that these Javascript events and handlers and the CSS handlers 
> were downloaded into the Cache from when the user was browsing using Tor.  
> They would then be deleted as detailed above. Therefore, when the user loads 
> up Firefox and turns off the Tor proxy settings, presumably the potential for 
> JavaScript or CSS to connect Tor and non-Tor activity and get the users real 
> (non-Tor) IP address is no longer a concern?
> 
> Is this correct?  Or am I missing something?  Just to re-state: I am only 
> looking at this one issue - I am well aware of how useful Tor button is in 
> other areas!
> 
> Thanks. 
> 



Torbutton Documentation - Adversary Capabilities.

2010-07-12 Thread Matthew

 Hello,

I have been reading the Torbutton documentation (thanks, guys) and have a 
question about the adversary capabilities.


The first adversary capability is "inserting javascript".  The document 
says that "If not properly disabled, Javascript event handlers and timers 
can cause the browser to perform network activity after Tor has been 
disabled, thus allowing the adversary to correlate Tor and Non-Tor activity 
and reveal a user's non-Tor IP address."


The third adversary capability is "inserting CSS".  The document says that 
"CSS can also be used to correlate Tor and Non-Tor activity and reveal a 
user's Non-Tor IP address, via the usage of CSS popups - essentially 
CSS-based event handlers that fetch content via CSS's onmouseover 
attribute. If these popups are allowed to perform network activity in a 
different Tor state than they were loaded in, they can easily correlate Tor 
and Non-Tor activity and reveal a user's IP address."


I understand that Torbutton is useful for protecting privacy in multiple 
ways.  But I would like to address this specific issue if I may.


Let us imagine that a user surfs the net using Tor (and Polipo or 
Privoxy).  He has JavaScript installed and uses it for all sites.  He 
finishes his activities and then closes his browser.  He then wipes the 
following files and directories (I am using Ubuntu as my example):


/.mozilla/firefox/nameofuser/cookies.sqlite
/.mozilla/firefox/nameofuser/downloads.sqlite
/.mozilla/firefox/nameofuser/cookies.sqlite-journal
/.mozilla/firefox/nameofuser/places.sqlite
/.mozilla/firefox/nameofuser/places.sqlite-journal
/.mozilla/firefox/nameofuser/formhistory.sqlite

/.mozilla/firefox/nameofuser/Cache/

Now I assume that these Javascript events and handlers and the CSS handlers 
were downloaded into the Cache from when the user was browsing using Tor.  
They would then be deleted as detailed above. Therefore, when the user 
loads up Firefox and turns off the Tor proxy settings, presumably the 
potential for JavaScript or CSS to connect Tor and non-Tor activity and get 
the users real (non-Tor) IP address is no longer a concern?


Is this correct?  Or am I missing something?  Just to re-state: I am only 
looking at this one issue - I am well aware of how useful Tor button is in 
other areas!


Thanks.



Re: Torbutton Documentation - Adversary Capabilities.

2010-07-12 Thread Kyle Williams
Beware of the Flash and other third-party plugins to your browser.  Flash
can also store "flash cookies" on your system as well.
I would look at "about:plugins" and see what Firefox has loaded.  Torbutton
does a good job at stopping third party plugins, but if you specifically
allow Flash and do not clear the cookie from Flash, you may have a problem.

Other than that, you have the right idea. :)



On Mon, Jul 12, 2010 at 8:45 AM, Matthew  wrote:

>  Hello,
>
> I have been reading the Torbutton documentation (thanks, guys) and have a
> question about the adversary capabilities.
>
> The first adversary capability is "inserting javascript".  The document
> says that "If not properly disabled, Javascript event handlers and timers
> can cause the browser to perform network activity after Tor has been
> disabled, thus allowing the adversary to correlate Tor and Non-Tor activity
> and reveal a user's non-Tor IP address."
>
> The third adversary capability is "inserting CSS".  The document says that
> "CSS can also be used to correlate Tor and Non-Tor activity and reveal a
> user's Non-Tor IP address, via the usage of CSS popups - essentially
> CSS-based event handlers that fetch content via CSS's onmouseover attribute.
> If these popups are allowed to perform network activity in a different Tor
> state than they were loaded in, they can easily correlate Tor and Non-Tor
> activity and reveal a user's IP address."
>
> I understand that Torbutton is useful for protecting privacy in multiple
> ways.  But I would like to address this specific issue if I may.
>
> Let us imagine that a user surfs the net using Tor (and Polipo or
> Privoxy).  He has JavaScript installed and uses it for all sites.  He
> finishes his activities and then closes his browser.  He then wipes the
> following files and directories (I am using Ubuntu as my example):
>
> /.mozilla/firefox/nameofuser/cookies.sqlite
> /.mozilla/firefox/nameofuser/downloads.sqlite
> /.mozilla/firefox/nameofuser/cookies.sqlite-journal
> /.mozilla/firefox/nameofuser/places.sqlite
> /.mozilla/firefox/nameofuser/places.sqlite-journal
> /.mozilla/firefox/nameofuser/formhistory.sqlite
>
> /.mozilla/firefox/nameofuser/Cache/
>
> Now I assume that these Javascript events and handlers and the CSS handlers
> were downloaded into the Cache from when the user was browsing using Tor.
> They would then be deleted as detailed above. Therefore, when the user loads
> up Firefox and turns off the Tor proxy settings, presumably the potential
> for JavaScript or CSS to connect Tor and non-Tor activity and get the users
> real (non-Tor) IP address is no longer a concern?
>
> Is this correct?  Or am I missing something?  Just to re-state: I am only
> looking at this one issue - I am well aware of how useful Tor button is in
> other areas!
>
> Thanks.
>


Torbutton Documentation - Adversary Capabilities.

2010-07-12 Thread Matthew

 Hello,

I have been reading the Torbutton documentation (thanks, guys) and have a 
question about the adversary capabilities.


The first adversary capability is "inserting javascript".  The document 
says that "If not properly disabled, Javascript event handlers and timers 
can cause the browser to perform network activity after Tor has been 
disabled, thus allowing the adversary to correlate Tor and Non-Tor activity 
and reveal a user's non-Tor IP address."


The third adversary capability is "inserting CSS".  The document says that 
"CSS can also be used to correlate Tor and Non-Tor activity and reveal a 
user's Non-Tor IP address, via the usage of CSS popups - essentially 
CSS-based event handlers that fetch content via CSS's onmouseover 
attribute. If these popups are allowed to perform network activity in a 
different Tor state than they were loaded in, they can easily correlate Tor 
and Non-Tor activity and reveal a user's IP address."


I understand that Torbutton is useful for protecting privacy in multiple 
ways.  But I would like to address this specific issue if I may.


Let us imagine that a user surfs the net using Tor (and Polipo or 
Privoxy).  He has JavaScript installed and uses it for all sites.  He 
finishes his activities and then closes his browser.  He then wipes the 
following files and directories (I am using Ubuntu as my example):


/.mozilla/firefox/nameofuser/cookies.sqlite
/.mozilla/firefox/nameofuser/downloads.sqlite
/.mozilla/firefox/nameofuser/cookies.sqlite-journal
/.mozilla/firefox/nameofuser/places.sqlite
/.mozilla/firefox/nameofuser/places.sqlite-journal
/.mozilla/firefox/nameofuser/formhistory.sqlite

/.mozilla/firefox/nameofuser/Cache/

Now I assume that these Javascript events and handlers and the CSS handlers 
were downloaded into the Cache from when the user was browsing using Tor.  
They would then be deleted as detailed above. Therefore, when the user 
loads up Firefox and turns off the Tor proxy settings, presumably the 
potential for JavaScript or CSS to connect Tor and non-Tor activity and get 
the users real (non-Tor) IP address is no longer a concern?


Is this correct?  Or am I missing something?  Just to re-state: I am only 
looking at this one issue - I am well aware of how useful Tor button is in 
other areas!


Thanks.