Bear writes:
> In this case you'd need to set up the wires-and-gates model
> in the QC for two ciphertext blocks, each attached to an
> identical plaintext-recognizer function and attached to the
> same key register. Then you set up the entangled state,
> and collapse the eigenvector on the eigen
Dr. Mike wrote, patiently, persistently and truthfully:
>
> On Fri, 16 Aug 2002, AARG! Anonymous wrote:
>
> > Here are some more thoughts on how cryptography could be used to
> > enhance user privacy in a system like TCPA. Even if the TCPA group
> > is not receptive t
Here are some more thoughts on how cryptography could be used to
enhance user privacy in a system like TCPA. Even if the TCPA group
is not receptive to these proposals, it would be useful to have an
understanding of the security issues. And the same issues arise in
many other kinds of systems wh
Basically I agree with Adam's analysis. At this point I think he
understands the spec equally as well as I do. He has a good point
about the Privacy CA key being another security weakness that could
break the whole system. It would be good to consider how exactly that
problem could be eliminate
Joe Ashwood writes:
> Actually that does nothing to stop it. Because of the construction of TCPA,
> the private keys are registered _after_ the owner receives the computer,
> this is the window of opportunity against that as well.
Actually, this is not true for the endoresement key, PUBEK/PRIVEK
Brian LaMacchia writes:
> So the complexity isn't in how the keys get initialized on the SCP (hey, it
> could be some crazy little hobbit named Mel who runs around to every machine
> and puts them in with a magic wand). The complexity is in the keying
> infrastructure and the set of signed state
In discussing how TCPA would help enforce a document revocation list
(DRL) Joseph Ashwood contrasted the situation with and without TCPA
style hardware, below. I just want to point out that his analysis of
the hardware vs software situation says nothing about DRL's specifically;
in fact it doesn'
David Wagner wrote:
> To respond to your remark about bias: No, bringing up Document Revocation
> Lists has nothing to do with bias. It is only right to seek to understand
> the risks in advance. I don't understand why you seem to insinuate
> that bringing up the topic of Document Revocation Lis
Adam Back writes:
> +---++
> | trusted-agent | user mode |
> |space | app space |
> |(code ++
> | compartment) | supervisor |
> | | mode / OS |
> +---++
> | ring -1 / TOR |
> +-
Seth Schoen of the EFF has a good blog entry about Palladium and TCPA
at http://vitanuova.loyalty.org/2002-08-09.html. He attended Lucky's
presentation at DEF CON and also sat on the TCPA/Palladium panel at
the USENIX Security Symposium.
Seth has a very balanced perspective on these issues compa
AARG! wrote:
> I asked Eric Murray, who knows something about TCPA, what he thought
> of some of the more ridiculous claims in Ross Anderson's FAQ (like the
> SNRL), and he didn't respond. I believe it is because he is unwilling
> to publicly take a position in opposition to such a famous and res
Several people have objected to my point about the anti-TCPA efforts of
Lucky and others causing harm to P2P applications like Gnutella.
Eric Murray wrote:
> Depending on the clients to "do the right thing" is fundamentally
> stupid.
Bran Cohen agrees:
> Before claiming that the TCPA, which is f
Re the debate over whether compilers reliably produce identical object
(executable) files:
The measurement and hashing in TCPA/Palladium will probably not be done
on the file itself, but on the executable content that is loaded into
memory. For Palladium it is just the part of the program called
I want to follow up on Adam's message because, to be honest, I missed
his point before. I thought he was bringing up the old claim that these
systems would "give the TCPA root" on your computer.
Instead, Adam is making a new point, which is a good one, but to
understand it you need a true pictur
Adam Back writes a very thorough analysis of possible consequences of the
amazing power of the TCPA/Palladium model. He is clearly beginning to
"get it" as far as what this is capable of. There is far more to this
technology than simple DRM applications. In fact Adam has a great idea
for how th
An article on Salon this morning (also being discussed on slashdot),
http://www.salon.com/tech/feature/2002/08/08/gnutella_developers/print.html,
discusses how the file-trading network Gnutella is being threatened by
misbehaving clients. In response, the developers are looking at limiting
the net
Seth Schoen writes:
> There is
> a much larger conversation about trusted computing in general, which
> we ought to be having:
>
> What would make you want to enter sensitive information into a
> complicated device, built by people you don't know, which you can't
> take apart under a microscope?
Anon wrote:
> You could even have each participant compile the program himself,
> but still each app can recognize the others on the network and
> cooperate with them.
Matt Crawford replied:
> Unless the application author can predict the exact output of the
> compilers, he can't issue a signatur
Here are some alternative applications for TCPA/Palladium technology which
could actually promote privacy and freedom. A few caveats, though: they
do depend on a somewhat idealized view of the architecture. It may be
that real hardware/software implementations are not sufficiently secure
for som
Peter Trei envisions data recovery in a TCPA world:
> HoM: I want to recover my data.
> Me: OK: We'll pull the HD, and get the data off it.
> HoM: Good - mount it as a secondary HD in my new system.
> Me: That isn't going to work now we have TCPA and Palladium.
> HoM: Well, what do you hav
Peter Trei writes:
> It's rare enough that when a new anononym appears, we know
> that the poster made a considered decision to be anonymous.
>
> The current poster seems to have parachuted in from nowhere,
> to argue a specific position on a single topic. It's therefore
> reasonable to infer
Eric Murray writes:
> TCPA (when it isn't turned off) WILL restrict the software that you
> can run. Software that has an invalid or missing signature won't be
> able to access "sensitive data"[1]. Meaning that unapproved software
> won't work.
>
> [1] TCPAmain_20v1_1a.pdf, section 2.2
We need
Peter Trei writes:
> I'm going to respond to AARGH!, our new Sternlight, by asking two questions.
>
> 1. Why can't I control what signing keys the Fritz chip trusts?
>
> If the point of TCPA is make it so *I* can trust that *my* computer
> to run the software *I* have approved, and refuse to ru
Seth Schoen writes:
> The Palladium security model and features are different from Unix, but
> you can imagine by rough analogy a Unix implementation on a system
> with protected memory. Every process can have its own virtual memory
> space, read and write files, interact with the user, etc. But
Hi All,
I have recently been reading about password-based authentication schemes,
especially EKE and its variants. The papers I've read on EKE, DH-EKE, and
SPEKE all refer to their "perfect forward security," though I have been
unable to find a formal definition of this property, or any detai
25 matches
Mail list logo