13 November, 2019 at 1:52 pm

Anonymous

Why do you say that eigenvectors are determined by eigenvalues? For
example, for symmetric matrices you still do not know n-squared plus or
minus signs of the coordinates. Isn’t finding the signs almost as hard to
finding eigenvectors themselves?

13 November, 2019 at 4:32 pm

Terence Tao

The title is indeed an oversimplification (it seemed to convey the essence
of the result more succinctly than “norm square of coefficients of
eigenvectors from eigenvalues”). The second sentence of the abstract is
intended to clarify the sense in which we relate eigenvectors and
eigenvalues.



On Fri, Nov 15, 2019 at 1:43 PM Roger Hui <[email protected]> wrote:

> The theorem gives information on the square of every element (component) of
> every eigenvector.  That's a lot of information and that's consistent with
> the informal description "compute eigenvectors using only information about
> eigenvalues".
>
>
> On Fri, Nov 15, 2019 at 9:58 AM Raul Miller <[email protected]> wrote:
>
> > On Fri, Nov 15, 2019 at 11:22 AM Roger Hui <[email protected]>
> > wrote:
> > > The actual theorem and proofs:
> > > https://terrytao.wordpress.com/tag/xining-zhang/
> >
> > I am only seeing a calculation for the magnitudes of the eigenvectors,
> > there.
> >
> > (Am I missing something obvious?)
> >
> > Thanks,
> >
> > --
> > Raul
> > ----------------------------------------------------------------------
> > For information about J forums see http://www.jsoftware.com/forums.htm
> >
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to