Here we prove Sylvester’s law of inertia. Let be a real symmetric matrix, and assume the spectral theorem.

Take an arbitrary orthogonal basis with respect to the form , and scale (and optionally reorder) so that there are entries of . (This can also be shown using the spectral theorem, but I think it’s overkill and conceptually messier (we’re “mixing up” linear transformation matrices and bilinear form matrices).)

So first we show are unique (i.e. invariant under change of basis). This is not too hard, since our basis is quite nice. The maximum dimension of a positive-definite set is , or else we would get a linearly independent set of at least vectors. (More precisely, this would force a nontrivial intersection between a dimension-() positive-definite space and a dimension- negative-semi-definite space, which is clearly absurd.) Similarly, the maximum dimension of a negative-definite set is .

Now that we have uniqueness, we move on to the eigenvalues and principal minor determinant interpretations. By the spectral theorem (for real symmetric matrices), the fact that symmetric matrices have real eigenvalues, and uniqueness of Sylvester form, has positive eigenvalues, negative eigenvalues, and zero eigenvalues.

This will allow us to induct for the principal minor determinant thing: if , and **all principal determinants are nonzero (important assumption for “mixed”-definite matrices)**, then there are exactly sign changes among the list of determinants of , where we interpret as .

The base case is pretty clear. Now suppose and the result holds for . The key interpretation of is as follows: let be the set of vectors in with last coordinates zero. Then for any , where denotes projection onto .

By inductive hypothesis on , if we have sign changes, then there are negative eigenvalues and positive eigenvalues **of **, with corresponding eigenvectors (forming an orthonormal basis WLOG by the spectral theorem) (first negative). To convert to information about , we use the vector interpretation—negative eigenvalues with eigenvector (with appended) give ; similar for positive. Of course, for distinct by orthogonality.

So now we have a positive definite set of dimension and a negative definite set of dimension , so has and . Thus either or , according to whether has the same sign as or .

### Like this:

Like Loading...

*Related*

Hey, apologies that this isn’t related to the above, but I was reading your blog and you recommended Stein’s analysis lectures. Now I just so happen to own a copy of the first lecture on Fourier Analysis and I’ll be taking a course next semester that covers intrductory real analysis (Rudin) and complex analysis (Priestley). When do you recommend I should read the Fourier analysis book? Just alongside the course or is there a point at which it will be more beneficial (after the course maybe?)

Thanks so much and keep up the awesome work!

LikeLike

If you have time, then it would be great alongside (or even before) the course—whenever there’s something from (baby) Rudin used in the Fourier analysis book, that would be good motivation to then learn that material well in Rudin. đź™‚ And there are certainly plenty of connections between complex analysis and Fourier analysis!

LikeLike