in the presence of a non-degen bilinear form (say, an inner product) a bilinear form is the same thing as an operator what's the connection? esp.: is sylvester's theorem related to the (real?) spectral theorem? (yes: they're both same data: a set of axes, each with a weight; either diag(a1,...,an) on the axes, or a1 x_1^2 + ... + an x_n^2 as form) (yes; think about how the -proofs- are related) (NB: given any bilinear form, operators yield bilinear forms) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% uppertriangulizability understand esp. this one on commuting a commuting family of operators stabilizes a common flag (i.e., is conjugate into the Borel group) pf: just need to show that they have a common eigenvector, then it's formal (just as "stabilizes a flag" follows from "has an eigenvector") [Cor: spectral theorem: T and T* stabilize a common flag so T also stabilizes the perp flag, hence ortho-diagonalizable; (take the intersections to get basis) concretely, they stabilize V_0 < V_1 < ... < V_n, so also stabilize (V_n)* < ... (V_0)* (V_k has dim k; (V_k)* has -codim- k) (V_k) + (V_k)* is an ortho direct sum (by def'n of perp) V_k \cap (V_{k+1})* has dim 1: let e_k be a unit norm vector; the e_i, e_j are ortho b/c if i > j, then e_i in V_i and e_j in (V_{j+1})* < (V_i)*, so =0] [even more simply: Lemma 1: T preserves V iff T^* preserves V^*; Pf: defn Lemma 2: On inner product space, T and T^* both preserve V iff T preserves direct sum V + V^perp (iff they both do, clearly) Pf of spectral thm: T,T^* commute, so they have a common eigenvector v; so they preserve v + v^perp. induct. (this skips flags and does the induction -here-, which is a bit simpler)] Kolchin's thm: a nilpotent family is conjugate into the nilpotent matrices two operators A,B s.t. [A,B] has rank 1 stabilize a common flag (Drinfeld's problem that cheered him up one year: proof is: [A,B] = C has trace 0 (b/c commutator), and rank one means C= w^* \otimes v; trace zero means w^*v = 0, so C^2=0, so nilpotent; thus this follows from the general case) (? what about a family whose derived algebra has rank 1?) NB: the Borel group isn't commutative, so we should expect something like this; a commuting family is a very special algebra (presumably we can bound the rank; I'd guess by n? (if in GL_n+1) oh yeah -- this really should be the def'n of the rank or such) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% SL2 inverse When is the map A -> A^{-1} given by conjugation? Well: - you better be abelian, as A -> A^{-1} is an -anti-homomorphism, and any conjugation is a homomorphism [oh! so this is an -outer- automorphism: point is that every abelian group has a natural Z/2 action given by A -> A^{-1}, which is an outer automorphism, so get an extension A semi Z/2: question is to find these in linear land] - you better have determinant +/- 1, as conjugation doesn't change your determinant and you need det A^{-1}=(det A)^{-1}, so (det A)^2 = 1 so det A = +/- 1: so SLn (or SLn^\pm) - you better be dimension 2, as crammer's rule says A -> A^{-1} is degree (n-1) (b/c of minors) on SLn here's 2 EGs -- are they all? (l 0) (0 l^-1) conjugate by (0 1) (1 0) [many conjugate such groups: expand on one, contract on other] (cos th -sin th) (sin th cos th) (conj by -any- reflection works) -weird-: conj by (0 1) (1 0) works on all of these individually, but can't work on products b/c non-abelian duh: gABg' = gAg'gBg' = A'B' != B'A' = (AB)' (cosh th sinh th) (sinh th cosh th) - so you better be SL2 (or SL1^\pm, where A -> A^{-1} is trivial) then (a b) -> (d -b) (c d) (-c a) [mneumonic: recall det = ad-bc (cross), and look at first column, which better be d,-b (for formula to work out) -- can check work by looking at other entries. (easy to get confused b/c you switch some and convert others to minus -- which is which?)] %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% permutation matrices: transpose is inverse pf 1: obvious from matrix pf 2: a_ji=1 means you take row i to row j so obviously the transpose reverses this pf 3: they're a standard basis, hence give orthogonal matrix also fun for easy computations, esp. S_3; also good way of demonstrating -sign- [funny irony: some defns of det use sign on S_n to define it; conversely, can use det to define sign on S_n! they're really a very similar idea; really, one should say that S_n = GL_n F_1 or something, and then sign -is- det (well, sorta -- shit, this is the archimedean place).] also O_n \cap GL_n Z = (1,-1)^n semidirect S_n as permutation matrices [a sorta maximal torus and weyl group; I wanna say U_n \cap some complex thing = (S^1)^n semidirect S_n: the complex thing should be "matrices with entries of integer length"]