. break out algebraic details (current one just brief, intuitive)
. look at det on diagonal in different basis:
diag(e,...,e) and diag(e,-e,0,...,0)
...so it's 1+ne + ... + e^n
and 1 - e^2
\chapter{Trace}
The trace is a natural map $\End V \to K$, coming from the pairing $V^* \times V \to K$,
and left adjoint to the identity $V \to V$.
It is a sort of counit for $\End V$, and dual to scalars (the unit $K \to \End V$).
In coordinates, it's ``sum of diagonal entries'',
$$\tr A = \sum_i a_{ii}.$$
In Einstein notation, $a^i_i$.
As a $(1,1)$ tensor, its coefficients are the Dirac delta $\delta^i_j$,
just like the identity.
It is also the sum of the eigenvalues,
and can be interpreted as ``number of fixed points, with weights''.
Geometrically, it is the derivative of the determinant (at the identity),
and is infinitesimal change in volume.
It is a map of Lie algebras (in particular linear), but not a map of algebras;
this is expected, as it's a derivative of a map of algebras.
\section{Algebra}
We use Einstein notation.
\subsection{Abstract definition}
There is a natural pairing $V^* \times V \to K$, namely ``evaluate a covector on a vector''.
This is bilinear, so it yields a map on the tensor product, $V^* \otimes V \to K$.
Now compose with the natural isomorphism $\Hom(V,V) = V^* \otimes V$, which yields
$$\End V = V^* \otimes V \to K.$$
This map is the trace.
\subsection{Relation to identity and scalars, using Dirac delta}
FIXME:
oh!
Dirac delta is:
- identity operator
- value of natural pairing: e^i(e_j)
- components of trace
identity/scalars are dual to trace
dually to trace, the identity is intuitive from POV of End V,
and mysterious from POV of V^* \otimes V
(it's id in any basis)
Its components are the Dirac delta!
id = \delta^i_j = \sum_{ij} \delta^i_j e^j \otimes e_i = \sum_i
e^i \otimes e_i
\subsection{Relation to multiplication}
The pairing $V^* \times V \to K$, and thinking of operators as elements
of the tensor product (via $\End V = V^* \otimes V$), occur in a number
of basic operations, more familiar than trace. We illustrate, using Einstein
notation.
\noindent \begin{tabular}{lll}
\textbf{interpretation} & \textbf{matrix interpretation} & \textbf{Einstein notation}\\
inner product & pair covector with vector & $a_i b^i$\\
apply a map to a vector & multiply vector by matrix & $a^i_j c^j$\\
compose maps & multiply matrices & $a^i_j b^j_k$\\
trace & trace & $a^i_i$
\end{tabular}
\noindent \begin{tabular}{ll}
\textbf{in terms of maps} & \textbf{in terms of tensor product}\\
$V^* \times V \to K$ & $V^* \otimes V \to K$\\
$V \times \Hom(V,W) \to W$ & $V \otimes V^* \otimes W \to W$\\
$\Hom(U,V) \times \Hom(V,W) \to \Hom(U,W)$
& $U^* \otimes V \otimes V^* \otimes W \to U^* \otimes W$\\
$\End V \to K$ & $V^* \otimes V \to K$
\end{tabular}
As a more sophisticated example, given any mixed variance tensor, one gets lower order tensors
by contraction: pair a covector against a vector (a $(k,l)$-tensor contracts
to yield a $(k-1,l-1)$-tensor; one must choose which indices to contract).
In Einstein notation, pair an upper and a lower index.
A key example is the Ricci curvature, by contraction of the Riemannian curvature.
Note that a tensor is usually interpreted as a form
$$V^* \times \dots \times V^* \times V \times \dots \times V \to K$$
so if one contracts on the left, one must argue why the form descends.
(fixme: a commutative diagram would help)
More naturally, dualize by taking a vector and covector to the right side,
and then contract there (or rather, interpret the tensor as taking in $(k-1,l-1)$
(covectors,vectors), and outputting (1,1) (covector,vector), aka an operator):
$$V^* \times \dots \times V^* \times V \times \dots \times V \to V \otimes V^* \to K$$
(the left map is the original tensor, partly dualized, the right map is pairing).
Thus contraction of a tensor is somewhat indirect, depending on how you interpret
the tensor.
Einstein notation also explains why ``sum of diagonal entries'' is a natural thing,
while ``sum of anti-diagonal entries'' or ``sum of 1st column'' isn't:
trace is a natural pairing, while these others are not.
\section{Interpretation in terms of fixed points and sum of eigenvalues}
The trace is transparent in terms of a function on $V^* \otimes V$,
but a priori opaque in terms of a function on $\End V$.
It can be interpreted as ``number of fixed points (with weights)'':
given a map on a set $X \to X$, you get an operator on the free vector space on that set,
and the trace is the number of fixed points: the $i$th diagonal entry is $1$
iff $x_i \mapsto x_i$, and zero otherwise. This is most familiar for
permutation matrices.
Note that this is intrinsic, and doesn't depend on an order on the set.
For a general map (not coming from a map on a set), the trace is the sum of eigenvalues
(proof: invariant under change of basis; put in Jordan form (or just Schur form: upper triangular), so diagonal entries are exactly the eigenvalues); the eigenvalues can
be thought of as a sort of ``weight'' of fixed points.
\section{Properties}
\subsection{Lie algebra map}
Trace is a map of Lie algebras, but not a map of algebras.
Concretely, $\tr AB = \tr BA$, but in general $\tr AB \neq \tr A \tr B$.
Categorically, this is because trace and composition are both the pairing
of vectors and covectors: given two operators $A, B$, you get an element of
$$\End V \otimes \End V = V^* \otimes V \otimes V^* \otimes V.$$
The multiplication $AB$ pairs the inner pair, then trace pairs the outer pair,
while multiplication $BA$ pairs the outer pair, then trace pairs the inner pair,
thus these are equal.
In Einstein notation\footnote{You obviously can switch $i$ and $j$;
they are dummy indices.}, they're both $a^i_jb^j_i = b^j_ia^i_j$.
Conversely, $\tr A \tr B = \tr B \tr A$ pairs the left pair and the right pair,
and in Einstein notation is $a^i_i b^j_j$.
(fixme: $V^* \times V$ etc. with over/under braces to indicate the
pairing would help)
As a concrete counter-example showing $\tr AB \neq \tr A \tr B$,
let $A = B = \psmallmatrix{0&1\\1&0}$. Then $\tr A = 0$ but $\tr A^2 = 2$!
Another example is: $A = \psmallmatrix{0&1\\0&0}, B = \psmallmatrix{0&0\\1&0}$.
For $3$-fold products, it is invariant under cyclic permutation,
which is called the \Def{cyclic property} of the trace, but not under
arbitrary permutation; this can be seen by Einstein notation or
drawing $V^* \times V \times V^* \times V \times V^* \times V$ as a hexagon;
then $\tr ABC = \tr BCA = \tr CAB$ are all contracting the same 3 sides,
while other permutations pair different pairs.
(fixme: pretty hexagonal diagram, bitte?)
We can interpret $\tr AB = \tr BA$ as $\tr [A,B] = 0$, so it vanishes on the derived algebra.
Since $K$ has a commutative Lie algebra structure (trivially), the trace
is a map of Lie algebras.
\subsection{Transpose}
Given a matrix, its trace equals the trace of the transpose: $\tr A^T = \tr A$.
This is obvious from the matrix characterization
(as the diagonal is invariant under transpose);
more intrinsically, the trace of the adjoint\footnote{For complex operators,
we just mean the algebraic dual, the transpose: no conjugation.}
of a map equals the original trace:
trace commutes with the duality isomorphism $\End V = \End V^*$.
Written in terms of tensor products,
$$\End V = V^* \otimes V = V \otimes V^* = V^{**} \otimes V^* = \End V^*$$
and the trace is the same pairing: you've just reversed the order of the terms.
(fixme: a commutative diagram would help)
\section{Geometric interpretation}
Geometrically, the trace is the infinitesimal change in volume: $\tr = \det'_I$.
Trace is the derivative of the determinant at the identity.
The fact that it's not multiplicative, but is a map of Lie algebras
suggests looking at the Lie algebra, whose geometric interpretation is infinitesimals.
The determinant is a map of Lie groups, and the trace is a map of Lie algebras,
so it's not surprising that they should be related in this way.
The kernel of the trace (the trace-free operators) is $\ssl$, the basic simple Lie algebra;
this is the Lie algebra of the kernel of the determinant, $\SL$.
I discuss their connection more in ``Trace and Determinant: a Lie theory perspective''.
\section{Hopf algebra?}
The endomorphism ring of a finite dimensional vector space
seems a lot like a Hopf algebra, especially if there's an inner product
(dual/transpose being an antipode),
but I can't prove this and can't find any references on it.
From this point of view, trace is the counit $A \to K$
and scalar matrices is the unit $K \to A$.
\section{Etymology}
The term comes from German, where it is called Spur (cognate to English ``spoor''),
notated $\mathrm{Sp}$: the trail of an animal (via its droppings).
\section{Applications}
A key application is group characters: given a group representation,
its trace is called the character, and the characters are very useful
for understanding the group.
Conversely, you can think of the trace as being a character;
I don't understand characters, so this doesn't give me insight,
but it might work for you.
One also defines a trace for operators on Hilbert spaces and Banach
spaces, but it can't be globally defined, hence one has \Def{trace class}
operators: the ones for which trace can be defined.