Navigation |
---|
MTH-696a |
Course outline |
Assignments & Solutions |
Schedule |
Lecture Notes |
Sources |
Home Page of Leo Butler |
The force $F$ exerted by the rod is perpendicular to the sphere, so for any curve $c$ tangent to the sphere we have $\ip{F}{\dot{c}} \equiv 0$ which gives $W = \int_0^1 \ip{F(c(t))}{\dot{c}(t)}\, \D{t} = 0$.
If the instantaneous velocity of the bob is $v$ when it is at the point $x$, then the kinetic energy $T = \frac{1}{2} \norm{v}^2$ where $\norm{\bullet}$ is the Euclidean norm in $\R^3$ restricted to $T_x \sphere{2}$. To compute the potential energy, let $F = -g e_3$ be the downward force due to gravity. We choose $S = -e_3$ to be the point of zero potential energy and let $c : [0,1] \to M$ be a smooth curve connecting $S$ to $x$. The work done by the gravitational force is $W = \int_0^1 \ip{F}{\dot{c}(t)}\, \D{t} = \int_0^1 -g \ddt{t}{x_3} \D{t} = -g (x_3(1)-x_3(0))$. Thus $W = -g \Delta x_3$ and so the potential energy is $V = g (x_3+1)$ since $x_3(0)=-1$.
The claim is certainly true for $k=1$ when $\tenalg{1}{V} = V$. So, assume the claim is true for $i, \ldots, k-1$. Let $\phi_1, \ldots, \phi_n$ be a dual basis of $V^*$. For each $k$-tuple $I=(i_1, \ldots, i_k)$ with $1 \leq i_1, \ldots, i_k \leq n$, let $v_I = v_{i_1} \tensorproduct \cdots \tensorproduct v_{i_k}$ and likewise for $\phi_I$. The multilinearity of $\tensorproduct$ implies that the collection of $v_I$ spans $\tenalg{k}{V}$. To prove it is a basis, suppose that $\eta = \sum_I \eta_I v_I = 0$. We need to show that each coefficient $\eta_I = 0$. Define, for a monomial $w_J \in \tenalg{k}{V}$, \begin{align*} \ip{\phi_I}{w_J} = \prod_{\alpha=1}^k \ip{\phi_{i_{\alpha}}}{w_{j_{\alpha}}}. \end{align*} Due to the properties of the $\tensorproduct$, this is a well-defined map that extends from the generators to a linear map. Moreover, we see that $\ip{\phi_I}{v_J} = \delta_{IJ}$. Therefore $0 = \ip{\phi_I}{\eta} = \eta_I$ for all $I$. This proves the linear independence.
The proof is essentially identical, since we have only used multi-linearity of $\tensorproduct$.
Let $W, V$ be finite-dimensional vector spaces of dimension $\geq 2$. Let $w_i, i=1, \ldots, w$ (resp. $v_j, j=1, \ldots, v$) be a basis of $W$ (resp. $V$). We know that $w_i \tensorproduct v_j$ is a basis of $W \tensorproduct V$. Define a linear map $\phi : W \tensorproduct V \to \Matrect{w}{v}{\R}$ by \begin{align*} \phi(\eta) = \phi(\sum_{ij} \eta_{ij} w_i \tensorproduct v_j) &= \sum_{ij} \eta_{ij} E_{ij} \end{align*} where $E_{ij}$ is the $w \times v$ matrix with zeroes everywhere but in the $(i, j)$ entry, which is $1$. This map $\phi$ is a linear isomorphism. It is clear that if $\eta = a \tensorproduct b$, then the rank of the matrix $\phi(\eta)$ is $1$. Since $v, w \geq 2$, there is a matrix $x$ of rank $2$ or more in $\Matrect{w}{v}{\R}$. Then $y = \phi^{-1}(x)$ cannot equal $a \tensorproduct b$. If we apply this to $\tenalg{k+1}{V} = \tenalg{k}{V} \tensorproduct V$, then we have proven the claim in full generality.
For symmetric tensors, let $v, w \in V$ be linearly independent and define $\alpha = v \symmetricproduct v + w \symmetricproduct w \in \symalg{2}{V}$. Wolg, we can suppose that $v, w$ are orthogonal unit vectors. Define $\phi : \symalg{2}{V} \to \Mat{n}{\R}$ by $\phi(x \symmetricproduct y) = xy' + yx'$. This map is well-defined on generators of $\symalg{2}{V}$ and extends to a linear map. Suppose that $A = \phi(\alpha)$ equals $B = \phi(x \symmetricproduct y)$ for some $x, y \in V$. By construction, $A v = 2v, A w = 2 w$ and $A s = 0$ for all $s \perp v, w$. On the other hand, the image of $B$ lies in $\Span{x, y}$. Therefore, we must have that $x, y \in \Span{v, w}$. Then, $2 x = A x = B x = x(y'x) + y(x'x)$. This implies that $y$ is a scalar multiple of $x$. Therefore, the rank of $B$ is $1$; but the rank of $A$ is $2$. Absurd.
For skew-symmetric tensors, let $u, v, w, x \in V$ be linearly independent and define $\alpha = u \extproduct v + w \extproduct x \in \extalg{2}{V}$. Define a map $\phi : \extalg{2}{V} \to \Mat{n}{\R}$ by $\phi(a \extproduct b) = ab' - ba'$. One verifies that $\phi$ is well-defined on the generators of $\extalg{2}{V}$ and extends to a linear map. If $A = \phi(\alpha)$ equals $B = \phi(a \extproduct b)$ for some $a, b \in V$, then rank of $A$ ($=4$) equals that of $B$ ($=2$). Absurd.
We have that $\exp(t A)x = x + tAx + O(t^2)$. Using the multilinearity of $\tensorproduct$, we have that $(\exp(t A)x) \tensorproduct (\exp(t A)y) = x \tensorproduct y + t (Ax) \tensorproduct y + t x \tensorproduct (Ay) + O(t^2)$. Take the derivative with respect to $t$ at $t=0$ gives the answer.
Suppose that $L, M$ are derivations that agree on $V = \tenalg{1}{V}$. Suppose that $L=M$ on $\tenalg{k-1}{V}$ for some $k \geq 2$. Then $L(v_1 \tensorproduct \cdots \tensorproduct v_{k-1} \tensorproduct v_k) = L(v_1 \tensorproduct \cdots \tensorproduct v_{k-1}) \tensorproduct v_k + v_1 \tensorproduct \cdots \tensorproduct v_{k-1} \tensorproduct L(v_k) = M(v_1 \tensorproduct \cdots \tensorproduct v_{k-1}) \tensorproduct v_k + v_1 \tensorproduct \cdots \tensorproduct v_{k-1} \tensorproduct M(v_k) = M(v_1 \tensorproduct \cdots \tensorproduct v_{k-1} \tensorproduct v_k)$ for all $v_1, \ldots, v_k \in V$. Therefore $L$ and $M$ coincide on a basis of $\tenalg{k}{V}$, and since they are linear, they coincide. By induction, they coincide on $\tenalg{*}{V}$.
Define the map $f : \Mat{2n}{\R} \to \sorth{2n;\R}$ (where $\sorth{2n;\R}$ is the vector space of real skew symmetric $2n \times 2n$ matrices), by \begin{align} f(x) &= x'Jx - J, && x \in \Mat{2n}{\R}. \end{align} Since $J$ is skew symmetric, $y = f(x)$ is too, for all $x$. Since $f$ is quadratic in the entries of $x$, it is smooth. In addition, \begin{align} \D{}f_x v &= v'Jx - x'Jv, && x, v \in \Mat{2n}{\R} \end{align} where we have identified $T_x \Mat{2n}{\R}$ with $\Mat{2n}{\R}$. We want to show that when $x \in f^{-1}(0)$ (i.e. $x'Jx = J$), we have that $\D{}f_x : \Mat{2n}{\R} \to \sorth{2n;\R}$ is a surjective linear map. The submersion theorem then says that $f^{-1}(0)$ is a smooth submanifold. First, if $x \in f^{-1}(0)$, then $\det(x)^2 = \det(x'Jx) = \det(J) = 1$, so $x$ is invertible. Second, we have that $\D{}f_x v = S \circ R_{Jx} \circ T(v)$ where $T(v)=v'$, $R_{Jx}(y)=yJx$ and $S(z) = z-z'$. The maps $T$ and $S$ are clearly surjective; since $Jx$ is invertible, so is $R_{Jx}$. This proves that $\D{}f_x$ is surjective.
The spherical pendulum. The bob (in green) moves freely about the pivot $P$.