Let $V$ and $W$ be finite dimensional vector spaces with respective ordered bases given by $v_1, \dots, v_n$ and $w_1, \dots, w_m$. Then, the matrix of a linear transformation $T:V \rightarrow W$ is typically defined as the matrix $A$ whose $i^{th}$ column $A_i$ is a vector in $W$ determined by $ A_i := T(v_i) = A^j_iw_j $ where $A^j_i$ denote the $m$ scalars in the $i^{th}$ column and the summation convention is in force. It is a theorem that there exists a dual basis $w^1, \dots, w^m \in W^*$ such that $w^k(w_j) = \delta^k_j$. Consequently, $ w^k(T(v_i)) = w^k(A^j_iw_j) = A^j_iw^k(w_j) = A^j_i \delta^k_j = A^k_i. $ Therefore, it appears that the condition that $A^k_i = w^k(T(v_i))$ is equivalent to requirement that the column of $A$ is determined by $T(v_i)$. My question is, Is this analysis correct and, if so, is there any reason why the entries of the matrix $A$ couldn't simply be defined by the relation $A^k_i = w^k(T(v_i))$?
Alternate Characterization of the Matrix of a Linear Transformation
1
$\begingroup$
linear-algebra