Suppose we have two linear functions, $f$ and $g$, which agree on all the basis vectors of some space. Then they must agree for every vector on that space, because they are both linear, and a linear function is completely determined by its values on the basis.
In gory detail, suppose that we know that $f(\vec{e_i}) = g(\vec{e_i})$ for each basis vector $\vec{e_i}$.
Consider some vector $\vec v$. We can express $\vec v$ a a linear combination of basis vectors, say as $\vec v = c_1\vec{e_1} + \cdots + c_n\vec{e_n}.$ Then we know that $\begin{align} f(\vec v) & = f(c_1\vec{e_1} + \cdots + c_n\vec{e_n}) \\ & = c_1f(\vec{e_1}) + \cdots + c_nf(\vec{e_n}) & \text{(linearity of $f$)} \\ & = c_1g(\vec{e_1}) + \cdots + c_ng(\vec{e_n}) & \text{($f=g$ for basis vectors)} \\ & = g(c_1\vec{e_1} + \cdots + c_n\vec{e_n}) & \text{(linearity of $g$)}\\ & = g(\vec v) \end{align} $
One can similarly show an analogous fact for functions of several variables. For example, if $f(u,v,x,y)$ and $g(u,v,x,y)$ are linear functions of $u, v, x,$ and $y$, and if they agree on all combinations of the basis vectors for some space, then they agree on every vector in that space.
Now take $f(u,v,x,y) = (u\times v)\cdot(x\times y)$ and $g(u,v,x,y) =\begin{vmatrix} u\cdot x & v\cdot x \\u \cdot y & v \cdot y\end{vmatrix}$. These are easily seen to be linear, or easy to show to be linear if you don't see it, using properties of cross and dot products (for $f$) and of determinants and dot products (for $g$). So if you can show that they are equal when $u,v,x,$ and $y$ are basis vectors, you are done. And for most choices of basis vectors as arguments, both sides are equal to zero, so this is quick to verify.