A Generalization of the Vector Cross Product
\( \def\matr#1{\mathbf #1} \def\tp{\mathsf T} \)
Imagine you have two 3-dimensional vectors and and you require a third vector, which is orthogonal to the first two vectors. How would you compute this vector? Typically, one would compute the cross product of and or vice versa. This is easy in three dimensions since we can use the well known relation
Also for two dimensions the case is quite simple. However, instead of two vectors and you only have one vector and the orthogonal vector is formally just a linear mapping:
You can easily check that the inner product of both vectors is zero. But what about higher dimensions? For example, how would you define the cross product in 5 dimensions or maybe even in 100 dimensions?
As it turns out, it is not that difficult to compute the cross product in any dimension . If we are in a space , we need vectors, each with elements, that are pairwise linearly independent (note that if two or more vectors are colinear then the cross product will be zero). Hence, we have a set of vectors
Let us write the cross product as:
Now let us arrange all vectors in a matrix. Additionally, we add a standard basis vector for dimension as a first column in the matrix:
Note that the standard unit vector is of magnitude 1 and is parallel to axis of the coordinate system. Hence, for every dimension we have a slightly different matrix .
Finally, we can express the cross product as a determinant for each dimension. We skip the math behind the derivation for this formula, since its understanding is not really required for applying the formula. Remember that the cross product is a vector with elements. Each element along axis is then separately computed with
Overall, we receive a vector in the following form:
Since the first column in only contains one element different from zero, the determinant can be conveniently computed using the minors of matrix and Laplace’s formula.
Let us do this exemplarily for our initial example with the vectors
We first create the matrix , which is:
Then, we can compute the first element of the cross product vector for axis :
For the other two axis and we follow the same procedure and get:
Summarizing the previous results, we receive for the cross product :
which is exactly the same as we have already noted in the beginning.
Note that the base package in provides a function crossprod()
which, however, does not compute the cross product as described in this blog post. For this reason I provide a -function here, which efficiently computes the cross product for arbitrary dimensions.