A Generalization of the Vector Cross Product

A Generalization of the Vector Cross Product

\( \def\matr#1{\mathbf #1} \def\tp{\mathsf T} \)

Imagine you have two 3-dimensional vectors and and you require a third vector, which is orthogonal to the first two vectors. How would you compute this vector? Typically, one would compute the cross product of and or vice versa. This is easy in three dimensions since we can use the well known relation

Also for two dimensions the case is quite simple. However, instead of two vectors and you only have one vector and the orthogonal vector is formally just a linear mapping:

You can easily check that the inner product of both vectors is zero. But what about higher dimensions? For example, how would you define the cross product in 5 dimensions or maybe even in 100 dimensions?

As it turns out, it is not that difficult to compute the cross product in any dimension . If we are in a space , we need vectors, each with elements, that are pairwise linearly independent (note that if two or more vectors are colinear then the cross product will be zero). Hence, we have a set of vectors

Let us write the cross product as:

Now let us arrange all vectors in a matrix. Additionally, we add a standard basis vector for dimension as a first column in the matrix:

Note that the standard unit vector is of magnitude 1 and is parallel to axis of the coordinate system. Hence, for every dimension we have a slightly different matrix .

Finally, we can express the cross product as a determinant for each dimension. We skip the math behind the derivation for this formula, since its understanding is not really required for applying the formula. Remember that the cross product is a vector with elements. Each element along axis is then separately computed with

Overall, we receive a vector in the following form:

Since the first column in only contains one element different from zero, the determinant can be conveniently computed using the minors of matrix and Laplace’s formula.

Let us do this exemplarily for our initial example with the vectors

We first create the matrix , which is:

Then, we can compute the first element of the cross product vector for axis :

For the other two axis and we follow the same procedure and get:

Summarizing the previous results, we receive for the cross product :

which is exactly the same as we have already noted in the beginning.

Note that the base package in provides a function crossprod() which, however, does not compute the cross product as described in this blog post. For this reason I provide a -function here, which efficiently computes the cross product for arbitrary dimensions.

opX <- function(...) {
  u <- list(...)
  #
  # check length of all vectors
  #
  len <- unlist(lapply(u, length))
  if(!all(len == len[1])) stop("All vectors must be of same length!")
  len <- len[1]
  #
  # Check, if correct number of vectors is provided
  #
  if(length(u) != (len - 1) ) stop("For vector length ",len," you have to provide ",len-1, " vectors!")
  U <- do.call(cbind, u)

  #
  # Compute the determinants based on the minors of U and Laplace's formula
  # The determinants form the new vector which is the result of the cross-product
  #
  sapply(1:len, function(i) (-1)^(i + 1) * det(as.matrix(U[-i,])))
}

#
# Run an example
#
a <- c(1,2,3,4)
b <- c(5,6,7,8)
c <- c(9,10,-11,12)

print(opX(a,b,c))

#
# Check the orthogonality property
#
print(t(a) %*% opX(a,b,c))
print(t(b) %*% opX(a,b,c))
print(t(c) %*% opX(a,b,c))

Markus Thill

Markus Thill
I studied computer engineering (B.Sc.) and Automation & IT (M.Eng.). Generally, I am interested in machine learning (ML) approaches (in the broadest sense), but particularly in the fields of time series analysis, anomaly detection, Reinforcement Learning (e.g. for board games), Deep Learning (DL) and incremental (on-line) learning procedures.

Deriving a Closed-Form Solution of the Fibonacci Sequence using the Z-Transform

The Fibonacci sequence might be one of the most famous sequences in the field of mathmatics and computer science. Already high school stu...… Continue reading