I can generate a random nxn matrix (for property based testing) by just by filling a nxn matrix with random numbers. But can anyone think of a way to generate random pairs of commutative matrices?

One way (but I’m not sure whether this is good or bad) is generate two random matrices A and B, and return (A+B,A-B). These two matrices certainly commute, as (A+B)(A-B) = (A-B)(A+B) = A^2 - B^2.

Probably too naive a method: randomly generate two diagonal matrices D1 and D2 and (an invertible) matrix P and consider the conjugates P * D1 * inv(P) and P * D2 * inv(P). Almost every matrix is invertible, so P is easy to generate. The problem is that you have to invert, which may be slow.

Whenever I hear “matrices” and “slow” in the same paragraph, I immediately think about GPUs. Manipulating and transforming large multi-dimensional arrays are what they do best. The Aparapi API makes it relatively easy to do OpenCL processing in Java – you can usually expect somewhere in the neighborhood of 200X+ speed-up for problems like these. Any problem where there are a number of independent arithmetic or move operations per input array element, you’re going to see really nice speed improvement. You have to write the code very differently than you would a native Java implementation, but if that kind of speed-up is worth it to you, and you have a GPU (or other accelerator) available, then it is a no-brainer.

Note, (A+B) * (A - B) = A^2 + BA - AB - B^2
but
(A - B) * (A + B) = A^2 - BA + AB - B^2

so, those are not the same if A and B don’t already commute.

I think the diagonalization approach is a good suggestion, and that is the approach I’d take.

If you don’t want to solve inverses, you could generate P as an orthogonal basis on the columns (using say a gram-schmidt approach: Gram–Schmidt process - Wikipedia ), then you can easily compute the inverse by doing the transpose and normalizing the rows.

A subtly to watch out for is that if P is taken to be orthogonal, then PDP^{-1} is always a symmetric matrix, so may not be general enough. But perhaps taking P as the product of an orthogonal and upper triangular matrix will be fine (or the product of an upper and a lower triangular matrix).

Excellent observation. Humans really want multiplication to be commutative. It’s very easy to accidentally make that assumption. Thanks! I’m glad I asked this question.

This works very well. If I generate two matrices, x and y, this way, I can verify that exp(x)*exp(y) = exp(x+y), whereas that is not the case for arbitrary matrices which do not commute.