# Linear AlgebraTransposes

The dot product gives us a compact way to express the formula for an entry of a matrix product: to obtain the $(i,j)$th entry of a matrix product , we dot the $i$th row of and the $j$th column of .

However, the matrix product by itself is not quite flexible enough to handle a common use case: suppose we have two matrices and which contain tabular data stored in the same format. For example, suppose that the columns of store the vectorized word counts for a series of emails sent from Alice, while stores vectorized word counts for a series of emails sent from Bob. If we want to calculate the similarity of each of Alice's email to each of Bob's emails, then we want to dot the *columns* of —not its rows—with the columns of .

So we define the **transpose** of a matrix to be the matrix which results from switching the rows and columns of .

**Definition**

If is an matrix, then its **transpose** is defined to be the matrix with rows whose th row is equal to the th column of , for each from 1 to .

**Example**

If then

With this definition in hand, we can write the matrix whose entries are the dot products of columns of

## Transpose properties

Let's develop a few properties of the transpose so that we can manipulate matrix expressions involving the transpose. First, we note that the transpose is a *linear* operator, meaning that

Taking the transpose also interacts nicely with matrix multiplication:

**Exercise**

Suppose that

Confirm your conjecture numerically in Python with some random matrices. You can generate a random `np.random.random_sample((m,n))`

, the transpose of `A`

is accessed as `A.T`

, and the product of `A`

and `B`

is `A * B`

.

Confirm your conjecture numerically in Julia with some random matrices. You can generate a random `rand(m, n)`

.

import numpy as np

*Solution.* Since

: this is anA' B' matrix multiplied by an \times m matrix, and ifp \times n it is not defined. If it is defined, it gives anm \neq p matrix.n \times n : this is aB' A' matrix multiplied by anp \times n matrix and hence is an \times m matrix.p \times m is anAB matrix, andm \times p is anA' matrix. Ifn \times m this is not defined. If it is defined, it gives anp \neq n matrix.m \times m

We see that the only matrix product that is always defined, and in fact gives the right dimensions, is the second option. And in fact, we have

in general.

The following block of code checks the equation for a particular random example.

import numpy as np A = np.random.random_sample((3,7)) B = np.random.random_sample((7,3)) np.allclose((A @ B).T, B.T @ A.T)

A = rand(3, 7) B = rand(7, 3) isapprox((A * B)', B' * A')

## Symmetric Matrices

In some applications, a matrix will have the property that its $(i,j)*symmetric*.

**Definition**

If **symmetric**.

**Exercise**

Suppose that

*Solution.* We have

Here we used that

In the case where

**Exercise**

Show that

for all

*Solution.* Since

In other words, we may move a matrix which is pre-multiplying one of the vectors in a dot product to the other vector, at the cost of taking its transpose. Let's look at one application of this property whose importance will be explored in subsequent sections.

**Exercise**

Show that

*Solution.* We have