Matrix multiplication as a convolution

1. The product of two power series/polynomials is

\displaystyle \begin{array}{rl} & (a_0 + a_1 x + a_2 x^2 + \cdots) (b_0 + b_1 x + b_2 x^2 + \cdots) \\ = & c_0 + c_1 x + c_2 x^2 + \cdots \end{array}

The coefficients given by

\displaystyle c_n = \sum_{k=0}^n a_k b_{n-k}

is sometimes called the Cauchy product. This is a convolution.

2. Let {G} be a finite group and {C[G]} be the group algebra with complex coefficients. Let

\displaystyle f = \sum_x f(x) x, g = \sum_y g(y) y

be two elements in {C[G]}. The product is

\displaystyle f g = \sum_z ( \sum_{x y = z} f(x) g(y) ) z.

When {f} and {g} are treated as functions {f, g : G \rightarrow C},

\displaystyle (f g) (z) = \sum_{x y = z} f(x) g(y)

is a convolution. In fact, {C[G]} is an example of convolution algebra {L^1 (G)}.

3. Let {J = \{1, ... ,n \}} and {G' = J \times J = \{ ( i, j ) : 1\le i, j \le n \}}. {G'} is a groupoid: not every pair of elements in {G'} can be composed, {(a, b)} can only be composed with {(c,d)} when {b=c}. In such case, {(i, j) (j, k) = (i, k).} Let us consider the groupoid convolution algebra {C[G']} as above. The convolution in this case is then

\displaystyle (f g) (i,k) = \sum_{ (i, j) (j, k) = (i, k) } f(i, j) g(j, k).

If we rewrite this in another format, it should look more familiar:

\displaystyle (AB)_{ik} = \sum_{i=1}^n A_{ij} B_{jk}.

This is matrix multiplication.

Advertisements
This entry was posted in Analysis, Fourier analysis, Linear Algebra and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s