# Algebra Of Matrices

### Addition and Subtraction of Matrices:

Any two matrices can be added if they are of the same order and the resulting matrix is of the same order. If two matrices A and B are of the same order, they are said to be conformable for addition.

For example:

$\large \left[ \begin{array}{ccc} a_1 & b_1 \\ a_2 & b_2 \\ a_3 & b_3 & \end{array} \right] + \left[ \begin{array}{ccc} c_1 & d_1 \\ c_2 & d_2 \\ c_3 & d_3 & \end{array} \right] = \left[ \begin{array}{ccc} a_1 + c_1 & b_1 + d_1 \\ a_2 + c_2 & b_2 + d_2 \\ a_3 + c_3 & b_3 + d_3 \end{array} \right]$

Similarly,

$\large \left[ \begin{array}{ccc} a_1 & b_1 \\ a_2 & b_2 \\ a_3 & b_3 & \end{array} \right] – \left[ \begin{array}{ccc} c_1 & d_1 \\ c_2 & d_2 \\ c_3 & d_3 & \end{array} \right] = \left[ \begin{array}{ccc} a_1 – c_1 & b_1 – d_1 \\ a_2 – c_2 & b_2 – d_2 \\ a_3 – c_3 & b_3 – d_3 \end{array} \right]$

Note:
⋄ Only matrices of the same order can be added or subtracted.

⋄ Addition of matrices is commutative as well as associative.

⋄ Cancellation laws hold well in case of addition.

⋄ The equation A + X = 0 has a unique solution in the set of all mxn matrices.

### Scalar Multiplication:

The matrix obtained by multiplying every element of a matrix A by a scalar λ is called the scalar multiple of A by λ and is denoted by λ A i.e. if A = [aij] then λA = [λaij].

Note:

⋄ All the laws of ordinary algebra hold for the addition or subtraction of matrices and their multiplication by scalars.

### Multiplication of Matrices

Two matrices can be multiplied only when the number of columns in the first is equal to the number of rows in the second. Such matrices are said to be conformable for multiplication.

$\large \left[ \begin{array}{ccc} a_{11} & a_{12} …. & a_{1n} \\ a_{21} & a_{22} …. & a_{2n} \\ ….. \\ a_{m1} & a_{m2} …. & a_{mn} \end{array} \right]_{m \times n} \; \left[ \begin{array}{ccc} b_{11} & b_{12} …. & b_{1p} \\ b_{21} & b_{22} …. & b_{2p} \\ ….. \\ b_{n1} & b_{n2} …. & b_{np} \end{array} \right]_{n \times p } = \left[ \begin{array}{ccc} c_{11} & c_{12} …. & c_{1p} \\ c_{21} & c_{22} …. & c_{2p} \\ ….. \\ c_{m1} & c_{m2} …. & c_{mp} \end{array} \right]_{m \times p}$

where , $\large c_{ij} = a_{i1}b_{1j} + a_{i2}b_{2j} + …..+ a_{in}b_{nj}$

$\large = \Sigma_{k=1}^{n} a_{ik} b_{kj}$

Example : If  $\large A = \left[ \begin{array}{ccc} 2 & 3 & 1 \\ 1 & 3 & 2 \end{array} \right]$ and  $\large B = \left[ \begin{array}{ccc} 1 & 2 \\ 2 & 1 \\ 1 & 3 \end{array} \right]$
Show that AB ≠ BA

Solution:
A B = $\large \left[ \begin{array}{ccc} 2 & 3 & 1 \\ 1 & 3 & 2 \end{array} \right]$ $\large \left[ \begin{array}{ccc} 1 & 2 \\ 2 & 1 \\ 1 & 3 \end{array} \right]$

$\large = \left[ \begin{array}{ccc} 2+6+1 & 4+3+3 \\ 1+6+2 & 2+3+6 \end{array} \right]$

$\large = \left[ \begin{array}{ccc} 9 & 10 \\ 9 & 11 \end{array} \right]$

B A = $\large \left[ \begin{array}{ccc} 1 & 2 \\ 2 & 1 \\ 1 & 3 \end{array} \right]$ $\large \left[ \begin{array}{ccc} 2 & 3 & 1 \\ 1 & 3 & 2 \end{array} \right]$

$\large = \left[ \begin{array}{ccc} 2+2 & 3+6 & 1+4 \\ 4+1 & 6+3 & 2+2 \\ 2+3 & 3+9 & 1+6 \end{array} \right]$

$\large = \left[ \begin{array}{ccc} 4 & 9 & 5 \\ 5 & 9 & 4 \\ 5 & 12 & 7 \end{array} \right]$

Thus A .B ≠ B . A

Example : Prove that (AB)C = A(BC), where A, B, C are matrices conformable for the product

Solution: Let A = [aij]m × n , B = [bij]n × p, C = [cij]p × q

Then A.B = $\large \Sigma_{k=1}^{n} a_{ik} b_{kj}$

(A.B)C = $\large (\Sigma_{k=1}^{n} a_{ik} b_{kj}) (c_{ij})$

= $\large \Sigma_{i=1}^{p}(\Sigma_{k=1}^{n} a_{ik} b_{kj}) (c_{ij})$

= $\large \Sigma_{k=1}^{n}(\Sigma_{k=1}^{p} a_{ik} b_{kj}) (c_{ij})$

Similarly, BC = $\large \Sigma_{l=1}^{p} b_{kl} c_{lj}$

=> A . (BC) = $\large (a_{ik}) (\Sigma_{l=1}^{p} b_{kl} c_{lj})$

= $\large \Sigma_{k=1}^{n} a_{ik}) (\Sigma_{l=1}^{p} b_{kl} c_{lj})$

= $\large \Sigma_{k=1}^{n} a_{ik}) (\Sigma_{l=1}^{p} b_{kl} c_{lj})$

Hence (AB)C = A(BC)

Illustration : Every square matrix can be uniquely expressed as a sum of symmetric and skew symmetric matrix.

##### Notes :

∎ Commutative law does not necessarily hold for matrices.

∎ If AB = BA then matrices A and B are called commutative matrices.

∎ If AB = − BA then matrices A and B are called anti-commutative matrices.

∎ Matrix multiplication is associative.

∎ Matrix multiplication is distributive with respect to addition.

∎ The matrices posses divisors of zero, i.e. if the product AB = O, it is not necessary that atleast one of the matrix should be zero matrix.

∎ Cancellation law does not necessarily hold, i.e. if AB = AC then in general B ≠ C, even if A ≠ 0.

Next Page→

←Back Page