Skip to main content
Statistics LibreTexts

Some basic facts about vectors and matrices

  • Page ID
    241
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Addition rule for matrices

    If \(c_1, ... , c_k\) are scalers, and \(A_1,...,A_k\) are all \(m \times n\) matrices, then \(B = c_1A_1+c_2A_2+ ... + c_kA_k\) is an \(m \times n\) matrix with \((i, j)\)-th entry of \(B : B(i, j) = c_1A_1(i, j) + c_2A_2(i, j) + ... + c_kA_k(i, j)\), for all \(i = 1, ... , m; j = 1, ..., n\). (Note sometimes we denote the entries of a matrix by \(A_{i_j}\), and sometimes by \(A(i, j)\). But always the first index is for the row and the second index is for the column).

    Transpose of a matrix

    If \(A\) is an \(m \times n\) matrix, then \(A^T\) (spelled \(A\)-transpose) is the \(n \times m\) matrix \(B\) whose \((i, j)\)-th entry \(B_{ij} = A_{ji}\) for all \(i = 1, ... , n; j = 1, ... ,m\).

    Inner product of vectors

    If \(x\) and \(y\) are two \(m \times 1\) vectors, then the inner product (or, dot product) between \(x\) and \(y\) is given by : \(\left \langle x, y \right \rangle\) = \(\sum_{i=1}^{m}x_iy_i\). Note that \(\left \langle x, y \right \rangle = \left \langle y, x \right \rangle\).

    Multiplication of matrices

    If \(A\) is an \(m \times n\) matrix and \(B\) is an \(n \times p\) matrix then the product \(AB = C\), say, is defined and it is an \(m \times p\) matrix with \((i, j)\)-th entry : \(C_{ij} = \sum_{k=1}^{n}A_{ik}B_{kj}\) for all \(i = 1, ... , m; j = 1, ... ,p\). Notethat for \(m \times 1\) vectors \(x\) and \(y\), \(\left \langle x, y \right \rangle = x^Ty = y^Tx\). In other words, the \((i, j)\)-th entry of \(AB\) is the inner product of \(i\)-th row of \(A\) and \(j\)-th column of \(B\).

    Special matricies

    1. Square matrix: A matrix \(A\) is square if it is \(m \times m\) (that is, number of rows = number of columns).
    2. Symmetric matrix: An \(m \times m\) (square) matrix \(A\) is symmetric if \(A = A^T\). That is, for all \(1 \leq i, j \leq m, A_{ij} = A_{ji}\).
    3. Diagonal matrix: A \(m \times m\) matrix with all the entries zero except (possibly) the entries on the diagonal (that is the (\(i , i\))-th entry for all the \(i = 1, ... m\)) is called a diagonal matrix.
    4. Identity matrix: The \(m \times m\) diagonal matrix with all diagonal entries equal to 1 is called the identity matrix and is denoted by \(I\) (or, \(I_m\)). It has the property that for any \(m \times n\) matrix \(A\) and any \(p x m\) matrix \(B\), \(IA = A\) and \(BI = B\).
    5. One vector: The \(m \times 1\) vector with all entries equal to 1 is usually called the one vector (non-standard term) and is denoted by 1(or, \(1_m\)).
    6. Ones matrix: The \(m \times m\) matrix with all entries equal to 1 is denoted by \(J\) (or, \(J_m\)). Note that \(J_m\) = \(1_m1_{m}^{T}\).
    7. Zero vector: The \(m \times 1\) vector with all entries zero is called the zero vector and is denoted by \(0\) (or, \(0_m\)).
    • Multiplication is not commutative: If \(A\) and \(B\) are both \(m \times m\) matrices then both \(AB\) and \(BA\) are defined and are \(m \times m\) matrices. However, in general \(AB \neq BA\). Notice that \(I_mB = BI_m = B\), where \(I_m\) is the identity matrix.
    • Linear independence: The \(m \times 1\) vectors \(x_1,...,x_k\), (\(k\) arbitrary) are said to be linearly dependent, if there exist constants \(c_1,..., c_m\), not all zero, such that $$c_1x_1+c_2x_2 + ... + c_mx_m = 0$$ If no such sequence of numbers \(c1, ... , c_m\) exists then the vectors \(x_1, ..., x_m\) are said to be linearly independent.
    1. Relationship with dimension: If \(k > m\) then \(m \times 1\) vectors \(x_1, ..., x_k\) are always linearly dependent.
    2. Rank of a matrix: For an \(m \times n\) matrix \(A\), the rank of \(A\), written rank(\(A\)) is the maximal number of linearly independent columns of \(A\) (treating each column as an \(m \times 1\) vector). Also, rank(\(A\))\(\leq\)min\(\{m,n\}\)
    3. Nonsingular matrix: If an \(m \times m\) matrix \(A\) has full rank, that is, rank(\(A\)) = \(m\), (which is equivalent to saying that all the columns of \(A\) are linearly independent), then the matrix \(A\) is called nonsingular
    • Inverse of a matrix: If an \(m \times m\) matrix \(A\) nonsingular, then it has an inverse,
      that is a unique \(m \times m\) matrix denoted by \(A^{-1}\) that satisfies the relationship : \(A^{-1}A = I_m = AA^{-1}\)

    Inverse of a \(2 \times 2\) matrix: Let a \(2 \times 2\) matrix \(A\) be expressed as \(A = \begin{bmatrix}
    a &b \\
    c &d
    \end{bmatrix}\).

    Then \(A\) is nonsingular (and hence has an inverse) if and only if \(ad-bc \neq 0\). If this is satisfied then the inverse is $$A^{-1} = \frac{1}{ad-bc}\begin{bmatrix}
    d & -b\\
    -c & a
    \end{bmatrix}$$

    2. Solution of a system of linear equations : A system of \(m\) linear equations in \(m\) variables \(b_1, ...,, b_m\) can be expressed as $$a_{1_1}b_1 + a_{1_2}b_2 + ... + a_{1_m}b_m = c_1$$ $$a_{2_1}b_1 + a_{2_2}b_2 + ... + a_{2_m}b_m = c_2$$ $$ ... ... ... ... ... = .$$ $$a_{m_1}b_1 + a_{m_2}b_2 + ... + a_{m_m}b_m = c_m $$

    Here the coefficients \(a_{i_j}\) and the constants \(c_i\) are considered unknown. This system can be expressed in matrix form as \(Ab = c\), where \(A\) is the \(m \times m\) matrix with \((i, j)\)-th entry \(a_{i_j}\), and b and c are \(m \times 1\) vectors with \(i\)-th entries \(b_i\) and \(c_i\), respectively, for \(i = 1, ... , m; j = 1, ..., m\).

    If the matrix \(A\) is nonsingular, then a unique solution exists for this system of equations and is given by b = \(A^{-1}c\). To see this, note that since \(A(A^{-1}) = (AA^{-1})c= I c= c\), it shows that \(A^{-1}c\) is a solution. On the other hand, if \(b = b^*\) is a solution, then it satisfies \(Ab^* = c\). Hence \(b^* = I b^* = (A^{-1}A)b^* = A^{-1}(Ab^*) = A^{-1}c\), which proves uniqueness.

    Contributors

    • Debashis Paul
    • Cathy Wang

    This page titled Some basic facts about vectors and matrices is shared under a not declared license and was authored, remixed, and/or curated by Debashis Paul.

    • Was this article helpful?