Kardi Teknomo
Kardi Teknomo Kardi Teknomo Kardi Teknomo
   
 
  Research
  Publications
  Tutorials
  Resume
  Personal
  Contact

Matrix Size & Validation
Vector Algebra
What is Vector?
Vector Norm
Unit Vector
Vector Addition
Vector Subtraction
Vector Scalar Multiple
Vector Multiplication
Vector Inner Product
Vector Outer Product
Vector Cross Product
Vector Triple Cross Product
Vector Triple Dot Product
Scalar Triple Product
Orthogonal & Orthonormal Vector
Cos Angle of Vectors
Scalar and Vector Projection
Matrix Algebra
What is a matrix?
Special Matrices
Matrix One
Null Matrix
Matrix Diagonal Is Diagonal Matrix?
Identity Matrix
Matrix Determinant
Matrix Sum
Matrix Trace
Matrix Basic Operation
Is Equal Matrix?
Matrix Transpose
Matrix Addition
Matrix Subtraction
Matrix Multiplication
Matrix Scalar Multiple
Hadamard Product
Horizontal Concatenation
Vertical Concatenation
Elementary Row Operations
Matrix RREF
Finding inverse using RREF (Gauss-Jordan)
Finding Matrix Rank using RREF
Matrix Inverse
Is Singular Matrix?
Linear Transformation
Matrix Generalized Inverse
Solving System of Linear Equations
Linear combination, Span & Basis Vector
Linearly Dependent & Linearly Independent
Change of basis
Matrix Rank
Matrix Range
Matrix Nullity & Null Space
Eigen System
Matrix Eigen Value & Eigen Vector
Symmetric Matrix
Matrix Eigen Value & Eigen Vector for Symmetric Matrix
Similarity Transformation and Matrix Diagonalization
Matrix Power
Orthogonal Matrix
Spectral Decomposition
Singular Value Decomposition
Resources on Linear Algebra

Linearly Independent

By Kardi Teknomo, PhD.
LinearAlgebra

<Next | Previous | Index>

In this page, you will learn more about linearly dependent and linearly independent vectors.

In the previous topic of Basis Vector, you have learned that a set of vectors can form a coordinate system in Linearly Independentdimensional space. The set of vectors that can form a coordinate system is called basis vectors. One main characteristic of basis vectors is that they cannot be put as a linear combination of other basis vector. This characteristic is called linearly independent vectors. Before we discus about linearly independent vectors, we will discuss about linearly dependent vectors.

Linearly Dependent Vectors

A set of vectors of the same Linearly Independentdimensions Linearly Independentis said to be linearly dependent if there is a set of scalarsLinearly Independent, not all zero, such that the linear combination is a zero vectorLinearly Independent.

Linearly dependent vectors cannot be used to make a coordinate system. Geometrically, two vectors are linearly dependent if they point to the same direction or opposite direction. These linearly dependent vectors are parallel or lie on the same line (collinear). Three vectors are linearly dependent if they lie in a common plane passing through the origin (coplanar).


Linearly Independent Linearly Independent 

Algebraicly, we can augment the set of vectors to form a matrix Linearly Independentsize Linearly IndependentbyLinearly Independent. If the matrix Linearly Independentis singular (i.e. it has no inverse), then we say that the set of vectors Linearly Independentare linearly dependent.

Linearly Independent Vectors

Having discussed about linearly dependent vectors, now we are ready for linearly independent vectors.

A set of vectors that is not linearly dependent is called linearly independent. When you put linearly independent vectors in the form of linear combinationLinearly Independent, the only correct answer would be Linearly Independent because linearly independent vectors Linearly Independentcannot be put as a linear combination of one another. Since that is the main characteristics of basis vectors, we say that basis vectors are equivalent to linearly independent vectors.

Geometrically, linear independent vectors form a coordinate system.

By inspection we can determine whether a set of vectors is linearly independent or linearly dependent. If at least one vector can be expresed as a linear combination (i.e. scalar multiple or sum) of the other vectors, then the set of vectors is linearly dependent. If no vector can be expressed as a linear combination of the other vectors, then the set of vectors is linearly independent.

Examples:
A set of vectors Linearly Independentis linearly dependent because we can expresed as a scalar multiple Linearly Independent.
A set of vectors Linearly Independentis linearly independent because we cannot find a scalar Linearly Independentsuch that Linearly Independent.
A set of vectors Linearly Independentis linearly dependent because we can expresed as a linear combination Linearly Independent.

Algebraicly, the vectors Linearly Independentcomprising the columns of Linearly IndependentbyLinearly Independentmatrix Linearly Independentsize are linearly independent if and only if the matrix Linearly Independentis non-singular (i.e. it has inverse).

For two vectors Linearly IndependentandLinearly Independent, simpler & faster computational procedure in a computer program is based on Cauchy-Schwartz inequality which stated that the absolute value of vector dot product is always less than or equal to the product of their normsLinearly Independent. The equality Linearly Independentholds if and only if the vectors are linearly dependent. Thus, the inequality indicates the vectors are the linearly independent.

The interactive program below is designed to answer whether two vectors are linearly independent or linearly dependent.

vector x vector y

See Also: Basis Vector, Changing Basis, Eigen Values & Eigen Vectors

<Next | Previous | Index>

Rate this tutorial or give your comments about this tutorial

This tutorial is copyrighted.

Preferable reference for this tutorial is

Teknomo, Kardi (2011) Linear Algebra tutorial. http:\\people.revoledu.com\kardi\ tutorial\LinearAlgebra\

 

 
© 2007 Kardi Teknomo. All Rights Reserved.
Designed by CNV Media