

Matrix Size & Validation
Vector Algebra What is Vector? Vector Norm Unit Vector Vector Addition Vector Subtraction Vector Scalar Multiple Vector Multiplication Vector Inner Product Vector Outer Product Vector Cross Product Vector Triple Cross Product Vector Triple Dot Product Scalar Triple Product Orthogonal & Orthonormal Vector Cos Angle of Vectors Scalar and Vector Projection Matrix Algebra What is a matrix? Special Matrices Matrix One Null Matrix Matrix Diagonal Is Diagonal Matrix? Identity Matrix Matrix Determinant Matrix Sum Matrix Trace Matrix Basic Operation Is Equal Matrix? Matrix Transpose Matrix Addition Matrix Subtraction Matrix Multiplication Matrix Scalar Multiple Hadamard Product Horizontal Concatenation Vertical Concatenation Elementary Row Operations Matrix RREF Finding inverse using RREF (GaussJordan) Finding Matrix Rank using RREF Matrix Inverse Is Singular Matrix? Linear Transformation Matrix Generalized Inverse Solving System of Linear Equations Linear combination, Span & Basis Vector Linearly Dependent & Linearly Independent Change of basis Matrix Rank Matrix Range Matrix Nullity & Null Space Eigen System Matrix Eigen Value & Eigen Vector Symmetric Matrix Matrix Eigen Value & Eigen Vector for Symmetric Matrix Similarity Transformation and Matrix Diagonalization Matrix Power Orthogonal Matrix Spectral Decomposition Singular Value Decomposition Resources on Linear Algebra 
Linearly Independent In this page, you will learn more about linearly dependent and linearly independent vectors. In the previous topic of Basis Vector, you have learned that a set of vectors can form a coordinate system in dimensional space. The set of vectors that can form a coordinate system is called basis vectors. One main characteristic of basis vectors is that they cannot be put as a linear combination of other basis vector. This characteristic is called linearly independent vectors. Before we discus about linearly independent vectors, we will discuss about linearly dependent vectors. Linearly Dependent VectorsA set of vectors of the same dimensions is said to be linearly dependent if there is a set of scalars, not all zero, such that the linear combination is a zero vector. Linearly dependent vectors cannot be used to make a coordinate system. Geometrically, two vectors are linearly dependent if they point to the same direction or opposite direction. These linearly dependent vectors are parallel or lie on the same line (collinear). Three vectors are linearly dependent if they lie in a common plane passing through the origin (coplanar).
Algebraicly, we can augment the set of vectors to form a matrix size by. If the matrix is singular (i.e. it has no inverse), then we say that the set of vectors are linearly dependent. Linearly Independent VectorsHaving discussed about linearly dependent vectors, now we are ready for linearly independent vectors. A set of vectors that is not linearly dependent is called linearly independent. When you put linearly independent vectors in the form of linear combination, the only correct answer would be because linearly independent vectors cannot be put as a linear combination of one another. Since that is the main characteristics of basis vectors, we say that basis vectors are equivalent to linearly independent vectors. Geometrically, linear independent vectors form a coordinate system. By inspection we can determine whether a set of vectors is linearly independent or linearly dependent. If at least one vector can be expresed as a linear combination (i.e. scalar multiple or sum) of the other vectors, then the set of vectors is linearly dependent. If no vector can be expressed as a linear combination of the other vectors, then the set of vectors is linearly independent. Examples: Algebraicly, the vectors comprising the columns of bymatrix size are linearly independent if and only if the matrix is nonsingular (i.e. it has inverse). For two vectors and, simpler & faster computational procedure in a computer program is based on CauchySchwartz inequality which stated that the absolute value of vector dot product is always less than or equal to the product of their norms. The equality holds if and only if the vectors are linearly dependent. Thus, the inequality indicates the vectors are the linearly independent. The interactive program below is designed to answer whether two vectors are linearly independent or linearly dependent. See Also: Basis Vector, Changing Basis, Eigen Values & Eigen Vectors Rate this tutorial or give your comments about this tutorial Preferable reference for this tutorial is Teknomo, Kardi (2011) Linear Algebra tutorial. http:\\people.revoledu.com\kardi\ tutorial\LinearAlgebra\ 



