Linear Independence, Basis, Dimension and Co-ordinates

Let V be a vector space over R or C and S be a subset of V.

1. Linearly independence:

1.1. Linearly dependent set:

Let ϕ ≠S⊆V. If x∈S be such that x is a linear combination of some other elements of S, then S is said to be linearly dependent. In other words, if S is linearly dependent, then for some x∈S, there exists y1, y2, …,yn∈S, which are different from x such that x=α1y12y2+ …+αnyn, for some α1, α2, ..., αn∈F. Notice that in this case, (-1)x+α1y12y2+ …+ αnyn=0, i.e. there exists a linear combination of elements of S which equals zero, but not all coefficients are zero. Any set containing zero vector is linearly dependent.

1.2. Linearly independent set:

Let ϕ≠S⊆V. Then S is said to be a linearly independent set if it is not linearly dependent. The non-empty set ϕ is defined to be linearly independent.
To show that S={a, b}⊆V is linearly independent, one needs to show that b≠αa and a≠βb, for any scalars α and β. Similarly, to show that S={a, b, c}⊆V is linearly independent one needs to show that none of a, b and c is a linear combination of the other two elements.
General method to show linear independence is provided in the proposition given below.

1.3. Proposition:

Let ϕ≠S⊆V. Then S is linearly independent iff [α1x1+ α2x2+ …+αnxn=0 ⇒ αi=0, for all i=1, 2, …, n, where α1, α2, ..., αn∈F, x1, x2, …, xn∈S].

Proof: Sufficient part:

Let αi=0, for all i=1, 2, …, n; whenever α1x1+ α2x2+ …+αnxn=0, where α1, α2, ..., αn∈F, x1, x2, …, xn∈S. To the contrary, let S be linearly dependent. By definition of a linearly dependent set, there exists x∈S, such that x=α1y1+ α2y2+ …+ αnyn, for some α1, α2, ..., αn∈F, y1, y2, …, yn∈S. Thus (-1)x+α1y12y2+ …+ αnyn=0. By hypothesis, -1=0. This is a contradiction.

Necessary part:

Let S be linearly independent. To the contrary, let α1x1+ α2x2+ …+ αnxn=0 and αi≠0, for some i=1, 2, …, n. Clearly -αixi1x1+ α2x2+ …+ αi-1xi-1i+1xi+1+…+ αnxn. Hence x i=-αi-11x12x2+ …+ αi-1xi-1i+1xi+1+…+ αnxn). Note that αi-1 exists because αi≠0. Hence S is linearly dependent, a contradiction.

1.4. Examples-I:

Consider R2 be the vector space over R, where S⊆R2.

(i) S={(0, 1), (1, 2), (2, 7)} is linearly dependent.
Justification: Clearly, (2, 7) is a linear combination of (0, 1), (1, 2) as given below: (2, 7)=2(1, 2)+3(0, 1). Thus, S is linearly dependent. Notice that 2(1, 2)-3(0, 1)+1(2, 7)=0, i.e. a linear combination of elements of S is zero, but all the coefficients are not zero.
Remark. a(0, 1)+b(1, 2)+c(2, 7)=0 ⇒ (b+2c,a+2b+7c)=0 implies that b+2c=0 and a+2b+7c=0 which does not imply that a=b=c=0. Hence it does not determine whether S is linearly dependent or independent. It only gives a clue.

(ii) S={(1, 2),(1, 0)} is linearly independent.
Justification: a(1, 2)+b(1, 0)=(0, 0) ⇒ (a, 2b)+(b, 0)=(0, 0) ⇒ (a+b, 2b)=(0, 0). Thus a=0, b=0. Hence, both the coefficients are zero therefore, S is linearly independent.

1.5. Examples-II:

(i) Consider the vector space R3 over R. Then S={(1, 0, 0), (0, 1, 0), (0, 0, 1)} is linearly independent. Justification: Let α(1, 0, 0)+β(0, 1, 0)+γ(0, 0, 1)=0; for α, β, γ∈R. By solving this we get α=0, β=0, γ=0 which implies by definition, that S is linearly independent.
(ii.) Consider the vector space P2(x) over R. Then S={1, x, x2+1} is linearly independent. Justification: Let α(1)+β(x)+γ(x2+1)=0; for α, β, γ∈R. By solving this we get α=0, β=0, γ=0 which implies by definition, that S is linearly independent.

1.6. Properties of linearly independent andb linearly dependent sets:

(i) Any set containing the zero vector is linearly dependent. In particular, {0} is linearly dependent.
(ii) Singleton set containing a non-zero vector is linearly independent.
(iii) Subset of a linearly independent is linearly independent.
(iv) Superset of a linearly dependent set is linearly dependent.

2. Basis:

A non-empty subset S of V is said to be a basis if S is a linearly independent set and spans V.

2.1. Examples:

  1. Let S be the linearly independent set as given in Example 5 (i). It can be seen that S spans R3. Hence S is a basis for R3.
  2. Let S be the linearly independent set as given in Example 5 (ii). It can be seen that S spans P2(x). Hence S is a basis for P2(x).

    3. Dimension:

    Let V have a basis consisting of finitely many elements. Then the number of elements in the basis of V is called the dimension of the vector space V and is denoted by Dim. The dimension of {0} is defined to be zero as it is defined to be generated by ϕ.

    3.1. Examples:

  3. In Example 5 (i) the Dim of S is 3.
  4. In Example 5 (ii) the Dim of S is 3.

    3.2. Properties of basis and dimension:

  5. Let V have a finite basis. Then every basis for V contains the same number of vectors.
  6. If a basis of V has n elements, then any subset of V having n-1 elements does not span V.
  7. If a basis has n elements, then any subset of V having n+1 elements is linearly dependent.
  8. Let B be a subset of V. Then the following are equivalent.
      a. B is basis.
      b. B is a minimal generating set, that is no proper subset of B can generate V.
      c. B is a maximal linearly independent set.

    4. Co-ordinates:

    Let V be a vector space and x∈V and let B={ e1, e2} be a basis. Then x=αe1+βe2, for some α, β∈F. These scalars α and β are called the co-ordinates of x w.r.t. the basis {e1, e2}.

    4.1. Examples:

    Let R2 be the vector space over R.
  9. Consider a basis B={(1, 1), (1, 0)} of the vector space R over R. Then (2, 3)∈R2 can be written as (2, 3)=α(1, 1)+β(1, 0). This implies that α=3 and β=-1 Thus co-ordinates of (2, 3) w.r.t. the basis B are 3, -1.
  10. If B={e1, e2} is a basis of the vector space R2 over R, then
    (i) The co-ordinates of e1 w.r.t. the basis B are 1, 0 since e1=1.e1+0.e2.
    (ii) The co-ordinates of e2 w.r.t. the basis B are 0, 1 since e2=0.e1+1.e2.