Buscar

Dimension_Rank_classnotes_printable

Prévia do material em texto

Subspaces, Basis, Dimension and Rank
We recall
Theorem 4, Sec 1.3. Let A be an m × n matrix. Then the
following is equivalent:
a. For each b ∈ Rm, the equation Ax = b has a solution.
b. Each b ∈ Rm is a linear combination of the columns of A.
c. The columns of A span Rm.
d. A has a pivot position in every row.
Theorem 8, Sec 1.7 Any {v1, . . . vp} in Rn is linearly dependent
if p > n.
1 A consequence of Theorem 4 above is that if A has less than
m column vectors, the set of columns of A cannot span Rm
2 A consequence of Theorem 8 above is that The maximum
number of independent vectors in Rn is n. Any other
vector is linear combination of the vectors in the maximum
independent set of vectors.
Subspaces
Definition. A subspace of Rn is any set H in Rn that has three
properties:
a. 0 ∈ H.
b. u + v ∈ H for all u, v ∈ H.
c. cu ∈ H for all c ∈ Rn and u ∈ H.
A subspace is closed under addition and scalar multiplication.
Example 1. - Planes and lines through the origin in R3 are
subspaces of R3. Plane: H = Span{u, v} is a subspace of R3.
R
n is a subspace of itself.
Basis for a Subspace
Definition. A basis for a subspace H of Rn is a linearly
independent set in H that spans H.
Example 2. Let V = {a1, . . . , an} be a set of vectors in Rn.
Suppose that they are the columns of an invertible n × n matrix A.
Then V is a basis for Rn.
Solution. By the big Invertible Matrix Theorem - IMT we have
that columns of A
form a linearly independent set Thm 8 (e) and
span Rn Thm8 (h).
So V is a basis for Rn.
Basis for a Subspace
It is consequence that the vectors
e1 =
⎡
⎢⎢⎢⎣
1
0
...
0
⎤
⎥⎥⎥⎦ , e2 =
⎡
⎢⎢⎢⎣
0
1
...
0
⎤
⎥⎥⎥⎦ , · · · , en =
⎡
⎢⎢⎢⎣
0
...
0
1
⎤
⎥⎥⎥⎦
form a basis for Rn, because the identity matrix is invertible.
We also have the converse. If set V is a basis for Rn, i.e., V is
linearly independent set that spans Rn, then matrix A is invertible.
That also follows from IMT.
Basis for a Subspace
Given a basis B = {b1, · · · ,bp} for a subspace H suppose
that vector x can be generated in two ways:
x = c1b1 + · · · cpbp and x = d1b1 + · · · + dpbp
.
Then subtracting gives
0 = x − x = (c1 − d1)b1 + · · · (cp − dp)bp (1)
Since B is linearly independent all weights in (1) must be
zero. So cj = dj for j = 1, · · · , p.
Vector Coordinates relative to a Basis
Definition. Let B be a basis for H. For each x in H, the
coordinates of x relative to the basis B are the weights
c1, . . . , cp such that x = c1b1 + · · · cpbp, and the vector in Rp
[x]B =
⎡
⎢⎣
c1
...
cp
⎤
⎥⎦
is called the coordinate vector of x (relative to B) or the
B-coordinate vector of x.
Coordinate systems
Example 3. Let v1 = [3, 6, 2]T , v2 = [−1, 0, 1]T , x = [3, 12, 7]T ,
and B = {v1, v2}. Then B is a basis for H = Span{v1, v2} because
v1 and v2 are linearly independent.
(a) Is x ∈ H? (b) If so, find the B-coordinate vector of x.
(a) x ∈ H if and only if the system
c1[3, 6, 2]
T + c2[−1, 0, 1T ] = [3, 12, 7]T
is consistent.
Coordinate systems
Row reduction shows that
⎡
⎣
3 −1 3
6 0 12
2 1 7
⎤
⎦ �
⎡
⎣
1 0 2
0 1 3
0 0 0
⎤
⎦
Thus c1 = 2, c2 = 3 is the solution and x is in H.
(b) The B-coordinates of x is [x]B = [2, 3]T .
The basis B determines a coordinate system on H. See figure
below.
Coordinate systems Coordinate systems
The correspondence x �→ [x]B is a one-to-one correspondence
between H and R2 that preserves linear combinations. Such a
correspondence is called an isomorphism. We say that H is
isomorphic to R2.
In general, if B = {b1, · · · ,bk} is a basis of H, then the mapping
x �→ [x]B is a one-to-one correspondence between H and Rk , even
though the vectors in H may have more than k entries. So H is
isomorphic to Rk .
Basis and the number of basic vectors
Proposition 1. If a subspace H has a basis with k elements so
every other basis has exactly k vectors.
Sketch of the proof.
A subspace H with a basis with k vectors is isomorphic to Rk .
So we need to show that all basis for Rk has exactly k vectors.
By Theorem 8, Section 1.7, the maximum number of vectors
a basis for Rk can have is p. So a basis for Rk must have at
most k (independent) vectors.
Suppose a basis for Rk has l vectors with l < k. By Theorem
4, placing the vectors of the basis as the column vectors of a
matrix A, the matrix must have one pivot at each row to span
Rk , i.e., k pivots. But A can have at most l < k pivots,
because it has l columns. A contradiction. Therefore, a basis
for Rk must have exactly k vectors. �
Dimension and Rank
Remark. Any set of k linearly independent vectors form a basis for
Rk .
Definition. The dimension of a nonzero subspace H, denoted by
dim H, is the number of vectors in any basis for H. The dimension
of the zero space is zero.
Definition. Given an m × n matrix A, the rank of A is the
maximum number of linearly independent column vectors in A.
This number is denoted by rankA = r .
It can be shown that the number of independent columns in a
matrix A is also equal to the number os independent row of A.
Dimension and Rank
Theorem 1. - The Basis Theorem Let H be a k dimensional
subspace of Rn. Any linearly independent set of exactly k vectors
in H is automatically a basis for H. Also a set of k elements that
spans H form a basis for H.
Theorem 2. - Invertible Matrix Theorem (continued)Let A be
an n × n square matrix. Then the following are equivalent.
a. A is invertible
m. Columns of A form a basis for Rn.
n. rank A = n
�

Continue navegando