25/05/2018, 13:48

Review of Linear Algebra

Vector spaces are the principal object of study in linear algebra. A vector space is always defined with respect to a field of scalars. A field is a set F equipped with two operations, addition and mulitplication, ...

Vector spaces are the principal object of study in linear algebra. A vector space is always defined with respect to a field of scalars.

A field is a set F equipped with two operations, addition and mulitplication, and containing two special members 0 and 1 ( 0 1 ), such that for all a b c F

    1. a b F
    2. a b b a
    3. ( a + b ) + c a + ( b + c )
    4. a 0 a
    5. there exists a such that a a 0
    1. a b F
    2. a b b a
    3. a b c a b c
    4. a · 1 a
    5. there exists a such that a a 1
  1. a b c a b a c
More concisely
  1. F is an abelian group under addition
  2. F is an abelian group under multiplication
  3. multiplication distributes over addition

Examples

ℚ, ℝ, ℂ

Let F be a field, and V a set. We say V is a vector space over F if there exist two operations, defined for all a F , u V and v V :

  • vector addition: (u, v) → u v V
  • scalar multiplication: (a,v) → a v V
and if there exists an element denoted 0 V , such that the following hold for all a F , b F , and u V , v V , and w V
    1. u + ( v + w ) ( u + v ) + w
    2. u v v u
    3. u 0 u
    4. there exists u such that u u 0
    1. a u v a u a v
    2. a b u a u b u
    3. a b u a b u
    4. 1 · u u
More concisely,
  1. V is an abelian group under plus
  2. Natural properties of scalar multiplication

Examples

  • N is a vector space over ℝ
  • N is a vector space over ℂ
  • N is a vector space over ℝ
  • N is not a vector space over ℂ
The elements of V are called vectors.

Throughout this course we will think of a signal as a vector x x 1 x 2 ⋮ x N x 1 x 2 … x N The samples x i could be samples from a finite duration, continuous time signal, for example.

A signal will belong to one of two vector spaces:

Real Euclidean space

x N (over ℝ)

Complex Euclidean space

x N (over ℂ)

Let V be a vector space over F.

A subset S V is called a subspace of V if S is a vector space over F in its own right.

V 2 , F , S any line though the origin .

S is any line through the origin.

Are there other subspaces?

S V is a subspace if and only if for all a F and b F and for all s S and t S , a s b t S

Let u 1 , … , u k V .

We say that these vectors are linearly dependent if there exist scalars a 1 , … , a k F such that

i 1 k a i u i 0
and at least one a i 0 .

If [link] only holds for the case a 1 … a k 0 , we say that the vectors are linearly independent.

1 1 -1 2 2 -2 3 0 1 -5 7 -2 0 so these vectors are linearly dependent in 3 .

Consider the subset S v 1 v 2 … v k . Define the span of S < S > span S i 1 k a i v i a i F

Fact: < S > is a subspace of V.

V 3 , F , S v 1 v 2 , v 1 1 0 0 , v 2 0 1 0 ⇒ < S > xy-plane .

< S > is the xy-plane.

Aside

If S is infinite, the notions of linear independence and span are easily generalized:

We say S is linearly independent if, for every finite collection u 1 , … , u k S , (k arbitrary) we have i 1 k a i u i 0 i a i 0 The span of S is < S > i 1 k a i u i a i F u i S k

In both definitions, we only consider finite sums.

A set B V is called a basis for V over F if and only if

  1. B is linearly independent
  2. < B > V
Bases are of fundamental importance in signal processing. They allow us to decompose a signal into building blocks (basis vectors) that are often more easily understood.

V = (real or complex) Euclidean space, N or N . B e 1 … e N standard basis e i 0 ⋮ 1 ⋮ 0 where the 1 is in the i th position.

V N over ℂ. B u 1 … u N which is the DFT basis. u k 1 2 k N ⋮ 2 k N N 1 where -1 .

Key Fact

If B is a basis for V, then every v V can be written uniquely (up to order of terms) in the form v i 1 N a i v i where a i F and v i B .

Other Facts

  • If S is a linearly independent set, then S can be extended to a basis.
  • If < S > V , then S contains a basis.

Let V be a vector space with basis B. The dimension of V, denoted dim V , is the cardinality of B.

Every vector space has a basis.

Every basis for a vector space has the same cardinality.

dim V is well-defined.

If dim V , we say V is finite dimensional.

Examples

vector space field of scalars dimension
N
N
N

Every subspace is a vector space, and therefore has its own dimension.

Suppose S u 1 … u k V is a linearly independent set. Then dim < S >

    Facts
  • If S is a subspace of V, then dim S dim V .
  • If dim S dim V , then S V .

Let V be a vector space, and let S V and T V be subspaces.

We say V is the direct sum of S and T, written V ⊕ S T , if and only if for every v V , there exist unique s S and t T such that v s t .

If V ⊕ S T , then T is called a complement of S.

V C ′ { f : → | f is continuous } S even funcitons in C ′ T odd funcitons in C ′ f t 1 2 f t f t 1 2 f t f t If f g h g ′ h ′ , g S and g ′ S , h T and h ′ T , then g g ′ h ′ h is odd and even, which implies g g ′ and h h ′ .

Facts

  1. Every subspace has a complement
  2. V ⊕ S T if and only if
    1. S T 0
    2. < S , T > V
  3. If V ⊕ S T , and dim V , then dim V dim S dim T

Proofs

Invoke a basis.

Let V be a vector space over F. A norm is a mapping → V F , denoted by · , such that forall u V , v V , and λ F

  1. u 0 if u 0
  2. λ u λ u
  3. u v u v

Examples

Euclidean norms:

x N : x i 1 N x i 2 1 2 x N : x i 1 N x i 2 1 2

Induced Metric

Every norm induces a metric on V d u v u v which leads to a notion of "distance" between vectors.

Let V be a vector space over F, F or . An inner product is a mapping V V → F , denoted · · , such that

  1. v v 0 , and ⇔ v v 0 v 0
  2. u v v u
  3. a u b v w a u w b v w

Examples

N over ℝ: x y x y i 1 N x i y i

N over ℂ: x y x y i 1 N x i y i

If x x 1 … x N , then x x 1 ⋮ x N is called the "Hermitian," or "conjugate transpose" of x.

If we define u u u , then u v u v Hence, every inner product induces a norm.

For all u V , v V , u v u v In inner product spaces, we have a notion of the angle between two vectors: ∠ u v u v u v 0 2

u and v are orthogonal if u v 0 Notation: ⊥ u v .

If in addition u v 1 , we say u and v are orthonormal.

In an orthogonal (orthonormal) set, each pair of vectors is orthogonal (orthonormal).

Orthogonal vectors in 2 .

An Orthonormal basis is a basis v i such that v i v i δ i ​ j 1 i j 0 i j

The standard basis for N or N

The normalized DFT basis u k 1 N 1 2 k N ⋮ 2 k N N 1

If the representation of v with respect to v i is v i a i v i then a i v i v

Every inner product space has an orthonormal basis. Any (countable) basis can be made orthogonal by the Gram-Schmidt orthogonalization process.

Let S V be a subspace. The orthogonal compliment S is S ⊥ u u V u v 0 v v S S ⊥ is easily seen to be a subspace.

If dim v , then V ⊕ S S ⊥ . AsideIf dim v , then in order to have V ⊕ S S ⊥ we require V to be a Hilbert Space.

Loosely speaking, a linear transformation is a mapping from one vector space to another that preserves vector space operations.

More precisely, let V, W be vector spaces over the same field F. A linear transformation is a mapping T : V → W such that T a u b v a T u b T v for all a F , b F and u V , v V .

In this class we will be concerned with linear transformations between (real or complex) Euclidean spaces, or subspaces thereof.

T w w W T v w for some v

Also known as the kernel: ker T v v V T v 0

Both the image and the nullspace are easily seen to be subspaces.

rank T dim T

null T dim ker T

rank T null T dim V

Every linear transformation T has a matrix representation. If T :

0