Table of Contents

Vector

What is a vector?

  • The Physics definition - A vector is a quantity with both magnitude and direction.
  • The Computer Science definition - A vector is an ordered tuple of components where each component is a scalar value corresponding to certain parameters.
  • The Mathematics definition - A vector is an element of a vector space over a field closed under addition and scalar multiplication.

We are interested in the Mathematical definition of a vector when studying Linear Algebra. So we need to under what fields and vector spaces are.

Field

A field is a non-empty collection of elements with operators where,

Additive structure is an abelian group:

  1. Closure:
  2. Associativity:
  3. Identity:
  4. Inverse:
  5. Commutativity:

Multiplicative structure is an abelian group:
6. Closure: 7. Associativity: 8. Identity: 9. Inverse: 10. Commutativity:

Link between them:
11. Distributivity:

Vector Space

A vector space over a field is a collection of elements closed over -

  • Vector Addition
  • Scalar Multiplication

is not a field but “over a field “. This means that the scalars for scalar multiplication come from , not the components for the vectors in . is an Abelian Group under vector addition and closed under scalar multiplication by a field .

Example - over is a vector space where the two components of the vectors come from while the scalars for multiplication come from .

Vector Addition for and :

  1. Closure:
  2. Associativity:
  3. Identity:
  4. Inverse:
  5. Commutativity:

Scalar Multiplication: 6. Closure: 7. Associativity: 8. Identity: 9. Inverse: Because Associativity works, with we can satisfy inversion

Link between them: 10. Distributivity: and

Subspaces

Any subset of the vector space which by itself satisfies the conditions of a vector space is a subspace of .

  • is a trivial subspace of .
  • The zero vector is also a trivial subspace of .
  • Any subset of forming a linear geometric structure that passes through the origin is a subspace of .

Linear Combinations

Suppose we have vectors and scalars , each corresponding to a vector. A resultant vector is a linear combination of vectors when -

When,

  • is the sum of all vectors.
  • is the average of all vectors.
  • , is an affine combination of vectors.

Affine Combination

When sum of all coefficients/scalars in a linear combination add up to 1, we call such a linear combination an affine combination.

If, in addition, all coefficients are non-negative, we call this combination a Weighted Average.

Linear Dependence

If some linear combination of vectors results in such that not all coefficients were 0, we say the vectors are linearly dependent.

  • If any set of vectors contains the zero vector, then this set is always a linearly dependent set.

If the only way to get the zero vector by performing a linear combination on the vectors is by setting all coefficients as 0, we say the vectors are linearly independent.

  • A linearly independent set cannot contain the zero vector.
  • A single vector is always linearly independent, unless it is the zero vector.
  • Any subset of a linearly independent set is always linearly independent.
  • Any superset of a linearly independent set is always linearly dependent.
  • Two vectors are linearly independent if one is not a multiple of the other.

Span, Basis, and Dimension

Span

A span of vectors is a set of all possible linear combinations of the vectors.

  • A span of vectors is a vector space.
  • Let be a set of linearly independent -component vectors. The smallest subspace containing is .

Basis

The basis of a vector space is a set of linearly independent vectors whose span is the entire vector space.

  • In the example under span, is the basis of .
  • Basis of a vector space need not be unique. has infinitely many basis.

Uniqueness of Representation Theorem

This theorem states – Any vector in a vector space, would always have a unique representation in terms of the basis.

Let be a set of linearly independent vectors. Let,

Let be represented using a different set of coefficients,

Doing , we get

As is a set of linearly independent vectors, their linear combination can only be when the coefficients are . Thus . So, any vector when expressed as a linear combination of a linearly independent set of vectors has a unique set of scalars corresponding to each vector in .

Thus we can say that any vector in a vector space has a unique representation in terms of the basis vectors.

Dimension

The number of elements/vectors in the basis of a vector space is called the dimension of the vector space.

  • The basis for a vector space need not be unique, but the dimension of a vector space is always unique.

Hyperplanes

Linear Hyperplane

A linear hyperplane in is:

Properties -

  1. It is a linear subspace that always passed through the origin.
  2. For any vector there would exit basis for the subspace orthogonal to it. This is the degrees of freedom.
  3. Here is the orthogonal complement of . .
  4. Geometrically, this is a flat subspace passing through the origin.

Linear functions

Linear functions map a linear subspace to a linear subspace and have .

Affine Hyperplane

An affine hyperplane in is:

Properties -

  1. Not a subspace of unless .
  2. Still has degrees of freedom + some constant vector.
  3. Here is a translation of a linear hyperplane by some .
  4. Geometrically, this is a flat dimension surface shifted away from the origin.

An affine subspace is a shifted version of a linear subspace.

Affine functions

An affine functions maps a linear subspace to an affine set and have . An affine set is of the form .