Matrices Definition A matrix is a rectangular array of numbers or symbolic elements In many applications the rows of a matrix will represent individuals cases people items plants animals and columns will represent attributes or characteristics ID: 547960
Download Presentation The PPT/PDF document "Introduction to Vectors and Matrices" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Introduction to Vectors and MatricesSlide2
Matrices
Definition: A matrix is a rectangular array of numbers or symbolic elements
In many applications, the rows of a matrix will represent individuals cases (people, items, plants, animals,...) and columns will represent attributes or characteristicsThe dimension of a matrix is its number of rows and columns, often denoted as r x c (r rows by c columns)Can be represented in full form or abbreviated form:Slide3
Special Types of MatricesSlide4
Regression Examples - Carpet DataSlide5
Matrix Addition and SubtractionSlide6
Matrix MultiplicationSlide7
Matrix Multiplication Examples - ISlide8
Matrix Multiplication Examples - IISlide9
Special Matrix TypesSlide10
Linear Dependence and Rank of a Matrix
Linear Dependence: When a linear function of the columns (rows) of a matrix produces a zero vector (one or more columns (rows) can be written as linear function of the other columns (rows))
Rank of a matrix: Number of linearly independent columns (rows) of the matrix. Rank cannot exceed the minimum of the number of rows or columns of the matrix. rank(A) ≤ min(rA,ca) A matrix if full rank if rank(A) = min(rA,
ca) Slide11
Geometry of Vectors
A vector of order n is a point in n-dimensional space
The line running through the origin and the point represented by the vector defines a 1-dimensional subspace of the n-dim spaceAny p linearly independent vectors of order n, p < n define a p-dimensional subspace of the n-dim spaceAny p+1 vectors in a p-dim subspace must have a linear dependencyTwo vectors x and y are orthogonal if x’y = y’
x = 0 and form a 90 angle at the originTwo vectors x and
y
are linearly dependent if they form a 0
or 18
0
angle at the origin
Slide12
Geometry of Vectors - II
If two vectors each have mean 0 among their elements then
q
is the product moment correlation between the two vectorsSlide13Slide14
Matrix Inverse
Note: For scalars (except 0), when we multiply a number, by its reciprocal, we get 1: 2(1/2)=1
x(1/x)=x(x-1)=1In matrix form if A is a square matrix and full rank (all rows and columns are linearly independent), then A has an inverse: A-1 such that:
A-1 A = A A-1
=
ISlide15
Computing an Inverse of 2x2 MatrixSlide16
Use of Inverse Matrix – Solving Simultaneous EquationsSlide17
Useful Matrix ResultsSlide18
Orthogonal MatricesSlide19
Eigenvalues and Eigenvectors Slide20
Positive Definite Matrices / Spectral DecompositionSlide21
Distance as a Quadratic FormSlide22Slide23
Square Root of a Positive Definite Square MatrixSlide24
Mean, Variance, Covariance, CorrelationSlide25
Random Vectors and MatricesSlide26
Mean and Variance of Linear Functions of XSlide27
Standard Deviation and Correlation Matrices
LPGA “Population” Data:Slide28
Partitioned Covariance MatrixSlide29
Matrix Inequalities and MaximizationSlide30
Multivariate Normal Distribution