 142K - views

# Introduction to Vectors and Matrices

Matrices. Definition: A matrix is a rectangular array of numbers or symbolic elements. In many applications, the rows of a matrix will represent individuals cases (people, items, plants, animals,...) and columns will represent attributes or characteristics.

## Introduction to Vectors and Matrices

Download Presentation - The PPT/PDF document "Introduction to Vectors and Matrices" is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

## Presentation on theme: "Introduction to Vectors and Matrices"— Presentation transcript:

Slide1

Introduction to Vectors and Matrices

Slide2

Matrices

Definition: A matrix is a rectangular array of numbers or symbolic elementsIn many applications, the rows of a matrix will represent individuals cases (people, items, plants, animals,...) and columns will represent attributes or characteristicsThe dimension of a matrix is its number of rows and columns, often denoted as r x c (r rows by c columns)Can be represented in full form or abbreviated form:

Slide3

Special Types of Matrices

Slide4

Regression Examples - Carpet Data

Slide5

Slide6

Matrix Multiplication

Slide7

Matrix Multiplication Examples - I

Slide8

Matrix Multiplication Examples - II

Slide9

Special Matrix Types

Slide10

Linear Dependence and Rank of a Matrix

Linear Dependence: When a linear function of the columns (rows) of a matrix produces a zero vector (one or more columns (rows) can be written as linear function of the other columns (rows))Rank of a matrix: Number of linearly independent columns (rows) of the matrix. Rank cannot exceed the minimum of the number of rows or columns of the matrix. rank(A) ≤ min(rA,ca) A matrix if full rank if rank(A) = min(rA,ca)

Slide11

Geometry of Vectors

A vector of order n is a point in n-dimensional space

The line running through the origin and the point represented by the vector defines a 1-dimensional subspace of the n-dim space

Any p linearly independent vectors of order n, p < n define a p-dimensional subspace of the n-dim space

Any p+1 vectors in a p-dim subspace must have a linear dependency

Two vectors

x

and

y

are orthogonal if

x

y

=

y

x

= 0 and form a 90

 angle at the origin

Two vectors

x

and

y

are linearly dependent if they form a 0

 or 18

0

 angle at the origin

Slide12

Geometry of Vectors - II

If two vectors each have mean 0 among their elements then

q

is the product moment correlation between the two vectors

Slide13

Slide14

Matrix Inverse

Note: For scalars (except 0), when we multiply a number, by its reciprocal, we get 1: 2(1/2)=1 x(1/x)=x(x-1)=1In matrix form if A is a square matrix and full rank (all rows and columns are linearly independent), then A has an inverse: A-1 such that: A-1 A = A A-1 = I

Slide15

Computing an Inverse of 2x2 Matrix

Slide16

Use of Inverse Matrix – Solving Simultaneous Equations

Slide17

Useful Matrix Results

Slide18

Orthogonal Matrices

Slide19

Eigenvalues and Eigenvectors

Slide20

Positive Definite Matrices / Spectral Decomposition

Slide21

Slide22

Slide23

Square Root of a Positive Definite Square Matrix

Slide24

Mean, Variance, Covariance, Correlation

Slide25

Random Vectors and Matrices

Slide26

Mean and Variance of Linear Functions of X

Slide27

Standard Deviation and Correlation Matrices

LPGA “Population” Data:

Slide28

Partitioned Covariance Matrix

Slide29

Matrix Inequalities and Maximization

Slide30

Multivariate Normal Distribution