Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Principal Component Analysis - Stochastic Hydrology - Lecture Notes, Study notes of Mathematical Statistics

The main points i the stochastic hydrology are listed below:Principal Component Analysis, Identifying Patterns, Number of Dimensions, Matrix Algebra, Eigenvectors and Eigenvalues, Square Matrices, Non–Zero Column Vector, Matrix of Coefficients, Covariance Matrix

Typology: Study notes

2012/2013

Uploaded on 04/20/2013

sathyanarayana
sathyanarayana 🇮🇳

4.4

(21)

140 documents

1 / 39

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
PRINCIPAL COMPONENT
ANALYSIS
3%
Docsity.com
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27

Partial preview of the text

Download Principal Component Analysis - Stochastic Hydrology - Lecture Notes and more Study notes Mathematical Statistics in PDF only on Docsity!

PRINCIPAL COMPONENT

ANALYSIS

3

  • PCA is a way of identifying patterns in the data; data

is expressed in such a way that the similarities and

differences are highlighted.

  • Once the patterns are found in the data, it can be

compressed (reduce the number of dimensions)

without losing much information.

Principal Component Analysis

4

  • λ is an eigenvalue of an n x n matrix A, with

corresponding eigenvector X.

(A − λI)X = 0, with X ≠ 0 leads to

  • There are at most n distinct eigenvalues of A.

Matrix Algebra

6

A − λ I = 0

Obtain the eigenvalues and eigenvectors for the

matrix,

The eigenvalues are obtained as

7

Example – 1

1 2

2 1

A

⎡ ⎤

= ⎢ ⎥

⎣ ⎦

A − λ I = 0

1 2

0

2 1

λ

λ

=

For

9

Example – 1 (Contd.)

1

λ = 3

( ) 1 1

A − λ I X = 0

1 1

1 1

2 2 0

2 2 0

x y

x y

− + =

− =

1

1

1 3 2

0

2 1 3

x

y

⎡ −^ ⎤ ⎡ ⎤

= ⎢ ⎥⎢ ⎥

− ⎣ ⎦ ⎣ ⎦

1

1

2 2

0

2 2

x

y

⎡ −^ ⎤ ⎡ ⎤

= ⎢ ⎥⎢ ⎥

− ⎣ ⎦ ⎣ ⎦

1 2

2 1

A

⎡ ⎤

= ⎢ ⎥

⎣ ⎦

which has solution x

1

= y

1

, x

1

arbitrary.

eigenvectors corresponding to λ

1

= 3 are the vectors ,

with x

1

e.g., if we take x

1

= 2 then y

1

The eigenvector is

10

Example – 1 (Contd.)

1

1

x

y

⎡ ⎤

⎢ ⎥

⎣ ⎦

2

2

⎡ ⎤

⎢ ⎥

⎣ ⎦

Principal Component Analysis (PCA):

  • Data on p variables; these variables may be

correlated.

  • Correlation indicates information contained in one

variable is also contained in some of the other p-

variables.

  • PCA transforms the p original correlated variables

into p uncorrelated components (also called as

orthogonal components or principal components)

  • These components are linear functions of the

original variables.

Principal Component Analysis

12

The transformation is written as

where

X is nxp matrix of n observations on p variables

Z is nxp matrix of n values for each of p

components

A is pxp matrix of coefficients defining the linear

transformation

All X are assumed to be deviations from their

respective means, hence X is a matrix of deviations

from mean

Principal Component Analysis

13

Z = X A

The procedure is explained with a simple data set of

the yearly rainfall and the yearly runoff of a catchment

for 15 years.

Principal Component Analysis

15

Year 1 2 3 4 5 6 7 8 9 10

Rainfall

(cm)

Runoff

(cm)

Year 11 12 13 14 15

Rainfall

(cm)

Runoff

(cm)

Mean of Rainfall = 108.5 cm

Mean of Runoff = 38.3 cm

Step 2: Form a matrix with deviations from mean

Principal Component Analysis

16

Original matrix Matrix with deviations from mean

X =

Step 4: Calculate the eigenvalues and eigenvectors of

the covariance matrix

Eigenvalues:

1

= 322.4 and

2

Principal Component Analysis

18

A − λ I = 0

Eigenvectors:

( A^ −^ λ I^ ) X =^0

0.801 0.

0.599 0.

⎡ − ⎤

⎢ ⎥

⎣ ⎦

X =

A

Step 5: Choose components and form a feature vector

The fraction of the total variance accounted for by the

j

th

principal component is

where

Trace (S) = 322.4 + 27.7 = 350.

Principal Component Analysis

19

j

Trace S

j

Trace S = λ

From the two eigenvectors, the feature vector is

selected

Principal Component Analysis

21

⎡ ⎤

⎢ ⎥

⎣ ⎦

A =

Step 6: Derive the new data set

Principal Component Analysis

22

Z = X A

⎡ ⎤

⎢ ⎥

⎣ ⎦