# Solution 1 Computational Intelligence Lab 2011

## Problem 1

a)

${\bar {X}}=X-M$

b)

$\Sigma ={\frac {1}{N}}{\bar {X}}{\bar {X}}^{T}$

c)

$\Sigma =U\Lambda U^{T}$

d)

${\bar {Z}}=U_{K}^{T}{\bar {X}}$

e)

${\tilde {X}}=(U_{K}^{T})^{-1}{\bar {Z}}+M=U_{K}{\bar {Z}}+M$

## Problem 2

a)

High, because it doesn't drop fast. (There is no knee)

b)

No, because the intrinsic dimensionality is high. The error criterion for PCA are the neglected eigenvalues, and since they are in the order of the kept eigenvalues (for any k), the error will be big.

c)

95

1. B
2. E
3. C

## Problem 4

a)

• Then there is no covariance between the eigenvectors -> maximal compression
• Computation for each dimension can be done separately (parallel) because a diagonal covariance matrix mans that the dimensions are not correlated

b)

$\Sigma _{Z}={\frac {1}{N}}ZZ^{T}$

$\Sigma _{Z}={\frac {1}{N}}A^{T}XX^{T}A$

$\Sigma _{Z}=A^{T}({\frac {1}{N}}XX^{T})A$

$\Sigma _{Z}=A^{T}\Sigma _{X}A$

c)

$\Sigma _{Z}=A^{T}\Sigma _{X}A$

$\Sigma _{Z}=A^{T}U\Lambda U^{T}A$

$\Sigma _{Z}=\Lambda$