\begin{block}{Definition: Eigenvalue and eigenvector}
Suppose matrix $A \in\mathbb{R}^{d \times d}$ is a symmetrical matrix. Thus $A^T = A$. $\lambda\in\mathbb{R}$ is called the \textbf{eigenvalue} of matrix $A \in\mathbf{R}^{d \times d}$, if there exists such a $u \neq0$ that $Au =\lambda u$. Thus, $u$ is the \textbf{eigenvector} of $A$.
\end{block}
}
\frame
{
\frametitle{Eigenvalues}
To make the calculations easier to handle, let's mark
$$
U =[u_1, \dots, u_d]\in\mathbb{R}^{d \times d},
$$
$$
\Lambda= diag(\lambda_1,\dots,\lambda_2)=
\begin{bmatrix}
\lambda_1&&0\\
&\ddots&\\
0&&\lambda_d
\end{bmatrix}
\in\mathbb{R}^{d \times d}
$$
}
\frame
{
\frametitle{Eigenvalue decomposition}
Now
$$
AU = U\Lambda,
$$
which gives us
$$
AUU^T = U\Lambda U^T
$$
As $U$ is in fact an orthogonal matrix, then $U^{-1}= U^T$, which means that $UU^T = UU^{-1}= I$, so the \textbf{eigendecomposition} of symmetrical matrix $A$ is
$$
A = U \Lambda U^T.
$$
}
\section{Principal component analysis}
\frame
{
\frametitle{Principal component analysis}
Principal component analysis (PCA) is the socket wrench of data analysts. It is the preferred method of getting started with new data. PCA goes by many names:
\begin{itemize}
\item Principal component analysis
\item Karhunen-Loève transform
\item Hotelling transform
\item Principal factor analysis
\end{itemize}
}
\section{Literature}
\frame
{
\frametitle{References}
\begin{thebibliography}{0}
\selectlanguage{english}
%\bibitem{manif}
%Michael Spivak, {\it Calculus on manifolds},
%Addison-Wesley, 1965.
%\bibitem{joy}
%M.D. Spivak, Ph.D., {\it The Joy of \TeX.
%A Gourmet Guide to Typesetting with the \AmS-\TeX\ macro package},