Skip to content

Commit

Permalink
Merge pull request #48 from kashefy/kpca_lin
Browse files Browse the repository at this point in the history
demonstrate linear combination
  • Loading branch information
kashefy authored May 10, 2022
2 parents a9fb516 + db19811 commit ab7dbe2
Show file tree
Hide file tree
Showing 3 changed files with 43 additions and 22 deletions.
11 changes: 5 additions & 6 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
sudo: required
#dist: trusty
#language: r
os: linux
dist: bionic
language: ruby
before_install:
- sudo apt-get -qq update && sudo apt-get install -qq -y --no-install-recommends texlive-fonts-recommended texlive-latex-extra texlive-fonts-extra dvipng texlive-latex-recommended latex-beamer texlive-science
- sudo apt-get -qq update && sudo apt-get install -qq -y --no-install-recommends texlive-fonts-recommended texlive-latex-extra texlive-fonts-extra dvipng texlive-latex-recommended texlive-science
#- tlmgr install index
#- sudo tlmgr init-usertree
#- tlmgr install --reinstall --repository https://www.komascript.de/repository/texlive/2020 koma-script\
Expand All @@ -13,13 +13,12 @@ after_success:
- bash .ci/travis/zip_pdfs.sh
deploy:
provider: releases
api_key: "$GITHU8_API_KEY"
token: "$GITHU8_API_KEY"
file_glob: true
file:
- "./notes/**/tutorial_*.slides.pdf"
- "./notes/**/tutorial_*.notes.pdf"
- "./tutorial_*.zip"
skip_cleanup: true
on:
branch: master
# edge: true
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Topics covered throughout the course:
* Self-Organizing Maps
* Locally Linear Embedding
* Probability density estimation
* Mixture models & Expectation-Maximization algorithm
* Mixture models & the Expectation-Maximization algorithm
* Hidden Markov Models
* Estimation theory

Expand Down
52 changes: 37 additions & 15 deletions notes/03_kernel-pca/3_kpca.tex
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ \subsubsection{Centering the immediate input to PCA}
Remember, we will first assume that we have the non-linear mapping $\phi$.\\

PCA assumes its input is centered.
It's direct input are the $\phi$'s. Therefore,
Its direct input are the $\phi$'s. Therefore,
\begin{equation}
\frac{1}{p} \sum^{p}_{\alpha=1} \vec{\phi}_{(\vec{x}^{(\alpha)})} \eqexcl \vec 0
\end{equation}
Expand Down Expand Up @@ -156,13 +156,35 @@ \subsubsection{The eigenvalue problem}
\vec{\phi}^{(\beta)}
\end{equation}

\notesonly{
Eq.\ref{eq:ephi} tells us that we can describe $\vec e$ in terms of the transformed observations (a weighted summation of $\phi$'s).
\end{frame}

\mode<article>{

It is possible to arrive at the linear relationship in \eqref{eq:ephi} by substituting \eqref{eq:cov} into the eigenvalue problem in \eqref{eq:eig}:

\begin{align}
\underbrace{\frac{1}{p} \sum_{\alpha=1}^{p} \vec{\phi}^{(\alpha)} {\color{blue}\big(\vec{\phi}^{(\alpha)}\big)^\top}
}_{=\,\vec C_{\phi}}
\,
{\color{blue}\vec e}
= \lambda \;\, \vec e\\
\intertext{with ${\color{blue}\big(\vec{\phi}^{(\alpha)}\big)^\top \vec e = \vec e^\top\vec{\phi}^{(\alpha)}}$ measuring the scalar projection $u_{(\vec{\phi}^{(\alpha)})}$ along $\vec e$:}
\frac{1}{p} \sum_{\alpha=1}^{p} u_{(\vec{\phi}^{(\alpha)})} \vec{\phi}^{(\alpha)}
= \lambda \;\, \vec e\\
\vec e =
\frac{1}{p} \sum_{\alpha=1}^{p} \frac{u_{(\vec{\phi}^{(\alpha)})}}{\lambda} \vec{\phi}^{(\alpha)}\\
\intertext{This effectivley expresses $\vec e$ as a linear combination of the transformed observations:}
\vec e =
\frac{1}{p} \sum_{\alpha=1}^{p} a^{(\alpha)} \vec{\phi}^{(\alpha)} \stackrel{\substack{\text{switch index}\\{\alpha \rightarrow \beta}}}{=} \frac{1}{p} \sum_{\beta=1}^{p} a^{(\beta)} \vec{\phi}^{(\beta)}
\end{align}

\paragraph{Deriving the transformed eigenvalue problem}

\eqref{eq:ephi} tells us that we can describe $\vec e$ in terms of the transformed observations (a weighted summation of $\phi$'s).
The use of the index $\beta$ is only to avoid collisions with $\alpha$ later.

}

\end{frame}

\begin{frame}{\subsubsecname}

\slidesonly{
Expand All @@ -184,7 +206,7 @@ \subsubsection{The eigenvalue problem}
}

\notesonly{
Substituting Eq.\ref{eq:cov} and Eq.\ref{eq:ephi} into the eigenvalue problem Eq.\ref{eq:eig}:
Substituting \eqref{eq:cov} and \eqref{eq:ephi} into the eigenvalue problem \eqref{eq:eig}:
}
\slidesonly{
Express the eigenvalue problem in terms of $\phi$'s:
Expand Down Expand Up @@ -243,15 +265,15 @@ \subsubsection{The eigenvalue problem}
}

\notesonly{
Recall from \sectionref{sec:nonlin} that we are not even able to compute $\vec{\phi}_{(\vec{x})}$ but we now see it is possible to avoid the transformation altogether by exploiting the kernel trick (cf. Eq.\ref{eq:trick}) by substituting
Recall from \sectionref{sec:nonlin} that we are not even able to compute $\vec{\phi}_{(\vec{x})}$ but we now see it is possible to avoid the transformation altogether by exploiting the kernel trick (cf. \eqref{eq:trick}) by substituting
$ K_{\alpha \beta} $ for
$
\vec{\phi}^{\top}_{(\vec{x}^{(\alpha)})}
\,
\vec{\phi}_{(\vec{x}^{(\beta)})}
$

Eq.\ref{eq:eig2} becomes:}
\eqref{eq:eig2} becomes:}

\begin{equation} \label{eq:eig3}
\frac{1}{p} \sum_{\alpha=1}^{p} \sum^{p}_{\beta=1}
Expand All @@ -267,7 +289,7 @@ \subsubsection{The eigenvalue problem}

\pause

\notesonly{We }left-multiply\notesonly{ Eq.\ref{eq:eig3}} with $\big(\vec \phi_{(\vec x^{(\gamma)})}\big)^\top$, where $\gamma = 1, \ldots, p$.
\notesonly{We }left-multiply\notesonly{ \eqref{eq:eig3}} with $\big(\vec \phi_{(\vec x^{(\gamma)})}\big)^\top$, where $\gamma = 1, \ldots, p$.
We can pull $\big(\vec \phi^{(\gamma)}\big)^\top$ directly into the sum on the \slidesonly{LHS}\notesonly{left-hand-side} and the sum on the \slidesonly{RHS}\notesonly{right-hand-side}:

\only<2,3>{
Expand All @@ -289,7 +311,7 @@ \subsubsection{The eigenvalue problem}

\pause

\newpage
%\newpage

\notesonly{\eqref{eq:eig4} without the clutter:}

Expand Down Expand Up @@ -592,13 +614,13 @@ \subsubsection{Normalize the eigenvectors}

\begin{frame}{\subsubsecname}

Recall\notesonly{ing Eq.\ref{eq:ephi} (we add the index $k$ to denote which eigenvector):}
Recall\notesonly{ing \eqref{eq:ephi} (we add the index $k$ to denote which eigenvector):}
\begin{equation}
\label{eq:ephik}
\vec e_k = \sum^{p}_{\beta=1} a_k^{(\beta)} \vec{\phi}_{(\vec{x}^{(\beta)})},
\end{equation}

We want an expression for the norm $\vec e^{\top}_k \vec e_k$ that does not involve $\phi$'s. We left-multiply\slidesonly{ the above}\notesonly{ Eq.\ref{eq:ephik}} with $\left(\vec e_k\right)^\top$:
We want an expression for the norm $\vec e^{\top}_k \vec e_k$ that does not involve $\phi$'s. We left-multiply\slidesonly{ the above}\notesonly{ \eqref{eq:ephik}} with $\left(\vec e_k\right)^\top$:
\begin{align}
\vec e^{\top}_k \vec e_k &= \sum^{p}_{\alpha=1} a_k^{(\alpha)} \vec{\phi}_{(\vec{x}^{(\alpha)})}^\top \sum^{p}_{\beta=1} a_k^{(\beta)} \vec{\phi}_{(\vec{x}^{(\beta)})} \\
&= \sum^{p}_{\alpha=1} \sum^{p}_{\beta=1} a_k^{(\beta)} \underbrace{\vec{\phi}_{(\vec{x}^{(\alpha)})}^\top \vec{\phi}_{(\vec{x}^{(\beta)})}} a_k^{(\alpha)} \\
Expand All @@ -611,7 +633,7 @@ \subsubsection{Normalize the eigenvectors}
\begin{frame}{\subsubsecname}

\notesonly{
And when we plug Eq.\ref{eq:eigsimple1} into the above:
And when we plug \eqref{eq:eigsimple1} into the above:
}
\slidesonly{
From $\vec{K} \, \widetilde {\vec a}_k = p \lambda \widetilde {\vec a}_k$ follows:
Expand Down Expand Up @@ -641,7 +663,7 @@ \subsubsection{Normalize the eigenvectors}

\notesonly{
Scaling $\widetilde {\vec a}_k$ by $\frac{1}{\sqrt{p \lambda_k}}$ yields
a vector in the same direction as $\widetilde {\vec a}_k$ to satisfy \notesonly{Eq.\ref{eq:eignorm}}\slidesonly{$\vec e^{\top}_k \vec e_k \eqexcl 1$}.\\
a vector in the same direction as $\widetilde {\vec a}_k$ to satisfy \notesonly{\eqref{eq:eignorm}}\slidesonly{$\vec e^{\top}_k \vec e_k \eqexcl 1$}.\\
}
With
\svspace{-5mm}
Expand Down Expand Up @@ -737,7 +759,7 @@ \subsubsection{Projection}
\pause

\notesonly{
We substitute $\vec \phi_{(\vec x)}$ for $\vec x$ and plug Eq.\ref{eq:ephi} into Eq.\ref{eq:projlin}:
We substitute $\vec \phi_{(\vec x)}$ for $\vec x$ and plug \eqref{eq:ephi} into \eqref{eq:projlin}:
}

\visible<3>{
Expand Down

0 comments on commit ab7dbe2

Please sign in to comment.