Next: 2.4.3 Common kernel functions
Up: 2.4 Kernel PCA
Previous: 2.4.1 Feature extraction
2.4.2 Centering in feature space
So far, we have assumed that
{
(
)} has zero mean, which is usually not fulfilled. Therefore, the formalism needs to be adjusted (Schölkopf et al., 1998b). The following set of points will be centered:
The above analysis holds if the covariance matrix is computed from
![$ \tilde{\boldsymbol}$](img200.gif)
(
). Thus, the kernel matrix
Kij =
(
)T
(
) needs to be replaced by
= ![$ \tilde{\boldsymbol}$](img200.gif)
(
)T![$ \tilde{\boldsymbol}$](img200.gif)
(
). Using (2.31),
can be written as,
![$\displaystyle \tilde{K}_{{ij}}^{}$](img203.gif) |
= |
( )T ( ) - ![$\displaystyle {\frac{{1}}{{n}}}$](img97.gif) ![$\displaystyle \sum_{{r=1}}^{n}$](img198.gif) ( )T ( ) - ![$\displaystyle {\frac{{1}}{{n}}}$](img97.gif) ![$\displaystyle \sum_{{r=1}}^{n}$](img198.gif) ( )T ( ) |
|
|
+ |
![$\displaystyle {\frac{{1}}{{n^2}}}$](img204.gif) ![$\displaystyle \sum_{{r,s=1}}^{n}$](img205.gif) ( )T ( ) |
|
|
= |
Kij - ![$\displaystyle {\frac{{1}}{{n}}}$](img97.gif) Kir - ![$\displaystyle {\frac{{1}}{{n}}}$](img97.gif) Krj + ![$\displaystyle {\frac{{1}}{{n^2}}}$](img204.gif) Krs . |
(2.32) |
Therefore, we can evaluate the kernel matrix for the centered data using the known matrix
. For the remainder of this thesis, I denote with
the eigenvectors of
instead of
, and they are normalized according to (2.29) using the eigenvalues of
. The principal components are
= ![$ \sum_{{i=1}}^{n}$](img209.gif)
![$ \alpha_{i}^{}$](img210.gif)
![$ \tilde{\boldsymbol}$](img200.gif)
(
).
Next: 2.4.3 Common kernel functions
Up: 2.4 Kernel PCA
Previous: 2.4.1 Feature extraction
Heiko Hoffmann
2005-03-22