13.6 Standardizing

Often researchers will standardize the \(x\) variables before conducting a PCA.

  • Standardizing: Take \(X\) and divide each element by \(\sigma_{x}\).


  • Normalizing: Centering and standardizing.

    \[Z = \frac{(X-\bar{X})}{\sigma_{X}}\]

  • Equivalent to analyzing the correlation matrix (\(\mathbf{R}\)) instead of covariance matrix (\(\mathbf{\Sigma}\)).

Using correlation matrix vs covariance matrix will generate different PC’s

This makes sense given the difference in matricies:

Standardizing your data prior to analysis (using \(\mathbf{R}\) instead of \(\mathbf{\Sigma}\)) aids the interpretation of the PC’s in a few ways

  1. The total variance is the number of variables \(P\)
  2. The proportion explained by each PC is the corresponding eigenvalue / \(P\)
  3. The correlation between \(C_{i}\) and standardized variable \(x_{j}\) can be written as \(r_{ij} = a_{ij}SD(C_{i})\)

This last point means that for any given \(C_{i}\) we can quantify the relative degree of dependence of the PC on each of the standardized variables. This is a.k.a. the factor loading (we will return to this key term later).

To calculate the principal components using the correlation matrix using princomp, set the cor argument to TRUE.

  • If we use the covariance matrix and change the scale of a variable (i.e. in to cm) that will change the results of the PC’s
  • Many researchers prefer to use the correlation matrix
    • It compensates for the units of measurements for the different variables.
    • Interpretations are made in terms of the standardized variables.