Key Word(s): PCA, Model Selection



CS 109A/STAT 121A/AC 209A/CSCI E-109A

Standard Section 5: Partial Components Analysis (PCA) Fitting and Model Selection

Harvard University
Fall 2017
Section Leaders: Albert Wu, Nathaniel Burbank
Instructors: Pavlos Protopapas, Kevin Rader, Rahul Dave, Margo Levine

Download this notebook from the CS109 repo or here: http://bit.ly/Sec_5_109a (right click save to computer)

For this section, our goal is to review and further our understanding of the Partial Components Analysis (PCA) model. PCA is highly effective in applications to high dimensionional datasets, which we will use here. Specifically, this section is designed to help us answer Homework 4, part (h).

Specifically, we will:

1. Review the basics of Partial Components Analysis and hone our intution
2. Discuss implementation of PCA within Python and coding issues to keep in mind
3. Use the principles of model selection we have learned in lecture to find a "best" PCA feature set.
4. Compare our PCA model with other models we have fit in labs and lecture and discuss coefficient meanings.

For this section we will be using the following packages:

In [2]:
import sys
import numpy as np
import pandas as pd
pd.set_option('display.max_rows', 999)
pd.set_option('display.max_columns', 999)
pd.set_option('display.width', 1000)
pd.set_option('display.notebook_repr_html', True)
import matplotlib
import matplotlib.pyplot as plt
import seaborn as sns

import statsmodels.api as sm
from statsmodels.api import OLS
from statsmodels.api import add_constant
from statsmodels.regression.linear_model import RegressionResults
from sklearn.preprocessing import MinMaxScaler
from sklearn.model_selection import KFold
from sklearn.linear_model import LinearRegression
from sklearn.linear_model import Ridge
from sklearn.linear_model import Lasso
from sklearn.preprocessing import PolynomialFeatures
from sklearn.neighbors import KNeighborsRegressor
from sklearn.decomposition import PCA
from sklearn.model_selection import train_test_split
# Note --  Requires sklearn version .18 or higher  

from sklearn.metrics import r2_score
from collections import Counter
sns.set(style="ticks")
%matplotlib inline

import warnings
warnings.filterwarnings("ignore")

matplotlib.rcParams['figure.figsize'] = (13.0, 6.0)

assert(sys.version_info.major==3),print(sys.version)
# Python 3 or higher is required
/anaconda/lib/python3.6/site-packages/statsmodels/compat/pandas.py:56: FutureWarning: The pandas.core.datetools module is deprecated and will be removed in a future version. Please use the pandas.tseries module instead.
  from pandas.core import datetools

Part (1): Load in our data and conduct basic EDA

We will first load in our dataset below and look at the first few rows. Then, we use the describe function to get a sense of the data.

In [3]:
crime_df = pd.read_csv('https://raw.githubusercontent.com/albertw1/data/master/Crime.csv').drop(['Date', 'Year'], axis=1)
crime_df.head()
Out[3]:
Incidence Temp Dewpoint Windspeed Pressure Precipitation TMAX_C TMIN_C Month DOW Weekend Season
0 182 26.444444 15.944444 8.5 1008.5 0.0 31.722222 21.722222 7 Sunday 1 Summer
1 295 24.333333 9.666667 7.9 1012.2 0.0 31.722222 20.000000 7 Monday 0 Summer
2 267 22.722222 9.666667 5.5 1016.0 0.0 28.888889 18.277778 7 Tuesday 0 Summer
3 250 23.166667 14.500000 6.7 1020.1 0.0 28.277778 18.277778 7 Wednesday 0 Summer
4 259 24.500000 15.611111 8.6 1021.3 0.0 30.000000 18.888889 7 Thursday 0 Summer
In [4]:
crime_df.describe()
Out[4]:
Incidence Temp Dewpoint Windspeed Pressure Precipitation TMAX_C TMIN_C Month Weekend
count 1095.000000 1095.000000 1095.000000 1095.000000 1095.000000 1095.000000 1095.000000 1095.000000 1095.000000 1095.000000
mean 238.107763 10.583815 3.713090 9.168767 1016.463562 0.114913 16.309082 5.841755 6.526027 0.285845
std 31.539843 9.913817 10.848571 3.269969 7.632274 0.303409 10.445645 9.806276 3.449427 0.452022
min 115.000000 -14.222222 -24.277778 2.200000 987.400000 0.000000 -10.000000 -19.388889 1.000000 0.000000
25% 219.000000 2.500000 -4.055556 6.900000 1011.300000 0.000000 7.222222 -1.111111 4.000000 0.000000
50% 240.000000 11.333333 4.722222 8.700000 1016.400000 0.000000 17.222222 6.722222 7.000000 0.000000
75% 259.000000 19.111111 12.888889 11.000000 1021.400000 0.055000 25.611111 15.000000 10.000000 1.000000
max 349.000000 30.944444 22.444444 25.800000 1040.400000 3.540000 37.222222 26.111111 12.000000 1.000000

Convert the columns that are categorical variables into dummy variables by one-hot encoding.

In [5]:
categorical_columns = ['Month', 'Weekend', 'Season', 'DOW']
numerical_columns = ['Temp', 'Dewpoint', 'Windspeed', 'Pressure', 'Precipitation', 'TMAX_C', 'TMIN_C']
crime_df = pd.get_dummies(crime_df, columns=categorical_columns, drop_first=True)
crime_df.head()
Out[5]:
Incidence Temp Dewpoint Windspeed Pressure Precipitation TMAX_C TMIN_C Month_2 Month_3 Month_4 Month_5 Month_6 Month_7 Month_8 Month_9 Month_10 Month_11 Month_12 Weekend_1 Season_Spring Season_Summer Season_Winter DOW_Monday DOW_Saturday DOW_Sunday DOW_Thursday DOW_Tuesday DOW_Wednesday
0 182 26.444444 15.944444 8.5 1008.5 0.0 31.722222 21.722222 0 0 0 0 0 1 0 0 0 0 0 1 0 1 0 0 0 1 0 0 0
1 295 24.333333 9.666667 7.9 1012.2 0.0 31.722222 20.000000 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0
2 267 22.722222 9.666667 5.5 1016.0 0.0 28.888889 18.277778 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0
3 250 23.166667 14.500000 6.7 1020.1 0.0 28.277778 18.277778 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1
4 259 24.500000 15.611111 8.6 1021.3 0.0 30.000000 18.888889 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0

Now, let's split this dataset up into a testing and training set.

In [6]:
train, test =  train_test_split(crime_df, test_size=.2, random_state=123)
train.shape,test.shape
Out[6]:
((876, 29), (219, 29))

Now let us standarize the numerical variables only.

In [7]:
mean = train[numerical_columns].mean()
std = train[numerical_columns].std()

train[numerical_columns] = (train[numerical_columns] - mean)/std
test[numerical_columns] = (test[numerical_columns] - mean)/std

Now to let us subset and create convenient formats for parts in train and test

In [8]:
all_predictors = ['Temp','Dewpoint','Windspeed','Pressure','Precipitation','TMAX_C','TMIN_C','Month_2','Month_3','Month_4','Month_5','Month_6','Month_7','Month_8','Month_9','Month_10','Month_11','Month_12','Weekend_1','Season_Spring','Season_Summer','Season_Winter','DOW_Monday','DOW_Saturday','DOW_Sunday','DOW_Thursday','DOW_Tuesday','DOW_Wednesday']

X_train_df = train[all_predictors]
X_test_df  = test[all_predictors]
X_train_np = train[all_predictors].values
X_test_np  = test[all_predictors].values
y_train = train['Incidence'].values
y_test = test['Incidence'].values
y_test.shape
Out[8]:
(219,)

Part (2): Use subset selection to fit a linear regression model

Let's use the forward/backward subset selection method from HW 3 to fit a linear regression model on the data.

In [9]:
def step_forwards_backwards(df, y_val, direction='forward'):
    
    assert direction in ['forward', 'backward']
    
    y = y_val.reshape(-1,1)
    
    predictors = set(df.columns)
    selected_predictors = set() if direction=='forward' else set(predictors)
    
    n = df.shape[0]
    best_bic = np.inf
    
    best_bics = []
    best_models = []
    
    if direction == 'forward':
        X = np.ones(n).reshape(-1,1)
        X = np.concatenate([X, df[list(selected_predictors)].values], axis=1)
        while (True):
            
            possible_bic_scores = []
            possible_predictors = list(selected_predictors ^ predictors)
            
            if len(possible_predictors) == 0:
                break
                
            for predictor in possible_predictors:
                
                x_temp = np.concatenate([X, df[predictor].values.reshape(-1,1)], axis=1)
                model = OLS(endog=y, exog=x_temp).fit()
                bic = model.bic
                possible_bic_scores.append(bic)
                
            best_predictor_ix = np.argmin(possible_bic_scores)
            best_predictor = possible_predictors[best_predictor_ix]
            
            best_bic = np.min(possible_bic_scores)
            best_bics.append(best_bic)
            
            selected_predictors.add(best_predictor)            
            X = np.concatenate([X, df[best_predictor].values.reshape(-1,1)], axis=1)
            best_models.append(list(selected_predictors))

    else:

        while (True):
            possible_bic_scores = []
            possible_predictors = list(selected_predictors)

            if len(possible_predictors) == 0:
                break

            for predictor in possible_predictors:
                X = np.concatenate([np.ones(n).reshape(-1,1), df[list(selected_predictors - set([predictor]))].values], axis=1)
                model = OLS(endog=y, exog=X).fit()
                bic = model.bic
                possible_bic_scores.append(bic)

            best_predictor_ix = np.argmin(possible_bic_scores)
            best_predictor = possible_predictors[best_predictor_ix] 

            best_bic = possible_bic_scores[best_predictor_ix]
            selected_predictors.discard(best_predictor)
            
            best_bics.append(best_bic)
            best_models.append(list(selected_predictors))
            
    index_of_best_bic = np.argmin(best_bics)

    return best_models[index_of_best_bic]

Let's run the subset selection function and see which variables were included in the best model:

In [10]:
predictors_forward = # Your code goes here
predictors_forward
  File "", line 1
    predictors_forward = # Your code goes here
                                              ^
SyntaxError: invalid syntax
In [11]:
predictors_backward = # Your code goes here
predictors_backward
  File "", line 1
    predictors_backward = # Your code goes here
                                               ^
SyntaxError: invalid syntax

Based on these variables, we can see what the R-squared values are for our training and testing sets.

In [12]:
X = sm.add_constant(X_train_df[predictors_backward])
X_test = sm.add_constant(X_test_df[predictors_backward])
y = train['Incidence'].values.reshape(-1,1)

model = OLS(endog=y, exog=X)
result = model.fit()

y_hat_train = result.predict()
y_hat_test = result.predict(exog=X_test)

print('Backward Selection Training R2 = ', r2_score(y_train, y_hat_train))
print('Backward Selection Testing R2 = ', r2_score(y_test, y_hat_test))
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
 in ()
----> 1 X = sm.add_constant(X_train_df[predictors_backward])
      2 X_test = sm.add_constant(X_test_df[predictors_backward])
      3 y = train['Incidence'].values.reshape(-1,1)
      4 
      5 model = OLS(endog=y, exog=X)

NameError: name 'predictors_backward' is not defined
In [13]:
X = sm.add_constant(X_train_df[predictors_forward])
X_test = sm.add_constant(X_test_df[predictors_forward])
y = train['Incidence'].values.reshape(-1,1)

model = OLS(endog=y, exog=X)
result = model.fit()

y_hat_train = result.predict()
y_hat_test = result.predict(exog=X_test)

print('Forward Selection Training R2 = ', r2_score(y_train, y_hat_train))
print('Forward Selection Testing R2 = ', r2_score(y_test, y_hat_test))
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
 in ()
----> 1 X = sm.add_constant(X_train_df[predictors_forward])
      2 X_test = sm.add_constant(X_test_df[predictors_forward])
      3 y = train['Incidence'].values.reshape(-1,1)
      4 
      5 model = OLS(endog=y, exog=X)

NameError: name 'predictors_forward' is not defined

Part (3): Create a data frame with continuous predictors taken to polynomial power

Now, we will work with an example where we will manually take numeric predictors and take them to a polynomial power. The next step is a simple example to help you see how we can do this.

In [14]:
np.hstack((np.array([[1, 2,3], [4, 5,6],[7,8,9],[10,11,12]])**(i+1) for i in range(3)))
Out[14]:
array([[   1,    2,    3,    1,    4,    9,    1,    8,   27],
       [   4,    5,    6,   16,   25,   36,   64,  125,  216],
       [   7,    8,    9,   49,   64,   81,  343,  512,  729],
       [  10,   11,   12,  100,  121,  144, 1000, 1331, 1728]])

We want to create a data frame with the continuous predictors taken up to a power 3, while keeping the rest of the categorical predictors the same.

In [15]:
X_train_numerical_powers = # Your code goes here

print('Number of Total Predictors with Continuous Polynomial Terms Added is', X_train_numerical_powers.shape[1])
  File "", line 1
    X_train_numerical_powers = # Your code goes here
                                                    ^
SyntaxError: invalid syntax
In [17]:
X_train_np_powers = np.concatenate((X_train_numerical_powers,X_train_df.drop(numerical_columns, axis=1)),axis=1)
X_train_df_powers = pd.DataFrame(X_train_np_powers)
newcolname = ['Temp', 'Dewpoint', 'Windspeed', 'Pressure', 'Precipitation', 'TMAX_C', 'TMIN_C', 'Temp^2', 'Dewpoint^2', 'Windspeed^2', 'Pressure^2', 'Precipitation^2', 'TMAX_C^2', 'TMIN_^2','Temp^3', 'Dewpoint^3', 'Windspeed^3', 'Pressure^3', 'Precipitation^3', 'TMAX_C^3', 'TMIN_C^3'] + list(X_train_df.drop(numerical_columns, axis=1))
X_train_df_powers.columns = newcolname
X_train_df_powers.head()                                          
                             
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
 in ()
----> 1 X_train_np_powers = np.concatenate((X_train_numerical_powers,X_train_df.drop(numerical_columns, axis=1)),axis=1)
      2 X_train_df_powers = pd.DataFrame(X_train_np_powers)
      3 newcolname = ['Temp', 'Dewpoint', 'Windspeed', 'Pressure', 'Precipitation', 'TMAX_C', 'TMIN_C', 'Temp^2', 'Dewpoint^2', 'Windspeed^2', 'Pressure^2', 'Precipitation^2', 'TMAX_C^2', 'TMIN_^2','Temp^3', 'Dewpoint^3', 'Windspeed^3', 'Pressure^3', 'Precipitation^3', 'TMAX_C^3', 'TMIN_C^3'] + list(X_train_df.drop(numerical_columns, axis=1))
      4 X_train_df_powers.columns = newcolname
      5 X_train_df_powers.head()

NameError: name 'X_train_numerical_powers' is not defined

We can do the same with the test set:

In [1]:
X_test_numerical_powers = # Your code goes here
X_test_np_powers = np.concatenate((X_test_numerical_powers,X_test_df.drop(numerical_columns, axis=1)),axis=1)
X_test_df_powers = pd.DataFrame(X_test_np_powers)
newcolname = ['Temp', 'Dewpoint', 'Windspeed', 'Pressure', 'Precipitation', 'TMAX_C', 'TMIN_C', 'Temp^2', 'Dewpoint^2', 'Windspeed^2', 'Pressure^2', 'Precipitation^2', 'TMAX_C^2', 'TMIN_^2','Temp^3', 'Dewpoint^3', 'Windspeed^3', 'Pressure^3', 'Precipitation^3', 'TMAX_C^3', 'TMIN_C^3'] + list(X_train_df.drop(numerical_columns, axis=1))
X_test_df_powers.columns = newcolname
X_test_df_powers.head()          
  File "", line 1
    X_test_numerical_powers = # Your code goes here
                                                   ^
SyntaxError: invalid syntax

We can do forward and backward selection as well on this new data frame.

In [317]:
predictors_forward = step_forwards_backwards(X_train_df_powers, y_train, direction='forward')
predictors_forward
Out[317]:
['DOW_Sunday',
 'DOW_Monday',
 'Windspeed',
 'Temp',
 'DOW_Wednesday',
 'Weekend_1',
 'Windspeed^3',
 'DOW_Thursday',
 'Season_Summer',
 'DOW_Tuesday',
 'Dewpoint^3']
In [319]:
predictors_backward = step_forwards_backwards(X_train_df_powers, y_train, direction='backward')
predictors_backward
Out[319]:
['DOW_Wednesday',
 'Windspeed',
 'DOW_Thursday',
 'Dewpoint',
 'DOW_Sunday',
 'TMIN_C^3',
 'Season_Summer',
 'DOW_Monday',
 'DOW_Saturday',
 'DOW_Tuesday',
 'Temp']
In [320]:
X = sm.add_constant(X_train_df_powers[predictors_backward])
X_test = sm.add_constant(X_test_df_powers[predictors_backward])
y = train['Incidence'].values.reshape(-1,1)

model = OLS(endog=y, exog=X)
result = model.fit()

y_hat_train = result.predict()
y_hat_test = result.predict(exog=X_test)

print('Forward Selection Training R2 = ', r2_score(y_train, y_hat_train))
print('Forward Selection Testing R2 = ', r2_score(y_test, y_hat_test))
Forward Selection Training R2 =  0.422425372293
Forward Selection Testing R2 =  0.361986140851
In [ ]:
X = sm.add_constant(X_train_df_powers[predictors_forward])
X_test = sm.add_constant(X_test_df_powers[predictors_forward])
y = train['Incidence'].values.reshape(-1,1)

model = OLS(endog=y, exog=X)
result = model.fit()

y_hat_train = result.predict()
y_hat_test = result.predict(exog=X_test)

print('Forward Selection Training R2 = ', r2_score(y_train, y_hat_train))
print('Forward Selection Testing R2 = ', r2_score(y_test, y_hat_test))

Part (4) Create the expanded matrix containing all terms

Let's now create a design matrix that includes all polynomial terms up to the third order, including all interactions.

In [416]:
all_poly_terms = PolynomialFeatures(degree=3, interaction_only=False, include_bias=False)

X_train_full_poly = all_poly_terms.fit_transform(X_train_df)
X_test_full_poly = all_poly_terms.fit_transform(X_test_df)

print('number of total predictors', X_train_full_poly.shape[1])
number of total predictors 4494
In [417]:
X_train_full_poly.shape
Out[417]:
(876, 4494)

We can use the following function to get an idea of the interactions between the variables (print first 25).

In [1]:
all_poly_terms.get_feature_names()[0:25]
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
 in ()
----> 1 all_poly_terms.get_feature_names().head()

NameError: name 'all_poly_terms' is not defined
In [419]:
X_train_full_poly
Out[419]:
array([[ 1.42281184,  1.58038879, -0.57476883, ...,  0.        ,
         0.        ,  0.        ],
       [-1.51727206, -1.79988626,  1.10635803, ...,  0.        ,
         0.        ,  0.        ],
       [-0.47203311,  0.001912  , -1.05509079, ...,  0.        ,
         0.        ,  0.        ],
       ..., 
       [-0.11430427, -0.2112585 ,  0.35585497, ...,  0.        ,
         0.        ,  0.        ],
       [ 0.42228899,  0.05774237,  0.5359757 , ...,  0.        ,
         0.        ,  0.        ],
       [ 0.41669948, -0.05391837, -1.02507067, ...,  0.        ,
         0.        ,  0.        ]])

If we wanted to drop the 0's, we can use the following function:

In [420]:
zero_column_index = np.where(~X_train_full_poly.any(axis=0))[0]
In [421]:
X_train_full_poly_nonzero_col = np.delete(X_train_full_poly, zero_column_index, axis = 1)
X_test_full_poly_nonzero_col = np.delete(X_test_full_poly, zero_column_index, axis = 1)

print(X_train_full_poly_nonzero_col)
[[ 1.42281184  1.58038879 -0.57476883 ...,  0.          0.          0.        ]
 [-1.51727206 -1.79988626  1.10635803 ...,  0.          0.          0.        ]
 [-0.47203311  0.001912   -1.05509079 ...,  0.          0.          0.        ]
 ..., 
 [-0.11430427 -0.2112585   0.35585497 ...,  0.          0.          0.        ]
 [ 0.42228899  0.05774237  0.5359757  ...,  0.          0.          0.        ]
 [ 0.41669948 -0.05391837 -1.02507067 ...,  0.          0.          0.        ]]

Now we can fit our PCA model:

In [447]:
pca = PCA(n_components=5)
pca.fit(X_train_full_poly)
train_pca = pca.transform(X_train_full_poly)
test_pca = pca.transform(X_test_full_poly)

print('Explained variance ratio:', pca.explained_variance_ratio_)
Explained variance ratio: [ 0.8129783   0.04373497  0.03401266  0.02028036  0.01556333]

We can obtain the corresponding coefficients of each principal component.

In [428]:
pca.components_
# First "row" of array corresponds to first component weights. 
Out[428]:
array([[  2.89699969e-04,   8.16273796e-04,   2.21196880e-03, ...,
          0.00000000e+00,   0.00000000e+00,   1.63373238e-05],
       [ -2.10903042e-02,  -1.89535884e-02,   3.16788138e-02, ...,
         -0.00000000e+00,  -0.00000000e+00,   1.56530336e-04],
       [  6.60108855e-02,   6.63252686e-02,  -1.68813579e-02, ...,
         -0.00000000e+00,  -0.00000000e+00,  -1.08743378e-04],
       [ -1.22151524e-02,  -1.10596786e-02,  -3.84026203e-02, ...,
          0.00000000e+00,   0.00000000e+00,   2.21319814e-03],
       [  8.82561724e-03,   1.13001835e-02,   4.94705788e-02, ...,
          0.00000000e+00,   0.00000000e+00,   2.29230565e-03]])

If you recall, our weights squared had to sum to one. Let's see if this is the case:

In [429]:
# Your code goes here
Out[429]:
array([ 1.,  1.,  1.,  1.,  1.])

Now we will look at each of the components, or weights from our 5 principal components. Because we centered our dataset, each value is essentially a correlation value.

In [430]:
feature_frame = pd.DataFrame(pca.components_,columns=all_poly_terms.get_feature_names(),index = ['PC-1','PC-2','PC-3','PC-4','PC-5'])
feature_frame
Out[430]:
x0 x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 x19 x20 x21 x22 x23 x24 x25 x26 x27 x0^2 x0 x1 x0 x2 x0 x3 x0 x4 x0 x5 x0 x6 x0 x7 x0 x8 x0 x9 x0 x10 x0 x11 x0 x12 x0 x13 x0 x14 x0 x15 x0 x16 x0 x17 x0 x18 x0 x19 x0 x20 x0 x21 x0 x22 x0 x23 x0 x24 x0 x25 x0 x26 x0 x27 x1^2 x1 x2 x1 x3 x1 x4 x1 x5 x1 x6 x1 x7 x1 x8 x1 x9 x1 x10 x1 x11 x1 x12 x1 x13 x1 x14 x1 x15 x1 x16 x1 x17 x1 x18 x1 x19 x1 x20 x1 x21 x1 x22 x1 x23 x1 x24 x1 x25 x1 x26 x1 x27 x2^2 x2 x3 x2 x4 x2 x5 x2 x6 x2 x7 x2 x8 x2 x9 x2 x10 x2 x11 x2 x12 x2 x13 x2 x14 x2 x15 x2 x16 x2 x17 x2 x18 x2 x19 x2 x20 x2 x21 x2 x22 x2 x23 x2 x24 x2 x25 x2 x26 x2 x27 x3^2 x3 x4 x3 x5 x3 x6 x3 x7 x3 x8 x3 x9 x3 x10 x3 x11 x3 x12 x3 x13 x3 x14 x3 x15 x3 x16 x3 x17 x3 x18 x3 x19 x3 x20 x3 x21 x3 x22 x3 x23 x3 x24 x3 x25 x3 x26 x3 x27 x4^2 x4 x5 x4 x6 x4 x7 x4 x8 x4 x9 x4 x10 x4 x11 x4 x12 x4 x13 x4 x14 x4 x15 x4 x16 x4 x17 x4 x18 x4 x19 x4 x20 x4 x21 x4 x22 x4 x23 x4 x24 x4 x25 x4 x26 x4 x27 x5^2 x5 x6 x5 x7 x5 x8 x5 x9 x5 x10 x5 x11 x5 x12 x5 x13 x5 x14 x5 x15 x5 x16 x5 x17 x5 x18 x5 x19 x5 x20 x5 x21 x5 x22 x5 x23 x5 x24 x5 x25 x5 x26 x5 x27 x6^2 x6 x7 x6 x8 x6 x9 x6 x10 x6 x11 x6 x12 x6 x13 x6 x14 x6 x15 x6 x16 x6 x17 x6 x18 x6 x19 x6 x20 x6 x21 x6 x22 x6 x23 x6 x24 x6 x25 x6 x26 x6 x27 x7^2 x7 x8 x7 x9 x7 x10 x7 x11 x7 x12 x7 x13 x7 x14 x7 x15 x7 x16 x7 x17 x7 x18 x7 x19 x7 x20 x7 x21 x7 x22 x7 x23 x7 x24 x7 x25 x7 x26 x7 x27 x8^2 x8 x9 x8 x10 x8 x11 x8 x12 x8 x13 x8 x14 x8 x15 x8 x16 x8 x17 x8 x18 x8 x19 x8 x20 x8 x21 x8 x22 x8 x23 x8 x24 x8 x25 x8 x26 x8 x27 x9^2 x9 x10 x9 x11 x9 x12 x9 x13 x9 x14 x9 x15 x9 x16 x9 x17 x9 x18 x9 x19 x9 x20 x9 x21 x9 x22 x9 x23 x9 x24 x9 x25 x9 x26 x9 x27 x10^2 x10 x11 x10 x12 x10 x13 x10 x14 x10 x15 x10 x16 x10 x17 x10 x18 x10 x19 x10 x20 x10 x21 x10 x22 x10 x23 x10 x24 x10 x25 x10 x26 x10 x27 x11^2 x11 x12 x11 x13 x11 x14 x11 x15 x11 x16 x11 x17 x11 x18 x11 x19 x11 x20 x11 x21 x11 x22 x11 x23 x11 x24 x11 x25 x11 x26 x11 x27 x12^2 x12 x13 x12 x14 x12 x15 x12 x16 x12 x17 x12 x18 x12 x19 x12 x20 x12 x21 x12 x22 x12 x23 x12 x24 x12 x25 x12 x26 x12 x27 x13^2 x13 x14 x13 x15 x13 x16 x13 x17 x13 x18 x13 x19 x13 x20 x13 x21 x13 x22 x13 x23 x13 x24 x13 x25 x13 x26 x13 x27 x14^2 x14 x15 x14 x16 x14 x17 x14 x18 x14 x19 x14 x20 x14 x21 x14 x22 x14 x23 x14 x24 x14 x25 x14 x26 x14 x27 x15^2 x15 x16 x15 x17 x15 x18 x15 x19 x15 x20 x15 x21 x15 x22 x15 x23 x15 x24 x15 x25 x15 x26 x15 x27 x16^2 x16 x17 x16 x18 x16 x19 x16 x20 x16 x21 x16 x22 x16 x23 x16 x24 x16 x25 x16 x26 x16 x27 x17^2 x17 x18 x17 x19 x17 x20 x17 x21 x17 x22 x17 x23 x17 x24 x17 x25 x17 x26 x17 x27 x18^2 x18 x19 x18 x20 x18 x21 x18 x22 x18 x23 x18 x24 x18 x25 x18 x26 x18 x27 x19^2 x19 x20 x19 x21 x19 x22 x19 x23 x19 x24 x19 x25 x19 x26 x19 x27 x20^2 x20 x21 x20 x22 x20 x23 x20 x24 x20 x25 x20 x26 x20 x27 x21^2 x21 x22 x21 x23 x21 x24 x21 x25 x21 x26 x21 x27 x22^2 x22 x23 x22 x24 x22 x25 x22 x26 x22 x27 x23^2 x23 x24 x23 x25 x23 x26 x23 x27 x24^2 x24 x25 x24 x26 x24 x27 x25^2 x25 x26 x25 x27 x26^2 x26 x27 x27^2 x0^3 x0^2 x1 x0^2 x2 x0^2 x3 x0^2 x4 x0^2 x5 x0^2 x6 x0^2 x7 x0^2 x8 x0^2 x9 x0^2 x10 x0^2 x11 x0^2 x12 x0^2 x13 x0^2 x14 x0^2 x15 x0^2 x16 x0^2 x17 x0^2 x18 x0^2 x19 x0^2 x20 x0^2 x21 x0^2 x22 x0^2 x23 x0^2 x24 x0^2 x25 x0^2 x26 x0^2 x27 x0 x1^2 x0 x1 x2 x0 x1 x3 x0 x1 x4 x0 x1 x5 x0 x1 x6 x0 x1 x7 x0 x1 x8 x0 x1 x9 x0 x1 x10 x0 x1 x11 x0 x1 x12 x0 x1 x13 x0 x1 x14 x0 x1 x15 x0 x1 x16 x0 x1 x17 x0 x1 x18 x0 x1 x19 x0 x1 x20 x0 x1 x21 x0 x1 x22 x0 x1 x23 x0 x1 x24 x0 x1 x25 x0 x1 x26 x0 x1 x27 x0 x2^2 x0 x2 x3 x0 x2 x4 x0 x2 x5 x0 x2 x6 x0 x2 x7 x0 x2 x8 x0 x2 x9 x0 x2 x10 x0 x2 x11 ... x14 x19 x20 x14 x19 x21 x14 x19 x22 x14 x19 x23 x14 x19 x24 x14 x19 x25 x14 x19 x26 x14 x19 x27 x14 x20^2 x14 x20 x21 x14 x20 x22 x14 x20 x23 x14 x20 x24 x14 x20 x25 x14 x20 x26 x14 x20 x27 x14 x21^2 x14 x21 x22 x14 x21 x23 x14 x21 x24 x14 x21 x25 x14 x21 x26 x14 x21 x27 x14 x22^2 x14 x22 x23 x14 x22 x24 x14 x22 x25 x14 x22 x26 x14 x22 x27 x14 x23^2 x14 x23 x24 x14 x23 x25 x14 x23 x26 x14 x23 x27 x14 x24^2 x14 x24 x25 x14 x24 x26 x14 x24 x27 x14 x25^2 x14 x25 x26 x14 x25 x27 x14 x26^2 x14 x26 x27 x14 x27^2 x15^3 x15^2 x16 x15^2 x17 x15^2 x18 x15^2 x19 x15^2 x20 x15^2 x21 x15^2 x22 x15^2 x23 x15^2 x24 x15^2 x25 x15^2 x26 x15^2 x27 x15 x16^2 x15 x16 x17 x15 x16 x18 x15 x16 x19 x15 x16 x20 x15 x16 x21 x15 x16 x22 x15 x16 x23 x15 x16 x24 x15 x16 x25 x15 x16 x26 x15 x16 x27 x15 x17^2 x15 x17 x18 x15 x17 x19 x15 x17 x20 x15 x17 x21 x15 x17 x22 x15 x17 x23 x15 x17 x24 x15 x17 x25 x15 x17 x26 x15 x17 x27 x15 x18^2 x15 x18 x19 x15 x18 x20 x15 x18 x21 x15 x18 x22 x15 x18 x23 x15 x18 x24 x15 x18 x25 x15 x18 x26 x15 x18 x27 x15 x19^2 x15 x19 x20 x15 x19 x21 x15 x19 x22 x15 x19 x23 x15 x19 x24 x15 x19 x25 x15 x19 x26 x15 x19 x27 x15 x20^2 x15 x20 x21 x15 x20 x22 x15 x20 x23 x15 x20 x24 x15 x20 x25 x15 x20 x26 x15 x20 x27 x15 x21^2 x15 x21 x22 x15 x21 x23 x15 x21 x24 x15 x21 x25 x15 x21 x26 x15 x21 x27 x15 x22^2 x15 x22 x23 x15 x22 x24 x15 x22 x25 x15 x22 x26 x15 x22 x27 x15 x23^2 x15 x23 x24 x15 x23 x25 x15 x23 x26 x15 x23 x27 x15 x24^2 x15 x24 x25 x15 x24 x26 x15 x24 x27 x15 x25^2 x15 x25 x26 x15 x25 x27 x15 x26^2 x15 x26 x27 x15 x27^2 x16^3 x16^2 x17 x16^2 x18 x16^2 x19 x16^2 x20 x16^2 x21 x16^2 x22 x16^2 x23 x16^2 x24 x16^2 x25 x16^2 x26 x16^2 x27 x16 x17^2 x16 x17 x18 x16 x17 x19 x16 x17 x20 x16 x17 x21 x16 x17 x22 x16 x17 x23 x16 x17 x24 x16 x17 x25 x16 x17 x26 x16 x17 x27 x16 x18^2 x16 x18 x19 x16 x18 x20 x16 x18 x21 x16 x18 x22 x16 x18 x23 x16 x18 x24 x16 x18 x25 x16 x18 x26 x16 x18 x27 x16 x19^2 x16 x19 x20 x16 x19 x21 x16 x19 x22 x16 x19 x23 x16 x19 x24 x16 x19 x25 x16 x19 x26 x16 x19 x27 x16 x20^2 x16 x20 x21 x16 x20 x22 x16 x20 x23 x16 x20 x24 x16 x20 x25 x16 x20 x26 x16 x20 x27 x16 x21^2 x16 x21 x22 x16 x21 x23 x16 x21 x24 x16 x21 x25 x16 x21 x26 x16 x21 x27 x16 x22^2 x16 x22 x23 x16 x22 x24 x16 x22 x25 x16 x22 x26 x16 x22 x27 x16 x23^2 x16 x23 x24 x16 x23 x25 x16 x23 x26 x16 x23 x27 x16 x24^2 x16 x24 x25 x16 x24 x26 x16 x24 x27 x16 x25^2 x16 x25 x26 x16 x25 x27 x16 x26^2 x16 x26 x27 x16 x27^2 x17^3 x17^2 x18 x17^2 x19 x17^2 x20 x17^2 x21 x17^2 x22 x17^2 x23 x17^2 x24 x17^2 x25 x17^2 x26 x17^2 x27 x17 x18^2 x17 x18 x19 x17 x18 x20 x17 x18 x21 x17 x18 x22 x17 x18 x23 x17 x18 x24 x17 x18 x25 x17 x18 x26 x17 x18 x27 x17 x19^2 x17 x19 x20 x17 x19 x21 x17 x19 x22 x17 x19 x23 x17 x19 x24 x17 x19 x25 x17 x19 x26 x17 x19 x27 x17 x20^2 x17 x20 x21 x17 x20 x22 x17 x20 x23 x17 x20 x24 x17 x20 x25 x17 x20 x26 x17 x20 x27 x17 x21^2 x17 x21 x22 x17 x21 x23 x17 x21 x24 x17 x21 x25 x17 x21 x26 x17 x21 x27 x17 x22^2 x17 x22 x23 x17 x22 x24 x17 x22 x25 x17 x22 x26 x17 x22 x27 x17 x23^2 x17 x23 x24 x17 x23 x25 x17 x23 x26 x17 x23 x27 x17 x24^2 x17 x24 x25 x17 x24 x26 x17 x24 x27 x17 x25^2 x17 x25 x26 x17 x25 x27 x17 x26^2 x17 x26 x27 x17 x27^2 x18^3 x18^2 x19 x18^2 x20 x18^2 x21 x18^2 x22 x18^2 x23 x18^2 x24 x18^2 x25 x18^2 x26 x18^2 x27 x18 x19^2 x18 x19 x20 x18 x19 x21 x18 x19 x22 x18 x19 x23 x18 x19 x24 x18 x19 x25 x18 x19 x26 x18 x19 x27 x18 x20^2 x18 x20 x21 x18 x20 x22 x18 x20 x23 x18 x20 x24 x18 x20 x25 x18 x20 x26 x18 x20 x27 x18 x21^2 x18 x21 x22 x18 x21 x23 x18 x21 x24 x18 x21 x25 x18 x21 x26 x18 x21 x27 x18 x22^2 x18 x22 x23 x18 x22 x24 x18 x22 x25 x18 x22 x26 x18 x22 x27 x18 x23^2 x18 x23 x24 x18 x23 x25 x18 x23 x26 x18 x23 x27 x18 x24^2 x18 x24 x25 x18 x24 x26 x18 x24 x27 x18 x25^2 x18 x25 x26 x18 x25 x27 x18 x26^2 x18 x26 x27 x18 x27^2 x19^3 x19^2 x20 x19^2 x21 x19^2 x22 x19^2 x23 x19^2 x24 x19^2 x25 x19^2 x26 x19^2 x27 x19 x20^2 x19 x20 x21 x19 x20 x22 x19 x20 x23 x19 x20 x24 x19 x20 x25 x19 x20 x26 x19 x20 x27 x19 x21^2 x19 x21 x22 x19 x21 x23 x19 x21 x24 x19 x21 x25 x19 x21 x26 x19 x21 x27 x19 x22^2 x19 x22 x23 x19 x22 x24 x19 x22 x25 x19 x22 x26 x19 x22 x27 x19 x23^2 x19 x23 x24 x19 x23 x25 x19 x23 x26 x19 x23 x27 x19 x24^2 x19 x24 x25 x19 x24 x26 x19 x24 x27 x19 x25^2 x19 x25 x26 x19 x25 x27 x19 x26^2 x19 x26 x27 x19 x27^2 x20^3 x20^2 x21 x20^2 x22 x20^2 x23 x20^2 x24 x20^2 x25 x20^2 x26 x20^2 x27 x20 x21^2 x20 x21 x22 x20 x21 x23 x20 x21 x24 x20 x21 x25 x20 x21 x26 x20 x21 x27 x20 x22^2 x20 x22 x23 x20 x22 x24 x20 x22 x25 x20 x22 x26 x20 x22 x27 x20 x23^2 x20 x23 x24 x20 x23 x25 x20 x23 x26 x20 x23 x27 x20 x24^2 x20 x24 x25 x20 x24 x26 x20 x24 x27 x20 x25^2 x20 x25 x26 x20 x25 x27 x20 x26^2 x20 x26 x27 x20 x27^2 x21^3 x21^2 x22 x21^2 x23 x21^2 x24 x21^2 x25 x21^2 x26 x21^2 x27 x21 x22^2 x21 x22 x23 x21 x22 x24 x21 x22 x25 x21 x22 x26 x21 x22 x27 x21 x23^2 x21 x23 x24 x21 x23 x25 x21 x23 x26 x21 x23 x27 x21 x24^2 x21 x24 x25 x21 x24 x26 x21 x24 x27 x21 x25^2 x21 x25 x26 x21 x25 x27 x21 x26^2 x21 x26 x27 x21 x27^2 x22^3 x22^2 x23 x22^2 x24 x22^2 x25 x22^2 x26 x22^2 x27 x22 x23^2 x22 x23 x24 x22 x23 x25 x22 x23 x26 x22 x23 x27 x22 x24^2 x22 x24 x25 x22 x24 x26 x22 x24 x27 x22 x25^2 x22 x25 x26 x22 x25 x27 x22 x26^2 x22 x26 x27 x22 x27^2 x23^3 x23^2 x24 x23^2 x25 x23^2 x26 x23^2 x27 x23 x24^2 x23 x24 x25 x23 x24 x26 x23 x24 x27 x23 x25^2 x23 x25 x26 x23 x25 x27 x23 x26^2 x23 x26 x27 x23 x27^2 x24^3 x24^2 x25 x24^2 x26 x24^2 x27 x24 x25^2 x24 x25 x26 x24 x25 x27 x24 x26^2 x24 x26 x27 x24 x27^2 x25^3 x25^2 x26 x25^2 x27 x25 x26^2 x25 x26 x27 x25 x27^2 x26^3 x26^2 x27 x26 x27^2 x27^3
PC-1 0.000290 0.000816 0.002212 -0.001893 0.011873 0.000330 0.000413 -0.000110 0.000031 -0.000119 -0.000112 0.000401 0.000032 -0.000075 -0.000110 0.000105 -0.000068 0.000135 0.000329 -0.000199 0.000357 -0.000084 -0.000175 0.000354 -0.000025 0.000108 -0.000173 0.000016 -0.000866 -0.000854 0.000261 0.000368 0.002569 -0.000736 -0.000794 0.000154 0.000022 0.000023 -0.000048 0.000194 -0.000006 -0.000085 -0.000077 -0.000028 0.000016 -0.000026 0.000279 -0.000003 0.000103 0.000278 -0.000020 0.000391 -0.000112 0.000048 -0.000015 -0.000080 -0.000744 0.001265 -0.000345 0.006456 -0.000735 -0.000766 0.000141 0.000125 0.000040 -0.000038 0.000334 -0.000026 -0.000049 -0.000070 0.000087 0.000035 0.000089 0.000477 0.000127 0.000260 0.000378 0.000004 0.000482 -0.000005 0.000188 -0.000008 0.000047 0.004738 -0.002547 0.017019 -0.000059 0.000623 -0.000033 0.000235 -0.000031 0.000010 0.000431 0.000163 0.000048 0.000052 0.001009 0.000066 0.000289 0.000866 0.000214 0.000641 0.000230 0.000005 0.000557 0.000310 0.000995 0.000066 0.000245 0.001297 -0.014242 0.000421 0.000325 -0.000009 -0.000173 -0.000023 -0.000030 -0.000491 -0.000058 -0.000012 -0.000025 -0.000309 -0.000121 -0.000595 -0.000744 -0.000226 -0.000560 -0.000653 -0.000040 -0.000552 -0.000192 -0.000386 -0.000102 -0.000502 0.101705 0.003301 0.003332 0.000016 0.001150 0.000043 0.000040 0.005197 0.001057 0.000279 0.000056 0.001703 0.000271 0.002023 0.006952 0.001234 0.006532 0.002076 0.000154 0.005778 0.001174 0.002174 0.000164 0.001962 -0.000540 -0.000638 0.000155 0.000028 0.000022 -0.000056 0.000347 0.000016 -0.000087 -0.000079 -0.000118 0.000036 -0.000073 0.000436 -0.000006 0.000276 0.000222 -0.000029 0.000538 -0.000103 -0.000011 -0.000023 -0.000127 -0.000724 0.000157 0.000083 0.000024 -0.000039 0.000245 -0.000004 -0.000082 -0.000079 0.000037 0.000023 -0.000108 0.000370 0.000068 0.000160 0.000203 -0.000010 0.000432 -0.000062 0.000131 -0.000011 -0.000154 -0.000110 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000024 0.0 0.0 -0.000110 -0.000021 -0.000011 -0.000013 -0.000015 -0.000014 -0.000019 0.000031 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000105 0.000031 0.0 0.0 -0.000011 -0.000022 0.000127 -0.000017 -0.000016 -0.000017 -0.000119 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000032 -0.000119 0.0 0.0 -0.000018 -0.000015 -0.000018 -0.000016 -0.000017 -0.000016 -0.000112 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000036 -0.000112 0.0 0.0 -0.000014 -0.000017 -0.000019 -0.000012 -0.000014 -0.000018 0.000401 0.0 0.0 0.0 0.0 0.0 0.0 0.000422 0.0 0.000401 0.0 -0.000010 4.281049e-04 -0.000006 -0.000021 -0.000012 -0.000007 0.000032 0.0 0.0 0.0 0.0 0.0 0.000075 0.0 0.000032 0.0 -0.000023 0.000093 -0.000018 0.000036 -0.000024 -0.000020 -0.000075 0.0 0.0 0.0 0.0 -0.000023 0.0 -0.000075 0.0 -0.000018 -0.000009 -0.000015 -0.000013 -1.806578e-05 -0.000017 -0.000110 0.0 0.0 0.0 -0.000032 0.0 0.0 0.0 -0.000020 -0.000015 -0.000016 -0.000015 -0.000018 -0.000009 0.000105 0.0 0.0 -0.000028 0.0 0.0 0.0 -0.000015 -0.000014 -0.000014 0.000196 -0.000010 -0.000025 -0.000068 0.0 -0.000045 0.0 0.0 0.0 -0.000012 -0.000024 -0.000021 0.000023 8.494807e-07 -0.000020 0.000135 -0.000029 0.0 0.0 0.000135 3.755264e-07 -0.000021 -0.000008 -0.000015 -0.000015 0.000203 0.000329 0.000036 0.000474 -0.000076 0.0 0.000354 -0.000025 0.0 0.0 0.0 -0.000199 0.0 0.0 -0.000044 -0.000054 0.000090 -0.000044 -0.000047 -0.000051 0.000357 0.0 -0.000051 0.000512 -0.000038 0.000002 -0.000054 -0.000045 -0.000084 -0.000033 -0.000051 -0.000026 -0.000053 -0.000044 0.000166 -0.000175 0.0 0.0 0.0 0.0 0.0 0.000354 0.0 0.0 0.0 0.0 -0.000025 0.0 0.0 0.0 0.000108 0.0 0.0 -0.000173 0.0 0.000016 0.000652 0.000896 0.000277 -0.000809 0.004710 0.000664 0.000720 -0.000238 -0.000076 -0.000020 -0.000037 0.000061 -0.000066 -0.000101 -0.000067 -0.000019 -0.000018 -0.000040 -0.000036 -0.000133 -0.000106 -0.000522 -0.000201 0.000056 -0.000092 -0.000110 -0.000206 -0.000172 0.001098 0.000093 -0.000693 0.004377 0.000919 0.000954 -0.000225 -0.000150 -0.000021 -0.000025 0.000156 -0.000075 -0.000056 -0.000061 -0.000021 -0.000038 -0.000092 -0.000018 -0.000196 0.000024 -0.000561 -0.000193 0.000110 -0.000128 -0.000112 -0.000196 -0.000218 0.000266 0.000080 0.000766 0.000343 0.000298 0.000064 -0.000137 -0.000001 -0.000008 0.000238 ... 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000020 0.0 0.0 0.0 0.0 0.0 -0.000015 0.0 0.0 0.0 0.0 -0.000016 0.0 0.0 0.0 -0.000015 0.0 0.0 -0.000018 0.0 -0.000009 0.000105 0.0 0.0 -0.000028 0.0 0.0 0.0 -0.000015 -0.000014 -0.000014 0.000196 -0.000010 -0.000025 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000028 0.0 0.0 0.0 0.0 -0.000014 -0.000014 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000015 0.0 0.0 0.0 0.0 0.0 -0.000014 0.0 0.0 0.0 0.0 -0.000014 0.0 0.0 0.0 0.000196 0.0 0.0 -0.000010 0.0 -0.000025 -0.000068 0.0 -0.000045 0.0 0.0 0.0 -0.000012 -0.000024 -0.000021 0.000023 8.494807e-07 -0.000020 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000045 0.0 0.0 0.0 0.0 -0.000024 -0.000021 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000012 0.0 0.0 0.0 0.0 0.0 -0.000024 0.0 0.0 0.0 0.0 -0.000021 0.0 0.0 0.0 0.000023 0.0 0.0 8.494807e-07 0.0 -0.000020 0.000135 -0.000029 0.0 0.0 0.000135 3.755264e-07 -0.000021 -0.000008 -0.000015 -0.000015 0.000203 -0.000029 0.0 0.0 -0.000029 0.0 -0.000021 -0.000008 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000135 3.755264e-07 -0.000021 -0.000008 -0.000015 -0.000015 0.000203 3.755264e-07 0.0 0.0 0.0 0.0 0.0 -0.000021 0.0 0.0 0.0 0.0 -0.000008 0.0 0.0 0.0 -0.000015 0.0 0.0 -0.000015 0.0 0.000203 0.000329 0.000036 0.000474 -0.000076 0.0 0.000354 -0.000025 0.0 0.0 0.0 0.000036 0.0 0.0 0.0 -0.000054 0.000090 0.0 0.0 0.0 0.000474 0.0 0.0 0.000512 -0.000038 0.0 0.0 0.0 -0.000076 0.0 -0.000051 -0.000026 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000354 0.0 0.0 0.0 0.0 -0.000025 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000199 0.0 0.0 -0.000044 -0.000054 0.000090 -0.000044 -0.000047 -0.000051 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000044 0.0 0.0 0.0 0.0 0.0 -0.000054 0.0 0.0 0.0 0.0 0.000090 0.0 0.0 0.0 -0.000044 0.0 0.0 -0.000047 0.0 -0.000051 0.000357 0.0 -0.000051 0.000512 -0.000038 0.000002 -0.000054 -0.000045 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000051 0.0 0.0 0.0 0.0 0.0 0.000512 0.0 0.0 0.0 0.0 -0.000038 0.0 0.0 0.0 0.000002 0.0 0.0 -0.000054 0.0 -0.000045 -0.000084 -0.000033 -0.000051 -0.000026 -0.000053 -0.000044 0.000166 -0.000033 0.0 0.0 0.0 0.0 0.0 -0.000051 0.0 0.0 0.0 0.0 -0.000026 0.0 0.0 0.0 -0.000053 0.0 0.0 -0.000044 0.0 0.000166 -0.000175 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000354 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000025 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000108 0.0 0.0 0.0 0.0 0.0 -0.000173 0.0 0.0 0.000016
PC-2 -0.021090 -0.018954 0.031679 -0.007052 0.009876 -0.021639 -0.020526 0.002351 0.001551 0.000012 -0.000700 -0.002477 -0.002753 -0.002158 -0.001308 0.001667 0.000553 0.001005 -0.001673 0.000863 -0.007388 0.005614 0.000161 -0.001814 0.000142 0.002061 -0.000457 0.000157 0.008002 0.007928 -0.016625 0.005613 -0.017609 0.006409 0.007661 -0.004243 -0.001757 0.000040 -0.000317 -0.001993 -0.003577 -0.002634 -0.001150 -0.000114 -0.000629 -0.001129 -0.006437 -0.002033 -0.008204 -0.008961 -0.002981 -0.003969 -0.002468 -0.003243 -0.002183 -0.003007 0.008905 -0.007136 0.001708 -0.007314 0.005690 0.007685 -0.004004 -0.001620 -0.000006 -0.000301 -0.002148 -0.003399 -0.002541 -0.001193 0.001140 -0.000563 -0.000735 -0.005925 -0.001927 -0.008088 -0.008323 -0.002814 -0.004006 -0.001919 -0.001988 -0.002039 -0.002932 0.089820 -0.034197 0.078531 -0.020423 -0.013035 0.004162 0.003480 0.001232 0.000294 -0.000319 0.000776 0.000912 0.000693 0.011467 0.002651 0.002047 0.005539 0.005006 0.001369 0.010493 0.004721 0.002004 0.003534 0.014092 0.001878 0.002899 0.015957 -0.025137 0.008150 0.005207 -0.000736 -0.000690 -0.000521 -0.000252 0.001630 0.000820 0.000497 0.000159 -0.004040 -0.001359 -0.001507 -0.000700 -0.001463 0.002947 -0.003296 -0.001177 0.001165 -0.001864 -0.004279 -0.000192 -0.000715 0.029817 -0.027895 -0.014568 0.000252 0.002857 0.000315 0.000066 -0.011763 -0.002656 -0.001195 -0.000209 0.015599 0.001900 0.004513 -0.008344 0.003238 -0.015615 0.004962 0.000510 -0.012408 0.004063 0.014631 0.000935 0.003112 0.005445 0.005915 -0.003991 -0.001766 0.000096 -0.000377 -0.002348 -0.003613 -0.002480 -0.001134 -0.001054 -0.000385 -0.001154 -0.006465 -0.002048 -0.008442 -0.008577 -0.002941 -0.004042 -0.002424 -0.003926 -0.002158 -0.003035 0.007738 -0.004179 -0.001542 0.000064 -0.000257 -0.002073 -0.003644 -0.002706 -0.001188 0.000583 -0.000469 -0.001319 -0.006463 -0.001734 -0.008423 -0.009294 -0.002857 -0.004110 -0.002353 -0.002865 -0.002077 -0.003068 0.002351 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000643 -0.0 -0.0 0.002351 0.000606 0.000429 0.000214 0.000136 0.000260 0.000251 0.001551 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000442 0.001551 -0.0 -0.0 0.000195 0.000022 0.000420 0.000370 0.000136 0.000322 0.000012 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000090 0.000012 -0.0 -0.0 -0.000065 -0.000021 -0.000069 0.000057 0.000099 0.000058 -0.000700 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000206 -0.000700 -0.0 -0.0 -0.000076 -0.000126 -0.000079 -0.000122 -0.000087 -0.000076 -0.002477 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.001466 -0.0 -0.002477 -0.0 -0.000265 -1.246766e-03 -0.000219 -0.000201 -0.000154 -0.000171 -0.002753 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000787 -0.0 -0.002753 -0.0 -0.000376 -0.000509 -0.000278 -0.000560 -0.000450 -0.000342 -0.002158 -0.0 -0.0 -0.0 -0.0 -0.000640 -0.0 -0.002158 -0.0 -0.000223 -0.000450 -0.000190 -0.000343 -2.505720e-04 -0.000251 -0.001308 -0.0 -0.0 -0.0 -0.000418 -0.0 -0.0 -0.0 -0.000238 -0.000201 -0.000217 -0.000150 -0.000124 -0.000186 0.001667 -0.0 -0.0 -0.000165 -0.0 -0.0 -0.0 0.000316 -0.000113 -0.000052 0.001779 -0.000008 -0.000161 0.000553 -0.0 0.000089 -0.0 -0.0 -0.0 0.000017 -0.000047 0.000136 0.000387 2.879226e-06 -0.000005 0.001005 0.000320 -0.0 -0.0 0.001005 1.581754e-04 -0.000011 0.000331 -0.000027 0.000028 0.000412 -0.001673 0.000146 -0.002892 0.001567 -0.0 -0.001814 0.000142 -0.0 -0.0 -0.0 0.000863 -0.0 -0.0 0.000055 -0.000126 0.000272 0.000305 0.000147 0.000304 -0.007388 -0.0 -0.000864 -0.002206 -0.000687 -0.001104 -0.000854 -0.000763 0.005614 0.000875 0.000878 0.000689 0.000843 0.000379 0.000968 0.000161 -0.0 -0.0 -0.0 -0.0 -0.0 -0.001814 -0.0 -0.0 -0.0 -0.0 0.000142 -0.0 -0.0 -0.0 0.002061 -0.0 -0.0 -0.000457 -0.0 0.000157 -0.051553 -0.049659 0.035085 0.002346 -0.004067 -0.048680 -0.050556 0.008040 0.002125 0.000061 -0.000239 -0.001909 -0.004898 -0.003294 -0.001204 -0.000123 0.000722 0.001435 0.000911 0.001947 -0.010101 0.016762 0.001675 0.000725 0.000187 0.001018 0.000340 0.001252 -0.050064 0.032703 0.004294 -0.010616 -0.046657 -0.048927 0.007641 0.002170 0.000134 -0.000183 -0.001852 -0.004559 -0.003164 -0.001255 -0.000100 0.000697 0.001087 0.000363 0.002121 -0.009575 0.016041 0.001883 0.000387 -0.000024 0.001056 0.000475 0.001406 -0.051570 0.011346 -0.018248 0.032489 0.034259 -0.007574 -0.003458 0.000012 -0.000006 -0.000249 ... -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000238 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000201 -0.0 -0.0 -0.0 -0.0 -0.000217 -0.0 -0.0 -0.0 -0.000150 -0.0 -0.0 -0.000124 -0.0 -0.000186 0.001667 -0.0 -0.0 -0.000165 -0.0 -0.0 -0.0 0.000316 -0.000113 -0.000052 0.001779 -0.000008 -0.000161 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000165 -0.0 -0.0 -0.0 -0.0 -0.000113 -0.000052 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000316 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000113 -0.0 -0.0 -0.0 -0.0 -0.000052 -0.0 -0.0 -0.0 0.001779 -0.0 -0.0 -0.000008 -0.0 -0.000161 0.000553 -0.0 0.000089 -0.0 -0.0 -0.0 0.000017 -0.000047 0.000136 0.000387 2.879226e-06 -0.000005 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000089 -0.0 -0.0 -0.0 -0.0 -0.000047 0.000136 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000017 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000047 -0.0 -0.0 -0.0 -0.0 0.000136 -0.0 -0.0 -0.0 0.000387 -0.0 -0.0 2.879226e-06 -0.0 -0.000005 0.001005 0.000320 -0.0 -0.0 0.001005 1.581754e-04 -0.000011 0.000331 -0.000027 0.000028 0.000412 0.000320 -0.0 -0.0 0.000320 -0.0 -0.000011 0.000331 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.001005 1.581754e-04 -0.000011 0.000331 -0.000027 0.000028 0.000412 1.581754e-04 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000011 -0.0 -0.0 -0.0 -0.0 0.000331 -0.0 -0.0 -0.0 -0.000027 -0.0 -0.0 0.000028 -0.0 0.000412 -0.001673 0.000146 -0.002892 0.001567 -0.0 -0.001814 0.000142 -0.0 -0.0 -0.0 0.000146 -0.0 -0.0 -0.0 -0.000126 0.000272 -0.0 -0.0 -0.0 -0.002892 -0.0 -0.0 -0.002206 -0.000687 -0.0 -0.0 -0.0 0.001567 -0.0 0.000878 0.000689 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.001814 -0.0 -0.0 -0.0 -0.0 0.000142 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000863 -0.0 -0.0 0.000055 -0.000126 0.000272 0.000305 0.000147 0.000304 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000055 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000126 -0.0 -0.0 -0.0 -0.0 0.000272 -0.0 -0.0 -0.0 0.000305 -0.0 -0.0 0.000147 -0.0 0.000304 -0.007388 -0.0 -0.000864 -0.002206 -0.000687 -0.001104 -0.000854 -0.000763 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000864 -0.0 -0.0 -0.0 -0.0 -0.0 -0.002206 -0.0 -0.0 -0.0 -0.0 -0.000687 -0.0 -0.0 -0.0 -0.001104 -0.0 -0.0 -0.000854 -0.0 -0.000763 0.005614 0.000875 0.000878 0.000689 0.000843 0.000379 0.000968 0.000875 -0.0 -0.0 -0.0 -0.0 -0.0 0.000878 -0.0 -0.0 -0.0 -0.0 0.000689 -0.0 -0.0 -0.0 0.000843 -0.0 -0.0 0.000379 -0.0 0.000968 0.000161 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.001814 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000142 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.002061 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000457 -0.0 -0.0 0.000157
PC-3 0.066011 0.066325 -0.016881 -0.022793 0.006962 0.062677 0.065717 -0.008503 -0.003515 0.000578 0.001496 0.003087 0.007295 0.005348 0.002991 0.002409 -0.000471 -0.002320 0.000921 -0.001441 0.015730 -0.019217 0.000019 -0.000687 0.001608 0.000084 0.000605 -0.000109 -0.041244 -0.046084 0.027221 0.013977 -0.015471 -0.036615 -0.041734 0.015165 0.005143 0.000152 0.000822 0.003594 0.010266 0.006633 0.002962 0.000633 0.001162 0.003558 0.015879 0.006117 0.020493 0.034644 0.009641 0.010292 0.005587 0.009791 0.009281 0.010385 -0.050572 0.029265 0.014895 -0.011836 -0.041457 -0.045654 0.014833 0.006236 0.000166 0.000720 0.003006 0.009601 0.006021 0.003008 0.001363 0.001562 0.003461 0.015150 0.007123 0.018629 0.034641 0.009770 0.009916 0.005234 0.010433 0.009289 0.010535 -0.011092 -0.000223 0.029309 0.022893 0.026424 -0.007562 -0.001536 0.000188 0.000099 -0.000464 -0.001689 -0.002014 -0.000928 0.003594 -0.000276 -0.000294 -0.004164 -0.001249 -0.004166 -0.013857 -0.005630 -0.003076 -0.001089 0.001559 -0.001330 -0.002525 -0.018495 -0.005589 0.013042 0.013562 -0.001407 -0.003151 -0.000428 0.000170 -0.001148 -0.002546 -0.001103 -0.000836 -0.002078 -0.001965 -0.003064 -0.005934 -0.003409 -0.004798 -0.009706 -0.001538 -0.004352 -0.001583 -0.005050 -0.003782 -0.003341 0.019481 -0.019213 -0.013261 0.001115 0.002057 0.000044 -0.000128 -0.004686 -0.001518 -0.000798 0.000334 0.006399 0.000850 0.001255 -0.003630 0.001973 -0.007002 0.004408 0.000353 -0.004187 0.000557 0.007584 0.000429 0.001515 -0.032695 -0.037361 0.014407 0.004823 0.000264 0.000952 0.003540 0.010090 0.006343 0.002935 0.000181 0.001069 0.003268 0.014929 0.006038 0.019972 0.032481 0.008963 0.009434 0.005495 0.009149 0.008975 0.010126 -0.042205 0.015343 0.005357 0.000091 0.000640 0.003371 0.010204 0.006693 0.002989 0.000935 0.001049 0.003329 0.016055 0.006088 0.020268 0.034388 0.009454 0.010507 0.005548 0.010154 0.008923 0.010105 -0.008503 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.001678 -0.0 -0.0 -0.008503 -0.001917 -0.001071 -0.000608 -0.000846 -0.001262 -0.001155 -0.003515 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000632 -0.003515 -0.0 -0.0 -0.000630 -0.000398 -0.000235 -0.000624 -0.000558 -0.000388 0.000578 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000166 0.000578 -0.0 -0.0 0.000085 0.000095 0.000071 0.000067 0.000099 0.000081 0.001496 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000498 0.001496 -0.0 -0.0 0.000176 0.000249 0.000249 0.000192 0.000186 0.000244 0.003087 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000908 -0.0 0.003087 -0.0 0.000548 2.053817e-04 0.000702 0.000545 0.000403 0.000379 0.007295 -0.0 -0.0 -0.0 -0.0 -0.0 0.001845 -0.0 0.007295 -0.0 0.001217 0.000941 0.000904 0.000838 0.001488 0.001229 0.005348 -0.0 -0.0 -0.0 -0.0 0.001429 -0.0 0.005348 -0.0 0.000683 0.000911 0.000517 0.000844 7.650545e-04 0.000713 0.002991 -0.0 -0.0 -0.0 0.000990 -0.0 -0.0 -0.0 0.000489 0.000493 0.000497 0.000293 0.000295 0.000527 0.002409 -0.0 -0.0 0.000350 -0.0 -0.0 -0.0 0.000219 0.000195 0.000156 0.001004 0.000318 0.000308 -0.000471 -0.0 -0.000261 -0.0 -0.0 -0.0 -0.000180 -0.000170 -0.000091 0.000052 -1.853918e-05 -0.000086 -0.002320 -0.000733 -0.0 -0.0 -0.002320 -3.901327e-04 -0.000438 -0.000295 -0.000248 -0.000466 -0.000326 0.000921 0.000032 0.004182 -0.004373 -0.0 -0.000687 0.001608 -0.0 -0.0 -0.0 -0.001441 -0.0 -0.0 -0.000370 -0.000054 0.000086 -0.000365 -0.000273 -0.000063 0.015730 -0.0 0.002448 0.002058 0.002124 0.002226 0.002656 0.002321 -0.019217 -0.002586 -0.003209 -0.001164 -0.003127 -0.002373 -0.003115 0.000019 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000687 -0.0 -0.0 -0.0 -0.0 0.001608 -0.0 -0.0 -0.0 0.000084 -0.0 -0.0 0.000605 -0.0 -0.000109 0.185810 0.183678 -0.066726 -0.050006 0.006700 0.174328 0.182741 -0.028416 -0.007564 -0.000017 0.000693 0.004600 0.015124 0.008453 0.003355 0.000406 -0.001371 -0.005358 -0.004944 -0.006888 0.028176 -0.064921 -0.005753 -0.005973 0.001029 -0.007933 -0.004225 -0.006302 0.187307 -0.067880 -0.052884 0.010469 0.171473 0.181107 -0.027942 -0.009082 -0.000083 0.000532 0.003745 0.013827 0.007637 0.003415 0.000530 -0.001783 -0.005125 -0.006088 -0.008634 0.025209 -0.064821 -0.006503 -0.006556 0.000468 -0.008846 -0.004831 -0.007067 0.095980 -0.017856 -0.002321 -0.060830 -0.066135 0.015082 0.001687 0.000315 0.000270 0.000128 ... -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000489 -0.0 -0.0 -0.0 -0.0 -0.0 0.000493 -0.0 -0.0 -0.0 -0.0 0.000497 -0.0 -0.0 -0.0 0.000293 -0.0 -0.0 0.000295 -0.0 0.000527 0.002409 -0.0 -0.0 0.000350 -0.0 -0.0 -0.0 0.000219 0.000195 0.000156 0.001004 0.000318 0.000308 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000350 -0.0 -0.0 -0.0 -0.0 0.000195 0.000156 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000219 -0.0 -0.0 -0.0 -0.0 -0.0 0.000195 -0.0 -0.0 -0.0 -0.0 0.000156 -0.0 -0.0 -0.0 0.001004 -0.0 -0.0 0.000318 -0.0 0.000308 -0.000471 -0.0 -0.000261 -0.0 -0.0 -0.0 -0.000180 -0.000170 -0.000091 0.000052 -1.853918e-05 -0.000086 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000261 -0.0 -0.0 -0.0 -0.0 -0.000170 -0.000091 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000180 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000170 -0.0 -0.0 -0.0 -0.0 -0.000091 -0.0 -0.0 -0.0 0.000052 -0.0 -0.0 -1.853918e-05 -0.0 -0.000086 -0.002320 -0.000733 -0.0 -0.0 -0.002320 -3.901327e-04 -0.000438 -0.000295 -0.000248 -0.000466 -0.000326 -0.000733 -0.0 -0.0 -0.000733 -0.0 -0.000438 -0.000295 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.002320 -3.901327e-04 -0.000438 -0.000295 -0.000248 -0.000466 -0.000326 -3.901327e-04 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000438 -0.0 -0.0 -0.0 -0.0 -0.000295 -0.0 -0.0 -0.0 -0.000248 -0.0 -0.0 -0.000466 -0.0 -0.000326 0.000921 0.000032 0.004182 -0.004373 -0.0 -0.000687 0.001608 -0.0 -0.0 -0.0 0.000032 -0.0 -0.0 -0.0 -0.000054 0.000086 -0.0 -0.0 -0.0 0.004182 -0.0 -0.0 0.002058 0.002124 -0.0 -0.0 -0.0 -0.004373 -0.0 -0.003209 -0.001164 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000687 -0.0 -0.0 -0.0 -0.0 0.001608 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.001441 -0.0 -0.0 -0.000370 -0.000054 0.000086 -0.000365 -0.000273 -0.000063 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000370 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000054 -0.0 -0.0 -0.0 -0.0 0.000086 -0.0 -0.0 -0.0 -0.000365 -0.0 -0.0 -0.000273 -0.0 -0.000063 0.015730 -0.0 0.002448 0.002058 0.002124 0.002226 0.002656 0.002321 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.002448 -0.0 -0.0 -0.0 -0.0 -0.0 0.002058 -0.0 -0.0 -0.0 -0.0 0.002124 -0.0 -0.0 -0.0 0.002226 -0.0 -0.0 0.002656 -0.0 0.002321 -0.019217 -0.002586 -0.003209 -0.001164 -0.003127 -0.002373 -0.003115 -0.002586 -0.0 -0.0 -0.0 -0.0 -0.0 -0.003209 -0.0 -0.0 -0.0 -0.0 -0.001164 -0.0 -0.0 -0.0 -0.003127 -0.0 -0.0 -0.002373 -0.0 -0.003115 0.000019 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.000687 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.001608 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 -0.0 0.000084 -0.0 -0.0 -0.0 -0.0 -0.0 0.000605 -0.0 -0.0 -0.000109
PC-4 -0.012215 -0.011060 -0.038403 0.021049 -0.000039 -0.012307 -0.013006 -0.000432 0.000835 -0.000501 0.000240 -0.001832 -0.002189 -0.000974 -0.000018 -0.001141 0.000744 0.004840 -0.000669 0.000574 -0.004995 0.004836 -0.001065 -0.001677 0.001008 -0.001532 0.001061 0.002213 -0.003041 -0.003907 0.019208 -0.009009 -0.034570 -0.002889 -0.002544 0.001107 -0.000799 -0.000288 -0.000055 -0.001486 -0.002770 -0.001323 -0.000363 -0.000171 -0.000663 -0.003554 -0.003672 -0.001143 -0.005579 -0.004296 -0.000074 -0.002394 -0.001279 -0.002312 -0.002397 -0.002177 -0.004776 0.013859 -0.011363 -0.023475 -0.003903 -0.003938 0.000930 -0.000374 -0.000101 0.000016 -0.001818 -0.002435 -0.001509 -0.000444 -0.000995 -0.000678 -0.001701 -0.003384 -0.000460 -0.005762 -0.002720 -0.000273 -0.002770 -0.000614 -0.002848 -0.002406 -0.000371 -0.111682 0.034007 0.000734 0.021742 0.017483 -0.006605 -0.005549 -0.003227 -0.000701 -0.001413 -0.001329 -0.000211 -0.000764 -0.008949 -0.004018 0.001432 -0.009835 -0.009477 -0.002953 -0.012242 -0.007866 -0.007406 -0.002429 -0.011433 -0.003151 -0.002605 0.010637 -0.036463 -0.008064 -0.007642 0.003434 0.001438 0.001742 0.000481 0.001967 0.001816 0.000869 0.000857 0.003682 0.002675 -0.003592 0.008403 0.003660 0.004652 0.005523 0.004134 0.006053 0.002350 0.006348 0.001629 -0.002543 0.026113 -0.040225 -0.041199 -0.000777 0.006838 -0.000061 -0.000192 -0.011738 -0.007910 -0.003502 -0.001092 -0.004866 0.001907 0.022065 -0.011539 0.006585 -0.023151 0.020578 0.001415 -0.016823 0.005284 -0.007538 0.000689 0.019232 -0.002834 -0.002244 0.000877 -0.000440 -0.000341 -0.000042 -0.001834 -0.002924 -0.001218 -0.000316 0.000382 -0.000653 -0.004135 -0.004185 -0.000823 -0.005976 -0.004920 0.000009 -0.002729 -0.001455 -0.001871 -0.002388 -0.002377 -0.002085 0.000841 -0.000826 -0.000343 -0.000028 -0.001576 -0.002763 -0.001373 -0.000371 -0.000605 -0.000785 -0.004037 -0.003614 -0.001197 -0.005712 -0.004337 -0.000171 -0.002560 -0.001054 -0.002159 -0.002406 -0.003013 -0.000432 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000288 0.0 0.0 -0.000432 -0.000610 -0.000202 -0.000086 0.000210 0.000298 0.000273 0.000835 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.001236 0.000835 0.0 0.0 0.000268 0.000176 0.001060 -0.000482 0.000088 -0.000665 -0.000501 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000087 -0.000501 0.0 0.0 0.000065 -0.000034 0.000121 -0.000205 -0.000386 -0.000071 0.000240 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000026 0.000240 0.0 0.0 0.000022 0.000019 -0.000045 0.000091 0.000042 0.000018 -0.001832 0.0 0.0 0.0 0.0 0.0 0.0 -0.001274 0.0 -0.001832 0.0 -0.000142 -9.556352e-04 -0.000319 0.000002 -0.000085 -0.000125 -0.002189 0.0 0.0 0.0 0.0 0.0 -0.000912 0.0 -0.002189 0.0 -0.000079 -0.000841 -0.000071 -0.000642 -0.000243 -0.000214 -0.000974 0.0 0.0 0.0 0.0 -0.000235 0.0 -0.000974 0.0 -0.000007 -0.000280 0.000044 -0.000276 2.492292e-05 -0.000029 -0.000018 0.0 0.0 0.0 -0.000071 0.0 0.0 0.0 -0.000028 -0.000075 0.000004 0.000091 0.000058 -0.000126 -0.001141 0.0 0.0 0.000084 0.0 0.0 0.0 -0.001031 0.000095 -0.000011 -0.000462 0.000039 0.000145 0.000744 0.0 -0.000015 0.0 0.0 0.0 0.000064 0.000212 -0.000227 0.000169 5.461344e-04 0.000093 0.004840 0.000604 0.0 0.0 0.004840 5.525942e-04 0.000399 0.000205 0.000348 0.000485 0.002546 -0.000669 0.001298 -0.002422 0.000458 0.0 -0.001677 0.001008 0.0 0.0 0.0 0.000574 0.0 0.0 0.000356 0.000161 0.001136 -0.000596 -0.000256 -0.000718 -0.004995 0.0 -0.000229 -0.002076 -0.000345 -0.000916 -0.000304 -0.000368 0.004836 -0.000197 0.000006 0.000451 0.000183 0.000978 0.003187 -0.001065 0.0 0.0 0.0 0.0 0.0 -0.001677 0.0 0.0 0.0 0.0 0.001008 0.0 0.0 0.0 -0.001532 0.0 0.0 0.001061 0.0 0.002213 -0.015181 -0.015643 -0.046801 0.036834 -0.014489 -0.015328 -0.015895 -0.002627 0.001251 -0.000164 -0.000097 -0.001444 -0.003734 -0.001782 -0.000650 0.000003 0.000196 0.003454 -0.002086 0.000991 -0.006960 0.003377 -0.002838 -0.002411 0.000325 0.000624 0.001819 0.000514 -0.016758 -0.044958 0.040025 -0.024610 -0.015477 -0.016403 -0.002235 0.001086 -0.000345 -0.000086 -0.001581 -0.003277 -0.002002 -0.000738 -0.000111 0.000200 0.002481 -0.001818 0.000655 -0.006860 0.002948 -0.003385 -0.001677 -0.000141 0.000956 0.001887 -0.000406 0.052325 -0.002956 -0.025595 -0.044594 -0.044545 0.011482 0.006871 -0.000443 -0.000264 -0.000955 ... 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000028 0.0 0.0 0.0 0.0 0.0 -0.000075 0.0 0.0 0.0 0.0 0.000004 0.0 0.0 0.0 0.000091 0.0 0.0 0.000058 0.0 -0.000126 -0.001141 0.0 0.0 0.000084 0.0 0.0 0.0 -0.001031 0.000095 -0.000011 -0.000462 0.000039 0.000145 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000084 0.0 0.0 0.0 0.0 0.000095 -0.000011 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.001031 0.0 0.0 0.0 0.0 0.0 0.000095 0.0 0.0 0.0 0.0 -0.000011 0.0 0.0 0.0 -0.000462 0.0 0.0 0.000039 0.0 0.000145 0.000744 0.0 -0.000015 0.0 0.0 0.0 0.000064 0.000212 -0.000227 0.000169 5.461344e-04 0.000093 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000015 0.0 0.0 0.0 0.0 0.000212 -0.000227 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000064 0.0 0.0 0.0 0.0 0.0 0.000212 0.0 0.0 0.0 0.0 -0.000227 0.0 0.0 0.0 0.000169 0.0 0.0 5.461344e-04 0.0 0.000093 0.004840 0.000604 0.0 0.0 0.004840 5.525942e-04 0.000399 0.000205 0.000348 0.000485 0.002546 0.000604 0.0 0.0 0.000604 0.0 0.000399 0.000205 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.004840 5.525942e-04 0.000399 0.000205 0.000348 0.000485 0.002546 5.525942e-04 0.0 0.0 0.0 0.0 0.0 0.000399 0.0 0.0 0.0 0.0 0.000205 0.0 0.0 0.0 0.000348 0.0 0.0 0.000485 0.0 0.002546 -0.000669 0.001298 -0.002422 0.000458 0.0 -0.001677 0.001008 0.0 0.0 0.0 0.001298 0.0 0.0 0.0 0.000161 0.001136 0.0 0.0 0.0 -0.002422 0.0 0.0 -0.002076 -0.000345 0.0 0.0 0.0 0.000458 0.0 0.000006 0.000451 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.001677 0.0 0.0 0.0 0.0 0.001008 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000574 0.0 0.0 0.000356 0.000161 0.001136 -0.000596 -0.000256 -0.000718 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000356 0.0 0.0 0.0 0.0 0.0 0.000161 0.0 0.0 0.0 0.0 0.001136 0.0 0.0 0.0 -0.000596 0.0 0.0 -0.000256 0.0 -0.000718 -0.004995 0.0 -0.000229 -0.002076 -0.000345 -0.000916 -0.000304 -0.000368 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000229 0.0 0.0 0.0 0.0 0.0 -0.002076 0.0 0.0 0.0 0.0 -0.000345 0.0 0.0 0.0 -0.000916 0.0 0.0 -0.000304 0.0 -0.000368 0.004836 -0.000197 0.000006 0.000451 0.000183 0.000978 0.003187 -0.000197 0.0 0.0 0.0 0.0 0.0 0.000006 0.0 0.0 0.0 0.0 0.000451 0.0 0.0 0.0 0.000183 0.0 0.0 0.000978 0.0 0.003187 -0.001065 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.001677 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.001008 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.001532 0.0 0.0 0.0 0.0 0.0 0.001061 0.0 0.0 0.002213
PC-5 0.008826 0.011300 0.049471 -0.056908 0.010571 0.008181 0.007641 0.000611 0.000747 0.000429 -0.000440 0.000328 0.000683 0.000012 -0.000377 -0.001224 -0.000264 0.000569 -0.001485 0.000736 0.001023 0.000105 0.002398 -0.002348 0.000864 -0.001798 -0.000988 0.002292 -0.003565 -0.008040 -0.042808 0.045771 -0.031402 -0.002650 -0.004807 -0.001257 0.000535 0.000414 -0.000044 0.000979 0.001736 0.000194 0.000116 0.000178 0.000659 0.001332 0.001757 0.000905 0.002909 0.004058 -0.001123 0.002425 -0.000669 0.002478 0.003639 0.001093 -0.010969 -0.038328 0.042172 -0.024921 -0.006803 -0.009799 -0.000772 0.001427 0.000158 -0.000141 0.000758 0.001556 0.000019 0.000152 -0.000438 0.001102 0.002878 0.002610 0.001444 0.002334 0.006707 -0.000692 0.002951 -0.000341 0.002180 0.004027 0.001787 0.094382 -0.039798 -0.045976 -0.041438 -0.045335 0.010951 0.010924 0.003565 0.000741 0.000392 0.000542 0.000702 0.000892 -0.001972 0.006433 0.006224 0.016183 0.015231 0.001636 0.027250 0.010528 0.008066 0.008117 0.003382 0.004454 0.009235 -0.002126 -0.017953 0.045167 0.047662 -0.007518 -0.008320 -0.003499 -0.000926 -0.001050 -0.001882 -0.000692 -0.001044 -0.000873 -0.007825 -0.011709 -0.018590 -0.012745 -0.003623 -0.030797 -0.007538 -0.009295 -0.009296 -0.007251 -0.005644 -0.010933 0.007692 -0.029271 -0.040867 0.001919 0.003373 0.000233 0.000120 -0.001613 -0.005113 -0.001040 0.000302 -0.014345 0.002819 0.020736 0.000340 0.003727 -0.007766 0.025835 0.001670 -0.003662 0.004002 -0.012509 0.002448 0.017228 -0.001999 -0.003363 -0.000950 0.000083 0.000494 -0.000024 0.001053 0.001698 0.000241 0.000145 0.000726 0.000789 0.000572 0.001706 0.000553 0.002992 0.002974 -0.001444 0.002275 -0.000569 0.003156 0.003530 0.000166 -0.005462 -0.000746 0.001233 0.000423 -0.000095 0.000782 0.001594 0.000121 0.000107 -0.000158 0.000880 0.000255 0.001889 0.001561 0.002498 0.002752 -0.000980 0.002438 -0.000548 0.001568 0.003477 0.000462 0.000611 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000248 0.0 0.0 0.000611 0.001159 -0.000096 0.000344 -0.000345 -0.000460 -0.000465 0.000747 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000252 0.000747 0.0 0.0 -0.000032 -0.000277 0.000025 0.000676 -0.000304 0.001110 0.000429 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000136 0.000429 0.0 0.0 -0.000192 0.000041 -0.000177 0.000180 0.000469 0.000155 -0.000440 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000135 -0.000440 0.0 0.0 -0.000014 -0.000163 0.000027 -0.000073 -0.000041 -0.000045 0.000328 0.0 0.0 0.0 0.0 0.0 0.0 -0.000027 0.0 0.000328 0.0 0.000088 8.713002e-08 -0.000027 0.000033 0.000019 0.000127 0.000683 0.0 0.0 0.0 0.0 0.0 -0.000448 0.0 0.000683 0.0 0.000184 -0.000499 0.000052 0.000115 0.000396 0.000410 0.000012 0.0 0.0 0.0 0.0 -0.000137 0.0 0.000012 0.0 0.000053 -0.000047 -0.000090 0.000095 8.776719e-07 0.000054 -0.000377 0.0 0.0 0.0 -0.000018 0.0 0.0 0.0 -0.000026 0.000035 -0.000053 -0.000166 -0.000068 -0.000030 -0.001224 0.0 0.0 0.000036 0.0 0.0 0.0 0.001040 -0.000048 0.000084 -0.002096 -0.000065 -0.000091 -0.000264 0.0 -0.000244 0.0 0.0 0.0 -0.000142 -0.000482 0.000238 0.000228 -1.847632e-04 -0.000118 0.000569 -0.000388 0.0 0.0 0.000569 2.102541e-05 -0.000746 0.000358 -0.000363 -0.000397 0.001683 -0.001485 -0.000524 -0.000612 -0.000122 0.0 -0.002348 0.000864 0.0 0.0 0.0 0.000736 0.0 0.0 -0.000238 -0.000399 -0.000125 0.000783 0.000123 0.001221 0.001023 0.0 0.000324 -0.000547 -0.000065 0.000244 0.000415 0.000591 0.000105 0.001439 -0.000907 0.000785 -0.000791 -0.001210 0.000720 0.002398 0.0 0.0 0.0 0.0 0.0 -0.002348 0.0 0.0 0.0 0.0 0.000864 0.0 0.0 0.0 -0.001798 0.0 0.0 -0.000988 0.0 0.002292 0.027176 0.031429 0.080853 -0.086933 0.017289 0.025798 0.027802 0.002941 -0.002473 0.000103 0.000024 0.001586 0.003485 0.000465 0.000484 0.000022 -0.000409 -0.003135 -0.001785 -0.002346 0.005536 -0.006853 0.005314 -0.002735 0.000950 -0.003756 -0.003466 -0.000482 0.034603 0.074672 -0.085543 0.009258 0.029897 0.032111 0.002045 -0.003463 0.000371 -0.000004 0.001388 0.003075 0.000230 0.000515 0.000109 -0.000745 -0.004008 -0.003674 -0.003095 0.004693 -0.009516 0.005472 -0.003966 0.000292 -0.004726 -0.003945 -0.000926 -0.099287 0.021867 -0.027624 0.077828 0.078127 -0.018867 -0.012371 0.000333 0.000241 0.000714 ... 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000026 0.0 0.0 0.0 0.0 0.0 0.000035 0.0 0.0 0.0 0.0 -0.000053 0.0 0.0 0.0 -0.000166 0.0 0.0 -0.000068 0.0 -0.000030 -0.001224 0.0 0.0 0.000036 0.0 0.0 0.0 0.001040 -0.000048 0.000084 -0.002096 -0.000065 -0.000091 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000036 0.0 0.0 0.0 0.0 -0.000048 0.000084 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.001040 0.0 0.0 0.0 0.0 0.0 -0.000048 0.0 0.0 0.0 0.0 0.000084 0.0 0.0 0.0 -0.002096 0.0 0.0 -0.000065 0.0 -0.000091 -0.000264 0.0 -0.000244 0.0 0.0 0.0 -0.000142 -0.000482 0.000238 0.000228 -1.847632e-04 -0.000118 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000244 0.0 0.0 0.0 0.0 -0.000482 0.000238 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000142 0.0 0.0 0.0 0.0 0.0 -0.000482 0.0 0.0 0.0 0.0 0.000238 0.0 0.0 0.0 0.000228 0.0 0.0 -1.847632e-04 0.0 -0.000118 0.000569 -0.000388 0.0 0.0 0.000569 2.102541e-05 -0.000746 0.000358 -0.000363 -0.000397 0.001683 -0.000388 0.0 0.0 -0.000388 0.0 -0.000746 0.000358 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000569 2.102541e-05 -0.000746 0.000358 -0.000363 -0.000397 0.001683 2.102541e-05 0.0 0.0 0.0 0.0 0.0 -0.000746 0.0 0.0 0.0 0.0 0.000358 0.0 0.0 0.0 -0.000363 0.0 0.0 -0.000397 0.0 0.001683 -0.001485 -0.000524 -0.000612 -0.000122 0.0 -0.002348 0.000864 0.0 0.0 0.0 -0.000524 0.0 0.0 0.0 -0.000399 -0.000125 0.0 0.0 0.0 -0.000612 0.0 0.0 -0.000547 -0.000065 0.0 0.0 0.0 -0.000122 0.0 -0.000907 0.000785 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.002348 0.0 0.0 0.0 0.0 0.000864 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000736 0.0 0.0 -0.000238 -0.000399 -0.000125 0.000783 0.000123 0.001221 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.000238 0.0 0.0 0.0 0.0 0.0 -0.000399 0.0 0.0 0.0 0.0 -0.000125 0.0 0.0 0.0 0.000783 0.0 0.0 0.000123 0.0 0.001221 0.001023 0.0 0.000324 -0.000547 -0.000065 0.000244 0.000415 0.000591 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000324 0.0 0.0 0.0 0.0 0.0 -0.000547 0.0 0.0 0.0 0.0 -0.000065 0.0 0.0 0.0 0.000244 0.0 0.0 0.000415 0.0 0.000591 0.000105 0.001439 -0.000907 0.000785 -0.000791 -0.001210 0.000720 0.001439 0.0 0.0 0.0 0.0 0.0 -0.000907 0.0 0.0 0.0 0.0 0.000785 0.0 0.0 0.0 -0.000791 0.0 0.0 -0.001210 0.0 0.000720 0.002398 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.002348 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.000864 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -0.001798 0.0 0.0 0.0 0.0 0.0 -0.000988 0.0 0.0 0.002292

5 rows × 4494 columns

Now, let's take a look at the top correlation values for the first partial component.

In [426]:
feature_frame[0:1].sort_values(by=['PC-1'], axis = 1, ascending = False).T[0:3]
Out[426]:
PC-1
x4^3 0.960024
x2 x4^2 0.146450
x4^2 0.101705

The first partial component really depends on x4^3, x2 x4^2 and x4^2. When each of these values increase, we would expect each of them to increase together.

Finally, let's look at the R-squared models fo the first 10 partial components.

In [435]:
R2_pca_train = []
R2_pca_test = []

for i in range(10):
    pca = PCA(n_components=i+1)
    pca.fit(X_train_full_poly_nonzero_col)
    
    X_train_pca = pca.transform(X_train_full_poly_nonzero_col)
    X_test_pca = pca.transform(X_test_full_poly_nonzero_col)
    
    regression_model = LinearRegression(fit_intercept=True) 
    regression_model.fit(X_train_pca, y_train)
    
    R2_pca_train.append(regression_model.score(X_train_pca, y_train)) 
    R2_pca_test.append(regression_model.score(X_test_pca, y_test)) 
    
    print('Explained variance ratio for Model with',i+1, 'components:', pca.explained_variance_ratio_)
    
Explained variance ratio for Model with 1 components: [ 0.8129783]
Explained variance ratio for Model with 2 components: [ 0.8129783   0.04373497]
Explained variance ratio for Model with 3 components: [ 0.8129783   0.04373497  0.03401266]
Explained variance ratio for Model with 4 components: [ 0.8129783   0.04373497  0.03401266  0.02028036]
Explained variance ratio for Model with 5 components: [ 0.8129783   0.04373497  0.03401266  0.02028036  0.01556333]
Explained variance ratio for Model with 6 components: [ 0.8129783   0.04373497  0.03401266  0.02028036  0.01556333  0.00778059]
Explained variance ratio for Model with 7 components: [ 0.8129783   0.04373497  0.03401266  0.02028036  0.01556333  0.00778059
  0.00562893]
Explained variance ratio for Model with 8 components: [ 0.8129783   0.04373497  0.03401266  0.02028036  0.01556333  0.00778059
  0.00562893  0.00509171]
Explained variance ratio for Model with 9 components: [ 0.8129783   0.04373497  0.03401266  0.02028036  0.01556333  0.00778059
  0.00562893  0.00509171  0.00434038]
Explained variance ratio for Model with 10 components: [ 0.8129783   0.04373497  0.03401266  0.02028036  0.01556333  0.00778059
  0.00562893  0.00509171  0.00434038  0.0033645 ]
In [436]:
R2_pca_train
Out[436]:
[0.0010658459075083559,
 0.03830594358760786,
 0.13381819106030979,
 0.13382136439024239,
 0.13585955685136197,
 0.13676941737898085,
 0.14451154996044147,
 0.14712808866011484,
 0.17240002211375349,
 0.17960666107411249]
In [437]:
R2_pca_test
Out[437]:
[-0.0053200353011577661,
 0.051752057220783021,
 0.10653207172873704,
 0.10667438546230867,
 0.11083439698814734,
 0.11371485746532772,
 0.115030280594421,
 0.11572373943930712,
 0.13615410672470851,
 0.14113152029249787]