StatQuest: Decision Trees

Link

A DT asks a question and classifies based on the answer

Note: A classification can be categories or numeric

In the 2nd case we are using mouse wtto predict mouse size

More complex DT:

It combines numeric data:

With Yes/No data:

Notice that cut-off for Resting heart rate need not be same on both sides

Also order of questions need not be same on both sides

The final classifications may be repeated

U start at top and go down till u get to a pt where u cant go further

Thats how a sample is classified

Raw table of data to DT:

We want to create a tree that uses chest pain, good blood circulation, blocked artery status to predict heart disease(y/n)

We want to decide which of chest pain, good blood circulation, blocked artery status should be root node

We start off by exploring how well Chest pain classifies heart disease and build a tree as shown below:

We build similar trees for Good Blood Circulation and blocked Artery

As shown above we dont kno the BA status for this patient. We skip it but there are other alternatives

As there are missing values for a feature the total number of patients in each tree is diff

because none of the leaf nodes are 100% YES Heart disease or 100% NO they all are considered as "impure"

To determine which separation is best we need a way to measure and compare impurity

Gini method to measure impurity

Gini impurity (GI) is calculated for each leaf node as shown below:

Similarly we calculate GI for right leaf node

The leaf nodes do not reppresent same number of patients

Thus total GI for using Chest pain as root node is the weighted avg of GI of the 2 nodes:

Similarly we calculate GI for all 3 possible root nodes

Good blood circulation has lowest impurity and it separates the people with or without heart disease the best

So first node (root) = GBC

After the split we get 2 leaf nodes

Left: (37 y, 127 n)

Right: (100 y, 33 n)

Now we need to figure out how to separate (and if we should separate further) these patients in the Left and Right

Lets start with left:

These are the patients with GBC == true

Just like before we separate these patients based on CP and calculate GI as before

We do same for Blocked Artery

GI for BA = 2.9

This is less than GI for CP and also less than GI for GBC

Thus we use BA in the left part

Resulting tree:

Now we will use CP to try and separate the L->L node(24/25)

These are the patients with GBC = true and BA = true

CP does a good job in separating the patients:

Now we look at node in Root->L->R (13/102)

Lets try and use CP to divide these 115 patients

Note : Vast majority (89%) of patients in this node dont have heart disease

After separating we get a higher GI than before separating

So we make this node a leaf node

We have built the entire LHS of the tree

For RHS we follow same steps:

  1. Calculate all GI scores

  2. If node otself has lowest score, then there is no point in separating and the node becomes a leaf node

  3. If separating the data results in an improvement, pick the separation with the lowest impurity value

Complete tree:

Numeric data in DT:

Imagine if our features were numeric not just Y/N:

  1. Sort patients by wt (lowest to highest)

  1. Calculate avg wts for all adjacent patients

  2. Calculate GI for each avg wt

In the above diag GI is calculated for wt < 167.5

  1. The lowest GI occurs when wt < 205 (GI=0.27)

So this is the cutoff that we will use when we compare wt to CP or BA

DT with ranked data and multiple choice data

Ranked data is similar to numeric data, except that now we calculate impurity scores for all possiblle ranks

So if rank is from 1 to 4 (4 being best), we calculate impurity scores as:

  • rank <= 1

  • rank <= 2

  • rank <= 3

We dont need <=4 as it includes everyone

When there are multiple choices like color choices - B, R or G we calculate GI for each one as well as each possible combination

  • B

  • G

  • R

  • B or G

  • B or R

  • G or R

We dont need to calculate for B or R or G as it includes everyone

StatQuest: Random Forests Part 1 - Building, Using and Evaluating

Link

DTs are easy to build, use and interpret

But in practice, theyare not that awesome

Trees have one aspect that prevents them from being the ideal tool for predictive learning, namely inaccuracy

They work great with the data used to create them but are not flexible when it comes to classifying new samples

RF combines simplicity of DTs with flexibility resulting in a vast improvement in accuracy

Step 1 : Create a "bootstrapped" dataset

Say these 4 samples are entire dataset

To create a bootstrapped dataset that is same size as original we randomly select samples from original dataset

We are allowed to pick the same sample more than once

Say first sample in original dataset = S1

We create bootstrap dataset as: S2, S1, S4, S4

Step 2: Create a DT using Bootstrapped dataset but only use a random subset of vars (columns) at each step

In this example we will consider 2 vars at each step

Thus instead of considering all 4 vars (CP, GBC, BA, Wt) to figure out how to split the root node we randomly select 2 : GBC, BA

Say GBC did the best job at separating the samples

We used GBC, we grey it out, so that we can focus on rem vars

Now we have to figure out how to select vars for circled node:

Just like for the root we randomly select 2 vars from (CP, BA, wt)

We select CP and wt

Thus we build the tree by:

  1. using the bootstrapped dataset

  2. considering a random subset of vars at each step

This is done for a single tree

Now we make a new bootstrapped dataset and build tree considering a random subset of vars at each step

Ideally we do this 100s of times

considering a random subset of vars at each step

Because of the randomness associated with creating the bootsrapped dataset and also due to choosing random columns for each step, RF results in a wide variety of DTs

This variety makes RF more effevtive that DTs

Now that we have created the RF, how do we use it?

First we get data of a new patient

We want to predict if Heart disease or not

We take data and run down 1st tree

Output: Yes

We keep track of this result

Similarly we run data thru 2nd... last tree

We keep track of the results and see which option received most votes

Here Yes: 5 No : 1

So conclusion : YES

Bagging : Bootstrapping the data plus using the aggregrate to make a decision is called Bagging

Test accuracy of a RF

When we created the bootstrapped dataset we allowed duplicate entries in the bootstrapped dataset

As a result above entry was not included in the bootstrapped dataset

Typically about 1/3 the original data does not end up in the bootstrapped dataset

These entries are called the Out-of-Bag Dataset

We know the results of OoB data

Say there is only 1 entry in OoB data = No

we use them to test

We run the data through our first DT

Result : No

Similarly we run throuugh all trees and keep track of the results

Then we chose the most common result: Here it is correct and = No

We repeat the process for all OoB samples for all trees

Some may be incorrectly labeled

Accuracy: Proportion of OoB Samples that were correctly claasified by the RF

The proportion of OoB smaples that were incorrectly classified is the OoB Error

We now know how to:

  • Build a RF

  • Use a RF

  • Estimate accuracy of RF

We used 2 vars to make a decision at each step

Now we can compare OoB Error for RF built using 2vars per step to a RF built using 3 vars per step

We can test many diff settings and chose the most accurate RF

Process:

  1. Build a RF

  2. Est accuracy of RF

  3. Change no of vars used per step

  4. Repeat for a number of times and chose the RF that is most accurate

Typically we start by using the square of number (sq root?) and then try a few settings above and below that value

StatQuest: Random Forests Part 2: Missing data and clustering

Link

Lets see how RF deals with missing data

Missing data can be of 2 types:

  • Missing data can be in original dataset
  • It may be in a new sample we want to categorize

Lets start with Missing data in the original dataset:

We want to create a RF from the data

But we dont know if the 4th patient has BA or what is their wt

We make an initial guess that mey be bad and gradually refine the guess until it (hopefully) gets good

Initial guess for BA = most common value = No

Since wt is numeric our initial guess is the median val = 180

This is the dataset with the initial guesses

Now we want to refine our guesses

We do this by detemining which samples are similar to the one with the missing data

Determining Similarity:

  1. Build a RF

  2. Run all of the data down all of the trees

Lets start by running all of the data down the 1st tree:

Say sample 3 and 4 ended up in the same leaf node

That means they are simialar (that is how similarity is defined in RF)

We keep track of similar samples using a Proximity Matrix

The PM has a row foreach sample and a col for each sample

As samples 3 and 4 are similar we put 1 there

Similarly we run all of the data down the 2nd tree

Now samples 2, 3 and 4 all ended up in the same leaf nodes

PM now:

We add 1 as the pairs come in smae leaf node again

We run all the data down the 3rd tree

Updated PM:

Ultimately, we run the data down all the trees and the PM fills in

Then we divide each proximity value by total number of trees (say we had 10 trees)

Updated PM:

Now we can use the proximity values for sample 4 to make better guesses about the missing data

For BA we calculate the weighted freq of Y and N using prox values as wts

Calculations:

Freq of Yes = 1/3

Freq of No = 2/3

The wighted freq of Yes = Freq of Yes * The weight for Yes

The weight for Yes = (Proximity of Yes)/(All proximities)

The proximity for Yes = Proximity value for sample 2 (the only one with Yes)

We divide that by sum of proximities for sample 4

The weight for Yes = 0.1/(0.1 + 0.1 + 0.8) = 0.1

The wighted freq of Yes = 1/3 * 0.1 = 0.03

Similarly,

The wighted freq of No = Freq of No * The weight for No

The weight for No = (0.1 + 0.8)/(0.1 + 0.1 + 0.8) = 0.9

The wighted freq of No = 2/3 * 0.9 = 0.6

Since No has a way higher wt freq we chose No

So our new, improved revised guess based on proximities for BA is No

Filling in missing values for wt:

For wt we use proximities to calculate a weighted avg

Weighted avg = (wt of sample 1 wt avg wt of sample 1) + (wt of sample 2 wt avg wt of sample 2) + (wt of sample 3 * wt avg wt of sample 3)

wt avg wt of sample 1 = (proximity of sample 1) / (sum of proximities) = 0.1 / (0.1 + 0.1 + 0.8) = 0.1

Similarly we calculate wt avg wt of sample 2 and wt avg wt of sample 3

Weighted avg = (125 0.1) + (180 0.1) + (210 * 0.8) = 198.5

wts used to calculate the weighted avg is based on proximities

So we fill missing val as 198.5

Now that we have revised our guesses a little bit, we do the whole thing over again..

  • we build a RF
  • run data thru the trees
  • recalculate proximities
  • recalculate missing vals
  • we do this 6 or 7 times until the missing values converge i.e. no longer change each time we recalculate

Super Cool stuff with the PM:

We have already seen this PM

This is the PM b4 we divided each value by 10

If samples 3 and 4 ended up in the same leaf node for all 10 trees:

We divide each number by 10

For Samples 3 and 4 the entry will be 1

1 in PM => samples are as close as they can be

Also

1 - prox value = distance

Thus it is possible to derive a Distance Matrix from the PM

Getting distance matrix (which is similar to corr matrix) means we can plot Heat Maps

We can also draw MDS Plots


Missing data in new sample that we want to categorize

Imagine that we have already built a RF and we wanted to classify a new patient

But the patient has missing data for BA

We dont know if patient has BA

So we need to make a guess about BA so that we can run the patient down all the trees in the forest

  1. Create 2 copies of the data (Yes and No for Heart Disease

  1. Then we use the iterative method discussed about to make a good guess about the misssing values

  1. These are the guesses that we came up with:

  1. Then we run the 2 samples down the trees in the forest

  2. Then we see which of the 2 is correctly labeled by the RF most number of times

  3. The sample which is correctly labeled more times wins


StatQuest: Random Forests in R

Link

In [106]:
# Import libraries:

library(ggplot2)

# cowplot improves ggplot2's default settings
                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        
library(cowplot)

library(randomForest)
In [107]:
# install.packages('cowplot', repos='http://cran.us.r-project.org')

We are going to use heart disease dataset from UCI ML repo

In [108]:
url <- "http://archive.ics.uci.edu/ml/machine-learning-databases/heart-disease/processed.cleveland.data"

data <- read.csv(url, header = FALSE)

head(data)
V1V2V3V4V5V6V7V8V9V10V11V12V13V14
63 1 1 1452331 2 1500 2.33 0.06.00
67 1 4 1602860 2 1081 1.52 3.03.02
67 1 4 1202290 2 1291 2.62 2.07.01
37 1 3 1302500 0 1870 3.53 0.03.00
41 0 2 1302040 2 1720 1.41 0.03.00
56 1 2 1202360 0 1780 0.81 0.03.00

Lets label the cols:

Data Manual

Only 14 used -- 1. #3 (age)
-- 2. #4 (sex)
-- 3. #9 (cp)
-- 4. #10 (trestbps)
-- 5. #12 (chol)
-- 6. #16 (fbs)
-- 7. #19 (restecg)
-- 8. #32 (thalach)
-- 9. #38 (exang)
-- 10. #40 (oldpeak)
-- 11. #41 (slope)
-- 12. #44 (ca)
-- 13. #51 (thal)
-- 14. #58 (num) (the predicted attribute)

3 age: age in years

4 sex: sex (1 = male; 0 = female)

9 cp: chest pain type -- Value 1: typical angina -- Value 2: atypical angina -- Value 3: non-anginal pain -- Value 4: asymptomatic

10 trestbps: resting blood pressure (in mm Hg on admission to the hospital)

12 chol: serum cholestoral in mg/dl

16 fbs: (fasting blood sugar > 120 mg/dl) (1 = true; 0 = false)

19 restecg: resting electrocardiographic results -- Value 0: normal -- Value 1: having ST-T wave abnormality (T wave inversions and/or ST elevation or depression of > 0.05 mV) -- Value 2: showing probable or definite left ventricular hypertrophy by Estes' criteria

32 thalach: maximum heart rate achieved

38 exang: exercise induced angina (1 = yes; 0 = no)

40 oldpeak = ST depression induced by exercise relative to rest

41 slope: the slope of the peak exercise ST segment -- Value 1: upsloping -- Value 2: flat -- Value 3: downsloping

44 ca: number of major vessels (0-3) colored by flourosopy

51 thal: 3 = normal; 6 = fixed defect; 7 = reversable defect

58 num: diagnosis of heart disease (angiographic disease status) -- Value 0: < 50% diameter narrowing -- Value 1: > 50% diameter narrowing (in any major vessel: attributes 59 through 68 are vessels)

In [109]:
colnames(data) <- c("age", "sex", "cp", "trestbps", "chol", "fbs", "restecg", "thalach", "exang", "oldpeak", 
                    "slope", "ca", "thal", "hd")

head(data)
agesexcptrestbpscholfbsrestecgthalachexangoldpeakslopecathalhd
63 1 1 1452331 2 1500 2.33 0.06.00
67 1 4 1602860 2 1081 1.52 3.03.02
67 1 4 1202290 2 1291 2.62 2.07.01
37 1 3 1302500 0 1870 3.53 0.03.00
41 0 2 1302040 2 1720 1.41 0.03.00
56 1 2 1202360 0 1780 0.81 0.03.00

str() function gives us the structure of the data

In [110]:
str(data)
'data.frame':	303 obs. of  14 variables:
 $ age     : num  63 67 67 37 41 56 62 57 63 53 ...
 $ sex     : num  1 1 1 1 0 1 0 0 1 1 ...
 $ cp      : num  1 4 4 3 2 2 4 4 4 4 ...
 $ trestbps: num  145 160 120 130 130 120 140 120 130 140 ...
 $ chol    : num  233 286 229 250 204 236 268 354 254 203 ...
 $ fbs     : num  1 0 0 0 0 0 0 0 0 1 ...
 $ restecg : num  2 2 2 0 2 0 2 0 2 2 ...
 $ thalach : num  150 108 129 187 172 178 160 163 147 155 ...
 $ exang   : num  0 1 1 0 0 0 0 1 0 1 ...
 $ oldpeak : num  2.3 1.5 2.6 3.5 1.4 0.8 3.6 0.6 1.4 3.1 ...
 $ slope   : num  3 2 2 3 1 1 3 1 2 3 ...
 $ ca      : Factor w/ 5 levels "?","0.0","1.0",..: 2 5 4 2 2 2 4 2 3 2 ...
 $ thal    : Factor w/ 4 levels "?","3.0","6.0",..: 3 2 4 2 2 2 2 2 4 4 ...
 $ hd      : int  0 2 1 0 0 0 3 0 2 1 ...

Some of the cols are messed up

  • sex is supposed to be a factor where 0: female and 1: male

  • cp is supposed to be a factor where levels 1-3 represents diff types of pain and 4 represents no chest pain

  • fbs is supposed to be a factor

  • restecg is supposed to be a factor

  • exang is supposed to be a factor

  • slope is supposed to be a factor

  • ca and thal are correctly called factors but one of the levels is "?" when we need it to be NA

Change "?" to NA:

In [111]:
data[data == '?'] <- NA

To make data easier on the eye, convert 0s in sex to F and 1s to M

Then convert the col into a factor

In [112]:
data[data$sex == 0,]$sex <- "F"

data[data$sex == 1,]$sex <- "M"

data$sex <- as.factor(data$sex)

head(data)
agesexcptrestbpscholfbsrestecgthalachexangoldpeakslopecathalhd
63 M 1 1452331 2 1500 2.33 0.06.00
67 M 4 1602860 2 1081 1.52 3.03.02
67 M 4 1202290 2 1291 2.62 2.07.01
37 M 3 1302500 0 1870 3.53 0.03.00
41 F 2 1302040 2 1720 1.41 0.03.00
56 M 2 1202360 0 1780 0.81 0.03.00

We convert the other cols into factors:

In [113]:
data$cp = as.factor(data$cp)
data$fbs = as.factor(data$fbs)
data$restecg = as.factor(data$restecg)
data$exang = as.factor(data$exang)
data$slope = as.factor(data$slope)

Since ca and thal cols had ? in them R took it to be a col of strings

We convert these cols to int then convert them as factors

In [114]:
data$ca <- as.integer(data$ca)
data$ca <- as.factor(data$ca)

data$thal <- as.integer(data$thal)
data$thal <- as.factor(data$thal)

Last thing is to make hd (heart disease as 0: Healthy, 1: Unhealthy)

In [115]:
data$hd <- ifelse(test = data$hd == 0, yes = "Healthy", no = "Unhealthy")

data$hd <- as.factor(data$hd)
In [116]:
head(data)
agesexcptrestbpscholfbsrestecgthalachexangoldpeakslopecathalhd
63 M 1 145 233 1 2 150 0 2.3 3 2 3 Healthy
67 M 4 160 286 0 2 108 1 1.5 2 5 2 Unhealthy
67 M 4 120 229 0 2 129 1 2.6 2 4 4 Unhealthy
37 M 3 130 250 0 0 187 0 3.5 3 2 2 Healthy
41 F 2 130 204 0 2 172 0 1.4 1 2 2 Healthy
56 M 2 120 236 0 0 178 0 0.8 1 2 2 Healthy
In [117]:
str(data)
'data.frame':	303 obs. of  14 variables:
 $ age     : num  63 67 67 37 41 56 62 57 63 53 ...
 $ sex     : Factor w/ 2 levels "F","M": 2 2 2 2 1 2 1 1 2 2 ...
 $ cp      : Factor w/ 4 levels "1","2","3","4": 1 4 4 3 2 2 4 4 4 4 ...
 $ trestbps: num  145 160 120 130 130 120 140 120 130 140 ...
 $ chol    : num  233 286 229 250 204 236 268 354 254 203 ...
 $ fbs     : Factor w/ 2 levels "0","1": 2 1 1 1 1 1 1 1 1 2 ...
 $ restecg : Factor w/ 3 levels "0","1","2": 3 3 3 1 3 1 3 1 3 3 ...
 $ thalach : num  150 108 129 187 172 178 160 163 147 155 ...
 $ exang   : Factor w/ 2 levels "0","1": 1 2 2 1 1 1 1 2 1 2 ...
 $ oldpeak : num  2.3 1.5 2.6 3.5 1.4 0.8 3.6 0.6 1.4 3.1 ...
 $ slope   : Factor w/ 3 levels "1","2","3": 3 2 2 3 1 1 3 1 2 3 ...
 $ ca      : Factor w/ 4 levels "2","3","4","5": 1 4 3 1 1 1 3 1 2 1 ...
 $ thal    : Factor w/ 3 levels "2","3","4": 2 1 3 1 1 1 1 1 3 3 ...
 $ hd      : Factor w/ 2 levels "Healthy","Unhealthy": 1 2 2 1 1 1 2 1 2 2 ...

Since we are going to be randomly sampling things, lets set the seed for the random no generator so that we can reproduce our results

In [118]:
set.seed(42)

Now we impute values for the NAs in the dataset with rfImpute()

The 1st arg is hd ~ .

This means that we want the hd col to be predicted by the data in the other cols

data specifies the dataset

iter = 6: Here we specify how many RFs should rfImpute() build to estimate the mssing values

In theory, 4-6 iters are enough

Lastly, we save the results i.e the dataset with imputed values instead of NAs as data.imputed

In [119]:
data.imputed = rfImpute(hd ~ ., data = data, iter = 6)

head(data)
ntree      OOB      1      2
  300:  17.49% 14.02% 21.58%
ntree      OOB      1      2
  300:  17.16% 13.41% 21.58%
ntree      OOB      1      2
  300:  17.49% 14.02% 21.58%
ntree      OOB      1      2
  300:  17.16% 13.41% 21.58%
ntree      OOB      1      2
  300:  17.16% 13.41% 21.58%
ntree      OOB      1      2
  300:  17.16% 13.41% 21.58%
agesexcptrestbpscholfbsrestecgthalachexangoldpeakslopecathalhd
63 M 1 145 233 1 2 150 0 2.3 3 2 3 Healthy
67 M 4 160 286 0 2 108 1 1.5 2 5 2 Unhealthy
67 M 4 120 229 0 2 129 1 2.6 2 4 4 Unhealthy
37 M 3 130 250 0 0 187 0 3.5 3 2 2 Healthy
41 F 2 130 204 0 2 172 0 1.4 1 2 2 Healthy
56 M 2 120 236 0 0 178 0 0.8 1 2 2 Healthy

After each iteration rfImpute() prints the Out-of-Bag(OOB) error rate

This should get smaller if the estimates are improving

Now that we have imputed the values, we build a RF

In [120]:
model <- randomForest(hd ~ ., data = data.imputed, proximity = TRUE)

The 1st arg is hd ~ .

This means that we want the hd col to be predicted by the data in the other cols

We also want randomForest() to return the PM

We will use this to cluster the samples

Lastly, we save the randomForest and asspciated data like PM as model

Get summary of RF and how well it performed:

In [121]:
model
Call:
 randomForest(formula = hd ~ ., data = data.imputed, proximity = TRUE) 
               Type of random forest: classification
                     Number of trees: 500
No. of variables tried at each split: 3

        OOB estimate of  error rate: 16.5%
Confusion matrix:
          Healthy Unhealthy class.error
Healthy       141        23   0.1402439
Unhealthy      27       112   0.1942446

Type of random forest: classification

If we had used the RF to predict wt or ht it would say "regression"

If we had omitted the thing RF was supposed to predict entirely, it would say "unsupervised"

Number of trees: 500: how many trees are in RF

No. of variables tried at each split: 3

  • how many cols of data were considered at each internal node

Classification trees have a default setting of sq root of no of vars

Regression trees have a default setting of no of vars div by 3

OOB estimate of error rate: 16.5% : This means that 83.5% of the OoB samples were correctly classified by the RF

Helthy Unhealthy class.error Helthy 141 23 0.1402439 Unhealthy 27 112 0.1942446

This is the Confusion Matrix

In [122]:
head(model$err.rate)
OOBHealthyUnhealthy
0.26724140.19696970.3600000
0.27027030.24528300.3037975
0.26160340.26923080.2523364
0.26436780.27972030.2457627
0.27956990.28947370.2677165
0.27622380.27096770.2824427
In [123]:
nrow(model$err.rate)
500

Each row in model$err.rate reflects the error rates at diff stages of creating the RF

The 1st row contains error rates after making 1st tree

2nd row contains error rates after making 1st 2 trees

... and so on

last row contains error rates after making all 500 trees

We want to construct a df which has the type of error in the rows rather than the cols

In [124]:
print(rep(c(2,4), each = 4))

print(rep(c(2,4), times = 4))
[1] 2 2 2 2 4 4 4 4
[1] 2 4 2 4 2 4 2 4

Creating col: Type

In [125]:
Type = rep(c("OOB", "Healthy", "Unhealthy"), each = nrow(model$err.rate))

Type
  1. 'OOB'
  2. 'OOB'
  3. 'OOB'
  4. 'OOB'
  5. 'OOB'
  6. 'OOB'
  7. 'OOB'
  8. 'OOB'
  9. 'OOB'
  10. 'OOB'
  11. 'OOB'
  12. 'OOB'
  13. 'OOB'
  14. 'OOB'
  15. 'OOB'
  16. 'OOB'
  17. 'OOB'
  18. 'OOB'
  19. 'OOB'
  20. 'OOB'
  21. 'OOB'
  22. 'OOB'
  23. 'OOB'
  24. 'OOB'
  25. 'OOB'
  26. 'OOB'
  27. 'OOB'
  28. 'OOB'
  29. 'OOB'
  30. 'OOB'
  31. 'OOB'
  32. 'OOB'
  33. 'OOB'
  34. 'OOB'
  35. 'OOB'
  36. 'OOB'
  37. 'OOB'
  38. 'OOB'
  39. 'OOB'
  40. 'OOB'
  41. 'OOB'
  42. 'OOB'
  43. 'OOB'
  44. 'OOB'
  45. 'OOB'
  46. 'OOB'
  47. 'OOB'
  48. 'OOB'
  49. 'OOB'
  50. 'OOB'
  51. 'OOB'
  52. 'OOB'
  53. 'OOB'
  54. 'OOB'
  55. 'OOB'
  56. 'OOB'
  57. 'OOB'
  58. 'OOB'
  59. 'OOB'
  60. 'OOB'
  61. 'OOB'
  62. 'OOB'
  63. 'OOB'
  64. 'OOB'
  65. 'OOB'
  66. 'OOB'
  67. 'OOB'
  68. 'OOB'
  69. 'OOB'
  70. 'OOB'
  71. 'OOB'
  72. 'OOB'
  73. 'OOB'
  74. 'OOB'
  75. 'OOB'
  76. 'OOB'
  77. 'OOB'
  78. 'OOB'
  79. 'OOB'
  80. 'OOB'
  81. 'OOB'
  82. 'OOB'
  83. 'OOB'
  84. 'OOB'
  85. 'OOB'
  86. 'OOB'
  87. 'OOB'
  88. 'OOB'
  89. 'OOB'
  90. 'OOB'
  91. 'OOB'
  92. 'OOB'
  93. 'OOB'
  94. 'OOB'
  95. 'OOB'
  96. 'OOB'
  97. 'OOB'
  98. 'OOB'
  99. 'OOB'
  100. 'OOB'
  101. 'OOB'
  102. 'OOB'
  103. 'OOB'
  104. 'OOB'
  105. 'OOB'
  106. 'OOB'
  107. 'OOB'
  108. 'OOB'
  109. 'OOB'
  110. 'OOB'
  111. 'OOB'
  112. 'OOB'
  113. 'OOB'
  114. 'OOB'
  115. 'OOB'
  116. 'OOB'
  117. 'OOB'
  118. 'OOB'
  119. 'OOB'
  120. 'OOB'
  121. 'OOB'
  122. 'OOB'
  123. 'OOB'
  124. 'OOB'
  125. 'OOB'
  126. 'OOB'
  127. 'OOB'
  128. 'OOB'
  129. 'OOB'
  130. 'OOB'
  131. 'OOB'
  132. 'OOB'
  133. 'OOB'
  134. 'OOB'
  135. 'OOB'
  136. 'OOB'
  137. 'OOB'
  138. 'OOB'
  139. 'OOB'
  140. 'OOB'
  141. 'OOB'
  142. 'OOB'
  143. 'OOB'
  144. 'OOB'
  145. 'OOB'
  146. 'OOB'
  147. 'OOB'
  148. 'OOB'
  149. 'OOB'
  150. 'OOB'
  151. 'OOB'
  152. 'OOB'
  153. 'OOB'
  154. 'OOB'
  155. 'OOB'
  156. 'OOB'
  157. 'OOB'
  158. 'OOB'
  159. 'OOB'
  160. 'OOB'
  161. 'OOB'
  162. 'OOB'
  163. 'OOB'
  164. 'OOB'
  165. 'OOB'
  166. 'OOB'
  167. 'OOB'
  168. 'OOB'
  169. 'OOB'
  170. 'OOB'
  171. 'OOB'
  172. 'OOB'
  173. 'OOB'
  174. 'OOB'
  175. 'OOB'
  176. 'OOB'
  177. 'OOB'
  178. 'OOB'
  179. 'OOB'
  180. 'OOB'
  181. 'OOB'
  182. 'OOB'
  183. 'OOB'
  184. 'OOB'
  185. 'OOB'
  186. 'OOB'
  187. 'OOB'
  188. 'OOB'
  189. 'OOB'
  190. 'OOB'
  191. 'OOB'
  192. 'OOB'
  193. 'OOB'
  194. 'OOB'
  195. 'OOB'
  196. 'OOB'
  197. 'OOB'
  198. 'OOB'
  199. 'OOB'
  200. 'OOB'
  201. 'OOB'
  202. 'OOB'
  203. 'OOB'
  204. 'OOB'
  205. 'OOB'
  206. 'OOB'
  207. 'OOB'
  208. 'OOB'
  209. 'OOB'
  210. 'OOB'
  211. 'OOB'
  212. 'OOB'
  213. 'OOB'
  214. 'OOB'
  215. 'OOB'
  216. 'OOB'
  217. 'OOB'
  218. 'OOB'
  219. 'OOB'
  220. 'OOB'
  221. 'OOB'
  222. 'OOB'
  223. 'OOB'
  224. 'OOB'
  225. 'OOB'
  226. 'OOB'
  227. 'OOB'
  228. 'OOB'
  229. 'OOB'
  230. 'OOB'
  231. 'OOB'
  232. 'OOB'
  233. 'OOB'
  234. 'OOB'
  235. 'OOB'
  236. 'OOB'
  237. 'OOB'
  238. 'OOB'
  239. 'OOB'
  240. 'OOB'
  241. 'OOB'
  242. 'OOB'
  243. 'OOB'
  244. 'OOB'
  245. 'OOB'
  246. 'OOB'
  247. 'OOB'
  248. 'OOB'
  249. 'OOB'
  250. 'OOB'
  251. 'OOB'
  252. 'OOB'
  253. 'OOB'
  254. 'OOB'
  255. 'OOB'
  256. 'OOB'
  257. 'OOB'
  258. 'OOB'
  259. 'OOB'
  260. 'OOB'
  261. 'OOB'
  262. 'OOB'
  263. 'OOB'
  264. 'OOB'
  265. 'OOB'
  266. 'OOB'
  267. 'OOB'
  268. 'OOB'
  269. 'OOB'
  270. 'OOB'
  271. 'OOB'
  272. 'OOB'
  273. 'OOB'
  274. 'OOB'
  275. 'OOB'
  276. 'OOB'
  277. 'OOB'
  278. 'OOB'
  279. 'OOB'
  280. 'OOB'
  281. 'OOB'
  282. 'OOB'
  283. 'OOB'
  284. 'OOB'
  285. 'OOB'
  286. 'OOB'
  287. 'OOB'
  288. 'OOB'
  289. 'OOB'
  290. 'OOB'
  291. 'OOB'
  292. 'OOB'
  293. 'OOB'
  294. 'OOB'
  295. 'OOB'
  296. 'OOB'
  297. 'OOB'
  298. 'OOB'
  299. 'OOB'
  300. 'OOB'
  301. 'OOB'
  302. 'OOB'
  303. 'OOB'
  304. 'OOB'
  305. 'OOB'
  306. 'OOB'
  307. 'OOB'
  308. 'OOB'
  309. 'OOB'
  310. 'OOB'
  311. 'OOB'
  312. 'OOB'
  313. 'OOB'
  314. 'OOB'
  315. 'OOB'
  316. 'OOB'
  317. 'OOB'
  318. 'OOB'
  319. 'OOB'
  320. 'OOB'
  321. 'OOB'
  322. 'OOB'
  323. 'OOB'
  324. 'OOB'
  325. 'OOB'
  326. 'OOB'
  327. 'OOB'
  328. 'OOB'
  329. 'OOB'
  330. 'OOB'
  331. 'OOB'
  332. 'OOB'
  333. 'OOB'
  334. 'OOB'
  335. 'OOB'
  336. 'OOB'
  337. 'OOB'
  338. 'OOB'
  339. 'OOB'
  340. 'OOB'
  341. 'OOB'
  342. 'OOB'
  343. 'OOB'
  344. 'OOB'
  345. 'OOB'
  346. 'OOB'
  347. 'OOB'
  348. 'OOB'
  349. 'OOB'
  350. 'OOB'
  351. 'OOB'
  352. 'OOB'
  353. 'OOB'
  354. 'OOB'
  355. 'OOB'
  356. 'OOB'
  357. 'OOB'
  358. 'OOB'
  359. 'OOB'
  360. 'OOB'
  361. 'OOB'
  362. 'OOB'
  363. 'OOB'
  364. 'OOB'
  365. 'OOB'
  366. 'OOB'
  367. 'OOB'
  368. 'OOB'
  369. 'OOB'
  370. 'OOB'
  371. 'OOB'
  372. 'OOB'
  373. 'OOB'
  374. 'OOB'
  375. 'OOB'
  376. 'OOB'
  377. 'OOB'
  378. 'OOB'
  379. 'OOB'
  380. 'OOB'
  381. 'OOB'
  382. 'OOB'
  383. 'OOB'
  384. 'OOB'
  385. 'OOB'
  386. 'OOB'
  387. 'OOB'
  388. 'OOB'
  389. 'OOB'
  390. 'OOB'
  391. 'OOB'
  392. 'OOB'
  393. 'OOB'
  394. 'OOB'
  395. 'OOB'
  396. 'OOB'
  397. 'OOB'
  398. 'OOB'
  399. 'OOB'
  400. 'OOB'
  401. 'OOB'
  402. 'OOB'
  403. 'OOB'
  404. 'OOB'
  405. 'OOB'
  406. 'OOB'
  407. 'OOB'
  408. 'OOB'
  409. 'OOB'
  410. 'OOB'
  411. 'OOB'
  412. 'OOB'
  413. 'OOB'
  414. 'OOB'
  415. 'OOB'
  416. 'OOB'
  417. 'OOB'
  418. 'OOB'
  419. 'OOB'
  420. 'OOB'
  421. 'OOB'
  422. 'OOB'
  423. 'OOB'
  424. 'OOB'
  425. 'OOB'
  426. 'OOB'
  427. 'OOB'
  428. 'OOB'
  429. 'OOB'
  430. 'OOB'
  431. 'OOB'
  432. 'OOB'
  433. 'OOB'
  434. 'OOB'
  435. 'OOB'
  436. 'OOB'
  437. 'OOB'
  438. 'OOB'
  439. 'OOB'
  440. 'OOB'
  441. 'OOB'
  442. 'OOB'
  443. 'OOB'
  444. 'OOB'
  445. 'OOB'
  446. 'OOB'
  447. 'OOB'
  448. 'OOB'
  449. 'OOB'
  450. 'OOB'
  451. 'OOB'
  452. 'OOB'
  453. 'OOB'
  454. 'OOB'
  455. 'OOB'
  456. 'OOB'
  457. 'OOB'
  458. 'OOB'
  459. 'OOB'
  460. 'OOB'
  461. 'OOB'
  462. 'OOB'
  463. 'OOB'
  464. 'OOB'
  465. 'OOB'
  466. 'OOB'
  467. 'OOB'
  468. 'OOB'
  469. 'OOB'
  470. 'OOB'
  471. 'OOB'
  472. 'OOB'
  473. 'OOB'
  474. 'OOB'
  475. 'OOB'
  476. 'OOB'
  477. 'OOB'
  478. 'OOB'
  479. 'OOB'
  480. 'OOB'
  481. 'OOB'
  482. 'OOB'
  483. 'OOB'
  484. 'OOB'
  485. 'OOB'
  486. 'OOB'
  487. 'OOB'
  488. 'OOB'
  489. 'OOB'
  490. 'OOB'
  491. 'OOB'
  492. 'OOB'
  493. 'OOB'
  494. 'OOB'
  495. 'OOB'
  496. 'OOB'
  497. 'OOB'
  498. 'OOB'
  499. 'OOB'
  500. 'OOB'
  501. 'Healthy'
  502. 'Healthy'
  503. 'Healthy'
  504. 'Healthy'
  505. 'Healthy'
  506. 'Healthy'
  507. 'Healthy'
  508. 'Healthy'
  509. 'Healthy'
  510. 'Healthy'
  511. 'Healthy'
  512. 'Healthy'
  513. 'Healthy'
  514. 'Healthy'
  515. 'Healthy'
  516. 'Healthy'
  517. 'Healthy'
  518. 'Healthy'
  519. 'Healthy'
  520. 'Healthy'
  521. 'Healthy'
  522. 'Healthy'
  523. 'Healthy'
  524. 'Healthy'
  525. 'Healthy'
  526. 'Healthy'
  527. 'Healthy'
  528. 'Healthy'
  529. 'Healthy'
  530. 'Healthy'
  531. 'Healthy'
  532. 'Healthy'
  533. 'Healthy'
  534. 'Healthy'
  535. 'Healthy'
  536. 'Healthy'
  537. 'Healthy'
  538. 'Healthy'
  539. 'Healthy'
  540. 'Healthy'
  541. 'Healthy'
  542. 'Healthy'
  543. 'Healthy'
  544. 'Healthy'
  545. 'Healthy'
  546. 'Healthy'
  547. 'Healthy'
  548. 'Healthy'
  549. 'Healthy'
  550. 'Healthy'
  551. 'Healthy'
  552. 'Healthy'
  553. 'Healthy'
  554. 'Healthy'
  555. 'Healthy'
  556. 'Healthy'
  557. 'Healthy'
  558. 'Healthy'
  559. 'Healthy'
  560. 'Healthy'
  561. 'Healthy'
  562. 'Healthy'
  563. 'Healthy'
  564. 'Healthy'
  565. 'Healthy'
  566. 'Healthy'
  567. 'Healthy'
  568. 'Healthy'
  569. 'Healthy'
  570. 'Healthy'
  571. 'Healthy'
  572. 'Healthy'
  573. 'Healthy'
  574. 'Healthy'
  575. 'Healthy'
  576. 'Healthy'
  577. 'Healthy'
  578. 'Healthy'
  579. 'Healthy'
  580. 'Healthy'
  581. 'Healthy'
  582. 'Healthy'
  583. 'Healthy'
  584. 'Healthy'
  585. 'Healthy'
  586. 'Healthy'
  587. 'Healthy'
  588. 'Healthy'
  589. 'Healthy'
  590. 'Healthy'
  591. 'Healthy'
  592. 'Healthy'
  593. 'Healthy'
  594. 'Healthy'
  595. 'Healthy'
  596. 'Healthy'
  597. 'Healthy'
  598. 'Healthy'
  599. 'Healthy'
  600. 'Healthy'
  601. 'Healthy'
  602. 'Healthy'
  603. 'Healthy'
  604. 'Healthy'
  605. 'Healthy'
  606. 'Healthy'
  607. 'Healthy'
  608. 'Healthy'
  609. 'Healthy'
  610. 'Healthy'
  611. 'Healthy'
  612. 'Healthy'
  613. 'Healthy'
  614. 'Healthy'
  615. 'Healthy'
  616. 'Healthy'
  617. 'Healthy'
  618. 'Healthy'
  619. 'Healthy'
  620. 'Healthy'
  621. 'Healthy'
  622. 'Healthy'
  623. 'Healthy'
  624. 'Healthy'
  625. 'Healthy'
  626. 'Healthy'
  627. 'Healthy'
  628. 'Healthy'
  629. 'Healthy'
  630. 'Healthy'
  631. 'Healthy'
  632. 'Healthy'
  633. 'Healthy'
  634. 'Healthy'
  635. 'Healthy'
  636. 'Healthy'
  637. 'Healthy'
  638. 'Healthy'
  639. 'Healthy'
  640. 'Healthy'
  641. 'Healthy'
  642. 'Healthy'
  643. 'Healthy'
  644. 'Healthy'
  645. 'Healthy'
  646. 'Healthy'
  647. 'Healthy'
  648. 'Healthy'
  649. 'Healthy'
  650. 'Healthy'
  651. 'Healthy'
  652. 'Healthy'
  653. 'Healthy'
  654. 'Healthy'
  655. 'Healthy'
  656. 'Healthy'
  657. 'Healthy'
  658. 'Healthy'
  659. 'Healthy'
  660. 'Healthy'
  661. 'Healthy'
  662. 'Healthy'
  663. 'Healthy'
  664. 'Healthy'
  665. 'Healthy'
  666. 'Healthy'
  667. 'Healthy'
  668. 'Healthy'
  669. 'Healthy'
  670. 'Healthy'
  671. 'Healthy'
  672. 'Healthy'
  673. 'Healthy'
  674. 'Healthy'
  675. 'Healthy'
  676. 'Healthy'
  677. 'Healthy'
  678. 'Healthy'
  679. 'Healthy'
  680. 'Healthy'
  681. 'Healthy'
  682. 'Healthy'
  683. 'Healthy'
  684. 'Healthy'
  685. 'Healthy'
  686. 'Healthy'
  687. 'Healthy'
  688. 'Healthy'
  689. 'Healthy'
  690. 'Healthy'
  691. 'Healthy'
  692. 'Healthy'
  693. 'Healthy'
  694. 'Healthy'
  695. 'Healthy'
  696. 'Healthy'
  697. 'Healthy'
  698. 'Healthy'
  699. 'Healthy'
  700. 'Healthy'
  701. 'Healthy'
  702. 'Healthy'
  703. 'Healthy'
  704. 'Healthy'
  705. 'Healthy'
  706. 'Healthy'
  707. 'Healthy'
  708. 'Healthy'
  709. 'Healthy'
  710. 'Healthy'
  711. 'Healthy'
  712. 'Healthy'
  713. 'Healthy'
  714. 'Healthy'
  715. 'Healthy'
  716. 'Healthy'
  717. 'Healthy'
  718. 'Healthy'
  719. 'Healthy'
  720. 'Healthy'
  721. 'Healthy'
  722. 'Healthy'
  723. 'Healthy'
  724. 'Healthy'
  725. 'Healthy'
  726. 'Healthy'
  727. 'Healthy'
  728. 'Healthy'
  729. 'Healthy'
  730. 'Healthy'
  731. 'Healthy'
  732. 'Healthy'
  733. 'Healthy'
  734. 'Healthy'
  735. 'Healthy'
  736. 'Healthy'
  737. 'Healthy'
  738. 'Healthy'
  739. 'Healthy'
  740. 'Healthy'
  741. 'Healthy'
  742. 'Healthy'
  743. 'Healthy'
  744. 'Healthy'
  745. 'Healthy'
  746. 'Healthy'
  747. 'Healthy'
  748. 'Healthy'
  749. 'Healthy'
  750. 'Healthy'
  751. 'Healthy'
  752. 'Healthy'
  753. 'Healthy'
  754. 'Healthy'
  755. 'Healthy'
  756. 'Healthy'
  757. 'Healthy'
  758. 'Healthy'
  759. 'Healthy'
  760. 'Healthy'
  761. 'Healthy'
  762. 'Healthy'
  763. 'Healthy'
  764. 'Healthy'
  765. 'Healthy'
  766. 'Healthy'
  767. 'Healthy'
  768. 'Healthy'
  769. 'Healthy'
  770. 'Healthy'
  771. 'Healthy'
  772. 'Healthy'
  773. 'Healthy'
  774. 'Healthy'
  775. 'Healthy'
  776. 'Healthy'
  777. 'Healthy'
  778. 'Healthy'
  779. 'Healthy'
  780. 'Healthy'
  781. 'Healthy'
  782. 'Healthy'
  783. 'Healthy'
  784. 'Healthy'
  785. 'Healthy'
  786. 'Healthy'
  787. 'Healthy'
  788. 'Healthy'
  789. 'Healthy'
  790. 'Healthy'
  791. 'Healthy'
  792. 'Healthy'
  793. 'Healthy'
  794. 'Healthy'
  795. 'Healthy'
  796. 'Healthy'
  797. 'Healthy'
  798. 'Healthy'
  799. 'Healthy'
  800. 'Healthy'
  801. 'Healthy'
  802. 'Healthy'
  803. 'Healthy'
  804. 'Healthy'
  805. 'Healthy'
  806. 'Healthy'
  807. 'Healthy'
  808. 'Healthy'
  809. 'Healthy'
  810. 'Healthy'
  811. 'Healthy'
  812. 'Healthy'
  813. 'Healthy'
  814. 'Healthy'
  815. 'Healthy'
  816. 'Healthy'
  817. 'Healthy'
  818. 'Healthy'
  819. 'Healthy'
  820. 'Healthy'
  821. 'Healthy'
  822. 'Healthy'
  823. 'Healthy'
  824. 'Healthy'
  825. 'Healthy'
  826. 'Healthy'
  827. 'Healthy'
  828. 'Healthy'
  829. 'Healthy'
  830. 'Healthy'
  831. 'Healthy'
  832. 'Healthy'
  833. 'Healthy'
  834. 'Healthy'
  835. 'Healthy'
  836. 'Healthy'
  837. 'Healthy'
  838. 'Healthy'
  839. 'Healthy'
  840. 'Healthy'
  841. 'Healthy'
  842. 'Healthy'
  843. 'Healthy'
  844. 'Healthy'
  845. 'Healthy'
  846. 'Healthy'
  847. 'Healthy'
  848. 'Healthy'
  849. 'Healthy'
  850. 'Healthy'
  851. 'Healthy'
  852. 'Healthy'
  853. 'Healthy'
  854. 'Healthy'
  855. 'Healthy'
  856. 'Healthy'
  857. 'Healthy'
  858. 'Healthy'
  859. 'Healthy'
  860. 'Healthy'
  861. 'Healthy'
  862. 'Healthy'
  863. 'Healthy'
  864. 'Healthy'
  865. 'Healthy'
  866. 'Healthy'
  867. 'Healthy'
  868. 'Healthy'
  869. 'Healthy'
  870. 'Healthy'
  871. 'Healthy'
  872. 'Healthy'
  873. 'Healthy'
  874. 'Healthy'
  875. 'Healthy'
  876. 'Healthy'
  877. 'Healthy'
  878. 'Healthy'
  879. 'Healthy'
  880. 'Healthy'
  881. 'Healthy'
  882. 'Healthy'
  883. 'Healthy'
  884. 'Healthy'
  885. 'Healthy'
  886. 'Healthy'
  887. 'Healthy'
  888. 'Healthy'
  889. 'Healthy'
  890. 'Healthy'
  891. 'Healthy'
  892. 'Healthy'
  893. 'Healthy'
  894. 'Healthy'
  895. 'Healthy'
  896. 'Healthy'
  897. 'Healthy'
  898. 'Healthy'
  899. 'Healthy'
  900. 'Healthy'
  901. 'Healthy'
  902. 'Healthy'
  903. 'Healthy'
  904. 'Healthy'
  905. 'Healthy'
  906. 'Healthy'
  907. 'Healthy'
  908. 'Healthy'
  909. 'Healthy'
  910. 'Healthy'
  911. 'Healthy'
  912. 'Healthy'
  913. 'Healthy'
  914. 'Healthy'
  915. 'Healthy'
  916. 'Healthy'
  917. 'Healthy'
  918. 'Healthy'
  919. 'Healthy'
  920. 'Healthy'
  921. 'Healthy'
  922. 'Healthy'
  923. 'Healthy'
  924. 'Healthy'
  925. 'Healthy'
  926. 'Healthy'
  927. 'Healthy'
  928. 'Healthy'
  929. 'Healthy'
  930. 'Healthy'
  931. 'Healthy'
  932. 'Healthy'
  933. 'Healthy'
  934. 'Healthy'
  935. 'Healthy'
  936. 'Healthy'
  937. 'Healthy'
  938. 'Healthy'
  939. 'Healthy'
  940. 'Healthy'
  941. 'Healthy'
  942. 'Healthy'
  943. 'Healthy'
  944. 'Healthy'
  945. 'Healthy'
  946. 'Healthy'
  947. 'Healthy'
  948. 'Healthy'
  949. 'Healthy'
  950. 'Healthy'
  951. 'Healthy'
  952. 'Healthy'
  953. 'Healthy'
  954. 'Healthy'
  955. 'Healthy'
  956. 'Healthy'
  957. 'Healthy'
  958. 'Healthy'
  959. 'Healthy'
  960. 'Healthy'
  961. 'Healthy'
  962. 'Healthy'
  963. 'Healthy'
  964. 'Healthy'
  965. 'Healthy'
  966. 'Healthy'
  967. 'Healthy'
  968. 'Healthy'
  969. 'Healthy'
  970. 'Healthy'
  971. 'Healthy'
  972. 'Healthy'
  973. 'Healthy'
  974. 'Healthy'
  975. 'Healthy'
  976. 'Healthy'
  977. 'Healthy'
  978. 'Healthy'
  979. 'Healthy'
  980. 'Healthy'
  981. 'Healthy'
  982. 'Healthy'
  983. 'Healthy'
  984. 'Healthy'
  985. 'Healthy'
  986. 'Healthy'
  987. 'Healthy'
  988. 'Healthy'
  989. 'Healthy'
  990. 'Healthy'
  991. 'Healthy'
  992. 'Healthy'
  993. 'Healthy'
  994. 'Healthy'
  995. 'Healthy'
  996. 'Healthy'
  997. 'Healthy'
  998. 'Healthy'
  999. 'Healthy'
  1000. 'Healthy'
  1001. 'Unhealthy'
  1002. 'Unhealthy'
  1003. 'Unhealthy'
  1004. 'Unhealthy'
  1005. 'Unhealthy'
  1006. 'Unhealthy'
  1007. 'Unhealthy'
  1008. 'Unhealthy'
  1009. 'Unhealthy'
  1010. 'Unhealthy'
  1011. 'Unhealthy'
  1012. 'Unhealthy'
  1013. 'Unhealthy'
  1014. 'Unhealthy'
  1015. 'Unhealthy'
  1016. 'Unhealthy'
  1017. 'Unhealthy'
  1018. 'Unhealthy'
  1019. 'Unhealthy'
  1020. 'Unhealthy'
  1021. 'Unhealthy'
  1022. 'Unhealthy'
  1023. 'Unhealthy'
  1024. 'Unhealthy'
  1025. 'Unhealthy'
  1026. 'Unhealthy'
  1027. 'Unhealthy'
  1028. 'Unhealthy'
  1029. 'Unhealthy'
  1030. 'Unhealthy'
  1031. 'Unhealthy'
  1032. 'Unhealthy'
  1033. 'Unhealthy'
  1034. 'Unhealthy'
  1035. 'Unhealthy'
  1036. 'Unhealthy'
  1037. 'Unhealthy'
  1038. 'Unhealthy'
  1039. 'Unhealthy'
  1040. 'Unhealthy'
  1041. 'Unhealthy'
  1042. 'Unhealthy'
  1043. 'Unhealthy'
  1044. 'Unhealthy'
  1045. 'Unhealthy'
  1046. 'Unhealthy'
  1047. 'Unhealthy'
  1048. 'Unhealthy'
  1049. 'Unhealthy'
  1050. 'Unhealthy'
  1051. 'Unhealthy'
  1052. 'Unhealthy'
  1053. 'Unhealthy'
  1054. 'Unhealthy'
  1055. 'Unhealthy'
  1056. 'Unhealthy'
  1057. 'Unhealthy'
  1058. 'Unhealthy'
  1059. 'Unhealthy'
  1060. 'Unhealthy'
  1061. 'Unhealthy'
  1062. 'Unhealthy'
  1063. 'Unhealthy'
  1064. 'Unhealthy'
  1065. 'Unhealthy'
  1066. 'Unhealthy'
  1067. 'Unhealthy'
  1068. 'Unhealthy'
  1069. 'Unhealthy'
  1070. 'Unhealthy'
  1071. 'Unhealthy'
  1072. 'Unhealthy'
  1073. 'Unhealthy'
  1074. 'Unhealthy'
  1075. 'Unhealthy'
  1076. 'Unhealthy'
  1077. 'Unhealthy'
  1078. 'Unhealthy'
  1079. 'Unhealthy'
  1080. 'Unhealthy'
  1081. 'Unhealthy'
  1082. 'Unhealthy'
  1083. 'Unhealthy'
  1084. 'Unhealthy'
  1085. 'Unhealthy'
  1086. 'Unhealthy'
  1087. 'Unhealthy'
  1088. 'Unhealthy'
  1089. 'Unhealthy'
  1090. 'Unhealthy'
  1091. 'Unhealthy'
  1092. 'Unhealthy'
  1093. 'Unhealthy'
  1094. 'Unhealthy'
  1095. 'Unhealthy'
  1096. 'Unhealthy'
  1097. 'Unhealthy'
  1098. 'Unhealthy'
  1099. 'Unhealthy'
  1100. 'Unhealthy'
  1101. 'Unhealthy'
  1102. 'Unhealthy'
  1103. 'Unhealthy'
  1104. 'Unhealthy'
  1105. 'Unhealthy'
  1106. 'Unhealthy'
  1107. 'Unhealthy'
  1108. 'Unhealthy'
  1109. 'Unhealthy'
  1110. 'Unhealthy'
  1111. 'Unhealthy'
  1112. 'Unhealthy'
  1113. 'Unhealthy'
  1114. 'Unhealthy'
  1115. 'Unhealthy'
  1116. 'Unhealthy'
  1117. 'Unhealthy'
  1118. 'Unhealthy'
  1119. 'Unhealthy'
  1120. 'Unhealthy'
  1121. 'Unhealthy'
  1122. 'Unhealthy'
  1123. 'Unhealthy'
  1124. 'Unhealthy'
  1125. 'Unhealthy'
  1126. 'Unhealthy'
  1127. 'Unhealthy'
  1128. 'Unhealthy'
  1129. 'Unhealthy'
  1130. 'Unhealthy'
  1131. 'Unhealthy'
  1132. 'Unhealthy'
  1133. 'Unhealthy'
  1134. 'Unhealthy'
  1135. 'Unhealthy'
  1136. 'Unhealthy'
  1137. 'Unhealthy'
  1138. 'Unhealthy'
  1139. 'Unhealthy'
  1140. 'Unhealthy'
  1141. 'Unhealthy'
  1142. 'Unhealthy'
  1143. 'Unhealthy'
  1144. 'Unhealthy'
  1145. 'Unhealthy'
  1146. 'Unhealthy'
  1147. 'Unhealthy'
  1148. 'Unhealthy'
  1149. 'Unhealthy'
  1150. 'Unhealthy'
  1151. 'Unhealthy'
  1152. 'Unhealthy'
  1153. 'Unhealthy'
  1154. 'Unhealthy'
  1155. 'Unhealthy'
  1156. 'Unhealthy'
  1157. 'Unhealthy'
  1158. 'Unhealthy'
  1159. 'Unhealthy'
  1160. 'Unhealthy'
  1161. 'Unhealthy'
  1162. 'Unhealthy'
  1163. 'Unhealthy'
  1164. 'Unhealthy'
  1165. 'Unhealthy'
  1166. 'Unhealthy'
  1167. 'Unhealthy'
  1168. 'Unhealthy'
  1169. 'Unhealthy'
  1170. 'Unhealthy'
  1171. 'Unhealthy'
  1172. 'Unhealthy'
  1173. 'Unhealthy'
  1174. 'Unhealthy'
  1175. 'Unhealthy'
  1176. 'Unhealthy'
  1177. 'Unhealthy'
  1178. 'Unhealthy'
  1179. 'Unhealthy'
  1180. 'Unhealthy'
  1181. 'Unhealthy'
  1182. 'Unhealthy'
  1183. 'Unhealthy'
  1184. 'Unhealthy'
  1185. 'Unhealthy'
  1186. 'Unhealthy'
  1187. 'Unhealthy'
  1188. 'Unhealthy'
  1189. 'Unhealthy'
  1190. 'Unhealthy'
  1191. 'Unhealthy'
  1192. 'Unhealthy'
  1193. 'Unhealthy'
  1194. 'Unhealthy'
  1195. 'Unhealthy'
  1196. 'Unhealthy'
  1197. 'Unhealthy'
  1198. 'Unhealthy'
  1199. 'Unhealthy'
  1200. 'Unhealthy'
  1201. 'Unhealthy'
  1202. 'Unhealthy'
  1203. 'Unhealthy'
  1204. 'Unhealthy'
  1205. 'Unhealthy'
  1206. 'Unhealthy'
  1207. 'Unhealthy'
  1208. 'Unhealthy'
  1209. 'Unhealthy'
  1210. 'Unhealthy'
  1211. 'Unhealthy'
  1212. 'Unhealthy'
  1213. 'Unhealthy'
  1214. 'Unhealthy'
  1215. 'Unhealthy'
  1216. 'Unhealthy'
  1217. 'Unhealthy'
  1218. 'Unhealthy'
  1219. 'Unhealthy'
  1220. 'Unhealthy'
  1221. 'Unhealthy'
  1222. 'Unhealthy'
  1223. 'Unhealthy'
  1224. 'Unhealthy'
  1225. 'Unhealthy'
  1226. 'Unhealthy'
  1227. 'Unhealthy'
  1228. 'Unhealthy'
  1229. 'Unhealthy'
  1230. 'Unhealthy'
  1231. 'Unhealthy'
  1232. 'Unhealthy'
  1233. 'Unhealthy'
  1234. 'Unhealthy'
  1235. 'Unhealthy'
  1236. 'Unhealthy'
  1237. 'Unhealthy'
  1238. 'Unhealthy'
  1239. 'Unhealthy'
  1240. 'Unhealthy'
  1241. 'Unhealthy'
  1242. 'Unhealthy'
  1243. 'Unhealthy'
  1244. 'Unhealthy'
  1245. 'Unhealthy'
  1246. 'Unhealthy'
  1247. 'Unhealthy'
  1248. 'Unhealthy'
  1249. 'Unhealthy'
  1250. 'Unhealthy'
  1251. 'Unhealthy'
  1252. 'Unhealthy'
  1253. 'Unhealthy'
  1254. 'Unhealthy'
  1255. 'Unhealthy'
  1256. 'Unhealthy'
  1257. 'Unhealthy'
  1258. 'Unhealthy'
  1259. 'Unhealthy'
  1260. 'Unhealthy'
  1261. 'Unhealthy'
  1262. 'Unhealthy'
  1263. 'Unhealthy'
  1264. 'Unhealthy'
  1265. 'Unhealthy'
  1266. 'Unhealthy'
  1267. 'Unhealthy'
  1268. 'Unhealthy'
  1269. 'Unhealthy'
  1270. 'Unhealthy'
  1271. 'Unhealthy'
  1272. 'Unhealthy'
  1273. 'Unhealthy'
  1274. 'Unhealthy'
  1275. 'Unhealthy'
  1276. 'Unhealthy'
  1277. 'Unhealthy'
  1278. 'Unhealthy'
  1279. 'Unhealthy'
  1280. 'Unhealthy'
  1281. 'Unhealthy'
  1282. 'Unhealthy'
  1283. 'Unhealthy'
  1284. 'Unhealthy'
  1285. 'Unhealthy'
  1286. 'Unhealthy'
  1287. 'Unhealthy'
  1288. 'Unhealthy'
  1289. 'Unhealthy'
  1290. 'Unhealthy'
  1291. 'Unhealthy'
  1292. 'Unhealthy'
  1293. 'Unhealthy'
  1294. 'Unhealthy'
  1295. 'Unhealthy'
  1296. 'Unhealthy'
  1297. 'Unhealthy'
  1298. 'Unhealthy'
  1299. 'Unhealthy'
  1300. 'Unhealthy'
  1301. 'Unhealthy'
  1302. 'Unhealthy'
  1303. 'Unhealthy'
  1304. 'Unhealthy'
  1305. 'Unhealthy'
  1306. 'Unhealthy'
  1307. 'Unhealthy'
  1308. 'Unhealthy'
  1309. 'Unhealthy'
  1310. 'Unhealthy'
  1311. 'Unhealthy'
  1312. 'Unhealthy'
  1313. 'Unhealthy'
  1314. 'Unhealthy'
  1315. 'Unhealthy'
  1316. 'Unhealthy'
  1317. 'Unhealthy'
  1318. 'Unhealthy'
  1319. 'Unhealthy'
  1320. 'Unhealthy'
  1321. 'Unhealthy'
  1322. 'Unhealthy'
  1323. 'Unhealthy'
  1324. 'Unhealthy'
  1325. 'Unhealthy'
  1326. 'Unhealthy'
  1327. 'Unhealthy'
  1328. 'Unhealthy'
  1329. 'Unhealthy'
  1330. 'Unhealthy'
  1331. 'Unhealthy'
  1332. 'Unhealthy'
  1333. 'Unhealthy'
  1334. 'Unhealthy'
  1335. 'Unhealthy'
  1336. 'Unhealthy'
  1337. 'Unhealthy'
  1338. 'Unhealthy'
  1339. 'Unhealthy'
  1340. 'Unhealthy'
  1341. 'Unhealthy'
  1342. 'Unhealthy'
  1343. 'Unhealthy'
  1344. 'Unhealthy'
  1345. 'Unhealthy'
  1346. 'Unhealthy'
  1347. 'Unhealthy'
  1348. 'Unhealthy'
  1349. 'Unhealthy'
  1350. 'Unhealthy'
  1351. 'Unhealthy'
  1352. 'Unhealthy'
  1353. 'Unhealthy'
  1354. 'Unhealthy'
  1355. 'Unhealthy'
  1356. 'Unhealthy'
  1357. 'Unhealthy'
  1358. 'Unhealthy'
  1359. 'Unhealthy'
  1360. 'Unhealthy'
  1361. 'Unhealthy'
  1362. 'Unhealthy'
  1363. 'Unhealthy'
  1364. 'Unhealthy'
  1365. 'Unhealthy'
  1366. 'Unhealthy'
  1367. 'Unhealthy'
  1368. 'Unhealthy'
  1369. 'Unhealthy'
  1370. 'Unhealthy'
  1371. 'Unhealthy'
  1372. 'Unhealthy'
  1373. 'Unhealthy'
  1374. 'Unhealthy'
  1375. 'Unhealthy'
  1376. 'Unhealthy'
  1377. 'Unhealthy'
  1378. 'Unhealthy'
  1379. 'Unhealthy'
  1380. 'Unhealthy'
  1381. 'Unhealthy'
  1382. 'Unhealthy'
  1383. 'Unhealthy'
  1384. 'Unhealthy'
  1385. 'Unhealthy'
  1386. 'Unhealthy'
  1387. 'Unhealthy'
  1388. 'Unhealthy'
  1389. 'Unhealthy'
  1390. 'Unhealthy'
  1391. 'Unhealthy'
  1392. 'Unhealthy'
  1393. 'Unhealthy'
  1394. 'Unhealthy'
  1395. 'Unhealthy'
  1396. 'Unhealthy'
  1397. 'Unhealthy'
  1398. 'Unhealthy'
  1399. 'Unhealthy'
  1400. 'Unhealthy'
  1401. 'Unhealthy'
  1402. 'Unhealthy'
  1403. 'Unhealthy'
  1404. 'Unhealthy'
  1405. 'Unhealthy'
  1406. 'Unhealthy'
  1407. 'Unhealthy'
  1408. 'Unhealthy'
  1409. 'Unhealthy'
  1410. 'Unhealthy'
  1411. 'Unhealthy'
  1412. 'Unhealthy'
  1413. 'Unhealthy'
  1414. 'Unhealthy'
  1415. 'Unhealthy'
  1416. 'Unhealthy'
  1417. 'Unhealthy'
  1418. 'Unhealthy'
  1419. 'Unhealthy'
  1420. 'Unhealthy'
  1421. 'Unhealthy'
  1422. 'Unhealthy'
  1423. 'Unhealthy'
  1424. 'Unhealthy'
  1425. 'Unhealthy'
  1426. 'Unhealthy'
  1427. 'Unhealthy'
  1428. 'Unhealthy'
  1429. 'Unhealthy'
  1430. 'Unhealthy'
  1431. 'Unhealthy'
  1432. 'Unhealthy'
  1433. 'Unhealthy'
  1434. 'Unhealthy'
  1435. 'Unhealthy'
  1436. 'Unhealthy'
  1437. 'Unhealthy'
  1438. 'Unhealthy'
  1439. 'Unhealthy'
  1440. 'Unhealthy'
  1441. 'Unhealthy'
  1442. 'Unhealthy'
  1443. 'Unhealthy'
  1444. 'Unhealthy'
  1445. 'Unhealthy'
  1446. 'Unhealthy'
  1447. 'Unhealthy'
  1448. 'Unhealthy'
  1449. 'Unhealthy'
  1450. 'Unhealthy'
  1451. 'Unhealthy'
  1452. 'Unhealthy'
  1453. 'Unhealthy'
  1454. 'Unhealthy'
  1455. 'Unhealthy'
  1456. 'Unhealthy'
  1457. 'Unhealthy'
  1458. 'Unhealthy'
  1459. 'Unhealthy'
  1460. 'Unhealthy'
  1461. 'Unhealthy'
  1462. 'Unhealthy'
  1463. 'Unhealthy'
  1464. 'Unhealthy'
  1465. 'Unhealthy'
  1466. 'Unhealthy'
  1467. 'Unhealthy'
  1468. 'Unhealthy'
  1469. 'Unhealthy'
  1470. 'Unhealthy'
  1471. 'Unhealthy'
  1472. 'Unhealthy'
  1473. 'Unhealthy'
  1474. 'Unhealthy'
  1475. 'Unhealthy'
  1476. 'Unhealthy'
  1477. 'Unhealthy'
  1478. 'Unhealthy'
  1479. 'Unhealthy'
  1480. 'Unhealthy'
  1481. 'Unhealthy'
  1482. 'Unhealthy'
  1483. 'Unhealthy'
  1484. 'Unhealthy'
  1485. 'Unhealthy'
  1486. 'Unhealthy'
  1487. 'Unhealthy'
  1488. 'Unhealthy'
  1489. 'Unhealthy'
  1490. 'Unhealthy'
  1491. 'Unhealthy'
  1492. 'Unhealthy'
  1493. 'Unhealthy'
  1494. 'Unhealthy'
  1495. 'Unhealthy'
  1496. 'Unhealthy'
  1497. 'Unhealthy'
  1498. 'Unhealthy'
  1499. 'Unhealthy'
  1500. 'Unhealthy'

Creating col: Trees

In [126]:
Trees = rep(1:nrow(model$err.rate), times = 3)

Trees
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20
  21. 21
  22. 22
  23. 23
  24. 24
  25. 25
  26. 26
  27. 27
  28. 28
  29. 29
  30. 30
  31. 31
  32. 32
  33. 33
  34. 34
  35. 35
  36. 36
  37. 37
  38. 38
  39. 39
  40. 40
  41. 41
  42. 42
  43. 43
  44. 44
  45. 45
  46. 46
  47. 47
  48. 48
  49. 49
  50. 50
  51. 51
  52. 52
  53. 53
  54. 54
  55. 55
  56. 56
  57. 57
  58. 58
  59. 59
  60. 60
  61. 61
  62. 62
  63. 63
  64. 64
  65. 65
  66. 66
  67. 67
  68. 68
  69. 69
  70. 70
  71. 71
  72. 72
  73. 73
  74. 74
  75. 75
  76. 76
  77. 77
  78. 78
  79. 79
  80. 80
  81. 81
  82. 82
  83. 83
  84. 84
  85. 85
  86. 86
  87. 87
  88. 88
  89. 89
  90. 90
  91. 91
  92. 92
  93. 93
  94. 94
  95. 95
  96. 96
  97. 97
  98. 98
  99. 99
  100. 100
  101. 101
  102. 102
  103. 103
  104. 104
  105. 105
  106. 106
  107. 107
  108. 108
  109. 109
  110. 110
  111. 111
  112. 112
  113. 113
  114. 114
  115. 115
  116. 116
  117. 117
  118. 118
  119. 119
  120. 120
  121. 121
  122. 122
  123. 123
  124. 124
  125. 125
  126. 126
  127. 127
  128. 128
  129. 129
  130. 130
  131. 131
  132. 132
  133. 133
  134. 134
  135. 135
  136. 136
  137. 137
  138. 138
  139. 139
  140. 140
  141. 141
  142. 142
  143. 143
  144. 144
  145. 145
  146. 146
  147. 147
  148. 148
  149. 149
  150. 150
  151. 151
  152. 152
  153. 153
  154. 154
  155. 155
  156. 156
  157. 157
  158. 158
  159. 159
  160. 160
  161. 161
  162. 162
  163. 163
  164. 164
  165. 165
  166. 166
  167. 167
  168. 168
  169. 169
  170. 170
  171. 171
  172. 172
  173. 173
  174. 174
  175. 175
  176. 176
  177. 177
  178. 178
  179. 179
  180. 180
  181. 181
  182. 182
  183. 183
  184. 184
  185. 185
  186. 186
  187. 187
  188. 188
  189. 189
  190. 190
  191. 191
  192. 192
  193. 193
  194. 194
  195. 195
  196. 196
  197. 197
  198. 198
  199. 199
  200. 200
  201. 201
  202. 202
  203. 203
  204. 204
  205. 205
  206. 206
  207. 207
  208. 208
  209. 209
  210. 210
  211. 211
  212. 212
  213. 213
  214. 214
  215. 215
  216. 216
  217. 217
  218. 218
  219. 219
  220. 220
  221. 221
  222. 222
  223. 223
  224. 224
  225. 225
  226. 226
  227. 227
  228. 228
  229. 229
  230. 230
  231. 231
  232. 232
  233. 233
  234. 234
  235. 235
  236. 236
  237. 237
  238. 238
  239. 239
  240. 240
  241. 241
  242. 242
  243. 243
  244. 244
  245. 245
  246. 246
  247. 247
  248. 248
  249. 249
  250. 250
  251. 251
  252. 252
  253. 253
  254. 254
  255. 255
  256. 256
  257. 257
  258. 258
  259. 259
  260. 260
  261. 261
  262. 262
  263. 263
  264. 264
  265. 265
  266. 266
  267. 267
  268. 268
  269. 269
  270. 270
  271. 271
  272. 272
  273. 273
  274. 274
  275. 275
  276. 276
  277. 277
  278. 278
  279. 279
  280. 280
  281. 281
  282. 282
  283. 283
  284. 284
  285. 285
  286. 286
  287. 287
  288. 288
  289. 289
  290. 290
  291. 291
  292. 292
  293. 293
  294. 294
  295. 295
  296. 296
  297. 297
  298. 298
  299. 299
  300. 300
  301. 301
  302. 302
  303. 303
  304. 304
  305. 305
  306. 306
  307. 307
  308. 308
  309. 309
  310. 310
  311. 311
  312. 312
  313. 313
  314. 314
  315. 315
  316. 316
  317. 317
  318. 318
  319. 319
  320. 320
  321. 321
  322. 322
  323. 323
  324. 324
  325. 325
  326. 326
  327. 327
  328. 328
  329. 329
  330. 330
  331. 331
  332. 332
  333. 333
  334. 334
  335. 335
  336. 336
  337. 337
  338. 338
  339. 339
  340. 340
  341. 341
  342. 342
  343. 343
  344. 344
  345. 345
  346. 346
  347. 347
  348. 348
  349. 349
  350. 350
  351. 351
  352. 352
  353. 353
  354. 354
  355. 355
  356. 356
  357. 357
  358. 358
  359. 359
  360. 360
  361. 361
  362. 362
  363. 363
  364. 364
  365. 365
  366. 366
  367. 367
  368. 368
  369. 369
  370. 370
  371. 371
  372. 372
  373. 373
  374. 374
  375. 375
  376. 376
  377. 377
  378. 378
  379. 379
  380. 380
  381. 381
  382. 382
  383. 383
  384. 384
  385. 385
  386. 386
  387. 387
  388. 388
  389. 389
  390. 390
  391. 391
  392. 392
  393. 393
  394. 394
  395. 395
  396. 396
  397. 397
  398. 398
  399. 399
  400. 400
  401. 401
  402. 402
  403. 403
  404. 404
  405. 405
  406. 406
  407. 407
  408. 408
  409. 409
  410. 410
  411. 411
  412. 412
  413. 413
  414. 414
  415. 415
  416. 416
  417. 417
  418. 418
  419. 419
  420. 420
  421. 421
  422. 422
  423. 423
  424. 424
  425. 425
  426. 426
  427. 427
  428. 428
  429. 429
  430. 430
  431. 431
  432. 432
  433. 433
  434. 434
  435. 435
  436. 436
  437. 437
  438. 438
  439. 439
  440. 440
  441. 441
  442. 442
  443. 443
  444. 444
  445. 445
  446. 446
  447. 447
  448. 448
  449. 449
  450. 450
  451. 451
  452. 452
  453. 453
  454. 454
  455. 455
  456. 456
  457. 457
  458. 458
  459. 459
  460. 460
  461. 461
  462. 462
  463. 463
  464. 464
  465. 465
  466. 466
  467. 467
  468. 468
  469. 469
  470. 470
  471. 471
  472. 472
  473. 473
  474. 474
  475. 475
  476. 476
  477. 477
  478. 478
  479. 479
  480. 480
  481. 481
  482. 482
  483. 483
  484. 484
  485. 485
  486. 486
  487. 487
  488. 488
  489. 489
  490. 490
  491. 491
  492. 492
  493. 493
  494. 494
  495. 495
  496. 496
  497. 497
  498. 498
  499. 499
  500. 500
  501. 1
  502. 2
  503. 3
  504. 4
  505. 5
  506. 6
  507. 7
  508. 8
  509. 9
  510. 10
  511. 11
  512. 12
  513. 13
  514. 14
  515. 15
  516. 16
  517. 17
  518. 18
  519. 19
  520. 20
  521. 21
  522. 22
  523. 23
  524. 24
  525. 25
  526. 26
  527. 27
  528. 28
  529. 29
  530. 30
  531. 31
  532. 32
  533. 33
  534. 34
  535. 35
  536. 36
  537. 37
  538. 38
  539. 39
  540. 40
  541. 41
  542. 42
  543. 43
  544. 44
  545. 45
  546. 46
  547. 47
  548. 48
  549. 49
  550. 50
  551. 51
  552. 52
  553. 53
  554. 54
  555. 55
  556. 56
  557. 57
  558. 58
  559. 59
  560. 60
  561. 61
  562. 62
  563. 63
  564. 64
  565. 65
  566. 66
  567. 67
  568. 68
  569. 69
  570. 70
  571. 71
  572. 72
  573. 73
  574. 74
  575. 75
  576. 76
  577. 77
  578. 78
  579. 79
  580. 80
  581. 81
  582. 82
  583. 83
  584. 84
  585. 85
  586. 86
  587. 87
  588. 88
  589. 89
  590. 90
  591. 91
  592. 92
  593. 93
  594. 94
  595. 95
  596. 96
  597. 97
  598. 98
  599. 99
  600. 100
  601. 101
  602. 102
  603. 103
  604. 104
  605. 105
  606. 106
  607. 107
  608. 108
  609. 109
  610. 110
  611. 111
  612. 112
  613. 113
  614. 114
  615. 115
  616. 116
  617. 117
  618. 118
  619. 119
  620. 120
  621. 121
  622. 122
  623. 123
  624. 124
  625. 125
  626. 126
  627. 127
  628. 128
  629. 129
  630. 130
  631. 131
  632. 132
  633. 133
  634. 134
  635. 135
  636. 136
  637. 137
  638. 138
  639. 139
  640. 140
  641. 141
  642. 142
  643. 143
  644. 144
  645. 145
  646. 146
  647. 147
  648. 148
  649. 149
  650. 150
  651. 151
  652. 152
  653. 153
  654. 154
  655. 155
  656. 156
  657. 157
  658. 158
  659. 159
  660. 160
  661. 161
  662. 162
  663. 163
  664. 164
  665. 165
  666. 166
  667. 167
  668. 168
  669. 169
  670. 170
  671. 171
  672. 172
  673. 173
  674. 174
  675. 175
  676. 176
  677. 177
  678. 178
  679. 179
  680. 180
  681. 181
  682. 182
  683. 183
  684. 184
  685. 185
  686. 186
  687. 187
  688. 188
  689. 189
  690. 190
  691. 191
  692. 192
  693. 193
  694. 194
  695. 195
  696. 196
  697. 197
  698. 198
  699. 199
  700. 200
  701. 201
  702. 202
  703. 203
  704. 204
  705. 205
  706. 206
  707. 207
  708. 208
  709. 209
  710. 210
  711. 211
  712. 212
  713. 213
  714. 214
  715. 215
  716. 216
  717. 217
  718. 218
  719. 219
  720. 220
  721. 221
  722. 222
  723. 223
  724. 224
  725. 225
  726. 226
  727. 227
  728. 228
  729. 229
  730. 230
  731. 231
  732. 232
  733. 233
  734. 234
  735. 235
  736. 236
  737. 237
  738. 238
  739. 239
  740. 240
  741. 241
  742. 242
  743. 243
  744. 244
  745. 245
  746. 246
  747. 247
  748. 248
  749. 249
  750. 250
  751. 251
  752. 252
  753. 253
  754. 254
  755. 255
  756. 256
  757. 257
  758. 258
  759. 259
  760. 260
  761. 261
  762. 262
  763. 263
  764. 264
  765. 265
  766. 266
  767. 267
  768. 268
  769. 269
  770. 270
  771. 271
  772. 272
  773. 273
  774. 274
  775. 275
  776. 276
  777. 277
  778. 278
  779. 279
  780. 280
  781. 281
  782. 282
  783. 283
  784. 284
  785. 285
  786. 286
  787. 287
  788. 288
  789. 289
  790. 290
  791. 291
  792. 292
  793. 293
  794. 294
  795. 295
  796. 296
  797. 297
  798. 298
  799. 299
  800. 300
  801. 301
  802. 302
  803. 303
  804. 304
  805. 305
  806. 306
  807. 307
  808. 308
  809. 309
  810. 310
  811. 311
  812. 312
  813. 313
  814. 314
  815. 315
  816. 316
  817. 317
  818. 318
  819. 319
  820. 320
  821. 321
  822. 322
  823. 323
  824. 324
  825. 325
  826. 326
  827. 327
  828. 328
  829. 329
  830. 330
  831. 331
  832. 332
  833. 333
  834. 334
  835. 335
  836. 336
  837. 337
  838. 338
  839. 339
  840. 340
  841. 341
  842. 342
  843. 343
  844. 344
  845. 345
  846. 346
  847. 347
  848. 348
  849. 349
  850. 350
  851. 351
  852. 352
  853. 353
  854. 354
  855. 355
  856. 356
  857. 357
  858. 358
  859. 359
  860. 360
  861. 361
  862. 362
  863. 363
  864. 364
  865. 365
  866. 366
  867. 367
  868. 368
  869. 369
  870. 370
  871. 371
  872. 372
  873. 373
  874. 374
  875. 375
  876. 376
  877. 377
  878. 378
  879. 379
  880. 380
  881. 381
  882. 382
  883. 383
  884. 384
  885. 385
  886. 386
  887. 387
  888. 388
  889. 389
  890. 390
  891. 391
  892. 392
  893. 393
  894. 394
  895. 395
  896. 396
  897. 397
  898. 398
  899. 399
  900. 400
  901. 401
  902. 402
  903. 403
  904. 404
  905. 405
  906. 406
  907. 407
  908. 408
  909. 409
  910. 410
  911. 411
  912. 412
  913. 413
  914. 414
  915. 415
  916. 416
  917. 417
  918. 418
  919. 419
  920. 420
  921. 421
  922. 422
  923. 423
  924. 424
  925. 425
  926. 426
  927. 427
  928. 428
  929. 429
  930. 430
  931. 431
  932. 432
  933. 433
  934. 434
  935. 435
  936. 436
  937. 437
  938. 438
  939. 439
  940. 440
  941. 441
  942. 442
  943. 443
  944. 444
  945. 445
  946. 446
  947. 447
  948. 448
  949. 449
  950. 450
  951. 451
  952. 452
  953. 453
  954. 454
  955. 455
  956. 456
  957. 457
  958. 458
  959. 459
  960. 460
  961. 461
  962. 462
  963. 463
  964. 464
  965. 465
  966. 466
  967. 467
  968. 468
  969. 469
  970. 470
  971. 471
  972. 472
  973. 473
  974. 474
  975. 475
  976. 476
  977. 477
  978. 478
  979. 479
  980. 480
  981. 481
  982. 482
  983. 483
  984. 484
  985. 485
  986. 486
  987. 487
  988. 488
  989. 489
  990. 490
  991. 491
  992. 492
  993. 493
  994. 494
  995. 495
  996. 496
  997. 497
  998. 498
  999. 499
  1000. 500
  1001. 1
  1002. 2
  1003. 3
  1004. 4
  1005. 5
  1006. 6
  1007. 7
  1008. 8
  1009. 9
  1010. 10
  1011. 11
  1012. 12
  1013. 13
  1014. 14
  1015. 15
  1016. 16
  1017. 17
  1018. 18
  1019. 19
  1020. 20
  1021. 21
  1022. 22
  1023. 23
  1024. 24
  1025. 25
  1026. 26
  1027. 27
  1028. 28
  1029. 29
  1030. 30
  1031. 31
  1032. 32
  1033. 33
  1034. 34
  1035. 35
  1036. 36
  1037. 37
  1038. 38
  1039. 39
  1040. 40
  1041. 41
  1042. 42
  1043. 43
  1044. 44
  1045. 45
  1046. 46
  1047. 47
  1048. 48
  1049. 49
  1050. 50
  1051. 51
  1052. 52
  1053. 53
  1054. 54
  1055. 55
  1056. 56
  1057. 57
  1058. 58
  1059. 59
  1060. 60
  1061. 61
  1062. 62
  1063. 63
  1064. 64
  1065. 65
  1066. 66
  1067. 67
  1068. 68
  1069. 69
  1070. 70
  1071. 71
  1072. 72
  1073. 73
  1074. 74
  1075. 75
  1076. 76
  1077. 77
  1078. 78
  1079. 79
  1080. 80
  1081. 81
  1082. 82
  1083. 83
  1084. 84
  1085. 85
  1086. 86
  1087. 87
  1088. 88
  1089. 89
  1090. 90
  1091. 91
  1092. 92
  1093. 93
  1094. 94
  1095. 95
  1096. 96
  1097. 97
  1098. 98
  1099. 99
  1100. 100
  1101. 101
  1102. 102
  1103. 103
  1104. 104
  1105. 105
  1106. 106
  1107. 107
  1108. 108
  1109. 109
  1110. 110
  1111. 111
  1112. 112
  1113. 113
  1114. 114
  1115. 115
  1116. 116
  1117. 117
  1118. 118
  1119. 119
  1120. 120
  1121. 121
  1122. 122
  1123. 123
  1124. 124
  1125. 125
  1126. 126
  1127. 127
  1128. 128
  1129. 129
  1130. 130
  1131. 131
  1132. 132
  1133. 133
  1134. 134
  1135. 135
  1136. 136
  1137. 137
  1138. 138
  1139. 139
  1140. 140
  1141. 141
  1142. 142
  1143. 143
  1144. 144
  1145. 145
  1146. 146
  1147. 147
  1148. 148
  1149. 149
  1150. 150
  1151. 151
  1152. 152
  1153. 153
  1154. 154
  1155. 155
  1156. 156
  1157. 157
  1158. 158
  1159. 159
  1160. 160
  1161. 161
  1162. 162
  1163. 163
  1164. 164
  1165. 165
  1166. 166
  1167. 167
  1168. 168
  1169. 169
  1170. 170
  1171. 171
  1172. 172
  1173. 173
  1174. 174
  1175. 175
  1176. 176
  1177. 177
  1178. 178
  1179. 179
  1180. 180
  1181. 181
  1182. 182
  1183. 183
  1184. 184
  1185. 185
  1186. 186
  1187. 187
  1188. 188
  1189. 189
  1190. 190
  1191. 191
  1192. 192
  1193. 193
  1194. 194
  1195. 195
  1196. 196
  1197. 197
  1198. 198
  1199. 199
  1200. 200
  1201. 201
  1202. 202
  1203. 203
  1204. 204
  1205. 205
  1206. 206
  1207. 207
  1208. 208
  1209. 209
  1210. 210
  1211. 211
  1212. 212
  1213. 213
  1214. 214
  1215. 215
  1216. 216
  1217. 217
  1218. 218
  1219. 219
  1220. 220
  1221. 221
  1222. 222
  1223. 223
  1224. 224
  1225. 225
  1226. 226
  1227. 227
  1228. 228
  1229. 229
  1230. 230
  1231. 231
  1232. 232
  1233. 233
  1234. 234
  1235. 235
  1236. 236
  1237. 237
  1238. 238
  1239. 239
  1240. 240
  1241. 241
  1242. 242
  1243. 243
  1244. 244
  1245. 245
  1246. 246
  1247. 247
  1248. 248
  1249. 249
  1250. 250
  1251. 251
  1252. 252
  1253. 253
  1254. 254
  1255. 255
  1256. 256
  1257. 257
  1258. 258
  1259. 259
  1260. 260
  1261. 261
  1262. 262
  1263. 263
  1264. 264
  1265. 265
  1266. 266
  1267. 267
  1268. 268
  1269. 269
  1270. 270
  1271. 271
  1272. 272
  1273. 273
  1274. 274
  1275. 275
  1276. 276
  1277. 277
  1278. 278
  1279. 279
  1280. 280
  1281. 281
  1282. 282
  1283. 283
  1284. 284
  1285. 285
  1286. 286
  1287. 287
  1288. 288
  1289. 289
  1290. 290
  1291. 291
  1292. 292
  1293. 293
  1294. 294
  1295. 295
  1296. 296
  1297. 297
  1298. 298
  1299. 299
  1300. 300
  1301. 301
  1302. 302
  1303. 303
  1304. 304
  1305. 305
  1306. 306
  1307. 307
  1308. 308
  1309. 309
  1310. 310
  1311. 311
  1312. 312
  1313. 313
  1314. 314
  1315. 315
  1316. 316
  1317. 317
  1318. 318
  1319. 319
  1320. 320
  1321. 321
  1322. 322
  1323. 323
  1324. 324
  1325. 325
  1326. 326
  1327. 327
  1328. 328
  1329. 329
  1330. 330
  1331. 331
  1332. 332
  1333. 333
  1334. 334
  1335. 335
  1336. 336
  1337. 337
  1338. 338
  1339. 339
  1340. 340
  1341. 341
  1342. 342
  1343. 343
  1344. 344
  1345. 345
  1346. 346
  1347. 347
  1348. 348
  1349. 349
  1350. 350
  1351. 351
  1352. 352
  1353. 353
  1354. 354
  1355. 355
  1356. 356
  1357. 357
  1358. 358
  1359. 359
  1360. 360
  1361. 361
  1362. 362
  1363. 363
  1364. 364
  1365. 365
  1366. 366
  1367. 367
  1368. 368
  1369. 369
  1370. 370
  1371. 371
  1372. 372
  1373. 373
  1374. 374
  1375. 375
  1376. 376
  1377. 377
  1378. 378
  1379. 379
  1380. 380
  1381. 381
  1382. 382
  1383. 383
  1384. 384
  1385. 385
  1386. 386
  1387. 387
  1388. 388
  1389. 389
  1390. 390
  1391. 391
  1392. 392
  1393. 393
  1394. 394
  1395. 395
  1396. 396
  1397. 397
  1398. 398
  1399. 399
  1400. 400
  1401. 401
  1402. 402
  1403. 403
  1404. 404
  1405. 405
  1406. 406
  1407. 407
  1408. 408
  1409. 409
  1410. 410
  1411. 411
  1412. 412
  1413. 413
  1414. 414
  1415. 415
  1416. 416
  1417. 417
  1418. 418
  1419. 419
  1420. 420
  1421. 421
  1422. 422
  1423. 423
  1424. 424
  1425. 425
  1426. 426
  1427. 427
  1428. 428
  1429. 429
  1430. 430
  1431. 431
  1432. 432
  1433. 433
  1434. 434
  1435. 435
  1436. 436
  1437. 437
  1438. 438
  1439. 439
  1440. 440
  1441. 441
  1442. 442
  1443. 443
  1444. 444
  1445. 445
  1446. 446
  1447. 447
  1448. 448
  1449. 449
  1450. 450
  1451. 451
  1452. 452
  1453. 453
  1454. 454
  1455. 455
  1456. 456
  1457. 457
  1458. 458
  1459. 459
  1460. 460
  1461. 461
  1462. 462
  1463. 463
  1464. 464
  1465. 465
  1466. 466
  1467. 467
  1468. 468
  1469. 469
  1470. 470
  1471. 471
  1472. 472
  1473. 473
  1474. 474
  1475. 475
  1476. 476
  1477. 477
  1478. 478
  1479. 479
  1480. 480
  1481. 481
  1482. 482
  1483. 483
  1484. 484
  1485. 485
  1486. 486
  1487. 487
  1488. 488
  1489. 489
  1490. 490
  1491. 491
  1492. 492
  1493. 493
  1494. 494
  1495. 495
  1496. 496
  1497. 497
  1498. 498
  1499. 499
  1500. 500
In [127]:
Error = c(model$err.rate[, "OOB"],
         model$err.rate[, "Healthy"],
         model$err.rate[, "Unhealthy"])

length(Error)
1500

Making the df:

In [128]:
oob.error.data <- data.frame(Trees = Trees, Type = Type, Error = Error)

head(oob.error.data)

nrow(oob.error.data)
TreesTypeError
1 OOB 0.2672414
2 OOB 0.2702703
3 OOB 0.2616034
4 OOB 0.2643678
5 OOB 0.2795699
6 OOB 0.2762238
1500

Now we plot this error

In [129]:
ggplot(data = oob.error.data, aes(x=Trees, y=Error)) + geom_line(aes(color = Type))

The blue line shows the error rates while classifying Unhealthy patients

The green line shows the overall OOB Error Rate. So its in the middle (avg) of the 2

The red line shows the error rates while classifying healthy patients

We see in general the error rate dec when the RF has more trees

If we added more trees would the error rate go down further?

Lets make a RF with 1000 trees

In [130]:
model2 <- randomForest(hd ~ ., data = data.imputed, ntree = 1000, proximity = TRUE)

model2
Call:
 randomForest(formula = hd ~ ., data = data.imputed, ntree = 1000,      proximity = TRUE) 
               Type of random forest: classification
                     Number of trees: 1000
No. of variables tried at each split: 3

        OOB estimate of  error rate: 16.5%
Confusion matrix:
          Healthy Unhealthy class.error
Healthy       142        22   0.1341463
Unhealthy      28       111   0.2014388

OOB error rate is same as before

And confusion matrix tells us we did no better than before

In [131]:
Type = rep(c("OOB", "Healthy", "Unhealthy"), each = nrow(model2$err.rate))
Trees = rep(1:nrow(model2$err.rate), times = 3)
Error = c(model2$err.rate[, "OOB"],
         model2$err.rate[, "Healthy"],
         model2$err.rate[, "Unhealthy"])

oob.error.data <- data.frame(Trees = Trees, Type = Type, Error = Error)

ggplot(data = oob.error.data, aes(x=Trees, y=Error)) + geom_line(aes(color = Type))

The error rates stabilize right after 500 trees

So adding more trees would not help

But we would not have known this had we not added more trees

Optimal no of vars at each internal node

This is done using the param: mtry

In [132]:
# Create an empty vector:

oob.values = vector(length = 10)

for (i in 1: 10){
    
    # build an RF using "i" to determine no of vars to try at  each step
    
    temp.model = randomForest(hd ~ ., data = data.imputed, mtry = i, ntree = 1000)
    
    # print(temp.model)
    
    # reqd oob value is the 1st col (OOB) of last (after building 1000) tree
    
    oob.value <- temp.model$err.rate[nrow(temp.model$err.rate),1]
    
    # Stre OOB error rate for current model
    
    oob.values[i] <- oob.value
}

oob.values
  1. 0.171617161716172
  2. 0.171617161716172
  3. 0.161716171617162
  4. 0.184818481848185
  5. 0.174917491749175
  6. 0.194719471947195
  7. 0.181518151815182
  8. 0.201320132013201
  9. 0.188118811881188
  10. 0.194719471947195

The 3rd value i.e no of vars = 3 is the optimal value

Coincidentally this is the default value