--- title: "Building a simple neural network using Keras and Tensorflow - Updated" output: word_document: default html_notebook: default pdf_document: default html_document: df_print: paged --- **Update:** The original code has been updated to use the *tidymodels* init_split() function, rather than using the indicies method which originally used setdiff, which now may have a conflict between base R and the tidyverse. Thank you A big thank you to Leon Jessen for posting his code on github. [Building a simple neural network using Keras and Tensorflow](https://github.com/leonjessen/keras_tensorflow_on_iris/blob/master/README.md) I have forked his project on github and put his code into an R Notebook so we can run it in class. ### Motivation The following is a minimal example for building your first simple artificial neural network using Keras and TensorFlow for R. [TensorFlow for R by Rstudio lives here](https://tensorflow.rstudio.com/keras/). ### Gettings started - Install Keras and TensorFlow for R You can install the Keras for R package from CRAN as follows: ```{r eval=FALSE} # install.packages("keras") ``` TensorFlow is the default backend engine. TensorFlow and Keras can be installed as follows: ```{r eval=FALSE} # library(keras) # install_keras() ``` Naturally, we will also need **Tidyverse**. ```{r eval=FALSE} # Install from CRAN # install.packages("tidyverse") # Or the development version from GitHub # install.packages("devtools") # devtools::install_github("hadley/tidyverse") ``` Once installed, we simply load the libraries. ```{r} library("keras") suppressMessages(library("tidyverse")) ``` ### Artificial Neural Network Using the Iris Data Set Right, let's get to it! ### Data The famous (Fisher's or Anderson's) *iris* data set contains a total of 150 observations of 4 input features *Sepal.Length*, *Sepal.Width*, *Petal.Length* and *Petal.Width* and 3 output classes *setosa* *versicolor* and *virginica*, with 50 observations in each class. The distributions of the feature values looks like so: ```{r} iris_tib <- as_tibble(iris) iris_tib ``` ```{r} iris_tib %>% pivot_longer(names_to = "feature", values_to = "value", -Species) %>% ggplot(aes(x = feature, y = value, fill = Species)) + geom_violin(alpha = 0.5, scale = "width") + theme_bw() ``` Our aim is to connect the 4 input features to the correct output class using an artificial neural network. For this task, we have chosen the following simple architecture with one input layer with 4 neurons (one for each feature), one hidden layer with 4 neurons and one output layer with 3 neurons (one for each class), all fully connected. ![architecture_visualisation.png](/home/esuess/Documents/Stat654/iris_nn/img/architecture_visualisation.png) Our artificial neural network will have a total of 35 parameters: 4 for each input neuron connected to the hidden layer, plus an additional 4 for the associated first bias neuron and 3 for each of the hidden neurons connected to the output layer, plus an additional 3 for the associated second bias neuron, i.e. $4 \times 4 + 4 + 4 \times 3 + 3=35$ ### Prepare data We start with slightly wrangling the iris data set by renaming and scaling the features and converting character labels to numeric. ```{r} set.seed(265509) nn_dat <- iris_tib %>% mutate(sepal_length = scale(Sepal.Length), sepal_width = scale(Sepal.Width), petal_length = scale(Petal.Length), petal_width = scale(Petal.Width), class_label = as.numeric(Species) - 1) %>% select(sepal_length, sepal_width, petal_length, petal_width, class_label) nn_dat %>% head() ``` Then, we create indices for splitting the iris data into a training and a test data set. We set aside 20% of the data for testing. ```{r} library(tidymodels) set.seed(364) n <- nrow(nn_dat) n iris_parts <- nn_dat %>% initial_split(prop = 0.8) train <- iris_parts %>% training() test <- iris_parts %>% testing() list(train, test) %>% map_int(nrow) ``` ```{r} n_total_samples <- nrow(nn_dat) n_train_samples <- nrow(train) n_test_samples <- nrow(test) ``` ### Create training and test data **Note** that the functions in the keras package are expecting the data to be in a matrix object and not a tibble. So as.matrix is added at the end of each line. ```{r} x_train <- train %>% select(-class_label) %>% as.matrix() y_train <- train %>% select(class_label) %>% as.matrix() %>% to_categorical() x_test <- test %>% select(-class_label) %>% as.matrix() y_test <- test %>% select(class_label) %>% as.matrix() %>% to_categorical() dim(y_train) dim(y_test) ``` ### Set Architecture With the data in place, we now set the architecture of our neural network. ```{r} model <- keras_model_sequential() model %>% layer_dense(units = 4, activation = 'relu', input_shape = 4) %>% layer_dense(units = 3, activation = 'softmax') model %>% summary ``` Next, the architecture set in the model needs to be compiled. ```{r} model %>% compile( loss = 'categorical_crossentropy', optimizer = optimizer_rmsprop(), metrics = c('accuracy') ) ``` ### Train the Artificial Neural Network Lastly we fit the model and save the training progress in the *history* object. **Try** changing the *validation_split* from 0 to 0.2 to see the *validation_loss*. ```{r} history <- model %>% fit( x = x_train, y = y_train, epochs = 200, batch_size = 20, validation_split = 0.2 ) plot(history) + ggtitle("Training a neural network based classifier on the iris data set") + theme_bw() ``` ### Evaluate Network Performance The final performance can be obtained like so. ```{r} perf <- model %>% evaluate(x_test, y_test) print(perf) ``` For the next plot the predicted and true values need to be in a vector. Note that the true values need to be unlisted before putting them into a numeric vector. ```{r} classes <- iris %>% pull(Species) %>% unique() y_pred <- model %>% predict_classes(x_test) y_true <- test %>% select(class_label) %>% unlist() %>% as.numeric() tibble(y_true = classes[y_true + 1], y_pred = classes[y_pred + 1], Correct = ifelse(y_true == y_pred, "Yes", "No") %>% factor) %>% ggplot(aes(x = y_true, y = y_pred, colour = Correct)) + geom_jitter() + theme_bw() + ggtitle(label = "Classification Performance of Artificial Neural Network", subtitle = str_c("Accuracy = ",round(perf[2],3)*100,"%")) + xlab(label = "True iris class") + ylab(label = "Predicted iris class") ``` ```{r} library(gmodels) CrossTable(y_pred, y_true, prop.chisq = FALSE, prop.t = FALSE, prop.r = FALSE, dnn = c('predicted', 'actual')) ``` ### Conclusion I hope this illustrated just how easy it is to get started building artificial neural network using Keras and TensorFlow in R. With relative ease, we created a 3-class predictor with an accuracy of 100%. This was a basic minimal example. The network can be expanded to create Deep Learning networks and also the entire TensorFlow API is available. Enjoy and Happy Learning! Leon **Thanks again Leon, this was awesome!!!**