Today we will do some review of the class and I will suggest some next steps for learning about Neural Networks and Tensorflow/Keras.
We have been using the Hold-out Method, which means we have been splitting our data into
And we have been using k-fold cross-validation.
Working with the new data types that we have seen are a bit challenging, but tensors gives a way to store many different types of data.
Pictures are tensors.
Words can be converted to numbers using an Autoencoder.
We have covered all of the basic neural network designs.
We have discussed some of the newer applications related to neural netwooks.
The basic idea of fitting Neural Networks uses
We have discussed the idea of Transfer Learning and Pre-Trained Models.
We have also discussed the idea of saving a fitted neural network and using it later or elsewhere.
We have been using Open Source software. You are now aware of some of the difficulties working with and using R and Tensorflow/Keras.
Things change quickly and you may run into problems.
But when things are working they work great! Hopefully you agree.
Open source models.
How to think about deep learning
“The most surprising thing about deep learning is how simple it is. Ten years ago, no one expected that we would achieve such amazing results on machine-perception problems by using simple parametric models trained with gradient descent. Now, it turns out that all you need is sufficiently large parametric models trained with gradient descent on sufficiently many examples. As Feynman once said about the universe, “It’s not complicated, it’s just a lot of it.”