Review Stat. 654

Prof. Eric A. Suess

Review

Today we will do some review of the class and I will suggest some next steps for learning about Neural Networks and Tensorflow/Keras.

Review

We have been using the Hold-out Method, which means we have been splitting our data into

  • Training data
  • Validation data
  • Test data

And we have been using k-fold cross-validation.

Review

Working with the new data types that we have seen are a bit challenging, but tensors gives a way to store many different types of data.

Pictures are tensors.

Words can be converted to numbers using an Autoencoder.

Review

We have covered all of the basic neural network designs.

  • Sequential
  • Feed-forward Neural Networks
  • Convolutional Neural Networks
  • Recurrent Layers
  • LSTM
  • Generative Neural Networks
  • GANs
  • Reinforcement Learning

Review

We have discussed some of the newer applications related to neural netwooks.

  • Stable Diffusion
  • Attention
  • Large Language Models

Review

The basic idea of fitting Neural Networks uses

  • Gradient Decent
  • Backprogation

Review

We have discussed the idea of Transfer Learning and Pre-Trained Models.

We have also discussed the idea of saving a fitted neural network and using it later or elsewhere.

Open Source software

We have been using Open Source software. You are now aware of some of the difficulties working with and using R and Tensorflow/Keras.

Things change quickly and you may run into problems.

But when things are working they work great! Hopefully you agree.

Our Book

Google

Alternatives

Hugging Face

Open source models.

LLMs

Read Chapter 14 Conclusions

How to think about deep learning

“The most surprising thing about deep learning is how simple it is. Ten years ago, no one expected that we would achieve such amazing results on machine-perception problems by using simple parametric models trained with gradient descent. Now, it turns out that all you need is sufficiently large parametric models trained with gradient descent on sufficiently many examples. As Feynman once said about the universe, “It’s not complicated, it’s just a lot of it.”