Building a Neural Network & Making Predictions (Summary)

Copied!
Happy Pythoning!

Congratulations! You built a neural network from scratch using NumPy. With this knowledge, you’re ready to dive deeper into the world of artificial intelligence in Python.

In this course, you learned:

• What deep learning is and what differentiates it from machine learning
• How to represent vectors with NumPy
• What activation functions are and why they’re used inside a neural network
• What the backpropagation algorithm is and how it works
• How to train a neural network and make predictions

The process of training a neural network mainly consists of applying operations to vectors. Today, you did it from scratch using only NumPy as a dependency. This isn’t recommended in a production setting because the whole process can be unproductive and error-prone. That’s one of the reasons why deep learning frameworks like Keras, PyTorch, and TensorFlow are so popular.

For additional information on topics covered in this course, check out these resources:

Sample Code (.zip)

23.0 KB

Course Slides (.pdf)

642.7 KB

Santosh

Fantastic. Brilliantly explained.

Rafael

Good explanation, but why, why there is no final example, where one vector would be given to the trained neuronal net with explanation:

-This Vector was given because…

-We await the prediction of…

-The trained model predicted x, because…

Short, to run the model and interpret the results by random input.

marcinszydlowski1984

Well explained.

bennikambs

Hey,

nice overview course, that demystifies some of the deep-learning scarecrows.

Two things: 1. Every tutorial that one might take after this one will be so much more advanced. Something intermediate building on top of this one here, but not yet being on the level of Tensorflow, Keras, PyTorch deep learning, would help flatten the steep learning curve 2. Something minor: Yeah, the dot-product measures similarities, and I know similarities are important for deep-learning. However, I guess the example in this course misses a bit the point. To my understanding it is the similarity of two different input vectors that matters, not the similarity of one input vector to the weights.

Thanks!

to join the conversation.