Ease training by adding a Loss function and Optimizer to your network
The goal of neural networks is generally to minimize error. This is why the objective function is commonly referred to as a cost function or a loss function, and the value calculated by the loss function is simply referred to as "loss."
The optimizer is a function or algorithm that modifies the neural network's parameters in such a way that it tries to reduce the overall loss, thereby increasing the estimation accuracy.
This section describes how to set up the loss function and optimizer at the output (prediction) of our model.
After adding layers, the final steps in setting up a network are:
Adding a Loss Function and connecting it to the prediction from the model's last layer(s) in the network. The Loss function is also connected to the target value that is defined by the dataset's ground truth.