From the course: Computer Vision for Data Scientists

Unlock the full course today

Join today to access over 24,900 courses taught by industry experts.

Optimization techniques

Optimization techniques

- [Instructor] Gradient descent is a widely used optimization algorithm for training neural networks, including convolutional neural networks. The algorithm works by iteratively updating the model's parameters using the gradients calculated during back propagation. The goal is to minimize the loss function, which will improve the model's performance. Imagine hiking in the mountains and trying to find the lowest point. You look at the landscape to figure out which direction to go in. This is similar to the concept of gradient descent in machine learning. The terrain represents a mathematical function, where each point on the terrain corresponds to a specific input. Your goal is to find the lowest point, or the bottom of the valley representing the function's minimum value. To do this, you observe the slope or steepness of the terrain at your current location. The gradient represents the slope of the terrain. It…

Contents