From the course: AI Workshop: Hands-on with GANs with Deep Convolutional Networks
Unlock this course with a free trial
Join today to access over 24,900 courses taught by industry experts.
Setting up the GAN training loop
From the course: AI Workshop: Hands-on with GANs with Deep Convolutional Networks
Setting up the GAN training loop
We are now ready to train our deep convolutional generative adversarial network. While training a GAN, the generator network and the discriminator network have to be trained together and we'll train them alternately in the same iteration. Let's set up some parameters for training. The loss function that we'll use is the binary cross entropy loss function, which will essentially mimic the minimax loss used to train a generative adversarial network. We'll set up the loss functions in such a way that the discriminator will maximize the probability of classifying real images as real and fake images as fake, whereas the generator will maximize the probability of having the discriminator classify fake images as real. I've initialized a batch of 64 fixed noise latent variables. Now, every so often during the training of the generator, we'll look at some sample images to see how the generator is performing at that point in the training. We'll use this fixed noise that I've set up here to…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
Generator and discriminator1m 43s
-
(Locked)
Deep convolutional GANs (DCGANs)6m 16s
-
(Locked)
Setting up data for GAN training4m 8s
-
(Locked)
Setting up the generator and discriminator6m 18s
-
(Locked)
Output from an untrained generator and discriminator2m 17s
-
(Locked)
Setting up the GAN training loop7m 9s
-
(Locked)
Viewing GAN training results4m 34s
-
-