1、Lecture 8: GenerativeAdversarial Network2November 27, 2019Artificial IntelligenceGenerative Adversarial Networks Genarative Learn a generative model Adversarial Trained in an adversarial setting Networks Use Deep Neural NetworksArtificial Intelligence3Generative ModelsNovember 27, 2019Artificial Int
2、elligence4Generative ModelsNovember 27, 2019Artificial Intelligence5Why Generative Models? Discriminative models Given a image X, predict a label Y Estimates P(Y|X) Discriminative models limitations: Cant model P(X) Cant generate new images Generative models Can model P(X) Can generate new imagesNov
3、ember 27, 2019Artificial Intelligence6Magic of GANsNovember 27, 2019Artificial Intelligence7Magic of GANs Which one is Computer generated?November 27, 2019Artificial Intelligence8Magic of GANsNovember 27, 2019Artificial Intelligence9GANs ArchitectureNovember 27, 2019Artificial Intelligence10November
4、 27, 2019Adversarial TrainingAdversarial Samples:We can generate adversarial samples to fool a discriminative modelWe can use those adversarial samples to make models robustWe then require more effort to generate adversarial samplesRepeat this and we get better discriminative modelGANs extend that i
5、dea to generative models:Generator: generate fake samples, tries to fool the DiscriminatorDiscriminator: tries to distinguish between real and fake samplesTrain them against each otherRepeat this and we get better Generator and DiscriminatorArtificial Intelligence11Training DiscriminatorNovember 27,
6、 2019Artificial Intelligence12Training GeneratorNovember 27, 2019Artificial Intelligence13Mathematical formulationNovember 27, 2019Artificial Intelligence14Mathematical formulationNovember 27, 2019Artificial Intelligence15Mathematical formulationNovember 27, 201916November 27, 2019Artificial Intelli
7、genceMathematical formulationArtificial Intelligence17Advantages of GANsNovember 27, 2019Artificial Intelligence18Problems with GANsNovember 27, 2019Artificial Intelligence19Problems with GANsNovember 27, 2019Artificial Intelligence20November 27, 2019FormulationDeep Learning models (in general) invo
8、lve a single playerThe player tries to maximize its reward (minimize its loss).Use SGD (with Backpropagation) to find the optimal parameters.SGD has convergence guarantees (under certain conditions).Problem: With non-convexity, we might converge to local optima.Artificial Intelligence21November 27,
9、2019FormulationGANs instead involve two (or more) playersDiscriminator is trying to maximize its reward.Generator is trying to minimize Discriminators reward.SGD was not designed to find the Nash equilibrium of a game.Problem: We might not converge to the Nash equilibrium at all.22November 27, 2019A
10、rtificial IntelligenceNon-ConvergenceArtificial Intelligence23Problems with GANsNovember 27, 2019Artificial Intelligence24Mode-CollapseNovember 27, 2019Artificial Intelligence25Some Real ExamplesNovember 27, 2019Artificial Intelligence26Some Solutions Mini-Batch GANs Supervision with labels Some rec
11、ent attempts : Unrolled GANs W-GANsNovember 27, 2019Artificial Intelligence27Basic (Heuristic) Solutions Mini-Batch GANs Supervision with labelsNovember 27, 201928November 27, 2019Artificial IntelligenceHow to reward sample diversity?At Mode Collapse,Generator produces good samples, but a very few o
12、f them.Thus, Discriminator cant tag them as fake.To address this problem,Let the Discriminator know about this edge-case.More formally,Let the Discriminator look at the entire batch instead of single examplesIf there is lack of diversity, it will mark the examples as fakeThus,Generator will be force
13、d to produce diverse samples.Artificial Intelligence29November 27, 2019Mini-Batch GANsExtract features that capture diversity in the mini-batchFor e.g. L2 norm of the difference between all pairs from the batchFeed those features to the discriminator along with the imageFeature values will differ b/
14、w diverse and non-diverse batchesThus, Discriminator will rely on those features for classificationThis in turn,Will force the Generator to match those feature values with the real dataWill generate diverse batchesArtificial Intelligence30Basic (Heuristic) Solutions Mini-Batch GANs Supervision with
15、labelsNovember 27, 201931November 27, 2019Artificial IntelligenceSupervision with Labels32November 27, 2019Artificial IntelligenceAlternate view of GANs33November 27, 2019Artificial IntelligenceAlternate view of GANs (Contd.)34November 27, 2019Artificial IntelligenceEnergy-Based GANs35November 27, 2
16、019Artificial IntelligenceExamples36November 27, 2019Artificial IntelligenceExamples37November 27, 2019Artificial IntelligenceExamples38November 27, 2019Artificial IntelligenceExamples39November 27, 2019Artificial IntelligenceHow to reward Disentanglement?Artificial Intelligence40November 27, 2019Re
17、cap: Mutual InformationMutual Information captures the mutual dependence between two variablesMutual information between two variables , is defined as:Artificial Intelligence41November 27, 2019InfoGANWe want to maximize the mutual information between and = (, )Incorporate in the value function of th
18、e minimax game.Artificial Intelligence42Conditional GANsNovember 27, 2019Artificial Intelligence43November 27, 2019Conditional GANsSimple modification to the originalGAN framework that conditions themodel on additional information forbetter multi-modal learning.Lends to many practicalapplications of
19、 GANs when wehave explicit supervision available.Artificial Intelligence44Conditional GANsNovember 27, 2019Artificial Intelligence45November 27, 2019Coupled GANLearning a joint distribution of multi-domain images.Using GANs to learn the joint distribution with samples drawn from the marginaldistribu
20、tions.Direct applications in domain adaptation and image translation.Artificial Intelligence46Coupled GANNovember 27, 2019Artificial Intelligence47Coupled GANNovember 27, 201948November 27, 2019Artificial IntelligenceApplications49November 27, 2019Artificial IntelligenceApplicationsArtificial Intell
21、igence50Deep Convolution GANsNovember 27, 2019Artificial Intelligence51Deep Convolution GANsNovember 27, 2019Artificial Intelligence52Deep Convolution GANsNovember 27, 2019Artificial Intelligence53DCGAN(bedroom)November 27, 2019Artificial Intelligence54Image-to-ImageTranslationNovember 27, 2019Artif
22、icial Intelligence55Image-to-ImageTranslationNovember 27, 2019Artificial Intelligence56Text-to-ImageSynthesisNovember 27, 2019Artificial Intelligence57Text-to-Image SynthesisNovember 27, 201958November 27, 2019Artificial IntelligenceFace Aging with Conditional GANs59November 27, 2019Artificial Intel
23、ligenceFace Aging with Conditional GANsArtificial Intelligence60November 27, 2019Image Inpainting with GANsHaofeng Li, Guanbin Li, Liang Lin, Hongchuan Yu, and Yizhou Yu, “Context-Aware Semantic Inpainting”IEEE Transactions on Cybernetics (T-Cybernetics), DOI: 10.1109/TCYB.2018.2865036, 2018.Artificial Intelligence61GANs FutureNovember 27, 2019Artificial Intelligence62Explosion of GANNovember 27, 2019