Deep Learning in custom training loop with output layer replacement during training

6 visualizaciones (últimos 30 días)
Hi there!
I'm working on SGAN network that can classify unlabeled data set, and I found the excellent summary article, here
One of the SGAN examples assumes that I will use lambda function to replace (add) additional output layer during training loop for each iteration, something like this (in the article, look at the "Stacked Discriminator Models With Shared Weights" description):
  • Train supervised network (multiple classes - cross-enthropy output)
  • Add to supervised network sigmoid output
  • Train unsupervised network (with new additiona output layer)
In the article and original paper they use Keras and lambda function for output layer, how to make that change in network in Matlab during custom training loop? Changing layer during training reduce performance dramatically (replaceLayer function), I'm looking for some lambda analog for that type of training.
Thank you!
p.s. attached screenshot of SGAN training from book "GANs in Action".
  1 comentario
Kirill
Kirill el 29 de Mzo. de 2022
Editada: Kirill el 29 de Mzo. de 2022
There is an example on two outputs networks, it can be used I think, but it will be different approach than explaned in book and article: documentation

Iniciar sesión para comentar.

Respuestas (0)

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by