could you anyone help me how to include sine, cosine and tanh activation function for training the neural network

1 visualización (últimos 30 días)
In my code i have written the
layers = [ ...
sequenceInputLayer(inputSize)
fullyConnectedLayer(numHiddenUnits1)
reLuLayer
fullyConnectedLayer(numHiddenUnits2)
reLuLayer
fullyConnectedLayer(numClasses)
reLuLayer
regressionLayer]
Now i want to execute the code using sine, cosine and tanh instead of reLu.
Could anyone please help me on this.

Respuestas (1)

Akshat
Akshat el 27 de Ag. de 2024
Hi Jaah,
I see you want to use different activation functions instead of reLu.
In the case of "tanh", you can use the "tanh layer", about which you understand here:
Using "sine" and "cosine" as activation functions is not a viable choice, as "sine" and "cosine" are periodic functions and they have many local extrema. Thus, we lose the uniqueness of values. Due to this reason, it is not a popular choice to use these functions as the activation functions.
Hope this helps!
Akshat

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by