how can I apply Relu activation function in Levenberg_Marquardt algorithm for training neural network?
    31 visualizaciones (últimos 30 días)
  
       Mostrar comentarios más antiguos
    
    Ashish Kumar Gupta
 el 10 de Nov. de 2022
  
    
    
    
    
    Respondida: Varun Sai Alaparthi
    
 el 22 de Nov. de 2022
            x = input';
t = target';
trainFcn = 'trainlm';
hiddenLayerSize = 10;
net = feedforwardnet(hiddenLayerSize,trainFcn);
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
[net] = train(net,x,t);
0 comentarios
Respuestas (1)
  Varun Sai Alaparthi
    
 el 22 de Nov. de 2022
        Hello Ashish,  
You can use ‘poslin’ as transfer function to the hidden layer which is same as applying ‘ReLU’ activation function.  
 You can use this code for using 'poslin'
net.layers{1}.transferFcn = 'poslin'; 
 I hope this information helps and please reach out for any further issues. 
0 comentarios
Ver también
Categorías
				Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
			
	Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

