Borrar filtros
Borrar filtros

How can I apply the second order derivative to boundary condition in loss function of PINN-based deep learning network ?

20 visualizaciones (últimos 30 días)
I want to apply the second order derivate to boundary condition in loss function of PINN-based deep learning network.
I am trying to run the below MATLAB example code.
openExample('nnet/TrainPhysicsInformedNeuralNetworkWithLBFGSAndAutoDiffExample')
In the example code, there are two loss function consisted of Burger's equation and initial, boundary conditions.
Burger's equation: u/t+u*u∂/x∂−0.01/π*(∂^2u/x^2)=0
initial and boundary conditions: u(x,t=0)=−sin(πx) as the initial condition, and u(x=−1,t)=0 and u(x=1,t)=0
numBoundaryConditionPoints = [25 25];
x0BC1 = -1*ones(1,numBoundaryConditionPoints(1));
x0BC2 = ones(1,numBoundaryConditionPoints(2));
t0BC1 = linspace(0,1,numBoundaryConditionPoints(1));
t0BC2 = linspace(0,1,numBoundaryConditionPoints(2));
u0BC1 = zeros(1,numBoundaryConditionPoints(1));
u0BC2 = zeros(1,numBoundaryConditionPoints(2));
numInitialConditionPoints = 50;
x0IC = linspace(-1,1,numInitialConditionPoints);
t0IC = zeros(1,numInitialConditionPoints);
u0IC = -sin(pi*x0IC);
X0 = [x0IC x0BC1 x0BC2];
T0 = [t0IC t0BC1 t0BC2];
U0 = [u0IC u0BC1 u0BC2];
function [loss, gradients] = modelLoss(net, X, T, X0, T0, U0)
% Make predictions with the initial conditions.
XT = cat(1, X, T);
U = forward(net, XT);
% Calculate derivatives with respect to X and T.
gradientsU = dlgradient(sum(U, "all"), {X, T}, EnableHigherDerivatives = true);
Ux = gradientsU{1};
Ut = gradientsU{2};
% Calculate second-order derivates with respect to X.
Uxx = dlgradient(sum(Ux, "all"), X, EnableHigherDerivatives = true);
% Calculate mseF. Enforce Burger's equation.
f = Ut + U.*Ux - (0.01./pi).*Uxx;
zeroTarget = zeros(size(f), "like", f);
mseF = l2loss(f, zeroTarget);
% Calculate mseU. Enforce initial and boundary conditions.
XT0 = cat(1, X0, T0);
U0Pred = forward(net, XT0);
mseU = l2loss(U0Pred, U0);
% Calculated loss to be minimized by combining errors.
loss = mseF + mseU;
% Calculate gradients with respect to the learnable parameters.
gradients = dlgradient(loss, net.Learnables);
end
I want to change the boundary condition to second order derivate formed as u''(x=−1,t)=0 and u''(x=1,t)=0.
How can I apply the second order derivate boundary conditions (mseU)?
Could you give me the solution ?

Respuesta aceptada

Ruchika
Ruchika el 16 de Ag. de 2023
Editada: Ruchika el 16 de Ag. de 2023
Hi, to apply the second-order derivative boundary conditions in the loss function, you need to modify the modelLoss function in the MATLAB example code. Here's how you can modify it to enforce the second-order derivative boundary conditions u''(x=-1,t) = 0 and u''(x=1,t) = 0:
function [loss, gradients] = modelLoss(net, X, T, X0, T0, U0)
% Make predictions with the initial conditions.
XT = cat(1, X, T);
U = forward(net, XT);
% Calculate derivatives with respect to X and T.
gradientsU = dlgradient(sum(U, "all"), {X, T}, EnableHigherDerivatives = true);
Ux = gradientsU{1};
Ut = gradientsU{2};
% Calculate second-order derivatives with respect to X.
Uxx = dlgradient(sum(Ux, "all"), X, EnableHigherDerivatives = true);
% Calculate mseF. Enforce Burger's equation.
f = Ut + U.*Ux - (0.01./pi).*Uxx;
zeroTarget = zeros(size(f), "like", f);
mseF = l2loss(f, zeroTarget);
% Calculate mseU. Enforce initial and boundary conditions.
XT0 = cat(1, X0, T0);
U0Pred = forward(net, XT0);
Uxx0 = dlgradient(sum(U0Pred, "all"), X0, EnableHigherDerivatives = true); % Calculate second-order derivatives at boundary points
% Modify mseU to enforce second-order derivative boundary conditions
mseU = l2loss(U0Pred, U0) + l2loss(Uxx0(1:numBoundaryConditionPoints(1)), zeros(1,numBoundaryConditionPoints(1))) + l2loss(Uxx0(numBoundaryConditionPoints(1)+1:end), zeros(1,numBoundaryConditionPoints(2)));
% Calculate loss to be minimized by combining errors.
loss = mseF + mseU;
% Calculate gradients with respect to the learnable parameters.
gradients = dlgradient(loss, net.Learnables);
end
In the modified code, we calculate the second-order derivatives at the boundary points using ’dlgradient’ and store them in Uxx0. Then, we modify ‘mseU’to include the loss term for the second-order derivative boundary conditions by using l2loss to compare Uxx0 with zeros. You need to update the ‘numBoundaryConditionPoints’variable to match the number of boundary condition points you want to use for the second-order derivative boundary conditions.
  1 comentario
Mansung Kang
Mansung Kang el 16 de Ag. de 2023
Thank you for the kind reply @Ruchika.
When I run the modified code that you gave to me, it is not working for me.
('numBoundaryConditionPoints' is an unrecognized function or variable.)
I think, the example code (https://kr.mathworks.com/help/deeplearning/ug/solve-partial-differential-equations-with-lbfgs-method-and-deep-learning.html?searchHighlight=PINN%20loss%20function%20second%20order%20&s_tid=srchtitle_support_results_1_PINN%20loss%20function%20second%20order%20) didn't use the geometry function for solving the partial differential equation. So, there is no variable of "numBoundaryConditionPoints" in that example code.
Is it possible to solve the second-order derivative formed boundary condition without geometry information?
One more thing, the Uxx0 is also not second-order derivatives in your code. It was first-order derivatives.
Thank you

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by