Custom loss function with LSTM for Physics Informed Machine Learning

30 Ansichten (letzte 30 Tage)
I want to train a LSTM model to prdict time history response of a dynamical system. I have defined a physics based loss function in a matlab code (.m file). I would like to include this function as the loss function in model training along with the standard RMSE. Is this possible using Matlab's Deep Learning Toolbox?

Akzeptierte Antwort

Ranjeet
Ranjeet am 8 Mai 2023
To use a custom loss function, a custom layer can be created by following the documentation below, it has templates for defining an intermediate and final layer as well.
An LSTM specific example can be referred from the following answer and adapted for your use case:
  1 Kommentar
Shubham Baisthakur
Shubham Baisthakur am 15 Jun. 2023
For the neural network architecture I am using for my problem, I would like to define a Regression Output Layer with a custom loss function. For this, I would need the regression layer to have two inputs, however I am not able to achieve that. How do I get around this?
Following is the definition of the custom layer:
classdef customLossLayerMultiInput < nnet.layer.RegressionLayer & nnet.layer.Acceleratable
% Custom regression layer with mean-absolute-error loss and additional properties.
properties
node_properties
numFeature
end
methods
function layer = customLossLayerMultiInput(name, node_properties, numFeature)
% Constructor
layer.Name = name;
layer.Description = 'Physics-Informed loss function for LSTM training';
layer.node_properties = node_properties;
layer.numFeature = numFeature;
end
function loss = forwardLoss(layer, Y, T, varargin)
% Calculate the forward loss
% Reshape predictions and targets
Y = reshape(Y, [], 1);
T = reshape(T, [], 1);
X1 = varargin{1};
X2 = varargin{2};
% Sequence input data
sequence_input_data = reshape(X1, [], layer.numFeature);
% Calculate mean residue
mean_residue = PI_BEM_Residue(Y, T, sequence_input_data, layer.node_properties);
% Calculate RMSE loss
rmse_loss = rmse(Y, T);
% Total loss
loss = mean_residue + rmse_loss;
end
end
end
And this is the network architecture
layers = [
sequenceInputLayer(numFeatures, 'Name', 'inputLayer') % Define the sequence input layer and name it
lstmLayer(num_hidden_units, 'OutputMode', 'sequence', 'Name', 'lstmLayer') % Define the LSTM layer and name it
fullyConnectedLayer(1, 'Name', 'fullyConnectedLayer') % Define the fully connected layer and name it
dropoutLayer(x.dropout_rate, 'Name', 'dropoutLayer') % Define the dropout layer and name it
customLossLayerMultiInput(LayerName, node_properties,numFeatures)
];
% Create a layer graph
lgraph = layerGraph(layers);
lgraph = connectLayers(lgraph,"inputLayer",strcat(LayerName,'\in2'));
For this setup, I am getting an error
Error using nnet.cnn.LayerGraph>iValidateLayerName
Layer 'RegressionLayer_Node2\in2' does not exist.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by