Neural ODE with custom layer for adding various input signals

7 Ansichten (letzte 30 Tage)
Sergio
Sergio am 28 Apr. 2025
Beantwortet: Hitesh am 6 Mai 2025
Hi all,
Using this post as guide I am constructing a Neural ODE that has a custom layer within. The custom layer aims to interpolate an input signal (first input signal) that varies over time and concatenates the interpolated values with an expanded constant value (second input signal). The ODE I am solving have the following form:
The custom layer created is the following:
classdef CustomDLLayer < nnet.layer.Layer & nnet.layer.Acceleratable
properties
ti % time
ui % value to interpolate
var_1 % constant
end
methods
function layer = CustomDLLayer(ti,ui,var_1,name)
layer.Name = name;
layer.ti = ti;
layer.ui = ui;
layer.var_1 = var_1;
end
function Z = predict(layer,t)
interpolation = interp1(layer.ti, layer.ui', t, 'linear', 'extrap'); % perform the interpolation
var1_expanded = layer.var_1 * ones(size(interpolation), 'like', interpolation); % expand the constant value
Z = cat(1, interpolation, var1_expanded); % concatenate the interpolated and expanded values
end
end
end
Part of my main code is the following
% Data generation
ti = linspace(1,50,50);
u = dlarray(randn(100,numel(ti)));
var1 = dlarray(1.5);
CustomLayer = CustomDLLayer(t,u,var1,'CustomLayer');
% set up the neural ode
hiddenstate = 10;
odenet = [
CustomLayer
concatenationLayer(1,2)
fullyConnectedLayer(hiddenstate)
tanhLayer
fullyConnectedLayer(1)
];
odenet = dlnetwork(odenet,Initialize=false);
tspan = 0:1:50;
odeLayer = neuralODELayer(odenet,tspan,GradientMode="adjoint",Name='ode');
% set up the main neural network
layer = [
featureInputLayer(1)
odeLayer
];
The error I encountered is that the dimensions of arrays being concatenated (output of odenet and the input of layer) are not consistent. This makes sense because the output of odenet has 2 channels with 50 batches each while the input of layer has 1 channel and 1 batch. I've try initializing the dlnetwork with a dlarray that has 1 channel and 50 batch but it didn't work.
To train the model I am using the function trainnet and I am using the neuralODELayer function because I will intend to include more tools within this deep learning model.
I am not sure if I chose the best approach to deal with this problem and I will appreciate your guidance with it.
Sergio

Antworten (1)

Hitesh
Hitesh am 6 Mai 2025
Hi Sergio,
The error that you are encountering "dimensions of arrays being concatenated are not consistent" indicates that ODE network (odenet) outputs a tensor of size [2, 50] (2 channels, 50 batches) whereas main network expects input of size [1, 1] (1 channel, 1 batch). Kindly follow the below steps to resolve this error:
Step 1: Ensure Consistent Input/Output Sizes
  • Custom Layer Output : Your custom layer outputs [2, numel(t)].You need to ensure that the input to the ODE layer matches this shape.
  • Input Layer : Main network's input layer need be compatible with the ODE network's expected input size. If "odenet" expects "[2, batchSize]", "featureInputLayer" needs to be:
featureInputLayer(2, 'Name', 'input');
  • Data Preparation: Ensure the input data to "trainnet" matches the expected input size of the network.
Step 2: Adjust the Network Construction
  • Modify the Input Layer
layer = [
featureInputLayer(2, 'Name', 'input') % <-- 2 channels
odeLayer
];
  • Modify Data Generation : If input is a time series then, you need to reshape the data so that each sample is [2, batchSize].
  • Remove Redundant Concatenation : If the custom layer already concatenates the interpolation and constant, there is no need to add a separate "concatenationLayer" in "odenet". So, kindly remove "concatenationLayer(1,2)" from "odenet" .
odenet = [
CustomLayer
fullyConnectedLayer(hiddenstate)
tanhLayer
fullyConnectedLayer(1)
];
For more information regarding "NeuralODELayer", kindly refer to the following MATLAB documentation:

Produkte


Version

R2024b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by