Hi all,
Using this post as guide I am constructing a Neural ODE that has a custom layer within. The custom layer aims to interpolate an input signal (first input signal) that varies over time and concatenates the interpolated values with an expanded constant value (second input signal). The ODE I am solving have the following form: The custom layer created is the following:
classdef CustomDLLayer < nnet.layer.Layer & nnet.layer.Acceleratable
function layer = CustomDLLayer(ti,ui,var_1,name)
function Z = predict(layer,t)
interpolation = interp1(layer.ti, layer.ui', t, 'linear', 'extrap');
var1_expanded = layer.var_1 * ones(size(interpolation), 'like', interpolation);
Z = cat(1, interpolation, var1_expanded);
Part of my main code is the following
u = dlarray(randn(100,numel(ti)));
CustomLayer = CustomDLLayer(t,u,var1,'CustomLayer');
fullyConnectedLayer(hiddenstate)
odenet = dlnetwork(odenet,Initialize=false);
odeLayer = neuralODELayer(odenet,tspan,GradientMode="adjoint",Name='ode');
The error I encountered is that the dimensions of arrays being concatenated (output of odenet and the input of layer) are not consistent. This makes sense because the output of odenet has 2 channels with 50 batches each while the input of layer has 1 channel and 1 batch. I've try initializing the dlnetwork with a dlarray that has 1 channel and 50 batch but it didn't work.
To train the model I am using the function trainnet and I am using the neuralODELayer function because I will intend to include more tools within this deep learning model.
I am not sure if I chose the best approach to deal with this problem and I will appreciate your guidance with it.
Sergio