Main Content

Predict Battery State of Charge Using Deep Learning

This example shows how to train a neural network to predict the state of charge of a battery by using deep learning.

Battery state of charge (SOC) is the level of charge of an electric battery relative to its capacity measured as a percentage. SOC is critical information for the vehicle energy management system and must be accurately estimated to ensure reliable and affordable electrified vehicles (xEV). However, due to the nonlinear temperature, health, and SOC dependent behavior of Li-ion batteries, SOC estimation is still a significant automotive engineering challenge. Traditional approaches to this problem, such as electrochemical models, usually require precise parameters and knowledge of the battery composition as well as its physical response. In contrast, using neural networks is a data-driven approach that requires minimal knowledge of the battery or its nonlinear behavior. [1]

This example is based on the MATLAB script from [1]. The example trains a neural network to predict the state of charge of a Li-ion battery, given time series data representing various features of the battery such as voltage, current, temperature, and average voltage and current (over the last 500 seconds).

The training data contains a single sequence of experimental data collected while the battery powered an electric vehicle during a driving cycle with an external temperature of 25 degrees Celsius. The test data contains four sequences of experimental data collected during driving cycles at four different temperatures. This example uses the preprocessed data set LG_HG2_Prepared_Dataset_McMasterUniversity_Jan_2020 from [1]. For an example showing how use a trained neural network inside a Simulink® model to predict the SOC of a battery, see Battery State of Charge Estimation in Simulink Using Deep Learning Network.

Download Data

Each file in the LG_HG2_Prepared_Dataset_McMasterUniversity_Jan_2020 data set contains a time series X of five predictors (voltage, current, temperature, average voltage, and average current) and a time series Y of one target (SOC). Each file represents data collected at a different ambient temperature.

Specify the URL from where to download the data set. Alternatively, you can download this data set manually from https://data.mendeley.com/datasets/cp3473x7xv/3.

url = "https://data.mendeley.com/public-files/datasets/cp3473x7xv/files/ad7ac5c9-2b9e-458a-a91f-6f3da449bdfb/file_downloaded";

Set downloadFolder to where you want to download the ZIP file and the outputFolder to where you want to extract the ZIP file.

downloadFolder = tempdir;
outputFolder = fullfile(downloadFolder, "LGHG2@n10C_to_25degC");

Download and extract the LG_HG2_Prepared_Dataset_McMasterUniversity_Jan_2020 data set.

if ~exist(outputFolder,"dir")
    fprintf("Downloading LGHG2@n10C_to_25degC.zip (56 MB) ... ")
    filename = fullfile(downloadFolder,"LGHG2@n10C_to_25degC.zip");
    websave(filename,url);
    unzip(filename,outputFolder)
end

Prepare Training Data

For the training data, create a file datastore and specify the read function as the load function. The load function loads the data from the MAT file into a structure array.

folderTrain = fullfile(outputFolder,"Train");
fdsTrain = fileDatastore(folderTrain, ReadFcn=@load); 

Each file in this datastore contains both the predictors X and the targets Y.

To create a transformed datastore tdsPredictorsTrain that returns only the predictor data X from each file, transform the file datastore fdsTrain.

tdsPredictorsTrain = transform(fdsTrain, @(data) {data.X});

Preview the transformed datastore. The output corresponds to a single sequence of predictors X from the first file.

preview(tdsPredictorsTrain)
ans = 1×1 cell array
    {5×669956 double}

To create a transformed datastore tdsTargetsTrain that returns only the target data Y from each file, transform the file datastore fdsTrain.

tdsTargetsTrain = transform(fdsTrain, @(data) {data.Y});

Preview the transformed datastore. The output corresponds to a single sequence of targets Y from the first file.

preview(tdsTargetsTrain)
ans = 1×1 cell array
    {[0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 0.2064 … ]}

To input both the predictors and targets from both datastores into a deep learning network, combine them using the combine function.

cdsTrain = combine(tdsPredictorsTrain,tdsTargetsTrain);

Note that to input the sequence data from datastores to a deep learning network, the mini-batches of the sequences must have the same length, which usually requires padding the sequences in the datastore. In this example, padding is not necessary because the training data consists of a single sequence. For more information, see Train Network Using Out-of-Memory Sequence Data.

Prepare Test and Validation Data

For the testing data, create a file datastore and specify the read function as the load function. The load function loads the data from the MAT file into a structure array.

folderTest = fullfile(outputFolder,"Test");
fdsTest = fileDatastore(folderTest, ReadFcn=@load);

Each file in this datastore contains both the predictors X and the targets Y.

To create a transformed datastore tdsPredictorsTest that returns only the predictor data X from each file, transform the file datastore fdsTest.

tdsPredictorsTest = transform(fdsTest, @(data) {data.X});

Preview the transformed datastore. The output corresponds to a single sequence of predictors X from the first file.

preview(tdsPredictorsTest)
ans = 1×1 cell array
    {5×39293 double}

To create a transformed datastore tdsTargetsTest that returns only the target data Y from each file, transform the file datastore fdsTest.

tdsTargetsTest = transform(fdsTest,@(data) {data.Y});

Preview the transformed datastore. The output corresponds to a single sequence of targets Y from the first file.

preview(tdsTargetsTest)
ans = 1×1 cell array
    {[1 1.0000 1.0000 1.0000 1.0000 1.0000 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9998 0.9998 0.9998 0.9998 0.9998 0.9998 0.9997 0.9996 0.9994 0.9991 0.9989 0.9988 0.9987 0.9983 0.9982 0.9980 0.9980 0.9980 0.9980 0.9981 0.9981 0.9981 0.9981 0.9981 0.9981 0.9980 0.9980 0.9979 0.9978 0.9975 0.9972 0.9970 0.9969 0.9969 0.9969 0.9970 0.9970 0.9970 0.9970 0.9970 0.9968 0.9965 0.9962 0.9960 0.9958 0.9957 0.9956 0.9956 0.9956 0.9955 0.9954 0.9954 0.9954 0.9953 0.9953 0.9952 0.9950 0.9950 0.9950 0.9950 0.9949 0.9948 0.9947 0.9946 0.9946 0.9945 0.9944 0.9942 0.9939 0.9938 0.9936 0.9935 0.9934 0.9933 0.9932 0.9932 0.9931 0.9931 0.9930 0.9929 0.9929 0.9929 0.9929 0.9928 0.9926 0.9925 0.9924 0.9923 0.9923 0.9923 0.9923 0.9923 0.9921 0.9920 0.9918 0.9917 0.9915 0.9914 0.9914 0.9914 0.9915 0.9915 0.9916 0.9916 0.9916 0.9917 0.9917 0.9918 0.9918 0.9918 0.9918 0.9918 0.9918 0.9918 0.9918 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9917 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9916 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9915 0.9914 0.9913 0.9911 0.9909 0.9905 0.9901 0.9898 0.9894 0.9891 0.9890 0.9891 0.9891 0.9891 0.9890 0.9889 0.9889 0.9887 0.9885 0.9886 0.9886 0.9887 0.9887 0.9888 0.9888 0.9887 0.9886 0.9884 0.9880 0.9876 0.9871 0.9864 0.9856 0.9849 0.9846 0.9840 0.9835 0.9830 0.9824 0.9818 0.9814 0.9811 0.9807 0.9806 0.9805 0.9804 0.9803 0.9801 0.9800 0.9799 0.9797 0.9795 0.9793 0.9790 0.9787 0.9783 0.9780 0.9777 0.9774 0.9770 0.9767 0.9763 0.9757 0.9752 0.9748 0.9745 0.9743 0.9741 0.9740 0.9738 0.9735 0.9732 0.9729 0.9726 0.9722 0.9720 0.9716 0.9714 0.9712 0.9710 0.9708 0.9706 0.9704 0.9702 0.9700 0.9698 0.9697 0.9696 0.9696 0.9696 0.9695 0.9694 0.9693 0.9691 0.9688 0.9686 0.9684 0.9682 0.9681 0.9681 0.9680 0.9680 0.9679 0.9677 0.9676 0.9675 0.9674 0.9672 0.9670 0.9668 0.9665 0.9662 0.9658 0.9655 0.9651 0.9646 0.9642 0.9639 0.9636 0.9634 0.9633 0.9633 0.9633 0.9634 0.9635 0.9636 0.9634 0.9633 0.9632 0.9633 0.9632 0.9630 0.9629 0.9628 0.9627 0.9625 0.9624 0.9623 0.9623 0.9623 0.9624 0.9626 0.9627 0.9628 0.9630 0.9631 0.9633 0.9634 0.9635 0.9637 0.9638 0.9639 0.9640 0.9641 0.9641 0.9641 0.9642 0.9643 0.9644 0.9646 0.9647 0.9647 0.9647 0.9648 0.9649 0.9650 0.9651 0.9652 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9653 0.9652 0.9652 0.9652 0.9652 0.9651 0.9650 0.9648 0.9645 0.9642 0.9639 0.9635 0.9633 0.9630 0.9627 0.9624 0.9620 0.9618 0.9616 0.9614 0.9612 0.9610 0.9608 0.9607 0.9605 0.9605 0.9604 0.9603 0.9601 0.9599 0.9598 0.9597 0.9596 0.9595 0.9594 0.9593 0.9592 0.9591 0.9590 0.9590 0.9591 0.9592 0.9593 0.9594 0.9596 0.9597 0.9599 0.9600 0.9601 0.9602 0.9604 0.9605 0.9606 0.9606 0.9606 0.9606 0.9606 0.9606 0.9606 0.9606 0.9605 0.9604 0.9603 0.9600 0.9597 0.9593 0.9589 0.9585 0.9583 0.9579 0.9576 0.9574 0.9573 0.9573 0.9573 0.9573 0.9573 0.9574 0.9575 0.9577 0.9578 0.9580 0.9581 0.9582 0.9583 0.9584 0.9584 0.9584 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9583 0.9582 0.9582 0.9582 0.9582 0.9582 0.9582 0.9581 0.9580 0.9579 0.9576 0.9573 0.9568 0.9563 0.9558 0.9555 0.9552 0.9547 0.9543 0.9541 0.9538 0.9537 0.9535 0.9533 0.9532 0.9531 0.9530 0.9530 0.9529 0.9528 0.9527 0.9527 0.9527 0.9526 0.9525 0.9525 0.9524 0.9523 0.9522 0.9522 0.9521 0.9520 0.9519 0.9518 0.9518 0.9517 0.9516 0.9516 0.9515 0.9515 0.9514 0.9516 0.9517 0.9519 0.9520 0.9522 0.9523 0.9525 0.9526 0.9527 0.9529 0.9530 0.9531 0.9531 0.9531 0.9531 0.9531 0.9530 0.9530 0.9530 0.9530 0.9530 0.9529 0.9528 0.9527 0.9527 0.9526 0.9525 0.9523 0.9521 0.9519 0.9517 0.9516 0.9514 0.9512 0.9511 0.9509 0.9508 0.9507 0.9506 0.9506 0.9505 0.9505 0.9504 0.9504 0.9502 0.9501 0.9501 0.9501 0.9501 0.9501 0.9500 0.9500 0.9500 0.9501 0.9503 0.9504 0.9506 0.9507 0.9508 0.9509 0.9509 0.9509 0.9509 0.9509 0.9509 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9508 0.9507 0.9507 0.9506 0.9504 0.9502 0.9500 0.9499 0.9498 0.9497 0.9497 0.9496 0.9496 0.9495 0.9495 0.9495 0.9495 0.9495 0.9494 0.9494 0.9494 0.9494 0.9494 0.9493 0.9493 0.9492 0.9491 0.9490 0.9489 0.9487 0.9486 0.9486 0.9485 0.9484 0.9483 0.9482 0.9482 0.9482 0.9481 0.9480 0.9478 0.9476 0.9474 0.9473 0.9472 0.9473 0.9475 0.9476 0.9477 0.9479 0.9480 0.9481 0.9482 0.9482 0.9482 0.9482 0.9482 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9481 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9480 0.9479 0.9479 0.9479 0.9477 0.9476 0.9474 0.9473 0.9471 0.9469 0.9466 0.9464 0.9462 0.9460 0.9458 0.9456 0.9455 0.9454 0.9454 0.9453 0.9452 0.9452 0.9451 0.9451 0.9451 0.9451 0.9453 0.9454 0.9456 0.9457 0.9458 0.9460 0.9461 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9462 0.9461 0.9461 0.9461 0.9461 0.9461 0.9461 0.9461 0.9460 0.9460 0.9459 0.9458 0.9456 0.9454 0.9453 0.9452 0.9451 0.9451 0.9450 0.9448 0.9447 0.9445 0.9443 0.9442 0.9441 0.9441 0.9440 0.9439 0.9438 0.9439 0.9440 0.9441 0.9442 0.9444 0.9445 0.9446 0.9447 0.9447 0.9447 0.9447 0.9447 0.9447 0.9447 0.9446 0.9444 0.9442 0.9440 0.9438 0.9436 0.9434 0.9431 0.9428 0.9426 0.9424 0.9422 0.9421 0.9419 0.9418 0.9417 0.9416 0.9415 0.9415 0.9415 0.9415 0.9415 0.9416 0.9417 0.9418 0.9420 0.9421 0.9423 0.9424 0.9425 0.9426 0.9427 0.9428 0.9428 0.9428 0.9428 0.9427 0.9427 0.9426 0.9424 0.9422 0.9419 0.9417 0.9416 0.9414 0.9412 0.9410 0.9408 0.9405 0.9402 0.9400 0.9399 0.9398 0.9396 0.9395 0.9395 0.9395 0.9394 0.9394 0.9394 0.9393 0.9393 0.9393 0.9393 0.9392 0.9392 0.9391 0.9390 0.9390 0.9389 0.9388 0.9386 0.9383 0.9380 0.9378 0.9375 0.9374 0.9373 0.9371 0.9370 0.9369 0.9369 0.9368 0.9368 0.9367 0.9368 0.9367 0.9367 0.9367 0.9367 0.9367 0.9367 0.9367 0.9367 0.9367 0.9366 0.9366 0.9365 0.9365 0.9365 0.9364 0.9364 0.9364 0.9364 0.9365 0.9365 0.9367 0.9368 0.9370 0.9371 0.9372 0.9371 0.9370 0.9368 0.9367 0.9366 0.9365 0.9364 0.9362 0.9360 0.9358 0.9357 0.9356 0.9356 0.9355 0.9354 0.9352 0.9351 0.9350 0.9349 0.9348 0.9347 0.9347 0.9348 0.9348 0.9348 0.9349 0.9350 0.9350 0.9349 0.9348 0.9347 0.9346 0.9345 0.9343 0.9342 0.9341 0.9339 0.9338 0.9337 0.9336 0.9336 0.9336 0.9336 0.9336 0.9335 0.9335 0.9335 0.9335 0.9335 0.9334 0.9333 0.9332 0.9331 0.9331 0.9330 0.9329 0.9329 0.9329 0.9329 0.9329 0.9329 0.9328 0.9328 0.9327 0.9327 0.9327 0.9326 0.9326 0.9326 0.9326 0.9325 0.9324 0.9324 0.9325 0.9326 0.9327 0.9327 0.9327 0.9326 0.9324 0.9323 0.9322 0.9321 0.9320 0.9319 0.9318 0.9318 0.9317 0.9316 0.9316 0.9316 0.9316 0.9315 0.9315 0.9315 0.9315 0.9314 0.9313 0.9312 0.9312 0.9312 0.9312 0.9312 0.9311 0.9309 0.9308 0.9309 0.9310 0.9311 0.9313 0.9315 0.9316 0.9317 0.9319 0.9319 0.9319 0.9319 0.9319 0.9319 0.9319 0.9318 0.9316 0.9314 0.9311 0.9309 0.9307 0.9305 0.9303 0.9302 0.9300 0.9297 0.9294 0.9292 0.9290 0.9289 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9288 0.9289 0.9291 0.9291 0.9290 0.9290 0.9288 0.9287 0.9287 0.9287 0.9286 0.9286 0.9286 0.9285 0.9283 0.9282 0.9281 0.9280 0.9279 0.9278 0.9278 0.9278 0.9278 0.9278 0.9278 0.9279 0.9279 0.9279 0.9279 0.9279 0.9280 0.9282 0.9283 0.9285 0.9286 0.9287 0.9287 0.9287 0.9287 0.9287 0.9287 0.9287 0.9287 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9286 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9285 0.9284 0.9284 0.9284 0.9284 0.9282 0.9280 0.9277 0.9274 0.9272 0.9270 0.9268 0.9266 0.9264 0.9262 0.9261 0.9259 0.9258 0.9256 0.9255 0.9254 0.9254 0.9254 0.9254 0.9254 0.9255 0.9256 0.9258 0.9258 0.9259 0.9261 0.9263 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9264 0.9265 0.9265 0.9265 0.9264 0.9264 0.9264 0.9264 0.9262 0.9261 0.9259 0.9258 0.9257 0.9255 0.9252 0.9250 0.9249 0.9248 0.9248 0.9247 0.9246 0.9245 0.9244 0.9242 0.9241 0.9240 0.9239 0.9239 0.9238 0.9237 0.9236 0.9235 0.9234 0.9233 0.9232 0.9232 0.9231 0.9231 0.9230 0.9230 0.9230 0.9229 0.9229 0.9229 0.9230 0.9230 0.9231 0.9233 0.9234 0.9236 0.9237 0.9238 0.9239 0.9241 0.9241 0.9242 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9241 0.9240 0.9240 0.9240 0.9240 0.9240 0.9240 0.9239 0.9237 0.9235 0.9232 0.9228 0.9224 0.9222 0.9221 0.9221 0.9222 0.9224 0.9226 0.9227 0.9229 0.9230 0.9231 0.9231 0.9231 0.9231 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9230 0.9229 0.9228 0.9226 0.9225 0.9224 0.9224 0.9224 0.9223 0.9223 0.9222 0.9221 0.9220 0.9218 0.9216 0.9214 0.9213 0.9212 0.9211 0.9211 0.9211 0.9210 0.9210 0.9209 0.9208 0.9208 0.9208 0.9208 0.9207 0.9207 0.9208 0.9208 0.9208 0.9207 0.9207 0.9207 0.9207 0.9209 0.9210 0.9211 0.9213 0.9213 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9214 0.9213 0.9213 0.9213 0.9213 0.9213 0.9213 0.9213 0.9212 0.9212 0.9211 0.9210 0.9209 0.9209 0.9209 0.9209 0.9209 0.9208 0.9207 0.9204 0.9201 0.9198 0.9196 0.9195 0.9194 0.9192 0.9191 0.9190 0.9189 0.9189 0.9189 0.9189 0.9189 0.9188 0.9188 0.9187 0.9187 0.9186 0.9185 0.9184 0.9183 0.9182 0.9181 0.9181 0.9180 0.9179 0.9178 0.9176 0.9174 0.9173 0.9171 0.9171 0.9170 0.9172 0.9174 0.9176 0.9177 0.9179 0.9181 0.9182 0.9183 0.9183 0.9183 0.9183 0.9183 0.9183 0.9183 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9182 0.9181 0.9181 0.9181 0.9181 0.9181 0.9181 0.9181 0.9181 0.9181 0.9180 0.9180 0.9178 0.9176 0.9174 0.9173 0.9171 0.9169 0.9167 0.9166 0.9164 0.9163 0.9163 0.9162 0.9162 0.9162 0.9162 0.9163 0.9163 0.9163 0.9164 0.9164 0.9165 0.9165 0.9166 0.9168 0.9169 0.9170 0.9170 0.9170 0.9170 0.9170 0.9170 0.9170 0.9170 0.9170 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9169 0.9168 0.9168 0.9168 0.9168 0.9168 0.9168 0.9167 0.9166 0.9165 0.9163 0.9160 0.9158 0.9158 0.9156 0.9153 0.9151 0.9150 0.9150 0.9150 0.9150 0.9150 0.9151 0.9152 0.9153 0.9154 0.9154 0.9153 0.9153 0.9152 0.9150 0.9147 0.9145 0.9143 0.9142 0.9142 0.9142 0.9143 0.9145 0.9146 0.9147 0.9146 … ]}

Specify the validation data as a subset of the testing data containing only the first file. To input the predictors and targets from both validation datastores into the trainingOptions function, combine them using the combine function.

indices = 1;
vdsPredictors = subset(tdsPredictorsTest,indices);
vdsTargets = subset(tdsTargetsTest,indices);
cdsVal = combine(vdsPredictors,vdsTargets);

Define Network Architecture

Define the network architecture. Set the number of inputs features to five (voltage, current, temperature, average voltage, and average temperature).

numFeatures = 5; 

Set the number of output features to one (SOC).

numResponses = 1;

Specify the number of hidden neurons.

numHiddenNeurons = 55; 

Define the layers of the network.

layers = [
    sequenceInputLayer(numFeatures,Normalization="zerocenter")
    fullyConnectedLayer(numHiddenNeurons)
    tanhLayer                            
    fullyConnectedLayer(numHiddenNeurons)
    leakyReluLayer(0.3)                  
    fullyConnectedLayer(numResponses)
    clippedReluLayer(1)                 
    regressionLayer];

Specify the training options. Train for 1200 epochs with mini-batches of size 1 using the "adam" solver. To prevent the gradients from exploding, set the gradient threshold to 1. Specify an initial learning rate of 0.01, a learning rate drop period of 400 and a learning rate drop factor of 0.1. Specify a validation frequency of 30. Experiments in Experiment Manager showed that the initial learning rate of 0.01 and the learning rate drop factor of 0.1 together minimize the validation error. For more information on how to optimize hyperparameters using Experiment Manager, see Choose Training Configurations for LSTM Using Bayesian Optimization.

Epochs = 1200;
miniBatchSize = 1;
LRDropPeriod = 400; 
InitialLR = 0.01;
LRDropFactor = 0.1; 
valFrequency = 30; 

options = trainingOptions("adam", ...                 
    MaxEpochs=Epochs, ...
    GradientThreshold=1, ...
    InitialLearnRate=InitialLR, ...
    LearnRateSchedule="piecewise", ...
    LearnRateDropPeriod=LRDropPeriod, ...
    LearnRateDropFactor=LRDropFactor, ...
    ValidationData=cdsVal, ...
    ValidationFrequency=valFrequency, ...
    MiniBatchSize=miniBatchSize, ...
    Verbose=0, ...
    Plots="training-progress");

Train Network

Train the network using trainNetwork with the specified training options.

net = trainNetwork(cdsTrain,layers,options);

Test Network

Make predictions on the test data using predict. To avoid having to pad the sequences to ensure that all sequences in a mini-batch have the same length, set the mini-batch size to 1.

YPred = predict(net,tdsPredictorsTest,MiniBatchSize=1);

Compare the SOC predicted by the network to the target SOC from the test data for different temperatures.

YTarget = readall(tdsTargetsTest);

Plot the predicted and the target SOC for different ambient temperatures.

figure

nexttile
plot(YPred{1})
hold on
plot(YTarget{1})
legend(["Predicted" "Target"], Location="Best")
ylabel("SOC")
xlabel("Time(s)")
title("n10degC")

nexttile
plot(YPred{2})
hold on
plot(YTarget{2})
legend(["Predicted" "Target"], Location="Best")
ylabel("SOC")
xlabel("Time(s)")
title("0degC")

nexttile
plot(YPred{3})
hold on
plot(YTarget{3})
legend(["Predicted" "Target"], Location="Best")
ylabel("SOC")
xlabel("Time(s)")
title("10degC")

nexttile
plot(YPred{4})
hold on
plot(YTarget{4})
legend(["Predicted" "Target"], Location="Best")
ylabel("SOC")
xlabel("Time(s)")
title("25degC")

Calculate the error between the predicted SOC and the target SOC for each ambient temperature.

Err_n10degC = YPred{1} - YTarget{1};
Err_0degC = YPred{2} - YTarget{2};
Err_10degC = YPred{3} - YTarget{3};
Err_25degC = YPred{4} - YTarget{4};

Calculate the root mean squared error (RMSE) as a percentage.

RMSE_n10degC = sqrt(mean(Err_n10degC.^2))*100;
RMSE_0degC = sqrt(mean(Err_0degC.^2))*100;
RMSE_10degC = sqrt(mean(Err_10degC.^2))*100;
RMSE_25degC = sqrt(mean(Err_25degC.^2))*100;

Calculate the maximum error as a percentage.

MAX_n10degC = max(abs(Err_n10degC))*100;
MAX_0degC = max(abs(Err_0degC))*100;
MAX_10degC = max(abs(Err_10degC))*100;
MAX_25degC = max(abs(Err_25degC))*100;

Plot the RMSE for the different ambient temperatures.

temp = [-10,0,10,25];
figure
nexttile
bar(temp,[RMSE_n10degC,RMSE_0degC,RMSE_10degC,RMSE_25degC])
ylabel("RMSE (%)")
xlabel("Temperature (C)")

Plot the maximum absolute error for the different ambient temperatures.

nexttile
bar(temp,[MAX_n10degC,MAX_0degC,MAX_10degC,MAX_25degC])
ylabel("MAX (%)")
xlabel("Temperature (C)")

Lower values in the RMSE and MAX plots indicate more accurate predictions for the corresponding temperatures. Larger values in the same plots indicate less accurate predictions for the corresponding temperatures.

References

[1] Kollmeyer, Phillip, Carlos Vidal, Mina Naguib, and Michael Skells. “LG 18650HG2 Li-Ion Battery Data and Example Deep Neural Network XEV SOC Estimator Script.” Mendeley, March 5, 2020. https://doi.org/10.17632/CP3473X7XV.3.

See Also

| |

Related Topics