Bayesian Hyperparameter for Anfis

5 Ansichten (letzte 30 Tage)
Raghu
Raghu am 26 Jun. 2021
Beantwortet: Shubham am 5 Sep. 2024
Hello,
I tried developing codes for ANFIS for 2 inputs and 1 output using example provided by the Matlab. I would like to optimise it using the bayesian optimisation for hyperparameter tuning such as membership function and parameter tuning. Can someone please help me with that because the code is not working?
This is my code for the ANFIS:
Daten = rand(100, 3);
Daten(:,3) = Daten(:,1) + Daten(:,2) + .1*randn(100, 1);
[m,n] = size(Daten) ;
[m,n] = size(y) ;
% Split into train and test
P = 0.7 ;
Training = y(1:round(P*m),:) ;
Testing = y(round(P*m)+1:end,:);
XTrain = Training(:,1:n-1);
YTrain = Training(:,n);
XTest = Testing(:,1:n-1);
YTest = Testing(:,n);
%%
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('MF', [1,20], 'Type', 'integer');
optimizableVariable('EN', [1,100], 'Type', 'integer')];
% Optimize
minfn = @(vars)kfoldLoss(XTrain', YTrain', cv, vars.MF, vars.EN);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
vars = bestPoint(results);
% ANFIS model
a= Training % training data
x= [a(:,1),a(:,2)]; % input for anfis
opt = anfisOptions('InitialFIS',vars.MF,'EpochNumber',vars.EN);
opt.DisplayErrorValues = 0;
opt.DisplayStepSize = 0;
fis = anfis(y,opt);
anfisOutput= evalfis(fis,x);
z= anfisOutput;

Antworten (1)

Shubham
Shubham am 5 Sep. 2024
Hi Raghu,
To optimize an ANFIS model using Bayesian optimization in MATLAB, you need to ensure that the optimization process is correctly set up, particularly the objective function that evaluates the model based on the hyperparameters. Your code has a few issues that need addressing to make it work properly. Here's a step-by-step guide to help you set up the optimization:
Key Steps:
  1. Data Preparation: Ensure your data is correctly split into training and testing sets.
  2. Objective Function: Define an objective function that trains the ANFIS model with given hyperparameters and returns a performance metric (e.g., mean squared error) on validation data.
  3. Bayesian Optimization: Use the bayesopt function to search for the best hyperparameters.
% Generate synthetic data
Daten = rand(100, 3);
Daten(:,3) = Daten(:,1) + Daten(:,2) + 0.1*randn(100, 1);
% Split into train and test
P = 0.7;
m = size(Daten, 1);
Training = Daten(1:round(P*m), :);
Testing = Daten(round(P*m)+1:end, :);
XTrain = Training(:, 1:2);
YTrain = Training(:, 3);
XTest = Testing(:, 1:2);
YTest = Testing(:, 3);
% Define cross-validation partition
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [
optimizableVariable('MF', [1, 20], 'Type', 'integer');
optimizableVariable('Epochs', [1, 100], 'Type', 'integer')
];
% Define the objective function
minfn = @(vars) kfoldLossFunction(XTrain, YTrain, cv, vars.MF, vars.Epochs);
% Optimize
results = bayesopt(minfn, vars, 'IsObjectiveDeterministic', false, ...
'AcquisitionFunctionName', 'expected-improvement-plus');
% Extract best hyperparameters
bestVars = bestPoint(results);
% Train final ANFIS model with best hyperparameters
opt = anfisOptions('InitialFIS', genfis1([XTrain YTrain], bestVars.MF), ...
'EpochNumber', bestVars.Epochs);
opt.DisplayErrorValues = 0;
opt.DisplayStepSize = 0;
fis = anfis([XTrain YTrain], opt);
% Evaluate ANFIS model
anfisOutput = evalfis(fis, XTest);
mseTest = mean((YTest - anfisOutput).^2);
fprintf('Test MSE: %.4f\n', mseTest);
% Objective function for cross-validation
function loss = kfoldLossFunction(XTrain, YTrain, cv, numMFs, numEpochs)
% Generate initial FIS
fis = genfis1([XTrain YTrain], numMFs);
% Initialize loss
loss = 0;
% Perform cross-validation
for i = 1:cv.NumTestSets
trainIdx = training(cv, i);
valIdx = test(cv, i);
% Train ANFIS
opt = anfisOptions('InitialFIS', fis, 'EpochNumber', numEpochs);
opt.DisplayErrorValues = 0;
opt.DisplayStepSize = 0;
trainedFis = anfis([XTrain(trainIdx, :) YTrain(trainIdx)], opt);
% Evaluate on validation set
valOutput = evalfis(trainedFis, XTrain(valIdx, :));
loss = loss + mean((YTrain(valIdx) - valOutput).^2);
end
% Average cross-validation loss
loss = loss / cv.NumTestSets;
end
Explanation:
  • Data Preparation: Ensure that Daten is correctly split into training and testing datasets.
  • Objective Function (kfoldLossFunction): This function trains the ANFIS model on the training data and evaluates it on the validation data. It returns the mean squared error (MSE) as the loss.
  • Bayesian Optimization: Uses the bayesopt function to find the optimal number of membership functions (MF) and epochs (Epochs). The objective function is evaluated using k-fold cross-validation.
  • Final Model Training: Once the best hyperparameters are found, train the ANFIS model on the entire training set and evaluate it on the test set.

Kategorien

Mehr zu Analyze Coverage and View Results finden Sie in Help Center und File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by