How can I have the middle data set during the fitting process by lsqcurvefit?

How can I have the middle data sets during the fitting process by lsqcurvefit?
By x = lsqcurvefit(fun,x0,xdata,ydata), x0 is initail data set and x is final data set.
I need the middle data sets. if possible, could you let me know how to get the middle data sets?
I'd like to make some graphs with middle data sets by fun in order to compare the grapfs differences.
I tried multiple lsqcurvefit, which means the below process. But, it needs so many hours....
  1. calculate the x by lsqcurvefit and have the loop number. e.g. loop number = 100
  2. set MaxIterations as 10 and calculate the x10 by lsqcurvefit
  3. set MaxIterations as 20, 30,,,,100 and calculate the x20, x30,,,x100, respectivelly by lsqcurvefit
  4. make the graphs with x10,x20,,,x100

 Akzeptierte Antwort

Umar
Umar am 29 Aug. 2024

Hi @彰朗,

Let me address your query regarding, “How can I have the middle data sets during the fitting process by lsqcurvefit?By x = lsqcurvefit(fun,x0,xdata,ydata), x0 is initail data set and x is final data set.I need the middle data sets. if possible, could you let me know how to get the middle data sets? I'd like to make some graphs with middle data sets by fun in order to compare the grapfs differences.I tried multiple lsqcurvefit, which means the below process. But, it needs so many hours.... 1. calculate the x by lsqcurvefit and have the loop number. e.g. loop number = 100 2. set MaxIterations as 10 and calculate the x10 by lsqcurvefit 3. set MaxIterations as 20, 30,,,,100 and calculate the x20, x30,,,x100, respectivelly by lsqcurvefit 4. make the graphs with x10,x20,,,x100 “

Please see my response to your comments below.

To address your question effectively and following provided MATLAB documentation by @Matt J, I implemented the code.

https://www.mathworks.com/help/optim/ug/output-functions.html#mw_rtc_OutputFunctionsForOptimizationToolboxExample_B73D5DFB

Yours goal was to enable to visualize the fitting progression and compare the differences in the graphs generated from these intermediate data sets. So, let me help you understand my code structure below.

Function Definition: The main function fitWithIntermediateData is designed to perform the fitting while capturing intermediate results. It takes four parameters: the model function fun, the initial guess x0, the independent variable data xdata, and the dependent variable data ydata.

History Structure: A structure named history is initialized to store the intermediate parameter values at each iteration. This is crucial for later analysis and visualization.

Options for lsqcurvefit: The optimoptions function is used to set various options for the lsqcurvefit function. Notably, the OutputFcn option is set to a custom function outfun, which is called at each iteration of the optimization process. This function is responsible for storing the current parameter estimates.

Output Function: The outfun function checks the state of the optimization. If the state is 'iter', it appends the current parameter estimates to the history.x array. This allows you to track how the parameters evolve over iterations.

Model Function: The yourModelFunction is a placeholder for the actual model you wish to fit. In this example, it is a simple linear function, but you can replace it with any model that suits your data.

Final Fit and Plotting: After the fitting process, the final parameters and the history of intermediate results are available. The code then plots each intermediate fit along with the final fit, allowing for visual comparison.

Here is the complete code with explanations embedded:

function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata)
  % Initialize history structure to store intermediate results
  history.x = [];
    % Define options for lsqcurvefit
    options = optimoptions('lsqcurvefit', ...
        'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state, 
          history), ...
        'StepTolerance', 1e-6, ... % Adjust step tolerance
        'TolFun', 1e-6, ... % Tolerance on the function value
        'MaxIter', 1000); % Maximum number of iterations
      % Call lsqcurvefit
      [xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata,
      ydata, [], [], options);
      % Display final results
      fprintf('Final Parameters: %s\n', mat2str(xsol));
      fprintf('Residual Norm: %.4f\n', resnorm);
      fprintf('Exit Flag: %d\n', exitflag);
      fprintf('Output: %s\n', output.message);
  end
function stop = outfun(x, optimValues, state, history)
  stop = false; % Allows optimization to continue
    if strcmp(state, 'iter')
        % Store current x in history
        history.x = [history.x; x]; % Append current parameters to history
    end
  end
function y = yourModelFunction(params, xdata)
  % Define your model function here
  y = params(1) * xdata + params(2) + params(3); 
end
% Define initial guess values as needed
initialGuess = [1, 2, 3]; 
% Define generic data for xdata and ydata
xdata = linspace(0, 10, 100); 
ydata = 2 * xdata + 1 + randn(size(xdata)); 
[xFinal, fitHistory] = fitWithIntermediateData(@yourModelFunction, 
initialGuess, xdata, ydata);
% Accessing intermediate results:
intermediateResults = fitHistory.x;
% Plotting the results
figure;
hold on;
for i = 1:size(intermediateResults, 1)
  plot(xdata, yourModelFunction(intermediateResults(i,:), xdata), 
 'DisplayName', ['Iteration ' num2str(i)]);
end
% Plot the final fit
plot(xdata, yourModelFunction(xFinal, xdata), 'k--', 'DisplayName', 'Final 
Fit', 'LineWidth', 2);
 legend show;
title('Intermediate Fit Results and Final Fit');
xlabel('X Data');
ylabel('Y Data');
hold off;

Since you are interested in visualization part of code let me explain the code loops through the intermediateResults array, plotting each set of parameters against the xdata. This will allow you to visualize how the model evolves with each iteration. The final fit is plotted as a dashed line for clear distinction.

Please see attached.

Results

Plot

So, by utilizing the provided code structure, you can capture and visualize the intermediate data sets during the fitting process with lsqcurvefit. This approach not only saves time compared to running multiple fitting processes but also provides a comprehensive view of how the fitting progresses. You can further customize the model function and the plotting sections of code to suit your specific needs. Please let me know if you have any further questions.

4 Kommentare

Thank you for your kind answer with sample code. by your code, does history include the middle x (i.e. x made in the middle of fitting calculation by lsqcurvefit)?

Hi @ 彰朗,

To address, does history include the middle x (i.e. x made in the middle of fitting calculation by lsqcurvefit)?

Yes, the history structure in the provided code includes the middle x values generated during the fitting process by lsqcurvefit. The outfun function, which is called at each iteration of the optimization process, stores the current parameter estimates in the history.x array. This allows you to track and access all the intermediate x values as they are calculated during the fitting process. The history structure provides a comprehensive record of these intermediate results for further analysis and visualization.

Hi @ 彰朗,
Please let us know if you still need assistance with your problem or any further questions you have?
Thank you for your follow. And I'm sorry for too delay to answering your commnet.
I tried last week to make the programing with your attached code some times.
Command window showed almost the same with your command window.
(Sorry for japanese, but the meeaning is almost same.)
But, fitHistory had no data, as blank matlix.
I used the origial function as function y = yourModelFunction(params, xdata), by setting the params as [100, 0.01, 3].
Could you advice to improve my situation?
attached is the code;
% 不要変数の削除
clear all
close all
% Define initial guess values as needed
initialGuess = [1, 2, 3];
params = [100, 0.01, 3];
% Define generic data for xdata and ydata
xdata = linspace(0, 10, 100);
ydata = 2 * xdata + 1 + randn(size(xdata));
[xFinal, fitHistory] = fitWithIntermediateData(@yourModelFunction, initialGuess, xdata, ydata);
% Accessing intermediate results:
intermediateResults = fitHistory.x;
% Plotting the results
figure;
hold on;
for i = 1:size(intermediateResults, 1)
plot(xdata, yourModelFunction(intermediateResults(i,:), xdata),'DisplayName', ['Iteration ' num2str(i)]);
end
% Plot the final fit
plot(xdata, yourModelFunction(xFinal, xdata), 'k--', 'DisplayName', 'Final Fit', 'LineWidth', 2);
legend show;
title('Intermediate Fit Results and Final Fit');
xlabel('X Data');
ylabel('Y Data');
hold off;
figure
plot(xdata, ydata);
function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata)
% Initialize history structure to store intermediate results
history.x = [];
% Define options for lsqcurvefit
options = optimoptions('lsqcurvefit', ...
'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state, history), ...
'StepTolerance', 1e-6, ... % Adjust step tolerance
'TolFun', 1e-6, ... % Tolerance on the function value
'MaxIter', 1000); % Maximum number of iterations
% Call lsqcurvefit
[xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata, ydata, [], [], options);
% Display final results
fprintf('Final Parameters: %s\n', mat2str(xsol));
fprintf('Residual Norm: %.4f\n', resnorm);
fprintf('Exit Flag: %d\n', exitflag);
fprintf('Output: %s\n', output.message);
end
function stop = outfun(x, optimValues, state, history)
stop = false; % Allows optimization to continue
if strcmp(state, 'iter')
% Store current x in history
history.x = [history.x; x]; % Append current parameters to history
end
end
function y = yourModelFunction(params, xdata)
% Define your model function here
y = params(1) * xdata + params(2) + params(3);
end

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (4)

Matt J
Matt J am 29 Aug. 2024

4 Kommentare

Thank you for your answer. I tried the OutputFcn and the PlotFcn. But, unfortunatelly, those can't output the middle data set. I need the middle x during the calculation of final x.
Thanks a lot for your kind answer. I'll try again.
You're welcome, but if it resolves your question, please Accept-click the answer.

Melden Sie sich an, um zu kommentieren.

Umar
Umar am 2 Sep. 2024

Hi @彰朗,

Could you advice to improve my situation?

Please see my response to your comments below.

Detailed Comparison and Analysis

Initial Parameter Setup

Old Code:The initial guess for parameters was hardcoded with arbitrary values.

New Code:The initial guess has been parameterized and is now directly linked to an array that defines lower and upper bounds for each parameter.

Improvement

This change enhances flexibility and allows for better control over the parameter space, which can lead to more accurate fitting. By bounding the parameters, the new code avoids unrealistic values that could skew results.

Performance Metrics Calculation

Old Code:The final output primarily includes the fitted parameters and residual norm, offering limited insight into the quality of the fit.

New Code:Introduces performance metrics such as Root Mean Square Error (RMSE) and R-squared, which provide a quantitative measure of fit quality.

Improvement

The incorporation of performance metrics allows users to evaluate the fitting process quantitatively, making it easier to assess model performance and compare different models.

Enhanced Output Reporting

Old Code: The reporting of results is minimal, focusing mainly on the fitted parameters.

New Code: More comprehensive output includes RMSE, R-squared, and details about the exit flag and residual norm.

Improvement

This enhancement makes the results more informative, facilitating a deeper analysis of the fitting process, which is vital for troubleshooting and validation.

Data Visualization

Old Code: The plotting section is functional but lacks clarity in distinguishing original data from fitted results.

New Code: An additional plot for original data points is included, and the final fit is clearly distinguished with a line style.

Improvement

Improved visualization aids in better interpretation of results, allowing users to visually assess the fitting quality against the original dataset.

Function Signature and Flexibility

Old Code: The fitting function does not accommodate parameter bounds, which may lead to suboptimal solutions.

New Code: The fitting function now accepts lower and upper bounds as parameters, enhancing the optimization's robustness.

Improvement

This flexibility is crucial in real-world applications where constraints on parameters are common. It allows for a more reliable fitting process and mitigates the risk of non-physical results.

New Code

% Define initial guess values
initialGuess = [100, 0.01, 3]; 
params = initialGuess; % Set parameters to the initial guess
% Define lower and upper bounds for the parameters
lowerBounds = [0, 0, 0];  % Lower bounds for [param1, param2, param3]
upperBounds = [200, 1, 10]; % Upper bounds for [param1, param2, param3]
% Define generic data for xdata and ydata
xdata = linspace(0, 10, 100); 
ydata = 2 * xdata + 1 + randn(size(xdata)); % Simulated noisy data
% Call the fitting function
[xFinal, fitHistory] = fitWithIntermediateData(@yourModelFunction, params, 
xdata, ydata, lowerBounds, upperBounds);
% Calculate performance metrics
yFit = yourModelFunction(xFinal, xdata);
RMSE = sqrt(mean((ydata - yFit).^2)); 
SSres = sum((ydata - yFit).^2);
SStot = sum((ydata - mean(ydata)).^2);
R_squared = 1 - (SSres / SStot);
% Display performance metrics
fprintf('Final Parameters: %s\n', mat2str(xFinal));
fprintf('RMSE: %.4f\n', RMSE);
fprintf('R²: %.4f\n', R_squared);
% Accessing intermediate results:
intermediateResults = fitHistory.x;
% Plotting the results
figure;
hold on;
for i = 1:size(intermediateResults, 1)
  plot(xdata, yourModelFunction(intermediateResults(i,:), xdata), 
'DisplayName', ['Iteration ' num2str(i)]);
end
% Plot the final fit
plot(xdata, yFit, 'k--', 'DisplayName', 'Final Fit', 'LineWidth', 2);
legend show;
title('Intermediate Fit Results and Final Fit');
xlabel('X Data');
ylabel('Y Data');
hold off;
% Plot the original data
figure;
plot(xdata, ydata, 'o');
title('Original Data');
xlabel('X Data');
ylabel('Y Data');
function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, 
ub)
  % Initialize history structure to store intermediate results
  history.x = [];
    % Define options for lsqcurvefit
    options = optimoptions('lsqcurvefit', ...
        'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state, 
         history), ...
        'StepTolerance', 1e-6, ... % Adjust step tolerance
        'TolFun', 1e-6, ... % Tolerance on the function value
        'MaxIter', 1000); % Maximum number of iterations
    % Call lsqcurvefit with bounds
    [xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata,  
    ydata, lb, ub, options);
    % Display final results
    fprintf('Residual Norm: %.4f\n', resnorm);
    fprintf('Exit Flag: %d\n', exitflag);
    fprintf('Output: %s\n', output.message);
 end
function stop = outfun(x, optimValues, state, history)
  stop = false; % Allows optimization to continue
  if strcmp(state, 'iter')
      % Store current x in history
      history.x = [history.x; x]; % Append current parameters to history
  end
end
function y = yourModelFunction(params, xdata)
  % Define your model function here
  y = params(1) * xdata + params(2) + params(3); 
end

Output results after execution of new code

Local minimum possible.

lsqcurvefit stopped because the final change in the sum of squares relative to its initial value is less than the value of the function tolerance.

criteria details Residual Norm: 75.4525 Exit Flag: 3 Output: Local minimum possible.

lsqcurvefit stopped because the final change in the sum of squares relative to its initial value is less than the value of the function tolerance.

criteria details

Optimization stopped because the relative sum of squares (r) is changing by less than options.FunctionTolerance = 1.000000e-06. Final Parameters: [2.01098691915462 0.0790032930972242 0.881589904882525] RMSE: 0.8686 R²: 0.9785

Summary of output results

The output results indicate a reasonably good fit, various strategies can be employed to enhance the model’s performance further. By refining the initial guesses, adjusting tolerances, and reviewing your model function, you can work towards achieving more optimal fitting results.

Please see attached.

Hope this helps. Please let me know if you have any further questions.

1 Kommentar

Thank you for your updating code, that is very kind fullness.
Updated code is very useful, and plot comparing is so nice.
I got same graph and command window by new code.
But, that have no showing the intermidate xdata.
so, i have questions again.
One question is regarding to how to use.
is it OK to copy and paste to editor in MATLAB and click Run button?
No need to special set up or process?
Next question is about your result.
if possible, could you attachd your result of middle xdata on this comment?
Third is aboud fitHistory.
Is it OK to understand the fitHistory are updated with midlle xdata during the Run?

Melden Sie sich an, um zu kommentieren.

Umar
Umar am 2 Sep. 2024

Hi @彰朗 ,

Please see my response to your questions below.

One question is regarding to how to use.is it OK to copy and paste to editor in MATLAB and click Run button? No need to special set up or process?

Yes, you can copy and paste the entire code into the MATLAB editor and click the "Run" button. There is no need for special setup or processes, provided you have the necessary toolboxes installed (specifically, the Optimization Toolbox).

Next question is about your result, if possible, could you attachd your result of middle xdata on this comment?

The provided code already accomplishes defining the initial parameters, bounds, and the data to be fitted., but I will ensure that we focus on the middle segment of xdata later. So, the fitting function fitWithIntermediateData is called to optimize the parameters based on the model function defined. This function also captures intermediate results for analysis.To focus on the middle segment of xdata, we can calculate the indices that correspond to the middle 20% of the data which can be done as follows:

% Calculate the indices for the middle 20% of xdata
numDataPoints = length(xdata);
middleStartIndex = round(numDataPoints * 0.4); % Start at 40%
middleEndIndex = round(numDataPoints * 0.6);   % End at 60%
% Extract the middle xdata and corresponding ydata
middleXdata = xdata(middleStartIndex:middleEndIndex);
middleYdata = ydata(middleStartIndex:middleEndIndex);

Using the optimized parameters obtained from the fitting process, you can calculate the fitted values for the middle segment:

% Calculate fitted values for the middle segment
middleYFit = yourModelFunction(xFinal, middleXdata);

Finally, display the results for the middle segment of xdata alongside the original noisy data. This can be done using a simple plot:

% Plotting the middle segment results
figure;
hold on;
plot(middleXdata, middleYdata, 'ro', 'DisplayName', 'Original Middle Data'); %   
Original data points
plot(middleXdata, middleYFit, 'b-', 'DisplayName', 'Fitted Middle Data'); %   
Fitted line
legend show;
title('Middle Segment Fit Results');
xlabel('X Data (Middle Segment)');
ylabel('Y Data');
hold off;

Here is the complete code integrating all the steps mentioned above:

% Define initial guess values
initialGuess = [100, 0.01, 3]; 
params = initialGuess; % Set parameters to the initial guess
% Define lower and upper bounds for the parameters
lowerBounds = [0, 0, 0];  % Lower bounds for [param1, param2, param3]
upperBounds = [200, 1, 10]; % Upper bounds for [param1, param2, param3]
% Define generic data for xdata and ydata
xdata = linspace(0, 10, 100); 
ydata = 2 * xdata + 1 + randn(size(xdata)); % Simulated noisy data
% Call the fitting function
[xFinal, fitHistory] = fitWithIntermediateData(@yourModelFunction, params,   
xdata, ydata, lowerBounds, upperBounds);
% Calculate performance metrics
yFit = yourModelFunction(xFinal, xdata);
RMSE = sqrt(mean((ydata - yFit).^2)); 
SSres = sum((ydata - yFit).^2);
SStot = sum((ydata - mean(ydata)).^2);
R_squared = 1 - (SSres / SStot);
% Display performance metrics
fprintf('Final Parameters: %s\n', mat2str(xFinal));
fprintf('RMSE: %.4f\n', RMSE);
fprintf('R²: %.4f\n', R_squared);
% Accessing intermediate results:
intermediateResults = fitHistory.x;
% Extracting middle segment of xdata
numDataPoints = length(xdata);
middleStartIndex = round(numDataPoints * 0.4);
middleEndIndex = round(numDataPoints * 0.6);
middleXdata = xdata(middleStartIndex:middleEndIndex);
middleYdata = ydata(middleStartIndex:middleEndIndex);
% Calculate fitted values for the middle segment
middleYFit = yourModelFunction(xFinal, middleXdata);
% Plotting the middle segment results
figure;
hold on;
plot(middleXdata, middleYdata, 'ro', 'DisplayName', 'Original Middle Data');
plot(middleXdata, middleYFit, 'b-', 'DisplayName', 'Fitted Middle Data');
legend show;
title('Middle Segment Fit Results');
xlabel('X Data (Middle Segment)');
ylabel('Y Data');
hold off;
% Function Definitions
function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb,       ub)
history.x = [];
options = optimoptions('lsqcurvefit', ...
    'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state, 
     history), ...
    'StepTolerance', 1e-6, ...
    'TolFun', 1e-6, ...
    'MaxIter', 1000);
[xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata,       ydata, lb, ub, options);
fprintf('Residual Norm: %.4f\n', resnorm);
fprintf('Exit Flag: %d\n', exitflag);
fprintf('Output: %s\n', output.message);
end
function stop = outfun(x, optimValues, state, history)
stop = false;
if strcmp(state, 'iter')
    history.x = [history.x; x];
end
end
 function y = yourModelFunction(params, xdata)
  y = params(1) * xdata + params(2) + params(3); 
  end

Please see attached data

Third is aboud fitHistory. Is it OK to understand the fitHistory are updated with midlle xdata during the Run?

Yes, you are correct. The fitHistory structure is updated with the intermediate parameter values during each iteration of the optimization process. This allows you to analyze how the parameters evolve over time.

Hope, this answers all your questions.

4 Kommentare

Thank you for your explanation and kind comments.
I'm sorry for my wrong explanation and selected words & sentences...
Your code is suitable for comparing the middle data.
I'd like to have intermediate params, also.
In your code, the initialGuess is used for initial parameters for fitting of xdata to ydata with yourModelFunction. In fitting process of lsqcurvefitting, a bit changed initialGuess will be used for 2nd parameters. Next 3rd parameters are created. After some steps of fittng calculations on lsqcurvefitting, the final parameters are made with condition as less than norm. If possible, could you add the data matlix with 2nd parameters, 3rd parameters, 4th, 5th, , , and before the last oneparameters?
Owing to my wrong explanation, too many commnents. sorry.

Hi @彰朗,

I implement the following changes based on your request.

Utilize the initial guess for the second parameter.

Create additional parameters (fourth and fifth).

Making sure that the final parameters are conditioned based on the norm.

Incorporate a data matrix that includes the second, third, fourth, and fifth parameters.

Below is the updated MATLAB code reflecting these modifications:

% Define initial guess values
initialGuess = [100, 0.01, 3, 0.5, 1]; % Added fourth and fifth       parameters
params = initialGuess; % Set parameters to the initial guess
% Define lower and upper bounds for the parameters
lowerBounds = [0, 0, 0, 0, 0];  % Lower bounds for [param1,       param2, param3, param4, param5]
upperBounds = [200, 1, 10, 5, 5]; % Upper bounds for [param1,         param2, param3, param4, param5]
% Define generic data for xdata and ydata
xdata = linspace(0, 10, 100); 
ydata = 2 * xdata + 1 + randn(size(xdata)); % Simulated noisy 
data
% Call the fitting function
[xFinal, fitHistory] =          fitWithIntermediateData(@yourModelFunction, params, xdata,        ydata,lowerBounds, upperBounds);
% Calculate performance metrics
yFit = yourModelFunction(xFinal, xdata);
RMSE = sqrt(mean((ydata - yFit).^2); 
SSres = sum((ydata - yFit).^2);
SStot = sum((ydata - mean(ydata)).^2);
R_squared = 1 - (SSres / SStot);
% Display performance metrics
fprintf('Final Parameters: %s\n', mat2str(xFinal));
fprintf('RMSE: %.4f\n', RMSE);
fprintf('R²: %.4f\n', R_squared);
% Accessing intermediate results:
intermediateResults = fitHistory.x;
% Extracting middle segment of xdata
numDataPoints = length(xdata);
middleStartIndex = round(numDataPoints * 0.4);
middleEndIndex = round(numDataPoints * 0.6);
middleXdata = xdata(middleStartIndex:middleEndIndex);
middleYdata = ydata(middleStartIndex:middleEndIndex);
% Calculate fitted values for the middle segment
middleYFit = yourModelFunction(xFinal, middleXdata);
% Plotting the middle segment results
figure;
hold on;
plot(middleXdata, middleYdata, 'ro', 'DisplayName', 'Original          Middle Data');
plot(middleXdata, middleYFit, 'b-', 'DisplayName', 'Fitted Middle             Data');
legend show;
title('Middle Segment Fit Results');
xlabel('X Data (Middle Segment)');
ylabel('Y Data');
hold off;
% Function Definitions
function [xsol, history] = fitWithIntermediateData(fun, x0,                                    xdata, ydata, lb, ub)
  history.x = [];
  options = optimoptions('lsqcurvefit', ...
      'OutputFcn', @(x, optimValues, state) outfun(x,                                optimValues, state, history), ...
      'StepTolerance', 1e-6, ...
      'TolFun', 1e-6, ...
      'MaxIter', 1000);
    [xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun,                 x0, xdata, ydata, lb, ub, options);
    % Check if the final parameters meet the norm condition
    if resnorm < 1e-4
        fprintf('Final parameters meet the norm condition.\n');
    else
        fprintf('Final parameters do not meet the norm                            condition.\n');
    end
    fprintf('Residual Norm: %.4f\n', resnorm);
    fprintf('Exit Flag: %d\n', exitflag);
    fprintf('Output: %s\n', output.message);
  end
function stop = outfun(x, optimValues, state, history)
  stop = false;
  if strcmp(state, 'iter')
      history.x = [history.x; x];
  end
end
function y = yourModelFunction(params, xdata)
  % Updated model function to include additional parameters
  y = params(1) * xdata + params(2) + params(3) + params(4) *               xdata.^2 + params(5); 
end

Please see attached.

Explanation of Changes

Initial Guess: The initialGuess array now includes two additional parameters, allowing for a more complex model.

Parameter Bounds: The bounds for the parameters have been adjusted to accommodate the new parameters.

Model Function: The yourModelFunction has been updated to include the fourth and fifth parameters, allowing for a polynomial fit.

Norm Condition: After fitting, the code checks if the residual norm is below a specified threshold, indicating a good fit.

Data Matrix: The model function now effectively utilizes all parameters, ensuring that the fitting process is comprehensive.

Please let me know if you have any further questions.

Thank you for your following and chnaging your code. But, I need middle params.
For example, lsqcurvefit is started with initial param as [100, 0.01, 3, 0.5, 1] and stopped on loop = 27 with final params [1.970 0.056 0.658 0.004 0.313].
In that case, i can't know the params on loop = 2 or 3, 4, ,,, 26.
If possible, I'd like to have the matlix including all params created during loop calculations with lsqcurvefit.

Hi @彰朗,

To achieve your goal of tracking all parameter values throughout the iterations of lsqcurvefit,you can modify the outfun function to store the parameters at every iteration. Here’s how you can do this:

Modify the Output Function: The outfun function needs to be adjusted to append each set of parameters to a history matrix.

Store Intermediate Parameters: Use a global variable or a structure passed to the optimization options to keep track of the parameters at each iteration.

Here’s how you can implement this:

function [xsol, history] = fitWithIntermediateData(fun, x0,     xdata, ydata, lb, ub)
  history.x = [];
  options = optimoptions('lsqcurvefit', ...
      'OutputFcn', @(x, optimValues, state) outfun(x,          optimValues, state, 
       history), ...
      'StepTolerance', 1e-6, ...
      'TolFun', 1e-6, ...
      'MaxIter', 1000);
    [xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun,     x0, xdata, ydata, 
    lb, ub, options);
    % Check if the final parameters meet the norm condition
    if resnorm < 1e-4
        fprintf('Final parameters meet the norm condition.\n');
    else
        fprintf('Final parameters do not meet the norm         condition.\n');
    end
    fprintf('Residual Norm: %.4f\n', resnorm);
    fprintf('Exit Flag: %d\n', exitflag);
    fprintf('Output: %s\n', output.message);
     end
function stop = outfun(x, optimValues, state, history)
  stop = false;
  if strcmp(state, 'iter')
      history.x = [history.x; x];  % Append current parameters to         history
  end
end

With this modified approach, the history.x will now accumulate the parameters from each iteration. After the fitting process, you can access the entire history of parameters by examining history.x. To visualize or analyze the intermediate parameters, you can add the following code after the fitting:

% Example: Display all parameters at each iteration
disp('All Iteration Parameters:');
disp(history.x);

Melden Sie sich an, um zu kommentieren.

彰朗
彰朗 am 6 Sep. 2024
Thank you for your support. I tried wth your new code, but fitHistory had no data.I think new code is available by changing the function [xsol, history] and function stop to the new ones you wrote in your last comment. right?
I think the new code can store the all parameters into fitHistory. If needed some operation, could you help me?

9 Kommentare

Hi @ 彰朗,
Could you please share your new code, so I can help resolve your problem.
Thank you for your kind comment.
Below is problem, which i added your last code and comment out the old function.
i only used your program, but i added "clear all" and "close all".
===
% 不要変数の削除
clear all
close all
% Define initial guess values
initialGuess = [100, 0.01, 3, 0.5, 1]; % Added fourth and fifth parameters
params = initialGuess; % Set parameters to the initial guess
% Define lower and upper bounds for the parameters
lowerBounds = [0, 0, 0, 0, 0]; % Lower bounds for [param1, param2, param3, param4, param5]
upperBounds = [200, 1, 10, 5, 5]; % Upper bounds for [param1, param2, param3, param4, param5]
% Define generic data for xdata and ydata
xdata = linspace(0, 10, 100);
ydata = 2 * xdata + 1 + randn(size(xdata)); % Simulated noisy data
% Call the fitting function
[xFinal, fitHistory] = fitWithIntermediateData(@yourModelFunction, params, xdata, ydata,lowerBounds, upperBounds);
% Calculate performance metrics
yFit = yourModelFunction(xFinal, xdata);
RMSE = sqrt(mean((ydata - yFit).^2));
SSres = sum((ydata - yFit).^2);
SStot = sum((ydata - mean(ydata)).^2);
R_squared = 1 - (SSres / SStot);
% Display performance metrics
fprintf('Final Parameters: %s\n', mat2str(xFinal));
fprintf('RMSE: %.4f\n', RMSE);
fprintf('R²: %.4f\n', R_squared);
% Accessing intermediate results:
intermediateResults = fitHistory.x;
% Extracting middle segment of xdata
numDataPoints = length(xdata);
middleStartIndex = round(numDataPoints * 0.4);
middleEndIndex = round(numDataPoints * 0.6);
middleXdata = xdata(middleStartIndex:middleEndIndex);
middleYdata = ydata(middleStartIndex:middleEndIndex);
% Calculate fitted values for the middle segment
middleYFit = yourModelFunction(xFinal, middleXdata);
% Plotting the middle segment results
figure;
hold on;
plot(middleXdata, middleYdata, 'ro', 'DisplayName', 'Original Middle Data');
plot(middleXdata, middleYFit, 'b-', 'DisplayName', 'Fitted Middle Data');
legend show;
title('Middle Segment Fit Results');
xlabel('X Data (Middle Segment)');
ylabel('Y Data');
hold off;
% % Example: Display all parameters at each iteration
% disp('All Iteration Parameters:');
% disp(history.x);
% Function Definitions
% function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, ub)
% history.x = [];
% options = optimoptions('lsqcurvefit', ...
% 'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state, history), ...
% 'StepTolerance', 1e-10, ...
% 'TolFun', 1e-10, ...
% 'MaxIter', 1000);
% [xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata, ydata, lb, ub, options);
% % Check if the final parameters meet the norm condition
% if resnorm < 1e-9
% fprintf('Final parameters meet the norm condition.\n');
% else
% fprintf('Final parameters do not meet the norm condition.\n');
% end
% fprintf('Residual Norm: %.4f\n', resnorm);
% fprintf('Exit Flag: %d\n', exitflag);
% fprintf('Output: %s\n', output.message);
% end
% function stop = outfun(x, optimValues, state, history)
% stop = false;
% if strcmp(state, 'iter')
% history.x = [history.x; x];
% end
% end
function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, ub)
history.x = [];
options = optimoptions('lsqcurvefit', ...
'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state, history), ...
'StepTolerance', 1e-6, ...
'TolFun', 1e-6, ...
'MaxIter', 1000);
[xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata, ydata, lb, ub, options);
% Check if the final parameters meet the norm condition
if resnorm < 1e-5
fprintf('Final parameters meet the norm condition.\n');
else
fprintf('Final parameters do not meet the norm condition.\n');
end
fprintf('Residual Norm: %.4f\n', resnorm);
fprintf('Exit Flag: %d\n', exitflag);
fprintf('Output: %s\n', output.message);
end
function stop = outfun(x, optimValues, state, history)
stop = false;
if strcmp(state, 'iter')
history.x = [history.x; x]; % Append current parameters to history
end
end
function y = yourModelFunction(params, xdata)
% Updated model function to include additional parameters
y = params(1) * xdata + params(2) + params(3) + params(4) * xdata.^2 + params(5);
end

Hi @彰朗,

Yes, the new code you provided is storing all parameters into fitHistory. I just glance through the code and the function fitWithIntermediateData is captureing the intermediate parameter values during the optimization process. The outfun function is responsible for appending the current parameters to the history structure whenever an iteration occurs. To make sure that fitHistory contains data, verify that the fitting function is being called correctly and that the optimization process is running as expected. If fitHistory remains empty, consider checking the convergence criteria and the initial parameter values. Here’s a brief overview of the relevant code snippet:

function stop = outfun(x, optimValues, state, history)
stop = false;
  if strcmp(state, 'iter')
    history.x = [history.x; x];  % Append current parameters to 
    history
  end
end

This code makes sure that every iteration's parameters are stored. If you need further operations or modifications, please specify, and I would be glad to assist you.

Thank you for your comment. I have a quesiton. In 77 rows,
"function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, ub)" is set "fun" as function for lsqcurvefit. I changed "fun" to "yourModelFunction", but fitHistory remains empty. Is it OK to remain that as "fun" or what should is changed?

Hi @彰朗 ,

To address your question about the use of "fun" versus "yourModelFunction" in the fitWithIntermediateData function, let me clarify a few key points:

Function Handle for Model: In MATLAB, when you define a function that is meant to be passed as an argument (like your model function), it’s common to use a placeholder name like fun. This allows you to keep the implementation flexible and can be replaced by any valid function handle. In your case, if you change fun to yourModelFunction, it should work as long as yourModelFunction is defined correctly.

Function Definition: Make sure that your model function (yourModelFunction) is correctly defined and accessible in the scope where you are calling fitWithIntermediateData. If it’s defined later in your script or in another file, MATLAB might not recognize it at the time of execution.

Storing Intermediate Results: The mechanism for storing intermediate results in history.x is dependent on the optimization process iterating correctly. If fitHistory remains empty:

*Confirm that the optimization is indeed running by checking if it goes through multiple iterations.

*Verify that your initial guess (x0) and bounds (lb,ub) are appropriate for your data; if they are too restrictive, optimization may not proceed effectively.

Here is how you can make sure everything is set up correctly:

function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, ub)
  history.x = [];
  options = optimoptions('lsqcurvefit', ...
      'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state, 
      history), ...
      'StepTolerance', 1e-6, ...
      'TolFun', 1e-6, ...
      'MaxIter', 1000);
    [xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata, ydata, 
    lb, ub, options);
     % Check if final parameters meet norm condition
      if resnorm < 1e-5
        fprintf('Final parameters meet the norm condition.\n');
      else
        fprintf('Final parameters do not meet the norm condition.\n');
     end
    fprintf('Residual Norm: %.4f\n', resnorm);
    fprintf('Exit Flag: %d\n', exitflag);
    fprintf('Output: %s\n', output.message);
  end
function stop = outfun(x, optimValues, state, history)
  stop = false;
  if strcmp(state, 'iter')
      history.x = [history.x; x];  % Append current parameters to history
  end
end
function y = yourModelFunction(params, xdata)
  % Ensure this matches your intended model structure
  y = params(1) * xdata + params(2) + params(3) + params(4) * xdata.^2 +   
params(5); 
end

In order to debug why fitHistory remains empty, add print statements inside the outfun function to confirm it gets called during iterations. Check if any warning messages appear during execution that could indicate convergence issues. Also, consider adjusting the optimization settings such as tolerance levels or maximum iterations based on your specific data characteristics.

Now, if you find that fitting does not converge well or produces unexpected results with added parameters, consider simplifying your model temporarily to identify which part may be causing issues.

By following these guidelines and making sure all components of your code are functioning harmoniously together, you should be able to resolve the issue with empty fitHistory. Feel free to reach out if you need further clarification or assistance!

Thank you for your help.
At last, i got it.
below is the last code.
Thanks of you and i checked MATLAB Help.
History can be stored.
Thank you very much!
% 不要変数の削除
clear all
close all
[xsol, history, xdata, ydata] = runlsqcurvefit;
data = history.x;
% xdata = linspace(0, 10, 100);
yFit = objfun2(xsol, xdata);
plot(xdata,ydata,'o');
hold on
plot(xdata,yFit,'-r');
for i = 1 : length(data) - 1
xsol0 = data(i,:);
yFit0 = objfun2(xsol0, xdata);
plot(xdata, yFit0, '--');
end
function f = objfun2(params, xdata)
f = params(1) * xdata + params(2) + params(3) + params(4) * xdata.^2 + params(5);
end
function [xsol,history,xdata,ydata] = runlsqcurvefit
% Set up shared variables with outfun
history.x = [];
% Call optimization
params = [100, 0.00000001, 0.03, 0.5, 1]; % Set parameters as the initial parameters
% Define lower and upper bounds for the parameters
lowerBounds = [0, 0, 0, 0, 0]; % Lower bounds for [param1, param2, param3, param4, param5]
upperBounds = [1000, 1, 100, 5, 5]; % Upper bounds for [param1, param2, param3, param4, param5]
% Define generic data for xdata and ydata
xdata = linspace(0, 10, 100);
ydata = 2 * xdata + 1 + randn(size(xdata)); % Simulated noisy data
options = optimoptions(@lsqcurvefit,'OutputFcn',@outfun,...
'Display','iter');
xsol = lsqcurvefit(@objfun, params, xdata, ydata, lowerBounds, upperBounds, options);
function stop = outfun(x,optimValues,state)
stop = false;
switch state
case 'init'
% hold on
case 'iter'
% Concatenate current point and objective function
% value with history. x must be a row vector.
history.x = [history.x; x];
% Concatenate current search direction with
% plot(xdata,ydata,'o');
% Label points with iteration number and add title.
% Add .15 to x(1) to separate label from plotted 'o'.
case 'done'
% hold off
otherwise
end
end
function f = objfun(params, xdata)
f = params(1) * xdata + params(2) + params(3) + params(4) * xdata.^2 + params(5);
end
end

Hi @彰朗 ,

Please see my feedback on your code below.

Structure and Clarity

Function Definitions: You have organized your code well by encapsulating functionality within separate functions (objfun2, runlsqcurvefit, and outfun). This modularity enhances readability and maintainability.

Comments: While there are comments present, they could be expanded to clarify the purpose of each section of the code, especially within the outfun function. More detailed comments would assist others (or future self) in understanding the logic without needing to decipher the code.

Code Efficiency

Repeated Function: The objfun function is defined twice with identical implementations. Consider removing one of them to avoid redundancy and potential confusion.

Data Handling: The variable history.x is used to store parameter history but could be better utilized if you also included corresponding objective function values for each iteration, which would provide insight into convergence behavior.

Parameter Initialization: The initial parameters are hardcoded. It might be beneficial to either provide a way for users to input these values or to implement a more systematic method for estimating reasonable starting points based on prior knowledge or data characteristics.

Output Analysis

Convergence Information: The output clearly shows convergence behavior, but it would be helpful to add a conditional check to determine if the optimization was successful and provide an informative message to the user.

Plotting Enhancements: When plotting, consider adding labels and legends to enhance clarity:

  • Use xlabel, ylabel, and title functions for axes labeling.
  • A legend differentiating between the original data points and fitted curves would aid in visual interpretation.

Overall, your MATLAB code demonstrates solid understanding and application of least squares curve fitting techniques. By refining some areas such as redundancy elimination, enhancing output clarity, improving error handling, and adding informative metrics, you can significantly improve both its functionality and user experience.

Hi @彰朗,
I hope all your questions have been answered, please let me know if you need any further assistance.
Thank you for your comment and kind support. Sorry for delay to reply.
Owing to your comment, i modified my programing and separated the main script and sub function.
Anyway, my recommandation was achieved by your support,
I appriciate your kind follow. Thank you very much!!
Good luck!

Melden Sie sich an, um zu kommentieren.

Kategorien

Produkte

Version

R2023b

Gefragt:

am 29 Aug. 2024

Kommentiert:

am 24 Sep. 2024

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by