How can I have the middle data set during the fitting process by lsqcurvefit?
- calculate the x by lsqcurvefit and have the loop number. e.g. loop number = 100
- set MaxIterations as 10 and calculate the x10 by lsqcurvefit
- set MaxIterations as 20, 30,,,,100 and calculate the x20, x30,,,x100, respectivelly by lsqcurvefit
- make the graphs with x10,x20,,,x100
Akzeptierte Antwort
Weitere Antworten (4)
0 Stimmen
Hi @彰朗,
Could you advice to improve my situation?
Please see my response to your comments below.
Detailed Comparison and Analysis
Initial Parameter Setup
Old Code:The initial guess for parameters was hardcoded with arbitrary values.
New Code:The initial guess has been parameterized and is now directly linked to an array that defines lower and upper bounds for each parameter.
Improvement
This change enhances flexibility and allows for better control over the parameter space, which can lead to more accurate fitting. By bounding the parameters, the new code avoids unrealistic values that could skew results.
Performance Metrics Calculation
Old Code:The final output primarily includes the fitted parameters and residual norm, offering limited insight into the quality of the fit.
New Code:Introduces performance metrics such as Root Mean Square Error (RMSE) and R-squared, which provide a quantitative measure of fit quality.
Improvement
The incorporation of performance metrics allows users to evaluate the fitting process quantitatively, making it easier to assess model performance and compare different models.
Enhanced Output Reporting
Old Code: The reporting of results is minimal, focusing mainly on the fitted parameters.
New Code: More comprehensive output includes RMSE, R-squared, and details about the exit flag and residual norm.
Improvement
This enhancement makes the results more informative, facilitating a deeper analysis of the fitting process, which is vital for troubleshooting and validation.
Data Visualization
Old Code: The plotting section is functional but lacks clarity in distinguishing original data from fitted results.
New Code: An additional plot for original data points is included, and the final fit is clearly distinguished with a line style.
Improvement
Improved visualization aids in better interpretation of results, allowing users to visually assess the fitting quality against the original dataset.
Function Signature and Flexibility
Old Code: The fitting function does not accommodate parameter bounds, which may lead to suboptimal solutions.
New Code: The fitting function now accepts lower and upper bounds as parameters, enhancing the optimization's robustness.
Improvement
This flexibility is crucial in real-world applications where constraints on parameters are common. It allows for a more reliable fitting process and mitigates the risk of non-physical results.
New Code
% Define initial guess values initialGuess = [100, 0.01, 3]; params = initialGuess; % Set parameters to the initial guess
% Define lower and upper bounds for the parameters lowerBounds = [0, 0, 0]; % Lower bounds for [param1, param2, param3] upperBounds = [200, 1, 10]; % Upper bounds for [param1, param2, param3]
% Define generic data for xdata and ydata xdata = linspace(0, 10, 100); ydata = 2 * xdata + 1 + randn(size(xdata)); % Simulated noisy data
% Call the fitting function [xFinal, fitHistory] = fitWithIntermediateData(@yourModelFunction, params, xdata, ydata, lowerBounds, upperBounds);
% Calculate performance metrics yFit = yourModelFunction(xFinal, xdata); RMSE = sqrt(mean((ydata - yFit).^2)); SSres = sum((ydata - yFit).^2); SStot = sum((ydata - mean(ydata)).^2); R_squared = 1 - (SSres / SStot);
% Display performance metrics
fprintf('Final Parameters: %s\n', mat2str(xFinal));
fprintf('RMSE: %.4f\n', RMSE);
fprintf('R²: %.4f\n', R_squared);
% Accessing intermediate results: intermediateResults = fitHistory.x;
% Plotting the results figure; hold on; for i = 1:size(intermediateResults, 1) plot(xdata, yourModelFunction(intermediateResults(i,:), xdata), 'DisplayName', ['Iteration ' num2str(i)]); end
% Plot the final fit
plot(xdata, yFit, 'k--', 'DisplayName', 'Final Fit', 'LineWidth', 2);
legend show;
title('Intermediate Fit Results and Final Fit');
xlabel('X Data');
ylabel('Y Data');
hold off;
% Plot the original data
figure;
plot(xdata, ydata, 'o');
title('Original Data');
xlabel('X Data');
ylabel('Y Data');
function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, ub) % Initialize history structure to store intermediate results history.x = [];
% Define options for lsqcurvefit
options = optimoptions('lsqcurvefit', ...
'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state,
history), ...
'StepTolerance', 1e-6, ... % Adjust step tolerance
'TolFun', 1e-6, ... % Tolerance on the function value
'MaxIter', 1000); % Maximum number of iterations % Call lsqcurvefit with bounds
[xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata,
ydata, lb, ub, options); % Display final results
fprintf('Residual Norm: %.4f\n', resnorm);
fprintf('Exit Flag: %d\n', exitflag);
fprintf('Output: %s\n', output.message);
endfunction stop = outfun(x, optimValues, state, history)
stop = false; % Allows optimization to continue
if strcmp(state, 'iter')
% Store current x in history
history.x = [history.x; x]; % Append current parameters to history
end
end
function y = yourModelFunction(params, xdata) % Define your model function here y = params(1) * xdata + params(2) + params(3); end
Output results after execution of new code
Local minimum possible.
lsqcurvefit stopped because the final change in the sum of squares relative to its initial value is less than the value of the function tolerance.
criteria details Residual Norm: 75.4525 Exit Flag: 3 Output: Local minimum possible.
lsqcurvefit stopped because the final change in the sum of squares relative to its initial value is less than the value of the function tolerance.
Optimization stopped because the relative sum of squares (r) is changing by less than options.FunctionTolerance = 1.000000e-06. Final Parameters: [2.01098691915462 0.0790032930972242 0.881589904882525] RMSE: 0.8686 R²: 0.9785
Summary of output results
The output results indicate a reasonably good fit, various strategies can be employed to enhance the model’s performance further. By refining the initial guesses, adjusting tolerances, and reviewing your model function, you can work towards achieving more optimal fitting results.
Please see attached.

Hope this helps. Please let me know if you have any further questions.
1 Kommentar
0 Stimmen
Hi @彰朗 ,
Please see my response to your questions below.
One question is regarding to how to use.is it OK to copy and paste to editor in MATLAB and click Run button? No need to special set up or process?
Yes, you can copy and paste the entire code into the MATLAB editor and click the "Run" button. There is no need for special setup or processes, provided you have the necessary toolboxes installed (specifically, the Optimization Toolbox).
Next question is about your result, if possible, could you attachd your result of middle xdata on this comment?
The provided code already accomplishes defining the initial parameters, bounds, and the data to be fitted., but I will ensure that we focus on the middle segment of xdata later. So, the fitting function fitWithIntermediateData is called to optimize the parameters based on the model function defined. This function also captures intermediate results for analysis.To focus on the middle segment of xdata, we can calculate the indices that correspond to the middle 20% of the data which can be done as follows:
% Calculate the indices for the middle 20% of xdata numDataPoints = length(xdata); middleStartIndex = round(numDataPoints * 0.4); % Start at 40% middleEndIndex = round(numDataPoints * 0.6); % End at 60%
% Extract the middle xdata and corresponding ydata middleXdata = xdata(middleStartIndex:middleEndIndex); middleYdata = ydata(middleStartIndex:middleEndIndex);
Using the optimized parameters obtained from the fitting process, you can calculate the fitted values for the middle segment:
% Calculate fitted values for the middle segment middleYFit = yourModelFunction(xFinal, middleXdata);
Finally, display the results for the middle segment of xdata alongside the original noisy data. This can be done using a simple plot:
% Plotting the middle segment results
figure;
hold on;
plot(middleXdata, middleYdata, 'ro', 'DisplayName', 'Original Middle Data'); %
Original data points
plot(middleXdata, middleYFit, 'b-', 'DisplayName', 'Fitted Middle Data'); %
Fitted line
legend show;
title('Middle Segment Fit Results');
xlabel('X Data (Middle Segment)');
ylabel('Y Data');
hold off;
Here is the complete code integrating all the steps mentioned above:
% Define initial guess values initialGuess = [100, 0.01, 3]; params = initialGuess; % Set parameters to the initial guess % Define lower and upper bounds for the parameters lowerBounds = [0, 0, 0]; % Lower bounds for [param1, param2, param3] upperBounds = [200, 1, 10]; % Upper bounds for [param1, param2, param3] % Define generic data for xdata and ydata xdata = linspace(0, 10, 100); ydata = 2 * xdata + 1 + randn(size(xdata)); % Simulated noisy data % Call the fitting function [xFinal, fitHistory] = fitWithIntermediateData(@yourModelFunction, params, xdata, ydata, lowerBounds, upperBounds);
% Calculate performance metrics yFit = yourModelFunction(xFinal, xdata); RMSE = sqrt(mean((ydata - yFit).^2)); SSres = sum((ydata - yFit).^2); SStot = sum((ydata - mean(ydata)).^2); R_squared = 1 - (SSres / SStot);
% Display performance metrics
fprintf('Final Parameters: %s\n', mat2str(xFinal));
fprintf('RMSE: %.4f\n', RMSE);
fprintf('R²: %.4f\n', R_squared);
% Accessing intermediate results: intermediateResults = fitHistory.x;
% Extracting middle segment of xdata numDataPoints = length(xdata); middleStartIndex = round(numDataPoints * 0.4); middleEndIndex = round(numDataPoints * 0.6); middleXdata = xdata(middleStartIndex:middleEndIndex); middleYdata = ydata(middleStartIndex:middleEndIndex);
% Calculate fitted values for the middle segment middleYFit = yourModelFunction(xFinal, middleXdata);
% Plotting the middle segment results
figure;
hold on;
plot(middleXdata, middleYdata, 'ro', 'DisplayName', 'Original Middle Data');
plot(middleXdata, middleYFit, 'b-', 'DisplayName', 'Fitted Middle Data');
legend show;
title('Middle Segment Fit Results');
xlabel('X Data (Middle Segment)');
ylabel('Y Data');
hold off;
% Function Definitions
function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, ub)
history.x = [];
options = optimoptions('lsqcurvefit', ...
'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state,
history), ...
'StepTolerance', 1e-6, ...
'TolFun', 1e-6, ...
'MaxIter', 1000);
[xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata, ydata, lb, ub, options);
fprintf('Residual Norm: %.4f\n', resnorm);
fprintf('Exit Flag: %d\n', exitflag);
fprintf('Output: %s\n', output.message);
end
function stop = outfun(x, optimValues, state, history)
stop = false;
if strcmp(state, 'iter')
history.x = [history.x; x];
end
end
function y = yourModelFunction(params, xdata) y = params(1) * xdata + params(2) + params(3); end
Please see attached data

Third is aboud fitHistory. Is it OK to understand the fitHistory are updated with midlle xdata during the Run?
Yes, you are correct. The fitHistory structure is updated with the intermediate parameter values during each iteration of the optimization process. This allows you to analyze how the parameters evolve over time.
Hope, this answers all your questions.
4 Kommentare
Hi @彰朗,
I implement the following changes based on your request.
Utilize the initial guess for the second parameter.
Create additional parameters (fourth and fifth).
Making sure that the final parameters are conditioned based on the norm.
Incorporate a data matrix that includes the second, third, fourth, and fifth parameters.
Below is the updated MATLAB code reflecting these modifications:
% Define initial guess values initialGuess = [100, 0.01, 3, 0.5, 1]; % Added fourth and fifth parameters params = initialGuess; % Set parameters to the initial guess
% Define lower and upper bounds for the parameters lowerBounds = [0, 0, 0, 0, 0]; % Lower bounds for [param1, param2, param3, param4, param5] upperBounds = [200, 1, 10, 5, 5]; % Upper bounds for [param1, param2, param3, param4, param5]
% Define generic data for xdata and ydata xdata = linspace(0, 10, 100); ydata = 2 * xdata + 1 + randn(size(xdata)); % Simulated noisy data
% Call the fitting function [xFinal, fitHistory] = fitWithIntermediateData(@yourModelFunction, params, xdata, ydata,lowerBounds, upperBounds);
% Calculate performance metrics yFit = yourModelFunction(xFinal, xdata); RMSE = sqrt(mean((ydata - yFit).^2); SSres = sum((ydata - yFit).^2); SStot = sum((ydata - mean(ydata)).^2); R_squared = 1 - (SSres / SStot);
% Display performance metrics
fprintf('Final Parameters: %s\n', mat2str(xFinal));
fprintf('RMSE: %.4f\n', RMSE);
fprintf('R²: %.4f\n', R_squared);
% Accessing intermediate results: intermediateResults = fitHistory.x;
% Extracting middle segment of xdata numDataPoints = length(xdata); middleStartIndex = round(numDataPoints * 0.4); middleEndIndex = round(numDataPoints * 0.6); middleXdata = xdata(middleStartIndex:middleEndIndex); middleYdata = ydata(middleStartIndex:middleEndIndex);
% Calculate fitted values for the middle segment middleYFit = yourModelFunction(xFinal, middleXdata);
% Plotting the middle segment results
figure;
hold on;
plot(middleXdata, middleYdata, 'ro', 'DisplayName', 'Original Middle Data');
plot(middleXdata, middleYFit, 'b-', 'DisplayName', 'Fitted Middle Data');
legend show;
title('Middle Segment Fit Results');
xlabel('X Data (Middle Segment)');
ylabel('Y Data');
hold off;
% Function Definitions
function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, ub)
history.x = [];
options = optimoptions('lsqcurvefit', ...
'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state, history), ...
'StepTolerance', 1e-6, ...
'TolFun', 1e-6, ...
'MaxIter', 1000);
[xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata, ydata, lb, ub, options);
% Check if the final parameters meet the norm condition
if resnorm < 1e-4
fprintf('Final parameters meet the norm condition.\n');
else
fprintf('Final parameters do not meet the norm condition.\n');
end fprintf('Residual Norm: %.4f\n', resnorm);
fprintf('Exit Flag: %d\n', exitflag);
fprintf('Output: %s\n', output.message);
endfunction stop = outfun(x, optimValues, state, history)
stop = false;
if strcmp(state, 'iter')
history.x = [history.x; x];
end
end
function y = yourModelFunction(params, xdata) % Updated model function to include additional parameters y = params(1) * xdata + params(2) + params(3) + params(4) * xdata.^2 + params(5); end
Please see attached.

Explanation of Changes
Initial Guess: The initialGuess array now includes two additional parameters, allowing for a more complex model.
Parameter Bounds: The bounds for the parameters have been adjusted to accommodate the new parameters.
Model Function: The yourModelFunction has been updated to include the fourth and fifth parameters, allowing for a polynomial fit.
Norm Condition: After fitting, the code checks if the residual norm is below a specified threshold, indicating a good fit.
Data Matrix: The model function now effectively utilizes all parameters, ensuring that the fitting process is comprehensive.
Please let me know if you have any further questions.
Hi @彰朗,
To achieve your goal of tracking all parameter values throughout the iterations of lsqcurvefit,you can modify the outfun function to store the parameters at every iteration. Here’s how you can do this:
Modify the Output Function: The outfun function needs to be adjusted to append each set of parameters to a history matrix.
Store Intermediate Parameters: Use a global variable or a structure passed to the optimization options to keep track of the parameters at each iteration.
Here’s how you can implement this:
function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, ub)
history.x = [];
options = optimoptions('lsqcurvefit', ...
'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state,
history), ...
'StepTolerance', 1e-6, ...
'TolFun', 1e-6, ...
'MaxIter', 1000);
[xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata, ydata,
lb, ub, options); % Check if the final parameters meet the norm condition
if resnorm < 1e-4
fprintf('Final parameters meet the norm condition.\n');
else
fprintf('Final parameters do not meet the norm condition.\n');
end fprintf('Residual Norm: %.4f\n', resnorm);
fprintf('Exit Flag: %d\n', exitflag);
fprintf('Output: %s\n', output.message);
endfunction stop = outfun(x, optimValues, state, history)
stop = false;
if strcmp(state, 'iter')
history.x = [history.x; x]; % Append current parameters to history
end
end
With this modified approach, the history.x will now accumulate the parameters from each iteration. After the fitting process, you can access the entire history of parameters by examining history.x. To visualize or analyze the intermediate parameters, you can add the following code after the fitting:
% Example: Display all parameters at each iteration
disp('All Iteration Parameters:');
disp(history.x);
0 Stimmen
9 Kommentare
Hi @彰朗,
Yes, the new code you provided is storing all parameters into fitHistory. I just glance through the code and the function fitWithIntermediateData is captureing the intermediate parameter values during the optimization process. The outfun function is responsible for appending the current parameters to the history structure whenever an iteration occurs. To make sure that fitHistory contains data, verify that the fitting function is being called correctly and that the optimization process is running as expected. If fitHistory remains empty, consider checking the convergence criteria and the initial parameter values. Here’s a brief overview of the relevant code snippet:
function stop = outfun(x, optimValues, state, history)
stop = false;
if strcmp(state, 'iter')
history.x = [history.x; x]; % Append current parameters to
history
end
end
This code makes sure that every iteration's parameters are stored. If you need further operations or modifications, please specify, and I would be glad to assist you.
Hi @彰朗 ,
To address your question about the use of "fun" versus "yourModelFunction" in the fitWithIntermediateData function, let me clarify a few key points:
Function Handle for Model: In MATLAB, when you define a function that is meant to be passed as an argument (like your model function), it’s common to use a placeholder name like fun. This allows you to keep the implementation flexible and can be replaced by any valid function handle. In your case, if you change fun to yourModelFunction, it should work as long as yourModelFunction is defined correctly.
Function Definition: Make sure that your model function (yourModelFunction) is correctly defined and accessible in the scope where you are calling fitWithIntermediateData. If it’s defined later in your script or in another file, MATLAB might not recognize it at the time of execution.
Storing Intermediate Results: The mechanism for storing intermediate results in history.x is dependent on the optimization process iterating correctly. If fitHistory remains empty:
*Confirm that the optimization is indeed running by checking if it goes through multiple iterations.
*Verify that your initial guess (x0) and bounds (lb,ub) are appropriate for your data; if they are too restrictive, optimization may not proceed effectively.
Here is how you can make sure everything is set up correctly:
function [xsol, history] = fitWithIntermediateData(fun, x0, xdata, ydata, lb, ub)
history.x = [];
options = optimoptions('lsqcurvefit', ...
'OutputFcn', @(x, optimValues, state) outfun(x, optimValues, state,
history), ...
'StepTolerance', 1e-6, ...
'TolFun', 1e-6, ...
'MaxIter', 1000);
[xsol, resnorm, residual, exitflag, output] = lsqcurvefit(fun, x0, xdata, ydata,
lb, ub, options); % Check if final parameters meet norm condition
if resnorm < 1e-5
fprintf('Final parameters meet the norm condition.\n');
else
fprintf('Final parameters do not meet the norm condition.\n');
end fprintf('Residual Norm: %.4f\n', resnorm);
fprintf('Exit Flag: %d\n', exitflag);
fprintf('Output: %s\n', output.message);
endfunction stop = outfun(x, optimValues, state, history)
stop = false;
if strcmp(state, 'iter')
history.x = [history.x; x]; % Append current parameters to history
end
end
function y = yourModelFunction(params, xdata) % Ensure this matches your intended model structure y = params(1) * xdata + params(2) + params(3) + params(4) * xdata.^2 + params(5); end
In order to debug why fitHistory remains empty, add print statements inside the outfun function to confirm it gets called during iterations. Check if any warning messages appear during execution that could indicate convergence issues. Also, consider adjusting the optimization settings such as tolerance levels or maximum iterations based on your specific data characteristics.
Now, if you find that fitting does not converge well or produces unexpected results with added parameters, consider simplifying your model temporarily to identify which part may be causing issues.
By following these guidelines and making sure all components of your code are functioning harmoniously together, you should be able to resolve the issue with empty fitHistory. Feel free to reach out if you need further clarification or assistance!
Hi @彰朗 ,
Please see my feedback on your code below.
Structure and Clarity
Function Definitions: You have organized your code well by encapsulating functionality within separate functions (objfun2, runlsqcurvefit, and outfun). This modularity enhances readability and maintainability.
Comments: While there are comments present, they could be expanded to clarify the purpose of each section of the code, especially within the outfun function. More detailed comments would assist others (or future self) in understanding the logic without needing to decipher the code.
Code Efficiency
Repeated Function: The objfun function is defined twice with identical implementations. Consider removing one of them to avoid redundancy and potential confusion.
Data Handling: The variable history.x is used to store parameter history but could be better utilized if you also included corresponding objective function values for each iteration, which would provide insight into convergence behavior.
Parameter Initialization: The initial parameters are hardcoded. It might be beneficial to either provide a way for users to input these values or to implement a more systematic method for estimating reasonable starting points based on prior knowledge or data characteristics.
Output Analysis
Convergence Information: The output clearly shows convergence behavior, but it would be helpful to add a conditional check to determine if the optimization was successful and provide an informative message to the user.
Plotting Enhancements: When plotting, consider adding labels and legends to enhance clarity:
- Use xlabel, ylabel, and title functions for axes labeling.
- A legend differentiating between the original data points and fitted curves would aid in visual interpretation.
Overall, your MATLAB code demonstrates solid understanding and application of least squares curve fitting techniques. By refining some areas such as redundancy elimination, enhancing output clarity, improving error handling, and adding informative metrics, you can significantly improve both its functionality and user experience.
Kategorien
Mehr zu Manage System Data finden Sie in Hilfe-Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!


