- Define your log regression model as you have mentioned.
- specify the log likelihood function assuming the error in normal as log(normpdf(ln(y), alpha + Beta*ln(x), sigma))
- Specify the prior distribution using ‘bayeslm’ function.
- Run the maximum likelihood to estimate the parameters.
Estimate Linear regression, then estimate normal learning model and see how parameters update over time
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
I want to estmate a log linearized regression (ln(y) = alpha + Beta*ln(x) +e), and then see how particular parameters (alpha, Beta) update over time given observations via a normal bayesian learning model. I am new to normal learning models, and use matlab infrequently.
Do I need to run maximum likelihod on a log likelihood function then run 'bayeslm', or do I run 'bayeslm/empiricallm' and then 'estimate' for the posterior?
Additionally, do I set up a log likelihood, prior, and then estimate, or just the log likelihood and then define the functions?
I have read around some of the mathworks documents, but would like verification for this process before proceeding. Thank you!
0 Kommentare
Antworten (1)
Balaji
am 22 Sep. 2023
Hi Joshua
I understand that you want to estimate a log-linearized regression model using a normal Bayesian learning approach in MATLAB. For this you can
For more information on ‘bayeslm’ function I suggest you refer to :
Hope this helps
Thanks
Balaji
0 Kommentare
Siehe auch
Kategorien
Mehr zu Regression finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!