How to simulate basic markov chain
6 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Hi,
I'm fairly new to matlab. Would anybody be able to show me how I would simulate a basic discrete time markov chain?
Say for example I have a transition matrix with 3 states, A, b and C, how could I simulate say 20 steps starting from state A?
A B C
A .3 .2. .5
B .2 .1. .7
C .1 . 5 .4
Any help would be greatly appreciated.
Regards
John
0 Kommentare
Antworten (1)
Doug Hull
am 12 Okt. 2012
Bearbeitet: Doug Hull
am 12 Okt. 2012
Are you looking to do a simple matrix multiply?
v = [1 0 0]
m = [0.3 0.2 0.5; 0.2 0.1 0.7; 0.1 0.5 0.4]
v = v * m
You can also do this in a loop.
If you want to square the matrix element by element
m = m.^2
but more likely you wish to square the matrix
m = m^2
You can do this for higher powers:
m = m^20
And putting it together:
>> v = v*m^20
v =
0.1652 0.3217 0.5130
0 Kommentare
Siehe auch
Kategorien
Mehr zu Markov Chain Models finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!