*Chain*

#
All 1 entries tagged

View all 3 entries tagged *Chain* on Warwick Blogs | View entries tagged *Chain* at Technorati | There are no images tagged *Chain* on this blog

## October 20, 2018

### Weather forecast with a Markov Chain

Markov Chains are a computational tool useful for modelling systems made up of linked events. For example, take a simple weather forecast with three states: rainy, sunny, and cloudy. If we'd think about it we'll soon realise that the weather forecast system behaves differently to the tossing coin example given at basic statistical courses. In this real world example, the events are not independent of the previous state. For example, that today has been rainy might signal that tomorrow is gonna be rainy (specially if you live in England!). An structure of a Markov Chain for the weather forecast system can be build by drawing dots (aka states) and arrows (aka transitions), here is an example with made up probabilities for rainy, sunny and cloudy:

Double-check that because arrows are probabilities the number associated with them must lie between 0 and 1, (and the sum of all the arrows stemming from a dot must add up to 1!).

OK, now we have written down the Markov diagram we can fairly easily check the probability of tomorrows raining given that today is raining, 0.5. Besides, we may also be interested in which is the probability of rain in two day if it's cloudy today. To do so, we'd add up all the paths that lead from cloudy today to rainy in two days, which amounts to 0.42. This will soon become cumbersome to calculate by heart, that's why it's so convenient to arrange the Markov Chain in a matrix P that predicts tomorrow's weather, and use matrix arithmetic to calculate the day after tomorrow's weather, given by PXP = P^2

Similarly, P3 would give the probabilities for three days, and so on. "The entire future unfolds from this one matrix".

Given, the simple example given the successive powers of the matrix rapidly converge to a configuration in where all the columns and rows remain stationary:

There is a simple interpretation for that behaviour of the Chain. If we let the system evolve long enough the probability of a given state no longer depends on the initial state. In other words, knowing that today today is rainy may offer a clue on tomorrow's weather, but it's not much helpful in predicting the weather in one month. For such an extended forecast, we may as well consult the long-term averages, which is the values where the Markov Chain converges.

It's a pleasure introduce Markov Chains, but if you're looking for more information check out this resource where I have adapted the example from: First Links in the Markov Chain (https://raichev.net/markov/misc/markov_chain.pdf) .