Wednesday, December 23, 2009

Markov Chains

You know developers have an itch to implement something cool that they learned about. No can say it better than XKCD.



I learnt about Markov Chains today.

In mathematics, a Markov chain, named after Andrey Markov, is a discrete random process with the Markov property. A discrete random process means a system which can be in various states, and which changes randomly in discrete steps. It can be helpful to think of the system as evolving once a minute, although strictly speaking the "step" may have nothing to do with time. The Markov property states that the probability distribution for the system at the next step (and in fact at all future steps) only depends on the current state of the system, and not additionally on the state of the system at previous steps. Since the system changes randomly, it is generally impossible to predict the exact state of the system in the future. However, the statistical properties of the system at a great many steps in the future can often be described. In many applications it is these statistical properties that are important. - From the wikipedia entry on Markov Chains


It was so cool that I decided to do something with it. So I wrote a simple helper that generates articles for the app that I was testing. Here is the code in all its glory. It is not the fastest piece of code and is a memory hog, but who cares?



This is how you use it.



And this produced this fine article. Isn't this cool?

2 comments:

Selvam said...

Thats a cool explanation on markov chains,even better than my college staffs :)

Niranjan said...

Should be called as Mudukov :). Muddu is on a roll...Nice to see a techie article from you after a long time.