Definition of Markov process:
Any stochastic process for which the probabilities, at any one time, of the different future states depend only on the existing state and not on how that state was arrived at.
Process whose future behavior cannot be accurately predicted from its past behavior (except the current or present behavior) and which involves random chance or probability. Behavior of a business or economy, flow of traffic, progress of an epidemic, all are examples of Markov processes. Named after the inventor of Markov analysis, the Russian mathematician Andrei Andreevich Markov (1856-1922). Also called stochastic process. See also Brownian motion and random walk theory.
Meaning of Markov process & Markov process Definition