Definition von markov jump process im Englisch Englisch wörterbuch
A time-dependent variable that starts in an initial state and stays in that state for a random time, when it makes a transition to another random state, and so on
a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state