## Heuristic

The Markov chain model in itself offers a future state prediction model. Using transition matrices *T* and the initial position vector, i.e., at what distance the two nodes are at the beginning, we can infer future steps movement probabilities. The following heuristic allows us to predict the state i.e., the distance, between two nodes *n* steps later, based on the current situation.

For a given pair of nodes, we apply the position vector to the transition matrix and deduce the probability of being in any state at the future step. This technique provides the probability for the given nodes of being at state *S in (infinity, 1, 2,.., k)* at the *n ^{th}* future step. The calculus follows:

From the result vector, the heuristic outputs two states: the first highest probability *S _{f}* and the second one

*S*. These two values are the prediction of our heuristic for the

_{s}*n*next step.

^{th}Here, we detail an example with k = 2 (we only consider distances 1 and 2 as well as k-intercontact = infinity):

Let us consider the presence vector [0 1 0] indicating that nodes are currently in contact (1-hop distance) and the transition matrix *T*:
[[0,5 0,1 0,4],
[0,7 0,2 0,1],
[0,6 0,3 0,1]].
In this case, we want to predict the potential states for the 1^{st} next step, so we use *T* power 1. The multiplication results in the vector:
[0,7 0,2 0,1].

For the next interval, we obtain the following predictions:

- S
_{f}= infinity -> k-intercontact with probability 0,7, - S
_{s}= contact with probability 0,2.