Combinatoric task about flock of sheep

Solution 1:

Providing that all orders of sheep speeds are equally likely and no pair of sheep have the same speed, the $n$th sheep (counting from the front) is the slowest of its group if and only if all $n-1$ sheep in front of it are all faster than it is, which has probability $\dfrac{1}{n}$

Each group has precisely one slowest sheep, so with $30$ sheep the expected number of groups is $$\displaystyle \sum_{n=1}^{30} \dfrac1n \approx 3.995$$

Solution 2:

Two parts:

    1. A visualization (see below) of the flock's evolution, on a time-position graphics displaying the aggregation process.

enter image description here

In this case, we are with 4 groups.

This representation has been obtained using a programming "trick" that can interest some people (Matlab program below) who would like to adapt it for other purposes.

Matlab program

clear all;close all;hold on;box on;
n=30;s=rand(1,n);
a=100;b=60;axis([0,a,0,b]);
fill([0,a,a,0],[b,b,0,0],'c');
for k=1:n
    fill([0,a,a,0],[b,b,k+a*s(k),k],'c');
end;

Comment: This program displays stacked trapezoidal shapes, each newly stacked shape yielding a possible partial occlusion of previously displayed shapes.

    1. A simple proof of the result given by Henry.

Let $G_k$ be the Random Variable equal to the number of groups for a flock of $k$ sheep.

Considering that a $n$th sheep is aggregated to a flock with $n−1$ sheep, we have

$$ \begin{cases}\text{either}&G_n=G_{n−1}+0 \ \\ \text{or}&G_n=G_{n−1}+1 \end{cases}$$

The latter occurs in the exceptional case where the speed of the newly added animal is smaller than all other speeds (this can clearly happen with probability $\tfrac{1}{n}$). Therefore, the natural model in terms of random variables is:

$$\tag{1}G_n=G_{n−1}+B$$

where $B$ is a Bernoulli RV with parameter $\tfrac{1}{n}$.

Taking expectation on both sides of (1),

$$E(G_n)=E(G_{n-1})+\tfrac{1}{n}.$$

Using the fact that $G_1=1 \implies E(G_1)=1$ (a single sheep constitutes a group in itself), we finally deduce, by an immediate induction, that: $$E(G_n)=1+\tfrac{1}{2}+\tfrac{1}{3}+\cdots+\tfrac{1}{n}=H_n \approx \ln(n)+\gamma,$$

where $H_n$ the $n$th harmonic number and $\gamma \approx 0.577...$ is the Euler-Mascheroni constant.