Solution 1:

1. Here is a mild condition on the distribution of $S_n$'s that guarantee the existsence of limiting distribution.

Proposition. Assume that $\sum_{n=0}^{\infty} (1-\beta)^n |S_n| < \infty$ holds almost surely. Then $(X_n)_{n\geq 0}$ has a limiting distribution.

Proof. For each $s \in \mathbb{R}$, define $f_s : \mathbb{R} \to \mathbb{R}$ by

$$ f_s(x) = \begin{cases} \alpha s + (1-\alpha) x, & \text{if $s \geq x$}, \\ \beta s + (1-\beta) x, & \text{if $s \leq x$}. \end{cases} $$

This function allows to rewrite the recursive formula as $ X_{n+1} = f_{S_n}(X_n) $, and so,

$$X_n = (f_{S_{n-1}} \circ \cdots \circ f_{S_0} )(0).$$

But since $S_n$'s are i.i.d., this implies

$$X_n \stackrel{\text{law}}{=} X'_n := (f_{S_{0}} \circ \cdots \circ f_{S_{n-1}} )(0).$$

In light of this, it suffices to prove that $(X'_n)_{n\geq 0}$ converges in distribution. Given the assumption, we actually prove that $(X'_n)_{n\geq 0}$ converges almost surely. Indeed, note that

$$|f_s(y) - f_s(x)| \leq (1-\beta)|y - x|$$

uniformly in $s, x, y \in \mathbb{R}$. Applying this to

$$ X'_{n+1} - X'_n = (f_{S_{0}} \circ \cdots \circ f_{S_{n-1}} )(f_{S_n}(0)) - (f_{S_{0}} \circ \cdots \circ f_{S_{n-1}} )(0), $$

we get

$$ \sum_{n=0}^{\infty} \left| X'_{n+1} - X'_n \right| \leq \sum_{n=0}^{\infty} (1-\beta)^n \left|f_{S_n}(0) - 0\right| \leq \sum_{n=0}^{\infty} (1-\beta)^n \alpha \left|S_n\right|. $$

Together with the assumption, the desired conclusion follows. $\square$

Remarks.

  1. The assumption of the proposition is rather arbitrary but still moderately general. For instance, it is satisfied whenever $S_0$ is integrable.

  2. Although the original problem is formulated with the initial condition $X_0 = 0$, this is not important. Indeed, the recurrence relation teaches us that $(X_n)$ forgets its initial condition at least exponentially fast, hence the question on the existence of limiting distribution does not depend on $X_0$.

    To see this, let $X_n(\xi) = (f_{S_{n-1}} \circ \cdots \circ f_{S_0})(\xi)$ denote the sequence given by OP's recurrence relation with the (possibly random) initial condition $X_0 = \xi$. Then

    $$ |X_n(\xi_2) - X_n(\xi_1)| \leq (1-\beta)^n|\xi_2 - \xi_1|. $$


2. Assume that $(X_n)$ converges in distribution to a random variable $X$. Let $S$ be identically distributed as $S_0$ and independent of $X$. Then the followings are easy consequences.

  • If $S$ is supported on an interval $I$, then so is $X$.
  • If $S$ is integrable, then so is $X$. More precisely, we have $\mathbb{E}[|X|] \leq \frac{\alpha}{\beta}\mathbb{E}[|S|]$. Also,

    $$ \mathbb{E}[X] = \mathbb{E}[S] + \frac{\alpha-\beta}{\alpha+\beta}\mathbb{E}[|X-S|] $$

    In particular, if $S$ is non-degenerate, then we have $\mathbb{E}[X] > \mathbb{E}[S]$.

Solution 2:

I will take my shot here. However, the post won't fully answer the question, and it lacks rigor in later parts.

Let's first examine a simpler claim that will be a basis for later discussion.

Claim: Let $S_n$ be i.i.d. and has symmetric distribution with non-negative characteristic function values. $Z_{n+1}=Z_n + \gamma (S_n - Z_n)$ be a sequence of random variable with $Z_0 = 0, 0 < \gamma < 1$. Then $Z_n$ converges in distribution.

Proof: Let the characteristic function of $S_n$ be $\psi(t)$. Since $S_n$ is symmetric, $\psi(t)$ takes real values and $\vert \psi(t) \vert \le 1$. The characteristic function of $Z_n$ is denoted by $\phi_n(t)$. Then $\phi_0(t) = 1 $, and $$ \phi_{n+1}(t)=\phi_n((1-\gamma)t)\cdot\psi(\gamma t). $$ Recursively applying the last equation leads to $$ \phi_n(t) = \prod_{k=0}^{n}\psi\left( \gamma{(1-\gamma)}^kt \right). \quad (1) $$ Note the value of $\phi_n(t)$ is always non-increasing and lower bounded by zero for any $t \in \mathcal{R}$. Hence $\phi_n(t)$ converges to a characteristic function $\phi(t)$ pointwise, and $Z_n$ converges in distribution.

Note: I think it is possible to weaken the requirements on the characteristic function values of $S_n$. But it will require more work.

Coming back to the original problem, let's first reformulate it into an equivalent format.

Reformulation: The original recursion is equivalent to $$ X_{n+1} = \max\{(1-\alpha)X_n + \alpha S_n, (1-\beta)X_n + \beta S_n\}. $$

Proof: Recall $0 < \beta < \alpha < 1$. The above is true because if $S_n > X_n$, then $\alpha(S_n - X_n) > \beta(S_n - X_n)$, and if $S_n \le X_n$, then $\alpha(S_n - X_n) \le \beta(S_n - X_n)$. The original recursion always takes the maximum of the two branches.

Unfortunately things from here below become a bit fuzzy.

Conjecture 1: $X_n$ does NOT converge a.s. or in probability.

Justification: Both $(1-\alpha)X_n + \alpha S_n$ and $(1-\beta)X_n + \beta S_n$ can be viewed as moving average between $X_n$ and $S_n$ with two different coefficients. Assuming $S_n$ is non-degenerate, the sequence $X_n$ can never settle at a converging point, since the next averaging will make a non-diminishing move proportional to $\alpha$ or $\beta$. This is most likely true since most Markov process do not converge in probability.

Conjecture 2: $X_n$ does NOT converge in distribution.

Justification: Equation (1) shows the limiting characteristic function, hence the distribution as well, is affected by the parameter $\gamma$. So recursion patterns $(1-\alpha)X_n + \alpha S_n$ and $(1-\beta)X_n + \beta S_n$ converge to two different distributions since $\alpha \ne \beta$. Given that $S_n$ is non-degenerate, we always have non-zero probability in following one limiting path and suddenly switching the recursion patterns, hence the limiting distribution will oscillate between different distributions. I am less confident in this conclusion.

Finding the stationary distribution if there is one:

I actually think this might be tractable computationally and analytically for simple cases.

  • Determine the domain of the distribution. Domain is limited by the minimum value and maximum value of $S_n$ (because $X_n$ are moving averages of $S_n$ starting from zero with switching). This helps in computation. Even if range of $S_n$ is not bounded, one can still truncate it to use a reasonable approximation.
  • Establish a functional equation based on the simple fact that the distributions of post and pre Markov transition are the same. Once the equation is established, analytical solutions can be sort for $S_n$ with simple distributions. An iterative algorithm can be used to compute the stationary distribution for $S_n$ with more complicated distributions.