Let $E$ be a $\mathbb R$-Banach space, $g:[0,\infty)\to E$, $$\operatorname{Var}_\varsigma g:=\sum_{i=1}^k\left\|g(t_i)-g(t_{i-1})\right\|_E$$ for $\varsigma=(t_0,\ldots,t_k)\in\mathcal S_I$, where $$\mathcal S_I:=\{(t_0,\ldots,t_k)\in I^{k+1}:k\in\mathbb N_0\text{ and }t_0<\cdots<t_k\},$$ and $$\operatorname{Var}_Ig:=\sup_{\varsigma\in\mathcal S_I}\operatorname{Var}_\varsigma g$$ for any interval $I\subseteq[0,\infty)$.

If $D=\{t_n:n\in\mathbb N\}$ is a countable dense subset of $[0,\infty)$, are we able to show that if $g$ is right-continuous, then $$\operatorname{Var}_{[0,t]}g=\sup_{k\in\mathbb N}\sum_{i=1}^k\left\|g(\min(t_i,t))-g(\min(t_{i-1},t))\right\|_E\tag1$$ for all $t>0$?

The inequality "$\ge$" is cleary trivial. How can we show the other one?


Here is a counterexample. Let $E = \mathbb{R}$, $t = 1$, and $g(x) = 1 - x$ for $x\in [0,1]$ and $g(x) = 0$ for $x \geq 1$.

Clearly $Var_{[0,1]}(g) = 1$. Now, take a dense set of points $\{t_n\}_{n=0}^\infty \subset [0, \infty)$ such that $t_{2n} \geq 1$ and $t_{2n + 1} \in [0, 1)$ for all $n\geq 0$.

Since $g(1) = 0$, we get $$ |g(\min(t_{2i}, t)) - g(\min(t_{2i - 1}, t))| = |g(1) - g(t_{2i - 1})| = |g(t_{2i - 1})|, $$ and similarly $$ |g(\min(t_{2i + 1}, t)) - g(\min(t_{2i}, t))| = |g(t_{2i+1})|. $$ Hence $$ \sum\limits_{i=1}^{2k+1} |g(\min(t_{i}, t)) - g(\min(t_{i - 1}, t))| = \sum\limits_{i=1}^k |g(t_{2i+1})|, $$ which is clearly divergent as $k\to \infty$ since the main term of the series does not converge to $0$ in view of the definition of $g$.