Multiplicative version of Mcdiarmid's inequality?
Suppose you have $n$ i.i.d. random variables taking values in $\{0,1\}$, and $X$ represents their sum. Then you can use a Chernoff bound to control the deviation of $X$ from its expectation. The Chernoff bound has two useful forms: the typical bound which controls the additive deviation, in terms of the number of random variables $n$, and the multiplicative bound, which controls the relative deviation from the expectation, with a bound that is independent of the number of random variables $n$.
When the quantity that one is interested in is not the sum of $n$ i.i.d. random variables, but instead some other $1$-Lipschitz function of the random variables, then Mcdiarmid's inequality gives essentially the same bound as the additive version of the Chernoff bound.
My question: Is there a multiplicative version of Mcdiarmid's inequality that bounds the relative deviation of an arbitrary $1$-Lipschitz function of $n$ i.i.d. random variables in a way that is independent of $n$, akin to the multiplicative version of the Chernoff bound?
McDiarmid's is proven applying Azuma's inequality (which is "additive") to a suitable martingale.
You should be able to obtain "multiplicative McDiarmid-type" inequalities by applying a "multiplicative Azuma-type inequality" (i.e., "Chernoff bounds for dependent r.v.'s) as in, e.g., this MathOverflow question, to the same martingale.