Concentration inequalities for the maximum of the rescaled/normalized sum of iid random variables

One can use Birnbaum and Marshall inequality:

Theorem(Theorem 2.1. in 1). If $\left(S_k,k\geqslant 1\right)$ is a non-negative sub-martingale and $(c_k,k\geqslant 1)$ a non-decreasing sequence of positive numbers, then for each $p\geqslant 1$: $$\mathbb P\left\{\max_{1\leqslant k\leqslant n}\frac{S_k}{c_k}\geqslant 1\right\}\leqslant \frac{\mathbb E\left[S_n^p\right]}{c_n^p}+\sum_{i=1}^{n-1}\left(\frac 1{c_i^p}-\frac 1{c_{i+1}^p}\right)\mathbb E\left[S_i^p\right].$$

Reference:

1 [Some Multivariate Chebyshev Inequalities with Extensions to Continuous Parameter Processes]1 Z. W. Birnbaum and Albert W. Marshall, Ann. Math. Statist. Volume 32, Number 3 (1961), 687-703.


You can do something with Talagrand's inequality for at least some normalizations. The simplest case would be if the $X_i$ are mean 0 and bounded, say $|X_i| \le 1$ almost surely, and you're looking at $\max_k S_k / \sqrt{k}$. In that case each $S_k / \sqrt{k}$ is a convex 1-Lipschitz function of $(X_1, \dotsc, X_n)$, and so $\max_k S_k/\sqrt{k}$ is a convex 1-Lipschitz function as well. Then Talagrand gives you $$ \mathbb{P} \left[\left| \max_{1 \le k \le n} \frac{S_k}{\sqrt{k}} - \mathbb{E}\max_{1 \le k \le n} \frac{S_k}{\sqrt{k}}\right| \ge \epsilon \right] \le C \exp[-c\epsilon^2]. $$ Of course you still need to bound that expectation. Maybe one can do better, but off the top of my head, using Talagrand again and a simple union bound, $$ \mathbb{E}\max_{1 \le k \le n} \frac{S_k}{\sqrt{k}} \le \mathbb{E}\max_{1 \le k \le n} \frac{|S_k|}{\sqrt{k}} \le C \sqrt{\log n}. $$