Concentration bounds for sums of random variables of permutations

One useful trick that comes in handy sometimes (I originally saw it in this paper of Talagrand, though it may go further back): We can view a random permutation in $S_n$ as being generated as follows: Start with the identity permutation, and successively perform transpositions of the form $(n, a_n)$, then $(n-1, a_{n-1})$, and so on down to $(2, a_2)$, where each $a_j$ is uniformly chosen from the numbers between $1$ and $j$ (with no transposition if $a_j=j$).

The point is that the $a_j$ are now independent. So if we have a situation where each individual $a_j$ has little impact on the sum of the $X_i$ or $Y_i$, then we can apply concentration bounds for independent variables. For example, changing an individual $a_j$ only impacts at most $3$ positions in the final permutation (whereever $j$, the old $a_j$, and the new $a_j$ end up), changing an individual $a_j$ can only change $X_1+\dots+X_n$ by $3$ (or at most $1$, if you're more careful), so McDiarmid's inequality immediately gives exponential concentration on the sum.


Note that the variables $X_i$ can be permuted without changing their statistics. There is a paper by Persi Diaconis that discusses this situation. Essentially one has bounds on the distance from the distribution of IID variables: what that means for concentrations specifically I do not know.