In BSM model building, what counts as fine tuned?

There is a difference between large hierarchies and fine-tuning.

We don't know how the values of the constants of the Standard Model (or its extensions) are set. Instead, we assume some prior – a sensible one is that the logarithm of the constant in question is uniformly distributed over some range. If we assume that the top and up Yukawas have this distribution, say from $10^{-10}$ to $1$, then it's no surprise that there are hierarchies of order $10^{-5}$.

If we similarly distributed the Higgs mass and Planck mass, say from $1 \,\mathrm{eV}$ to $10^{30} \, \mathrm{eV}$, then it would likewise be no surprise that we see a hierarchy of order $10^{15}$. The problem arises because we take these masses to be set at some high initial scale $\Lambda$ (of order the Planck mass) – their values at low scales are modified by quantum corrections. If there is some other particle in nature that couples to the Higgs and has a mass $M$ of order the Planck mass, the Higgs mass will receive corrections

$$m_H^2(\Lambda) - m_H^2(0) \sim M^2 \ln (\Lambda) \sim M_P^2 \,.$$

Now we see a problem. In order for $m_H(0)$ to be small, say less than $10^{20} \, \mathrm{eV}$, we require $m_H(\Lambda)$ to be within about $10^{12}\,\mathrm{eV}$ of some mass close to the Planck mass, $10^{28}\,\mathrm{eV}$. Assuming the logarithm of $m_H(\Lambda)$ to be uniformly distributed as before, the probability of this occurring is minute.

This is the so-called electroweak hierarchy problem. It is a problem insofar as a 1% change in the initial Higgs mass would result in a vastly larger observed Higgs mass, nowhere near the electroweak scale. This is what is meant by a fine-tuning problem: a small change to our original theory results in vastly different or non-workable experimental consequences.

When people talk about tuning in BSM models, they are talking about something similar. They are not saying that the model contains some parameters which are hundreds of times larger than others. They are saying that 1% changes to the parameters of the model result in something which is non-workable.