Why doesn't increasing resistance increase brightness if $P=I^2\cdot R$

Power sources can work in two modes : control voltage (CV) or control current (CC).

In CV mode, the voltage is imposed, and the output current is adjusted depending on the load. This is for instance the case at home, where the electrical plugs deliver 100V (or 220V depending of country), regardless of what is plugged in it. In this case, $P=V^2/R$ is the relevant expression as $V$ is a known value.

In CC mode, the current is imposed and the corresponding voltage is adjusted to match the load. While this mode is less familiar in domestic usages, it is often used in electrical engineering - it insures for instance a constant magnetic field in a coil, regardless heat effects which could alter the resistance of the circuit. In this case, $P=RI^2$ is the relevant expression. If you increase the resistance, the power supply will increase the voltage to maintain constant intensity, and the dissipated power is logically higher.

You can use any of these formulas to calculate $P$:

$$P = I^2 \cdot R$$ $$P = \frac{V^2}{R}$$

They are both correct and will give same result. You can not tell which one is "dominant".

But to use these formulas you need to know not only $R$ but also $I$ or $V$. And to analyze these formulas you need to know how $I$ or $V$ change when you change $R$.

In case you connected the bulb to a power supply with produces constant voltage $V$ it's easier to use the second formula. You can use the first one either, but you should remember that when $R$ increases the $I$ changes as well. The result would be the same: $P$ decreases.

If you connect the bulb to a power supply which produces constant current $I$ then both formulas would tell you that $P$ increases when $R$ increases.

The two equations of relevance are

${\rm power} (P) = \,{\rm voltage} \,(V) \times {\rm current} \,(I)$

${\rm resistance}\, (R) = \dfrac{{\rm voltage}\,(V)}{{\rm current}\,(I)}$

From those two equations you can get $P = I^2R$ and $P = \dfrac {V^2}{R}$

Suppose that it is assumed that the resistance of the light bulb does not vary with the voltage across it / the current through it.

In your room you have a light stand with a $240 \, \rm V,\;60\, \rm W$ light bulb in it and you want to replace it with a brighter $240 \, \rm V,\;100\, \rm W$ light bulb.
Using $R = \dfrac {V^2}{P}$ he working resistance of the $60\, \rm W$ bulb is $694\, \Omega$ and that of the $100\, \rm W$ bulb is $576\, \Omega$.
So decreasing the resistance increases the brightness.

The problem with using $P=I^2R$ is that you might get the impression that because the resistance $R$ goes down the power also decreases but in doing that you have assumed that the current $I$ stays constant.
The current is not constant but actually increases by the same fractional amount as the resistance decreases.
But that is not all because in the equation $P=I^2R$ the current is squared and so the fractional increase in the current squared $I^2$ is double the fractional decrease in the resistance $R$.
So overall the power dissipated increase as the current decreases which leads to increased brightness.