What would happen to a star if a Dyson sphere lined with mirrors reflected a significant portion of the stars light back to the star

A partially reflective Dyson sphere is equivalent to asking what happens if we artificially increase the opacity of the photosphere - akin to covering the star with large starspots - because by reflecting energy back, you are limiting how much (net) flux can actually escape from the photosphere

The global effects, depend on the structure of a star and differ for one that is fully convective, or one like the Sun that has a radiative interior and a relatively thin convective envelope on top. The phenomenon could be treated in a similar way to the effects of large starspots. The canonical paper on this is by Spruit & Weiss (1986). They show that the effects have a short term character and then a long term nature. The division point is the thermal timescale of the convective envelope, which is of order $10^{5}$ years for the Sun.

On short timescales the nuclear luminosity of the Sun is unchanged, the stellar structure remains the same as does the surface temperature. As only a fraction of the flux from the the Sun ultimately gets into space, the net luminosity at infinity will be decreased. However things change if you leave the Dyson sphere in place for longer.

On longer timescales, in a star like the Sun, the luminosity will tend to stay the same because the nuclear burning core is unaffected by what is going on in the thin convective envelope. However if a large fraction of the luminosity is being reflected back then to lose the same luminosity it turns out that the radius increases and the photosphere gets a little hotter. In this case, the radius squared times the photospheric temperature will increase to make sure that the luminosity observed beyond the Dyson sphere stays the same - i.e. by $R^2T^4(1 - \beta) = R_{\odot}^2 T_{\odot}^4$, where $\beta$ is the fraction of the solar luminosity reflected by the sphere.

The calculations of Spruit et al. (1986) indicate that for $\beta=0.1$ the surface temperature increases by just 1.4% whilst the radius increases by 2%. Thus $R^2 T^4$ is increased by a factor 1.09. This is not quite $(1-\beta)^{-1}$ because the core temperature and luminosity do drop slightly in response to the increased radius.

It is probably not appropriate to quantitatively extrapolate the Spruit treatment for very large values of $\beta$, but why would you build a Dyson sphere that was highly reflective? Qualitatively, the envelope of the star would expand massively in response to the heat being deposited in it from outside and in this case the photosphere might become cooler, despite the extra heat inflow.

The above discussion is true for the Sun because it has a very thin convection zone and the conditions in the core are not very affected by conditions at the surface. As the convection zone thickens (for example in a main sequence star of lower mass), the response is different. The increase in radius becomes more pronounced; to maintain hydrostatic equilibrium the core temperature decreases and hence so does the nuclear energy generation. The luminosity of the star falls and the surface temperature stays roughly the same.