Misunderstanding of percentage increase

If something increases from $50$ to $200$, it increases by $300\%$ and has a new value that is $400\%$ of the old value.

Similarly, if something increases from $50$ to $52$, it increases by $4\%$ to a new value that is $104\%$ of the old one.


Percentage increase is $$\frac{\text{new number - old number}}{\text{old number}}\times 100 \%$$

The right comptuation should be $$\frac{200-50}{50} \times 100 \%=300\%$$


The convention is that "percentage increase" is the number of percentage points that are added.

So it is assumed that you always start with $100\%$ of a number and then add an $n\%$ percent increase to that, so you end up with $(100 + n)\%$ of the original number.

If you take the ratio of the starting and ending amounts and multiply by $100\%,$ you end up with the figure $(100 + n)\%.$ You then have to subtract $100\%$ if what you want is the percentage increase.

Indeed $52$ is $104\%$ of $50,$ but the added amount is only $2,$ which is $4\%$ of $50.$ Likewise $200$ is $400\%$ of $50,$ but the added amount is only $150,$ which is $300\%$ of $50.$

Tags:

Percentages