"Normalize" values to sum 1 but keeping their weights

Why not just divide each number in your sample by the sum of all the numbers in your sample?


The answers provided here won't work in cases where the set contains negative numbers.

The function you're looking for is called the softmax function. The softmax function is often used in the final layer of a neural network-based classifier. Softmax is defined as:

$$f_i(x) = \frac{e^{x_i}}{\sum_j e^{x_j}}$$


From the text description, it seems this is what you want:

  • calculate the sum of all elements
  • divide each element by the sum

Note that, however, then your example $[40, 10]$ normalises as $[0.8, 0.2]$, not $[0.75,0.25]$. The latter doesn't preserve the ratio of both elements.

Tags:

Statistics