Finding the average of an array using JS

With ES6 you can turn Andy's solution into as a one-liner:

const average = array => array.reduce((a, b) => a + b) / array.length;
console.log(average([1,2,3,4,5]));

You calculate an average by adding all the elements and then dividing by the number of elements.

var total = 0;
for(var i = 0; i < grades.length; i++) {
    total += grades[i];
}
var avg = total / grades.length;

The reason you got 68 as your result is because in your loop, you keep overwriting your average, so the final value will be the result of your last calculation. And your division and multiplication by grades.length cancel each other out.


For the second part of your question you can use reduce to good effect here:

const grades = [80, 77, 88, 95, 68];

function getAvg(grades) {
  const total = grades.reduce((acc, c) => acc + c, 0);
  return total / grades.length;
}

const average = getAvg(grades);
console.log(average);

The other answers have given good insight into why you got 68, so I won't repeat it here.


The MacGyver way, just for lulz

var a = [80, 77, 88, 95, 68];

console.log(eval(a.join('+'))/a.length)