0

I have multiple arrays of size 262144, and I am trying to use the following code to set all the values of each array to between 0 and 1, inclusive. To be clear, each array will be normalized differently because each has different values.

var i;
var max = Number.MIN_VALUE;
var min = Number.MAX_VALUE;
for (i = 0; i < array.length; i++) {
    if(array[i]>max) {
        max = array[i];
    }
}

for (i = 0; i < array.length; i++){
    if(array[i]<min) {
        min = array[i];
    }
}

for (i = 0; i < array.length; i++) {
    var norm = (array[i]-min)/(max-min);
    array[i] = norm;
}

However, I know that it's not doing this correctly because when I do the following code, the numbers logged to the console are often above 1.

max = Number.MIN_VALUE;
min = Number.MAX_VALUE;
for (i = 0; i < array.length; i++) {
    if(array[i]>max) {
        max = array[i];
    }
}

for (i = 0; i < array.length; i++) {
    if(array[i]<min) {
        min = array[i];
    }
}

console.log(max);
console.log(min);

What am I doing wrong? Thank you!

4
  • array[i] should be able to be greater than MIN_VALUE and less than MAX_VALUE, right? Am I misunderstanding something? @SterlingArcher Commented Aug 1, 2016 at 18:49
  • @gcampbell that was intentional. Is that a problem? Commented Aug 1, 2016 at 18:50
  • @user6645395 No, it's not a problem. I've deleted my comment. Commented Aug 1, 2016 at 18:54
  • @user6645395 no my eyes just played tricks on me. You're right! Commented Aug 2, 2016 at 13:24

2 Answers 2

1

Your code works fine for me. The following example results in the following normalized array:

[
  0,
  0.821917808219178,
  0.0684931506849315,
  0.3835616438356164,
  1
]

Note that min and max do not have to be between 0 and 1 since they represent the minimal and maximal value of your original array.

var array = [4, 64, 9, 32, 77];

var i;
var max = Number.MIN_VALUE;
var min = Number.MAX_VALUE;
for (i = 0; i < array.length; i++)
{
   if(array[i]>max)
   {
       max = array[i];
   }
}

for (i = 0; i < array.length; i++)
{
   if(array[i]<min)
   {
       min = array[i];
   }
}

for (i = 0; i < array.length; i++)
{
   var norm = (array[i]-min)/(max-min);
   array[i] = norm;
}

max = Number.MIN_VALUE;
min = Number.MAX_VALUE;
for (i = 0; i < array.length; i++) {
    if(array[i]>max) {
        max = array[i];
    }
}

for (i = 0; i < array.length; i++) {
    if(array[i]<min) {
        min = array[i];
    }
}

console.log(array);

console.log(max); // 1
console.log(min); // 0

Edit: As you can see in the example, the min and max value after the normalization should be 0 and 1, which is the case.

Sign up to request clarification or add additional context in comments.

5 Comments

Why would min and max contain the values for my original array? For the second block of code which I posted, don't min and max get reset to the min and max of the new array? Thank you for you response.
Also, when I log the full array to the console, I get values above 1.
As far as I can see, in the second snippet you do not do any normalization at all, you only determine the min/max value of the array. Could you please elaborate your thought process?
Sorry, I didn't explain what I meant. I meant that in the first two loops, I find the min and max of the original array, and in the third, I normalize the data. In the last two, I reset max and min to what should hopefully be 0 and 1, because I set them based on the normalized array. I was stating why I disagree with the following statement: "Note that min and max do not have to be between 0 and 1 since they represent the minimal and maximal value of your original array."
See my edit. Determining the min and max after the normalization seems to work fine as well. What values are you getting, if not 0 and 1?
0

I found the solution thanks to this answer! It's simply that the numbers were being compared as I had not converted them from strings to floats.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.