So I have an array of values that I need to scale down while maintaining a minimum value for the scaled value.
For example, let's say I have an array of values [1, 1, 3, 5] with a minimum scale factor of .2.
By normalizing the array, we get the array [.1, .1, .3, .5]. However, keeping in mind the minimum scale factor, we'd have values [.2, .2, .3, .5], which adds up to 1.2, not 1.
My thinking was to iterate over the array and first set all values that would be under the minimum to the minimum, keeping a carry variable to determine how much still needs to be redistributed to the other elements in the array that were over the minimum.
With all of the values that were over the minimum, scale their values with respect to the carry variable and then subtract that from their values.
So with the example above, we'd subtract 3/8 * .2 from .3, and 5/8 * .2 from .5 to get [.2, .2, .225, .375].
Is there any other way to do this more efficiently? Or an alternative way to scale the remaining values?
Edit: Sorry, scaling might be the incorrect term, but in the end the values of the array are to be divided in such a way that their values are changed with respect to the total value.
I'll explain the specific implementation so that the question might be more clear:
I have a number of posts, and each of the posts is to be shown for a certain amount of time before fading out, after which the next post is to be shown. I want the delay between posts to be dependent on the number of words within each post, but also constrained to be at least some minimum value.
There is a total amount of time for all of the posts to be shown, and the time is supposed to be split up between all of the posts.