I have a problem to solve. Lets say that there is normal distribution with mean value 5000 and deviation 1000. I have to know lets say what is a mean o 25 percent biggest numbers. How to calculate that? I know that result should be between 6500 and 8000, and closer to first value, but how to calculate it correct? Is there a function for this?
Maybe I did not write it clear. What I need to calculate is for example average weight of 50% or 25% of biggest part of population. I know that I can split population into for example 10 buckets and then I will now that from 0-1000 there is 0% from 1000 to 2000 there is 2% ad so on. From 5000 to 6000 will be 30% and from 9000 to infinity almost 0%. Then if I will revert this calculation I can estimate what is an average of biggest 50% numbers, by average from each bucket starting from biggest ones, get average of it. But with having only 10 buckets the estimation is not precise enough. I could split it to thousands of buckets to get more accurate number, but do think this is best way. There must be some formula to calculate this in easier way.