I tried to find a way to compute a coefficient varying from 0 to 1 where 0 mean perfect balance and 1 is the worst unbalance (all request go to one server).
Here a practical example... imagine a server farm with 3 server in it. In one day a got 9000 requests being served by those servers. Inside the farm, those 9000 calls get split between the 3 servers. Now, here's 2 scenario : the best case and worst case... best case : each server received 3000 calls so my coefficient will give me 0 (perfect balance). The worse case will be 9000 calls on only 1 server were my coefficient should compute to 1. Then you got all the possibility in between like... 4500 on server 1, 2250 on server 2 and 2250 on server 3 should get you 0.5 coefficient meaning only half the charge is well balance.
So I'm trying to find a formula (with no luck for now) that will compute this kind of coefficient so when put in a graphic in relation with the system load, could show me if my farm configuration is efficient or not by simply looking at this graph.
I want my formula to be adaptable (changing the number of server in farm) so I can simulate different scenario...
So anyone can point me in the right direction? :-)
Thanks in advance!!!