0
$\begingroup$

On our quiz today, I came across the following question.

Assuming a telemarketer has a 20% chance of selling each caller an item, and a 80% chance of not selling the caller an item.

Each call in which the telemarketer makes a successful sale takes 2.5 minutes, and each call that doesn't sell takes 0.5 minutes. If the telemarketer makes 100 calls, find the average amount of time it takes.

I was conflicted between two methods.

Method 1: $E(Y)$ = (2.5)(0.2) + (0.5)(0.8) = 0.9, which I proceeded to multiply by 100 = 90 minutes.

Method 2: Since the question can be set up as a Bernoulli trial, $E(Y)$= $u$ = $np$. So 100(0.2)(2.5) = 50 minutes

I think either I misinterpreted the question, or I'm not doing something right. I feel I didn't set up the first method correctly, or if it's useable in this case.

1 Answers 1

1

Your method 2 ignores the time spent failing to sell an item. If you add $100(0.8)(0.5)$ to the expected time for succeeding to sell items, which is what you calculated, you get the right answer.

  • 1
    @Anon:I believe your formula says the average time for $n$ events is $n$ times the average time for each event. In your method 1 you calculated the average time for each event. In method 2, when you multiplied by 0.2 you got the expected number of sales, then the 2.5 got the expected time for the sales. But you ignored the 0.8 of the time that you didn't sell, and that takes 0.5 min each. This will lead to the same answer as method 1, in fact the same expression up to the distributive law.2012-10-01