1
$\begingroup$

This is all curiosity. It's just something that's been stuck in my head today, and I thought I'd throw it out there.

With my Amazon rewards card, I get 3% cash back on purchases from Amazon, 1% on everywhere else (ok, 2% some places, but irrelevant right now). I've figured out the following:

Final Cost = Sale Price - (Sale Price * Cash Back)

To figure out whether I should buy an item off of Amazon or, say, pick it up at Wal-Mart, I can just pick the minimum of Final Cost A and Final Cost B, or min(FA,FB). This is simple enough. What I'm wondering is if there's another way of framing this that just compares the distance between the two Sale Prices (SA, SB). In other words, there's got to be a point at which the Amazon price reaches a threshold where, despite the larger amount of cash back, it still costs more than the alternative product. And maybe there's a formula or rule that, assuming the cash back rates remain constant, only requires me to determine whether SA is within a certain distance of SB to know which is actually less expensive.

1 Answers 1

1

If you get $3\%$ back, you final price is $0.97$ times the sale price, so compare $0.97$ times the Amazon price to $0.99$ times the Wal-Mart price. If you only want to multiply once, you can divide the two coefficients and compare $0.979797979797979797$ times the Amazon price to the basic Wal-Mart price. Yes, you might round that to $0.98$