Is there any way to calculate x/y without using division? Like 1/10 = 1 * 0.1 . I'm asking because dividing is slower than multiplying in programming programs.
Use division instead of multiplication
-
2@Raphael Actually both integer and floating point division have significantly more latency than multiplication, according to the Intel optimization manual: http://download.intel.com/design/processor/manuals/248966.pdf – 2011-11-24
3 Answers
No. There's no free lunch: the most efficient way to calculate $x/y$ for general $y$ is to ask the hardware to divide $x$ by $y$.
Obviously, in analogy to your example, $x/y = x \cdot \frac{1}{y}$ if you happen to have $\frac{1}{y}$ handy. Maybe if you're dividing many, many numbers by the same $y$, inverting $y$ first and then doing a bunch of multiplications will save some cycles. But see my caveat below.
Another special case is when $y$ is an integer power of 2: division can then be efficiently performed using bit shifting, if $x$ is also an integer, or by adjusting the exponent of $x$, if $x$ is an IEEE floating point number of the form $a 2^{b}$.
CAVEAT: Please do not try to "optimize division" by hand:
The compiler you are using is very good at recognizing special cases where division can be optimized / replaced by faster operations and will almost certainly do a better job at it than you.
Division is extremely unlikely to be the biggest performance bottleneck in your program, so spending time optimizing it isn't spending your time most effectively.
Your future self, and anyone else who needs to maintain your code in the future, will thank you for writing more straightforward, readable code.
-
0This is not only a good, but almost certainly the right answer. – 2011-11-24
I'm sure this is not the answer you're looking for, but (for x, y, b > 0) there's always $ x/y = b^{\log_b(x) - \log_b(y)}$ which (with $b=10$ and using log tables) was the usual way division with about 5 digit precision was performed in the days before computers.