1
$\begingroup$

I found a few other questions( about log,antilog,..) like this but couldnt find a complete answer. So: What can we actually compute with a simple calculator (+,-,×,÷,sqrt) like sine,log,some other functions with a good accuracy? And a good algorithm for each of them? Thanks

P.S: one way to find square root is linear aproximation( doesnt give exact digit number) but my teacher gave us an algorithm that you can find the exact digi . do we have such algorithms for log and other functions too?

  • 0
    There are series for the functions $\log$ , $\sin$ , $\cos$ , $\tan$ and a lot more. But it depends on the argument whether we can calculate them with reasonable effort only using a simple calculator. The algorithm you mean is probably Heron's method. (But according to your question, the calculator can handle square-roots anyway)2017-01-25
  • 0
    In principle, we could calculate most of the functions occuring in practice even if we only have the operations '+,-,*,\' , but the calculation can require much effort.2017-01-25
  • 0
    The current version of the question is still quite broad. Please give concrete functions you want to calculate and at which positions. $\sin(1)$ , for example is no problem to calculate , if we use the taylor expansion.2017-01-25
  • 0
    Thanks for answers, i found the algorithm in wikipedia: https://en.m.wikipedia.org/wiki/Methods_of_computing_square_roots the one named Digit-by-digit . my intention of saying this is to see if there is a digit by digit algorithm for log, antilog, and other functions as well?2017-01-26
  • 0
    Nemexia, there is, search for CORDIC algorithm2017-01-29

1 Answers 1

2

I think this is a good question, but it's very broad. So I'll give a general answer with some examples.

To better understand all of this I suggest you read on limits and numerical methods.

You can compute almost any function you can imagine with a pocket calculator, pen and paper. Pen and paper you will need to write down intermediate values. The accuracy will be limited by the time you spend and the number of digits the calculator gives.

Among the algorithms which use only arithmetic operations (and maybe square roots) are:

  • Infinite series. You have some term $a_n$ which depends on a natural number $n$ and possibly some other variables. The approximation to some function $f$ you get will be $$f_N \approx a_0+a_1+a_2+\dots+a_N$$ Beware! There are two types of series. 1) Converging series (the error will always get smaller as $N$ gets larger, potentially approaching zero, i.e. you get the exact value). As a rule $a_n$ need to get smaller and smaller in absolute value for the series to converge. Two most important kinds of converging series are Taylor series and Fourier series. 2) Asymptotic series (can give a very good approximation, but if $N$ is too large the error gets larger too). A good example of a function computed by the series is the exponential function: $$e^x=1+x+\frac{x^2}{2}+\frac{x^3}{2 \cdot 3}+\frac{x^4}{2 \cdot 3 \cdot 4}+\frac{x^5}{2 \cdot 3 \cdot 4 \cdot 5}+\dots$$ This series converge for all $x$, but for larger $x$ you have to take more terms for good accuracy.

  • Infinite products. Are similar to infinite series, only you multiply the terms instead of adding them. $$f_N \approx a_0 \cdot a_1 \cdot a_2 \cdot \dots \cdot a_N$$ For the infinite product to converge $a_n$ need to get closer and closer to $1$. A fun example of an infinite product to compute logarithms is: $$\prod_{k=0}^\infty \frac{2}{1+x^{1/2^k}}=\frac{2}{1+x} \cdot \frac{2}{1+\sqrt{x}} \cdot \frac{2}{1+\sqrt[4]{x}} \cdot \frac{2}{1+\sqrt[8]{x}} \cdot \dots=\frac{2}{x^2-1} \ln x$$

  • Continued fractions. These are a different animal. Basically, you set up some initial value $f_k$ and compute 'from the end': $$f_{k-1}=\frac{1}{a_k+f_k},\quad f_{k-2}=\frac{1}{a_{k-1}+f_{k-1}}, \ldots f=a_0+\frac{1}{a_1+f_1}$$, where $b_n$ are some numbers. It can be written as $$f=a_0+\cfrac{1}{a_1+\cfrac{1}{a_2+\cfrac{1}{a_3+\cfrac{1}{a_4+ \cdots}}}}$$ If all $a_n$ are positive, then continued faction converges if $a_n$ grows faster than $1/n$. A general continued fraction has the form: $$f=a_0+\cfrac{b_0}{a_1+\cfrac{b_1}{a_2+\cfrac{b_2}{a_3+\cfrac{b_3}{a_4+ \cdots}}}}$$ There are continued fractions for the exponent, for the logarithm, etc. Any series can be made into a continued fraction, if we want. A simple example of a continued fracion is: $$\cfrac{1}{1+\cfrac{1}{2+\cfrac{1}{3+\cfrac{1}{4+\cdots}}}}=\frac{I_1(2)}{I_0(2)}$$ where $I_0,I_1$ are modified Bessel functions. A more general case for the ratio of Bessel functions is: $$ \frac{x J'_n (x)}{J_n (x)}=n-\cfrac{x^2}{2(n+1)-\cfrac{x^2}{2(n+2)-\cfrac{x^2}{2(n+3)-...}}}$$ Another interesting case for the incomplete gamma function: $$\Gamma (0,x)=\int_x^{\infty} \frac{e^{-p}}{p} dp=\cfrac{e^{-x}}{x+1-\cfrac{1}{x+3-\cfrac{4}{x+5-\cfrac{9}{x+7-\cdots}}}}$$ For more general cases look up Euler continued fraction and Gauss continued fraction.

  • Iterative algorithms. These are more fun and usually very efficient in computing various functions. Basically, they are the generalization of all the above. We have the famous Newton's algorithm for computing roots of equations, icluding square roots (so called Babylonian method) as you mentioned. Logarithms, arctangents and other 'inverse' functions can be computed by iterative algorithms. For example, take $$a_0=x,~~~~b_0=y$$ $$a_1=\frac{a_0+\sqrt{a_0b_0}}{2},~~~~b_1=\frac{b_0+\sqrt{a_0b_0}}{2}$$ $$a_{n+1}=\frac{a_n+\sqrt{a_nb_n}}{2},~~~~b_{n+1}=\frac{b_n+\sqrt{a_nb_n}}{2}$$ Then we obtain as a limit the 'logarithmic mean' of $x,y$: $$a_\infty=b_\infty=\frac{x-y}{\ln x-\ln y}$$

In any case, take your calculator, pen and paper, and do some experiments! It's fun. You can check your results using some online tools, such as Wolfram Alpha. Just past any number you've got into the search bar and Wolfram will tell you if it's any well known function.

  • 0
    Thanks for your good answer,2017-01-26