0
$\begingroup$

There is an algorithm text book that I'm reading to teach myself asymptotic analysis. To demonstrate that not all two-level nested for loops are $\Theta(n^2)$, the book presents the following code to be analyzed...

/* θ(n log n) */
sum = 0;
for(int i = 1; i <=n; i*= 2)
    for(int j = 1; j <=n; j++)
        sum++;

The book calculated that the running-time of the above implementation would be $\Theta(n \log n)$. The book arrived at the $\Theta(n \log n)$ running-time, after first suggesting that the intermediate Sigma notation is $\sum_{i=1}^{\log n} n = n \log n$.

Before I ask my question, I'll first set up the context of my question by showing another block of code with $\Theta(n^2)$ running-time — whose corresponding Sigma notation is $\sum_{i=1}^{n} i = \frac{n(n-1)}{2}$...

/* θ(n^2) */ 
sum = 0;
for(int i = 0; i <=n-1; i++)
    for(int j = n-1; j > i; j--)
        sum++;

If the above-listed $\Theta(n^2)$ code were run with $n = 8$, then the value of the variable sum, would end up being 36. Correspondingly, the manual calculation of the $\Theta(n^2)$ closed-form summation equation ($\frac{n(n-1)}{2}$) would work out to be 36 if you were to plug the value 8 into its $n$.

If — on the other hand — the above listed $\Theta(n \log n)$ code were run with $n = 8$, then the value of its sum variable would end up being 32. However, the manual calculation of $\Theta(n \log n)$ — where $n = 8$ — would work out to be 24. Because asymptotic analysis always assumes $log_2$, then $\log 8 = 3$. And $8 \cdot 3 = 24$.

So now, here are my questions:

  1. How come the $\Theta(n^2)$ code's summation and the summation of its corresponding Sigma notation's closed form equation both work out to be 36 — but the $\Theta(n \log n)$ Sigma notation equation calculation does not jibe with the value calculated by the corresponding $\Theta(n \log n)$ code listed above?

  2. What is the relationship between the 24 that comes out of the $\Theta(n \log n)$ Sigma notation equation calculation and the corresponding code listed above?

  3. What would I need to change in the above-listed $\Theta(n \log n)$ code, to make the final value of the summation, equal to the summation calculated in its corresponding Sigma notation equation?

  4. What would the the $\Theta(n \log n)$ Sigma's closed-form equation ($n \log n$) look like fully expanded — where $n = 8$?

Thank you in advance for your answers.


EDIT: Please, don't let my use of MathJax fool you into thinking I'm some kind of Math dude. 'Coz I'm not. I suck at Math. Which is the very reason why I couldn't answer the question on my own. I also suck at Math jargon. Therefore, I would not be offended in the least if any answers and comments used ELI5-level plain English. In fact, I'm gonna have to insist on jargon-less, ELI5-level plain English. Please? Thanks.

  • 1
    MIght be better suited to cs.stackexchange.com. To try and answer your question: Theta and BigOh ignore constants. It makes no sense to evaluate at specific values of $n$ and comparing. I recommend you revisit the definitions.2017-01-24
  • 0
    Why even use Sigma notation then, if it doesn't represent an actual mathematical summation? Why wouldn't just the Big-O (_asymptotic_) notation on its own suffice?2017-01-24
  • 0
    Furthermore, the knee-jerk, CS 100 canned response — "_Big-O ignores constants and low-order terms_" — still does not satisfy my curiosity. The crux of my post, is essentially asking why some Sigma notation closed-form summations jibe with their corresponding programmatic summations and others don't. I'd like to understand what the difference is. Replying with, "_RTFM_" doesn't get me closer to that understanding. Trust me, I have already _RTFM_. And like your comment @Aryabhata, _TFM_ wasn't helpful either.2017-01-24
  • 0
    Wouldn't that be a capricious and groundless double standard, @William? There are dozens of existing questions in math.stackexchange asking very similar questions: | [Algorithm complexity](http://math.stackexchange.com/questions/1243723/help-understanding-the-complexity-of-my-algorithm-summation?rq=1) | [Asymptotic notation](https://math.stackexchange.com/questions/701244/calculating-run-times-of-programs-with-asymptotic-notation?rq=1) | [θ summation](http://math.stackexchange.com/questions/267810/is-this-why-this-summation-is-equivalent-to-this-theta-notation?rq=1) | ...(etc..)2017-01-25
  • 0
    The answer to _"The crux of my post, is essentially asking why some Sigma notation closed-form summations jibe with their corresponding programmatic summations and others don't"_ is **constants are ignored in BigOh and Theta** (assuming you are referring to the value of `sum` as the programmatic summation). Another answer is, why do you even expect them to "jibe"? (What does "jibe" even mean here?).2017-01-25
  • 0
    You insist on making this about CS — _for what I can only assume are emotional reasons_ — @Aryabhata. When for me, it's not about CS. But about learning the mathematical concepts of summations, mathematical induction and Sigma notation. The hand-wavy **CS050** answer, "_It just is! So deal with it!_" — is not what we call "_**learning**_" where I'm from. You may be conditioned to just accept things without questioning them, in your part of the world. If that's the case, you have my sincere sympathy. But we consider that way of thinking anathema to _**real**_ learning — in my neck of the woods.2017-01-26
  • 0
    Sorry to say this, but it seems you are accustomed to _spoon feeding_. _"It is the constant, look up the definition"_ is very much intended to make you learn. If you don't get that, well.... Anyway, end of discussion for me.2017-01-26
  • 0
    Backpedaling now that I've already answered my own Qs, @Aryabhata? Predictable. You never said anything close to — "_It is the constant, **look up the definition**_". Instead, you kept on about θ, & bleating out your **_CS050_** mantra. Obviously what you are about, is [_Racking Up_...](https://math.stackexchange.com/questions/54341/solving-recurrences-of-the-form-tn-atn-a-thetan-log-2-a?rq=1) | [_Points_](https://math.stackexchange.com/questions/116703/solving-recurrence-tn-t-lceil-n-2-rceil-t-lfloor-n-2-rfloor-thet?rq=1); not about _**teaching**_ anything. Don't be a hypocrite all yer life.2017-01-27
  • 0
    @KangarooCourt — Summarily closing this on trumped-up charges, is also predictable. What would I have to gain by rewording my questions, when they already are ELI5-level clear? To earn points? Un-Math-dudes like moi, don't give 2 sh*ts about points or "_rep_". Besides. I already answered my own questions. That self-learned knowledge is it's own reward. So what do I care if this post is on hold 'til $\infty$? No skin off my teeth. The only ones who lose are all the other un-Math-dudes who also prefer ELI5-level answers. But of course that's the \@KangarooCourt's elitist plan, it looks like.2017-01-27

2 Answers 2

1

The notation $\sum_{i=1}^{\log n}$ confuses you. The outer loop in the first example executes with indices $2^k$, where $k \in [0, n]$. Number of executions of such a loop will be $log(n) + 1$, but not $log(n)$.

Of course, asymptotic time complexity of this piece of code will still be $\Theta(n \log n)$.

  • 0
    Which one of the four (_count 'em_) questions in my OP — _five counting the one in the title_ — confuses you? Because you've addressed neither of them.2017-01-24
  • 1
    @algoHolic - $n \cdot (\log(n) + 1)$ with $n=8$ will give you 32. And please be creative, but not aggressive2017-01-24
  • 0
    I wasn't being creative. I was being _repetitive_. Meaning: _I only repeated your own usage of "confused", @HEKTO_. Nor was I being "_creative_" with the $\sum_{i=1}^{\log n} n = n \log n$ Sigma notation. Like I've already said in my OP — I'm only repeating the Sigma notation that I copied from an algorithm analysis text book I'm reading. Like I also said — The book based its $\Theta(n \log n)$ running-time solution on what the book claims was the summation of the closed-form of the $\sum_{i=1}^{\log n} n = n \log n$ Sigma notation that the book presented to support its solution...2017-01-25
  • 0
    ...There are dozens of algorithm-related, Sigma notated summations that contain constant terms in their closed-form equations. The above-mentioned $\sum_{i=1}^{n} i = \frac{n(n-1)}{2}$ is one example. So, as a naive self-learner, I gotta figure that the author intentionally presented his $\sum_{i=1}^{\log n} n = n \log n$ summation for some specific reason. Which begs the question: **_What IS the reason the author uses_ $\sum_{i=1}^{\log n} n = n \log n$ _to represent the summation of the above_ $\Theta(n \log n)$ _code?_** — (_that I copy/pasta-ed from the book, I hasten to repeat_)...2017-01-25
  • 0
    ...If $\sum_{i=1}^{\log n} n = n \log(n) + 1$ was indeed what the author intended, there is no obvious reason why he would not have just tacked the stupid $+ 1$ onto the end of his equation. After all, by that point in the book, he'd already drilled the universally parotted, "_Big-O ignores constants and low-order terms_" dogma into the reader. So it would have been abundantly clear to me how to deal with the $+ 1$ term in the Sigma notation, if he hadda included it. Granted — maybe the author himself is confused.2017-01-25
  • 0
    I didn't create these either. Cormen, et. al did... $$\sum_{k=0}^n k^2 = \frac{n(n+1)(2n+1)}{6}$$ $$\sum_{k=0}^n k^3 = \frac{n^2(n+1)^2}{4}$$ $$\sum_{k=0}^{n} x^k = \frac{x^{n+1} - 1}{x - 1}$$ $$\sum_{k=0}^\infty x^k = \frac{1}{1-x}$$ Notice how all the closed-form equations have constant terms?2017-01-25
-1

On further reading (and rereading...and rereading) of the book I referred to in my OP, it eventually registered that not only does the author write: "The first code fragment has its outer for loop executed log n + 1 times...(see pg 71)", but he also talks about a second Sigma summation function: $\sum_{i=0}^{\log n} n$, which looks at first glance, virtually identical to the $\sum_{i=1}^{\log n} n$ summation that I originally asked about. (Notice the different values for the lower bound $i$).

The two summation functions (see pg 30 & pg 71) look so similar to my naïve mathematical eyes, that I blew off the differences between the $i=0$ and the $i=1$ lower bounds, as being just a minor typo. But on a whim, I plugged $\sum_{i=0}^{\log n} n$ into Wolfram-Alpha. And out pops...

$$\sum_{i=0}^{\log n} n = n(\log n + 1)$$

The teacher's working through a step-by-step example of the summation of a constant (starting @6.24) offers a clear, ELI5-level demonstration of expanding a summation. The key mathematical concepts to take away from that ELI5-level video, are...

  1. when there's a constant in a Sigma notation, the index does not affect the constant.

  2. for a constant, $c$ and for an outer bound $n$, the closed form of the Sigma notation, is simply expressed as, $=cn$. And most importantly...

  3. the closed form is evaluated only the one time; NOT $n$ times; as one might naïvely think at first.

  4. the "Summation Of A Constant" approach to arriving at a sum, is a general mathematical concept. As such, it is no more synonymous with the topic of computer science, than it is with the price of tea in China.

So with that, I submit this corrected ELI5-level solution to my "fully expanded equation" question #4...

$$\log_2 8 = 3$$

$$n(\log n + 1)$$

$$= 8(3 + 1)$$

$$= 8(4)$$

$$= 32$$

I'll also take a stab at answering my own, question #1...

How come...the $\Theta(n \log n)$ Sigma notation equation calculation does not jibe with the value calculated by the corresponding $\Theta(n \log n)$ code listed above?

Referring to page 71 of the above-mentioned book, the author actually says, "...the total cost for the first code fragment can be expressed as $\sum_{i=0}^{\log n} n$...". So that summation's closed-form equation — $\sum_{i=0}^{\log n} n = n(\log n + 1)$ — is what should have been used to do the summation. Not the $\sum_{i=1}^{\log n} n = n \log n$ one that I mistakenly used at first.

As the above "fully expanded equation" answer to question #4 shows, as long as you follow the "Summation Of A Constant" approach, then the value $32$ calculated by the book's $for$ loop, and the value $32$ calculated by the "fully expanded equation" answer above, actually do "jibe" — to put it in ELI5 vocab.

Having answered my own mathematics-related questions #1 and #4 to my own ELI5-level satisfaction, the other two mathematics-related questions (#2 & #3) disappear by osmosis.

I am prepared to graciously defer to other more expertly-informed answers to any of the original questions. All's I ask is to ELI5 ;¬)

  • 0
    Just curious: I am clueless as to what "ELI5-level" means. Seriously. I'm from the US, so I'm guessing it relates to a specific level (grade-level) of mathematics. But if you could be so kind as to educate my guesses, I'd be much obliged.2017-02-02
  • 0
    To answer your Q @amWhy : "[_•E is for Explain - merely answering a question is not enough_](https://www.reddit.com/r/explainlikeimfive/)" | "[_•LI5 means friendly, simplified and layman-accessible explanations - not responses aimed at literal five-year-olds_](https://www.reddit.com/r/explainlikeimfive/)" | [[_Source_](https://www.reddit.com/r/explainlikeimfive/)] | It comes from an old adage (_and I paraphrase_) — "_If you can't explain something simple and straightforward enough so that a ten year old would understand it — then you really don't understand that something yourself_"...2017-02-05
  • 0
    Thanks for the lesson! :-)2017-02-05