0
$\begingroup$

Is the problem to prove whether or not there exists an algorithm with running time polynomial in the length of the input string $|s|$, or polynomial both in $|s|$ and the size of the alphabet $|A|$ ? The papers I'm looking at assume that you know which one they mean.

Edit: Paper

1 Answers 1

1

The very first sentence of the paper defines what problem they are considering:

This paper addresses the smallest grammar problem; namely, what is the smallest context-free grammar that generates exactly one given string?

  • 0
    The problem they also deal with is whether an algorithm in P exists that finds a smallest grammar. I would be assuming that they mean treating the alphabet size as a constant. It seems like a reasonable assumption, since when we want the smallest grammar of something, we usually fix the alphabet. But I'm still not 100% convinced they mean that.2012-04-17
  • 0
    Well, the alphabet can trivially be taken to be the set of symbols found in the input string, or you could fix it as $\{0,1\}$. I assume that one would allow any number of nonterminals in the grammar; more than $|s|$ of them will never be optimal anyway.2012-04-17
  • 0
    That's right. I knew that, but the thought bubbled up again, and I need to think before I shoot questions to the forums. If an algorithm is bounded by a polynomial in the alphabet size of s, then it is also bounded polynomially in the size of s.2012-04-17