1) I wouldn't consider this as a quantitative version of the central limit theorem, but rather as a quantitative version of large deviation theorems (the two are related, of course). Let us focus on the result, and not on the methods you use to get them. Let $(X_i)$ be a sequence of i.i.d., $\mathbb{R}$-valued, centered, bounded random variables. I'll denote by $(S_n)$ the sequence of its partial sums. A large deviation principle tells you that there exists a rate function $I: \mathbb{R} \to \mathbb{R}_+$ such that, for any open set $O$:
$- \inf_O I \leq \liminf_{n \to + \infty} \frac{\ln \mathbb{P} (S_n/n \in O)}{n},$
and for any closed set $F$:
$\liminf_{n \to + \infty} \frac{\ln \mathbb{P} (S_n/n \in F)}{n} \leq - \inf_F I.$
In other words, the probability that the sum $S_n$ is large (say, $S_n \geq \varepsilon n$ for a fixed $\varepsilon$) decreases exponentially in $n$, roughly at speed $e^{- I (\varepsilon)n}$.
A notable feature of these large deviation principles for i.i.d. random variable is that the function $I$, which governs the speed of the decay, is the Lapaplace-legendre transform of the characteristic function of $X$. In other words, exactly what you get with the Chernoff bounds! So the Chernoff bounds you give you a quantitative, upper bound for the large deviation principles:
$\mathbb{P} (S_n/n \geq \varepsilon) \leq e^{- I(\varepsilon) n},$
or equivalently,
$\frac{\mathbb{P} (S_n/n \geq \varepsilon)}{n} \leq - I(\varepsilon).$
In a more general setting, the rate function $I$ is related to the entropy of some system (you get a large entropy [that is, small for a physicist - there is often a sign change] when the sum $S_n$ is far from its typical state).
==========
There a point which is worthy of note, but has not been raised yet. You can show that moment bounds are stronger that exponential bounds. You know that, for any $p \geq 0$ and any $t > 0$:
$\mathbb{P} (|X| \geq t) \leq \frac{\mathbb{E} (|X|^p)}{t^p}.$
These bounds are stronger that the Chernoff bounds: if you know each of the moments of $X$, then the moment bounds allow you to get better bounds on $\mathbb{P} (|X| \geq t)$ than Chernoff bounds. However, they behave very badly when you look at sums of i.i.d. random variables (because the moments change in a non-trivial way), while the exponential bounds are very easy to manage:
$\mathbb{E} (e^{\lambda S_n}) = \mathbb{E} (e^{\lambda X})^n.$
==========
2) Obviously, Chernoff bounds exist as soon as the characteristic function $\mathbb{E} (e^{\lambda X})$ is defined on a neighborhood of $0$, so you only need exponential tails for $X$ (and not boundedness). Moreover, if you want to get a bound in one direction (i.e. on $\mathbb{P} (S_n/n \geq \varepsilon)$ or $\mathbb{P} (S_n/n \leq -\varepsilon)$, not on $\mathbb{P} (|S_n/n| \geq \varepsilon)$), you only need exponential tails in the corresponding direction.
If you assume stronger hypotheses on the tails of $X$, you can get stronger Chernoff bounds. Boundedness or sub-gaussianity of $X$ are typical assumptions.
You can get similar bounds (concentration inequalities) not only for the aprtial sums of i.i.d. random variables, but also for some martingales (see Collin McQuillan's answer), and for much, much larger classes of processes. This Wikipedia page will give you a taste of it, as well as some key-words, if you are interested.