I can't pretend to tell you what the "true meaning" of the prime number theorem is, but I can answer your questions about intuition by talking about how to think about asymptotic information.
To start with, you know what a limit is, what $\pi(x)$ and $\log x$ stand for, so of course you know what the prime number theorem says: the ratio between $\pi(x)$ and $x/\log x$ converges to $1$. As you have observed, the difference (in terms of subtraction) between two quantities converging to $0$ is not the same as their ratio converging to one. When two quantities diverge out to (say, positive) infinity, the former implies the latter, but in general the converse does not hold. Curves approaching each other in an absolute (rather than relative) sense is not the morally correct way to think about asymptotic comparisons, as it is too strict and uninteresting a requirement to gain any footing in realistic problems in mathematics.
The key to understanding what information is conveyed by the multiplicative aspects of the choices of definitions for the Bachmann-Landau notations is to think about scale and proportion. As two quantities grow to infinity, in order to view their graphs we need to continually zoom out. At this level of thinking, it is quite easy to pick out a way to characterize when one function will outright dominate another function: say that $f=o(g)$ ("little-oh") if the graph of $g$ will converge to the zero as we zoom out to fit both $f$ and $g$ in our frame. This can be characterized analytically too: it is when $g(x)/f(x)$ tends to $0$ as $x\to\infty$.
By $f\sim g$, then, we want an equivalence relation that describes when the graphs of $f,g$ converge as we zoom out, which is equivalent to $f(x)/g(x)\to1$ as $x\to\infty$. An alternative way to characterize this, is that the difference between them $f-g$ is asymptotically swamped out by their growth, which is to say that $f-g=o(f)$ or $f-g=o(g)$. Thus, if we have a hard-to-characterize function say $\Phi$, one way to get a foothold on its growth is to find a $\Psi$ for which $\Phi\sim\Psi$, so that the error in our estimate is as negligible as being nonexistent compared to the actual value $\Phi$ or $\Psi$.
See for yourself:
$~~~$ 
Another way to think about the prime number theorem is that the proportion $\pi(x)/x$ of prime numbers up to a given magnitude $x$ is inversely proportional to logarithmic growth in $x$. You may be asking yourself what exponentials or logarithms have to do with prime numbers. The answer partially lies in the Cramer model of prime numbers as a Poisson process - for more on this, see for example Soundarajan's lectures on the distribution of primes, or just google around with these terms - but ultimately finding a complete answer to this is part of what analytic number theory is still striving to accomplish.
As for why $c_1<\frac{\pi(x)}{x/\log x} ($c_1<1) for sufficiently large $x$ is weaker than the PNT, it should be easy to see that a quantity (here, a ratio between $\pi(x)$ and the PNT estimate $x/\log x$) being constrained to an interval after a certain point does not logically imply that there is any smaller interval it is constrained in too, or that any limit exists, whereas the existence and knowledge of the limit easily tells us that the ratio is bounded in any interval around $1$ eventually.
To compare: the hierarchy of polynomials should be pretty elementary. Two nonzero polynomials are $\sim$ if and only if they have the same degree and leading coefficient. Two polynomials are $\Theta$ to each other (i.e. the ratio between them is eventually bound inside some interval) if and only if they have the same leading degree, though having the same leading coefficient is not necessary. And $f=o(g)$ for polynomials $f,g$ if and only if the degree of $f$ strictly less than that of $g$. Ultimately, we have extended our reach so that we can speak about a wide class of functions instead of only polynomial growth. Indeed, there are levels of growth in between, smaller or bigger than other types of polynomials, so this level of description is a huge refinement.