We introduce infinite sequences and series very thoroughly in calculus classes. We first define infinite sequences, then series, carefully discussing notions of convergence, etc., and discuss all sorts of rules for convergence before allowing students to see Taylor's theorem.
However, suppose that one just went to the board and wrote down $$e^x = 1 + x + \frac{x^2}{2} + \frac{x^3}{6} + \frac{x^4}{24} + \cdots$$ without making a general definition of an infinite series, or explaining anything about convergence. (Presumably one would have to explain factorials so that the pattern is clear.)
Or, more simply, one could write $$1 = \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \frac{1}{16} + \cdots$$
I have had interesting debates with colleagues as to whether this is a good idea -- and our debates seem to rely on an empirical question.
Are these formulas easily comprehensible to, say, Calc I students, bright high school students taking competitions, or other students who have not had formal exposure to infinite series? Or are infinite series a genuine conceptual stumbling block to students?
For example, would students be able to see how the first formula allows them to quickly find accurate approximations for $e$?