One reason calc books are so weird: consider integrals: Newton etc. (18th? 17th? century) had an intuitive understanding of them; 19th century (Riemann) actually defined them; 20th century (Lebesgue) abstracted more, showed it led to measure etc. ...and calc books -give- the 19th century definition, but the proceed as if they were in the 18th? century, just using them as anti-derivatives. Ditto for sequences/series, kinda: they give 19th century definitions, but only have/expect 18th century sophistication (and they don't look at interesting subtleties like Abel sums (e.g., 1 - 1 + 1 - 1 + 1 ... = 1/2, if you think of it as lim_{x \to -1} p(x) where p(x) is the power series for 1/(1-x) about 0, namely $p(x)= 1 + x + x^2 + \dots + x^n + \dotsp$ ) ditto for functions and continuity (for them a function is something you compute with, an elementary (more properly, EL) function) ditto continuity: give defn at a point, but use intuition... (****** grrrrrrr ******* continuous functions are -much- more subtle than they appear; look at nowhere differentiable functions, peano curves, fractals!)