Some people would insist that the kinds of proofs I give in my calculus textbook are “not rigorous.” What does this even mean? They are not what exactly? What is the definition of “rigorous”? I do not know any credible definition of this term. And I very much doubt that all those zealous advocates of “rigorous” mathematics do either.
Judging by the way people use the term, the definition of “rigorous,” what ever it is, must, it seems, fulfil two key conditions:
1. It must entail a clear-cut way of telling “rigorous” from “non-rigorous” methods. Whether something is “rigorous” or not is supposed to be an objective, straightforward question, rather like telling whether a shirt is red or blue.
2. It must make it clear why being “rigorous” is desirable. It is always taken for granted as a virtual truism that a “rigorous” proof is better than a “non-rigorous” one. Since no one ever argues for this, it must apparently be evident already from the definition why this is so.
I do not know of any definition of “rigour” satisfying both of these conditions. I do, however, know of plenty of definitions satisfying one or the other, which leads me to suspect that people rely one one definition for the one purpose and another definition for the other. But then of course they would be obligated to prove that the two definitions are synonymous, which they never seem to realise.
For instance, people generally assume that “rigour” means something along the lines of “very careful reasoning.” This certainly takes care of (2) very well, but it completely fails (1). Why are for example infinitesimal methods not “rigorous”? Merely answering “because they are not careful” obviously accomplishes nothing. This just leads to the question “why are they not careful?”, which is just back to square one.
An opposite approach is to give a definition that clearly satisfies (1). In the case of the foundations of the calculus this is easily done by simply defining “rigorous” to mean methods based on epsilon-delta limit formalism. This indeed makes it clear as day that infinitesimals are “not rigorous,” but it does absolutely nothing to answer (2), i.e., it leaves wide open the question of whether there is any reason to prefer “rigorous” methods in the first place.
Another popular definition is something like that given by Wikipedia: “Mathematical rigour can be defined as amenability to algorithmic proof checking.” This definition is quite well-suited to meet condition (1), but again it does nothing for (2). With this definition too the burden of proof is on the advocates of “rigour” to establish that “rigour” so defined is in fact desirable.
Another common conception of what “rigour” means is that “rigorous reasoning” is “immune to errors,” while “non-rigorous reasoning” is prone to errors and paradoxes. Again a lopsided definition in terms of the desiderata: this time it passes (2) with flying colours, but leaves us completely in the dark as to (1). In the case of the calculus, people often attempt to underwrite this definition with various assertions about the history of the calculus which, if true, would take care of (1), such as various alleged “errors of intuition” that practitioners of infinitesimal methods tend to commit. However, it is easy to demonstrate that these assertions are propaganda, not truth; they are concocted to justify the emphasis on rigour, not the other way around. But that is a story for another day.