There are two ways to gain knowledge: by thinking (rationalism) or by collecting data (empiricism). Or at least normal people think so. Not educational researchers, however. In their world only the latter exists. Research in education is dogmatically empiricist: doing “research” means collecting and analysing data. The stupidity of this dogma may be illustrated by two paradoxes.

Paradox 1. By definition, educational research is completely ignorant of content. By design, it cannot say anything about whether one explanation a teacher may give is better than another, or whether a given topic should be taught at all, because that’s not based on “data.”

Any normal person would find this absurd. Aren’t these the very core questions of education? How can you call yourself a “department of education” if you explicitly forbid any of your Ph.D. students to address such questions? Suppose we are in fact teaching a certain topic (say for instance l’Hôpital’s rule in calculus) to hundreds of thousands of people every year, but that this topic is actually worthless and serves no meaningful purpose and should not be taught at all. Is it not preposterous and absurd that, in such a case, educational research by definition and design could never speak of this fact? But in the idiotic world of education “research” that’s how it goes.

You want to teach l’Hôpital’s rule to one class by lecture and another by group work, and see who scores better on a test? Excellent research question!

You want to analyse different ways of proving l’Hôpital’s rule from a pedagogical standpoint? You want to discuss the purpose of teaching l’Hôpital’s rule in the first place? You want to investigate how and to what end l’Hôpital’s rule was developed and applied historically, in order to inform teaching? Impossible! None of those things are “research,” you fool!

Paradox 2. The notion of “data” on which educational research is based is incoherent and irrational.

If I spend a lot of time thinking about how to teach l’Hôpital’s rule and write down insightful reflections, then that’s of course just “opinion” and not “data” and therefore “not research” and it would be impossible for me to get anywhere as an education researcher. But if you interview me about how I teach l’Hôpital’s rule and write a paper about “collegiate mathematics instructor’s attitudes and strategies for teaching l’Hôpital’s rule” then you are a good data-based researcher sure to rise in the ranks. As I observed to no avail in a graduate seminar in mathematics education: if we each publish our own opinions there is no data and no research, but if we go in a circle and publish descriptions of one another’s opinions then suddenly all our opinions have become “data” and now it’s all “research,” even though we have accomplished nothing except to filter and garble our thoughts through imperfect interpreters.

Likewise, if I study 17th-century texts on problems using l’Hôpital’s rule and draw pedagogical lessons from this, then I am also “not doing research” since there is no “data.” But if I put a student in front of a camera and have him think aloud while trying to solve the same problems, then this is “data”; and if I draw from this pedagogical lessons in the exact same way I did for the historical texts then that is now “research” all of a sudden. Is it conceivable that it could be more insightful to study the historical development of an idea, reflecting the same insights we want our students to reach, than to overanalyse a video of an unprepared student randomly rambling his way through a problem? We better hope not, for if so then educational research has shot itself in the foot before it even begun.

None of this makes any sense. The dogmatic obsession with this naive notion of “data” makes educational research rotten at its very core.

The rationale for the empiricist dogma is supposedly that research should be based on “facts” rather than “opinions,” which may sound like a reasonable idea. But, as these examples show, the main consequence in practice of this dogmatic empiricist interpretation of this principle is that it makes it impossible to address any questions that actually matter, while incentivising the investigation of stupid questions that can be address by this narrow-minded conception of “research” methodology.

Meanwhile, if the goal of the empiricist dogma was to eliminate empty opinions it is not working: educational research is in general blatantly and massively biased, as I have demonstrated in a hundred cases. Worse yet, the empiricist dogma has made it very difficult to expose these biases, or indeed to have any rational and critical discussion, since researchers hide behind the idea that their research is all based on “facts” and “data” and is hence by definition objective and beyond criticism.

The most widely touted conclusion of educational research is “lecture doesn’t work.” Indeed, lecture is a rather irrational way to teach. But the right way to prove this is the rationalistic way, using argument and thought. The wrong way to prove it is to naively proclaim it as an empirical “fact” proved by “data” and “research.” But the latter, of course, is the ruling dogma. See for instance the recent report on Active Learning in Post-Secondary Mathematics Education by the Conference Board of the Mathematical Sciences (an umbrella organisation that includes basically anyone who’s anyone in U.S. higher education). “A wealth of research has provided clear evidence that active learning results in better student performance and retention than more traditional, passive forms of instruction alone,” the report reads, referring in particular to the “landmark” meta-analysis of 225 research studies which I have criticised before.

I therefore call upon everyone concerned with education to resist dogmatic empiricism and to admit a sensible role for rationalism, that is, allowing that one can learn some things through thought and reflection. In other words:

Stop wasting massive resources on proving the obvious. Don’t spend several lifetime’s worth of work piling up 225 studies and a mountain of data to prove that we shouldn’t lecture when reason proves it in a minute.

Stop hiding behind “data.” Stop preempting debate by pretending that your opinion is a “fact” proved by “research.” Instead, have the courage to allow educational matters into the arena of rational discussion.

Stop letting methodological dogmas dictate the direction of your research. Have the courage to face and tackle the questions that truly matter. Start there, not with a preconceived idea of what “research” ought to look like.