The author is Andrew Hacker, a professor of political science. Initially I'll admit this made me wary; I went into the article half-expecting it to be just another long-winded gripe about the humanities are undervalued. But instead he pointed out that 1) algebra is a massive stumbling block for students up through college, including students who excel at other subjects and even other areas of math, and 2) as much as people like myself may hate to admit it, algebra is almost never a necessary skill in the life of an adult. That includes me, by the way, even though I'm a working mathematician. Hacker argues that high-school algebra, factoring polynomials and all that, should not be considered an essential part of the curriculum. I really encourage you to take a look at the article, and keep an open mind; I'm the first person to say we should beat students over the head with numbers, and he managed to convince me.

But there is another thing that Hacker points out, which dovetails with the subject of this blog. The reason students struggle with algebra isn't that it's necessarily difficult. The problem is that it's abstract. Unlike arithmetic, or geometry, or even many higher areas of math, algebra is usually done completely in the abstract, isolated from compelling applications or even visual heuristics. Math *can* be done by complete rote like this, but it's a specialized skill, and one that doesn't necessarily make us better at anything *besides* that one specialized skill. If our goal is to make people mathematically literate, we should probably focus education on skills that either *are* useful in-and-of themselves, or that build up our understanding of the subject by engaging all the different levels on which we can think of it.

Hacker doesn't say that we should de-emphasize math in the curriculum; instead, he thinks rote algebra should be phased out in favor of "quantitative reasoning". This ranges from understanding how to interpret statistics, to estimating a budget, to being able to spot numbers that are probably bogus. Higher-level quantitative reasoning would include back-of-the-envelope calculations, interpreting complicated diagrams (like a log-log scatterplot), and picking mathematical models based on the properties you know they should have ("I need a formula that starts off steep, then tapers off as it approaches a finite upper limit"). Non-technical people rarely need more math than quantitative reasoning. And even for technical people, if you don't have this as your guiding light, your calculations are probably doomed to fail.

To tie this back to linguistics, language is a tool for describing the world that we almost never use in isolation. To do so would be unnatural; our brains are designed to use language for conveying ideas to other people, not for reasoning about them. We *can* reason in a purely rote manner, using syllogisms,