Texas Tech University

Study: Most People Blindly Trust Calculators – Even When They Lie

Glenys Young

October 30, 2019


Researchers found a person’s numerical knowledge is key to their ability to become suspicious.

If John's grandmother was born in 1942, how old was she in 1994?


Did that send up any red flags? If not, it should have.

First, the math is wrong: 1994 minus 1942 is only 52. But beyond that, how many people in the world live to be 114?

If you didn't catch the error, you're not alone, according to a new study from the Texas Tech University Department of Psychological Sciences.

After one of doctoral student Mark LaCour's advisers expressed frustration about students' inability to do simple math problems without a calculator, they had the idea to program a calculator to lie and then test whether students would notice.

"He actually didn't want the calculator to lie by too much in our initial study, because he thought anything more than a 15% added error would be too egregious," said LaCour, the study's lead author. "After doing multiple studies, though, we realized that something in the neighborhood of 120% – more than double the real answer – was necessary to make an appreciable amount of participants suspicious."

There was a striking result, however. Even participants who became suspicious of the calculator's answers didn't stop using the calculators.

"There are a few things this might be saying, but one possibility is that many people really don't like putting in effort to do math," said co-author Tyler Davis, an associate professor in psychological sciences. "There's probably an effort, accuracy trade-off to some extent, and people seem to be willing to sacrifice some accuracy to continue using the less-effortful calculator.

"The thing that kept surprising me is how much we had to increase the magnitude of errors to get people to notice them. We tried a few steps from the low errors to where we ended up, but we didn't start getting substantial detection until the errors were off by an order of magnitude."

One factor researchers found that correlated with participants' suspicion of errors was their level of numeracy – that is, their number sense: how well a person understands and can appropriately use mathematical concepts. Outside of errors from the calculator manipulations, participants had to understand how to put the problems into the calculators correctly to get a correct result.

As one example, participants in the study were asked to calculate 15% of 21. Instead of simply multiplying 21 by 0.15, some subtracted 0.15 from 21 then divided by 0.15 – and gave no indication that they recognized their mistake. However, when they were asked to calculate a 15% tip for a $21 meal, they were much less likely to accept a nonsensical answer that resulted from faulty operations.

"Any instructor will tell you that you should have a conceptual understanding of the new mathematical concepts you're learning, and they don't want their students to simply manipulate numbers and equations without an understanding of why they're doing it," LaCour said. "Our results suggest that encouraging students to actively map a problem onto a familiar situation can help them catch mistakes in choosing which operations and equations they're using."

The results of the study aren't just applicable in mathematics, though – and that's what excited Davis.

"This has potential connections to basic cognitive neuroscience research in my lab," said Davis, director of the CAPROCK fMRI Lab. "Noticing a calculator error involves some of the same processes we might use to notice any kind of anomalous patterns when we are performing a task, like playing the piano or categorizing animals. It was interesting to consider how factors related to expertise and motivation might interact with the degree of anomaly to affect people's error recognition."

While their study didn't directly investigate why people were unable or unwilling to solve the problems accurately, LaCour and Davis cited previous research showing that mathematical abilities have declined over the past few decades as technological advancements, like smartphones, have become increasingly available.

"This is all basic math that students should know how to do in college," Davis said, "but to the degree that we get used to not having to put in the effort, through smartphones, etc., perhaps we learn to discount accuracy when we don't have them around. Likewise, not practicing math by having smartphones around can increase the perceived effort of math once we have to try to do it without one.

"An alternative, of course, is that the people in the study don't know how to do the math, and that's probably scarier. As a professor, I hope that's not true."