Pages

Sunday, June 18, 2023

“The irony of this being a story about data fraud in a paper on inducing honesty is not lost on me.”

The problem of data fraud in empirical academic papers has received increased attention. Temptation for such fraud arises particularly in data sets that are generated by the authors. In economics, for example, much empirical research stems from government-generated data sets from agencies such as the Bureau of the Census. 

In principle, other authors can access the same data sets and repeat the analysis. However, in the subfield of behavioral economics, the numbers are often generated by lab experiments or confidential organization data. It might be noted that clinical trials in medicine are similar and the temptations are found there, too. Indeed, there is an online newsletter - Retraction Watch - dedicated to highlighting articles retracted due to suspicious data: https://retractionwatch.com/.

It is possible, nonetheless, for fraud to be detected in such researcher-generated experimental lab data as the article below demonstrates. A famous case involved an article about honesty because of the irony of the fraud found therein. Excerpt below from the Chronicle of Higher Education:

Almost two years ago, a famous study about a clever way to prompt honest behavior was retracted due to an ironic revelation: It relied on fraudulent data. But The Chronicle has learned of yet another twist in the story.

According to one of the authors, Harvard University found that the study contained even more fraudulent data than previously revealed and it’s now asking the journal to note this new information. The finding is part of an investigation into a series of papers that Harvard has been conducting for more than a year, the author said.

Details about the reported fabrications are unclear. Francesca Gino, a world-renowned Harvard Business School professor who studies dishonesty, and is a co-author on the disputed study, is now on administrative leave, according to her faculty page. Gino did not return a request for comment.

The head-spinning saga began in 2012, when a team of five researchers claimed that three experiments they’d done separately, and combined into one paper, showed that when people signed an honesty pledge at the beginning of a form, versus the end, they were less likely to cheat on the form. This intuitive-sounding conclusion turned heads at government agencies and companies.

But by 2020, it was falling apart. The researchers, plus two others, reported in a new paper that they were unable to replicate the effect after running essentially larger versions of experiments Nos. 1 and 2, which involved university students and employees filling out tax forms in a lab. Max H. Bazerman, a Harvard Business School professor, has said that the two experiments were written up by him, Gino, and Lisa Shu, then of Northwestern University.

Scientific findings often fail to replicate for all kinds of reasons, not necessarily because they were fabricated. But in the summer of 2021, a trio of data detectives wrote on their blog that a close examination pointed to fraud in experiment No. 3, which, unlike the others, was based on auto-insurance customer data.

That experiment had been handled by two other authors: Nina Mazar, formerly of the University of Toronto, and Dan Ariely, a Duke University professor. The source of that fraud remains unclear. In 2021, Ariely told BuzzFeed News that he was the only author in touch with the insurance company that provided the data, but he denied fabricating it. At the same time, he gave conflicting answers about the origins of the data file that was the basis for the analysis. BuzzFeed News reported that the insurer was The Hartford, which confirmed doing a “small project” with Ariely but was unable to locate any data resulting from it.

It was yet another blow to the field of behavioral economics — which in the 2000s and 2010s churned out headline-grabbing strategies to subtly influence people’s behavior for the better, and has since walked back many of them. In September 2021, the Proceedings of the National Academy of Sciences retracted the 2012 paper. But that, it turned out, was not quite the end.

The alleged new problems involve experiment No. 1 — one of the two conducted in a lab with students. Bazerman told The Chronicle that on Tuesday, Harvard informed him that it believed fabricated data for this experiment made it invalid. According to Bazerman, Harvard provided a 14-page document with what he described as “compelling evidence” of data alterations. Their analysis found that somebody had accessed a database and added and altered data in the file, he said. “I did not have anything to do with the fabrication,” he told The Chronicle... “The irony of this being a story about data fraud in a paper on inducing honesty is not lost on me.” ...

Full story at https://www.chronicle.com/article/a-weird-research-misconduct-scandal-about-dishonesty-just-got-weirder.

An update to the article above provides a link which provides still more analysis of the fraud involved in the honesty study. See: https://datacolada.org/109.

Note that the effort to prove in the paper that making a pledge to be honest before a task increased honesty presumably was undertaken because it was thought that finding a positive effect would be more interesting than finding no effect. But that conclusion is not obvious. There are many circumstances in which such pledges are routinely used, e.g., student honor codes. Finding these pledges had no effect - it could be argued - would be more interesting.

No comments: