Madness in the Method
Start talking research methodology to journalists, and many will run screaming from the room. To be fair, the same might be said of some academics working outside the bounds of history, social or pure science departments. Yet, however technical and dry it seems, researchers working in methodology-driven disciplines know that valid methods aren’t a detail, but at the heart of the most important question to be asked of any study: can you trust the results?
But while academics are under increasing pressure to promote and discuss their research findings in the media, they often face time-pressed reporters who know little about what it takes to conduct valid research. Sometimes, they seem to not want to know, given that knowledge that a piece of research is rubbish does them out of a story.
The uncritical reporting of the Southern Cross Bioethics Institute’s launch of their report on abortion is a case in point. While the media widely reported the “findings” the Institute provided at their launch and via a summary on their website, journalists were not given information on the study’s conduct (how subjects were selected, the questions they were asked, the statistical tests used to crunch the numbers), and who funded it. Academics and activists attempting to do their homework before providing the media with requested commentary couldn’t get obtain such information either. When one activist asked, the PR consultants fielding enquiries initially promised it by email, only to renege later that day with a garbled excuse about only having permission to distribute the report – after payment via the website – through the post. A high-profile statistician who actually teaches research methods to university students was flatly refused a copy of the questions and told they wouldn’t be in the report either not to bother waiting by her mailbox, as the written report also didn’t reveal them.
Despite this, the report claims to “reflect the attitudes to abortion of Australians generally”, though it fails to adequately explain why if this is so, the results are so discordant with polls by reputable institutions like the Australian National University, which does make its research questions and methods available for public scrutiny.
The peer review process ensures that the value of data – and the conclusions drawn from it – are never taken at face value. A bottom-line requirement for the publication of academic articles and books is the provision of extensive information on how data was collected and crunched. Disclosure of who funded the research has, thankfully, become de rigour. This is because academics are trained to be sceptics, and are well aware of the biases that can be introduced through the way in which research is conduced and funded, not just how the results are analysed.
Journalists are trained to be sceptics too, and the exercise of their critical faculties regularly protects the public from swallowing what those controlling the agenda want us to see and believe. During the US election, Australian journos reported extensively and well on the way “push-polling” conducted during the US election distorted the data produced. Yet as the Southern Cross Bioethics Institute saga makes clear, a lack of journalistic scepticism about what constitutes valid research – coupled with organisational competitiveness and tight deadlines – can have significant outcome for the nature and quality of public debate.
How might things go better next time? Firstly, media academics and journalists need to debate amongst themselves, in peer-reviewed journals and magazines like the Walkley, the precise nature of their responsibilities when it comes to publicising research results. Is it the media’s role to assess the quality of data, and source the money behind it, before placing it before the public? Or is the provision of space for critical commentary on a study’s methods and findings all that can reasonably be required?
For their part, academics must insist in their interactions with journalists that methodology does matter, and provide clear and convincing examples of why this is so. Journalists are in the business of asking questions. It’s not hard for them to understand that the way researchers ask them impacts on the responses they get. Nor should it be hard to convince them that data generated from surveys of large, randomly selected adults of all ages and both genders, is going to be more robust than that obtained from 12 elderly members of a church group plus their relatives and friends. This isn’t to say that the latter technique can’t be used, but that it can’t be used if the researcher wants to claim the results represent the views of all Australians.
Academics must also develop and market short, on-site courses for media practitioners that elucidate the basic methodological requirements of good research, and the basic questions journalists must ask – and have answered – before any story goes to press or to air.