NUTRITION SOCIETY BLOG

Scientist or iconoclast?

Reflections on what a scientist can learn from a philosopher

The year immediately above me whilst I was an undergraduate had the chance of a few lectures on the philosophy of science, something which dropped off the curriculum by the time it came to my own year. Likewise there was little or no formal teaching on that era of PhD programmes and as someone loosely involved with the delivery of components of postgraduate taught and research programmes, philosophy remains viewed as an arcane and at-best peripheral topic. A recent scan at the Wikipedia Philosophy of Science page highlighted the attitude that many scientists may feel towards the philosophy of science “[it] is about as useful to scientists as ornithology is to birds.” To which the riposte was that ornithology would be very useful to birds if only they studied it.

Why then, when the curricula of undergraduate and postgraduate courses are stretched to bursting and resources are thin on the ground, should we venture to our colleagues in Humanities and engage them and try to fit even more in?

It’s been my concern as both a teacher/trainer and as a peer-reviewer and peer-reviewee that there are some core concepts, central to being a scientist, that are no longer understood by parts of the scientific community, let alone communicated adequately to the general public, such that the proper and formal notion of science and scientist are being lost. Furthermore, I feel there is a rise in the numbers of papers I see with highly technically competent work, which through fallacious logic fails to be scientific. I think this is a particular aspect of cellular and molecular nutrition papers, and one particular flaw, the unaccepted enthymeme, is virulent.

There are large numbers of adverts in press and TV offering margarine scientifically proven to aid weight loss; cat food scientifically proven to make your cat more glossy; and prebiotics scientifically proven to make you perkier and more attractive (probably). It is evidently this notion of “scientific proof” that is irritating me, because frankly it’s not scientific. Whilst I know full well that I’m preaching to the converted through this article, that we all know this stuff already, it’s my serious concern is that the generation currently being trained don’t know that scientists never prove hypotheses. And frankly why would they? As the philosophy of science isn’t on the curriculum, we just assume that hanging out with more experienced researchers will do the job (despite the cumulative experience of both science and life tells us that the golden rule is: never, never, never assume). However having marked many dozens of masters theses and PhD progress reports over the last decade, and having asked several cohorts of postgraduate students what the implication of a hypothesis-supporting experiment is, I’m unconvinced that the notion of fallibilism is adequately laid down at undergraduate, masters or doctoral level.  I think there is legitimate and interesting debate around whether we test hypothesis or null-hypothesis, but I see no basis for movement away from the core scientific value of fallibilism.

So it’s important, I think, for life scientists – such as nutritionists -  to have some notion of what science actually is (and as I’ve written before, of what life actually is). In my experience it’s a fairly easy thing to add to a course, you just rock up to your local philosophy department, knock on the right door and ask the locally eminent philosopher of science to give an hour of their time lecturing your intellectually undernourished cohort. The person in question will doubtless have some great break-out exercises at his/her fingertips, but fundamentally it is worth not just telling young scientists that we are fallible, but explaining that this is a meaningful and requisite response enabling us to occupy the no-man’s land between infinite regress and dogmatism (so there we go – never be dogmatically falliblistic, merely infallibly fallible). Broad brush explanations work fine here but the key to good practice is understanding underlying rationales (I’m sure I’m not alone in feeling the concept of “understanding underlying rationales” is being eroded at every level from the curriculum, in every discipline, from GCSE upwards.

But beyond that is there any need for philosophical training? Again yes, as the application of pure logic, and the ability to avoid fallacious thinking are important in the structuring of grants, papers and reports, and an invaluable skill in life (and seemingly should therefore be prized by the soulless beancounters who quantify transferability of skills taught on our courses). The ordering and classification of fallacy sounds as dull as dishwater, but a fabulous book by Madsen Pirie, How to Win Every Argument (1), achieves the seemingly impossible task of offering a classification of logical fallacy with wit and occasionally scabrous humour. As a reviewer I’ve found the book a very useful mechanism for articulation of flaws I could only before intuit. And as a regular reviewer of papers in cellular and molecular nutrition for a number of journals, there is a fallacy which I think occurs with greater frequency than all others - the unaccepted enthymeme.

After a bit of digging around the back of the internet, it’s my suspicion that this is a classification of Pirie’s. The unaccepted enthymeme works as follows: if A and C, then B may be imputed. In very simple systems, for example where B is the only route between A and C then this is reasonably. However biological systems are never simple and in fact are hardwired always to have multiple routes from  A to C. It’s therefore either naïve or illogical to apply such a rationale to a biological system, such that if input or treatment A is applied, and I observe C (and usually this is one observation from a million that could be made) then of all the myriad of possibilities, I believe B must have happened. A firm example: I do a lot of work on butyrate, which I accepted on the basis of published literature is an inhibitor of HDACs and my grants and papers over the last decade have accepted this. However I had cause to want to know its molecular action in more detail, and started scouting for direct evidence – it’s to my concern that this is extraordinarily rare, highly conflicted and not terribly supportive, and counter-hypotheses are easily generated (2). I’ve come up with the notion of the evidenceless paradigm – a pseudohypothesis endemic in the literature, for which direct evidence is neglible or non-existent. I know there are other examples out there, and would like to use the forum or blog response response to hear about your experiences. The concern about evidenceless paradigms is that they can become very hard to shift due to the elusiveness of the primary evidence.

I’m also concerned that we’re giving lip-service to fallibilism itself. In a classic model of science we not only view our hypotheses as fallible and testable, but should test and occasionally, or indeed whenever appropriate, reject them. A recently submitted a manuscript to an august and respected nutrition journal, with two independent datasets suggesting that another paradigm in my particular area should, at the very least, be questioned or scrutinised. In the most vigorous and detailed set of reviewing comments I’ve ever received, one reviewer completely disagreed with my (although to be very fair, equally supported putting the data and argument into the public domain), but the second called me an iconoclast. As I reflect more I get frustrated that the term iconoclast was used in a pejorative manner. Is science fundamentally not set up specifically to question rigorously and fairly, by definition to have no icons? Is questioning and testing not a hallmark of science and the scientist? Indeed, is it not a duty of the scientist to be an iconoclast?

  1. Pirie M. How to win every argument: the use and abuse of logic. London: Continuum, 2006.
  2. Corfe BM. Hypothesis: butyrate is not an HDAC inhibitor, but a product inhibitor of deacetylation. Mol Biosyst 2012;8(6):1609-12.