Hans Eysenck and Carl Sargent’s Dishonesty in Parapsychology

Context

I write this blog as a long-term investigator into psychology and the paranormal. This post concerns a saga of intellectual dishonesty by the late Cambridge University psychologist, Carl Sargent, and his mentor, Professor Hans J Eysenck, of King’s College London. A diary of events weaves a dark story that many wish the world would forget, but the story needs to be told. The parties in this story displayed gullibility, bias and wilful deceit. One of them (CS) was forced to leave his academic post and seek another career. Many years after his death, the other (HJE) stands accused of producing ‘unsafe’ publications on an industrial scale.

Carl Lynwood Sargent (1952 – 2018) was a British parapsychologist and author of several roleplaying game-based products and novels, using the pen name Keith Martin to write Fighting Fantasy gamebooks. Sargent also wrote four books with Hans J Eysenck.  

According to his Wikipedia entry,

Hans Jürgen Eysenck, was born in Berlin onand died in London on, was a British psychologist of German origin known for his work on personality , heritability of intelligence , behavioral therapies and for his critiques of psychoanalysis. At the time of his death, Eysenck was the most frequently cited living psychologist in English-language scientific journals.”

Hans J Eysenck’s intellectual honesty   was recently the focus of renewed controversy after Anthony Pelosi exposed a series of impossible findings published by Eysenck in the field of health psychology (see here, here and here). 26 of Eysenck’s publications were recently considered “unsafe” by an investigation by King’s College London, and many others are also suspected.

Intellectual dishonesty

Richard Smith (personal communication) astutely remarked as follows: “When forensic accountants detect fraud they assume that everything else from that person may well be fraudulent. Scientists tend to do the opposite–assuming that everything is OK until proved to be fraudulent. But as proving fraud is hard lots of highly questionable material remains untouched.”

Smith continues: “I think of the example of R K Chandra, who was eventually found guilty not only of research fraud but also of financial and business fraud. His first paper established to be fraudulent was in 1989. Why, I ask myself, would you start being honest after you’d practised fraud–yet hundreds of his papers are left unremarked, including unfortunately some that have been shown to be fraudulent.”

A reliable source and long-time colleague of Eysenck’s states: “Eysenck was a mendacious charlatan. I base that not so much on his published fiction but his denial of the link between smoking and cancer was pernicious. His espousal of the beliefs of the John Birch Society was egregious…a grant had to be withdrawn and several researchers dismissed.

A profile of Hans Eysenck based on his biography by Rod Buchanan and also his books with Carl Sargent provide insights into Eysenck’s intellectual values as a scientist and scholar. There were four books with Sargent, all with Eysenck as first author:

  1. Explaining the Unexplained: Mysteries of the Paranormal – Weidenfeld & Nicolson, London 01/01/1982 (1982)
  2. Explaining the Unexplained: Mysteries of the Paranormal – (2nd Ed.) – Prion Books Ltd (1993)
  3. Know Your Own Psi-Q – John Wiley (1983)
  4. Are You Psychic? –  Prion Books Ltd (1996).

The collaboration between the two authors began in the early 1980s in Sargent’s heyday at Cambridge and continued until 1996.

A Distorted Account of Parapsychology

These four books present a distorted and strongly biased view that psychic powers are scientifically proven.  The evidence suggests exactly the opposite (Marks, 2020).

Eysenck’s and Sargent’s ‘ naivety and credulity are everywhere apparent. They present a one-sided view of the scientific evidence on psi and affect the naive stance that fraud and trickery do not need to be considered. David Nias and Geoffrey Dean (1986) summarised their criticisms of the Eysenck/Sargent books thus: “the failure of Eysenck and Sargent’s books to cover trickery and credulity is a serious deficiency” (p.368).

In my opinion, these books are among the most distorted and misleading accounts of parapsychological phenomena ever published by academic psychologists. The four books are a total disgrace and how Eysenck had the gall to put his name to them – perhaps only to build his reputation as the fearless contrarian – is beyond imagination.

In addition to the terrible scholarship, there is convincing evidence of scientific fraud by Sargent. How much Hans Eysenck knew about this, we will never know exactly because Eysenck requested that his papers be destroyed after his death. However, Blackmore’s report on Sargent’s fraud became public knowledge several years into the collaboration and years before the third and fourth books with Eysenck were published.

If Sargent had kept his trickery hidden from Eysenck then Eysenck could have been an innocent party.  In a partnership built over 14+ years, surely there would have been a conversation that included a question of the kind, ‘Oh, I hear you left Cambridge, why was that?”  If, as seems likely, Sargent ‘fessed up’ by admitting the occurrence of some kind of experimental ‘error’, then Eysenck could have been party to covering up Sargent’s deceit. Did Eysenck imagine nobody would notice? or perhaps he simply did not care. After all, that great Cambridge genius, Isaac Newton, had done the same kind of thing, and Eysenck saw no problem with a bit of data fudging. According to Eysenck, a ‘genius’ does whatever is necessary to prove their theories, as he had stated in one of his many pot-boilers.

Escaping Early Disaster

1978: Carl Sargent starts playing Dungeons & Dragons and submits an article to Imagine magazine.

1979: The University of Cambridge awards Sargent a PhD, that he claims was the first awarded to a parapsychologist by this university.

1979: The Society for Psychical Research provides a grant to Susan Blackmore (SB) enabling her to visit Sargent’s lab at Cambridge in November.  The original plan was to visit for a month. However, SB was only able to stay eight days from November 22-30 1979.  Blackmore describes her visit to Sargent’s lab as follows:

“[Sargent’s Ganzfeld] research was providing dramatically positive results for ESP in the GF and mine was not, so the idea was for me to learn from his methods in the hope of achieving similarly good results …. After watching several trials and studying the procedures carefully, I concluded that CS’s experimental protocols were so well designed that the spectacular results I saw must either be evidence for ESP or for fraud. I then took various simple precautions and observed further trials during which it became clear that CS had deliberately violated his own protocols and in one trial had almost certainly cheated. I waited several years for him to respond to my claims and eventually they were published along with his denial. (Harley & Matthews, 1987; Sargent, 1987).”

Sargent’s Career Temporarily Blossoms with Eysenck

In this period, Sargent developed his career in parapsychology at Cambridge with Blackmore’s ‘cheating’  report brushed under the carpet.

1980: Sargent writes a monograph, Exploring Psi in the Ganzfeld. Parapsychological Monographs No 17.

Sargent, C. L., Harley, T. A., Lane, J. and Radcliffe, K. publish: ‘Ganzfeld psi optimization in relation to session duration’, Research in Parapsychology 1980, 82-84.

1981: Sargent and Matthews publish ‘Ganzfeld GESP performance in variable duration testing’. Journal of Parapsychology 1981, 159-160

1982: Eysenck and Sargent (1982) publish their first book together, Explaining the unexplained: mysteries of the paranormal. Weidenfeld and Nicolson; First Edition.

1983: Eysenck and Sargent publish their second book, Know Your Own PSI-Q.

Then the Inevitable Downfall

1984: The Parapsychological Association Council asked Martin Johnson to head a committee to investigate SB’s accusation of fraud by Sargent. My book, Psychology and the Paranormal, describes what happened next;

The Parapsychological Association (PA) invited CS to provide an account of the ‘errors’ that SB had reported, but he declined to offer any explanation. The PA President, Stanley Krippner, wrote to CS at four different addresses, but still received no reply. The PA’s ‘Sargent Case Report’ dated 10 December 1986 found that, in spite of strong reservations about CS’s randomisation technique, there was insufficient evidence that CS had used unethical procedures.

CS was ‘reproved’ for failing to respond to the PA’s request for information. However, CS had allowed his PA membership to lapse through non-payment of dues, but he was informed that, should he wish to renew his membership, his application would be considered with ‘extreme prejudice’, I.e. CS would I likely be re-admitted as a member.

The final report of this committee reprimanded Sargent for failing to respond to their request for information within a reasonable time.

1985: Sargent leaves Cambridge University and the parapsychology field [stated in the 2nd edition of Explaining the unexplained: mysteries of the paranormal, 1993].

At some point, Sargent moves into full-time authoring of game-books.

1987: Susan Blackmore’s 1979 report is finally published: ‘A Report of a Visit to Carl Sargent’s Laboratory’, Journal of the Society for Psychical Research, 54, 186-198.

1993: Undeterred by the report of cheating, Eysenck and Sargent publish their third book, Explaining the unexplained: mysteries of the paranormal. (2nd ed.)

1996:  Eysenck and Sargent publish their fourth book, Are You Psychic?: Tests & Games to Measure Your Powers (1996), a revised version of ‘Know your own Psi-Q’.

The Hidden Truth

Two editions of the book by H. J. Eysenck and Sargent (1982, 1993) raise questions about how much Eysenck knew of the fraud accusations against Sargent in Blackmore’s SPR report of 1979. In the 1982 edition of the first book, the procedural problems with Sargent’s GF research are not even mentioned. In the 1993 edition, the authors refer to ‘spirited exchanges on GF research’ between Blackmore, and Sargent and Harley (p. 189).

However, the Ganzfeld evidence of psi is described by them as ‘very, very powerful indeed’. They do not mention the accusations of fraud, CS’s departure from Cambridge University, and his repeated non-cooperation with the Parapsychology Association enquiry.

I obtained an update from Susan Blackmore on her current thinking about her 30-year-old allegation of fraud by CS and on psi research more generally, which I reproduce below. Here are Susan Blackmore’s answers to a few specific questions:

Do you think, in the light of everything that has come to light, CS committed fraud at Cambridge? (Ideally, a yes or a no).

Yes, at least on one specific trial.

Do you think CS knowingly deceived anybody (including possibly himself) or was he simply a victim of confirmation bias/subjective validation?

The former.

Is there anything else you would like to say about research on psi?

In the light of my decades of research on psi, and especially because of my experiences with the GF, I now believe that the possibility of psi existing is vanishingly small, though not zero. I am glad other people continue to study the subject because it would be so important to science if psi did exist. But for myself, I think doing any further psi research would be a complete waste of time. I would not expect to find any phenomena to study, let alone any that could lead us to an explanatory theory. I may yet be proved wrong of course. (Blackmore, personal communication, 1 August 2019)

Summary of facts and conclusions

  1. A consistent pattern of data manipulation in Hans Eysenck’s and at least two collaborators’ research practice is evident over several decades. Yet only recently have journals found it necessary to retract 14 of Hans Eysenck’s papers and to publish 71 expressions of concern. One paper of concern was published by the Proceedings of the Royal Society of Medicine in 1946. 
  2. A reliable source accused Eysenck of cheating with his data analyses in the 1960s and other colleagues and PhD students publicly critiqued Eysenck’s laboratory methods.
  3. In the late 1970s/early 1980s, Eysenck formed a long-term collaboration with a Cambridge academic Carl Sargent in spite of the fact that Carl Sargent had been accused of fraud in 1979.  Eysenck and Sargent’s joint publications, with Eysenck as senior author, occurred over the period 1982-1993.
  4. In 1992 and 1993, Anthony Pelosi, Louis Appleby and others raised serious questions about publications by Eysenck with R Grossarth-Maticek (Pelosi, AJ, Appleby, L (1992Psychological influences on cancer and ischaemic heart disease. British Medical Journal 304: 12951298.Pelosi, AJ, Appleby, L (1993Personality and fatal diseases. British Medical Journal 306: 16661667.) The authorities failed to respond.
  5.  Anthony Pelosi (2019) again voices his concerns. This author’s editorial appealing to Kings College London to open an enquiry finally led to concrete action. 25 publications by H J Eysenck and R Grossarth-Maticek have been deemed by KCL to be unsafe.
  6.  As suspicions strengthened over a 75-year period from the mid-1940s, torpor and complacency in the academic system enabled research malpractice to continue, not only Eysenck’s and Sargent’s, but across the board.
  7.  The currently available systems for regulating research integrity and malpractice are an abject failure. A totally new approach is required. An independent National Research Integrity Ombudsperson needs to be established to significantly improve the governance of academic research.

Master Faker H J Eysenck

“People fake only when they need to fake.”

The replication crisis in science begins with faked data. I discuss here a well-known recent case, Hans J Eysenck. An enquiry at King’s College London and scientific journals  concluded that multiple publications by Hans J Eysenck’s are ‘unsafe’ and must be retracted. These recent events suggest that the entire edifice of Eysenck’s work warrants re-examination. In this post I examine some early experimental research by Eysenck and his students at the Institute of Psychiatry during the 1950s and 60s.

Hans J Eysenck was a chameleon-figure in the science of psychology. Eysenck doctored data from the very beginning of his theorising. Time and again HJE proved  that he was a master of camouflage. I examine here some historically significant data that HJE used to promote his biological theory of personality, data that were used by HJE in a misleading way to promote his theories.

The evidence suggests that HJE massaged data to give them more ‘scientific’ appeal.  HJE’s biological theory had predicted that introverts would condition more quickly than extraverts. The original data were collected by Cyril M Franks who had worked for his PhD under Eysenck’s supervision at the Institute of Psychiatry, London. Even Franks would later turn upon the master for his misleading methodology and data analysis. However, HJE dismissed and vehemently attacked all of his critics, claiming they were wrong, foolhardy and unreasonable.

HJE’s version of Cyril Franks’ data was originally published in the British Journal of Psychology (Eysenck, H. J. (1962). Conditioning and personality. British Journal of Psychology53(3), 299-305) and again, the next year, in Nature (Eysenck, H. J. (1963). Biological basis of personality. Nature199(4898), 1031-1034 and in multiple other publications. HJE doctored the data to make the introverts show a more rapid increase than the extraverts. These data were a crucial step in his theory published in his 1957 book, The dynamics of anxiety and hysteria.

HJE used a series of questionable  practices (QPRs) that raised many eyebrows including insiders at the Institute of Psychiatry.  Eysenck’s theory of personality became the subject of scathing criticism. Chapter 5 of Playing With Fire by Rod Buchanan  provides the full details.

My personal skepticism about HJE began as an undergraduate student when a lecturer, Vernon Hamilton, told me that HJE had ‘cheated’ with his data – see Hamilton’s critique here.  Other telling criticism was published by Storms and Sigal here and in another article with Franks: see Sigal, Star and Franks here.

In spite of all of the controversy, which he seemed to rather enjoy, HJE became one of the most influential psychologists of all time. His Nature paper has been cited 6331 times.

In light of the recent exposure of Eysenck as a person who carried out serial publication fraud, it is informative to take a close look at Cyril Franks’ PhD research that in HJE’s creative accounting became a foundation stone of HJE’s first theory of personality.  

EYSENCK’S DOCTORED CURVES  

EYSENCK'S DOCTORED VERSION OF FRANKS, 1957, DATA

An almost perfect set of findings, one might assume – too good to be true even. My detailed scrutiny suggests that this was indeed the case. When one examines the data HJE used to generate these two curves, we see anything but smoothly increasing scores.

JAGGED-LOOKING ORIGINAL DATA PUBLISHED BY FRANKS (1956):

Franks tested a hypothesis attributed to Pavlov: “Neurotics of the dysthymic type form conditioned reflexes rapidly, and these reflexes are difficult to extinguish; neurotics of the hysteric type form conditioned reflexes slowly, and these reflexes are easy to extinguish”.  Franks chose data from 20 dysthymic patients (having rejected data from 8 others),  20 hysteric patients (having rejected data from 7 others), and 20 non-patients …in a specially constructed soundproof conditioning laboratory.”  The results for the dysthymic and hysteric groups were as follows:


Franks 1956 Not unreasonably, Franks concluded that dysthymics give significantly more CR’s than hysterics. Buoyed by his initial success, Franks carried out another study to examine the factor of extraversion/introversion in the same eye-blink conditioning task. In this instance, Franks hypothesised, following HJE’s theory, that the introverts conditioned more quickly than the extraverts.

MORE JAGGED-LOOKING ORIGINAL DATA PUBLISHED BY FRANKS (1957):

Franks’ 1957 data again show the rates of  classical conditioning in eye-blink responses in this case for 15 introverts and 15 extraverts.  According to Eysenck’s theory, the former group should show more rapid conditioning than the latter. The maximum score was 15.

FRANKS' 1957 DATA

EYSENCK COMBINED DATA FROM FRANKS’ TWO STUDIES

HJE combined the data from Franks’ two studies in a rather creative and unconventional manner.  HJE combined data from groups of introvert non-patients with patients diagnosed with dysthymia and he combine data from a group of extravert non-patients and patients categorised as hysterics. The data from the two Franks studies were a hotchpotch that needs untangling.

1)  Eysenck combined the data from the extraverts with the data from the patients classified as hysterics and the data from the introverts with that collected from the dysthymics  This rather odd amalgam smoothed out many of the jagged edges in the two data sets.

2) There was no justification for assuming that the CR rates began at zero because all four groups had minimum scores well above zero. This fact was pointed out by Vernon Hamilton. Yet HJE doctored the data look this way by imposing curves that started at a zero origin.

The next figure shows the data after they had been combined, groups D and I together, and groups H and E together.  I show the combined data with HJE’s smooth curves and the data points as HJE reported them.

EYSENCK ACHIEVED THE LOOK HE WANTED USING CHILDISHLY SIMPLE METHODS

HJE’s 4-step approach to a successful scientific outcome proceeded as follows:

  • First, HJE combined data from 4 different groups to create two new groups even though there was no scientific basis for doing so.
  • Second, although HJE’s and my computations of the combined data points show  a fair degree of consistency, HJE appears to have ‘adjusted’ a few data points that didn’t fit the curve.
  • Third, HJE’s gave his smoothed curves zero starting points, contrary to the actual data, which indicate above-zero baseline scores. HJE attempted to disguise the fact that the groups had radically different, non-zero starting points.
  • Fourth, HJE ignored the fact that two lines with identical slope fitted the data equally well.

Using these devices, HJE promoted the data as respectable science fit for publication in the most reputable journals. This analysis suggests something rather different – that HJE was an out-and-out charlatan.

The left panel of the diagram below shows HJE’s published curves with his cunningly averaged data-points converted to percentages (small dots), and the same averaged data-points obtained by this author (DFM; o’s and x’s). In most cases, HJE’s and DFM’s data points coincide. In at least 4 instances, however, HJE’s points lie closer to the theoretical curve than the correct figures would suggest.

The right-hand panel shows the fit to the same two data sets using linear plots with identical slope.  Neither of the fitted functions look anywhere near perfect, but there is no prima facie case for preferring the curvilinear to the linear fit.

 

HJE curves and lines FINAL.png

CONCLUSION

HJE constructed a curvilinear association between eye-blink classical conditioning rates and questionnaire measures of extraversion-introversion. These curves were artificially doctored to suggest that introverts conditioned more quickly than extraverts, as HJE’s theory had predicted. By combining data that did not belong together, HJE was able to smooth the data sets, which when considered separately did not fit the predictions quite so well.  HJE avoided a feasible alternative (null) hypothesis that the two groups produced identical rates of conditioning. In so doing, HJE helped to establish his first biological theory of personality. This was not only bad science, it was faked science, the work of a chameleon.

I thank Rod Buchanan for his input and advice.

 

 

 

 

 

Uri Geller: Self-Proclaimed ‘Psychic’

A flash-back to 1975 when Uri Geller came to town.  Super-psychic or super-charlatan? Who to Believe?

On the one hand, a scientific report published in Nature verifying Geller’s psychic abilities under supposedly cheat-proof conditions, and on the other, a highly speculative but critical attack published simultaneously in New Scientist. While Targ and Puthoff’s reply … would seem to invalidate his explanations of their significant results, it is difficult to disregard the doubts Hanlon has raised about the “circus atmosphere” that he believes surrounded the SRI experiments. Frankly, at the end of 1974, we were puzzled and confused. Weighing all the evidence available at that time, it seemed impossible to decide whether Geller was a genuine psychic or an ingenious and highly skilled hoaxer. Clearly, the Geller effect had to be taken seriously as, in either case, there would be much of interest to learn about the mechanics of psychic performance. Clearly, what was needed was more experimentation.

First Encounter

Our first live encounter with Geller was accidental. On 23 March 1975 Geller arrived in New Zealand from Australia to begin a series of four “lecture-demonstrations” of his psychic powers. To facilitate communications, I ( David Marks) checked into the same hotel as Geller in the Dominion capital, Wellington. Hopefully we could obtain a sufficient level of cooperation to complete a series of laboratory tests. I left a letter for Geller at the hotel reception inviting his participation in some experiments.

I had been told by Geller’s local agent, Bruce Warwick, that Geller was due to arrive on the ten o’clock plane, and so an arrangement was made to talk with Geller the next morning after a press conference. At eight o’clock on the evening of the 23, I went down to dinner in the almost empty hotel restaurant. At about nine o’clock a party of noisy, flamboyant people sat down at the table next to me in the quiet dining room. From their accents, some were obviously Americans, others Australians, and others sounded like Americanized Israelis.

Suddenly, as I idly scanned their faces, to my utter amazement, I saw Uri Geller. Apparently, he had materialized himself into New Zealand prior to the aircraft’s arrival! He was sitting with his back to me, not more than ten feet away, opposite a woman with blond hair who spoke loudly and clearly with a distinct American accent.

Geller’s Faux Pas

Although I was dying to meet Geller, my first reaction was to leave, as the last thing I wanted to do was invade Geller’s privacy. However, it was I who had been there first, and they had sat next to me, not vice versa, so I decided to finish my dinner and then leave.

To this day I can still hardly believe what took place in the next few moments. The American woman (whom I knew later to be Miss Solveig Clark, one of Geller’s personal assistants) asked Geller in a clear and distinctive voice whether he had “read the letter from Dr. Marks.” Like most other people, I find it hard not to tune in to a conversation when my name is mentioned. I heard Geller reply: “Keep that guy away from me; he’ll pick up the signals (sic).”

No words can describe how I felt at that moment. What signals? Could these be the signals described in the New Scientist? Who was Geller’s female confidante? Was Puharich there, or Shipi Shtrang? Although I couldn’t answer all these questions, Geller had already told me more than I ever imagined would be possible. Yet Geller was blissfully ignorant of this major faux pas. I couldn’t help feeling that if Geller were truly psychic, he’d certainly have sensed my presence and avoided giving away trade secrets!

Postscript:

No question about it, from that moment I sensed that Uri Geller was nothing more than a not-so-clever trickster.

Anybody can bend a spoon, as long as you have a firm grip. Try it without touching, however, and its a very different story.

Geller successfully conned pretty much the whole world into believing he had special powers.

He does. It’s called sleight-of-hand!

Flashback: an extract from “The Psychology of the Psychic”.

H J Eysenck’s ‘Unsafe’ Publications Total 148

Featured

This post updates the situation regarding publications by Hans J Eysenck that are deemed ‘unsafe’. The 148 publications include 87 publications identified by David F Marks and Roderick D Buchanan and 61 papers in two journals flagged by SAGE Publications on 10 February 2020 (details below).

To date, only fourteen of HJ Eysenck’s 148 suspect papers have been retracted. A list containing  details of 61 of the suspect papers was published more than a year ago.

Why are journals so slow to retract such obviously dubious papers?

Complacent, Complicit Institutions

A large part of the blame lies with King’s College London, where Hans J Eysenck’s Institute is affiliated. The institution has been slow and reluctant to act. KCL conducted a review of Eysenck’s publications but failed to complete the job. A recent editorial with Eysenck’s biographer, Rod D Buchanan, called on KCL to properly complete their review. To date, KCL has given no response.

Equally culpable is the British Psychological Society. The British Psychological Society is the representative body for Psychology and Psychologists in the UK. The Society is  responsible for the promotion of excellence and ethical practice in the science, education, and practical applications of Psychology. As the only professional association of psychologists in Britain, the BPS has refused to do anything at all to censor H J Eysenck’s fraudulent research. How can the British public feel protected from ‘fake news’ and fraud if the Society responsible for policing psychological practice in the UK sticks its head in the sand?  An utter disgrace!

Remember that according to HJE, the connection between smoking and cancer was unproven. Moreover cancer and heart disease can be caused by one’s own personality!

Yet the BPS has done nothing to correct these blatant falsehoods.

To this day, the Society continues to bolster up HJE’s flagging reputation.

The Society’s magazine published a letter claiming that this author’s request for an inquiry into H J Eysenck: “…is representative of the very type of smear campaign and witch-hunting which Eysenck was subjected to previously.”

The British Psychological Society’s complicity in Eysenck’s discredited publication record and its refusal to take any action whatsoever is shameful. It is evident that the BPS is more interested in protecting its own than the British public.

Shared responsibility

The responsibility for H J E’s many suspect publications cannot be laid only at Eysenck’s door.  The many co-authors of the long list of suspect publication were required to vouch for the authenticity of the data, analyses and conclusions when the papers were accepted for publication.

Many of the suspect papers were co-authored with well-known figures in the Psychology discipline including HJE’s second wife, Sybil  B.G Eysenck. Other co-authors include professors holding chairs in the University of London, Professors Adrian Furnham,  and Chris Frith at University College London.  Paul Barrett, was co-director with Hans Eysenck of the Biosignal Lab at the University of London’s Institute of Psychiatry for 14 years, and currently is Chief Research Scientist at Cognadev (UK and SA), and Professor of Psychology at the University of Auckland, New Zealand.

Another of HJE’s co-authors is Richard Lynn, a former professor of psychology at Ulster University, having had the title withdrawn by the university in 2018, and Editor-in-Chief of the journal Mankind Quarterly, which has been described as a “white supremacist journal”. Hans Eysenck’s eugenicist convictions will be the subject of a later post.

440px-Prof-Richard-Lynn-7635-2.jpg

No publications in the two journals founded by HJE have yet been retracted. However, three have been listed in an Expression of Concern: https://doi.org/10.1016/j.paid.2020.109855

In spite of the obvious fraud, the journal Personality and Individual Differences, one of the journals founded by HJE, retracts nothing. PAID cannot bring itself to publicly acknowledge that HJE was a charlatan. Many who signed an expression of concern are Eysenck’s co-authors, including Barrett, referred to above. No conflict of interest there then.

X01918869.jpgs

Full bibliographic details can be found at the Retraction Watch database: 14 retractions and 64 expressions of concern.

Journals Slow to Act

73 items are pending any response by the relevant publishers. The publishers are listed in the KCL enquiry report as follows:

Professor Roger Pearson (Editor) Journal of Social, Political, and Economic Studies Council for Social and Economic Studies PO Box 34143 Washing DC 20043, USA (1 paper).

Michelle G. Craske (Editor) Behaviour Research and Therapy Department of Psychology University of California at Los Angeles (UCLA) 405 Hilgard Avenue, Los Angeles, CA 90095-1563 California, USA (4 papers).

Dr Donald Saklofske Personality and Individual Differences Department of Psychology University of Western Ontario Canada (4 papers).

Jaan Valsiner (Editor-in-Chief) Intergrative Psychological and Behavioral Science Department of Psychology Clark University Worcester, MA 01610-1477, USA (1 paper).

Professor Oi Ling Siu (Editor) International Journal of Stress Management WYL201/1 Dorothy Y L Wong Building Department of Applied Psychology Lingnan University Tuen Mun Hong Kong (1 paper).

Werner Strik (Editor) Neuropsychobiology University Hospital of Psychiatry Waldau Page 8 of 9 CH-3000 Bern 60 Switzerland (1 paper).

Adam S. Radomsky (Editor) Journal of Behavior Therapy and Experimental Psychiatry L-PY 101-4 Psychology Building 7141 Sherbrooke W. Concordia University in Montreal Canada (1 paper).

Timothy R Elliott (Editor) Journal of Clinical Psychology Education & Human Development Texas A&M University 713A Harrington Office Building (2 papers).

How much longer do these journals need to wait?

Two journals published by SAGE have already listed 13 retractions and expressions of concern on 61 papers. Other journals need to follow suit.

Psychological Reports Expression of Concern

https://doi.org/10.1177/0033294120901991

The Journal Editor and SAGE Publishing hereby issue an expression of concern for the following articles:

  1. Eysenck, H. J. (1955). Psychiatric Diagnosis as a Psychological and Statistical Problem. Psychological Reports1(1), 3–17. https://doi.org/10.2466/pr0.1955.1.g.3
  2. Eysenck, S. B. G., & Eysenck, H. J. (1964). “Acquiescence” Response Set in Personality Inventory Items. Psychological Reports14(2), 513–514. https://doi.org/10.2466/pr0.1964.14.2.513
  3. Eysenck, H. J. (1956). Diagnosis and Measurement: A Reply to Loevinger. Psychological Reports2(3), 117–118. https://doi.org/10.2466/pr0.1956.2.3.117
  4. Eysenck, S. B. G., & Eysenck, H. J. (1967). Physiological Reactivity to Sensory Stimulation as a Measure of Personality. Psychological Reports20(1), 45–46. https://doi.org/10.2466/pr0.1967.20.1.45
  5. Sartory, G., & Eysenck, H. J. (1976). Strain Differences in Acquisition and Extinction of Fear Responses in Rats. Psychological Reports38(1), 163–187. https://doi.org/10.2466/pr0.1976.38.1.163
  6. Bruni, P., & Eysenck, H. J. (1976). Structure of Attitudes—An Italian Sample. Psychological Reports38(3), 956–958. https://doi.org/10.2466/pr0.1976.38.3.956
  7. Eysenck, H. J. (1976). Structure of Social Attitudes. Psychological Reports39(2), 463–466. https://doi.org/10.2466/pr0.1976.39.2.463
  8. Eysenck, S. B. G., White, O., & Eysenck, H. J. (1976). Personality and Mental Illness. Psychological Reports39(3), 1011–1022. https://doi.org/10.2466/pr0.1976.39.3.1011
  9. Eysenck, H. J. (1958). The Nature of Anxiety and the Factorial Method. Psychological Reports4(2), 453–454. https://doi.org/10.2466/pr0.1958.4.h.453
  10. Hewitt, J. K., Eysenck, H. J., & Eaves, L. J. (1977). Structure of Social Attitudes after Twenty-Five Years: A Replication. Psychological Reports40(1), 183–188. https://doi.org/10.2466/pr0.1977.40.1.183
  11. Eysenck, S. B. G., & Eysenck, H. J. (1977). Personality Differences between Prisoners and Controls. Psychological Reports40(3_suppl), 1023–1028. https://doi.org/10.2466/pr0.1977.40.3c.1023
  12. Eysenck, H. J. (1977). National Differences in Personality as Related to ABO Blood Group Polymorphism. Psychological Reports41(3_suppl), 1257–1258. https://doi.org/10.2466/pr0.1977.41.3f.1257
  13. Hewitt, J. K., Fulker, D. W., & Eysenck, H. J. (1978). Effect of Strain and Level of Shock on the Behaviour of Rats in PSI Experiments. Psychological Reports42(3_suppl), 1103–1108. https://doi.org/10.2466/pr0.1978.42.3c.1103
  14. Eysenck, S. B. G., & Eysenck, H. J. (1978). Impulsiveness and Venturesomeness: Their Position in a Dimensional System of Personality Description. Psychological Reports43(3_suppl), 1247–1255. https://doi.org/10.2466/pr0.1978.43.3f.1247
  15. Eysenck, H. J. (1979). Personality Factors in a Random Sample of the Population. Psychological Reports44(3_suppl), 1023–1027. https://doi.org/10.2466/pr0.1979.44.3c.1023
  16. Eysenck, H. J. (1980). Psychology of the Scientist: XLIV. Sir Cyril Burt: Prominence versus Personality. Psychological Reports46(3), 893–894. https://doi.org/10.2466/pr0.1980.46.3.893
  17. Eysenck, H. J. (1980). Personality, Marital Satisfaction, and Divorce. Psychological Reports47(3_suppl), 1235–1238. https://doi.org/10.2466/pr0.1980.47.3f.1235
  18. Eysenck, H. J. (1959). Comments on a Test of the Personality-Satiation-Inhibition Theory. Psychological Reports5(2), 395–396. https://doi.org/10.2466/pr0.1959.5.h.395
  19. Eysenck, H. J. (1959). Personality and Verbal Conditioning. Psychological Reports5(2), 570–570. https://doi.org/10.2466/pr0.1959.5.h.570
  20. Eysenck, H. J. (1959). Personality and Problem Solving. Psychological Reports5(3), 592–592. https://doi.org/10.2466/pr0.1959.5.3.592
  21. Eysenck, H. J. (1982). The Biological Basis of Cross-Cultural Differences in Personality: Blood Group Antigens. Psychological Reports51(2), 531–540. https://doi.org/10.2466/pr0.1982.51.2.531
  22. Eysenck, H. J. (1987). Comments on “the Orthogonality of Extraversion and Neuroticism Scales.” Psychological Reports61(1), 50–50. https://doi.org/10.2466/pr0.1987.61.1.50
  23. Eysenck, H. J., & Barrett, P. (1993). The Nature of Schizotypy. Psychological Reports73(1), 59–63. https://doi.org/10.2466/pr0.1993.73.1.59
  24. Eysenck, H. J. (1995). Some Comments on the Gough Socialization Scale. Psychological Reports76(1), 298–298. https://doi.org/10.2466/pr0.1995.76.1.298
  25. Eysenck, H. J., Eysenck, S. B. G., & Barrett, P. (1995). Personality Differences According to Gender. Psychological Reports76(3), 711–716. https://doi.org/10.2466/pr0.1995.76.3.711

Perceptual and Motor Skills Expression of Concern

https://doi.org/10.1177/0031512520901993

  1. Frith, C. D., & Eysenck, H. J. (1982). Reminiscence and Learning: One or Many? Perceptual and Motor Skills54(2), 494–494. https://doi.org/10.2466/pms.1982.54.2.494
  2. Eysenck, H. J., & Eysenck, S. B. G. (1960). Reminiscence on the Spiral After-Efect as a Function of Length of Rest and Number of Pre-Rest Trials. Perceptual and Motor Skills10(2), 93–94. https://doi.org/10.2466/pms.1960.10.2.93
  3. Eysenck, H. J. (1960). Reminiscence, Extraversion and Neuroticism. Perceptual and Motor Skills11(1), 21–22. https://doi.org/10.2466/pms.1960.11.1.21
  4. Eysenck, H. J. (1960). Reminiscence as a Function of Rest, Practice, and Personality. Perceptual and Motor Skills11(1), 91-94E. https://doi.org/10.2466/pms.1960.11.1.91
  5. Eysenck, H. J., & Holland, H. (1960). Length of Spiral After-Effect as a Function of Drive. Perceptual and Motor Skills11(2), 129–130. https://doi.org/10.2466/pms.1960.11.2.129
  6. Eysenck, H. J. (1960). Reminiscence and Post-Rest Increment after Massed Practice. Perceptual and Motor Skills11(2), 221–222. https://doi.org/10.2466/pms.1960.11.2.221
  7. Holland, H., & Eysenck, H. J. (1960). Spiral After-Effect as a Function of Length of Stimulation. Perceptual and Motor Skills11(2), 228–228. https://doi.org/10.2466/pms.1960.11.2.228
  8. Lynn, R., & Eysenck, H. J. (1961). Tolerance for Pain, Extraversion and Neuroticism. Perceptual and Motor Skills12(2), 161–162. https://doi.org/10.2466/pms.1961.12.2.161
  9. Costello, C. G., & Eysenck, H. J. (1961). Persistence, Personality, and Motivation. Perceptual and Motor Skills12(2), 169–170. https://doi.org/10.2466/pms.1961.12.2.169
  10. Eysenck, H. J., & Willett, R. A. (1962). Cue Utilization as a Function of Drive: An Experimental Study. Perceptual and Motor Skills15(1), 229–230. https://doi.org/10.2466/pms.1962.15.1.229
  11. Eysenck, H. J., & Willett, R. A. (1962). Performance and Reminiscence on a Symbol Substitution Task as a Function of Drive. Perceptual and Motor Skills15(2), 389–390. https://doi.org/10.2466/pms.1962.15.2.389
  12. Eysenck, H. J. (1962). Figural After-Effects, Personality, and Inter-Sensory Comparisons. Perceptual and Motor Skills15(2), 405–406. https://doi.org/10.2466/pms.1962.15.2.405
  13. Eysenck, H. J. (1964). Involuntary Rest Pauses in Tapping as a Function of Drive and Personality. Perceptual and Motor Skills18(1), 173–174. https://doi.org/10.2466/pms.1964.18.1.173
  14. Eysenck, H. J. (1966). On the Dual Function of Consolidation. Perceptual and Motor Skills22(1), 273–274. https://doi.org/10.2466/pms.1966.22.1.273
  15. Eysenck, H. J. (1967). Factor-Analytic Study of the Maitland Graves Design Judgment Test. Perceptual and Motor Skills24(1), 73–74. https://doi.org/10.2466/pms.1967.24.1.73
  16. Eysenck, S. B. G., & Eysenck, H. J. (1967). Salivary Response to Lemon Juice as a Measure of Introversion. Perceptual and Motor Skills24(3_suppl), 1047–1053. https://doi.org/10.2466/pms.1967.24.3c.1047
  17. Eysenck, H. J. (1969). A New Theory of Post-Rest Upswing or “Warm-up” in Motor Learning. Perceptual and Motor Skills28(3), 992–994. https://doi.org/10.2466/pms.1969.28.3.992
  18. Eysenck, H. J. (1970). An Application of the Maitland Graves Design Judgment Test to Professional Artists. Perceptual and Motor Skills30(2), 589–590. https://doi.org/10.2466/pms.1970.30.2.589
  19. Eysenck, S. B. G., Russell, T., & Eysenck, H. J. (1970). Extraversion, Intelligence, and Ability to Draw a Person. Perceptual and Motor Skills30(3), 925–926. https://doi.org/10.2466/pms.1970.30.3.925
  20. Eysenck, H. J. (1971). Relation between Intelligence and Personality. Perceptual and Motor Skills32(2), 637–638. https://doi.org/10.2466/pms.1971.32.2.637
  21. Eysenck, H. J., & Iwawaki, S. (1971). Cultural Relativity in Aesthetic Judgments: An Empirical Study. Perceptual and Motor Skills32(3), 817–818. https://doi.org/10.2466/pms.1971.32.3.817
  22. Eysenck, S. B. G., & Eysenck, H. J. (1971). Attitudes to Sex, Personality and LIE Scale Scores. Perceptual and Motor Skills33(1), 216–218. https://doi.org/10.2466/pms.1971.33.1.216
  23. Wilson, G. D., Tunstall, O. A., & Eysenck, H. J. (1971). Individual Differences in Tapping Performance as a Function of Time on the Task. Perceptual and Motor Skills33(2), 375–378. https://doi.org/10.2466/pms.1971.33.2.375
  24. Eysenck, H. J., & Eysenck, S. B. G. (1971). The Orthogonality of Psychoticism and Neuroticism: A Factorial Study. Perceptual and Motor Skills33(2), 461–462. https://doi.org/10.2466/pms.1971.33.2.461
  25. Eysenck, H. J. (1972). Preference Judgments for Polygons, Designs, and Drawings. Perceptual and Motor Skills34(2), 396–398. https://doi.org/10.2466/pms.1972.34.2.396
  26. Bone, R. N., & Eysenck, H. J. (1972). Extraversion, Field-Dependence, and the Stroop Test. Perceptual and Motor Skills34(3), 873–874. https://doi.org/10.2466/pms.1972.34.3.873
  27. Eysenck, H. J., & Soueif, M. (1972). An Empirical Test of the Theory of Sexual Symbolism. Perceptual and Motor Skills35(3), 945–946. https://doi.org/10.2466/pms.1972.35.3.945
  28. Allsopp, J. F., & Eysenck, H. J. (1974). Personality as a Determinant of Paired-Associates Learning. Perceptual and Motor Skills39(1), 315–324. https://doi.org/10.2466/pms.1974.39.1.315
  29. Götz, K. O., Lynn, R., Borisy, A. R., & Eysenck, H. J. (1979). A New Visual Aesthetic Sensitivity Test: I. Construction and Psychometric Properties. Perceptual and Motor Skills49(3), 795–802. https://doi.org/10.2466/pms.1979.49.3.795
  30. Iwawaki, S., Eysenck, H. J., & Götz, K. O. (1979). A New Visual Aesthetic Sensitivity Test (VAST): II. Cross-Cultural Comparison between England and Japan. Perceptual and Motor Skills49(3), 859–862. https://doi.org/10.2466/pms.1979.49.3.859
  31. Chan, J., Eysenck, H. J., & Götz, K. O. (1980). A New Visual Aesthetic Sensitivity Test: III. Cross-Cultural Comparison between Hong Kong Children and Adults, and English and Japanese Samples. Perceptual and Motor Skills50(3_suppl), 1325–1326. https://doi.org/10.2466/pms.1980.50.3c.1325
  32. Frith, C. D., & Eysenck, H. J. (1981). Reminiscence—Psychomotor Learning: A Reply to Coppage and Payne. Perceptual and Motor Skills53(3), 842–842. https://doi.org/10.2466/pms.1981.53.3.842
  33. Chan, J. W. C., Eysenck, H. J., & Lynn, R. (1991). Reaction Times and Intelligence among Hong Kong Children. Perceptual and Motor Skills72(2), 427–433. https://doi.org/10.2466/pms.1991.72.2.427
  34. Lynn, R., Chan, J. W. C., & Eysenck, H. J. (1991). Reaction Times and Intelligence in Chinese and British Children. Perceptual and Motor Skills72(2), 443–452. https://doi.org/10.2466/pms.1991.72.2.443
  35. Eysenck, H. J., & Furnham, A. (1993). Personality and the Barron-Welsh Art Scale. Perceptual and Motor Skills76(3), 837–838. https://doi.org/10.2466/pms.1993.76.3.837
  36. Eysenck, H. J. (1959). Personality and the Estimation of Time. Perceptual and Motor Skills9(3), 405–406. https://doi.org/10.2466/pms.1959.9.3.405

CONCLUSION

The list of unsafe publications grows and grows. How many more can there be? And when will the scientific record finally be corrected?

 

 

 

 

 

KCL Enquiry Stopped Prematurely

HANS EYSENCK EXPOSURE

In 2019 we exposed the largest fraud ever perpetrated in the history of psychology (Marks, 2019Pelosi, 2019). This audacious fraud was carried out by the UK’s most published and best known psychologist, the late Professor Hans J Eysenck (1916-1997), by all accounts, a maverick and controversial figure.

We called for an enquiry (Marks, 2019).  H J Eysenck’s ex-employer, the Institute of Psychiatry in Denmark Hill, is now a part of King’s College London (KCL).

The enquiry at KCL concluded that 25 publications were unsafe. However, the enquiry report remains unpublished and incomplete.  

KCL reviewed publications written by Eysenck with his collaborator Ronald Grossarth-Maticek. The enquiry failed to investigate 36 other bogus items based on exactly the same data collected by Eysenck’s collaborator.

The KCL enquiry must be properly completed to include the entire set of 61 bogus publications.

The Eysenck affair makes a strong case for a National Research Integrity Ombudsperson.

WHAT HAPPENED

The Journal of Health Psychology published a penetrating review  by Anthony Pelosi (2019) into Hans J Eysenck’s research on fatal illnesses and personality.

Eysenck’s research had been conducted with a German sociologist, Ronald Grossarth-Maticek, while claiming affiliation to Eysenck’s employer, the Institute of Psychiatry, now part of King’s College London (KCL). In a survey of Eysenck’s publications about fatal illness and personality, I identified  a provisional total of 61 that exceeded any reasonable boundary of scientific credibility. Based on Pelosi’s case and my review of these dubious publications, I called for an investigation by KCL into these publications (Marks, 2019).

On 3rd December 2018  I sent a pre-publication copy of Anthony Pelosi’s review and my editorial to the Principal of KCL, Professor Edward Byrne. On 13th December 2018,  I received a reply informing me that a considered response would follow a KCL review.

FOUR MONTH DELAY

On 25th June 2019 I was informed that KCL had completed its enquiry to examine publications authored by Professor Hans Eysenck with Professor Ronald Grossarth-Maticek.  Professor Byrne said that KCL had contacted the University of Heidelberg where Professor Grossarth-Maticek is associated. Professor Byrne confirmed that the enquiry had found “a number of papers” to be questionable and that KCL would be writing to the editors of the relevant journals to inform them. I requested a copy of the enquiry report but heard nothing more until October 4th 2019  when I received the enquiry report dated ‘May 2019’.

The reason for the 4-month delay is unclear.

THE REPORT

According to the enquiry report, the Principal had asked the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) to set up a committee to examine publications authored by Professor Hans Eysenck with Professor Ronald Grossarth-Maticek.

Why only the publications co-authored with Grossarth-Maticek? The reason for this limitation in the scope of the enquiry is not given.

The enquiry committee expressed its concerns about the Eysenck and Grossarth-Maticek papers in the following terms:

“The concerns are based on two issues. First, the validity of the datasets, in terms of recruitment of participants, administration of measures, reliability of outcome ascertainment, biases in data collection, absence of relevant covariates, and selection of cases analysed in each article. Second, the implausibility of the results presented, many of which show effect sizes virtually unknown in medical science. For example, the relative risk of dying of cancer for individuals with ‘cancer-prone’ personality compared with healthy  personality was over 100, while the risk of cancer mortality was reduced 80% by bibliotherapy. These findings are incompatible with modern clinical science and the understanding of disease processes.”

REPORT’S CONCLUSION

“The Committee shared the concerns made by the critics of this body of work. We have come to the conclusion that we consider the published results of studies that included the results of the analyses of data collected as part of the intervention or observational studies to be unsafe and that the editors of the journals should be informed of our decision. We have highlighted 26 papers (Appendix 1) which were published in 11 journals which are still in existence.”

 

Screen Shot 2019-10-10 at 10.10.55.png

SHIFTING THE BLAME

As noted, the KCL enquiry was based on the publications Eysenck co-authored with Ronald Grossarth-Maticek. Was this a manoeuvre designed to try and shift the blame away from Eysenck towards Grossarth-Maticek?

If so, it failed.

Any implication that Eysenck was a hapless victim of a dishonest act of data manipulation by Grossarth-Maticek is inconsistent with the evidence. After all, a large subset of 36 single-authored publications by the great man himself were  based, partly or entirely, on the same body of research data as the co-authored publications.

There are so many publications both with and without his collaborator. Many of these publications cover exactly the same material. Multiple publication of the same material is definitely not part of the normally recognised process of academic publication.

There is indubitable evidence in Eysenck’s multiple publications of the Questionable Writing Practice of self-plagiarism. 

OTHER EXAMPLES OF EYSENCK’S FRAUD

The KCL enquiry failed to identify the full extent of Eysenck’s fraud. Its enquiry must be extended to examine the ‘safety’ of Eysenck’s 36 bogus single-authored publications.  It should also examine Eysenck’s multiple publications covering the same ground for evidence of self-plagiarism.

On no less than 20 co-authored papers. Grossarth-Maticek was falsely shown as being affiliated to the Institute of Psychiatry  This fraud can be laid squarely at Eysenck’s door. Grossarth-Maticek could not have asserted this false affiliation without the deliberate connivance of Eysenck.

Recent personal communications with Ronald Grossarth-Maticek indicate that Grossarth-Maticek does not have even a minimal command of English.  It can be reasonably assumed that Grossarth-Maticek was 100% reliant on Eysenck to produce the English language versions of the 25 unsafe papers.

TOTAL BODY OF BOGUS WORK MUST BE CONSIDERED

In total, Eysenck was responsible for no less than 61 publications using the bogus data sets. This total body of 61 publications includes more than 40 peer-reviewed journal articles, 10 book chapters and two books, each in three editions.

The proper thing for editors and publishers is to retract all 61 publications.

To quote James Heathers: “Eysenck would eclipse Diederik Stapel (58) as the most retracted psychologist in history, a scarcely believable legacy for someone who was at one time the most cited psychologist on the planet” (Heathers, 2019).

INABILITY TO PROPERLY INVESTIGATE FRAUD INSIDE ACADEMIC INSTITUTIONS

There are strong reasons to doubt the capability of KCL and other academic institutions to properly and fully investigate academic misconduct because of the obvious conflict of interest. In a previous case when I brought a complaint of academic misconduct to KCL, the institution failed to follow its procedures for investigating the complaint. In the Eysenck case, it has investigated less than half of the publications pertaining to the complaint.

All of which leads one to conclude that there is an urgent need to establish a National Research Integrity Ombudsperson to investigate allegations of academic misconduct.

The need for an independent UK body to promote good governance, management and conduct of academic, scientific and medical research could never be stronger than in the present situation. The Eysenck affair requires the full attention of the institutions that govern scientific practice.

The only professional body for psychologists, the British Psychological Society, washed its hands of the problem by passing the entire responsibility to KCL.

This is not an issue about a single individual’s alleged misconduct, or a single institution, it is about the integrity of science. Without a genuine ability to assure governance, quality and integrity, science is a failure unto to itself, to reason and to ethics.

As James Heathers points out:

the question is, does anybody have the will to do anything about it?

REFERENCES

Heathers, J. (2019). Do we have the will to do anything about it? James Heathers reflects on the Eysenck case. https://retractionwatch.com/2019/10/07/do-we-have-the-will-to-do-anything-about-it-james-heathers-reflects-on-the-eysenck-case/

Marks, D.F. (2019). The Hans Eysenck affair: Time to correct the scientific record. Journal of Health Psychology, 24, 4: 409-420.

Pelosi, A. (2019). Personality and fatal diseases: revisiting a scientific scandal. Journal of Health Psychology,  24, 4:  421-439.

[1] Only 25 papers were listed in Appendix 1.

Personality and Fatal Diseases: Revisiting a Scientific Scandal

Featured

During the 1980s and 1990s, Hans J Eysenck conducted a programme of research into the causes, prevention and treatment of fatal diseases in collaboration with one of his protégés, Ronald Grossarth-Maticek. This led to what must be the most astonishing series of findings ever published in the peer-reviewed scientific literature with effect sizes that have never otherwise been encounterered in biomedical research. This article outlines just some of these reported findings and signposts readers to extremely serious scientific and ethical criticisms that were published almost three decades ago. Confidential internal documents that have become available as a result of litigation against tobacco companies provide additional insights into this work. It is suggested that this research programme has led to one of the worst scientific scandals of all time. A call is made for a long overdue formal inquiry.

Read paper at:

https://journals.sagepub.com/doi/full/10.1177/1359105318822045

Scientific Fraud at London University

The University of London (UL) is a complex, federal institution including University College London (UCL) the LSE, King’s College London and the London Business School. The University is the world’s oldest provider of academic awards through distance and flexible learning, dating back to 1858. The UL website proudly announces that it: “has been shortlisted for the International Impact Award at the 2018 Times Higher Education Awards, known as the ‘Oscars’ for higher education.

The academic context of an institution of the size and complexity of the UL is one of intense external and internal competition. These colleges compete fiercely for resources on a national and international stage. Many of them do exceedingly well. They are obsessed by their positions in various public league tables.  For example, the Times Higher Education (2018) World University Rankings for 2019 place Imperial College, UCL, LSE and King’s at 9th, 14th, 26th and 38th places respectively in a table of 1250 universities. These rankings matter and the only game in town is to move up the table. Oxford and Cambridge are in first and second place, with Stanford, MIT, CalTech and the Ivy League universities not far behind.

Within UL itself, there is intense rivalry between the member colleges,  the Medical Schools, the Schools and departments within those colleges, research groups and units within departments, and finally, between individual academics. The  white heat of competition needs to be directly observed or experienced to be believed. Academics at every level are under huge and intense pressure to obtain research funding and to publish peer-reviewed papers in high-impact journals to raise the perceived status of their schools and departments, and to secure funding in the form of research grants and to do all of these things as quickly as possible. As a consequence, simply to stay in the race, each and every method that produces the most outstanding results will be tested and tried. Unfortunately, from time to time, this inevitably means that academics resort to fraudulent practices.

This always does harm; it harms patients, biomedicine and science.  It also harms the reputations of the individuals concerned and their institutions. For this reason, information about scientific misconduct seldom finds its way into public arenas, yet it is a notable part of ‘behind the scenes’ academic history. In “Scientific misconduct and the myth of self-correction in science”, Stroebe, Postmes and Spears (2012) discuss 40 cases of fraud that occurred between 1974 and 2012. The majority occurred in Biomedicine and the only two UK cases were at UL. Academic institutions prefer to keep scientific fraud committed by their employees behind closed doors. Then with the inevitable leaks, news of ‘scandals’ creates headlines in the mainstream media. This means that academic responses  to fraud are driven by scandals. To quote Richard Smith (2006): “They accumulate to a point where the scientific community can no longer ignore them and `something has to be done’. Usually this process is excruciatingly slow.”

There have been several examples of proven scientific misconduct involving fabrication and fraud at several esteemed colleges within London University. London University has been blighted with a high proportion of ‘celebrity’ fraud cases, a few of which are summarised below.

UNIVERSITY COLLEGE LONDON – BURT SCANDAL

740px-Picture_of_Dr._Cyril_Burt

Sir Cyril Burt at University College London claimed a child’s intelligence is mainly inherited and social circumstances play only a minor role. Burt was a eugenicist and he fabricated data in a manner that suggested the genetic theories of intelligence were confirmed.   Burt’s research formed the basis of education policy from the 1920s until Burt died in 1971. Soon afterwards evidence of fraud began to seep out, as if from a leaky bucket.

Notable exposures were by Leon Kamin (1974) in his book, The science and politics of IQ and Oliver Gillie (1976, October 24) who claimed that “Crucial data was faked by eminent psychologist” in the Sunday Times 

Burt was alleged to have invented results, assistants and authors to fit his theory that intelligence has primarily a genetic basis.  It is widely accepted today that Burt was a fraudster although he still has defenders.

ROYAL FREE HOSPITAL – WAKEFIELD SCANDAL

download

A fraudulent article in The Lancet falsely linked the MMR vaccine to autism. The publicity about this scared large numbers of parents.  Dr. Andrew J Wakefield and a team (1998) at the Royal Free Hospital and School of Medicine, UL,  falsified their findings. This resulted in a substantial drop in vaccinations causing unnecessary deaths among thousands of unprotected children (e.g., Braunstein, 2012; Deere, 2012). In spite of significant public and scientific concerns, the Wakefield paper was not retracted until February 2010,  12 years after the original publication.  The paper received 1330 citations in the 12-year period prior to retraction and 1260 citations since the retraction. The false evidence that MMR vaccine causes autism is widely cited to the present day, and the paper forms the backbone of an international anti-vaxxing campaign which Wakefield leads from Austin, Texas (Glenza, 2018).

ST GEORGE’S MEDICAL SCHOOL – PEARCE SCANDAL

Dr. Malcolm Pearce of St George’s Medical School, LU, claimed that a 29-year-old woman had given birth to a healthy baby after he had successfully relocated a five-week-old ectopic foetus into her womb (Pearce et al., 1994).  The report excited worldwide interest and hope to thousands of women who are prone to pregnancies that start outside the uterus and end in miscarriage. However, Dr Pearce’s patient records had been tampered with, colleagues knew nothing of this astonishing procedure, and the mother could not be tracked down. Pearce had falsified his evidence. The GMC ruled that fraud had happened and struck off his name from the register. His fraud actually ended two careers.

BIRKBECK COLLEGE AND UCL SCANDAL

Turner (2018) describes a “a major research scandal, after an inquiry found that scientific papers were doctored over an eleven year period.” Professor David Latchman, Master of Birkbeck College and one of the country’s top geneticists, was accused of “recklessness” by allowing research fraud to take place at UCL’s Institute of Child Health. The report states that UCL launched a formal investigation after a whistleblower alleged fraud in dozens papers published by the Institute.

It is alleged that a panel of experts  found that two scientists, Dr Anastasis Stephanou and Dr Tiziano Scarabelli, were guilty of research misconduct by manipulating images in seven published papers. Professor Latchman, a former Dean of the Institute, is cited as an author on all seven of the papers.  In a paper published in the Journal of the American College of Cardiology, the panel said there was “clear evidence” of cloning, where parts of an image were copied and pasted elsewhere.

KING’S COLLEGE LONDON – PETERS AND BANERJEE SCANDAL

Another college in UL tainted by fraud is King’s College. According to the King’s College’s website (https://www.kcl.ac.uk/lsm/about/history/index.aspx) the College was founded in 1829 as a university college “in the tradition of the Church of England”. The first King’s Professor to gain prominence as a fraudster was Professor Timothy Peters, professor of clinical biochemistry at King’s College School of Medicine and Dentistry, who was found guilty of serious professional misconduct in 2001. He was given a severe reprimand by the General Medical Council (GMC) for failing to take action over falsified research published by a junior doctor he was supervising (Dyer, 2001).

Professor Peters had been the research supervisor of Dr Anjan Banerjee, a junior doctor at King’s College Hospital between 1988 and 1991. Dr Banerjee, aged 41, was suspended from practice by the GMC for 12 months in December 2000 for publishing fraudulent research (BMJ 2000;321:1429). When the GMC suspended him, he had already been suspended from his job as consultant surgeon at the Royal Halifax Infirmary as a result of unconnected allegations concerning financial fraud, and he resigned after the GMC suspension. In spite of everything, Dr Banerjee was awarded fellowships at three Royal Colleges and also the MBE!  Nice work, if you can get it.

KING’S COLLEGE LONDON –  HANS J EYSENCK AND R GROSSARTH-MATICEK SCANDAL

A recent publication in the Journal of Health Psychology, ‘Personality and fatal diseases: Revisiting a scientific scandal’ by Anthony Pelosi and editorial, ‘The Hans Eysenck affair: Time to correct the scientific record’ have triggered an investigation into 61 publications by the late Professor H J Eysenck and R Grossarth-Maticek.

Hans Eysenck did his doctorate at UCL under the supervision of Cyril Burt (see section above about the Burt Scandal).

My Open Letter to the President of King’s College, London, Professor David Byrne, draws attention to the 30-year old scandal concerning the dodgy data, impossible claims and dirty tobacco money that are the foundation of multiple dubious publications by Professor H J Eysenck and R Grossarth-Maticek’s.  An investigation by KCL of these events is long overdue and a report of a review by KCL is currently awaited. Watch this space…

th

 

Ronald_Grossarth-Maticek

 

<Prof Hans J Eysenck                                                                          Roland Grossarth-Maticek>

UNIVERSITY COLLEGE LONDON – AHLUWALIA  SCANDAL

The Ahluwalia scandal is described in detail by Dr Geoff. It involved multiple acts of fraud. Jatinder Ahluwalia was obviously a very shrewd operator. In spite of getting found out on more than one occasion, Ahluwalia was able to gain employment in several prestigious institutions including Cambridge University,  Imperial College London, UCL and the University of East London.

These cases indicate the relative ease with which the academic fraudster can accomplish fame and fortune at some of the most prestigious institutions in the land.  The extremely poor record of the authorities at colleges in London University in discovering and calling out fraud is something to behold.

To be continued…

Special issue on the PACE Trial

We are proud that this issue marks a special contribution by the Journal of Health Psychology to the literature concerning interventions to manage adaptation to chronic health problems. The PACE Trial debate reveals deeply embedded differences between critics and investigators. It reveals an unwillingness of the co-principal investigators of the PACE trial to engage in authentic discussion and debate. It leads one to question the wisdom of such a large investment from the public purse (£5million) on what is a textbook example of a poorly done trial.

The Journal of Health Psychology received a submission in the form of a critical review of one of the largest psychotherapy trials ever done, the PACE Trial. PACE was a trial of therapies for patients with myalgic encephalomyelitis (ME)/chronic fatigue syndrome (CFS), a trial that has been associated with a great deal of controversy (Geraghty, 2016). Following publication of the critical paper by Keith Geraghty (2016), the PACE Trial investigators responded with an Open Peer Commentary paper (White et al., 2017). The review and response were sent to more than 40 experts on both sides of the debate for commentaries.

The resulting collection is rich and varied in the perspectives it offers from a neglected point of view. Many of the commentators should be applauded for their courage, resilience and ‘insider’ understanding of experience with ME/CFS.

The Editorial Board wants to go on record that the PACE Trial investigators and their supporters were given numerous opportunities to participate, even extending the possibility of appeals and re-reviews when they would not normally be offered. That they failed to respond appropriately is disappointing.

Commentaries were invited from an equal number of individuals on both sides of the debate (about 20 from each side of the debate). Many more submissions arrived from the PACE Trial critics than from the pro-PACE side of the debate. All submissions were peer reviewed and judged on merit.

The PACE Trial investigators’ defence of the trial was in a template format that failed to engage with critics. Before submitting their reply, Professors Peter White, Trudie Chalder and Michael Sharpe wrote to me as co-principal investigators of the PACE trial to seek a retraction of sections of Geraghty’s paper, a declaration of conflicts of interest (COI) by Keith Geraghty on the grounds that he suffers from ME/CFS, and publication of their response without peer review (White et al., 4 November 2016, email to David F Marks). All three requests were refused.

On the question of COI, the PACE authors themselves appear to hold strong allegiances to cognitive behavioural therapy (CBT) and graded exercise therapy (GET) – treatments they developed for ME/CFS. Stark COI have been exposed by the commentaries including the PACE authors themselves who hold a double role as advisers to the UK Government Department of Work and Pensions (DWP), a sponsor of PACE, while at the same time working as advisers to large insurance companies who have gone on record about the potential financial losses from ME/CFS being deemed a long-term physical illness. In a further twist to the debate, undeclared COI of Petrie and Weinman (2017) were alleged (Lubet, 2017). Professors Weinman and Petrie adamantly deny that their work as advisers to Atlantis Healthcare represents a COI.

After the online publication of several critical Commentaries, Professors White, Sharpe, Chalder and 16 co-authors were offered a further opportunity to respond to their critics in the round but they chose not to do so.

After peer review, authors were invited to revise their manuscripts in response to reviewer feedback and many made multiple drafts. The outcome is a set of robust papers that should stand the test of time and offer significant new light on what went wrong with the PACE Trial that has been of such high significance for the nature of treatment protocols. It is disappointing that what has been the more dominant other side refused to participate.

Unfortunately, across the pro-PACE group of authors there was a consistent pattern of resistance to the debate. After receiving critical reviews, the pro-PACE authors chose to make only cosmetic changes or not to revise their manuscripts in any way whatsoever. They appeared unwilling to enter into the spirit of scientific debate. They acted with a sense of entitlement not to have to respond to criticism. Two pro-PACE authors even showed disdain for ME/CFS patients, stating: We have no wish to get into debates with patients. In another instance, three pro-PACE authors attempted to subvert the journal’s policy on COI by recommending reviewers who were strongly conflicted, forcing rejection of their paper.

The dearth of pro-PACE manuscripts to start off with (five submissions), the poor quality, the intransigence of authors to revise and the unavoidable rejection of three pro-PACE manuscripts led to an imbalance in papers between the two sides. However, this editor was loathe to compromise standards by publishing unsound pieces in spite of the pressure to go ahead and publish from people who should know better.

We are proud that this issue marks a special contribution by the Journal of Health Psychology to the literature concerning interventions to manage adaptation to chronic health problems. The PACE Trial debate reveals deeply embedded differences between critics and investigators. It also reveals an unwillingness of the co-principal investigators of the PACE trial to engage in discussion and debate. It leads one to question the wisdom of such a large investment from the public purse (£5 million) on what is a textbook example of a poorly done trial.

ME/CFS research has been poorly served by the PACE Trial and a fresh new approach to treatment is clearly warranted. On the basis of this Special Issue, readers can make up their own minds about the scientific merits and demerits of the PACE Trial. It is to be hoped that the debate will provide a more rational basis for evidence-based improvements to the care pathway for hundreds of thousands of patients.

References

Geraghty, KJ (2016‘PACE-Gate’: When clinical trial evidence meets open data access. Journal of Health Psychology 22(9): 11061112Google ScholarSAGE JournalsISI
Lubet, S (2017Defense of the PACE trial is based on argumentation fallacies. Journal of Health Psychology 22(9): 12011205Google ScholarSAGE JournalsISI
Petrie, K, Weinman, J (2017The PACE trial: It’s time to broaden perceptions and move on. Journal of Health Psychology 22(9): 11981200Google ScholarSAGE JournalsISI
White, PD, Chalder, T, Sharpe, M. (2017Response to the editorial by Dr Geraghty. Journal of Health Psychology 22(9): 11131117Google ScholarSAGE JournalsISI

The Editorial has been abridged and the photograph of Dr. Keith Geraghty added.

Personality, Heart Disease and Cancer: A Chequered History

Featured

Type A and B Personality

We discuss here the chequered history of the claims by Psychologists and others about the links between personality and illness, particularly heart disease and cancer. The research has been marred by dirty money and allegations of fraud.

Speculation about ‘Type A’ and ‘Type B’ personalities and coronary heart disease (CHD) has existed for at least 70 years. The distinction between the two personalities was introduced in the mid-1950s by the cardiologists Meyer Friedman and Ray Rosenman (1974) Type A behavior and your heart.  Their ideas can be traced to Franz Alexander one of the ‘fathers’ of psychosomatic medicine.

The Type A personality is described this: highly competitive and achievement oriented, not prepared to suffer fools gladly, always in a hurry and unable to bear delays and queues, hostile and aggressive, inclined to read, eat and drive very fast, and constantly thinking what to do next, even when supposedly listening to someone else. Type A was thought to be at greater risk of CHD,

The Type B personality is: relaxed, laid back, lethargic, even- tempered, amiable and philosophical about life, relatively slow in speech and action, and generally has enough time for everyone and everything.

The Type A personality is similar to Galen’s choleric temperament, and Type B with the phlegmatic.  It is well known that men are at greater risk of CHD than women.

‘Classic’ Studies

The key pioneering study of Type A personality and CHD was the Western Collaborative Group Study (WCGS).  Over 3,000 Californian men, aged from 39 to 59, were followed up initially over a period of eight-and-a-half years, and later extending to 22 years plus. At the eight-and-a-half-year follow-up, Type As were twice as likely compared with Type Bs to suffer from subsequent CHD. 7% developed some signs of CHD and two-thirds of these were Type As. This increased risk was there even when other risk factors, such as blood pressure and cigarette smoking, were statistically controlled.

Similar results were obtained in another large-scale study in Framingham, Massachusetts.  This time the sample contained both men and women.  By the early 1980s, it was confidently asserted that Type A characteristics were as much a risk factor for heart disease as high blood pressure, high cholesterol levels and even smoking.

Failure to Replicate

Later research failed to support these early findings. When Ragland and Brand (1988) conducted a 22-year follow-up of the WCGS, using CHD mortality as the crucially important measure, they failed to find any consistent evidence of an association.

Further research continued up to the late 1980s, yielding few positive findings. Reviewing this evidence, Myrtek (2001) suggests that the modest number of positive findings that did exist were the result of over-reliance on angina as the measure of CHD. Considering studies that adopted hard criteria, including mortality, Myrtek concludes that Type A personality is not a risk factor for CHD.

Enter the Tobacco Industry

With such disappointing results, why did Type A obtain so much publicity over more than 40 years? The reason is in part connected with the involvement of the US tobacco industry.

Mark Petticrew et al. (2012) analysed material lodged at the Legacy Tobacco Documents Library. This is a vast collection of documents that the companies were obliged to make public following litigation in 1998. These documents show that, for over 40 years from the 1950s, the industry heavily funded research into links between personality, CHD and cancer. The industry was hoping to demonstrate that personality variables were associated with cigarette smoking.

Any such links would undermine the alleged causal links between smoking and disease. Thus, for example, if it could be shown that Type A personalities were both more likely to smoke than Type Bs, and more likely to develop CHD, then it could be argued that smoking might be just an innocent background variable.

The Philip Morris company funded Meyer Friedman, the originator of Type A research, for the Meyer Friedman Institute. The research aimed to show that Type A personalities could be changed by interventions, thereby presumably reducing proneness to CHD even if they continued to smoke.

Petticrew et al. show that, while most Type A–CHD studies were not funded by the tobacco industry, most of the positive results were tobacco-funded. As has been pointed out in many areas of science, positive findings invariably get a great deal more publicity than negative findings and rebuttals.

Hans J Eysenck

The late H J Eysenck was one of the most controversial psychologists who ever lived. Generations of UK psychology students had to study his books as gospel.

The German-born, British psychologist worked at the Institute of Psychiatry, University of London.  He did a PhD under Sir Cyril Burt  who was proved to have fabricated researchers and data to support his eugenic theory of intelligence.  (Kamin, 1974, The science and politics of IQ).

Eysenck used the tobacco industry as a source of funding for his research on psychological theories of personality. According to Pringle (1996), Eysenck received nearly £800,000 to support his research on personality and cancer.  Eysenck’s results were a spectacular exception to the general run of negative findings in this field.  Eysenck (1988) claimed that personality variables are much more strongly related to death from cancer than even cigarette smoking.

One of my lecturers while I was an undergraduate had worked for Eysenck as a research assistant for a year. It had seemed clear to him that data massaging was required before placing Eysenck’s studies into publication. Data manipulation or even worse, outright fraud, has surfaced in a major re-analysis of Eysenck’s work on tobacco and personality.

Ronald Grossarth-Maticek

Two of Eysenck’s papers, with Ronald Grossarth-Maticek (pictured above), based  in Crvenka, Serbia, claimed to have identified personality types that increase the risk of cancer by about 120 times and heart disease by about 25 times (Grossarth-Maticek and Eysenck, 1991; Eysenck and Grossarth-Maticek, 1991). They also claimed to have tested a new method of psychological treatment that could reduce the death rate for disease prone personalities over the next 13 years from 80% to 32%. These claims are too good to be true.

These extraordinary claims were not received favourably by others in this field. Fox (1988) dismissed earlier reports by Eysenck and Grossarth-Maticek as ‘simply unbelievable’ and the 1991 papers were subjected to devastating critiques by Pelosi and Appleby (1992, 1993) and Amelang, Schmidt-Rathjens and Matthews (1996).  The ‘cancer prone personality’ was not clearly described and seems to have been an odd amalgam of emotional distance and excessive dependence.

A Case of Fraud?

After pointing out a large number of errors, omissions, obscurities and implausible data, in a manner reminiscent of Leon Kamin’s  analysis of Burt’s twin IQ data, Pelosi and Appleby comment:

It is unfortunate that Eysenck and Grossarth-Maticek omit the most basic information that might explain why their findings are so different from all the others in this field. The methods are either not given or are described so generally that they remain obscure on even the most important points . . . Also essential details are missing from the results, and the analyses used are often inappropriate.

(Pelosi and Appleby, 1992: 1297).

They never used the word “fraud”. They didn’t need to. For an update of this story,  see this post

and this post

Update

I wrote to Ronald Grossarth-Maticek on 3rd December 2018 and again on 5th March 2019 inviting him to respond to the allegations.
Dr. Grossarth-Maticek has responded saying that he will give me an answer within the next month.
He also says that he will send me the results of his actual research.
To be continued…

The PACE Trial: A Catalogue of Errors

What was the PACE Trial?

Rarely in the history of clinical medicine have doctors and patients been placed so bitterly at loggerheads. The dispute had been a long time coming. Thirty years ago, a few psychiatrists and psychologists offered a hypothesis based on a Psychological Theory in which ME/CFS is constructed as a psychosocial illness. According to their theory, ME/CFS patients have “dysfunctional beliefs” that their symptoms are caused by an organic disease. The ‘Dysfunctional Belief Theory’ (DBT) assumes that no underlying pathology is causing the symptoms; patients are being ‘hypervigilant to normal bodily sensations‘ (Wessely et al., 1989; Wessely et al., 1991).

The Psychological Theory assumes that the physical symptoms of ME/CFS are the result of ‘deconditioning’ or ‘dysregulation’ caused by sedentary behaviour, accompanied by disrupted sleep cycles and stress. Counteracting deconditioning involves normalising sleep cycles, reducing anxiety levels and increasing physical exertion. To put it bluntly, the DBT asserts that ME/CFS is ‘all in the mind’.  Small wonder that patient groups have been expressing anger and resentment in their droves.

Top-Down Research

‘Top-down research’ uses a hierarchy of personnel, duties and skill-sets. The person at the top sets the agenda and the underlings do the work. The structure is a bit like the social hierarchy of ancient Egypt. Unless carefully managed, this top-down approach risks creating a self-fulfilling prophecy from confirmation biases at multiple levels. At the top of the research pyramid sits the ‘Pharaoh’, Regius Professor Sir Simon Wessely KB, MA, BM BCh, MSc, MD, FRCP, FRCPsych, F Med Sci, FKC, Knight of the Realm, President of the Royal College of Medicine, and originator of the DBT.  The principal investigators (PIs) for the PACE Trial, Professors White, Chalder and Sharpe, are themselves advocates of the DBT.  The PIs all have or had connections both to the Department of Work and Pensions and to insurance companies. The objective of the PACE Trial was to demonstrate that two treatments based on the DBT, cognitive behavioural therapy (CBT) and graded exercise therapy (GET), help ME/CFS patients to recover. There was zero chance the PACE researchers would fail to obtain the results they wanted. 

Groupthink, Conflicts and Manipulation

The PACE Trial team were operating within a closed system or groupthink in which they ‘know’ their theory is correct. With every twist and turn, no matter what the actual data show, the investigators are able to confirm their theory. The process is well-known in Psychology. It is a self-indulgent processes of subjective validation and confirmation bias.  Groupthink occurs when a group makes faulty decisions because group pressures lead to a deterioration of “mental efficiency, reality testing, and moral judgment” (Janis, 1972). Given this context, we can see reasons to question the investigators’ impartiality with many potential conflicts of interest (Lubet, 2017). Furthermore, critical analysis suggests that the PACE investigators involved themselves in manipulating protocols midway through the trial, selecting confirming data and omitting disconfirming data, and publishing biased reports of findings which created a catalogue of errors.

‘Travesty of Science’

The PACE Trial has been termed a ‘travesty of science’ while sufferers of ME/CFS continue to be offered unhelpful or harmful treatments and are basically being told to ‘pull themselves together’. One commentator has asserted that the situation for ME patients in the UK is: The 3 Ts – Travesty of Science; Tragedy for Patients and Tantamount to Fraud” (Professor Malcolm Hooper, quoted by Williams, 2017). Serious errors in the design, the protocol and procedures of the PACE Trial are evident. The catalogue of errors is summarised below. The PACE Trial was loaded towards finding significant treatment effects.

A Catalogue of Errors

The claimed benefits of GET and CBT for patient recovery are entirely spurious. The explanation lies in a sequence of serious errors in the design, the changed protocol and procedures of the PACE Trial. The investigators neglected or bypassed accepted scientific procedures for a RCT, as follows:

Error Category of error Description of error
1Ethical issue: Applying for ethical approval and funding for a long-term trial when the PIs knew already knew CBT effects on ME/CFS were short-lived. On 3rd November 2000, Sharpe confirmed: “There is a tendency for the difference between those receiving CBT and those receiving the comparison treatment to diminish with time due to a tendency to relapse in the former” (www.cfs.inform/dk). Wessely stated in 2001 that CBT is “not remotely curative” and that: “These interventions are not the answer to CFS” (Editorial: JAMA 19th September 2001:286:11) (Williams, 2016).
2Ethical issue: Failure to declare conflicts of interest to Joint Trial Steering Committee.Undeclared conflicts of interest by the three PIs in the Minutes of the Joint Trial Steering Committee and Data Monitoring Committee held on 27th September 2004.
3Ethical issue: Failure to obtain fully informed consent after non-disclosure of conflicts of interest.Failing to declare their vested financial interests to PACE participants, in particular, that they worked for the PHI industry, advising claims handlers that no payments should be made until applicants had undergone CBT and GET.
4Use of their own discredited “Oxford” criteria for entry to the trial.Patients with ME would have been screened out of the PACE Trial even though ME/CFS has been classified by the WHO as a neurological disease since 1969 (ICD-10 G93.3).
5Inadequate outcome measures.Using only subjective outcome measures.The original protocol included the collection of actigraphy data as an objective outcome measure. However, after the Trial started, the decision was taken that no post-intervention actigraphy data should be obtained.
6Changing the primary outcomes of the trial after receiving the raw data. Altering outcome measures mid-trial in a manner which gave improved outcomes.
7Changing entry criteria midway through the trial. Altering the inclusion criteria for trial entry after the main outcome measures were lowered so that some participants (13%) met recovery criteria at the trial entry point.
8The statistical analysis plan was published two years after selective results had been published. The Re-definition of “recovery” was not specified in the statistical analysis plan.
9Inadequate control Sending participants newsletters promoting one treatment arm over another, thus contaminating the trial.
10Inadequate controlLack of comparable placebo/control groups with inexperienced occupational therapists providing a control treatment and experienced therapists provided CBT.
11Inadequate controlRepeatedly informing participants in the GET and CBT groups that the therapies could help them get better.
12Inadequate control Giving patients in the CBT and GET arms having more sessions than in the control group.
13Inadequate controlAllowing therapists from different arms to communicate with each other about how patients were doing.

14

Lack of transparency

Blocking release of the raw data for five years preventing independent analysis by external experts.

Cover-Up

Blocking release of the raw data for five years and preventing independent analysis by external experts was tantamount to a cover-up of the true findings. An editorial by Keith Geraghty (2016) was entitled ‘PACE-Gate’. ME/CFS patient associations were rightly suspicious of the recovery claims concerning the GET arm of the trial because of their own experiences of intense fatigue after ordinary levels of activity which were inconsistent with the recovery claims of the PACE Trial reports. For many sufferers, even moderate exercise results in long ‘wipe-outs’ in which they are almost immobilized by muscle weakness and joint pain. In the US, post-exertional relapse has been recognized as the defining criterion of the illness by the Centers for Disease Control, the National Institutes of Health and the Institute of Medicine. For the PACE investigators, however, the announced recovery results validated their conviction that psychotherapy and exercise provided the key to reversing ME/CFS.

Alem Matthees Obtains Data Release

When Alem Matthees, a ME/CFS patient, sought the original data under the Freedom of Information Act and a British Freedom of Information tribunal ordered the PACE team to disclose their raw data, some of the data were re-analysed according to the original protocols. The legal costs of the tribunal at which QMUL were forced to release the data, against their strenuous objections, was over £245,000. The re-analysis of the PACE Trial data revealed that the so-called “recovery” under CBT and GET all but disappeared (Carolyn Wilshire, Tom Kindlon, Alem Matthees and Simon McGrath, 2016). The recovery rate for CBT fell to seven percent and the rate for GET fell to four percent, which were statistically indistinguishable from the three percent rate for the untreated controls. Graded exercise and CBT are still being routinely prescribed for ME/CFS in the UK despite patient reports that the treatments can cause intolerable pain and relapse. The analysis of the PACE Trial by independent critics has revealed a catalogue of errors and provides an object lesson in how not to conduct a scientific trial. The trial can be useful to instructors in research design and methodology for that purpose.

Following the re-analyses of the PACE Trial, the DBT is dead in the water. There is an urgent need for new theoretical approaches and scientifically-based treatments for ME/CFS patients. Meanwhile, there is repair work to be done to rebuild patient trust in the medical profession after this misplaced attempt to apply the Psychological Theory to the unexplained syndrome of ME/CFS. The envelope theory of Jason et al. (2009) proposes that people with ME/CFS need to balance their perceived and expended energy levels and provides one way forward, pending further research.

Ultimately, patients, doctors and psychologists are waiting for an organic account of ME/CFS competent to explain the symptoms and to open the door to effective treatments. Patients have a right to nothing less.

An extract from: David F Marks et al. (2018) Health Psychology. Theory, Research & Practice (5th ed.) SAGE Publications Ltd.