A paper published in Psychological Science in the Public Interest has evaluated ten techniques for improving learning, ranging from mnemonics to highlighting and came to some surprising conclusions.
One of my all time favourite bloggers, Oxford Neuroscientist Prof. Dorothy Bishop, or DeevyBee as she is known on Twitter has performed an amazing open access lecture focusing largely on the misunderstanding of neuroscience (click down to the “Emanuel Miller Lecture” to play the video). The talk is incredibly informative and digestible, even those with no understanding of neuroscience or psychology whatsoever will take a great deal away. The problem of the poor understanding of neuroscience is one of the main reasons why I started this blog, so if you like this blog then you’ll love this lecture.
Click here to download the slides.
The talk begins with a reasoned explanation of how and when we should be sceptical of neuroscience research, Bishop goes on to cite 4 key reasons why certain kinds of scientific research will inevitbaly result in false-positives:
“The four horseman of the apocalypse”
1. Maturation – People develop naturally over time.
“There seems to be an implicit assumption that the brain, because it is a physical organ is somehow not going to change unless you give it some intervention – that it is there as a static thing. This is completely untrue… as evidenced by this series of images.”
The brain changes naturally over time
2. Practice effects – when people keep doing the same test again and again, they get better at it.
“…purely to do with the fact that you have got better at doing the test and nothing to do with your abilities… People forget that this can apply to language tests and thing like that. It also applies to some extent to the brain, often we don’t know how important this is because brain imaging is so new.. clearly if you get brain responses to novelty, that means if you do something twice – the first time round you will get different responses to the second time round when it is no longer novel”.
3. Regression to the mean – a statistical artefact of longitudinal studies that is exacerbated if you select participants on the basis of a low score on a test (for example participants with developmental difficulties). Bishop does an outstanding job of explaining the problem at about 18 minutes in to the talk.
“Regression to the mean is as inevitable as death and taxes”
Campbell and Kenny (1999) A primer on regression artefacts
4. The placebo effect. This is the obvious consideration that continues to impact poorly designed research but according to Bishop, the three issues listed above could actually be having an even greater impact than the placebo effect.
The Solution?
Bishop explains that a control group is vital in order to achieve valid findings, but a control group alone is not enough, we should also be asking questions such as:
- Are the groups randomly assigned – or is there some other factor at play?
- Is the control group given an alternative treatment? If not, why not?
- What causes drop out? People don’t tend to drop out at random and this can have a very big effect on results.

References:
Weisberg, Deena Skolnick. (2008). The Seductive Allure of Neuroscience Explanations. Journal of Cognitive Neuroscience, 18 (3), 229-477 DOI: 10.1162/jocn.2008.20040
Campbell and Kenny (1999) A primer on regression artefacts
Follow Simon on Twitter, Facebook, Google+, RSS, or join the mailing list.A wise man named Karl Popper once disagreed with the orthodox view that scientific activity starts with observation noting that “observation is always selective”. It is often possible to propose a theory and find results that support or verify it but Popper proposed that it is the negative results that are of crucial value. He used the following simple thought experiment to prove it.

A black swan disproves the theory that all swans are white
Europeans for thousands of years had observed millions of white swans. Using inductive evidence, we could come up with the theory that all swans are white.
However exploration of Australasia introduced Europeans to black swans. No matter how many observations are made which confirm the theory that swans are white there is always the possibility that a future observation could refute it. Induction cannot yield certainty.
This simple principle revolutionised science and the world lived happily ever after.
…Of course that was not how the story ended. Today there is still massive pressure on scientists by industry, funding bodies, the media and universities to chase after verifying positive findings and in effect supporting established knowledge. This is all very well but when a negative finding is discovered it is all too often brushed under the carpet, almost as if it is an embarrassed (an ugly duckling perhaps). For example consider for a second how truly horrifying the following statistic actually is:
Only 5.9% of industry sponsored cancer trials are ever published
Of that 5.9%, an astounding 75% give positive results – suggesting negative findings are simply not published. This is not the only problem for academics searching for negative results to support a proposition. Because of the way boolean search algorithms work you have to have a little bit of boolean-know-how to actually search for a negative result. Simply adding “not” in to an english language proposition will still yield positive findings. Now a group of scientists have created a database called BioNot that uses data mining and intelligent machine learning methods to systematically search for negative results in Pubmed and Elsevier.
You may wish to write down the URL http://snake.ims.uwm.edu/bionot/ because in an ironic twist the boolean wizards who created this programme have made the URL a little trixy for google to find, presumably because this website is google’s arch nemesis. Perhaps not, but either way if you hit “BioNot” in to google you end up with something to do with nuts.
Anyway, I’m off to have a play with it, if you need some search ideas, check out the fabulous results of my very first search!
Full Text (Open Access PDF) on Pubmed
Via Neuroskeptic (where you can pop on over and read a little more explanation of the potential applications of this tool)
Agarwal S, Yu H, & Kohane I (2011). BioNOT: A searchable database of biomedical negated sentences. BMC bioinformatics, 12 (1) PMID: 22032181
Follow Simon on Twitter, Facebook, Google+, RSS, or join the mailing list.
Image via Erowid
UPDATE 06/12/11: THE INDEPENDENT HAVE NOW AMMENDED THEIR ARTICLE FOLLOWING A PCC COMPLAINT (THEY STILL FAIL TO REFERENCE THE FACT THAT THE DRUG IN QUESTION IS ALMOST CERTAINLY “VALIUM”.)
UPDATE 05/12/11: THE METRO HAVE NOW REMOVED THEIR ARTICLE FOLLOWING A PCC COMPLAINT.
UPDATE 24/11/11: THIS PIECE HAS NOW BEEN REMOVED BY THE HULL DAILY MAIL AFTER A COMPLAINT TO THE PRESS COMPLAINTS COMMISSION BUT IT HAS REAPPEARED IN STORIES BY THE METRO AND THE INDEPENDENT.
Read this piece by the Hull Daily Mail and see if you can spot the ten major factual and editorial errors yourself. Watch the slide show below for the solution.
Follow Simon on Twitter, Facebook, Google+, RSS, or join the mailing list.Twitter
Facebook
Cookie Compliance
This site contains cookies. If you have ever used the internet before then you probably knew that already and ate them long before you arrived here. If you are allergic to cookies please leave now.