A paper published in Psychological Science in the Public Interest has evaluated ten techniques for improving learning, ranging from mnemonics to highlighting and came to some surprising conclusions.
Cloud Learning The lesson you never got taught in school: How to learn!

Read More

Follow Simon on TwitterFacebookGoogle+RSS, or join the mailing list.
Tagged with:

One of my all time favourite bloggers, Oxford Neuroscientist Prof. Dorothy Bishop, or DeevyBee as she is known on Twitter has performed an amazing open access lecture focusing largely on theBad Neuroscience1 The Science of Bad Neuroscience misunderstanding of neuroscience (click down to the “Emanuel Miller Lecture” to play the video). The talk is incredibly informative and digestible, even those with no understanding of neuroscience or psychology whatsoever will take a great deal away. The problem of the poor understanding of neuroscience is one of the main reasons why I started this blog, so if you like this blog then you’ll love this lecture.

Click here to download the slides.

The talk begins with a reasoned explanation of how and when we should be sceptical of neuroscience research, Bishop goes on to cite 4 key reasons why certain kinds of scientific research will inevitbaly result in false-positives:

“The four horseman of the apocalypse”

1. Maturation – People develop naturally over time.

“There seems to be an implicit assumption that the brain, because it is a physical organ is somehow not going to change unless you give it some intervention – that it is there as a static thing. This is completely untrue… as evidenced by this series of images.”

Changes in the brain over time The Science of Bad Neuroscience

The brain changes naturally over time

2.  Practice effects – when people keep doing the same test again and again, they get better at it.

“…purely to do with the fact that you have got better at doing the test and nothing to do with your abilities… People forget that this can apply to language tests and thing like that. It also applies to some extent to the brain, often we don’t know how important this is because brain imaging is so new.. clearly if you get brain responses to novelty, that means if you do something twice – the first time round you will get different responses to the second time round when it is no longer novel”.

3. Regression to the mean  – a statistical artefact of longitudinal studies that is exacerbated if you select participants on the basis of a low score on a test (for example participants with developmental difficulties). Bishop does an outstanding job of explaining the problem at about 18 minutes in to the talk.

“Regression to the mean is as inevitable as death and taxes”

Campbell and Kenny (1999) A primer on regression artefacts

4. The placebo effect. This is the obvious consideration that continues to impact poorly designed research but according to Bishop, the three issues listed above could actually be having an even greater impact than the placebo effect.

The Solution?

Bishop explains that a control group is vital in order to achieve valid findings, but a control group alone is not enough, we should also be asking questions such as:

  • Are the groups randomly assigned – or is there some other factor at play?
  • Is the control group given an alternative treatment? If not, why not?
  • What causes drop out? People don’t tend to drop out at random and this can have a very big effect on results.
If something smells fishy, it probably is fishy.
Sometimes things just go wrong and currently in the field of brain imaging, an awful lot of things have been going wrong. This is well illustrated by the now famous study of the dead fish in the brain scanner. A result was found in two different trials where a dead fish was asked to determine facial expressions. For this reason, all research – but particularly abstract research such as brain imaging research – should be taken with a pinch of salt until the results have been replicated, ideally a few times.
Smells fishy dead fish brain scanner The Science of Bad NeuroscienceIf you like this lecture subscribe to Dorothy Bishop’s blog, it is one of those blogs that is so useful that frankly, it should be required reading for all concerned. Also, keep an eye out for Dorothy’s forthcoming paper where she will be publishing some of the ideas she presented in the Emanuel Miller Lecture.


Weisberg, Deena Skolnick. (2008). The Seductive Allure of Neuroscience Explanations. Journal of Cognitive Neuroscience, 18 (3), 229-477 DOI: 10.1162/jocn.2008.20040

Campbell and Kenny (1999) A primer on regression artefacts

Follow Simon on TwitterFacebookGoogle+RSS, or join the mailing list.
Tagged with:

rb editors selection BioNot: The Internets Answer to the Principle of FalsifiabilityA wise man named Karl Popper once disagreed with the orthodox view that scientific activity starts with observation noting that “observation is always selective”. It is often possible to propose a theory and find results that support or verify it but Popper proposed that it is the negative results that are of crucial value. He used the following simple thought experiment to prove it.

black and white swans BioNot: The Internets Answer to the Principle of Falsifiability

A black swan disproves the theory that all swans are white

Europeans for thousands of years had observed millions of white swans. Using inductive evidence, we could come up with the theory that all swans are white.

However exploration of Australasia introduced Europeans to black swans. No matter how many observations are made which confirm the theory that swans are white there is always the possibility that a future observation could refute it. Induction cannot yield certainty.

This simple principle revolutionised science and the world lived happily ever after.
…Of course that was not how the story ended. Today there is still massive pressure on scientists by industry, funding bodies, the media and universities to chase after verifying positive findings and in effect supporting established knowledge. This is all very well but when a negative finding is discovered it is all too often brushed under the carpet, almost as if it is an embarrassed (an ugly duckling perhaps). For example consider for a second how truly horrifying the following statistic actually is:

Only 5.9% of industry sponsored cancer trials are ever published

Of that 5.9%, an astounding 75% give positive results – suggesting negative findings are simply not published. This is not the only problem for academics searching for negative results to support a proposition. Because of the way boolean search algorithms work you have to have a little bit of boolean-know-how to actually search for a negative result. Simply adding “not” in to an english language proposition will still yield positive findings. Now a group of scientists have created a database called BioNot that uses data mining and intelligent machine learning methods to systematically search for negative results in Pubmed and Elsevier.

BioNot BioNot: The Internets Answer to the Principle of FalsifiabilityYou may wish to write down the URL http://snake.ims.uwm.edu/bionot/ because in an ironic twist the boolean wizards who created this programme have made the URL a little trixy for google to find, presumably because this website is google’s arch nemesis. Perhaps not, but either way if you hit “BioNot” in to google you end up with something to do with nuts.

Not BioNot BioNot: The Internets Answer to the Principle of Falsifiability

Anyway, I’m off to have a play with it, if you need some search ideas, check out the fabulous results of my very first search!

Full Text (Open Access PDF) on Pubmed

Via Neuroskeptic (where you can pop on over and read a little more explanation of the potential applications of this tool)

Agarwal S, Yu H, & Kohane I (2011). BioNOT: A searchable database of biomedical negated sentences. BMC bioinformatics, 12 (1) PMID: 22032181

Follow Simon on TwitterFacebookGoogle+RSS, or join the mailing list.
Mano 10 The worst piece of drugs reporting I have ever read

Image via Erowid




Read this piece by the Hull Daily Mail and see if you can spot the ten major factual and editorial errors yourself. Watch the slide show below for the solution.

Follow Simon on TwitterFacebookGoogle+RSS, or join the mailing list.

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

Visit our friends!

A few highly recommended friends...