Sunday, November 3, 2013

The trip into the Wild West continues

There were some great posts at npinoz after I posted about this series. Nicola Gates talked about the standards of studies that we need to take into account when reviewing research, and Jonathan Foster posted a link to an article he recently published (see http://theconversation.com/health-check-does-brain-training-make-you-smarter-18882). This is really good stuff.

I have to admit that I am not going to be as rigorous as Nicola suggests (but maybe she will consider giving us a review). Today I am going to talk about two articles that made an impression on me.


The first study was reviewed by Jonathan in his article, and has been mentioned on npinoz before.
This is Owen, A.M., Hampshire, A., Grahn, J.A., Stenton, R., Dajani, S., Burns, A.S., Howard, R.JH., Ballard, C.G. (2010), Putting brain training to the test. Nature, 465, 775-778. Free full text to be found here). It was a huge study, with 11, 430 participants, and it found no effect of brain training at all.

The study used viewers of a popular BBC series, who did a 6-week online brain training program. They used four tests at pre-test. The assessment tasks (looking at reasoning, digits forward, spatial working memory, and paired associate learning) have good literature support and could be argued to be sensitive to cognitive change. The training tasks are described in the article, but there is no source or support information given for their construction or choice.
Minimum dose of training was set at 10 minutes a day, 3 days a week. There were two training groups and a control group, which answered a series of obscure questions. Not surprisingly, the experimental groups improved on the trained tasks. However, all groups improved at the same rate on post-test assessment.

Well, with over 11 thousand subjects even miniscule differences would reach significance. So the study did not report any significance testing and concentrated instead on effect sizes (leaving me fairly bereft, to tell the truth). But the bottom line was quite clear - brain training did not make a difference.

For scientific critique of this study you can go here. But I personally believe in the results. BBC producers run a lot of participants through brain training tasks similar to those commercially available and found no effects of training.

but:
Dr Menry Mahncke has been quoted here as saying that using this study to discount all brain training products would be "like concluding that there are no compounds to fight bacteria becasue the compound you tested was sugar and not penicillin.'


So, is there penicillin out there? Here is another study:

Ball, K., Edwards, J.D., Ross, L.A., McGwin, G. Jr (2010). Cogntive training decreases motor vehicle collision involvement of older drivers. Journal of the American Geriatric Society, 58(11), 2107-2113. Free full text to be found here. Worth a careful read.

This was also a big study, of 908 older participants (over 65, no significant problems) divided into four groups. The control group received no intervention. The remaining groups received memory training, reasoning training or speed of processing training. Some or all of this training, I believe, consisted of tasks produced by PositScience. Memory training involved teaching mnemonic strategies. Reasoning training consisted of teaching strategies and comprehension of patterns in everyday life. The 'speed of processing' training was really computerized training in visual attention skills. Training involved a maximum of 10 session (average of 9, range of 0 to 10).
The outcome measure was the number of at-fault motor vehicle accidents. This was adjusted for driving exposure measures calculated in a complicated way to obtain person-miles of travel (using self-reported annual mileage during the follow up - not an ideal measure, but better than nothing). Data about accidents was taken from state-kept statistics.
The outcome: training in both reasoning and speed of processing reduced at-fault accidents. Reasoning training was significant only when the results were adjusted for age at baseline, gender, race, education, location, visual acuity, health, depression and mental status. Training in speed of processing was significant with and without such adjustments.

Now, here is some serious generalisation of brain training to everyday life outcomes.


In my mind the main differences between the studies are twofold: the choice of subjects and the choice of tasks. Somehow it is easier for me to believe that effects of brain training will be observable in the aged brain, possibly in the context of a mild cognitive impairment, than in healthy adults. Also, the tasks in the Nature study were designed by BBC producers. The tasks in the driving study came out of research by Professor Merzenich, a father of brain plasticity research.

I can't comment much on the reasoning tasks, but I tried the 'speed of processing' training that was given to one of the study groups. It was a couple of years ago or more, and PositScience was selling their products as a one-off purchase for ugly sums of money (these days it is a neat monthly subscription of $14). I  bought a driving module which was the cheapest. And here I leave this story on a clifhanger, because I want to resurrect my old PC and get the data from this training before continuing.

cheers,
Izabela

1 comment:

  1. Excellent posts on the Wild West of neuropsych ! you should also have a twitter! Lucette Cysique, Ph.D. At UNSW
    I'm writing up a grant and considering those arguments right now...

    ReplyDelete