BCI Robot arm Researchers Demonstrate The First Brain Controlled Robotic ArmResearchers have demonstrated the first 3D control of a robotic arm from electrodes implanted directly in the brain. This technology has been tested before in monkeys but this is the first time the technology has been successfully trialled in humans. Until now, human brain computer interfaces (BCI) have been limited to two dimensions. This is clearly a huge step forward for BCI technology however it will likely be a very long time before we see this technology affecting our lives. Methods which do not require electrodes to be attached directly to the brain have to use much weaker signals as the brain waves must travel through the skull and the tissue above the brain. Due to the cost of invasive brain procedures and the associated risks, it is therefore unlikely that this method will be used for anything but the most extreme cases of paralysis.

Reference
Hochberg, L., Bacher, D., Jarosiewicz, B., Masse, N., Simeral, J., Vogel, J., Haddadin, S., Liu, J., Cash, S., van der Smagt, P., & Donoghue, J. (2012). Reach and grasp by people with tetraplegia using a neurally controlled robotic arm Nature, 485 (7398), 372-375 DOI: 10.1038/nature11076

Follow Simon on TwitterFacebookGoogle+RSS, or join the mailing list.
Tagged with:
 

researchblogging 2 New Brainwave Products, Can You Spot The Fraud?This week two new brain computer interface (BCI) based products have hit the headlines, one is a hoax. I’ve placed the adverts for both below, see if you can figure out which one is a real project.

 

Project Black Mirror
The developers of “project black mirror” claim to have developed a BCI that can control an  iphone using Siri.

Neurowear
The developers of “Neurowear” claim to have developed a pair of wearable rabbit ears containing a BCI that moves based on your mood.

But, can you tell which one is an elaborate hoax?

(Watch the videos, check out their websites but don’t scroll down until you’ve made your guess.)

Flying Saucer Hoax 2 New Brainwave Products, Can You Spot The Fraud?

An elaborate hoax

Believe it or not, it turns out that the project that is a hoax is actually the mobile phone device “project black mirror“. This is clear for a number of reasons:

1. EEG can not yet be deciphered anywhere near the extent necessary to achieve a wide range of commands based simply on imagined words. At the moment it is only possible to assign commands based on cues such as our emotions or imagined movements of different parts of the body. Even then, there is a very long way to go before we can achieve significantly more commands than can be counted on one hand.

2. On the “project black mirror” page the group make the blunder of describing the device as an ECG instead of an EEG. An ECG is an electrocardiogram which measures activity from the heart while an EEG is an Electroencephalograph which measures activity from the brain, by definition, a necessary component in any BCI (the brain, that is).

3. On the “project black mirror” page the group describe the device as measuring signals in the range of 0-5v. EEG signals are approximately one millionth of that range! (“microvolts” not “volts”.)

4. The chip board in the “project black mirror” video isn’t properly attached.

5. In the “project black mirror” video, on the laptop screen there is an animation of the matrix code, presumably instead of an EEG output.

As @Interaxon has pointed out, this is a rather sad trick to play because it devalues the work being done by genuine BCI researchers and raises expectations to an unrealistic level. That said, progress is being made. Only this week a breakthrough study was published in the Lancet that demonstrated using EEG that 19% of patients diagnosed with being in a vegative state could respond using BCI.

“Three (19%) of 16 patients could repeatedly and reliably generate appropriate EEG responses to two distinct commands, despite being behaviourally entirely unresponsive (classifi cation accuracy 61–78%)”

(Cruse et, al, 2011) [Open access PDF via The Lancet]

This is a major step forward, demonstrating clinically that there really is potential for us to communicate using the many different BCI packages in development around the world with those that currently have no way of communicating whatsoever. This really is a noble goal and one that we are, right now, witnessing being achieved for the first time. Conversely, the “Project Black Mirror” video appears to be attempting to capitalise on this by applying to crowd-fund their “project” using Kickstarter. This is at best a poor thought out hoax and at worst a blundering attempt to commit a major fraud.

Now, there is one  question left to answer and that is…

“What about the BCI rabbit ears?”
Well, it seems that this project may well indeed be genuine. The concept itself is certainly scientifically grounded and empirically demonstrated (Coan, et al. 2004) [Open access PDF]. As for the product, well if there is someone bonkers enough to create it then there would be no reason why it would not be technically possible. And that, it would appear, there is.

NB: This is not an endorsement of the “neurowear” product. I have seen no published data and the apparent use of one electrode suggests the device would be vulnerable to confounding facial movements (See my critical post on the Emotiv’). That said, they certainly aren’t the first group to come up with an attempted wacky implementation of BCI and they certainly won’t be the last.

References:
Damian Cruse, Srivas Chennu, Camille Chatelle, Tristan A Bekinschtein, Davinia Fernández-Espejo, John D Pickard, Steven Laureys, Adrian M Owen (2011). Bedside detection of awareness in the vegetative state: a cohort study The Lancet : 10.1016/S0140-6736(11)61224-5

Coan, J., & Allen, J. (2004). Frontal EEG asymmetry as a moderator and mediator of emotion Biological Psychology, 67 (1-2), 7-50 DOI: 10.1016/j.biopsycho.2004.03.002

Continue reading »

Follow Simon on TwitterFacebookGoogle+RSS, or join the mailing list.
Tagged with:
 
remote controlled rat Remote Controlled Rats

Head top unit

Last month we covered Kevin Warwick’s creation of a robot controlled remotely by living rat brain tissue. This month a similar project by a group at MIT has completed almost exactly the same task in reverse. Electrodes implanted in a rat’s brain have been used to remotely control the rat’s movements by delivering impulses of light directly in to the brain tissue.  Until now a problem for this method has been the size of the batteries required to power the unit that sits on the rat’s head. This problem has been solved to some extent by using “wireless power”, which consists of placing a small magnetic coil on the head top unit and transmitting electricity via a magnetic field from a nearby magnetic coil.

remote controlled rat 21 Remote Controlled Rats

Wirelessly powered and controlled optogenetics

The remote control addition to this project is the ground breaking element here. It has long been known that electric stimulation of the brain can illicit specific behaviours. More recently the field of optogenetics has emerged becoming Nature’s “method of the year” in 2010. Optogenetics involves the use of a gene for producing light-sensitive protein in algae which is tied to the promoter for another gene which identifies neurons that elicit a given behaviour such as sex, aggression or the flight response. The light sensitive protein itself (such as channelrhodopsin) then activates those neurons electrically when the light is on. The modified gene is then delivered in to the brain by a virus. The neurons that trigger a particular behaviour can then be activated at the flick of a switch just by shining light on the brain using an LED.

Believe it or not, the purpose of this research isn’t a macabre project to develop Frankenstein pets. The emerging fields of optogenetics and remote power transmission combined are likely to result in vast developments in the fields of neural prosthetics. Theoretically there are also endless possibilities for the management of mood disorders. Optogenetics will surely be a field I’ll be keeping my eye on.

The original paper is open-access for 30 days (PDF)

rb2 large gray Remote Controlled Rats

Wentz CT, Bernstein JG, Monahan P, Guerra A, Rodriguez A, & Boyden ES (2011). A wirelessly powered and controlled device for optical neural control of freely-behaving animals. Journal of neural engineering, 8 (4) PMID: 21701058

Via Ed Yong on Not Exactly Rocket Science

Continue reading »

Follow Simon on TwitterFacebookGoogle+RSS, or join the mailing list.
Tagged with:
 

rb editors selection Rise of the Rat Brained RobotsOn Friday 20th May Kevin Warwick, the designer of the rat brain robot and the self proclaimed “world’s first cyborg” is visiting town for the Bristol Ideas Festival. “Rewind” I hear you say, “the what controlled robot and the what what?“. Kevin and his team have grown cultured neural networks from brain tissue of dissected embryonic rats. The tissue is dissected before being split up using enzymes and fed on a culture of nutrients, antibiotics and growth hormones, half of which have to be replaced twice weekly. Within just an hour of seeding the cells extend connections with one another and within 24 hours a dense mat of neuronal extensions appears. The team attached a microelectrode array over the surface of the neuronal mat. After a week single action potentials (electrical signals) can be seen and during the second week dense bursts of electrical signals are emitted by the brain. The electrode array, that is placed on the brain is then attached via bluetooth to a robot. The robot scans the environment using a form of sonar and transmits signals to the brain via the electrode array. The brain interprets the signals and responds with signals which are analysed in real time and used to control the movement of the robot. This allows the robot to avoid obstacles without any outside help.



The research paper concludes..

“The existing, successfully tested infrastructure could be easily modified to investigate culture-mediated control of a wide array of alternative robotic devices such as a robot head, an autonomous vehicle, robotic arms, grippers, mobile robot swarms, and multi-legged walkers.

Perhaps that’s optimistic, but I’d have doubtless said it was optimistic if I’d been told ten years ago that robots would soon be controlled by brains grown in labs. Way back in 2002 Kevin became the world’s “first cyborg” by developing a chip which he himself implanted in his arm which allowed him to remotely control a robot hand by just moving his hand. He received sensory feedback from the robot in the form of electrical signals emitted by the implant in his own hand. Before this point it was generally believed that such an operation would cause sensory damage but no damage occured, the nerve endings actually grew around the electrode array (Warwick et al, 2005). Following this he successfully implanted a similar array in his wife, allowing the couple to be the first humans in history to communicate (all be it only through internal clicks of the hand) through purely electronic means. In the film below Kevin describes his experience “being a cyborg” and debates some of the ethical issues with a somewhat unique candour:

“Seeing how poor humans are at doing things, why not upgrade?.. why can’t I have extra memory?.. why can’t I sense the world in infra red or ultra-sonic?” (all fair questions).. “If we can simply download information in to the brain education will completely change”… At about five and a half minutes in we get really quite off the wall and the narrative starts to get slightly disparate, though I’m pretty sure that’s due to the film editor… I think in terms of holidays.. do you really need to go to a place if the image of it and the memory of it can be downloaded in to your brain”. Kevin has come under attack from some quarters who are rightly or wrongly but certainly amusingly completely freaking out about the implications of his scientific advances (or perhaps rather, his interpretations of them). Kevin certainly doesn’t give up any ground to the critics… “I would expect critics, because when we look at enhancements it is going to change life completely, humans could become a sub-species.. (cyborgs) will clearly be intellectually superior in many many ways.. just the concept of thinking in five or six or ten dimensions as opposed to humans thinking in three dimensions. You’d have a completely different concept of the world around you, if I was a cyborg and you were a cyborg and we’re communicating by thought and so on and some human comes along making these silly noises that humans make called speech.. (I’m starting to feel sorry for his project student at this point).. well it’s a bit like for two humans to be communicating and a cow comes in going moomoomooo”… “well I don’t mind.. because I don’t want to be a human, I don’t want to be part of a subspecies.. I want to be a cyborg and I know I’m not the only one”.

I have an enormous respect for Kevin, the work he has done has led to phenomenal scientific progress and is nothing short of inspirational. It appears to me that Kevin is a tad misrepresented by the above clip, the excessive video editing seems to obstruct his message somewhat. In another excellent monologue (below) Kevin talks freely and is much better presented. I find particularly interesting how Kevin describes the perceptual incorporation of ultrasonic sensors (Warwick et al, 2005). Using the internal implant in his own arm, feedback from ultrasonic sensors placed on a baseball cap on his head were fully integrated in to his perceptual system allowing him to judge his distance from objects. It seems he genuinely acquired a sixth sense (something that researchers in Brighton have recently demonstrated is now no longer such a unique technological advancement). In the following clip he also tackles the Terminator scenario. I can’t help but think that he is using irony, humour and a pinch of artistic licence to some extent to tackle the issues that he’s concerned about, which in my opinion is the best way to create dialogue in most domains. Either way, the following clip certainly makes me chuckle to myself at points. Mortal humour is the new black, haven’t you heard? Just me? OK then. I’d really like to be able to think that he’s just engaging in some old fashioned idiot-baiting but it certainly appears however that he sincerely believes what he is saying, which makes the whole thing pretty chilling.

“The possibilities are there, the sort of Terminator style machines, they will be the ones in command… There are dangers and threats if you are a human and you want to remain a human, I certainly don’t, I want to become a cyborg on a permanent basis and I can see lots of advantages of doing that.. In the future if you want to be a cyborg and you’re happy upgrading then fine, I think the futures rosy, an in-the-driving-seat type of existence. However if you are a human and you want to stay human then, well, enjoy life while you can, it’s not going to last much longer”. Things get even darker when Kevin moves on to talk about military applications.. “We can see dramatic changes in the military domain, firstly autonomous fighting machines, by the time we get to 2020.. we’re going to have a plethora of them”… and without dropping the deadpan for even a second… “security is going to have to be tighter, you can’t have people hacking in to other people’s brains”.

Could it be that James Cameron might just have hit the nail on the head the first time round? Probably not, but maybe, just maybe, the classic film had the ring of truth. Only the future will tell. Welcome to the brave new world.

Kevnin Warwick is speaking at the Bristol Ardolfini on Friday 20th May (along with a host of other fascinating speakers over the coming fortnight at the Bristol Festival of Ideas)

Warwick, K., Xydas, D., Nasuto, S. J., Becerra, V. M.,, & Hammond, M. W., Downes, J., Marshall, S. and Whalley, B (2010). Controlling a mobile robot with a biological brain. Defence Science, 60 (1), 5-14 (Open Access PDF)
Warwick, K.; Gasson, M.; Hutt, B.; Goodhew, I.; (2005). An attempt to extend human sensory capabilities by means of implant technology Systems, Man and Cybernetics, 2005 IEEE International Conference, 2 (10), 1663-1668 : 10.1109/ICSMC.2005.1571387 (IEEE Subscription needed)
Advert below provided by unaffiliated sponsor

Follow Simon on TwitterFacebookGoogle+RSS, or join the mailing list.
Tagged with:
 

rb2 large gray Calculating the value of a year of human life in $US

I just returned from a week of amazing talks at the IEEE conference on computational intelligence in Paris (yay!). I won’t bore you with a full review but I thought I’d do a post on an interesting debate I had based on a paper presented at the conference. The debate was about how much a human life is worth, it’s a crucial variable if you are writing an algorithm for managing a rapid response to a state of emergency. It is still generally too much of a taboo to put a real number on the value, let alone seriously consider giving different lives different values. That said we know if we had one fire engine to send to a burning school or a burning office we would not make a random decision. At the moment it takes up to 16 minutes for emergency vehicles to be dispatched in a state of emergency, if the dispatch was fully automated decisions would be made in seconds (Mohammadi & Sadeghian, 2011).

It’s a relevant question for me because my interest is in brain computer interfacing (BCI), a technology that provides means of communication for the severely paralysed by linking the human brain directly to a computer. BCI isn’t widely used clinically yet largely because the medical grade EEG equipment required costs tens of thousands of dollars. What isn’t discussed is the question of whether health authorities are interested in extending lives using artificial means, it certainly doesn’t do health budgets any favours and for some people the prospect leaves a bitter taste in the mouth.

One reason I’m fully behind the (BCI) developments is that history shows clearly that the costs of these systems are destined to fall rapidly and if we invest in them their performance is likely to increase exponentially. If you were to plot a graph of the increase in computing power or the reduction in computing cost over time using a regular linear graph it would be a relatively straight line disappearing almost vertically in to the sky.

The graph below is a semi-logarithmic plot which means each level on the y-axis represents a number one hundred times the level below it.

exponential growth Calculating the value of a year of human life in $US

During some slightly off-topic research on this somewhat rhetorical question I stumbled across this rather morbid paper which presents some intriguing stats from a meta-analysis of peer reviewed journals. Below are just a few of the relative costs of treatments or state interventions that are in practice. They are presented in terms of the cost of the intervention for every “life-year” saved.The value of life1 Calculating the value of a year of human life in $US

I’ll be first to admit that some of the figures are more than a little bit odd (probably due to the infinite number of confounding factors in a comparative study like this) and I haven’t reviewed the sources of the paper but I recommend taking a glance at the original (PDF). It’s morbidly intriguing, if nothing else.

Tengs, T., Adams, M., Pliskin, J., Safran, D., Siegel, J., Weinstein, M., & Graham, J. (1995). Five-Hundred Life-Saving Interventions and Their Cost-Effectiveness Risk Analysis, 15 (3), 369-390 DOI: 10.1111/j.1539-6924.1995.tb00330.x

Mohammadi, & Sadeghian (2011). iFAST: An Intelligent Fire-Threat Assessment and Size-up Technology for First Responders Proceedings of IEEE Symposium Series in Computational Intelligence

Follow Simon on TwitterFacebookGoogle+RSS, or join the mailing list.

Tagged with:
 

Looking for something?

Use the form below to search the site:


Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

Visit our friends!

A few highly recommended friends...