Human Computer Interface (HCI) Tech Digest - February 2017

Stimulation of visual cortex provides partial sight

To further develop its wireless brain implants to restore partial vision in blind people, the University of Chicago Medicine has been granted USD 2.4 million from the National Institute of Health in the US. The concept of the project is to convert a camera’s input into electrical stimulation of the visual cortex which results in perception of visual stimuli. As part of the US National Institutes of Health’s BRAIN initiative, the project will spend two years developing the technology and implants, followed by three years of implanted human trialling. 
The subject will wear camera-equipped glasses which are connected to a waist worn computer that processes the data and sends it as electrical signals via a headband to some of the 600 implanted electrodes in the visual cortex, producing the sensation of sight. Currently the trial will only be carried out on those who were born with sight but later lost it, as their visual cortex is well developed.

Typing from the brain

In a Stanford University-led report, researchers have demonstrated that a brain-computer interface (BCI) can enable paralysed people to type at high speed and accuracy. The three study-subjects each had two small electrode arrays placed into their motor cortexes. Signals from this area were sent to a computer which then translated them into point-and-click commands guiding a cursor to an on-screen keyboard. One of the participants was able to accurately type about eight words per minute. 

Turtle controlled by mind

Researchers at the Korea Advanced Institute of Science and Technology (KAIST) have developed a brain-computer interface (BCI) that can control a turtle with human thought. Human participants are given a head mounted display (HMD) through which the turtle’s videofeed comes, and a BCI. The turtle has a camera, WiFi transceiver, computer control module, a battery and a stimulation device which blocks the turtle’s eyesight, all of which are attached to the upper shell. The human participant can see the video feed from the turtle and decide which direction to move. This information is fed through the BCI which can determine left, right and idle signals. The signal is then sent to the turtle’s stimulation device which blocks the turtle’s sight from either the left or right eye. The turtle instinctively moves away from the side which has been blocked. So if the left side is blocked the turtle will turn right  . After further development the researchers see this technology being used to improve AR and VR, and being used in military reconnaissance and surveillance.  

Brain-hacking

University of Washington’s Tamara Bonaci and team recently ‘hacked’ into a human brain. The team created a BCI (seven electrodes and an EEG device) which took subjects’ neural responses to in-game imagery while playing a game called Flappy Whale. Subjects’ unconscious emotional responses to millisecond exposure to logos of restaurants, cars, etc (which the player subliminally saw) were recorded. Bonaci believes similar techniques could be used to get information on peoples’ religious beliefs or attitudes to race. 

Low power voice recognition chip

MIT scientists have developed a voice-recognition chip that compared to current chips could offer power savings of 90% to 99%. Currently a cell phone running speech-recognition software requires 1 watt of power; the new chip requires between 0.2 to 10 milliwatts depending on word volume. This power efficiency could open up a wider field for voice recognition tech in IoT and wearable devices. The scientists achieved such power efficiency by providing the chip with a voice activity detection circuit that monitors ambient noise to determine speech. If speech is detected the larger power draining circuits are activated to process it.  

Havyn security assistant at IBM

Mike Spisak of IBM has created a voice assistant in his spare time. The assistant is named Havyn and it is being tested in real world applications by IBM’s X-Force Command Centers (IBM’s security operations centre). 
The assistant was created using a Raspberry Pi and a seven-inch touchscreen, to which was added code from Bluemix (IBM’s cloud platform). Later Spisak connected Havyn to IBM BigFix, an endpoint security manager that assesses systems for threats. Havyn also uses IBM Watson APIs.  Havyn can support cybersecurity teams in handling workflow, providing real time responses to verbal requests and commands to access data from open source security intelligence. For example, Havyn can provide updates to workers on security threats and recommended remediation procedures.  

Better electrodes

The Center for Sensorimotor Neural Engineering (CSNE), a multi-university collaboration, has developed a type of electrode that it claims would be more durable, last longer and transmit clearer data than current electrodes. Current high-end electrodes are made of thin-film platinum, but these can fracture and gradually degrade. The answer according to the study is glassy carbon electrodes, that are ten times smoother than thin-film platinum meaning that they are less prone to corrosion, giving them a longer life time. The material is also promising for reading signals directly from neurotransmitters, having about double the signal strength of platinum electrodes, says one of the researchers. 

DARPA working on neural net

DARPA, the Defense Advanced Research Projects Agency, a branch of the US Department of Defense, has revealed that it is working on creating a direct neural interface, according to a story in ComputerWorld magazine. The interface would allow humans to control machines and devices with thought alone. Justin Sanchez, director of the Biological Technologies Office at DARPA, says that DARPA is working on creating an implantable device with computing power similar to a laptop computer. 

Optically conductive fibre could help us better understand the brain

MIT scientists have developed a chemically, electrically and optically conductive fibre to carry signals between a brain and an external device. The researchers expect the fibre to be mostly used in animal research. It allows scientists to inject viral vectors carrying light-sensitive proteins (opsins) directly into the brain. The proteins cause neurons to become sensitive to light, allowing researchers to send signals through  the fibre and record responses. This procedure is useful in understanding how neural pathways function. The scientists see the main uses for the fibre being better understanding of diseases like Parkinson’s, depression and other neurological conditions. 

Line is launching its own voice assistant

Chat app Line has created its own voice-assistant called Clova – a portmanteau of cloud virtual assistant. Clova comes in the Line app but will also be offered as a standalone product which can be installed into devices: this aspect has already gained interest from Sony and Tomy. Furthermore, Line has recently acquired Japanese hologram company Vinclu which has previously released a holographic virtual assistant. Line is planning to release a Clova powered smart speaker in Japan and Korea in early summer 2017. 

Add this: