Human Computer Interface (HCI) Tech Digest - March 2017

Bixby digital assistant

Samsung has announced the release of Bixby, a new digital assistant for the company’s devices - initially the Galaxy S8 but later others. Samsung claims that Bixby is different from other digital assistant on the market due to:
1)    Completeness: Bixby will enable complete voice control of Bixby-enabled apps, eliminating the need to touch the screen.
2)    Context awareness: Bixby is aware of the current context and state of a Bixby enabled application, helping with smooth, uninterrupted work flow.   
3)    Cognitive tolerance: users will be able to use incomplete voice commands from which Bixby will be able to deduct a meaning, asking for confirmation as it completes a task piece-meal. 

Brain tells robot it’s not right 

Scientists at Massachusetts Institute of Technology (MIT) have developed a system that allows humans to let a robot know when it is making a mistake. The system uses EEG (electroencephalography) to detect reactive patterns in the brain when the user sees the robot do something wrong. To get the base measurement indicating wrong activity the scientists took five subjects, wearing EEG equipment, and had them watch a robot reaching toward one of two LEDs. When the subjects saw the robot reaching toward an arbitrarily defined wrong LED, a distinct signal would be generated by the brain that could be used as a sign to the robot that it is doing wrong. In 70 percent of the tests the robot identified the wrong signal and altered its behaviour appropriately. The scientists see this system having uses in autonomous car systems, e.g. alerting the car when the human spots a hazard that the car hasn’t. 

BCI restores motion in a quadriplegic’s arm

Quadraplegic Bill Kochevar has achieved movement in his arms and hands with the help of a brain computer interface. The re-activation happened at Case Western Reserve University in Ohio, USA, through the insertion of a recording electrode under the skull, and a functional electrical stimulation (FES) system connecting Kochevar’s brain to his muscles. As a visualisation exercise Kochevar spent the first four months of the study using his brain signals to move a VR arm before having the FES system implanted through which signals from the 96 electrodes in Kochevar’s brain are converted into electrical muscle stimulation. Kochevar has been able to drink, feed himself mash-potato and scratch his nose with a sponge skewered on a stick.
The scientists say that the advances needed to make this usable outside of the lab are not far from being realised – with work ongoinging on making the implant wireless and fine tuning of the system for more precise movements. 

Brain Computer Interface (BCI) for VR

Neurable, a startup investigating BCIs, has demonstrated a dry EEG (electroencephalography) system (dry meaning that it is non-invasive) that allows wearers to interact with a VR environment. Using an HTC Vive headset and the EEG equipment the user’s intentional brain signals are read and interpreted. In the demonstration, which used a game environment, the intention was fed back to the VR world in the form of a spell selection or throw. The company says that its BCI system has reduced noise (signals that obscure an accurate reading) and is quicker than others. Neurable says it has achieved 85 percent signal reading accuracy in real time and 99 percent accuracy with a one second delay using previous versions of its system.

Alexa or Siri?

According to Bloomberg, The Marriott, a global hotel chain, is deciding on a voice assistant for its rooms. The Marriott is testing both Amazon’s Alexa, used in its Echo range of products, and Apple’s Siri. The devices could be used to allow guests to use voice commands to turn on lights, turn up the heating or change TV channel etc. The hotel chain is planning to introduce the chosen platform in some hotels in mid-2017.
If The Marriott selects Alexa then guests could also enjoy voice-controlled games that have been released for Amazon’s Echo speakers. 

Tilt to enter

Scientists at the University of St. Andrews, UK, have come up with a phone text input method that uses tilt gestures. The method called SWiM (Shape Writing in Motion) uses a tilt controlled pointer that traces a path across the screen. So for example, if I tilt my phone to the left the pointer will also move across the keyboard to the left, tilt up and it moves up. When the pointer is above the letter the user wishes to input they tap an on-screen icon to confirm input. A study showed that first time users could input 15 words per minute with little previous practice, and a rate of 32 words per minute after 90 minutes of practice. The scientists suggest that tilt-based gesture control could be used in VR headsets or game controllers. 

Chatbot with empathy

Soul Machines, a New Zealand based development company, has been working with IBM to develop an avatar called Rachel, which it claims can recognise and respond to human emotional needs. Rachel appears to the user as a life-like human avatar. Rachel works as a customer service and consultation service worker, as was showcased at the US financial services conference Lendit in 2016, where it advised people on credit cards. 

Earbuds that reads your expression

Scientists at the Fraunhofer Institute for Computer Graphics Research in Rostock, Germany have developed an earbud that can detect a user’s facial expression and use it as a cue to execute an action e.g. a smile pulls up Twitter. The earbud’s sensors use changes in ear canal shape to determine the facial expression. The device can detect five different expressions: smiling, winking, head turning, mouth opening and a ‘shh’ sound.

Skin control

Researchers at Saarland University, Germany, have developed a skin-conforming, tattoo-style HCI. The electronic tattoo, called SkinMarks, can be applied to skin with water and will last for about three days, during which time the tattoo can be used to, for example, adjust the volume of the user’s headphones or to pause or play music. The tattoo is electroluminescent giving it the possibility to light up when certain apps receive notifications. 

Neuralink having a think about neural interfaces

Neuralink is developing what it terms ‘ultra-high bandwidth brain-machine interfaces to connect humans and computers’. These interfaces will be implanted. The company is currently advertising multiple jobs on its website including for a mechatronics engineer, biomedical engineer and a senior technician for immunohistochemistry. 

 

Add this: