Reading your mind, from your EEG!

New research from Mindsee partners [Aalto University, the Helsinki Institute for Information Technology (HIIT) and the Centre of Excellence in Computational Inference (COIN)] reports that for the first time, information retrieval is possible with the help of EEG interpreted with machine learning. Or in other words, Mindsee reading minds!

http://cs.aalto.fi/en/current/news/2016-12-08-002/

 

New Frontiers in Neuroscience article from MINDSEE partner TU Berlin

We're happy to share news of a new Frontiers in Neuroscience article from Mindsee project partner Technical University of Berlin (link below opens in new window):

http://journal.frontiersin.org/article/10.3389/fnins.2016.00530/full

Prof Benjamin Blankertz, PI on MINDSEE at TUB, said "We are pleased to announce a
new review paper about non-medical applications of BCI technology. Section 6 of the paper discusses MindSee HCI applications."

The Rise of Perceptual Technologies

In his two Intel blog articles (links provided below) the author, Bob Duffy, (2015, 2016) focuses on the shift from implicit to explicit computing that is happening in both the general computing area and in technology companies. This brief blog post emphasizes the growing interest for implicit systems, referred to here as “perceptual technologies”. Duffy points out that for decades, technology has worked on developing algorithms and tools to track precise input and information while humans were the ones doing the thinking and the perceiving. The shift is now towards creating machines capable of precise measurement but also to able to determine the relative state of a situation and use this information to trigger an action.

The posts then delve into the possibility of improving machine learning by allowing the computer to experience how we see the world. When interacting humans implicitly evaluate others’ behaviour to be able to understand their emotions and intentions.
For instance, we may observe the listener’s posture or eye gaze to infer how interested they are in the conversation, but how could this be implemented in machine learning?
Duffy provides a simple example to explain this concept such as the idea of having an interactive screen (fish tank) with a series of targets (fish) that will react to specific people’s actions. If the person is getting closer to the screen the fish will come closer to the surface but, if the individual gets too close the fish will retreat, additionally, if people pass too quickly in front of the screen the fish won’t appear. The implementation of these types of applications enable the machine learning process and will aid the computer in understanding how we see the world ultimately making the experience more intuitive and more human. 


Duffy also pictures a future in which these implicit technologies will be embedded in many parts of our life. We all saw glimpses of this in movies such as Minority Report (2002) in which the main character enters a room and is targeted by personal advertising; and this notion is already not too far removed from our present daily reality.  
We can also see it in our environment, it won’t be long until our devices (e.g., car) will be able to sense and identify both our presence and intent and then respond with a congruent action (e.g., parking). Technology is, literally, opening doors and differently from the past developers already have the tools to build these type of systems and the technologies at the base of this area of computing are the internet of things, perceptual technologies and cloud computing.

                                                                   Intel® RealSense™ Technology

                                                                  Intel® RealSense™ Technology

Since 2013 Mindsee has been working towards the goal of developing a system able to understand the implicit state of the user engaged in a scientific literature task; and Bob Duffy’s post show how relevant perceptual technologies have become and how important applications like Mindsee will be in the close future

Source: 

  • Duffy, B (2015) - https://blogs.intel.com/evangelists/2015/07/22/developers-need-think-relatively-intel-realsense
  • Duffy, B (2016) - https://blogs.intel.com/evangelists/2016/04/13/the-shift-from-explicit-to-implicit-computing/ 

 

Human Interactive Conference

The Human Interactive Conference was held today, 6th of November 2014 at Goldsmiths, University of London with the aim to explore the current and future interfaces between human beings and the rapidly evolving landscape of novel technologies. 

Prof. Jonathan Freeman was one of the speakers, introducing the work that him and his team at i2 media research have been conducting, applying both conscious and subconscious objective measures of human behavior. His presentation highlighted ongoing research and development projects including the CEEDs project (ceeds-project.eu) as well as MindSee.

Find out more at  http://humaninteractive.org.uk/

Brain-computer interface: everyday technology?

Major tech companies (Intel's RealSense, Philips and Emotiv) are more and more experiencing with brain-computer interface technology.

The last one to end on the spotlight is Dell which has recently announced to be working on a product to detect a person’s mood, for use in education, communications, cars and video games. Jai Menon,  head of Dell research gave a few examples of benefits coming from such systems: “If I can sense the user is working hard on a task, an intuitive computer system might then reduce distractions, such as allowing incoming phone calls to go directly to voicemail and not letting the user be disturbed.”

Dell researchers have also used brain activity headsets by companies like Neurosky to identify users' moods. This worked on about 50% of cases, so accouracy is still quite low but the use of multiple sensors such as EEG, ECG, pulse oximeter to adjust values are currently being tested and hope to increase. Despite such limitations, Dell plans to release an off the shelf system in 2017.

Sources: http://neurogadget.com/2014/09/14/dell-plans-release-mood-reading-product-2017/10543, http://www.bbc.co.uk/news/technology-28642935

RE.WORK 2014 Technology Summit – LONDON

On September 18-19, 2014 the Mindesee and CEEDs projects were presented  at the RE.WORK Technology Summit by Professor Jonathan Freeman. The topics covered in the conference were various but mainly focused on Wearable technologies (Misfits, Studio XO, Institute for Scientific Interchange Foundation), Environmental technologies (Tado, University of Bath), Robotics (University of Bristol, Minibuilders, University of Oxford, University of Hertfordshire, Sheffield Centre of Robotics), Medicine (SENS, Sosafresh, QuantumMDx), Technologies for developing world (Buffalo Grid, Imperial College, Konto46) and New Generation technologies (Goldsmiths College, Ultrahaptics, i2 media research, Swiftkey, Create-net, Concirrus, IBM, Bleepbleeps, CHS, Microsoft).

                         Copyrights@DanTaylor

                       Copyrights@DanTaylor

                                Copyrights@DanTaylor                            

                             Copyrights@DanTaylor                           

Professor Jonathan Freeman was present at the conference to promote Mindsee and CEEDs and during his talk he explained aims, expectations, stages of development, possible applications and the innovation achievable by both projects.

                                                                                      Copyrights@DanTaylor            

                                                                                    Copyrights@DanTaylor          

Mindsee and CEEDs project were also showcased with demos and posters.

Both projects were explained in details by researches from UPF/SPECS and Goldsmiths/i2 media research, and received high interested and enthusiasm by the numerous attendees.

         Copyrights@DanTaylor

         Copyrights@DanTaylor

         Copyrights@DanTaylor

         Copyrights@DanTaylor


Information-seeking, curiosity, and attention: computational and neural mechanisms

An interesting review paper recently published in Trends in Cognitive Science,  looks into the area of information-seeking from three traditionally separate fields:

  • machine learning,
  • eye movements in natural behavior,
  • and studies of curiosity in psychology and neuroscience. 

Although they use different terminology and methods, these three lines of research grapple in fact with strikingly similar questions and propose overlapping mechanisms. Understanding and recognizing recent developments in the area of information seeking from these points of view is of great interest to MindSee and very close to the solution that the project proposes for advancing in the field.

Read More

BBCI Winter School on Neurotechnology 2014

he BBCI Winter School on Neurotechnology 2014 will be held  on February 24-28, in Berlin, Germany. 

Brain-Computer Interfaces (BCIs) and alternative applications of Neurotechnology based on sophisticated data analysis methods have become an active and flourishing field of research. Its attractivity and also its complexity is based on the fact that this area requires concerted expertise and effort of a number of different fields including neurophysiology, machine learning, electrical engineering and psychology.

Read More