In his two Intel blog articles (links provided below) the author, Bob Duffy, (2015, 2016) focuses on the shift from implicit to explicit computing that is happening in both the general computing area and in technology companies. This brief blog post emphasizes the growing interest for implicit systems, referred to here as “perceptual technologies”. Duffy points out that for decades, technology has worked on developing algorithms and tools to track precise input and information while humans were the ones doing the thinking and the perceiving. The shift is now towards creating machines capable of precise measurement but also to able to determine the relative state of a situation and use this information to trigger an action.
The posts then delve into the possibility of improving machine learning by allowing the computer to experience how we see the world. When interacting humans implicitly evaluate others’ behaviour to be able to understand their emotions and intentions.
For instance, we may observe the listener’s posture or eye gaze to infer how interested they are in the conversation, but how could this be implemented in machine learning?
Duffy provides a simple example to explain this concept such as the idea of having an interactive screen (fish tank) with a series of targets (fish) that will react to specific people’s actions. If the person is getting closer to the screen the fish will come closer to the surface but, if the individual gets too close the fish will retreat, additionally, if people pass too quickly in front of the screen the fish won’t appear. The implementation of these types of applications enable the machine learning process and will aid the computer in understanding how we see the world ultimately making the experience more intuitive and more human.
Duffy also pictures a future in which these implicit technologies will be embedded in many parts of our life. We all saw glimpses of this in movies such as Minority Report (2002) in which the main character enters a room and is targeted by personal advertising; and this notion is already not too far removed from our present daily reality.
We can also see it in our environment, it won’t be long until our devices (e.g., car) will be able to sense and identify both our presence and intent and then respond with a congruent action (e.g., parking). Technology is, literally, opening doors and differently from the past developers already have the tools to build these type of systems and the technologies at the base of this area of computing are the internet of things, perceptual technologies and cloud computing.
Since 2013 Mindsee has been working towards the goal of developing a system able to understand the implicit state of the user engaged in a scientific literature task; and Bob Duffy’s post show how relevant perceptual technologies have become and how important applications like Mindsee will be in the close future
- Duffy, B (2015) - https://blogs.intel.com/evangelists/2015/07/22/developers-need-think-relatively-intel-realsense
- Duffy, B (2016) - https://blogs.intel.com/evangelists/2016/04/13/the-shift-from-explicit-to-implicit-computing/