Last week, I had a whirlwind trip to Grenoble, France: 3 days, 2 summits, and 1 supercool paragliding festival, Coupe Icare, on the side – all filled with “Aha!” moments. Ironically, the paragliding event I attended on Saturday drove home the point of the most stand-out talk from the week: Peter Hartwell’s (TDK-Invensas) impassioned presentation titled “Archiving and Sharing Your Experiences”, during which he illustrated, sense by sense, how MEMS and sensors technology allows us to capture and save memories and experiences and share them with the world.

I attended Coupe Icare by myself; my interaction with the hordes of attendees limited by my feeble grasp of French. Being able to share the experience with friends and family back home thanks to my smartphone camera and social media apps like Snapchat, Instagram, and Facebook made it even more memorable.

The Gala Dinner was one of the week’s highlights.

The Highlights
Combined, the European MEMS and Sensors and Imaging and Sensors Summits, September 20-22, 2017, drew over 400 attendees to the Minatec Campus, who gathered to share the latest advancements in MEMS and sensor technology, learn about what’s driving these markets, and most importantly, how these technologies can impact our lives through real-world applications.

Frederic Breussin, Yole Développement, succinctly wrapped up the three-day event as follows: Keynote speakers Carlo Bozotti, ST Microelectronics; Chae Lee, LG Electronics; Hubert Carl Lakner, Fraunhofer IPMS; and Marie-Noelle Semaria, CEA-Leti;  provided vision for both industry and R&D in the future. We saw a combination of MEMS and sensors for multiple applications. It turns out that we don’t need trillions of sensors, as previously thought. We just need them to be smart. Advancements in artificial intelligence (AI) will help us make the most of all our sensors so that they provide more than just measurements of parameters.

Quality data is taking priority over the quantity of data, which puts the spotlight on opportunities for software. Breussin also noted that MEMS and imaging technologies are not so different, as they share the same key drivers: automotive, AR/VR – all to improve the man-to-machine interface.

mems and sensors
We are entering the third technology wave enabled by MEMS and sensors. (Courtesy of Eric Mounier, Yole Dèveloppement)

Providing market insight, Yole’s Eric Mounier, said the total solid-state sensor and actuator market is expected to grow from $38B USD in 2017 to $66B by 2021. While historically, the MEMS market has always been driven by automotive applications, Mounier noted that since 2003, there has been a transition to consumer applications, particularly smartphones and tablets, which continue to be dynamic with volumes being driven by China.

In 2004, the iPhone Edge contained 5 sensors. By 2014, the number had increased to 12 in the Samsung S5 and iPhone 5. Mounier predicts that by 2021, the number of sensors will have climbed to 19-20.  Consumer applications still own 70% of the sensor market, with automotive coming in second. With the coming of autonomous vehicles, cars could become the next consumer application for MEMS and sensors, bringing it full circle. According to Mounier, a level-three autonomous car will have $2500 worth of embedded sensors.

 Sense by Sense
As the themes of this year’s summits were MEMS for All Senses, and (Image) Sensing the World, respectively, I thought I’d recap it sense by sense. Currently, sight-based sensing dominates sensing technologies. Next, comes hearing, followed by touch/feel, and smell. Taste, the most complicated of all, is still a sense to be conquered in the MEMS and sensors world. According to Hartwell, while there are technologies being developed to find out how to stimulate the taste buds to taste certain things, but not sensors that can identify the tastes themselves.

Sight
Mounier cited a psychological study in 1994 by Hatwell (Hatwell, Y., 1994, Traitéde psychologieexpérimentale Paris, P.U.F.) which showed that 83% of our external world perception is through our vision, followed by hearing, which represents 11% of our perception. It’s no wonder then that image sensing has carved a market out of the general sensor markets all for itself.

As Fabien Dumon, Airbus, noted in his presentation on VR/AR, the most critical sensors help us understand our environment. For example, highly accurate camera sensors that recognize the room, as well as large and small objects, help recreate the scenes in AR applications.

From the image sensor track, we learned that in visible light image sensing, the pixel race is over and focus is turning to performance, with developments focused on advanced functionalities like high dynamic range (HDR) and 3D imaging, which is required for facial recognition applications. Indeed, the need for improved performance is taking us beyond the visible light spectrum to multispectral and near-infrared technologies that rely on the non-visible light spectrum.

Figure 1: Amazon has become a heavy consumer of MEMS microphones. (Courtesy, Jeremie Bouchaud, IHS)

Sound
Mounier, Hartwell, Jeremie Bouchaud, IHS, and Peter Cooney, SAR Insight and Consulting, all talked about the MEMS microphone success story, improvements in technology, for example in capacitive MEMS, drive down the microphone form factor and improve and how machine hearing, voice interface, and voice biometrics applications are driving technology advancements.

Machine hearing is particularly critical to AI, and as such, companies like Amazon, Apple, Google, Microsoft, and Nuance have been snapping up speech recognition technologies. Cooney said that voice is rapidly becoming a key user interface, alongside touch. Beyond just Siri and automotive hands-free interface applications, voice-activated personal assistants like the Amazon Echo and Google Home for smart home control are all the rage.  However, there are security concerns surrounding cloud-connected devices that are always listening. Cooney also touted voice biometrics as an authentication tool that he says will make systems more secure due to the uniqueness of the human voice. However, there is still lots to accomplish before the technology is accurate enough for that. Personally, I have a hard time believing that voice authentication will ever be more secure than facial recognition, as the technology will have to sophisticated enough to distinguish between a recorded and live voice or even a simulated voice, as simulation software becomes more and more advanced. 

Touch/Feel
From thin-film piezoelectric transducers for that allows users to feel a user interface, such as presented by Mathieu Rupin, Hap2U during the technology showcase, to improved inertial motion sensors for gesture control in AR/VR applications, we heard a number of talks about how sensors connected to touch, feeling and movement are improving the human-machine interactive experience.

Smell
According to Hartwell, sensors that “smell” offer big opportunities. He called it the most important of passive safety senses. “We can smell smoke long before there is fire,” he noted. Near-term opportunities for gas sensing include VOC sensors that measure outdoor air quality, indoor air quality for indoor safety applications, a and breath analysis for medical diagnostics, as well as food safety.

Robots navigate the world using the same sensors as humans. (Courtesy of Peter Hartwell, TDK-Invensense)

Making a Dog
The driver for most (if not all) of these sensors, according to Hartwell, is the explosion in robotics. The ultimate application is an autonomous companion that follows you around, answers questions, changes the music and the temperature, reminds us of appointments, roams the house when you’re away, can find its charger to plug in, monitors your health wellness, and environment protects you, finds lost objects. “In short,” said Hartwell, “We are making a dog.”

Parting Thoughts
Based solely on the enthusiasm level of those participating at the MEMS/Imaging and Sensors Summits, it’s easy to envision a future built on MEMS and sensors limited only by our imagination. However, it’s important to factor in the unintended consequences of these technologies. But as this event recap blog is already longer than I intended, I think that topic is best saved for another day.  ~ F.v.T.

Francoise von Trapp

They call me the “Queen of 3D” because I have been following the course of…

View Francoise's posts

Become a Member

Media Kit

Login