The gentleman speaks but does not use his hands: Do you like this kind of human-computer interaction?

Human Machine Interface or Human Machine Interface (HMI) is the medium for interaction and information exchange between the system and people. Traditional HMI mainly uses serial communication interfaces such as RS232, RS422/RS485, and data interfaces such as network ports and USB to realize the human-computer interaction of the equipment. Important hardware part, it replaces part of the original mouse and keyboard functions.

Author: Dr. M

Human Machine Interface or Human Machine Interface (HMI) is the medium for interaction and information exchange between the system and people. Traditional HMI mainly uses serial communication interfaces such as RS232, RS422/RS485, and data interfaces such as network ports and USB to realize the human-computer interaction of the equipment. Important hardware part, it replaces part of the original mouse and keyboard functions.

With the continuous evolution of technology, the interactive mode and manifestation of HMI have undergone great changes. Today’s human-computer interaction may be tangible or intangible. For example, we can issue a verbal command or use gestures to control a smartphone, or we can use voice to control the vehicle’s navigation system to select the destination we want to go to.

Three conjectures about HMI

Both in function and concept, today’s HMI has undergone earth-shaking changes. From the perspective of technology and application, we believe that HMI will further affect future human-computer interaction in the following three aspects.

Screen changes

In consumer electronics, curved screens and folding screens have begun to be used in large quantities, and the advancement of sensor technology has driven the application of fingerprints under the screen. In a car, HMI is mainly represented by screens and displays. Passengers can operate the output of the audio system by pressing buttons on the screen, such as selecting input devices, radio tuning, and browsing navigation commands.

In the future, the screens in passenger cars will be given more functions and the size will become larger and larger, and a fully customizable all-digital dashboard Display will be widely used. Now, Tesla’s 15-inch vertical touch screen has incorporated functions such as heating, ventilation and air conditioning (HVAC) control, and the larger screen should be given more functions in the future.

Voice is expected to be the next development goal of HMI

Experts predict that by 2022, 80% of in-vehicle HMIs will integrate voice control functions, which does not include voice recognition systems used in smartphones. With the advancement of speech recognition technology, its applications are also expanding. Nowadays, most HMIs use voice commands to control the audio system and answer calls, etc.

In the future, natural language commands will also be used to perform more complex functions, from adaptive cruise control of vehicles to non-contact control and operation of smart phones and wearable devices. It can also be said that the application of voice user interface (VUI) to Electronic devices will become a major trend in the development of human-computer interaction. Now, the voice intelligent assistant based on artificial intelligence (AI) basically solves the shortcoming of “unintelligible” in VUI. Voice control + AI + machine learning, a combined voice user interface of the three, their application prospects, think Thinking can make people very excited.

Gesture recognition has become popular in non-contact HMI

Compared with touch screens, gesture control has many advantages: for example, users can issue commands from a distance without touching the device. In addition, gesture control expands HMI from a two-dimensional user interface to a three-dimensional space. The world-renowned car company BMW has already applied gesture technology to some car models. After the gestures made by passengers are “seen” by the camera, the functions in the car can be executed. Of course, we can also regard gesture control as an alternative to voice control, especially in public areas that are not suitable for loud speaking.

VUI: The future star of HMI

In order to avoid the spread of the virus due to the outbreak of the new crown pneumonia, the demand for non-contact HMI in workplaces, retail stores, hospitals and other environments has risen sharply. With the reopening of the global economy, this trend is likely to accelerate further. For this reason, we boldly predict that in the next 10 years, the development of non-contact HMI, especially VUI, will enter the fast lane of development. The development of this market also provides business opportunities for semiconductor companies, OEM/ODM, proximity sensor suppliers and software companies.

Among the non-contact HMI technologies, why is VUI so favored by the industry? In fact, as early as 2014, Microsoft CEO Satya Nadella predicted: “Human voice is the new interface.” According to a consumer survey conducted by PricewaterhouseCoopers (PwC) in 2018, 90% of respondents are familiar with voice assistants, and 72% of respondents have first-hand experience with this technology.

The gentleman speaks but does not use his hands: Do you like this kind of human-computer interaction?
Figure 1: The application of voice assistants in various electronic devices (Source: PwC 2018 Consumer Survey Questionnaire)

The reason why VUI is widely recognized by consumers is that on the one hand, it can free our hands and improve the convenience of communication; on the other hand, it effectively enhances the customer experience. After all, we speak much faster than typing. With the verbal commands of the control system, the communication between humans and machines will become more natural and effective. As voice recognition technology continues to mature, VUI will benefit billions of users in the next few years. Experts predict that in the next 5 years, almost every application will integrate VUI technology in some way.

AI makes machines more and more intelligent. Therefore, VUI with AI and machine learning will greatly improve the personalized experience of voice interfaces. According to the forecast of research company Tractica, the role of AI-based voice intelligent assistants in HMI is becoming more and more important. By 2025, the global market value is expected to reach 4.6 billion US dollars. In the next few years, 80% of car HMIs will be integrated with voice recognition systems. This does not include smart phone assistant applications such as Google voice and Apple Siri. In cars, voice commands are most commonly used to control media players in the car and set destinations for the navigation system. With the development of machine learning algorithms, VUI will also be introduced into the functions of ADAS. In smart phones, almost all high-end models have the function of voice control of mobile phone operations.

In smart homes, voice control systems take home automation to a new level. Smart home hubs like Amazon Echo, Google Nest, and Samsung SmartThings allow users to manage connected devices using simple voice commands. At the same time, VUI is gradually infiltrating our workplaces and making them move in the direction of digitization. Gartner predicts that by 2023, 25% of employee interactions with applications will be via voice.

Deep ploughing of key technologies

Overall, non-contact HMI mainly involves 9 key technologies, namely: camera-based gesture recognition and authentication, gesture ultrasound or radar, eye tracking, voice commands, gesture-based and position-based photodiode sensors, proximity touch screens, and motion sensors Convergence, short-range radio and other non-contact technologies. According to the latest “Contactless HMI 2020” report released by the research institution Touch Display Research, at present, contactless HMI technology has attracted the participation of more than 390 companies, who are focusing on the areas of contactless sensors, software and system integration, etc. Among them, companies that provide voice commands and camera-based gesture recognition are the most.

For HMI applications, many technology suppliers have taken active actions. For example, TI focuses on creating a complete ecosystem. Its HMI product portfolio includes a wide range of I/O, graphics processing, voice recognition, etc., and provides various development options. Almost all the components, software, and support required for this kind of interface. TI’s solution also implements Power over Ethernet to further reduce the complexity of wiring and even supports wireless connections.

For example, TIDEP-01013 is a gesture control HMI reference design with millimeter wave sensor and Sitara processor; TIDEP0066 speech recognition reference design uses TI Embedded Speech Recognition (TIesr) library to highlight the speech recognition capabilities of C5535 and C5545 DSP devices.

The gentleman speaks but does not use his hands: Do you like this kind of human-computer interaction?
Figure 2: Block diagram of the speech recognition reference design (Source: TI)

Due to the interference of background noise, the effect of speech recognition is usually not so perfect. High-quality MEMS microphones and advanced audio processing functions are key factors that make voice control equipment truly suitable for everyday environments. Infineon’s VUI market strategy is to provide a series of innovative reference platforms and ready-to-use new-generation VUI solutions to the industry through a gradually established partner ecosystem. Currently available solutions mainly come from companies such as Aaware, CEVA, Creoir, SoundAI, Sugr and XMOS. In early 2017, the new voice control solution jointly proposed by Infineon and XMOS uses a smart microphone, which enables the voice assistant to accurately identify and locate human voices in other noises. Among them, Infineon’s XENSIV radar and silicon microphone sensor combination can identify the position and distance of the speaker and the microphone, while using XMOS’s far-field voice processing technology to capture voice.

The gentleman speaks but does not use his hands: Do you like this kind of human-computer interaction?
Figure 3: Infineon’s joint partner launch is part of the VUI reference design (Source: According to the information on the Infineon website)

Concluding remarks

Back in the early 1950s, speech recognition technology was beginning to sprout, but the system at that time could only understand numbers. By 2017, this technology has made considerable progress, almost reaching a level that can accurately understand human language, thus laying the foundation for VUI’s commercial use.

Human-computer interaction is the prerequisite for achieving automation and intelligence. After decades of development, the button/key HMI method is withdrawing from our lives. The touch screen replaced by it will gradually be replaced by a new generation of VUI in many applications. Technology and innovation are endless. Although today’s VUI is not perfect, its advantages have been widely recognized by consumers. After all, the most effective communication method for human beings is language, and future machines should be no exception.

The Links:   PHT4008 MG15G1AL3

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *