Learning The Mind, Excelling in Battle
The UX and Human Factor Unit at Elbit Systems' GO focuses on understanding human mind and behavior in demanding operational context, going beyond just surveys or observations. This is how they use data collected from users to improve performance on the battlefield, exploring aspects of behavior that words can't capture
As we navigate the ever-advancing technological era, the simplicity and efficiency of user interfaces have become a standard expectation. Activities that once required extensive steps are now accomplished with a simple button press, reflecting our ability to personalize even the most fundamental features to align with our individual needs.
Technology, while generally user-friendly, can demand significant mental effort, impacting focus and emotions. This is particularly true for tasks requiring constant attention, like operating a UAV (Unmanned Aerial Vehicle). Operators, despite training, can experience ‘looked but failed to see’ incidents where they miss vital information, even though it’s visible. This often results from cognitive and emotional factors like fatigue from long hours, stress from complex information, or the strain of vigilance. These challenges can cause well-trained professionals to overlook critical details.
Current methods like surveys or verbal reports can’t fully capture these issues. That’s where neurophysiological monitoring comes in. This involves tracking brain activity and other bodily signals to understand the operator’s mental state. It’s important in high-stress scenarios because it gives a clearer picture of cognitive load ā the mental effort required. By applying this monitoring, a technology that helps operators stay alert and make good decisions can be designed, going beyond the insights provided by traditional methods.
Addressing these challenges is the primary focus of Elbit Systems’ UX (User Experience) department, where scientists and designers are dedicated to making significant advancements to enhance these crucial aspects of technology use.
Here’s how they accomplish this.
Eyes Tell All
“In our laboratories, we employ a range of tools to analyze users’ neuro-physiological signals. This allows us to understand their behavioral, cognitive, and emotional responses during interactions with specific interfaces or products,” says Dima, Ph.D., an expert in human-computer interaction research.
As the head of UX research at Elbit Systems, he is committed to building a laboratory infrastructure that facilitates the testing of innovative ideas and hypotheses about human-machine interaction. This advanced setup is crafted to evaluate if interfaces or systems are demanding too much cognitively from their operators, thereby ensuring a solid applied basis for their development.
As they conduct various experiments, the UX team is able to identify specific interaction requirements for each system or interface being tested. They carefully analyze and adjust the features of each solution, suggesting improvements and changes aimed at making the product easier to use and less mentally demanding for users.
“Remarkably, by tracking experimenters’ eye movements during their engagement with the system, we can extract a wealth of insights,” Dima elaborates. “This allows us to evaluate the efficiency of the current system interface, and to determine if all its features are being recognized and utilized effectively.”
Monitoring participants’ heart rate, blood pressure, and skin conductance during their interaction with different systems, offers substantial insights. āThe bodyās reactions are authentic,ā Dima emphasizes. āEven though individuals might sometimes alter their verbal feedback or unintentionally misrepresent their experiences, these physiological responses add crucial context to their feedback.ā
We Can “Read” Your Mind
Undoubtedly, the most fascinating element of exploring user experience and human factors lies in the study of the brain. Dima, along with his colleagues at Elbit Systems’ UX lab, highlights the tremendous potential for gaining deeper insights through analyzing the raw data obtained directly from the experimenters’ mind.
“Our ultimate aim is to discern the factors that foster a user’s trust in the system they utilize,” Dima explains. “Thorough analysis of data collected from a wide range of participants and system interactions can lead to unparalleled conclusions. These insights can then be integrated into other technologies and solutions.”
For instance, effective human-machine interactions often require the machine to effectively communicate its processes to the user. When a computer displays an error message, it can cause frustration, prompting users to seek a quick solution. However, in highly complex systems operating in high-risk settings, encountering certain errors or machine-generated insights might be inevitable.
“Much like the need for two-way communication in human interactions, engaging with a machine also demands reciprocal effort,” Dima stresses. “From our standpoint, our goal is to make the machine’s operations as clear as possible, while still providing all the essential information a user requires. Achieving this balance requires a profound understanding of the shared task they need to accomplish together.” Dima adds that these aspects are planned to be examined closely to extract interaction requirements specifically tailored for man-unmanned teaming missions. This focused analysis will ensure that both human operators and their unmanned counterparts can function effectively and cohesively in complex scenarios, optimizing the synergy and performance of the integrated team.
Is AI Changing Human-Machine Interactions?
As we navigate the rapid advancements in AI-based products, we are witnessing a transformation not only in human comprehension skills but also in our conventional understanding of human-machine interactions.
This evolution places user experience and human factor engineering at a pivotal crossroads. It raises key questions: How does AI impact the human role in managing complex systems, and is there an impending era where the human element in these interactions becomes less essential? Some of these questions are in the focus of the upcoming applied research at Elbit Systems.
“I believe that in the foreseeable future, completely overlooking user feedback is improbable,” Dima asserts. “A harmonious integration of human insight and machine learning will offer us a more reliable grasp of reality. There are still elements that machines find challenging to interpret, like contextual understanding and moral reasoning. Consequently, from my perspective, the human factor will maintain its importance and remain a vital component.”