Is voice recognition the next big thing for aviation iPad apps?
Today’s GA avionics and iPad apps have only scratched the surface when it comes to reducing pilot workload and improving GA safety. While Apple, aviation app developers and accessory manufacturers have made great strides over the past 6 years, we’re still interacting with our data in much the same way. Yes we have digital charts and near real-time weather imagery at our finger tips, but to ingest this information we have to take our attention away from flying the airplane and interact with small buttons and menus on a panel-mount MFD or tablet in our lap.
The iPad’s touchscreen display and has made this interaction much easier, but at the end of the day it’s really not much different than looking down at a paper sectional chart or flipping through the pages of an instrument approach book. No matter how you slice it, it’s time away from looking out the window, scanning flight instruments or monitoring aircraft systems.
You could outfit your cockpit with all the iPads, gadgets and information systems available today, but that would only provide a fraction of the benefit of an alternative low-tech solution: a human co-pilot. There’s no better way to reduce pilot workload thn to have another knowledgeable pilot sitting beside you to help out with routine tasks in the airplane. The best part about this arrangement is the tried and true communication link with this resource: verbal dialogue.
Need the ATIS frequency for the destination? All it takes is a simple spoken request using the headset and they can locate the frequency for you (and maybe even enter it in the standby radio if you’re lucky). Need the before landing checklist? Again a simple request will have them reading the items out loud while you keep your head up and accomplish each task. The best part about this resource is it allows you to keep focused on flying the airplane and not spend time hunting for data on an iPad or checklist in your lap. This is a tremendous resource to have during task-saturated events, like taxiing around a busy airport or setting up for an instrument approach in rough weather.
The airlines recognize the benefit of a two-pilot crew and there’s no question its a big reason they operate with a near-flawless safety record. It’s not realistic, though, for GA pilots to always operate with the same two-pilot crew, as that would take away a lot of the flexibility afforded by our GA airplanes.
So if we accept that we’re going to continue to operating single-pilot in a high-workload environment for the foreseeable future, what other resources might be out there to help reduce pilot workload? You don’t have to look far to see one possibility, which is exploding currently in the consumer market:
“Hey Siri, order me an Uber to the Peachtree Airport”
“Hey Siri, set the home thermostat to 72°”
Voice control was designed to allow us to communicate with our devices without taking our eyes off another task, such as walking down the street or driving a car. All the popular smartphones and tablets now offer these digital assistants, and Amazon even offers a product that resides in your home that allows you to order products from the online retailer using only voice commands.
If the digital assistant concept works so well for hands free tasks in everyday life, why couldn’t it benefit pilots in the airplane as well? The folks at the MITRE Corporation think it can, and have been researching the concept for nearly a year now. MITRE is a not-for-profit organization that operates research and development centers sponsored by the federal government. Their main interest is to develop new technologies that benefit the public interest. For example, they developed the core technologies that power your favorite portable ADS-B receiver.
Their current mission is to develop the “Digital Copilot,” with the goal to increase safety in the single-pilot environment. These technologies would allow the pilot to speak to an iPad app to request information pertinent to the current phase of flight, and the app could either display that information on the screen or “speak” the requested info right back to the pilot. This would provide many of the same capabilities as a human copilot, looking up ATC frequencies, reading checklists or providing reminders to switch fuel tanks, all with no heads-down time needed from the pilot.
MTIRE’s Digital Copilot looks to go beyond a simple challenge/response system and provide smart services too. It will monitor what’s occurring in your flight and try to alert you to safety issues, such as altitude deviations or taxiing on to the incorrect runway. While en route it could monitor your flight progress, and automatically read to you the current weather at the destination and runway in use when you start to descend.
One of the most interesting things about this project is that MITRE is not trying to develop a dedicated EFB app with these features to compete in the current app market. Rather their goal is to develop these new technologies and then transfer its research and capabilities to the GA software industry. It’ll then be up to the individual aviation app developers to incorporate the Digital Copilot concept if they so choose.
The timing of this project coincides well with the release of iOS 10, which opens up Apple’s Siri digital voice assistant to 3rd-party apps. Up to this point Siri could only be used for specific Apple-approved requests, which were pretty limited. Now individual app developers can take advantage of voice recognition to perform basic tasks in their specific apps, e.g. calling for an Uber right from the main Siri screen.
Garmin also introduced voice control in its panel-mount avionics lineup this year, officially called Telligence Voice Command. This requires a GMA 350 or GMA 35 audio panel, a GTN 650/750 touchscreen navigator with the required Telligence system software, and a push-to-command button on the yoke (separate from your communication radio push-to-talk switch).
There are hundreds of commands available with Telligence, allowing you to perform routine actions on the GTN navigators without touching or looking at the screen. For example, you could have it tune in specified frequencies:
“Tune destination ATIS”
“Tune destination approach”
The system can also be questioned and will provide contextual information using a computer-generated voice. You might ask:
“Say bearing and distance from destination airport”
“Say winds”
When it comes to voice recognition and the concept of a personal digital assistant, there are really 2 separate trends to pay attention to over the next few years. The first is the basic voice command technology that Garmin recently made available in their panel-mount navigators. This technology is here now, but its capabilities are limited to a finite amount of variables and actions programmed by the developers.
The next step is combining voice recognition with artificial intelligence, allowing the avionics or iPad app to “think” in a sense and provide more meaningful data and resources to you throughout a flight. This would represent the true Digital Copilot that MITRE is researching and has the potential to affect aviation safety in big ways.
For more information:
If voice command for avionics works as well as the “award winning state of the art” voice command in my 2016 SUV, we’re going to have all sorts of new NTSB findings. “I’m sorry, I did understand your request, please try again” is the response I get most of the time. This, after spending much time programming the system and having it assure me “thank you, your voice command set up is complete”, now 3 separate times. Siri is about as reliable, too, so my guess is that voice command avionics meeting any kind of reliability standard won’t happen very soon.
You haven’t tried Alexa have you?
I seriously doubt that further exposure to automation dependency by transferring activities from pilot’s brain to some silicon AI, Affected Intelligence, will make flying more joyful … ;-).