Apple kicked off its annual developer conference yesterday at its headquarters in Cupertino, California. During the keynote, they took the wraps off iOS 17 and iPadOS 17 and showed off many of the new features that will be coming to Apple devices later this year. Like with past major OS updates, Apple will release a public beta of iOS and iPadOS 17 this summer, and then likely release the final version for everyone later this year alongside the iPhone 15.
This year’s mobile operating system update from Apple doesn’t involve any groundbreaking features but it does include a plethora of welcome improvements to many of the applications we use on a regular basis. And while iOS and iPadOS may not have received any significant new features, Apple did announce the new Apple Vision Pro mixed reality VR headset, which could have some major implications for the aviation industry.
We’ll first look at what’s new for iOS 17 for iPhone then look at some of the unique features designed for the larger screen in iPadOS 17.
Voice phone calls may be going the way of the Dodo, but Apple has added some new flair and a clever new voicemail feature. Now you can personalize a custom Contact Post using a favorite photo or Memoji, which becomes part of your contact card. Family and friends will see that image in places you communicate including when making phone calls.
A really clever feature, Live Voicemail, will now dictate a real-time transcription of a voice message someone is leaving as they speak. You can even pick up the call at anytime while they’re still on the line.
If you prefer making FaceTime video or audio calls in lieu of a traditional phone call, you can now leave a video or audio message when someone misses your call.
What’s new in Messages
Many of the things you send the most through the messages apps are now easier to access. A new plus button allows you to view things like photos, audio messages, and the rest of your iMessage apps.
“Check In” now lets your friends or family know when you’ve arrived at your destination. This could be a great way to automate arrival messages with a company or family members. The Check In feature will also send location, battery level, and cell reception information to a designated contact if you stop making progress on your trip. You can simply share your location with a contact by tapping the plus button while in a message thread. If someone shares their location with you, you can view it directly within the conversation. You can also now share an AirTag or other Find My network accessory with up to five other people.
Ever land after a long flight, turn off airplane mode and find yourself greeted by dozens of new messages? iOS 17 now has a Catch-up arrow that lets you jump to the first message you haven’t seen. Then you can simply swipe right to send a reply to any specific message. New search filters also make it easy to go back and find a specific message, and any audio message you receive is now automatically transcribed, which can be helpful in a noisy cockpit where listening to iPhone audio may be difficult or impossible.
Apple has now made your mobile device even more useful when it’s being charged. StandBy gives you glanceable information designed to be viewed from a distance. Turn your iPhone into a bedside clock, showcase special moments from your photos, and get the right information at the right time with widget Smart Stacks. Maybe ForeFlight’s Passenger app could work with this mode, showing continuously updated ETA information?
This also now includes fullscreen live activities and Siri results are optimized to be viewed from a distance.
Widgets continue to evolve from simply convenient information views to small, interactive applets. Think of interactive widgets like mini versions of your favorite app that give you access to specific features without even launching the it. Interact with a widget from your Home Screen, Lock Screen, or in even in StandBy. Complete a to-do, play or pause a song or podcast, or access your Home controls to get tasks done in the moment.
These have been a popular feature on Android devices for years, and could offer easy access to frequently-used tools like airport weather or logbook stats. Check with your favorite app developer to find out about any upcoming interactive widgets in the pipeline.
Airdrop has made sharing files across mobile devices incredibly easy and now the Airdrop framework allows additional sharing features.
Want a quick and easy way to share a flight plan? Share with intention now allows you to simply bring two devices close together to initiate an Airdrop. The previous option was to dig through the content you wanted to share, find the send-to or share option, find the Airdrop option, then select the correct device name of the person you want to share with. Just bring the two devices together greatly simplifies the processes.
This process also now allows two nearby devices to swap contact information instantly. Hold your iPhone near someone else’s iPhone or Apple Watch to use NameDrop. You’ll both be able to choose the specific phone numbers or email addresses you want to share, and you can share them along with your Contact Poster instantly.
Similarly, hold your iPhone close to your friend’s iPhone to instantly watch content, listen to music, play games in sync, and more with SharePlay. And if you need to run before the content is finished sending, Airdrop will continue to send over the internet in full quality.
Much like saving your aviation chart data before each flight, Apple Maps data is now available to download for offline use. You can save an area of a map to your iPhone and explore it while offline. View information like hours and ratings on place cards and get turn-by-turn directions for driving, walking, cycling, or riding transit. This could be helpful if you flying to a new area and are unsure if cell service or Wi-Fi will be available upon your arrival, or for reviewing in flight.
Later this year, AirPlay will be available in supporting hotel rooms. Scan the QR code on your room TV and you can securely share videos, photos, and music from your iPhone to the TV. Additionally, if you have multiple Airplay devices, a list is now shown by relevance, and iOS can send suggested connections proactively based on your preferences.
A minor update but with a very helpful option is the ability to mark on PDFs, scans, papers and more in the Notes app. Have an airport diagram or chart that you’d like to highlight pertinent info? Just mark directly on the document. Notes now also allows you you link related notes so you can group them like a trip itinerary. We’ve used Notes for years for scanning aviation documents, and this could be a valuable upgrade for your POH, checklist, or maintenance documents.
Journal is a new way to reflect on and relive special moments. Pilots could capture thoughts on each flight, airport or FBO, and add details to any entry with photos, music, audio recordings, and more. There’s also an option to mark important moments and revisit them later to find new insights or set new goals. Using on-device machine learning, your iPhone creates personalized suggestions of moments for you to remember and write about based on your photos, music, workouts, and more. There’s even a Journal API, which could enable some logbook apps to integrate with this new app—time will tell.
What’s new in iPadOS 17
iPadOS 17 includes all of the same features in iOS17, including the same improvements in the Widgets and Messages app, FaceTime, Notes, and AirDrop, but improves on several of the new features designed to increase productivity collaboration for those using the iPad as an everyday computer. The one feature that doesn’t appear to be including for iPad is the StandBy feature.
Personalized lock screen
Take advantage of super high quality wallpaper to customize the look of your iPad lock screen. This includes all new support for motion effects in live photos that react to motion. Live Activites are now available on iPad so you can can track of things happening in real time like the status of an upcoming flight. Additionally you can now add you favorite widget to the iPad home screen.
The option to markup documents in the Notes app becomes acutely relevant on the large iPad screen, especially when using the Apple Pencil. PDFs and document scans are presented in full width, making them much easier to work with. This also includes the same Enhanced Autofill on iOS 17 but the larger screen of the iPad improves on this feature making it very simple to complete a flight plan or other form.
In the continued evolution to turn the iPad in a dedicated mobile computing device, Stage Manager now creates more flexibility when moving and resizing multiple windows so you can setup your workspace exactly how you like it. This can be especially helpful to optimize a larger screen when connected to an external monitor. You can also use a camera from an external monitor to make and receive FaceTime calls.
The list of new features in this latest update is impressive. Other minor but notable features to this operation system update include:
- Siri now allows back-to-back requests without reactivating.
- Keyboard autocorrect is more accurate and includes in-line predictions as you type.
- Spotlight searches intelligently offer your app shortcuts.
- Visual lookup lets you lift a subject from a photo or video frame and lookup information for the callout menu.
- A new health app can provide power insight into your mental and vision health, and is available on iPad now.
- Improved security also includes a Lockdown mode which can be activated simultaneously across all your devices.
- Enhanced autofill can use information from Contacts to fill out a PDF documents faster.
Apple Vision Pro – A HUD for your face
The famous “one more thing…” part of Apple’s presentation was the long-awaited virtual reality/augmented reality/mixed reality/magic box. Apple calls it Vision Pro, a self-contained “spatial computing” device that includes dual 4K screens with a 180-degree field of view, built-in speakers, and a whole host of sensors to read the world around you and help you interact with it. The rumors were swirling, the demo was impressive—but what does it mean for pilots?
What is it good for?
The Vision Pro includes a stunning array of hardware, from chips to cameras to sensors, and integrating all of this into one product is something that only Apple could pull off. It’s the highest performing VR headset yet designed, but a key question is “what will owners use it for?” Or to put it in the terms of so much technology debate, “what’s the killer app?”
Apple seems to be taking a similar approach to their launch of the Watch in 2015: try a bunch of potential use cases and see what wins in the marketplace. Their Watch started out as a luxury product, a timepiece, and maybe a smart home controller; over time it evolved to be a fitness tracker and notification screen. The killer app was workout tracking, not leather Hermes watch bands. Likewise, the Vision Pro demo showed potential uses for productivity apps, video conferencing, gaming, photos, and much more. Many of those use cases may fail to live up to the hype, but some will stick.
The most likely winner, in our opinion, is entertainment: watching movies, browsing YouTube, and playing games. Productivity—think Word, Excel, Slack—might take off, but it’s not clear why a headset or a 3D version of these apps is better than a powerful laptop or even a tablet. Still, a powerful video device could bring benefits for aviation training, with immersive videos on aerodynamics and interactive simulations of aircraft parts (yes, stay tuned for some exciting updates to Sporty’s Pilot Training app).
Vision Pro also has great potential in the flight simulator world. The ability to look out the left window of a virtual Cessna while flying X-Plane might make it much more valuable as a training tool. Likewise for practicing cockpit flows and flipping virtual switches—the current method of zooming in on a switch and then trying to click it with a mouse leaves a lot to be desired. Apple appears to be offering a headset with enough resolution to show a virtual cockpit in exquisite detail, and the ability to interact with hand gestures is a major upgrade.
Before the keynote presentation was even finished, some pilots were speculating that Vision Pro could make a great heads-up display (HUD) for in-flight use. It sounds fun, but we are skeptical.
- Vision Pro is self-contained, so you don’t have to be tethered to a computer.
- There are no pesky hand controllers like you find on Meta’s Quest, so keeping your hands on the controls would be possible.
- There is plenty of computing power built-in, so it should be able to handle processor-hungry aviation apps.
- Most importantly, this is not a transparent set of glasses. While it looks like it in the images, the front is actually opaque, with an OLED screen that shows a video representation of your face. The inability to turn off the magic and see out the window (without using the built-in cameras and screens) seems like a major problem for pilots.
- Battery life isn’t great—only two hours with the attached battery pack—although this will likely improve with future models.
- It is heavy and big. Apple is emphasizing quality with the metal and glass construction, but that could make it uncomfortable for long flights on hot days. Could you wear a headset around it? Maybe…
Someone will try to fly with Vision Pro as a HUD, but we think it’s a bad idea. For that, the answer is probably see-through glasses, which could be many years away.
Of course, speculating about version one of a product that barely exists is a dangerous game. It’s clear that Apple views this product as the start of a new platform, not a one-off. Note that it’s called Vision Pro, suggesting a regular Vision or even Vision SE might be on the roadmap at some point—just like the iPad or iPhone. The original iPhone was a fairly unimpressive device; only with the iPhone 3G and 3Gs did the product find its footing. That very well might happen here (Apple Vision SE, the hot gift of Christmas 2025).
Meta, perhaps Apple’s fiercest rival in the world of AR/VR, has tried to win on price with their headsets, sacrificing screen resolution and ease of use in order to ship a product for $300. So far sales have been decent, but hardly transformational. Apple has taken the exact opposite approach, essentially building a product with every possible feature and sensor and letting the price land where it lands—which, for now, is an eye-watering $3500.
The critical advantage for Apple is its ecosystem, a word that is wildly overused in tech circles but certainly applies here. For example, the Vision Pro demo showed off the ability to take and play back 3D photos and videos for a next-level visual experience. We would bet the iPhone 15 coming later this year will be allow you to create those same 3D videos: shoot it on the phone, replay it on the headset. That would dramatically increase the number of 3D videos, and could also be helpful for post-flight debriefs and aviation training. Apple can scale this feature because of the 200 million iPhones they sell every year.
And then there’s the App Store and the 1 million+ apps available for Apple devices. By announcing Vision Pro almost a year before it ships, Apple is giving its developer community plenty of time to modify existing apps or build completely new ones for their upcoming visionOS. There will be tens of thousands of apps available the day the first headset ships, compared to Meta’s paltry list of under 3000. What will all these new apps do? We can’t wait to find out.