Daniel Lyons' Notes

WWDC25 Explore new advances in App Intents Apple

Description

Explore all the new enhancements available in the App Intents framework in this year’s releases. Learn about developer quality-of-life improvements like deferred properties, new capabilities like interactive app intents snippets, entity view annotations, how to integrate Visual Intelligence, and much more. We’ll take you through how App Intents is more expressive than ever, while becoming even easier and smoother to adopt. We’ll also share exciting new clients of App Intents this year like Spotlight and Visual Intelligence, and learn to write app intents that work great in those contexts.

Explore related documentation, sample code, and more:
App intent domains: https://developer.apple.com/documentation/AppIntents/app-intent-domains
Creating your first app intent: https://developer.apple.com/documentation/AppIntents/Creating-your-first-app-intent
Integrating actions with Siri and Apple Intelligence: https://developer.apple.com/documentation/AppIntents/Integrating-actions-with-siri-and-apple-intelligence
Making actions and content discoverable and widely available: https://developer.apple.com/documentation/AppIntents/Making-actions-and-content-discoverable-and-widely-available
PurchaseIntent: https://developer.apple.com/documentation/StoreKit/PurchaseIntent
App Shortcuts: https://developer.apple.com/documentation/AppIntents/app-shortcuts
App Intents: https://developer.apple.com/documentation/AppIntents
Adopting App Intents to support system experiences: https://developer.apple.com/documentation/AppIntents/adopting-app-intents-to-support-system-experiences
Building a workout app for iPhone and iPad: https://developer.apple.com/documentation/HealthKit/building-a-workout-app-for-iphone-and-ipad
Accelerating app interactions with App Intents: https://developer.apple.com/documentation/AppIntents/AcceleratingAppInteractionsWithAppIntents
Bring your app’s core features to users with App Intents: https://developer.apple.com/videos/play/wwdc2024/10210
Design App Intents for system experiences: https://developer.apple.com/videos/play/wwdc2024/10176
What’s new in App Intents: https://developer.apple.com/videos/play/wwdc2024/10134
Bring your app to Siri: https://developer.apple.com/videos/play/wwdc2024/10133
Get to know App Intents: https://developer.apple.com/videos/play/wwdc2025/244
Develop for Shortcuts and Spotlight with App Intents: https://developer.apple.com/videos/play/wwdc2025/260
Design interactive snippets: https://developer.apple.com/videos/play/wwdc2025/281

My Notes

00:00 Intro

  • 00:06 Jeff from the App Intents team introduces new advances.
  • 00:15 App Intents integrates app features into Shortcuts, Spotlight, and Visual Intelligence.
  • 00:25 Suggests "Get to know App Intents" session for beginners.
  • 00:35 Outline of the talk: interactive snippets, new system integrations, user experience refinements, and convenience APIs.

00:55 Interactive snippets

  • 00:56 Snippets display tailored views for App Intents (confirmation or result).
  • 01:04 Snippets now support interactivity.
  • 01:08 Examples: turning on sprinklers, configuring food orders, integrating with Live Activities.
  • 01:28 Demo with TravelTracking app to find the closest landmark.
  • 01:45 Snippet shows landmark with a heart button to favorite it; snippet updates immediately on tap.
  • 02:04 Built using the new SnippetIntent protocol, which renders views based on parameters and app state.
  • 02:32 How result snippets work: System populates parameters, fetches app entities, runs perform method to render the view.
  • 02:44 Views can associate buttons/toggles with any App Intents, reusing existing ones.
  • 03:15 System runs the associated intent, then triggers another update for the snippet.
  • 03:49 View changes are animated with SwiftUI contentTransition APIs.
  • 04:05 Implementation: Add ShowsSnippetIntent to the return type of an existing intent and provide the snippet intent in the result method.
  • 04:27 Implement the snippet intent by conforming to SnippetIntent, marking variables as @Parameter, accessing AppDependencies, and returning ShowsSnippetView in the perform method.
  • 05:04 Snippet intents are created/run many times; avoid mutating app state.
  • 05:14 Render views quickly and use parameters for App Entities (which are queried) and primitive unchanging values.
  • 05:40 SwiftUI views trigger intents using Button or Toggle initializers with corresponding App Intents.
  • 06:13 Confirmation snippets: Demo for "Find Tickets" asks for ticket count, then shows price.
  • 06:30 If a snippet's button triggers an intent that presents its own snippet, it replaces the original (if original was a result snippet).
  • 06:43 Use requestConfirmation method to present a configuration view.
  • 07:03 The method throws an error if the snippet is canceled; let it terminate the perform method.
  • 07:11 Interactions follow the same update cycle. Modeling search requests as AppEntity allows system to fetch newest values.
  • 07:37 App is not terminated while snippet is visible; states can be retained in memory.
  • 07:47 After interactions, the original App Intent's execution resumes.
  • 07:56 Call static reload method on the snippet intent to update it mid-task.
  • 08:09 References another session for designing interactive snippets.

08:15 New system integrations

  • 08:15 Image Search (new in iOS 26) allows apps to show search results from camera captures or screenshots.
  • 08:36 Demo: Screenshot of a landmark, performing image search, selecting TravelTracking app, tapping result opens the landmark page in the app.
  • 08:54 Implement a query conforming to IntentValueQuery (accepts SemanticContentDescriptor, returns AppEntities). Image Search uses display representations.
  • 09:12 Tapped results send the AppEntity to its OpenIntent, which must exist.
  • 09:24 Query implementation: struct conforms to IntentValueQuery, values method takes SemanticContentDescriptor (pixels), converts to CGImage, then returns matched entities.
  • 09:52 Implement an OpenIntent (with matching entity type for target parameter) to handle tapped results.
  • 10:03 OpenIntents also work in Spotlight for direct navigation to entities.
  • 10:14 Return a few pages of results and allow users to continue searching in the app if they don't find what they're looking for.
  • 10:52 Use the new AppIntent macro with semanticContentSearch schema (replaces AssistantIntent). Add the semanticContent property.
  • 11:18 In the perform method, process search metadata and navigate to the search view.
  • 11:24 Use UnionValues to return a mixture of entity types from one query (e.g., LandmarkEntity and CollectionEntity). Implement OpenIntent for each type.
  • 12:05 Onscreen entities: Using NSUserActivities, apps can associate entities with onscreen content for Apple Intelligence (e.g., asking ChatGPT about visible content).
  • 12:29 Demo: Asking Siri about Niagara Falls in TravelTracking app, sending screenshot/full content (PDF) to ChatGPT.
  • 13:00 To associate: add userActivity modifier to view, associate entity identifier with activity.
  • 13:13 Support converting LandmarkEntity to data types ChatGPT understands (e.g., PDF) by conforming to Transferable protocol.
  • 13:38 Spotlight improvements: Now supports running actions directly on Mac.
  • 13:53 Make app entities conform to IndexedEntity and donate to Spotlight for filtering parameters.
  • 14:05 Use indexingKey parameter on property attribute to associate entity properties with Spotlight keys (e.g., searching "Asia" for Asian landmarks). This also enables Shortcuts to generate find actions.
  • 14:31 Annotate onscreen content with entities for prioritized suggestions.
  • 14:41 Implement PredictableIntent for system to learn and suggest intents based on user behavior.

15:01 User experience refinements

  • 15:02 Undo: UndoableIntent protocol allows people to undo App Intents with familiar gestures.
  • 15:28 Demo: Deleting a collection with a shortcut, then undoing it with a three-finger swipe.
  • 15:49 DeleteCollectionIntent conforms to UndoableIntent, which provides an optional undoManager property to register undo actions.
  • 16:06 System provides the relevant undo manager, ensuring sync across UI and App Intents.
  • 16:28 Multiple-choice API: Present several options for users to choose from.
  • 16:38 Demo: Deleting a collection now offers an "Archive" option.
  • 16:51 In perform method, call requestChoice with an array of options (custom titles, styles).
  • 17:09 Customize snippet with dialog and SwiftUI view.
  • 17:16 Chosen option is returned; cancellation throws an error that terminates the perform method.
  • 17:32 Use a switch statement to branch on the chosen option.
  • 17:43 Supported Modes: Intents can behave differently based on user interaction (e.g., voice-only when driving, foregrounding app when looking at device).
  • 18:15 Demo: "Get crowd status" intent shows only dialog when "Open When Run" is off (like Siri with headphones), but takes user to app when enabled.
  • 18:49 Add supportedModes static variable: background only (never foregrounded) or foreground (launches app before running).
  • 19:11 Use currentMode property to check if in foreground and navigate accordingly.
  • 19:31 Modify foreground mode to dynamic (intent decides when to launch app), or deferred (eventually launches, but not immediately).
  • 20:14 For dynamic or deferred modes, use continueInForeground method to control when to bring the app forward.
  • 20:23 Check systemContext.canForegroundApp, then call continueInForeground (alwaysConfirm: false avoids prompt if recent activity).
  • 20:50 If launch request is denied, the method throws an error for handling.

21:02 Convenience APIs

  • 21:07 View control APIs: Remove UI navigation code from App Intents and let views handle it.
  • 21:20 Conform intent to TargetContentProvidingIntent. Use onAppIntentExecution view modifier, which takes intent type and action closure with intent passed in.
  • 22:04 This allows removing UI code or even the perform method from the intent.
  • 22:30 If multiple views have the same modifier, all will run.
  • 22:39 Control which scene/window runs an intent using handlesExternalEvents APIs.
  • 22:52 TargetContentProvidingIntent has contentIdentifier (defaults to persistentIdentifier).
  • 23:12 Use HandlesExternalEvents modifier on scenes to set activation condition (array of identifiers matching intent's contentIdentifier).
  • 23:26 For dynamic activation conditions, use the modifier on views instead.
  • 23:52 For UIKit: conform intents to UISceneAppIntent or have scene delegate respond via AppIntentSceneDelegate.
  • 24:14 New ComputedProperty macro avoids storing derived values on AppEntities (e.g., accessing UserDefaults directly from getter).
  • 24:42 New DeferredProperty macro lowers instantiation cost by fetching expensive properties (e.g., network calls) only when explicitly requested.
  • 25:21 Choose ComputedProperty over DeferredProperty for lower overhead; use Deferred only if calculation is expensive.
  • 25:34 App Intents can now be put in Swift Packages (and static libraries) using the AppIntentsPackage protocol.
  • 25:57 Next steps: Try interactive snippets, associate entities with onscreen content, provide multiple choice options, support multiple modes.
  • 26:23 Check out the sample app on the developer website.

Transcript

WWDC25 Explore new advances in App Intents Apple
Interactive graph
On this page
Description
My Notes
00:00 Intro
00:55 Interactive snippets
08:15 New system integrations
15:01 User experience refinements
21:02 Convenience APIs
Transcript