Streaming is available in most browsers,
and in the WWDC app.
What's new for enterprise developers
Discover how you can build compelling apps for your business on iOS, iPadOS, macOS, and watchOS. We'll take you through a curated overview of the latest updates to Apple platforms and explore relevant features that you can use to create engaging enterprise apps to transform workflows, inform business decisions, and boost employee productivity.
- Capture machine-readable codes and text with VisionKit
- Complications and widgets: Reloaded
- Create parametric 3D room scans with RoomPlan
- Discover ARKit 6
- Dive into App Intents
- Enhance voice communication with Push to Talk
- Get more mileage out of your app with CarPlay
- Go further with Complications in WidgetKit
- Hello Swift Charts
- Implement App Shortcuts with App Intents
- Meet Apple Maps Server APIs
- Meet the expanded San Francisco font family
- Meet WeatherKit
- Swift Charts: Raise the bar
- The SwiftUI cookbook for navigation
- What's new in MapKit
- What's new in SF Symbols 4
- What's new in Vision
Hey, everyone! My name is Andria Jensen and I'm a consulting engineer helping our enterprise customers with their app-development efforts. Today I'd like to talk to you about what's new for enterprise developers. We see Apple products being used around the world by businesses of all sizes and across industries in more ways than we ever imagined. Enterprise developers make sure our devices are used most effectively for business. Whether it's a point-of-sale app in a retail store or a flight-planning app for airline pilots, you rely on our platforms to provide the tools and features you need to do your best work and enable others to do theirs. This year at WWDC, Apple announced major updates across software platforms, including iOS, iPadOS, macOS, and watchOS. There were a lot of announcements this year, with sessions discussing all of these topics and more. I'm going to highlight ones I think will be most relevant to you, including the new data-scanner API allowing for minimal code to create your own barcode-scanning interface, Live Activities and Lock Screen widgets to show current data from your app at a glance. And these are just the start. To better navigate all of this new information, I'm grouping things into related areas that we'll continue to reference throughout this session. And as I go through each of these at a summary level, I'll be pointing you towards additional sessions you can watch in the Developer app or on the website to learn more. OK! Let's get going. We'll start with Siri. And iOS 15 and before, SiriKit and Intents have been a way for your apps to leverage the power of Siri. This year, we announced App Intents, a new Swift-only framework designed to make it faster and easier to build great actions for your app. Unlike the previous way of making SiriKit custom intents, new App Intents do not require intent definition files or generated code. And if you have existing custom intents with intent definition files, it's easy to convert them over. Simply click the Convert to an App Intent button in your existing intent definition file. Once you have an app intent created, you can use it to build the all-new App Shortcuts. When an app is installed, so are its app intents and App Shortcuts, making shortcuts immediately available in Siri, Spotlight Search, and the Shortcuts app before the app is ever opened. This means there is now no user setup needed to create an App Shortcut; they're automatically available to users and have predefined phrases. Those initial trigger phrases can now also contain parameters, making it faster than ever to invoke a shortcut to do exactly what you want in any moment. Since App Shortcuts are installed with the app, an Add to Siri button is no longer needed. But users will still need to be made aware of the phrases you've included with your app. So we've introduced two ways to make shortcuts easy to discover. The Siri Tip view helps people understand what shortcuts are available and how to use them. And a new Shortcuts link button has been added, so it's easy to jump into the Shortcuts app and see all of the App Shortcuts your app offers. Your app is limited to 10 predefined App Shortcuts, so those installed with the app should be focused and well-defined. This allows the people using your app to create even more efficient workflows. We introduced two new modern ways to leverage Siri with app intents and App Shortcuts, and helpful ways to discover them with the Siri tip view and Shortcuts link. Check out these WWDC sessions to learn more about integrating these in your apps. So that's Siri. Widgets are a great way to leverage Siri intents for surfacing relevant details of your app at a glance. The new app intents are great for powering App Shortcuts, but for those using intents to power widgets, you should continue to use custom SiriKit intents and intent definition files. With iOS 16, you can use WidgetKit to build complications on the Lock Screen for iPhone. And complications on the Apple Watch are now built using Widget Extensions as well. So you can write code once for iOS 16 and watchOS 9. You can even share that infrastructure with your existing Home Screen widgets. Let's look a little closer. iOS 15 introduced system widgets on iPhone and iPad, and watchOS 7 introduced custom complications using ClockKit. New in iOS 16 and watchOS 9, WidgetKit can be used to build widgets for the new Lock Screen and complications on Apple Watch. We've added new widget family types for iOS and watchOS that replace the previous ClockKit complication families. So now whether you're on iPhone or Apple Watch, you'll have a few common widget types to choose from: rectangular, circular, inline, and a special type just for Apple Watch, accessoryCorner. We've also added new rendering modes to support the new system styles in iOS 16 so you can make sure your widget feels at home no matter what style your user prefers. This year we're introducing another powerful way to leverage your Widget Extensions: Live Activities. Live Activities make it easy to stay on top of the things that are happening in real time. They're displayed on your Lock Screen or from the new Dynamic Island. Live Activities use WidgetKit functionality and SwiftUI for their user interface. This makes the presentation code of a Live Activity similar to widget code. It also enables code sharing between your widgets and Live Activities. However, Live Activities don't use a timeline for updates like widgets. ActivityKit was introduced in iOS 16.1 to handle the life cycle of each Live Activity. You use it to request, update, and end a Live Activity. In addition, Live Activities may also be updated by receiving remote push notifications, so your Live Activities can stay in sync when data changes on your back end. Imagine using this in aviation apps to monitor wind speed in real time for making safety decisions; or retail and food service managers could see real-time stats for their shift at a glance. Here's what you might find on the Lock Screen for an iPhone 14: a Live Activity persistently displayed without unlocking the phone, such as up-to-the-minute status on a current order. With the iPhone 14 Pro and iPhone 14 Pro Max, you also see Live Activities on the Lock Screen, and in addition, you'll also see Live Activities on the new Dynamic Island. This allows for displaying and interacting with real-time data across the system at a glance. So a delivery driver can see their current delivery information, a field worker can see relevant data for their current job site, and a shift worker can keep track of their current hours; all of this in real time without unlocking the device and even while using another app. We introduced new widgets for the Lock Screen and deprecated ClockKit complications in favor of Widget Extensions. Widget Extensions now power Apple Watch complications, Lock Screen and Home Screen widgets, as well as the new Live Activities. Learn more with these sessions and start building glanceable experiences for your apps today. That's what's new with widgets. So it might surprise you, but iOS and iPadOS are the biggest augmented reality platforms in the world, and they help you to bring your ideas to life in more realistic ways than ever. New in ARKit 6, we've introduced a 4K video mode that lets you run the camera stream in the highest image resolution yet. We also introduced some additional camera enhancements that give you more control of the video backdrop. There were updates to the behavior of plane anchors, additions to the Motion Capture API, and new cities with location anchors' support. Powered by a ARKit, RoomPlan is a new Swift API that utilizes the Camera and LiDAR Scanner on iPhone and iPad to create a 3D floor plan of a room, including key characteristics such as dimensions and types of furniture. Imagine the possibilities with RoomPlan in workflows across architecture and design, retail, hospitality, and real estate industries. With great updates to ARKit 6 and the addition of RoomPlan for creating 3D floor plans, there are even more ways to integrate augmented reality in your apps. Learn more by watching these Dub Dub sessions. That was augmented reality. Vision allows you to build features that can process and analyze images and video using computer vision, enabling your users to quickly scan for relevant information using just the device's camera. This year, we announced new revisions to text recognition, barcode scanning, and optical flow. These revisions offer increased performance and accuracy. Barcode scanning now offers faster scanning for multiple barcodes, barcodes detected per image, and improved bounding boxes for 2D barcodes. Revision three of text recognition and barcode scanning serves as the foundation for our great new Live Text and data scanner APIs, which I'll share more about in a moment. For text recognition, we now support Korean and Japanese. There's also the ability to do language recognition for cases where language is not known up front. But if you do know what language the user will be trying to recognize, you should configure that in the API for better performance. Live Text was a system feature introduced in iOS 15 that let you interact with data in photos in brand-new ways. This year, we announced a new Swift API in VisionKit that lets you bring Live Text features to your own apps. Built on the great Vision APIs we just discussed, the Live Text APIs work beautifully on static images and can be adapted for use on paused video frames as well. Now you can allow interaction directly with text inside of an image; standard interactions like Copy are available; and data detectors are available for things like getting directions to an address, making a phone call, or sending an email. Users can even see text translated into their preferred language. Finally, QR code detection is also available. All of this without having to write complex code. The Live Text APIs are great for analyzing static image frames, but sometimes you need to do live-image analysis using the camera, such as for barcode scanning. iOS 16 introduces a new DataScannerViewController as part of the VisionKit framework that simplifies data scanning for developers. This new view controller will greatly reduce the time you spend creating a scanning experience. We've taken all the great Vision APIs that we just discussed and wrapped them in an easy-to-use drop-in UI component that takes care of the scanning interface for you. Data scanner offers built-in features for scanning text and machine-readable codes, using a live camera preview. Customizable guidance labels and item highlighting allow you to help your users easily find and scan the right things quickly, Tap to Focus, which is also used for selection, and lastly, Pinch-to-zoom to get a closer look. Integrating the data scanner API into your apps can reduce costs associated with using third-party scanning libraries and allow you to create beautiful scanning experiences customized to your app's needs. Whether your app is used for scanning medical supplies, retail inventory management, or baggage handling for an airline, the data scanner API can handle it. For machine-readable codes, all of these symbologies are supported, including these highlighted formats we commonly see in enterprise. You can even scan multiple codes with different symbologies at once. And for text scanning, we support content types from street address to date, time, and duration. Imagine integrating this into a travel or expenses app that might allow employees to scan their documents and receipts for relevant information. New revisions were introduced this year for text recognition: barcode scanning and optical flow. And revision one for face detection and landmarks has been deprecated this year. Remember, it's always best practice to use the latest revisions. We've added support for Korean and Japanese, all of this powering new Live Text and Data Scanner APIs. To learn more and integrate these great new APIs in your apps, check out these sessions. That's what's new in Vision. Now let's take a look at Maps. Apple Maps has been improving every year, adding more visual detail and a higher-quality map imagery. This year, we took MapKit to a new level with an all-new map. Along with that, we're providing a 3D city experience which lets you move around cities in amazing detail. The 3D city experience is available in many metropolitan areas around the world. And now, you can include this new experience and the all-new map in your own apps without any additional code. We've also introduced a LookAround API, allowing you to bring this immersive experience into your own apps. Imagine working for a plumbing service and being able to visualize each client address before arriving or being a delivery driver servicing a large store like this Safeway for the first time. You can see the parking lot and exactly where you need to go in order to make your delivery as well as how accessible it might be for your truck. With selectable map features, you can now take advantage of all the great annotations that Apple Maps provides. Before, you could see points of interest provided by Maps but could not interact with them. New this year, we're changing that. By using a default annotation view or creating your own custom experience, you can now interact with points of interest provided by Apple Maps.
MapKit has supported overlays with several styling options for years. In iOS 16, we're improving our existing APIs to allow your overlays to seamlessly integrate with the map. Notice here how the route is occluded by the trees. Finally, we've added a new Map Configuration API that enables you to configure your maps using different types of displays. A new property for preferred configuration has been added that you should begin using. There are three configuration types that allow you to choose what works best for your map's context. The image map configuration is used to present satellite-style imagery. The hybrid configuration is used to present an imagery-based map with added map features, such as road labels and points of interest. The standard map configuration is used to present a fully graphics-based map. These three map configurations may sound familiar to you as they're similar to our existing map types. MapType and the associated MKMapView properties for showing particular map features are deprecated. You should transition to using the new Map Configurations API. Our native frameworks offer an amazing map experience. To extend that experience, we've created the new Maps Server APIs. With geocoding APIs, you can convert an address to geographic coordinates in latitude and longitude; reverse geocoding does the opposite, allowing you to take coordinates and turn them into an address. The Search API lets you pass in a search string to discover places like businesses, points of interest, and more. And the ETA API will let you calculate how far you are from a given destination. We introduced a beautiful all-new map and 3D city experience. We've provided new MapKit APIs for you to bring that experience to your own apps, with a new way to configure your maps and a few new server APIs as well. Check out these sessions to get more details on bringing these new features to your maps. That's Maps. Next up, Weather. Providing current weather information helps people stay up to date, safe, and prepared. This year, we announced the all-new Apple Weather Service. It's available as part of the Apple Developer Program for both apps in the App Store and those published as custom apps. The Apple Weather Service powers WeatherKit, a new native framework available on all Apple platforms. And because a consistent experience is important, we're also providing a REST API that can be used to bring the Apple Weather Service to any platform. It uses high-resolution weather models -- machine learning and prediction algorithms -- to give you hyper-local weather forecasts around the globe. And we will always respect your users' privacy. Location is only used for weather forecasts. It's never associated with any personally identifying information, and no user data is ever shared or sold. With WeatherKit, you can get current weather conditions, 10-day hourly forecasts for temperature, precipitation, wind, UV index, and more. Minute-by-minute precipitation for the next hour and severe weather alerts are available for select regions. You can even get historical weather data to identify trends. Imagine integrating WeatherKit into a field service or aviation app where the weather conditions make a big difference to the work environment. We introduced the all-new Apple Weather Service which powers the new native WeatherKit framework. There's also a REST API to bring WeatherKit to any platform. All of this helping you deliver hyperlocal, up-to-the-minute forecasts to your apps. To find out more, watch the "Meet WeatherKit" session. And that was Weather. Push-to-talk apps have many uses in fields where rapid communication is essential, such as healthcare and emergency services. In these fields, communication needs to be real-time and response times are critical. In iOS 16, we've introduced the Push to Talk framework to help you create this experience, and I think this will be a huge addition for many of our enterprise developers. The Push to Talk framework enables a new class of audio communication apps on iOS that provide a walkie-talkie-style system experience. The Push to Talk framework provides developers with APIs to leverage a system UI that your users can access from anywhere without having to directly launch your app. The system UI allows the user to quickly activate an audio transmission which will launch your app in the background. It will then record and stream audio to your back-end server. The new framework eliminates special entitlements and workarounds that, in the past, have kept walkie-talkie apps running continuously in the background. With the Push to Talk framework, the system will wake your app only when needed to preserve battery life. Push-to-talk features have a long history of use by first responders and law enforcement. We've also seen increasing use of push-to-talk solutions in healthcare as well as retail and warehouse environments, where they're used to support curbside pickup or pick-and-pack use cases. The Push to Talk framework provides you with a way to create walkie-talkie-style apps, utilizing a familiar system interface available from anywhere on the device. And it's designed to be compatible with your existing end-to-end communication solutions and your back-end infrastructure. To learn more about implementing Push to Talk in your apps, watch this session from WWDC. So that's Push to Talk. This year, we made several improvements to CarPlay that we think are great additions for our enterprise customers. Navigation apps can now display maps and turn-by-turn instructions in a second location, such as the instrument cluster located directly in front of the driver. This is a great feature for use cases in field service, sales, delivery, and transit. All CarPlay apps require the Apple Developer Program and a CarPlay app entitlement that matches your app type. Previously, CarPlay entitlements were limited to these app types: navigation, audio, communication, EV charging, parking, and quick food ordering. This year, we've added two new types: fueling and driving task apps. Fueling apps might help a user start up a gas pump, and driving task apps can enable a wide variety of simple tasks that a user might need to do while driving. These apps can be used to help control car accessories, like a trailer controller; or help with tasks at the beginning or end of a drive, like tracking mileage for expense reporting. We also made testing CarPlay apps easier with the new CarPlay Simulator. CarPlay Simulator is a standalone Mac application that replicates a CarPlay environment. Once installed and connected to an iPhone, CarPlay will start on iPhone and run just the same as if you had it connected to a real car. And because your app is running on an actual iPhone, you have access to complete iPhone functionality, allowing you to test any scenario necessary. We've created the ability to have turn-by-turn directions in a second location, added two new app types as well as the CarPlay Simulator. Check out this session if you're interested in learning more about using CarPlay with your apps. So far we've looked at some improvements to frameworks used throughout the system. Now, I want to tell you about the updates we've made to our UI frameworks. Something I'm really excited about this year is Swift Charts, a powerful and flexible new framework for transforming your data into informative visualizations. I think you're going to really love it. It uses the same concise syntax that you're already familiar with in SwiftUI, allowing for effective and customizable charts built on minimal code. Swift Charts has built-in support for localization and accessibility, and just like SwiftUI, Swift Charts offers support for all Apple platforms, so you can make your charts truly available to everyone. Swift Charts allows for easy communication of patterns or trends in your data. Line charts, bar charts, and scatter plots are all easy to create. And when you create a chart, it automatically generates scales and axes that fit your data, adjusting them as needed when your data changes. Swift Charts is an easy way to bring charts like these to apps that require data reporting or visualization. Think of the beautiful ways you could bring Swift Charts to your enterprise apps, reporting dashboards, sales trends, real-time metrics, patient health data, and anything else you could imagine. Swift Charts can help you bring it to life. With iPadOS 16, you can now build some amazing new features into your iPad apps, making them truly desktop class. We've added an all-new Find and Replace UI. Search now takes up less space and is shown inline in the navigation bar on iPadOS, It can also collapse to a button. Search suggestions appear when the search is activated and can be updated as the search query changes. And we've brought desktop-class editing to the iPad. Edit menus now have alternate presentations based on the input used. For touch interactions, you'll see the familiar menu you're used to but with a new paging behavior. When using the Magic Keyboard or Trackpad, you'll see a context menu presented. UIKit is formalizing the existing navigation bar styles and introducing two new ones with a denser and more customizable layout. Navigator apps have a familiar push-pop navigation model. This is generally appropriate for apps that display hierarchical data, like Settings. Browsers, like Safari or Files, are ideal for viewing and navigating back and forth between multiple documents or folder structures. And editors are great for focused viewing or editing of individual documents. Just as we're getting more specific with the navigation bar styles on the iPad, we're introducing new SwiftUI APIs for more specific navigation styles as well. Until now, you used navigation views for navigation-based apps. This year, we're moving away from navigation views and have introduced two more specific options to handle navigation. Let's start with navigation split view, which is perfect for multicolumn maps. Navigation split view automatically adapts to a single-column stack on iPhone. There's a two-column layout as with the App Store Connect app. There's even a three-column layout like Notes. Or the Mail app, which shows a three-column layout on iPad and Mac. The single-column layout also displays in Slide Over on iPad and even adapts for Apple Watch. Navigation stacks represent a push-pop interface for navigating forward and backward in a stack of views. We see this style of navigation in many places, such as the Settings app or Find My on Apple Watch. You can mix navigation stacks with split views for even more ways to create your interface. The new navigation stack also allows for an easy implementation of deep linking and programmatic navigation. Along with these updates, navigation links have changed with iOS 16. Navigation links are used to present other views in SwiftUI. In the Settings app, as you move through the different options and hierarchies, navigation links tell your app which view to present next. Previously, navigation links were configured by providing a title and a destination view to present, but now they can also be triggered based on a presented data value. Finally, navigation stacks also keep track of a path. Paths represent all the data on the stack as you move between views. When the stack is just showing its root view, the path is empty. As views are added to the stack, their values are appended to the path. Navigation stacks use a binding to this path to easily enable data-driven programmatic navigation. And for developers using Device.name, I've got some important changes. Before iOS 16, the UIDevice API allowed you to access the user-assigned device name. To better safeguard user data, the UIDevice.name API will now return only the model of the device instead of the user-assigned name. We realize that there are some circumstances where apps will still need the device name. Perhaps you have a document-management app where the user needs to see on which of their devices they last edited a file. You can request an entitlement to still access the user-assigned device name as you always have. If you have an enterprise or custom app that requires Device.name, please describe your use case in detail when submitting the entitlement request. Be sure to include whether or not it's a shared device. For a full list of eligibility criteria, check out the documentation for the user-assigned device name entitlement. And that's UI frameworks. We introduced Swift Charts for beautiful data visualizations and added several new features to make the iPad truly desktop class. We've deprecated navigation views in SwiftUI in favor of using navigation stack or navigation split view. To make sure you're up to date on all the latest details, go to the Developer app and get started with these sessions. We just saw UI frameworks that help you build great interfaces in code. Finally, let's take a look at some ways we've improved the tools you use to design those interfaces. But first, I'd like to take a moment to talk about why design matters and not just for consumer apps. Enterprise apps that are well-designed can make a huge impact on your business. Creating a consistent, familiar interface that your users expect can make them more productive, more effective employees. And that saves the business money on things like training and support costs, all while ensuring a high level of adoption for your app and efficiency for your employee workflows. Because design is such an important component of building great apps, we created the Human Interface Guidelines. The Human Interface Guidelines has long been a comprehensive resource to help you create great experiences across Apple platforms. Now it's been fully redesigned and refreshed to meet your current needs. The HIG has merged its platform-specific guidance into a unified document. This makes it simpler to explore common design approaches while still preserving relevant details about each platform. It's easier to navigate and is now searchable. Coming later this year, we're also adding change logs for the entire set of guidelines. SF Symbols is a large library of iconography that allows you to bring consistent icons to your apps. With SF Symbols 4, we've added 700 new symbols. So now, there are more than 4000 symbols to choose from, and all of these are now available directly in Xcode or from the SF Symbols app. Rendering modes give you control over how color is applied to a symbol. SF Symbols supports four rendering modes: monochrome, hierarchical, palette, and multicolor. These rendering modes allow for a wide variety of ways to display symbols in your apps, choosing what works best for a given context. Previously, if no rendering mode was specified, the default would always be monochrome. With SF Symbols 4, we've added a new rendering mode called "automatic." With automatic rendering, individual symbols can identify one of the four modes as their own preferred rendering mode. This allows each symbol to best highlight its unique features. For example, here, the cloud, sun, and rain symbol as well as the hexagon grid prefer monochrome, while the SharePlay and iPhone radio waves icons prefer hierarchical. Automatic rendering will now be the default mode for all symbols and is preferred unless another mode is explicitly requested for a certain context. To help visualize how symbols look in different rendering modes and configurations, the SF Symbols app has gained a new preview area located in the right sidebar. SF Symbols 4 also adds support for variable symbols. You can find these updated symbols by selecting the variable collection in the left sidebar. Variable symbols allow apps to display different layers of a single symbol based on a value from zero to one. An app can now use a speaker symbol to represent current volume. At a value of zero, the speaker waves are faded out; and as the value increases up to one, the speaker waves progressively fill in, indicating the change in volume level as it happens. And if you're making custom symbols, you'll want to check out the new unified layer annotation, also shown in the preview area. This allows you to have a shared-layer structure across rendering modes, making annotation faster and easier. Continuing in design. This year, we've added three new width styles to the San Francisco font family: condensed, compressed, and expanded. This means you have more flexibility when designing your typography. You can see examples of these new styles in Photo Memories and Apple News. We've also added SF Arabic and SF Arabic Rounded to the system fonts, making Arabic typography on Apple platforms modern, clear, and refined. We've made it a lot easier to add an icon for your iOS app in Xcode 14. In Xcode 13 and before, you had to provide images of all sizes. We're now only requiring a single large-format image for iOS. The large app icon will be scaled for display on all iPhone, iPad, and Apple Watch devices. If you want to add custom images for smaller sizes, you can still add specific icon images just for those. This year, we completely revamped the Human Interface Guidelines, brought you new icons and ways to render them with SF Symbols 4, font additions, and the ability to use a single image for app icons. To find out more, check out these Dub Dub sessions. Well, that was a lot, but we made it! Those were the highlights from this year's announcements that I wanted you all to know about. And I think there were a lot of great things here that you can use to make your enterprise apps even better. To learn more about everything I've covered, you can check out all of our sessions in the Developer app. Start using the new SDKs today. And please provide feedback as you explore the new features. I can't wait to see how you'll use these new tools to improve your users' experience and make their jobs faster and easier to accomplish. Thanks for listening!
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.