iOS 11 was announced in early June at the Apple WWDC (Worldwide Developers Conference) 2017, and with it came a lot of changes. There were design changes, app updates, new features introduced and more, but what’s more important for the developers to know are the new iOS 11 developers features.
iOS 11 is arguably one of the biggest operating system releases in Apple history. Not only did it come packed with tons of updates, but it also included a bunch of new features, as well as APIs, frameworks, and integrations that can help developers build better and more intelligent apps.
From augmented reality to machine learning and more, here are the new iOS 11 developers features that you need to know.
New iOS 11 Features for Developers
One of the biggest announcement for iOS 11 was ARKit, a new framework by Apple that allows you to easily create and incorporate augmented reality into your apps and games. ARKit runs on the Apple A9, A10, and A11 processors, with their powerful capabilities providing a very smooth and impressive experience.
It allows the iOS device to identify the environment around you and real-world scenes including different surfaces, planes, and objects, which you can then use to add virtual objects on top of, manipulate, and keep track of as the device moves.
ARKit easily integrates with Apple’s 2D and 3D gaming platforms, SceneKit and SpriteKit, and all it takes to set up the SceneKit’s view is about four lines of code and then one for adding ARKit, as well as third-party tools like Unity and Unreal Engine, allowing you to create unparalleled augmented reality experiences for mobile apps and games.
For more augmented reality tools, check out our Top 11 Mobile Augmented Reality Tools In 2018.
Another game-changing framework introduced is Core ML, Apple’s new machine learning framework, which will help speed up app tasks involving artificial intelligence, like image recognition. Core ML is already used across many Apple products including Camera, Siri and QuickType, and is now available to all developers through an easy integration using just a few lines of code.
Core ML supports Vision’s high-performance image analysis and computer vision techniques that allows you to identify faces, detect features, and classify scenes in images and video. It also supports Foundation, which allows you to perform natural language analysis of text, as well as GameplayKit to evaluate learned decision trees and automatically build predictive models based on the data you provide.
New App Store
For its almost 10th anniversary, Apple has completely redesigned its App Store to make the app discovery experience better for its users. It introduced many changes that affect mobile app and game developers, including:
Limiting allowed app titles to 30 characters.
Adding another 30-character subtitle to apps.
Limiting changing apps’ description to whenever you’re submitting a new version or build only.
Adding a 170-character promotional text to an app which you can change at any time.
Allowing developers to feature up to three app previews, each up to 30-seconds long.
Autoplaying app previews with muted audio when a user views a product page.
The ability to customize in-app purchases by adding display names, descriptions, and promotional images for each.
The ability to promote up to 20 of them on a product page.
Splitting in-app purchases and subscriptions into two separate sections.
Accumulated ratings across all versions.
The ability to reset apps reviews and ratings.
Disallow custom review prompts without using Apple’s API.
Limiting the number of reviews prompts to three times per year.
For a more in-depth dive into the iOS 11 app store changes, make sure to check out our post: How Will The iOS 11 App Store Affect Your App?
Depth Map API
Apple created depth maps when they introduced Portrait Mode in Camera, which describe at each pixel the distance to an object in meters. Now, thanks to Apple’s Depth Map API, developers can access this depth information captured by both of the device’s cameras, which can be used to create better experiences and smarter features for photo editing apps.
Metal is Apple’s low-level, low-overhead hardware-accelerated 3D graphic and compute shader API that was introduced back in iOS 8, combining functionality similar to OpenGL and OpenCL in one API. In iOS 11, Apple announced the release of the second version, Metal 2.
Metal 2 provides a near-direct access to the GPU, allowing you to maximize your mobile apps and games graphics and compute potential output. Metal 2 enables the GPU to take more control of the rendering pipeline to further boost performance, allowing you to build upon low-overhead architecture with pre-compiled shaders, fine-grained resource control, and multithreading support.
Metal 2 also compliments Core ML perfectly as it provides deep support for for GPU-accelerated machine learning, as well as many developer tools to help you debug, optimize, and deploy your apps.
Siri was released back in 2011 and with iOS 11, you can now connect your app with Siri using SiriKit.
SiriKit allows you to handle the user requests originating from Siri or Maps by defining the types of requests, known as intents, that users can make. Related intents are grouped into domains to make it clear which intents you might support in your app. For example, in the “Lists and Notes” domain, users can create to-do lists, mark to-do list items as complete, or ask Siri to create or modify notes in the apps.
You can also build an extension that allows your app to communicate with Siri, even when the app isn’t running. Siri will then handle all of the interactions with the user, including natural language and voice recognition, and work with the extension to handle requests and perform the necessary actions.
Apple now allows developers to add HomeKit support to their products, which lets you control and communicate with home automation devices straight from your mobile app. You can configure the devices and define actions for them to perform and act upon based on the provided data. Users can configure a home with logical groupings of accessories, services, and commands, then invoke those actions through Siri using voice control. This can be applied to such actions as turning on lights or controlling air conditioning in a smart home.
Drag and Drop
iOS 11 introduces a system-wide drag and drop, taking advantage of the power of multi-touch, where users can drag content (text, images, contacts, reminder, maps, etc.) across different apps and from one mobile app to another.
With an easy to use yet powerful API, you can add drag and drop support with your views, which can be used in the Home screen, Dock, Reminders, Calendar, Messages, Spotlight, Files, Safari, Contacts, iBooks, News, Notes, Photos, Maps, Keynote, Pages, and Numbers.
However, the full-on drag and drop features are only available on iPads, while on iPhones they’re only available within the app.
MusicKit allows developers to add Apple Music features to their apps, giving them access to millions of songs. Once a the user’s permission is given to access their Apple Music account, you can play songs, albums, create playlists, find music recommendations and more, right from within your app.
Even if the user isn’t an Apple Music member, it allows you to prompt them to create an account, offering them a free trial instantly from your app.
In iOS 11, Apple removed the iCloud Drive app and instead introduced a more conclusive Files app. The Files app lets you browse files and content on your device, whether they’re stored on other iOS devices, in iCloud Drive, or across other services like Dropbox, Google Drive, and more.
You can use the integration, especially if your app is primarily focused on storing and managing user documents, to let users easily browse, access, and share their files.
AirPlay 2 now allows you to wirelessly send your content from any Apple device to any displays and audio systems with AirPlay. This basically lets you allow your app users to extend their experience to high-definition devices like AirPlay-enabled sound systems or Apple TVs.
For the first time, developers will now get access to the NFC chip in the devices. Its features are also being extended way beyond just payment. It now allows developers to detect NFC tags and read any messages that contain NDEF data to give users more information about their physical environment and the real-world objects in it.
Possible applications include allowing users to view more information about products while shopping through your app or syncing data with their gym equipments.
iOS 11’s Business Chat allows businesses and developers to connect with their users, clients, and customers from within the app. By integrating Business Chat, it allows them to answer questions, schedule appointments, make payments with Apple Pay, and more.
It allows users to easily find your business and start conversations from Safari, Maps, Spotlight, and Siri. It also integrates with pre-existing popular platforms including LivePerson, Salesforce, Nuance and Genesys.