How Apple Intelligence can elevate your App

During Apple’s World Wide Developer Conference 2024 (WWDC) developers and customers were introduced to Apple’s version of generative AI, called Apple Intelligence. While the underlying technology of Apple Intelligence is not that different from other AI solutions utilising Large Language Models (LLMs), Apple also introduced a very deep integration into their operating systems and a set of sophisticated privacy protection mechanisms. In this article we want to give a brief overview what Apple Intelligence means for your apps and how it is different from other AI solutions.

Marcos Ramos Rubio — Senior iOS Developer

October 30, 2024

Siri and App Intents

Until now apps were able to use App Intents to make content and actions discoverable to the system so that they can be used in Spotlight, Shortcuts, Siri and more. Users would have to build their own shortcuts, or know exactly which keywords to use for Siri. With Apple Intelligence, Apple is giving App Intents some new powerful features.

To integrate an app with Siri and Apple Intelligence, developers need to use assistant schemas. These schemas define properties and ensure App Intents follow the necessary protocols. This allows App Intents to work seamlessly with Apple’s pre-trained models, making them understandable to Apple Intelligence. If the user tells Siri to "Open last photo" for example, Apple Intelligence will be able to understand that the user wants to open a photo and will show the app in which a user last looked at a photo.

In a recent Newsletter from Bloomberg, additional details about Apple’s plans for 2025 and Apple Intelligence came to light. Apparently it will not only be possible to trigger certain app functions via Siri, but to automate all kinds of processes just by talking to your AI Assistant. This opens up a whole new way of interacting with your smart home. In the article, Mark Gurman describes it as “home automation on steroids”.

If you want your product to be part of those automations and offer unique functions, an iOS app with the necessary app intents is inevitable. We recommend thinking about these user journeys now, so you will be ready for 2025.

Current Schemas

In the current iOS 18.1 Beta there are only three assistant schema available, with more schemas being added later on.

  1. Email: The schemas for emails will allow developers to create, update, delete and archive emails or drafts as well as send, forward or reply to emails.
  2. Photo and Video: The photo schema has actions to create, delete, update, duplicate and open albums, photos or videos as well as doing some editing like changing depth, saturation and warmth values or cropping and adding filters for example.
  3. In-App search: In the In-App search schema there is only one action possible for now. It allows Apple Intelligence to use the search inside an app, to find relevant data.
Schemas coming soon

Apple only gave us a sneak peek at the names of the schemas. Further details will follow. This is the provided list:

  • Books

  • Browsers

  • Cameras

  • Document readers

  • File management

  • Journals

  • Presentations

  • Spreadsheets

  • Whiteboards

  • Word processors

As mentioned above, we expect even more to be added to the list over the course of the year. Home automation will be an important part of Apple Intelligence’s success.

Writing Tools

Another more direct way to enhance the user experience in apps is by supporting Writing Tools. Writing Tools is a new suite of features that’s available in text views and helps polish any text. Users can let Apple Intelligence:

  • Rewrite text to make it feel more friendly, professional or concise.
  • Summarize written text
  • Get key points from written text
  • Convert text into a List or Table

Developers can take advantage of the power of Writing Tools out of the box by just using native UI components.

It can be defined how apps react to the user making use of Writing Tools by implementing the appropriate methods. In case developers do not want to offer these new capabilities in some context of their app, it is possible to opt-out completely or just offer a limited panel experience.

Custom text views can also adopt the new features introduces with Apple Intelligence, but needs some extra treatment.

Conclusion

While Apple Intelligence does not open a vast number of new features to be utilized inside an app, users will expect apps to work with Apple Intelligence seamlessly. Usage of native components and adapting custom UI to support Apple Intelligence will be crucial as soon as the adoption of Apple’s new assistant will grow. If ChatGPTs Advanced Voice Mode is any indication, Apple Intelligence will change how we control and communicate with Apple devices forever in the next 2 years. Some of the features mentioned above will roll out with the iOS 18.1 update in the US. It is unclear when all features will be available and when exactly users in the EU will be able to get their hands on Apple Intelligence. It will happen though!

Now is the time to think about user journeys, use cases and business cases for your products. Be AI-ready for 2025!

TL;DR
  • Apple Intelligence enables users to perform actions in third-party apps using Siri

  • Apps can be associated to certain topics (called schemas) and Apple Intelligence will suggest your app or execute commands based on user requests

  • Apps can benefit from integrated tools like Rewrite, Proofread, and Summarize for text

  • With features like Image Playground and the Clean Up tool, users can create and enhance images directly within apps

  • Apple Intelligence ensures user privacy with on-device processing and Private Cloud Compute

Related Content

BlogPost

Embedded , Smart Products

Spelsberg and grandcentrix develop smart charging station

The market for light electric vehicles has experienced immense growth in recent years. E-bikes, e-scooters or similar vehicle have become an integral part of modern mobility in cities. Accordingly, the demand for charging stations, designed to allow e-bikes, as well as pedelecs and other LEVs, to be charged in all conceivable places without having to carry a charger with them, is also increasing. Our customer Spelsberg has recognized this potential and thought one step further. Spelsberg now also sells a smart version in addition to the classic charging station. But what exactly does “smart” mean in this context? In addition to the advantages for the end user, such as quickly finding the nearest charging station via app, the smart charging station also offers many advantages and simplifications for the operator: Real-time monitoring and remote control of all charging stations in operation, for example, make it possible to set operating times and receive data on usage and error messages in real time. However, in order to do this, the charging stations first had to be brought online quickly and easily: We at grandcentrix provided precisely this support.