Apple Asks Developers to Start Optimizing Apps for HomePod Using SiriKit in iOS 11.2

iOS 11.2, released this morning, introduces SiriKit support for the HomePod, according to Apple. With SiriKit for HomePod now available, Apple is asking developers to make sure SiriKit-compatible apps are optimized for HomePod ahead of the device's release.

SiriKit is designed to allow iOS and watchOS apps to work with Siri, so users can complete tasks with Siri voice commands. SiriKit is available for a wide range of apps on those two platforms, but its availability is slightly more limited when it comes to HomePod.


Third-party apps that use SiriKit Messaging, Lists, and Notes are compatible with the HomePod. Siri will recognize voice requests given to the HomePod, with those requests carried out on a linked iOS device. So, for example, users can ask HomePod to send a message to a friend, add an item to a list, or create a new note. Sample HomePod requests:

- Send a text to Eric using WhatsApp
- In WeChat, tell Eric I'll be late
- Add chocolate and bananas to my list in Things
- Create a note that says "hello" in Evernote

Developers can test the voice-only experience of their apps using Siri through headphones connected to an iOS device with the iOS 11.2 beta.

Apple plans to release the HomePod this December, but a specific launch date for the speaker has not yet been provided. When it becomes available, the HomePod will cost $349.

Related Roundups: iOS 11, HomePod
Tag: Siri

Discuss this article in our forums

Apple Says ‘Hey Siri’ Detection Briefly Becomes Extra Sensitive If Your First Try Doesn’t Work

A new entry in Apple's Machine Learning Journal provides a closer look at how hardware, software, and internet services work together to power the hands-free "Hey Siri" feature on the latest iPhone and iPad Pro models.


Specifically, a very small speech recognizer built into the embedded motion coprocessor runs all the time and listens for "Hey Siri." When just those two words are detected, Siri parses any subsequent speech as a command or query.

The detector uses a Deep Neural Network to convert the acoustic pattern of a user's voice into a probability distribution. It then uses a temporal integration process to compute a confidence score that the phrase uttered was "Hey Siri."

If the score is high enough, Siri wakes up and proceeds to complete the command or answer the query automatically.

If the score exceeds Apple's lower threshold but not the upper threshold, however, the device enters a more sensitive state for a few seconds, so that Siri is much more likely to be invoked if the user repeats the phrase—even without more effort.

"This second-chance mechanism improves the usability of the system significantly, without increasing the false alarm rate too much because it is only in this extra-sensitive state for a short time," said Apple.

To reduce false triggers from strangers, Apple invites users to complete a short enrollment session in which they say five phrases that each begin with "Hey Siri." The examples are saved on the device.
We compare the distances to the reference patterns created during enrollment with another threshold to decide whether the sound that triggered the detector is likely to be "Hey Siri" spoken by the enrolled user.

This process not only reduces the probability that "Hey Siri" spoken by another person will trigger the iPhone, but also reduces the rate at which other, similar-sounding phrases trigger Siri.
Apple also says it created "Hey Siri" recordings both close and far in various environments, such as in the kitchen, car, bedroom, and restaurant, based on native speakers of many languages around the world.

For many more technical details about how "Hey Siri" works, be sure to read Apple's full article on its Machine Learning Journal.


Discuss this article in our forums

Apple Hires AI Team From Init.ai to Join Work on Siri

Apple this week "acqui-hired" the team from Init.ai, a startup that designed a smart assistant to allow customer service representatives to easily parse through and automate some interactions with users, reports TechCrunch.

The startup focused on creating an AI with natural language processing and machine learning to analyze chat-based conversations between humans.


Init.ai announced that it was shutting down its service earlier this week to join a new project.
Today is an exciting day for our team. Init.ai is joining a project that touches the lives of countless people across the world. We are thrilled and excited at the new opportunities this brings us.

However, this means Init.ai will discontinue its service effective December 16th 2017. While we wish to make this transition as smooth as possible, we cannot continue to operate Init.ai going forward.
Apple did not purchase Init.ai and will not obtain any intellectual property nor is there an indication that Apple plans to use any existing Init.ai services. Instead, Apple has taken on the Init.ai team, who will now work on Apple's Siri personal assistant.

The addition of the Init.ai team may hint at Apple's future Siri plans, with the company perhaps planning to build out more business integrations to supplement Business Chat, the iOS 11 iMessage feature that allows businesses to communicate with customers.

TechCrunch says it's not entirely clear how many people from Init.ai will be transitioning to Apple, but the startup only employed six people.


Discuss this article in our forums

Apple’s Siri Turns Six: AI Assistant Announced Alongside iPhone 4s on October 4, 2011

On October 4, 2011, Apple held a media event in which it introduced Find My Friends, refreshed the iPod Nano and iPod touch, and revealed the iPhone 4s with its all-new Siri voice assistant. This means that today marks the sixth year anniversary of when Apple's Siri was first introduced to the world, although the AI helper wouldn't be available to the public until the iPhone 4s launch on October 14, 2011.

In the original press releases for Siri, Apple touted using your voice to send text messages, schedule meetings, set timers, ask about the weather, and more. Apple explained Siri's understanding of context and non-direct questions, like presenting you with a weather forecast if you ask "Will I need an umbrella this weekend?"

The original Siri interface on iOS 5
Siri on iPhone 4S lets you use your voice to send messages, schedule meetings, place phone calls, and more. Ask Siri to do things just by talking the way you talk. Siri understands what you say, knows what you mean, and even talks back. Siri is so easy to use and does so much, you’ll keep finding more and more ways to use it.

Siri understands context allowing you to speak naturally when you ask it questions, for example, if you ask “Will I need an umbrella this weekend?” it understands you are looking for a weather forecast. Siri is also smart about using the personal information you allow it to access, for example, if you tell Siri “Remind me to call Mom when I get home” it can find “Mom” in your address book, or ask Siri “What’s the traffic like around here?” and it can figure out where “here” is based on your current location.
Apple didn't create Siri itself, however, as the company purchased the technology that’s now prevalent in all iOS devices by acquiring Siri, Inc., a spinoff of SRI International where the technology originated. Prior to the assistant's presence on iPhone, Siri was a standalone app on the App Store (launched February 2010) that offered automated personal assistant services through integrations with third-party apps like OpenTable, FlightStats, Google Maps, and more. Users could interact with these apps using Siri's voice-recognition technology, created by Nuance.

Just two months after Siri appeared on the App Store, reports of Apple's acquisition of Siri surfaced in April 2010, and the purchase was quickly confirmed by representatives and board members from the voice-recognition company. According to Siri board member Shawn Carolan, "The offer from Apple was of a scope and tenor that it was a no-brainer to accept it." The standalone app was removed from the App Store after Apple's unveiling of its own Siri in October 2011.


Over the years, Siri has debuted new features and expanded to more devices, including the iPad (June 2012), iPod Touch (September 2012), Apple Watch (April 2015), Apple TV with Siri Remote (September 2015), Mac with macOS Sierra (September 2016), and HomePod (coming December 2017).

Since 2011, Siri has become a large enough part of Apple's brand that the company just this year launched a series of advertisements focusing solely on the assistant's helpfulness, aided by actor Dwayne "The Rock" Johnson. The latest version of iOS, iOS 11, has seen a few improvements brought to Siri, including a more natural speaking voice, text-to-talk, and a translation feature.

Details about Siri's origin at Apple have continued to emerge over the years, with voice actress Susan Bennett revealing a few behind-the-scenes tidbits about the early days of the project in an interview posted earlier in 2017. Bennett described having to say "nonsense phrases" like "Say the shrodding again," which she later realized provided Apple with "all of the sounds of the English language."


The next place that Siri will be found in is Apple's HomePod speaker, which was for a long time simply called the "Siri Speaker" prior to its official unveiling at WWDC in June. HomePod will greatly rely on user interaction with Siri, allowing for music playback, HomeKit control, timer settings, news reports, and essentially most of the tasks that Siri can already do elsewhere. Most importantly, Siri will become a "musicologist" in HomePod and gain a greater understand of music-related trivia to greater enhance HomePod's position as a high-quality audio device.

Despite advancements, many users frequently point out Siri's flaws and inconsistencies in certain situations. It's been rumored previously that Apple's development on Siri has been hindered by the company's commitments to privacy. But, in an interview last month Apple VP of marketing Greg Joswiak argued that user data privacy and a smart AI assistant can co-exist: "We're able to deliver a very personalized experience... without treating you as a product that keeps your information and sells it to the highest bidder. That's just not the way we operate."

Tag: Siri

Discuss this article in our forums

Apple Drops Bing Search Engine Results for Siri and Spotlight in Favor of Google

Starting today, Apple search results from Siri and Spotlight on Mac and iOS will be provided by Google rather than Microsoft's Bing. Apple announced the news in a statement that was given to TechCrunch this morning, claiming consistency across iOS and Mac devices is the reason behind the switch.

"Switching to Google as the web search provider for Siri, Search within iOS and Spotlight on Mac will allow these services to have a consistent web search experience with the default in Safari," reads an Apple statement sent this morning. "We have strong relationships with Google and Microsoft and remain committed to delivering the best user experience possible."
Prior to this morning, all results from a search conducted on Spotlight using Finder on Mac or the swipe down search bar on iOS were Bing search results, as was all search information provided by Siri. Now, when you search using Spotlight or when you ask Siri a question that ends up involving a web search, info will come from Google.

According to TechCrunch, the swap will include both web links and video results from YouTube, but web image results in Siri and Spotlight searches will continue to be provided by Bing for the time being. Google searches will use the standard search API and will provide the same search results you'd get from a Google.com search.

While Apple has used Bing for search results for things like Siri and Spotlight, Google has remained the default search engine on iOS and Mac devices. Earlier this year, reports suggested Google paid Apple nearly $3 billion to maintain its position as the default search engine on iOS devices.

The search engine swap began rolling out to users at 9:00 a.m. Pacific Time.


Discuss this article in our forums

How to Use Siri’s New Translation Feature in iOS 11

iOS 11 brings new functionality to Siri, including a translation feature that allows Siri to translate words and phrases spoken in English to a handful of other languages. Translation is super simple to use, and while the translations aren't always perfect, they get the gist of what you're attempting to say across to someone who speaks another language.


Using Siri Translate



  1. Activate Siri, either by holding down the Home button or using a "Hey Siri" command.

  2. Tell Siri the phrase you want to translate and the language you want it in. For example: "Siri, how do I say where's the bathroom in Spanish?"

  3. Siri will respond with the appropriate translation, both in text form and vocally. The vocal component can be replayed by pressing the play button located at the bottom of the translation.

  4. There are multiple ways to phrase your translation requests. Siri will respond to "Translate X to X language" or "How do I say X in X language?"

Available Languages


Siri can translate English to Mandarin, French, German, Italian, and Spanish. There's no two-way translation available yet - it's only English to the above listed languages. Apple has said it plans to add additional languages to the Siri translation feature following the release of iOS 11.

Apple appears to be using an in-house translation engine for Siri, as the translations do not match up with translations provided by popular services like Google Translate or Bing Translate. Also of note, while Siri can translate from English to several other languages, the translation features do not work with British, Canadian, or Australian English settings.

Because Siri speaks translations aloud, the translation feature can come in handy when traveling and trying to get simple communications across. It's a simple addition, but one that may go a long way towards making Siri more useful.

Related Roundup: iOS 11
Tag: Siri

Discuss this article in our forums

Apple’s Greg Joswiak on Siri: We Deliver a Personalized Experience Without Treating You as a Product

Ahead of the launch of iOS 11, Apple VP of marketing Greg Joswiak sat down with several publications to talk about Siri, the personal assistant built into all major Apple devices. His interview with Wired was published last week, and today, Fast Company published its interview, in which Joswiak talks Siri and privacy, among other topics.

It's been long believed that Apple's Siri development has been hindered by the company's deep commitment to privacy, but according to Joswiak, privacy, respect for user data, and an intelligent AI can co-exist.


"I think it's a false narrative," he told Fast Company. "We're able to deliver a very personalized experience... without treating you as a product that keeps your information and sells it to the highest bidder. That's just not the way we operate."

Much of Apple's Siri functionality is done on-device, rather than in the cloud like other services. In Apple's 2017 software updates, that's shifting slightly with the company planning to allow Siri to communicate across devices to learn more about users. Still, many things, like Siri's ability to find photos with a specific photo or date are powered on-device.
"Your device is incredibly powerful, and it's even more powerful with each generation," Joswiak said. "And with our focus on privacy, we're able to really take advantage of exploiting that power with things like machine learning on your device to create an incredible experience without having to compromise your data."
Apple does use the cloud to answer requests and to train Siri, but it strips all user identifiable data. All Siri requests are stripped of user ID and supplied with a random request ID, with the request then encrypted and sent to the cloud. Apple stores six months of voice recordings to allow its voice recognition engine to get a better understanding of users. A second copy of recordings can be stored for up to two years, also with the aim of improving Siri.

"We leave out identifiers to avoid tying utterances to specific users so we can do a lot of machine learning and a lot of things in the cloud without having to know that it came from [the user]," said Joswiak.

Alongside Joswiak, Apple's Craig Federighi, senior vice president of software weighed in on Siri's future in an email to Fast Company. "Siri is no longer just a voice assistant," he said. "Siri on-device intelligence is streamlining everyday interactions with our devices."

He went on to say that with iOS 11, macOS High Sierra, tvOS 11, and watchOS 4, users will "experience even more Siri functionality." He went on to say that in the "years to come," Siri functionality will be "ever more integral" to the core user experience on all of the company's platforms, from Mac to iPhone to Apple TV.

Federighi and Joswiak's full Siri interview, which provides more insight into the inner workings of Siri and Apple's commitment to privacy, can be read over at Fast Company.

Tag: Siri

Discuss this article in our forums

Apple’s Greg Joswiak: Siri Wasn’t Engineered to Be Trivial Pursuit

In iOS 11, Apple's AI-based personal assistant Siri has a much more natural voice that goes a long way towards making Siri sound human like. Siri speaks with a faster, smoother cadence with elongated syllables and pitch variation, a noticeable departure from the more machine like sound in iOS 10.

The team behind Siri, including Siri senior director Alex Acero, has worked for years to improve the way Siri speaks, according to a new interview Acero did alongside Apple VP of marketing Greg Joswiak with Wired. While Siri's voice recognition capabilities were powered by a third-party company early on in Siri's life, Acero's team took over Siri development a few years back, leading to several improvements to the personal assistant since then.

Siri is powered by deep learning and AI, technology that has much improved her speech recognition capabilities. According to Wired, Siri's raw voice recognition capabilities are now able to correctly identify 95 percent of users' speech, on par with rivals like Alexa and Cortana.

Apple is still working to overcome negative perceptions about Siri, and blames many of the early issues on the aforementioned third-party partnership.
"It was like running a race and, you know, somebody else was holding us back," says Greg Joswiak, Apple's VP of product marketing. Joswiak says Apple always had big plans for Siri, "this idea of an assistant you could talk to on your phone, and have it do these things for you in a more easy way," but the tech just wasn't good enough. "You know, garbage in, garbage out," he says.
Joswiak says Apple's aim from the beginning has been to make Siri a "get-s**t-done" machine. "We didn't engineer this thing to be Trivial Pursuit!" he told Wired. Apple wants Siri to serve as an automated friend that can help people do more.


One unique Siri attribute is its ability to work in multiple languages. Siri supports English, French, Dutch, Mandarin, Cantonese, Finnish, Hebrew, Malay, Arabic, Italian, and Spanish, and more, including dialect variants (like English in the UK and Australia) and accents. The Siri team combines pre-existing databases of local speech with local voice talent and on-device dictation, transcribing and dissecting the content to find all of the individual sounds in a given language and all of the ways those sounds are pronounced.

In areas where Apple offers spoken dictation but no Siri support, it's gathering data for future Siri support, and in places where Siri is already available, spoken interactions between user and device (gathered anonymously) are used to improve algorithms and train the company's neural network.

Creating the right voice for Siri in a given language hinges on the proper voice talent, and Apple uses an "epic search" with hundreds of people to find someone who sounds helpful, friendly, spunky, and happy without overdoing it. Once the right person is found, Apple records them for weeks at a time to create the right sound. So far, Apple has repeated this process for all 21 languages Siri supports.

Ultimately, Acero and his Siri team are aiming to make Siri sound more like a trusted person than a robot, creating an attachment to the AI that will "make Siri great" even when Siri fails to answer a query properly. Apple also wants to make people more aware of what Siri can and can't do and that it exists in the first place, which is why iOS 11 includes Siri-centric features like cross-device syncing and a better understanding of user interests and preferences.

Wired's full piece, which goes into much more detail on how Siri recognizes various aspects of speech and how Apple chooses voice talent can be read over on the site.

Tag: Siri

Discuss this article in our forums

Apple Acknowledges Siri Leadership Has Officially Moved From Eddy Cue to Craig Federighi

Apple has updated its executive profiles to acknowledge that software engineering chief Craig Federighi now officially oversees development of Siri. The responsibility previously belonged to Apple's services chief Eddy Cue.

Craig Federighi is Apple’s senior vice president of Software Engineering, reporting to CEO Tim Cook. Craig oversees the development of iOS, macOS, and Siri. His teams are responsible for delivering the software at the heart of Apple’s innovative products, including the user interface, applications and frameworks.
Apple's leadership page is only now reflecting Federighi's role as head of Siri, but the transition has been apparent for several months, based on recent interviews and stage appearances at Apple's keynotes.

At WWDC 2016, for example, Federighi and Apple marketing chief Phil Schiller joined Daring Fireball's John Gruber to discuss how Apple was opening Siri up to third-party developers with SiriKit later that year.

At WWDC 2017, Federighi was on stage to discuss improvements to Siri in iOS 11, including more natural voice, built-in translation capabilities, and advances in machine learning and artificial intelligence.

Cue continues to oversee the iTunes Store, Apple Music, Apple Pay, Apple Maps, iCloud, and the iWork and iLife suites of apps, and handing off Siri should allow him to focus more on Apple's push into original content.

Apple's updated leadership page also now lists profiles for recent hires Deirdre O'Brien, Vice President of People, and Isabel Ge Mahe, Vice President and Managing Director of Greater China.


Discuss this article in our forums

Apple Updates Machine Learning Journal With Three Articles on Siri Technology

Back in July, Apple introduced the "Apple Machine Learning Journal," a blog detailing Apple's work on machine learning, AI, and other related topics. The blog is written entirely by Apple's engineers, and gives them a way to share their progress and interact with other researchers and engineers.

Apple today published three new articles to the Machine Learning Journal, covering topics that are based on papers Apple will share this week at Interspeech 2017 in Stockholm, Sweden.


The first article may be the most interesting to casual readers, as it explores the deep learning technology behind the Siri voice improvements introduced in iOS 11. The other two articles cover the technology behind the way dates, times, and other numbers are displayed, and the work that goes into introducing Siri in additional languages.

Links to all three articles are below:

Apple is notoriously secret and has kept its work under wraps for many years, but over the course of the last few months, the company has been open to sharing some of its machine learning advancements. The blog, along with research papers, allows Apple engineers to participate in the wider AI community and may help the company retain employees who do not want to keep their progress a secret.


Discuss this article in our forums