iOS 12 Said to Feature Animoji in FaceTime, Deeper Siri Integration, and Do Not Disturb Improvements

Apple's alleged plans to double down on the quality of its iPhone, iPad, and Mac software platforms, rather than rush to introduce new features, have been revealed in more detail by Mark Gurman at Bloomberg News.


The report claims that Apple's software engineers will have more discretion to delay features that aren't as polished, with the company essentially shifting to more of a two-year roadmap for iOS and macOS, rather than trying to release major feature-packed upgrades every single year without question.
Instead of keeping engineers on a relentless annual schedule and cramming features into a single update, Apple will start focusing on the next two years of updates for its iPhone and iPad operating system, according to people familiar with the change. The company will continue to update its software annually, but internally engineers will have more discretion to push back features that aren't as polished to the following year.
The report describes Apple's new strategy as a "major cultural shift," and an admission that its recent software updates have suffered from an uncharacteristic number of bugs, ranging from a critical root superuser vulnerability on Mac to iMessages appearing in the wrong order across Apple devices.

Apple's commitment to a fast-paced iOS release schedule already led some features to be delayed regardless, including Apple Pay Cash and Messages on iCloud, so the new strategy would likely involve not announcing or testing those features in beta until they are much closer to being ready for public release.

Despite the increased focus on under-the-hood refinements, iOS 12 is still expected to include some significant new features, including Animoji in FaceTime, which will enable people to place virtual faces over themselves during video calls.

Additionally, in iOS 12, Apple is planning deeper Siri integration in the iPhone's search view, Do Not Disturb improvements that will give users more options to automatically reject phone calls or silence notifications, a redesigned version of its Stocks app, and a multiplayer mode for augmented reality games.

As previously reported, Apple is also expected to make it possible for developers to release apps that work across iPhone, iPad, and Mac, starting with iOS 12 and macOS 10.14, which should be introduced at WWDC 2018 in June.

Last month, Gurman reported that developers will be able to design a single third-party app that works with both a touchscreen, and a mouse or trackpad, depending on whether it's running on an iPhone, iPad, or Mac. Apple would presumably also streamline its own apps on the desktop and mobile.

The report didn't reveal exactly how the process will work, but Apple could be planning to release a new SDK with new APIs that enable true cross-platform functionality. Right now, Apple's UIKit and AppKit frameworks provide the required infrastructure for iOS and macOS app user interfaces respectively.

Today's report reiterates other features that are delayed, including redesigned home screens on iPhone, iPad, and CarPlay, tabbed apps on iPad, and the ability to view two screens from the same app side by side on iPad.

Related Roundup: iOS 12

Discuss this article in our forums

Uber’s Latest App Update Restores Siri and Apple Maps Integrations

Uber yesterday updated its iPhone app, and while the release notes do not mention any specific changes, the latest version appears to re-enable the ability to request a vehicle for pickup using Siri or Apple Maps.


After updating the Uber app, we were successfully able to ask Siri to hail us a ride, while tapping on the Ride tab in Apple Maps once again listed Uber as one of the ride-hailing services available alongside Lyft.

While the Siri and Apple Maps integrations are working again in the United States, we encountered errors when trying to hail an Uber with Siri and Apple Maps in Toronto, Canada, where the features were previously supported.

As noted by Christian Zibreg at iDB, some users may need to manually re-enable the Siri and Apple Maps integrations in Settings → Uber → Siri & Search and Settings → Maps under "Ride Booking Extensions."

The ability to hail an Uber ride with Siri or Apple Maps had disappeared in late January following an earlier update to the Uber app. Both features were originally added in iOS 10, and it's unclear what prompted their temporary removal.

Uber's app is available for free on the App Store.


Discuss this article in our forums

Uber’s Siri and Apple Maps Integrations Have Disappeared

Uber's latest app update appears to have removed several important iOS integrations, with the service now unavailable to both Siri and Apple Maps.

If you ask Siri to get you an Uber, a feature that has been available since the launch of iOS 10, Siri will say that Uber hasn't activated that feature. In the "Siri & Search" section of the Uber options in the Settings app, there's also no longer a "Use with Siri" toggle.


Similarly, in Apple Maps, you can no longer select Uber as an option when choosing "Ride" when getting directions. This is also a feature that debuted in iOS 10.

Both Siri and Apple Maps integrations are still available for other ride sharing apps like Lyft, so the problem seems to be with the Uber app rather than with Apple's services.

The removal of both features was noticed by MacRumors readers and reddit users starting last week. It is not clear if Uber has deliberately removed these features or if it's a bug, and the company did not respond to a request for comment when contacted by MacRumors earlier this afternoon. We have also contacted Apple and will update this post when we hear back.

Uber integration with Siri, enabled through the SiriKit API, was a much touted feature when iOS 10 first launched, as was Apple Maps integration. Both Apple and Uber heavily promoted the two options when iOS 10 rolled out.


Discuss this article in our forums

Siri Gains Info About Tennis and Golf Tournaments Ahead of Australian Open

Siri has been updated with additional sports information, allowing the personal assistant to provide details about a range of tennis and golf events. Siri's new knowledge has been introduced just ahead of the Australian Open, which is set to kick off this weekend, and it joins other sports data Siri offers for baseball, basketball, hockey, and football.

As noted by 9to5Mac, Siri can provide information on both upcoming tournaments and past events from recent years, along with details on player backgrounds and statistics.


For tennis, the personal assistant can answer queries about the ATP world tour and the Women's Tennis Association, offering up data from 2016-2018. For golf, Siri can provide details about men and women's PGA and LGPA tours.

The new golf and tennis data available from Siri is accessible on iOS devices running the latest version of iOS, and it is also available on Macs, the Apple TV, and the Apple Watch.

Tag: Siri

Discuss this article in our forums

Apple Asks Developers to Start Optimizing Apps for HomePod Using SiriKit in iOS 11.2

iOS 11.2, released this morning, introduces SiriKit support for the HomePod, according to Apple. With SiriKit for HomePod now available, Apple is asking developers to make sure SiriKit-compatible apps are optimized for HomePod ahead of the device's release.

SiriKit is designed to allow iOS and watchOS apps to work with Siri, so users can complete tasks with Siri voice commands. SiriKit is available for a wide range of apps on those two platforms, but its availability is slightly more limited when it comes to HomePod.


Third-party apps that use SiriKit Messaging, Lists, and Notes are compatible with the HomePod. Siri will recognize voice requests given to the HomePod, with those requests carried out on a linked iOS device. So, for example, users can ask HomePod to send a message to a friend, add an item to a list, or create a new note. Sample HomePod requests:

- Send a text to Eric using WhatsApp
- In WeChat, tell Eric I'll be late
- Add chocolate and bananas to my list in Things
- Create a note that says "hello" in Evernote

Developers can test the voice-only experience of their apps using Siri through headphones connected to an iOS device with the iOS 11.2 beta.

Apple plans to release the HomePod this December, but a specific launch date for the speaker has not yet been provided. When it becomes available, the HomePod will cost $349.

Related Roundups: iOS 11, HomePod
Tag: Siri

Discuss this article in our forums

Apple Says ‘Hey Siri’ Detection Briefly Becomes Extra Sensitive If Your First Try Doesn’t Work

A new entry in Apple's Machine Learning Journal provides a closer look at how hardware, software, and internet services work together to power the hands-free "Hey Siri" feature on the latest iPhone and iPad Pro models.


Specifically, a very small speech recognizer built into the embedded motion coprocessor runs all the time and listens for "Hey Siri." When just those two words are detected, Siri parses any subsequent speech as a command or query.

The detector uses a Deep Neural Network to convert the acoustic pattern of a user's voice into a probability distribution. It then uses a temporal integration process to compute a confidence score that the phrase uttered was "Hey Siri."

If the score is high enough, Siri wakes up and proceeds to complete the command or answer the query automatically.

If the score exceeds Apple's lower threshold but not the upper threshold, however, the device enters a more sensitive state for a few seconds, so that Siri is much more likely to be invoked if the user repeats the phrase—even without more effort.

"This second-chance mechanism improves the usability of the system significantly, without increasing the false alarm rate too much because it is only in this extra-sensitive state for a short time," said Apple.

To reduce false triggers from strangers, Apple invites users to complete a short enrollment session in which they say five phrases that each begin with "Hey Siri." The examples are saved on the device.
We compare the distances to the reference patterns created during enrollment with another threshold to decide whether the sound that triggered the detector is likely to be "Hey Siri" spoken by the enrolled user.

This process not only reduces the probability that "Hey Siri" spoken by another person will trigger the iPhone, but also reduces the rate at which other, similar-sounding phrases trigger Siri.
Apple also says it created "Hey Siri" recordings both close and far in various environments, such as in the kitchen, car, bedroom, and restaurant, based on native speakers of many languages around the world.

For many more technical details about how "Hey Siri" works, be sure to read Apple's full article on its Machine Learning Journal.


Discuss this article in our forums

Apple Hires AI Team From Init.ai to Join Work on Siri

Apple this week "acqui-hired" the team from Init.ai, a startup that designed a smart assistant to allow customer service representatives to easily parse through and automate some interactions with users, reports TechCrunch.

The startup focused on creating an AI with natural language processing and machine learning to analyze chat-based conversations between humans.


Init.ai announced that it was shutting down its service earlier this week to join a new project.
Today is an exciting day for our team. Init.ai is joining a project that touches the lives of countless people across the world. We are thrilled and excited at the new opportunities this brings us.

However, this means Init.ai will discontinue its service effective December 16th 2017. While we wish to make this transition as smooth as possible, we cannot continue to operate Init.ai going forward.
Apple did not purchase Init.ai and will not obtain any intellectual property nor is there an indication that Apple plans to use any existing Init.ai services. Instead, Apple has taken on the Init.ai team, who will now work on Apple's Siri personal assistant.

The addition of the Init.ai team may hint at Apple's future Siri plans, with the company perhaps planning to build out more business integrations to supplement Business Chat, the iOS 11 iMessage feature that allows businesses to communicate with customers.

TechCrunch says it's not entirely clear how many people from Init.ai will be transitioning to Apple, but the startup only employed six people.


Discuss this article in our forums

Apple’s Siri Turns Six: AI Assistant Announced Alongside iPhone 4s on October 4, 2011

On October 4, 2011, Apple held a media event in which it introduced Find My Friends, refreshed the iPod Nano and iPod touch, and revealed the iPhone 4s with its all-new Siri voice assistant. This means that today marks the sixth year anniversary of when Apple's Siri was first introduced to the world, although the AI helper wouldn't be available to the public until the iPhone 4s launch on October 14, 2011.

In the original press releases for Siri, Apple touted using your voice to send text messages, schedule meetings, set timers, ask about the weather, and more. Apple explained Siri's understanding of context and non-direct questions, like presenting you with a weather forecast if you ask "Will I need an umbrella this weekend?"

The original Siri interface on iOS 5
Siri on iPhone 4S lets you use your voice to send messages, schedule meetings, place phone calls, and more. Ask Siri to do things just by talking the way you talk. Siri understands what you say, knows what you mean, and even talks back. Siri is so easy to use and does so much, you’ll keep finding more and more ways to use it.

Siri understands context allowing you to speak naturally when you ask it questions, for example, if you ask “Will I need an umbrella this weekend?” it understands you are looking for a weather forecast. Siri is also smart about using the personal information you allow it to access, for example, if you tell Siri “Remind me to call Mom when I get home” it can find “Mom” in your address book, or ask Siri “What’s the traffic like around here?” and it can figure out where “here” is based on your current location.
Apple didn't create Siri itself, however, as the company purchased the technology that’s now prevalent in all iOS devices by acquiring Siri, Inc., a spinoff of SRI International where the technology originated. Prior to the assistant's presence on iPhone, Siri was a standalone app on the App Store (launched February 2010) that offered automated personal assistant services through integrations with third-party apps like OpenTable, FlightStats, Google Maps, and more. Users could interact with these apps using Siri's voice-recognition technology, created by Nuance.

Just two months after Siri appeared on the App Store, reports of Apple's acquisition of Siri surfaced in April 2010, and the purchase was quickly confirmed by representatives and board members from the voice-recognition company. According to Siri board member Shawn Carolan, "The offer from Apple was of a scope and tenor that it was a no-brainer to accept it." The standalone app was removed from the App Store after Apple's unveiling of its own Siri in October 2011.


Over the years, Siri has debuted new features and expanded to more devices, including the iPad (June 2012), iPod Touch (September 2012), Apple Watch (April 2015), Apple TV with Siri Remote (September 2015), Mac with macOS Sierra (September 2016), and HomePod (coming December 2017).

Since 2011, Siri has become a large enough part of Apple's brand that the company just this year launched a series of advertisements focusing solely on the assistant's helpfulness, aided by actor Dwayne "The Rock" Johnson. The latest version of iOS, iOS 11, has seen a few improvements brought to Siri, including a more natural speaking voice, text-to-talk, and a translation feature.

Details about Siri's origin at Apple have continued to emerge over the years, with voice actress Susan Bennett revealing a few behind-the-scenes tidbits about the early days of the project in an interview posted earlier in 2017. Bennett described having to say "nonsense phrases" like "Say the shrodding again," which she later realized provided Apple with "all of the sounds of the English language."


The next place that Siri will be found in is Apple's HomePod speaker, which was for a long time simply called the "Siri Speaker" prior to its official unveiling at WWDC in June. HomePod will greatly rely on user interaction with Siri, allowing for music playback, HomeKit control, timer settings, news reports, and essentially most of the tasks that Siri can already do elsewhere. Most importantly, Siri will become a "musicologist" in HomePod and gain a greater understand of music-related trivia to greater enhance HomePod's position as a high-quality audio device.

Despite advancements, many users frequently point out Siri's flaws and inconsistencies in certain situations. It's been rumored previously that Apple's development on Siri has been hindered by the company's commitments to privacy. But, in an interview last month Apple VP of marketing Greg Joswiak argued that user data privacy and a smart AI assistant can co-exist: "We're able to deliver a very personalized experience... without treating you as a product that keeps your information and sells it to the highest bidder. That's just not the way we operate."

Tag: Siri

Discuss this article in our forums

Apple Drops Bing Search Engine Results for Siri and Spotlight in Favor of Google

Starting today, Apple search results from Siri and Spotlight on Mac and iOS will be provided by Google rather than Microsoft's Bing. Apple announced the news in a statement that was given to TechCrunch this morning, claiming consistency across iOS and Mac devices is the reason behind the switch.

"Switching to Google as the web search provider for Siri, Search within iOS and Spotlight on Mac will allow these services to have a consistent web search experience with the default in Safari," reads an Apple statement sent this morning. "We have strong relationships with Google and Microsoft and remain committed to delivering the best user experience possible."
Prior to this morning, all results from a search conducted on Spotlight using Finder on Mac or the swipe down search bar on iOS were Bing search results, as was all search information provided by Siri. Now, when you search using Spotlight or when you ask Siri a question that ends up involving a web search, info will come from Google.

According to TechCrunch, the swap will include both web links and video results from YouTube, but web image results in Siri and Spotlight searches will continue to be provided by Bing for the time being. Google searches will use the standard search API and will provide the same search results you'd get from a Google.com search.

While Apple has used Bing for search results for things like Siri and Spotlight, Google has remained the default search engine on iOS and Mac devices. Earlier this year, reports suggested Google paid Apple nearly $3 billion to maintain its position as the default search engine on iOS devices.

The search engine swap began rolling out to users at 9:00 a.m. Pacific Time.


Discuss this article in our forums

How to Use Siri’s New Translation Feature in iOS 11

iOS 11 brings new functionality to Siri, including a translation feature that allows Siri to translate words and phrases spoken in English to a handful of other languages. Translation is super simple to use, and while the translations aren't always perfect, they get the gist of what you're attempting to say across to someone who speaks another language.


Using Siri Translate



  1. Activate Siri, either by holding down the Home button or using a "Hey Siri" command.

  2. Tell Siri the phrase you want to translate and the language you want it in. For example: "Siri, how do I say where's the bathroom in Spanish?"

  3. Siri will respond with the appropriate translation, both in text form and vocally. The vocal component can be replayed by pressing the play button located at the bottom of the translation.

  4. There are multiple ways to phrase your translation requests. Siri will respond to "Translate X to X language" or "How do I say X in X language?"

Available Languages


Siri can translate English to Mandarin, French, German, Italian, and Spanish. There's no two-way translation available yet - it's only English to the above listed languages. Apple has said it plans to add additional languages to the Siri translation feature following the release of iOS 11.

Apple appears to be using an in-house translation engine for Siri, as the translations do not match up with translations provided by popular services like Google Translate or Bing Translate. Also of note, while Siri can translate from English to several other languages, the translation features do not work with British, Canadian, or Australian English settings.

Because Siri speaks translations aloud, the translation feature can come in handy when traveling and trying to get simple communications across. It's a simple addition, but one that may go a long way towards making Siri more useful.

Related Roundup: iOS 11
Tag: Siri

Discuss this article in our forums