Is Apple's Siri Sherlocking* Google Search?
iOS 18.2 is coming this week, with new AI features and ChatGPT Siri integration. The buzz is likely to increase consumer interest in Siri. More intriguing is Siri's growing use of Apple knowledge graphs to answer many queries directly on device and its potential impact on Google.
The discovery of Siri's improved Local Search capability led me to ask: Is the new Siri capable of similarly answering questions across other knowledge domains? Could Siri pull off the biggest Sherlock* ever by integrating features to replace much of what search does?
Siri and Spotlight search have long had the ability to surface types of knowledge that aren't limited to local entities. In fact, the newly reconstituted assistant has access to many in-depth knowledge graphs that Apple has been building out since 2012.
Siri 2.0 brings the ability leverage AI to remember the context of your initial query and provide follow-on answers and, importantly, it can now be accessed through text entry and not just voice.
Many of these features are still in their early stages, but they hint at a future where more of the questions we currently ask Google are answered directly on our phones—instantly, seamlessly and without interrupting what we’re doing, without opening a browser, or launching an app.
Apple Builds Out Knowledge Graphs
Many of the answers that we see in Siri are based on knowledge graphs that Apple has built for specific applications and cover a broad range of potential user interests. Apple is answering questions directly on the device: questions about entertainment, movies, TV shows, sports, realtime sports scores, current events, news, music, weather, Wikipedia, Local POIs, the web and more. I even found evidence of Siri being able to perform product searches.