Apple’s rumored use of LLM on certain apps might make a huge difference in Siri’s usefulness

Apple’s rumored use of LLM on certain apps might make a huge difference in Siri’s usefulness

Disclaimer: Content Source The content on this website is for informational purposes only. We would like to clarify that the information provided here is sourced from various publicly available outlets on the internet. None of the content on this website is authored, reviewed, or endorsed by our team.

There is no question that the promise that Siri had when first introduced in 2011 on the iPhone 4s has not been realized by the virtual digital assistant. Very rarely does Siri deliver a single answer to a query like Google Assistant is more apt to do. For example, ask Google Assistant the release date of the iPhone 4s and it comes back with one answer: October 14, 2011. Ask Siri the same question and she returns with three websites you can choose from.
Anyone who depends on Siri to deliver certain information is excited to see what changes Apple is planning on making to the digital assistant with the AI makeover that will be briefly touched on by Apple CEO Tim Cook on May 7th, and more thoroughly previewed during the keynote at WWDC 2024 on June 10th. AppleInsider says that it has learned specifics about how Apple intends to use its Ajax Large Language Model (LLM) and some of the AI-based features that could make their way to iOS 18.
Apple’s goal is to improve iPhone users’ experiences running native iOS apps, including Siri, by offering AI capabilities that will run on-device to save time and maintain privacy. Apple is almost treating the chatbot feature as a parlor trick. Instead of offering a chatbot that might amuse some without helping to improve the user experience of running certain apps, Apple wants to give its system apps AI features such as text summarization, AI-enhanced search options, and document analysis. We could see these offered on apps like Siri, Messages, Mail, Spotlight Search, and Safari.

One of the AI-powered features that I’m looking forward to would be a summarization of a website that appears on the screen while browsing with the Safari app. Analyzing certain words and phrases found on a website, the app will show a summary explaining the most important points found in the content on the screen. Siri will not also offer a similar tool, which we can only hope improves the responses the assistant delivers, but it will also be integrated with the iOS Messages app.

Apple’s LLM will also provide users with simple, basic responses on-device. Even without having to rely on cloud-based servers that lengthen the time of responses and is a process that is less secure, multiple responses can surface in order of accuracy, speed, and other metrics. The report says that Ajax will show the name and information of any contact that appears in text it uncovers and will scan the calendar app for upcoming events it can include when creating a response.

Apple’s on-device AI will collect text from a text field/box, or info collected from the Safari and the Messages app. It all sounds exciting and with Tim Cook now expected to make some brief AI-related comments next Tuesday during the iPad-focused “Let Loose” event, there will be a buzz surrounding Apple right through the WWDC keynote. If Apple plays its cards right, the buzz can continue through the iPhone unveiling event in September.

If Siri does get a feature that summarizes the websites it often delivers as the response to a user query, that could be among the most useful AI features that Apple announces this year.

See also  Xiaomi Mix Flip leaks suggest powerful specs and a possible launch window

Source link


No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *