Apple has disclosed that the company is focusing artificial intelligence into language models.
The company in an internal event said they’re focused AI and large language models.
According to New York Times, many teams including those working on Siri are testing “language-generating concepts”
There has been complaints about Siri not understanding queries and other assistants like Alexa and Google Assistant have failed to understand different accents of people living in different parts of the world.
Former Apple engineer John Burke, who worked on Siri in an interview with NYT said Apple’s Assistant has had a slow evolution because of “clunky code,” which he said made it harder to push feature updates.
He also added that Siri had a big database of words, so when an engineers need to add features or phrases, the database will have to be rebuilt, a process he took up to six weeks.
However the NYT report didn’t specify if Apple is building its own language models or wants to adopt an existing model.
Apple has been using AI-powered features for a while.
These features are; better suggestions on the keyboard, processing in photography, mask unlocks with Face ID, separation of objects from the background, handwashing and crash detection on Apple Watch, and most recently karaoke feature on Apple Music, but it’s unlikely if any of them might be in-the-face as chatbots.
Apple has quiet about its AI efforts, but in January, the company started a program AI-powered narration services to authors to turn their books into audiobooks.