Apple is reportedly experimenting with language-generating AI


If not for last week’s Silicon Valley Bank (SVB) collapse almost every conversation in tech seems to be centered around AI and chatbots. In the last few days, Microsoft-backed OpenAI released a new language model called GPT-4. Its competitor Anthropic released the Claude chatbot. Google said that it is integrating AI into its Workspace tools like Gmail and Docs. Microsoft Bing has brought attention to itself with a chatbot-enabled search. The one name missing from the action? Apple.

Last month, the Cupertino-based company held an internal event that focused on AI and large language models. According to a report from The New York Times, many teams, including people working on Siri, are testing “language-generating concepts” regularly. Separately, 9to5Mac reported that Apple has introduced a new framework for “Siri Natural Language Generation” in the tvOS 16.4.

People have complained about Siri not understanding queries (including mine) for a long time. Siri (and other assistants like Alexa and Google Assistant) have failed to understand different accents and phonetics of people living in different parts of the world even if they are speaking the same language.

The newly minted fame of ChatGPT and text-based search makes it easier for people to interact with different AI models. But currently, the only way to chat with Apple’s AI assistant Siri is to enable a feature under accessibility settings.