Apple’s OpenELM is a family of eight small language models.
Photo Credit: Pexels/Mateusz Taciak
Apple’s OpenELM AI models have up to 3 billion parameters
Apple has released its new family of artificial intelligence (AI) models dubbed OpenELM. Short for Open-source Efficient Language Models, there are a total of eight AI models with four pre-trained variants and four instruct variants. All of them are small language models (SLMs) that specialise in text-related tasks, highlighting an alignment with the tech giant’s reported ambitions of introducing on-device AI features this year. Notably, the company is also said to have acquired a French AI startup called Datakalab which works with computer vision models.
The OpenELM AI models were spotted on Apple’s Hugging Face page. Introducing the SLMs, the company said, “We introduce OpenELM, a family of Open-source Efficient Language Models. OpenELM uses a layer-wise scaling strategy to efficiently allocate parameters within each layer of the transformer model, leading to enhanced accuracy.” There are two variants — pre-trained and instruction — and they come with 270 million, 450 million, 1.1 billion, and 3 billion parameters.
Parameters refer to the neural networks in an AI model, which essentially means knowledge points with each point in the network containing information about a specific topic. The higher the number of parameters, the more efficiently it can understand and respond to complex questions. For reference, the recently released Microsoft Phi-3-mini contains 3.8 billion parameters whereas Google’s Gemma comes with 2 billion parameters. The pre-trained AI models are designed for general conversations and coherence in responses while the instruct variants are fine-tuned for task completion.
- WhatsApp for iOS Will Now Let You Login Without SMS Codes
Small language models might not show an all-encompassing knowledge base or conversational capacity like ChatGPT or Gemini, but they are efficient at handling specific tasks and queries and are generally less error-prone. While Apple did not mention any specific use cases of the AI models, it offered the weights of the models available to the community. The weights are available under Apple’s sample code licence which allows its usage for both research and commercial purposes.
- Apple’s “Let Loose” iPad Launch Event Will Be Held on This Date
- Apple Orders Components for Capacitive Button for iPhone 16: Report
Apple leaning towards developing SLM highlights that the company is focused on its vision of on-device AI, as reported earlier. The company has so far also published papers on three other AI models including one that focuses on on-device capabilities, one that comes with multimodal capabilities, and another with computer vision that can understand smartphone screen interfaces.
Is the Samsung Galaxy Z Flip 5 the best foldable phone you can buy in India right now? We discuss the company's new clamshell-style foldable handset on the latest episode of Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated – see our ethics statement for details.