dark

Opera Gives Users Access to AI Models on Their Device

In an announcement yesterday, Opera announced that users will now be able to download and use large language models (LLMs) locally on their computers. Over 150 models from over 50 families will be available to Opera One users who receive developer stream updates. This includes models like Llama from Meta, Gemma from Google, and Vicuna.

The feature uses the open source Ollama framework to run the AI models directly in the browser without needing an internet connection. Each model takes up more than 2GB of space, so users will need to make sure they have enough available storage so downloads don’t fail. Opera hasn’t compressed the sizes yet.

Downloading the models allows users to test and experiment with different AI assistants without an online connection. However, services like HuggingChat and Quora’s Poe also offer ways to explore models online without using as much local storage.

Opera has been adding AI features to enhance the browsing experience. Last year it introduced an assistant named Aria in the sidebar. In January, the company shared plans for a new iOS browser with its own engine since Apple may open up to alternatives.

By providing local access to popular AI models, Opera gives power and customization to its users. This can help developers and enthusiasts learn more about conversational technologies. With compression improvement potential too, localized LLMs in Opera bring exciting AI capabilities within reach.

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Spotify Prices Going Up, But a New “Basic” Option Could Keep Costs the Same

Next Post

Brave’s AI Helper Leo Comes to iPhones and iPads

Related Posts