Welcome
Power up your web apps with local AI.
Meet AiBrow, which enables on-device AI in your browser. Private, Fast and Free. It's Open Source and supports Llama, Gemini, Phi and many other models.
The AiBrow API follows the current proposal for the browser Prompt API currently being developed in Google Chrome, but then extends this base feature set with new capabilities. AI in the browser is implemented in 1 of 3 ways...
Using the in-browser
window.ai
APIUsing the AiBrow extension which implements native llama.cpp
Using web-apis such as WebGPU and WASM
Each method has its own advantages and limitations as well as performance considerations to take into account (Feature comparison). You can use the AiBrow Web API to check on-device support and access each of these APIs as needed.
Quick Start
Install the dependencies:
You can use the languageModel API to have a conversation with the AI, using whichever backend you choose.
๐ Take a look at the examples to get started
๐ Check out the API reference to see everything that AiBrow supports
๐พ The AiBrow extension is on GitHub if you want to contribute or chat
๐งช Try out some of the AiBrow demos
Last updated