Welcome
Last updated
Last updated
Power up your web apps with local AI.
Meet AiBrow, which enables on-device AI in your browser. Private, Fast and Free. It's Open Source and supports Llama, Gemini, Phi and many other models.
The AiBrow API follows the current proposals for the browser machine learning APIs, namely the
These are currently being developed & trialled in , but AiBrow extends this base feature set with new capabilities. This means you can use AI in the browser using a number of different implementations...
Using the in-browser APIs when available
Using the , which uses native llama.cpp
Using web APIs such as WebGPU and WASM
Each method has its own advantages and limitations as well as performance considerations to take into account (). You can use the to check on-device support and access each of these APIs as needed.
Install the dependencies:
You can use the languageModel API to have a conversation with the AI, using whichever backend you choose.
๐ Take a look at the examples to get started
๐ to see everything that AiBrow supports
๐พ The AiBrow extension is on if you want to contribute or chat
๐งช Try out some of the