Welcome
Last updated
Last updated
Power up your web apps with local AI.
Meet AiBrow, which enables on-device AI in your browser. Private, Fast and Free. It's Open Source and supports Llama, Gemini, Phi and many other models.
The AiBrow API follows the current proposal for the browser currently being developed in , but then extends this base feature set with new capabilities. AI in the browser is implemented in 1 of 3 ways...
Using the in-browser window.ai
API
Using the which implements native llama.cpp
Using web-apis such as WebGPU and WASM
Each method has its own advantages and limitations as well as performance considerations to take into account (). You can use the to check on-device support and access each of these APIs as needed.
Install the dependencies:
You can use the languageModel API to have a conversation with the AI, using whichever backend you choose.
๐ Take a look at the
๐ to see everything that AiBrow supports
๐พ The AiBrow extension is on if you want to contribute or chat
๐งช Try out some of the