AiBrow
  • Welcome
  • AiBrow web API
    • Getting started
    • Feature comparison
  • AiBrow Extension
    • Getting started
    • Web polyfill
    • Helping users install the AiBrow extension
    • Remove the on-device helper or models
  • Examples
    • Embedding API
    • LanguageDetector API
    • LanguageModel API
    • Rewriter API
    • Summarizer API
    • Translator API
    • Writer API
    • Using different models
    • Model download feedback
    • Getting JSON output
  • API Reference
    • AI
      • AIBrow
      • BrowserAI
      • AIBrowWeb
    • AiBrow
      • Embedding
      • LanguageDetector
      • LanguageModel
      • Rewriter
      • Summarizer
      • Translator
      • Writer
    • Types
      • AIModelAvailability
      • AIModelCoreCompatibility
      • AIModelDtype
      • AIModelGpuEngine
      • AICreateMonitor
      • EmbeddingCreateOptions
      • LanguageDetectorCreateOptions
      • LanguageDetectorDetectResult
      • LanguageModelCreateOptions
      • RewriterCreateOptions
      • RewriterFormat
      • RewriterLength
      • RewriterTone
      • SummarizerCreateOptions
      • SummarizerFormat
      • SummarizerLength
      • SummarizerType
      • TranslatorCreateOptions
      • WriterCreateOptions
      • WriterFormat
      • WriterLength
      • WriterTone
    • Models
Powered by GitBook
On this page

Welcome

NextGetting started

Last updated 12 days ago

Power up your web apps with local AI.

Meet AiBrow, which enables on-device AI in your browser. Private, Fast and Free. It's Open Source and supports Llama, Gemini, Phi and many other models.

The AiBrow API follows the current proposals for the browser machine learning APIs, namely the

These are currently being developed & trialled in , but AiBrow extends this base feature set with new capabilities. This means you can use AI in the browser using a number of different implementations...

  1. Using the in-browser APIs when available

  2. Using the , which uses native llama.cpp

  3. Using web APIs such as WebGPU and WASM

Each method has its own advantages and limitations as well as performance considerations to take into account (). You can use the to check on-device support and access each of these APIs as needed.

Quick Start

Install the dependencies:

npm install @aibrow/web

You can use the languageModel API to have a conversation with the AI, using whichever backend you choose.

import AI from '@aibrow/web'

// WebGPU
const webGpu = await AI.AIBrowWeb.LanguageModel.create()
console.log(await webGpu.prompt('Write a short poem about the weather'))

// Llama.cpp
const ext = await AI.AIBrow.LanguageModel.create()
console.log(await ext.prompt('Write a short poem about the weather'))

// Chrome AI
const browser = await AI.Browser.LanguageModel.create()
console.log(await browser.prompt('Write a short poem about the weather'))

๐Ÿ˜ƒ Take a look at the examples to get started

๐Ÿ“” to see everything that AiBrow supports

๐Ÿ‘พ The AiBrow extension is on if you want to contribute or chat

๐Ÿงช Try out some of the

Prompt API
Writing assistance API
Translation API
Google Chrome
AiBrow extension
Feature comparison
AiBrow Web API
Check out the API reference
GitHub
AiBrow demos
Page cover image