AiBrow
  • Welcome
  • AiBrow web API
    • Getting started
    • Feature comparison
  • AiBrow Extension
    • Getting started
    • Web polyfill
    • Helping users install the AiBrow extension
    • Remove the on-device helper or models
  • Examples
    • CoreModel API
    • Embedding API
    • LanguageDetector API
    • LanguageModel API
    • Rewriter API
    • Summarizer API
    • Translator API
    • Writer API
    • Using different models
    • Model download feedback
    • Getting JSON output
  • API Reference
    • AI
      • AiBrowAI
      • BrowserAI
      • WebAI
    • AiBrow
      • CoreModelFactory
        • CoreModel
        • CoreModelCapabilities
      • EmbeddingFactory
        • Embedding
        • EmbeddingCapabilities
      • LanguageDetectorFactory
        • LanguageDetector
        • LanguageDectectorCapabilities
      • LanguageModelFactory
        • LanguageModel
        • LanguageModelCapabilities
      • RewriterFactory
        • Rewriter
        • RewriterCapabilities
      • SummarizerFactory
        • Summarizer
        • SummarizerCapabilities
      • TranslatorFactory
        • Translator
        • TranslatorCapabilities
      • WriterFactory
        • Writer
        • WriterCapabilities
    • Types
      • AICapabilityAvailability
      • AICapabilityGpuEngine
      • AICreateMonitor
      • AILanguageDetectorDetectResult
      • AIRewriterFormat
      • AIRewriterLength
      • AIRewriterTone
      • AISummarizerFormat
      • AISummarizerLength
      • AISummarizerType
      • AIWriterFormat
      • AIWriterLength
      • AIWriterTone
      • AIModelDtype
    • Models
Powered by GitBook
On this page
  • AiBrow extension using llama.cpp natively
  • AiBrow on WebGPU
  • Chrome built-in AI
  1. AiBrow web API

Getting started

PreviousWelcomeNextFeature comparison

Last updated 3 months ago

The API allows you to make the best use of the device's hardware to run local AI in the browser in the most performant way possible. It's based around the , but adds support for new features such as custom/HuggingFace models, grammar schemas, JSON output, LoRa Adapters, embeddings, and a fallback to a self-hosted or public server for lower-powered devices.

Take a look at the table for each implementation

AiBrow extension using llama.cpp natively

Using the AiBrow extension gives the best on-device performance with the broadest feature-set. It's a browser extension that leverages the powerful and can give great performance on all kinds of desktop computers either leveraging the GPU or CPU. Downloaded models are stored in a common repository meaning models only need to be downloaded once. You can use models provided by AiBrow, or any GGUF model hosted on .

import AI from '@aibrow/web'

const { ready, extension, helper } = await AI.aibrow.capabilities()
if (ready) {
  const session = await AI.aibrow.languageModel.create()
  console.log(await session.prompt('Write a short poem about the weather'))
} else {
  // Here are some tips to help users install the AiBrow extension & helper https://docs.aibrow.ai/guides/helping-users-install-aibrow
  console.log(`Extension is not fully installed. Extension=${extension}. Helper=${helper}`)
}

AiBrow on WebGPU

WebGPU provides a good middle-ground for performance and feature set, but it comes with some memory usage restrictions and performance overheads. If you only need to use small models or want to provide a fallback for when the extension isn't installed this can provide a great solution. Under the hood it uses from HuggingFace. Models are downloaded through an AiBrow frame which means models only need to be downloaded once. You can use models provided by AiBrow, or any ONNX model hosted on .

import AI from '@aibrow/web'

const session = await AI.web.languageModel.create()
console.log(await session.prompt('Write a short poem about the weather'))

Chrome built-in AI

The Chrome built-in AI is a great option for simple tasks such as summarization, writing etc. It has a smaller feature-set compared to the AiBrow extension and WebGPU and has reasonable on-device performance.

import AI from '@aibrow/web'

if (AI.browser) {
  const session = await AI.browser.languageModel.create()
  console.log(await session.prompt('Write a short poem about the weather'))
} else {
  console.log(`Your browser doesn't support browser window.ai`)
}

Chrome built-in AI APIs
Feature comparison
llama.cpp
HuggingFace
transformers.js
HuggingFace