AiBrow
  • Welcome
  • AiBrow web API
    • Getting started
    • Feature comparison
  • AiBrow Extension
    • Getting started
    • Web polyfill
    • Helping users install the AiBrow extension
    • Remove the on-device helper or models
  • Examples
    • CoreModel API
    • Embedding API
    • LanguageDetector API
    • LanguageModel API
    • Rewriter API
    • Summarizer API
    • Translator API
    • Writer API
    • Using different models
    • Model download feedback
    • Getting JSON output
  • API Reference
    • AI
      • AiBrowAI
      • BrowserAI
      • WebAI
    • AiBrow
      • CoreModelFactory
        • CoreModel
        • CoreModelCapabilities
      • EmbeddingFactory
        • Embedding
        • EmbeddingCapabilities
      • LanguageDetectorFactory
        • LanguageDetector
        • LanguageDectectorCapabilities
      • LanguageModelFactory
        • LanguageModel
        • LanguageModelCapabilities
      • RewriterFactory
        • Rewriter
        • RewriterCapabilities
      • SummarizerFactory
        • Summarizer
        • SummarizerCapabilities
      • TranslatorFactory
        • Translator
        • TranslatorCapabilities
      • WriterFactory
        • Writer
        • WriterCapabilities
    • Types
      • AICapabilityAvailability
      • AICapabilityGpuEngine
      • AICreateMonitor
      • AILanguageDetectorDetectResult
      • AIRewriterFormat
      • AIRewriterLength
      • AIRewriterTone
      • AISummarizerFormat
      • AISummarizerLength
      • AISummarizerType
      • AIWriterFormat
      • AIWriterLength
      • AIWriterTone
      • AIModelDtype
    • Models
Powered by GitBook
On this page
  1. Examples

Getting JSON output

PreviousModel download feedbackNextAI

Last updated 6 months ago

AiBrow supports defining grammar on the coreModel and languageModel APIs. This allows you to constrain the output as needed. is provided by llama-cpp .

Learn more about

Constraining the output allows you to confidently parse the output and work on it. This is an excellent way to chain prompts together and make full use of the language model. Here's how you can extract some data from some text as a JSON structure...

const session = await window.ai.coreModel.create()
// We want to extract some data from this text
const prompt = "Extract data from the following text: John Doe is an innovative software developer with a passion for creating intuitive user experiences. Based in the heart of England, John has spent the past decade refining his craft, working with both startups and established tech companies. His deep commitment to quality and creativity is evident in the numerous award-winning apps he has developed, which continue to enrich the digital lives of users worldwide. Beyond his technical skills, John is admired for his collaborative spirit and mentorship, always eager to share his knowledge and inspire the next generation of tech enthusiasts."

// Define the type of object we want returned
const grammar = {
  "type": "object",
  "properties": {
    "first_name": {
      "type": "string"
    },
    "last_name": {
      "type": "string"
    },
    "country": {
      "type": "string"
    }
  }
}

// Prompt the model
const stream = await session.promptStreaming(prompt, { grammar })
let output = ''
for await (const chunk of stream) {
  console.log(chunk)
  output += chunk
}

console.log(JSON.parse(output))
// { "first_name": "John", "last_name": "Doe", "country": "England" }
Grammar support
how to use Grammar