CoreModel API
Most top-level APIs don't supply the prompt that you provide directly to the language model. They wrap this prompt up in a template that allows the model to have a little more understanding of what you're doing. This, for example, could indicate that the model is taking in a conversation, and certain snippets of text are from an assistant, whilst others are from a user.
The CoreModel API is powerful because it exposes the underlying model to the API and passes your prompts and options directly through. This gives you more power to constrain the language model in new ways, with the added complexity of building templates and configurations yourself.
Prompting
Prompting can be done in much the same way as the other models. The prompt that's provided to the session is passed directly to the language model
Prompting with templates
Using a template can be a really useful way, to indicate that the language model is part of a conversation. It can suggest that snippets of text are from the user, whilst others are the assistant. You can manually build your own templates or use some 3rd-party libraries to help
Demos
Last updated