Token Limits

What are tokens?

Tokens are the basic units of input and output in a Large Language Model. It can be a word, part of a word, a character, or more.

For help counting the tokens in your text, you can leverage this helpful tokenizer tool from OpenAI: https://platform.openai.com/tokenizer

How many tokens can be returned from a plugin?

Today, you can return 1400 tokens (~5.6kB) of data from your plugin to the AI agent.

If you need to reduce the amount of tokens returned from your plugin, we have a few tools you can use:

  1. Eliminate fields from a data structure using the MERGE statement from our Data Mapper.
  2. Remove records from your list using DSL FILTER expressions
  3. Reduce the length of a text field using the summarize text action