Token Limits
Tokens are the basic units of input and output in a language model. It can be a word, part of a word, a character, or more. For help counting the tokens in your text, you can leverage this helpful tokenizer tool from OpenAI: https://platform.openai.com/tokenizer
Note that you can only return 1400 tokens (~5.6kB) of data.
If you would like to reduce the amount of tokens in a body of text, consider using a FILTER expression (see DSL) to filter out certain pieces of data, or use a summarize text action to reduce the tokens in your text.
Updated 24 days ago