Optional apiThe API key to use.
{process.env.COHERE_API_KEY}
Optional modelThe name of the model to use.
{"command"}
Optional streamWhether or not to include token usage when streaming.
This will include an extra chunk at the end of the stream
with eventType: "stream-end" and the token usage in
usage_metadata.
{true}
Optional streamingWhether or not to stream the response.
{false}
Optional temperatureWhat sampling temperature to use, between 0.0 and 2.0. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
{0.3}
Generated using TypeDoc
Input interface for ChatCohere