improved

Increased Context Length for J2 Foundation Models

We are excited to announce that our Jurassic-2 Foundation models, including J2-Large, J2-Grande, and J2-Jumbo, now support an increased context length of up to 8K tokens (8192 tokens to be exact). This enhancement allows users to work with longer prompts and receive more comprehensive completions.

  • Impact: Users can now feed more extensive input prompts or utilize larger conversation histories for their tasks, enabling more contextually relevant and accurate responses from the AI models. This update provides a significant boost in the flexibility and capabilities of our Jurassic-2 Foundation models, making them suitable for even more use cases and applications.
  • Usage: To take advantage of the increased context length, simply provide your prompt or conversation history within the allowed token limit. The models will automatically handle the additional context without any further changes required in the API call.

As always, we appreciate your feedback and encourage you to reach out with any questions or concerns. Stay tuned for more updates and improvements in the future!