Enhancement: New features on the Anthropic API reducing costs by up to 90%. #5034
tracer8
started this conversation in
Feature Requests & Suggestions
Replies: 1 comment 3 replies
-
Thanks. Prompt caching is already supported and some of these features already have open issues. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What features would you like to see added?
New features on the Anthropic API reducing costs by up to 90%.
Prompt caching
Provide Claude with more background knowledge and example outputs to improve response accuracy and latency—all while reducing costs by up to 90%.
Message Batches API
Send any number of batches of up to 100,000 messages per batch. Batches are processed asynchronously with results returned as soon as the batch is complete and cost 50% less than standard API calls.
Token counting
Determine the number of tokens in a message before sending it to Claude to make informed decisions about your prompts and usage.
Multimodal PDF support
Understand and analyze both text and visual content within PDF documents for more comprehensive analysis.
Models endpoint
List all available models to select between the latest Anthropic Models or query a specific model string to validate user-provided content.
More details
New features on the Anthropic API reducing costs by up to 90%.
Which components are impacted by your request?
No response
Pictures
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions