Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Integrate Anthropic's New Prompt Caching #104

Open
stchakwdev opened this issue Aug 16, 2024 · 0 comments
Open

Feature Request: Integrate Anthropic's New Prompt Caching #104

stchakwdev opened this issue Aug 16, 2024 · 0 comments

Comments

@stchakwdev
Copy link

We should consider incorporating Anthropic's new prompt caching feature into this project. This integration could offer several benefits:

  1. Reduced costs: By caching prompts, we can potentially lower the overall API usage costs.
  2. Improved latency: Cached prompts should result in faster response times for repeated or similar queries.
  3. Enhanced "Agentic" capabilities: Anthropic claims this feature improves their model's ability to act as an agent. It would be interesting to explore how this affects Devon's performance and capabilities.

Implementing this feature could significantly enhance our project's efficiency and potentially expand its functionality.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant