Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
Added OpenAI to list of inference providers. Updated coming soon feature list and roadmap.
  • Loading branch information
mmurad2 authored Sep 4, 2024
1 parent 1a7a1c8 commit 41fee7a
Showing 1 changed file with 5 additions and 4 deletions.
9 changes: 5 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@ The Bee framework makes it easy to build agentic worfklows with leading open-sou
- ⏸️ **Serialization** Handle complex agentic workflows and easily pause/resume them [without losing state](https://github.com/i-am-bee/bee-agent-framework/blob/main/docs/overview.md#serializer).
- 🔍 **Traceability**: Get full visibility of your agent’s inner workings, [log](https://github.com/i-am-bee/bee-agent-framework/blob/main/docs/overview.md#logger) all running events, and use our MLflow integration (coming soon) to debug performance.
- 🎛️ **Production-level** control with [caching](https://github.com/i-am-bee/bee-agent-framework/blob/main/docs/overview.md#cache) and [error handling](https://github.com/i-am-bee/bee-agent-framework/blob/main/docs/overview.md#errors).
- 🚧 (Coming soon) **Evaluation**: Run evaluation jobs with your own data source (custom csv or Airtable).
- 🚧 (Coming soon) **Model-agnostic support**: Change model providers in 1 line of code without breaking your agent’s functionality.
- 🚧 (Coming soon) **Chat UI**: Serve your agent to users in a delightful GUI with built-in transparency, explainability, and user controls.
- ... more on our [Roadmap](#roadmap)
Expand Down Expand Up @@ -112,9 +111,10 @@ To run this example, be sure that you have installed [ollama](https://ollama.com
| Name | Description |
| ------------------------------------------------------------------------- | --------------------------------------------------------------------------------------- |
| `Ollama` | LLM + ChatLLM support ([example](./examples/llms/providers/ollama.ts)) |
| `OpenAI` | LLM + ChatLLM support ([example](./examples/llms/providers/openai.ts)) |
| `LangChain` | Use any LLM that LangChain supports ([example](./examples/llms/providers/langchain.ts)) |
| `WatsonX` | LLM + ChatLLM support ([example](./examples/llms/providers/watsonx.ts)) |
| `BAM (IBM Internal)` | LLM + ChatLLM support ([example](./examples/llms/providers/bam.ts)) |
| `BAM (Internal)` | LLM + ChatLLM support ([example](./examples/llms/providers/bam.ts)) |
|[Request](https://github.com/i-am-bee/bee-agent-framework/discussions) | |

### 📦 Modules
Expand Down Expand Up @@ -144,12 +144,13 @@ To see more in-depth explanation see [docs](./docs/overview.md).

## Roadmap

- Evaluation with MLFlow integration
- MLFlow integration for trace observability
- JSON encoder/decoder for model-agnostic support
- Chat Client (GUI)
- Structured outputs
- Chat Client (GUI)
- Improvements to base Bee agent
- Guardrails
- Evaluation
- 🚧 TBD 🚧

## Contribution guidelines
Expand Down

0 comments on commit 41fee7a

Please sign in to comment.