Skip to content

Commit

Permalink
Merge branch 'master' into add_promql_capability
Browse files Browse the repository at this point in the history
  • Loading branch information
nherment authored Jan 29, 2025
2 parents 768e39a + bfab843 commit 7633699
Show file tree
Hide file tree
Showing 229 changed files with 8,487 additions and 2,596 deletions.
3 changes: 1 addition & 2 deletions .github/workflows/build-and-test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -37,10 +37,9 @@ jobs:
curl -sSL https://install.python-poetry.org | python3 - --version 1.4.0
poetry config virtualenvs.create false
poetry install --no-root
poetry run python -m playwright install --with-deps firefox
sudo apt-get install -y binutils
pyinstaller holmes.py --add-data 'holmes/plugins/runbooks/*:holmes/plugins/runbooks' --add-data 'holmes/plugins/prompts/*:holmes/plugins/prompts' --add-data 'holmes/plugins/toolsets/*:holmes/plugins/toolsets' --hidden-import=tiktoken_ext.openai_public --hidden-import=tiktoken_ext --hiddenimport litellm.llms.tokenizers --collect-data litellm
pyinstaller holmes.py --add-data 'holmes/plugins/runbooks/*:holmes/plugins/runbooks' --add-data 'holmes/plugins/prompts/*:holmes/plugins/prompts' --add-data 'holmes/plugins/toolsets/*:holmes/plugins/toolsets' --hidden-import=tiktoken_ext.openai_public --hidden-import=tiktoken_ext --hiddenimport litellm.llms.tokenizers --hiddenimport litellm.litellm_core_utils.tokenizers --collect-data litellm
ls dist
- name: Run tests
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build-binaries-and-brew.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ jobs:
# regarding the tiktoken part of the command, see https://github.com/openai/tiktoken/issues/80
# regarding the litellm part of the command, see https://github.com/pyinstaller/pyinstaller/issues/8620#issuecomment-2186540504
run: |
pyinstaller holmes.py --add-data 'holmes/plugins/runbooks/*:holmes/plugins/runbooks' --add-data 'holmes/plugins/prompts/*:holmes/plugins/prompts' --add-data 'holmes/plugins/toolsets/*:holmes/plugins/toolsets' --hidden-import=tiktoken_ext.openai_public --hidden-import=tiktoken_ext --hiddenimport litellm.llms.tokenizers --collect-data litellm
pyinstaller holmes.py --add-data 'holmes/plugins/runbooks/*:holmes/plugins/runbooks' --add-data 'holmes/plugins/prompts/*:holmes/plugins/prompts' --add-data 'holmes/plugins/toolsets/*:holmes/plugins/toolsets' --hidden-import=tiktoken_ext.openai_public --hidden-import=tiktoken_ext --hiddenimport litellm.llms.tokenizers --hiddenimport litellm.litellm_core_utils.tokenizers --collect-data litellm
ls dist
- name: Zip the application (Unix)
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/llm-evaluation.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@ jobs:
curl -sSL https://install.python-poetry.org | python3 - --version 1.4.0
poetry config virtualenvs.create false
poetry install --no-root
poetry run python -m playwright install --with-deps firefox
- name: Run tests
shell: bash
Expand Down
1 change: 0 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,6 @@ ENV PYTHONPATH=$PYTHONPATH:.:/app/holmes
WORKDIR /app

COPY --from=builder /app/venv /venv
RUN python -m playwright install firefox --with-deps

# We're installing here libexpat1, to upgrade the package to include a fix to 3 high CVEs. CVE-2024-45491,CVE-2024-45490,CVE-2024-45492
RUN apt-get update \
Expand Down
149 changes: 129 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ To this 👇

### Key Features
- **Automatic data collection:** HolmesGPT surfaces up the observability data you need to investigate
- **Secure:** *Read-only* access to your data - respects RBAC permissions
- **Secure:** *Read-only* access to your data - respects RBAC permissions
- **Runbook automation and knowledge sharing:** Tell Holmes how you investigate today and it will automate it
- **Extensible:** Add your own data sources (tools) and Holmes will use them to investigate
- **Data Privacy:** Bring your own API key for any AI provider (OpenAI, Azure, AWS Bedrock, etc)
Expand All @@ -41,7 +41,7 @@ Includes free use of the Robusta AI model.
![Screenshot 2024-10-31 at 11 40 09](https://github.com/user-attachments/assets/2e90cc7b-4b0a-4386-ab4f-0d36692b549c)


[Sign up for Robusta SaaS](https://platform.robusta.dev/signup/?utm_source=github&utm_medium=holmesgpt-readme) (Kubernetes cluster required) or contact us about on-premise options.
[Sign up for Robusta SaaS](https://platform.robusta.dev/signup/?utm_source=github&utm_medium=holmesgpt-readme&utm_content=ways_to_use_holmesgpt_section) (Kubernetes cluster required) or contact us about on-premise options.
</details>

<details>
Expand Down Expand Up @@ -178,7 +178,20 @@ See an example implementation [here](examples/custom_llm.py).

Like what you see? Discover [more use cases](#more-use-cases) or get started by [installing HolmesGPT](#installation).

## Installation
## In-Cluster Installation (Recommended)

Install Holmes + [Robusta](https://github.com/robusta-dev/robusta) as a unified package:

- Analysis based on **GPT-4o** (no API key needed)
- Simple installation using `helm`
- Built-in integrations with **Prometheus alerts** and **Slack**
- Visualize Kubernetes issues on a timeline, and analyze them with Holmes in a single click

**Note:** Requires a Kubernetes cluster.

[Create a free Robusta UI account »](https://platform.robusta.dev/signup/?utm_source=github&utm_medium=holmesgpt-readme&utm_content=easy_install_in_cluster_section)

## More Installation methods

**Prerequisite:** <a href="#getting-an-api-key"> Get an API key for a supported LLM.</a>

Expand Down Expand Up @@ -281,32 +294,79 @@ docker run -it --net=host -v -v ~/.holmes:/root/.holmes -v ~/.aws:/root/.aws -v
<details>
<summary>Run HolmesGPT in your cluster (Helm)</summary>

Most users should install Holmes using the instructions in the [Robusta docs ↗](https://docs.robusta.dev/master/configuration/ai-analysis.html) and NOT the below instructions.
Most users should install Holmes using the instructions in the [Robusta docs ↗](https://docs.robusta.dev/master/configuration/ai-analysis.html) and NOT the instructions below.

By using the ``Robusta`` integration you’ll benefit from an end-to-end integration that integrates with ``Prometheus alerts`` and ``Slack``. Using the below instructions you’ll have to build many of those components yourself.
By using the `Robusta` integration, you’ll benefit from a fully integrated setup that works seamlessly with `Prometheus alerts` and `Slack`. Using the instructions below requires you to build and configure many of these components yourself.

In this mode, all the parameters should be passed to the HolmesGPT deployment, using environment variables.
### Environment Variable Configuration

We recommend pulling sensitive variables from Kubernetes ``secrets``.
In this mode, all parameters should be passed to the HolmesGPT deployment using environment variables. To securely manage sensitive data, we recommend pulling sensitive variables from Kubernetes `secrets`.

First, you'll need to create your ``holmes-values.yaml`` file, for example:
#### Example Configuration
Create a `holmes-values.yaml` file with your desired environment variables:

additionalEnvVars:
- name: MODEL
value: gpt-4o
- name: OPENAI_API_KEY
value: <your open ai key>
```yaml
additionalEnvVars:
- name: MODEL
value: gpt-4o
- name: OPENAI_API_KEY
value: <your open ai key>
```
Install Holmes with Helm:
```bash
helm repo add robusta https://robusta-charts.storage.googleapis.com && helm repo update
helm install holmes robusta/holmes -f holmes-values.yaml
```

For all LLMs, you must provide the `MODEL` environment variable, which specifies the model you are using. Some LLMs may require additional variables.

Then, install with ``helm``;
### Using `{{ env.VARIABLE_NAME }}` for Secrets

helm repo add robusta https://robusta-charts.storage.googleapis.com && helm repo update
helm install holmes robusta/holmes -f holmes-values.yaml
For enhanced security and flexibility, you can substitute values directly with environment variables using the `{{ env.VARIABLE_NAME }}` syntax. This is especially useful for passing sensitive information like API keys or credentials.

Example configuration for OpenSearch integration:

```yaml
toolsets:
opensearch:
enabled: true
config:
# OpenSearch configuration
opensearch_clusters:
- hosts:
- host: "{{ env.OPENSEARCH_URL }}"
port: 9200
headers:
Authorization: "Basic {{ env.OPENSEARCH_BEARER_TOKEN }}"
# Additional parameters
use_ssl: true
ssl_assert_hostname: false
verify_certs: false
ssl_show_warn: false
```
In this example:
- `{{ env.OPENSEARCH_URL }}` will be replaced by the `OPENSEARCH_URL` environment variable.
- `{{ env.OPENSEARCH_BEARER_TOKEN }}` will pull the value of the `OPENSEARCH_BEARER_TOKEN` environment variable.

This approach allows sensitive variables to be managed securely, such as by using Kubernetes secrets.

### Custom Toolset Configurations

You can also add custom configurations for other toolsets. For example:

For all LLMs you need to provide the ``MODEL`` environment variable, which specifies which model you are using.
```yaml
toolsets:
tool_name_here:
enabled: true
config:
# Custom configuration for your tool
custom_param: "{{ env.CUSTOM_PARAM }}"
```

Some LLMs requires additional variables:
This structure enables you to add or modify toolset configurations easily, while leveraging environment variables for flexibility and security.

<details>
<summary>OpenAI</summary>
Expand Down Expand Up @@ -491,7 +551,7 @@ To use Vertex AI with Gemini models, set the following environment variables:

```bash
export VERTEXAI_PROJECT="your-project-id"
export VERTEXAI_LOCATION="us-central1"
export VERTEXAI_LOCATION="us-central1"
export GOOGLE_APPLICATION_CREDENTIALS="path/to/your/service_account_key.json"
```

Expand Down Expand Up @@ -560,7 +620,8 @@ Fetching runbooks through URLs
</summary>

HolmesGPT can consult webpages containing runbooks or other relevant information.
HolmesGPT uses playwright to scrape webpages and requires playwright to be installed and working through `playwright install`.
This is done through a HTTP GET and the resulting HTML is then cleaned and parsed into markdown.
Any Javascript that is on the webpage is ignored.
</details>

<details>
Expand Down Expand Up @@ -646,6 +707,7 @@ You can view an example config file with all available settings [here](config.ex

By default, without specifying `--config` the agent will try to read `~/.holmes/config.yaml`. When settings are present in both config file and cli, the cli option takes precedence.


<details>
<summary>Custom Toolsets</summary>

Expand Down Expand Up @@ -762,6 +824,53 @@ Configure Slack to send notifications to specific channels. Provide your Slack t

</details>


<details>
<summary>OpenSearch Integration</summary>

The OpenSearch toolset (`opensearch`) allows Holmes to consult an opensearch cluster for its health, settings and shards information.
The toolset supports multiple opensearch or elasticsearch clusters that are configured by editing Holmes' configuration file (or in cluster to the configuration secret):

```
opensearch_clusters:
- hosts:
- https://my_elasticsearch.us-central1.gcp.cloud.es.io:443
headers:
Authorization: "ApiKey <your_API_key>"
# or
# - hosts:
# - https://my_elasticsearch.us-central1.gcp.cloud.es.io:443
# http_auth:
# username: ELASTIC_USERNAME
# password: ELASTIC_PASSWORD
```

The configuration for each OpenSearch cluster is passed directly to the [opensearch-py](https://github.com/opensearch-project/opensearch-py) module. Refer to the module's documentation for detailed guidance on configuring connectivity.

To enable OpenSearch integration when running HolmesGPT in a Kubernetes cluster, **include the following configuration** in the `Helm chart`:

```yaml
toolsets:
opensearch:
enabled: true
config:
# OpenSearch configuration
opensearch_clusters:
- hosts:
- host: "{{ env.OPENSEARCH_URL }}"
port: 9200
headers:
Authorization: "Basic {{ env.OPENSEARCH_BEARER_TOKEN }}"
# Additional parameters
use_ssl: true
ssl_assert_hostname: false
verify_certs: false
ssl_show_warn: false
```
</details>
<details>
<summary>Custom Runbooks</summary>
Expand Down
Loading

0 comments on commit 7633699

Please sign in to comment.