diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..220feb3 --- /dev/null +++ b/.gitignore @@ -0,0 +1,7 @@ +# Python +__pycache__ + +# uv +.python-version +.venv +build diff --git a/README.md b/README.md index 192b49c..44f2ad3 100644 --- a/README.md +++ b/README.md @@ -5,102 +5,57 @@ -## What is Stacks for Terraform? - -**Stacks** is a code pre-processor for Terraform. It implements a **sustainable scaling pattern**, **prevents drift** and **boilerplate**, all while **plugging into your already existing Terraform pipeline**. - -Stacks was initially presented at [SREcon23 Americas](https://www.usenix.org/conference/srecon23americas/presentation/bejarano). - -***Warning:** Stacks is under heavy development, many things may change.* - - -## What is a "stack"? - -- A **stack** is a set of Terraform resources you want to deploy one or more times. -- Each instance of a stack is a **layer**. A stack has one or more layers, hence, the name "stacks". - -### Example - -``` -vpc/ -│ -├── base/ -│ ├── vpc.tf -│ └── subnets.tf -│ -├── layers/ -│ ├── production/ -│ │ └── layer.tfvars -│ └── staging/ -│ ├── layer.tfvars -│ └── vpn.tf -│ -└── stack.tfvars -``` - -- This is an example stack called `vpc`. -- It contains a `base` folder, containing the common Terraform configuration scoped for all layers in this stack. -- It contains a `layers` folder with two layers, one called `production` and one called `staging`. Layer directories contain layer-specific Terraform configuration. -- Finally, it contains an optional `stack.tfvars` file, which defines variables global to all layers in the stack. These variables can be overriden at the layer level through a layer-specific `layer.tfvars`. - - -## How does Stacks work? - -Stacks sits between you (the Terraform user) and Terraform. It's a **code pre-processor**. -Here's an overview of Stacks inner workings: - -1. It takes your stack definitions (as shown above) -1. For each layer: - 1. Joins the `base` code with the layer-specific code - 1. Applies a number of transformations - 1. Injects some extra configuration - 1. Bundles it up for Terraform to plan/apply on it - - -## How to use Stacks? - -First, you need to put the Stacks code somewhere close to your stack definitions. -Here's an example (not necessarily what we recommend): - -``` -your-terraform-repository/ -│ -├── src/ # the contents of the `src` directory -│ ├── helpers.py -│ ├── postinit.py -│ └── preinit.py -│ -├── environments/ # see the `example` directory on how to set this up -│ ├── production/ -│ │ ├── backend.tfvars -│ │ └── environment.tfvars -│ └── staging/ -│ -└── stacks/ # put your stack definitions here - └── vpc/ # the `vpc` stack shown above - ├── base/ - │ ├── vpc.tf - │ └── subnets.tf - ├── layers/ - │ ├── production/ - │ │ └── layer.tfvars - │ └── staging/ - │ ├── layer.tfvars - │ └── vpn.tf - └── stack.tfvars -``` - -You can find [another example here](example/stacks/example) with all the appropriate file contents. - -Then you need to run Stacks in the layer you want to apply: -```bash -cd stacks/vpc/layers/production -python3 ../../../../src/preinit.py -cd stacks.out # where the preinit output goes -terraform init -python3 ../../../../../src/postinit.py -``` - -Now you're ready to run any further `terraform` commands in the `stacks.out` directory. - -***Note:** we recommend putting `stacks.out` in `.gitignore` to prevent it from being tracked by git.* +## What is Stacks? + +**Stacks** is a [Terraform](https://www.terraform.io/) code pre-processor. +Its primary goal is to minimize your total Terraform codebase without giving up on coverage. To do more with less. + +As a code pre-processor, Stacks receives your "input code" and returns "output code" for Terraform to consume. + +Stacks was originally developed and continues to be maintained by the Infrastructure SRE team at [Cisco ThousandEyes](https://www.thousandeyes.com/). +It was initially presented and open-sourced at [SREcon23 Americas](https://www.usenix.org/conference/srecon23americas/presentation/bejarano). + +You can read "Terraform" and "OpenTofu" interchangeably, Stacks works with both but we've chosen to go with "Terraform" for readability. + +The ["I am starting from scratch" quick-start guide](<2.2. I am starting from scratch.md>) is a good introduction to Stacks and what it does. + + +## Documentation + +1. About + 1. [Considerations before using]() + 2. [Stacks vs. its alternatives]() + +2. Quick-start guide + 1. [Installation instructions]() + 2. [I am starting from scratch]() + 3. [I am collaborating to an existing stack]() + 4. [I am collaborating to Stacks itself]() + +3. Reference + 1. Native features + 1. [Global Terraform code]() + 2. [Reusable root modules]() + 3. [Jinja templating for Terraform]() + 4. [Jinja templating for variables]() + 5. [Remote lookup functions]() + 6. [Inline secret encryption]() + 7. [Automatic variable initialization]() + 2. Features you can build with Stacks + 1. [Terraform state backend configuration]() + 2. [Terraform provider generation]() + 3. [Input validation]() + 3. Command-line interface + 1. [`stacks render`]() + 2. [`stacks terraform`]() + 3. [`stacks diff`]() + 4. [`stacks encrypt`]() + 5. [`stacks decrypt`]() + 6. [`stacks surgery list`]() + 7. [`stacks surgery import`]() + 8. [`stacks surgery remove`]() + 9. [`stacks surgery rename`]() + 10. [`stacks surgery move`]() + 11. [`stacks surgery edit`]() + 4. [Directory structure]() + 5. [Special variables]() diff --git a/docs/1.1. Considerations before using.md b/docs/1.1. Considerations before using.md new file mode 100644 index 0000000..54edcea --- /dev/null +++ b/docs/1.1. Considerations before using.md @@ -0,0 +1,31 @@ +# Considerations before using + +## Stacks is designed for monorepos + +Stacks is primarily designed to be used in Terraform monorepos. + +We define a Terraform monorepo as a single version control repository whose code maps to two or more Terraform states. +You can have multiple Terraform monorepos as long as they all map to more than one state each. + +While perfectly possible to use in "1 repository = 1 state" setups, Stacks shines brightest when "1 repository = N states". + +If your setup is not monorepo-like, we do not recommend you use Stacks. + +## Not all Terraform automation tools support pre-processors like Stacks + +Stacks was originally developed to run on top of [Atlantis](https://www.runatlantis.io/). +Atlantis does support code pre-processors in the form of [pre-workflow hooks](https://www.runatlantis.io/docs/pre-workflow-hooks.html#pre-workflow-hooks). + +Unfortunately, not all Terraform automation tools are flexible enough for you to run a thing that will modify your checked-out code before Terraform consumes it. + +Same restrictions apply to any code scanning tools you may be using. You'll have to put Stacks before them, which may not be possible depending on what continuous integration platform they're running on. + +If your Terraform automation pipeline does not support such code pre-processors, you cannot use Stacks. + +## You will not be able to use code formatters like `terraform fmt` + +Since not all Stacks' input code is valid HCL, formatters like `terraform fmt` will not work. + +You can still use code formatters in output code, but since its not meant to be persisted anywhere there's little to no reason to do that either. + +If format enforcing through code formatters is something you're not willing to give up, you cannot use Stacks. diff --git a/docs/1.2. Stacks vs its alternatives.md b/docs/1.2. Stacks vs its alternatives.md new file mode 100644 index 0000000..7a71f9e --- /dev/null +++ b/docs/1.2. Stacks vs its alternatives.md @@ -0,0 +1,47 @@ +# Stacks vs. its alternatives + +## Stacks vs. Terraform workspaces + +A Stacks stack is a Terraform root module which you can deploy as many times as you have layers. + +Terraform CLI [workspaces](https://developer.hashicorp.com/terraform/language/state/workspaces) let you deploy the same root module as many times as you have workspaces. + +Both allow you to inject different input variable values on different layers or workspaces, respectively. + +So it would seem like both are similar, however, the primary goal of Terraform workspaces is to [enable testing changes](https://developer.hashicorp.com/terraform/cli/workspaces#use-cases) on a separate state before modifying production infrastructure, and HashiCorp [explicitly recommends against](https://developer.hashicorp.com/terraform/cli/workspaces#when-not-to-use-multiple-workspaces) using workspaces for long-lived parallel deployments of the same root module. + +_HCP Terraform workspaces are a different feature to Terraform CLI workspaces, and do not compare with Stacks._ + +## Stacks vs. Terragrunt + +[Terragrunt](https://terragrunt.gruntwork.io/) and Stacks achieve very similar results with very different strategies. + +Both enforce a specific directory structure on your repository. +Both generate output code for Terraform to consume. + +Terragrunt adds an [extra layer of configuration](https://terragrunt.gruntwork.io/docs/getting-started/overview/#example) on top of Terraform which lets you define what code it generates. +Terragrunt is heavily influenced by Terraform's [specifics](https://terragrunt.gruntwork.io/docs/features/state-backend/). +It even has special features for [AWS](https://terragrunt.gruntwork.io/docs/features/aws-authentication/). + +Stacks is radically simpler in that it's mainly a thin layer of [Jinja](https://jinja.palletsprojects.com/en/stable/) on top of your Terraform code. +So much so that you can probably use Stacks for other declarative purposes like generating [Kubernetes](https://kubernetes.io/) manifests for [`kubectl`](https://kubernetes.io/docs/reference/kubectl/) to consume, for example. + +_Terragrunt Stacks is a Terragrunt feature that does not compare with Stacks, and while the word "stacks" is overloaded, Stacks existence precedes that of Terragrunt Stacks._ + +## Stacks vs. CDK for Terraform + +Both Stacks and [CDKTF](https://developer.hashicorp.com/terraform/cdktf) can be used to achieve the same results, but again with very different approaches. + +Where Stacks adds a thin layer of Jinja templating on top of the HCL you already know, CDKTF replaces HCL with one of the imperative programming language it supports. +While that can be a good thing if what you want is limitless customizability of your infrastructure set based on imperative logic, we've found that very similar results can be achieved without the complexities CDK for Terraform comes with. + +## Stacks vs. Pulumi + +Everything we said above about CDK for Terraform can be said about Pulumi, as they're basically interchangable. + +## Terraform Cloud Stacks + +Terraform Cloud Stacks is a HCP Terraform feature that enables orchestrating the deployment of multiple interdependent root modules together. +So while the names are the same, HCP Terraform Stacks does not compare with Stacks. + +_And about the word "stacks" being overloaded again: Stacks was initially released on March 2023, Terraform Cloud Stacks was announced later on October that year._ diff --git a/docs/2.1. Installation instructions.md b/docs/2.1. Installation instructions.md new file mode 100644 index 0000000..d41ddad --- /dev/null +++ b/docs/2.1. Installation instructions.md @@ -0,0 +1,43 @@ +# Installation instructions + +## 1. Install Python + +Stacks is written in Python, so it'll need a working [Python](https://www.python.org/downloads/) interpreter on your machine. + +We recommend the version specified in [pyproject.toml](../pyproject.toml). + +## 2. Install `pip` or `uv` + +Stacks is installable with both [`pip`](https://pypi.org/project/pip/) and [`uv`](https://docs.astral.sh/uv/). + +Choose one and install it. If you already have `pip` installed you can skip `uv`, otherwise we recommend the latter. + +## 3. Install Terraform + +Stacks requires [Terraform](https://developer.hashicorp.com/terraform/install) (or [OpenTofu](https://opentofu.org/docs/intro/install/)) to be installed on your machine. + +Stacks will use the binary in the `STACKS_TERRAFORM_PATH` environment variable, which defaults to `terraform` (so it'll look up `terraform` in `$PATH` and use that). + +If you use OpenTofu make sure to set `STACKS_TERRAFORM_PATH` to `tofu`. + +If `STACKS_TERRAFORM_PATH` is not in `$PATH`, you can also set `STACKS_TERRAFORM_PATH` to the absolute path of the binary you want to use (e.g.: `STACKS_TERRAFORM_PATH=/usr/bin/terraform`). + +## 4. Install Stacks + +To install Stacks using `pip`: +```shell +pip3 install --break-system-packages git+https://github.com/cisco-open/stacks.git +``` + +To install Stacks using `uv`: +```shell +uv tool install git+https://github.com/cisco-open/stacks.git +``` + +For development, we recommend you install Stacks from source: +```shell +git clone git@github.com:cisco-open/stacks.git +cd stacks/ +uv tool install --editable . +``` +The `--editable` flag allows you to try your changes right away without reinstalling `stacks`. diff --git a/docs/2.2. I am starting from scratch.md b/docs/2.2. I am starting from scratch.md new file mode 100644 index 0000000..db268b2 --- /dev/null +++ b/docs/2.2. I am starting from scratch.md @@ -0,0 +1,178 @@ +# "I am starting from scratch" quick-start guide + +We assume you have followed Stacks' [installation instructions](<2.1. Installation instructions.md>). + +## 1. Create the base directory structure + +On your working directory, create the following base directory structure: +``` +|-- environments/ +`-- stacks/ +``` + +## 2. Create your first environment + +An environment is a unique context where your Terraform stacks will be deployed. + +Environments are represented by a subdirectory of `environments/` and their name must conform to this regular expression: `^[a-zA-Z0-9-]{,254}$`. + +Let's create your first environment: +``` +|-- environments/ +| `-- development/ +| `-- env.tfvars # must be named "env.tfvars" +`-- stacks/ +``` + +This creates a `development` environment. +The `env.tfvars` file should contain any environment-specific variables to be shared accross stacks deployed in this environment. +Typically these are provider settings like IAM roles to assume, or the bucket where you want this environment's state to be stored. + +## 3. Create your first stack + +A stack is the collection of a base and its layers. +A stack base is input code that translates to a Terraform root module. +A stack layer is an instance of its stack's base, on the environment it maps to. + +Stacks are represented by a subdirectory of `stacks/` and their name must conform to this regular expression: `^[a-zA-Z0-9-]{,254}$`. +A stack's base is stored under its `base/` subdirectory. +A stack's layers are stored under its `layers/` subdirectory. + +Let's create your first stack: +``` +|-- environments/ +| `-- development/ +| `-- env.tfvars +`-- stacks/ + `-- vpc/ + |-- base/ + | |-- backend.tf + | `-- main.tf # you can do any code structure you want here + |-- layers/ + | `-- development/ # must be named after an existing environment + | `-- layer.tfvars # can be named .tfvars + `-- stack.tfvars # can be named .tfvars +``` + +This creates a `vpc` stack, with its base and one layer on the `development` environment we created earlier. +The variables in `stack.tfvars` will be injected in all layers of the stack. +You can override them on a per-layer basis through the `layer.tfvars` file. +You can have multiple `*.tfvars` files. + +## 4. Deploy your layer + +Now let's deploy your newly created layer: +```shell +cd stacks/vpc/layers/development +stacks terraform apply +``` + +You'll notice a `stacks.out` directory was created. +This is where the output code is stored for Terraform to consume, feel free to inspect it. +Make sure to exclude `stacks.out` from version control (e.g. in Git add it to `.gitignore`). + +## 5. Promote to production + +Now that we've validated our base in our `development` environment, let's deploy to production now. + +To do that, we'll need a new `production` environment, and a new `production` layer of our `vpc` stack: +``` +|-- environments/ +| |-- development/ +| | `-- env.tfvars +| `-- production/ +| `-- env.tfvars +`-- stacks/ + `-- vpc/ + |-- base/ + | |-- backend.tf + | `-- main.tf + |-- layers/ + | |-- development/ + | | `-- layer.tfvars + | `-- production/ + | `-- layer.tfvars + `-- stack.tfvars +``` + +Then repeat the deployment process for our `production` layer: +```shell +cd stacks/vpc/layers/production +stacks terraform apply +``` + +## 6. Deploy another stack + +Let's now deploy something on our VPC: +``` +|-- environments/ +| |-- development/ +| | `-- env.tfvars +| `-- production/ +| `-- env.tfvars +`-- stacks/ + |-- ec2/ + | |-- base/ + | | |-- backend.tf + | | `-- main.tf + | |-- layers/ + | | |-- development/ + | | | `-- layer.tfvars + | | `-- production/ + | | `-- layer.tfvars + | `-- stack.tfvars + `-- vpc/ + |-- base/ + | |-- backend.tf + | `-- main.tf + |-- layers/ + | |-- development/ + | | `-- layer.tfvars + | `-- production/ + | `-- layer.tfvars + `-- stack.tfvars +``` +And `stacks terraform apply` accordingly. + +## 7. Deduplicate code + +You've surely realized by now that there are a number of things we're duplicating accross environments, stacks and layers. + +For starters, you can remove the need to define `backend.tf` on every stack by moving it to `stacks/`: +``` +|-- environments/ +`-- stacks/ + |-- backend.tf # (Step 3) Make it a Jinja template and put it here for all stacks to share. + |-- ec2/ + | |-- base/ # (Step 1) Remove backend.tf here. + | | `-- main.tf + | |-- layers/ + | `-- stack.tfvars + `-- vpc/ + |-- base/ # (Step 2) ...and here. + | `-- main.tf + |-- layers/ + `-- stack.tfvars +``` + +If you're doing things the Terraform way, I'm sure you're also using a [remote state data source](https://developer.hashicorp.com/terraform/language/state/remote-state-data) in the `ec2` stack to pull the `vpc_id` output you exported in the `vpc` stack. And I'm confident that has at least a couple hard-coded settings as to where the state is stored and how to access it. +Why do that when you could simply use `{{ output("vpc_id", stack="vpc") }}` wherever you need the VPC ID in your `ec2` stack? The `output` Jinja filter will figure out the value and inject it for you. +Better yet, if you don't even want to define an `output` in your `vpc` stack you can use `{{ resource("aws_vpc.main", stack="vpc")["id"] }}` instead, which will look up the resource by address in the `vpc` stack's state and inject it just like before. + +There's surely some other things you're duplicating either between `env.tfvars` or `stack.tfvars` files. +For those, you can create common `*.tfvars` files in `stacks/` too, much like with global `*.tf` Terraform code: +``` +|-- environments/ +`-- stacks/ + |-- backend.tf + |-- globals.tfvars # put common variables here + |-- ec2/ + `-- vpc/ +``` + +--- + +Hopefully you're starting to see how Stacks can drastically reduce the amount of code you need to write for Terraform to work. +From templating common code and boilerplate away both globally and per-stack, to cascading variable scopes that allow defining values once without losing the ability to override per-stack or even per-layer, plus some extra goodies like the `variable`, `output` and `resource` Jinja filters that completely remove the need for hard-coded remote state data sources to communicate states together. + +Check out the _Reference_ section of these docs for an exhaustive list of Stacks built-in features along with usage examples, plus other features not native to Stacks but that you can build on top of it. diff --git a/docs/2.3. I am collaborating to an existing stack.md b/docs/2.3. I am collaborating to an existing stack.md new file mode 100644 index 0000000..e54fb76 --- /dev/null +++ b/docs/2.3. I am collaborating to an existing stack.md @@ -0,0 +1,31 @@ +# "I am collaborating to an existing stack" quick-start guide + +We assume you have followed Stacks' [installation instructions](<2.1. Installation instructions.md>). + +All stacks have the following structure: +``` +|-- environments/ +`-- stacks/ + `-- vpc/ + |-- base/ + | |-- backend.tf + | `-- main.tf + |-- layers/ + | |-- development/ + | | `-- layer.tfvars + | `-- production/ + | `-- layer.tfvars + `-- stack.tfvars +``` + +Here's what you need to know: +- `vpc` is the stack name. +- `base` is where the stack's Terraform root module is located. +- Variables in the stack's `*.tfvars` files are injected on all layers. +- Variables in the layers' `*.tfvars` files override those of the stack's `*.tfvars` files. + +With that in mind: +- If you want to **modify the infrastructure in all layers**, change the stack's base to your liking. +- If you want to **modify the infrastructure in one or a subset of layers**, change the stack's base to support different templating based on a common variable with different value between layers on one side and the other. +- If you want to **modify the variable values of all layers**, change the stack's `*.tfvars` files` (make sure they aren't being overriden at the layer level). +- If you want to **modify the variable value of one or a subset of layers**, change the layers' `*.tfvars` files. diff --git a/docs/2.4. I am collaborating to Stacks itself.md b/docs/2.4. I am collaborating to Stacks itself.md new file mode 100644 index 0000000..3d74787 --- /dev/null +++ b/docs/2.4. I am collaborating to Stacks itself.md @@ -0,0 +1,9 @@ +# "I am collaborating to Stacks itself" quick-start guide + +We assume you have followed Stacks' [installation instructions](<2.1. Installation instructions.md>). +Make sure you install Stacks from source so you can develop comfortably, though. + +We're open to contributions but are very strict on the project's direction, so make sure you align with us before any significant development efforts. +You can contact us by raising an issue on our GitHub repository. + +Follow [CONTRIBUTING.md](../CONTRIBUTING.md). diff --git a/docs/3.1.1. Global Terraform code.md b/docs/3.1.1. Global Terraform code.md new file mode 100644 index 0000000..99d2600 --- /dev/null +++ b/docs/3.1.1. Global Terraform code.md @@ -0,0 +1,14 @@ +# Global Terraform code + +Stacks allows you to define global Terraform code that's shared across all stacks. + +This can be used for use cases like defining a common Terraform state backend template (see [this](<3.2.1. Terraform state backend configuration.md>)), or iterating through a list of regions to define multiple instances of the same provider (see [this](<3.2.2. Terraform provider generation.md>)). + +Global Terraform code must be located in the stacks directory like this: +``` +|-- environments/ +`-- stacks/ + |-- ... + `-- global.tf # here +``` +Any files named `*.tf` will work. diff --git a/docs/3.1.2. Reusable root modules.md b/docs/3.1.2. Reusable root modules.md new file mode 100644 index 0000000..185ac9d --- /dev/null +++ b/docs/3.1.2. Reusable root modules.md @@ -0,0 +1,9 @@ +# Reusable root modules + +A stack is the collection of a base and its layers. +A stack base is input code that translates to a Terraform root module. +A stack layer is an instance of its stack's base, on the environment it maps to. + +This means you get to reinstantiate the same base (root module) as many times (layers) as you want. + +It's like Terraform modules but simpler because you don't even have to initialize the module 1000 times to get 1000 copies of it, just create 1000 empty folders in your layers directory. diff --git a/docs/3.1.3. Jinja templating for Terraform.md b/docs/3.1.3. Jinja templating for Terraform.md new file mode 100644 index 0000000..d0fab1f --- /dev/null +++ b/docs/3.1.3. Jinja templating for Terraform.md @@ -0,0 +1,45 @@ +# Jinja templating for Terraform + +As you know, Terraform's configuration language is the [HashiCorp configuration language](https://github.com/hashicorp/hcl). +While there are a lot of questionable things about HCL, it's Terraform's use of it that gets more in the way of users and their goals. + +Terraform expects you to hard-code a number of values such as module URLs or state backend settings that remove all ability to reuse that code across projects. +Jinja can work around that since it doesn't care what the underlying template represents. + +Terraform also lacks conditional logic to include/exclude resources other than janky `count` attributes with ternary operators that pollute your resource state addresses. +Jinja if blocks work better for that. + +And finally, it was only until recently that Terraform enabled custom [provider functions](https://www.hashicorp.com/blog/terraform-1-8-improves-extensibility-with-provider-defined-functions). +We've had custom filters in Jinja for a decade. + +Here's where you can use Jinja in Terraform code: +``` +|-- environments/ +`-- stacks/ + `-- vpc/ + |-- base/ + | |-- main.tf # you get Jinja support here (any *.tf files) + | |-- module + | | `-- main.tf # and here (if you add the module to var.stacks_jinja_enabled_modules) + | `-- script.py # but not here (since it's not Terraform code) + `-- layers/ +``` + +For using variables where you typically couldn't: +```hcl +module "vpc" { + source = "{{ var.module_vpc_source }}" + + name = "main" + cidr = "10.0.0.0/16" +} +``` + +For including code conditionally: +```hcl +{% if var.enable_cloudtrail %} +resource "aws_cloudtrail" "main" { + name = "main" +} +{% endif %} +``` diff --git a/docs/3.1.4. Jinja templating for variables.md b/docs/3.1.4. Jinja templating for variables.md new file mode 100644 index 0000000..71fe639 --- /dev/null +++ b/docs/3.1.4. Jinja templating for variables.md @@ -0,0 +1,26 @@ +# Jinja templating for variables + +Similar to how you get Jinja for templating Terraform code, you also get Jinja support to define variables themselves. + +You can do so by creating any number of `*.tfvars.jinja` files where you would normally have `*.tfvars` files. +These get rendered using the variables defined in the non-Jinja counterparts, and they get higher override priority than them, of course. + +Here's where you can use Jinja in variables: +``` +|-- environments/ +| `-- production/ +| `-- env.tfvars # not here +`-- stacks/ + |-- ec2/ + | |-- base/ + | |-- layers/ + | | `-- production/ + | | |-- layer.tfvars # not here + | | `-- layer.tfvars.jinja # here + | |-- stack.tfvars # not here + | `-- stack.tfvars.jinja # here + |-- globals.tfvars # not here + `-- globals.tfvars.jinja # here +``` + +Be careful not to put yourself in a cyclical dependency, however. diff --git a/docs/3.1.5. Remote lookup functions.md b/docs/3.1.5. Remote lookup functions.md new file mode 100644 index 0000000..4c28811 --- /dev/null +++ b/docs/3.1.5. Remote lookup functions.md @@ -0,0 +1,37 @@ +# Remote lookup functions + +With Jinja you get to inject your own custom functions to be used in templating, so of course we do. + +## The `variable` lookup function +With the `variable` lookup function you can fetch any variable value off any other layer in your repository. +### Example 1: fetching a `vpc_id` variable from the same layer in a different stack +```hcl +# stacks/ec2/stack.tfvars.jinja: +vpc_id = "{{ variable("vpc_id", stack="vpc") }}" +``` +### Example 2: fetching a `vpc_id` variable from a different environment in a different stack +```hcl +# stacks/ec2/layers/development/layer.tfvars.jinja: +vpc_id = "{{ variable("vpc_id", stack="vpc", environment="production") }}" +``` +### Example 3: full example +```hcl +# stacks/ec2/layers/development@us-east-1_foo/layer.tfvars.jinja: +vpc_id = "{{ variable("vpc_id", stack="vpc", environment="development", subenvironment="us-east-1", instance="foo") }}" # stack/environment/subenvironment/instance all default to the caller's +``` + +## The `output` lookup function +With the `output` lookup function you can fetch any output value off the state of any other layer in your repository. +### Example 1: fetching a `vpc_id` output from the same layer in a different stack +```hcl +# stacks/ec2/stack.tfvars.jinja: +vpc_id = "{{ output("vpc_id", stack="vpc") }}" +``` + +## The `resource` lookup function +With the `resource` lookup function you can fetch any resource attributes off the state of any other layer in your repository. +### Example 1: fetching a `aws_vpc.main` resource from the same layer in a different stack +```hcl +# stacks/ec2/stack.tfvars.jinja: +vpc_id = "{{ resource("aws_vpc.main", stack="vpc")["id"] }}" +``` diff --git a/docs/3.1.6. Inline secret encryption.md b/docs/3.1.6. Inline secret encryption.md new file mode 100644 index 0000000..9c8fd27 --- /dev/null +++ b/docs/3.1.6. Inline secret encryption.md @@ -0,0 +1,42 @@ +# Inline secret encryption + +Similar to [sops](https://github.com/getsops/sops), [eyaml](https://github.com/voxpupuli/hiera-eyaml) or [Kubernetes sealed secrets](https://github.com/bitnami-labs/sealed-secrets), Stacks gives you the ability to write down encrypted secrets directly in your configuration. + +## Encrypting a secret + +First, if you don't have a public key already, generate a key pair as described in the last section of this document. + +Then use your public key to encrypt the secret: +```shell +$ stacks encrypt --public-key-path path/to/public.pem -- 'mysecr3t' # the "--" before your secret is only required if your secret begins with "--", so Stacks doesn't parse it as a non-existent flag, but it doesn't hurt to always use it +ENC[l42kj562...v349120j] +``` +The `ENC[l4...0j]` output is your encrypted secret. + +Finally, copy your encrypted secret anywhere you wish to use it: +```hcl +# environments/production/env.tfvars +aws_secret_access_key = "ENC[l42kj562...v349120j]" +``` + +***Note:** only string values can be encrypted, and they must be fed to Stacks via `*.tfvars` variables (i.e. you cannot use them directly in resource attributes).* + +## Using an encrypted secret + +To use your encrypted secrets, all you need to do is set the `STACKS_PRIVATE_KEY_PATH` environment variable to point to your private key, and then run Stacks normally. + +***Note:** this implies you can only use one key pair to encrypt/decrypt all the secrets of your running layer.* + +## Decrypting a secret + +Here's how you can consult the value of an encrypted secret (you'll need your private key): +```shell +$ stacks decrypt --private-key-path path/to/private.pem 'ENC[l42kj562...v349120j]' +mysecr3t +``` +The `mysecr3t` output is your decrypted secret. + +## Generating a new key pair + +1. Run `stacks genkey --public-key-path public.pem --private-key-path private.pem`. +2. Store `private.pem` in a safe place. Make sure to exclude it from version control! diff --git a/docs/3.1.7. Automatic variable initialization.md b/docs/3.1.7. Automatic variable initialization.md new file mode 100644 index 0000000..f9f7168 --- /dev/null +++ b/docs/3.1.7. Automatic variable initialization.md @@ -0,0 +1,10 @@ +# Automatic variable initialization + +In Terraform, variables are defined and assigned in separate steps: your `variables.tf` (definitions) and your `variables.tfvars` (assignments). + +In variable definitions you set things like their `type`, `description` and whether they have a `default`. +In variable assignments you set their actual value. + +Stacks removes the need to define variables and defines any undefined variables for you. + +All you need to do is nothing. You can "forget" to define your variables with Stacks. diff --git a/docs/3.2.1. Terraform state backend configuration.md b/docs/3.2.1. Terraform state backend configuration.md new file mode 100644 index 0000000..309c15d --- /dev/null +++ b/docs/3.2.1. Terraform state backend configuration.md @@ -0,0 +1,30 @@ +# Terraform state backend configuration + +Let's leverage Stacks' global Terraform code feature to simplify Terraform state management. + +## 1. Define state backend template + +```hcl +# stacks/backend.tf +terraform { + backend "{{ var.backend_type }}" { + {% for key, value in var.backend_args.items() -%} + {{ key }} = "{{ value }}" + {% endfor -%} + key = "{{ var.stacks_path }}/terraform.tfstate" # e.g. "stacks/vpc/layers/production/terraform.tfstate" + } +} +``` + +## 2. Define `backend_type` and `backend_args` + +```hcl +# stacks/backend.tfvars +backend_type = "s3" +backend-args = { + region = "eu-south-2" + bucket = "my-terraform-state" + dynamodb_table = "my-terraform-state" + role_arn = "arn:aws:iam::0123456789:role/Terraform" +} +``` diff --git a/docs/3.2.2. Terraform provider generation.md b/docs/3.2.2. Terraform provider generation.md new file mode 100644 index 0000000..11629d6 --- /dev/null +++ b/docs/3.2.2. Terraform provider generation.md @@ -0,0 +1,112 @@ +# Terraform provider generation + +Let's leverage Stacks' global Terraform code feature to simplify multi-account AWS provider management. + +## 1. Define AWS providers template + +```hcl +# stacks/aws.tf +{% macro aws_provider(alias, region, role) -%} +provider "aws" { + {% if alias -%} + alias = "{{ alias }}" + {% endif -%} + region = "{{ region }}" + assume_role { + role_arn = "{{ role }}" + } + default_tags { + tags = { + stacks_path = "{{ var.stacks_path }}" + } + } +} +{% endmacro -%} + +# injects default provider in var.aws_region with var.aws_role_arn +{{ aws_provider(alias=None, region=var.aws_region, role=var.aws_role_arn) }} + +# injects provider with var.aws_role_arn in all var.aws_regions +{% for region in var.aws_regions -%} +{{ aws_provider(alias=region, region=region, role=var.aws_role_arn) }} +{% endfor -%} + +# injects providers for all other environments in all var.aws_regions +{% for environment, variables in var.stacks_environments.items() -%} +{{ aws_provider(alias=environment, region=variables.aws_region, role=variables.aws_role_arn) }} +{% for region in var.aws_regions -%} +{{ aws_provider(alias=[environment,region]|join("_"), region=region, role=variables.aws_role_arn) }} +{% endfor -%} +{% endfor -%} +``` + +## 2. Define `aws_regions` + +```hcl +# stacks/aws.tfvars +aws_regions = [ + "us-east-1", + "us-east-2", + "us-west-1", + "us-west-2", + "eu-central-1", +] +``` + +## 3. Define per-environment `aws_region` and `aws_role_arn` + +```hcl +# environments/production/env.tfvars +aws_region = "us-east-1" +aws_role_arn = "arn:aws:iam::0123456789:role/Terraform" +``` + +and + +```hcl +# environments/development/env.tfvars +aws_region = "eu-south-2" +aws_role_arn = "arn:aws:iam::9876543210:role/Terraform" +``` + +## 4. Use the automatically-injected providers + +```hcl +# stacks/vpc/base/foo.tf +resource "foo" "bar" { # uses the environment's account and region (default provider) + foo = "bar" +} +``` + +or + +```hcl +# stacks/vpc/base/foo.tf +resource "foo" "bar" { + provider = aws.us-west-2 # uses the environment's account in the us-west-2 region + + foo = "bar" +} +``` + +or + +```hcl +# stacks/vpc/base/foo.tf +resource "foo" "bar" { + provider = aws.development # uses the "development" environment's account and region + + foo = "bar" +} +``` + +or + +```hcl +# stacks/vpc/base/foo.tf +resource "foo" "bar" { + provider = aws.development_us-east-2 # uses the "development" environment's account in the us-east-2 region + + foo = "bar" +} +``` diff --git a/docs/3.2.3. Input validation.md b/docs/3.2.3. Input validation.md new file mode 100644 index 0000000..8b16743 --- /dev/null +++ b/docs/3.2.3. Input validation.md @@ -0,0 +1,32 @@ +# Input validation + +One interesting thing Stacks does with Jinja is inject a `throw(msg)` filter which performs a Python `raise Exception(msg)`. +This allows halting the execution of a Stacks render if the template is programmed to do so. + +We can use this for input validation, for example. + +## Example: enforcing an `owner` variable be defined out of a list of valid values + +```hcl +# stacks/owners.tf +{% if 'owner' in var %}{{ throw('var.owner is not defined') }}{% endif %} +{% if var.owner not in var.owners %}{{ throw('var.owner not in var.owners') }}{% endif %} +``` + +```hcl +# stacks/owners.tfvars +owners = [ + "engineering", + "marketing", + "operations", + "sales", +] +``` + +From now on, on each stack/layer, you'll have to define a valid owner: +```hcl +owner = "engineering" +``` +Otherwise Stacks will fail to render. + +This can be used for enforcing common resource tagging policies, for ownership tracking, cost analysis, etc. diff --git a/docs/3.3.1. stacks render.md b/docs/3.3.1. stacks render.md new file mode 100644 index 0000000..561fe48 --- /dev/null +++ b/docs/3.3.1. stacks render.md @@ -0,0 +1,25 @@ +# `stacks render` + +## Usage +``` +Usage: stacks render [OPTIONS] + + Render a layer into working Terraform code. + +Options: + --init TEXT Run terraform init (auto, always, never) + --help Show this message and exit. +``` + +## Description + +Used to render a layer's input code into its output code (under `stacks.out`). + +Must run within a layer directory (e.g. `stacks/vpc/layers/production`). + +## Example + +```shell +$ cd stacks/vpc/layers/production +$ stacks render +``` diff --git a/docs/3.3.10. stacks surgery move.md b/docs/3.3.10. stacks surgery move.md new file mode 100644 index 0000000..027671f --- /dev/null +++ b/docs/3.3.10. stacks surgery move.md @@ -0,0 +1,22 @@ +# `stacks surgery move` + +## Usage +``` +Usage: stacks surgery move [OPTIONS] FROM_ADDRESS TO_ADDRESS TO_PATH + + Move a resource from one state to another by address. + +Options: + --help Show this message and exit. +``` + +## Description + +Moves a resource from one to another state. + +## Example + +```shell +$ cd stacks/vpc/layers/production +$ stacks surgery move aws_vpc.main aws_vpc.main stacks/vpc/layers/development +``` diff --git a/docs/3.3.11. stacks surgery edit.md b/docs/3.3.11. stacks surgery edit.md new file mode 100644 index 0000000..5ed760b --- /dev/null +++ b/docs/3.3.11. stacks surgery edit.md @@ -0,0 +1,24 @@ +# `stacks surgery edit` + +## Usage +``` +Usage: stacks surgery edit [OPTIONS] + + Edit state with vi. + +Options: + --help Show this message and exit. +``` + +## Description + +Opens up a text editor with your remote Terraform state. + +Uses whatever you have in the `EDITOR` environment variable, or `vi` otherwise. + +## Example + +```shell +$ cd stacks/vpc/layers/production +$ stacks surgery edit +``` diff --git a/docs/3.3.2. stacks terraform.md b/docs/3.3.2. stacks terraform.md new file mode 100644 index 0000000..63697fe --- /dev/null +++ b/docs/3.3.2. stacks terraform.md @@ -0,0 +1,26 @@ +# `stacks terraform` + +## Usage +``` +Usage: stacks terraform [OPTIONS] [ARGS]... + + Terraform command wrapper. + +Options: + --init TEXT Run terraform init (auto, always, never) + --help Show this message and exit. +``` + +## Description + +Runs Terraform after rendering a layer. + +Must run within a layer directory (e.g. `stacks/vpc/layers/production`). + +## Example + +```shell +$ cd stacks/vpc/layers/production +$ stacks terraform apply +... +``` diff --git a/docs/3.3.3. stacks diff.md b/docs/3.3.3. stacks diff.md new file mode 100644 index 0000000..f02cf58 --- /dev/null +++ b/docs/3.3.3. stacks diff.md @@ -0,0 +1,32 @@ +# `stacks diff` + +## Usage +``` +Usage: stacks diff [OPTIONS] + + Render and compare Git HEAD vs current uncommitted changes. + +Options: + --help Show this message and exit. +``` + +## Description + +***Note:** assumes your Terraform code uses Git for version control.* + +Runs `stacks render` on both Git `HEAD` (without uncommited changes) and your current working directory (including uncommitted changes). +Then diff's both output codes. + +Useful when debugging Stacks Jinja templating. + +Uses `git stash` to temporarily store your uncommitted changes. +If it crashes it might've not cleaned up successfully so you'll need to `git stash pop` yourself. + +Must run within a layer directory (e.g. `stacks/vpc/layers/production`). + +## Example + +```shell +$ cd stacks/vpc/layers/production +$ stacks diff +``` diff --git a/docs/3.3.4. stacks encrypt.md b/docs/3.3.4. stacks encrypt.md new file mode 100644 index 0000000..ea8ef70 --- /dev/null +++ b/docs/3.3.4. stacks encrypt.md @@ -0,0 +1,16 @@ +# `stacks encrypt` + +## Usage +``` +Usage: stacks encrypt [OPTIONS] STRING + + Encrypt a secret string using a public key. Can run in any directory. + +Options: + --public-key-path TEXT [required] + --help Show this message and exit. +``` + +## Description & example + +See [Inline secret encryption](<3.1.6. Inline secret encryption.md>). diff --git a/docs/3.3.5. stacks decrypt.md b/docs/3.3.5. stacks decrypt.md new file mode 100644 index 0000000..756b942 --- /dev/null +++ b/docs/3.3.5. stacks decrypt.md @@ -0,0 +1,16 @@ +# `stacks decrypt` + +## Usage +``` +Usage: stacks decrypt [OPTIONS] STRING + + Decrypt an encrypted string using a private key. Can run in any directory. + +Options: + --private-key-path TEXT [required] + --help Show this message and exit. +``` + +## Description & example + +See [Inline secret encryption](<3.1.6. Inline secret encryption.md>). diff --git a/docs/3.3.6. stacks surgery list.md b/docs/3.3.6. stacks surgery list.md new file mode 100644 index 0000000..2f096f9 --- /dev/null +++ b/docs/3.3.6. stacks surgery list.md @@ -0,0 +1,15 @@ +# `stacks surgery list` + +## Usage +``` +Usage: stacks surgery list [OPTIONS] + + List all resources in state by address. + +Options: + --help Show this message and exit. +``` + +## Description + +Equivalent of running `stacks terraform state list`. diff --git a/docs/3.3.7. stacks surgery import.md b/docs/3.3.7. stacks surgery import.md new file mode 100644 index 0000000..93ef4dc --- /dev/null +++ b/docs/3.3.7. stacks surgery import.md @@ -0,0 +1,15 @@ +# `stacks surgery import` + +## Usage +``` +Usage: stacks surgery import [OPTIONS] ADDRESS RESOURCE + + Import a resource into state by id. + +Options: + --help Show this message and exit. +``` + +## Description + +Equivalent of running `stacks terraform import`. diff --git a/docs/3.3.8. stacks surgery remove.md b/docs/3.3.8. stacks surgery remove.md new file mode 100644 index 0000000..fda4435 --- /dev/null +++ b/docs/3.3.8. stacks surgery remove.md @@ -0,0 +1,15 @@ +# `stacks surgery remove` + +## Usage +``` +Usage: stacks surgery remove [OPTIONS] ADDRESS + + Remove a resource from state by address. + +Options: + --help Show this message and exit. +``` + +## Description + +Equivalent of running `stacks terraform state rm`. diff --git a/docs/3.3.9. stacks surgery rename.md b/docs/3.3.9. stacks surgery rename.md new file mode 100644 index 0000000..e96f8d3 --- /dev/null +++ b/docs/3.3.9. stacks surgery rename.md @@ -0,0 +1,15 @@ +# `stacks surgery rename` + +## Usage +``` +Usage: stacks surgery rename [OPTIONS] FROM_ADDRESS TO_ADDRESS + + Rename a resource in the current state. + +Options: + --help Show this message and exit. +``` + +## Description + +Renames a resource from one to another state address. diff --git a/docs/3.4. Directory structure.md b/docs/3.4. Directory structure.md new file mode 100644 index 0000000..4c14e0e --- /dev/null +++ b/docs/3.4. Directory structure.md @@ -0,0 +1,41 @@ +# Directory structure + +The following is an example Stacks directory structure as full as it can get, with annotations as to the specific freedoms/constraints you get. + +``` +|-- environments/ # The environments folder must be named after the value of the STACKS_ENVIRONMENTS_DIR environment variable, which defaults to "environments". +| `-- production/ # Environment names must follow the ^[a-zA-Z0-9-]{,254}$ regular expression. +| `-- env.tfvars # Environments can only have one env.tfvars file to define environment-specific variables. +`-- stacks/ # The stacks folder must be named after the value of the STACKS_STACKS_DIR environment variable, which defaults to "stacks". + |-- ec2/ # Stack names must follow the ^[a-zA-Z0-9-]{,254}$ regular expression. + | |-- base/ # The stack base folder must be named after the value of the STACKS_BASE_DIR environment variable, which defaults to "base". + | | |-- backend.tf # The base folder contains a Terraform root module, you can use Jinja on any one of its *.tf files. + | | `-- modules/... # You can also have folders here for stuff like local modules, but those don't get Jinja support unless added to the var.stacks_jinja_enabled_modules special variable. + | |-- layers/ # The layers folder must be named after the value of the STACKS_LAYERS_DIR environment variable, which defaults to "layers". + | | |-- production/ # Stack layers must be named after an existing environment. Stacks must run within one of these layer directories. + | | | |-- layer.tfvars # Layers can contain any number of *.tfvars files, which are sorted alphabetically for preference (i.e. a.tfvars is overriden by z.tfvars). + | | | `-- stacks.out/ # Stacks operations that produce output code store it in stacks.out (or the value of the STACKS_OUTPUT_DIR environment variable). Make sure to exclude this directory from version control. + | | `-- production_foo/ # If you want to deploy your stack multiple times per environment you can use layer instances, which are represented by a layer suffixed by an underscore followed by a string that must follow the ^[a-zA-Z0-9-]{,254}$ regular expression. + | | `-- .keep # Technically, you don't have to have anything in layer directories, but make sure your version control checks out empty directories, otherwise you can create something like an empty ".keep" file like so. + | `-- stack.tfvars # Stacks can contain any number of *.tfvars files, which are sorted alphabetically for preference (i.e. a.tfvars is overriden by z.tfvars). + |-- global.tf # The stacks folder can contain any number of *.tf files, which are joined with the stack base code for cross-stack Terraform code reusability (for example, to define state backends and providers). + `-- globals.tfvars # The stacks folder can contain any number of *.tfvars files, which are sorted alphabetically for preference (i.e. a.tfvars is overriden by z.tfvars). +``` + +Variable values of the environments' `env.tfvars` are overriden by those in `stacks/*.tfvars`, which are overriden by those in the stack's `*.tfvars`, which are overriden by those in the layer's `*.tfvars`. + +Overrides are performed by joining two adjacent data structures with [`deepmerge.always_merger.merge(...)`](https://pypi.org/project/deepmerge/), so: +- `bool`/`number`/`string` values are replaced: + - if `stack.tfvars` says `color = "red"` + - and `layer.tfvars` says `color = "blue"` + - the result is `color = "blue"` +- `list`/`set` values are joined: + - if `stack.tfvars` says `colors = ["red", "blue"]` + - and `layer.tfvars` says `colors = ["green", "yellow"]` + - the result is `colors = ["red", "blue", "green", "yellow"]` +- `map` values are joined: + - if `stack.tfvars` says `colors = { red = "#ff0000", blue = "#00ffff" }` + - and `layer.tfvars` says `colors = { blue = "#0000ff", green = "#00ff00" }` + - the result is `colors = { red = "#ff0000", blue = "#0000ff", green = "#00ff00" }` + +You can also have `*.tfvars.jinja` files anywhere you can have `*.tfvars` files, which support Jinja templating, but be careful not to put yourself into a cyclical reference. diff --git a/docs/3.5. Special variables.md b/docs/3.5. Special variables.md new file mode 100644 index 0000000..d2af3f1 --- /dev/null +++ b/docs/3.5. Special variables.md @@ -0,0 +1,52 @@ +# Special variables + +Along with any variables you may define, Stacks also injects a number of read-only variables for you to use in Jinja and/or Terraform. + +Here's a full list of all special variables: + +## `var.stacks_path` +The path of the current running layer, relative to the stacks directory parent. +Example: `stacks/vpc/layers/production`. + +## `var.stacks_root` +The path of the stacks directory parent, relative to Terraform's runtime working directory. +Similar to Terraform's `${path.root}`, so you can reference other files relative to the root of the repository. +Example: `../../../../..`. + +## `var.stacks_stack` +The name of the stack. +Example: `vpc`. + +## `var.stacks_layer` +The full (directory) name of the layer. +Example: `production@us-east-1_foo`. + +## `var.stacks_environment` +The name of the environment. +Example: `production`. + +## `var.stacks_subenvironment` +The name of the subenvironment, if applicable, otherwise it's `""` (empty string). +Example: `us-east-1`. + +## `var.stacks_instance` +The name of the instance, if applicable, otherwise it's `""` (empty string). +Example: `foo`. + +## `var.stacks_environments` +A map of all environments and their settings. +Example: +``` +{ + production = { + production = true + aws_region = "eu-south-2" + aws_assume_role = "Admin" + } + development = { + production = false + aws_region = "eu-south-2" + aws_assume_role = "Developer" + } +} +``` diff --git a/example/environments/production/backend.tfvars b/example/environments/production/backend.tfvars deleted file mode 100644 index 47eab92..0000000 --- a/example/environments/production/backend.tfvars +++ /dev/null @@ -1,11 +0,0 @@ -/* -This file is required. -It configures the S3 remote state backend for this environment. -The scope of these variables is limited to the backend configuration for the -layers in this environment. -*/ - -role_arn = # CHANGEME (the role Terraform will assume to work with S3 remote state) -region = # CHANGEME (the region where the S3 remote state bucket is) -bucket = # CHANGEME -dynamodb_table = # CHANGEME diff --git a/example/environments/production/environment.tfvars b/example/environments/production/environment.tfvars deleted file mode 100644 index 3bd6e7a..0000000 --- a/example/environments/production/environment.tfvars +++ /dev/null @@ -1,8 +0,0 @@ -/* -This file is required. -It defines environment-level variables. -The scope of these variables is limited to all layers in this environment. -*/ - -role_arn = # CHANGEME (the role the AWS providers will assume) -region = # CHANGEME diff --git a/example/environments/staging/backend.tfvars b/example/environments/staging/backend.tfvars deleted file mode 100644 index 47eab92..0000000 --- a/example/environments/staging/backend.tfvars +++ /dev/null @@ -1,11 +0,0 @@ -/* -This file is required. -It configures the S3 remote state backend for this environment. -The scope of these variables is limited to the backend configuration for the -layers in this environment. -*/ - -role_arn = # CHANGEME (the role Terraform will assume to work with S3 remote state) -region = # CHANGEME (the region where the S3 remote state bucket is) -bucket = # CHANGEME -dynamodb_table = # CHANGEME diff --git a/example/environments/staging/environment.tfvars b/example/environments/staging/environment.tfvars deleted file mode 100644 index 3bd6e7a..0000000 --- a/example/environments/staging/environment.tfvars +++ /dev/null @@ -1,8 +0,0 @@ -/* -This file is required. -It defines environment-level variables. -The scope of these variables is limited to all layers in this environment. -*/ - -role_arn = # CHANGEME (the role the AWS providers will assume) -region = # CHANGEME diff --git a/example/stacks/example/base/foo.tf b/example/stacks/example/base/foo.tf deleted file mode 100644 index 3911a2a..0000000 --- a/example/stacks/example/base/foo.tf +++ /dev/null @@ -1 +0,0 @@ -resource "null_resource" "foo" {} diff --git a/example/stacks/example/base/outputs.tf b/example/stacks/example/base/outputs.tf deleted file mode 100644 index 10a6c50..0000000 --- a/example/stacks/example/base/outputs.tf +++ /dev/null @@ -1,3 +0,0 @@ -output "feedback" { - value = var.feedback -} diff --git a/example/stacks/example/layers/production/.keep b/example/stacks/example/layers/production/.keep deleted file mode 100644 index e31b179..0000000 --- a/example/stacks/example/layers/production/.keep +++ /dev/null @@ -1,3 +0,0 @@ -Empty file. - -We need an empty file so that the directory is tracked in git. diff --git a/example/stacks/example/layers/staging/bar.tf b/example/stacks/example/layers/staging/bar.tf deleted file mode 100644 index a724a3d..0000000 --- a/example/stacks/example/layers/staging/bar.tf +++ /dev/null @@ -1,7 +0,0 @@ -/* -This file is optional. -It defines layer-level resources. -The scope of these resources is limited to this layer in this stack. -*/ - -resource "null_resource" "bar" {} diff --git a/example/stacks/example/layers/staging/layer.tfvars b/example/stacks/example/layers/staging/layer.tfvars deleted file mode 100644 index 515398d..0000000 --- a/example/stacks/example/layers/staging/layer.tfvars +++ /dev/null @@ -1,7 +0,0 @@ -/* -This file is optional. -It defines layer-level variables. -The scope of these variables is limited to this layer in this stack. -*/ - -feedback = "It lets me override variables at the layer level!" diff --git a/example/stacks/example/stack.tfvars b/example/stacks/example/stack.tfvars deleted file mode 100644 index 903b747..0000000 --- a/example/stacks/example/stack.tfvars +++ /dev/null @@ -1,7 +0,0 @@ -/* -This file is optional. -It defines stack-level variables. -The scope of these variables is limited to all layers in this stack. -*/ - -feedback = "Stacks for Terraform is very cool!" diff --git a/pyproject.toml b/pyproject.toml new file mode 100644 index 0000000..b638dfa --- /dev/null +++ b/pyproject.toml @@ -0,0 +1,28 @@ +[project] +name = "stacks" +version = "2.0.3" +description = "Stacks, the Terraform code pre-processor" +readme = "README.md" +requires-python = ">=3.10" +dependencies = [ + "click>=8.1.7", + "cryptography>=43.0.3", + "deepmerge>=2.0", + "gitpython>=3.1.43", + "jinja2>=3.1.4", + "packaging>=24.2", + "python-hcl2>=5.1.1", +] + +[project.scripts] +stacks = "stacks:main.cli" + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.build.targets.wheel] +packages = ["src/stacks"] + +[tool.ruff] +line-length = 300 diff --git a/src/filters.py b/src/filters.py deleted file mode 100644 index 8990cfc..0000000 --- a/src/filters.py +++ /dev/null @@ -1,35 +0,0 @@ -#!/usr/bin/env python3 - -# Copyright 2024 Cisco Systems, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -def deepformat(value, params): - if isinstance(value, dict): - return { - deepformat(key, params): deepformat(value, params) - for key, value in value.items() - } - if isinstance(value, list): - return [ - deepformat(item, params) - for item in value - ] - if isinstance(value, str): - return value.format(**params) - return value - - -__all__ = [ - deepformat, -] diff --git a/src/helpers.py b/src/helpers.py deleted file mode 100755 index 8929aab..0000000 --- a/src/helpers.py +++ /dev/null @@ -1,208 +0,0 @@ -#!/usr/bin/env python3 - -# Copyright 2024 Cisco Systems, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import fnmatch -import glob -import json -import os -import pathlib -import shutil -import tempfile - -import deepmerge -import hcl2 -import jinja2 -import yaml - -import filters -from tools import encryption_decrypt - - -def directory_copy(srcpath, dstpath, ignore=[]): - """Copy the contents of the dir in 'srcpath' to the dir in 'dstpath'. - - Keyword arguments: - srcpath[str]: path to source directory - dstpath[str]: path to destination directory - ignore[list]: files in srcpath to not ignore when copying - """ - srcpath = pathlib.Path(srcpath) - dstpath = pathlib.Path(dstpath) - def ignorefunc(parent, items): - return [ - item - for item in items - if ( - item in ignore - or item == dstpath.name - or parent == dstpath.name - or any(fnmatch.fnmatch(item, pattern) for pattern in ignore) - ) - ] - shutil.copytree(srcpath, dstpath, ignore=ignorefunc, dirs_exist_ok=True) - - -def directory_remove(path, keep=[]): - """Remove directory in 'path', but preserve any files in 'keep'. - - Keyword arguments: - path[str]: path to directory - keep[list]: files in directory to keep - """ - path = pathlib.Path(path) - if not path.is_dir(): - return - - for item in path.iterdir(): - if item.name not in keep: - if item.is_dir(): - shutil.rmtree(item) - else: - item.unlink() - - -def json_read(patterns): - """Read JSON files in 'patterns' and return their merged contents. - - Keyword arguments: - patterns[str/list]: pattern/s to JSON file/s, in ascending order of priority - """ - assert(isinstance(patterns, list)) - data = {} - for pattern in patterns: - for path in sorted(glob.glob(str(pattern))): - path = pathlib.Path(path) - if not path.is_file(): - continue - with open(path, "r") as f: - data = deepmerge.always_merger.merge(data, json.load(f)) - return data - - -def json_write(data, path, indent=2): - """Write 'data' to file in 'path' in JSON format. - - Keyword arguments: - path[str]: destination file path - data[any]: JSON-serializable data structure to write to file - indent[int,optional]: number of spaces to indent JSON levels with - """ - with open(pathlib.Path(path), "w") as f: - json.dump(data, f, indent=indent) - - -def yaml_read(patterns): - """Read YAML files in 'patterns' and return their merged contents. - - Keyword arguments: - patterns[str/list]: pattern/s to YAML file/s, in ascending order of priority - """ - assert(isinstance(patterns, list)) - data = {} - for pattern in patterns: - for path in sorted(glob.glob(str(pattern))): - path = pathlib.Path(path) - if not path.is_file(): - continue - with open(path, "r") as f: - data = deepmerge.always_merger.merge(data, yaml.safe_load(f)) - return data - - -def yaml_write(data, path, indent=2, width=1000): - """Write 'data' to file in 'path' in YAML format. - - Keyword arguments: - path[str]: destination file path - data[any]: YAML-serializable data structure to write to file - """ - with open(pathlib.Path(path), "w") as f: - yaml.dump(data, f, indent=indent, width=width) - - -def hcl2_read(patterns): - """Read HCL2 files in 'patterns' and return their merged contents. - - Keyword arguments: - patterns[str/list]: pattern/s to HCL2 file/s, in ascending order of priority - """ - assert(isinstance(patterns, list)) - data = {} - for pattern in patterns: - for path in sorted(glob.glob(str(pattern))): - path = pathlib.Path(path) - if not path.is_file(): - continue - with open(path, "r") as f: - data = deepmerge.always_merger.merge(data, hcl2.load(f)) - return hcl2_decrypt(data) - - -def hcl2_decrypt(data): - """Decrypts all strings in 'data'. - - Keyword arguments: - data[any]: any HCL2-sourced data structure - """ - if isinstance(data, str) and data.startswith("ENC[") and data.endswith("]"): - key_path = os.getenv("STACKS_PRIVATE_KEY_PATH") - if not key_path: - raise Exception("could not decrypt data: STACKS_PRIVATE_KEY_PATH is not set") - if not pathlib.Path(key_path).exists(): - raise Exception(f"could not decrypt data: STACKS_PRIVATE_KEY_PATH ({key_path}) does not exist") - return encryption_decrypt.main(data, key_path) - - elif isinstance(data, list): - for i in range(len(data)): - data[i] = hcl2_decrypt(data[i]) - - elif isinstance(data, dict): - for k, v in data.items(): - data[k] = hcl2_decrypt(v) - - return data - - -def jinja2_render(patterns, data): - """Overwrite files in 'patterns' with their Jinja2 render. - - Keyword arguments: - patterns[str/list]: pattern/s of text file/s - data[dict]: variables to render files with - """ - assert(isinstance(patterns, list)) - for pattern in patterns: - for path in sorted(glob.glob(str(pattern))): - path = pathlib.Path(path) - if not path.is_file(): - continue - try: - with open(path, "r") as fin: - template = jinja2.Template(fin.read()) - - rendered = template.render(data | { - func.__name__: func - for func in filters.__all__ - }) - - with open(path, "w") as fout: - fout.write(rendered) - except jinja2.exceptions.UndefinedError as e: - print(f"Failure to render {path}: {e}", file=sys.stderr) - sys.exit(1) - except jinja2.exceptions.TemplateSyntaxError as e: - print(f"Failure to render {path} at line {e.lineno}, in statement {e.source}: {e}", file=sys.stderr) - sys.exit(1) diff --git a/src/postinit.py b/src/postinit.py deleted file mode 100755 index c685a2e..0000000 --- a/src/postinit.py +++ /dev/null @@ -1,77 +0,0 @@ -#!/usr/bin/env python3 - -# Copyright 2024 Cisco Systems, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import glob -import pathlib - -import git - -import helpers - - -# define context -cwd = pathlib.Path().cwd() # current working directory -rootdir = cwd.parent.parent.parent.parent.parent # repository root -envsdir = rootdir.joinpath("environments") # where environment definitions live -stacksdir = rootdir.joinpath("stacks") # where stack definitions live -stackdir = cwd.parent.parent.parent # where this stack's definition lives -workdir = cwd # where plan/apply will happen -tfdir = workdir.joinpath(".terraform") # the .terraform directory -modulesdir = tfdir.joinpath("modules") # where child modules live -stack = stackdir.name # the stack's name -layer = cwd.parent.name # the layer's name -layer_split = layer.split("_", 1) # split environment and instance -env = layer_split[0] # the layer's environment's name -envdir = envsdir.joinpath(env) # the layer's environment directory -instance = layer_split[1] if len(layer_split) > 1 else None # the layer's instance's name, otherwise None - -# read variables -variables = { - **helpers.hcl2_read([ - envdir.joinpath("environment.tfvars"), - stacksdir.joinpath("*.common.tfvars"), stacksdir.joinpath("common.tfvars"), - stackdir.joinpath("*.stack.tfvars"), stackdir.joinpath("stack.tfvars"), - cwd.joinpath("*.layer.tfvars"), cwd.joinpath("layer.tfvars"), - ]), - "stacks-root": "../../../../..", # repository root, relative to workdir - "stacks-path": f"stacks/{stack}/layer/{layer}", # layer path, relative to repository root - "stacks-stack": stack, # stack name - "stacks-layer": layer, # stack layer - "stacks-environment": env, # layer environment - "stacks-instance": instance or "", # layer instance (if exists) -} - -for module in helpers.json_read([modulesdir.joinpath("modules.json")]).get("Modules", []): - # ignore root module - if module["Key"] == "": - continue - - moduledir = workdir.joinpath(module["Dir"]) - - # remove changes to avoid interference between runs - try: - repo = git.Repo(moduledir) - repo.git.reset("--hard") - repo.git.clean("--force") - except git.exc.InvalidGitRepositoryError: # use git if the module doesn't to keep track of initial state between runs - repo = git.Repo.init(moduledir) - repo.config_writer().set_value("user", "name", "Stacks").release() - repo.config_writer().set_value("user", "email", "stacks@example.com").release() - repo.git.add(".") - repo.git.commit("-m", "stacks: postinit checkpoint") - - # render files - helpers.jinja2_render([workdir.joinpath("*.tf")], {"var": variables}) diff --git a/src/preinit.py b/src/preinit.py deleted file mode 100755 index ae2e3a0..0000000 --- a/src/preinit.py +++ /dev/null @@ -1,128 +0,0 @@ -#!/usr/bin/env python3 - -# Copyright 2024 Cisco Systems, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import glob -import os -import pathlib - -import helpers - - -# define context -cwd = pathlib.Path().cwd() # current working directory -rootdir = cwd.parent.parent.parent.parent # repository root -envsdir = rootdir.joinpath("environments") # where environment definitions live -stacksdir = rootdir.joinpath("stacks") # where stack definitions live -stackdir = cwd.parent.parent # where this stack's definition lives -basedir = stackdir.joinpath("base") # where this stack's base code lives -workdir = cwd.joinpath("stacks.out") # where init/plan/apply will happen -stack = stackdir.name # the stack's name -layer = cwd.name # the layer's name -layer_split = layer.split("_", 1) # split environment and instance -env = layer_split[0] # the layer's environment's name -envdir = envsdir.joinpath(env) # the layer's environment directory -instance = layer_split[1] if len(layer_split) > 1 else None # the layer's instance's name, otherwise None -# validate context -assert(basedir.exists()) -assert(envdir.exists()) - -# remove workdir to avoid interference between runs -helpers.directory_remove(workdir, keep=[ - ".terraform", # keep .terraform to avoid re-init - ".terraform.lock.hcl", # keep .terraform.lock.hcl to avoid re-init -]) - -# read variables -variables = { - **helpers.hcl2_read([ - envdir.joinpath("environment.tfvars"), - stacksdir.joinpath("*.common.tfvars"), stacksdir.joinpath("common.tfvars"), - stackdir.joinpath("*.stack.tfvars"), stackdir.joinpath("stack.tfvars"), - cwd.joinpath("*.layer.tfvars"), cwd.joinpath("layer.tfvars"), - ]), - "stacks-root": "../../../../..", # repository root, relative to workdir - "stacks-path": f"stacks/{stack}/layer/{layer}", # layer path, relative to repository root - "stacks-stack": stack, # stack name - "stacks-layer": layer, # stack layer - "stacks-environment": env, # layer environment - "stacks-instance": instance or "", # layer instance (if exists) -} - -# merge stack and layer files -reserved = [".terraform", ".terraform.lock.hcl", "stacks.tf.json", "zzz.auto.tfvars.json"] -helpers.directory_copy(basedir, workdir, ignore=reserved) -helpers.directory_copy(cwd, workdir, ignore=reserved+[ - "layer.tfvars", "*.layer.tfvars", - pathlib.Path(os.getenv("PLANFILE","default.tfplan")).name, # do not copy plan files (binary) - pathlib.Path(os.getenv("SHOWFILE","default.json")).name, # do not copy plan files (json) -]) - -# render files -helpers.jinja2_render([workdir.joinpath("*.tf")], {"var": variables}) - -# initialize universe -universe = {"terraform": {}, "provider": [], "variable": {}} - -# configure remote state backend -universe["terraform"]["backend"] = {"s3": { - **helpers.hcl2_read([envdir.joinpath("backend.tfvars")]), - "key": f"layers/{stack}/{f'{instance}/' if instance else ''}terraform.tfstate", -}} - -# add missing variables' declarations -variables_declared = [ - list(variable.keys())[0] - for variable in helpers.hcl2_read([workdir.joinpath("*.tf")]).get("variable", []) -] -for variable in {**variables, **helpers.hcl2_read([workdir.joinpath("*.auto.tfvars")])}.keys(): # also autodeclare variables in *.auto.tfvars files - if variable not in variables_declared: - universe["variable"][variable] = {} - -# configure AWS providers -regions = [] # CHANGEME (the list of regions you operate in) -def provider_aws_append(region, role, alias=None): - global universe, variables, stack, env - universe["provider"].append({"aws": { - **variables.get("stacks-provider-aws-extra-args", {}), - **({"alias": alias} if alias else {}), - "region": region, - "assume_role": [{"role_arn": role}], - "default_tags": [{"tags": { - **variables.get("stacks-provider-aws-extra-tags", {}), - "stacks-path": variables["stacks-path"], - }}], - }}) -# inject default provider -provider_aws_append(variables["region"], variables["role_arn"]) -# inject default provider in other regions -for region in regions: - provider_aws_append(region, variables["role_arn"], region) -# inject all other environments' default providers -for path in envsdir.iterdir(): - if not path.is_dir(): - continue - try: - environment = helpers.hcl2_read([path.joinpath("environment.tfvars")]) - provider_aws_append(environment["region"], environment["role_arn"], path.name) - # inject all other environments' other regions' providers - for region in regions: - provider_aws_append(region, environment["role_arn"], f"{path.name}_{region}") - except KeyError: - continue - -# persist universe and variables -helpers.json_write(universe, workdir.joinpath("stacks.tf.json")) -helpers.json_write(variables, workdir.joinpath("zzz.auto.tfvars.json")) # 'zzz' so that it has the topmost precedence diff --git a/src/requirements.txt b/src/requirements.txt deleted file mode 100644 index 6dffa41..0000000 --- a/src/requirements.txt +++ /dev/null @@ -1,6 +0,0 @@ -cryptography==43.0.1 -deepmerge==2.0 -GitPython==3.1.43 -Jinja2==3.1.4 -python-hcl2==4.3.5 -PyYAML==6.0.2 diff --git a/src/stacks/__init__.py b/src/stacks/__init__.py new file mode 100644 index 0000000..c0a4e2b --- /dev/null +++ b/src/stacks/__init__.py @@ -0,0 +1 @@ +# this is needed so uv treats this as a package diff --git a/src/stacks/cmd/__init__.py b/src/stacks/cmd/__init__.py new file mode 100644 index 0000000..567b956 --- /dev/null +++ b/src/stacks/cmd/__init__.py @@ -0,0 +1,15 @@ +from . import surgery +from .context import Context +from .diff import diff +from .preinit import preinit +from .render import render +from .terraform import terraform + +__all__ = [ + Context, + diff, + preinit, + render, + surgery, + terraform, +] diff --git a/src/stacks/cmd/config.py b/src/stacks/cmd/config.py new file mode 100644 index 0000000..cdf93c3 --- /dev/null +++ b/src/stacks/cmd/config.py @@ -0,0 +1,10 @@ +import os + + +ENVIRONMENTS_DIR = os.getenv("STACKS_ENVIRONMENTS_DIR", "environments") +STACKS_DIR = os.getenv("STACKS_STACKS_DIR", "stacks") +BASE_DIR = os.getenv("STACKS_BASE_DIR", "base") +LAYERS_DIR = os.getenv("STACKS_LAYERS_DIR", "layers") +OUTPUT_DIR = os.getenv("STACKS_OUTPUT_DIR", "stacks.out") +TERRAFORM_PATH = os.getenv("STACKS_TERRAFORM_PATH", "terraform") +EDITOR = os.getenv("STACKS_EDITOR", os.getenv("EDITOR", "vi")) diff --git a/src/stacks/cmd/context.py b/src/stacks/cmd/context.py new file mode 100644 index 0000000..824e3fb --- /dev/null +++ b/src/stacks/cmd/context.py @@ -0,0 +1,34 @@ +import pathlib + +from . import config + + +class Context: + def __init__(self, path=pathlib.Path().cwd(), out=pathlib.Path().cwd().joinpath(config.OUTPUT_DIR), parent=None): + if path == path.parent.joinpath(out.name): + out = path + path = path.parent + self.path = path + self.root_dir = self.path.parent.parent.parent.parent + self.envs_dir = self.root_dir.joinpath(config.ENVIRONMENTS_DIR) + self.stacks_dir = self.root_dir.joinpath(config.STACKS_DIR) + self.stack_dir = self.path.parent.parent + self.base_dir = self.stack_dir.joinpath(config.BASE_DIR) + self.work_dir = out + self.terraform_dir = self.work_dir.joinpath(".terraform") + self.modules_dir = self.terraform_dir.joinpath("modules") + self.universe_file = self.work_dir.joinpath("stacks.tf.json") + self.variables_file = self.work_dir.joinpath("zzz.auto.tfvars.json") # 'zzz.auto.tfvars.json' so that it has the topmost precedence + self.stack = self.stack_dir.name + self.layer = self.path.name + layer_split = self.layer.split("_", 1) + env_split = layer_split[0].split("@", 1) + self.env = env_split[0] + self.env_dir = self.envs_dir.joinpath(self.env) + self.subenv = env_split[1] if len(env_split) > 1 else None + self.subenv_dir = self.env_dir.joinpath(self.subenv) if self.subenv else None + self.instance = layer_split[1] if len(layer_split) > 1 else None + assert self.env_dir.exists() + assert self.subenv_dir is None or self.subenv_dir.exists() + self.ancestor = parent.ancestor if parent and parent.ancestor else parent + self.parent = parent diff --git a/src/stacks/cmd/diff.py b/src/stacks/cmd/diff.py new file mode 100644 index 0000000..1f6436b --- /dev/null +++ b/src/stacks/cmd/diff.py @@ -0,0 +1,21 @@ +import git + +from . import render +from . import context +from .. import helpers + + +def diff(ctx): + repository = git.Repo(path=ctx.path, search_parent_directories=True) + repository.git.stash() + + ctx_old = context.Context(path=ctx.path, out=ctx.work_dir.parent.joinpath(f"{ctx.work_dir.name}.old")) + render.render(ctx=ctx_old) + + repository.git.stash("pop") + + ctx_new = context.Context(path=ctx.path, out=ctx.work_dir.parent.joinpath(f"{ctx.work_dir.name}.new")) + render.render(ctx=ctx_new) + + helpers.run_command("diff", "-ur", "--color", ctx_old.work_dir, ctx_new.work_dir) + helpers.run_command("rm", "-rf", ctx_old.work_dir, ctx_new.work_dir) diff --git a/src/stacks/cmd/preinit.py b/src/stacks/cmd/preinit.py new file mode 100644 index 0000000..7e933ef --- /dev/null +++ b/src/stacks/cmd/preinit.py @@ -0,0 +1,98 @@ +import shutil + +from .. import helpers + + +def preinit(ctx): + helpers.directory_remove(ctx.work_dir, keep=[".terraform", ".terraform.lock.hcl"]) + + variables_predefined = { + "stacks_path": f"stacks/{ctx.stack}/layer/{ctx.layer}", + "stacks_root": "/".join([".."] * (len(ctx.work_dir.parts) - len(ctx.root_dir.parts))), + "stacks_stack": ctx.stack, + "stacks_layer": ctx.layer, + "stacks_environment": ctx.env, + "stacks_subenvironment": ctx.subenv or "", + "stacks_instance": ctx.instance or "", + "stacks_environments": { + item.name: helpers.hcl2_read([item.joinpath("env.tfvars")]) # TODO: replace 'env.tfvars' with '*.tfvars' after all stacks have been upgraded to v2 + for item in ctx.envs_dir.iterdir() + if item.is_dir() and item.joinpath("env.tfvars").exists() + }, + } + + helpers.copy_files(ctx.stacks_dir, ctx.work_dir, include=["*.tf", "*.tfvars.jinja"], prefix="common_") + helpers.copy_files(ctx.stack_dir, ctx.work_dir, include=["*.tfvars.jinja"], prefix="stack_") + for item in ctx.base_dir.iterdir(): + if item.is_dir(): + shutil.copytree(item, ctx.work_dir.joinpath(item.name), dirs_exist_ok=True) + elif item.is_file() and not item.match("*.tf"): + shutil.copyfile(item, ctx.work_dir.joinpath(item.name)) + helpers.copy_files(ctx.base_dir, ctx.work_dir, include=["*.tf"], prefix="base_") + helpers.copy_files(ctx.path, ctx.work_dir, include=["*.tfvars.jinja"], prefix="layer_") + + helpers.jinja2_render( + ctx=ctx, + patterns=[ctx.work_dir.joinpath("*.tfvars.jinja")], + data={ + "var": { + **helpers.hcl2_read( + [ + pattern + for pattern in [ + ctx.env_dir.joinpath("env.tfvars"), + ctx.subenv_dir.joinpath("*.tfvars") if ctx.subenv_dir else None, + ctx.stacks_dir.joinpath("*.tfvars"), + ctx.stack_dir.joinpath("*.tfvars"), + ctx.path.joinpath("*.tfvars"), + ] + if pattern + ] + ), + **variables_predefined, + } + }, + ) + + variables = { + **helpers.hcl2_read( + [ + pattern + for pattern in [ + ctx.env_dir.joinpath("env.tfvars"), + ctx.subenv_dir.joinpath("*.tfvars") if ctx.subenv_dir else None, + ctx.stacks_dir.joinpath("*.tfvars"), + ctx.work_dir.joinpath("common_*.tfvars.jinja"), + ctx.stack_dir.joinpath("*.tfvars"), + ctx.work_dir.joinpath("stack_*.tfvars.jinja"), + ctx.path.joinpath("*.tfvars"), + ctx.work_dir.joinpath("layer_*.tfvars.jinja"), + ] + if pattern + ] + ), + **variables_predefined, + } + helpers.jinja2_render( + ctx=ctx, + patterns=[ctx.work_dir.joinpath("*.tf")], + data={"var": variables}, + ) + + variables_declared = [list(variable.keys())[0] for variable in helpers.hcl2_read([ctx.work_dir.joinpath("*.tf")]).get("variable", [])] + + helpers.json_write( + { + "variable": { + variable: {} + for variable in { + **variables, + **helpers.hcl2_read([ctx.work_dir.joinpath("*.auto.tfvars")]), + } + if variable not in variables_declared + } + }, + ctx.universe_file, + ) + + helpers.json_write(variables, ctx.variables_file) diff --git a/src/stacks/cmd/render.py b/src/stacks/cmd/render.py new file mode 100644 index 0000000..beb39e8 --- /dev/null +++ b/src/stacks/cmd/render.py @@ -0,0 +1,10 @@ +from . import config +from . import preinit +from .. import helpers + + +def render(ctx, init="auto"): + preinit.preinit(ctx=ctx) + + if init == "always" or (init == "auto" and not ctx.terraform_dir.joinpath("terraform.tfstate").exists()): + helpers.run_command(config.TERRAFORM_PATH, f"-chdir={ctx.work_dir}", "init") diff --git a/src/stacks/cmd/surgery.py b/src/stacks/cmd/surgery.py new file mode 100644 index 0000000..a2ee879 --- /dev/null +++ b/src/stacks/cmd/surgery.py @@ -0,0 +1,43 @@ +import hcl2 + +from . import terraform +from . import context +from . import config +from . import render +from .. import helpers + + +def _list(ctx): + terraform.terraform(ctx=ctx, args=["state", "list"]) + + +def _import(address, _id, ctx): + terraform.terraform(ctx=ctx, args=["import", address, _id]) + + +def remove(address, ctx): + terraform.terraform(ctx=ctx, args=["state", "rm", address]) + + +def move(from_address, to_path, to_address, ctx): + render.render(ctx=ctx) + _id = list(list(hcl2.loads(helpers.run_command(config.TERRAFORM_PATH, f"-chdir={ctx.work_dir}", "state", "show", "-no-color", from_address, interactive=False).stdout)["resource"][0].items())[0][1].items())[0][1]["id"] # this is ugly but it works, do NOT touch + _import(ctx=context.Context(path=to_path), address=to_address, _id=_id) + remove(ctx=ctx, address=from_address) + + +def edit(ctx): + render.render(ctx=ctx) + + old_state = ctx.terraform_dir.joinpath("old.tfstate") + helpers.run_script(f"{config.TERRAFORM_PATH} -chdir={ctx.work_dir} state pull > '{old_state}'") # we use run_script so we can redirect output easily to old_state + + new_state = ctx.terraform_dir.joinpath("new.tfstate") + helpers.run_command("cp", old_state, new_state) + + helpers.run_command(config.EDITOR, new_state) + + helpers.run_script(f"diff -ru --color {old_state} {new_state} || true") # '|| true' because diff returns non-zero if differences were found, which would mean we sys.exit and halt execution + if input("Proceed with changes? [y/N] ").lower().startswith("y"): + # we do NOT increase serial automatically, to protect non-experts from themselves + helpers.run_command(config.TERRAFORM_PATH, f"-chdir={ctx.work_dir}", "state", "push", new_state) diff --git a/src/stacks/cmd/terraform.py b/src/stacks/cmd/terraform.py new file mode 100644 index 0000000..82a94a7 --- /dev/null +++ b/src/stacks/cmd/terraform.py @@ -0,0 +1,8 @@ +from . import config +from . import render +from .. import helpers + + +def terraform(ctx, init="auto", args=[]): + render.render(ctx=ctx, init=init) + return helpers.run_command(config.TERRAFORM_PATH, f"-chdir={ctx.work_dir}", *args) diff --git a/src/stacks/filters/__init__.py b/src/stacks/filters/__init__.py new file mode 100644 index 0000000..c095039 --- /dev/null +++ b/src/stacks/filters/__init__.py @@ -0,0 +1,12 @@ +from .throw import throw +from .deepformat import deepformat +from .lookup import variable, output, resource + + +__all__ = [ + throw, + deepformat, + variable, + output, + resource, +] diff --git a/src/stacks/filters/deepformat.py b/src/stacks/filters/deepformat.py new file mode 100644 index 0000000..3ed31b1 --- /dev/null +++ b/src/stacks/filters/deepformat.py @@ -0,0 +1,8 @@ +def deepformat(ctx, value, params): + if isinstance(value, str): + return value.format(**params) + elif isinstance(value, list): + return [deepformat(item, params) for item in value] + elif isinstance(value, dict): + return {deepformat(key, params): deepformat(value, params) for key, value in value.items()} + return value diff --git a/src/stacks/filters/lookup.py b/src/stacks/filters/lookup.py new file mode 100644 index 0000000..8807d80 --- /dev/null +++ b/src/stacks/filters/lookup.py @@ -0,0 +1,54 @@ +import pathlib +import json + +import hcl2 + +from ..cmd import config +from ..cmd import context +from ..cmd import preinit +from .. import helpers + + +def remote_context(ctx, stack=None, environment=None, subenvironment=None, instance=None): # TODO: explore if this should be a method of context.Context + assert any([stack, environment, subenvironment, instance]) + remote_path = pathlib.Path(config.STACKS_DIR, stack or ctx.stack, config.LAYERS_DIR, (environment or ctx.env) + (f"@{subenvironment}" if subenvironment else "") + (f"_{instance}" if instance else "")) + return context.Context(path=ctx.root_dir.joinpath(remote_path), out=ctx.work_dir.joinpath(remote_path, config.OUTPUT_DIR)) + + +def variable(ctx, name, *args, **kwargs): + remote_ctx = remote_context(ctx=ctx, *args, **kwargs) + return helpers.hcl2_read( + [ + pattern + for pattern in [ + remote_ctx.env_dir.joinpath("env.tfvars"), + remote_ctx.subenv_dir.joinpath("*.tfvars") if remote_ctx.subenv_dir else None, + remote_ctx.stacks_dir.joinpath("*.tfvars"), + remote_ctx.stack_dir.joinpath("*.tfvars"), + remote_ctx.path.joinpath("*.tfvars"), + ] + if pattern + ] + )[name] + + +def terraform_init_headless(ctx, argv, *args, **kwargs): + remote_ctx = remote_context(ctx=ctx, *args, **kwargs) + preinit.preinit(ctx=remote_ctx) + code = helpers.hcl2_read([remote_ctx.work_dir.joinpath("*.tf")]) + helpers.directory_remove(remote_ctx.work_dir) + helpers.json_write({"terraform": [{"backend": code["terraform"][0]["backend"]}]}, remote_ctx.universe_file) + helpers.run_command(config.TERRAFORM_PATH, f"-chdir={remote_ctx.work_dir}", "init") # we cannot avoid pulling providers because we need to know the resources' schema + return helpers.run_command(config.TERRAFORM_PATH, f"-chdir={remote_ctx.work_dir}", *argv, interactive=False).stdout + + +def output(ctx, name, *args, **kwargs): + if ctx.ancestor == ctx.parent and ctx.parent is not None: + return "" # empty string so it cannot be iterated upon + return json.loads(terraform_init_headless(ctx=ctx, argv=["output", "-json", name], *args, **kwargs)) + + +def resource(ctx, address, *args, **kwargs): + if ctx.ancestor == ctx.parent and ctx.parent is not None: + return "" # empty string so it cannot be iterated upon + return hcl2.loads(terraform_init_headless(ctx=ctx, argv=["state", "show", "-no-color", address], *args, **kwargs))["resource"][0].popitem()[1].popitem()[1] diff --git a/src/stacks/filters/throw.py b/src/stacks/filters/throw.py new file mode 100644 index 0000000..c8645f4 --- /dev/null +++ b/src/stacks/filters/throw.py @@ -0,0 +1,2 @@ +def throw(ctx, message): + raise Exception(message) diff --git a/src/stacks/helpers/__init__.py b/src/stacks/helpers/__init__.py new file mode 100644 index 0000000..57612d2 --- /dev/null +++ b/src/stacks/helpers/__init__.py @@ -0,0 +1,23 @@ +from .config import config_read, json_read, hcl2_read, config_write, json_write +from .crypto import genkey, encrypt, decrypt +from .directory import directory_remove, copy_files +from .merge import merge +from .run import run_command, run_script +from .template import jinja2_render + +__all__ = [ + config_read, + json_read, + hcl2_read, + config_write, + json_write, + genkey, + encrypt, + decrypt, + directory_remove, + merge, + run_command, + run_script, + jinja2_render, + copy_files +] diff --git a/src/stacks/helpers/config.py b/src/stacks/helpers/config.py new file mode 100644 index 0000000..f4e6e0b --- /dev/null +++ b/src/stacks/helpers/config.py @@ -0,0 +1,52 @@ +import glob +import json +import pathlib + +import hcl2 + +from .crypto import decrypt +from .merge import merge + + +def config_read(patterns, decoderfunc, **decoderargs): + """Read configuration files in 'patterns' using 'decoderfunc' and return their merged contents. + + Keyword arguments: + patterns[list]: patterns to configuration files, in ascending order of priority + decoderfunc[function]: function that parses a given configuration file into a data structure + decoderargs[dict]: keyword arguments to pass to decoderfunc + """ + assert isinstance(patterns, list) + data = {} + for pattern in patterns: + for path in sorted(glob.glob(str(pattern))): + path = pathlib.Path(path) + if path.is_file(): + with open(path, "r") as f: + data = merge(data, decoderfunc(f, **decoderargs)) + return decrypt(data) + + +def json_read(patterns): + return config_read(patterns, json.load) + + +def hcl2_read(patterns): + return config_read(patterns, hcl2.load) + + +def config_write(data, path, encoderfunc, **encoderargs): + """Write 'data' to file in 'path' using 'encoderfunc' for formatting. + + Keyword arguments: + data[any]: structure to write to file + path[pathlib.Path]: destination file path + encoderfunc[function]: function that formats a given data structure into a configuration file + encoderargs[dict]: keyword arguments to pass to encoderfunc + """ + with open(path, "w") as f: + encoderfunc(data, f, **encoderargs) + + +def json_write(data, path): + config_write(data, path, json.dump, indent=2) diff --git a/src/stacks/helpers/crypto.py b/src/stacks/helpers/crypto.py new file mode 100644 index 0000000..8b36689 --- /dev/null +++ b/src/stacks/helpers/crypto.py @@ -0,0 +1,142 @@ +import base64 +import cryptography.hazmat.backends +import cryptography.hazmat.primitives.asymmetric.padding +import cryptography.hazmat.primitives.asymmetric.rsa +import cryptography.hazmat.primitives.ciphers +import cryptography.hazmat.primitives.hashes +import cryptography.hazmat.primitives.padding +import cryptography.hazmat.primitives.serialization +import os + + +def genkey(public_key_path, private_key_path): + """Generate a public/private key pair to use with 'encrypt' and 'decrypt'. + + Keyword arguments: + public_key_path[pathlib.Path]: where to store the generated public key + private_key_path[pathlib.Path]: where to store the generated private key + """ + key = cryptography.hazmat.primitives.asymmetric.rsa.generate_private_key( + backend=cryptography.hazmat.backends.default_backend(), + key_size=2**11, + public_exponent=2**16 + 1, + ) + with open(private_key_path, "wb") as f: + f.write( + key.private_bytes( + encoding=cryptography.hazmat.primitives.serialization.Encoding.PEM, + format=cryptography.hazmat.primitives.serialization.PrivateFormat.PKCS8, + encryption_algorithm=cryptography.hazmat.primitives.serialization.NoEncryption(), + ) + ) + with open(public_key_path, "wb") as f: + f.write( + key.public_key().public_bytes( + encoding=cryptography.hazmat.primitives.serialization.Encoding.PEM, + format=cryptography.hazmat.primitives.serialization.PublicFormat.SubjectPublicKeyInfo, + ) + ) + + +def encrypt(public_key_path, string): + """Encrypt 'string' using 'public_key_path'. + + Keyword arguments: + public_key_path[pathlib.Path]: path to public key + string[str]: string to encrypt + """ + padder = cryptography.hazmat.primitives.padding.PKCS7(128).padder() + padded = padder.update(string.encode()) + padder.finalize() + + symmetric_key = os.urandom(32) + + init_vector = os.urandom(12) + init_vector_base64 = base64.b64encode(init_vector).decode("utf-8") + + encryptor = cryptography.hazmat.primitives.ciphers.Cipher( + cryptography.hazmat.primitives.ciphers.algorithms.AES(symmetric_key), + cryptography.hazmat.primitives.ciphers.modes.GCM(init_vector), + backend=cryptography.hazmat.backends.default_backend(), + ).encryptor() + + string_encrypted = encryptor.update(padded) + encryptor.finalize() + string_encrypted_base64 = base64.b64encode(string_encrypted).decode("utf-8") + + encryptor_tag_base64 = base64.b64encode(encryptor.tag).decode("utf-8") + + with open(public_key_path, "rb") as f: + public_key = cryptography.hazmat.primitives.serialization.load_pem_public_key( + f.read(), + backend=cryptography.hazmat.backends.default_backend(), + ) + + symmetric_key_encrypted_base64 = base64.b64encode( + public_key.encrypt( + symmetric_key, + cryptography.hazmat.primitives.asymmetric.padding.OAEP( + mgf=cryptography.hazmat.primitives.asymmetric.padding.MGF1(algorithm=cryptography.hazmat.primitives.hashes.SHA256()), + algorithm=cryptography.hazmat.primitives.hashes.SHA256(), + label=None, + ), + ) + ).decode("utf-8") + + return f"ENC[{symmetric_key_encrypted_base64};{encryptor_tag_base64};{init_vector_base64};{string_encrypted_base64}]" + + +def decrypt(data, private_key_path=os.getenv("STACKS_PRIVATE_KEY_PATH")): + """Decrypt 'data' using 'private_key_path'. + + Keyword arguments: + private_key_path[pathlib.Path]: path to private key + data[any]: any data structure + """ + if isinstance(data, str) and data.startswith("ENC[") and data.endswith("]"): + ( + symmetric_key_encrypted_base64, + encryptor_tag_base64, + init_vector_base64, + string_encrypted_base64, + ) = data.removeprefix("ENC[").removesuffix("]").split(";") + + with open(private_key_path, "rb") as f: + private_key = cryptography.hazmat.primitives.serialization.load_pem_private_key( + f.read(), + password=None, + backend=cryptography.hazmat.backends.default_backend(), + ) + symmetric_key = private_key.decrypt( + base64.b64decode(symmetric_key_encrypted_base64.encode()), + cryptography.hazmat.primitives.asymmetric.padding.OAEP( + mgf=cryptography.hazmat.primitives.asymmetric.padding.MGF1(algorithm=cryptography.hazmat.primitives.hashes.SHA256()), + algorithm=cryptography.hazmat.primitives.hashes.SHA256(), + label=None, + ), + ) + + init_vector = base64.b64decode(init_vector_base64.encode()) + + string_encrypted = base64.b64decode(string_encrypted_base64.encode()) + + encryptor_tag = base64.b64decode(encryptor_tag_base64.encode()) + + decryptor = cryptography.hazmat.primitives.ciphers.Cipher( + cryptography.hazmat.primitives.ciphers.algorithms.AES(symmetric_key), + cryptography.hazmat.primitives.ciphers.modes.GCM(init_vector, encryptor_tag), + backend=cryptography.hazmat.backends.default_backend(), + ).decryptor() + + unpadder = cryptography.hazmat.primitives.padding.PKCS7(128).unpadder() + padded = decryptor.update(string_encrypted) + decryptor.finalize() + + string_decrypted = unpadder.update(padded) + unpadder.finalize() + + return string_decrypted.decode("utf-8") + + elif isinstance(data, list): + return [decrypt(private_key_path=private_key_path, data=item) for item in data] + + elif isinstance(data, dict): + return {key: decrypt(private_key_path=private_key_path, data=value) for key, value in data.items()} + + return data diff --git a/src/stacks/helpers/directory.py b/src/stacks/helpers/directory.py new file mode 100644 index 0000000..03cb30a --- /dev/null +++ b/src/stacks/helpers/directory.py @@ -0,0 +1,31 @@ +import fnmatch +import shutil + + +def copy_files(src, dst, include=None, prefix=""): + assert src.is_dir() + + for item in src.iterdir(): + if item.is_file() and any(fnmatch.fnmatch(item.name, pattern) for pattern in include): + dst.mkdir(exist_ok=True, parents=True) + shutil.copyfile( + src=item, + dst=dst.joinpath(f"{prefix}{item.name}"), + ) + + +def directory_remove(path, keep=[]): + """Remove 'path' dir, but preserve any paths in 'keep'. + You can 'keep' paths in 'path', but not in any of its subdirectories. + + Keyword arguments: + path[pathlib.Path]: path to directory + keep[list]: paths to keep + """ + if path.is_dir(): + for item in path.iterdir(): + if item.name not in keep: + if item.is_dir(): + shutil.rmtree(item) + else: + item.unlink() diff --git a/src/stacks/helpers/merge.py b/src/stacks/helpers/merge.py new file mode 100644 index 0000000..11ef7b9 --- /dev/null +++ b/src/stacks/helpers/merge.py @@ -0,0 +1,18 @@ +import deepmerge + + +def merge(a, b): + """Merges the contents of 'a' and 'b'. + + Keyword arguments: + a[any]: any data structure + b[any]: any data structure + """ + if isinstance(a, dict) and isinstance(b, dict): + for key in list(b.keys()): + if key in a and key.endswith("_override"): # TODO: remove this and use bare deepmerge + # Ideally, this should be handled by the config language instead (i.e. HCL). + # This only works with top-level keys because deepmerge won't letme define a custom strategy recursively. + a[key] = b[key] + b.pop(key) + return deepmerge.always_merger.merge(a, b) diff --git a/src/stacks/helpers/run.py b/src/stacks/helpers/run.py new file mode 100644 index 0000000..771220a --- /dev/null +++ b/src/stacks/helpers/run.py @@ -0,0 +1,16 @@ +import subprocess +import sys + + +def run_command(*argv, interactive=True): + try: + p = subprocess.run(args=argv, encoding="utf-8", check=True, capture_output=not interactive) + except subprocess.CalledProcessError as e: + if interactive: + sys.exit(e.returncode) + raise e + return p + + +def run_script(script): + return run_command("bash", "-c", script) diff --git a/src/stacks/helpers/template.py b/src/stacks/helpers/template.py new file mode 100644 index 0000000..508c1ac --- /dev/null +++ b/src/stacks/helpers/template.py @@ -0,0 +1,36 @@ +import glob +import pathlib + +import jinja2 + +from .. import filters + + +def jinja2_render(ctx, patterns, data={}): + """Overwrite files in 'patterns' with their Jinja2 render. + + Keyword arguments: + patterns[list]: patterns of text files + data[dict]: data to render files with + """ + for pattern in patterns: + for path in sorted(glob.glob(str(pattern))): + path = pathlib.Path(path) + if path.is_file(): + try: + with open(path, "r") as fin: + template = jinja2.Template(fin.read()) + with open(path, "w") as fout: + filters_dict = {} + for filter in filters.__all__: + filter_name = filter.__name__ + + def filter_with_context(*args, filter_name=filter_name, **kwargs): + return getattr(filters, filter_name)(ctx, *args, **kwargs) + + filters_dict[filter_name] = filter_with_context + fout.write(template.render(data | filters_dict)) + except jinja2.exceptions.UndefinedError as e: + raise Exception(f"Failure to render {path}: {e}") + except jinja2.exceptions.TemplateSyntaxError as e: + raise Exception(f"Failure to render {path} at line {e.lineno}, in statement {e.source}: {e}") diff --git a/src/stacks/main.py b/src/stacks/main.py new file mode 100644 index 0000000..691c137 --- /dev/null +++ b/src/stacks/main.py @@ -0,0 +1,143 @@ +import pathlib + +import click + +from . import cmd +from . import helpers + + +@click.group() +def cli(): + """ + Stacks, the Terraform code pre-processor. + + All commands MUST run within a layer directory, unless noted otherwise. + """ + pass + + +@cli.command(hidden=True) # hidden because it should not be used independently unless for advanced debugging purposes +def preinit(): + cmd.preinit(ctx=cmd.Context()) + + +@cli.command() +@click.option("--init", default="auto", help="Run terraform init (auto, always, never)") +def render(init): + """ + Render a layer into working Terraform code. + """ + cmd.render(ctx=cmd.Context(), init=init) + + +@cli.command() +def diff(): + """ + Render and compare Git HEAD vs current uncommitted changes. + """ + cmd.diff(ctx=cmd.Context()) + + +@cli.command(context_settings={"ignore_unknown_options": True}) +@click.option("--init", default="auto", help="Run terraform init (auto, always, never)") +@click.argument("args", nargs=-1, type=click.UNPROCESSED) +def terraform(init, args): + """ + Terraform command wrapper. + """ + cmd.terraform(ctx=cmd.Context(), init=init, args=args) + + +@cli.command(hidden=True) # hidden because the key pair is always the same +@click.option("--public-key-path", required=True) +@click.option("--private-key-path", required=True) +def genkey(public_key_path, private_key_path): + helpers.genkey(public_key_path=pathlib.Path(public_key_path), private_key_path=pathlib.Path(private_key_path)) + + +@cli.command() +@click.option("--public-key-path", required=True) +@click.argument("string") +def encrypt(public_key_path, string): + """ + Encrypt a secret string using a public key. + Can run in any directory. + """ + print(helpers.encrypt(public_key_path=pathlib.Path(public_key_path), string=string)) + + +@cli.command() +@click.option("--private-key-path", required=True) +@click.argument("string") +def decrypt(private_key_path, string): + """ + Decrypt an encrypted string using a private key. + Can run in any directory. + """ + print(helpers.decrypt(private_key_path=pathlib.Path(private_key_path), data=string)) + + +@cli.group() +def surgery(): + """ + Terraform state surgery utilities. + """ + pass + + +@surgery.command("edit") +def surgery_edit(): + """ + Edit state with vi. + """ + cmd.surgery.edit(ctx=cmd.Context()) + + +@surgery.command("list") +def surgery_list(): + """ + List all resources in state by address. + """ + cmd.surgery._list(ctx=cmd.Context()) + + +@surgery.command("import") +@click.argument("address", required=True) +@click.argument("resource", required=True) +def surgery_import(address, resource): + """ + Import a resource into state by id. + """ + cmd.surgery._import(ctx=cmd.Context(), address=address, _id=resource) + + +@surgery.command("remove") +@click.argument("address", required=True) +def surgery_remove(address): + """ + Remove a resource from state by address. + """ + cmd.surgery.remove(ctx=cmd.Context(), address=address) + + +@surgery.command("move") +@click.argument("from_address", required=True) +@click.argument("to_address", required=True) +@click.argument("to_path", required=True) +def surgery_move(from_address, to_address, to_path): + """ + Move a resource from one state to another by address. + """ + ctx = cmd.Context() + cmd.surgery.move(ctx=ctx, from_address=from_address, to_address=to_address, to_path=ctx.root_dir.joinpath(to_path)) + + +@surgery.command("rename") +@click.argument("from_address", required=True) +@click.argument("to_address", required=True) +def surgery_rename(from_address, to_address): + """ + Rename a resource in the current state. + """ + ctx = cmd.Context() + cmd.surgery.move(ctx=ctx, from_address=from_address, to_address=to_address, to_path=ctx.path) diff --git a/src/tools/cli_wrapper.py b/src/tools/cli_wrapper.py deleted file mode 100644 index f62b540..0000000 --- a/src/tools/cli_wrapper.py +++ /dev/null @@ -1,29 +0,0 @@ -#!/usr/bin/env python3 - -# Copyright 2024 Cisco Systems, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -def main(func): - import argparse, inspect, json - parser = argparse.ArgumentParser() - for key, value in inspect.signature(func).parameters.items(): - parser.add_argument( - f"--{key.replace('_','-')}", - action = argparse.BooleanOptionalAction if isinstance(value.default, bool) else None, - default = value.default, - required = value.default == value.empty, - ) - output = func(**vars(parser.parse_args())) - if output is not None: - print(json.dumps(output, indent=2)) diff --git a/src/tools/encryption_decrypt.py b/src/tools/encryption_decrypt.py deleted file mode 100644 index 1c7374c..0000000 --- a/src/tools/encryption_decrypt.py +++ /dev/null @@ -1,59 +0,0 @@ -#!/usr/bin/env python3 - -# Copyright 2024 Cisco Systems, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -def main(string, private_key_path): - import base64 - import cryptography.hazmat.backends - import cryptography.hazmat.primitives.asymmetric.padding - import cryptography.hazmat.primitives.hashes - import cryptography.hazmat.primitives.padding - import cryptography.hazmat.primitives.serialization - - symmetric_key_encrypted_base64, encryptor_tag_base64, init_vector_base64, string_encrypted_base64 = string.removeprefix("ENC[").removesuffix("]").split(";") - - with open(private_key_path, "rb") as key_file: - symmetric_key = cryptography.hazmat.primitives.serialization.load_pem_private_key( - key_file.read(), - password = None, - backend = cryptography.hazmat.backends.default_backend(), - ).decrypt( - base64.b64decode(symmetric_key_encrypted_base64.encode()), - cryptography.hazmat.primitives.asymmetric.padding.OAEP( - mgf = cryptography.hazmat.primitives.asymmetric.padding.MGF1(algorithm=cryptography.hazmat.primitives.hashes.SHA256()), - algorithm = cryptography.hazmat.primitives.hashes.SHA256(), - label = None, - ) - ) - - decryptor = cryptography.hazmat.primitives.ciphers.Cipher( - cryptography.hazmat.primitives.ciphers.algorithms.AES(symmetric_key), - cryptography.hazmat.primitives.ciphers.modes.GCM( - base64.b64decode(init_vector_base64.encode()), - base64.b64decode(encryptor_tag_base64.encode()), - ), - backend = cryptography.hazmat.backends.default_backend(), - ).decryptor() - padded = decryptor.update(base64.b64decode(string_encrypted_base64.encode())) + decryptor.finalize() - - unpadder = cryptography.hazmat.primitives.padding.PKCS7(128).unpadder() - unpadded = unpadder.update(padded) + unpadder.finalize() - - return unpadded.decode("utf-8") - - -if __name__ == "__main__": - import cli_wrapper - cli_wrapper.main(main) diff --git a/src/tools/encryption_encrypt.py b/src/tools/encryption_encrypt.py deleted file mode 100644 index 9f48f3c..0000000 --- a/src/tools/encryption_encrypt.py +++ /dev/null @@ -1,62 +0,0 @@ -#!/usr/bin/env python3 - -# Copyright 2024 Cisco Systems, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -def main(string, public_key_path): - import base64 - import cryptography.hazmat.backends - import cryptography.hazmat.primitives.asymmetric.padding - import cryptography.hazmat.primitives.ciphers - import cryptography.hazmat.primitives.hashes - import cryptography.hazmat.primitives.padding - import cryptography.hazmat.primitives.serialization - import os - - padder = cryptography.hazmat.primitives.padding.PKCS7(128).padder() - padded = padder.update(string.encode()) + padder.finalize() - - symmetric_key = os.urandom(32) - - init_vector = os.urandom(12) - init_vector_base64 = base64.b64encode(init_vector).decode("utf-8") - - encryptor = cryptography.hazmat.primitives.ciphers.Cipher(cryptography.hazmat.primitives.ciphers.algorithms.AES(symmetric_key), cryptography.hazmat.primitives.ciphers.modes.GCM(init_vector), backend=cryptography.hazmat.backends.default_backend()).encryptor() - - string_encrypted = encryptor.update(padded) + encryptor.finalize() - string_encrypted_base64 = base64.b64encode(string_encrypted).decode("utf-8") - - encryptor_tag_base64 = base64.b64encode(encryptor.tag).decode("utf-8") - - with open(public_key_path, "rb") as f: - public_key = cryptography.hazmat.primitives.serialization.load_pem_public_key( - f.read(), - backend = cryptography.hazmat.backends.default_backend(), - ) - - symmetric_key_encrypted_base64 = base64.b64encode(public_key.encrypt( - symmetric_key, - cryptography.hazmat.primitives.asymmetric.padding.OAEP( - mgf = cryptography.hazmat.primitives.asymmetric.padding.MGF1(algorithm=cryptography.hazmat.primitives.hashes.SHA256()), - algorithm = cryptography.hazmat.primitives.hashes.SHA256(), - label = None, - ) - )).decode("utf-8") - - return f"ENC[{symmetric_key_encrypted_base64};{encryptor_tag_base64};{init_vector_base64};{string_encrypted_base64}]" - - -if __name__ == "__main__": - import cli_wrapper - cli_wrapper.main(main) diff --git a/src/tools/encryption_generate_key.py b/src/tools/encryption_generate_key.py deleted file mode 100644 index d43bc18..0000000 --- a/src/tools/encryption_generate_key.py +++ /dev/null @@ -1,42 +0,0 @@ -#!/usr/bin/env python3 - -# Copyright 2024 Cisco Systems, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -def main(public_key_path, private_key_path): - import cryptography.hazmat.backends - import cryptography.hazmat.primitives.serialization - import cryptography.hazmat.primitives.asymmetric.rsa - - key = cryptography.hazmat.primitives.asymmetric.rsa.generate_private_key( - backend = cryptography.hazmat.backends.default_backend(), - key_size = 2**11, - public_exponent = 2**16+1, - ) - with open(private_key_path, "wb") as f: - f.write(key.private_bytes( - encoding = cryptography.hazmat.primitives.serialization.Encoding.PEM, - format = cryptography.hazmat.primitives.serialization.PrivateFormat.PKCS8, - encryption_algorithm = cryptography.hazmat.primitives.serialization.NoEncryption(), - )) - with open(public_key_path, "wb") as f: - f.write(key.public_key().public_bytes( - encoding = cryptography.hazmat.primitives.serialization.Encoding.PEM, - format = cryptography.hazmat.primitives.serialization.PublicFormat.SubjectPublicKeyInfo, - )) - - -if __name__ == "__main__": - import cli_wrapper - cli_wrapper.main(main) diff --git a/uv.lock b/uv.lock new file mode 100644 index 0000000..8e983ce --- /dev/null +++ b/uv.lock @@ -0,0 +1,293 @@ +version = 1 +requires-python = ">=3.10" + +[[package]] +name = "cffi" +version = "1.17.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pycparser" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/90/07/f44ca684db4e4f08a3fdc6eeb9a0d15dc6883efc7b8c90357fdbf74e186c/cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14", size = 182191 }, + { url = "https://files.pythonhosted.org/packages/08/fd/cc2fedbd887223f9f5d170c96e57cbf655df9831a6546c1727ae13fa977a/cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67", size = 178592 }, + { url = "https://files.pythonhosted.org/packages/de/cc/4635c320081c78d6ffc2cab0a76025b691a91204f4aa317d568ff9280a2d/cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382", size = 426024 }, + { url = "https://files.pythonhosted.org/packages/b6/7b/3b2b250f3aab91abe5f8a51ada1b717935fdaec53f790ad4100fe2ec64d1/cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702", size = 448188 }, + { url = "https://files.pythonhosted.org/packages/d3/48/1b9283ebbf0ec065148d8de05d647a986c5f22586b18120020452fff8f5d/cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3", size = 455571 }, + { url = "https://files.pythonhosted.org/packages/40/87/3b8452525437b40f39ca7ff70276679772ee7e8b394934ff60e63b7b090c/cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6", size = 436687 }, + { url = "https://files.pythonhosted.org/packages/8d/fb/4da72871d177d63649ac449aec2e8a29efe0274035880c7af59101ca2232/cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17", size = 446211 }, + { url = "https://files.pythonhosted.org/packages/ab/a0/62f00bcb411332106c02b663b26f3545a9ef136f80d5df746c05878f8c4b/cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8", size = 461325 }, + { url = "https://files.pythonhosted.org/packages/36/83/76127035ed2e7e27b0787604d99da630ac3123bfb02d8e80c633f218a11d/cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e", size = 438784 }, + { url = "https://files.pythonhosted.org/packages/21/81/a6cd025db2f08ac88b901b745c163d884641909641f9b826e8cb87645942/cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be", size = 461564 }, + { url = "https://files.pythonhosted.org/packages/f8/fe/4d41c2f200c4a457933dbd98d3cf4e911870877bd94d9656cc0fcb390681/cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c", size = 171804 }, + { url = "https://files.pythonhosted.org/packages/d1/b6/0b0f5ab93b0df4acc49cae758c81fe4e5ef26c3ae2e10cc69249dfd8b3ab/cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15", size = 181299 }, + { url = "https://files.pythonhosted.org/packages/6b/f4/927e3a8899e52a27fa57a48607ff7dc91a9ebe97399b357b85a0c7892e00/cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401", size = 182264 }, + { url = "https://files.pythonhosted.org/packages/6c/f5/6c3a8efe5f503175aaddcbea6ad0d2c96dad6f5abb205750d1b3df44ef29/cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf", size = 178651 }, + { url = "https://files.pythonhosted.org/packages/94/dd/a3f0118e688d1b1a57553da23b16bdade96d2f9bcda4d32e7d2838047ff7/cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4", size = 445259 }, + { url = "https://files.pythonhosted.org/packages/2e/ea/70ce63780f096e16ce8588efe039d3c4f91deb1dc01e9c73a287939c79a6/cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41", size = 469200 }, + { url = "https://files.pythonhosted.org/packages/1c/a0/a4fa9f4f781bda074c3ddd57a572b060fa0df7655d2a4247bbe277200146/cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1", size = 477235 }, + { url = "https://files.pythonhosted.org/packages/62/12/ce8710b5b8affbcdd5c6e367217c242524ad17a02fe5beec3ee339f69f85/cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6", size = 459721 }, + { url = "https://files.pythonhosted.org/packages/ff/6b/d45873c5e0242196f042d555526f92aa9e0c32355a1be1ff8c27f077fd37/cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d", size = 467242 }, + { url = "https://files.pythonhosted.org/packages/1a/52/d9a0e523a572fbccf2955f5abe883cfa8bcc570d7faeee06336fbd50c9fc/cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6", size = 477999 }, + { url = "https://files.pythonhosted.org/packages/44/74/f2a2460684a1a2d00ca799ad880d54652841a780c4c97b87754f660c7603/cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f", size = 454242 }, + { url = "https://files.pythonhosted.org/packages/f8/4a/34599cac7dfcd888ff54e801afe06a19c17787dfd94495ab0c8d35fe99fb/cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b", size = 478604 }, + { url = "https://files.pythonhosted.org/packages/34/33/e1b8a1ba29025adbdcda5fb3a36f94c03d771c1b7b12f726ff7fef2ebe36/cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655", size = 171727 }, + { url = "https://files.pythonhosted.org/packages/3d/97/50228be003bb2802627d28ec0627837ac0bf35c90cf769812056f235b2d1/cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0", size = 181400 }, + { url = "https://files.pythonhosted.org/packages/5a/84/e94227139ee5fb4d600a7a4927f322e1d4aea6fdc50bd3fca8493caba23f/cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4", size = 183178 }, + { url = "https://files.pythonhosted.org/packages/da/ee/fb72c2b48656111c4ef27f0f91da355e130a923473bf5ee75c5643d00cca/cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c", size = 178840 }, + { url = "https://files.pythonhosted.org/packages/cc/b6/db007700f67d151abadf508cbfd6a1884f57eab90b1bb985c4c8c02b0f28/cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36", size = 454803 }, + { url = "https://files.pythonhosted.org/packages/1a/df/f8d151540d8c200eb1c6fba8cd0dfd40904f1b0682ea705c36e6c2e97ab3/cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5", size = 478850 }, + { url = "https://files.pythonhosted.org/packages/28/c0/b31116332a547fd2677ae5b78a2ef662dfc8023d67f41b2a83f7c2aa78b1/cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff", size = 485729 }, + { url = "https://files.pythonhosted.org/packages/91/2b/9a1ddfa5c7f13cab007a2c9cc295b70fbbda7cb10a286aa6810338e60ea1/cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99", size = 471256 }, + { url = "https://files.pythonhosted.org/packages/b2/d5/da47df7004cb17e4955df6a43d14b3b4ae77737dff8bf7f8f333196717bf/cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93", size = 479424 }, + { url = "https://files.pythonhosted.org/packages/0b/ac/2a28bcf513e93a219c8a4e8e125534f4f6db03e3179ba1c45e949b76212c/cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3", size = 484568 }, + { url = "https://files.pythonhosted.org/packages/d4/38/ca8a4f639065f14ae0f1d9751e70447a261f1a30fa7547a828ae08142465/cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8", size = 488736 }, + { url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448 }, + { url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976 }, + { url = "https://files.pythonhosted.org/packages/8d/f8/dd6c246b148639254dad4d6803eb6a54e8c85c6e11ec9df2cffa87571dbe/cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e", size = 182989 }, + { url = "https://files.pythonhosted.org/packages/8b/f1/672d303ddf17c24fc83afd712316fda78dc6fce1cd53011b839483e1ecc8/cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2", size = 178802 }, + { url = "https://files.pythonhosted.org/packages/0e/2d/eab2e858a91fdff70533cab61dcff4a1f55ec60425832ddfdc9cd36bc8af/cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3", size = 454792 }, + { url = "https://files.pythonhosted.org/packages/75/b2/fbaec7c4455c604e29388d55599b99ebcc250a60050610fadde58932b7ee/cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683", size = 478893 }, + { url = "https://files.pythonhosted.org/packages/4f/b7/6e4a2162178bf1935c336d4da8a9352cccab4d3a5d7914065490f08c0690/cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5", size = 485810 }, + { url = "https://files.pythonhosted.org/packages/c7/8a/1d0e4a9c26e54746dc08c2c6c037889124d4f59dffd853a659fa545f1b40/cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4", size = 471200 }, + { url = "https://files.pythonhosted.org/packages/26/9f/1aab65a6c0db35f43c4d1b4f580e8df53914310afc10ae0397d29d697af4/cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd", size = 479447 }, + { url = "https://files.pythonhosted.org/packages/5f/e4/fb8b3dd8dc0e98edf1135ff067ae070bb32ef9d509d6cb0f538cd6f7483f/cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed", size = 484358 }, + { url = "https://files.pythonhosted.org/packages/f1/47/d7145bf2dc04684935d57d67dff9d6d795b2ba2796806bb109864be3a151/cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9", size = 488469 }, + { url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475 }, + { url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009 }, +] + +[[package]] +name = "click" +version = "8.1.8" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "platform_system == 'Windows'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188 }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 }, +] + +[[package]] +name = "cryptography" +version = "44.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cffi", marker = "platform_python_implementation != 'PyPy'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/91/4c/45dfa6829acffa344e3967d6006ee4ae8be57af746ae2eba1c431949b32c/cryptography-44.0.0.tar.gz", hash = "sha256:cd4e834f340b4293430701e772ec543b0fbe6c2dea510a5286fe0acabe153a02", size = 710657 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/55/09/8cc67f9b84730ad330b3b72cf867150744bf07ff113cda21a15a1c6d2c7c/cryptography-44.0.0-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:84111ad4ff3f6253820e6d3e58be2cc2a00adb29335d4cacb5ab4d4d34f2a123", size = 6541833 }, + { url = "https://files.pythonhosted.org/packages/7e/5b/3759e30a103144e29632e7cb72aec28cedc79e514b2ea8896bb17163c19b/cryptography-44.0.0-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b15492a11f9e1b62ba9d73c210e2416724633167de94607ec6069ef724fad092", size = 3922710 }, + { url = "https://files.pythonhosted.org/packages/5f/58/3b14bf39f1a0cfd679e753e8647ada56cddbf5acebffe7db90e184c76168/cryptography-44.0.0-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:831c3c4d0774e488fdc83a1923b49b9957d33287de923d58ebd3cec47a0ae43f", size = 4137546 }, + { url = "https://files.pythonhosted.org/packages/98/65/13d9e76ca19b0ba5603d71ac8424b5694415b348e719db277b5edc985ff5/cryptography-44.0.0-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:761817a3377ef15ac23cd7834715081791d4ec77f9297ee694ca1ee9c2c7e5eb", size = 3915420 }, + { url = "https://files.pythonhosted.org/packages/b1/07/40fe09ce96b91fc9276a9ad272832ead0fddedcba87f1190372af8e3039c/cryptography-44.0.0-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:3c672a53c0fb4725a29c303be906d3c1fa99c32f58abe008a82705f9ee96f40b", size = 4154498 }, + { url = "https://files.pythonhosted.org/packages/75/ea/af65619c800ec0a7e4034207aec543acdf248d9bffba0533342d1bd435e1/cryptography-44.0.0-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:4ac4c9f37eba52cb6fbeaf5b59c152ea976726b865bd4cf87883a7e7006cc543", size = 3932569 }, + { url = "https://files.pythonhosted.org/packages/c7/af/d1deb0c04d59612e3d5e54203159e284d3e7a6921e565bb0eeb6269bdd8a/cryptography-44.0.0-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ed3534eb1090483c96178fcb0f8893719d96d5274dfde98aa6add34614e97c8e", size = 4016721 }, + { url = "https://files.pythonhosted.org/packages/bd/69/7ca326c55698d0688db867795134bdfac87136b80ef373aaa42b225d6dd5/cryptography-44.0.0-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:f3f6fdfa89ee2d9d496e2c087cebef9d4fcbb0ad63c40e821b39f74bf48d9c5e", size = 4240915 }, + { url = "https://files.pythonhosted.org/packages/ef/d4/cae11bf68c0f981e0413906c6dd03ae7fa864347ed5fac40021df1ef467c/cryptography-44.0.0-cp37-abi3-win32.whl", hash = "sha256:eb33480f1bad5b78233b0ad3e1b0be21e8ef1da745d8d2aecbb20671658b9053", size = 2757925 }, + { url = "https://files.pythonhosted.org/packages/64/b1/50d7739254d2002acae64eed4fc43b24ac0cc44bf0a0d388d1ca06ec5bb1/cryptography-44.0.0-cp37-abi3-win_amd64.whl", hash = "sha256:abc998e0c0eee3c8a1904221d3f67dcfa76422b23620173e28c11d3e626c21bd", size = 3202055 }, + { url = "https://files.pythonhosted.org/packages/11/18/61e52a3d28fc1514a43b0ac291177acd1b4de00e9301aaf7ef867076ff8a/cryptography-44.0.0-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:660cb7312a08bc38be15b696462fa7cc7cd85c3ed9c576e81f4dc4d8b2b31591", size = 6542801 }, + { url = "https://files.pythonhosted.org/packages/1a/07/5f165b6c65696ef75601b781a280fc3b33f1e0cd6aa5a92d9fb96c410e97/cryptography-44.0.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1923cb251c04be85eec9fda837661c67c1049063305d6be5721643c22dd4e2b7", size = 3922613 }, + { url = "https://files.pythonhosted.org/packages/28/34/6b3ac1d80fc174812486561cf25194338151780f27e438526f9c64e16869/cryptography-44.0.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:404fdc66ee5f83a1388be54300ae978b2efd538018de18556dde92575e05defc", size = 4137925 }, + { url = "https://files.pythonhosted.org/packages/d0/c7/c656eb08fd22255d21bc3129625ed9cd5ee305f33752ef2278711b3fa98b/cryptography-44.0.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:c5eb858beed7835e5ad1faba59e865109f3e52b3783b9ac21e7e47dc5554e289", size = 3915417 }, + { url = "https://files.pythonhosted.org/packages/ef/82/72403624f197af0db6bac4e58153bc9ac0e6020e57234115db9596eee85d/cryptography-44.0.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f53c2c87e0fb4b0c00fa9571082a057e37690a8f12233306161c8f4b819960b7", size = 4155160 }, + { url = "https://files.pythonhosted.org/packages/a2/cd/2f3c440913d4329ade49b146d74f2e9766422e1732613f57097fea61f344/cryptography-44.0.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:9e6fc8a08e116fb7c7dd1f040074c9d7b51d74a8ea40d4df2fc7aa08b76b9e6c", size = 3932331 }, + { url = "https://files.pythonhosted.org/packages/7f/df/8be88797f0a1cca6e255189a57bb49237402b1880d6e8721690c5603ac23/cryptography-44.0.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:d2436114e46b36d00f8b72ff57e598978b37399d2786fd39793c36c6d5cb1c64", size = 4017372 }, + { url = "https://files.pythonhosted.org/packages/af/36/5ccc376f025a834e72b8e52e18746b927f34e4520487098e283a719c205e/cryptography-44.0.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a01956ddfa0a6790d594f5b34fc1bfa6098aca434696a03cfdbe469b8ed79285", size = 4239657 }, + { url = "https://files.pythonhosted.org/packages/46/b0/f4f7d0d0bcfbc8dd6296c1449be326d04217c57afb8b2594f017eed95533/cryptography-44.0.0-cp39-abi3-win32.whl", hash = "sha256:eca27345e1214d1b9f9490d200f9db5a874479be914199194e746c893788d417", size = 2758672 }, + { url = "https://files.pythonhosted.org/packages/97/9b/443270b9210f13f6ef240eff73fd32e02d381e7103969dc66ce8e89ee901/cryptography-44.0.0-cp39-abi3-win_amd64.whl", hash = "sha256:708ee5f1bafe76d041b53a4f95eb28cdeb8d18da17e597d46d7833ee59b97ede", size = 3202071 }, + { url = "https://files.pythonhosted.org/packages/77/d4/fea74422326388bbac0c37b7489a0fcb1681a698c3b875959430ba550daa/cryptography-44.0.0-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:37d76e6863da3774cd9db5b409a9ecfd2c71c981c38788d3fcfaf177f447b731", size = 3338857 }, + { url = "https://files.pythonhosted.org/packages/1a/aa/ba8a7467c206cb7b62f09b4168da541b5109838627f582843bbbe0235e8e/cryptography-44.0.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:f677e1268c4e23420c3acade68fac427fffcb8d19d7df95ed7ad17cdef8404f4", size = 3850615 }, + { url = "https://files.pythonhosted.org/packages/89/fa/b160e10a64cc395d090105be14f399b94e617c879efd401188ce0fea39ee/cryptography-44.0.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:f5e7cb1e5e56ca0933b4873c0220a78b773b24d40d186b6738080b73d3d0a756", size = 4081622 }, + { url = "https://files.pythonhosted.org/packages/47/8f/20ff0656bb0cf7af26ec1d01f780c5cfbaa7666736063378c5f48558b515/cryptography-44.0.0-pp310-pypy310_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:8b3e6eae66cf54701ee7d9c83c30ac0a1e3fa17be486033000f2a73a12ab507c", size = 3867546 }, + { url = "https://files.pythonhosted.org/packages/38/d9/28edf32ee2fcdca587146bcde90102a7319b2f2c690edfa627e46d586050/cryptography-44.0.0-pp310-pypy310_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:be4ce505894d15d5c5037167ffb7f0ae90b7be6f2a98f9a5c3442395501c32fa", size = 4090937 }, + { url = "https://files.pythonhosted.org/packages/cc/9d/37e5da7519de7b0b070a3fedd4230fe76d50d2a21403e0f2153d70ac4163/cryptography-44.0.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:62901fb618f74d7d81bf408c8719e9ec14d863086efe4185afd07c352aee1d2c", size = 3128774 }, +] + +[[package]] +name = "deepmerge" +version = "2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a8/3a/b0ba594708f1ad0bc735884b3ad854d3ca3bdc1d741e56e40bbda6263499/deepmerge-2.0.tar.gz", hash = "sha256:5c3d86081fbebd04dd5de03626a0607b809a98fb6ccba5770b62466fe940ff20", size = 19890 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2d/82/e5d2c1c67d19841e9edc74954c827444ae826978499bde3dfc1d007c8c11/deepmerge-2.0-py3-none-any.whl", hash = "sha256:6de9ce507115cff0bed95ff0ce9ecc31088ef50cbdf09bc90a09349a318b3d00", size = 13475 }, +] + +[[package]] +name = "gitdb" +version = "4.0.12" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "smmap" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/72/94/63b0fc47eb32792c7ba1fe1b694daec9a63620db1e313033d18140c2320a/gitdb-4.0.12.tar.gz", hash = "sha256:5ef71f855d191a3326fcfbc0d5da835f26b13fbcba60c32c21091c349ffdb571", size = 394684 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a0/61/5c78b91c3143ed5c14207f463aecfc8f9dbb5092fb2869baf37c273b2705/gitdb-4.0.12-py3-none-any.whl", hash = "sha256:67073e15955400952c6565cc3e707c554a4eea2e428946f7a4c162fab9bd9bcf", size = 62794 }, +] + +[[package]] +name = "gitpython" +version = "3.1.44" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "gitdb" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c0/89/37df0b71473153574a5cdef8f242de422a0f5d26d7a9e231e6f169b4ad14/gitpython-3.1.44.tar.gz", hash = "sha256:c87e30b26253bf5418b01b0660f818967f3c503193838337fe5e573331249269", size = 214196 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1d/9a/4114a9057db2f1462d5c8f8390ab7383925fe1ac012eaa42402ad65c2963/GitPython-3.1.44-py3-none-any.whl", hash = "sha256:9e0e10cda9bed1ee64bc9a6de50e7e38a9c9943241cd7f585f6df3ed28011110", size = 207599 }, +] + +[[package]] +name = "jinja2" +version = "3.1.5" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/af/92/b3130cbbf5591acf9ade8708c365f3238046ac7cb8ccba6e81abccb0ccff/jinja2-3.1.5.tar.gz", hash = "sha256:8fefff8dc3034e27bb80d67c671eb8a9bc424c0ef4c0826edbff304cceff43bb", size = 244674 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bd/0f/2ba5fbcd631e3e88689309dbe978c5769e883e4b84ebfe7da30b43275c5a/jinja2-3.1.5-py3-none-any.whl", hash = "sha256:aba0f4dc9ed8013c424088f68a5c226f7d6097ed89b246d7749c2ec4175c6adb", size = 134596 }, +] + +[[package]] +name = "lark" +version = "1.2.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/af/60/bc7622aefb2aee1c0b4ba23c1446d3e30225c8770b38d7aedbfb65ca9d5a/lark-1.2.2.tar.gz", hash = "sha256:ca807d0162cd16cef15a8feecb862d7319e7a09bdb13aef927968e45040fed80", size = 252132 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2d/00/d90b10b962b4277f5e64a78b6609968859ff86889f5b898c1a778c06ec00/lark-1.2.2-py3-none-any.whl", hash = "sha256:c2276486b02f0f1b90be155f2c8ba4a8e194d42775786db622faccd652d8e80c", size = 111036 }, +] + +[[package]] +name = "markupsafe" +version = "3.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/90/d08277ce111dd22f77149fd1a5d4653eeb3b3eaacbdfcbae5afb2600eebd/MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8", size = 14357 }, + { url = "https://files.pythonhosted.org/packages/04/e1/6e2194baeae0bca1fae6629dc0cbbb968d4d941469cbab11a3872edff374/MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158", size = 12393 }, + { url = "https://files.pythonhosted.org/packages/1d/69/35fa85a8ece0a437493dc61ce0bb6d459dcba482c34197e3efc829aa357f/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579", size = 21732 }, + { url = "https://files.pythonhosted.org/packages/22/35/137da042dfb4720b638d2937c38a9c2df83fe32d20e8c8f3185dbfef05f7/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d", size = 20866 }, + { url = "https://files.pythonhosted.org/packages/29/28/6d029a903727a1b62edb51863232152fd335d602def598dade38996887f0/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb", size = 20964 }, + { url = "https://files.pythonhosted.org/packages/cc/cd/07438f95f83e8bc028279909d9c9bd39e24149b0d60053a97b2bc4f8aa51/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b", size = 21977 }, + { url = "https://files.pythonhosted.org/packages/29/01/84b57395b4cc062f9c4c55ce0df7d3108ca32397299d9df00fedd9117d3d/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c", size = 21366 }, + { url = "https://files.pythonhosted.org/packages/bd/6e/61ebf08d8940553afff20d1fb1ba7294b6f8d279df9fd0c0db911b4bbcfd/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171", size = 21091 }, + { url = "https://files.pythonhosted.org/packages/11/23/ffbf53694e8c94ebd1e7e491de185124277964344733c45481f32ede2499/MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50", size = 15065 }, + { url = "https://files.pythonhosted.org/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a", size = 15514 }, + { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353 }, + { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392 }, + { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984 }, + { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120 }, + { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032 }, + { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057 }, + { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359 }, + { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306 }, + { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094 }, + { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521 }, + { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274 }, + { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348 }, + { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149 }, + { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118 }, + { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993 }, + { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178 }, + { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319 }, + { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352 }, + { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097 }, + { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601 }, + { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274 }, + { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352 }, + { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122 }, + { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085 }, + { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978 }, + { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208 }, + { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357 }, + { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344 }, + { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101 }, + { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603 }, + { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510 }, + { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486 }, + { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480 }, + { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914 }, + { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796 }, + { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473 }, + { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114 }, + { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098 }, + { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208 }, + { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739 }, +] + +[[package]] +name = "packaging" +version = "24.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 }, +] + +[[package]] +name = "pycparser" +version = "2.22" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552 }, +] + +[[package]] +name = "python-hcl2" +version = "5.1.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "lark" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ae/cb/18f9662201e948dbc3a1a37daf5392ac0d60667138e23d73799fc8c7bc8d/python-hcl2-5.1.1.tar.gz", hash = "sha256:e5c759d902e7566b9e8f1fcbf3eeef426269904f3fc0c3bf50d9090d00d1f57c", size = 23292 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/19/d2/fc80edee3535bf3c226c992dddebf64c1eef260f2d95c0a6e1838b0135d8/python_hcl2-5.1.1-py3-none-any.whl", hash = "sha256:4670e90aac0b0825f3a634fc684ba94a4268dbc15b062df26f4b4441af24c29a", size = 13931 }, +] + +[[package]] +name = "smmap" +version = "5.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/44/cd/a040c4b3119bbe532e5b0732286f805445375489fceaec1f48306068ee3b/smmap-5.0.2.tar.gz", hash = "sha256:26ea65a03958fa0c8a1c7e8c7a58fdc77221b8910f6be2131affade476898ad5", size = 22329 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/be/d09147ad1ec7934636ad912901c5fd7667e1c858e19d355237db0d0cd5e4/smmap-5.0.2-py3-none-any.whl", hash = "sha256:b30115f0def7d7531d22a0fb6502488d879e75b260a9db4d0819cfb25403af5e", size = 24303 }, +] + +[[package]] +name = "stacks" +version = "2.0.3" +source = { editable = "." } +dependencies = [ + { name = "click" }, + { name = "cryptography" }, + { name = "deepmerge" }, + { name = "gitpython" }, + { name = "jinja2" }, + { name = "packaging" }, + { name = "python-hcl2" }, +] + +[package.metadata] +requires-dist = [ + { name = "click", specifier = ">=8.1.7" }, + { name = "cryptography", specifier = ">=43.0.3" }, + { name = "deepmerge", specifier = ">=2.0" }, + { name = "gitpython", specifier = ">=3.1.43" }, + { name = "jinja2", specifier = ">=3.1.4" }, + { name = "packaging", specifier = ">=24.2" }, + { name = "python-hcl2", specifier = ">=5.1.1" }, +]