Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(cli): reduce redundancy on context to cluster flags in command deployment create #1156

Conversation

instamenta
Copy link
Contributor

Description

Removes flag --context-cluster and it's usages, replaces them with data provided from local config deployments.clusters.

Related Issues

…data from flag.deploymentClusters and parsed data inside local config

Signed-off-by: instamenta <[email protected]>
@instamenta instamenta self-assigned this Jan 14, 2025
@instamenta instamenta requested review from leninmehedy and a team as code owners January 14, 2025 11:33
Copy link
Contributor

github-actions bot commented Jan 14, 2025

Unit Test Results - Linux

  1 files  ±0   59 suites  ±0   4s ⏱️ -1s
233 tests ±0  233 ✅ ±0  0 💤 ±0  0 ❌ ±0 
238 runs  ±0  238 ✅ ±0  0 💤 ±0  0 ❌ ±0 

Results for commit 523480d. ± Comparison against base commit 64fbf95.

♻️ This comment has been updated with latest results.

Copy link
Contributor

github-actions bot commented Jan 14, 2025

Unit Test Results - Windows

  1 files  ±0   59 suites  ±0   13s ⏱️ -1s
233 tests ±0  233 ✅ ±0  0 💤 ±0  0 ❌ ±0 
238 runs  ±0  238 ✅ ±0  0 💤 ±0  0 ❌ ±0 

Results for commit 523480d. ± Comparison against base commit 64fbf95.

♻️ This comment has been updated with latest results.

Copy link
Contributor

github-actions bot commented Jan 14, 2025

E2E Test Report

 17 files  126 suites   1h 29m 9s ⏱️
258 tests 258 ✅ 0 💤 0 ❌
269 runs  269 ✅ 0 💤 0 ❌

Results for commit 523480d.

♻️ This comment has been updated with latest results.

Copy link

codacy-production bot commented Jan 14, 2025

Coverage summary from Codacy

See diff coverage on Codacy

Coverage variation Diff coverage
Report missing for 64fbf951 65.63%
Coverage variation details
Coverable lines Covered lines Coverage
Common ancestor commit (64fbf95) Report Missing Report Missing Report Missing
Head commit (523480d) 21721 18221 83.89%

Coverage variation is the difference between the coverage for the head and common ancestor commits of the pull request branch: <coverage of head commit> - <coverage of common ancestor commit>

Diff coverage details
Coverable lines Covered lines Diff coverage
Pull request (#1156) 32 21 65.63%

Diff coverage is the percentage of lines that are covered by tests out of the coverable lines that the pull request added or modified: <covered lines added or modified>/<coverable lines added or modified> * 100%

See your quality gate settings    Change summary preferences

Codacy stopped sending the deprecated coverage status on June 5th, 2024. Learn more

Footnotes

  1. Codacy didn't receive coverage data for the commit, or there was an error processing the received data. Check your integration for errors and validate that your coverage setup is correct.

Copy link

codecov bot commented Jan 14, 2025

Codecov Report

Attention: Patch coverage is 59.37500% with 13 lines in your changes missing coverage. Please review.

Project coverage is 82.99%. Comparing base (9d1a8cb) to head (523480d).
Report is 7 commits behind head on main.

Files with missing lines Patch % Lines
src/core/config/remote/listr_config_tasks.ts 46.15% 7 Missing ⚠️
src/commands/cluster/tasks.ts 28.57% 3 Missing and 2 partials ⚠️
src/core/config/local_config.ts 66.66% 1 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #1156      +/-   ##
==========================================
+ Coverage   82.89%   82.99%   +0.10%     
==========================================
  Files          78       78              
  Lines       21763    21721      -42     
  Branches     1383     1585     +202     
==========================================
- Hits        18040    18028      -12     
+ Misses       3696     3599      -97     
- Partials       27       94      +67     
Files with missing lines Coverage Δ
src/commands/base.ts 94.25% <100.00%> (ø)
src/commands/flags.ts 75.85% <ø> (+0.20%) ⬆️
src/commands/mirror_node.ts 75.61% <100.00%> (ø)
src/commands/network.ts 65.79% <100.00%> (ø)
src/core/config/remote/remote_config_manager.ts 76.27% <100.00%> (ø)
src/core/templates.ts 77.77% <100.00%> (+5.43%) ⬆️
src/core/config/local_config.ts 65.86% <66.66%> (+0.13%) ⬆️
src/commands/cluster/tasks.ts 88.00% <28.57%> (-0.84%) ⬇️
src/core/config/remote/listr_config_tasks.ts 65.97% <46.15%> (-1.05%) ⬇️

... and 15 files with indirect coverage changes

Impacted file tree graph

Copy link
Contributor

@jeromy-cannon jeromy-cannon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@instamenta ,

some changes:

  • check and make sure that K8s testClusterConnection validates that the context exists in the kube config prior to setting it
  • put K8s testClusterConnection logic in a try/catch block, and if there is a failure revert the context back to its original value.

In this case:

npm run solo -- deployment create -n jeromy --email [email protected] --deployment-clusters solo-e2e

the cluster solo-e2e is what solo will use as an alias to map to a context kind-solo-e2e which I supplied when it prompted me.

The local-config.yaml looks good:

❯ cat local-config.yaml
userEmailAddress: [email protected]
deployments:
  jeromy:
    clusters:
      - solo-e2e
currentDeploymentName: jeromy
clusterContextMapping:
  solo-e2e: kind-solo-e2e

currently, the testClusterConnection fails because solo-e2e isn't a valid context, but it has already updated the kube current context. So, when the program attempts to exit, the release lease fails because k8 is a singleton, and still pointing to context = solo-e2e.

✔ Initialize
  ✔ Acquire lease - lease acquired successfully, attempt: 1/10
↓ Prompt local configuration
❯ Validate cluster connections
  ✖ No active cluster!
◼ Create remote config
*********************************** ERROR *****************************************
failed to read existing leases, unexpected server response of '500' received
***********************************************************************************

@jeromy-cannon jeromy-cannon added PR: Unresolved Comments A pull request where there are comments and they need to be resolved. PR: Merge Conflicts A pull request that has merge conflicts that need to be resolved. labels Jan 15, 2025
…cy-on-context-to-cluster-flags-in-solo-deployment-create

# Conflicts:
#	src/commands/deployment.ts
@jeromy-cannon jeromy-cannon added PR: Needs Approval A pull request that needs reviews and approvals. and removed PR: Merge Conflicts A pull request that has merge conflicts that need to be resolved. PR: Unresolved Comments A pull request where there are comments and they need to be resolved. labels Jan 17, 2025
jeromy-cannon
jeromy-cannon previously approved these changes Jan 17, 2025
@jeromy-cannon jeromy-cannon added PR: Ready to Merge A pull request that is ready to merge. PR: Merge Conflicts A pull request that has merge conflicts that need to be resolved. and removed PR: Needs Approval A pull request that needs reviews and approvals. labels Jan 17, 2025
…cy-on-context-to-cluster-flags-in-solo-deployment-create

# Conflicts:
#	src/commands/deployment.ts
#	src/core/k8.ts
@instamenta instamenta removed the PR: Merge Conflicts A pull request that has merge conflicts that need to be resolved. label Jan 20, 2025
Copy link
Contributor

@jeromy-cannon jeromy-cannon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think maybe something got broke during the merge conflicts, or I missed something, or I started doing something wrong. I'm now getting an error when I try to do this:

❯ kind delete cluster -n solo
Deleting cluster "solo" ...
Deleted nodes: ["solo-control-plane"]
❯ kind create cluster -n solo
Creating cluster "solo" ...
 ✓ Ensuring node image (kindest/node:v1.27.3) 🖼
 ✓ Preparing nodes 📦  
 ✓ Writing configuration 📜 
 ✓ Starting control-plane 🕹️ 
 ✓ Installing CNI 🔌 
 ✓ Installing StorageClass 💾 
Set kubectl context to "kind-solo"
You can now use your cluster with:

kubectl cluster-info --context kind-solo

Have a nice day! 👋
❯ rm -Rf ~/.solo

❯ npm run solo -- init

> @hashgraph/[email protected] solo
> node --no-deprecation --no-warnings dist/solo.js init


******************************* Solo *********************************************
Version                 : 0.33.0
Kubernetes Context      : kind-solo
Kubernetes Cluster      : kind-solo
**********************************************************************************
✔ Setup home directory and cache
✔ Check dependencies [3s]
  ✔ Check dependency: helm [OS: darwin, Release: 23.6.0, Arch: arm64] [3s]
✔ Setup chart manager [1s]
✔ Copy templates in '/Users/user/.solo/cache'


***************************************************************************************
Note: solo stores various artifacts (config, logs, keys etc.) in its home directory: /Users/user/.solo
If a full reset is needed, delete the directory or relevant sub-directories before running 'solo init'.
***************************************************************************************
❯ npm run solo -- deployment create -n jeromy-solo --deployment-clusters kind-solo

> @hashgraph/[email protected] solo
> node --no-deprecation --no-warnings dist/solo.js deployment create -n jeromy-solo --deployment-clusters kind-solo


******************************* Solo *********************************************
Version                 : 0.33.0
Kubernetes Context      : kind-solo
Kubernetes Cluster      : kind-solo
Kubernetes Namespace    : jeromy-solo
**********************************************************************************
✔ Initialize
✔ Setup home directory
✔ Prompt local configuration [6s]
✖ Context kind-solo is not valid for cluster
◼ Validate context
◼ Update local configuration
◼ Validate cluster connections
◼ Create remoteConfig in clusters
*********************************** ERROR *****************************************
Error installing chart solo-deployment
***********************************************************************************
❯ cat ~/.solo/cache/local-config.yaml
userEmailAddress: [email protected]
deployments:
  jeromy-solo:
    clusters:
      - kind-solo
currentDeploymentName: jeromy-solo
clusterContextMapping:
  kind-solo: kind-solo

@jeromy-cannon jeromy-cannon added PR: Unresolved Comments A pull request where there are comments and they need to be resolved. and removed PR: Ready to Merge A pull request that is ready to merge. labels Jan 21, 2025
…cy-on-context-to-cluster-flags-in-solo-deployment-create
…cy-on-context-to-cluster-flags-in-solo-deployment-create

# Conflicts:
#	src/commands/deployment.ts
@instamenta instamenta added PR: Unresolved Comments A pull request where there are comments and they need to be resolved. and removed PR: Unresolved Comments A pull request where there are comments and they need to be resolved. labels Jan 22, 2025
@instamenta instamenta requested a review from a team as a code owner January 23, 2025 08:15
@instamenta instamenta removed the PR: Unresolved Comments A pull request where there are comments and they need to be resolved. label Jan 23, 2025
…cy-on-context-to-cluster-flags-in-solo-deployment-create
…cy-on-context-to-cluster-flags-in-solo-deployment-create
@jeromy-cannon jeromy-cannon added the PR: Needs Approval A pull request that needs reviews and approvals. label Jan 24, 2025
@jeromy-cannon jeromy-cannon merged commit 4ab4dd5 into main Jan 24, 2025
41 of 42 checks passed
@jeromy-cannon jeromy-cannon deleted the 00977-reduce-redundancy-on-context-to-cluster-flags-in-solo-deployment-create branch January 24, 2025 16:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
PR: Needs Approval A pull request that needs reviews and approvals.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

reduce redundancy on context to cluster flags in solo deployment create
2 participants