Skip to content

A CLI that debugs error messages produced in your command line calling an LLM!

Notifications You must be signed in to change notification settings

guananya/aidebugs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

aidebugs

A CLI that debugs error messages produced in your command line by calling an LLM!

Current version

Shoutout to aicommits for the inspiration!

aidebugs --command

Setup

Check your Node.js version with node --version. It should be at least v14!

  1. Installation:

    npm install -g aidebugs
  2. Get your API key from OpenAI

    Note: If you haven't already, you'll have to create an account and set up billing.

  3. Set the key so aidebugs can use it:

    aidebugs config --key "<your-api-key>"

    This will create a .aidebugs file in your home directory.

Upgrading

Check the installed version with:

aidebugs --version

If it's not the latest version, run:

npm update -g aidebugs

Usage

Send error message

Utilize the --command flag to specify a command for the terminal to execute. Any error output generated is then fed into the open ai api to get tips and ways to fix it! For example,

aidebugs --command "npm run dev"
aidebugs --command

Include files

You can also include files for added context on understanding your error message using the --file flag. For example, including two files

aidebugs --command "python3 scripts/script1.py" --file "scripts/script1.py" "scripts/script2.py"
aidebugs --command

Specify file lines

You can also specify a range of file lines to include into your context instead of entire files. You can do this while passing in the filename- "filename.abc:num1-num2" with the range of lines being from line numbers 'num1-num2' right after the ':'. For example,

aidebugs --command "python3 scripts/script1.py" --file "scripts/script1.py:1-2" "scripts/script2.py"
aidebugs --command

Additions coming soon

  • Be able to include custom prompts such as "I expected an output of 4 but got 5- what could the issue be?" instead of it just debugging when an error is hit.
  • Expanding on that, even using this as a debugging buddy! It's common to include print and logging statements in specific places to help understand where your code may have an issue- it would be cool to have this tool add these by itself to help the user debug their code!
  • Make it prettier, better error handling

About

A CLI that debugs error messages produced in your command line calling an LLM!

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published