-
Notifications
You must be signed in to change notification settings - Fork 252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use LLMs to infer arg names of generated configurations #2652
Comments
/bounty 65$ |
|
/attempt 2652
|
@ssddOnTop: Reminder that in 1 days the bounty will become up for grabs, so please submit a pull request before then 🙏 |
/attempt #2652
|
Note The user @ssddOnTop is already attempting to complete issue #2652 and claim the bounty. We recommend checking in on @ssddOnTop's progress, and potentially collaborating, before starting a new solution. |
/attempt #2652
|
Note The user @mobley-trent is already attempting to complete issue #2652 and claim the bounty. We recommend checking in on @mobley-trent's progress, and potentially collaborating, before starting a new solution. |
@mobley-trent: Reminder that in 1 days the bounty will become up for grabs, so please submit a pull request before then 🙏 |
💡 @laststylebender14 submitted a pull request that claims the bounty. You can visit your bounty board to reward. |
Action required: Issue inactive for 30 days. |
Issue closed after 7 days of inactivity. |
Tailcall uses the power of AI to auto-generate configurations. This feature is primarily used to identify the name of the type currently and works reliably well.
Before
After
infer_type_names: true
After
infer_arg_names: true
Technical Requirements
fieldName
that is a required field should be made optional and used only if provided.infer_type_name.rs
and the newinfer_arg_name.rs
.Additional Links
The text was updated successfully, but these errors were encountered: