-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: implement SDK #2
Conversation
Resolved issues in the following files with DeepSource Autofix: 1. Sources/BaseMindClient/BaseMindClient.swift 2. Tests/BaseMindClientTests/BaseMindClientTests.swift
This pull request sets up GitHub code scanning for this repository. Once the scans have completed and the checks have passed, the analysis results for this pull request branch will appear on this overview. Once you merge this pull request, the 'Security' tab will show more code scanning analysis results (for example, for the default branch). Depending on your configuration and choice of analysis tool, future pull requests will be annotated with code scanning analysis results. For more information about GitHub code scanning, check out the documentation. |
1.Regarding BaseMindError, Is there any case which handles the error :Case Example - Unable to fetch prompt response for given input template variable due to insufficient data at llm end. |
This would be a
Sure
If it was a bigger application, sure. In this case I think it's fine.
Can you explain? |
No description provided.