Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shortcut Timeout Workaround #101

Open
VenusTruth opened this issue Nov 20, 2024 · 0 comments
Open

Shortcut Timeout Workaround #101

VenusTruth opened this issue Nov 20, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@VenusTruth
Copy link

I have frequent issues with the time running out for a shortcut action to complete, as a workaround, I’ve seen other apps solve the issue by calling a URL to trigger another shortcut to run. Below are my workaround ideas. Any or all of these would be beneficial on their own too in my opinion.

Due to the potential size of the output from a query, I’d like to have the response from the shortcut action be saved to a file in the on-device folder or a customizable location, and then the specified URL called to trigger another shortcut to run. For simplicity for some users it may be nice to have the query output available as an a portion of text composing the URL.

Ideally LLM Farm just calls any URL (more customizable) and I can specify a specific URL to call in either the relevant Chat’s configuration (in app or the JSON file) or as as part of the shortcut action that triggers the query.

I’m cool with doing anything in Shortcuts to handle the integration as needed, heck I can even a shortcut just check repeatedly for a file saved in the specified location so as to determine that way if the query completed, but I figure if LLM Farm calls the URL to indicate the query completed processing, that’s more reliable.

Entirely alternative idea, LLM Farm runs an integrated web server type thing, so my shortcut can open LLM Farm (so it’s on screen) then submits a web GET request in the background to trigger a query, shortcut ends, when LLM Farm query completes that saves data to a file and calls a URL as described above. Integrated web server API idea could also be easier to have a shortcut interact with a remote installation of LLM Farm running on a Mac or other iOS device, which would have other potential use cases.

@guinmoon guinmoon added the enhancement New feature or request label Nov 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants