We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
compel 2.0.2, on invokeAI main
main
Example prompt: ("polar bear","james bond").blend(1,1) painting
("polar bear","james bond").blend(1,1) painting
Traceback (most recent call last): File "/home/bat/Documents/Code/InvokeAI/invokeai/app/services/processor.py", line 106, in __process outputs = invocation.invoke_internal( File "/home/bat/Documents/Code/InvokeAI/invokeai/app/invocations/baseinvocation.py", line 610, in invoke_internal output = self.invoke(context) File "/home/bat/invokeai/.venv/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/home/bat/Documents/Code/InvokeAI/invokeai/app/invocations/compel.py", line 119, in invoke conjunction = Compel.parse_prompt_string(self.prompt) File "/home/bat/invokeai/.venv/lib/python3.10/site-packages/compel/compel.py", line 158, in parse_prompt_string conjunction = pp.parse_conjunction(prompt_string) File "/home/bat/invokeai/.venv/lib/python3.10/site-packages/compel/prompt_parser.py", line 330, in parse_conjunction root = self.conjunction.parse_string(prompt) File "/home/bat/invokeai/.venv/lib/python3.10/site-packages/pyparsing/core.py", line 1141, in parse_string raise exc.with_traceback(None) pyparsing.exceptions.ParseException: Expected {explicit_conjunction | {[Group:({lora}...)] {blend | Group:([{cross_attention_substitute | lora | attention | Forward: string enclosed in '"' | parenthesized_fragment | free_word | Suppress:(<SP><TAB><CR><LF>)}]...)} [Group:({lora}...)] StringEnd}}, found 'painting' (at char 41), (line:1, col:42)
Would expect this to fallback to treating the prompt as a raw prompt.
The text was updated successfully, but these errors were encountered:
fix: on error parsing, fall back to "raw" prompt
75d3ff4
Closes damian0815#69
Successfully merging a pull request may close this issue.
compel 2.0.2, on invokeAI
main
Example prompt:
("polar bear","james bond").blend(1,1) painting
Would expect this to fallback to treating the prompt as a raw prompt.
The text was updated successfully, but these errors were encountered: