Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fixed bug with double negated preconditions. We now simplify the theory #6

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

nirlipo
Copy link

@nirlipo nirlipo commented Sep 7, 2022

This fixes the following bug:

image

into:

image

Please check if this is the correct transformation. Note that the original was negating the AND formula, so the precondition became OR of some double negated fluents, which then were handled incorrectly by the planner.

@haz
Copy link
Contributor

haz commented Sep 7, 2022

Can you clarify -- was the original incorrect, or just not handled by the planner you were using?

@haz
Copy link
Contributor

haz commented Sep 7, 2022

Read through the change -- not sure this works semantically. We want to negate the other precondition, and if it's a conjunction, that means indeed it turns into a disjunction. Without doing that, you're essentially saying that it's a failed action only if you can violate all of the domain2 preconditions for the action while satisfying all of the domain1 preconditions. We'd rather want to violate just one of the domain2 preconditions while satisfying the domain preconditions.

That make sense?

@nirlipo
Copy link
Author

nirlipo commented Sep 7, 2022

I tried to use the lor formula, but then tarksi would print it into an action embedding an or connective between each term, which defeats the purpose.

In the latest commit, I ended up adding a failed action for each precondition, by negating only one precondition at a time. This indicates exactly which precondition is triggering the error.

image

It may be that the original encoding was correct, but I couldn't understand which precondition was failing. Now I do :)

@nirlipo
Copy link
Author

nirlipo commented Sep 7, 2022

On a different note, many alignments fail due to students submissions being parsed well by the online solver but not by Tarski. We need to find a workaround to this exposing this as an autograder so they can make sure their model works well with Tarski too.

@haz
Copy link
Contributor

haz commented Sep 7, 2022

In the latest commit, I ended up adding a failed action for each precondition, by negating only one precondition at a time. This indicates exactly which precondition is triggering the error.

Very compelling. But I'm a bit torn. To truly know what's wrong, that's the best approach. To just give students a hint (e.g., in running a validation on a webserver where they don't get to see the reference domain being used), then perhaps it's too much info. Ultimately, having the choice might get us the best of both worlds. Two requests if you're up for it:

  1. Make it a cmd-line option to go with the original mode (negate it all) -vs- the split mode (negate each precond and use multiple actions).
  2. Put in the assertions required to ensure the approach is operating only on domains that make sense. If I have an arbitrary nesting, or an or in the preconditions, or whatever, then it will fail silently (I think?). Using the original planner handles these cases, so the check only needs to be there for the "split" option.

On a different note, many alignments fail due to students submissions being parsed well by the online solver but not by Tarski. We need to find a workaround to this exposing this as an autograder so they can make sure their model works well with Tarski too.

Ya, this is a major problem. I've kind of conceded to not solving it since I need to move to a different style of assessment for the course:

  • Many low-stakes instant-feedback assignments
  • Higher-stakes quizzes (from 2 to 3)
  • The p4pp will be hosted, and validation against the reference will be provided as instant feedback

@nirlipo
Copy link
Author

nirlipo commented Sep 7, 2022

Hosting seems like a good move to allow testing against Tarski and continuous feedback while developing their models.

@haz
Copy link
Contributor

haz commented Sep 11, 2022

Big-time agree. Think you can tackle 1+2 above? I'll get it server-wrapped for us from there (will be needed early 2023).

@haz
Copy link
Contributor

haz commented Jan 26, 2023

@nirlipo Any chance you'd like to make a move on the above? In the meantime, I've started picking away at the server setup...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants