Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Pilot] Prepare Argilla workspace for Eval-Team #2

Open
1 task done
PaulNdrei opened this issue Sep 23, 2024 · 7 comments
Open
1 task done

[Pilot] Prepare Argilla workspace for Eval-Team #2

PaulNdrei opened this issue Sep 23, 2024 · 7 comments
Assignees

Comments

@PaulNdrei
Copy link

PaulNdrei commented Sep 23, 2024

The text eval team will need to start their first annotation

  • Initialize a workspace and dataset in the dev instance for the text eval team

Workspace name: text-eval-team
Dataset name: AYA-RT_es_test_100

When Javier Aula agrees the content/fields/etc, we going to initialize the workspace and the dataset on the Argilla production instance.

@PaulNdrei PaulNdrei self-assigned this Sep 23, 2024
@PaulNdrei
Copy link
Author

Annotation initialized here:
https://argilla.dev.aina.bsc.es/datasets?workspaces=text-eval-team

Waiting for Javier Aula agreement to initialize annotation in production Argilla.

@cbarone511
Copy link

Andrei need s to review the tasks. He created three users. Javier AB required some changes, but one can´t be don in Argilla, so the production is blocked by this now.
There's a pending meeting with Hannah on how to set up the image.
He will transfer the image into the Galtea repo. from Magma.

@gullabi gullabi changed the title Prepare Argilla workspace for Eval-Team [Pilot] Prepare Argilla workspace for Eval-Team Sep 23, 2024
@PaulNdrei
Copy link
Author

Javier AB feedback:

  • Is there a way to have a text box before the Prompt box OR before the right side bar panel OR create a pop up hover thingie when hovering over a question on the the right side bar panel? We want to use it to remind annotators of the criteria they should follow. I understand this may be a bit tough to follow, so please call me if needed 🙂
  • "How satisfied are you with the response?" pease change it to "How much better is the chosen response with respect to the rejected one?"
  • When selecting "Both" and "None" to the first question, the second one (the 5 point scale) should not be available (greyed out or disappear).

@PaulNdrei
Copy link
Author

PaulNdrei commented Sep 23, 2024

List of things to do:

  • Is there a way to have a text box before the Prompt box OR before the right side bar panel OR create a pop up hover thingie when hovering over a question on the the right side bar panel? We want to use it to remind annotators of the criteria they should follow.
    • This could be accomplished by adding a markdown field with the desired content.
    • The guidelines section could be considered.
      UPDATE 25/09/2024: Javier had not seen the guidelines tab. Anyway, they will be added at the top in another color with Markdown so that they are always visible because the guidelines tab seems difficult to find (UX issues).
      Javier will send us the correct guidelines to add at the top of the UI.
  • How satisfied are you with the response?" please change it to "How much better is the chosen response with respect to the rejected one?"
    • String replacement only (easy to do)
  • When selecting "Both" and "None" to the first question, the second one (the 5 point scale) should not be available (greyed out or disappear).
    • It seems that is not possible to do it natively in Argilla, also be complex to integrate
    • I have asked on the Huggingface Argilla support channel. (They responded that is not implemented yet)
      UPDATE 25/09/2024: This feature could be costly to implement, therefore the 5-point scale is not going to be hidden, instead of that a clear text explaining to let the value on "1" will be added to the guidelines (the value is going to be ignored in the post-annotation phases)

@gullabi
Copy link

gullabi commented Sep 30, 2024

We need to add the language and the model info metadata in the records, so that in the exports they will show up.

@PaulNdrei
Copy link
Author

PaulNdrei commented Oct 1, 2024

  • Add guidelines
  • Add text team user (use their bsc nicknames)
  • Check model fields to see if they exist in the output files.
  • Add "N/A" ("not applicable") option to the list of rating values

@gullabi
Copy link

gullabi commented Oct 7, 2024

Waiting for the final dataset from Javier. [Paused]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants