Skip to content

Latest commit

 

History

History
22 lines (17 loc) · 1.01 KB

glue.md

File metadata and controls

22 lines (17 loc) · 1.01 KB

GLUE Benchmark

Generating Submissions

jiant supports generating submission files for GLUE. To generate test predictions, use the --write_test_preds flag in runscript.py when running your workflow. This will generate a test_preds.p file in the specified output directory. To convert test_preds.p to the required GLUE submission format, use the following command:

python benchmark_submission_formatter.py \
    --benchmark GLUE \
    --input_base_path $INPUT_BASE_PATH \
    --output_path $OUTPUT_BASE_PATH

where $INPUT_BASE_PATH contains the task folder(s) output by runscript.py. Alternatively, a subset of tasks can be formatted using:

python benchmark_submission_formatter.py \
    --benchmark GLUE \
    --tasks cola mrpc \
    --input_base_path $INPUT_BASE_PATH \
    --output_path $OUTPUT_BASE_PATH