Submitty logo



We will present a range of “case study” assignment configurations, from simple through complex, using a variety of different automated grading methods including per-character and per-line output difference checkers, external unit testing frameworks (such as JUnit), memory debugging tools (Valgrind and DrMemory), code coverage (e.g., Emma), static analysis tools, and custom graders. Submitty can be customized per test case as appropriate to apply resource limits (running time, number of processes, output file size, etc.) and to display or hide from students the program output, autograding results, and testing logs.

Where to Store your Assignment Configurations

To allow backups and re-use of assignment configurations, we recommend that assignment configurations be prepared in a separate version controlled repository (e.g, GIT).

The assignment configuration may contain hidden input examples, solution output, and/or solution code that should not be publicly available to students. Thus, this repository should be private or shared only with other instructors or teaching assistants.

For example, we suggest storing these per course private repositories on the server, with controlled permissions. For example:

/var/local/submitty/private_course_repositories/computer_science_1/
/var/local/submitty/private_course_repositories/data_structures/
etc.

Overall Structure of an Assignment Configuration

You can study the Tutorial sample assignment configurations here:
Tutorial Example Configurations

And additional configuration examples are available.

Each assignment configuration will have a top level directory with a config.json file. Here is the general structure of a homework configuration directory:

   computer_science_1
   └── my_python_homework
       ├── config.json                   [ REQUIRED ]
       ├── provided_code                 [ OPTIONAL ]
       |   └── instructor_code.cpp
       |   └── instructor_code.h
       ├── test_input                    [ OPTIONAL ]
       │   └── input_1.txt
       │   └── input_2.txt
       │   └── input_3.txt
       ├── test_output                   [ OPTIONAL ]
       │   └── output_1.txt
       │   └── output_2.txt
       │   └── output_3.txt
       ├── instructor_CMakeLists.txt     [ OPTIONAL ]
       └── custom_validation_code        [ OPTIONAL ]
           └── grader.cpp
           └── grader.h
           └── another_file.cpp

Phases of Autograding

First Phase: Compilation

  1. Create a temporary directory for autograding this student’s submission.

  2. Create a temporary subdirectory for compilation.

  3. Copy the student’s submitted source code (for compiled languages) to this temporary directory. Note: The copied files can be controlled with the submitted_to_compilation variable in config.json.

  4. Copy the files from the provided_code directory into the temporary compilation subdirectory.

  5. Scan through the testcases in the config.json for all testcases with type = “compilation”.

  6. Execute the “command”(s) for the compilation testcases.

  7. Rename the STDOUT.txt, STDERR.txt, execution logfiles, and specified output files that are to have been created by the program execution (prefix with test case number).

Second Phase: Execution

  1. Create a temporary subdirectory for runner and validation work.

  2. Copy the student’s submitted source code (for interpreted languages) to the tmp_work subdirectory. Note: The copied files can be controlled with the submitted_to_runner variable in config.json.

  3. Copy the test input files to the tmp_work subdirectory.

  4. Copy the compiled executables from the tmp_compilation subdirectory to the tmp_work subdirectory. Note: The copied files can be controlled with the compilation_to_runner variable in config.json.

  5. Scan through the testcases in the config.json for all testcases with type = “execution”.

  6. Execute the “command”(s) for the execution testcases.

  7. Rename the STDOUT.txt, STDERR.txt, execution logfiles, and specified output files that are to have been created by the program execution (prefix with test case number).

Third Phase: Validation

  1. Copy specific files as needed from the student’s submission languages) to the tmp_work subdirectory. Note: These files are specified with the submitted_to_validation variable in config.json.

  2. Copy the custom validation code into the tmp_work subdirectory.

  3. Copy the expected test output into the tmp_work subdirectory.

  4. Copy output files from compilation from the tmp_compilation subdirectory to the tmp_work subdirectory. Note: The copied files can be controlled with the compilation_to_validation variable in config.json.

  5. Scan through the test cases in the config.json and perform the validation checks indicated within each check.

  6. Calculate the score for each test case, and determine what messages and files should be displayed for each test case.

  7. Write the results.json and grade.txt files.

  8. Copy files as needed form the tmp_work directory for archive to the details subfolder of the student’s results directory for this assignment and submission version. Note: The copied files can be controlled with the work_to_details variable in config.json.

Overall Specification of a config.json file

You are allowed to have C/C++ style comments in a config.json file. These will be removed before compilation of the autograding executables.

Specification of a Testcase

Types of Action

{   
  "actual_file": "0.png",   
  "description": "This description will be shown to the student",    
  "method": "fileExists",    
  "show_actual": "always",   
  "show_message": "always"   
}  

Specification of a Textbox

Specification of an Image Object

Specification of a Validation Object

Validation Methods