We will present a range of “case study” assignment configurations, from simple through complex, using a variety of different automated grading methods including per-character and per-line output difference checkers, external unit testing frameworks (such as JUnit), memory debugging tools (Valgrind and DrMemory), code coverage (e.g., Emma), static analysis tools, and custom graders. Submitty can be customized per test case as appropriate to apply resource limits (running time, number of processes, output file size, etc.) and to display or hide from students the program output, autograding results, and testing logs.

Where to Store your Assignment Configurations

To allow backups and re-use of assignment configurations, we recommend that assignment configurations be prepared in a separate version controlled repository (e.g, GIT).

The assignment configuration may contain hidden input examples, solution output, and/or solution code that should not be publicly available to students. Thus, this repository should be private or shared only with other instructors or teaching assistants.

For example, we suggest storing these per course private repositories on the server, with controlled permissions. For example:

/var/local/submitty/private_course_repositories/computer_science_1/
/var/local/submitty/private_course_repositories/data_structures/
etc.

Overall Structure of an Assignment Configuration

You can study the Tutorial sample assignment configurations here:
Tutorial Example Configurations

And additional configuration examples are available.

Each assignment configuration will have a top level directory with a config.json file. Here is the general structure of a homework configuration directory:

   computer_science_1
   └── my_python_homework
       ├── config.json                   [ REQUIRED ]
       ├── provided_code                 [ OPTIONAL ]
       |   └── instructor_code.cpp
       |   └── instructor_code.h
       ├── test_input                    [ OPTIONAL ]
       │   └── input_1.txt
       │   └── input_2.txt
       │   └── input_3.txt
       ├── test_output                   [ OPTIONAL ]
       │   └── output_1.txt
       │   └── output_2.txt
       │   └── output_3.txt
       ├── instructor_CMakeLists.txt     [ OPTIONAL ]
       └── custom_validation_code        [ OPTIONAL ]
           └── grader.cpp
           └── grader.h
           └── another_file.cpp

Phases of Autograding

First Phase: Compilation

  1. Create a temporary directory for autograding this student’s submission.

  2. Create a temporary subdirectory for compilation.

  3. Copy the student’s submitted source code (for compiled languages) to this temporary directory. Note: The copied files can be controlled with the submission_to_compilation variable in config.json.

  4. Copy the files from the provided_code directory into the temporary compilation subdirectory.

  5. Scan through the testcases in the config.json for all testcases with type = “compilation”.

  6. Execute the “command”(s) for the compilation testcases.

  7. Rename the STDOUT.txt, STDERR.txt, execution logfiles, and specified output files that are to have been created by the program execution (prefix with test case number).

Second Phase: Execution

  1. Create a temporary subdirectory for runner and validation work.

  2. Copy the student’s submitted source code (for interpreted languages) to the tmp_work subdirectory. Note: The copied files can be controlled with the submission_to_runner variable in config.json.

  3. Copy the test input files to the tmp_work subdirectory.

  4. Copy the compiled executables from the tmp_compilation subdirectory to the tmp_work subdirectory. Note: The copied files can be controlled with the compilation_to_runner variable in config.json.

  5. Scan through the testcases in the config.json for all testcases with type = “execution”.

  6. Execute the “command”(s) for the execution testcases.

  7. Rename the STDOUT.txt, STDERR.txt, execution logfiles, and specified output files that are to have been created by the program execution (prefix with test case number).

Third Phase: Validation

  1. Copy specific files as needed from the student’s submission languages) to the tmp_work subdirectory. Note: These files are specified with the submission_to_validation variable in config.json.

  2. Copy the custom validation code into the tmp_work subdirectory.

  3. Copy the expected test output into the tmp_work subdirectory.

  4. Copy output files from compilation from the tmp_compilation subdirectory to the tmp_work subdirectory. Note: The copied files can be controlled with the compilation_to_validation variable in config.json.

  5. Scan through the test cases in the config.json and perform the validation checks indicated within each check.

  6. Calculate the score for each test case, and determine what messages and files should be displayed for each test case.

  7. Write the results.json and grade.txt files.

  8. Copy files as needed form the tmp_work directory for archive to the details subfolder of the student’s results directory for this assignment and submission version. Note: The copied files can be controlled with the work_to_details variable in config.json.

Variables to move files

As outlined in the above sections & diagrams, there are 6 different configuration settings in the config.json to control the movement of files. Some of them have reasonable defaults for assignments that are compiling and running Python, C++, and Java programs (we may update these defaults in future revisions to Submitty). Each setting should be a list of one or more strings to match against files. You may use wildcards. Example of syntax:

    "autograding" : {
        "submission_to_compilation" : [ "part1/*.pdf" ],
        "submission_to_runner" : [ "part2/*.pdf", "special.xlsx" ],
        "compilation_to_runner" : [ "**/*.pdf" ],
        "submission_to_validation" : [ "part3/*.png" ],
        "compilation_to_validation" : [ "*/*.pdf" ],
        "work_to_details" : [ "*.pdf" ]
    },

These file match patterns will be appended to the Submitty defaults, defined here: grading/load_config_json.cpp

Overall Specification of a config.json file

You are allowed to have C/C++ style comments in a config.json file. These will be removed before compilation of the autograding executables.

Specification of a Testcase

Specification of a Networked Gradeable

Example Specification:

//use_router can be specified at the testcase level.
"use_router" : true,
"containers" : [
                {
                    "container_name" : "server",
                    "commands" : ["python3 server.py server"],
                    "outgoing_connections" : ["client"]
                    "container_image" : "ubuntu:custom"
                },
                {
                    "container_name" : "client",
                    "commands" : ["sleep 1", "python3 client.py client 0"],
                    "outgoing_connections" : ["server"]
                },
                {
                    "container_name" : "router",
                    "commands" : ["python3 router.py"]
                }

Notes:

  1. In networks specified with use_router = true, a “router” node intercepts and relays student messages. This allows an instructor to log all messages sent within the system, as well as to add rules in regards to message delay and loss. A router must be hand specified by the instructor per testcase. See Submitty Tutorial 16 for an example router.

  2. It can be important to ensure your container’s start in the correct order. In the example above, a sleep is used on the client to ensure that the server starts before it.

  3. A known bug is causing standard out to fail to flush its buffer in networked gradeables (confirmed in Python). As such all professor and student code should either explicitly flush their stdout or write to a file.

Dispatcher Actions (Standard Input)

It is possible to communicate with an assignment running in docker via standard input.

"dispatcher_actions" :
[
  {
    "action" : "delay",
    "seconds" : 2
  },
  {
    "containers" : ["container0"],
    "action" : "stdin",
    "string" : "Hi there! I'm container0\n"
  },
  {
    "containers" : ["container1"],
    "action" : "stdin",
    "string" : "Hi there! I'm container1\n"
  }
],

Dispatcher actions are specified at the testcase level and are delivered sequentially to student containers. There are two types of action, stdin and delay. Delays specify a floating point number of seconds delay before the next action is processed. Standard Input Actions deliver a string to any containers whose names are specified in the “containers” array. Please note that many languages require a newline at the end of an input expected on stdin.

Interfacing With Graphics Applications

It is possible to provide keyboard and mouse input to running student graphics applications.

Types of Graphics Application Actions

{   
  "actual_file": "0.png",   
  "description": "This description will be shown to the student",    
  "method": "fileExists",    
  "show_actual": "always",   
  "show_message": "always"   
}  

Automatically Generated Submission Limit Test Case

Students are allowed to resubmit if they discover an error. And students should be able to submit partial work early and verify they are on the right track.

However, we assume that students will do the bulk of their development, testing, and debugging on a local machine. To prevent overuse of limited resources, Submitty adds a test case that adds a small 1/10th of a point penalty for each submission over 20 submissions. Note that autograding totals round down to the nearest integer.

        {
            "title": "Submission Limit",
            "type": "FileCheck",
            "max_submissions": 20,
            "penalty": -0.1,
            "points": -5
        }

You may adjust this limit by pasting this syntax into your config.json and adjusting the parameters. The student is allowed max_submissions penalty-free submissions. After that, they will be charged penalty points per additional submission. The maximum penalty is set with the points parameter. For example:

Hidden Test Case configuration with customized Submission Limit

Note, you can view the defaults added to your config file by viewing:

   /var/local/submitty/courses/<SEMESTER>/<COURSE>/config/complete_config/complete_config_<GRADEABLE>.json

Specification of a Textbox

Specification of an Image Object

Specification of a Validation Object

Validation Methods