Next Previous Contents

4. Setting up a contest

After installation is successful, you want to run your contest! Configuring DOMjudge to run a contest (or a number of them, in sequence) involves the following steps:

4.1 Configure the contest data

DOMjudge stores and retrieves most of its data from the MySQL database. Some information must be filled in beforehand, other tables will be populated by DOMjudge.

You can use the jury web interface to add, edit and delete most types of data described below. It's advised to keep a version of phpMyAdmin handy in case of emergencies, or for general database operations like import and export.

This section describes the meaning of each table and what you need to put into it. Tables marked with an `x' are the ones you have to configure with contest data before running a contest (via the jury web interface or e.g. with phpMyAdmin), the other tables are used automatically by the software:
auditlog Log of every state-changing event.
balloon Balloons to be handed out.
clarification Clarification requests/replies are stored here.
xconfiguration Runtime configuration settings.
xcontest Contest definitions with start/end time.
xcontestproblem Coupling of problems to contests and data specific to it.
xcontestteam Coupling of teams to contests.
event Log of events during contests.
xexecutable Executable compile/run/compare scripts.
internal_errorStores errors that occurred on judgehosts including logs.
judgehost Computers (hostnames) that function as judgehosts.
xjudgehost_restrictionOptional restriction sets on submissions taken by judgehosts.
judging Judgings of submissions.
judging_run Result of one testcase within a judging.
xlanguage Definition of allowed submission languages.
xproblem Definition of problems (name, timelimit, etc.).
rankcache Cache of team ranking data for public/teams and for the jury.
rejudging Metadata for batched rejudging.
role Possible user roles.
scorecache Cache of the scoreboards for public/teams and for the jury.
submission Submission metadata of solutions to problems.
submission_file Submitted code files.
xteam Definition of teams.
xteam_affiliation Definition of institutions a team can be affiliated with.
xteam_category Different category groups teams can be put in.
team_unread Records which clarifications are read by which team.
xtestcase Definition of testdata for each problem.
xuser Users that will able to access the system.
xuserrole Mapping of users to their roles.

Now follows a longer description (including fields) per table that has to be filled manually. As a general remark: almost all tables have an identifier field. Most of these are numeric and automatically increasing; these do not need to be specified. The tables executable and language have text strings as identifier fields. These need to be manually specified and only alpha-numeric, dash and underscore characters are valid, i.e. a-z, A-Z, 0-9, -, _.

configuration

This table contains configuration settings. These entries are simply stored as name, value pairs, where the values are JSON encoded, type contains the allowed data type, and description documents the configuration setting.

contest

The contests that the software will run. E.g. a test session and the live contest.

cid is the reference ID and contestname is a descriptive name used in the interface, while shortname is the publicly visible identifier.

activatetime, starttime and endtime are required fields and specify when this contest is active and open for submissions. Optional freezetime and unfreezetime control scoreboard freezing and deactivatetime when the contest is not visible anymore. For a detailed treating of these, see section Contest milestones. All contest times can be specified relative to starttime, except of course starttime itself. The input given in the jury interface (either relative or absolute) is stored in the *time_string fields, while a calculated absolute version is stored in the fields without the _string suffix.

The public field can be used to limit which contests are displayed as public scoreboards (as opposed to privately to a selected set of teams), while enabled can be used to (temporarily) disable a contest altogether.

contestproblem

This table couples problems to contests: cid and probid describe the pairing.

Furthermore, it stores problem data that is specific for the included contest: shortname is a contest-unique identifier string for the problem, points defaults to 1 and can be set to assign non-even scoring; allow_submit determines whether teams can submit solutions for this problem. Non-submittable problems are also not displayed on the scoreboard. This can be used to define spare problems, which can then be added to the contest quickly; allow_judge determines whether judgehosts will judge submissions for this problem. See also the explanation for language.

The color tag can be filled with a CSS colour specification to associate with this problem; see also section Scoreboard: colours.

contestteam

This table couples teams to contests. Teams can only submit solutions to problems in contests that are public or which they are part of.

executable

This table stores zip-bundles of executable scripts that can be used as compile, run, and compare scripts.

judgehost_restriction

This table encodes restriction sets for selecting which submissions are sent to a judgehost. The restrictions are JSON encoded in the restrictions column, and can be set in the admin web interface to restrict on specific contests, problems, languages, and to never rejudge on the same judgehost. A restriction set can be assigned to judgehost(s) on the edit page of the judgehosts overview.

language

Programming languages in which to accept and judge submissions. langid is a string of maximum length 8, which references the language. name is the displayed name of the language; extensions is a JSON encoded list of recognized filename extensions; allow_submit determines whether teams can submit using this language; allow_judge determines whether judgehosts will judge submissions for this problem. This can for example be set to no to temporarily hold judging when a problem occurs with the judging of a specific language; after resolution of the problem this can be set to yes again.

time_factor is the relative factor by which the timelimit is multiplied for solutions in this language; compile_script refers to a compile executable script that is used for this language.

problem

This table contains the problem definitions. probid is the reference ID, cid is the contest ID this problem is (only) defined for: a problem cannot be used in multiple contests. name is the full name (description) of the problem.

allow_submit determines whether teams can submit solutions for this problem. Non-submittable problems are also not displayed on the scoreboard. This can be used to define spare problems, which can then be added to the contest quickly; allow_judge determines whether judgehosts will judge submissions for this problem. See also the explanation for language.

timelimit is the timelimit in seconds within which solutions for this problem have to run (taking into account time_factor per language). See also enforcement of time limits for more details.

memlimit is the memory limit in kB allotted for this problem. If empty then the global configuration setting memory_limit is used. Equivalently for outputlimit.

special_run if not empty defines a custom run program run_<special_run> to run compiled submissions for this problem and special_compare if not empty defines a custom compare program compare_<special_compare> to compare output for this problem.

The color tag can be filled with a CSS colour specification to associate with this problem; see also section Scoreboard: colours.

In problemtext a PDF, HTML or plain text document can be placed which allows team, public and jury to download the problem statement. Note that no additional filtering takes place, so HTML (and PDF to some extent) should be from a trusted source to prevent cross site scripting or other attacks. The file type is stored in problemtext_type.

team

Table of teams: teamid is (internal) ID of the team, while externalid can be used to store an ID for im/exporting to other systems. name the displayed name of the team, categoryid is the ID of the category the team is in; affilid is the affiliation ID of the team.

When enabled is set to 0, the team immediately disappears from the scoreboards and cannot use the team web interface anymore, even when already logged in. One use case could be to disqualify a team on the spot.

members are the names of the team members, separated by newlines and room is the location or room of the team, both for display only; comments can be filled with arbitrary useful information and is only visible to the jury. The timestamp teampage_first_visited and the hostname field indicate when/whether/from where a team visited its team web interface.

The penalty field can be used to give this team a (positive or negative) number of penalty minutes to correct for exceptional circumstances.

team_affiliation

affilid is the reference ID and name the name of the institution. country should be the 3 character ISO 3166-1 alpha-3 abbreviation of the country and comments is a free form field that is displayed in the jury interface.

A country flag can be displayed on the scoreboard. For this to work, the country field must match a (flag) picture in webapp/web/images/countries/<country>.png. All country flags are present there, named with their 3-character ISO codes. See also webapp/web/images/countries/README.

team_category

categoryid is the reference ID and name is a string: the name of the category. sortorder is the order at which this group must be sorted in the scoreboard, where a higher number sorts lower and equal sort depending on score.

The color is again a CSS colour specification used to discern different categories easily. See also section Scoreboard: colours.

The visible flag determines whether teams in this category are displayed on the public/team scoreboard. This feature can be used to remove teams from the public scoreboard by assigning them to a separate, invisible category.

testcase

The testcase table contains testdata for each problem; testcaseid is a unique identifier, input and output contain the testcase input/output and image an optional graphical representation of the testcase for the jury. The fields md5sum_input, md5sum_output, and md5sum_image contain their respective md5 hashes to check for up-to-date-ness of cached versions by the judgehosts and image_thumb and image_type a thumbnail version and mimetype string for the image. The field probid is the corresponding problem and rank determines the order of the testcases for one problem. description is an optional description for this testcase. See also providing testdata.

user

This table has the users that the system knows about with their login credentials. Each user may have one or more roles, like being part of a team, being a jury member or administrator. There are also functional accounts, like for judgedaemons.

4.2 Contest milestones

The contest table specifies timestamps for each contest that mark specific milestones in the course of the contest.

The triplet activatetime, starttime and endtime define when the contest runs and are required fields (activatetime and starttime may be equal).

activatetime is the moment when a contest first becomes visible to the public and teams . Nothing can be submitted yet and the problem set is not revealed. Clarifications can be viewed and sent.

At starttime, the scoreboard is displayed and submissions are accepted. At endtime the contest stops. New incoming submissions will still be processed and judged, but the result will not be shown anymore to teams; they instead receive the verdict`too-late'. Unjudged submissions received before endtime will still be judged normally.

freezetime and unfreezetime control scoreboard freezing. freezetime is the time after which the public and team scoreboard are not updated anymore (frozen). This is meant to make the last stages of the contest more thrilling, because no-one knows who has won. Leaving them empty disables this feature. When using this feature, unfreezetime can be set to automatically `unfreeze' the scoreboard at that time. For a more elaborate description, see also section Scoreboard: freezing and defrosting.

The scoreboard, results and clarifications will remain to be displayed to team and public after a contest, until the deactivatetime.

All events happen at the first moment of the defined time. That is: for a contest with starttime "12:00:00" and endtime "17:00:00", the first submission will be accepted at 12:00:00 and the last one at 16:59:59.

The following ordering must always hold: activatetime <= starttime < (freezetime <=) endtime (<= unfreezetime) (<= deactivatime).

4.3 User authentication

The authentication system lets the domserver know which user it is dealing with and which role(s) the user has. The 6.0 version of DOMjudge only supports username/password authentication.

Each user receives a password and PHP's session management is used to keep track of which user is logged in. It does require the administrator to generate users and passwords for all teams (this can be done in the jury interface) and distribute those. Each team has to login each time they (re)start their browser. The password is stored in a salted hash in the password field in database (user table).

4.4 Providing testdata

Testdata is used to judge the problems: when a submission run is given the input testdata, the resulting output is compared to the reference output data using a compare script. The default compare script simply checks if the outputs are equal up to whitespace differences, but more elaborate comparisons can be done, see e.g. the float and boolfind_cmp scripts.

The database has a separate table named testcase, which can be manipulated from the web interface. Under a problem, click on the testcase link. There the files can be uploaded. The judgehosts cache a copy based on MD5 sum, so if you need to make changes later, re-upload the data in the web interface and it will automatically be picked up.

Testdata can also be imported into the system from a problem zip file, following the Kattis problem package format.

4.5 Start the daemons

Once everything is configured, you can start the daemons. They all run as a normal user on the system. The needed root privileges are gained through sudo only when necessary.

4.6 Check that everything works

If the daemons have started without any problems, you've come a long way! Now to check that you're ready for a contest.

First, go to the jury interface: http(s)://yourhost.example.edu/domjudge/jury. Look under all the menu items to see whether the displayed data looks sane. Use the config-checker under `Admin Functions' for some sanity checks on your configuration.

Go to a team workstation and see if you can access the team page and if you can submit solutions.

Next, it is time to submit some test solutions. If you have the default Hello World problem enabled, you can submit some of the example sources from under the doc/examples directory. They should give `CORRECT'.

You can also try some (or all) of the sources under tests. Use make check to submit a variety of tests; this should work when the submit client is available and the default example problems are in the active contest. There's also make stress-test, but be warned that these tests might crash a judgedaemon. The results can be checked in the web interface; each source file specifies the expected outcome with some explanations. For convenience, there is a link judging verifier in the admin web interface; this will automatically check whether submitted sources from the tests directory were judged as expected. Note that a few sources have multiple possible outcomes: these must be verified manually.

When all this worked, you're quite ready for a contest. Or at least, the practice session of a contest.

4.7 Testing jury solutions

Before running a real contest, you and/or the jury will want to test the jury's reference solutions on the system.

The simplest way to do this is to include the jury solutions in a problem zip file and upload this. You can also upload a zip file containing just solutions to an existing problem. Note that the zip archive has to adhere to the Kattis problem package format. For this to work, the jury/admin who uploads the problem has to have an associated team to which the solutions will be assigned. The solutions will automatically be judged if the contest is active (but it need not have started yet). You can verify whether the submissions gave the expected answer from the link on the jury/admin index page.


Next Previous Contents