Hi Maarten,

sorry for the late reply.

All recent NWERCs and probably most contests use problemtools nowadays. It doesn't use runguard, but that shouldn't matter that much.
Then on your CI system, you can just run verifyproblem from problemtools to verify if your submissions give the expected results.

Be aware that you probably want to update the languages.yaml (https://github.com/Kattis/problemtools/blob/develop/problemtools/config/languages.yaml) that specifies the languages, i.e. their compile and run commands.
See also https://github.com/Kattis/problemtools#configuration

 Best,
Tobi

On Mon, Nov 26, 2018 at 8:12 PM Maarten Sijm <M.P.Sijm@student.tudelft.nl> wrote:
Hi,

I am planning to set up automatic testing of jury solutions using continuous integration.
From the admin manual (section 4.7), I've found that testing jury solutions can be done by uploading the problems to the system, but I would like to circumvent this manual task.
I already have a naive shell script set up that does some basic diffing, which works for simple problems.
However, I thought there might be already some tool that automatically verifies solutions.

I've found that Kattis has a tool to verify problems: https://github.com/Kattis/problemtools
However, I can imagine that this `verifyproblem` tool uses different settings than what would be used on a DomJudge instance.

Could I get some advice on how I could best set this up?

Kind regards,
M.P. Sijm
_______________________________________________
DOMjudge-devel mailing list
DOMjudge-devel@domjudge.org
https://www.domjudge.org/mailman/listinfo/domjudge-devel