Configurable
SAT Solver Challenge (CSSC) 2013
|
|
Home
| Mechanics | Eligibility
and
Rules
| Categories
| Prizes
| Contribution
of
Benchmarks | Important
Dates | Organizers
| Execution
Environment | Automated
Algorithm Configuration
Thanks to all participants! There now is a technical report with the results of CSSC 2013 and an arXiv paper about the CSSC.
Results
We configured each solver for each
benchmark and measured its number of timeouts and runtime for solved
instances (all on instances disjoint from those used for
configuration). We simply summed these measures up
across the benchmarks in each track, and for reference, also computed
the same measures for the default performance. Here are the results:
Table:
Final results for CSSC Application track
|
Solver |
CSSC
Results |
|
Solver
defaults |
|
|
#(timeouts) |
avg.
time for solved |
|
#(timeouts) |
avg.
time for solved |
Gold |
lingeling |
115 |
12.78 |
|
136 |
16.35 |
Silver |
riss3g |
117 |
10.18 |
|
122 |
10.80 |
Bronze |
Solver43 |
127 |
13.17 |
|
127 |
14.46 |
|
forl-nodrup |
128 |
15.07 |
|
152 |
19.01 |
|
simpsat |
128 |
20.27 |
|
134 |
19.59 |
|
clasp-cssc |
130 |
9.97 |
|
163 |
11.21 |
|
sat4j |
176 |
19.46 |
|
184 |
21.37 |
|
gnovelty+GCwa |
1090 |
23.88 |
|
1131 |
7.30 |
|
gnovelty+PCL |
1099 |
9.62 |
|
1101 |
14.07 |
|
gnovelty+GCa |
1104 |
10.97 |
|
1129 |
14.81 |
|
riss3gExt
(hors concours) |
82 |
8.25 |
|
123 |
10.53 |
|
|
|
|
|
|
|
|
Table:
Final results for CSSC Crafted track
|
Solver |
CSSC
Results |
|
Solver
defaults |
|
|
#(timeouts) |
avg.
time for solved |
|
#(timeouts) |
avg.
time for solved |
Gold |
clasp-cssc |
96 |
13.87 |
|
139 |
7.10 |
Silver |
forl-nodrup |
98 |
15.27 |
|
135 |
10.01 |
Bronze |
lingeling |
107 |
10.75 |
|
148 |
7.51 |
|
riss3g |
131 |
7.52 |
|
148 |
7.52 |
|
simpsat |
149 |
9.86 |
|
149 |
9.86 |
|
Solver43 |
152 |
8.79 |
|
156 |
10.75 |
|
sat4j |
161 |
7.44 |
|
172 |
9.42 |
|
gnovelty+GCwa |
334 |
13.34 |
|
375 |
4.33 |
|
gnovelty+GCa |
353 |
7.13 |
|
423 |
4.74 |
|
gnovelty+PCL |
361 |
12.02 |
|
378 |
12.02 |
|
riss3gExt
(hors concours) |
44 |
4.71 |
|
148 |
7.09 |
|
|
|
|
|
|
|
|
Table:
Final results for CSSC Random track
|
Solver |
CSSC
Results |
|
Solver
defaults |
|
|
#(timeouts) |
avg.
time for solved |
|
#(timeouts) |
avg.
time for solved |
Gold |
clasp-cssc |
250 |
1.58 |
|
261 |
14.10 |
Silver |
lingeling |
250 |
4.20 |
|
258 |
13.24 |
Bronze |
riss3g |
250 |
7.68 |
|
260 |
11.51 |
|
Solver43 |
253 |
12.32 |
|
256 |
11.34 |
|
simpsat |
254 |
13.85 |
|
254 |
13.95 |
|
sat4j |
255 |
14.96 |
|
257 |
16.33 |
|
forl-nodrup |
258 |
11.61 |
|
289 |
14.75 |
|
gnovelty+GCwa |
375 |
13.40 |
|
382 |
22.64 |
|
gnovelty+GCa |
378 |
19.58 |
|
537 |
48.11 |
|
gnovelty+PCL |
385 |
44.46 |
|
624 |
0.11 |
|
riss3gExt
(hors concours) |
250 |
7.20 |
|
261 |
11.00 |
|
|
|
|
|
|
|
|
These results show quite clearly that algorithm
configuration has a major impact on
the ranking of algorithms (e.g., neither of the top 3 solvers in the
CSSC would
have been in the top 3 had we only looked at default
configurations). Here are the detailed
measures for each
benchmark, and here are scatter plots
of test performance for default vs. optimized
configuration (for each solver and each benchmark).
Instances
We will shortly
provide separate downloads for each of the benchmark instance sets.
CSSC
directory structure
Here is the entire CSSC
directory
structure (with the exception of the instances, see above).
This includes all solvers, configuration spaces, scenario files,
configurators, and results.
Solver
configurations
For each solver, we provide a file with
the optimized configuration for each of the benchmarks. The
configurations are given as a callstring that can be executed from the
cssc directory structure. Each line is for one benchmark, with the
first line being
the default.
Please send any questions,
concerns or comments to cssc.organizers@gmail.com.