SMAC: Sequential Model-based Algorithm ConfigurationBioinformatics,
and
Empirical & Theoretical Algorithmics Laboratory
(ß-Lab) |
July 19th, 2015 | New release (version 2.10.03) which has a minor bugfix. |
July 5th, 2015 | New release (version 2.10.02) which has a new PCS format, and other bug fixes. License is now AGPLv3. |
August 4th, 2014 | New release (version 2.08.00) which should be much simpler to use |
October, 2013 | New release (version 2.06.01) contains some minor bug fixes |
August 28th, 2013 | New release (version 2.06.00) which contains a bunch of bug fixes and usability improvements |
August 7th, 2013 | New release (version 2.04.02) which contains a pair of bug fixes. Additionally a beta release (2.06.00b) has also been posted which has some new utilities and more bug fixes. |
February 16, 2013 | New release (version 2.04.01), including some bug fixes, usability improvements and minor feature improvements |
October 25th, 2012 | After a series of
internal releases, SMAC version 2 has been
publically released! In short, this is a complete
rewrite of SMAC in Java that features many improvements, is well documented, and is portable & easy to use. |
February 1, 2012 | A substantially
improved version of SMAC will be available soon; if you want to start
using SMAC in the meantime, please send a quick email to Frank.
We also plan to provide a quickstart guide similar to the one for ParamILS, as well as a Java implementation of SMAC. |
September 9, 2011 | First version of this page set up. Before this, SMAC was only available upon request. |
SMAC (sequential model-based
algorithm configuration) is a versatile
tool for optimizing algorithm parameters (or the parameters of
some
other process we can run automatically, or a function we can evaluate,
such as a simulation).
SMAC has helped us speed up both local search and tree search
algorithms
by orders of magnitude on certain instance distributions. Recently, we
have also found it to be very effective for the
hyperparameter optimization of machine learning algorithms, scaling
better to high dimensions and discrete input dimensions than other
algorithms. Finally, the predictive models SMAC is based on can
also capture and exploit important information
about the model domain, such as which input variables are most
important.
We hope
you find SMAC similarly useful. Ultimately, we hope that it
helps
algorithm designers focus on
tasks that are more scientifically valuable than parameter tuning.
For any comments, questions or concerns please check out the SMAC forum available here
Frank
Hutter, Holger Hoos, and Kevin
Leyton-Brown.
Parallel
Algorithm Configuration
In: Learning
and Intelligent
Optimization (LION 6). To
appear [pdf][pptx
slides][pdf
slides][bib]
Frank
Hutter, Holger
Hoos, and Kevin
Leyton-Brown.
Bayesian Optimization With Censored Response Data
2011 NIPS workshop on
Bayesian
Optimization,
Experimental Design, and Bandits. [pdf]
[poster]
[bib]