minConf is a set of Matlab functions for optimization of differentiable real-valued multivariate functions subject to simple constraints on the parameters. On many problems, the functions included in minConf will be able to solve problems more efficiently than Matlab's fmincon function, while the functions in minConf can solve problems with a much larger number of variables, and they use line searches that are robust to several common function pathologies.
min_x f(x), subject to LB <= x <= UBThe default options use a quasi-Newton strategy, where limited-memory BFGS updates are used in computing the step direction, and a backtracking line search is used to find a step satisfying an Armijo condition along the projection arc. You can change the number of L-BFGS corrections to store for the quasi-Newton update (options.corrections), and whether or not to use skipping or damping of the L-BFGS updates (options.damped). If funObj returns the Hessian of the objective function as its third output argument, you can use the Hessian explicitly if you set options.method to newton. To use this function, you need to specify the vectors LB and UB giving the lower/upper bounds on the variables. The elements of LB/UB can be set to -inf/inf if variables are unconstrained below/above.
Some examples of its usage can be found here.
min_x f(x), subject to x is in XThe default options use a Barzilai-Borwein scaling of the gradient, and use a non-monotone Armijo line search along the feasible direction to find the next iterate. The number of iterations to remember in the non-monotone line search can be changed (options.memory), as can the type of Barzilai-Borwein scaling (options.bbType), whether to backtrack along the projection arc (options.curvilinear), whether to compute an additional projection at each iteration to test optimality (options.testOpt), whether or not to use the Barzilai-Borwein scaling (options.useSpectral), and whether or not to project the initial parameters (options.feasibleInit). To use this function, you need to specify a function funProj that computes the projection onto the convex set:
funProj(x) = argmin_y ||x - y||_2, subject to y is in XIncluded with minConf are some examples of projection functions: boundProject computes the projection when the convex set consists of simple bound constraints, projectSimplex computes the projection onto the probability simplex, and linearProject uses quadprog to project onto general linear constraints (LB <= x <= UB, Aeq*x=beq, and/or A*x + b >= 0).
min_x f(x), subject to x is in XThe default options use limited-memory BFGS quasi-Newton updates to form a quadratic approximation to the function, use minConf_SPG to minimize this quadratic approximation over the set, and use an Armijo backtracking line search along the feasible direction to find the next iterate. You can change the number of BFGS corrections to store (options.corrections), the number of iterations of SPG to run (options.SPGiters), the optimality tolerance for the SPG sub-routine (options.SPGoptTol), whether to initialize the SPG sub-routine with the Barzilai-Borwein step (options.bbInit), the maximum number of projections (options.maxProject), and whether or not to use quadratic interpolation to initialize the step length in the line search (options.adjustStep).
There is also a separate webpage dedicated to this method here, and some examples of its usage can be found here.
anonFunc = @(x)funObj(x,arg1,arg2); minConf_TMP(anonFunc,x,LB,UB,options);If the .zip file does not contain a .mex version of lbfgsC.mex*** for your operating system, you will need to run mexAll before using minConf_TMP or minConf_PQN (please e-mail me the compiled mex file so I can include in the .zip file).
To reference minConf in a publication, please include my name and a link to
this website. You may also want to include the function used and the date, since I may update the
software in the future.