-
-
Notifications
You must be signed in to change notification settings - Fork 209
ConvexSolver Performance
apete edited this page Mar 20, 2019
·
4 revisions
We're often asked about how large problems/models the ojAlgo mathematical programming solvers can handle. In particular we get this question in relation to the MarkowitzModel class.
To examine this we've created this simple test. It creates a simplified streamlined version of a Markowitz portfolio optimisation model.
Compared to using ojAlgo's MarkowitzModel class, and the performance you could expect with it, there are 3 things to remember:
- The MarkowitzModel class can be used in 3 different ways; "target variance", "target return" and "risk aversion". The model here corresponds to the "risk aversion" problem formulation. The "target variance" and "target return" use cases could be something like 10 times slower, because they actually solve a sequence of optimisation problems.
- The covariance matrix used here is much simplified compared to a realistic one - it's a diagonal matrix. This structure could be exploited by a solver to greatly increase performance, but ojAlgo does not have that feature. Therefore we assume performance will be similar to that of a more realistic case.
- Compared to these timings the MarkowitzModel class would have additional overhead in creating the optimisation model. Apart from creating a somewhat simplified model, the model creation time is not measured here at all.
import static org.ojalgo.constant.BigMath.*;
import org.ojalgo.OjAlgoUtils;
import org.ojalgo.netio.BasicLogger;
import org.ojalgo.optimisation.Expression;
import org.ojalgo.optimisation.ExpressionsBasedModel;
import org.ojalgo.optimisation.Optimisation.Result;
import org.ojalgo.optimisation.Optimisation.State;
import org.ojalgo.optimisation.Variable;
import org.ojalgo.random.Uniform;
/**
* This example creates random/simple quadratic programming (QP) problems - similar to what the MarkowitzModel
* portfolio optimiser would create - at increasing sizes. The execution speed is measured for each problem
* size. The problem size is doubled for every iteration. Theoretically, regardless of how long you're willing
* to wait, the ojAlgo convex solvers have an absolute limit somewhere around 20000 variables.
*
* @author apete
*/
public class ConvexSolverPerformance {
/**
* Random number [0.0%,20.0%)
*/
private static final Uniform UNIFORM_20 = new Uniform(0.0, 0.2);
public static void main(final String[] args) {
BasicLogger.debug();
BasicLogger.debug(ConvexSolverPerformance.class.getSimpleName());
BasicLogger.debug(OjAlgoUtils.getTitle());
BasicLogger.debug(OjAlgoUtils.getDate());
BasicLogger.debug();
// Tiny bit of warmup to get rid of the worst timing problems
ConvexSolverPerformance.buildModel(3).minimise();
ConvexSolverPerformance.buildModel(33).minimise();
ConvexSolverPerformance.buildModel(333).minimise();
for (int exp = 1; exp <= 12; exp++) {
final int tmpNumberOfVariables = (int) Math.pow(2, exp);
final ExpressionsBasedModel tmpModel = ConvexSolverPerformance.buildModel(tmpNumberOfVariables);
final long tmpBefore = System.nanoTime();
final Result tmpResult = tmpModel.minimise();
final long tmpAfter = System.nanoTime();
final State tmpState = tmpResult.getState();
final double tmpTime = (tmpAfter - tmpBefore) / 1_000_000_000.0;
BasicLogger.debug(); // Print the problem size, execution time and result state
BasicLogger.debug("{} variables =>\t{} in {}s", tmpNumberOfVariables, tmpState, tmpTime);
// If the problem is small; also print the entire model
if (tmpNumberOfVariables <= 10) {
BasicLogger.debug(tmpModel);
}
}
}
private static ExpressionsBasedModel buildModel(final int numberOfVariables) {
final Variable[] tmpVariables = new Variable[numberOfVariables];
for (int i = 0; i < numberOfVariables; i++) {
tmpVariables[i] = Variable.make("V" + Integer.toString(i)).lower(ZERO).weight(-UNIFORM_20.doubleValue());
}
final ExpressionsBasedModel retVal = new ExpressionsBasedModel(tmpVariables);
final Expression tmp100P = retVal.addExpression("Balance");
for (final Variable tmpVariable : tmpVariables) {
tmp100P.set(tmpVariable, ONE);
}
tmp100P.level(ONE);
final Expression tmpVar = retVal.addExpression("Variance");
for (final Variable tmpVariable : tmpVariables) {
tmpVar.set(tmpVariable, tmpVariable, UNIFORM_20);
}
tmpVar.weight(TWO);
return retVal;
}
}
ConvexSolverPerformance
ojAlgo
2016-06-21
2 variables => OPTIMAL in 0.007090837s
############################################
0 <= V0: 0.675025 (-0.177244) <= 1.000000
0 <= V1: 0.324975 (-0.042475) <= 1.000000
1.000000 <= Balance: 1.0 <= 1.000000
Variance: 0.053618 (2.000000)
############################################
4 variables => OPTIMAL in 0.002845425s
############################################
0 <= V0: 0.633435 (-0.077728)
0 <= V1: 0.124587 (-0.027958)
0 <= V2: 0 (-0.007380)
0 <= V3: 0.241978 (-0.090821)
1.000000 <= Balance: 1.0 <= 1.000000
Variance: 0.016004 (2.000000)
############################################
8 variables => OPTIMAL in 0.003616068s
############################################
0 <= V0: 0 (-0.045985)
0 <= V1: 0 (-0.077335)
0 <= V2: 0.078457 (-0.149730)
0 <= V3: 0.023964 (-0.136193)
0 <= V4: 0.897578 (-0.160384)
0 <= V5: 0 (-0.116369)
0 <= V6: 0 (-0.036489)
0 <= V7: 0 (-0.008732)
1.000000 <= Balance: 1.0 <= 1.000000
Variance: 0.00677 (2.000000)
############################################
16 variables => OPTIMAL in 0.008724122s
32 variables => OPTIMAL in 0.013094184s
64 variables => OPTIMAL in 0.042818407s
128 variables => OPTIMAL in 0.059538885s
256 variables => OPTIMAL in 0.165197883s
512 variables => OPTIMAL in 0.558241339s
1024 variables => OPTIMAL in 2.257617094s
2048 variables => OPTIMAL in 15.365805817s
4096 variables => OPTIMAL in 68.704221689s