This vignettes discribes the modelling techniques available in ompr to make your life easier when developing a mixed integer programming model.

A MILP Model

You can think of a MIP Model as a big constraint maxtrix and a set of vectors. But you can also think of it as a set of decision variables, an objective function and a number of constraints as equations/inequalities. ompr implements the latter approach.

For example, Wikipedia describes the Knapsack problem like this: \[ \begin{equation*} \begin{array}{ll@{}ll} \text{max} & \displaystyle\sum\limits_{i=1}^{n} v_{i}x_{i} & &\\ \text{subject to}& \displaystyle\sum\limits_{i=1}^{n} w_{i}x_{i} \leq W, & &\\ & x_{i} \in \{0,1\}, &i=1 ,\ldots, n& \end{array} \end{equation*} \]

This is the ompr equivalent:

n <- 10; W <- 2
v <- runif(n);w <- runif(n)
model <- MILPModel() %>% 
  add_variable(x[i], i = 1:n, type = "binary") %>% 
  set_objective(sum_expr(colwise(v[i]) * x[i], i = 1:n)) %>% 
  add_constraint(sum_expr(colwise(w[i]) * x[i], i = 1:n) <= W)

The overall idea is to use modern R idioms to construct models like the one above as readable as possible directly in R. ompr will do the heavy lifting and transforms everything into matrices/vectors and pass it to your favorite solver.

Vectorized semantics

ompr supppots different backends. A backend is the empty model to which you add variables, constraints etc. Currently two backends exist: MIPModel and MILPModel. This vignette describes the latter as the first will become deprecated.

Compared to the old MIPModel backend, MILPModel has vectorized semantics. Meaning that model variables accept and expect vectors. This enables a speedup by a factor of 1000 and more. More details can be found at the end of this document.

Pipes

Each function in ompr creates immutable copies of the models. In addition the function interface has been designed to work with magrittr pipes. You always start with an empty model and add components to it.

## Mixed integer linear optimization problem
## Variables:
##   Continuous: 1 
##   Integer: 0 
##   Binary: 0 
## Model sense: maximize 
## Constraints: 1

Variable types

Variables can be of type continuous, integer or binary.

MIPModel() %>% 
  add_variable(x, type = "integer") %>% 
  add_variable(y, type = "continuous") %>% 
  add_variable(z, type = "binary")
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 1 
##   Integer: 1 
##   Binary: 1 
## No objective function. 
## Constraints: 0

Variable bounds

Variables can have lower and upper bounds.

MIPModel() %>% 
  add_variable(x, lb = 10) %>% 
  add_variable(y, lb = 5, ub = 10)
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 2 
##   Integer: 0 
##   Binary: 0 
## No objective function. 
## Constraints: 0

Indexed variables

Often when you develop a complex model you work with indexed variables. This is an important concept ompr supports.

MILPModel() %>% 
  add_variable(x[i], i = 1:10) %>%  # creates 10 decision variables
  set_objective(x[5]) %>% 
  add_constraint(x[5] <= 10)
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 10 
##   Integer: 0 
##   Binary: 0 
## Model sense: maximize 
## Constraints: 1

Summation over variables

If you have indexed variables then you often want to sum over a subset of variables.

The following code creates a model with three decision variables \(x_1\), \(x_2\), \(x_3\). An objective function \(\sum_i x_i\) and one constraint \(\sum_i x_i \leq 10\).

MILPModel() %>% 
  add_variable(x[i], i = 1:3) %>% 
  set_objective(sum_expr(x[i], i = 1:3)) %>% 
  add_constraint(sum_expr(x[i], i = 1:3) <= 10)
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 3 
##   Integer: 0 
##   Binary: 0 
## Model sense: maximize 
## Constraints: 1

Quantifiers

add_variable, add_constraint, set_bounds, sum_expr all support a common quantifier interface that also supports filter expression. A more complex example will show what that means.

## Mixed integer linear optimization problem
## Variables:
##   Continuous: 0 
##   Integer: 0 
##   Binary: 105 
## Model sense: maximize 
## Constraints: 15

Special bounds on a subset of variables

Imagine you want to model a matching problem with a single binary decision variable \(x_{i,j}\) that is \(1\) iff object \(i\) is matched to object \(j\). One constraint would be to allow matches only if \(i \neq j\). This can be modelled by a constraint or by selectively changing bounds on variables. The latter approach can be used by solvers to improve the solution process.

MILPModel() %>% 
  add_variable(x[i, j], i = 1:10, j = 1:10, 
               type = "integer", lb = 0, ub = 1) %>% 
  set_objective(sum_expr(x[i, j], i = 1:10, j = 1:10)) %>% 
  add_constraint(x[i, i] == 0, i = 1:10) %>% 
  
   # this sets the ub to 0 without adding new constraints
  set_bounds(x[i, i], ub = 0, i = 1:10)
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 0 
##   Integer: 100 
##   Binary: 0 
## Model sense: maximize 
## Constraints: 10

External model parameters

Of course you will need external parameters for your models. You can reuse any variable defined in your R environment within the MIP Model.

n <- 5 # number of our variables
costs <- rpois(n, lambda = 3) # a cost vector
max_elements <- 3
MILPModel() %>% 
  add_variable(x[i], type = "binary", i = 1:n) %>% 
  set_objective(sum_expr(colwise(costs[i]) * x[i], i = 1:n)) %>% 
  add_constraint(sum_expr(x[i], i = 1:n) <= max_elements)
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 0 
##   Integer: 0 
##   Binary: 5 
## Model sense: maximize 
## Constraints: 1

Extract model solutions

Once you have a model, you pass it to a solver and get back a solutions. The main interface to extract variable values from a solution is the function get_solution. It returns a data.frame for indexed variable and thus makes it easy to subsequently use the value.

We use ROI and GLPK to solve it.

## ROI.plugin.glpk: R Optimization Infrastructure
## Registered solver plugins: nlminb, glpk.
## Default solver: auto.
## <SOLVER MSG>  ----
## GLPK Simplex Optimizer, v4.63
## 5 rows, 25 columns, 25 non-zeros
##       0: obj =  -0.000000000e+00 inf =   5.000e+00 (5)
##       5: obj =   2.400000000e+01 inf =   0.000e+00 (0)
## *    14: obj =   4.400000000e+01 inf =   0.000e+00 (0)
## OPTIMAL LP SOLUTION FOUND
## GLPK Integer Optimizer, v4.63
## 5 rows, 25 columns, 25 non-zeros
## 25 integer variables, all of which are binary
## Integer optimization begins...
## +    14: mip =     not found yet <=              +inf        (1; 0)
## +    14: >>>>>   4.400000000e+01 <=   4.400000000e+01   0.0% (1; 0)
## +    14: mip =   4.400000000e+01 <=     tree is empty   0.0% (0; 1)
## INTEGER OPTIMAL SOLUTION FOUND
## <!SOLVER MSG> ----
get_solution(result, x[i, j]) %>% 
  dplyr::filter(value == 1)
##   variable i j value
## 1        x 4 1     1
## 2        x 2 2     1
## 3        x 5 3     1
## 4        x 3 4     1
## 5        x 1 5     1

You can also fix certain indexes.

get_solution(result, x[2, j])
##   variable j value
## 1        x 1     0
## 2        x 2     1
## 3        x 3     0
## 4        x 4     0
## 5        x 5     0

Vectorized semantics revisited

Each variable accepts vectors. The following code snippets show the behaviour by example:

Instead of passing index variables through quantifiers, you can also use vectors directly. Each element of a vector creates a new row for that variable. The two constraint groups below are equivalent.

n <- 10L
MILPModel() %>% 
  add_variable(x[i, j], i = 1:n, j = 1:n) %>% 
  add_constraint(x[i, j] == 1, i = 1:n, j = 1:n, i == j) %>% 
  add_constraint(x[1:n, 1:n] == 1) # this this equivalent
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 100 
##   Integer: 0 
##   Binary: 0 
## No objective function. 
## Constraints: 20

You can also add vectors columnwise using the function colwise or as_colwise:

MILPModel() %>% 
  add_variable(x[i, j], i = 1:n, j = 1:n) %>% 
  add_constraint(sum_expr(x[i, j], i = 1:n) == 1, j = 1:n) %>% 
  add_constraint(x[1:n, colwise(1:n)] == 1) # this this equivalent
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 100 
##   Integer: 0 
##   Binary: 0 
## No objective function. 
## Constraints: 20

Another example:

Say you want to express the below matrix:

With vectorized semantics you can do the following:

x[1, colwise(1, 1:2, 1:3)]

Or with the support of the ompr function sum_expr

MILPModel() %>% 
  add_variable(x[i, j], i = 1, j = 1:n) %>% 
  add_constraint(sum_expr(x[1, j], j = colwise(1, 1:2, 1:3)) == 1)
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 10 
##   Integer: 0 
##   Binary: 0 
## No objective function. 
## Constraints: 3

Adding varialbes - vectorized

Sometimes, you want to control both variable indexes and their bounds very specifically. You can either use filter expressions together with set_bounds or pass in the concrete variable indexes explicitly.

The following code is equivalent:

MILPModel() %>% 
  add_variable(x[i, j], i = 1:10, j = 1:10) 
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 100 
##   Integer: 0 
##   Binary: 0 
## No objective function. 
## Constraints: 0
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 100 
##   Integer: 0 
##   Binary: 0 
## No objective function. 
## Constraints: 0

In addition to the first block of code, you can also set bounds on the variables in the same way:

MILPModel() %>% 
  add_variable(x[grid$i, grid$j], lb = grid$i * 10, ub = grid$i * 20) 
## Mixed integer linear optimization problem
## Variables:
##   Continuous: 100 
##   Integer: 0 
##   Binary: 0 
## No objective function. 
## Constraints: 0

This is especially handy if you work with large models where a lot of index combinations are invalid. An example is routing problem, where you define a connection between two cities by a binary variable x[i, j] - a lot of i,j combinations are usually invalid and do not need to enter the model explicitly. Although a lot of solvers remove those variables during presolve.

Feedback

Do you have any questions, ideas, comments? Or did you find a mistake? Let’s discuss on Github.