Predictors

The main entry point for embedding prediction models into JuMP is add_predictor.

All methods use the form y, formulation = MathOptAI.add_predictor(model, predictor, x) to add the relationship y = predictor(x) to model.

Supported predictors

The following predictors are supported. See their docstrings for details:

PredictorRelationshipDimensions
Affine$f(x) = Ax + b$$M \rightarrow N$
BinaryDecisionTreeA binary decision tree$M \rightarrow 1$
GrayBox$f(x)$$M \rightarrow N$
Pipeline$f(x) = (l_1 \circ \ldots \circ l_N)(x)$$M \rightarrow N$
QuantileThe quantiles of a distribution$M \rightarrow N$
ReLU$f(x) = \max.(0, x)$$M \rightarrow M$
ReLUBigM$f(x) = \max.(0, x)$$M \rightarrow M$
ReLUQuadratic$f(x) = \max.(0, x)$$M \rightarrow M$
ReLUSOS1$f(x) = \max.(0, x)$$M \rightarrow M$
Scale$f(x) = scale .* x .+ bias$$M \rightarrow M$
Sigmoid$f(x) = \frac{1}{1 + e^{-x}}$$M \rightarrow M$
SoftMax$f(x) = \frac{e^x}{\sum e^{x_i}}$$M \rightarrow M$
SoftPlus$f(x) = \frac{1}{\beta} \log(1 + e^{\beta x})$$M \rightarrow M$
Tanh$f(x) = \tanh.(x)$$M \rightarrow M$

Note that some predictors, such as the ReLU ones, offer multiple formulations of the same mathematical relationship. The ''right'' choice is solver- and problem-dependent.

ReLU

There are a number of different mathematical formulations for the rectified linear unit (ReLU).

  • ReLU: requires the solver to support the max nonlinear operator.
  • ReLUBigM: requires the solver to support mixed-integer linear programs, and requires the user to have prior knowledge of a suitable value for the "big-M" parameter.
  • ReLUQuadratic: requires the solver to support quadratic equality constraints
  • ReLUSOS1: requires the solver to support SOS-I constraints.

The correct choice for which ReLU formulation to use is problem- and solver-dependent.