ExaModels

ExaModels.ExaModelsModule
ExaModels

An algebraic modeling and automatic differentiation tool in Julia Language, specialized for SIMD abstraction of nonlinear programs.

For more information, please visit https://github.com/exanauts/ExaModels.jl

source
ExaModels.AdjointNode1Type
AdjointNode1{F, T, I}

A node with one child for first-order forward pass tree

Fields:

  • x::T: function value
  • y::T: first-order sensitivity
  • inner::I: children
source
ExaModels.AdjointNode2Type
AdjointNode2{F, T, I1, I2}

A node with two children for first-order forward pass tree

Fields:

  • x::T: function value
  • y1::T: first-order sensitivity w.r.t. first argument
  • y2::T: first-order sensitivity w.r.t. second argument
  • inner1::I1: children #1
  • inner2::I2: children #2
source
ExaModels.AdjointNodeSourceType
AdjointNodeSource{VT}

A source of AdjointNode. adjoint_node_source[i] returns an AdjointNodeVar at index i.

Fields:

  • inner::VT: variable vector
source
ExaModels.CompressorType
Compressor{I}

Data structure for the sparse index

Fields:

  • inner::I: stores the sparse index as a tuple form
source
ExaModels.ExaCoreType

ExaCore([array_eltype::Type; backend = backend, minimize = true])

Returns an intermediate data object ExaCore, which later can be used for creating ExaModel

Example

julia> using ExaModels

julia> c = ExaCore()
An ExaCore

  Float type: ...................... Float64
  Array type: ...................... Vector{Float64}
  Backend: ......................... Nothing

  number of objective patterns: .... 0
  number of constraint patterns: ... 0

julia> c = ExaCore(Float32)
An ExaCore

  Float type: ...................... Float32
  Array type: ...................... Vector{Float32}
  Backend: ......................... Nothing

  number of objective patterns: .... 0
  number of constraint patterns: ... 0

julia> using CUDA

julia> c = ExaCore(Float32; backend = CUDABackend())
An ExaCore

  Float type: ...................... Float32
  Array type: ...................... CUDA.CuArray{Float32, 1, CUDA.DeviceMemory}
  Backend: ......................... CUDA.CUDAKernels.CUDABackend

  number of objective patterns: .... 0
  number of constraint patterns: ... 0
source
ExaModels.ExaModelMethod
ExaModel(core)

Returns an ExaModel object, which can be solved by nonlinear optimization solvers within JuliaSmoothOptimizer ecosystem, such as NLPModelsIpopt or MadNLP.

Example

julia> using ExaModels

julia> c = ExaCore();                      # create an ExaCore object

julia> x = variable(c, 1:10);              # create variables

julia> objective(c, x[i]^2 for i in 1:10); # set objective function

julia> m = ExaModel(c)                     # creat an ExaModel object
An ExaModel{Float64, Vector{Float64}, ...}

  Problem name: Generic
   All variables: ████████████████████ 10     All constraints: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
            free: ████████████████████ 10                free: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
         low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0              low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           fixed: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                fixed: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
          infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
            nnzh: ( 81.82% sparsity)   10              linear: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
                                                    nonlinear: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
                                                         nnzj: (------% sparsity)
                                                     lin_nnzj: (------% sparsity)
                                                     nln_nnzj: (------% sparsity)

julia> using NLPModelsIpopt

julia> result = ipopt(m; print_level=0)    # solve the problem
"Execution stats: first-order stationary"
source
ExaModels.Node1Type
Node1{F, I}

A node with one child for symbolic expression tree

Fields:

  • inner::I: children
source
ExaModels.Node2Type
Node2{F, I1, I2}

A node with two children for symbolic expression tree

Fields:

  • inner1::I1: children #1
  • inner2::I2: children #2
source
ExaModels.ParameterSubexprType
ParameterSubexpr

A parameter-only subexpression whose values are computed once when parameters are set, not at every function evaluation. Use this for expressions that depend only on parameters (θ), not on variables (x). Values are automatically recomputed when set_parameter! is called.

source
ExaModels.ReducedSubexprType
ReducedSubexpr

A reduced-form subexpression that substitutes the expression directly when indexed. No auxiliary variables or constraints are created - the expression is inlined.

source
ExaModels.SIMDFunctionType
SIMDFunction(gen::Base.Generator, o0 = 0, o1 = 0, o2 = 0)

Returns a SIMDFunction using the gen.

Arguments:

  • gen: an iterable function specified in Base.Generator format
  • o0: offset for the function evaluation
  • o1: offset for the derivative evalution
  • o2: offset for the second-order derivative evalution
source
ExaModels.SecondAdjointNode1Type
SecondAdjointNode1{F, T, I}

A node with one child for second-order forward pass tree

Fields:

  • x::T: function value
  • y::T: first-order sensitivity
  • h::T: second-order sensitivity
  • inner::I: DESCRIPTION
source
ExaModels.SecondAdjointNode2Type
SecondAdjointNode2{F, T, I1, I2}

A node with one child for second-order forward pass tree

Fields:

  • x::T: function value
  • y1::T: first-order sensitivity w.r.t. first argument
  • y2::T: first-order sensitivity w.r.t. first argument
  • h11::T: second-order sensitivity w.r.t. first argument
  • h12::T: second-order sensitivity w.r.t. first and second argument
  • h22::T: second-order sensitivity w.r.t. second argument
  • inner1::I1: children #1
  • inner2::I2: children #2
source
ExaModels.SubexprType
Subexpr

A subexpression that has been lifted to auxiliary variables with defining equality constraints. Can be indexed like a Variable to get Var nodes for use in objectives and constraints.

source
ExaModels.VarType
Var{I}

A variable node for symbolic expression tree

Fields:

  • i::I: (parameterized) index
source
ExaModels._recompute_param_subexprs!Method
_recompute_param_subexprs!(c::ExaCore)

Re-evaluates all parameter-only subexpressions and updates their cached values in θ. Called automatically by set_parameter!.

source
ExaModels.constraint!Method
constraint!(c, c1, expr, pars)

Expands the existing constraint c1 in c by adding addtional constraints terms specified by expr and pars.

source
ExaModels.constraint!Method
constraint!(c::C, c1, gen::Base.Generator) where {C<:ExaCore}

Expands the existing constraint c1 in c by adding additional constraint terms specified by a generator.

Arguments

  • c::C: The model to which the constraints are added.
  • c1: An initial constraint value or expression.
  • gen::Base.Generator: A generator that produces the pair of constraint index and term to be added.

Example

julia> using ExaModels

julia> c = ExaCore();

julia> x = variable(c, 10);

julia> c1 = constraint(c, x[i] + x[i+1] for i=1:9; lcon = -1, ucon = (1+i for i=1:9));

julia> constraint!(c, c1, i => sin(x[i+1]) for i=4:6)
Constraint Augmentation

  s.t. (...)
       g♭ ≤ (...) + ∑_{p ∈ P} h(x,θ,p) ≤ g♯

  where |P| = 3
source
ExaModels.constraintMethod
constraint(core, n; start = 0, lcon = 0,  ucon = 0)

Adds empty constraints of dimension n, so that later the terms can be added with constraint!.

source
ExaModels.constraintMethod
constraint(core, generator; start = 0, lcon = 0,  ucon = 0)

Adds constraints specified by a generator to core, and returns an Constraint object.

Keyword Arguments

  • start: The initial guess of the dual solution. Can either be Number, AbstractArray, or Generator.
  • lcon : The constraint lower bound. Can either be Number, AbstractArray, or Generator.
  • ucon : The constraint upper bound. Can either be Number, AbstractArray, or Generator.

Example

julia> using ExaModels

julia> c = ExaCore();

julia> x = variable(c, 10);

julia> constraint(c, x[i] + x[i+1] for i=1:9; lcon = -1, ucon = (1+i for i=1:9))
Constraint

  s.t. (...)
       g♭ ≤ [g(x,θ,p)]_{p ∈ P} ≤ g♯

  where |P| = 9
source
ExaModels.constraintMethod
constraint(core, expr [, pars]; start = 0, lcon = 0,  ucon = 0)

Adds constraints specified by a expr and pars to core, and returns an Constraint object.

source
ExaModels.drpassMethod
drpass(d::D, y, adj)

Performs dense gradient evaluation via the reverse pass on the computation (sub)graph formed by forward pass

Arguments:

  • d: first-order computation (sub)graph
  • y: result vector
  • adj: adjoint propagated up to the current node
source
ExaModels.gradient!Method
gradient!(y, f, x, adj)

Performs dense gradient evalution

Arguments:

  • y: result vector
  • f: the function to be differentiated in SIMDFunction format
  • x: variable vector
  • adj: initial adjoint
source
ExaModels.grpassMethod
grpass(d::D, comp, y, o1, cnt, adj)

Performs dsparse gradient evaluation via the reverse pass on the computation (sub)graph formed by forward pass

Arguments:

  • d: first-order computation (sub)graph
  • comp: a Compressor, which helps map counter to sparse vector index
  • y: result vector
  • o1: index offset
  • cnt: counter
  • adj: adjoint propagated up to the current node
source
ExaModels.hdrpassMethod
hdrpass(t1::T1, t2::T2, comp, y1, y2, o2, cnt, adj)

Performs sparse hessian evaluation ((df1/dx)(df2/dx)' portion) via the reverse pass on the computation (sub)graph formed by second-order forward pass

Arguments:

  • t1: second-order computation (sub)graph regarding f1
  • t2: second-order computation (sub)graph regarding f2
  • comp: a Compressor, which helps map counter to sparse vector index
  • y1: result vector #1
  • y2: result vector #2 (only used when evaluating sparsity)
  • o2: index offset
  • cnt: counter
  • adj: second adjoint propagated up to the current node
source
ExaModels.jrpassMethod
jrpass(d::D, comp, i, y1, y2, o1, cnt, adj)

Performs sparse jacobian evaluation via the reverse pass on the computation (sub)graph formed by forward pass

Arguments:

  • d: first-order computation (sub)graph
  • comp: a Compressor, which helps map counter to sparse vector index
  • i: constraint index (this is i-th constraint)
  • y1: result vector #1
  • y2: result vector #2 (only used when evaluating sparsity)
  • o1: index offset
  • cnt: counter
  • adj: adjoint propagated up to the current node
source
ExaModels.multipliersMethod
multipliers(result, y)

Returns the multipliers for constraints y associated with result, obtained by solving the model.

Example

julia> using ExaModels, NLPModelsIpopt

julia> c = ExaCore();

julia> x = variable(c, 1:10, lvar = -1, uvar = 1);

julia> objective(c, (x[i]-2)^2 for i in 1:10);

julia> y = constraint(c, x[i] + x[i+1] for i=1:9; lcon = -1, ucon = (1+i for i=1:9));

julia> m = ExaModel(c);

julia> result = ipopt(m; print_level=0);

julia> val = multipliers(result, y);


julia> val[1] ≈ 0.81933930
true
source
ExaModels.multipliers_LMethod
multipliers_L(result, x)

Returns the multipliers_L for variable x associated with result, obtained by solving the model.

Example

julia> using ExaModels, NLPModelsIpopt

julia> c = ExaCore();

julia> x = variable(c, 1:10, lvar = -1, uvar = 1);

julia> objective(c, (x[i]-2)^2 for i in 1:10);

julia> m = ExaModel(c);

julia> result = ipopt(m; print_level=0);

julia> val = multipliers_L(result, x);

julia> isapprox(val, fill(0, 10), atol=sqrt(eps(Float64)), rtol=Inf)
true
source
ExaModels.multipliers_UMethod
multipliers_U(result, x)

Returns the multipliers_U for variable x associated with result, obtained by solving the model.

Example

julia> using ExaModels, NLPModelsIpopt

julia> c = ExaCore();

julia> x = variable(c, 1:10, lvar = -1, uvar = 1);

julia> objective(c, (x[i]-2)^2 for i in 1:10);

julia> m = ExaModel(c);

julia> result = ipopt(m; print_level=0);

julia> val = multipliers_U(result, x);

julia> isapprox(val, fill(2, 10), atol=sqrt(eps(Float64)), rtol=Inf)
true
source
ExaModels.objectiveMethod
objective(core::ExaCore, generator)

Adds objective terms specified by a generator to core, and returns an Objective object. Note: it is assumed that the terms are summed.

Example

julia> using ExaModels

julia> c = ExaCore();

julia> x = variable(c, 10);

julia> objective(c, x[i]^2 for i=1:10)
Objective

  min (...) + ∑_{p ∈ P} f(x,θ,p)

  where |P| = 10
source
ExaModels.objectiveMethod
objective(core::ExaCore, expr [, pars])

Adds objective terms specified by a expr and pars to core, and returns an Objective object.

source
ExaModels.parameterMethod
parameter(core, start::AbstractArray)

Adds parameters with initial values specified by start, and returns Parameter object.

Example

julia> using ExaModels

julia> c = ExaCore();

julia> θ = parameter(c, ones(10))
Parameter

  θ ∈ R^{10}
source
ExaModels.set_parameter!Method
set_parameter!(core, param, values)

Updates the values of parameters in the core.

Example

julia> using ExaModels

julia> c = ExaCore();

julia> p = parameter(c, ones(5))
Parameter

  θ ∈ R^{5}

julia> set_parameter!(c, p, rand(5))  # Update with new values
source
ExaModels.sgradient!Method

sgradient!(y, f, x, adj)

Performs sparse gradient evalution

Arguments:

  • y: result vector
  • f: the function to be differentiated in SIMDFunction format
  • x: variable vector
  • adj: initial adjoint
source
ExaModels.shessian!Method
shessian!(y1, y2, f, x, adj1, adj2)

Performs sparse jacobian evalution

Arguments:

  • y1: result vector #1
  • y2: result vector #2 (only used when evaluating sparsity)
  • f: the function to be differentiated in SIMDFunction format
  • x: variable vector
  • adj1: initial first adjoint
  • adj2: initial second adjoint
source
ExaModels.sjacobian!Method
sjacobian!(y1, y2, f, x, adj)

Performs sparse jacobian evalution

Arguments:

  • y1: result vector #1
  • y2: result vector #2 (only used when evaluating sparsity)
  • f: the function to be differentiated in SIMDFunction format
  • x: variable vector
  • adj: initial adjoint
source
ExaModels.solutionMethod
solution(result, x)

Returns the solution for variable x associated with result, obtained by solving the model.

Example

julia> using ExaModels, NLPModelsIpopt

julia> c = ExaCore();

julia> x = variable(c, 1:10, lvar = -1, uvar = 1);

julia> objective(c, (x[i]-2)^2 for i in 1:10);

julia> m = ExaModel(c);

julia> result = ipopt(m; print_level=0);

julia> val = solution(result, x);

julia> isapprox(val, fill(1, 10), atol=sqrt(eps(Float64)), rtol=Inf)
true
source
ExaModels.subexprMethod
subexpr(core, generator; reduced=false, parameter_only=false)

Creates a subexpression that can be reused in objectives and constraints.

Three forms are available:

  • Lifted (default, reduced=false): Creates auxiliary variables with defining equality constraints. This generates derivative code once and uses simple variable references thereafter. Adds variables and constraints to the problem.

  • Reduced (reduced=true): Stores the expression for direct substitution when indexed. No auxiliary variables or constraints are created. The expression is inlined wherever used.

  • Parameter-only (parameter_only=true): For expressions that depend only on parameters (θ), not variables (x). Values are computed once when parameters are set, not at every function evaluation. Automatically recomputed when set_parameter! is called.

Both lifted and reduced forms support SIMD-vectorized evaluation and can be nested.

Example

julia> using ExaModels

julia> c = ExaCore();

julia> x = variable(c, 10);

julia> s = subexpr(c, x[i]^2 for i in 1:10)
Subexpression (lifted)

  s ∈ R^{10}

julia> objective(c, s[i] + s[i+1] for i in 1:9);

Reduced form (experimental)

Warning

The reduced form (reduced=true) is experimental and may have issues with complex nested expressions. Use the default lifted form for production code.

c = ExaCore()
x = variable(c, 10)

# Reduced form - no extra variables/constraints
s = subexpr(c, x[i]^2 for i in 1:10; reduced=true)

# s[i] substitutes x[i]^2 directly into the expression
objective(c, s[i] + s[i+1] for i in 1:9)

Parameter-only form

For expressions involving only parameters, use parameter_only=true to evaluate them once when parameters change, rather than at every optimization iteration:

c = ExaCore()
θ = parameter(c, ones(10))
x = variable(c, 10)

# Parameter-only subexpression - computed once per parameter update
weights = subexpr(c, θ[i]^2 + θ[i+1] for i in 1:9; parameter_only=true)

# Use in objective - weights[i] returns cached value, not re-computed
objective(c, weights[i] * x[i]^2 for i in 1:9)

Multi-dimensional example

c = ExaCore()
x = variable(c, 0:T, 0:N)

# Automatically infers 2D structure from Cartesian product
dx = subexpr(c, x[t, i] - x[t-1, i] for t in 1:T, i in 1:N)

# Now dx[t, i] can be used in constraints
constraint(c, dx[t, i] - something for t in 1:T, i in 1:N)
source
ExaModels.variableMethod
variable(core, dims...; start = 0, lvar = -Inf, uvar = Inf)

Adds variables with dimensions specified by dims to core, and returns Variable object. dims can be either Integer or UnitRange.

Keyword Arguments

  • start: The initial guess of the solution. Can either be Number, AbstractArray, or Generator.
  • lvar : The variable lower bound. Can either be Number, AbstractArray, or Generator.
  • uvar : The variable upper bound. Can either be Number, AbstractArray, or Generator.

Example

julia> using ExaModels

julia> c = ExaCore();

julia> x = variable(c, 10; start = (sin(i) for i=1:10))
Variable

  x ∈ R^{10}

julia> y = variable(c, 2:10, 3:5; lvar = zeros(9,3), uvar = ones(9,3))
Variable

  x ∈ R^{9 × 3}
source
ExaModels.@register_bivariateMacro
register_bivariate(f, df1, df2, ddf11, ddf12, ddf22)

Register a bivariate function f to ExaModels, so that it can be used within objective and constraint expressions

Arguments:

  • f: function
  • df1: derivative function (w.r.t. first argument)
  • df2: derivative function (w.r.t. second argument)
  • ddf11: second-order derivative funciton (w.r.t. first argument)
  • ddf12: second-order derivative funciton (w.r.t. first and second argument)
  • ddf22: second-order derivative funciton (w.r.t. second argument)

Example

julia> using ExaModels

julia> relu23(x,y) = (x > 0 || y > 0) ? (x + y)^3 : zero(x)
relu23 (generic function with 1 method)

julia> drelu231(x,y) = (x > 0 || y > 0) ? 3 * (x + y)^2 : zero(x)
drelu231 (generic function with 1 method)

julia> drelu232(x,y) = (x > 0 || y > 0) ? 3 * (x + y)^2  : zero(x)
drelu232 (generic function with 1 method)

julia> ddrelu2311(x,y) = (x > 0 || y > 0) ? 6 * (x + y) : zero(x)
ddrelu2311 (generic function with 1 method)

julia> ddrelu2312(x,y) = (x > 0 || y > 0) ? 6 * (x + y) : zero(x)
ddrelu2312 (generic function with 1 method)

julia> ddrelu2322(x,y) = (x > 0 || y > 0) ? 6 * (x + y) : zero(x)
ddrelu2322 (generic function with 1 method)

julia> @register_bivariate(relu23, drelu231, drelu232, ddrelu2311, ddrelu2312, ddrelu2322)
source
ExaModels.@register_univariateMacro
@register_univariate(f, df, ddf)

Register a univariate function f to ExaModels, so that it can be used within objective and constraint expressions

Arguments:

  • f: function
  • df: derivative function
  • ddf: second-order derivative funciton

Example

julia> using ExaModels

julia> relu3(x) = x > 0 ? x^3 : zero(x)
relu3 (generic function with 1 method)

julia> drelu3(x) = x > 0 ? 3*x^2 : zero(x)
drelu3 (generic function with 1 method)

julia> ddrelu3(x) = x > 0 ? 6*x : zero(x)
ddrelu3 (generic function with 1 method)

julia> @register_univariate(relu3, drelu3, ddrelu3)
source