ExaModels
ExaModels.ExaModels — Module
ExaModelsAn algebraic modeling and automatic differentiation tool in Julia Language, specialized for SIMD abstraction of nonlinear programs.
For more information, please visit https://github.com/exanauts/ExaModels.jl
ExaModels.AbstractExaModel — Type
AbstractExaModelAn abstract type for ExaModel, which is a subtype of NLPModels.AbstractNLPModel.
ExaModels.AdjointNode1 — Type
AdjointNode1{F, T, I}A node with one child for first-order forward pass tree
Fields:
x::T: function valuey::T: first-order sensitivityinner::I: children
ExaModels.AdjointNode2 — Type
AdjointNode2{F, T, I1, I2}A node with two children for first-order forward pass tree
Fields:
x::T: function valuey1::T: first-order sensitivity w.r.t. first argumenty2::T: first-order sensitivity w.r.t. second argumentinner1::I1: children #1inner2::I2: children #2
ExaModels.AdjointNodeSource — Type
AdjointNodeSource{VT}A source of AdjointNode. adjoint_node_source[i] returns an AdjointNodeVar at index i.
Fields:
inner::VT: variable vector
ExaModels.AdjointNodeVar — Type
AdjointNodeVar{I, T}A variable node for first-order forward pass tree
Fields:
i::I: indexx::T: value
ExaModels.AdjointNull — Type
NullA null node
ExaModels.Compressor — Type
Compressor{I}Data structure for the sparse index
Fields:
inner::I: stores the sparse index as a tuple form
ExaModels.ExaCore — Type
ExaCore([array_eltype::Type; backend = backend, minimize = true])Returns an intermediate data object ExaCore, which later can be used for creating ExaModel
Example
julia> using ExaModels
julia> c = ExaCore()
An ExaCore
Float type: ...................... Float64
Array type: ...................... Vector{Float64}
Backend: ......................... Nothing
number of objective patterns: .... 0
number of constraint patterns: ... 0
julia> c = ExaCore(Float32)
An ExaCore
Float type: ...................... Float32
Array type: ...................... Vector{Float32}
Backend: ......................... Nothing
number of objective patterns: .... 0
number of constraint patterns: ... 0
julia> using CUDA
julia> c = ExaCore(Float32; backend = CUDABackend())
An ExaCore
Float type: ...................... Float32
Array type: ...................... CUDA.CuArray{Float32, 1, CUDA.DeviceMemory}
Backend: ......................... CUDA.CUDAKernels.CUDABackend
number of objective patterns: .... 0
number of constraint patterns: ... 0ExaModels.ExaModel — Method
ExaModel(core)Returns an ExaModel object, which can be solved by nonlinear optimization solvers within JuliaSmoothOptimizer ecosystem, such as NLPModelsIpopt or MadNLP.
Example
julia> using ExaModels
julia> c = ExaCore(); # create an ExaCore object
julia> x = variable(c, 1:10); # create variables
julia> objective(c, x[i]^2 for i in 1:10); # set objective function
julia> m = ExaModel(c) # create an ExaModel object
An ExaModel{Float64, Vector{Float64}, ...}
Problem name: Generic
All variables: ████████████████████ 10 All constraints: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
free: ████████████████████ 10 free: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0 lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0 upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0 low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
fixed: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0 fixed: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0 infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
nnzh: ( 81.82% sparsity) 10 linear: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
nonlinear: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
nnzj: (------% sparsity)
lin_nnzj: (------% sparsity)
nln_nnzj: (------% sparsity)
julia> using NLPModelsIpopt
julia> result = ipopt(m; print_level=0) # solve the problem
"Execution stats: first-order stationary"
ExaModels.Node1 — Type
Node1{F, I}A node with one child for symbolic expression tree
Fields:
inner::I: children
ExaModels.Node2 — Type
Node2{F, I1, I2}A node with two children for symbolic expression tree
Fields:
inner1::I1: children #1inner2::I2: children #2
ExaModels.Null — Type
NullA null node
ExaModels.ParIndexed — Type
ParIndexed{I, J}A parameterized data node
Fields:
inner::I: parameter for the data
ExaModels.ParSource — Type
ParSourceA source of parameterized data
ExaModels.ParameterSubexpr — Type
ParameterSubexprA parameter-only subexpression whose values are computed once when parameters are set, not at every function evaluation. Use this for expressions that depend only on parameters (θ), not on variables (x). Values are automatically recomputed when set_parameter! is called.
ExaModels.ReducedSubexpr — Type
ReducedSubexprA reduced-form subexpression that substitutes the expression directly when indexed. No auxiliary variables or constraints are created - the expression is inlined.
ExaModels.SIMDFunction — Type
SIMDFunction(gen::Base.Generator, o0 = 0, o1 = 0, o2 = 0)Returns a SIMDFunction using the gen.
Arguments:
gen: an iterable function specified inBase.Generatorformato0: offset for the function evaluationo1: offset for the derivative evalutiono2: offset for the second-order derivative evalution
ExaModels.SecondAdjointNode1 — Type
SecondAdjointNode1{F, T, I}A node with one child for second-order forward pass tree
Fields:
x::T: function valuey::T: first-order sensitivityh::T: second-order sensitivityinner::I: DESCRIPTION
ExaModels.SecondAdjointNode2 — Type
SecondAdjointNode2{F, T, I1, I2}A node with one child for second-order forward pass tree
Fields:
x::T: function valuey1::T: first-order sensitivity w.r.t. first argumenty2::T: first-order sensitivity w.r.t. first argumenth11::T: second-order sensitivity w.r.t. first argumenth12::T: second-order sensitivity w.r.t. first and second argumenth22::T: second-order sensitivity w.r.t. second argumentinner1::I1: children #1inner2::I2: children #2
ExaModels.SecondAdjointNodeSource — Type
SecondAdjointNodeSource{VT}A source of AdjointNode. adjoint_node_source[i] returns an AdjointNodeVar at index i.
Fields:
inner::VT: variable vector
ExaModels.SecondAdjointNodeVar — Type
SecondAdjointNodeVar{I, T}A variable node for first-order forward pass tree
Fields:
i::I: indexx::T: value
ExaModels.SecondAdjointNull — Type
NullA null node
ExaModels.Subexpr — Type
SubexprA subexpression that has been lifted to auxiliary variables with defining equality constraints. Can be indexed like a Variable to get Var nodes for use in objectives and constraints.
ExaModels.TwoStageExaModel — Type
TwoStageExaModel{T, VT, M}A two-stage optimization model where all scenarios are fused into a single ExaModel. Evaluates all scenarios in ONE kernel launch.
Fields
model::M: Single fused ExaModel containing all scenariosns::Int: Number of scenariosnv::Int: Recourse variables per scenariond::Int: Design (shared) variablesnc::Int: Constraints per scenarionθ::Int: Parameters per scenarionnzj_per_scenario::Int: Jacobian nonzeros per scenario (approximate)nnzh_per_scenario::Int: Hessian nonzeros per scenario (approximate)
Structure
- Total variables: ns*nv + nd
- Total constraints: ns*nc
- Global variable layout: [v₁; v₂; ...; vₛ; d]
- Global constraint layout: [c₁; c₂; ...; cₛ]
ExaModels.TwoStageExaModel — Method
TwoStageExaModel(build, nd, nv, ns, θ_sets; backend=nothing)Build a two-stage model where all scenarios are fused into a single ExaModel.
All scenarios share ONE compiled expression pattern, achieving maximum GPU efficiency. This requires scenarios to have identical structure.
Arguments
build::Function: Function(c, d, v, θ, ns, nv, nθ) -> nothingc: ExaCored: Variable handle for design variables (indices 1:nd)v: Variable handle for ALL recourse variables (indices 1:nsnv) Scenario i's vars are at indices (i-1)nv+1 : i*nvθ: Parameter handle for ALL parameters (length nsnθ) Scenario i's params are at indices (i-1)nθ+1 : i*nθns, nv, nθ: dimensions for building iteration data
nd::Int: Number of design variablesnv::Int: Number of recourse variables per scenarions::Int: Number of scenariosθ_sets::Vector{<:AbstractVector}: Parameter vectors for each scenario
Keyword Arguments
backend: Backend for computation (default:nothing)d_start: Initial values for design variables (scalar or vector of lengthnd, default:0.0)d_lvar: Lower bounds for design variables (scalar or vector of lengthnd, default:-Inf)d_uvar: Upper bounds for design variables (scalar or vector of lengthnd, default:Inf)v_start: Initial values for recourse variables (scalar or vector of lengthns*nv, default:0.0)v_lvar: Lower bounds for recourse variables (scalar or vector of lengthns*nv, default:-Inf)v_uvar: Upper bounds for recourse variables (scalar or vector of lengthns*nv, default:Inf)
Example
ns, nv, nd, nθ = 100, 5, 2, 3
θ_sets = [rand(nθ) for _ in 1:ns]
model = TwoStageExaModel(nd, nv, ns, θ_sets) do c, d, v, θ, ns, nv, nθ
obj_data = [(i, j, (i-1)*nv + j, (i-1)*nθ) for i in 1:ns for j in 1:nv]
objective(c, θ[θ_off + 1] * v[v_idx]^2 for (i, j, v_idx, θ_off) in obj_data)
con_data = [(i, j, (i-1)*nv + j, (i-1)*nθ) for i in 1:ns for j in 1:nv]
constraint(c, v[v_idx] + d[1] - θ[θ_off + 3] for (i, j, v_idx, θ_off) in con_data)
endExaModels.Var — Type
Var{I}A variable node for symbolic expression tree
Fields:
i::I: (parameterized) index
ExaModels.VarSource — Type
VarSourceA source of variable nodes
ExaModels.WrapperNLPModel — Method
WrapperNLPModel(VT, m)Returns a WrapperModel{T,VT} wrapping m <: AbstractNLPModel{T}
ExaModels.WrapperNLPModel — Method
WrapperNLPModel(m)Returns a WrapperModel{Float64,Vector{64}} wrapping m
ExaModels._recompute_param_subexprs! — Method
_recompute_param_subexprs!(c::ExaCore)Re-evaluates all parameter-only subexpressions and updates their cached values in θ. Called automatically by set_parameter!.
ExaModels.cons_block_indices — Method
cons_block_indices(model::TwoStageExaModel, i) -> UnitRangeGet the index range for constraints of scenario i in the global constraint vector. Use as: c_global[cons_block_indices(model, i)]
ExaModels.constraint! — Method
constraint!(c, c1, expr, pars)Expands the existing constraint c1 in c by adding addtional constraints terms specified by expr and pars.
ExaModels.constraint! — Method
constraint!(c::C, c1, gen::Base.Generator) where {C<:ExaCore}Expands the existing constraint c1 in c by adding additional constraint terms specified by a generator.
Arguments
c::C: The model to which the constraints are added.c1: An initial constraint value or expression.gen::Base.Generator: A generator that produces the pair of constraint index and term to be added.
Example
julia> using ExaModels
julia> c = ExaCore();
julia> x = variable(c, 10);
julia> c1 = constraint(c, x[i] + x[i+1] for i=1:9; lcon = -1, ucon = (1+i for i=1:9));
julia> constraint!(c, c1, i => sin(x[i+1]) for i=4:6)
Constraint Augmentation
s.t. (...)
g♭ ≤ (...) + ∑_{p ∈ P} h(x,θ,p) ≤ g♯
where |P| = 3ExaModels.constraint — Method
constraint(core, n; start = 0, lcon = 0, ucon = 0)Adds empty constraints of dimension n, so that later the terms can be added with constraint!.
ExaModels.constraint — Method
constraint(core, generator; start = 0, lcon = 0, ucon = 0)Adds constraints specified by a generator to core, and returns an Constraint object.
Keyword Arguments
start: The initial guess of the dual solution. Can either beNumber,AbstractArray, orGenerator.lcon: The constraint lower bound. Can either beNumber,AbstractArray, orGenerator.ucon: The constraint upper bound. Can either beNumber,AbstractArray, orGenerator.
Example
julia> using ExaModels
julia> c = ExaCore();
julia> x = variable(c, 10);
julia> constraint(c, x[i] + x[i+1] for i=1:9; lcon = -1, ucon = (1+i for i=1:9))
Constraint
s.t. (...)
g♭ ≤ [g(x,θ,p)]_{p ∈ P} ≤ g♯
where |P| = 9ExaModels.constraint — Method
constraint(core, expr [, pars]; start = 0, lcon = 0, ucon = 0)Adds constraints specified by a expr and pars to core, and returns an Constraint object.
ExaModels.design_var_index — Method
design_var_index(model, j) -> global_idxGet global index for design variable j.
ExaModels.design_var_indices — Method
design_var_indices(model::TwoStageExaModel) -> UnitRangeGet the index range for design variables in the global variable vector. Use as: x_global[design_var_indices(model)]
ExaModels.drpass — Method
drpass(d::D, y, adj)Performs dense gradient evaluation via the reverse pass on the computation (sub)graph formed by forward pass
Arguments:
d: first-order computation (sub)graphy: result vectoradj: adjoint propagated up to the current node
ExaModels.extract_cons_block! — Method
extract_cons_block!(dest, model::TwoStageExaModel, i, c_global)Extract constraint block for scenario i into pre-allocated dest. Returns dest for convenience.
ExaModels.extract_design_vars! — Method
extract_design_vars!(dest, model::TwoStageExaModel, x_global)Extract design variables into pre-allocated dest. Returns dest for convenience.
ExaModels.extract_grad_block! — Method
extract_grad_block!(g_v, g_d, model::TwoStageExaModel, i, g_global)Extract gradient block for scenario i into pre-allocated g_v and g_d. Returns (g_v, g_d) for convenience.
Note: The design variable gradient accumulates contributions from all scenarios.
ExaModels.extract_recourse_vars! — Method
extract_recourse_vars!(dest, model::TwoStageExaModel, i, x_global)Extract recourse variables for scenario i into pre-allocated dest. Returns dest for convenience.
ExaModels.get_model — Method
get_model(model::TwoStageExaModel)Get the underlying ExaModel for direct NLPModels interface usage.
ExaModels.global_con_index — Method
global_con_index(model, i, local_idx) -> global_idxConvert local constraint index to global index for scenario i.
ExaModels.global_var_index — Method
global_var_index(model, i, local_idx) -> global_idxConvert local variable index to global index for scenario i.
Local ordering (for scenario API): [d₁, ..., dnd, v₁, ..., vnv] Global ordering: [v₁¹...vnv¹, v₁²...vnv², ..., v₁ⁿˢ...vnvⁿˢ, d₁...dnd]
ExaModels.grad_design_indices — Method
grad_design_indices(model::TwoStageExaModel) -> UnitRangeGet the index range for design gradient. Same as design_var_indices since gradient has same layout as variables.
ExaModels.grad_recourse_indices — Method
grad_recourse_indices(model::TwoStageExaModel, i) -> UnitRangeGet the index range for recourse gradient of scenario i. Same as recourse_var_indices since gradient has same layout as variables.
ExaModels.gradient! — Method
gradient!(y, f, x, adj)Performs dense gradient evalution
Arguments:
y: result vectorf: the function to be differentiated inSIMDFunctionformatx: variable vectoradj: initial adjoint
ExaModels.grpass — Method
grpass(d::D, comp, y, o1, cnt, adj)Performs dsparse gradient evaluation via the reverse pass on the computation (sub)graph formed by forward pass
Arguments:
d: first-order computation (sub)graphcomp: aCompressor, which helps map counter to sparse vector indexy: result vectoro1: index offsetcnt: counteradj: adjoint propagated up to the current node
ExaModels.hdrpass — Method
hdrpass(t1::T1, t2::T2, comp, y1, y2, o2, cnt, adj)Performs sparse hessian evaluation ((df1/dx)(df2/dx)' portion) via the reverse pass on the computation (sub)graph formed by second-order forward pass
Arguments:
t1: second-order computation (sub)graph regarding f1t2: second-order computation (sub)graph regarding f2comp: aCompressor, which helps map counter to sparse vector indexy1: result vector #1y2: result vector #2 (only used when evaluating sparsity)o2: index offsetcnt: counteradj: second adjoint propagated up to the current node
ExaModels.jrpass — Method
jrpass(d::D, comp, i, y1, y2, o1, cnt, adj)Performs sparse jacobian evaluation via the reverse pass on the computation (sub)graph formed by forward pass
Arguments:
d: first-order computation (sub)graphcomp: aCompressor, which helps map counter to sparse vector indexi: constraint index (this isi-th constraint)y1: result vector #1y2: result vector #2 (only used when evaluating sparsity)o1: index offsetcnt: counteradj: adjoint propagated up to the current node
ExaModels.multipliers — Method
multipliers(result, y)Returns the multipliers for constraints y associated with result, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
julia> c = ExaCore();
julia> x = variable(c, 1:10, lvar = -1, uvar = 1);
julia> objective(c, (x[i]-2)^2 for i in 1:10);
julia> y = constraint(c, x[i] + x[i+1] for i=1:9; lcon = -1, ucon = (1+i for i=1:9));
julia> m = ExaModel(c);
julia> result = ipopt(m; print_level=0);
julia> val = multipliers(result, y);
julia> val[1] ≈ 0.81933930
trueExaModels.multipliers_L — Method
multipliers_L(result, x)Returns the multipliers_L for variable x associated with result, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
julia> c = ExaCore();
julia> x = variable(c, 1:10, lvar = -1, uvar = 1);
julia> objective(c, (x[i]-2)^2 for i in 1:10);
julia> m = ExaModel(c);
julia> result = ipopt(m; print_level=0);
julia> val = multipliers_L(result, x);
julia> isapprox(val, fill(0, 10), atol=sqrt(eps(Float64)), rtol=Inf)
trueExaModels.multipliers_U — Method
multipliers_U(result, x)Returns the multipliers_U for variable x associated with result, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
julia> c = ExaCore();
julia> x = variable(c, 1:10, lvar = -1, uvar = 1);
julia> objective(c, (x[i]-2)^2 for i in 1:10);
julia> m = ExaModel(c);
julia> result = ipopt(m; print_level=0);
julia> val = multipliers_U(result, x);
julia> isapprox(val, fill(2, 10), atol=sqrt(eps(Float64)), rtol=Inf)
trueExaModels.objective — Method
objective(core::ExaCore, generator)Adds objective terms specified by a generator to core, and returns an Objective object. Note: it is assumed that the terms are summed.
Example
julia> using ExaModels
julia> c = ExaCore();
julia> x = variable(c, 10);
julia> objective(c, x[i]^2 for i=1:10)
Objective
min (...) + ∑_{p ∈ P} f(x,θ,p)
where |P| = 10ExaModels.objective — Method
objective(core::ExaCore, expr [, pars])Adds objective terms specified by a expr and pars to core, and returns an Objective object.
ExaModels.parameter — Method
parameter(core, start::AbstractArray)Adds parameters with initial values specified by start, and returns Parameter object.
Example
julia> using ExaModels
julia> c = ExaCore();
julia> θ = parameter(c, ones(10))
Parameter
θ ∈ R^{10}ExaModels.recourse_var_index — Method
recourse_var_index(model, i, j) -> global_idxGet global index for recourse variable j of scenario i.
ExaModels.recourse_var_indices — Method
recourse_var_indices(model::TwoStageExaModel, i) -> UnitRangeGet the index range for recourse variables of scenario i in the global variable vector. Use as: x_global[recourse_var_indices(model, i)]
ExaModels.set_all_scenario_parameters! — Method
set_all_scenario_parameters!(model, θ_sets)Update parameters for all scenarios.
ExaModels.set_parameter! — Method
set_parameter!(core, param, values)Updates the values of parameters in the core.
Example
julia> using ExaModels
julia> c = ExaCore();
julia> p = parameter(c, ones(5))
Parameter
θ ∈ R^{5}
julia> set_parameter!(c, p, rand(5)) # Update with new valuesExaModels.set_scenario_parameters! — Method
set_scenario_parameters!(model, i, θ_new)Update parameters for scenario i.
ExaModels.sgradient! — Method
sgradient!(y, f, x, adj)
Performs sparse gradient evalution
Arguments:
y: result vectorf: the function to be differentiated inSIMDFunctionformatx: variable vectoradj: initial adjoint
ExaModels.shessian! — Method
shessian!(y1, y2, f, x, adj1, adj2)Performs sparse jacobian evalution
Arguments:
y1: result vector #1y2: result vector #2 (only used when evaluating sparsity)f: the function to be differentiated inSIMDFunctionformatx: variable vectoradj1: initial first adjointadj2: initial second adjoint
ExaModels.sjacobian! — Method
sjacobian!(y1, y2, f, x, adj)Performs sparse jacobian evalution
Arguments:
y1: result vector #1y2: result vector #2 (only used when evaluating sparsity)f: the function to be differentiated inSIMDFunctionformatx: variable vectoradj: initial adjoint
ExaModels.solution — Method
solution(result, x)Returns the solution for variable x associated with result, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
julia> c = ExaCore();
julia> x = variable(c, 1:10, lvar = -1, uvar = 1);
julia> objective(c, (x[i]-2)^2 for i in 1:10);
julia> m = ExaModel(c);
julia> result = ipopt(m; print_level=0);
julia> val = solution(result, x);
julia> isapprox(val, fill(1, 10), atol=sqrt(eps(Float64)), rtol=Inf)
trueExaModels.subexpr — Method
subexpr(core, generator; reduced=false, parameter_only=false)Creates a subexpression that can be reused in objectives and constraints.
Three forms are available:
Lifted (default,
reduced=false): Creates auxiliary variables with defining equality constraints. This generates derivative code once and uses simple variable references thereafter. Adds variables and constraints to the problem.Reduced (
reduced=true): Stores the expression for direct substitution when indexed. No auxiliary variables or constraints are created. The expression is inlined wherever used.Parameter-only (
parameter_only=true): For expressions that depend only on parameters (θ), not variables (x). Values are computed once when parameters are set, not at every function evaluation. Automatically recomputed whenset_parameter!is called.
Both lifted and reduced forms support SIMD-vectorized evaluation and can be nested.
Example
julia> using ExaModels
julia> c = ExaCore();
julia> x = variable(c, 10);
julia> s = subexpr(c, x[i]^2 for i in 1:10)
Subexpression (lifted)
s ∈ R^{10}
julia> objective(c, s[i] + s[i+1] for i in 1:9);Reduced form (experimental)
The reduced form (reduced=true) is experimental and may have issues with complex nested expressions. Use the default lifted form for production code.
c = ExaCore()
x = variable(c, 10)
# Reduced form - no extra variables/constraints
s = subexpr(c, x[i]^2 for i in 1:10; reduced=true)
# s[i] substitutes x[i]^2 directly into the expression
objective(c, s[i] + s[i+1] for i in 1:9)Parameter-only form
For expressions involving only parameters, use parameter_only=true to evaluate them once when parameters change, rather than at every optimization iteration:
c = ExaCore()
θ = parameter(c, ones(10))
x = variable(c, 10)
# Parameter-only subexpression - computed once per parameter update
weights = subexpr(c, θ[i]^2 + θ[i+1] for i in 1:9; parameter_only=true)
# Use in objective - weights[i] returns cached value, not re-computed
objective(c, weights[i] * x[i]^2 for i in 1:9)Multi-dimensional example
c = ExaCore()
x = variable(c, 0:T, 0:N)
# Automatically infers 2D structure from Cartesian product
dx = subexpr(c, x[t, i] - x[t-1, i] for t in 1:T, i in 1:N)
# Now dx[t, i] can be used in constraints
constraint(c, dx[t, i] - something for t in 1:T, i in 1:N)ExaModels.variable — Method
variable(core, dims...; start = 0, lvar = -Inf, uvar = Inf)Adds variables with dimensions specified by dims to core, and returns Variable object. dims can be either Integer or UnitRange.
Keyword Arguments
start: The initial guess of the solution. Can either beNumber,AbstractArray, orGenerator.lvar: The variable lower bound. Can either beNumber,AbstractArray, orGenerator.uvar: The variable upper bound. Can either beNumber,AbstractArray, orGenerator.
Example
julia> using ExaModels
julia> c = ExaCore();
julia> x = variable(c, 10; start = (sin(i) for i=1:10))
Variable
x ∈ R^{10}
julia> y = variable(c, 2:10, 3:5; lvar = zeros(9,3), uvar = ones(9,3))
Variable
x ∈ R^{9 × 3}
NLPModels.cons! — Method
cons!(model::TwoStageExaModel, x_global, c_global)Evaluate all constraints. Output: c_global ∈ ℝ^{ns*nc}
NLPModels.get_ncon — Method
get_ncon(model::TwoStageExaModel)Total number of constraints.
NLPModels.get_nnzh — Method
get_nnzh(model::TwoStageExaModel)Total number of Hessian nonzeros.
NLPModels.get_nnzj — Method
get_nnzj(model::TwoStageExaModel)Total number of Jacobian nonzeros.
NLPModels.get_nvar — Method
get_nvar(model::TwoStageExaModel)Total number of variables.
NLPModels.grad! — Method
grad!(model::TwoStageExaModel, x_global, g_global)Evaluate total gradient. Output: g_global ∈ ℝ^{ns*nv + nd}
NLPModels.hess_coord! — Method
hess_coord!(model::TwoStageExaModel, x_global, y_global, hess_global; obj_weight=1.0)Evaluate full Hessian of Lagrangian (COO values).
NLPModels.hess_structure! — Method
hess_structure!(model::TwoStageExaModel, rows, cols)Get full Hessian sparsity structure.
NLPModels.jac_coord! — Method
jac_coord!(model::TwoStageExaModel, x_global, jac_global)Evaluate full Jacobian (COO values).
NLPModels.jac_structure! — Method
jac_structure!(model::TwoStageExaModel, rows, cols)Get full Jacobian sparsity structure.
NLPModels.obj — Method
obj(model::TwoStageExaModel, x_global)Evaluate total objective (sum over all scenarios).
ExaModels.@register_bivariate — Macro
register_bivariate(f, df1, df2, ddf11, ddf12, ddf22)Register a bivariate function f to ExaModels, so that it can be used within objective and constraint expressions
Arguments:
f: functiondf1: derivative function (w.r.t. first argument)df2: derivative function (w.r.t. second argument)ddf11: second-order derivative funciton (w.r.t. first argument)ddf12: second-order derivative funciton (w.r.t. first and second argument)ddf22: second-order derivative funciton (w.r.t. second argument)
Example
julia> using ExaModels
julia> relu23(x,y) = (x > 0 || y > 0) ? (x + y)^3 : zero(x)
relu23 (generic function with 1 method)
julia> drelu231(x,y) = (x > 0 || y > 0) ? 3 * (x + y)^2 : zero(x)
drelu231 (generic function with 1 method)
julia> drelu232(x,y) = (x > 0 || y > 0) ? 3 * (x + y)^2 : zero(x)
drelu232 (generic function with 1 method)
julia> ddrelu2311(x,y) = (x > 0 || y > 0) ? 6 * (x + y) : zero(x)
ddrelu2311 (generic function with 1 method)
julia> ddrelu2312(x,y) = (x > 0 || y > 0) ? 6 * (x + y) : zero(x)
ddrelu2312 (generic function with 1 method)
julia> ddrelu2322(x,y) = (x > 0 || y > 0) ? 6 * (x + y) : zero(x)
ddrelu2322 (generic function with 1 method)
julia> @register_bivariate(relu23, drelu231, drelu232, ddrelu2311, ddrelu2312, ddrelu2322)ExaModels.@register_univariate — Macro
@register_univariate(f, df, ddf)Register a univariate function f to ExaModels, so that it can be used within objective and constraint expressions
Arguments:
f: functiondf: derivative functionddf: second-order derivative funciton
Example
julia> using ExaModels
julia> relu3(x) = x > 0 ? x^3 : zero(x)
relu3 (generic function with 1 method)
julia> drelu3(x) = x > 0 ? 3*x^2 : zero(x)
drelu3 (generic function with 1 method)
julia> ddrelu3(x) = x > 0 ? 6*x : zero(x)
ddrelu3 (generic function with 1 method)
julia> @register_univariate(relu3, drelu3, ddrelu3)