Coder Social home page Coder Social logo

infiniteopt / infiniteopt.jl Goto Github PK

View Code? Open in Web Editor NEW
278.0 8.0 21.0 13.8 MB

An intuitive modeling interface for infinite-dimensional optimization problems.

Home Page: https://infiniteopt.github.io/InfiniteOpt.jl/stable

License: MIT License

Julia 100.00%
optimization julia modeling-language dynamic-programming stochastic-optimization optimal-control pde-constrained-optimization nonlinear-optimization differential-equations measure-theory

infiniteopt.jl's Introduction

Logo

A JuMP extension for expressing and solving infinite-dimensional optimization problems. Such areas include stochastic programming, dynamic programming, space-time optimization, and more. InfiniteOpt serves as an easy-to-use modeling interface for these advanced problem types that can be used by those with little to no background in these areas. It also it contains a wealth of capabilities making it a powerful and convenient tool for advanced users.

Current Version:

Documentation Build Status Citation
Build Status codecov.io DOI
Build Status codecov.io

It builds upon JuMP to add support for many complex modeling objects which include:

  • Infinite parameters (e.g., time, space, uncertainty, etc.)
  • Finite parameters (similar to ParameterJuMP)
  • Infinite variables (decision functions) (e.g., y(t, x))
  • Derivatives (e.g., ∂y(t, x)/∂t)
  • Measures (e.g., ∫y(t,x)dt, 𝔼[y(ξ)])

The unifying modeling abstraction behind InfiniteOpt captures a wide spectrum of disciplines which include dynamic, PDE, stochastic, and semi-infinite optimization. Moreover, we facilitate transferring techniques between these to synthesize new optimization paradigms!

abstract

Comments, suggestions and improvements are welcome and appreciated.

License

InfiniteOpt is licensed under the MIT "Expat" license.

Installation

InfiniteOpt.jl is a registered Julia package and can be installed by entering the following in the REPL.

julia> ]

(v1.11) pkg> add InfiniteOpt

Documentation

Please visit our documentation pages to learn more. These pages are quite extensive and feature overviews, guides, manuals, tutorials, examples, and more!

Questions

For additional help please visit and post in our discussion forum.

Citing

DOI DOI

If you use InfiniteOpt.jl in your research, we would greatly appreciate your citing it.

@article{pulsipher2022unifying,
      title = {A unifying modeling abstraction for infinite-dimensional optimization},
      journal = {Computers & Chemical Engineering},
      volume = {156},
      year = {2022},
      issn = {0098-1354},
      doi = {https://doi.org/10.1016/j.compchemeng.2021.107567},
      url = {https://www.sciencedirect.com/science/article/pii/S0098135421003458},
      author = {Joshua L. Pulsipher and Weiqi Zhang and Tyler J. Hongisto and Victor M. Zavala},
}

A pre-print version is freely available though arXiv.

Project Status

The package is tested against Julia's LTS release and the latest released version on Linux, Mac, and Windows.

Contributing

InfiniteOpt is being actively developed and suggestions or other forms of contribution are encouraged. There are many ways to contribute to this package. For more information please visit CONTRIBUTING.

infiniteopt.jl's People

Contributors

pulsipher avatar wzhangw avatar bdaves12 avatar github-actions[bot] avatar odow avatar azev77 avatar amadeusine avatar dnguyen227 avatar mzagorowska avatar

Stargazers

Jean-Baptiste Caillau avatar Stephen Cini avatar Akshay Kudva avatar Victor Alves avatar  avatar Chen Gao avatar Pingping avatar Marius Fersigan avatar Vincent Du avatar  avatar karei avatar Wissotsky avatar  avatar Andres Cabeza avatar ggiaquinto avatar Vineet J Nair avatar Ali Tlisov avatar Dillon Chang avatar tangwang avatar  avatar Sushrut Deshpande avatar  avatar Igor Alentev avatar Riccardo Riva avatar Minjae Jung avatar Antoine Aspeel avatar Adrien Banse avatar Shang Fei avatar Caio Dantas avatar Orestis Ousoultzoglou avatar Kang avatar Zhan Ziyuan avatar Marc Alsina avatar 电线杆 avatar Eduardo Salazar avatar 冷漠小滑 avatar Orjan Ameye avatar Albert avatar  avatar Liu Mao avatar  avatar Parin Chaipunya avatar Misael Cureño avatar Yosef Weissmann avatar  avatar Jorge Pérez avatar Alberto Fossà avatar Kostas Andreadis avatar  avatar Timon Böhler avatar Tuan Duc Tran avatar  avatar BIAN, Pei-Liang avatar Jonnie Diegelman avatar Orpheus Lummis avatar Alexander Stromberger avatar  avatar Mingzhe avatar Flinner avatar Arjit Seth avatar Chelsea Sidrane avatar Jack Chou avatar enrico facca avatar Rémi Garcia avatar Daniel Kloimwieder avatar genix avatar Nicola Di Cicco avatar Mr. Fisher avatar  avatar Justin Yi avatar Alexander Demin avatar Diego S Cardoso avatar Camilo Marchesini avatar AbdulazizAhmed avatar Takuya Iwanaga avatar Oleksandr Balyshyn avatar _z3r0_ avatar Michael Goerz avatar Daniel Volya avatar  avatar hikettei avatar  avatar Johannes Milz avatar Mischa Krüger avatar Rafael Benchimol Klausner avatar Iago Leal avatar Guilherme Bodin avatar Jack Huang avatar Harsh Bad avatar Jiho Lee avatar Albert Lam avatar Debarchito Nath avatar Patrick Townsend Falcones avatar  avatar Thiago Henriques Pessoa avatar Weishi Wang avatar ts0yu avatar Jeremiah avatar Navi avatar Filippos Christou avatar

Watchers

 avatar François Pacaud avatar Mohamed Tarek avatar Matthew Wilhelm avatar  avatar  avatar Chris Muir avatar  avatar

infiniteopt.jl's Issues

Update for JuMP 0.21

Need to extend the set_time_limit_sec, unset_time_limit_sec and time_limit_sec methods. DONE!

Add Developer Guides

Add documentation to explain how to extend InfiniteOpt and how to contribute. The extensions should include how to add user-defined infinite sets, user-defined measure evaluation methods, user-defined support generation, and user-defined reformulation methods to solve the model (e.g., polynomial chaos). Perhaps we can extend it with FlexibilityAnalysis.jl as an example.

The contribution documentation should include a style-guide, a setup tutorial (e.g., using the dev command), using docstrings, using Documenter, making testsets, understanding the build and coverage tests, and using Github's pull request framework.

Direct Derivative Evaluation Transcription

Currently, when derivatives are evaluated for optimizer models, they are first transformed into a reduced InfiniteOpt expression and then transcripted into JuMP expressions. However, this workflow can be reworked to have them map directly to JuMP expressions for optimizer models. This paradigm shift should have a notable increase in performance.

Note this direct paradigm would be harder to implement with measures since they build in place and can be nested whereas derivative evaluation constraints are always built independently and do not even need to be mapped.

Add and Document Use Examples

Need to make more examples and include the source code in the examples folder. These should also be used in the Examples section of the documentation.

Improve Modularity/Extendability of MeasureEvalMethods

This module should be structured a little more modularly. In particular the methods are called with different arguments and with different infinite sets. This makes extensions overly cumbersome. Currently, there is no safeguard for calling a method that is invalid for a particular set type.

[FEATURE] Intelligent Integral Defaults

Describe the feature you'd like
More intelligently define integral defaults to work better in conjunction with ODEs and to avoid adding extraneous supports with other integral calls.

Does this pertain to a particular class of problems/techniques? If so explain.
Any problem with integrals.

Describe what you currently do about this
Currently, we depend a lot on the user to manage supports and methods but this is not very intuitive.

Additional context
This should use inheritance from infinite parameters to determine the eval_method and should make trapezoid rule the default for IntervalSets and enable the trapezoid rule to use existing supports and only add more if needed.

Revamp Macros with JuMP Dependencies

We need to change our macro structure to be devoid of internal JuMP function dependencies for better stability. It also appears we'll need to forego our convenient wrapping of JuMP.@variable within our macros. Please see the discussion here: jump-dev/JuMP.jl#2333. Based on this discussion, it seems we'll need to decide between the two options:

  1. Implement all the back-ending necessary to support our macros without the use of JuMP internal functions or JuMP.@variable.
  2. Drop most of our macros in favor of JuMP.@variable but with object orientated inputs (e.g., @variable(m, y >= 0, Infinite(t, x))) thus breaking our symbolic mathematical syntax.

Add measure default setting

  1. Modify the measure function to identify default eval_method for sampling and quadrature methods.

  2. Add a user-friendly function set_measure_default to change the default measure parameters for a model.

Return Interpolated Value Functions

We should build in some interpolation functionality to automatically construct start value functions and optionally return infinite variable solutions as functions.

Extension Template and Tests for Derivative Evaluation

We need to add an extension template and tests for user-defined derivative evaluation methods. This should detail how to define AbstractDerivativeMethods and how to implement evaluate_derivative using our provided helper functions as appropriate.

Performance Optimization

We need to analyze and optimize InfiniteOpt's structure to enhance performance. This might involve changes such as shifting to an array based structure instead of dictionaries. Also, the memory accessing should be better optimized to effectively leverage locality (i.e., use the cache memory effectively). We want to make it such that JuMP is the bottleneck.

Add measure evaluation registry

Add a registry to current model that records which evaluation functions are valid for each type of infinite set. Users should be able to query and modify the registry, which is necessary for adding new evaluation functions and/or new infinite set types.

Better integration with JuMP + GSOC

There is a pretty incredible amount of work here (docs are outstanding).

I haven't quite wrapped my head around all of the functionality you have, so my question is: what, if anything, can we add to JuMP to make this simpler?

@tomasfmg is doing a GSOC adding parameters to MOI. I think there is some overlap, particularly with https://pulsipher.github.io/InfiniteOpt.jl/stable/guide/expression/#Datatype-Hierarchy-1. At the very least, we should understand the differences so we're not duplicating a lot of work

cc @blegat @mlubin @joaquimg

[FEATURE] Restructure Transcribe.jl

Describe the feature you'd like
transcribe.jl needs to be rewritten to address the following concerns:

  • Needs to adopt a more straightforward paradigm like measure expansion
  • Should make a support/variable look up table for all variables to quicken variable searches
  • Should build expressions in-place using the look-up table
  • Reduced variables need to be mapped to transcription variables to allow partial transcription of vector parameters
  • Reduced variables and measures should reference transcription equivalents in order to query their values
  • Perhaps measure expansions should be located in connected equality constraints in order to query value
  • The parameter references present in each expression should be found at creation not in TranscriptionOpt
  • Parameter references should be ordered at creation
  • Make a way for PointVariableRefs generated by measure expansion to have names.

All of the above should improve performance and stability by an appreciable margin and make it much more straightforward.

Does this pertain to a particular class of problems/techniques? If so explain.
Everything

Describe what you currently do about this
We use a hacky methodology with a lot of unnecessary searches and data type conversions.

[BUG] Safe Attribute Accessing

Most methods will throw a OutOfBounds error when querying attributes of a deleted variable reference. This behavior is different to JuMP which is able to process such queries. This can likely be rectified using get(container, key, fallback) to access the container data objects and then extending Base.getproperty and Base.setproperty! for the fallback type in an intelligent way.

[FEATURE] Trajectory Start Values

Describe the feature you'd like
Implement functionality to specify better start values for infinite variables. This will greatly help with convergence.

Does this pertain to a particular class of problems/techniques? If so explain.
All infinite problems

Describe what you currently do about this
Only a single value can be used. Trajectories cannot be given.

Additional context
Perhaps it would be nice to implement some automatic methods.

[FEATURE] Google Analytics

We should add Google Analytics to our docs via the analytics keyword in the make.jl file. We can emulate how JuMP does this.

[FEATURE] Vector Constraints

Describe the feature you'd like
We should extend capabilities to handle JuMP vector constraints such as semi-definite and conic.

Does this pertain to a particular class of problems/techniques? If so explain.
Semi-definite programming and conic programming

Describe what you currently do about this
Conic can be worked around with quadratic, but will not work with conic solvers. Semi-definite is not currently possible.

Constraint Measures to Enable Event Constraints

We can generalize our treatment of measures to allow constraint-like expressions as input to enable things like chance-constraints and exceedance probabilities, since all of these entities essentially implement a measure that operates on a constraint-like (i.e., threshold) condition.

Reformulation in these cases will typically involve expanding in-place and typically defining auxiliary constraints and variables (e.g., big-M constraints). This would be a useful feature for better enabling risk measures as well. Programmatically, the paradigm we'll might want to add on is tracking variables and/or constraints that are made by measures to prevent unnecessary duplication

Problem Simulation and Automatic Initialization

Describe the feature you'd like
Allow this to plugin to effective dynamic simulators to simulate the model. This could be used to generate better guess values in accordance with #43.

Does this pertain to a particular class of problems/techniques? If so explain.
Dynamic optimization problems.

Describe what you currently do about this
We don't do this at all.

[FEATURE] Make a Dependent Multi-Dimensional Parameter Object

Describe the feature you'd like
In connection with reworking the supports we should make a separate object for multi-dimensional infinite parameters that are dependent, so we can store their supports together. We should also make a macro to implement this, so independent and dependent parameters are defined with separate
macros.

Does this pertain to a particular class of problems/techniques? If so explain.
Principally, to multi-variate distribution random parameters.

Describe what you currently do about this
We currently treat them separately and account for them in a hacky way with group IDs.

Additional context
We'll need to update everything that depends on infinite parameters to accommodate this.

Update Documentation for v0.3.0

We need to update the documentation now that we support derivatives. This will involve the following:

  • Update out of date doc tests (e.g., ones that show InfiniteModels)
  • Write the derivative guide page
  • Add a section of derivative evaluation extension to the extensions page

Add Direct Mode

Implement JuMP's direct mode in combination with optimizer models to achieve performance enhancements. This may help with #18.

Improve Infinite Set Extendability

Need to review and modify functions that use infinite sets to better accommodate extensions. This will
likely involve implementing dispatch methods. For example JuMP.has_lower_bound cannot be extended.

[FEATURE] Update to include JuMP 0.21.6 Features

New features to extend include:

  • JuMP.simplex_iterations (done)
  • JuMP.barrier_iterations (done)
  • JuMP.node_count (done)
  • JuMP.value (constraints with custom var_value function)
  • JuMP.relax_integrality (done)
  • JuMP.reduced_cost (done)
  • Maybe more...

[FEATURE] Support/Coefficient DataType

Describe the feature you'd like
We should investigate making a datatype to associate measure coefficients with supports, but should only implement this if there is no degradation in performance.

Does this pertain to a particular class of problems/techniques? If so explain.
All uses of measures.

Describe what you currently do about this
We store these attributes separately.

OptimizerFactory not defined

using InfiniteOpt,JuMP
[ Info: Precompiling InfiniteOpt [20393b10-9daf-11e9-18c9-8db751c92c57]
ERROR: LoadError: LoadError: UndefVarError: OptimizerFactory not defined
Stacktrace:
[1] getproperty(::Module, ::Symbol) at .\Base.jl:13
[2] top-level scope at C:\Users\Administrator.julia\packages\InfiniteOpt\7pV7u\src\datatypes.jl:63
[3] include at .\boot.jl:328 [inlined]
[4] include_relative(::Module, ::String) at .\loading.jl:1105
[5] include at .\Base.jl:31 [inlined]
[6] include(::String) at C:\Users\Administrator.julia\packages\InfiniteOpt\7pV7u\src\InfiniteOpt.jl:1
[7] top-level scope at C:\Users\Administrator.julia\packages\InfiniteOpt\7pV7u\src\InfiniteOpt.jl:13
[8] include at .\boot.jl:328 [inlined]
[9] include_relative(::Module, ::String) at .\loading.jl:1105
[10] include(::Module, ::String) at .\Base.jl:31
[11] top-level scope at none:2
[12] eval at .\boot.jl:330 [inlined]
[13] eval(::Expr) at .\client.jl:425
[14] top-level scope at .\none:3
in expression starting at C:\Users\Administrator.julia\packages\InfiniteOpt\7pV7u\src\datatypes.jl:63
in expression starting at C:\Users\Administrator.julia\packages\InfiniteOpt\7pV7u\src\InfiniteOpt.jl:13
ERROR: Failed to precompile InfiniteOpt [20393b10-9daf-11e9-18c9-8db751c92c57] to C:\Users\Administrator.julia\compiled\v1.3\InfiniteOpt\p3GvY_rYFwp.ji.
Stacktrace:
[1] error(::String) at .\error.jl:33
[2] compilecache(::Base.PkgId, ::String) at .\loading.jl:1283
[3] _require(::Base.PkgId) at .\loading.jl:1024
[4] require(::Base.PkgId) at .\loading.jl:922
[5] require(::Module, ::Symbol) at .\loading.jl:917

[FEATURE] Analytic Measure Evaluation

Add functionality in measure to derive and compute analytic relations for integral measures that entail expressions with no parameter dependence or no infinite variable dependence to compute the
associated constant value derived from the integral form.

[FEATURE] Enhanced `InfiniteModel` Storage

Describe the feature you'd like
Reduce the number of fields used by InfiniteModel significantly and make more efficient. Do this by
defining structures to contain vars and all mapping information. Store these via CleverDicts using
variable specific indices with explicit types. Replace variable reference dependencies with these variable specific indices. Maybe also consolidate the constraint reference structure.

Does this pertain to a particular class of problems/techniques? If so explain.
All infinite models.

Describe what you currently do about this
We use a lot of dictionaries and abstract types.

Finish Guide/Manual in Documentation

Need to finish the guide section in the documentation. In particular the following sections:

  • Variables
  • Measures
  • Objectives
  • Expressions
  • Constraints
  • Model Transcription
  • Optimization
  • Results

The corresponding docstrings need to be updated as well to use jldoctest and appropriate links.

Ordinary Differential Equations

Add syntax to support ordinary differential equations and separable partial differential equations with a single partial term. This will could involve a native implementation or use bridge constraints. Will use point variables and measures to evaluate. Also, make sure it displays in a way the user expects.

Macro Expression Bug

We get an error if we embed expressions like x +2y in macro calls. This can probably be resolved in combination with #5.

Option to Return Result Queries as N-dimensional Arrays

It would be convenient to allow to be able to obtain the value and other quantities of variables and constraints as an n-dimensional array instead of a vector of tuples. This would enable slicing operations to more access the results when multiple independent parameters are at play.

Cannot Transcribe Scalar Measures/Derivatives of Dependent Infinite Parameters

Describe the bug
Currently we support modeling of measures and derivatives that partially evaluate dependent parameter dependencies, however this cannot be transcripted in general. Fundamentally, this occurs because this produces reduced variables with partial dependent parameters and it is not clear how to reformulate this. For example, consider dependent parameters x[1:2] with supports [0 1], [1 1], [0.5 0]. Then by virtue of a measure or derivative with an infinite variable y(x) we can obtain the reduced expression: y(x[1] 0]) + y([x[1] 1]). So then the question is how would we (or does it even make sense to) transcribe this function given that x[1] is not restricted to a support and may have a different amount of possible values for each reduced variable.

To Reproduce

using InfiniteOpt, JuMP 

m = InfiniteModel()
@infinite_parameter(m, x[1:2] in [0, 1])
@infinite_variable(m, y(x))
@objective(m, Min, integral(y, x[1]))
build_optimizer_model!(m)
ERROR: Unable to locate transcription variable by support, consider rebuilding the infinite model with less significant digits. Note this might be due to partially evaluating dependent parameters which is not supported by TranscriptionOpt.
Stacktrace:
 [1] error(::String) at .\error.jl:33
 [2] _supp_error() at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\TranscriptionOpt\model.jl:285
 [3] get at .\dict.jl:523 [inlined]
 [4] lookup_by_support(::Model, ::GeneralVariableRef, ::Type{ReducedVariableIndex}, ::Array{Float64,1}) at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\TranscriptionOpt\model.jl:299
 [5] transcription_expression(::Model, ::GeneralVariableRef, ::Type{ReducedVariableIndex}, ::Array{Float64,1}) at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\TranscriptionOpt\transcribe.jl:300
 [6] transcription_expression(::Model, ::GeneralVariableRef, ::Array{Float64,1}) at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\TranscriptionOpt\transcribe.jl:289
 [7] macro expansion at C:\Users\pulsipher\.julia\packages\MutableArithmetics\H0Uof\src\rewrite.jl:224 [inlined]
 [8] macro expansion at C:\Users\pulsipher\.julia\packages\JuMP\MnJQc\src\macros.jl:45 [inlined]
 [9] transcription_expression(::Model, ::GenericAffExpr{Float64,GeneralVariableRef}, ::Array{Float64,1}) at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\TranscriptionOpt\transcribe.jl:336
 [10] transcribe_measures!(::Model, ::InfiniteModel) at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\TranscriptionOpt\transcribe.jl:387
 [11] build_transcription_model!(::Model, ::InfiniteModel) at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\TranscriptionOpt\transcribe.jl:618
 [12] build_optimizer_model! at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\TranscriptionOpt\optimize.jl:30 [inlined]
 [13] #build_optimizer_model!#322(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::typeof(build_optimizer_model!), ::InfiniteModel) at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\optimize.jl:532
 [14] build_optimizer_model!(::InfiniteModel) at C:\Users\pulsipher\.julia\dev\InfiniteOpt\src\optimize.jl:531
 [15] top-level scope at REPL[40]:1

Expected behavior
Currently, we throw an error at transcription, but perhaps we can consider removing this capability all together to improve performance and simplicity. Or if this behavior makes sense mathematically, then perhaps we can enable TranscriptionOpt to handle this situation accordingly.

Infinite Parameter Function Objects to Embed in Expressions

Without an NLP interface, we cannot readily support general functions of infinite parameters that will get evaluated at the transcription step. An example case, would be an objective function of ∫(x(t) - sin(t))² dt. Here we could introduce some programmatic infinite parameter function d(t) = sin(t) where d(t) can then be introduced into expressions normally and then evaluated to its numerical value at transcription.

[FEATURE] Differential Equation Definition

Desired Feature
Enable definition of [partial] derivatives with an arbitrary number of infinite parameter dependencies.

Where would this be applicable?
This is relevant in dynamic and space-time problems with ODEs and/or PDEs.

What is currently done?
Currently these have to be transcribed manually using user-defined point variables to implement, methods such as finite differences. Measures can also be used for ODEs.

Add Built-in Methods for Measure Data

Need to add built-in sampling and quadrature methods to better enable measures. Desire syntax of form `measure(expr, param(s), lb = [param lb], ub = [param ub]; eval_method = [default_method])'. Maybe the evaluation methods could be defined in a submodule.

[FEATURE] Nonlinear Expressions

Enable nonlinear expression support with @NLconstraint, @NLexpression, and @NLobjective. This probably will involve making our own graph storage structure to account for different variable types unlike JuMP. This structure will then need to be transcribable such that it can interface with JuMP so we can use JuMP's differentiation tools. Also, it should accommodate extensions.

[FEATURE] Add Check Free Mode

Add a mode for InfiniteModels such that all the checks are turned off to help speed up the performance. This should be implemented as a keyword argument in the InfiniteModel constructor that is then saved as an attribute. This attribute should then be used to turn on/off the checks present in InfiniteOpt.

[FEATURE] Automatic Transcription of Differential Equations

Describe the feature you'd like
Automatically transcribe models with ODEs and/or PDEs via TranscriptionOpt using finite difference, orthogonal collocation, and/or integrals.

Does this pertain to a particular class of problems/techniques? If so explain.
This is relevant to dynamic and space-time problems.

Describe what you currently do about this
Currently users have to manually encode such methods via point variables.

Additional context
This should be extendable to allow for a wide range of user-defined methods. This primarily should have to deal adding additional supports if needed (e.g., interior points) and with forming linking
constraints to properly define the derivative variables.

Also this needs to be done in conjunction with #9.

[BUG] Reduced Infinite Variables

Describe the bug
Reduced infinite variables cannot account for partially transcribed parameter tuple array elements. This leads to incorrect measure expansion.

To Reproduce

using InfiniteOpt, JuMP
m=InfiniteModel()
@infinite_parameter(m, x[1:2] in [0, 2])
@infinite_parameter(m, t in [0, 1])
@infinite_variable(m, y(t, x))
expand(integral(y, x[1], num_supports = 2))
y(t, 0.02083778845386952) + y(t, 1.6710951524438982) # x dimensions are lost

Expected behavior
Reduced variables should be able to partially transcribe arrays of parameters and maintain structure.

Desktop (please complete the following information):

  • OS: Windows10
  • Package Versions : InfiniteOpt v0.1.0
  • Julia Version: 1.2

Additional context
This should be resolved by redoing how reduced variables are structured and made. Also, we should make a data object for better storing parameter tuples and propagate it through.

Update to MOI 0.9.16

This contains the fixed version of CleverDicts and can replace _get with Base.get.

[BUG] Resolve MeasureRef Operations

We need to check operations that involve MeasureRefs to check/handle the possibility of expressions that become non-quadratic when the measure is expanded.

Restructure to Distinguish Between Internal/Public Supports

With the addition differential equations that might employ methods like orthogonal collocation over finite elements it becomes prudent to track the difference between public and internal supports (e.g., internal collocation nodes). Such a change will necessitate TranscriptionOpt to track which generated supports are internal/public. By default, supports should return only public supports and return the full set when indicated to do so.

[FEATURE] Streamline TranscriptionOpt

Describe the feature you'd like
There are a number of ways to enhance the transcription process.

  • Implement #41
  • Pass along in _make_transcription_function if the expr has measures
  • Expand measures in place
  • Make transcribed constraints by replacing variables in place (complicated with parameters)
  • Track highest variable level encountered while expanding to avoid checking all at the end
  • Improve the support matching checks for floats
  • Avoid unnecessary conversions
  • Reduce quad expressions to aff expressions when appropriate

Describe what you currently do about this
We don't do any of the above.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.