faradayinstitution / bpx Goto Github PK
View Code? Open in Web Editor NEWBPX schema in pydantic and JSON schema format, and parsers
License: MIT License
BPX schema in pydantic and JSON schema format, and parsers
License: MIT License
Provide support in the BPX standard for electrode OCPs to be defined as general functions of more parameters, in order to define a hysteresis model. The simplest such example could be the sigmoid hysteresis model (distinct charge/discharge OCP profiles) in PyBaMM.
[request from industrial contributor]
Add version number to a parameter set, separate from (in addition to) the BPX version number.
This would allow us to fix mistakes or inaccuracies in parameter sets while maintaining old versions (and hence not changing any previous results/publications that have been obtained using an old version).
It can probably just be a single version number, doesn't need to be semantic versioning.
Does it have to go via a tempfile?
https://github.com/pybamm-team/BPX/blob/70f1637b239489e92e828c46c73655f14c0f4937/bpx/function.py#L54-L66
Can the function not be evaluated directly? Is this for safety reasons?
Checking for required parameters should be done by model (e.g. SPM needs fewer parameters than DFN). Allows users to share "partial" BPX that is sufficient for reduced models.
spotted by @ikorotkin
The BPX schema includes a field ["Parameterisation"]["Cell"]["Thermal conductivity [W.m-1.K-1]"]
. However, this parameter is not defined or used in the corresponding mathematical specification for any model.
Proposed fix:
Recommended to fix after v0.4 release.
Currently, the "User-defined" section requires that all fields are floats, strings or dicts of the form {"x": x_data, "y": y_data}
, which then get converted into FloatFunctionTable
objects. We should allow users add their own JSON structure within this field, so long as the final fields are of type FloatFunctionTable
. This requires more carefully defining what is and is not an InterpolatedTable
.
Should be 'Surface area per unit volume [m-1]'
In BPX/tests/test_utilities.py there is a typo
Positive electrode Maximum concentration [mol.m-3]": 631040 but it should be 63104.0
Relates to #4
Consider adding support within BPX for standard degradation modes that can be expressed within a physics-based model:
Simon Clark (SINTEF) wrote:
Generally, I am thinking if we can structure this in a way that will support higher dimensions (e.g. P3D, P4D) in the future? That is affected mostly by the electrode parameter structure. As it currently is implemented, the electrode level contains information about the coating (e.g. thickness, porosity, transport efficiency, etc.) and the active material (e.g. stoichiometry, OCP, etc.) while omitting information about the current collectors. While the current collectors are not important for P2D models, they can be very important in P4D models. Also, enabling the possibility to set the composition of the coating (e.g. X wt% active material, Y wt% carbon black) would be good – even if it is not directly supported by PyBaMM at the moment. That would look something like this (just as a pseudo-code example):
{
"Negative electrode": {
"Coating": {
"Thickness [m]": 4.44e-05,
"Porosity": 0.20666,
"Transport efficiency": 0.09395,
"Active material": {
"name": "Graphite",
"Mass fraction": 0.92,
"Diffusivity [m2.s-1]": 9.6e-15,
"OCP [V]": "5.29210878e+01 * exp(-1.72699386e+02 * x) - 1.17963399e+03 + 1.20956356e+03 * tanh(6.72033948e+01 * (x + 2.44746396e-02)) + 4.52430314e-02 * tanh(-1.47542326e+01 * (x - 1.62746053e-01)) + 2.01855800e+01 * tanh(-2.46666302e+01 * (x - 1.12986136e+00)) + 2.01708039e-02 * tanh(-1.19900231e+01 * (x - 5.49773440e-01)) + 4.99616805e+01 * tanh(-6.11370883e+01 * (x + 4.69382558e-03))",
"Entropic change coefficient [V.K-1]": "(-0.1112 * x + 0.02914 + 0.3561 * exp(-((x - 0.08309) ** 2) / 0.004616)) / 1000",
"Conductivity [S.m-1]": 7.46,
"Surface area per unit volume [m-1]": 473004,
"Reaction rate constant [mol.m-2.s-1]": 6.872E-06,
"Minimum stoichiometry": 0.0016261
},
"Conductive additive": {
"name": "Carbon black",
"Mass Fraction": 0.04
},
"Binder": {
"name": "CMC",
"Mass fraction": 0.04
}
},
"Current collector": {
"name": "Copper foil",
"Thickness [µm]": 10
}
}
}
Remove 'Cell height [m]', 'Cell width [m]', 'Cell thickness [m]', 'Cell diameter [m]', and add "Cell volume [m3]", "Cell surface area [m2]" instead. Allow/encourage people to add info on cell form factor, detailed dimensions in the "description" field. This will be enough for lumped thermal models.
When parsing a BPX we should check 1) that any given parameter is valid (float, function that can be executed), and 2) check how "complete" the BPX is (which model can you simulate if any).
This allows sharing partial BPX files, e.g. for a single electrode.
Add a half-cell variant of BPX - this would require writing down the model equations in the standard and having a slightly different schema and validation.
Update to 'Reaction rate constant [mol.m-2.s-1]'
In the presence of an invalid BPX object, the BPX parser reports non-existent errors in addition to real errors.
import bpx
bpx.parse_bpx_file("examples/nmc_pouch_cell_BPX.json", v_tol=1)
Here, the file is parsed with no errors (valid BPX).
nmc_pouch_cell_BPX.json
:"Ambient temperature [K]": 298.15,
import bpx
bpx.parse_bpx_file("examples/nmc_pouch_cell_BPX.json", v_tol=1)
The parser raises 40 errors, most of which are not correctly identified (missing or extra fields). The only real error (missing required field ["Parameterisation"]["Cell"]["Ambient temperature [K]"]
) is raised twice, as the first and last reported error.
Recommended by Simon Clark (SINTEF).
Raise an error if there are fields in the JSON file that aren't in the schema
Add to the BPX standard a specifically defined equation set for "SPMe", within which the BPX parameters are used.
This will require some discussion on whether the PyBaMM implementation is appropriate, or if the canonical SPMe defined by Marquis 2019 should be used. See also pybamm-team/PyBaMM#4145
Simon Clark (SINTEF) wrote:
Maybe also include some terms for bibliographic metadata (e.g. dcterms) to track the creator and source of the parameters for traceability. That might actually be another good argument for defining quantities as objects:
"Thickness": {
"value": 10E-6,
"unit": "m",
"source": "https://doi.org/10.1149/1945-7111/ab9050"
}
The use of BPX is currently limited by the restrictive set of mathematical operations that are allowed in FloatFunctionTable
inputs.
The schema currently supports the following operations: *, /, -, +, **, exp, tanh.
Here are some suggestions:
>
and <
to support piecewise functions as sums, e.g.: (x < 0) * 1 + (x > 0) * 0.5
.Roadmap issues:
Extras:
[...] I think it's an irritation of the present BPX that it's necessary to define the mass loading of active components indirectly, via the specification of a surface area and a particle radius combined with the assumption of spherical particles.
I think it would be more comprehensible to be able (optionally) to introduce some ability to specify e.g. a mass fraction and corresponding specific capacity of each blend component, and make the active volumetric surface area a dependent quantity under the equivalent spherical particle assumption.
Originally posted by @ejfdickinson in #33 (comment)
Some mathematical symbols could be more verbose, to avoid confusion of a local or spatially-resolved quantity with a lumped quantity. For example, replace
Derived from recommendation from Edwin Knobbe (BMW).
Provide support in the BPX standard for electrodes to contain a specified blend of different materials (e.g. by weight or atom fraction) with discretely defined OCP functions and maximum concentrations.
[request from industrial contributor]
Simon Clark (SINTEF) wrote:
I’m wondering if including the units as part of the key name is wise? A quantity always has two parts: a value and a unit, and for flexibility it is helpful to express it that way. For example [below] which allows you a little more flexibility in supporting alternative units or unit conversions. It does add an extra level, but I like the clarity of it. It also brings in the option to add source information for where parameters came from (see point on creator metadata below). If the unit is part of the key name, then you need to either (a) already know what the unit is or (b) parse it out from the string, which adds a level of ambiguity / effort.
Example:
Replace:
"Thickness [m]": 10e-6
with
"Thickness": {
"value": 10E-6,
"unit": "m"
}
Parameters required for a lumped thermal model are currently optional. Check that if e.g. activation energy is supplied then a reference temperature is also supplied.
The latter is preffered
The model, parameterization and how they were simulated are intimately connected. For example, how many mesh points did I need in my model to achieve convergence or the desired accuracy? We should add some simulation metadata so that people providing a BPX can say "I tested this model on this platform using this approach and it worked". Obviously, users are free to use a different platform, but this is useful (and sometimes necessary information)
Things like how the model should be discretised in space (method, gridpoints, etc.) and solver settings.
Suggestion from Darryl at About:Energy, seconded by me!
Provide a utility function that will take validation data in a suitable format (for example .csv) and merge it with an existing BPX .json. Include some sanity checking (for example, input data contain time series that are the same length of array).
Maybe also consider a utility function that will strip validation data (or some subset of it) from the BPX .json.
Ensure that any generated BPX .json passes the schema validation in parse_bpx_file()
before being returned.
Due to Pydantic V2 release, PyBaMM Build was failing.
Currently it has been temporarily fixed by pinning down pydantic version in #35 but we still need to make changes accordingly to migrate to Pydantic>2
.
Here's the Migration Guide.
For easily adding extra parameters that don't fall in the specification
Alternatively, this could be done manually by first loading the BPX file and then adding parameters in a python script
"validation" field should be a dict from experiment names to Experiment
model
Is there a workflow for suggesting new fields? Like a lightweight version of PEP?
Could/should limits be specified by the BPX for inputs into functions? e.g. concentration between 0 and 3 molar, temperature between 10 and 40 Celsius. See pybamm-team/PyBaMM#989
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.