powergridmodel / power-grid-model Goto Github PK
View Code? Open in Web Editor NEWPython/C++ library for distribution power system analysis
License: Mozilla Public License 2.0
Python/C++ library for distribution power system analysis
License: Mozilla Public License 2.0
The minimum degree reordering might always have zero edges.
As suggested in Sonar Cloud
https://sonarcloud.io/summary/overall?id=PowerGridModel_power-grid-model
sonar-project.properties
if that's the intended choice. Document in this file about the design choice.A suggestion to add Probot: DCO to the project.
Probot:DCO is a GitHub Integration built with probot that enforces the Developer Certificate of Origin (DCO) on Pull Requests. It requires all commit messages to contain the Signed-off-by line with an email address that matches the commit author.
From Insight into Loads, we would like to use the voltage measurements from FlexOVL smart meters that usually are situated on/near the LV rack of a secondary substation, and include these in a state-estimation calculation. A challenge here is that the step position of the distribution transformer can have a significant influence on the effect of this measurement on the estimated state of the rest of the (MV) grid. I was wondering if you see possibilities for including the uncertainty in the tap position of the transformer in the state-estimation of the power-grid model. So e.g. when the tap position is set to 3 but with position 4 the total residuals of the measurements would be significantly lower, choose that position. My first suspicion is that this isn't very trivial. It seems easier to determine this over several time steps, if the load and thus the voltage drop is very low, a wrong step position would be easier to detect than with a high load. But then a calculation/analysis over several time steps would be necessary. But if this feature works well, it can also mean a nice data improvement that yields much more than just for our team.
Describe the bug
Starting from 1.3.250 I am getting segmentation faults when running .calculate_power_flow()
Input Data Validity
Did you check the validity of input data using the helper function, see examples
[here](../blob/main/examples/Validation Examples.ipynb)?
Data Dump
no datadump for now since I am not sure whether I can share the data outside alliander. Shared internally with @TonyXiang8787
To Reproduce
input_data = load_json()
model = PowerGridModel(input_data, system_frequency=50.0)
model.calculate_power_flow(calculation_method=CalculationMethod.linear)
Expected behavior
A traceback instead of a segmentation fault would be helpful.
Acceptance criteria
The calculate_state_estimation functionality is a statistical optimization of the existing state, based on measurements. Those measurements are, inherently, not perfect, so variance in the optimal state is expected. This feature request is about exposing that variance to the users.
We need short circuit calculation for fault analysis.
To provide insight in the code quality, I would suggest to add power-grid-model to sonarcloud (https://sonarcloud.io/project/configuration?id=alliander-opensource_power-grid-model) via github actions.
The Weather Provider API project is good example. Also badges can be used to show the quality gate is passed, what the maintainability rate and secrurity rate is See: https://github.com/alliander-opensource/Weather-Provider-API & https://sonarcloud.io/project/overview?id=alliander-opensource_Weather-Provider-API
The tap_size property should be allowed to be exactly zero when validating data with assert_valid_input_data
.
BR,
Santiago
The v3 of catch2
is released. It has some major breaking changes. The current cpp unit test will fail to build with the v3 verison of catch2.
See migration tutorial here:
https://github.com/catchorg/Catch2/blob/devel/docs/migrate-v2-to-v3.md#top
It would be nice to have a voltage controlled generator model, since it would allow the simulation of broader grid models.
The generator will be modeled as a PV (active power + Voltage module set point) bus in the calculation engine.
BR,
Santiago
Describe the feature request
Adjust pipeline to build macOS universal wheel file which is both compatible with x86_64 and arm64.
Requirement
Either
It would be nice if the numpy dtypes of the of the PGM input and output arrays are directly accessible.
This would make it easier to extend PGM arrays (e.g. as done in APG project)
example of exposed dtypes:
class Base:
"""Base dtype"""
id: np.int8 = np.iinfo(np.int8).min
class Node(Base):
"""Node data type"""
u_rated: np.float64 = np.nan
node_type: np.int8 = np.iinfo(np.int8).min
We have code/documentation duplication of the input/output/update attributes of components at many places:
Describe the attributes in central json
or yaml
files. Use Python script to generate code/documentation automatically. In this way we avoid duplication.
The issue #82 should be solved by this.
The test files should be in comparable folder/name structure as the structure of the source codes.
eigen
version 3.4 is released. We can migrate the code to use eigen 3.4. Some alignment issue will be easier.
Use pre-commit to check code quality locally before committing
The pre-commit
tool is a very handy tool to run linters etc before committing the code. It is up to the user to activate the pre-commit hook or not.
I would suggest adding the following .pre-commit-config.yaml
file:
repos:
- repo: https://github.com/fsfe/reuse-tool
rev: v1.0.0
hooks:
- id: reuse
- repo: https://github.com/psf/black
rev: 22.6.0
hooks:
- id: black
language_version: python3.8
Then we need to add pre-commit
to the dev requirements and users can choose to activate it by executing:
pre-commit install
I would like to extend the pre-commit file with isort
, flake8
and mypy
, but then we should also add them to the github pipeline
- repo: https://github.com/pycqa/isort
rev: 5.10.1
hooks:
- id: isort
- repo: https://github.com/pycqa/flake8
rev: 4.0.1
hooks:
- id: flake8
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.971
hooks:
- id: mypy
Currently the versioning on main
branch is major.minor.build_number
, which can grow pretty fast as there are many builds on feature branches.
It is desired to retrieve the latest version from PyPI and
major.minor.0
.This avoid error of putting the wrong number in the Python side.
To make the C++ code easier to understand it would be nice to improve the code documentation.
Tasks:
Describe the bug
Probably the batch validation does not work for asym loads. It must be because asym loads batch are 3d arrays while other components would give only 2d.
To Reproduce
# some basic imports
from power_grid_model import PowerGridModel, CalculationMethod, CalculationType, LoadGenType
from power_grid_model import initialize_array
from power_grid_model.validation import assert_valid_input_data, assert_valid_batch_data
# node
node = initialize_array("input", "node", 3)
node["id"] = [1, 2, 6]
node["u_rated"] = [10.5e3, 10.5e3, 10.5e3]
# line
line = initialize_array("input", "line", 3)
line["id"] = [3, 5, 8]
line["from_node"] = [1, 2, 1]
line["to_node"] = [2, 6, 6]
line["from_status"] = [1, 1, 1]
line["to_status"] = [1, 1, 1]
line["r1"] = [0.25, 0.25, 0.25]
line["x1"] = [0.2, 0.2, 0.2]
line["c1"] = [10e-6, 10e-6, 10e-6]
line["tan1"] = [0.0, 0.0, 0.0]
line["i_n"] = [1000, 1000, 1000]
line["r0"] = [0.25, 0.25, 0.25]
line["x0"] = [0.2, 0.2, 0.2]
line["c0"] = [10e-6, 10e-6, 10e-6]
line["tan0"] = [0.0, 0.0, 0.0]
# load
asym_load = initialize_array("input", "asym_load", 2)
asym_load["id"] = [4, 7]
asym_load["node"] = [2, 6]
asym_load["status"] = [1, 1]
asym_load["type"] = [LoadGenType.const_power, LoadGenType.const_power]
asym_load["p_specified"] = [[20e6,21e6,10e6], [10e6, 15e6,10e6]]
asym_load["q_specified"] = [[5e6, 5e6, 2e6], [1e6,2e6,3e6]]
# source
source = initialize_array("input", "source", 1)
source["id"] = [10]
source["node"] = [1]
source["status"] = [1]
source["u_ref"] = [1.0]
# all
input_data = {
"node": node,
"line": line,
"asym_load": asym_load,
"source": source
}
# batch data
asym_load_update = initialize_array("update", "asym_load", (2, 2))
asym_load_update["id"] = [[4,7], [4,7]]
asym_load_update["p_specified"] = [[[20e6,21e6,10e6], [10e6, 15e6,10e6]], [[31e6,31e6,20e6], [20e6, 25e6,20e6]]]
batch_data = {"asym_load": asym_load_update}
assert_valid_input_data(input_data=input_data, calculation_type=CalculationType.power_flow, symmetric=False)
assert_valid_batch_data(input_data=input_data, update_data=batch_data, calculation_type=CalculationType.power_flow, symmetric=False)
Screenshots
Additional context
Fix: Maybe we can add this before line 219? Would that be right/enough?
if mask.ndim == 2:
mask = mask.prod(axis=1, dtype=np.bool)
False positives in is_component_update_independent
Currently the function checks if all batches contain exactly the same set of object IDs. If so, the batches are considered independent. However, the actual data in the batch can contain NaN values, in which case the update is not independent.
A (partial) example of input data for two sources and two seemingly independent batch updates:
Input
id node status u_ref u_ref_angle ...
1 1001 1 10000.0 0.0 ...
2 1002 1 10000.0 0.0 ...
Update batch #1
id status u_ref u_ref_angle
1 NaN 10100.0 NaN
2 NaN NaN 0.01
Update batch #2
id status u_ref u_ref_angle
1 NaN NaN 0.02
2 NaN 10200.0 NaN
In general, if the batches are independent, there is no need to copy the input data before applying each batch. However, when a batch contains a NaN value (at a position where another batch contains a non-NaN values) some values may not be overwritten, so you might up with values from batch 1 in batch 2 etc.
Batch #1
id node status u_ref u_ref_angle ...
1 1001 1 10100.0 0.0 ...
2 1002 1 10000.0 0.01 ...
Batch #2
id node status u_ref u_ref_angle ...
1 1001 1 10100.0 0.02 ...
2 1002 1 10200.0 0.01 ...
Notice how the batch update data from batch 1 is still present in batch 2.
Expected behavior
We'd expect is_component_update_independent
to return false
in this case.
Guiding directions
We should add a check for each attribute of each object to see if there is a NaN value following a non-NaN value when we loop through the batches from batch 0 to N. A simpler easier implementation would be to check if all values are NaN or non of the values are NaN.
#177 might fix the problem automatically.
eigen3
is slow in many scenarios.arm64
(macOS, Linux).It is an output attribute.
Currently the link is modeled as a small impedance. This can cause numerical issues.
It is proposed to merge the nodes connected by link into one single bus.
Acceptance criteria
Tranformer functionality should be tested in test_main_model.hpp
Similar tests should be perfomed as for the other components
Describe the problem
When a three-phase (a.k.a. asymmetrical) value is updated with one or two NaN values, the original value is lost.
To Reproduce
In general, the logic for updating values is:
if (!is_nan(update_value)) {
value = update_value;
}
Three phase values are tuples of three values like (230.1, 230.2, 230.3)
. In the C++ code, a three phase value is considered a single value which can be NaN or non-NaN. Currently it is considered NaN iff all three values are NaN:
value | is_nan |
---|---|
(NaN, NaN, NaN) | ✔ |
(NaN, 0.0, NaN) | ❌ |
(0.0, NaN, 0.0) | ❌ |
(0.0, 0.0, 0.0) | ❌ |
It follows that updating (230.1, 230.2, 230.3)
with (NaN, NaN, 0.0)
(which is not NaN
) results in (NaN, NaN, 0.0)
.
Improved behavior
After some discussion, we decided that updating (230.1, 230.2, 230.3)
with (NaN, NaN, 0.0)
should result in (230.1, 230.2, 0.0)
as that gives the most flexibility and it follows the logic of updating scalar values.
Additional remarks
The python validation code follows the same behaviour as the C++ (when issue #91 is fixed), so make sure that the validation code is being updated as well.
Dependabot takes the effort out of maintaining your dependencies. You can use it to ensure that your repository automatically keeps up with the latest releases of the packages and applications it depends on.
You enable Dependabot version updates by checking a configuration file into your repository. The configuration file specifies the location of the manifest, or of other package definition files, stored in your repository. Dependabot uses this information to check for outdated packages and applications. Dependabot determines if there is a new version of a dependency by looking at the semantic versioning (semver) of the dependency to decide whether it should update to that version.
For more information see: https://docs.github.com/en/code-security/supply-chain-security/keeping-your-dependencies-updated-automatically/about-dependabot-version-updates
Also github support Dependabot, see: https://github.com/alliander-opensource/power-grid-model/network/updates
When we have the calculation results, it is in a numpy structured array. For example for the node you have:
Input
id u_pu ...
7 1.05 ...
10 1.07 ...
8 1.07 ...
103 1.0 ...
Say we want to retrieve a sub-array with node id 10
and 103
. To do this, we need to translate the list of id's to list of position numbers in the original numpy array. The C++ core already has hash table to do the lookup.
Say we have a numpy array a = [10, 103]
. We should be able to retrieve the position number by using the following call:
position = model.get_indexer('node', a)
The returned value should be [1, 3]
.
We can then for example to retrieve the u_pu
for these two nodes by:
u_pu_sub_array = node_result['u_pu'][position]
The result u_pu_sub_array
should be [1.07, 1.0]
Describe the bug
When running pre-commit hook, pylint
does not actually check any files.
pylint...............................................(no files to check)Skipped
Test data
failure message
Not all values match for line.p_from (rtol=1e-05, atol=1e-05, pattern=default)
Actual: [[ 682.79979676 291.65058163 1394.98543615]]
Expected: [[0.22670672 0.22670672 0.22670672]]
Difference: [[ 682.57309004 291.42387492 1394.75872943]]
Matches: [[False False False]]
https://github.com/alliander-opensource/power-grid-model/runs/5216237639?check_suite_focus=true
Acceptance criteria
Some example:
Acceptance criteria
As suggested in Sonar Cloud
https://sonarcloud.io/summary/new_code?id=PowerGridModel_power-grid-model
sonar-project.properties
if that's the intended choice. Document in this file about the design choice.Originally posted by Thijss May 4, 2022
I can see a lot of (patch) releases on https://pypi.org/project/power-grid-model/#history
I have no clue what the release notes are on these.
Github has a very easy system for creating releases and auto-generating release notes from PR titles
https://docs.github.com/en/repositories/releasing-projects-on-github/automatically-generated-release-notes
Perhaps we can use this here too?
I could help setting it up
Implement zigzag winding for transformer. So we can cope with Yzn5 connection, which is a typical connection for MV/LV transformers.
Describe the feature request
It would be nice to include support network configurations with so-called three winding transformers.
In my understanding these transformers can connect three nodes with different voltage levels.
In the current data model, a Transformer can only connect two nodes: (from_node
) and (to_node
)
current:
[from_node]--[transformer]--[to_node]
requested support for:
[node_1]--[transformer]--[node_2]
|
|--[node_3]
Acceptance criteria
validate_input_data
and validate_batch_data
Use pybind11
to replace cython
as the binding to C++.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.