hugovdberg / piconnect Goto Github PK
View Code? Open in Web Editor NEWA python connector to the OSISoft PI and PI-AF databases
License: MIT License
A python connector to the OSISoft PI and PI-AF databases
License: MIT License
A coworker experienced an error when trying to install PIconnect
into her Anaconda environment, I experienced the same error when installing into a vanilla venv.
Using Python 3.7.4 with the standard library:
py -m venv venv
venv\Scripts\activate
pip install PIconnect
Error:
Collecting PIconnect
Downloading https://files.pythonhosted.org/packages/63/29/714e87723fb8cbb4b21e6fdedcce34d4c28bdf6d37f4fe8a687c3bfb7289/PIconnect-0.7.1.tar.gz
Complete output from command python setup.py egg_info:
Download error on https://pypi.org/simple/pytest-runner/: [WinError 10054] An existing connection was forcibly closed by the remote host -- Some packages may not be found!
Couldn't find index page for 'pytest-runner' (maybe misspelled?)
Download error on https://pypi.org/simple/: [WinError 10054] An existing connection was forcibly closed by the remote host -- Some packages may not be found!
No local packages or working download links found for pytest-runner
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\...\AppData\Local\Temp\pip-install-rq3lke40\PIconnect\setup.py", line 109, in <module>
package_data={
File "c:\users\...\desktop\test\venv\lib\site-packages\setuptools\__init__.py", line 144, in setup
_install_setup_requires(attrs)
File "c:\users\...\desktop\test\venv\lib\site-packages\setuptools\__init__.py", line 139, in _install_setup_requires
dist.fetch_build_eggs(dist.setup_requires)
File "c:\users\...\desktop\test\venv\lib\site-packages\setuptools\dist.py", line 724, in fetch_build_eggs
replace_conflicting=True,
File "c:\users\...\desktop\test\venv\lib\site-packages\pkg_resources\__init__.py", line 782, in resolve
replace_conflicting=replace_conflicting
File "c:\users\...\desktop\test\venv\lib\site-packages\pkg_resources\__init__.py", line 1065, in best_match
return self.obtain(req, installer)
File "c:\users\...\desktop\test\venv\lib\site-packages\pkg_resources\__init__.py", line 1077, in obtain
return installer(requirement)
File "c:\users\...\desktop\test\venv\lib\site-packages\setuptools\dist.py", line 791, in fetch_build_egg
return cmd.easy_install(req)
File "c:\users\...\desktop\test\venv\lib\site-packages\setuptools\command\easy_install.py", line 673, in easy_install
raise DistutilsError(msg)
distutils.errors.DistutilsError: Could not find suitable distribution for Requirement.parse('pytest-runner')
----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in C:\Users\...\AppData\Local\Temp\pip-install-rq3lke40\PIconnect\
PIconnect
and its dependencies are installed by pip
.
pytest-runner
before installing PIconnect
, this resolved the errorpytest-runner
conditional for setup:None.
Me | Coworker | |
---|---|---|
OS | Windows 10 | Windows 10 |
Python version | 3.7.4 | 2.7.x |
PIconnect version | 0.7.1 | 0.7.1 |
Anaconda Dist. | - | 5.2.0 |
Line 34 in ec65edd
Line 78 in ec65edd
Tried to get interpolated values by passing datetime objects instead of strings. Expected PIconnect
to transform the objects to the correct format, got TypeError: no constructor matches given arguments
instead.
Issue is in AF.Time.AFTimeRange(start_time, end_time)
, where the .NET library of the AF SDK accepts strings only.
import PIconnect as PI
from datetime import datetime
start = datetime(2015, 1, 1)
end = datetime(2016, 1, 1)
interval = '1d'
with PI.PIServer() as server:
tag = server.search('*')[0]
data = tag.interpolated_values(start, end, interval)
Hello, Hugo. Suppose there is a tag and we want to WRITE data to it. Could you please provide a sample Python code how to do it using your PIconnect package? Thank you.
When I tried to import the PI package using below code
import PIconnect as PI
I got the below error, when i use my personal computer worked fine however when I ran it in my work machine I get this error, is this normal behavior from a protected network? and how I can fix that?
SocketException Traceback (most recent call last)
SocketException: No such host is known
at System.Net.Dns.GetAddrInfo(String name)
at System.Net.Dns.InternalGetHostByName(String hostName, Boolean includeIPv6)
at System.Net.Dns.GetHostEntry(String hostNameOrAddress)
at System.ServiceModel.Channels.DnsCache.Resolve(Uri uri)
The above exception was the direct cause of the following exception:
EndpointNotFoundException Traceback (most recent call last)
<ipython-input-1-4cbc695e5bee> in <module>
----> 1 import PIconnect as PI
...project_env\lib\site-packages\PIconnect\__init__.py in <module>
26 from PIconnect.config import PIConfig
27 from PIconnect.PI import PIServer
---> 28 from PIconnect.PIAF import PIAFDatabase
29
30 # pragma pylint: enable=unused-import
...project_env\lib\site-packages\PIconnect\PIAF.py in <module>
59
60
---> 61 class PIAFDatabase(object):
62 """PIAFDatabase
63
...project_env\lib\site-packages\PIconnect\PIAF.py in PIAFDatabase()
69 servers = {
70 s.Name: {"server": s, "databases": {d.Name: d for d in s.Databases}}
---> 71 for s in AF.PISystems()
72 }
73 if AF.PISystems().DefaultPISystem:
...project_env\lib\site-packages\PIconnect\PIAF.py in <dictcomp>(.0)
69 servers = {
70 s.Name: {"server": s, "databases": {d.Name: d for d in s.Databases}}
---> 71 for s in AF.PISystems()
72 }
73 if AF.PISystems().DefaultPISystem:
...roject_env\lib\site-packages\PIconnect\PIAF.py in <dictcomp>(.0)
68
69 servers = {
---> 70 s.Name: {"server": s, "databases": {d.Name: d for d in s.Databases}}
71 for s in AF.PISystems()
72 }
EndpointNotFoundException: No DNS entries exist for host myafserver.
Server stack trace:
at System.ServiceModel.Channels.DnsCache.Resolve(Uri uri)
at System.ServiceModel.Channels.SocketConnectionInitiator.GetIPAddresses(Uri uri)
at System.ServiceModel.Channels.SocketConnectionInitiator.Connect(Uri uri, TimeSpan timeout)
at System.ServiceModel.Channels.BufferedConnectionInitiator.Connect(Uri uri, TimeSpan timeout)
at System.ServiceModel.Channels.ConnectionPoolHelper.EstablishConnection(TimeSpan timeout)
at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at System.ServiceModel.ICommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.ClientBase`1.System.ServiceModel.ICommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.ClientBase`1.Open()
at OSIsoft.AF.Support.AFProxy.Reconnect(Boolean autoPrompt, AFConnectionProtocol protocol, String host, Int32 port, String accountName, TimeSpan timeout)
Currently the code contains a lot of noise to try and remain backwards compatible with Python 2.7. However, testing compatibility is increasingly hard and the necessity within our company has finally fallen away to maintain the backward compatibility. Therefore I want to deprecate Python 2 effective immediately, although no active removal of backward compatibility will be done for version 0.9 (due in august I think). For version 0.10 I want to cleanup the code to be fully Python 3.6+ compatible. Please comment below if this would lead to insurmountable issues for you, so we can discuss this plan.
The documentation uses points = server.search('*')[0]
to get a PI tag to then retrieve data in various ways. It would be good to add an example that shows how the user can provide a tag with a name they already know e.g. my_pi_tag.pv
I generally think it's a common occurrence for the user to already know the name of tags they want to retrieve data for.
I can use the search function to do it e.g. points = server.search('my_pi_tag.pv')[0]
but perhaps something like this would be of use:
with PI.PIServer() as server:
tag = server.get_tag('my_pi_tag.pv')
data = tag.recorded_values('*-48h', '*')
print(data)
Is your feature request related to a problem? Please describe.
Currently the Pipfile
and requirements_dev.txt
contain a lot of dependencies for the development environment, some of which I'm not sure why they were added.
Describe the solution you'd like
A critical review of the development dependencies. I suspect some dependencies were added for the travis setup script, so can probably be deleted.
asyncronous reading
I am wondering if it would be possible to make the PIConnect to be asyncio compatible. The idea here is to collect data from various tags without waiting for one tag read to finish to start the next one. Something similar to what is done with pyodbc and aioodcb libraries
I come up with this need because I am trying to develop an asyncronous application that reads data from PI. The problem is that the real asyncronous behaviours is not achieved because the PIConnect don't use the Asyncio library
There is this link from PISquare website that explain how to use asyncronous reading with PI.
https://pisquare.osisoft.com/s/Blog-Detail/a8r1I000000GvS4QAK/async-with-pi-af-sdk-introduction
When trying to run a server listing test I got this error message:
AttributeError: module 'PIconnect' has no attribute 'PIServer'
At first I was sure that the reason was some faulty installation, but after some alternative installs it became clear that this was not the root cause.
After digging a little deeper, I compared the AFSDK.py file with another library I have been using to get PI data, and found out that the path to AF SDK was set to 64-bit installations.
`
PIAF_SDK = os.getenv("PIHOME", "C:\\Program Files\\PIPC")
PIAF_SDK += "\\AF\\PublicAssemblies\\4.0\\"
if not os.path.isdir(PIAF_SDK):
raise ImportError("PIAF SDK not found in %s, check installation" % PIAF_SDK)
`
I changed this line to
PIAF_SDK = os.getenv("PIHOME", "C:\\Program Files(x86)\\PIPC")
And now it works like a charm.
My company has lots of old desktops still, so by default any installation comes as 32-bit.
My suggestion is a simple fix that can help other guys with jurassic 32-bit versions like myself, with some path flexibility depending on the version installed.
Awesome work , thank you for doing this!
Is there a way to extract information for a PIPoint like tag name or value as strings or numbers? I'm trying to narrow a list of PIPoints down based on tag names, but I want to keep the filter generic (i.e match filter via regular expressions) since I'm not always looking for a specific PI Tag, but rather a list of tags containing the same type of information (which therefore means the tag names should go by a similar format)
Is your feature request related to a problem? Please describe.
Currently automated testing and documentation of the package is impossible because the SDK is unavailable on the servers.
Describe the solution you'd like
By automatically injecting the SDK in the functions that depend on it, while allowing explicit overriding if necessary, we can use the package as we do now, but inject a testing version on the automated build systems.
Describe alternatives you've considered
Another option could be to automatically import a testing framework depending on an environment variable.
Export the EventFrame
functionality in a manner similar to PIAFElements
The EventFrame
functionality of PI AF is currently only available through the raw SDK methods,
and not through the PIconnect classes.
Add a new class PIAFEventFrame
.
NA
NA
One of the things on your ToDo doc is to get rid of the auxiliary classes. Is there a way to do this manually? When I try to convert the type of one of the series to floats it just gives an error.
data.D1.astype(float)
TypeError: float() argument must be a string or a number, not 'AFEnumerationValue'
Need a method for using the data from the PI pulls
I noticed that it is possible to first convert the data (floats, strings, ect) into strings only. After that the numbers can all be converted into floats. From there I think I can fix the dataframe completely. Maybe there is an easier way.
Expand the ability to retrieve any/all PI point attributes.
Managing PI servers and tuning point compression means monitoring data point attributes like:
PIconnect has made watching data quick and easy, but quickly access point attributes would further expand those data management tasks.
1: possible expansion of the point.* function(s)
2: retrieve all of a point's attributes, into a Python Dictionary, with a new function and leave it up to the developer to extract the Dictionary entries needed.
System.ComponentModel.Win32Exception: The logon attempt failed
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File ".\test_pi_4.py", line 1, in
import PIconnect as PI
File "C:\Users\anusorbo\AppData\Local\Programs\Python\Python38\lib\site-packages\PIconnect_init_.py", line 28, in
from PIconnect.PIAF import PIAFDatabase
File "C:\Users\anusorbo\AppData\Local\Programs\Python\Python38\lib\site-packages\PIconnect\PIAF.py", line 61, in
class PIAFDatabase(object):
File "C:\Users\anusorbo\AppData\Local\Programs\Python\Python38\lib\site-packages\PIconnect\PIAF.py", line 69, in PIAFDatabase
servers = {
File "C:\Users\anusorbo\AppData\Local\Programs\Python\Python38\lib\site-packages\PIconnect\PIAF.py", line 70, in
s.Name: {"server": s, "databases": {d.Name: d for d in s.Databases}}
File "C:\Users\anusorbo\AppData\Local\Programs\Python\Python38\lib\site-packages\PIconnect\PIAF.py", line 70, in
s.Name: {"server": s, "databases": {d.Name: d for d in s.Databases}}
System.Security.Authentication.InvalidCredentialException: The server has rejected the client credentials.
at System.Net.Security.NegoState.ProcessReceivedBlob(Byte[] message, LazyAsyncResult lazyResult)
at System.Net.Security.NegoState.StartSendBlob(Byte[] message, LazyAsyncResult lazyResult)
at System.Net.Security.NegoState.StartSendBlob(Byte[] message, LazyAsyncResult lazyResult)
at System.Net.Security.NegoState.ProcessAuthentication(LazyAsyncResult lazyResult)
at System.Net.Security.NegotiateStream.AuthenticateAsClient(NetworkCredential credential, String targetName, ProtectionLevel requiredProtectionLevel, TokenImpersonationLevel allowedImpersonationLevel)
at System.ServiceModel.Channels.WindowsStreamSecurityUpgradeProvider.WindowsStreamSecurityUpgradeInitiator.OnInitiateUpgrade(Stream stream, SecurityMessageProperty& remoteSecurity)
When attempting to use a filter_expression argument with the filtered_summaries method, the method does not appear to be applying the argument, or it does not appear to be applying the arguement consistently.
Steps to reproduce the behavior:
I expected that the single maximum value returned would be below 500, but instead, I continue to get a value above 500.
I have confirmed that recored_values and interpolated_values methods both apply the filter_expression argument correctly, in a manner that I expect and when following the examples provided in the documentation.
So I am wondering if there is a bug in the code for filtered_summaries (but maybe just for filtering greater than?) or maybe I just have a syntax issue.
Snap trying to find the single max value over the time period, but it's not filtering.
Snap trying to find the max value below 400 every 5 days, but there are a bunch of them above 400
Snap showing that, oddly enough, filtering seems to work for values greater than
I just wanted to thank you and the community for putting this together. Its already been so amazingly helpful to use python to gather PI data and manipulate it. Using Datalink was getting problematic as the data sets grew in size and the various asks.
With commit fe138d9 the automatic disconnect from the server at exiting the context manager was disabled. This means connections to the database remain open until the python session ends, or the SDK's grace. This issue serves as a request for comments on how to fix this more efficiently.
Running this snippet:
import PIconnect as PI
with PI.PIAFDatabase() as db:
print(db.server.IsConnected)
print(db.server.IsConnected)
prints:
True
True
The expected result would be
True
False
The problem is that PIconnect.PIAF.PIAFDatabase.__exit__
no longer calls self.server.Disconnect
. This was done intentionally to fix unexpected disconnects, but the current fix might not be the most efficient solution.
Add any other context about the problem here.
I installed PIconnect
using pip
but importing failed because wrapt
was missing.
# pip install PIconnect
# python
> import PIconnect as PI
Add wrapt
as a dependency
Created from comment by @mpepera on #496:
I had a slightly different error. We are using an older PI Server without an Asset Framework implementation. Import PIconnect failed on the PIAF import database step. I just commented out the following line from __init__.py
and happily went along my way!
from PIconnect.PIAF import PIAFDatabase
Cool project! Thanks for building it.
File "C:\Users\myuser\AppData\Local\Continuum\anaconda3_32\lib\site-packages\PIconnect\__init__.py", line 26, in <module>
from PIconnect.PIAF import PIAFDatabase
File "C:\Users\myuser\AppData\Local\Continuum\anaconda3_32\lib\site-packages\PIconnect\PIAF.py", line 61, in <module>
class PIAFDatabase(object):
File "C:\Users\myuser\AppData\Local\Continuum\anaconda3_32\lib\site-packages\PIconnect\PIAF.py", line 67, in PIAFDatabase
default_server = servers[AF.PISystems().DefaultPISystem.Name]
AttributeError: 'NoneType' object has no attribute 'Name'
Originally posted by @mpepera in #496 (comment)
When I have used update_value
before Pt Created
with REPLACE
option PI Server merely flushed recently updated data (I guess, it flushed rows where the special flag had a place, i.e. Substituted flag
, via: https://docs.osisoft.com/bundle/af-sdk/page/html/T_OSIsoft_AF_Data_AFUpdateOption.htm). I have not expected that, ofc.
However, for data that is updated after the Pt Created
REPLACE
method works as expected as I can see and test.
import PIconnect as PI
from PIconnect.PIConsts import RetrievalMode
from datetime import datetime, timedelta
with PI.PIServer() as server:
points = server.search(tag)
p = points[0]
created = p.raw_attributes['creationdate']
created = datetime(created.Year, created.Month, created.Day, created.Hour, created.Minute, created.Second, created.Millisecond)
print('before:')
print(p.recorded_values(created, datetime.now()).head(), end=f'\n{"-"*10}\n')
p.update_value(
1e+10,
time=created - timedelta(days=1),
update_mode=PI.PIConsts.UpdateMode.NO_REPLACE,
buffer_mode=PI.PIConsts.BufferMode.BUFFER_IF_POSSIBLE,
)
print('updated:')
print(p.recorded_values(created - timedelta(days=30), datetime.now()).head(), end=f'\n{"-"*10}\n')
p.update_value(
1e+10,
time=created - timedelta(days=1),
update_mode=PI.PIConsts.UpdateMode.REPLACE,
buffer_mode=PI.PIConsts.BufferMode.DO_NOT_BUFFER,
)
print('replaced:')
print(p.recorded_values(created - timedelta(days=30), datetime.now()).head(), end=f'\n{"-"*10}\n')
In my work, I have created test points in order to reproduce service behavior (aka test stand).
So, newly created test points have their created
stamp in a "recent" time, however, I was supposed to start debugging with historical data and here update_value
method came to help me as I thought. So, I faced the bug mentioned above and spent time locating the one in a code. I believe, if there were a kinda warning about the buggy behavior it would save me time.
Not sure that we want to dig into the PI problem hard, maybe there should be a simple warning on update_value
older than created
property (as for unexpected hardcode on your own risk behavior).
before:
2021-07-13 11:28:03+00:00 9003
2021-07-13 11:29:03+00:00 9000
2021-07-13 11:30:03+00:00 9001
2021-07-13 11:31:03+00:00 9002
2021-07-13 11:32:03+00:00 9002
Name: SINUSOID6, dtype: object
----------
updated:
2021-07-12 11:28:03+00:00 1e+10
2021-07-13 11:28:03+00:00 9003
2021-07-13 11:29:03+00:00 9000
2021-07-13 11:30:03+00:00 9001
2021-07-13 11:31:03+00:00 9002
Name: SINUSOID6, dtype: object
----------
replaced:
2021-07-13 11:28:03+00:00 9003
2021-07-13 11:29:03+00:00 9000
2021-07-13 11:30:03+00:00 9001
2021-07-13 11:31:03+00:00 9002
2021-07-13 11:32:03+00:00 9002
Name: SINUSOID6, dtype: object
----------
Ofc, maybe there is a good workaround how to fill the PI server with a lot of data for a newly created point. For me, I found the solution: use INSERT
as an option for update_value
of the test point.
UPD: I can report that the bug sometimes does not occur (you can possibly check it via several runs of the code).
UPD2: I added screenshot result of script executing
PIConnect v0.8.0 defaulting to myafserver, and
error is no DNS entries exist for host myafserver.
Steps to reproduce the behavior:
import PIconnect as PI
with PI.PIServer(server='PI1') as server:
print(server.server_name)
Should print PI1
v0.7.1 works OK.
If applicable, add screenshots to help explain your problem.
Hugo thanks for sharing your code. I have a feature request;
tags is a list of PIPoint objects. The following code works for me;
my_single_data = PIconnect.PIData.PISeriesContainer.interpolated_values(tags[0],"2018-03-19 20:00:00","2018-03-19 20:02:00","00:00:30")
It would be nice if the function could take a list of PIPoints and return an array of data.
my_data_table = PIconnect.PIData.PISeriesContainer.interpolated_values(tags,"2018-03-19 20:00:00","2018-03-19 20:02:00","00:00:30")
Bug report
Description
PIconnect.PIAF.PIAFAttribute object returns data as floats.
To Reproduce
with PI.PIAFDatabase(server=servers[0],
database="Test") as database:
element = database.descendant(
r'Europe\A\B')
attribute = element.attributes['DD']
print(type(attribute))
data = attribute.recorded_values('*-5mo', '*')
for i in range(10):
print(type(data[i]))
print(data[i])
Expected behavior
I expect the data to be of "datetime, float" pairs. But it is only floats. How to get its timestamps?
System
Hi, I didn't get how to explicitly pass username and password when connecting to a specific PI server. In my environment I need to provide a technical user to connect to the server instead of using my own Windows credentials.
Apologies for the basic question, but I am new to the library and have not 100% clarity on how the servers are discovered.
Although I am able to ping my server, the server is not discovered with running the code snippet in the tutorials. If I provide the name of the server, the same happens - it is not found and I only get the the Testing server.
Is there any kind of network constrain that could cause this?
Appreciate the help in advance.
Steps to reproduce the behavior:
import PIconnect as PI
with PI.PIServer(server='MyServerName') as server:
print(server.server_name)
I would expect to see MyServerName
Not a clear understanding why this would happen. Can ping the server. A different colleague, on a different office, does indeed see the server as it is supposed to be, which leads me to think that this is a network question. I just don't understand how the discovery is done and based on what. Assuming that I can ping the server, I would expect that I would be able to connect to it, unless we are talking about a specific port that might/can be blocked.
PIconnect
does not load the SDK correctly, but uses the Testing framework instead.
Steps to reproduce the behavior:
import PIconnect as PI
with PI.PIServer() as server:
print(server.server_name)
The name of the connected server is printed
--
When I import Import PIconnect as PI
I got this error. Do I need to setup OSIsoft?
`AttributeError Traceback (most recent call last)
in
----> 1 import PIconnect as PI
2 print(list(PI.PIServer.servers.keys()))
~\AppData\Local\Continuum\anaconda3\lib\site-packages\PIconnect_init_.py in
24 from PIconnect.AFSDK import AF
25 from PIconnect.PI import PIServer
---> 26 from PIconnect.PIAF import PIAFDatabase
27
28 version = "0.7.1"
~\AppData\Local\Continuum\anaconda3\lib\site-packages\PIconnect\PIAF.py in
59
60
---> 61 class PIAFDatabase(object):
62 """Context manager for connections to the PI Asset Framework database."""
63
~\AppData\Local\Continuum\anaconda3\lib\site-packages\PIconnect\PIAF.py in PIAFDatabase()
65
66 servers = {x.Name: {"server": x, "databases": {}} for x in AF.PISystems()}
---> 67 default_server = servers[AF.PISystems().DefaultPISystem.Name]
68
69 def init(self, server=None, database=None):
AttributeError: 'NoneType' object has no attribute 'Name'`
I guess it may be useful if a point has created
property. Now it is located in raw_attributes of the point in System.DateTime format
When I have been debugging data stored in PI Server I occasionally have faced with No Data
rows. And it took me several tries to identify the start point of the data recording.
For other data, now I have to proceed the same way and I suppose with the property of created
we can simplify a successful data extraction for a point. So, we can merely extract values from creation date.
relates to #506 (comment)
in PI
module add the property discussed above:
@property
def created(self):
"""Return the creation datetime of a point."""
self.__load_attributes()
created = p.raw_attributes['creationdate']
return datetime(d.Year, d.Month, d.Day, d.Hour, d.Minute, d.Second, d.Millisecond)
property
and not the cached_prtoperty
? Seems, we load each time attributes with raw_attributes
method.To export interpolated values only over the time range for which values were actually recorded some helper attributes would be useful:
PIPoint.first_recorded_value
PIPoint.first_recorded_timestamp
PIPoint.last_recorded_value
PIPoint.last_recorded_timestamp
A workaround is available in the meantime:
import PIconnect as PI
start_date = PI.AFSDK.AF.Time.AFTime('01-01-1970')
end_date = PI.AFSDK.AF.Time.AFTime('*')
at_or_after_date = PI.AFSDK.AF.Data.AFRetrievalMode.AtOrAfter
at_or_before_date = PI.AFSDK.AF.Data.AFRetrievalMode.AtOrBefore
with PI.PIServer() as server:
tags = server.search('*')
for tag in tags:
first_val = tag.pi_point.RecordedValue(start_date, at_or_after_date)
last_val = tag.pi_point.RecordedValue(end_date, at_or_before_date)
first_recorded_timestamp = first_val.Timestamp
first_recorded_value = first_val.Value
last_recorded_timestamp = last_val.Timestamp
last_recorded_value = last_val.Value
print tag.name, first_recorded_timestamp, first_recorded_value, last_recorded_timestamp, last_recorded_value
The values from this example are not cleaned up into python objects yet, but can be useful already.
I have been using this great module to quickly grab some data and have built my own functions for analysis (this is my first introduction to OSI Soft applications).
Initially built this on my Windows PC but have recently switched to MacOS systems.
I am now unable to access the server with this module (returns "TestingAF") even when I explicitly indicate the server and/or database.
I've spent substantial time parsing through the documents but can't seem to find what I'm missing.
I understand that the PI SDK runs on 32- and 64-bit Windows platforms and provides access to servers on all PI platforms - am I unable to leverage this module due to my OS?
with PI.PIAFDatabase(server='sname', database='dname') as database:
print(database.server_name)
>>> TestingAF
/opt/anaconda3/lib/python3.7/site-packages/PIconnect/PIAF.py:89: UserWarning: Server "sname" not found, using the default server.
warn(message=message.format(server=server), category=UserWarning)
Thanks for the support!
Tried to get interpolated values for a long period (years) with high interval (minutes). System gives a runetime error
start = "01-01-2014"
eind = "13-03-2018"
interval = '2m'
for item in lijst:
with PI.PIServer() as server:
tags = server.search(item)
for tag in tags:
data = tag.interpolated_values(start, eind, interval)
data.to_csv(tag.tag + ".csv")
PITimeoutException: [-10722] PINET: Timeout on PI RPC or System Call.
bij OSIsoft.AF.PI.PIException.ConvertAndThrowException(PIServer piServer, Exception ex, String message)
bij OSIsoft.AF.PI.PIPoint.InterpolatedValuesByCount(AFTimeRange timeRange, Int32 numberOfValues, String filterExpression, Boolean includeFilteredValues)
bij OSIsoft.AF.PI.PIPoint.InterpolatedValues(AFTimeRange timeRange, AFTimeSpan interval, String filterExpression, Boolean includeFilteredValues)
When querying data from a PIAFAttribute it's possible to get the underlying data, but it does not seem easy (or possible) to get the tag name that the attribute must be pointing to or otherwise referencing in order for that data to be retrieved.
When building applications on top of PI data one often wants to use PI AF in order to discover what data is available but once discovered to not incur the overhead of going back to AF in order to run additional queries on the timeseries data.
AF data is typically fairly static or is slowly changing. Once a PI tag name is determined from a PIAF attribute it's usually fixed for quite some time.
I've poked around the PIAFAttribute class and started peering into the PIAFAttribute.attribute but while I can find things that seem to be "name" or in some cases "tag" the value often seems to be much closer to a display name than an underlying tag name. Something like "Tank Level" which looks like a display name rather than "\SERVER\REGION.LOCATION.SITE.SKID.EQUIPMENT" which looks much more like a SCADA tag name.
It would be great if there were a property or a function on the PIAFAttribute that one could call to retrieve the tag name. I can see it with the OSI PI Explorer but I haven't been able to figure out how to track it down within the PIConnect system.
I did poke around both the PIAttribute.attribute and PIAttribute.attribute.Data (using print(dir())) and also what comes back from PIAttribute.recorded_values(start, end) but none of them seemed to have what I was looking for. This despite the PISeries having an attribute called "tag" but again it seems to have a display name in it, not something that looks like a PI tag.
This seems like it might be related to #506: #506
I'm happy to help by forking and making a PR but I suppose I'm not sure entirely where to start! I'm hoping that I could get some guidance or advice. Thanks!
Is there any function to send values back to PI?
It would be really interesting.
It would be able to extract data, work with it and send results to tags (predictions for example).
I am getting a TCP/IP Port connection failure โฆ see below:
When I build my Python script (Anaconda 3.7.7) in Spyder, using PIConnect, everything works like a charm down though this point:
import` sys
import re
import PIconnect as PI
from PIconnect.PIConsts import SummaryType, TimestampCalculation
with PI.PIServer(server='mypi') as server:
print(server.server_name)
This builds a connection to my PI server (mypi) over port 5450 as I would expect, and all is well with the world.
However, when I try to run Python from the command line, using either python.exe or ipython.exe (again Anaconda 3.3.7) the connection fails as it tries to connect on port 5457.
C:\Users\14541\OneDrive - FirstEnergy Corp\My Python\OSI PI>python.exe test.py
System.Net.Sockets.SocketException: No connection could be made because the target machine actively refused it 10.10.16.99:5457
at System.Net.Sockets.Socket.DoConnect(EndPoint endPointSnapshot, SocketAddress socketAddress)
at System.Net.Sockets.Socket.Connect(EndPoint remoteEP)
at System.ServiceModel.Channels.SocketConnectionInitiator.Connect(Uri uri, TimeSpan timeout)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "test.py", line 3, in <module>
import PIconnect as PI
File "D:\Anaconda3\lib\site-packages\PIconnect\__init__.py", line 28, in <module>
from PIconnect.PIAF import PIAFDatabase
File "D:\Anaconda3\lib\site-packages\PIconnect\PIAF.py", line 61, in <module>
class PIAFDatabase(object):
File "D:\Anaconda3\lib\site-packages\PIconnect\PIAF.py", line 71, in PIAFDatabase
for s in AF.PISystems()
File "D:\Anaconda3\lib\site-packages\PIconnect\PIAF.py", line 71, in <dictcomp>
for s in AF.PISystems()
File "D:\Anaconda3\lib\site-packages\PIconnect\PIAF.py", line 70, in <dictcomp>
s.Name: {"server": s, "databases": {d.Name: d for d in s.Databases}}
System.ServiceModel.EndpointNotFoundException: Could not connect to net.tcp://mypi:5457/AFServer/Service. The connection attempt lasted for a time span of 00:00:01.1774284. TCP error code 10061: No connection could be made because the target machine actively refused it 10.10.16.99:5457.
Server stack trace:
at System.ServiceModel.Channels.SocketConnectionInitiator.Connect(Uri uri, TimeSpan timeout)
at System.ServiceModel.Channels.BufferedConnectionInitiator.Connect(Uri uri, TimeSpan timeout)
at System.ServiceModel.Channels.ConnectionPoolHelper.EstablishConnection(TimeSpan timeout)
at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at System.ServiceModel.ICommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.ClientBase`1.System.ServiceModel.ICommunicationObject.Open(TimeSpan timeout)
at OSIsoft.AF.Support.AFProxy.Reconnect(Boolean autoPrompt, AFConnectionProtocol protocol, String host, Int32 port, String accountName, TimeSpan timeout)
C:\Users\14541\OneDrive - FirstEnergy Corp\My Python\OSI PI>
I would expcect a connection to my PI server, mypi:5450
All is set up in the OSI PI SDK Utility correctly, including Server Name and Port Number
Add any other context about the problem here.
PI systems allow a maximum number of events to be exported at once.
This feature request proposes an optional blockwise export of events.
PI Systems have a maximum number of events that can be exported at once (150k for PI versions before 2012, 1.5M for later versions). To get more than that amount of events the data needs to be exported in blocks at most as large as the limit.
Add a new argument blocksize
to the data extraction methods, that defaults to None
(meaning the data will be exported in a single request). When the blocksize
is set, we can implement an algorithm as explained in this deep dive to extract all values in the requested interval.
In summary it exports the data in blocks of at most blocksize
events. Each new block starts at the last timestamp in the previous block. To remove duplicates, the as many events at the top of the new block are removed as there were with that timestamp at the bottom of the previous block. The full motivation for this algorithm is explained in the deep dive.
No alternative options were considered so far.
Special thanks to Rick Davin, who explains the problem and solution much clearer than I could have done.
I am trying to get values from an attribute of an element similar to how I am getting values from a PI tag. My PI tag example is this (retrieves data at 5 minute intervals for the previous day):
import PIconnect as PI
import pandas as pd
from datetime import datetime
from datetime import timedelta
end = datetime.today()
start = (end + timedelta(days=-1)).date()
end = end.strftime('%m-%d-%Y')
start = start.strftime('%m-%d-%Y')
interval = '5m'
with PI.PIServer() as server:
tags = server.search('BLK1WNAC*MVWdspd')
output = pd.concat([tag.interpolated_values(start, end, interval) for tag in tags], axis = 1)
output.to_csv("D:\Python\Windspeed\\Windspeed " + start + ".csv")
I have been able to retrieve the names of the attributes of an element in my current PI system:
import PIconnect as PI
with PI.PIAFDatabase() as database:
element = database.descendant(r'MNRCSRVR01 ModuleDB\Turbine\A04')
for attribute in element.attributes:
print (attribute)
Output:
ExpectedProduction_Met2
StateFault_Text
Met1_WdSpd
Turbine_Name
OperationState_Text
Turbine_Tag_Name
OperationState
WindSpeed
ExpectedProduction_Met1
ExpectedProduction_Met3
MetAvg_WdSpd
SCADAState_Text
TurbineCondition
SCADAState
WTGState_Text
Met3_WdSpd
VoltagePhaseB_LN
ActivePower
AvailabilitySCADA
StateFault
ExpectedProduction_Nac
Turbine_Num
WTGState
Met2_WdSpd
How do I get the 5 minute data for one of the attributes (such as ExpectedProduction_Nac)?
Attached is the screen in PI System Explorer that shows the attributes.
points.recorded_value('-5m')
I get the errorPython.Runtime.PythonException: since Python.NET 3.0 int can not be converted to Enum implicitly. Use Enum(int_value)
or token was not expected in stringI can get current value of points with no problem, so I think this is related to the way timestamps are created/processed.
From reading here: stackoverflow this might re related to pythonnet and python 3.9.
On install the install script failed on me because of pythonnet with the error described here stackoverflow2 and I used the suggested solution of a user to manually install with pip install pythonnet==3.0.0a2
which worked. After that install of Piconnect worked. But I think something got messed up and I do not have the knowledge to further track down the issue.
Create venv
using python 3.9
install manually pip install pythonnet==3.0.0a2
install Piconnect pip install PIconnect
use
import PIconnect as PI
with PI.PIServer() as server:
points = server.search('LDB1.P*')[10]
data = points.recorded_value('-5m')
print(data)
I went ahead and created a new venv and installed the pythonnet wheel for my Python 3.9 32bit version. This was the actual solution to the pythonnet install issue unter python 3.9 (see stackoverflow2).
Piconnect installs with no further issues after that.
Now the error is different.
Good news: .recorded_value works now. This failed before
Bad news: Summary still does not work
import PIconnect as PI
from PIconnect.PIConsts import SummaryType
with PI.PIServer() as server:
points = server.search('LDB1.P*')[10]
#Works
data = points.recorded_value('-1h')
print(data)
#Fails
data2 = points.summary('*-14d', '*', SummaryType.MAXIMUM)
print(data2)
Error
return AF.Time.AFTimeRange(start_time, end_time)
System.FormatException: The 'd' token in the string '*-14d' was not expected.
Hello Hugo,
Thank you for putting this package together, it has been very useful for pulling PI data.
Currently when using PIPoint.summaries, the returned pandas dataframe has the summary name as the column name. However when pulling summaries for several PI tags at once, all columns will have the summary name as the column name. Could there be an option for summaries to return the PI tag as the column name instead of the summary name, like with interpolated_values?
Well it's pretty terrific that you have put this together. And I have tested it and it works. Here's a thought:
We have a dozen PI/AF servers configured. But some of them are not available. When we want to connect to a specific server, PIConnect seems to try to validate all the servers in the list. And since we don't have access to some of the servers at all times, we get a timeout.
Feel free to contact me on e-mail [email protected]
Thanks for this effort!
Hi,
I want to download AF Audit Trails using PIconnect rather than manually downloading it in PI PSE. Does PIconnect have this option?
Thanks,
Yazdan
To assist compatibility with Python 3.6, can you remove references to 'basestring'
For example, Pi.py line 64:
elif not isinstance(query, basestring):
Can this be changed to:
elif not isinstance(query, str):
Bug report
I cannot connect to anything besides a Testing Server
This seems similar to the #595 question. I've tried following through those suggestions.
Note that this script is not on the actual node / server that the PI is on. It's on the same network though.
In order to get past the
raise ImportError("PIAF SDK not found in %s, check installation" % PIAF_SDK)
on #595, I added the PIPC to my local Program Files
Thanks!
Hi! Thanks for this awesome package!!
Below is my system info:
OS Name: Microsoft Windows 10 Enterprise
OS Version: 10.0.17763 N/A Build 17763
Python 3.7.6
Here's my code that works for example:
with PI.PIServer() as server:
points=server.search('tagname')
data=points[0].recorded_values('-89h','')
print(data)
but if enter -90h or larger for the start time, it throws a pandas assertion error
AssertionError Traceback (most recent call last)
in
2 points=server.search('TO00742510-Sump_Temperature_Transmitter')
3 data=points[0].recorded_values('-90h','')
----> 4 print(data)
~\Anaconda3\lib\site-packages\pandas\core\series.py in repr(self)
1369 min_rows=min_rows,
1370 max_rows=max_rows,
-> 1371 length=show_dimensions,
1372 )
1373 result = buf.getvalue()
~\Anaconda3\lib\site-packages\pandas\core\series.py in to_string(self, buf, na_rep, float_format, header, index, length, dtype, name, max_rows, min_rows)
1433 float_format=float_format,
1434 min_rows=min_rows,
-> 1435 max_rows=max_rows,
1436 )
1437 result = formatter.to_string()
~\Anaconda3\lib\site-packages\pandas\io\formats\format.py in init(self, series, buf, length, header, index, na_rep, name, float_format, dtype, max_rows, min_rows)
258 self.adj = _get_adjustment()
259
--> 260 self._chk_truncate()
261
262 def _chk_truncate(self) -> None:
~\Anaconda3\lib\site-packages\pandas\io\formats\format.py in _chk_truncate(self)
283 row_num = max_rows // 2
284 series = series._ensure_type(
--> 285 concat((series.iloc[:row_num], series.iloc[-row_num:]))
286 )
287 self.tr_row_num = row_num
~\Anaconda3\lib\site-packages\pandas\core\base.py in _ensure_type(self, obj)
91 Used by type checkers.
92 """
---> 93 assert isinstance(obj, type(self)), type(obj)
94 return obj
95
AssertionError: <class 'pandas.core.series.Series'>
Pleas advise
Hello,
Thank you very much for this very well made package, it's really very helpful.
I am struggling since some weeks on how to write a code to pull sampled data "500 ms" for many tags at the same time. i have about 120 tag names which i want to pull data in parallel as numpy array.
some tag names from my 120 list:
KE_91_T_EX_Motors_SHUTTLE_UDFB_MOTOR_TEMP.PV
KE_91_T_EX_Motors_SHUTTLE_UDOP_MOTOR_TEMP.PV
KE_91_T_EX_Motors_SHUTTLE_FBBK_MOTOR_TEMP.PV
KE_91_T_EX_Motors_SHUTTLE_FBOP_MOTOR_TEMP.PV
KE_91_T_EX_Motors_BLAST_UPBK_MOTOR_TEMP.PV
KE_91_T_EX_Motors_BLAST_UPOP_MOTOR_TEMP.PV
KE_91_T_EX_Motors_BLAST_DNBK_MOTOR_TEMP.PV
KE_91_T_EX_Motors_BLAST_DNOP_MOTOR_TEMP.PV
My actual script (which is not working correctly) is below:
import csv
import pandas as pd
import numpy as np
import PIconnect as PI
with open("tags.txt") as file_in:
lines = []
for line in file_in:
lines.append(line)
f=open('all_data.txt','a')
with PI.PIServer() as server:
print(server.server_name)
for i in range(8):
points = server.search(lines)[i]
data = points.interpolated_values('*-1h', '*', '10m')
print(data)
x=data.to_frame()
np.savetxt(f, data, '%1.3f', newline=", ")
f.write("\n")
Thank you for your support
Getting error when trying to access timezone.
import PIconnect as PI
with PI.PIServer() as server:
print(server.server_name)
print(PI.PIConfig.DEFAULT_TIMEZONE)
Tried pip install git+https://github.com/Hugovdberg/PIconnect@develop but no difference for me.
Thanks for making this awesome project!
Hi ๐
This is my first visit to this fine repo, but it seems you have been working hard to keep all dependencies updated so far.
Once you have closed this issue, I'll create separate pull requests for every update as soon as I find one.
That's it for now!
Happy merging! ๐ค
I am observing the following behavior regarding timezones. If I use the PISDK library (through the use of the win32com.client.dynamic Dispatch package) the timestamp information are returned as expected. However when I use PIConnect it return the timestamp as GMT+1h. I would expect that it would return as GMT timezone so I could use the pandas tz_convert to put it in the correct timezone.
This is tests I am running.
test1.py
from win32com.client.dynamic import Dispatch
from datetime import datetime
import pandas as pd
PL = Dispatch("PISDK.PISDK")
server = PL.Servers("your server here")
tags = ['your tag here']
t_ini = datetime(2019,12,16,8,25,0)
t_end = datetime(2019,12,16,9,35,0)
tag = server.PIPoints(tags[0])
t_inis = t_ini.strftime('%Y-%m-%d %H:%M:%S')
t_ends = t_end.strftime('%Y-%m-%d %H:%M:%S')
L=[]
nValues = (t_end-t_ini).total_seconds()/60+1
pvs = tag.Data.InterpolatedValues(t_inis,t_ends,nValues,1,1,None)
for pv in pvs:
L.append((tags[0],str(pv.Timestamp.LocalDate),str(pv.Value)))
df1 = pd.DataFrame(L,columns=['tag','time','value'])
test2.py
import PIconnect as PI
from datetime import datetime
import pandas as pd
t_ini = datetime(2019,12,16,8,35,0)
t_end = datetime(2019,12,16,9,35,0)
ti = t_ini.strftime('%Y-%m-%d %H:%M:%S')
te = t_end.strftime('%Y-%m-%d %H:%M:%S')
server = PI.PIServer(server='your server here')
points = server.search('your tag here')[0]
data = points.interpolated_values(ti, te,'1m')
data = data.tz_convert(tz='America/Sao_Paulo')
df = pd.DataFrame(data)
Expected behavior
A clear and concise description of what you expected to happen.
I initially though that this could be a configuration problem in the server but I check it and it seems to be ok. And reading using the PISDK is working.
When I connect to a different server by specifying an incorrect name PIconnect
silently connects to the default server. The default behaviour should be to raise a warning when this happens.
with PI.PIServer('WrongServer') as server:
print server.server_name
The output is simply the name of the default server.
Describe the bug
wrapt
is missing from dependencies
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.