ipinfo / python Goto Github PK
View Code? Open in Web Editor NEWOfficial Python Library for IPinfo API (IP geolocation and other types of IP data)
Home Page: https://ipinfo.io
License: Apache License 2.0
Official Python Library for IPinfo API (IP geolocation and other types of IP data)
Home Page: https://ipinfo.io
License: Apache License 2.0
Make sure the cache key contains a number to indicate the version of the cached data. Data changes that change what's expected in cached data require a version change.
Been using this amazing module for a long time. But now my system's Python is 3.12 and it fails to install
~ $ mkvirtualenv foo
created virtual environment CPython3.12.0.final.0-64 in 161ms
creator CPython3macOsBrew(dest=/Users/luke/.virtualenvs/foo, clear=False, no_vcs_ignore=False, global=True)
seeder FromAppData(download=False, pip=bundle, via=copy, app_data_dir=/Users/luke/Library/Application Support/virtualenv)
added seed packages: pip==23.3.1
activators BashActivator,CShellActivator,FishActivator,NushellActivator,PowerShellActivator,PythonActivator
virtualenvwrapper.user_scripts creating /Users/luke/.virtualenvs/foo/bin/predeactivate
virtualenvwrapper.user_scripts creating /Users/luke/.virtualenvs/foo/bin/postdeactivate
virtualenvwrapper.user_scripts creating /Users/luke/.virtualenvs/foo/bin/preactivate
virtualenvwrapper.user_scripts creating /Users/luke/.virtualenvs/foo/bin/postactivate
virtualenvwrapper.user_scripts creating /Users/luke/.virtualenvs/foo/bin/get_env_details
~ ๐ v3.12.0 (foo) $ pip install ipinfo
Collecting ipinfo
Using cached ipinfo-4.4.3-py3-none-any.whl.metadata (648 bytes)
Requirement already satisfied: requests in /opt/homebrew/lib/python3.12/site-packages (from ipinfo) (2.31.0)
Collecting cachetools (from ipinfo)
Using cached cachetools-5.3.2-py3-none-any.whl.metadata (5.2 kB)
Collecting aiohttp<=4 (from ipinfo)
Using cached aiohttp-3.8.6.tar.gz (7.4 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Installing backend dependencies ... done
Preparing metadata (pyproject.toml) ... done
Collecting attrs>=17.3.0 (from aiohttp<=4->ipinfo)
Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Requirement already satisfied: charset-normalizer<4.0,>=2.0 in /opt/homebrew/lib/python3.12/site-packages (from aiohttp<=4->ipinfo) (3.3.2)
Collecting multidict<7.0,>=4.5 (from aiohttp<=4->ipinfo)
Using cached multidict-6.0.4-cp312-cp312-macosx_14_0_arm64.whl
Collecting async-timeout<5.0,>=4.0.0a3 (from aiohttp<=4->ipinfo)
Using cached async_timeout-4.0.3-py3-none-any.whl.metadata (4.2 kB)
Collecting yarl<2.0,>=1.0 (from aiohttp<=4->ipinfo)
Using cached yarl-1.9.2-cp312-cp312-macosx_14_0_arm64.whl
Collecting frozenlist>=1.1.1 (from aiohttp<=4->ipinfo)
Using cached frozenlist-1.4.0-cp312-cp312-macosx_14_0_arm64.whl
Collecting aiosignal>=1.1.2 (from aiohttp<=4->ipinfo)
Using cached aiosignal-1.3.1-py3-none-any.whl (7.6 kB)
Requirement already satisfied: idna<4,>=2.5 in /opt/homebrew/lib/python3.12/site-packages (from requests->ipinfo) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in /opt/homebrew/lib/python3.12/site-packages (from requests->ipinfo) (2.0.7)
Requirement already satisfied: certifi>=2017.4.17 in /opt/homebrew/lib/python3.12/site-packages (from requests->ipinfo) (2023.7.22)
Using cached ipinfo-4.4.3-py3-none-any.whl (24 kB)
Using cached cachetools-5.3.2-py3-none-any.whl (9.3 kB)
Using cached async_timeout-4.0.3-py3-none-any.whl (5.7 kB)
Building wheels for collected packages: aiohttp
Building wheel for aiohttp (pyproject.toml) ... error
error: subprocess-exited-with-error
ร Building wheel for aiohttp (pyproject.toml) did not run successfully.
โ exit code: 1
โฐโ> [188 lines of output]
*********************
* Accelerated build *
*********************
running bdist_wheel
running build
running build_py
creating build
creating build/lib.macosx-14-arm64-cpython-312
creating build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_ws.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/worker.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/multipart.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_response.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/client_ws.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/test_utils.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/tracing.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_exceptions.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_middlewares.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/http_exceptions.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_app.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/streams.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_protocol.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/log.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/client.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_urldispatcher.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_request.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/http_websocket.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/client_proto.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/locks.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/__init__.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_runner.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_server.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/base_protocol.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/payload.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/client_reqrep.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/http.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_log.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/resolver.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/formdata.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/payload_streamer.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_routedef.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/connector.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/client_exceptions.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/typedefs.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/hdrs.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/web_fileresponse.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/http_writer.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/tcp_helpers.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/helpers.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/http_parser.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/cookiejar.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/pytest_plugin.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/abc.py -> build/lib.macosx-14-arm64-cpython-312/aiohttp
running egg_info
writing aiohttp.egg-info/PKG-INFO
writing dependency_links to aiohttp.egg-info/dependency_links.txt
writing requirements to aiohttp.egg-info/requires.txt
writing top-level names to aiohttp.egg-info/top_level.txt
reading manifest file 'aiohttp.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'aiohttp' anywhere in distribution
warning: no previously-included files matching '*.pyc' found anywhere in distribution
warning: no previously-included files matching '*.pyd' found anywhere in distribution
warning: no previously-included files matching '*.so' found anywhere in distribution
warning: no previously-included files matching '*.lib' found anywhere in distribution
warning: no previously-included files matching '*.dll' found anywhere in distribution
warning: no previously-included files matching '*.a' found anywhere in distribution
warning: no previously-included files matching '*.obj' found anywhere in distribution
warning: no previously-included files found matching 'aiohttp/*.html'
no previously-included directories found matching 'docs/_build'
adding license file 'LICENSE.txt'
writing manifest file 'aiohttp.egg-info/SOURCES.txt'
copying aiohttp/_cparser.pxd -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/_find_header.pxd -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/_headers.pxi -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/_helpers.pyi -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/_helpers.pyx -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/_http_parser.pyx -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/_http_writer.pyx -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/_websocket.pyx -> build/lib.macosx-14-arm64-cpython-312/aiohttp
copying aiohttp/py.typed -> build/lib.macosx-14-arm64-cpython-312/aiohttp
creating build/lib.macosx-14-arm64-cpython-312/aiohttp/.hash
copying aiohttp/.hash/_cparser.pxd.hash -> build/lib.macosx-14-arm64-cpython-312/aiohttp/.hash
copying aiohttp/.hash/_find_header.pxd.hash -> build/lib.macosx-14-arm64-cpython-312/aiohttp/.hash
copying aiohttp/.hash/_helpers.pyi.hash -> build/lib.macosx-14-arm64-cpython-312/aiohttp/.hash
copying aiohttp/.hash/_helpers.pyx.hash -> build/lib.macosx-14-arm64-cpython-312/aiohttp/.hash
copying aiohttp/.hash/_http_parser.pyx.hash -> build/lib.macosx-14-arm64-cpython-312/aiohttp/.hash
copying aiohttp/.hash/_http_writer.pyx.hash -> build/lib.macosx-14-arm64-cpython-312/aiohttp/.hash
copying aiohttp/.hash/_websocket.pyx.hash -> build/lib.macosx-14-arm64-cpython-312/aiohttp/.hash
copying aiohttp/.hash/hdrs.py.hash -> build/lib.macosx-14-arm64-cpython-312/aiohttp/.hash
running build_ext
building 'aiohttp._websocket' extension
creating build/temp.macosx-14-arm64-cpython-312
creating build/temp.macosx-14-arm64-cpython-312/aiohttp
clang -fno-strict-overflow -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -O3 -Wall -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.sdk -I/Users/luke/.virtualenvs/foo/include -I/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12 -c aiohttp/_websocket.c -o build/temp.macosx-14-arm64-cpython-312/aiohttp/_websocket.o
aiohttp/_websocket.c:1475:17: warning: 'Py_OptimizeFlag' is deprecated [-Wdeprecated-declarations]
if (unlikely(!Py_OptimizeFlag)) {
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/cpython/pydebug.h:13:1: note: 'Py_OptimizeFlag' has been explicitly marked deprecated here
Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_OptimizeFlag;
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/pyport.h:317:54: note: expanded from macro 'Py_DEPRECATED'
#define Py_DEPRECATED(VERSION_UNUSED) __attribute__((__deprecated__))
^
aiohttp/_websocket.c:2680:27: warning: 'ma_version_tag' is deprecated [-Wdeprecated-declarations]
return likely(dict) ? __PYX_GET_DICT_VERSION(dict) : 0;
^
aiohttp/_websocket.c:1118:65: note: expanded from macro '__PYX_GET_DICT_VERSION'
#define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag)
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/cpython/dictobject.h:22:5: note: 'ma_version_tag' has been explicitly marked deprecated here
Py_DEPRECATED(3.12) uint64_t ma_version_tag;
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/pyport.h:317:54: note: expanded from macro 'Py_DEPRECATED'
#define Py_DEPRECATED(VERSION_UNUSED) __attribute__((__deprecated__))
^
aiohttp/_websocket.c:2692:36: warning: 'ma_version_tag' is deprecated [-Wdeprecated-declarations]
return (dictptr && *dictptr) ? __PYX_GET_DICT_VERSION(*dictptr) : 0;
^
aiohttp/_websocket.c:1118:65: note: expanded from macro '__PYX_GET_DICT_VERSION'
#define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag)
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/cpython/dictobject.h:22:5: note: 'ma_version_tag' has been explicitly marked deprecated here
Py_DEPRECATED(3.12) uint64_t ma_version_tag;
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/pyport.h:317:54: note: expanded from macro 'Py_DEPRECATED'
#define Py_DEPRECATED(VERSION_UNUSED) __attribute__((__deprecated__))
^
aiohttp/_websocket.c:2696:56: warning: 'ma_version_tag' is deprecated [-Wdeprecated-declarations]
if (unlikely(!dict) || unlikely(tp_dict_version != __PYX_GET_DICT_VERSION(dict)))
^
aiohttp/_websocket.c:1118:65: note: expanded from macro '__PYX_GET_DICT_VERSION'
#define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag)
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/cpython/dictobject.h:22:5: note: 'ma_version_tag' has been explicitly marked deprecated here
Py_DEPRECATED(3.12) uint64_t ma_version_tag;
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/pyport.h:317:54: note: expanded from macro 'Py_DEPRECATED'
#define Py_DEPRECATED(VERSION_UNUSED) __attribute__((__deprecated__))
^
aiohttp/_websocket.c:2741:9: warning: 'ma_version_tag' is deprecated [-Wdeprecated-declarations]
__PYX_PY_DICT_LOOKUP_IF_MODIFIED(
^
aiohttp/_websocket.c:1125:16: note: expanded from macro '__PYX_PY_DICT_LOOKUP_IF_MODIFIED'
if (likely(__PYX_GET_DICT_VERSION(DICT) == __pyx_dict_version)) {\
^
aiohttp/_websocket.c:1118:65: note: expanded from macro '__PYX_GET_DICT_VERSION'
#define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag)
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/cpython/dictobject.h:22:5: note: 'ma_version_tag' has been explicitly marked deprecated here
Py_DEPRECATED(3.12) uint64_t ma_version_tag;
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/pyport.h:317:54: note: expanded from macro 'Py_DEPRECATED'
#define Py_DEPRECATED(VERSION_UNUSED) __attribute__((__deprecated__))
^
aiohttp/_websocket.c:2741:9: warning: 'ma_version_tag' is deprecated [-Wdeprecated-declarations]
__PYX_PY_DICT_LOOKUP_IF_MODIFIED(
^
aiohttp/_websocket.c:1129:30: note: expanded from macro '__PYX_PY_DICT_LOOKUP_IF_MODIFIED'
__pyx_dict_version = __PYX_GET_DICT_VERSION(DICT);\
^
aiohttp/_websocket.c:1118:65: note: expanded from macro '__PYX_GET_DICT_VERSION'
#define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag)
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/cpython/dictobject.h:22:5: note: 'ma_version_tag' has been explicitly marked deprecated here
Py_DEPRECATED(3.12) uint64_t ma_version_tag;
^
/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12/pyport.h:317:54: note: expanded from macro 'Py_DEPRECATED'
#define Py_DEPRECATED(VERSION_UNUSED) __attribute__((__deprecated__))
^
aiohttp/_websocket.c:3042:55: error: no member named 'ob_digit' in 'struct _longobject'
const digit* digits = ((PyLongObject*)x)->ob_digit;
~~~~~~~~~~~~~~~~~~ ^
aiohttp/_websocket.c:3097:55: error: no member named 'ob_digit' in 'struct _longobject'
const digit* digits = ((PyLongObject*)x)->ob_digit;
~~~~~~~~~~~~~~~~~~ ^
aiohttp/_websocket.c:3238:55: error: no member named 'ob_digit' in 'struct _longobject'
const digit* digits = ((PyLongObject*)x)->ob_digit;
~~~~~~~~~~~~~~~~~~ ^
aiohttp/_websocket.c:3293:55: error: no member named 'ob_digit' in 'struct _longobject'
const digit* digits = ((PyLongObject*)x)->ob_digit;
~~~~~~~~~~~~~~~~~~ ^
aiohttp/_websocket.c:3744:47: error: no member named 'ob_digit' in 'struct _longobject'
const digit* digits = ((PyLongObject*)b)->ob_digit;
~~~~~~~~~~~~~~~~~~ ^
6 warnings and 5 errors generated.
error: command '/usr/bin/clang' failed with exit code 1
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for aiohttp
Failed to build aiohttp
ERROR: Could not build wheels for aiohttp, which is required to install pyproject.toml-based projects
add a python programme for the details of ip address through input of token and ip address for getting details
Good day, thanks for your service. Can you pls tell me, how i can get information about ip, like is it proxy or vpn?
How do I add proxy configuration to the ipinfo library? I do see this uses urllib3 which could be integrated using the following variable:
proxy = urllib3.ProxyManager('http://localhost:3128/')
Is there a way to do it in ipinfo without having to manually change the code?
We have several files like eu.json
, countries.json
, continents.json
and so on, which are loaded during initialization / startup of the client.
Instead of loading these as such, which has risks such as the asset not appearing in a production environment properly, and has a performance penalty during init of loading an on-disk file, we should inline the files into a static, in-memory map / dictionary or similar.
I am getting IndexError for following code
import ipinfo
access_token = '************'
handler = ipinfo.getHandler(access_token)
ip_country = lambda x:handler.getDetails(x).country
df1['country'] = df1["IP_address"].apply(ip_country)
There are few IP address for which it gives IndexError
. For Example : 192.168.73.231, 10.62.100.139
If we try handler.getDetails('10.62.100.139').country
, it will retrun IndexError:tuple index out of range
.
Can you edit ipinfo package that it return NAN
in such cases?
I am using the Python ipinfo module to geolocate some IP addresses. You can see my test code below -
import ipinfo
access_token = "TOKEN"
handler = ipinfo.getHandler(access_token)
lookup = ['8.8.8.8', '8.8.4.4']
details = handler.getBatchDetails(lookup)
I am just requesting info for 2 IP addresses but when I open up my ipinfo dashboard, the usage count increases by 4.
What I found interesting is that even though I am requesting information for just 2 IP addresses, the ipinfo handler is sending 4 IP addresses in the POST request to the servers where in each IP is added twice.
When I checked the getBatchDetails
method, I found that the IP addresses are being appended twice to the lookup list here and here.
Am I missing something here?
I'd like to know why an IP address is added to the lookup twice.
Thanks
curl ipinfo.io/124.122.223.235?token=
$ curl ipinfo.io/124.122.223.235?token=
{
"ip": "124.122.223.235",
"hostname": "ppp-124-122-223-235.revip2.asianet.co.th",
"city": "Hat Yai",
"region": "Songkhla",
"country": "TH",
"loc": "7.0084,100.4767",
"org": "AS17552 True Online",
"postal": "90110",
"timezone": "Asia/Bangkok"
}
The user should be allowed, via some option during client initialization, to set custom HTTP headers for all requests.
This error was found on a Windows 10 environment.
import ipinfo
access_token = 'MY_ACCESS_TOKEN'
handler = ipinfo.getHandler(access_token)
This code triggers the following error:
Traceback (most recent call last):
File "<pyshell#2>", line 1, in <module>
handler = ipinfo.getHandler(access_token)
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\ipinfo\__init__.py", line 7, in getHandler
return Handler(access_token, **kwargs)
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\ipinfo\handler.py", line 68, in __init__
self.countries_currencies = handler_utils.read_json_file(
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\ipinfo\handler_utils.py", line 107, in read_json_file
countries_json = f.read()
File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\encodings\cp1252.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]
UnicodeDecodeError: 'charmap' codec can't decode byte 0x8f in position 299: character maps to <undefined>
Step 1
Uninstall the IPinfo module version 4.4.0 with the following command:
pip uninstall ipinfo
Step 2
Install the previous stable release IPinfo version: 4.3.1 with the following command:
pip install ipinfo==4.3.1
You can check if your python library version with the following command as well:
pip show ipinfo
Which will output the following:
Name: ipinfo
Version: 4.3.1
.....
Then you should be ready to go if you are not using the added functions from the version 4.4.0 such as:
Hello, I have an issue accessing to the API, this is my error, someone have a solution please ?
requests.exceptions.ProxyError: HTTPSConnectionPool(host='ipinfo.io', port=443): Max retries exceeded with url: /92.63.197.48 (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 302 Found')))
Output:
{'bogon': True,
'country_name': None,
'ip': '192.***.**.**,
'latitude': None,
'longitude': None}
The code:
import ipinfo
import pprint
handler = ipinfo.getHandler(IP_TOKEN)
details = handler.getDetails(MY_IP)
pprint.pprint(details.all)
we are in 2020, the network is used asynchronously now
[Errno 2] No such file or directory: 'C:\Users\FyrekS\AppData\Local\Temp\_MEI188962\ipinfo\countries.json'
I'm builded my python script then I get this error - [Errno 2] No such file or directory: 'C:\Users\FyrekS\AppData\Local\Temp\_MEI188962\ipinfo\countries.json'
It was discovered in #40 that a call to requests.posts
in the sync handler was throwing an exception even when raise_on_fail=False
.
This behavior can cause the loss of data from successful chunks in a >1000 size batch request if any of the chunks fail because of an error in requests.post
.
TODO:
raise_on_fail=False
.Hello:
Code that has been working without a hitch for months now throws this error when I call: handler = ipinfo.getHandler(access_token)
after a call to pip install ipinfo
on a fresh VM:
FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/lib/python3.7/dist-packages/ipinfo/eu.json'
I confirm that version.py
contains SDK_VERSION = "4.3.0"
I have a dataset of ip addresses. When I use the code to cycle through them at 506th ip address I get this error. AttributeError: loc is not a valid attribute of Details. This error does not make sense. Can you investigate?
loc_list = []
for f in members['ip_address']:
import ipinfo
print(len(loc_list))
access_token = '<REDACTED>'
handler = ipinfo.getHandler(access_token, cache_options={'ttl':30, 'maxsize': 5000}, request_options={'timeout': 10})
match = handler.getDetails(str(f))
location = match.loc
loc_list.append(location)
Please make a video how to use this feature
When I use it in python idle completely work but in Visual Studio Code make this error:
Module 'ipinfo' has no 'getHandler' memberpylint(no-member) AttributeError: partially initialized module 'ipinfo' has no attribute 'getHandler' (most likely due to a circular import)
Create an iterator-based batch function, so that the user can loop over the batch call and get results per IP 1-by-1 rather than all at once, which is probably necessary for really large inputs whose outputs could exceed the available RAM on small machines e.g. in the cloud.
This is particularly useful after #37 because the user can input millions of IPs at once to the function.
Implement SDK support for IP summaries.
Got the following exception when using handler.getBatchDetails([uri])
checking large number of IPs from third-party API data source.
ReadTimeoutError: HTTPSConnectionPool(host='ipinfo.io', port=443): Read timed out. (read timeout=2)
/usr/local/lib/python3.8/site-packages/ipinfo/handler.py in getBatchDetails(self, ip_addresses)
67 headers = self._get_headers()
68 headers["content-type"] = "application/json"
---> 69 response = requests.post(
70 url, json=lookup_addresses, headers=headers, **self.request_options
71 )
Not sure if this is fixable in this library, maybe you should add rate-limiting or something to the requests ?
Possible workaround: time.sleep() between each IP being sent to ipinfo request
Full trace:
---------------------------------------------------------------------------
timeout Traceback (most recent call last)
/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
425 # Otherwise it looks like a bug in the code.
--> 426 six.raise_from(e, None)
427 except (SocketTimeout, BaseSSLError, SocketError) as e:
/usr/local/lib/python3.8/site-packages/urllib3/packages/six.py in raise_from(value, from_value)
/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
420 try:
--> 421 httplib_response = conn.getresponse()
422 except BaseException as e:
/usr/local/Cellar/[email protected]/3.8.4/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py in getresponse(self)
1331 try:
-> 1332 response.begin()
1333 except ConnectionError:
/usr/local/Cellar/[email protected]/3.8.4/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py in begin(self)
302 while True:
--> 303 version, status, reason = self._read_status()
304 if status != CONTINUE:
/usr/local/Cellar/[email protected]/3.8.4/Frameworks/Python.framework/Versions/3.8/lib/python3.8/http/client.py in _read_status(self)
263 def _read_status(self):
--> 264 line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
265 if len(line) > _MAXLINE:
/usr/local/Cellar/[email protected]/3.8.4/Frameworks/Python.framework/Versions/3.8/lib/python3.8/socket.py in readinto(self, b)
668 try:
--> 669 return self._sock.recv_into(b)
670 except timeout:
/usr/local/Cellar/[email protected]/3.8.4/Frameworks/Python.framework/Versions/3.8/lib/python3.8/ssl.py in recv_into(self, buffer, nbytes, flags)
1240 self.__class__)
-> 1241 return self.read(nbytes, buffer)
1242 else:
/usr/local/Cellar/[email protected]/3.8.4/Frameworks/Python.framework/Versions/3.8/lib/python3.8/ssl.py in read(self, len, buffer)
1098 if buffer is not None:
-> 1099 return self._sslobj.read(len, buffer)
1100 else:
timeout: The read operation timed out
During handling of the above exception, another exception occurred:
ReadTimeoutError Traceback (most recent call last)
/usr/local/lib/python3.8/site-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
438 if not chunked:
--> 439 resp = conn.urlopen(
440 method=request.method,
/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
723
--> 724 retries = retries.increment(
725 method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
/usr/local/lib/python3.8/site-packages/urllib3/util/retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
402 if read is False or not self._is_method_retryable(method):
--> 403 raise six.reraise(type(error), error, _stacktrace)
404 elif read is not None:
/usr/local/lib/python3.8/site-packages/urllib3/packages/six.py in reraise(tp, value, tb)
734 raise value.with_traceback(tb)
--> 735 raise value
736 finally:
/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
669 # Make the request on the httplib connection object.
--> 670 httplib_response = self._make_request(
671 conn,
/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
427 except (SocketTimeout, BaseSSLError, SocketError) as e:
--> 428 self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
429 raise
/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py in _raise_timeout(self, err, url, timeout_value)
334 if isinstance(err, SocketTimeout):
--> 335 raise ReadTimeoutError(
336 self, url, "Read timed out. (read timeout=%s)" % timeout_value
ReadTimeoutError: HTTPSConnectionPool(host='ipinfo.io', port=443): Read timed out. (read timeout=2)
During handling of the above exception, another exception occurred:
ReadTimeout Traceback (most recent call last)
<ipython-input-12-81b6d3ab61af> in <module>
56
57 # check ipinfo data to see if the IP is multi-tenant / legitimate potential
---> 58 domains_count = ipinfo_get_domains(ipinfo_token, ip)
59 domain_threshold = 100
60
<ipython-input-12-81b6d3ab61af> in ipinfo_get_domains(ipinfo_token, ip)
15 handler = ipinfo.getHandler(ipinfo_token)
16 uri = ip
---> 17 r = handler.getBatchDetails([uri])
18 #print(r)
19
/usr/local/lib/python3.8/site-packages/ipinfo/handler.py in getBatchDetails(self, ip_addresses)
67 headers = self._get_headers()
68 headers["content-type"] = "application/json"
---> 69 response = requests.post(
70 url, json=lookup_addresses, headers=headers, **self.request_options
71 )
/usr/local/lib/python3.8/site-packages/requests/api.py in post(url, data, json, **kwargs)
117 """
118
--> 119 return request('post', url, data=data, json=json, **kwargs)
120
121
/usr/local/lib/python3.8/site-packages/requests/api.py in request(method, url, **kwargs)
59 # cases, and look like a memory leak in others.
60 with sessions.Session() as session:
---> 61 return session.request(method=method, url=url, **kwargs)
62
63
/usr/local/lib/python3.8/site-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
528 }
529 send_kwargs.update(settings)
--> 530 resp = self.send(prep, **send_kwargs)
531
532 return resp
/usr/local/lib/python3.8/site-packages/requests/sessions.py in send(self, request, **kwargs)
641
642 # Send the request
--> 643 r = adapter.send(request, **kwargs)
644
645 # Total elapsed time of the request (approximately)
/usr/local/lib/python3.8/site-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
527 raise SSLError(e, request=request)
528 elif isinstance(e, ReadTimeoutError):
--> 529 raise ReadTimeout(e, request=request)
530 else:
531 raise
ReadTimeout: HTTPSConnectionPool(host='ipinfo.io', port=443): Read timed out. (read timeout=2)
Thanks,
Hi,
I am using the ipinfo.io free data with a token. For every request my server connects to ipinfo.io server.
ipinfo.io also has a free database with Free IP to Country + IP to ASN, and also the paid database, that I can download to my server to avoid sending thousands of requests over the internet.
Can I config this ipinfo python library to use the downloaded database as source of data rather than connect to ipinfo.io server for every request?
I'm using async call inside class Map
method _get_city_center_coord
.
import ipinfo
import folium
class Map:
def __init__(self, event_coordinate=None, zoom_start=15, popup=None, tooltip=None):
self.event_coordinate = event_coordinate
self.zoom_start = zoom_start
self.popup = popup
self.tooltip = tooltip
@staticmethod
async def _get_city_center_coord():
handler = ipinfo.getHandlerAsync('ipinfo_access_token')
details = await handler.getDetails()
lat, lon = (details.latitude, details.longitude)
return lat, lon
async def show_events(self, event_list):
m = await self._init_map()
for event in event_list:
id, popup, lat, long, tooltip = event
folium.Marker(
location=list((lat, long)),
popup=popup,
tooltip=tooltip,
icon=folium.Icon(color="red", icon="info-sign")
).add_to(m)
return m
async def _init_map(self):
if self.event_coordinate:
return folium.Map(location=self.event_coordinate, zoom_start=self.zoom_start)
else:
return folium.Map(
location=await self._get_city_center_coord(),
zoom_start=self.zoom_start
)
And this code used by fastapi framework this way
@router.get('/all', response_model=List[EventList])
async def events_list(
request: Request,
service: EventsService = Depends(),
user: User = Depends(UserService.get_authenticated_user_id),
):
events = await service.get_list()
events_data = [
(
event['id'], event['title'], event['location']['lat'],
event['location']['long'], event['activity']['name']
)
for event in events
]
m = await Map(zoom_start=12).show_events(events_data)
return templates.TemplateResponse(
'events.html',
context={
'request': request,
'events': events,
'm': m._repr_html_(),
'user': user
}
)
Everything works fine, IP coordinates are detected and the map is displayed. But in the console I see a warning
Unclosed client session
client_session: <aiohttp.client.ClientSession object at 0x7f25f9e1e800>
Unclosed connector
connections: ['[(<aiohttp.client_proto.ResponseHandler object at 0x7f25f9e2bac0>, 178355.411)]']
connector: <aiohttp.connector.TCPConnector object at 0x7f25f9e1dff0>
Is there a way to close the connection after a request has been made? It looks like there is a lack of async with
syntax implementation
is there any way to use ripe records by your script?
Currently, this library is Python 3 specific. Missing __init__.py
in ipinfo.cache
package. Also, the ABC metaclass
declaration is going to be Python 3 specific. PR incoming.
The release was made in November, but the last commit was made in March.
The latest code contains this feature to add custom headers, which is also documented, but when you pip install
the library, this feature is not there.
I ran into an issue when I limited my token to a specific referrer, but then the library was not sending the corresponding header, which resulted in a very ambiguous HTTP 400 error:
return self.handler.getDetails(ip)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".../lib/python3.11/site-packages/ipinfo/handler.py", line 140, in getDetails
response.raise_for_status()
File ".../lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://ipinfo.io/89.58.48.55
Could you please make a new release with the latest code?
I have integrated this library into a Python 3.6 application which runs in ECS on AWS.
The library works as expected when running the application locally, and fails with a 403 response to the request when deployed into the AWS environment. I have done some debugging and this appears to not be a network issue; I can write equivalent code that calls the API using the requests library directly which returns the correct results, while making 'the same' request internally to the library fails with a 403 response.
Sample code to demonstrate this; it is a simplified version of the code running in my application
import logging
import sys
import ipinfo
import requests
ip = "<some IP address>"
try:
# headers copied from library internals
headers = {
"user-agent": "IPinfoClient/Python{version}/{sdk_version}".format(
version=sys.version_info[0], sdk_version="4.2.1"
),
"accept": "application/json",
"authorization": "Bearer <my token>",
}
response = requests.get(url="https://ipinfo.io/" + ip, headers=headers)
logging.warning("Headers: %s", response.headers)
logging.warning("Text: %s", response.text)
response.raise_for_status() # this does not fail, the expected values are logged above
except Exception as e:
logging.warning("manual call failed", exc_info=e)
try:
return ipinfo.getHandler(<my token>).getDetails(ip_address=ip) # this fails
except Exception as e:
logging.warning(
"Failed to get IP Location Details", exc_info=e,
)
return None
The output of the call to the library is
File "/home/stellar/virtualenv/lib/python3.6/site-packages/ipinfo/handler.py", line 98, in getDetails
response.raise_for_status()
File "/home/stellar/virtualenv/lib/python3.6/site-packages/requests/models.py", line 943, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://ipinfo.io/<IP address>
I would appreciate any assistance in diagnosing or fixing this issue
hello, as i saw in docs i can use these options to print out results . But it shows error when i run program .I have free plan
import ipinfo
access_token = 'MY-API'
ip_address = '130.117.190.132'
handler = ipinfo.getHandler(access_token)
details = handler.getDetails(ip_address)
print(details.ip)
print(details.hostname)
print(details.city)
print(details.region)
print(details.country)
print(details.loc)
print(details.org)
print(details.postal)
print(details.timezone)
The error that it prints is :
File "c:Users\User\Files\VsCode\pyprojects\test.py", line 8, in <module>
print(details.hostname)
File "C:\Python\Python391\lib\site-packages\ipinfo\details.py", line 18, in __getattr__
raise AttributeError(
AttributeError: hostname is not a valid attribute of Details**
Note that as i tested, when i enter different ip, it may print other error except 'hostname' option
I am getting this error while using running the following lines of code in my python IDE (VScode).
I am new to the world of async and coroutines so a little help might be useful.
Use case: I receive an IP from a user which is an input to an async function that return the details of the said IP. However, when I run the code it behaves in a very unpredictable way. Sometimes it runs for two or three times and then suddenly pops up the error/warning of event loop being closed. And sometimes it just stops running on the very first try.
Additionally, when I try to run the code to profile it (in timeit) module it gives the same warning at the very last iteration. (Not to mention the time it takes - 0.5s on avg which is not ideal for my use case)
Code sample 1:
code1 = '''
import ipinfo
import asyncio
access_token = 'd8ee206f9edf'
handler = ipinfo.getHandlerAsync(access_token)
ip_address = '216.239.36.21'
st = time.time()
async def main():
details = await handler.getDetails(ip_address)
return details.country_name
# If jupyter
try:
loop = asyncio.get_running_loop()
task = loop.create_task(main())
#details = asyncio.run(main())
task.add_done_callback(lambda t: print(t.result()))
# รf VScode
except:
loop = None
details = asyncio.run(main())
print(details)
asyncio.run(handler.httpsess.close())
print(time.time()-st)
'''
timeit.timeit(stmt = code1, number = 17)/17
Error/Warning:
AttributeError: 'NoneType' object has no attribute 'send' Exception ignored in: <function _SSLProtocolTransport.__del__ at 0x00000219B777A1F0> Traceback (most recent call last): File "e:\LLB\VScode\SC\.conda\lib\asyncio\base_events.py", line 515, in _check_closed raise RuntimeError('Event loop is closed') RuntimeError: Event loop is closed
However, when I add the following lines to the setup of timeit module then it runs fine and fast as well (0.03s). Maybe it is because
gethandlerasync is called once (in the setup) and not everytime the loop runs. Any clarification would also be very helpful.
Code Sample 2:
setup = '''
import ipinfo
import asyncio
access_token = 'd8ee206f9edf'
handler = ipinfo.getHandlerAsync(access_token)
ip_address = '216.239.36.21'
'''
code2 ='''
st = time.time()
async def main():
details = await handler.getDetails(ip_address)
return details.country_name
# If Jupyter
try:
loop = asyncio.get_running_loop()
task = loop.create_task(main())
#details = asyncio.run(main())
task.add_done_callback(lambda t: print(t.result()))
# If VScode
except:
loop = None
details = asyncio.run(main())
print(details)
asyncio.run(handler.httpsess.close())
print(time.time()-st)
'''
timeit.timeit(setup = setup, stmt = code2, number = 17)/17
What to change in code sample 1 to make it work everytime (with ideally shorter runtimes)?
Why does the second code sample works and has a shorter runtime as well?
Thanks.
``
Add an optional IP selection handler to the SDK client initialization step which accepts the request context and expects returning an IP.
Add a default handler for this which looks at the X-Forwarded-For
header and falls back to the source IP.
The resulting IP is the IP for which details are fetched.
Since the ipadress
library is part of the core Python libraries, it would be good to include support for it: https://docs.python.org/3/library/ipaddress.html
The current result:
Traceback (most recent call last):
File "test.py", line 24, in <module>
print(handler.getDetails(ip))
File "ipinfo/handler.py", line 48, in getDetails
raw_details = self._requestDetails(ip_address)
File "ipinfo/handler.py", line 96, in _requestDetails
url += "/" + ip_address
TypeError: must be str, not IPv4Address
Initially received warning that enum34 was not correct. I uninstalled enum34 and reattempted.
Received warnings for three dependencies :
aiohttp, multidict,yarl.
They all say permission denied.
Underneath it says could not build wheels for these three, because they use PEP517 and cannot be installed directly.
https://github.com/ipinfo/python/blob/master/ipinfo/handler.py#L42
In case cache_options
was defined, we end up "re-seinding" it, after having retrieved maxsize
and ttl
from the same dictionary. This way we multiply the same parameter in the function call, thus it can't perform.
Line 114 in a003e43
The ip_address
parameter is optional, but ValueError will be raised due to the bogon check here if ip_address
is None.
Currently the code retrieves results, caches that, and then formats and returns. This means even if the results are in the cache, we go through a formatting step.
As an optimization, especially in batch ops, we should retrieve -> format -> cache, so that formatting isn't happening unnecessarily on cached details.
(The async code in #32 will already do this, so this fix is mostly for the synchronous code).
import requests
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (HTML, like Gecko) '
'Chrome/108.0.0.0 '
'Safari/537.36'
}
url = "https://ipinfo.io/8.8.8.8?token=***********"
response = requests.get(url, headers=headers)
data = response.json()
print(data)
For a long time in the past, I used the above code to correctly return JSON content. Since two days ago, error 406 has been returned as follows
{'error': '406', 'request': '/8.8.8.8?token=***********"', 'data': "If you believe you've received this in error, please open a support request."}
2 issues combined into one:
Processing 50,000 IP addresses using the batch functionality provided by the library is limited to 1000 IPs per batch. Due to the limitation, users need to split it into 50 different batches. Sometimes the server might time out and the request would be left incomplete. Despite the timeout, the requests still counted for the request limit. We need to improve the batch processing functionality and timeouts handling with large amounts of data.
There seems to currently be a problem with a /batch
endpoint of the backend API. When the endpoint is used to lookup an ASN which according to the /<ASN>/json
endpoint "Can't be found" it seems to hang.
I have raised an issue regarding this with your support team and they suggested that I also open an issue here since it can be somewhat handled gracefully in this library.
When fetching the problematic ASNs via getBatchDetails
it just hangs until the timeout_per_batch
value is exceeded and then throws an exception (see stacktrace below)
2021-02-07 16:41:41,092 [INFO] looking up ASN range "range(1870, 2000)"
2021-02-07 16:41:41,093 [DEBUG] Starting new HTTPS connection (1): ipinfo.io:443
Traceback (most recent call last):
File "/home/klm/project/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py", line 445, in _make_request
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "/home/klm/project/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py", line 440, in _make_request
httplib_response = conn.getresponse()
File "/usr/lib64/python3.8/http/client.py", line 1347, in getresponse
response.begin()
File "/usr/lib64/python3.8/http/client.py", line 307, in begin
version, status, reason = self._read_status()
File "/usr/lib64/python3.8/http/client.py", line 268, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
File "/usr/lib64/python3.8/socket.py", line 669, in readinto
return self._sock.recv_into(b)
File "/usr/lib64/python3.8/ssl.py", line 1241, in recv_into
return self.read(nbytes, buffer)
File "/usr/lib64/python3.8/ssl.py", line 1099, in read
return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/klm/project/venv/lib64/python3.8/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/home/klm/project/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py", line 755, in urlopen
retries = retries.increment(
File "/home/klm/project/venv/lib64/python3.8/site-packages/urllib3/util/retry.py", line 531, in increment
raise six.reraise(type(error), error, _stacktrace)
File "/home/klm/project/venv/lib64/python3.8/site-packages/urllib3/packages/six.py", line 735, in reraise
raise value
File "/home/klm/project/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py", line 699, in urlopen
httplib_response = self._make_request(
File "/home/klm/project/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py", line 447, in _make_request
self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
File "/home/klm/project/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py", line 336, in _raise_timeout
raise ReadTimeoutError(
urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='ipinfo.io', port=443): Read timed out. (read timeout=60)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "api.py", line 86, in <module>
asn_data = client.getBatchDetails([f"AS{num}" for num in bulk_asns], timeout_per_batch=60, raise_on_fail=False)
File "/home/klm/project/venv/lib64/python3.8/site-packages/ipinfo/handler.py", line 189, in getBatchDetails
response = requests.post(
File "/home/klm/project/venv/lib64/python3.8/site-packages/requests/api.py", line 119, in post
return request('post', url, data=data, json=json, **kwargs)
File "/home/klm/project/venv/lib64/python3.8/site-packages/requests/api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "/home/klm/project/venv/lib64/python3.8/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/home/klm/project/venv/lib64/python3.8/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/home/klm/project/venv/lib64/python3.8/site-packages/requests/adapters.py", line 529, in send
raise ReadTimeout(e, request=request)
requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='ipinfo.io', port=443): Read timed out. (read timeout=60)
The error itself seems to occur at line 189 of handler.py (https://github.com/ipinfo/python/blob/master/ipinfo/handler.py#L189
I've taken a snapshot of the local variables at the time of the call to requests.post
on line 189:
{
"batch_size": 1000,
"chunk": ["AS1870", "AS1871", "AS1872", "AS1873", "AS1874", "AS1875", "AS1876", "AS1877", "AS1878", "AS1879", "AS1880", "AS1881", "AS1882", "AS1883", ...],
"headers": {
"user-agent": "IPinfoClient/Python3/4.1.0",
"accept": "application/json",
"authorization": "Bearer <MY API KEY>",
"content-type": "application/json"
},
"i": 0,
"ip_address": "AS1919",
"ip_addresses": ["AS1870", "AS1871", "AS1872", "AS1873", "AS1874", "AS1875", "AS1876", "AS1877", "AS1878", "AS1879", "AS1880", "AS1881", "AS1882", "AS1883", ...],
"lookup_addresses": ["AS1870", "AS1871", "AS1872", "AS1873", "AS1874", "AS1875", "AS1876", "AS1877", "AS1878", "AS1879", "AS1880", "AS1881", "AS1882", "AS1883", ...],
"raise_on_fail": False
}
My input arguments to getBatchDetails
were:
ip_addresses: ["AS1870", ..., "AS1919"]
(all 50 between those values)
timeout_per_batch: 60
raise_on_fail: False
I have been able to bypass this problem by making individual calls to getDetails
in the meantime. However, being able to use getBatchDetails
would be preferable to reduce network overhead.
I do believe that once the issue with /batch
has been resolved with the main API that this would likely sort itself but there could be handling for the requests.post
failing to catch erroneous cases perhaps?
Thanks
Create a simple function that accepts an IP list (max 500k) and returns the JSON response from https://ipinfo.io/maps
.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.