Coder Social home page Coder Social logo

dims / etcd3-gateway Goto Github PK

View Code? Open in Web Editor NEW
10.0 4.0 20.0 97 KB

This repository is now read-only. Please see https://opendev.org/openstack/etcd3gw for the new location for this code.

Home Page: https://opendev.org/openstack/etcd3gw

License: Apache License 2.0

Python 98.75% Shell 1.25%

etcd3-gateway's Introduction

etcd3 gateway Python Client

Build Status PyPI version codecov pypi status pypi supported versions

A python client for etcd3 grpc-gateway v3 API

etcd3-gateway's People

Contributors

dims avatar fasaxc avatar gmelikov avatar heychirag avatar lihiwish avatar skywalker-nick avatar stmcginnis avatar thatisgeek avatar thomasgoirand avatar tirkarthi avatar yoctozepto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

etcd3-gateway's Issues

Issues with TLS client authentication

While setting up Calico in our OpenStack deployment we've encountered issues when trying to use TLS authentication to access etcd cluster.

To reproduce:

  1. Setup etcd cluster with TLS client authentication
  2. Create new client c = Etcd3Client(host="server", protocol="https", ca_cert="server_ca.crt", cert_cert="user.crt", cert_key="user.key")
  3. try to get some random key c.get("random_key")
  4. You should get TLS alert bad certificate or CERTIFICATE_VERIFY_FAILED if you don't have server_ca in your system trust.

When looking through the source code we've noticed that the TLS parameters are stored in self.kwargs and then never used.

self.kwargs = {

When edited to actually set the values on the session object, the TLS auth started working.

Issue appeared on both Ubuntu Xenial (Python 2.7.12, OpenSSL 1.0.2g) and Fedora 27 (Python 2.7.14, OpenSSL 1.1.0h).

Fetching lease fails with TTL

I'm using etc3dgw in openstack cinder and I'm running into an issue where the lease refresh is failing, due to a missing field.

stack@walt-stack-1  /opt/stack/logs/screen  pip freeze |grep etcd  1 ↵  482  12:23:14
etcd3==0.6.2
etcd3gw==0.1.0

---excerpt from the cinder log

2017-08-23 12:25:16.771 24793 ERROR root [-] Unexpected exception occurred 60 time(s)... retrying.: KeyError: 'TTL'
2017-08-23 12:25:16.771 24793 ERROR root Traceback (most recent call last):
2017-08-23 12:25:16.771 24793 ERROR root File "/usr/lib/python2.7/site-packages/oslo_utils/excutils.py", line 250, in wrapper
2017-08-23 12:25:16.771 24793 ERROR root return infunc(*args, **kwargs)
2017-08-23 12:25:16.771 24793 ERROR root File "/usr/lib/python2.7/site-packages/tooz/coordination.py", line 187, in _beat_forever_until_stopped
2017-08-23 12:25:16.771 24793 ERROR root wait_until_next_beat = self._driver.heartbeat()
2017-08-23 12:25:16.771 24793 ERROR root File "/usr/lib/python2.7/site-packages/tooz/drivers/etcd3gw.py", line 191, in heartbeat
2017-08-23 12:25:16.771 24793 ERROR root lock.heartbeat()
2017-08-23 12:25:16.771 24793 ERROR root File "/usr/lib/python2.7/site-packages/tooz/drivers/etcd3gw.py", line 38, in wrapper
2017-08-23 12:25:16.771 24793 ERROR root return func(*args, **kwargs)
2017-08-23 12:25:16.771 24793 ERROR root File "/usr/lib/python2.7/site-packages/tooz/drivers/etcd3gw.py", line 154, in heartbeat
2017-08-23 12:25:16.771 24793 ERROR root self._lease.refresh()
2017-08-23 12:25:16.771 24793 ERROR root File "/usr/lib/python2.7/site-packages/etcd3gw/lease.py", line 64, in refresh
2017-08-23 12:25:16.771 24793 ERROR root return int(result['result']['TTL'])
2017-08-23 12:25:16.771 24793 ERROR root KeyError: 'TTL'
2017-08-23 12:25:16.771 24793 ERROR root

support etcd cluster

can i set etcd cluster address in etcdgw.client(host="192.168.0.9,192.168.0.10“)

Doesn't respect the real key prefix

I use 'client.watch_prefix' API to watch a set of keys which start with '/tx-resource'.
But the system seems to always return the event with keys starting with '/tx-', including '/tx-status' and '/tx-resource'.

Watch doesn't handle error cases

I do a watch for a key prefix with client.watch_prefix('prefix'). The server is an etcd gateway (the TCP proxy).
It turns out etcd3-gateway doesn't handle at least two error cases:

  1. when I restart the (upstream) etcd nodes, the etcd gateway closes the JSON streaming "array" and ends the connection with: {"error":{"grpc_code":14,"http_code":503,"message":"rpc error: code = Unavailable desc = transport is closing","http_status":"Service Unavailable"}}
  2. when I cut the etcd gateway's TCP connections to the upstream servers it just closes the HTTP connection

Currently etcd3-gateway handles none of these, making the watch call hang forever.

Default endpoint changed causing etcd3gw.exceptions.Etcd3Exception: Not Found

In the new release (0.2.6 from Jul 30, 2020) etcd3gw is unable to raise (some?) of its own exceptions (the exception wording was mostly unfortunate and I guess we all fell in this trap)
(the truth is:) switched from /v3alpha to /v3beta endpoints which are not available on default etcd installations on current platforms (Ubuntu Bionic, Focal; CentOS 8):

Example:

2020-08-12 10:22:39.645 1787 ERROR oslo_service.service [-] Error starting thread.: etcd3gw.exceptions.Etcd3Exception: Not Found
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service Traceback (most recent call last):
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service   File "/var/lib/kolla/venv/lib/python3.6/site-packages/oslo_service/service.py", line 807, in run_service
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service     service.start()
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service   File "/var/lib/kolla/venv/lib/python3.6/site-packages/cinder/service.py", line 220, in start
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service     coordination.COORDINATOR.start()
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service   File "/var/lib/kolla/venv/lib/python3.6/site-packages/cinder/coordination.py", line 67, in start
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service     self.coordinator.start(start_heart=True)
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service   File "/var/lib/kolla/venv/lib/python3.6/site-packages/tooz/coordination.py", line 689, in start
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service     super(CoordinationDriverWithExecutor, self).start(start_heart)
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service   File "/var/lib/kolla/venv/lib/python3.6/site-packages/tooz/coordination.py", line 426, in start
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service     self._start()
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service   File "/var/lib/kolla/venv/lib/python3.6/site-packages/tooz/drivers/etcd3gw.py", line 224, in _start
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service     self._membership_lease = self.client.lease(self.membership_timeout)
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service   File "/var/lib/kolla/venv/lib/python3.6/site-packages/etcd3gw/client.py", line 122, in lease
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service     json={"TTL": ttl, "ID": 0})
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service   File "/var/lib/kolla/venv/lib/python3.6/site-packages/etcd3gw/client.py", line 91, in post
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service     raise exceptions.Etcd3Exception(resp.text, resp.reason)
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service etcd3gw.exceptions.Etcd3Exception: Not Found
2020-08-12 10:22:39.645 1787 ERROR oslo_service.service 

No way to get important metadata

One issue that we found in writing our product using etcd3gw is that the methods don't return the metadata from the response "envelope". There are various cases where that information is needed:

  • To do watches correctly, you need the response revision, not the per-key revision. This is because the per-key revision may be pre-compaction and you can't watch on a compacted revision.
  • To do chunked gets (with limit=<n>) correctly, you need the "more" flag form the response and the response revision.

Perhaps you could switch to a dedicated Response object that exposes the KVs and other metadata?

Values are not correctly decoded in python3

I was trying to get the lock using this module with python3, however lock.is_acquired() always returned False. After some investigation, I found the reason is that the values are not correctly decoded. utils._decode returns an instance of bytes instead of str in python3.

Run the unit tests with Python3 with etcd3 server running would reveal the issue.

$ tox -e py36
...
collected 21 items                                                                                                                                                             

etcd3gw/tests/test_client.py::TestEtcd3Gateway::test_client_default PASSED                                                                                               [  4%]
etcd3gw/tests/test_client.py::TestEtcd3Gateway::test_client_ipv4 PASSED                                                                                                  [  9%]
etcd3gw/tests/test_client.py::TestEtcd3Gateway::test_client_ipv6 PASSED                                                                                                  [ 14%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_client_lease PASSED                                                                                                [ 19%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_client_lease_with_keys FAILED                                                                                      [ 23%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_client_lock_acquire_release PASSED                                                                                 [ 28%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_client_locks FAILED                                                                                                [ 33%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_client_members PASSED                                                                                              [ 38%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_client_status PASSED                                                                                               [ 42%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_client_with_keys_and_values FAILED                                                                                 [ 47%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_create_fail FAILED                                                                                                 [ 52%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_create_success FAILED                                                                                              [ 57%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_get_and_delete_prefix FAILED                                                                                       [ 61%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_get_prefix_sort_order FAILED                                                                                       [ 66%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_get_prefix_sort_order_explicit_sort_target_key FAILED                                                              [ 71%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_get_prefix_sort_order_explicit_sort_target_rev FAILED                                                              [ 76%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_replace_fail FAILED                                                                                                [ 80%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_replace_success FAILED                                                                                             [ 85%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_sequential_watch_prefix_once PASSED                                                                                [ 90%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_watch_key FAILED                                                                                                   [ 95%]
etcd3gw/tests/test_etcd3gw.py::TestEtcd3Gateway::test_watch_prefix FAILED                                                                                                [100%]

=================================================================================== FAILURES ===================================================================================
...
_________________________________________________________________ TestEtcd3Gateway.test_client_lease_with_keys _________________________________________________________________
NOTE: Incompatible Exception Representation, displaying natively:

testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/root/src/etcd3gw/etcd3gw/tests/test_etcd3gw.py", line 222, in test_client_lease_with_keys
    self.assertIn('foo12', keys)
  File "/root/src/etcd3gw/.tox/py36/lib/python3.6/site-packages/testtools/testcase.py", line 417, in assertIn
    self.assertThat(haystack, Contains(needle), message)
  File "/root/src/etcd3gw/.tox/py36/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 'foo12' not in [b'foo12', b'foo13']


______________________________________________________________________ TestEtcd3Gateway.test_client_locks ______________________________________________________________________
NOTE: Incompatible Exception Representation, displaying natively:

testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/root/src/etcd3gw/etcd3gw/tests/test_etcd3gw.py", line 354, in test_client_locks
    self.assertTrue(lock.is_acquired())
  File "/root/src/etcd3gw/.tox/py36/lib/python3.6/site-packages/unittest2/case.py", line 702, in assertTrue
    raise self.failureException(msg)
AssertionError: False is not true
...
===================================================================== 13 failed, 8 passed in 9.60 seconds ======================================================================

Request to have a new release cut

There have been several commits since the last etcd3-gateway release. The ones including resp.reason as part of an exception are particularly of interest since it makes debugging client issues using the gateway far easier. As such, it would be much appreciate to have a new release cut.

Watch seems to leak connections

Run a local etcd server on port 2379, e.g. with docker run --net=host quay.io/coreos/etcd.

Create an Etcd3Client and make an initial connection to the server:

(py27) neil@smaug:~/calico/networking-calico$ python
Python 2.7.14+ (default, Mar 13 2018, 15:23:44) 
[GCC 7.3.0] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> from etcd3gw.client import Etcd3Client
>>> c = Etcd3Client()
>>> c.status()
{u'raftTerm': u'2', u'header': {u'raft_term': u'2', u'revision': u'1', u'cluster_id': u'14841639068965178418', u'member_id': u'10276657743932975437'}, u'version': u'3.2.9', u'raftIndex': u'4', u'dbSize': u'24576', u'leader': u'10276657743932975437'}

Find out the PID of that python shell, and then in another window monitor its etcd connections with

watch 'lsof -p 6770 | grep 2379'

You'll initially see that there is 1 connection:

python  6770 neil    3u  IPv4 837856      0t0      TCP localhost:40306->localhost:2379 (ESTABLISHED)

Start and then cancel a watch:

>>> stream, cancel = c.watch_prefix("/")
>>> cancel()

The watch still shows just 1 connection.

However, now, for each further iteration of

>>> stream, cancel = c.watch_prefix("/")
>>> cancel()

we see 1 more ESTABLISHED connection. For example, after 8 more iterations, there are now 9 ESTABLISHED connections:

python  6770 neil    3u  IPv4 837856      0t0      TCP localhost:40306->localhost:2379 (ESTABLISHED)
python  6770 neil    4u  IPv4 852249      0t0      TCP localhost:40374->localhost:2379 (ESTABLISHED)
python  6770 neil    5u  IPv4 851242      0t0      TCP localhost:40386->localhost:2379 (ESTABLISHED)
python  6770 neil    6u  IPv4 880526      0t0      TCP localhost:40580->localhost:2379 (ESTABLISHED)
python  6770 neil    7u  IPv4 884101      0t0      TCP localhost:40584->localhost:2379 (ESTABLISHED)
python  6770 neil    8u  IPv4 884108      0t0      TCP localhost:40588->localhost:2379 (ESTABLISHED)
python  6770 neil   10u  IPv4 884113      0t0      TCP localhost:40592->localhost:2379 (ESTABLISHED)
python  6770 neil   11u  IPv4 885316      0t0      TCP localhost:40596->localhost:2379 (ESTABLISHED)
python  6770 neil   12u  IPv4 880574      0t0      TCP localhost:40600->localhost:2379 (ESTABLISHED)

I believe this pattern continues without any bound, because in an overnight OpenStack churn test with networking-calico, the DHCP agent (which uses etcd3gw, and creates and cancels watches as above) was found to have more than 900 open connections to etcd.

Having IPv6 addresses in client URL fails

See https://review.openstack.org/505168 for context, config file has:

backend_url = etcd3+http://[::1]:2379

but it seems to get messed up along the way

Sep 19 15:14:36.951835 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server [None req-bf45693c-0d1e-4e1f-884b-e1a5419b1a81 tempest-TestVolumeSwap-1355161147 None] Exception during message handling: InvalidURL: Failed to parse: ::1:2379
Sep 19 15:14:36.951988 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Sep 19 15:14:36.952169 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 160, in _process_incoming
Sep 19 15:14:36.952324 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Sep 19 15:14:36.952500 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 222, in dispatch
Sep 19 15:14:36.952662 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Sep 19 15:14:36.952804 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 192, in _do_dispatch
Sep 19 15:14:36.952954 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Sep 19 15:14:36.953153 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "<decorator-gen-245>", line 2, in attach_volume
Sep 19 15:14:36.953305 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/opt/stack/new/cinder/cinder/coordination.py", line 169, in _synchronized
Sep 19 15:14:36.953439 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     with lock(blocking):
Sep 19 15:14:36.954766 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/tooz/locking.py", line 31, in __enter__
Sep 19 15:14:36.954949 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     return self.lock.__enter__(*self.args, **self.kwargs)
Sep 19 15:14:36.955110 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/tooz/locking.py", line 52, in __enter__
Sep 19 15:14:36.955404 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     acquired = self.acquire(*args, **kwargs)
Sep 19 15:14:36.955559 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/tooz/drivers/etcd3gw.py", line 38, in wrapper
Sep 19 15:14:36.955711 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     return func(*args, **kwargs)
Sep 19 15:14:36.958234 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/tooz/drivers/etcd3gw.py", line 113, in acquire
Sep 19 15:14:36.958396 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     return _acquire()
Sep 19 15:14:36.958541 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/tenacity/__init__.py", line 171, in wrapped_f
Sep 19 15:14:36.958690 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     return self.call(f, *args, **kw)
Sep 19 15:14:36.958850 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/tenacity/__init__.py", line 248, in call
Sep 19 15:14:36.959000 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     start_time=start_time)
Sep 19 15:14:36.959157 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/tenacity/__init__.py", line 203, in iter
Sep 19 15:14:36.959309 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     return fut.result()
Sep 19 15:14:36.959461 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/concurrent/futures/_base.py", line 422, in result
Sep 19 15:14:36.959612 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     return self.__get_result()
Sep 19 15:14:36.959755 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/tenacity/__init__.py", line 251, in call
Sep 19 15:14:36.959902 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     result = fn(*args, **kwargs)
Sep 19 15:14:36.960044 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/tooz/drivers/etcd3gw.py", line 82, in _acquire
Sep 19 15:14:36.960185 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     self._lease = self._coord.client.lease(self._timeout)
Sep 19 15:14:36.960332 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/etcd3gw/client.py", line 113, in lease
Sep 19 15:14:36.960520 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     json={"TTL": ttl, "ID": 0})
Sep 19 15:14:36.960665 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/etcd3gw/client.py", line 78, in post
Sep 19 15:14:36.960823 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     resp = self.session.post(*args, **kwargs)
Sep 19 15:14:36.960977 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 555, in post
Sep 19 15:14:36.961162 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     return self.request('POST', url, data=data, json=json, **kwargs)
Sep 19 15:14:36.961305 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 494, in request
Sep 19 15:14:36.961448 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     prep = self.prepare_request(req)
Sep 19 15:14:36.961607 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 437, in prepare_request
Sep 19 15:14:36.961757 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     hooks=merge_hooks(request.hooks, self.hooks),
Sep 19 15:14:36.961899 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/requests/models.py", line 305, in prepare
Sep 19 15:14:36.962069 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     self.prepare_url(url, params)
Sep 19 15:14:36.962212 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python2.7/dist-packages/requests/models.py", line 373, in prepare_url
Sep 19 15:14:36.962354 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server     raise InvalidURL(*e.args)
Sep 19 15:14:36.962495 ubuntu-xenial-infracloud-chocolate-10994920 cinder-volume[22504]: ERROR oslo_messaging.rpc.server InvalidURL: Failed to parse: ::1:2379

Move to openstack?

Hi @dims

not sure if this was discussed, would it make sense to move this project to opendev aka openstack hosting infra ?

Sort target is incorrectly mapped

If I specify "key" for the sort target then that gets mapped to "sort_target": 1 on the API but the docs say that key should be 0:

  enum SortTarget {
	KEY = 0;
	VERSION = 1;
	CREATE = 2;
	MOD = 3;
	VALUE = 4;
  }

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.