Coder Social home page Coder Social logo

pyrqlite's Introduction

Circle CI AppVeyor Go Report Card Release Docker Office Hours Slack Google Group

rqlite is a relational database which combines SQLite's simplicity with the power of a robust, fault-tolerant, distributed system. It's designed for easy deployment and lightweight operation, offering a developer-friendly and operator-centric solution for Linux, macOS, and Windows, as well as various CPU platforms.

Check out the rqlite FAQ.

Why run rqlite?

rqlite is your solution for a rock-solid, fault-tolerant, relational database with effortless installation, deployment, and operation. It's ideal as a lightweight, distributed relational data store for both developers and operators. Think Consul or etcd, but with relational modeling available.

Use rqlite to reliably store your most important data, ensuring it's always available to your applications. If you're interested in understanding how distributed systems actually work, it's a good example to study. A lot of thought has gone into its design, separating storage, consensus, and API clearly.

Key features

More questions?

Pronunciation

Common pronunciations of rqlite include "R Q lite" and "ree-qwell-lite".

pyrqlite's People

Contributors

0x0101010 avatar alanjds avatar arthurzam avatar benschweizer avatar blackmoomba avatar harpreetkhanuja51 avatar kaihil avatar otoolep avatar sc68cal avatar sum12 avatar thedrow avatar thotypous avatar zmedico avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

pyrqlite's Issues

EphemeralRqlited does not wait for rqlited to become leader

The pyrqlite unit tests fail with rqlite-6.7.0 because EphemeralRqlited does not wait for the ephemeral rqlited instance to become the leader. This problem can be solved by polling the http /status endpoint until .store.raft.state is Leader.

support ssl_context in connect args

pyrqlite defaults on ssl parameters, so tuning things like verification is not possible.
Drivers like pg8000 support an optional custom ssl_context to overwrite these defaults.

From sqlalchemy, this offices a neat interface:

    import ssl
    ssl_context = ssl.create_default_context()
    ssl_context.check_hostname = False
    ssl_context.verify_mode = ssl.CERT_NONE
    engine = sa.create_engine(
        "postgresql+pg8000://scott:[email protected]/test",
        connect_args={"ssl_context": ssl_context},
    )

SQL expressions with the join keyword that have the same key do not correspond to the number of columns and values in the returned value

first, this is my table creation SQL:
CREATE TABLE IF NOT EXISTS dp_project_gateway (
id integer primary key not null,
name text not null,
describe text,
machine_id integer not null, /* foreign_key machine(id) /
create_time timestamp not null,
CONSTRAINT major_sign_1 UNIQUE(name),
CONSTRAINT major_sign_2 UNIQUE(machine_id));
CREATE TABLE IF NOT EXISTS dp_project_strategy (
id integer primary key not null,
name text not null,
describe text,
strategy_type integer not null,
target_type integer not null,
way integer,
project_gateway_id integer not null, /
foreign_key dp_project_gateway(id) */
platform_type integer,
platform_id integer,
raw_cycle_type integer not null,
raw_cycles text,
raw_time_type integer not null,
raw_time timestamp,
raw_start_time timestamp,
raw_end_time timestamp,
raw_every_few_type integer,
raw_every_few integer,
reserve_count integer not null,
CONSTRAINT major_sign UNIQUE(name, strategy_type, target_type));

then, this is my test script:
#!user/bin/python

import pyrqlite.dbapi2 as dbapi2

connection = dbapi2.connect(host='192.168.25.216', port=4001)
with connection.cursor() as cursor:
    cursor.execute(
        'select * from dp_project_gateway inner join dp_project_strategy on dp_project_gateway.id=dp_project_strategy.project_gateway_id;')
    row = cursor.fetchone()
    print('---------------------- keys ----------------------------')
    print(list(row.keys()))
    print(f'number of keys: {len(list(row.keys()))}')
    print('---------------------- values --------------------------')
    print(row.values())
    print(f'number of values: {len(row.values())}')
    print('--------------------------------------------------------')

final, this is the output of the script in terminal:
1704855818948

Suggestion: sqlite3 row compatibility

Thank you for pyrqlite.

By default, sqliite3 returns data as a tuple or a list of tuples.
My suggestion is to add a parameter that makes pyrqlite compatible with sqlite3:

  1. add parameter 'compat_rows' to connections.py.
  2. then in cursors.py, replace the code:
    for payload_row in values:
    row = []
    for field, converter, value in zip(fields, converters, payload_row):
    row.append((field, (value if converter is None
    else converter(value))))
    rows.append(Row(row))
    by:
    if self._connection.compat_rows is None:
    for payload_row in values:
    row = []
    for field, converter, value in zip(fields, converters, payload_row):
    row.append((field, (value if converter is None
    else converter(value))))
    rows.append(Row(row))
    elif self._connection.compat_rows == 'sqlite3':
    for payload_row in values:
    row = []
    for field, converter, value in zip(fields, converters, payload_row):
    row.append(value if converter is None
    else converter(value))
    rows.append(tuple(row))
    else:
    raise ValueError('compat_rows must be None or sqlite3')
    Thank you.

No ability to use this package via pip vcs

I have tried a couple different methods of installing this package through pip, because I don't believe that having to download the source and build it by hand is really the way to do things.

Pip has the ability to install from vcs however this repository does not have PEP 440 valid version.

Adding git+https://github.com/rqlite/[email protected] to my requirements.txt file ends up failing with

  WARNING: Built wheel for pyrqlite is invalid: Metadata 1.2 mandates PEP 440 version, but 'HEAD' is not
Failed to build pyrqlite
ERROR: Could not build wheels for pyrqlite, which is required to install pyproject.toml-based projects

There are differences between access rqlite master and slave with pyrqlite

I use sqlalchemy and pyrqlite accessing rqlite db. Here is my sqlalchemy engine
engine = create_engine('rqlite+pyrqlite://' +ip +':4001/', poolclass=NullPool)
It can work properly when there are only one rqlite node. After there are three rqlite nodes,I found that only ip is rqlite master ip, it can work properly.When ip is rqlite slave nodes, I got a error as bellow from sqlalchemy:
No JSON object could be decoded
My rqlited starts as:

rqlited -http-addr 0.0.0.0:4001 -http-adv-addr 192.168.2.14:4001 -raft-addr 0.0.0.0:4002 -raft-adv-addr 192.168.2.14:4002 ~/node.1
rqlited -http-addr 0.0.0.0:4001 -http-adv-addr 192.168.2.13:4001 -raft-addr 0.0.0.0:4002 -raft-adv-addr 192.168.2.13:4002 -join http://192.168.2.14:4001 ~/node.2
rqlited -http-addr 0.0.0.0:4001 -http-adv-addr 192.168.2.15:4001 -raft-addr 0.0.0.0:4002 -raft-adv-addr 192.168.2.15:4002 -join http://192.168.2.14:4001 ~/node.3

I don't think this is rqlite server problem, because when I use rqlite client with --host parameter, it can work properly.So is this a pyrqlite issue?

Encoding incompatibility when running cursor.py

I am trying to use RQlite to replace Sqlite in my parallel computing project, and I encountered the encoding issue when running /pyrqlite/cursor.py:

Traceback (most recent call last):
File "mmm_tuner.py", line 58, in
GccFlagsTuner.main(argparser.parse_args())
File "/home/sx233/test/opentuner/opentuner/measurement/interface.py", line 291, in main
return TuningRunMain(cls(args, *pargs, **kwargs), args).main()
File "/home/sx233/test/opentuner/opentuner/tuningrunmain.py", line 194, in main
self.init()
File "/home/sx233/test/opentuner/opentuner/tuningrunmain.py", line 176, in init
self.search_driver = self.search_driver_cls(**driver_kwargs)
File "/home/sx233/test/opentuner/opentuner/search/driver.py", line 74, in init
self.session.flush()
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 2171, in flush
self._flush(objects)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 2291, in _flush
transaction.rollback(_capture_exception=True)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/util/langhelpers.py", line 66, in exit
compat.reraise(exc_type, exc_value, exc_tb)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/orm/session.py", line 2255, in _flush
flush_context.execute()
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 389, in execute
rec.execute(self)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py", line 548, in execute
uow
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 181, in save_obj
mapper, table, insert)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py", line 835, in _emit_insert_statements
execute(statement, params)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 945, in execute
return meth(self, multiparams, params)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/sql/elements.py", line 263, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 1053, in _execute_clauseelement
compiled_sql, distilled_params
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 1189, in _execute_context
context)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 1405, in _handle_dbapi_exception
util.reraise(*exc_info)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context
context)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/sqlalchemy/engine/default.py", line 470, in do_execute
cursor.execute(statement, parameters)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/pyrqlite/cursors.py", line 114, in execute
operation = self._substitute_params(operation, parameters)
File "/home/sx233/datuner/releases/Linux_x86_64/install/lib/python2.7/site-packages/pyrqlite/cursors.py", line 102, in _substitute_params
return ''.join(subst)
UnicodeDecodeError: 'ascii' codec can't decode byte 0xda in position 2: ordinal not in range(128)

Similar error will occur when I set the default python encoding format to be "utf-8"

UnicodeDecodeError: 'utf8' codec can't decode byte 0xda in position 2: invalid continuation byte

and Sqlalchemy fails to insert and rollback:

[ 0s] INFO sqlalchemy.engine.base.Engine: INSERT INTO tuning_run (uuid, program_version_id, machine_class_id, input_class_id, name, args, objective, state, start_date, end_date, final_config_id) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
2017-09-11 10:14:32,577 INFO sqlalchemy.engine.base.Engine ('0d0a42787b78461a9b6ba19ba7199cf5', 1, None, None, 'unnamed', 'x\xdaMQ=o\xd4@\x10%!\xc7%\xbe|\x1fp\x04\x1aJ\x10\x12\t\x12U\xba\x90\xf2t\xae\x982Z\x8d\xd7s\xf6H\xeb\xf5\xce\xce\xba\xb8H\x91\xa0\xc3\xff\x97\x1f\xc0 ... (760 characters truncated) ... 91\xc5\xb2\x80"p\xc8B\x9e}#\xef\x96/\xf1\xb0\x9d\x12F\xdb\x9a\xfb\xf7\x18\xe5[\x92\xb3\x11f\xcf\xf6\x96\xf7%L}o\xea!\xa8|\x18\x87\xea\xeb?\xf1,\xac(', '\x80\x02copentuner.search.objective\nMinimizeTime\nq\x01)\x81q\x02}q\x03U\x06driverq\x04Nsb.', 'QUEUED', '2017-09-11 10:14:32.559986', None, None) [ 0s] INFO sqlalchemy.engine.base.Engine: ('0d0a42787b78461a9b6ba19ba7199cf5', 1, None, None, 'unnamed', 'x\xdaMQ=o\xd4@\x10%!\xc7%\xbe|\x1fp\x04\x1aJ\x10\x12\t\x12U\xba\x90\xf2t\xae\x982Z\x8d\xd7s\xf6H\xeb\xf5\xce\xce\xba\xb8H\x91\xa0\xc3\xff\x97\x1f\xc0 ... (760 characters truncated) ... 91\xc5\xb2\x80"p\xc8B\x9e}#\xef\x96/\xf1\xb0\x9d\x12F\xdb\x9a\xfb\xf7\x18\xe5[\x92\xb3\x11f\xcf\xf6\x96\xf7%L}o\xea!\xa8|\x18\x87\xea\xeb?\xf1,\xac(', '\x80\x02copentuner.search.objective\nMinimizeTime\nq\x01)\x81q\x02}q\x03U\x06driverq\x04Nsb.', 'QUEUED', '2017-09-11 10:14:32.559986', None, None)
2017-09-11 10:14:32,578 INFO sqlalchemy.engine.base.Engine ROLLBACK
[ 0s] INFO sqlalchemy.engine.base.Engine: ROLLBACK

rqlite does not like empty DATETIME columns

I get the following error unless I set defaults in datetime columns:

venv/lib/python3.8/site-packages/pyrqlite/extensions.py", line 111, in <lambda> 'DATETIME': lambda x: x.replace('T', ' ').rstrip('Z')

Client should route request to leader on receipt of 301

The 3.0 release series of rqlite now returns 301 and the location of the leader set in a header within the response, if the node receiving the request is not the leader. For example:

$ curl -G -v 'localhost:4004/db/query?pretty&timings' --data-urlencode 'q=SELECT * FROM foo'
*   Trying ::1...
* connect to ::1 port 4004 failed: Connection refused
*   Trying 127.0.0.1...
* Connected to localhost (127.0.0.1) port 4004 (#0)
> GET /db/query?pretty&timings&q=SELECT%20%2A%20FROM%20foo HTTP/1.1
> Host: localhost:4004
> User-Agent: curl/7.43.0
> Accept: */*
> 
< HTTP/1.1 301 Moved Permanently
< Location: http://localhost:4001/db/query
< Date: Mon, 02 May 2016 19:40:20 GMT
< Content-Length: 65
< Content-Type: text/html; charset=utf-8
< 
<a href="http://localhost:4001/db/query">Moved Permanently</a>.

* Connection #0 to host localhost left intact

The Python client could detect when this happens, and automatically re-issue the request to the leader.

If transactions are implemented in rqlite add support

The plan right now is to implement transaction support in rqlite 5.0. If this work completes, this library should be updated to take advantage of. See rqlite/rqlite#266 for details of the proposed design.

When a cursor is first created, and assuming the system is not in autocommit mode, a transaction should be started (hit the proposed API on the rqlite node). Store the transaction ID on the cursor, which will prevent other cursors from writing within that transaction. The connection can commit or rollback the transaction at anytime, by hitting the same endpoint. Whether that call should supply the transaction ID is TBD.

It would be worth studying the behavior of the mainstream SQLite DB API 2.0 driver, to learn its behaviour.

test_PragmaTableInfo fails with rqlite 7.2.0

This test failure appeared after upgrade from rqlite 7.1.0 to 7.2.0:

======================================================================
FAIL: test_PragmaTableInfo (__main__.SqliteTypeTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "src/test/test_types.py", line 124, in test_PragmaTableInfo
    self.assertEqual(rows,
AssertionError: Lists differ: [(0, [34 chars]s', 'varchar', 0, None, 0), (2, 'f', 'number',[38 chars], 0)] != [(0, [34 chars]s', 'VARCHAR', 0, None, 0), (2, 'f', 'NUMBER',[38 chars], 0)]

First differing element 1:
(1, 's', 'varchar', 0, None, 0)
(1, 's', 'VARCHAR', 0, None, 0)

  [(0, 'i', 'INTEGER', 0, None, 0),
-  (1, 's', 'varchar', 0, None, 0),
?            ^^^^^^^

+  (1, 's', 'VARCHAR', 0, None, 0),
?            ^^^^^^^

-  (2, 'f', 'number', 0, None, 0),
?            ^^^^^^

+  (2, 'f', 'NUMBER', 0, None, 0),
?            ^^^^^^

   (3, 'b', 'BLOB', 0, None, 0)]

How to implement connection.create_function() ?

After implementing most of Python's sqlite3 type extensions, I tried to adapt a Django DB Backend for RQLite based on the SQLite one. But Django' stock SQLite backend do use the non-standard Connection.create_function() to add REGEX and some more stuff into SQLite, that have no serverside PL features AFAIK.

As this works by connecting a C callable to a SQLite custom function, and being C callables not possible to be transferred to the RQLite server, I am now clueless on how to implement/emulate this feature.

I am halting the development of this Django backend. This issue is just a tip/heads-up for someone trying to do the same in the future. If this could be solved, seems pretty possible to have a Django backend over a high-available relational database based on the SQLite one.

Thanks @otoolep for the help during the implementation of the types extensions (#9)

Multiple endpoint support?

It would be nice if I could connect to whichever endpoint is online. Something like this:

connection = dbapi2.connect(
    hosts=['db1', 'db2', 'db3'],
    port=4001,
)

Otherwise, endpoint has to be shared in some sort of failover by multiple nodes in order to avoid a single point of failure.

Prepared statement parameterization failure on multiple-item inserts

Hello,

I just ran into this issue when trying to insert a large amount of rows into an rqlite database using multiple-item INSERTs.

Example code:

#!/usr/bin/env python3

import pyrqlite.dbapi2

if __name__ == '__main__':
	con = pyrqlite.dbapi2.connect(host="127.0.0.1", port=4001)
	cur = con.cursor()
	cur.execute("CREATE TABLE bug_demonstration (id INT UNSIGNED PRIMARY KEY, name TEXT)")
	con.commit()
	cur.execute("INSERT INTO bug_demonstration (name) VALUES (?), (?)", ['why am I being seen as a substitution token?', "little bobby tables sends his regards"])

When run against a stock rqlite server compiled from the current GitHub code and run in a completely vanilla fashion (rqlited bug_database), I get the following error:

{"error": "near \"little\": syntax error"}
Traceback (most recent call last):
  File "pyrqlite_bug.py", line 10, in <module>
    cur.execute("INSERT INTO bug_demonstration (name) VALUES (?), (?)", ['why am I being seen as a substitution token?', "little bobby tables sends his regards"])
  File "/usr/local/lib/python3.6/dist-packages/pyrqlite-HEAD-py3.6.egg/pyrqlite/cursors.py", line 178, in execute
sqlite3.Error: {"error": "near \"little\": syntax error"}

I used Wireshark to investigate the request coming down the line, which looked like this:

POST /db/execute?transaction HTTP/1.1
Host: 127.0.0.1:4001
Accept-Encoding: identity
Content-Length: 139
Content-Type: application/json

["INSERT INTO bug_demonstration (name) VALUES ('why am I being seen as a substitution token'little bobby tables sends his regards''), (?)"]

I figured it'd be a good idea to report this, as any parameterization or data-escaping errors have serious security implications.

Cheers,
--Jays

Tests always passes. Even when broken.

Tried to add a test to the tests folder and setup.py test. It passed. But the test was just a raise RuntimeError.

Test environ runs the tests, but eats the errors.

SOCKS Proxy support

Is there a way that I can connect to rqlite through a SOCKS proxy?

Thanks!

PyPI release

There is no pyrqlite on the cheeseshop. Will you create one? Or maybe is ok if I do?

Build is crashing (because of no `license`?)

With git 9a35768.

$ python -m build -w
* Creating virtualenv isolated environment...
* Installing packages in isolated environment... (setuptools, wheel)
* Getting build dependencies for wheel...
/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/config/_apply_pyprojecttoml.py:75: _MissingDynamic: `license` defined outside of `pyproject.toml` is ignored.
!!

        ********************************************************************************
        The following seems to be defined outside of `pyproject.toml`:

        `license = 'MIT'`

        According to the spec (see the link below), however, setuptools CANNOT
        consider this value unless `license` is listed as `dynamic`.

        https://packaging.python.org/en/latest/specifications/declaring-project-metadata/

        To prevent this problem, you can list `license` under `dynamic` or alternatively
        remove the `[project]` table from your file and rely entirely on other means of
        configuration.
        ********************************************************************************

!!
  _handle_missing_dynamic(dist, project_table)
/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/config/_apply_pyprojecttoml.py:75: _MissingDynamic: `maintainers` defined outside of `pyproject.toml` is ignored.
!!

        ********************************************************************************
        The following seems to be defined outside of `pyproject.toml`:

        `maintainers = 'Zac Medico'`

        According to the spec (see the link below), however, setuptools CANNOT
        consider this value unless `maintainers` is listed as `dynamic`.

        https://packaging.python.org/en/latest/specifications/declaring-project-metadata/

        To prevent this problem, you can list `maintainers` under `dynamic` or alternatively
        remove the `[project]` table from your file and rely entirely on other means of
        configuration.
        ********************************************************************************

!!
  _handle_missing_dynamic(dist, project_table)
/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/config/expand.py:133: SetuptoolsWarning: File '/tmp/pyrqlite/README.md' cannot be found
  return '\n'.join(
Traceback (most recent call last):
  File "/usr/lib/python3.12/site-packages/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
    main()
  File "/usr/lib/python3.12/site-packages/pyproject_hooks/_in_process/_in_process.py", line 335, in main
    json_out['return_val'] = hook(**hook_input['kwargs'])
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/site-packages/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
    return hook(config_settings)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/build_meta.py", line 325, in get_requires_for_build_wheel
    return self._get_build_requires(config_settings, requirements=['wheel'])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/build_meta.py", line 295, in _get_build_requires
    self.run_setup()
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/build_meta.py", line 311, in run_setup
    exec(code, locals())
  File "<string>", line 72, in <module>
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/__init__.py", line 103, in setup
    return distutils.core.setup(**attrs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/_distutils/core.py", line 159, in setup
    dist.parse_config_files()
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/_virtualenv.py", line 23, in parse_config_files
    result = old_parse_config_files(self, *args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/dist.py", line 627, in parse_config_files
    pyprojecttoml.apply_configuration(self, filename, ignore_option_errors)
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/config/pyprojecttoml.py", line 67, in apply_configuration
    return _apply(dist, config, filepath)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/config/_apply_pyprojecttoml.py", line 56, in apply
    _apply_project_table(dist, config, root_dir)
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/config/_apply_pyprojecttoml.py", line 82, in _apply_project_table
    corresp(dist, value, root_dir)
  File "/tmp/build-env-ha5_7xmd/lib/python3.12/site-packages/setuptools/config/_apply_pyprojecttoml.py", line 183, in _license
    _set_config(dist, "license", val["text"])
                                 ~~~^^^^^^^^
KeyError: 'text'

ERROR Backend subprocess exited when trying to invoke get_requires_for_build_wheel

Can't install in docker

Dockerfile:

FROM python:3.8.10
COPY . /app
WORKDIR /app
RUN git clone https://github.com/rqlite/pyrqlite.git
RUN pip install ./pyrqlite

Result:

 => ERROR [5/6] RUN pip install ./pyrqlite                                                                                               6.7s
------                                                                                                                                        
 > [5/6] RUN pip install ./pyrqlite:                                                                                                          
#9 1.694 Processing ./pyrqlite                                                                                                                
#9 1.695   DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.
#9 1.695    pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555.
#9 1.833   Installing build dependencies: started
#9 5.540   Installing build dependencies: finished with status 'done'
#9 5.543   Getting requirements to build wheel: started
#9 5.857   Getting requirements to build wheel: finished with status 'done'
#9 5.860     Preparing wheel metadata: started
#9 6.170     Preparing wheel metadata: finished with status 'done'
#9 6.174 Building wheels for collected packages: pyrqlite
#9 6.176   Building wheel for pyrqlite (PEP 517): started
#9 6.513   Building wheel for pyrqlite (PEP 517): finished with status 'done'
#9 6.514   Created wheel for pyrqlite: filename=pyrqlite-HEAD-py3-none-any.whl size=13130 sha256=079284a3f3d5ed725b9b427b1a8b355ef775ae06522d1fd4391b324a74d2c7b7
#9 6.514   Stored in directory: /tmp/pip-ephem-wheel-cache-_ak8qae9/wheels/64/25/12/591e62106996869404e56b8de62562ffa87c421d7a464babcd
#9 6.516   WARNING: Built wheel for pyrqlite is invalid: Metadata 1.2 mandates PEP 440 version, but 'HEAD' is not
#9 6.516 Failed to build pyrqlite
#9 6.516 ERROR: Could not build wheels for pyrqlite which use PEP 517 and cannot be installed directly
#9 6.523 WARNING: You are using pip version 21.1.3; however, version 22.3 is available.
#9 6.523 You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
------
executor failed running [/bin/sh -c pip install ./pyrqlite]: exit code: 1

BinaryConverterTests.test_CheckBinaryInputForConverter fails with both pyrqlite and sqlite3

This test fails with pyrqlite as follows:

____________ BinaryConverterTests.test_CheckBinaryInputForConverter ____________

self = <test_types.BinaryConverterTests testMethod=test_CheckBinaryInputForConverter>

    def test_CheckBinaryInputForConverter(self):
        testdata = b"abcdefg" * 10
        compressed = zlib.compress(testdata)
>       result = self.con.execute('select ? as "x [bin]"', (compressed,)).fetchone()[0]

src/test/test_types.py:397: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
src/pyrqlite/connections.py:100: in execute
    return cursor.execute(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <pyrqlite.cursors.Cursor object at 0x7efbf49d79d0>
operation = 'select \'x\x9cKLJNIMKO\xa4\x8c\x02\x00\xca\x0f\x1bY\' as "x [bin]"'
parameters = ('x\x9cKLJNIMKO\xa4\x8c\x02\x00\xca\x0f\x1bY',)

    def execute(self, operation, parameters=None):
    
        if parameters:
            operation = self._substitute_params(operation, parameters)
    
        command = self._get_sql_command(operation)
        if command in ('SELECT', 'PRAGMA'):
            payload = self._request("GET",
                                    "/db/query?" + _urlencode({'q': operation}))
        else:
            payload = self._request("POST", "/db/execute?transaction",
                                    headers={'Content-Type': 'application/json'}, body=json.dumps([operation]))
    
        last_insert_id = None
        rows_affected = -1
        payload_rows = {}
        try:
            results = payload["results"]
        except KeyError:
            pass
        else:
            rows_affected = 0
            for item in results:
                if 'error' in item:
                    logging.error(json.dumps(item))
>                   raise Error(json.dumps(item))
E                   Error: {"error": "unrecognized token: \"'x\ufffdKLJNIMKO\ufffd\ufffd\u0002\""}

It also fails with python's own sqlite3 module, as follows:

____________ BinaryConverterTests.test_CheckBinaryInputForConverter ____________

self = <test_types.BinaryConverterTests testMethod=test_CheckBinaryInputForConverter>

    def test_CheckBinaryInputForConverter(self):
        testdata = b"abcdefg" * 10
        compressed = zlib.compress(testdata)
>       result = self.con.execute('select ? as "x [bin]"', (compressed,)).fetchone()[0]
E       ProgrammingError: You must not use 8-bit bytestrings unless you use a text_factory that can interpret 8-bit bytestrings (like text_factory = str). It is highly recommended that you instead just switch your application to Unicode strings.

src/test/test_types.py:397: ProgrammingError

@alanjds

ColNamesTests.test_CheckCaseInConverterName fails

I'm having difficulty trying to make test_CheckCaseInConverterName and test_CheckColName both succeed simultaneously (though they both succeed with python's own sqlite3 module).

=================================== FAILURES ===================================
_________________ ColNamesTests.test_CheckCaseInConverterName __________________

self = <test_types.ColNamesTests testMethod=test_CheckCaseInConverterName>

    def test_CheckCaseInConverterName(self):
>       self.cur.execute("select 'other' as \"x [b1b1]\"")

src/test/test_types.py:298: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
src/pyrqlite/cursors.py:178: in execute
    else converter(value))))
src/pyrqlite/extensions.py:180: in _decode_base64_converter
    return converter(value.decode('base64'))
/usr/lib64/python2.7/encodings/base64_codec.py:42: in base64_decode
    output = base64.decodestring(input)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

s = 'other'

    def decodestring(s):
        """Decode a string."""
>       return binascii.a2b_base64(s)
E       Error: Incorrect padding

As you can see, it fails because it's trying to decode 'other' as base64. The sql which does not need base64 decoding is as follows:

create table test(x foo);
select 'other' as "x [b1b1]";

Meanwhile, test_CheckColName would fail without the base64 decoding support. The sql that triggers the need for base64 decoding is as follows:

create table test(x foo);
insert into test(x) values ('xxx');
select x as "x [bar]" from test;

Feeding the sql statements into the rqlite v3.14.0 command confirms, where 'eHh4' is the base64 encoding of 'xxx':

127.0.0.1:4001> create table test(x foo);
1 row affected (0.000543 sec)
127.0.0.1:4001> select 'other' as "x [b1b1]";
+----------+
| x [b1b1] |
+----------+
| other    |
+----------+
127.0.0.1:4001> insert into test(x) values ('xxx');
1 row affected (0.000206 sec)
127.0.0.1:4001> select x as "x [bar]" from test;
+---------+
| x [bar] |
+---------+
| eHh4    |
+---------+
127.0.0.1:4001>

I see that rqlite/rqlite#265 reports the issue for rqlited, and the relevant code is visible in rqlite/rqlite#244.

@alanjds

Missing Response Code Handling

Hi,

I've found a missing error handling when a cluster fails (two nodes rebooting in a three node cluster):

rqlite logs:

Feb 20 17:23:59 <hostname> rqlited[682]: 2023-02-20T17:23:59.030Z [WARN]  raft: Election timeout reached, restarting election
Feb 20 17:23:59 <hostname>[682]: 2023-02-20T17:23:59.030Z [INFO]  raft: entering candidate state: node="Node at [dead::beef]:4002 [Candidate]" term=8998

traceback:

Traceback (most recent call last):
  File "test.py", line 45, in <module>
    print (conn.execute(query).fetchone())
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 980, in execute
    return meth(self, multiparams, params)
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 273, in _execute_on_connection
    return connection._execute_clauseelement(self, multiparams, params)
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1093, in _execute_clauseelement
    ret = self._execute_context(
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1239, in _execute_context
    self._handle_dbapi_exception(
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1460, in _handle_dbapi_exception
    util.reraise(*exc_info)
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 277, in reraise
    raise value
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1235, in _execute_context
    self.dialect.do_execute(
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 536, in do_execute
    cursor.execute(statement, parameters)
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/pyrqlite/cursors.py", line 164, in execute
    payload = self._request("GET",
  File "/home/schweizerbe/code/.venv/lib/python3.8/site-packages/pyrqlite/cursors.py", line 79, in _request
    response_json = json.loads(
  File "/usr/lib/python3.8/json/__init__.py", line 370, in loads
    return cls(**kw).decode(s)
  File "/usr/lib/python3.8/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python3.8/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

The actual response is empty, which causes the JSONDecodeError.
Digging into _request(), there's not http status code checking, causing json.loads to fail.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.