Coder Social home page Coder Social logo

Flaky test_max_csv_mb test about datasette HOT 3 OPEN

simonw avatar simonw commented on July 24, 2024
Flaky test_max_csv_mb test

from datasette.

Comments (3)

simonw avatar simonw commented on July 24, 2024

datasette/tests/test_csv.py

Lines 192 to 216 in 64a125b

def test_max_csv_mb(app_client_csv_max_mb_one):
# This query deliberately generates a really long string
# should be 100*100*100*2 = roughly 2MB
response = app_client_csv_max_mb_one.get(
"/fixtures.csv?"
+ urllib.parse.urlencode(
{
"sql": """
select group_concat('ab', '')
from json_each(json_array({lots})),
json_each(json_array({lots})),
json_each(json_array({lots}))
""".format(
lots=", ".join(str(i) for i in range(100))
),
"_stream": 1,
"_size": "max",
}
),
)
# It's a 200 because we started streaming before we knew the error
assert response.status == 200
# Last line should be an error message
last_line = [line for line in response.body.split(b"\r\n") if line][-1]
assert last_line.startswith(b"CSV contains more than")

from datasette.

simonw avatar simonw commented on July 24, 2024

It failed again, this time with a 500 not a 400 and with a different version combo:

Flaky test failed again in a different way: https://github.com/asg017/datasette/actions/runs/9503497141/job/26194049093

    def test_max_csv_mb(app_client_csv_max_mb_one):
        # This query deliberately generates a really long string
        # should be 100*100*100*2 = roughly 2MB
        response = app_client_csv_max_mb_one.get(
            "/fixtures.csv?"
            + urllib.parse.urlencode(
                {
                    "sql": """
                select group_concat('ab', '')
                from json_each(json_array({lots})),
                    json_each(json_array({lots})),
                    json_each(json_array({lots}))
                """.format(
                        lots=", ".join(str(i) for i in range(100))
                    ),
                    "_stream": 1,
                    "_size": "max",
                }
            ),
        )
        # It's a 200 because we started streaming before we knew the error
>       assert response.status == 200
E       assert 500 == 200
E        +  where 500 = <datasette.utils.testing.TestResponse object at 0x7fe7fb6d0790>.status

/home/runner/work/datasette/datasette/tests/test_csv.py:213: AssertionError
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 306, in sql_operation_in_thread
    cursor.execute(sql, params if params is not None else {})
sqlite3.OperationalError: interrupted

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/runner/work/datasette/datasette/datasette/app.py", line 1732, in route_path
    response = await view(request, send)
  File "/home/runner/work/datasette/datasette/datasette/app.py", line 1899, in async_view_for_class
    return await async_call_with_supported_arguments(
  File "/home/runner/work/datasette/datasette/datasette/utils/__init__.py", line 1022, in async_call_with_supported_arguments
    return await fn(*call_with)
  File "/home/runner/work/datasette/datasette/datasette/views/base.py", line 89, in __call__
    return await handler(request, datasette)
  File "/home/runner/work/datasette/datasette/datasette/views/database.py", line 61, in get
    return await QueryView()(request, datasette)
  File "/home/runner/work/datasette/datasette/datasette/views/base.py", line 89, in __call__
    return await handler(request, datasette)
  File "/home/runner/work/datasette/datasette/datasette/views/database.py", line 564, in get
    return await stream_csv(datasette, fetch_data_for_csv, request, db.name)
  File "/home/runner/work/datasette/datasette/datasette/views/base.py", line 441, in stream_csv
    response_or_template_contexts = await fetch_data(request)
  File "/home/runner/work/datasette/datasette/datasette/views/database.py", line 560, in fetch_data_for_csv
    results = await db.execute(sql, params, truncate=True)
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 336, in execute
    results = await self.execute_fn(sql_operation_in_thread)
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 282, in execute_fn
    return await asyncio.get_event_loop().run_in_executor(
  File "/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 280, in in_thread
    return fn(conn)
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 319, in sql_operation_in_thread
    raise QueryInterrupted(e, sql, params)
datasette.database.QueryInterrupted: (OperationalError('interrupted'), "\n            select group_concat('ab', '')\n            from json_each(json_array(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99)),\n                json_each(json_array(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99)),\n                json_each(json_array(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99))\n            ", {'_stream': '1', '_size': 'max'})

from datasette.

simonw avatar simonw commented on July 24, 2024

That second failure looks like we need to catch the datasette.database.QueryInterrupted exception somewhere.

from datasette.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.