Coder Social home page Coder Social logo

planetscale / database-js Goto Github PK

View Code? Open in Web Editor NEW
1.2K 14.0 32.0 558 KB

A Fetch API-compatible PlanetScale database driver

Home Page: https://planetscale.com/docs/tutorials/planetscale-serverless-driver

License: Apache License 2.0

TypeScript 100.00%
database edge-computing mysql serverless vercel

database-js's Introduction

PlanetScale serverless driver for JavaScript

PlanetScale serverless driver for JavaScript

A Fetch API-compatible PlanetScale database driver for serverless and edge compute platforms that require HTTP external connections, such as Cloudflare Workers or Vercel Edge Functions

Installation

npm install @planetscale/database

Usage

import { connect } from '@planetscale/database'

const config = {
  host: '<host>',
  username: '<user>',
  password: '<password>'
}

const conn = connect(config)
const results = await conn.execute('select 1 from dual where 1=?', [1])
console.log(results)

Database URL

A single database URL value can be used to configure the host, username, and password values.

import { connect } from '@planetscale/database'

const config = {
  url: process.env['DATABASE_URL'] || 'mysql://user:pass@host'
}

const conn = connect(config)

Connection factory

Use the Client connection factory class to create fresh connections for each transaction or web request handler.

import { Client } from '@planetscale/database'

const client = new Client({
  host: '<host>',
  username: '<user>',
  password: '<password>'
})

const conn = client.connection()
const results = await conn.execute('select 1 from dual')
console.log(results)

Transactions

Use the transaction function to safely perform database transactions. If any unhandled errors are thrown during execution of the transaction, the transaction will be rolled back.

The following example is based on the Slotted Counter Pattern.

import { connect } from '@planetscale/database'

const config = {
  host: '<host>',
  username: '<user>',
  password: '<password>'
}

const conn = connect(config)
const results = await conn.transaction(async (tx) => {
  const whenBranch = await tx.execute('INSERT INTO branches (database_id, name) VALUES (?, ?)', [42, "planetscale"])
  const whenCounter = await tx.execute('INSERT INTO slotted_counters(record_type, record_id, slot, count) VALUES (?, ?, RAND() * 100, 1) ON DUPLICATE KEY UPDATE count = count + 1', ['branch_count', 42])
  return [whenBranch, whenCounter]
})
console.log(results)

Usage with PlanetScale Boost

To enable PlanetScale Boost, run SET @@boost_cached_queries = true once. All subsequent queries run on the same connection will use boost if it's enabled for the query pattern. Non-matching queries will run as normal.

To learn more, visit: Query caching with PlanetScale Boost

const conn = client.connection()
// Enable boost for the connection
await conn.execute('SET @@boost_cached_queries = true')

const results = await conn.execute('...')

// Optionally, you may disable boost for the connection by setting to false
await conn.execute('SET @@boost_cached_queries = false')

Custom fetch function

Node.js version 18 includes a built-in global fetch function. When using an older version of Node.js, you can provide a custom fetch function implementation. We recommend the undici package on which Node's built-in fetch is based.

import { connect } from '@planetscale/database'
import { fetch } from 'undici'

const config = {
  fetch,
  host: '<host>',
  username: '<user>',
  password: '<password>'
}

const conn = connect(config)
const results = await conn.execute('select 1 from dual')
console.log(results)

To leverage HTTP/2, you can use the fetch-h2 shim. fetch-h2 also supports Node.js 12+.

import { connect } from '@planetscale/database'
import { context } from 'fetch-h2'
const { fetch, disconnectAll } = context()

const config = {
  fetch,
  host: '<host>',
  username: '<user>',
  password: '<password>'
}

const conn = connect(config)
const results = await conn.execute('select 1 from dual')
console.log(results)
await disconnectAll()

Custom query parameter format function

Query replacement parameters identified with ? are replaced with escaped values. Named replacement parameters are supported with a colon prefix.

const results1 = await conn.execute('select 1 from dual where 1=?', [42])
const results2 = await conn.execute('select 1 from dual where 1=:id', { id: 42 })

Providing a custom format function overrides the built-in escaping with an external library, like sqlstring.

import { connect } from '@planetscale/database'
import SqlString from 'sqlstring'

const config = {
  format: SqlString.format,
  host: '<host>',
  username: '<user>',
  password: '<password>'
}

const conn = connect(config)
const results = await conn.execute('select 1 from dual where 1=?', [42])
console.log(results)

Custom type casting function

Column values are converted to their corresponding JavaScript data types. This can be customized by providing a cast function.

import { connect, cast } from '@planetscale/database'

function inflate(field, value) {
  if (field.type === 'INT64' || field.type === 'UINT64') {
    return BigInt(value)
  }
  return cast(field, value)
}

const config = {
  cast: inflate,
  host: '<host>',
  username: '<user>',
  password: '<password>'
}

const conn = connect(config)

You can also pass a custom cast function to execute. If present, this will override the cast function set by the connection:

const result = await conn.execute(
  'SELECT userId, SUM(balance) AS balance FROM UserBalanceItem GROUP BY userId',
  {},
  {
    cast: (field, value) => {
      if (field.name === 'balance') {
        return BigInt(value)
      }
      return cast(field, value)
    }
  }
)

Row return values

Rows can be returned as an object or an array of column values by passing an as option to execute.

const query = 'select 1 as one, 2 as two where 1=?'
const objects = conn.execute(query, [1], { as: 'object' })
// objects.rows => [{one: '1', two: '2'}]

const arrays = conn.execute(query, [1], { as: 'array' })
// arrays.rows => [['1', '2']]

Development

npm install
npm test

Need help?

Get help from the PlanetScale support team, or join our community on Discord or GitHub discussion board to see how others are using PlanetScale.

License

Distributed under the Apache 2.0 license. See LICENSE for details.

database-js's People

Contributors

andrewvasilchuk avatar ayrton avatar bjesuiter avatar cdcarson avatar dependabot[bot] avatar dgraham avatar iheanyi avatar jkomyno avatar jvandenaardweg avatar mattrobenolt avatar miketromba avatar mscoutermarsh avatar nickvanw avatar sevinf avatar skottler avatar tbarn avatar yutak23 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

database-js's Issues

Support for a DATABASE_URL style config

I think either providing a helper such as configFromEnv('DATABASE_URL') or something along these lines to parse a config in a traditional mysql://user:pass@host would be nice for configuration, we can even be a bit more planetscale specific with our URI such as pscale://user:[email protected]. I don't personally have an opinion if this is it's own thing like:

connect(configFromEnv())

Or built into the config such as:

connect({uri: env['DATABASE_URL']})

Up to whatever you think is most common or would be most desirable to adopt.

This would allow simplifying an application config with one variable instead of 3 different ones, and we can leverage the already standard DATABASE_URL that's established.

Allow passing fetch config

Currently, I don't see any way to pass any configs to the fetch call. One very specific use case where this is going to be necessary is the upcoming next.js app router driven data fetch infrastructure, where next will start caching fetch responses and you will need to pass next.revalidate to fetch calls to configure how long the cache lives.

I don't think this is super high priority as it's also possible to export the revalidate const from a route segment. However, that reduces granularity of control -- i.e. I may have a route segment that consists of multiple DB requests, some of which are more permanent than others and thus can have a longer cache. That said, I also think it's a pretty trivial thing to support, and I'd even be happy to open a PR to implement it, if ya'll at planetscale would be open to accepting such a PR.

Send along an identifying User-Agent header with all requests

Something as simple as User-Agent: database-js/0.1.0 seems like it'd be sufficient. Happy to accommodate more data as you see fit, but at least we can start identifying client version.

This might require an addition to our CORS policy? So might require a tiny bit of experimentation to make sure it'll work.

Open to other suggestions too. We could potentially use a different header entirely Client: database-js/0.1.0, etc. We own everything here, so we can accommodate what works.

Changelogs

I can't find the changelog for v1.6.0 (or any version, really). Can someone tell me where to look? Thanks. πŸ™

Question: How to import in node.js/express server

How can I import the js driver in node using require? Thanks!

Edit:

This:

const { Client } = require("@planetscale/database");

Results in this:
Failed to load function definition from source: FirebaseError: Failed to load function definition from source: Failed to generate manifest from function source: Error [ERR_REQUIRE_ESM]: require() of ES Module /Users/raphael/code/test/node_modules/@planetscale/database/dist/index.js from /Users/raphael/code/test/https/api.js not supported. Instead change the require of index.js in /Users/raphael/code/test/https/api.js to a dynamic import() which is available in all CommonJS modules

Empty result when querying from Next.js App Router component

Hi!
I'm trying to setup a new Next.js project with PlanetScale using the new Next.js App Router.
I'm facing a problem where querying the DB from a script works fine, but sending the same query from a component returns an empty result.
In the following image the first log is from the server startup running in a separate scripts, and the second and third logs are when I call the function from a component.
image
Here's a link to the repo, it's quite bare-bone.

Thank you very much!

Support for client side parameter interpolation

It'd be expected to support a syntax like:

client.execute("select * from foo where id = ?", [1])

Where they can safely pass a list of parameters into their query instead of needing to do their own escaping or using prepared statements. This is a common pattern across drivers to allow client side interpolation, and in our case, this would be highly preferred over preparing statements.

For JavaScript, we came across sqlstring, which supports the same API as well as some other goodies. We should leverage this automatically internally. If parameters are passed, run it through SqlString.format(...).

It's worth noting that SqlString comes from mysqljs ecosystem, and is used within the mysqljs/mysql driver.

Error handling

[DatabaseError: target: reanvue.-.primary: vttablet: rpc error: code = AlreadyExists desc = Duplicate entry 'user_2O3GkFE9ev2C2REVYxmxqyVf13P-user_2O3pPzMN7jALRmhpGI1yEWyhjI' for key 'Connection.Connection_senderId_receiverId_key' (errno 1062) (sqlstate 23000) (CallerID: 211kmwyh6r7bw465wddi): Sql: "insert into `Connection`(senderId, receiverId) values (:vtg1, :vtg2)", BindVars: {REDACTED}] {
  status: 400,
  name: 'DatabaseError',
  body: {
  message: 'target: reanvue.-.primary: vttablet: rpc error: code = AlreadyExists desc = Duplicate entry \'user_2O3GkFE9ev2C2REVYxmxqyVf13P-user_2O3pPzMN7jALRmhpGI1yEWyhjI\' for key \'Connection.Connection_senderId_receiverId_key\' (errno 1062) (sqlstate 23000) (CallerID: 211kmwyh6r7bw465wddi): Sql: "insert into `Connection`(senderId, receiverId) values (:vtg1, :vtg2)", BindVars: {REDACTED}',
  code: 'UNKNOWN'
}
}

I am getting this error from my database because I have an unique constraint set up, which is okay.

But is there a way I can perhaps extract the error code AlreadyExists from it to send back a different response to my client?

Thanks!

Performance difference

I did a small performance comparison with the standard MySQL connection and the new driver in a lambda function. (using this)

I found that when using the serverless driver the cold start times and the execution times in a warm lambda take at least twice as long, often longer. For example, for a cold start with MySQL it's 3-4s, with the HTTP driver, it's more like 10s. Warm starts are 1s compared to 3 or 4s

I haven't yet added any logging of what's taking the additional time, but before I dig into things further I was curious to find out what others' experiences were in terms of performance?

How to select what branch to connect to

Hi, it seems that my connection points to the production database. How can I point to a development branch instead?

I'm using separate host/username/password. There is no option to select a branch. Am I supposed to switch to a URL string and connect via a pscale proxy locally?

Thanks

Inserting multiple rows at once

Hey guys, not sure if I'm missing something, but does this client not support inserting more than one row at a time? Having trouble finding any mention in planetscale docs, readme, and the code in this library

connection.execute(
  `
	  INSERT INTO test (id)
	  VALUES (?)
  `,
  [[1], [2]]
)

I've also tried:

VALUES (??)
VALUES ??
VALUES ?

etc

Local Connection Fails

Scenario

  1. I run pscale connect galaxy --port 3309 locally to connect to my PlanetScale DB
  2. I attempt to connect my T3 app using import { connect } from "@planetscale/database";

Problem

  1. When I use url to connect to my local db connection, it does not use the specified port - instead is uses port 443:
const connection = connect({
  url: "mysql://[email protected]:3309"
});

Screenshot 2023-04-09 at 8 13 19 PM

  1. When I use host & username to connect to my local db connection, it uses the correct port but it fails with a ERR_SSL_WRONG_VERSION_NUMBER status code
const connection = connect({
  host: "127.0.0.1:3309",
  username: "root"
});

Screenshot 2023-04-09 at 8 12 30 PM

I'd love to be able to use this package instead of prisma for my next project.

Anything that can be done to help here would be greatly appreciated!

cast's type does not support undefined or bigint

I'm migrating from a system where null values are represented as undefined in JS. cast has no issue doing this for us, however it's type has undefined missing from it. Same for the example with BigInt. The type should probably be unknown or any.

Doesn't work with jest

Please see #68 It was never fixed and using the minimal example provided it still errors. This is blocking me really hard if anyone could help me that would be great! Thanks

Prisma-like JSON Query Constructor

Any plans to make a Prisma-like JSON query constructor? The idea is to make conditional queries easier, for example:

import { NextApiRequest, NextApiResponse } from "next";
import { pscale } from @planetscale/database

export default async function handler(
  req: NextApiRequest,
  res: NextApiResponse
) {
    const { param1, param2, param3 } = req.query as {
      param1?: string;
      param2?: string;
      param3?: string;
    };
    const response = await pscale.table.findMany({
        where: {
            ...(param1 && {
                  attr1: param1
             }),
            ...(param2 && {
                  attr2: param2
             }),
            ...(param3 && {
                  attr3: param3
             }),
        }
    })
    res.status(200).json(response);
}

Wrong path to type definitions

Hi, I’m using the @planetscale/database npm package in a TypeScript project (using Deno). It works but I had trouble getting the type definitions to work. package.json specifies dist/index.d.js as types file but this file does not exist. It should be dist/index.d.ts. Can you please fix the path in your package.json?

For the record, here is my PoC with working types:

// main.ts

// @deno-types="./node_modules/@planetscale/database/dist/index.d.ts"
import { connect } from "npm:@planetscale/database";

const connection = connect({
  host: "xx",
  password: "xx",
  username: "xx",
});

const start = Date.now();
const _result = await connection.execute("SELECT 1");
console.log(`Execute: ${Date.now() - start} ms`);

Then run deno run --allow-net --node-modules-dir main.ts

Environment:

deno --version
deno 1.28.2 (release, aarch64-apple-darwin)
v8 10.9.194.1
typescript 4.8.3

Getting ENOTFOUND from Docker image on fly.io

Hey I'm trying to deploy a NextJS (13 using app-dir) app to fly.io but hitting the following error, also tried passing the global fetch from NextJS to the connection config

db:execute:pscale:ERROR TypeError: fetch failed
    at Object.fetch (node:internal/deps/undici/undici:11522:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
  cause: Error: getaddrinfo ENOTFOUND "aws.connect.psdb.cloud"
      at GetAddrInfoReqWrap.onlookupall [as oncomplete] (node:dns:119:26)
      at GetAddrInfoReqWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
    errno: -3008,
    code: 'ENOTFOUND',
    syscall: 'getaddrinfo',
    hostname: '"aws.connect.psdb.cloud"'
  }
}

Not even sure this is the right place here but felt like a good starting point.
Never had this issue before, locally and Cloudflare Pages/Worker work as expected.

Any idea or hint would be ver much appreciated πŸ™πŸΌ

NULL types are mishandled

It seems, due to a quirk in how a protobuf is translated into JSON, the NULL type is left out from the field because it's integer value is 0: https://github.com/vitessio/vitess/blob/main/proto/query.proto#L111-L112

What happens here is an HTTP response that looks like this for a query such as select null, 1.

"result": {
    "fields": [
      {
        "name": "null"
      },
      {
        "name": ":vtg1",
        "type": "INT64"
      }
    ],
    "rows": [
      {
        "lengths": [
          "-1",
          "1"
        ],
        "values": "MQ=="
      }
    ]
  }
}

The quirk here is that type is entirely left off of the fields, resulting in us decoding the row incorrectly.

The equivalent of after decoded by the driver is:

{
  headers: [ 'null', ':vtg1' ],
  types: { null: undefined, ':vtg1': 'INT64' },
  fields: [ { name: 'null' }, { name: ':vtg1', type: 'INT64' } ],
  rows: [ { null: null, ':vtg1': '1' } ],
  rowsAffected: null,
  insertId: null,
  size: 1,
  statement: 'select null, 1',
  time: 194
}

As we can see, the null column has an undefined type, rather than something such as "NULL" as I'd expect.

Even worse, but albeit an extremely edge case, on a query like select null, we actually throw an error:

DOMException [InvalidCharacterError]: The string to be decoded is not correctly encoded.
    at new DOMException (node:internal/per_context/domexception:53:5)
    at __node_internal_ (node:internal/util:505:10)
    at atob (node:buffer:1288:11)
    at decodeRow (file:///xxx/node_modules/@planetscale/database/dist/index.js:161:20)
    at parseObjectRow (file:///xxx/node_modules/@planetscale/database/dist/index.js:149:17)
    at file:///xxx/node_modules/@planetscale/database/dist/index.js:158:88
    at Array.map (<anonymous>)
    at parse (file:///xxx/node_modules/@planetscale/database/dist/index.js:158:17)
    at Connection.execute (file:///xxx/node_modules/@planetscale/database/dist/index.js:86:31)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)

The HTTP response we get is:

  "result": {
    "fields": [
      {
        "name": "null"
      }
    ],
    "rows": [
      {
        "lengths": [
          "-1"
        ]
      }
    ]
  }

My assumption here again is a similar quirk, and that values is just a missing key entirely since it'd be an empty string representing no data. It's stemming from within atob, so didn't check exactly what's going on, but my assumption is we're passing undefined in there, rather than anything else.

My hunch here though is we need to explicitly handle the NULL type better, and can probably fix both issues together.

New as array/object option breaks Typescript

Hi! πŸ‘‹

I updated my Planetscale driver version today and Typescript started generating errors.

After a bit of investigation, it seems that Typescript is seeing that a row can now be an Array or an Object and defaulting to Array (so things like query.rows[0].row_name break, even when specifying as).

It seems to be this line. A row can either be an array or an object and array is preferred so it's underlined red.

type Row = Record<string, any> | any[]

PlanetScale migrate

Is there a way to use the Drizzle ORM migrate API with PlanetScale?

I tried:

import { drizzle } from "drizzle-orm/mysql2";
import { migrate } from "drizzle-orm/mysql2/migrator";
import mysql from "mysql2/promise";

async function main() {
  // create the connection
  const poolConnection = mysql.createPool({
    host: process.env.MYSQL_HOSTNAME,
    user: process.env.MYSQL_USERNAME,
    database: process.env.MYSQL_DATABASE,
    multipleStatements: true,
  });

  const db = drizzle(poolConnection);

  // this will automatically run needed migrations on the database
  await migrate(db, { migrationsFolder: "./migrations" });
}
 main();

But, I got code: 'ECONNREFUSED'. I have confirmed that the stuff in my env is all good.

Thanks!

Syntax error at position 42 near 'WHERE'

Hey! Great job on this JS driver! :)

I tried to run a simple query:

SELECT * FROM database.table LIMIT 100 WHERE slug = 'xyz'

But I got the following error:

syntax error at position 42 near 'WHERE'

I later realized that it's because my LIMIT clause cannot come before my WHERE clause.

i.e. this works:

SELECT * FROM database.table WHERE slug = 'xyz' LIMIT 100

Would be nice if we can write a freeform SQL query without restricting the clause sequence :)

WASM / Rust version

Since edge runtimes have WASM support, it'd be good if there was a planetscale driver in WASM instead of Js. Edge resources are limited so the more savings the better :)

base64 compatibility

We currently rely on btoa and atob, because that works in browsers and worked in versions of Node we tested against, but that's not as universal as we had hoped.

We should instead, provide some compatibility shim against the Buffer API that should more likely exist? We might need to provide a fallback for even that if neither Buffer nor btoa and friends exist.

But for now, it seems either way, it'd be more favorable to use the Buffer API when available over btoa.

I propose something like this, granted I dunno the best way to do this in JavaScript/TypeScript these days:

var b64encode;
var b64decode;
if(typeof Buffer !== "undefined") {
  b64encode = x => Buffer.from(x, 'latin1').toString('base64')
  b64decode = x => Buffer.from(x, 'base64').toString('latin1')
} else if (typeof btoa !== "undefined") {
  b64encode = btoa
  b64decode = atob
} else {
  // I dunno, natively has neither
}

console.log(b64encode('foo'))
console.log(b64decode('Zm9v'))

I think this is still reasonably applicable since we don't simply use it for the Authorization header. While that's the only use of btoa, atob is used to decode binary row data, which we do for every row in a QueryResult.

Refs: #47

Suggestion: Allow setting the `cast` option in `execute`

tldr;

const result = connOrTx.execute(query, values, { as: 'object',  cast: myOneOffCastFn })

Reasoning

AFAICT you have to pass cast to connect or the Client factory. This is inconvenient when the code that's calling execute isn't the code that created the connection. For example I might have this...

type UserBalance = {
  userId: string; // but stored as INT64
  balance: bigint; // yeah, my app's users are super wealthy
}
const selectUserBalance = async (
  input: { where: Sql, order: Sql, limit: Sql }, 
  db: Transaction | Connection
): Promise<UserBalance[]>  {...}

As it is now selectUserBalance has to re-cast the rows after getting them from the library. If you don't know (or, in my case, never can remember) what the connection's cast is doing, this is hard to get right. Am I casting to BigInt?, etc.

I think it'd be nicer to allow the connection's cast to be overridden or extended as needed. In many (if not all) cases cast is the concern of the model "at hand," not the connection. In addition to BigInt as in the example above, there are at least a couple other cases where one might want to cast on a per-model basis:

  • boolean. The library (rightly) does not try to cast this on its own from say INT8. Even if it did, I'm not sure it's possible to CAST to that in MySQL from an aggregate query. I think the common case, though, is that it should end up as a boolean.
  • Cases where one might want to manipulate the raw data, e.g. turning a float into Decimal or massaging JSON.

How it'd work

  • Folks could still pass a custom "global" cast to connect. This or the library's default cast would be used if execute is called without cast set.
  • It execute is called with cast, the library would use that function rather than the custom or default one.

In userland...

Defining a "global" cast function...

const myGlobalCast = (field: Field, value: string | null) => {
  if (field.type === 'DATETIME' && typeof value === 'string') {
    return new Date(value);
  }
  // etc
  return cast(field, value);
}
const myGetConnection = (url: string): Connection => {
  return connect({
    cast: myGlobalCast,
    url
  });
}

...which can be overridden as need be...

type UserBalance = {
  userId: string; // but stored as INT64
  balance: bigint; // yeah, my app's users are super wealthy
};
const selectUserBalance = async (
  input: { where: Sql; order: Sql; limit: Sql },
  db: Transaction | Connection
): Promise<UserBalance[]> => {
  const { sql: query, values } = sql`some fancy sql here`;
  const cast = (field: Field, value: string | null) => {
    if (field.name === 'balance') {
      return BigInt(value);
    }
    return myGlobalCast(field, value); // or just use the library's default cast
  };
  const result = await db.execute(query, values, { as: 'object', cast });
  return result.rows as UserBalance[];
};

Alternatives Considered

  • Currently I'm short-circuiting the connection's cast with (field: Field, value: string | null) => value, then recasting the fetched rows knowing that I only have to deal with string|null values. This isn't as terrible as it sounds, since it's using cast under the hood for most things, but it's not great.
  • Rename columns to indicate type. Ack. Nope.
  • Maybe I'm missing something?

Send Request-Id header on requests

Our API supports sending along a Request-Id header, and this will be traced through on our side for logging and whatnot.

A simple UUID or something unique would be sufficient.

Note that this header should already be allowed by our CORS, but a manual verification would be nice.

LAST_INSERTED_ID comes back as string

The types returned by the http driver usually line up well with what's in the db, but for some reason LAST_INSERT_ID comes back as a string

const newId = await tx.execute("SELECT LAST_INSERT_ID() as id");
console.log(newId.rows, typeof newId.rows[0].id);

which prints

image

Retrieved binary is converted to a string

AFAICT, this library does not support retrieving binary as binary.

I added some code inspired by custom type casting function:

function inflate(field, value) {
  console.log("field.type", field.type)
  console.log("value", value)
  console.log("value type", typeof value)
  return cast(field, value)
}

When I do so, here is some representative output of some UUIDs:

field.type BINARY
value ø0sô¼A€ÿçÐsô
value type string
field.type BINARY
value AD3ADFB038724973
value type string
field.type BINARY
value type string

Yes, the third value is missing - it is seemingly unprintable and silently killed that console.log. This isn't surprising because not all binary is valid utf-8.

Some code, if you want gory details.
create table Media_User
(
    mediaId binary(16)  not null,
    userId  varchar(21) not null,
    primary key (mediaId, userId)
)
    collate = utf8mb4_unicode_ci;

INSERT INTO Media_User (mediaId, userId) VALUES (0x11f830737ff4bc41a4ffe792d073f41f, 'ff')
INSERT INTO Media_User (mediaId, userId) VALUES (0x41443341444642303338373234393733, 'ff')
INSERT INTO Media_User (mediaId, userId) VALUES (0x737295711B1121479411906BBD2F7A34, 'ff')
    import { connect, cast } from "@planetscale/database"

    const config = {
      cast: inflate,
      url: somethngSecret,
    }

    const conn = connect(config)
    const results = await conn.execute("select mediaId from planetScaleDb.Media_User")
    console.log("------------------")
    console.log(results.rows[0])
    console.log(results.rows[1])
    console.log(results.rows[2])

This occurs for me on cloudflare workers (and miniflare).

Add CJS distribution to allow both NextJS and Node usage

This package is esm-only at the time of writing. This is a good call and a growing trend.

However, NextJS does not currently fully support Node’s explicit file extension pattern for esm.

This means that it is impossible to:

  • Use this library with NextJS
  • AND write tests that use the library running in Node (such as E2E/Integration tests that require DB mutations).

Further, despite esm seen as the de-facto better future, its support is still limited, with many issues still to iron out.

A CJS distribution along an ESM one, would ensure that developers do not have to waste time on esm friction (I did, and it took 10 hours).

The diagram below depicts the issue.

nextjs-esm-only

Idea: Multiple fetch styles

I am doing PHP at work, and we have in a popular library Doctrine something like "Fetch styles". That would be I guess a cool thing what could be implemented.

Something like below. Maybe useful also for others πŸ€”

con.fetchOne('SELECT id from xxx LIMIT 1'); // returns direct the ID 

con.fetchAllIndexed('SELECT id, name FROM xxx'); // returns {1:  {name: "foo"}}

con.fetchAllKeyValue('SELECT id, name FROM xxx') // returns {1: "foo"}

Support for multiple statements?

Apologies if this is already supported, but would it be possible to get the ability to run multiple statements in one request? Something like

await conn.execute("INSERT INTO foo(x) VALUES(12); SELECT LAST_INSERT_ID() as id;")

Obviously you can achieve that with separate queries, but that would require blocking until the insert is done, and then blocking while you grab the inserted id.

How to close connection?

import { connect } from "@planetscale/database";
const conn = connect(config);
export const executeQuery = async (query, params = []) => {
  try {
    const results = await conn.execute(query, params);
    return results;
  } catch (error) {
    console.error(`Error in executeQuery ${error.message}`);
    console.error(`Error in executeQuery ${error.stack}`);
    throw error;
  }
};

How does connection handling work and how to close the existing database connection?

Error [ERR_REQUIRE_ESM]: require() of ES Module

Trying to use in my typescript project and i get

Error [ERR_REQUIRE_ESM]: require() of ES Module /home/.../dev/...-monorepo/node_modules/@planetscale/database/dist/index.js from /home/.../dev/...-monorepo/...-backend/db/client.ts not supported.
Instead change the require of index.js in /home/.../dev/...-monorepo/...-backend/db/client.ts to a dynamic import() which is available in all CommonJS modules.

First class support for transactions

I don't want to speculate on what an API for this could be, since I think this is more fitting to precedent within the ecosystem and expectations, but it's not clear or obvious that you can do transactions through database-js, so we should provide some examples and likely some conn.beginTransaction(), etc, helpers.

To be clear, you can simply do:

await conn.execute('begin')
await conn.execute('select 1')
await conn.execute('commit')

and this works fine, as long as it's on the same session.

Missing/more errors

Thanks for this great library, it's fantastic to be able to query directly from Cloudflare Pages πŸ₯³

Based on my experiments so far I feel like the library is pretty silent regarding errors, instead of returning or throwing anything it just does nothing, for example inserting a record with an existing id twice, only the first time shows the result of the query, second time nothing happens.

Personally I definitely prefer that it doesn't throw but it'd be really helpful if it could return errors, so I could react accordingly.

const result = await conn.execute(`INSERT INTO user (id, username) VALUES (?, ?)`, [id, username]);
console.log(JSON.stringify({ result }, null, 4));

Edit: just stumbled upon #62 πŸ˜… though as stated I'd prefer the former returning errors instead of throwing, or the option to configure this, in my case it does nothing where I expect some form of information.

Parse Date, timezone problem because Z is removed

There is a sanitiser that removes the Z from a date:

return quote(value.toISOString().replace('Z', ''))

But when I parse that string back to a date I get the wrong result. I needed to add the Z back to make it work:

new Date(Date.parse(d + "Z"));

I had that problem here:
https://github.com/roelandmoors/authjs-adapter-mysql/blob/5867a205154f91c88086c0ce2a5101e79c61faf7/src/db.ts#L45

Seems like a bug to me?

Question: Error when trying to use the module in jest

Expected Behavior

"@planetscale/database" should be able to import in Jest tests.

Current Behavior

An error occurs and jest is not able to run.

 FAIL  ./test.test.ts
  ● Test suite failed to run

    Jest encountered an unexpected token

    Jest failed to parse a file. This happens e.g. when your code or its dependencies use non-standard JavaScript syntax, or when Jest is not configured to support such syntax.

    Out of the box Jest supports Babel, which will be used to transform your files into valid JS based on your Babel configuration.

    By default "node_modules" folder is ignored by transformers.

    Here's what you can do:
     β€’ If you are trying to use ECMAScript Modules, see https://jestjs.io/docs/ecmascript-modules for how to enable it.
     β€’ If you are trying to use TypeScript, see https://jestjs.io/docs/getting-started#using-typescript
     β€’ To have some of your "node_modules" files transformed, you can specify a custom "transformIgnorePatterns" in your config.
     β€’ If you need a custom transformation specify a "transform" option in your config.
     β€’ If you simply want to mock your non-JS modules (e.g. binary assets) you can stub them out with the "moduleNameMapper" config option.

    You'll find more details and examples of these config options in the docs:
    https://jestjs.io/docs/configuration
    For information about custom transformations, see:
    https://jestjs.io/docs/code-transformation

    Details:

    /foo/node_modules/@planetscale/database/dist/index.js:1
    ({"Object.<anonymous>":function(module,exports,require,__dirname,__filename,jest){import { format } from './sanitization.js';
                                                                                      ^^^^^^

    SyntaxError: Cannot use import statement outside a module

What I've tried

I'm not sure what the root cause is but it seems Jest cannot transform the ESM(?) file included in @planetscale/database/dist/index.js.

I assumed this error is a well-known jest misconfiguration and tried some of these solutions:

The same error occurred each time.

Steps to Reproduce

I've created a minimum code base as follows:
https://github.com/kenfdev/planetscale-jest-poc

You can reproduce it with npm test.

I might totally be missing something from the beginning but am pretty much stuck right now. Any advice would be highly appreciated!
Thank you in advance!

Reject promise if query fails

Hey,

I would currently expect that when a query fails, the promise will be rejected. Maybe with a new config rejectPromiseIfQueryFails: boolean

Thanks for building this. Works awesome with Cloudflare Workers

[DOCS]: Parallel executes not supported within a single transaction

I assume the statement in the title is correct. If so, might be worth adding to the Transactions section of the README the following:

Note: execute cannot be called in parallel (Promise.all) within a transaction.

The transaction example provided in the README:

const results = await conn.transaction(async (tx) => {
  const whenBranch = await tx.execute(query1)
  const whenCounter = await tx.execute(query2)
  return [whenBranch, whenCounter]
})

However, if you try to fire execute in parallel:

const results = await conn.transaction(async (tx) => {
  await Promise.all(
    tx.execute(query1),
    tx.execute(query2)
  )
})

It breaks with:

target: dbname.-.primary: vttablet: rpc error: code = Aborted desc = transaction 1678232354685529562: in use: for query (CallerID: r05ouwpwzfqr7yd16bf2)

Explore supporting StreamExecute/OLAP workloads

Within some contexts, especially serverless, the memory limits are very low in process. So it can be desirable when performing aggregations in process, or if also streaming out to the end user, to be able to leverage StreamExecute from our API.

I will say, this is going to be rather non-trivial, but I'd like us to see what it'd take to support without fully adopting connect-web or protobufs.

Streaming requires adopting the Connect protocol, rather than simply application/json, which can be done, and is done within connect-web. https://connect.build/docs/protocol#streaming-response

To add to this, this feature is required to support OLAP queries.

cc @reconbot

Prisma integration ?

Hi ! Is it possible to integrate this connector with the Prisma client ? I'm running an app in Cloudflare Pages and so can only send requests through HTTP.

Ability to easily switch branches for local development

Currently, database.js is not compatible with pscale connect because they are entirely different protocols (http vs mysql).

One pain point when using database.js for local development is quickly switching branches. Ideally, we'd find a solution where developers do not have to create & copy/paste connection strings into their code to swap branches.

Opening this issue so that: 1. People find it when googling. 2. to track the discussion in one place.

Other context:

Why transactions don't work in this case

I have this code

async function getDBConnection() {
    const config = {
        host: process.env["DATABASE_HOST"],
        username: process.env["DATABASE_USERNAME"],
        password: process.env["DATABASE_PASSWORD"],
    };
    return connect(config);
}

async function test() {
    const queries = [
        "CREATE TABLE `a` (`id` serial AUTO_INCREMENT PRIMARY KEY NOT NULL)",
        "CREATE TABLE `b` (`id` serial AUTO_INCREMENT PRIMARY KEY NOT NULL)",
        "CREATE TABLE `c` (`id` serial AUTO_INCREMENT PRIMARY KEY NOT NULL)",
        "" // This will fail with DatabaseError: Query was empty
    ]
    const conn = await getDBConnection();
    return conn.transaction(async (tx) => {
        for (const q of queries) {
            await tx.execute(q)
        }
    })
}

test()
    .then((res) => {
        console.log(res);
    })
    .catch((err) => {
        console.log(err);
    });

I've executed it and the empty query error was returned

DatabaseError: Query was empty
    at Connection.execute (.../@planetscale/database/dist/index.js:78:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async .../scripts/run-migration.mjs:115:13
    at async Connection.transaction (.../@planetscale/database/dist/index.js:59:25) {
  status: 400,
  body: { message: 'Query was empty', code: 'UNKNOWN' }

I was expecting the tables to not be created but when I go to the console I see

image

what I am missing?

Json Datatypes are not natively supported

When sending a JSON object or JSON stringified string I get the following errors. I can't tell if I should be doing something different.

JSON Object

target: readcted.-.primary: vttablet: rpc error: code = Unknown desc = Invalid JSON text: \"Invalid value.\" at position 1 in value for `data`) values (:vtg1,)\", BindVars: {vtg1: \"type:VARCHAR value:\\\"[object Object]\\\"\}

JSON Stringified

target: redacted.-.primary: vttablet: rpc error: code = Unknown desc = Invalid JSON text: \"Invalid value.\" at position 0 in value for column 'Redacted.data'. (errno 3140) (sqlstate 22032) (CallerID: planetscale-admin): Sql: \"insert into Redacted(`data`) values (:vtg1)\", BindVars: {vtg1: \"type:VARCHAR value:\\\"{\\\\\\\"foo\\\\\\\":\\\\\\\"bar\\\\\\\"}\\\"\}

Table Schema

CREATE TABLE `Redacted` (
	`id` varchar(191) NOT NULL,
	`createdAt` datetime(3) NOT NULL DEFAULT CURRENT_TIMESTAMP(3),
	`data` json NOT NULL,
	PRIMARY KEY (`id`)
) ENGINE InnoDB,
  CHARSET utf8mb4,
  COLLATE utf8mb4_unicode_ci;

fetch config types not compatible with node-fetch v2 or undici

import { connect } from '@planetscale/database'
import { fetch } from 'undici'

export const connection = connect({
  url: process.env.DATABASE_URL,
  fetch,
})

Comes with a type error

Type '(input: RequestInfo, init?: RequestInit | undefined) => Promise<Response>' is not assignable to type '(input: string, init?: ReqInit | undefined) => Promise<Pick<Response, "ok" | "json" | "status" | "statusText" | "text">>'.
  Types of parameters 'init' and 'init' are incompatible.
    Type 'ReqInit | undefined' is not assignable to type 'RequestInit | undefined'.
      Type 'ReqInit' is not assignable to type 'RequestInit | undefined'.
        Type 'ReqInit' is not assignable to type 'RequestInit'.
          Types of property 'headers' are incompatible.
            Type 'HeadersInit | undefined' is not assignable to type 'import("/Users/wizard/src/telley/telley-api/node_modules/.pnpm/[email protected]/node_modules/undici/types/fetch").HeadersInit | undefined'.
              Type 'Headers' is not assignable to type 'HeadersInit | undefined'.
                Type 'Headers' is missing the following properties from type 'Headers': keys, values, entries, [Symbol.iterator]ts(2322)
index.d.ts(34, 5): The expected type comes from property 'fetch' which is declared here on type 'Config'

A very similar error comes with the node-fetch v2 types.

Geometry binary data appears to be incorrect value

For some context i'm trying to interface with drizzle orm. I'm trying to parse the response for a geometry field that I'm receiving from the driver, i'm deliberately not using ST_AsBinary() as the will be how it will end up getting called by drizzle. My understanding is that MySql should return the SRID + WKB as the default response when not providing ST_AsText, or ST_AsBinary

The geometry in my database: LINESTRING(0 0,1 1,2 2), 4326

db.Execute('SELECT Top(1) geom from polys')

The response from the driver:

\u0010\u0000\u0000\u0001\u0002\u0000\u0000\u0000\u0003\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000 ?\u0000\u0000\u0000\u0000\u0000\u0000 ?\u0000\u0000\u0000\u0000\u0000\u0000\u0000@\u0000\u0000\u0000\u0000\u0000\u0000\u0000@

I'm then reading it into a buffer and slicing off the first 4 bytes for the SRID. However doing so produces a marginally different output than expected.

I'm expecting (from MySqlWorkBench):

01020000000300000000000000000000000000000000000000000000000000F03F000000000000F03F00000000000000400000000000000040

Receiving (after reading from buffer):

01020000000300000000000000000000000000000000000000000000000000fd3f000000000000fd3f00000000000000400000000000000040

its very close to what it should be

MySqlWorkBench for reference:
image

Is there something i'm missing here?

Codesandbox without pscale hooked up

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.