Coder Social home page Coder Social logo

node-pg-types's Introduction

pg-types

This is the code that turns all the raw text from postgres into JavaScript types for node-postgres

use

This module is consumed and exported from the root pg object of node-postgres. To access it, do the following:

var types = require('pg').types

Generally what you'll want to do is override how a specific data-type is parsed and turned into a JavaScript type. By default the PostgreSQL backend server returns everything as strings. Every data type corresponds to a unique OID within the server, and these OIDs are sent back with the query response. So, you need to match a particluar OID to a function you'd like to use to take the raw text input and produce a valid JavaScript object as a result. null values are never parsed.

Let's do something I commonly like to do on projects: return 64-bit integers (int8) as JavaScript integers. Because JavaScript doesn't have support for 64-bit integers node-postgres cannot confidently parse int8 data type results as numbers because if you have a huge number it will overflow and the result you'd get back from node-postgres would not be the result in the database. That would be a very bad thing so node-postgres just returns int8 results as strings and leaves the parsing up to you. Let's say that you know you don't and wont ever have numbers greater than int4 in your database, but you're tired of receiving results from the COUNT(*) function as strings (because that function returns int8). You would do this:

var types = require('pg').types
types.setTypeParser(20, function(val) {
  return parseInt(val, 10)
})

boom: now you get numbers instead of strings.

Just as another example -- not saying this is a good idea -- let's say you want to return all dates from your database as moment objects. Okay, do this:

var types = require('pg').types
var moment = require('moment')
var parseFn = function(val) {
   return val === null ? null : moment(val)
}
types.setTypeParser(types.builtins.TIMESTAMPTZ, parseFn)
types.setTypeParser(types.builtins.TIMESTAMP, parseFn)

note: I've never done that with my dates, and I'm not 100% sure moment can parse all the date strings returned from postgres. It's just an example!

If you're thinking "gee, this seems pretty handy, but how can I get a list of all the OIDs in the database and what they correspond to?!?!?!" worry not:

$ psql -c "select typname, oid, typarray from pg_type order by oid"

If you want to find out the OID of a specific type:

$ psql -c "select typname, oid, typarray from pg_type where typname = 'daterange' order by oid"

😄

license

The MIT License (MIT)

Copyright (c) 2014 Brian M. Carlson

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

node-pg-types's People

Contributors

alxndrsn avatar americas avatar bendrucker avatar brianc avatar chamini2 avatar charmander avatar ginden avatar joostfarla avatar jrf0110 avatar kamholz avatar kapouer avatar keithwhor avatar lalitkapoor avatar laurent22 avatar marekventur avatar martianboy avatar melloc avatar mkulke avatar mnahkies avatar mohamedlamineallal avatar noinkling avatar ooflorent avatar point9repeating avatar ray-vgw avatar royreiss avatar santigimeno avatar sehrope avatar stekycz avatar tlhunter avatar ziluvatar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

node-pg-types's Issues

Last update breaks app

Hi,

I've no idea why but the update from 1.7.0 to 1.8.0 breaks my app with the following error message:

Array.parseBigIntegerArray (/app/api/node_modules/bub-backend-shared/node_modules/pg/node_modules/pg-types/lib/textParsers.js:32:22)
new eval (eval at <anonymous> (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/result.js:98:20), <anonymous>:28:73)
Result.parseRow (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/result.js:61:10)
Query.handleDataRow (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/query.js:58:26)
null.<anonymous> (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/client.js:108:24)
emit (events.js:107:17)
Socket.<anonymous> (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/connection.js:109:12)
Socket.emit (events.js:107:17)
readableAddChunk (_stream_readable.js:163:16)
Socket.Readable.push (_stream_readable.js:126:10)
TCP.onread (net.js:538:20)
Uncaught ReferenceError: val is not defined

FROM
Array.parseBigIntegerArray (/app/api/node_modules/bub-backend-shared/node_modules/pg/node_modules/pg-types/lib/textParsers.js:32:22)
new eval (eval at <anonymous> (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/result.js:98:20), <anonymous>:28:73)
Result.parseRow (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/result.js:61:10)
Query.handleDataRow (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/query.js:58:26)
null.<anonymous> (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/client.js:108:24)
emit (events.js:107:17)
Socket.<anonymous> (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/connection.js:109:12)
Socket.emit (events.js:107:17)
readableAddChunk (_stream_readable.js:163:16)
Socket.Readable.push (_stream_readable.js:121:15)
TCP.onread (net.js:538:20)
Uncaught ReferenceError: val is not defined

FROM
Array.parseBigIntegerArray (/app/api/node_modules/bub-backend-shared/node_modules/pg/node_modules/pg-types/lib/textParsers.js:32:22)
new eval (eval at <anonymous> (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/result.js:98:20), <anonymous>:28:73)
Result.parseRow (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/result.js:61:10)
Query.handleDataRow (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/query.js:58:26)
null.<anonymous> (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/client.js:108:24)
emit (events.js:107:17)
Socket.<anonymous> (/app/api/node_modules/bub-backend-shared/node_modules/pg/lib/connection.js:109:12)
Socket.emit (events.js:107:17)
readableAddChunk (_stream_readable.js:163:16)
Socket.Readable.push (_stream_readable.js:126:10)
TCP.onread (net.js:538:20)

Inconsistent (date) values when working with Timezones

Some time ago I noticed "small problems" working with data (types) from json subquery (brianc/node-postgres#1876). I know that there are many similar issues, but now I had to include different time zones and "things" looks worse...

Consider The Experiment

with (postgresql.conf) timezone = 'Europe/Warsaw'
or (sql) SET TIMEZONE = 'Europe/Warsaw'
and (node) $ env TZ='Europe/Warsaw' node server.js

and query:

SELECT
    current_setting('TIMEZONE')        AS "TZ_current",
    now()                              AS "now_______",
    now()::timestamptz                 AS "now_tstz__",
    now()::timestamp                   AS "now_ts____",
    now() AT TIME ZONE 'Europe/Warsaw' AS "now_WAW___",
    now() AT TIME ZONE 'UTC'           AS "now_UTC___",
    (SELECT row_to_json(r) FROM (
        SELECT
            now()                              AS "now_______",
            now()::timestamptz                 AS "now_tstz__",
            now()::timestamp                   AS "now_ts____",
            now() AT TIME ZONE 'Europe/Warsaw' AS "now_WAW___",
            now() AT TIME ZONE 'UTC'           AS "now_UTC___"
    ) r );

the results look like this:

console.log(process.env.TZ); //: (ok) Europe/Warsaw
console.log(new Date()); //: 2019-04-14T12:38:47.301Z (ok) UTC
console.log(new Date().getTimezoneOffset()/60): //: (ok) -2

{
  "TZ_current": "Europe/Warsaw",
  "now_______": "2019-04-14T12:38:47.301Z", //: ? UTC
  "now_tstz__": "2019-04-14T12:38:47.301Z", //: ? UTC
  "now_ts____": "2019-04-14T12:38:47.301Z", //: ERROR: now_tstz__ = now_ts____
  "now_WAW___": "2019-04-14T12:38:47.301Z", //: ERROR: now_tstz__ = now_WAW___
  "now_UTC___": "2019-04-14T10:38:47.301Z", //: ERROR: now_tstz__ <> now_UTC___ 
  "row_to_json": {
    "now_______": "2019-04-14T14:38:47.301076+02:00",
    "now_tstz__": "2019-04-14T14:38:47.301076+02:00",
    "now_ts____": "2019-04-14T14:38:47.301076",
    "now_WAW___": "2019-04-14T14:38:47.301076",
    "now_UTC___": "2019-04-14T12:38:47.301076"
  }
}

types.setTypeParser(1114, (s) => s); //: timestamp
types.setTypeParser(1184, (s) => s); //: timestamptz

{
  "TZ_current": "Europe/Warsaw",
  "now_______": "2019-04-14 14:39:35.911219+02",
  "now_tstz__": "2019-04-14 14:39:35.911219+02",
  "now_ts____": "2019-04-14 14:39:35.911219",
  "now_WAW___": "2019-04-14 14:39:35.911219",
  "now_UTC___": "2019-04-14 12:39:35.911219",
  "row_to_json": {
    "now_______": "2019-04-14T14:39:35.911219+02:00",
    "now_tstz__": "2019-04-14T14:39:35.911219+02:00",
    "now_ts____": "2019-04-14T14:39:35.911219",
    "now_WAW___": "2019-04-14T14:39:35.911219",
    "now_UTC___": "2019-04-14T12:39:35.911219"
  }
}

and for me this is a Real Problem with default (js date) type.

Maybe another argument for #50 – Do not return DATE fields as Javascript Date – ???

Originally posted by @veriKami in brianc/node-postgres#1876 (comment)

Error with Buffer() Deprecated

Hello,

First of all, thank you for your work!

You have a dependency to "postgres-bytea": "~1.0.0". It seems that dependency make an error in aws lambda due to Buffer deprecated constructor.

Perhaps you could update your dependency to version 2.0.0 to resolve this issue?

Thank you!

Problem parsing multidimensional json arrays

I have a column defined as a jsonb[] type. I'm actually going to store 2D arrays of JSON objects like the following.

[
    [
        {"count": 2, type: "king"},
        {"count": 3, type: "queen"}
    ], [
        {"count": 2, type: "king"},
        {"count": 3, type: "queen"}
    ]
]

The data representation of this on the database is like this...

{{"{\"type\": \"king\", \"count\": 2}","{\"type\": \"queen\", \"count\": 3}"},{"{\"type\": \"king\", \"count\": 2}","{\"type\": \"queen\", \"count\": 3}"}}

When trying to retrieve this kind of data structure from the database however I end up getting the following error.

SyntaxError: Unexpected token , in JSON at position 28
    at JSON.parse (<anonymous>)
    at c:\vagrant\aws\property_manager_api\property_api\node_modules\pg-types\lib\textParsers.js:113:45
    at Array.map (<anonymous>)
    at Array.parseJsonArray (c:\vagrant\aws\property_manager_api\property_api\node_modules\pg-types\lib\textParsers.js:113:14)
    at Result.parseRow (c:\vagrant\aws\property_manager_api\property_api\node_modules\pg\lib\result.js:73:36)
    at Query.handleDataRow (c:\vagrant\aws\property_manager_api\property_api\node_modules\pg\lib\query.js:91:24)
    at Connection.<anonymous> (c:\vagrant\aws\property_manager_api\property_api\node_modules\pg\lib\client.js:217:22)
    at emitOne (events.js:116:13)
    at Connection.emit (events.js:211:7)
    at Socket.<anonymous> (c:\vagrant\aws\property_manager_api\property_api\node_modules\pg\lib\connection.js:123:12)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at addChunk (_stream_readable.js:263:12)
    at readableAddChunk (_stream_readable.js:250:11)
    at Socket.Readable.push (_stream_readable.js:208:10)

I believe this might be because it's ambiguous as to whether or not the curly braces are for an object or an array. Note that I haven't declared the column as a jsonb[][] because internally for postgres it should be treated the same.

Is there any fix for this?

README contains wrong OIDs for time types

The timestamp and timestamptz oids are reversed on the README page.

select typname, oid, typarray from pg_type where typtype = 'b' and typname LIKE 'timestamp%' order by oid
"timestamp";1114;1115
"timestamptz";1184;1185

setTypeParser within json_agg

Hi, I'm using the example provided to do this:

pg.types.setTypeParser(3912, parseDaterange)

However, when I've grouped documents using json_agg it doesn't seem to run those type parsers. This makes sense to me because I'm sure it's formatted as JSON by Postgres before the type parser ever sees it. I'm just curious if there's a recommended approach to parsing data types inside something like this?

Error message undefined

Hi,

I am using pg to execute a stored procedure on postegres 10, this procedure may fail and raise an exception, the exception message can not be extracted from the error (err.error 'undefined'), but can be logged on the console.
when I log the error (console.log(err);) I see the exception message along with the stack trace, but when I log only the error message (console.log(err.error)) I get undefined.

let query = {
    text: 'SELECT insert_events($1, $2, $3, $4, $5)',
    values: [provider, type, events, types, expectedVersion],
  };
  const qfunc = async () => {
    const cnn = await pool.connect();
    try {
      const res = await cnn.query(query);
      return res.rows[0].insert_events;
    } catch (err) {
      console.log('Error : ', err.where);
      throw err;
    } finally {
      cnn.release();
    }
  };
  return qfunc();

Entries of type DATE are converted from UTC to local date

Postgresql DATE are stored in the following format yyyy-MM-dd in UTC (following the SQL convention) but are converted by the parseDate function in local date considering that the timezone is missing.

      //it is a date in YYYY-MM-DD format
      //add time portion to force js to parse as local time
      return new Date(isoDate + ' 00:00:00');

That result in offsetting (depending on the local timezone) the stored date. The DATE entry should be considered as UTC date.

numeric parsed as a string : is this the expected behaviour ?

Hi folks, and thanks for this module,

I'm currently using node-postgres 5.1, which uses pg-types 1.11
When I query a table with a numeric(5.2) column (oid 1700), I get the result as a string.
Eg : '2.00' though I was expecting 2 (number)

Is this the expected result ?
If yes, shouldn't the docs updated, as for big ints ?

Shoud I have to set my own parser for oid 1700 ?

Thx in advance.

Parsing postgres intervals in ISO 8601 format

I am using moment.duration as the internal format for intervals in my application. What would be the most direct way of saving and restoring intervals in the ISO 8601 format?

  1. What are the steps to override the pg-types parse and format functions for Interval? (I am new to pg-types).
  2. How do I force postgres to output ISO 8601 intervals programmatically (equivalent to the command SET intervalStyle = iso_8601).

Parsing BOOL arrays

Right now BOOL arrays don't appear to be parsed. OID is 1000. Just wanted to make sure I'm not doing something wrong before I submit a pull request 😄

Register parser for type time[]

Hi, I was having trouble with parsing time arrays and found that the problem is easily solvable by adding a small line of code to lib/textParsers.js. In the init function, just add:

  register(1183, parseStringArray); // time[]

I would do a pull request, but it seems so much trouble for such a simple change.

Should this package try to parse like PostgreSQL, or just what PostgreSQL produces?

Take the boolean parser, for example:

function parseBool (value) {
if (value === null) return value
return value === 'TRUE' ||
value === 't' ||
value === 'true' ||
value === 'y' ||
value === 'yes' ||
value === 'on' ||
value === '1'
}

It checks for 'TRUE', 't', 'true', 'y', 'yes', 'on', and '1', and assumes anything else is false. This seems like overkill for interpreting fields from the server to me, since PostgreSQL only sends 't' and 'f' as text representations… but it’s not the right way to parse booleans in general the way PostgreSQL would either, since the latter

  • is case-insensitive ('True' is valid)
  • ignores surrounding whitespace (E' yes\t\n\f\r' is valid)
  • allows prefixes of words ('of', 'tru', 'fals' are valid)

In other words, there are strings PostgreSQL will parse as true that parseBool will parse as false with no warning.

So is 't'true, 'f'false, otherwise → throw a good direction? I’ll take a look at how it affects the performance of pg at some point.

How to parse custom types?

We make extensive use of custom types in Postgres, both for sending and receiving data from the DB. I don’t see any mention of node-pg-type supporting this, but I wanted to check if I missed it, apologies if I did!

So given an example custom type like this:

create type car as (
   make  text,
   model text,
   style text,
   color text
);

How could I parse a JS object into that PG type? And likewise, how to parse a raw “car” result from PG into a JS object?

For data coming out of PG we can use to_jsonb() easily enough. Data going into PG though, is more difficult as we use a lot PG functions with custom type parameters.

handling hh:mm:ss offsets

For early timestamp with timezones, postgres gives odd offsets, for example:

select timestamp with time zone '0005-02-03 10:00:00.000Z'

gives "0005-02-03 10:53:28+00:53:28" on a server in Germany and "0005-02-03 10:53:28+00:53:28" for a server in the UK. Weird!

I guess these are based on your environment's locally stored timezone file. This isn't part of the spec but postgres does output it, so I guess this project should handle it.

I'll submit a PR in a moment.

Parser for citext[]

Hi!

I am missing a citext[] parser, which can be parsed exactly like text[]. Currently I use a workaround by casting my function result from citext[] to text[], but it is nasty and I don't like it.
I wouldn't mind creating a pull request but I have no clue how to retrieve the type-number.

Thanks a lot,

Jos

should expose arrayParser method

node-postgres-hstore does not natively handle arrays of hstores. I've written my own simple array parsing method, but it requires accessing the unexposed arrayParser method in pg-types (see code below). It would be much better if pg-types could expose arrayParser so I didn't have to depend on the file structure.

var pgTypes = require('pg-types');
var pgTypesPath = path.dirname(require.resolve('pg-types'));
var pgArrayParser = require(path.join(pgTypesPath, 'lib', 'arrayParser'));

pgTypes.setTypeParser(config.hstore.oid, hstore.parse);

pgTypes.setTypeParser(config.hstore.arrayOid, function (val) {
  val = String(val);
  if (!val) return null;

  var p = pgArrayParser.create(val, function (entry) {
    if (entry !== null) entry = hstore.parse(entry);
    return entry;
  });

  return p.parse();
});

select count(*) as c from users returns type "string"

First of all thanks for this nice project. I am experiencing an issue where I use count(*) in a query and I get the value "2" returned, which is represented as a Javascript string.

An explicit typecast fixes the problem, but I am afraid that many more of these bugs might occur in the project. As this issue came up overnight, I am unsure where to search the culprit of this problem.

I have written test cases using db.connect -> db.query -> typeof result.rows[0].c === "number", and found out that only count(*) raises this issue. I am using pg version 4.4.1.

Binary array unknown ElementType log vs Error

While going through for adding linting I noticed the log line at 200:

if ((elementType == 0x17) || (elementType == 0x14)) {
// int/bigint
result = parseBits(value, length * 8, offset);
offset += length * 8;
return result;
}
else if (elementType == 0x19) {
// string
result = value.toString(this.encoding, offset >> 3, (offset += (length << 3)) >> 3);
return result;
}
else {
console.log("ERROR: ElementType not implemented: " + elementType);
}
};

Should be a throw new Error(...) right? At the very least something should be returned as now it'd default to returning undefined.

Wrong and too strict TS type definitions

TS type definitions were added recently in version 2.2.0.

However, there is already an existing Definitely Typed version of them.

They are different and definitely typed version is better to use for us. ISo I would suggest merge of them together.

Differences (issues in new type definitions):

  1. Missing TypeParser type for parsing function
  2. Too strict oid: TypeId which does not allow custom DB types which might have any number
  3. Wrong arrayParser type definition that does not respect the implementation (Definitely Typed definition does)

double precision field not parsed correctly, loss of precision

I get data from fields in postgresql 9.4 :

create table bookseller
(
...
	maplatitude double precision,
	maplongitude double precision,
...
);

And setting up node-pg like this :


const {Pool} = require('pg')
const types = require('pg').types

types.setTypeParser(701, (val) => {
  console.log(`val double : ${val}`)
  return parseFloat(val)
})

Data from pg looks like this :

maplatitude : 43.84668200000001
maplongitude : 5.559941999999999

But I get this with node-pg, whatever I do (default type parser, or with above override).

maplatitude : 43.846682
maplongitude : 5.559942

The result from node-pg :

Result {
  command: 'SELECT',
  rowCount: 1,
  oid: NaN,
  rows: 
   [ anonymous {
     ...
       maplatitude: 43.846682,
       maplongitude: 5.559942 } ],
  fields: 
...
     Field {
       name: 'maplatitude',
       tableID: 724451,
       columnID: 11,
       dataTypeID: 701,
       dataTypeSize: 8,
       dataTypeModifier: -1,
       format: 'text' },
     Field {
       name: 'maplongitude',
       tableID: 724451,
       columnID: 12,
       dataTypeID: 701,
       dataTypeSize: 8,
       dataTypeModifier: -1,
       format: 'text' } ],

I've read countless bug report and node-pg docs on this, seams like it should work correctly. Is this a bug ? Or something I'm not doing correctly ?

Question: performance impact of types.arrayParser.create vs direct postgres-array.parse

(sorry if this is the wrong venue to ask this question).

First, I am very pleased with the quality and extensibility of node-pg and the suite (coming from Java world).

Now, while writing some custom type parsers for Array, I understood that the way to do override an array type parser is like this:

Option 1)

// _int8: 1016
types.setTypeParser(1016, function (val: string) {
  return types.arrayParser.create(val, parseInt).parse();
});

Now the thing which confuses me is that types.arrayParser.create returns a new function/"scope" per record cells, which seems to be a potential performance issue. Now, there might be some JS optimization that makes it a non-issue, but I would like to have confirmation.

The other option would be to use the postgres-array directly.

Option 2)

import * as postgresArray from 'postgres-array';

types.setTypeParser(1016, function (val: string) {
  return postgresArray.parse(val); 
});

Is there any performance disadvantage of option 1 over 2, or is the way JS/V8 optimized, the new function/"scope" created per record in option 1 is not really an issue?

Timestamp without time zones issue?

Hi Brian

I'm having an issue and I'm not sure if it's on my side or the library.

I'm setting some dates on the client, sending to the server and the server writes to the DB.
Afterwards the client requests the data and the serves fetches this date and returns (standard flow).

The client is sending the date in ISO format (2014-10-09T18:41:28.772Z) which should be bringing it to time zone 0. The server is setting the value using the same format, the column type is timestamp without time zone (internal code 1114).
When I select this record, the date field seems to be at time zone 0 (hour wise) but the date object gets the time zone of the server.

So in my case, it goes like this.

  • Client (GMT -03) sends 6pm no timezone
  • Server (GMT -03) stores 6pm no timezone
  • Server fetches 6pm at GMT -03 which yields 9pm

I tried overriding the handler for type 1114 and see what I get, the date comes in this format:
2014-10-09 21:01:00

Due to not having time zone, I think the code is going this way:
https://github.com/brianc/node-pg-types/blob/master/lib/textParsers.js#L71

Which implicitly sets the server time zone to that Date object which, in my case at least, is wrong. You should be aware that you're handling a "no time zone" date and set the time zone offset to 0.

Thanks

Parse array of BigInt

Hi,
I have a column of type bigint[]
But in response, it is converted to an array of string.

How do I parse it?
This works when the column is of type bigint

import { types } from "pg"
types.setTypeParser(20, BigInt); 

Tried this but it gives an error.

import { Client, types } from "pg"
types.setTypeParser(1016, str => types.arrayParser(str,BigInt));

Postgres outputs non-javascript-compatible JSON

knex.raw('select row_to_json(row(1111111111111111111::bigint)) as example;')
  .then(function(res) { console.log(res.rows[0].example); });
// outputs { f1: 1111111111111111200 }

That is, postgres doesn't account for javascript's type limits when outputting its json, and node-pg-types can't use its logic for defaulting those types to string, because it just sees that json is coming down the wire. This affects at least bigints and arbitrary-precision numbers, possibly more things.

What do we do? Can we do anything?

Breaking changes in v2?

Hi,

What were the breaking changes introduced in v2? It's not quite clear from the commit history.

Best,
Joel

Numeric fields returned as text

PostgreSQL returns numeric fields as text, instead of binary, so pg-types does not parse numeric fields. The type definition for numeric fields is in lib/binaryParsers.js: register(1700, parseNumeric);. However, since PostgreSQL returns numeric data as text, the parser is not invoked.

Reproducing the problem

My environment

  • PostgreSQL server 9.3.6
  • pg module version 4.3.0
  • pg-types module version 1.7.0

SQL table

CREATE TABLE test (
  id INTEGER,
  cost NUMERIC
);
INSERT INTO test VALUES (1, 10.50);

Node.js code

var pg = require('pg');
pg.connect('postgres://user@localhost/test_db', function(err, client, done) {
  if (err) throw err;
  client.query('SELECT * FROM test;', function(err, result) {
    if (err) throw err;
    console.log(result);
    done();
    process.exit();
  });
});

Output

{ command: 'SELECT',
  rowCount: 1,
  oid: NaN,
  rows: [ { id: 1, cost: '10.50' } ],
  fields: 
   [ { name: 'id',
       tableID: 17536,
       columnID: 1,
       dataTypeID: 23,
       dataTypeSize: 4,
       dataTypeModifier: -1,
       format: 'text' },
     { name: 'cost',
       tableID: 17536,
       columnID: 2,
       dataTypeID: 1700,
       dataTypeSize: -1,
       dataTypeModifier: -1,
       format: 'text' } ],
  _parsers: [ [Function], [Function] ],
  RowCtor: [Function],
  rowAsArray: false,
  _getTypeParser: [Function] }

From the output, the cost column has format 'text' instead of 'binary', so pg-types does not parse it, and returns it as a string (rows: [ { id: 1, cost: '10.50' } ]).

Workaround

Add 2 lines to the script to set a custom parser for numeric fields:

var pg = require('pg');

// Fix for parsing of numeric fields
var types = require('pg').types
types.setTypeParser(1700, 'text', parseFloat);

pg.connect('postgres://[email protected]/test_db', function(err, client, done) {
  if (err) throw err;
  client.query('SELECT * FROM test;', function(err, result) {
    if (err) throw err;
    console.log(result);
    done();
    process.exit();
  });
});

The output is now correct:

{ command: 'SELECT',
  rowCount: 1,
  oid: NaN,
  rows: [ { id: 1, cost: 10.5 } ],
  fields: 
   [ { name: 'id',
       tableID: 17549,
       columnID: 1,
       dataTypeID: 23,
       dataTypeSize: 4,
       dataTypeModifier: -1,
       format: 'text' },
     { name: 'cost',
       tableID: 17549,
       columnID: 2,
       dataTypeID: 1700,
       dataTypeSize: -1,
       dataTypeModifier: -1,
       format: 'text' } ],
  _parsers: [ [Function], [Function: parseFloat] ],
  RowCtor: [Function],
  rowAsArray: false,
  _getTypeParser: [Function] }

Bug Fix

Add this line to lib/textParsers.js:

var init = function(register) {
  ...
  register(1700, parseFloat);
};

ENUM ARRAY types get parsed as single string

Hi,

I found that currently there is an issue with the way pg-types parses ENUM ARRAY column types.
Currently the array is not parsed at all and a simple string is returned with the SQL array serialised as a string. e.g:

'{"bar", "baz"}'

will be returned as is, instead of the expected value:

['bar', 'baz']

Here is a piece of code that will reproduce this issue (pg is the only dependency):

'use strict';

const assert = require('assert');
const pg = require('pg');
const util = require('util');

const pool = new (pg.Pool)({
  user: 'postgres',
  password: 'postgres',
  host: 'postgresql',
  database: 'enum_test'
});

Promise
  .resolve()
  .then(() => pool.query(
    `
      CREATE TYPE
        MY_ENUM
      AS ENUM
        (
          'foo',
          'bar',
          'baz'
        );

      CREATE TABLE
        my_table
        (
          id             INTEGER        PRIMARY KEY,
          my_enum        MY_ENUM        NOT NULL,
          my_enum_array  MY_ENUM ARRAY  NOT NULL
        );

      INSERT INTO
        my_table
        (
          id,
          my_enum,
          my_enum_array
        )
      VALUES
        (
          1,
          'foo',
          '{"bar", "baz"}'
        );
    `
  ))
  .then(() => pool.query(
    `
      SELECT
        *
      FROM
        my_table
      WHERE
        id = 1;
    `
  ))
  .then(result => result.rows)
  .then(rows => {
    console.log(util.inspect(rows, { colors: true, depth: 100 }));
    assert.ok(util.isArray(rows));
    assert.strictEqual(rows.length, 1);
    assert.ok(util.isObject(rows[0]));
    assert.strictEqual(rows[0].id, 1);
    assert.strictEqual(rows[0].my_enum, 'foo');
    assert.ok(util.isArray(rows[0].my_enum_array));
    assert.strictEqual(rows[0].my_enum_array.length, 2);
    assert.strictEqual(rows[0].my_enum_array[0], 'bar');
    assert.strictEqual(rows[0].my_enum_array[1], 'baz');
  })
  .catch(error => console.log(error, error.stack));

This code will fail at line 68 with the following AssertionError:

{ AssertionError: false == true
    at Promise.resolve.then.then.then.then.rows (/opt/app/src/pg-enum-array-bug-poc.js:68:12)
    at process._tickCallback (internal/process/next_tick.js:103:7)
  name: 'AssertionError',
  actual: false,
  expected: true,
  operator: '==',
  message: 'false == true',
  generatedMessage: true } 'AssertionError: false == true\n    at Promise.resolve.then.then.then.then.rows (/opt/app/src/pg-enum-array-bug-poc.js:68:12)\n    at process._tickCallback (internal/process/next_tick.js:103:7)'

The value of rows will be:

[ anonymous { id: 1, my_enum: 'foo', my_enum_array: '{bar,baz}' } ]

A temporary fix for to this in user land is the following intermediary then() block:

  .then(rows => rows.map(row => {
    row.my_enum_array = pgArray.create(row.my_enum_array, String).parse();
    return row;
  }));

where pgArray is:

const pgArray = require('pg').types.arrayParser;

Is it possible to parse types for sub-documents?

Consider something like:

select
  id
, ( array_to_json( array(
    select row_to_json(user_books) from user_books
    where user_books.user_id = users.id
  ))) as books
from users
where id = 123;

Is it even possible for node-pg-types to detect that a sub-document is included and could parse types there as well?

null won't be passed into parser

In readme, the example says:

var types = require('pg').types
types.setTypeParser(20, function(val) {
  //remember: all values returned from the server are either NULL or a string
  return val === null ? null : parseInt(val)
})

I want to make a parser to convert NULL data in pg JSON type to something other. But test shows that if a field is NULL, it won't be passed into parser. I read the code and find in result.js#L78, parser is invoked only if raw data is not NULL.

intervals could be represented more explicitly

The code for parsing intervals explicitly removes properties having zero values. This leads to some confusing cases. Here's my test program "test.js":

var pg = require('pg');
var util = require('util');

var conString = 'postgres://localhost/zag';
var query = process.argv[2];

if (!query) {
        console.error('node test.js QUERY_STRING');
        process.exit(2);
}

pg.connect(conString, function (err, client, done) {
        if (err) {
                console.error('pg.connect: %s', err.message);
                process.exit(1);
        }

        client.query(query, function (err, result) {
                console.error(util.inspect(result, null, 5));
                done();
                process.exit(0);
        });
});

Here's a zero interval:

$ node test.js 'select NOW() - NOW() as my_interval'
{ command: 'SELECT',
  rowCount: 1,
  oid: NaN,
  rows: [ { my_interval: {} } ],
  fields: 
   [ { name: 'my_interval',
       tableID: 0,
       columnID: 0,
       dataTypeID: 1186,
       dataTypeSize: 16,
       dataTypeModifier: -1,
       format: 'text' } ],
  _parsers: [ [Function] ],
  RowCtor: [Function],
  rowAsArray: false,
  _getTypeParser: [Function] }

It seems a little weird that "{}" denotes "0 seconds".

Here's an interval of 1 hour and 5 seconds, which is a little weird because "hours" and "seconds" are both present, but "minutes" isn't:

$ node test.js "select TIMESTAMP '2014-01-26T12:00:00Z' - TIMESTAMP '2014-01-26T10:59:55Z' as my_interval;"
{ command: 'SELECT',
  rowCount: 1,
  oid: NaN,
  rows: [ { my_interval: { hours: 1, seconds: 5 } } ],
  fields: 
   [ { name: 'my_interval',
       tableID: 0,
       columnID: 0,
       dataTypeID: 1186,
       dataTypeSize: 16,
       dataTypeModifier: -1,
       format: 'text' } ],
  _parsers: [ [Function] ],
  RowCtor: [Function],
  rowAsArray: false,
  _getTypeParser: [Function] }

Is that expected? Is the code trying to make the object "look friendly"?

custom parser for type DOMAIN

It seems that either pg or postgres is regarding a type domain as if it was a regular type, even though it has its own OID, thus currently I can't write a parser for it.

CREATE DOMAIN custom_id AS bigint;

CREATE TABLE test (
	id custom_id
)

The result object specifies dataTypeID that of bigint insteaad of custom_id:

fields: [
  Field {
    name: 'id',
    tableID: 6937143,
    columnID: 1,
    dataTypeID: 20, // OID of bigint instead of custom_id
    dataTypeSize: 8,
    dataTypeModifier: -1,
    format: 'text'
  }
],

Is there any way around this?

numeric[] should be parsed via parseStringArray(...) not parseFloatArray(...)

Arrays of numeric type are registered to be parsed as an arrays of floats whereas the non-array numeric is returned as a string to ensure to no loss of precision:

register(1231, parseFloatArray); // _numeric

It should instead return back an array of the string values so the user gets the full value.

This would be a breaking change to anyone previously expecting an array of Numbers.

Strange array representation before parsing

Hi there.

I have several arrray of types _int4 and _int8. When updating type parser for _int8 i noticed that db returns for this column such repesentation: [0:3]={2,4,3,1}.
Have you seen something like this?

I am using AWS RDS 10.4

array_agg returns array as a string

SELECT
  c1.relname "tableName",
  c2.relname "indexName",
  i1.indisunique "indexIsUnique",
  array_agg(a1.attname) "columnNames"
FROM
  pg_class c1,
  pg_class c2,
  pg_index i1,
  pg_attribute a1
WHERE
  c1.oid = i1.indrelid
  AND c2.oid = i1.indexrelid
  AND a1.attrelid = c1.oid
  AND a1.attnum = ANY(i1.indkey)
  AND c1.relkind IN ('r', 'm')
GROUP BY
  c1.relname,
  c2.relname,
  i1.indisunique
ORDER BY
  c1.relname,
  c2.relname

Expecting columnNames to return an array.

Returns {foo, bar} instead of expected array.

I couldn't figure out how to configure setTypeParser to capture this expression.

Timestamptz, same point in time but different representation for same query with 'set local time zone'

I have an issue with representation of the timestamps with time zone on my web app due to a query result from PostgreSQL. I use pg-promise, which uses node-postgres and this library under the hood. I have code that causes the issue on the question on he SO. I make use of SQL set local time zone, nad I explain how I apply it. For simplicity, I just link to the question page:
Timestamptz issue on StackOverflow.

Do not return DATE fields as Javascript Date

A Javascript Date is always parsed in local timezone. A DATE column in Postgres or any other DB doesn't have any timezone associated with it, it is just a date. It begins and ends differently depending on where you are. Parsing this as a JS Date that has timezone associated with it is a huge problem. Let's say your server is located in GMT+2 (Germany). You insert a date into your database, like 2016-08-26. When you query that in Germany, you will get the Date object Fri Aug 26 2016 00:00:00 GMT+0200. But run toUTCString() on it, and you get Thu, 25 Aug 2016 22:00:00 GMT! The date is one day off. This imposes a huge problem for distributed applications.

TL;DR JavaScript objects are not suited to represent a DATE field, because they have a timezone and time associated, while a DATE has not.

postgres-date should not be used to parse it and instead, the string should be returned (just like with a DECIMAL). Users can register their own type parser with their own DateOnly object if they want.

This will be a breaking change and we would like to depend on this in Sequelize 4.

See sequelize/sequelize#4858

Doc- types.builtins.NUMERIC usage

One example in the readme.md is
types.setTypeParser(types.builtins.TIMESTAMPTZ, parseFn)

This approach
var types = require('pg').types;
types.setTypeParser(types.builtins.NUMERIC, function(val) { return parseFloat(val); });

shows an error
Cannot read property 'NUMERIC' of undefined

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.