Coder Social home page Coder Social logo

node-hdb's Introduction

hdb - Pure JavaScript SAP HANA Database Client

Version Build Coverage License Downloads REUSE status

Table of contents

Install

Install from npm:

npm install hdb

or clone from the GitHub repository to run tests and examples locally:

git clone https://github.com/SAP/node-hdb.git
cd node-hdb
npm install

SAP Support for hdb and @sap/hana-client

The hdb and @sap/hana-client Node.js SAP HANA client drivers are supported by SAP for connecting to SAP HANA Cloud and SAP HANA Platform servers. When starting a new project, it is encouraged to use the fully featured @sap/hana-client driver (documentation).

npm install @sap/hana-client

Below is a major feature comparison chart between the two drivers:

Feature @sap/hana-client hdb
Connectivity to SAP HANA Cloud ✔️ ✔️
Connectivity to SAP HANA as a Service ✔️ ✔️
Connectivity to SAP HANA Platform ✔️ ✔️
Transport Layer Security (TLS) ✔️ ✔️
Automatic Reconnect (Transparent Session Recovery) ✔️
Active-Active Read Enabled ✔️
Connection Pooling (Implicit and Explicit) ✔️
Client-Side Data Encryption ✔️
Database Redirection ✔️
Statement Distribution ✔️
Password/PBKDF2 Authentication ✔️ ✔️
SAML Authentication ✔️ ✔️
JWT Authentication ✔️ ✔️
LDAP Authentication ✔️ ✔️
Kerberos Authentication ✔️
X.509 Authentication ✔️
Secure User Store Integration (hdbuserstore) ✔️
Connections through HTTP proxy ✔️
Connections through SOCKS proxy (SAP Cloud Connector) ✔️
Network Compression ✔️
Network Packet Size ✔️
Network Poll before Send ✔️
Advanced Tracing via external utility or environment variables ✔️
Tracing via environment variable to a file ✔️ ✔️
Promise support ✔️
TypeScript support ✔️
Pure JavaScript package ✔️
Security Provider Support SAP CommonCryptoLib, OpenSSL, MSCrypto OpenSSL
Node.js major version support See SAP Note 3165810 All Supported Versions
License (without alternate SAP license agreement) SAP Developer Agreement Apache 2.0
SAP Support (with SAP Support agreement) Component HAN-DB-CLI Component HAN-DB-CLI
Community Support answers.sap.com HANA tag node-hdb/issues

The hdb driver may also have different APIs or lack support for SAP HANA server features where the @sap/hana-client is fully supported. APIs that are the same in both drivers may have different behaviour.

Getting started

If you do not have access to an SAP HANA server, go to the SAP HANA Developer Center and choose one of the options to use SAP HANA Express or deploy a new SAP HANA Cloud server.

This is a very simple example showing how to use this module:

var hdb    = require('hdb');
var client = hdb.createClient({
  host     : 'hostname',
  port     : 30015,
  user     : 'user',
  password : 'secret'
});
client.on('error', function (err) {
  console.error('Network connection error', err);
});
client.connect(function (err) {
  if (err) {
  	return console.error('Connect error', err);
  }
  client.exec('select * from DUMMY', function (err, rows) {
	client.end();
    if (err) {
      return console.error('Execute error:', err);
    }
    console.log('Results:', rows);
  });
});

Establish a database connection

The first step to establish a database connection is to create a client object. It is recommended to pass all required connect options like host, port, user and password to the createClient function. They will be used as defaults for any following connect calls on the created client instance. Options beginning with the prefix "SESSIONVARIABLE:" are used to set session-specific client information at connect time (see example of setting EXAMPLEKEY=EXAMPLEVALUE below). In case of network connection errors like a connection timeout or a database restart, you should register an error event handler in order to be able to handle these kinds of problems. If there are no error event handlers, errors will not be emitted.

var hdb    = require('hdb');
var client = hdb.createClient({
  host     : 'hostname',
  port     : 30015,
  user     : 'user',
  password : 'secret',
  'SESSIONVARIABLE:EXAMPLEKEY' : 'EXAMPLEVALUE'
});
client.on('error', function (err) {
  console.error('Network connection error', err);
});
console.log(client.readyState); // new

When a client instance is created it does not immediately open a network connection to the database host. Initially, the client is in a 'new' state. When you call connect for the first time, two things are done internally:

  1. A network connection is established and the communication is initialized (Protocol - and Product Version exchange). Now the connection is ready for exchanging messages but no user session is established as the client is in a disconnected state. This step is skipped if the client is already in a disconnected state.

  2. The authentication process is initiated. After a successful user authentication a database session is established and the client is in a connected state. If authentication fails the client remains in a 'disconnect' state.

client.connect(function (err) {
  if (err) {
    return console.error('Error:', err);
  }
  console.log(client.readyState); // connected
});

If user and password are specified they will override the defaults of the client. It is possible to disconnect and reconnect with a different user on the same client instance and the same network connection.

The client also supports SAP HANA systems installed in multiple-container (MDC) mode. In this case a single SAP HANA system may contain several isolated tenant databases. A database is identified by its name. One of the databases in an MDC setup is the system database which is used for central system administration. One can connect to a specific tenant database directly via its host and SQL port (as shown in the example above) or via the system database which may lookup the exact host and port of a particular database by a given name.

var hdb    = require('hdb');
var client = hdb.createClient({
  host         : 'hostname', // system database host
  port         : 30013,      // system database port
  databaseName : 'DB1',      // name of a particular tenant database
  user         : 'user',     // user for the tenant database
  password     : 'secret'    // password for the user specified
});

The client also accepts an instance number instead of the port of the system database:

var hdb    = require('hdb');
var client = hdb.createClient({
  host           : 'hostname', // system database host
  instanceNumber : '00',       // instance number of the HANA system
  databaseName   : 'DB1',      // name of a particular tenant database
  user           : 'user',     // user for the tenant database
  password       : 'secret'    // password for the user specified
});

Multiple hosts can be provided to the client as well:

var hdb    = require('hdb');
var client = hdb.createClient({
  hosts : [ { host: 'host1', port: 30015 }, { host: 'host2', port: 30015 } ],
  user     : 'user',
  password : 'secret'
});

This is suitable for multiple-host SAP HANA systems which are distributed over several hosts. The client establishes a connection to the first available host from the list.

Authentication mechanisms

Details about the different authentication methods can be found in the SAP HANA Security Guide.

User / Password

Users authenticate themselves with their database user and password.

SAML assertion

SAML bearer assertions as well as unsolicited SAML responses that include an unencrypted SAML assertion can be used to authenticate users. SAML assertions and responses must be signed using XML signatures. XML Digital signatures can be created with xml-crypto or xml-dsig.

Instead of user and password you have to provide a SAML assertion:

client.connect({
  assertion: '<Assertion xmlns="urn:oasis:names:tc:SAML:2.0:assertion" ...>...</Assertion>'
},function (err) {
  if (err) {
    return console.error('Error:', err);
  }
  console.log('User:', client.get('user'));
  console.log('SessionCookie:', client.get('SessionCookie'));
});

After a successful SAML authentication, the server returns the database user and a SessionCookie which can be used for reconnecting.

JWT token

JWT tokens can also be used to authenticate users.

Instead of user and password you have to provide a JWT token:

client.connect({
  token: 'eyJhbGciOiJSUzI1NiJ9....'
},function (err) {
  if (err) {
    return console.error('Error:', err);
  }
  console.log('User:', client.get('user'));
  console.log('SessionCookie:', client.get('SessionCookie'));
});

After a successful JWT authentication, the server returns the database user and a SessionCookie which can be used for reconnecting.

Encrypted network communication

To establish an encrypted database connection just pass either key, cert and ca or a pfx to createClient.

var client = hdb.createClient({
  host : 'hostname',
  port : 30015,
  key  : fs.readFileSync('client-key.pem'),
  cert : fs.readFileSync('client-cert.pem'),
  ca   : [fs.readFileSync('trusted-cert.pem')],
  ...
});

Use the useTLS option if you would like to connect to SAP HANA using Node.js's trusted certificates.

var client = hdb.createClient({
  host : 'hostname',
  port : 30015,
  useTLS: true,
  ...
});

Note for MDC use cases: The system database and the target tenant database may be configured to work with different certificates. If so, make sure to include all the necessary TLS-related properties for both the databases in the client's options.

In case you need custom logic to validate the server's hostname against the certificate, you can assign a callback function to the checkServerIdentity property, alongside the other connection options. The callback is supplied to the tls.connect funciton of the TLS API and should conform to the signature described there.

Controlling the Maximum Packet Size

By default, the node-hdb driver restricts the size of outgoing packets to 128KB. Attempting to execute SQL statements larger than this limit will result in an error. Furthermore, large object parameters (LOBs) larger than this limit will be broken up and sent in multiple packets to the server. This limit is configurable via the packetSize and packetSizeLimit connect options. Node-hdb will never allocate outgoing packets larger than packetSizeLimit and will restrict the packet size further to packetSize if possible (via breaking up LOB parameters into multiple packets).

For example, the following configuration will allow packets up to 2^20 bytes (1MB) if necessary, but will still restrict LOB data to 2^17 byte (128KB) packets:

var client = hdb.createClient({
  host : 'hostname',
  port : 30015,
  packetSize : Math.pow(2, 17),
  packetSizeLimit : Math.pow(2, 20)
  ...
});

If not set, the value of packetSize defaults to 131072 (128KB) and packetSizeLimit defaults to packetSize. Values for packetSize may range from 65536 (64KB) to 1073741823 (1GB-1). Values for packetSizeLimit may range from packetSize to 1073741823 (1GB-1).

Direct Statement Execution

Direct statement execution is the simplest way to execute SQL statements. The only input parameter is the SQL command to be executed. Generally, statement execution results are returned using callbacks. The type of returned result depends on the kind of statement.

DDL Statement

In the case of a DDL statement nothing is returned:

client.exec('create table TEST.NUMBERS (a int, b varchar(16))', function (err) {
  if (err) {
    return console.error('Error:', err);
  }
  console.log('Table TEST.NUMBERS has been created');
});

DML Statement

In the case of a DML Statement the number of affectedRows is returned:

client.exec('insert into TEST.NUMBERS values (1, \'one\')', function (err, affectedRows) {
  if (err) {
    return console.error('Error:', err);
  }
  console.log('Number of affected rows:', affectedRows);
});

Query

The exec function is a convenient way to completely retrieve the result of a query. In this case all selected rows are fetched and returned in the callback. The resultSet is automatically closed and all Lobs are completely read and returned as buffer objects. If streaming of the results is required you will have to use the execute function. This is described in section Streaming results:

client.exec('select A, B from TEST.NUMBERS order by A', function(err, rows) {
  if (err) {
    return console.error('Error:', err);
  }
  console.log('Rows:', rows);
});

Different Representations of Query Results

The default representation of a single row is an object where the property names are the columnDisplayNames of the resultSetMetadata:

var command = 'select top 1 * from t1';
client.exec(command, function(err, rows) {
  /* rows will be an array like this:
  [{
    ID: 1,
    A: 't1.1.a',
    B: 't1.1.b'
  }]
  */
});

If your SQL statement is a join with overlapping column names, you may want to get separate objects for each table per row. This is possible if you set option nestTables to TRUE:

var command = 'select top 1 * from t1 join t2 on t1.id = t2.id';
var options = {
  nestTables: true
};
client.exec(command, options, function(err, rows) {
  /* rows will be an array like this now:
  [{
    T1: {
      ID: 1,
      A: 't1.1.a',
      B: 't1.1.b',
    },
    T2: {
      ID: 1
      A: 't2.1.a',
      B: 't2.1.b',
    },
  }]
  */
});

It is also possible to return all rows as an array where the order of the column values is exactly the same as in the resultSetMetadata. In this case you have to set the option rowsAsArray to TRUE:

var command = 'select top 1 * from t1 join t2 on t1.id = t2.id';
var options = {
  rowsAsArray: true
};
client.exec(command, options, function(err, rows) {
  /* rows will be an array like this now:
  [[
    1,
    't1.1.a',
    't1.1.b',
    1
    't2.1.a',
    't2.1.b'
  ]]
  */
});

Prepared Statement Execution

Prepare a Statement

The client returns a statement object which can be executed multiple times:

client.prepare('select * from DUMMY where DUMMY = ?', function (err, statement){
  if (err) {
    return console.error('Error:', err);
  }
  // do something with the statement
  console.log('StatementId', statement.id);
});

Execute a Statement

The execution of a prepared statement is similar to the direct statement execution on the client. The difference is that the first parameter of the exec function is an array with positional parameters. In case of named parameters it can also be an parameters object:

statement.exec(['X'], function (err, rows) {
  if (err) {
    return console.error('Error:', err);
  }
  console.log('Rows:', rows);
});

If you use the execute function instead of the exec function the resultSet is returned in the callback like in the direct query execution above.

Calling Stored Procedures

If you have a stored procedure similar to the following example:

create procedure PROC_DUMMY (in a int, in b int, out c int, out d DUMMY, out e TABLES)
  language sqlscript
  reads sql data as
  begin
    c := :a + :b;
    d = select * from DUMMY;
    e = select * from TABLES;
  end

You can call it via a prepared statement. The second argument is always an object with the scalar parameters. If there are no scalar parameters, an empty object {} is returned. The following arguments are the resultSets:

client.prepare('call PROC_DUMMY (?, ?, ?, ?, ?)', function(err, statement){
  if (err) {
    return console.error('Prepare error:', err);
  }
  statement.exec({
    A: 3,
    B: 4
  }, function(err, parameters, dummyRows, tableRows) {
    if (err) {
      return console.error('Exec error:', err);
    }
    console.log('Parameters:', parameters);
    console.log('Dummies:', dummyRows);
    console.log('Tables:', tableRows);
  });
});

Note: Default values for stored procedures are not supported.

Drop Statement

To drop the statement simply call:

statement.drop(function(err){
  if (err) {
    return console.error('Drop error:', err);
  }
  console.log('Statement dropped');
});

The callback is optional in this case.

Using Datetime types

If you want to use DATETIME types in a prepared statement, be aware that strings like '14.04.2016 12:41:11.215' are not processed by the SAP HANA Database but by the node-hdb module. Therefore, you must use the exact required format that would be returned by a selection made with this module. The formats are:

TIME: '13:32:20'
DATE: '2016-04-14'
TIMESTAMP: '2016-04-14T13:32:20.737'
SECONDDATE: '2016-04-14T13:32:20'

Another possibility is to use the functions TO_DATE, TO_DATS, TO_TIME and TO_TIMESTAMP in your SQL statement to convert your string to a valid DATETIME type.

Bulk Insert

If you want to insert multiple rows with a single execution you just have to provide the all parameters as array, for example:

client.prepare('insert into TEST.NUMBERS values (?, ?)', function(err, statement){
  if (err) {
    return console.error('Prepare error:', err);
  }
  statement.exec([[1, 'one'], ['2', 'two'], [3, 'three']], function(err, affectedRows) {
    if (err) {
      return console.error('Exec error:', err);
    }
    console.log('Array of affected rows:', affectedRows);
  });
});

For further details, see: app9.

Streaming results

If you use the execute function of the client or statement instead of the exec function, a resultSet object is returned in the callback instead of an array of all rows. The resultSet object allows you to create an object based row stream or an array based stream of rows which can be piped to an writer object. Don't forget to close the resultSet if you use the execute function:

client.execute('select A, B from TEST.NUMBERS order by A', function(err, rs) {
  if (err) {
    return console.error('Error:', err);
  }
  rs.setFetchSize(2048);
  rs.createObjectStream()
    .pipe(new MyWriteStream())
    .on('finish', function (){
      if (!rs.closed) {
       rs.close();
      }
    });
});

For further details, see app4.

Transaction handling

The default behavior is that each statement is automatically committed. If you want to manually control commit and rollback of a transaction, you can do this by calling setAutoCommit(false) on the client object:

function execTransaction(cb) {
  client.setAutoCommit(false);
  async.series([
    client.exec.bind(client, "insert into NUMBERS values (1, 'one')"),
    client.exec.bind(client, "insert into NUMBERS values (2, 'two')")
  ], function (err) {
    if (err) {
      client.rollback(function(err){
        if (err) {
          err.code = 'EROLLBACK';
          return cb(err);
        }
        cb(null, false);
      });
    } else {
      client.commit(function(commitError){
        if (err) {
          err.code = 'ECOMMIT';
          return cb(err);
        }
        cb(null, true);
      });
    }
    client.setAutoCommit(true);
  });
}

execTransaction(function(err, ok){
  if (err) {
    return console.error('Commit or Rollback error', err);
  }
  if (ok) {
    console.log('Commited');
  } else {
    console.log('Rolled back');
  }
})

For further details, see: tx1.

Streaming Large Objects

Read Streams

Reading large object as stream can be done if you use the execute method of client or statement. In this case for all LOB columns a Lob object is returned. You can call createReadStream or read in order create a readable stream or to read the LOB completely.

Write Streams

Writing large objects is automatically done. You just have to pass a Readable instance or a buffer object as parameter.

For further details, see: app7.

CESU-8 encoding support

The SAP HANA server connectivity protocol uses CESU-8 encoding. Node.js does not suport CESU-8 natively and the driver by default converts all text to CESU-8 format in the javascript layer including SQL statements.

Due to the fact that Node.js has built-in support for UTF-8, using UTF-8 in the HDB drivers can lead to performance gains especially for large text data. If you are sure that your data contains only BMP characters, you can disable CESU-8 conversion by setting a flag in the client configuration.

createClient accepts the parameter useCesu8 to disable CESU-8 support. Here is how to provide the configuration:

var hdb    = require('hdb');
var client = hdb.createClient({
  host     : 'hostname',
  port     : 30015,
  user     : 'user',
  password : 'secret',
  useCesu8 : false
});

This setting is per client and cannot be changed later.

Note: Using CESU-8 brings performance penalties proportionate to the text size that has to be converted.

TCP Keepalive

To configure TCP keepalive behaviour, include the tcpKeepAliveIdle connect option. The value provided for this option is the number of seconds before an idle connection will begin sending keepalive packets. By default, TCP keepalive will be turned on with a value of 200 seconds. If a value of 0 is specified, keepalive behaviour is determined by the operating system. The following example creates a client whose connections will begin sending keepalive packets after 300 seconds.

var hdb    = require('hdb');
var client = hdb.createClient({
  host             : 'hostname',
  port             : 30015,
  user             : 'user',
  password         : 'secret',
  tcpKeepAliveIdle : 300
});

TCP keepalive can be explicity disabled by specifying tcpKeepAliveIdle=false as in the example below.

var hdb    = require('hdb');
var client = hdb.createClient({
  host             : 'hostname',
  port             : 30015,
  user             : 'user',
  password         : 'secret',
  tcpKeepAliveIdle : false
});

Setting Session-Specific Client Information

The client information is a list of session variables (defined in property-value pairs that are case sensitive) that an application can set on a client object. These variables can be set at connection time via "SESSIONVARIABLE:" prefixed options, or by using the setClientInfo method to specify a single property-value pair.

var hdb    = require('hdb');
var client = hdb.createClient({
  host             : 'hostname',
  port             : 30015,
  user             : 'user',
  password         : 'secret',
  "SESSIONVARIABLE:EXAMPLEKEY1" : "EXAMPLEVALUE1"
});
client.setClientInfo("EXAMPLEKEY2", "EXAMPLEVALUE2");

Session variables set via the setClientInfo method will be sent to the server during the next execute, prepare, or fetch operation.

Running tests

To run the unit tests for hdb simply run:

make test-unit

To run the unit tests as well as acceptance tests for hdb you have to run:

make test

For the acceptance tests a database connection has to be established. Therefore, you need to copy the configuration template config.tpl.json in the test/db folder to config.json and change the connection data to yours. If the config.json file does not exist a local mock server is started.

Running examples

For any examples you need a valid config.json in the test/db folder.

  • app1: Simple query.
  • app2: Fetch rows from ResultSet.
  • app3: Streaming rows createObjectStream().
  • app4: Pipe row into JSON-Transform and to stdout.
  • app5: Stream from the filesystem into a db table.
  • app6: Stream from a db table into the filesystem.
  • app7: Insert a row with a large image into a db table (uses WriteLobRequest and Transaction internally).
  • app8: Automatic reconnect when network connection is lost.
  • app9: Insert multiple rows with large images into a db table as one batch.
  • app10: Usage example of query option nestTables.
  • call1: Call stored procedure.
  • call2: Call stored procedure with lob input and output parameter.
  • call3: Call stored procedure with table as input parameter.
  • tx1: Transaction handling (shows how to use commit and rollback).
  • csv: Stream a db table into csv file.
  • server: Stream rows into http response http://localhost:1337/{schema}/{tablename}?top={top}

To run the first example:

node examples/app1

Licensing

Copyright 2013-2021 SAP SE or an SAP affiliate company and node-hdb contributors. Please see our LICENSE for copyright and license information. Detailed information including third-party components and their licensing/copyright information is available via the REUSE tool.

node-hdb's People

Contributors

abap-weasel avatar alexpenev-s avatar bobdenos avatar bridgear avatar clncln1 avatar cplussharp avatar dotchev avatar gvachkov avatar holgerkoser avatar ianmccurdy avatar jeffalbion avatar johannes-vogel avatar larshp avatar michaelsp avatar mikelr avatar petar-iv avatar pseitz avatar rjayasinghe avatar sebastianwolf-sap avatar simsso avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

node-hdb's Issues

Cannot read property 'createTask' of undefined when closing connection early

I'm using async.waterfall similar to your examples to create client -> connect -> execute -> disconnect.

In the execution function, before the actual execution, there is a timer for 10 sec. If the timer hits before the query returns, the callback is called to the client.disconnect.

Shortly after the error for the 10 sec limit, an uncaught error is emitted from the HDB driver and crashes the server.

The error and code below.

Any workaround or way to catch the error would be much appreciated.

Thanks!


TypeError: Cannot read property 'createTask' of undefined
    at Connection.enqueue (/data/rootdir/home/i827087/Projects/Hanamon/node_modules/hdb/lib/protocol/Connection.js:366:30)
    at Connection.fetchNext (/data/rootdir/home/i827087/Projects/Hanamon/node_modules/hdb/lib/protocol/Connection.js:487:8)
    at ResultSet.sendFetch (/data/rootdir/home/i827087/Projects/Hanamon/node_modules/hdb/lib/protocol/ResultSet.js:282:20)
    at ResultSet.resume (/data/rootdir/home/i827087/Projects/Hanamon/node_modules/hdb/lib/protocol/ResultSet.js:130:15)
    at Readable._read (/data/rootdir/home/i827087/Projects/Hanamon/node_modules/hdb/lib/util/index.js:224:8)
    at Readable.read (_stream_readable.js:348:10)
    at flow (_stream_readable.js:761:34)
    at ResultSetTransform.<anonymous> (_stream_readable.js:623:7)
    at emitNone (events.js:86:13)
    at ResultSetTransform.emit (events.js:185:7)

async.waterfall([

    // Create client connection
    (callbackWF) => {

        var client = hdb.createClient({

            host: host,
            port: port,
            user: user,
            password: password

        });

        callbackWF(null, client);

    },

    // Connect the client
    (client, callbackWF) => {

        client.connect( (err) => {

            if (err) {

                console.log("Error while connecting to client, " + name +  ". " + err);
                return callbackWF(1, client);

            }

                return callbackWF(null, client);

        });
    },

    // Get data for table.
    (client, callbackWF) => {

        client.on('error', function (err) {

            console.error('HANA HDB Error: ', err);

        });

        var timer = setTimeout(function() {

            console.log(name + " " + host + " did not return statement within 10,000ms. Statement: " + select);
            callbackWF(1, client);
            return clearTimeout(timer);

        }, 10000);

        client.exec(select, {rowsAsArray: true}, (err, rows) => {

            clearTimeout(timer);

            if (err) {

                console.log("Error while getting data from " + name +  " table: " + table + ". " + err + select);
                return callbackWF(1, client);

            } else {

                if (rows.length == 0) {
                    
                    client.exec(<another select>, {rowsAsArray: true}, (err, emptyRows) => {

                        if (err) {

                            console.log("Error sending empty tables: " + err);

                        } else {

                            <Do stuff with data>

                        }

                    });

                }


                <Do stuff with data>

            }

                callbackWF(null, client);

        });

    }

], (err, client) => {

    client.disconnect();

});

Connection option DATAFORMATVERSION2=1

Currently DATAFORMATVERSION2=1
https://github.com/SAP/node-hdb/blob/master/lib/protocol/part/ConnectOptions.js#L115

  // The client indicates this set of understood type codes and field formats.
  // The server then defines the value according to its own capabilities,
  // and sends it back. The following values are supported:
  // *  `1` Baseline data type support (SAP HANA SPS 02)
  // *  `4` Baseline data type support (SAP HANA SPS 06)
  //        Support for ALPHANUM, TEXT, SHORTTEXT, LONGDATE, SECONDDATE,
  //        DAYDATE, and SECONDTIME.
  this.dataFormatVersion = DataFormatVersion.COMPLETE_DATATYPE_SUPPORT;
  this.dataFormatVersion2 = DataFormatVersion.COMPLETE_DATATYPE_SUPPORT;

Here DataFormatVersion.COMPLETE_DATATYPE_SUPPORT = 1

This results in some data types being mapped to other data types, e.g.
SQL: create table TEST_TYPES(SECDT SECONDDATE, SMALLDEC SMALLDECIMAL, ALNUM ALPHANUM)
SQL: insert into TEST_TYPES values(?, ?, ?)
param 1: type 16 (TIMESTAMP)
param 2: type 5 (DECIMAL)
param 3: type 11 (NVARCHAR)

Notice that parameter types do not match column types.
Forcing the version like this

client.connectOptions.dataFormatVersion2 = 4;

improves the result

SQL: insert into TEST_TYPES values(?, ?, ?)
param 1: type 62 (SECONDDATE)
param 2: type 5 (DECIMAL)
param 3: type 55 (ALPHANUM)

So why not change the default to

  this.dataFormatVersion2 = DataFormatVersion.LEVEL4;

Also is it intended that applications can change client.connectOptions because it is not documented at the moment?

Bulk Insert: TypeError: value is out of bounds

When launching a bulk insert with 1000 rows, I sometimes get this error. I am assuming a piece of data I am putting in is causing this issue. Is there anyway to show the user a better message? possibly containing the offending row?

TypeError: value is out of bounds
at TypeError ()
at checkInt (buffer.js:705:11)
at Buffer.writeInt32LE (buffer.js:782:5)
at Writer.writeInt as 3
at Writer.add (/project/node_modules/hdb/lib/protocol/Writer.js:90:15)
at Writer.setValues (/project/node_modules/hdb/lib/protocol/Writer.js:57:10)
at next (/project/node_modules/hdb/lib/protocol/ExecuteTask.js:161:21)
at ExecuteTask.getParameters (/project/node_modules/hdb/lib/protocol/ExecuteTask.js:177:3)
at ExecuteTask.sendExecute (/project/node_modules/hdb/lib/protocol/ExecuteTask.js:184:8)
at execute (/project/node_modules/hdb/lib/protocol/ExecuteTask.js:74:10)
at ExecuteTask.run (/project/node_modules/hdb/lib/protocol/ExecuteTask.js:112:3)

Sends all queries in block letters

invalid column name: INDEX_NAMES: line 1 col 38 (at pos 37). This error popus up even though a column name of index_names was existing. This worked on altering the table and changing the column name to block letters . Any solution for this ?

ST_LOCATION

Hi, we've just upgraded to HANA 1.00.81.00.392776 and it seems like the returned ST_AsGeoJSON Attributes are newly returned as a Buffer:
JSONLOCATION: <Buffer 7b 22 74 79 70 65 22 3a 20 22 50 6f 69 6e 74 22 2c 20 22 63 6f 6f 72 64 69 6e 61 74 65 73 22 3a 20 5b 34 37 2e 36 33 33 39 36 30 39 39 39 39 39 39 39 39 ...

Regards, Tobi

Add the ability to cancel a statement

It would be neat if the driver allowed you to cancel running queries.

It would allow my application to kill running queries when users close their browser, or if a query takes too long to execute. This would also allow my users to kill any long-running queries manually, (ex: using a cancel button on our UI).

Perhaps the API could look something like:

var query = client.exec("<statement>", function(err, result) {...});
query.cancel(doneCallback);
// or maybe...
var statementId = client.exec(...);
client.cancel(statementId, doneCallback);

This method could be added to the ResultSet class as well, or the client.execute call.

Any thoughts?

Callback not executed when connecting to an active port (not SQL port)

The error occurs with the following coding:

var hdb    = require('hdb');
var client = hdb.createClient({
  host     : 'hostname',
  port     : port,
  user     : 'user',
  password : 'secret'
});

client.connect(function (err) {
  if (err) {
    return console.error('Connect error', err);
  }  
});

If the port parameter is set to a random unused port, the callback function of client.connect is executed and the error is set. If an active port (e.g. the xs-engine http-port 8000) is used, the callback is never executed and the code exits silently somewhere.
The expected result would be the execution of the callback with an error like 'wrong port - connection failed' or something like that.

Weird error behaviour during batch update

Hi,

when doing batch updates I noticed the following problem:
In case any of the rows in the batch contains values which do not adhere to the datatypes in the table, the error message can be very misleading and point to values / columns in columns which are actually correct. It is also sometimes impossible to determine which row caused a problem.

The example which I stumpled upon is this:

  • change the datatype in app9.js from INT to DOUBLE
  • replace the numerical ID of the second row by the string 'THIS WILL CAUSE AN ERROR'

When I now execute app9.js, I get the following error message

{ [Error: error while parsing protocol: numeric value out of range: nan: type_code=7, index=1error while parsing protocol: cannot parse the string 'aliquam' as a number: type_code=30, index=1no such data type: type_code=65, index=3error while parsing protocol: no such data type: type_code=108, index=1error while parsing protocol: no such data type: type_code=105, index=1error while parsing protocol: no such data type: type_code=113, index=1error while parsing protocol: no such data type: type_code=117, index=1]
  message: 'error while parsing protocol: numeric value out of range: nan: type_code=7, index=1error while parsing protocol: cannot parse the string \'aliquam\' as a number: type_code=30, index=1no such data type: type_code=65, index=3error while parsing protocol: no such data type: type_code=108, index=1error while parsing protocol: no such data type: type_code=105, index=1error while parsing protocol: no such data type: type_code=113, index=1error while parsing protocol: no such data type: type_code=117, index=1',
  code: 1033,
  sqlState: 'HY000',
  level: 1,
  position: -1 }

Even though the first error Error: error while parsing protocol: numeric value out of range: nan: type_code=7, index=1 points to the right column, the additional errors like: error while parsing protocol: cannot parse the string 'aliquam' as a number: type_code=30, index=1 are very confusing as they are pointing to values / columns which are actually ok.

It seems to me that an error in one row causes errors in all subsequent rows because array items gets accidently shifted.

Also importantly there's no information in which row an error occured. This is very important for batch scenarios where one often would inserted thousands of records at the same time.

In my concrete use-case I wrote node module which reads large , GB sized csv files, does some transformations and then inserts them into a hana table. I need some info on which rows failed to process and handle them seperate / write them into an error output csv file. Therefore I need to know which row caused a problem.

I would greatly appreciate if this can be fixed.

Best Regards Christian

Executed prepared statement does not return output values under HANA SPS10 when HANA warning occurs

I've encountered an issue where a prepared statement for a stored procedure does not seem to return the procedure's outputs (including the scalar output object and any other output tables). This happens when a warning is being returned from HANA under SPS10.

The exact case I'm having an issue is where HANA SPS10 returns a warning in case a DDL statement is present within a dynamic SQL operation.

The prepared statement is correctly prepared for call TEST_PROCEDURE(?, ?, ?), and it is being created for the procedure below..

create PROCEDURE TEST_PROCEDURE(
  in value       nvarchar(100),
  out reference  nclob,
  out error      table(message nvarchar(100))
)
  LANGUAGE SQLSCRIPT
  SQL SECURITY INVOKER
AS
BEGIN
  exec 'create table MYTABLE(id int)';
  reference := '';
  error = select 'error' as MESSAGE from dummy;
END;

The prepared statement is executed using the following code

input = {VALUE: 'abc'};
preparedStatement.exec(input, function(error, parameters, errorTable) {
  console.log(arguments);
});

When executing this, HANA 10 returns an error object in the first parameter of the callback, but there are no output parameters present. Executing the procedure in HANA Studio correctly returns the outputs.

{ '0':
   { [Error: Not recommended feature: DDL statement is used in Dynamic SQL (current dynamic_sql_ddl_error_level = 1)]
     message: 'Not recommended feature: DDL statement is used in Dynamic SQL (current dynamic_sql_ddl_error_level = 1)',
     code: 1347,
     sqlState: 'HY000',
     level: 0,
     position: 0 } }

When running the code against an SPS09 system, everything works fine.

If you remove the dynamic SQL from the procedure, the output values are being returned on both SPS 9 and 10.

create PROCEDURE TEST_PROCEDURE(
  in value       nvarchar(100),
  out reference  nclob,
  out error      table(message nvarchar(100))
)
  LANGUAGE SQLSCRIPT
  SQL SECURITY INVOKER
AS
BEGIN
  reference := '';
  error = select 'error' as MESSAGE from dummy;
END;

Arguments of the callback:

{ '0': null,
  '1': { REFERENCE: <Buffer > },
  '2':
   [ { Code: '1',
       Messagetype: 'error',
       Message: 'error',
       Description: 'error' } ] }

I am wondering if this is an issue with the library or if I'm missing a configuration that I can set for the client?

Thanks,
Balazs

Discrepancy in returning SELECT results

Hi,
For a JOIN query on two tables which have columns with the same name, a simple SELECT returns only the values of the matching columns in the second table.

Descriptive Example:
Table TAB1 (ID NUMBER, NAME NVARCHAR(512), AGE NUMBER)
Table TAB2 (NID NUMBER, NAME NVARCHAR(512), SALARY NUMBER)

Query: SELECT TOP 1 * FROM TAB1, TAB2 WHERE TAB1.ID = TAB2.NID;

Using node-hdb, the results for the query are of the format:
[{ID: 1, AGE: 55, NID: 1, NAME: 'ABC', SALARY: 35000}, {ID: 2, AGE: 55, NID: 2, NAME: 'BCD', SALARY: 350}, .... ]

In each object of the array, the value for the NAME field in the last table of the join is returned. The values of NAME for the first table (s) are lost. I understand this is inherent to Javascript object semantics that no object can have duplicate keys but I think it will be best to come up with a more streamlined way of returning back the results, especially in such cases where there's a potential for data 'loss'.

Best Regards,
Abhi

P.S.: Of course, the problem does not arise if one uses 'aliases' within the SELECT query for the columns. But this becomes cumbersome in most cases.

'No initialization reply received within 5 sec' error, need info!

I am starting to see some strange error happening in my application. Sometimes, I get the following error: No initialization reply received within 5 sec. Most of the time, things work fine.

What does this mean? Is this something that happens when the connection is created or can it happen on an execute call? What can cause this error?

Note: I have a connection pool, it sometimes creates new clients on errors or if the client is idle for too long.

Out Parameter defined as NUMBER is returned as a JS string

Hi,
I have a stored procedure with the following structure:

PROCEDURE "SCHEMA_NAME"."UPDATE_ME" ( IN ID NUMBER, OUT NUM_JOBS NUMBER, OUT SCHEDULES_TAB "SCHEMA_NAME"."MY_TYPE")

Upon executing the procedure using node-hdb, the scalar OUT parameter "NUM_JOBS" is returned as a string with a value similar to "0e+0". In JS code, this effectively rules out using strict equals operators to compare values (e.g. "0e+0" === 0 returns false).
Is this intended behaviour? Shouldn't the API return scalars by automatically converting the type to JS type?

Best Regards
Abhi

Fetching large dataset using ResultSet pauses when hitting FETCH_SIZE

Hello! First of all, thanks for making this awesome driver. It's awesome!

I'm grabbing 150k rows (each row with 200+ columns) from the database to generate CSV reports and dumps. I know it's a lot of data, but it is required for a particular use case.

I'm using ResultSet to stream the rows through. It works great until I hit the FETCH_SIZE row count (2^15). Then the stream pauses for a minute, and the streaming continues until the next FETCH_SIZE batch.

I would expect to see a continuous stream of rows, but I assume the driver buffers FETCH_SIZE rows until it pushes it through the stream. Maybe you can explain this mechanism further? It might give me hints for what I can optimize.

In any case, what can I do to shorten this stream pause? Is there something I can tweak at the driver level?

I tried increasing FETCH_SIZE and MAX_FETCH_SIZE: Unfortunately I get error while parsing protocol: wrong fetch size after exceeding a certain size.

I also poked around the source and I saw averageRowLength. Is this something I should explore? I'm not sure what kind of unit this value is in (is it bytes? or number of columns?). The default seems low for the amount of columns I have.

In addition, is there something I could do in my Hana configuration?

ECONNRESET Error

I am getting an unhandled error event when the connection to the HANA gets lost, e.g., because of restarting it.:

events.js:85
      throw er; // Unhandled 'error' event
            ^
Error: read ECONNRESET
    at exports._errnoException (util.js:746:11)
    at TCP.onread (net.js:550:26)

Is it possible to catch it within the node-hdb module?

Connect callback is called multiple times if there is a network error during connection

During automated testing we have been seeing rare crashes due to what appears to be the callback on connect() being invoked multiple times. These appear during long automated tests during periods of network disruption. I have not been able to build a simple repro.

When we have network disruptions, usually the error that is returned is:

Error: Could not connect to any host: [ <<hostname>> - connect ECONNREFUSED <<ip address and port>> ]

These are handled properly. However, in rare cases when an ECONNRESET is raised during connection, we receive:

Received connect error: Error: Could not connect to any host: [ <<hostname>> - read ECONNRESET ]

When the ECONNRESET is received, it appears the callback supplied to connect is being called twice.

I believe this may be because in ConnectionManager.js, a handler is registered for the 'error' event (https://github.com/SAP/node-hdb/blob/master/lib/protocol/ConnectionManager.js#L81). The handler is not removed until after fetchDbConnectInfo has been called (https://github.com/SAP/node-hdb/blob/master/lib/protocol/ConnectionManager.js#L102).

If a network error arises during the fetchDbConnectInfo call, I believe that both the error handler will be called, and fetchDbConnectInfo will return the error, which will cause the callback to be called twice.

There is a specific test case for this: https://github.com/SAP/node-hdb/blob/master/test/hdb.Client.js#L509. However, I believe the mock is not correctly mimicking the connection object. Specifically, this test redefines fetchDbConnectInfo to emit the 'error' (https://github.com/SAP/node-hdb/blob/master/test/hdb.Client.js#L516). However, I believe the real connection object would both emit the error, and call a callback with the error.

I made the changes on my fork (efarrar@a6b7b4d), and when I run the test, I see that the callback is being invoked twice. (Unfortunately, mocha does not detect it because the second invocation happens after that test case has passed, so mocha silently ignores it ). In the output, I see:

     should report an error if instance number string is not valid
Connect callback was called
      should report an error if error event has occurred during fetching DB_CONNECT_INFO
Connect callback was called
      should close the connection to system-db in case of error during opening it

ECONNRESET Error

I am getting this error when a connection has been idle for a while:

Error: read ECONNRESET
    at errnoException (net.js:905:11)
    at TCP.onread (net.js:559:19)

I am assuming this is an issue with the connection timing out.
Is there an easy stop this happening?

Closing a connection twice causes an error

When closing a client connection twice it gives the following error in lib/protocol/Connection.js#L475:

  if (this._queue.empty && !this._queue.busy) {
                 ^
TypeError: Cannot read property 'empty' of undefined

While closing it twice doesn't make a lot of sense, this can be made more robust. A second close should just be a noop. It also would remove the burden from the user to check for the connection state before closing. What do you think?

Kerberos Authentication

I am looking for Kerberos Authentication. Do you plans to implement it?

You have mentioned in Readme that "This mechanism is currently not implemented. Please contact me if you require Kerberos based authentication."

using timestamps

Hey Holger,

When I use timestamps in my statement with a between condition, there is an error with the timestamp format(YYYYMMDD).
This select:
select TOP 30 index, name, datetime, lat, lon from db where index = ? and datetime between ? and ? ORDER BY DATETIME DESC.
Called by:
req_statement.exec([index, 20080101, 20100101], function (err, rows)...or
req_statement.exec([index, '20080101', '20100101'], function (err, rows)...

The same query works in a SQL console in SAP HANA Studio without any problems..

Best,
Lukas

Connect to HCP HANA Trial

Hi Experts,

Is there a way to connect to the HCP trial account HANA database? We are running a nodejs application in a cloud foundary instance. We can't however figure out what the connection parameters will have to look like (specifically the hostname).

Thank you very much.

Best Nils

Result rows as array instead of dictionary

When retrieving rows with the execute function, the resulting rows are always dictionaries with column/name pairs. Would it be possible to return a list of arrays in order to maintain the order of columns?

For example, if I execute this query: SELECT A, B, C FROM MYTABLE, the resulting rows currently look like this: {A: 156, B: 78, C: 0}. However, I think it would be useful to have the option to return rows like this: [156, 78, 0].

Remove debuglog devDependency

According to PR #37 the debuglog dependency should be defined in package.json. But this module should not have any dependencies. Only devDependency are acceptable. There this devDependency should be completely removed.

Support LOBs as string feature.

I am getting Buffer as result for LOB data. I found out all LOB objects are getting converted to Buffer while retrieving result set. But I need the Large object as String in my feature. So, isn't there be an option to convert it to string again ?
*Optionally rolling through all result set in my side would consume a lot of time and a bad option.

Multiple Container HDB

Hi there,

I should connect to a Multiple Container HDB.
Is it possible to connect with node-hdb?
(need to specify the name of the tenant database container in the connection options)

Thank you and best regards,
jazz

Error while opening tunnel

I had installed JDK version 1.8 initially but when I was trying to open tunnel using command neo open-db-tunnel... I got an error saying "Java cannot be found on your system". When I checked java -version it gave me error in registry key saying 1.7 is required. I installed JDK 1.7 & things went fine afterwards. But I would like to understand the current HCP SDK does not support 1.8 version or is there a new Java web file I should refer to.

Regards,
Pallavi

Problems executing huge SQL statements

Hello, I get the following error when trying to execute huge SQL statements (the statement string is 144k large):

Execute error: { [Error: error while parsing protocol: invalid communication buffer structure]
  message: 'error while parsing protocol: invalid communication buffer structure',
  code: 1033,
  sqlState: 'HY000',
  level: 1,
  position: 0 }

The SQL is large due to around 11000 string constants that are put into a hierarchy expression, so it can't be formulated differently (e.g. with a join).

When I reduce the SQL statement to around 126k (using only 5100 string constants), it works and I get the data back.

Is this a limit in the DB server, or does it come from node-hdb?

Typo in TopologyInformation

The constant ALL_IP_ADRESSES in lib/protocol/common/TopologyInformation.js is mispelled, missing the second D. This would be a breaking change.

SQL parsing bug causes syntax error

> SELECT TABLE_NAME, CONCAT(CONCAT('"', STRING_AGG(COLUMN_NAME, '","')), '"') FROM (SELECT TABLE_NAME, COLUMN_NAME FROM SYS.CONSTRAINTS WHERE SCHEMA_NAME='MY_SCHEMA' ORDER BY "POSITION") GROUP BY TABLE_NAME

undefined:4
"CONCAT(CONCAT('"',STRING_AGG(COLUMN_NAME,'","')),'"')": this.readString()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^

SyntaxError: Unexpected string
    at Function (native)
    at createEvilParseFunction (/home/woverton/Documents/Repositories/censql/node_modules/hdb/lib/protocol/Parser.js:149:10)
    at Parser.createParseFunction (/home/woverton/Documents/Repositories/censql/node_modules/hdb/lib/protocol/Parser.js:50:12)
    at Parser.createParseRowFunction (/home/woverton/Documents/Repositories/censql/node_modules/hdb/lib/protocol/Parser.js:36:15)
    at Parser.createTransform (/home/woverton/Documents/Repositories/censql/node_modules/hdb/lib/protocol/Parser.js:71:38)
    at ResultSet.createParser (/home/woverton/Documents/Repositories/censql/node_modules/hdb/lib/protocol/ResultSet.js:198:39)
    at ResultSet.createArrayStream (/home/woverton/Documents/Repositories/censql/node_modules/hdb/lib/protocol/ResultSet.js:224:52)
    at ResultSet.fetch (/home/woverton/Documents/Repositories/censql/node_modules/hdb/lib/protocol/ResultSet.js:144:21)
    at next (/home/woverton/Documents/Repositories/censql/node_modules/hdb/lib/protocol/Result.js:226:19)
    at fetchAll (/home/woverton/Documents/Repositories/censql/node_modules/hdb/lib/protocol/Result.js:228:3)

Not too sure what is going on here. It just really doesn't like it when I run that query.

How to execute multiple statements ?

I am trying to execute multiple drop statements. Like

var drop_tables = `
DROP TABLE ESH_TESTING.PRODUCT_DETAILS CASCADE;
DROP TABLE ESH_TESTING.PRODUCT_DELIVERY_LOCATION CASCADE;
DROP TABLE ESH_TESTING.TERM_MAPPING_COUNTRIES CASCADE;
        `;

but it keeps giving me a syntax error at line 3. Could you suggest a workaround using
which i could execute multiple statements at a go.

It does work if i split the drop's into individual statements but not when combined.

It follows the same behavior for inserts given in the above format.

select on BOOL value returns internal INT representation

SELECT TRUE a, FALSE b, 1 c, 0 d FROM DUMMY

returns

[{"A":1,"B":0,"C":1,"D":0}]

Is this intentional?
I understand that HANA uses integers for the internal representation of boolean values, but I would expect a node db client to return a javascript type boolean here.

Setting active schema

Hi there,

I am wondering if there is a way to set the current active schema so that it does not need to be specified every query. Thanks.

P.s. is there a better place to ask this question?

Local Temporary table is not accessible

Hi,
When I create local temporary table using node-hdb module by using query given below

CREATE LOCAL TEMPORARY COLUMN TABLE "#1231CD57A6569D15E10000000A600482" AS (SELECT 1 FROM DUMMY) WITH DATA

I'm not able to access the table. While when I create Global temporary table I'm able to access.

NOTE: I'm using the same connection with which I create a table.

any help would be highly appreciated.

Reconnection of client after Network connection error results in error on Statement Execution

After an Error in the Client Object the Client seems to get closed.
As we have the option 'autoReconnect' enabled, it reconnects to Hana again.
See the messages in the terminal, the functions onerror, onclose and onconnect are the same as in examples/client.js:

Network connection error { [Error: read ECONNRESET] code: 'ECONNRESET', errno: 'ECONNRESET', syscall: 'read' }
Client closed
Client connected

When we then try to execute a statement, we get the following error in lib/protocol/Connection.js#L316

      queueable = this._queue.createTask(this.send.bind(this, task), cb);

TypeError: Cannot call method 'createTask' of undefined

'RangeError: Maximum call stack size exceeded' during large insert

Large insert (1000+ rows) causing errors

RangeError: Maximum call stack size exceeded
    at ExecuteTask.end (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/ExecuteTask.js:115:41)
    at done (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/ExecuteTask.js:53:10)
    at finalize (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/ExecuteTask.js:59:14)
    at receive (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/ExecuteTask.js:76:16)
    at send (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/ExecuteTask.js:186:14)
    at next (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/ExecuteTask.js:163:16)
    at handleParameters (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/ExecuteTask.js:152:5)
    at done (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/Writer.js:196:5)
    at next (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/Writer.js:166:14)
    at Writer.finializeParameters (/var/node/wo/Convertiq/node_modules/hdb/lib/protocol/Writer.js:184:3)

My Code (ish):

var sql = 'insert into "' + databaseName.toUpperCase() + '"."' + tableName + '" values (' + columns + ')';

this.connections[sessionID].prepare(sql, function(err, statement) {

    if(err) throw err;

    var insertData = [];

    for (var i = data.length - 1; i >= 0; i--) {
        insertData.push(data[i])
    }

    statement.exec(insertData, function(err, affectedRows) {
        debug(err, 'Amount of affected rows: ' + affectedRows.length);
    });
});

Hana Version/Status:

SELECT * FROM SYS.M_DATABASE

SYSTEM_ID | DATABASE_NAME | HOST | START_TIME | VERSION | USAGE
- - - - - - - - - - - - - - - - - - - 
M1D | M1D | prd-nohana01 | 2015-08-19T14:53:45.932 | 1.00.097.01.1436865239 | DEVELOPMENT

> \st

HOST | HOST_ACTIVE | HOST_STATUS
- - - - - - - - - - - - - - - - - - - 
prd-nohana01 | YES | OK

LIMIT query runs really slow with node-hdb

node-hdb version: 0.5.3
Running a SELECT query with LIMIT option using node-hdb is really slow.

Parent query: select * from TABLE where A=5 ORDER BY B;

Total Number of Rows:
select count(*) from TABLE where A= 5 , Output: 1102

Limit Query: select * from TABLE where A=5 order by B limit 200

Code:

var d = new Date();
client.prepare("select * from TABLE where A=5 order by B limit 200", function(err, statement) {
                if (err) {
                    console.error('Error during query preparation: ', query, err);
                    return callback(err);
                }
                statement.exec(values, function(err, rows) {
                    console.log('Time taken for prepared query: %s seconds', (new Date() - d) / 1000);
                    if (err) {
                        console.error('Error during statement execution: ', query, err);
                        return callback(err);
                    }
                    return callback(err, rows);
                });
            });

Output of the code above: Time taken for prepared query: 53.291 seconds

The same query runs just fine (465 ms) via HANA Studio.

P.S.: I changed the LIMIT query to SELECT * FROM (select *, ROW_NUMBER() over (order by TID) as LID from TABLE where A=5) WHERE LID <= 200. This gives better performance (25 seconds) but nowhere near HANA Studio.

Batch insert has side effects on input parameters

When doing batch insert the input array with arguments is changed.

Steps to reproduce:

  1. You have to have a database table in which you will insert.

For example: CREATE COLUMN TABLE "SOME_SCHEMA"."SOME_TABLE" ( "ID" INT CS_INT, "NAME" NVARCHAR(255) CS_STRING )

  1. Execute this code:
var hdb = require('hdb');
var client = hdb.createClient({
  host: <host>,
  port: <port>,
  user: <user>,
  password: <password>
});
var params = [[1, 'a'], [2, 'b']];
client.connect(onConnect);

function onConnect(err) {
  console.log(JSON.stringify(params));
  client.prepare('INSERT INTO "SOME_SCHEMA"."SOME_TABLE" VALUES(?,?);', onPrepare);
}

function onPrepare(err, statement) {
  statement.exec(params, onExec);
}

function onExec(err, rows) {
  console.log(JSON.stringify(params));
}

Expected result:
[[1,"a"],[2,"b"]]
[[1,"a"],[2,"b"]]

Actual result:
[[1,"a"],[2,"b"]]
[]

Problem:
the params array is modified

Unicode support for select statements?

I'm having issues displaying unicode characters using node-hdb select queries. The first attached image shows the unicode character correctly stored in the HDB (screenshot from HANA Admin console).
hdb

The second image shows my webstorm console after retrieving the same piece of data with the out of bounds placeholder for unicode characters.
node-hdb

Are there any workarounds for this behavior?

Date parsing - dates returned as strings

Column types DATE DATETIME, etc. are not returned as JavaScript Date objects, but as strings. Similar node libraries (pg, for example) return date type fields as JS dates.

lib/protocol/common/Reader.js, functions readDate, readTime, readTimestamp, etc. Perhaps to maintain backwards compat, conversion to a Date object could be optional, and specified when the client is initialise (clientOptions).

Cheers

Can node-hdb APIs be executed synchronously?

Hi Holger,
Do you provide a way in which node-hdb database operations can be executed in synchronous fashion conditionally?
The requirements stems from the fact that I would like to execute a few database operations, reacting to the process SIGKILL event. According to node.js process docs, the event handler must only execute synchronous stuff for any guarantee for execution.

Best Regards,
Abhi

Configure a default schema in connection

Hi, I am trying to use a schema other than "SYSTEM" to use as my default schema, so that I don't have to add schema name in every query. But I don't find a way to do so.
Any help will be really appreciated !

Number of affected rows is returned as undefined during batch insert

Present Scenario
Number of affected rows is returned as undefined during batch insert when some of rows have error.
But the correct records are inserted into the data base.

Expected scenario
Both the parameters should be filled on the function call back containing both the error and number of inserted rows

image

SQL statement CONNECT fails with Error: Invalid or unsupported FunctionCode

SQL

CONNECT new_user PASSWORD Password1

Error stack

     Error: Invalid or unsupported FunctionCode
      at Result.handle (node_modules/hdb/lib/protocol/Result.js:104:11)
      at handleReply (node_modules/hdb/lib/Client.js:236:12)
      at receive (node_modules/hdb/lib/util/Queue.js:113:18)
      at Connection.receive (node_modules/hdb/lib/protocol/Connection.js:341:3)
      at Socket.ondata (node_modules/hdb/lib/protocol/Connection.js:216:12)
      at readableAddChunk (_stream_readable.js:153:18)
      at Socket.Readable.push (_stream_readable.js:111:10)
      at TCP.onread (net.js:531:20)

[email protected]

Encoding problem?

I have the following statement:

https://gist.githubusercontent.com/drleidig/47e5ad6cb28c581fdef73f953d68f783/raw/80bff2b5f54f6f1f94b9341b13dbc6a088824ff8/statement.sql

(you have to download it to see the utf-8 characters)

Testing the statement with node-hdb:

https://gist.githubusercontent.com/drleidig/a95b36ee8b90f884b0fb4964d4bbc32c/raw/d33b8c4a1105cbbeabb7fe3f3305786acc1c11b2/encoding.js

I get an SQL syntax error: incorrect syntax near ...

When I test the statement from the SQL console within the HANA studio everything is fine.
I assume something with the encoding is going wrong.
Somewhere in the docs I found that HANA is using CESU-8 encoding, is similar to utf-8 but not the same. Maybe this could be the source of the error?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.