Coder Social home page Coder Social logo

twright-msft / mssql-node-docker-demo-app Goto Github PK

View Code? Open in Web Editor NEW
297.0 18.0 230.0 70 KB

Demonstration application using Microsoft SQL Server on Linux and Node in a Docker container.

License: MIT License

Shell 23.49% JavaScript 45.48% Dockerfile 27.52% TSQL 3.51%

mssql-node-docker-demo-app's Introduction

Overview

This is a demo application created to show how SQL Server can operate in a DevOps scenario where an application developer can check in code to GitHub and then trigger a build in Red Hat Open Shift to deploy the changes automatically as pods (containers). This demo was first shown at the Nordic Infrastructure Conference (NIC) 2017 in Oslo, Norway on Feb 3, 2017. This demo application is notable for showing a few things:

  • An entrypoint CMD which executes a import-data.sh script at runtime to use sqlcmd to execute a .sql script to create a database and populate initial schema into it.
  • The import-data.sh script also uses bcp to bulk import the data found in the Products.csv file.
  • A simple node application that acts as a web service to get the data out of the SQL Server database using FOR JSON auto to automatically format the data into JSON and return it in the response.

IMPORTANT: This project has been tested with SQL Server v.Next version CTP 1.4 (March 17, 2017 release).

Running the Demo

Setting up the application and building the image for the first time

First, create a folder on your host and then git clone this project into that folder:

git clone https://github.com/twright-msft/mssql-node-docker-demo-app.git

To run the demo you just need to build the container:

docker build -t node-web-app .

Then, you need to run the container:

docker run -e ACCEPT_EULA=Y -e SA_PASSWORD=Yukon900 -p 1433:1433 -p 8080:8080 -d node-web-app

Note: make sure that your password matches what is in the import-data.sh script.

Then you can connect to the SQL Server in the container by running a tool on your host or you can docker exec into the container and run sqlcmd from inside the container.

docker exec -it <container name|ID> /bin/bash
/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P Yukon900

To show the web service response, open up a browser and point it to http://localhost:8080.

Now that you have the demo application prepared, you can setup and configure OpenShift.

Setting up OpenShift

For demo purposes, you can deploy an OpenShift environment into any major cloud provider using templates available in the marketplaces, you can use OpenShift Online, or you can deploy a Virtual Box-based OpenShift environment called minishift on your local development/demo machine.

You will also need to get the oc commandline utility installed on your host/demo machine.

Once you have OpenShift set up and you have the oc command line utility installed, you need to login to your OpenShift environment as a cluster administrator, create a new project, and set the permissions to allow any user identity to run as root (required by the mssql user that runs the sqlservr.sh script in a container for now).

oc login
oc new-project demo
oadm policy add-scc-to-user anyuid -z default

Now that you have OpenShift set up, you are ready to do an initial test run of deploying into OpenShift.

Deploying into OpenShift

At a terminal prompt in the root of the application folder enter this command:

oc new-app .

That will deploy the app into OpenShift. You can now log into OpenShift and see a few artifacts.

Detailed Explanation

Here's a detailed look at each of the files in the project.

Dockerfile

The Dockerfile defines how the image will be built. Each of the commands in the Dockerfile is described below.

The Dockerfile can define a base image as the first layer. In this case, the Dockerfile uses the official Microsoft SQL Server Linux image that can be found on Docker Hub. The Dockerfile will pull the image with the 'latest' tag. This image requires two environment variables to be passed to it at run time - ACCEPT_EULA and SA_PASSWORD. The Microsoft SQL Server Linux image is in turn based on the official Ubuntu Linux image Ubuntu:16.04.

FROM microsoft/mssql-server-linux:latest

This RUN command will update all the installed packages in the image, install the curl utility if it is not already there and then install node. This command must be run with sudo privileges so the user is switched to root.

USER root
RUN apt-get -y update  && \
        apt-get install -y curl && \
        curl -sL https://deb.nodesource.com/setup_14.x | bash - && \
        apt-get install -y nodejs

This installs the tedious driver for SQL Server which allows node applications to connect to SQL Server and run SQL commands. This is an open source project to which Microsoft is now one of the main contributors.

NPM package details

Source Code

RUN npm install tedious

This RUN command creates a new directory inside the container at /usr/src/app and then sets the working directory to that directory.

RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app

Then this command copies the package.json file from the source code in this project to the /usr/src/app directory inside the container. The RUN command npm install will install all the dependencies defined in the package.json file.

COPY package.json /usr/src/app/
RUN npm install

Then all the source code from the project is copied into the container image in the /usr/src/app directory.

COPY . /usr/src/app

In order for the import-data.sh script to be executable you need to run the chmod command to add +x (execute) to the file.

RUN chmod +x /usr/src/app/import-data.sh

The EXPOSE command defines which port the application will be accessible at from outside the container.

EXPOSE 8080

Lastly, the CMD command defines what will be executed when the container starts. In this case, it will execute the entrypoint.sh script contained in the source code for this project. The source code including the entrypoint.sh is contained in the /usr/src/app directory which has also been made the working directory by the commands above. The user is also switched back to the mssql user for security reasons.

USER mssql
CMD /bin/bash ./entrypoint.sh

entrypoint.sh

The entrypoint.sh script is executed when the container first starts. The script kicks off three things simultaneously:

  • Start SQL Server using the sqlservr.sh script. This script will look for the existence of the ACCEPT_EULA and SA_PASSWORD environment variables. Since this will be the first execution of SQL Server the SA password will be set and then the sqlservr process will be started. Note: Sqlservr runs as a process inside of a container, not as a daemon.
  • Executes the import-data.sh script contained in the source code of this project. The import-data.sh script creates a database, populates the schema and imports some data.
  • Runs npm start which will start the node application.
/opt/mssql/bin/sqlservr.sh & /usr/src/app/import-data.sh & npm start 

import-data.sh

The import-data.sh script is a convenient way to delay the execution of the SQL commands until SQL Server is started. Typically SQL Server takes about 5-10 seconds to start up and be ready for connections and commands. Bringing the SQL commands into a separate .sh script from entrypoint.sh creates modularity between the commands that should be run at container start up time and the SQL commands that need to be run. It also allow for the container start up commands to be run immediately and the SQL commands to be delayed.

This command causes a wait to allow SQL Server to start up. To handle this, the script implements a brute force attempt to excute the sqlcmd command in a loop until it is successful or it has attempted 50 times 1 second apart.

The sqlcmd command uses the SQL Server command line utility sqlcmd to execute some SQL commands contained in the setup.sql file. The commands can also be passed directly to sqlcmd via the -q parameter. For better readability if you have lots of SQL commands, it's best to create a separate .sql file and put all the SQL commands in it.

for i in {1..50};
do
    /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P Yukon900 -d master -i setup.sql
    if [ $? -eq 0 ]
    then
        echo "setup.sql completed"
        break
    else
        echo "not ready yet..."
        sleep 1
    fi
done

IMPORTANT: Make sure to change your password here if you use something other than 'Yukon900'.

The setup.sql script will create a new database called DemoData and a table called Products in the default dbo schema. This bcp command will import the data contained in the source code file Products.csv. IMPORTANT: If you change the names of the database or the table in the setup.sql script, make sure you change them here too. IMPORTANT: Make sure to change your password here if you use something other than 'Yukon900'.

bcp DemoData.dbo.Products in "/usr/src/app/Products.csv" -c -t',' -S localhost -U sa -P Yukon900

setup.sql

The setup.sql defines some simple commands to create a database and some simple schema. You could use a .sql file like this for other purposes like creating logins, assigning permissions, creating stored procedures, and much more. When creating a database in production situations, you will probably want to be more specific about where the database files are created so that the database files are stored in persistent storage. This SQL script creates a table with two columns - ID (integer) and ProductName (nvarchar(max)).

CREATE DATABASE DemoData;
GO
USE DemoData;
GO
CREATE TABLE Products (ID int, ProductName nvarchar(max));
GO

Products.csv

This CSV data file contains some sample data to populate the Products table. It has two columns - ID and ProductName separated by a comma. The bcp command in the import-data.sh script uses this file to import the data into the Products table created by the setup.sql script file.

1,Car
2,Truck
3,Motorcycle
4,Bicycle
5,Horse
6,Boat
7,Plane
8,Scooter
9,Gopher
.... more data if you want ....

server.js

The server.js file defines the node application that exposes the web service and retrieves the data from SQL Server and returns it to the requestor as a JSON response.

The require statements at the top of the file bring in some libraries like tedious and express and define some global variables which can be used by the rest of the application.

var express = require("express");
var app = express();
var connection = require('tedious').Connection;
var request = require('tedious').Request;

The app.get defines the route for this application. Any GET request that comes to the root of this application will be handled by this function. This effectively creates a simple REST-style interface for returing data in JSON from a GET request.

app.get('/', function (req, res) {

The next set of commands define the connection parameters and creates a connection object.

IMPORTANT: Make sure to change your password here if you use something other than 'Yukon900'.

IMPORTANT: If you change the names of the database in the setup.sql script, make sure you change it here to.

    var config = {
        server: 'localhost',
        authentication: {
            type: 'default',
            options: {
                userName: 'sa',
                password: 'Yukon900', // update me
                database: 'DemoData' // doesn't seem to work?
            },
        }
    };
var conn = new connection(config);

This next command defines the event handler function for the connection.on event.

conn.on('connect', function(err) {

Assuming the connection is made correctly, the next command sets up the query that will be executed. This uses SQL Server's built in JSON functions to retrieve the data in JSON format for us so we don't have to write code to convert the data from a traditional row set into JSON. Nice!

More information on JSON in SQL server

sqlreq = new request("SELECT * FROM Products FOR JSON AUTO", function(err, rowCount) {

The next set of commands set up the event handler function for the sql request row command which will be triggered for each row in a response. In this case there will only be a single row and a single column because we are using FOR JSON AUTO to get the data returned in a single string of JSON data. Assuming the request comes back with a row and a column value we simply return the JSON string (the column.value) directly to the browser in the response (res).

sqlreq.on('row', function(columns) {
   columns.forEach(function(column) {  
      if (column.value === null) {  
         console.log('NULL');
      } else {  
         res.send(column.value);
      }  
   });
});

This is the command that actually sends in the SQL request:

conn.execSql(sqlreq);

This command starts the app listening on port 8080. IMPORTANT: If you change the port number in the Dockerfile EXPOSE command make sure you change it here too.

var server = app.listen(8080, function () {
    console.log("Listening on port %s...", server.address().port);
});

Alternative approach using SQL scripts for seeding schema and data

So far, the steps above have described how to create a docker image of sqlserver that starts seeding on first run. However, if we have a huge database to be seeded, the setup.sql could contain all the CREATE/INSERT SQL statements that are needed for seeding the database. We do not have to import it from csv file.

You can either add the CREATE/INSERT statements to setup.sql and have those run each time a container is created or you can create a new image that has the schema and data captured inside of it. In that case, after starting a new container and executing the .sql script, we can commit the newly running container with seeded db as a new image using "docker commit" command.

docker commit <container_id> <docker image tag>

We can now use this new image in a new docker based project including in a Docker Compose app using a docker-compose.yml file.

Also node.js dependency can be removed in this case. Node.js is only used here as an example web service to show the data can be retrieved from the SQL Server.

mssql-node-docker-demo-app's People

Contributors

alandball avatar imaarthi avatar ingalless avatar porcupinenick avatar robrich avatar septem151 avatar ttilberg avatar twright-msft avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mssql-node-docker-demo-app's Issues

Separation of concerns?

It would be nice to have separation of concerns.

  • docker Db container with its own volume.
    Safe, reliable, robust and Production ready Db initialization (no 90s wait)

  • docker App container communicating to docker Db container.

Question tailing microsoft/mssql-server-linux logs

So, I'm following the pattern of using entrypoint.sh to start and seed the server inside of a docker-compose file. The problem I'm running into is my app resides in a separate container. So, I can't just run npm start to keep the container alive. I'm wondering if I can do something like this:

/opt/mssql/bin/sqlservr & /app/mssql-import.sh && tail -f <MSSQL_LOGS>

Any insights are much appreciated! Been banging my head on this one for a while. Thanks!

db container exited with code 0

This setup runs my setup.sql script, creates my database and switches context to that the database and then the container 'exited with code 0'.

Keep container alive without node

How would you keep the container alive without running a node web service as the last command while still starting sql server and initializing the database after the server has started? Wouldn't it be better to register sql server as a service, and then import the data and have the final command check for the service?

Something like this:

/opt/mssql/bin/sqlservr & ./init_db.sh && while true; do sleep 1; done

sa login fails

I'm trying to run this as coded (i.e. I have not updated any passwords so should all just work) and when run I get the following error (Docker on Mac):

Bergkamp:mssql-node-docker-demo-app convery$ docker run -e ACCEPT_EULA=Y -e SA_PASSWORD=Yukon900 -p 1433:1433 -p 8080:8080 -d node-web-app
08ce68c8584e651e2f17c7dc8730b41ef38964664b7ff93a8ecb8e921d9a6472
Bergkamp:mssql-node-docker-demo-app convery$ docker ps -a
CONTAINER ID        IMAGE               COMMAND                  CREATED             STATUS              PORTS                                            NAMES
08ce68c8584e        node-web-app        "/bin/sh -c '/bin/..."   4 seconds ago       Up 1 second         0.0.0.0:1433->1433/tcp, 0.0.0.0:8080->8080/tcp   nostalgic_yonath
B
Bergkamp:mssql-node-docker-demo-app convery$ docker logs nostalgic_yonath
Configuring Microsoft(R) SQL Server(R)...

> [email protected] start /usr/src/app
> node server.js

Listening on port 8080...
Configuration complete.
This is an evaluation version.  There are [146] days left in the evaluation period.
RegQueryValueEx HADR for key "Software\Microsoft\Microsoft SQL Server\MSSQL\MSSQLServer\HADR" failed.
2017-04-18 19:06:49.55 Server      Microsoft SQL Server vNext (CTP1.4) - 14.0.405.198 (X64) 
	Mar 11 2017 01:54:12 
	Copyright (C) 2016 Microsoft Corporation. All rights reserved.
	on Linux (Ubuntu 16.04.1 LTS)
2017-04-18 19:06:49.55 Server      UTC adjustment: 0:00
2017-04-18 19:06:49.55 Server      (c) Microsoft Corporation.
2017-04-18 19:06:49.55 Server      All rights reserved.
2017-04-18 19:06:49.55 Server      Server process ID is 4116.
2017-04-18 19:06:49.55 Server      Logging SQL Server messages in file 'C:\var\opt\mssql\log\errorlog'.
2017-04-18 19:06:49.55 Server      Registry startup parameters: 
	 -d C:\var\opt\mssql\data\master.mdf
	 -l C:\var\opt\mssql\data\mastlog.ldf
	 -e C:\var\opt\mssql\log\errorlog
2017-04-18 19:06:49.56 Server      SQL Server detected 4 sockets with 1 cores per socket and 1 logical processors per socket, 4 total logical processors; using 4 logical processors based on SQL Server licensing. This is an informational message; no user action is required.
2017-04-18 19:06:49.56 Server      SQL Server is starting at normal priority base (=7). This is an informational message only. No user action is required.
2017-04-18 19:06:49.56 Server      Detected 3158 MB of RAM. This is an informational message; no user action is required.
2017-04-18 19:06:49.57 Server      Using conventional memory in the memory manager.
2017-04-18 19:06:49.60 Server      Default collation: SQL_Latin1_General_CP1_CI_AS (us_english 1033)
2017-04-18 19:06:49.67 Server      Buffer pool extension is already disabled. No action is necessary. 
2017-04-18 19:06:49.73 Server      InitializeExternalUserGroupSid failed. Implied authentication will be disabled.
2017-04-18 19:06:49.73 Server      Implied authentication manager initialization failed. Implied authentication will be disabled.
2017-04-18 19:06:49.75 Server      The maximum number of dedicated administrator connections for this instance is '1'
2017-04-18 19:06:49.75 Server      Node configuration: node 0: CPU mask: 0x000000000000000f:0 Active CPU mask: 0x000000000000000f:0. This message provides a description of the NUMA configuration for this computer. This is an informational message only. No user action is required.
2017-04-18 19:06:49.76 Server      Using dynamic lock allocation.  Initial allocation of 2500 Lock blocks and 5000 Lock Owner blocks per node.  This is an informational message only.  No user action is required.
2017-04-18 19:06:49.77 Server      In-Memory OLTP initialized on lowend machine.
2017-04-18 19:06:49.79 Server      Database Instant File Initialization: enabled. For security and performance considerations see the topic 'Database Instant File Initialization' in SQL Server Books Online. This is an informational message only. No user action is required.
2017-04-18 19:06:49.80 Server      Query Store settings initialized with enabled = 1, 
2017-04-18 19:06:49.81 spid6s      Starting up database 'master'.
2017-04-18 19:06:49.81 Server      Software Usage Metrics is disabled.
2017-04-18 19:06:49.90 spid6s      9 transactions rolled forward in database 'master' (1:0). This is an informational message only. No user action is required.
2017-04-18 19:06:49.94 spid6s      0 transactions rolled back in database 'master' (1:0). This is an informational message only. No user action is required.
2017-04-18 19:06:49.94 spid6s      Recovery is writing a checkpoint in database 'master' (1). This is an informational message only. No user action is required.
2017-04-18 19:06:50.02 spid6s      Buffer pool extension is already disabled. No action is necessary. 
2017-04-18 19:06:50.02 spid6s      Resource governor reconfiguration succeeded.
2017-04-18 19:06:50.02 spid6s      SQL Server Audit is starting the audits. This is an informational message. No user action is required.
2017-04-18 19:06:50.02 spid6s      SQL Server Audit has started the audits. This is an informational message. No user action is required.
2017-04-18 19:06:50.05 spid6s      SQL Trace ID 1 was started by login "sa".
2017-04-18 19:06:50.07 spid6s      Server name is '08ce68c8584e'. This is an informational message only. No user action is required.
2017-04-18 19:06:50.07 spid6s      The NETBIOS name of the local node that is running the server is '08ce68c8584e'. This is an informational message only. No user action is required.
2017-04-18 19:06:50.08 spid16s     Password policy update was successful.
2017-04-18 19:06:50.09 spid19s     Always On: The availability replica manager is starting. This is an informational message only. No user action is required.
2017-04-18 19:06:50.09 spid6s      Starting up database 'msdb'.
2017-04-18 19:06:50.09 spid19s     Always On: The availability replica manager is waiting for the instance of SQL Server to allow client connections. This is an informational message only. No user action is required.
2017-04-18 19:06:50.09 spid7s      Starting up database 'mssqlsystemresource'.
2017-04-18 19:06:50.10 spid7s      The resource database build version is 14.00.405. This is an informational message only. No user action is required.
2017-04-18 19:06:50.11 spid7s      Starting up database 'model'.
2017-04-18 19:06:50.36 spid6s      6 transactions rolled forward in database 'msdb' (4:0). This is an informational message only. No user action is required.
2017-04-18 19:06:50.38 spid6s      0 transactions rolled back in database 'msdb' (4:0). This is an informational message only. No user action is required.
2017-04-18 19:06:50.38 spid7s      Polybase feature disabled.
2017-04-18 19:06:50.38 spid7s      Clearing tempdb database.
2017-04-18 19:06:50.41 spid16s     A self-generated certificate was successfully loaded for encryption.
2017-04-18 19:06:50.42 spid16s     Server is listening on [ 0.0.0.0 <ipv4> 1433].
2017-04-18 19:06:50.42 Server      Server is listening on [ 127.0.0.1 <ipv4> 1434].
2017-04-18 19:06:50.42 Server      Dedicated admin connection support was established for listening locally on port 1434.
2017-04-18 19:06:50.43 spid16s     SQL Server is now ready for client connections. This is an informational message; no user action is required.
2017-04-18 19:06:50.67 spid7s      Starting up database 'tempdb'.
2017-04-18 19:06:50.85 spid7s      The tempdb database has 1 data file(s).
2017-04-18 19:06:50.85 spid19s     The Service Broker endpoint is in disabled or stopped state.
2017-04-18 19:06:50.85 spid19s     The Database Mirroring endpoint is in disabled or stopped state.
2017-04-18 19:06:50.86 spid19s     Service Broker manager has started.
2017-04-18 19:06:50.87 spid6s      Recovery is complete. This is an informational message only. No user action is required.
2017-04-18 19:07:04.04 Logon       Error: 18456, Severity: 14, State: 38.
2017-04-18 19:07:04.04 Logon       Login failed for user 'sa'. Reason: Failed to open the explicitly specified database 'DemoData'. [CLIENT: 127.0.0.1]
{ ConnectionError: Login failed for user 'sa'.
    at ConnectionError (/node_modules/tedious/lib/errors.js:12:12)
    at Parser.<anonymous> (/node_modules/tedious/lib/connection.js:373:38)
    at emitOne (events.js:96:13)
    at Parser.emit (events.js:188:7)
    at Parser.<anonymous> (/node_modules/tedious/lib/token/token-stream-parser.js:54:15)
    at emitOne (events.js:96:13)
    at Parser.emit (events.js:188:7)
    at readableAddChunk (/node_modules/readable-stream/lib/_stream_readable.js:212:18)
    at Parser.Readable.push (/node_modules/readable-stream/lib/_stream_readable.js:171:10)
    at Parser.Transform.push (/node_modules/readable-stream/lib/_stream_transform.js:123:32) message: 'Login failed for user \'sa\'.', code: 'ELOGIN' }

The passwords look to be consistent throughout

not working with latest sqlserver 2019

im using : mcr.microsoft.com/mssql/server:2019-latest

but your setup is not working as expected, because latest 2019 uses non-root.

can you please create article for using 2019 non root please.

An alternative to the 90 second sleep

Thanks for the sample! I was able to eliminate your 90s delay using the following & figured I'd share it.

while [ ! -f /var/opt/mssql/log/errorlog ]
do
  sleep 2
done

tail -f /var/opt/mssql/log/errorlog | while read LOGLINE
do
   [[ "${LOGLINE}" == *"Using 'xpstar.dll' version"* ]] && pkill -P $$ tail
done

Feel free to upvote these 2 if this is useful to you:
https://superuser.com/a/449307
https://stackoverflow.com/a/2379904

Image ms sql without node.js

Hello

I removed node.js from image and then my container ms sql is still restarting.

I do not know what is wrong.
image

Lukas

This sample doesn't work at all with current sqlserver 2019 docker version

I reimplemented the server.js with using the nodejs package mssql instead of tedious.
The trustServerCertificate: true option must be added for local connection to avoid certificate error.

Actually, two things should be changed only:

  • Replace the line RUN npm install tedious in Dockerfile with RUN npm install mssql
  • Replace server.js with this block of text
var express = require("express");
var app = express();
var sql = require('mssql')

app.get('/', async function (req, res) {
    console.log("********** SERVER.JS app.get")
    var config = {
        server: 'localhost',
        user: 'sa',
        password: 'Yukon900', // update me
        database: 'DemoData', // doesn't seem to work?
        options: {
            encrypt: true, // for azure
            trustServerCertificate: true // change to true for local dev / self-signed certs        
        },
    };
    var pool
    try {
        pool = await sql.connect(config)
        console.log("********** SERVER.JS connected")
        const result = await sql.query`SELECT * FROM DemoData.dbo.Products` // FOR JSON AUTO
        console.dir(result)
        res.send(result?.recordset);
    } catch (err) {
        console.log(err)
        res.send(JSON.stringify(err))
    } finally {
        pool?.close()
    }
})  

var server = app.listen(8080, function () {
    console.log(`========== Listening on http://localhost:${server.address().port}`);
});

I have provided an updated readme, too.

Entrypoint

Why does the Dockerfile not just use ENTRYPOINT "/bin/bash ./entrypoint.sh" instead of CMD /bin/bash ./entrypoint.sh?

Docker-compose newbie question

First off thanks for this repo, as this definitely helped me out. I was able to get everything running using docker build and docker run. My Dockerfile using sqlcmd to creates a new database and I'm using sqlpackage to deploy a dacpac file. Everything is gravy on that front.

My current issue is likely a result of my lack expertise in Docker in general. When I use docker-compose to run the solution, it doesn't run the entrypoint shell script. Docker-compose file is below.

Any advice would be greatly appreciated.

Thanks,
Isaac


version: "3"
services:
db:
build: .
ports:
- "1433:1433"

Did exactly what is done in this demo, but i am getting : Login failed for user 'sa'. Reason: Failed to open the explicitly specified database 'CIT'.

Hi @twright-msft , I am trying to do exactly the same thing as shown in your demo, Mine is windows machine and i allocated 7GB memory for dockere with 2 cpu's. When i run docker image with this below command-

docker run -e ACCEPT_EULA=Y -e SA_PASSWORD=Yukon900 -d testsqlserver

I am getting this below error:

2017-08-21 16:58:23.16 Logon Error: 18456, Severity: 14, State: 38.
2017-08-21 16:58:23.16 Logon Login failed for user 'sa'. Reason: Failed to open the explicitly specified database 'CIT'. [CLIENT: 172.17.0.2]
SQLState = 37000, NativeError = 4060
Error = [unixODBC][Microsoft][ODBC Driver 13 for SQL Server][SQL Server]Cannot open database "CIT" requested by the login. The login failed.
SQLState = 28000, NativeError = 18456
Error = [unixODBC][Microsoft][ODBC Driver 13 for SQL Server][SQL Server]Login failed for user 'sa'.
SQLState = 08001, NativeError = 10057
Error = [unixODBC][Microsoft][ODBC Driver 13 for SQL Server]TCP Provider: Error code 0x2749
SQLState = 08001, NativeError = 10057
Error = [unixODBC][Microsoft][ODBC Driver 13 for SQL Server]A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.

Any kind of help is appreciated!!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.