Coder Social home page Coder Social logo

amacneil / dbmate Goto Github PK

View Code? Open in Web Editor NEW
4.4K 30.0 253.0 2.72 MB

:rocket: A lightweight, framework-agnostic database migration tool.

License: MIT License

Makefile 0.75% Go 96.21% Dockerfile 0.43% TypeScript 2.49% JavaScript 0.12%
database-migrations golang nodejs python database-schema docker mysql postgresql sqlite migration

dbmate's Introduction

Dbmate

Release Go Report Reference

Dbmate is a database migration tool that will keep your database schema in sync across multiple developers and your production servers.

It is a standalone command line tool that can be used with Go, Node.js, Python, Ruby, PHP, or any other language or framework you are using to write database-backed applications. This is especially helpful if you are writing multiple services in different languages, and want to maintain some sanity with consistent development tools.

For a comparison between dbmate and other popular database schema migration tools, please see Alternatives.

Table of Contents

Features

  • Supports MySQL, PostgreSQL, SQLite, and ClickHouse.
  • Uses plain SQL for writing schema migrations.
  • Migrations are timestamp-versioned, to avoid version number conflicts with multiple developers.
  • Migrations are run atomically inside a transaction.
  • Supports creating and dropping databases (handy in development/test).
  • Supports saving a schema.sql file to easily diff schema changes in git.
  • Database connection URL is defined using an environment variable (DATABASE_URL by default), or specified on the command line.
  • Built-in support for reading environment variables from your .env file.
  • Easy to distribute, single self-contained binary.

Installation

NPM

Install using NPM:

$ npm install --save-dev dbmate
$ npx dbmate --help

macOS

Install using Homebrew:

$ brew install dbmate

Linux

Install the binary directly:

$ sudo curl -fsSL -o /usr/local/bin/dbmate https://github.com/amacneil/dbmate/releases/latest/download/dbmate-linux-amd64
$ sudo chmod +x /usr/local/bin/dbmate

Windows

Install using Scoop

scoop install dbmate

Docker

Docker images are published to GitHub Container Registry (ghcr.io/amacneil/dbmate).

Remember to set --network=host or see this comment for more tips on using dbmate with docker networking):

$ docker run --rm -it --network=host ghcr.io/amacneil/dbmate --help

If you wish to create or apply migrations, you will need to use Docker's bind mount feature to make your local working directory (pwd) available inside the dbmate container:

$ docker run --rm -it --network=host -v "$(pwd)/db:/db" ghcr.io/amacneil/dbmate new create_users_table

Heroku

To use dbmate on Heroku, either use the NPM method, or store the linux binary in your git repository:

$ mkdir -p bin
$ curl -fsSL -o bin/dbmate https://github.com/amacneil/dbmate/releases/latest/download/dbmate-linux-amd64
$ chmod +x bin/dbmate
$ git add bin/dbmate
$ git commit -m "Add dbmate binary"
$ git push heroku master
$ heroku run bin/dbmate --help

Commands

dbmate --help    # print usage help
dbmate new       # generate a new migration file
dbmate up        # create the database (if it does not already exist) and run any pending migrations
dbmate create    # create the database
dbmate drop      # drop the database
dbmate migrate   # run any pending migrations
dbmate rollback  # roll back the most recent migration
dbmate down      # alias for rollback
dbmate status    # show the status of all migrations (supports --exit-code and --quiet)
dbmate dump      # write the database schema.sql file
dbmate load      # load schema.sql file to the database
dbmate wait      # wait for the database server to become available

Command Line Options

The following options are available with all commands. You must use command line arguments in the order dbmate [global options] command [command options]. Most options can also be configured via environment variables (and loaded from your .env file, which is helpful to share configuration between team members).

  • --url, -u "protocol://host:port/dbname" - specify the database url directly. (env: DATABASE_URL)
  • --env, -e "DATABASE_URL" - specify an environment variable to read the database connection URL from.
  • --env-file ".env" - specify an alternate environment variables file(s) to load.
  • --migrations-dir, -d "./db/migrations" - where to keep the migration files. (env: DBMATE_MIGRATIONS_DIR)
  • --migrations-table "schema_migrations" - database table to record migrations in. (env: DBMATE_MIGRATIONS_TABLE)
  • --schema-file, -s "./db/schema.sql" - a path to keep the schema.sql file. (env: DBMATE_SCHEMA_FILE)
  • --no-dump-schema - don't auto-update the schema.sql file on migrate/rollback (env: DBMATE_NO_DUMP_SCHEMA)
  • --strict - fail if migrations would be applied out of order (env: DBMATE_STRICT)
  • --wait - wait for the db to become available before executing the subsequent command (env: DBMATE_WAIT)
  • --wait-timeout 60s - timeout for --wait flag (env: DBMATE_WAIT_TIMEOUT)

Usage

Connecting to the Database

Dbmate locates your database using the DATABASE_URL environment variable by default. If you are writing a twelve-factor app, you should be storing all connection strings in environment variables.

To make this easy in development, dbmate looks for a .env file in the current directory, and treats any variables listed there as if they were specified in the current environment (existing environment variables take preference, however).

If you do not already have a .env file, create one and add your database connection URL:

$ cat .env
DATABASE_URL="postgres://[email protected]:5432/myapp_development?sslmode=disable"

DATABASE_URL should be specified in the following format:

protocol://username:password@host:port/database_name?options
  • protocol must be one of mysql, postgres, postgresql, sqlite, sqlite3, clickhouse
  • username and password must be URL encoded (you will get an error if you use special charactors)
  • host can be either a hostname or IP address
  • options are driver-specific (refer to the underlying Go SQL drivers if you wish to use these)

Dbmate can also load the connection URL from a different environment variable. For example, before running your test suite, you may wish to drop and recreate the test database. One easy way to do this is to store your test database connection URL in the TEST_DATABASE_URL environment variable:

$ cat .env
DATABASE_URL="postgres://[email protected]:5432/myapp_dev?sslmode=disable"
TEST_DATABASE_URL="postgres://[email protected]:5432/myapp_test?sslmode=disable"

You can then specify this environment variable in your test script (Makefile or similar):

$ dbmate -e TEST_DATABASE_URL drop
Dropping: myapp_test
$ dbmate -e TEST_DATABASE_URL --no-dump-schema up
Creating: myapp_test
Applying: 20151127184807_create_users_table.sql

Alternatively, you can specify the url directly on the command line:

$ dbmate -u "postgres://[email protected]:5432/myapp_test?sslmode=disable" up

The only advantage of using dbmate -e TEST_DATABASE_URL over dbmate -u $TEST_DATABASE_URL is that the former takes advantage of dbmate's automatic .env file loading.

PostgreSQL

When connecting to Postgres, you may need to add the sslmode=disable option to your connection string, as dbmate by default requires a TLS connection (some other frameworks/languages allow unencrypted connections by default).

DATABASE_URL="postgres://username:[email protected]:5432/database_name?sslmode=disable"

A socket or host parameter can be specified to connect through a unix socket (note: specify the directory only):

DATABASE_URL="postgres://username:password@/database_name?socket=/var/run/postgresql"

A search_path parameter can be used to specify the current schema while applying migrations, as well as for dbmate's schema_migrations table. If the schema does not exist, it will be created automatically. If multiple comma-separated schemas are passed, the first will be used for the schema_migrations table.

DATABASE_URL="postgres://username:[email protected]:5432/database_name?search_path=myschema"
DATABASE_URL="postgres://username:[email protected]:5432/database_name?search_path=myschema,public"

MySQL

DATABASE_URL="mysql://username:[email protected]:3306/database_name"

A socket parameter can be specified to connect through a unix socket:

DATABASE_URL="mysql://username:password@/database_name?socket=/var/run/mysqld/mysqld.sock"

SQLite

SQLite databases are stored on the filesystem, so you do not need to specify a host. By default, files are relative to the current directory. For example, the following will create a database at ./db/database.sqlite3:

DATABASE_URL="sqlite:db/database.sqlite3"

To specify an absolute path, add a forward slash to the path. The following will create a database at /tmp/database.sqlite3:

DATABASE_URL="sqlite:/tmp/database.sqlite3"

ClickHouse

DATABASE_URL="clickhouse://username:[email protected]:9000/database_name"

To work with ClickHouse cluster, there are 4 connection query parameters that can be supplied:

  • on_cluster - Indicataion to use cluster statements and replicated migration table. (default: false) If this parameter is not supplied, other cluster related query parameters are ignored.
DATABASE_URL="clickhouse://username:[email protected]:9000/database_name?on_cluster"

DATABASE_URL="clickhouse://username:[email protected]:9000/database_name?on_cluster=true"
  • cluster_macro (Optional) - Macro value to be used for ON CLUSTER statements and for the replciated migration table engine zookeeper path. (default: {cluster})
DATABASE_URL="clickhouse://username:[email protected]:9000/database_name?on_cluster&cluster_macro={my_cluster}"
  • replica_macro (Optional) - Macro value to be used for the replica name in the replciated migration table engine. (default: {replica})
DATABASE_URL="clickhouse://username:[email protected]:9000/database_name?on_cluster&replica_macro={my_replica}"
  • zoo_path (Optional) - The path to the table migration in ClickHouse/Zoo Keeper. (default: /clickhouse/tables/<cluster_macro>/{table})
DATABASE_URL="clickhouse://username:[email protected]:9000/database_name?on_cluster&zoo_path=/zk/path/tables"

See other supported connection options.

BigQuery

Follow the following format for DATABASE_URL when connecting to actual BigQuery in GCP:

bigquery://projectid/location/dataset

projectid (mandatory) - Project ID

dataset (mandatory) - Dataset name within the Project

location (optional) - Where Dataset is created

NOTE: Follow this doc on how to set GOOGLE_APPLICATION_CREDENTIALS environment variable for proper Authentication

Follow the following format if trying to connect to a custom endpoint e.g. BigQuery Emulator

bigquery://host:port/projectid/location/dataset?disable_auth=true

disable_auth (optional) - Pass true to skip Authentication, use only for testing and connecting to emulator.

Creating Migrations

To create a new migration, run dbmate new create_users_table. You can name the migration anything you like. This will create a file db/migrations/20151127184807_create_users_table.sql in the current directory:

-- migrate:up

-- migrate:down

To write a migration, simply add your SQL to the migrate:up section:

-- migrate:up
create table users (
  id integer,
  name varchar(255),
  email varchar(255) not null
);

-- migrate:down

Note: Migration files are named in the format [version]_[description].sql. Only the version (defined as all leading numeric characters in the file name) is recorded in the database, so you can safely rename a migration file without having any effect on its current application state.

Running Migrations

Run dbmate up to run any pending migrations.

$ dbmate up
Creating: myapp_development
Applying: 20151127184807_create_users_table.sql
Writing: ./db/schema.sql

Note: dbmate up will create the database if it does not already exist (assuming the current user has permission to create databases). If you want to run migrations without creating the database, run dbmate migrate.

Pending migrations are always applied in numerical order. However, dbmate does not prevent migrations from being applied out of order if they are committed independently (for example: if a developer has been working on a branch for a long time, and commits a migration which has a lower version number than other already-applied migrations, dbmate will simply apply the pending migration). See #159 for a more detailed explanation.

Rolling Back Migrations

By default, dbmate doesn't know how to roll back a migration. In development, it's often useful to be able to revert your database to a previous state. To accomplish this, implement the migrate:down section:

-- migrate:up
create table users (
  id integer,
  name varchar(255),
  email varchar(255) not null
);

-- migrate:down
drop table users;

Run dbmate rollback to roll back the most recent migration:

$ dbmate rollback
Rolling back: 20151127184807_create_users_table.sql
Writing: ./db/schema.sql

Migration Options

dbmate supports options passed to a migration block in the form of key:value pairs. List of supported options:

  • transaction

transaction

transaction is useful if you need to run some SQL which cannot be executed from within a transaction. For example, in Postgres, you would need to disable transactions for migrations that alter an enum type to add a value:

-- migrate:up transaction:false
ALTER TYPE colors ADD VALUE 'orange' AFTER 'red';

transaction will default to true if your database supports it.

Waiting For The Database

If you use a Docker development environment for your project, you may encounter issues with the database not being immediately ready when running migrations or unit tests. This can be due to the database server having only just started.

In general, your application should be resilient to not having a working database connection on startup. However, for the purpose of running migrations or unit tests, this is not practical. The wait command avoids this situation by allowing you to pause a script or other application until the database is available. Dbmate will attempt a connection to the database server every second, up to a maximum of 60 seconds.

If the database is available, wait will return no output:

$ dbmate wait

If the database is unavailable, wait will block until the database becomes available:

$ dbmate wait
Waiting for database....

You can also use the --wait flag with other commands if you sometimes see failures caused by the database not yet being ready:

$ dbmate --wait up
Waiting for database....
Creating: myapp_development

You can customize the timeout using --wait-timeout (default 60s). If the database is still not available, the command will return an error:

$ dbmate --wait-timeout=5s wait
Waiting for database.....
Error: unable to connect to database: dial tcp 127.0.0.1:5432: connect: connection refused

Please note that the wait command does not verify whether your specified database exists, only that the server is available and ready (so it will return success if the database server is available, but your database has not yet been created).

Exporting Schema File

When you run the up, migrate, or rollback commands, dbmate will automatically create a ./db/schema.sql file containing a complete representation of your database schema. Dbmate keeps this file up to date for you, so you should not manually edit it.

It is recommended to check this file into source control, so that you can easily review changes to the schema in commits or pull requests. It's also possible to use this file when you want to quickly load a database schema, without running each migration sequentially (for example in your test harness). However, if you do not wish to save this file, you could add it to your .gitignore, or pass the --no-dump-schema command line option.

To dump the schema.sql file without performing any other actions, run dbmate dump. Unlike other dbmate actions, this command relies on the respective pg_dump, mysqldump, or sqlite3 commands being available in your PATH. If these tools are not available, dbmate will silently skip the schema dump step during up, migrate, or rollback actions. You can diagnose the issue by running dbmate dump and looking at the output:

$ dbmate dump
exec: "pg_dump": executable file not found in $PATH

On Ubuntu or Debian systems, you can fix this by installing postgresql-client, mysql-client, or sqlite3 respectively. Ensure that the package version you install is greater than or equal to the version running on your database server.

Note: The schema.sql file will contain a complete schema for your database, even if some tables or columns were created outside of dbmate migrations.

Library

Use dbmate as a library

Dbmate is designed to be used as a CLI with any language or framework, but it can also be used as a library in a Go application.

Here is a simple example. Remember to import the driver you need!

package main

import (
	"net/url"

	"github.com/amacneil/dbmate/v2/pkg/dbmate"
	_ "github.com/amacneil/dbmate/v2/pkg/driver/sqlite"
)

func main() {
	u, _ := url.Parse("sqlite:foo.sqlite3")
	db := dbmate.New(u)

	err := db.CreateAndMigrate()
	if err != nil {
		panic(err)
	}
}

See the reference documentation for more options.

Embedding migrations

Migrations can be embedded into your application binary using Go's embed functionality.

Use db.FS to specify the filesystem used for reading migrations:

package main

import (
	"embed"
	"fmt"
	"net/url"

	"github.com/amacneil/dbmate/v2/pkg/dbmate"
	_ "github.com/amacneil/dbmate/v2/pkg/driver/sqlite"
)

//go:embed db/migrations/*.sql
var fs embed.FS

func main() {
	u, _ := url.Parse("sqlite:foo.sqlite3")
	db := dbmate.New(u)
	db.FS = fs

	fmt.Println("Migrations:")
	migrations, err := db.FindMigrations()
	if err != nil {
		panic(err)
	}
	for _, m := range migrations {
		fmt.Println(m.Version, m.FilePath)
	}

	fmt.Println("\nApplying...")
	err = db.CreateAndMigrate()
	if err != nil {
		panic(err)
	}
}

Concepts

Migration files

Migration files are very simple, and are stored in ./db/migrations by default. You can create a new migration file named [date]_create_users.sql by running dbmate new create_users. Here is an example:

-- migrate:up
create table users (
  id integer,
  name varchar(255),
);

-- migrate:down
drop table if exists users;

Both up and down migrations are stored in the same file, for ease of editing. Both up and down directives are required, even if you choose not to implement the down migration.

When you apply a migration dbmate only stores the version number, not the contents, so you should always rollback a migration before modifying its contents. For this reason, you can safely rename a migration file without affecting its applied status, as long as you keep the version number intact.

Schema file

The schema file is written to ./db/schema.sql by default. It is a complete dump of your database schema, including any applied migrations, and any other modifications you have made.

This file should be checked in to source control, so that you can easily compare the diff of a migration. You can use the schema file to quickly restore your database without needing to run all migrations.

Schema migrations table

Dbmate stores a record of each applied migration in table named schema_migrations. This table will be created for you automatically if it does not already exist.

The table is very simple:

CREATE TABLE IF NOT EXISTS schema_migrations (
  version VARCHAR(255) PRIMARY KEY
)

You can customize the name of this table using the --migrations-table flag or DBMATE_MIGRATIONS_TABLE environment variable.

Alternatives

Why another database schema migration tool? Dbmate was inspired by many other tools, primarily Active Record Migrations, with the goals of being trivial to configure, and language & framework independent. Here is a comparison between dbmate and other popular migration tools.

dbmate goose sql-migrate golang-migrate activerecord sequelize flyway sqitch
Features
Plain SQL migration files
Support for creating and dropping databases
Support for saving schema dump files
Timestamp-versioned migration files
Custom schema migrations table
Ability to wait for database to become ready
Database connection string loaded from environment variables
Automatically load .env file
No separate configuration file
Language/framework independent
Drivers
PostgreSQL
MySQL
SQLite
CliсkHouse

If you notice any inaccuracies in this table, please propose a change.

Contributing

Dbmate is written in Go, pull requests are welcome.

Tests are run against a real database using docker-compose. To build a docker image and run the tests:

$ make docker-all

To start a development shell:

$ make docker-sh

dbmate's People

Contributors

alfredringstad avatar amacneil avatar andreacrotti avatar andyli avatar aposhian avatar benjreinhart avatar bouk avatar breml avatar cloneable avatar csuriano23 avatar dependabot[bot] avatar dissociable avatar docapotamus avatar dossy avatar dthakur avatar edaubert avatar enrico204 avatar farbodsalimi avatar fathercandle avatar flamefork avatar hamza512b avatar jell avatar juneezee avatar kmchmk avatar kriive avatar philip-hartmann avatar pkqk avatar s1owjke avatar shames0 avatar silbinarywolf avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbmate's Issues

schema dump includes `AUTO INCREMENT`

Hi,

When I dbmate migrate the schema dump includes the current AUTO INCREMENT of each table, which is bad since if I want to import the schema as is (without going through all the migrations) to a server or another computer it will start the row count from where my local table ends.

Thanks

Protection from accidental database dropping

I am a little bit worried that there is no confirmation before dropping a database. A single slip-up in typing can ruin all the data on a production server.

With all respect to the simplicity of the original design, I see the solution as an interactive confirmation with printing the host[:port], user and database name in the prompt to make sure the user understands what s/he is going to do, and the --force|-f option for use in scripts.

Are you interested in implementing this? Thanks.

Cannot run inside a transaction block

PostreSQL

-- migrate:up
ALTER TYPE my_enum ADD VALUE 'my_value';

Running this migrations results in

Error: pq: ALTER TYPE ... ADD cannot run inside a transaction block

version command

Would be nice to have a dbmate version command that prints the current schema version.

dbmate with postgres - Error: pq: must be owner of extension uuid-ossp

I'm trying to use dbmate to bring up database on newly provisioned postgres instance.
It does not matter if database exists or not, as well as extension uuid-ossp created with schema public or not. The db instance is AWS RDS PostgreSQL 9.5.
dbmate fails to execute _init.sql script from misgrations folder.
As the result, database created, but initial script is not executed.
The error is:
Error: pq: must be owner of extension uuid-ossp
I'm not sure what is pq in this context, and how to fix it.
Any ideas?

Issue with Postgres Connection String

Hey, I'm trying to run docker-compose run dbmate new test but I'm getting the following error:

Error: parse "postgres://postgres:[email protected]:5432/dbname?sslmode=disable": first path segment in URL cannot contain colon

I'm setting the database URL using the DATABASE_URL env variable and am using the amacneil/dbmate docker image.

I added dbmate as a service to my docker-compose file which looks like this:

  dbmate:
    image: amacneil/dbmate
    depends_on:
      - database
    env_file:
      - docker.env
    volumes:
      - ./db/migrations:/db/migrations

Thanks

How does this tool work with existing schemas?

I have a large existing schema that was manually created.

What are the steps to getting dbmate up to date with this schema and start tracking database migrations through dbmate?

Is there any kind of automation that you can suggest?

url.URL does not support DSN format anymore

Due to some recent changes in net/url package, a github.com/go-sql-driver/mysql#Config.FormatDSN resulting string can not be parsed by net/url/#Parse

The net/url package now parses an URL following more closely RFC 3986; therefore a MySQL DSN string such as the following one would not be parsed anymore:

_, err = url.Parse("mysql://boulder@tcp(localhost:3306)/boulder_test?parseTime=true")
if err != nil {
	fmt.Println(err)
}
// parse mysql://boulder@tcp(localhost:3306)/boulder_test?parseTime=true: invalid port ":3306)" after host

The dbmate.New constructor should be kept compatible with the standard MySQL DSN format and not (necessarily) rely on *url.URL.

A work-around is to manually build a string such as:

sqlURL := fmt.Sprintf("mysql://%s:%s@%s/%s", sqlConfig.User, sqlConfig.Passwd, sqlConfig.Addr, sqlConfig.DBName)
u, _ := url.Parse(sqlURL)
dbmate.New(u)

References
https://play.golang.org/p/jUs0pRjjn_Q
golang/go@61bb56a
golang/go#33646

no schema has been selected to create "schema_migrations" in

$ DATABASE_URL="postgres://root@localhost:5432/test12?sslmode=disable" dbmate up
Error: pq: no schema has been selected to create "schema_migrations" in

What does it mean?
I've created migration file with the creation of a couple of tables then I'm trying to migrate and getting this error. Maybe you have any ideas?

MySQL driver doesn't support DELIMITER statement

Hello,

I am trying to migrate stored procedures with dbmate but seems like it doesn't support it. I am getting below error while executing dbmate up command:

sql file content for stored procedure:
DELIMITER //
DROP PROCEDURE IF EXISTS foobar //
create procedure foobar (in var1 int)
begin select var1 + 2 as result;
END //
DELIMITER ;

ERROR:
Applying: 20180907100150_create_store_procedure.sql
Error: Error 1064: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'DELIMITER //
DROP PROCEDURE IF EXISTS foobar //
create procedure foobar (in var1' at line 1

status command

dbmate status would display the current migration status.

It would also be nice to have an optional flag that would just output the number of pending migrations so that external systems can leverage that to warn when migrations are pending during an application boot process for example.

Publish to NPM

I've been using dbmate with a node project. It works great but it would be nice if I could integrate it with the rest of my projects dependencies. While NPM is focused on Javascript, you can publish any binary to the repository (see https://github.com/sanathkr/go-npm for a go specific tool and instructions on doing this).

This would not only make it easier to use with node projects, but for some users would make it easier to install since NPM handles cross platform binaries automatically. You would even get support for npx dbmate, which lets you run a binary without explicitly installing it first. This would also simplify using it on Heroku.

Errors during migration lack line numbers

I just noticed that if I run a dbmate migrate and there is an error in one of the statements, the error being printed out is a bit useless, and doesn't even contain the line number or the actual message from Postgres.

I checked if there are any options for more verbosity but I don't see that either, so atm we simply don't get errors printed out?

embed the migration time in the file

It appears that the timestamp prefix of the schema migration files are used to indicate the ordering of migrations. This feature prohibits using completely custom filenames (e.g. copying a file and changing the suffix and/or using more human-readable timestamps). Moreover, if the filenames are incorrect (or there's a tie in the dates) then migrations could be applied in the wrong order. To resolve this issue, the migration times could be embedded in the files; furthermore dbmate new could ensure a new migration occurs after all existing ones. Additionally it could be helpful to have a tool that can verify the order in which migrations get applied.

How use in Windows?

I have problems with "dbmate" in windows installed using "go get -u github.com/amacneil/dbmate".
"dbmate command not found"

Running dbmate with docker or docker-compose run

Hello,

I want to run dbmate inside the provided docker container, but it's not clear to me how this works in practice. I'm learning docker at the same time here so this may be part of my ignorance.

If I run the dbmate using docker run, then it writes the migration file inside the docker container that was created. But how would I go about accessing that file again? I tried specifying the migrations directory using the --migrations-dir command, but still can't find a way to edit the migration file that was created.

Brew Install Outdated

First of all, love this tool, does exactly what I want and no more. Well done.

Small issue with the brew distribution. I install the latest, and the version appears to be 1.3.0, but it is missing the wait command.

$ dbmate
NAME:
   dbmate - A lightweight, framework-independent database migration tool.

USAGE:
   dbmate [global options] command [command options] [arguments...]

VERSION:
   1.3.0

COMMANDS:
     new, n          Generate a new migration file
     up              Create database (if necessary) and migrate to the latest version
     create          Create database
     drop            Drop database (if it exists)
     migrate         Migrate to the latest version
     rollback, down  Rollback the most recent migration
     dump            Write the database schema to disk
     help, h         Shows a list of commands or help for one command

GLOBAL OPTIONS:
   --env value, -e value             specify an environment variable containing the database URL (default: "DATABASE_URL")
   --migrations-dir value, -d value  specify the directory containing migration files (default: "./db/migrations")
   --schema-file value, -s value     specify the schema file location (default: "./db/schema.sql")
   --no-dump-schema                  don't update the schema file on migrate/rollback
   --help, -h                        show help
   --version, -v                     print the version

When I install using go get the wait command is included as expected.

$ dbmate
NAME:
   dbmate - A lightweight, framework-independent database migration tool.

USAGE:
   dbmate [global options] command [command options] [arguments...]

VERSION:
   1.3.0

COMMANDS:
     new, n          Generate a new migration file
     up              Create database (if necessary) and migrate to the latest version
     create          Create database
     drop            Drop database (if it exists)
     migrate         Migrate to the latest version
     rollback, down  Rollback the most recent migration
     dump            Write the database schema to disk
     wait            Wait for the database to become available
     help, h         Shows a list of commands or help for one command

GLOBAL OPTIONS:
   --env value, -e value             specify an environment variable containing the database URL (default: "DATABASE_URL")
   --migrations-dir value, -d value  specify the directory containing migration files (default: "./db/migrations")
   --schema-file value, -s value     specify the schema file location (default: "./db/schema.sql")
   --no-dump-schema                  don't update the schema file on migrate/rollback
   --help, -h                        show help
   --version, -v                     print the version

I can get by using go get, but my teammates might prefer the brew install. Thanks again!

Different profiles

Suppose I have some base database structure, which I can apply different changes to, what would be a good way to do that?

One way I thought I could do that could do something like:

  • db/migrations/base.sql
  • db/migrations/first/migration.sql
  • db/migrations/first/base.sql -> ../base.sql
  • db/migrations/second/migrations.sql
  • db/migrations/second/base.sql -> ../base.sql

So using as symlink, and then call dbmate passing one or the other directory.
I think this would work, is there another better way to do something similar?

DB connection strings do not support exclamation points in password

I was getting an "Error 1045: Access denied for user...", given a valid connection string. Checked privileges and made sure everything was right on database side. It seems that mysql is fine with connection strings with exclamation points in them, while net/url mangles them and passes them to mysql as %21, which mysql interprets literally.

Squitch Comparsion

I see you have a comparison table in your readme, but I don't see Squitch in there which has been around for quite a while and would be a good yardstick here.

Add specifying the database url as a command line argument

While working on wrapper to invoke the command from nodejs, I've found that it would be more flexible to specify the database url via a command line argument rather than having dbmate read it from a .env file. This allows the app and testing environment to provide the database url after loading it however it wants.

Dump the database schema to sql file

Similar to Rails, we should dump the current database schema to a db/schema.sql file after running migrations.

This allows developers interacting with a git repo to better track the current state of a database (e.g. when reviewing pull requests, it's much easier to tell what has changed versus reviewing a single migration without context).

I don't believe this is possible to do with the mysql/postgres libraries, so we may need to shell out to mysqldump/pg_dump if they are available, and either error or fail silently if not available.

Suggested changes:

  • dbmate migrate - should automatically dump the current schema after applying migrations
  • dbmate dump - new command: should dump the current schema

Tampering resistance by storing the migrations' hashes

Hi,

great project, it ticks off all of my requirements for a light weight migrations tool, but the following:

The table of applied migrations should contain a column for the hash of each migrations file

This prevents applying a chain of migrations if an already applied migration file has different contents than previously seen. See e.g. node-postgres-migrations for an example where this is implemented.

Additionally, migrations should always be applied in order, so a database with the migrations 100_foo and 200_bar should refuse to apply 150_baz.

On MySQL 5.6 with default utf8mb4 encoding, schema_migrations table fails to create

Running dbmate on my MySQL 5.6 server with the DEFAULT CHARSET utf8mb4 gave me this error:

Error: Error 1071: Specified key was too long; max key length is 767 bytes

It's because on MySql 5.6 InnoDB with utb8mb4 encoding, the max length limit for any key is 191 characters (utf8mb4 chars are 4 bytes each). Digging into the dbmate code, it looks like it's because the schema_migrationstable is created as such:

// CreateMigrationsTable creates the schema_migrations table
func (drv MySQLDriver) CreateMigrationsTable(db *sql.DB) error {
	_, err := db.Exec("create table if not exists schema_migrations " +
		"(version varchar(255) primary key)")

	return err
}

255 chars * 4 bytes = 1020 bytes, which will go over the limit. This command will fail even if the schema_migrations table has already been manually created with a lower varchar limit.

To fix this, we could possibly:

  • Detect if the version is 5.6 and limit the varchar to 191
  • Specify latin1 encoding for the schema_migrations table, which only takes 1 byte per character

What do you think? Thanks for making a great tool!

Command line option --no-dump-schema does not work

Executing:

$ bin/dbmate migrate --no-dump-schema

Gives:

Incorrect Usage: flag provided but not defined: -no-dump-schema

NAME:
   dbmate migrate - Migrate to the latest version

USAGE:
   dbmate migrate [arguments...]
Error: flag provided but not defined: -no-dump-schema

Exclusive lock during migration

Hi,

as far as I can tell, currently there is now explicit lock applied during migrations. That would mean that it is dangerous to run migrations in a concurrent setting, e.g. when several instances of some API server including dbmate are running.

It would be advisable to obtain an exclusive lock of the migrations table before migrating.

Support for semantic versioning

Like other database migration tools, dbmate only supports numerical version numbers. I would like to be able to version my migrations semantically, using tags like 1.0.0 for example.

Golang libraries exist that allow you to compare semantic versions, like https://github.com/hashicorp/go-version.

Is there any chance of supporting this style of versioning in future? Is there a database limitation that means these can't be used?

Postgres COPY command not supported

Hi, I have the following migration :

-- migrate:up
COPY clients.business_types (id, label) FROM stdin delimiter ',';
ASS,Association
EARL,Entreprise agricole à responsabilité limitée
\.

-- migrate:down

I cant apply it, it gives me the following error :

postgres_1  | 2018-03-10 14:37:22.868 UTC [837] ERROR:  syntax error at or near "ASS" at character 68
postgres_1  | 2018-03-10 14:37:22.868 UTC [837] STATEMENT:
postgres_1  | 	COPY clients.business_types (id, label) FROM stdin delimiter ',';
postgres_1  | 	ASS,Association
postgres_1  | 	EARL,Entreprise agricole à responsabilité limitée
postgres_1  | 	\.

use unix_sock with Google cloud platforms: Cloud SQL syntax

We have this snippet:

FROM node:lts-slim
RUN curl -fsSL -o /usr/local/bin/dbmate https://github.com/amacneil/dbmate/releases/download/v1.7.0/dbmate-linux-amd64; chmod +x /usr/local/bin/dbmate
WORKDIR /var/www/
COPY . .
ENV DATABASE_URL="postgres://postgres:pass@/bbf_dev?unix_sock=/cloudsql/pg-bbf-instance-dev/.s.PGSQL.5432"
RUN /usr/local/bin/dbmate -e DATABASE_URL migrate

it simply fails with the migrate command

Step #0: Step 4/10 : WORKDIR /var/www/
Step #0:  ---> Running in ef0ea58fcc5d
Step #0: Removing intermediate container ef0ea58fcc5d
Step #0:  ---> 66c868af67d1
Step #0: Step 5/10 : COPY . .
Step #0:  ---> 2bc0858ba21c
Step #0: Step 6/10 : ENV DATABASE_URL="postgres://postgres:pass@/bbf_dev?unix_sock=/cloudsql/pg-bbf-instance-dev/.s.PGSQL.5432"
Step #0:  ---> Running in 4b06ce1334ac
Step #0: Removing intermediate container 4b06ce1334ac
Step #0:  ---> 12cf38003b1a
Step #0: Step 7/10 : RUN /usr/local/bin/dbmate -e DATABASE_URL migrate
Step #0:  ---> Running in 8429b83962b2
Step #0: �[91mError: dial tcp 127.0.0.1:5432: connect: connection refused
Step #0: The command '/bin/sh -c /usr/local/bin/dbmate -e DATABASE_URL migrate' returned a non-zero code: 1
Finished Step #0
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/docker" failed: exit status 1

it tries a default 127.0.0.1:5432 but neither the .env file nor the ENV DATABASE_URL is being picked up as we don't have 127.0.0.1 neither in the .env file.

Is something broken? Is there another syntax?

I build another example, with export the var is there, but dbmate is not picking up.

Step #0: Step 7/11 : RUN export
Step #0:  ---> Running in b2f14d01c305
Step #0: export DATABASE_URL='postgres://postgres:pass@/bbf_dev?unix_sock=/cloudsql/pg-bbf-instance-dev/.s.PGSQL.5432'
Step #0: export HOME='/root'
Step #0: export HOSTNAME='b2f14d01c305'
Step #0: export NODE_VERSION='12.13.1'
Step #0: export PATH='/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
Step #0: export PORT='8080'
Step #0: export PWD='/var/www'
Step #0: export YARN_VERSION='1.19.1'
Step #0: Removing intermediate container b2f14d01c305
Step #0:  ---> 9ed92413d49b
Step #0: Step 8/11 : RUN /usr/local/bin/dbmate -e DATABASE_URL migrate
Step #0:  ---> Running in 1bb033a84122
Step #0: �[91mError: dial tcp 127.0.0.1:5432: connect: connection refused
Step #0: The command '/bin/sh -c /usr/local/bin/dbmate -e DATABASE_URL migrate' returned a non-zero code: 1

the question might be, that it's not picking up the unix_sock

see the GCP guide for unix sockets here: https://cloud.google.com/sql/docs/postgres/connect-run

Inconsistencies with migrate and rollback.

Scenario.

2 developers working on different branches of code.

Developer 1 creates a new migration 20180124192200_featureA.sql

Developer 2 creates a new migration 20180124203000_featureB.sql

Developer 2's code is merged and put into production and dbmate is used to apply 20180124203000_FeatureB.sql.

Developer 1's code is merged and put into production and dbmate is used to apply 20180124192200_FeatureA.sql

I want to rollback FeatureA so I run:

$ dbmate rollback

but instead FeatureB is rolled back because it has a higher timestamp.

  1. I want dbmate to apply any missing schema migrations. I don't care about the timestamp and like the way 'migrate' works.
  2. I want dbmate to rollback the most recently applied migration, not the migration with the highest timestamp.

How to fix this?

Add a timestamp to schema_migrations row and rollback the most recently applied migration that way rather than a descending sortin on the migration version.

MySQL 8 Support?

It seems like MySQL 8 databases aren't supported due to the new authentication method. I get the following error when running dbmate up using the brew OSX package:

Error: this authentication plugin is not supported

execute all migrations in one transaction

Currently dbmate executes each migration in a new transaction. The potential issue with this is it can leave the database in an incompatible state with either version of the application, which is bad. Potential causes of this sort of failure can be difficult to detect in testing -- db server crash, oom, excessive temp files, tx id wraparound, etc.

I think we should change dbmate to execute all pending migrations in a single transaction. Thoughts?

dbmate does not support DOS/Windows file line endings

dbmate only supports files ending with '\n' instead of the usual Windows/DOS format which is '\r\n'.

Trying to perform a dbmate up command with migration files that have Windows/DOS line ending will result in an error, but it will work if the file is converted back to Unix line endings.

The error produced when trying to run on migration files with DOS file endings:

Applying: 20180605202455_query.sql
Error: Error 1065: Query was empty

"Deprecated" error when running

To reproduce:

$ go get -u github.com/amacneil/dbmate
$ dbmate new stuff
DEPRECATED Action signature.  Must be `cli.ActionFunc`.  This is an error in the application.  Please contact the distributor of this application if this is not you.  See https://github.com/urfave/cli/blob/master/CHANGELOG.md#deprecated-cli-app-action-signature

Seems to still work fine.

Log output from migrations

Is there a way to enable logging? I'd like to get an output of the migration.

For example with the following, I'd like to look in a log and see exactly how many rows were inserted. It might not be what I expected.

START TRANSACTION;

INSERT INTO db.tbl
  SELECT col1, col2, col3 FROM db.tbl2
  WHERE col2 IN ( val1, val2, ... );

SELECT ROW_COUNT();

COMMIT;

Build fails with Go 1.13.4

$ go version
go version go1.13.4 darwin/amd64
$ mkdir go
$ export GOPATH=$PWD/go
$ go get -u github.com/amacneil/dbmate
# github.com/amacneil/dbmate
go/src/github.com/amacneil/dbmate/main.go:33:17: cannot use cli.StringFlag literal (type cli.StringFlag) as type cli.Flag in array or slice literal:
	cli.StringFlag does not implement cli.Flag (Apply method has pointer receiver)
go/src/github.com/amacneil/dbmate/main.go:38:17: cannot use cli.StringFlag literal (type cli.StringFlag) as type cli.Flag in array or slice literal:
	cli.StringFlag does not implement cli.Flag (Apply method has pointer receiver)
go/src/github.com/amacneil/dbmate/main.go:43:17: cannot use cli.StringFlag literal (type cli.StringFlag) as type cli.Flag in array or slice literal:
	cli.StringFlag does not implement cli.Flag (Apply method has pointer receiver)
go/src/github.com/amacneil/dbmate/main.go:48:15: cannot use cli.BoolFlag literal (type cli.BoolFlag) as type cli.Flag in array or slice literal:
	cli.BoolFlag does not implement cli.Flag (Apply method has pointer receiver)
go/src/github.com/amacneil/dbmate/main.go:54:15: cannot use []cli.Command literal (type []cli.Command) as type []*cli.Command in assignment
go/src/github.com/amacneil/dbmate/main.go:138:25: c.GlobalBool undefined (type *cli.Context has no field or method GlobalBool)
go/src/github.com/amacneil/dbmate/main.go:139:23: c.GlobalString undefined (type *cli.Context has no field or method GlobalString)
go/src/github.com/amacneil/dbmate/main.go:140:20: c.GlobalString undefined (type *cli.Context has no field or method GlobalString)
go/src/github.com/amacneil/dbmate/main.go:148:10: c.GlobalString undefined (type *cli.Context has no field or method GlobalString)

Is there a command to do a dry-run?

Wondering if there is a command if a particular target DB is consistent with the current migrations, without doing destructive changes like dbmate up.

Skips some migrations

  • I have more than 5 migrations files. There have been issues where dbmate doesn't apply schema change to databases though it prints that migration name in console.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.