jtdaugherty / dbmigrations Goto Github PK
View Code? Open in Web Editor NEWA library for the creation, management, and installation of schema updates for relational databases.
License: Other
A library for the creation, management, and installation of schema updates for relational databases.
License: Other
HDBC 2.4 is now out, but dbmigrations
explicitly requires < 2.4
.
Since dbmigrations 2.1.0, migrations with . in the name, such as 2018-11-13_20-18-16_v0.1.0.txt are truncated at the last non-extension . and interpreted like 2018-11-13_20-18-16_v0.1
As a test, renaming to 2018-11-13_20-18-16_v0.1.txt results in an interpreted migration name of 2018-11-13_20-18-16_v0 Confirmed building dbmigrations-postgresql with a constraint of dbmigrations==2.0.0 resolves the issue.
Obviously, this breaks all existing migration stores where . were used in the migration names.
It appears there are some modules missing from the release tarball: http://hydra.cryp.to/build/574168/nixlog/2/raw.
Sometimes it happens that migrations are created for an existing database or are migrated from another migrations system.
In these cases it is necessary to be able to fake the migrations so only the migration state data are written but no migration is actually executed to get the database and migration state information into consistent state.
How to do this with dbmigrations?
I read README and MOO.TXT, but I still don't know how to use dbmigrations programmatically in my program.
moo is okay for development, but I want to use dbmigrations directly in production code.
Can you show me an example of automatic database upgrade for production code in README? The production code UI should just prompt a user for database upgrade. The production code should also create a SQLite file and initialize the database file completely.
Is it just a fun name, an acronym, a bacronym? Where did it come from? The suspense is killing me. It's been 12 years, the people have waited long enough. ๐
I'm trying to use dbmigrations with sqlite. It should not require postgresql to be installed my system.
Or am I missing something?
So I don't forget:
@ocharles agreed to update TestDriver.hs
to also test the MySQL backend.
This isn't a big deal, but it's a significant annoyance (also, I submit this report fully aware of the relevance of XKCD #1172 ;).
The addition of the mysql support imposed the mysql-0.1.1.8 dependency, which requires mysql_config (a binary provided by mysql). Thus, it is no longer possible to build dbmigrations without installing MySql.
It would be great if this could be enabled or disabled with a flag, so that we can avoid installing an unused database system on the existing postgesql-focused build machines. This may become a significantly more problematic issue for anyone building with shared libraries (since those libs won't be available on a production system).
What would it take to make dbmigrations support v2 of the aeson library?
I would like to add functional tests around command line interface using shelltestrunner. Do you have any concerns regarding this issue?
#1. moo run without command should print usage.
moo
<<<
>>>
Usage: moo <command> [args]
Environment:
DBM_DATABASE: database connection string
DBM_DATABASE_TYPE: database type, one of postgresql,sqlite3
DBM_MIGRATION_STORE: path to migration store
Commands:
new [--no-ask] [--config-file] <migrationName>
Create a new empty migration
apply [--test] [--config-file] <migrationName>
Apply the specified migration and its dependencies
revert [--test] [--config-file] <migrationName>
Revert the specified migration and those that depend on it
test [--config-file] <migrationName>
Test the specified migration by applying ...
upgrade [--test] [--config-file]
Install all migrations that have not yet been installed
upgrade-list
Show the list of migrations not yet installed
reinstall [--test] [--config-file] <migrationName>
Reinstall a migration by reverting, then reapplying it
list [--config-file]
List migrations already installed in the backend
Options:
-c FILE --config-file=FILE Specify location of configuration file
-t --test Perform the action then rollback when finished
-n --no-ask Do not interactively ask any questions, just do it
>>>=1
A run:
:functional/moo.test: [OK]
Test Cases Total
Passed 1 1
Failed 0 0
Total 1 1
hspec has Expectations
which are just IO ()
. I use this to wire the shelltestrunner
test results into cabal
. I will use the shelly
to set up a fake environment (without having to mess with real environment) for when I want to test environment configuration.
{-# LANGUAGE OverloadedStrings #-}
module FunctionalTestExample where
import Shelly
import Test.Hspec
main ::IO ()
main = hspec runMooWithNoCommand
runMooWithNoCommand :: Spec
runMooWithNoCommand =
describe "moo" $ do
it "prints out a usage statement" noCommand
noCommand :: IO ()
noCommand = shelly $ do
whenMooIsRunWithNoOptions
thenItShouldSendAUsageStatementToStandardOut
whenMooIsRunWithNoOptions :: Sh ()
whenMooIsRunWithNoOptions = do
let testRunnerFileForMooWithNoCommand = "moo_shell_test.txt"
run_ pathToShellTestRunner [testRunnerFileForMooWithNoCommand]
where
pathToShellTestRunner = fromText "caba-dev/bin/shelltest"
thenItShouldSendAUsageStatementToStandardOut :: Sh ()
thenItShouldSendAUsageStatementToStandardOut = do
exitCode <- lastExitCode
liftIO $ exitCode `shouldBe` success
where
success = 0
Aside from Persistent's primitive migrations, this library comes the closest I've seen to my dream of building my application such that it is totally capable of doing migrations on its own. In other words, I don't want to deploy moo
to my endpoints, but I want my web application to do migrations by itself.
I'd be interested in making this functionality more "first class" if you think it's worth it. At first glance, it seems that providing some TH functions to embed the preparsed YAML data could be sufficient.
It's not easy to tell which version of dbmigrations you're using, which makes reporting bugs more difficult.
Supporting the standard version command line flags would be pretty nice, and given the nature of the tool, it would be great to also support --numeric-version
.
Expected behavior:
# This first one could be more verbose:
$ moo --version
dbmigrations 0.9
# This should report *exactly* the version number, and nothing else, as it is intended to be parsed by the cabal version number parser; and as such, can more easily be added as a build-tool dependency.
$ moo --numeric-version
0.9
See ghc --version and ghc --numeric-version for more examples.
Why are the migration files named .txt
and not .yaml
? Seems like a strange choice for YAML files. Is the .txt
meaningful? I can work around this with some editor configuration to treat these as YAML, but it'd be nice if I didn't have to do that.
Detect migration dependencies automatically???
This things obviously needs some kind of configuration. That is supported by checking the help output of moo and presence of taking environment variables and a config file.
However, still unclear wheter there is default location where the config file is read from, what it's format is and what options are available. Generally what is the expected setup and workflow. A Getting started tutorial is needed.
I've sort of touched on this idea before, but now I'm quite certain this would be nice. Instead of having to run a separate program in order to run my migrations, I'd like to capture all my migrations directly in source code and then run them within my server process. I'm imagining (though I might be wrong) that this would make a zero downtime deploy much easier. Perhaps I'm wrong, in which case I'd love to hear how you do it.
We have all our tables in a custom schema. I created the initial migration with pg_dump
and then performing a little cleanup. In this script, it begins with:
CREATE SCHEMA uniplex;
SET search_path = uniplex, pg_catalog;
However, moo upgrade
begins by creating installed_migrations
in the public
schema. This means that moo upgrade
results in a crash, as the inserts aren't qualified with a schema (and the final INSERT INTO installed_migrations
thus tries to INSERT
into uniplex.installed_migrations
)
I keep shooting myself in the foot when for example commenting out some sql in such a way as to make processing the yaml file fail. When I make a mistake like this and run the migrations, I just get told that the database is up to date. It would be good to have some descriptive output about yaml files not being parsed or something. If you agree I'm happy to give you a pull request to fix it?
When I run the following migration with the command dotenv 'moo-postgresql upgrade'
:
Description: Sample Migration
Created: 2024-01-30 14:21:56.815803195 UTC
Depends:
Apply: |
SELECT * FROM table;
I have the following error:
There were errors in the migration store: There was an error loading a migration: Could not parse migration migrations/foo.txt: Error in "migrations/foo.txt": unrecognized field found
The workaround that I found was to remove the decimal values after the seconds (.815803195
) and, remove 'UTC' from the Created field like this:
Description: Sample Migration
Created: 2024-01-30 14:21:56
Depends: create_new_table
Apply: |
SELECT * FROM table;
As a user I would like to be able to specify whether the database type, store path and database connection string get read from a configuration file or from the environment.
Implementing this brings up a few behavioral issues:
DBM_DATABASE_TYPE: "postgresql"
DBM_DATABASE: "a=beautiful; connection=string"
DBM_MIGRATION_STORE: "Path to a migration store"
A simple implementation using Data.Yaml in the applicative. (Data.Yaml is based on Aeson and the names haven't changed yet.)
module Moo.ConfigParser where
import Data.Yaml
import Moo.Core
parseConfigOptions :: FilePath -> IO ConfigData
parseConfigOptions configFilePath = do
configContents <- decodeFile configFilePath
case configContents of
Nothing -> error "Failed Parsing Yaml"
Just a -> return a
module Moo.Core where
import Control.Applicative ( (<$>), (<*>) )
import Data.Yaml ( FromJSON( parseJSON ), Value ( Object ), (.:) )
...
-- |ConfigOptions are those options read from configuration file
-- or environment.
data ConfigData = ConfigData { _dbTypeStr :: String
, _dbConnStr :: String
, _fileStorePath :: String
}
instance FromJSON ConfigData where
parseJSON (Object v) = ConfigData <$>
v .: "DBM_DATABASE_TYPE" <*>
v .: "DBM_DATABASE" <*>
v .: "DBM_MIGRATION_STORE"
parseJSON _ = mzero
module Moo.CommandInterface
...
commandOptions :: [ OptDescr (CommandOptions -> IO CommandOptions) ]
commandOptions = [ optionConfigFile
, optionTest
, optionNoAsk
]
optionConfigFile :: OptDescr (CommandOptions -> IO CommandOptions)
optionConfigFile = Option "c" ["config-file"]
(ReqArg (\arg opt ->
return opt { _configFilePath = Just arg }) "FILE")
"Specify location of configuration file"
The rest would depend on the answers to the questions above I suppose.
At least according to the Rails framework, it is not a good idea to depend on running every migration to completely update a databases schema from scratch. Would it be a good idea to have an option that would enable commands to dump the database schema (and the installed migrations
relation) to a file after every migration? I would also implement a moo schema-load
command to go along with it.
It appears that there's a slight difference between the cabal file in the 0.6 package and the one that is in the git repository under the 0.6 tag. Would you mind bumping the version number and uploading another package?
Also, the main branch is still at 0.5 and there are no other branches in the repository. It would be nice if the 0.6 tag was merged to master and master represented the current release.
The following migration fails with a parse error:
Description: Adds two tables, but has syntax errors.
Created: 2014-09-15 15:59:52.862597 UTC
Depends: 001-initial-schema
Apply:
CREATE TABLE IF NOT EXISTS table1
(
-- comments work when they are on their own line.
id SERIAL PRIMARY KEY NOT NULL,
username TEXT NOT NULL, -- the author
date TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(),
);
CREATE TABLE IF NOT EXISTS table2
(
id SERIAL PRIMARY KEY NOT NULL,
data TEXT NOT NULL
);
Revert:
drop table table2
drop table table1
Example:
$ moo test testmigration1
applying: testmigration1...
A database error occurred: execute: PGRES_FATAL_ERROR: ERROR: syntax error at or near "CREATE"
LINE 2: CREATE TABLE IF NOT EXISTS table2 ( id SERIAL PRIMARY KEY NO...
However, this migration isn't recognized as being a migration:
Description: Adds two tables, but has syntax errors.
Created: 2014-09-15 15:59:52.862597 UTC
Depends: 001-initial-schema
Apply:
CREATE TABLE IF NOT EXISTS table1
(
-- comments work when they are on their own line.
id SERIAL PRIMARY KEY NOT NULL,
username TEXT NOT NULL, -- the author
date TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(),
);
-- test comment:
CREATE TABLE IF NOT EXISTS table2
(
id SERIAL PRIMARY KEY NOT NULL,
data TEXT NOT NULL
);
Revert:
drop table table2
drop table table1
Example:
$ moo test testmigration2
applying: testmigration2...
No such migration: testmigration2
If you remove the ':' from the second comment in testmigration2, then a different syntax error is generated.
For some projects I would like all migrations to be linear in that they always and only depend on the last migration. I have time to spend implementing this option. I would like to know if you have any thoughts on this.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.