Coder Social home page Coder Social logo

dbt-labs / dbt-redshift Goto Github PK

View Code? Open in Web Editor NEW
95.0 9.0 52.0 34.8 MB

dbt-redshift contains all of the code enabling dbt to work with Amazon Redshift

Home Page: https://getdbt.com

License: Apache License 2.0

Python 97.24% Shell 0.42% Makefile 1.19% Dockerfile 1.15%

dbt-redshift's Introduction

dbt logo

Unit Tests Badge Integration Tests Badge

dbt enables data analysts and engineers to transform their data using the same practices that software engineers use to build applications.

dbt is the T in ELT. Organize, cleanse, denormalize, filter, rename, and pre-aggregate the raw data in your warehouse so that it's ready for analysis.

dbt-redshift

The dbt-redshift package contains all of the code enabling dbt to work with Amazon Redshift. For more information on using dbt with Redshift, consult the docs.

Getting started

Join the dbt Community

Reporting bugs and contributing code

Code of Conduct

Everyone interacting in the dbt project's codebases, issue trackers, chat rooms, and mailing lists is expected to follow the dbt Code of Conduct.

dbt-redshift's People

Contributors

beckjake avatar brangisom avatar brunomurino avatar buremba avatar clrcrl avatar cmcarthur avatar colin-rogers-dbt avatar dataders avatar davidbloss avatar dbeatty10 avatar dependabot[bot] avatar drewbanin avatar emmyoop avatar fishtownbuildbot avatar github-actions[bot] avatar gshank avatar iknox-fa avatar jiezhen-chen avatar jtcohen6 avatar kconvey avatar leahwicz avatar mcknight-42 avatar michelleark avatar mikealfare avatar nathaniel-may avatar nssalian avatar peterallenwebb avatar tjengel avatar tyang209 avatar versusfacit avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dbt-redshift's Issues

Repo housekeeping

  • Issue + PR templates
  • Changelog (start with v1.0.0 - Release TBD)
  • CLA bot
  • License (Apache 2.0)
  • Repo metadata (description, URL)
  • Review README copy
  • Review setup.py (i.e. make sure it points to the new repo URL)
  • Rename default branch to develop
  • Branch protection for develop, *.latest

Then do the same in dbt-snowflake + dbt-bigquery, and dbt-spark where appropriate

[CT-837] Support IAM authentication for Redshift Serverless

Describe the feature

Redshift Serverless is generally available now. It supports connecting via IAM. dbt-redshift supports IAM for Redshift clusters, but not for serverless workgroups.

Describe alternatives you've considered

At the moment, I can use database user and password to connect to Redshift Serverless

Additional context

Redshift Serverless uses a different API than Redshift.

Who will this benefit?

Anyone using Redshift Serverless and IAM.

Are you interested in contributing this feature?

I have the PR ready.

'SSL error: unsupported method

Describe the bug

When running dbt test we occasionally get an 'SSL error: unsupported method' error.

dbt is running from airflow in an EC2 instance. After the error happens if we restart the process it runs fine. As far as we are aware we haven't once had the occur reoccur on retry.

Error happens maybe once or twice a week on processes that run once daily.

Only ever happens with dbt test, never seen it happen in dbt run.

Steps To Reproduce

Have not been able to reproduce the error manually.

Expected behavior

To run without an SSL error

Screenshots and log output

Heres an extract of where the error happens with debug flag on

[2021-08-17 06:23:09,311] {bash_operator.py:157} INFO - 2021-08-17 06:23:09.311283 (MainThread): Acquiring new redshift connection "master".
[2021-08-17 06:23:09,311] {bash_operator.py:157} INFO - 2021-08-17 06:23:09.311414 (MainThread): Opening a new connection, currently in state init
[2021-08-17 06:23:21,250] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.248250 (ThreadPoolExecutor-1_0): Acquiring new redshift connection "list_staging_dbt_staging".
[2021-08-17 06:23:21,250] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.248727 (ThreadPoolExecutor-1_1): Acquiring new redshift connection "list_staging_dbt_dims".
[2021-08-17 06:23:21,250] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.250831 (ThreadPoolExecutor-1_1): Opening a new connection, currently in state init
[2021-08-17 06:23:21,257] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.249071 (ThreadPoolExecutor-1_2): Acquiring new redshift connection "list_staging_dbt_seed_data".
[2021-08-17 06:23:21,264] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.249919 (ThreadPoolExecutor-1_4): Acquiring new redshift connection "list_staging_dbt".
[2021-08-17 06:23:21,264] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.262399 (ThreadPoolExecutor-1_4): Opening a new connection, currently in state init
[2021-08-17 06:23:21,278] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.250255 (ThreadPoolExecutor-1_5): Acquiring new redshift connection "list_staging_dbt_marts".
[2021-08-17 06:23:21,290] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.249580 (ThreadPoolExecutor-1_3): Acquiring new redshift connection "list_staging_dbt_facts".
[2021-08-17 06:23:21,301] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.262215 (ThreadPoolExecutor-1_2): Opening a new connection, currently in state init
[2021-08-17 06:23:21,304] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.250694 (ThreadPoolExecutor-1_0): Opening a new connection, currently in state init
[2021-08-17 06:23:21,306] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.290397 (ThreadPoolExecutor-1_5): Opening a new connection, currently in state init
[2021-08-17 06:23:21,308] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.290957 (ThreadPoolExecutor-1_1): Using redshift connection "list_staging_dbt_dims".
[2021-08-17 06:23:21,308] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.301143 (ThreadPoolExecutor-1_3): Opening a new connection, currently in state init
[2021-08-17 06:23:21,311] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.301585 (ThreadPoolExecutor-1_4): Using redshift connection "list_staging_dbt".
[2021-08-17 06:23:21,311] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.303918 (ThreadPoolExecutor-1_2): Using redshift connection "list_staging_dbt_seed_data".
[2021-08-17 06:23:21,312] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.306141 (ThreadPoolExecutor-1_0): Using redshift connection "list_staging_dbt_staging".
[2021-08-17 06:23:21,312] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.308204 (ThreadPoolExecutor-1_5): Using redshift connection "list_staging_dbt_marts".
[2021-08-17 06:23:21,312] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.308377 (ThreadPoolExecutor-1_1): On list_staging_dbt_dims: BEGIN
[2021-08-17 06:23:21,312] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.311538 (ThreadPoolExecutor-1_3): Using redshift connection "list_staging_dbt_facts".
[2021-08-17 06:23:21,312] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.311732 (ThreadPoolExecutor-1_4): On list_staging_dbt: BEGIN
[2021-08-17 06:23:21,312] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.311918 (ThreadPoolExecutor-1_2): On list_staging_dbt_seed_data: BEGIN
[2021-08-17 06:23:21,313] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.312094 (ThreadPoolExecutor-1_0): On list_staging_dbt_staging: BEGIN
[2021-08-17 06:23:21,313] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.312258 (ThreadPoolExecutor-1_5): On list_staging_dbt_marts: BEGIN
[2021-08-17 06:23:21,313] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.312510 (ThreadPoolExecutor-1_1): Connecting to Redshift using 'database' credentials
[2021-08-17 06:23:21,313] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.312705 (ThreadPoolExecutor-1_3): On list_staging_dbt_facts: BEGIN
[2021-08-17 06:23:21,313] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.312846 (ThreadPoolExecutor-1_4): Connecting to Redshift using 'database' credentials
[2021-08-17 06:23:21,314] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.312980 (ThreadPoolExecutor-1_2): Connecting to Redshift using 'database' credentials
[2021-08-17 06:23:21,314] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.313141 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'database' credentials
[2021-08-17 06:23:21,314] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.313250 (ThreadPoolExecutor-1_5): Connecting to Redshift using 'database' credentials
[2021-08-17 06:23:21,314] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.313763 (ThreadPoolExecutor-1_3): Connecting to Redshift using 'database' credentials
[2021-08-17 06:23:21,334] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.334000 (ThreadPoolExecutor-1_3): Got an error when attempting to open a postgres connection: 'SSL error: unsupported method
[2021-08-17 06:23:21,334] {bash_operator.py:157} INFO - '
[2021-08-17 06:23:21,334] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.334140 (ThreadPoolExecutor-1_1): Got an error when attempting to open a postgres connection: 'SSL error: unsupported method
[2021-08-17 06:23:21,334] {bash_operator.py:157} INFO - '
[2021-08-17 06:23:21,334] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.334312 (ThreadPoolExecutor-1_3): Error running SQL: BEGIN
[2021-08-17 06:23:21,334] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.334425 (ThreadPoolExecutor-1_1): Error running SQL: BEGIN
[2021-08-17 06:23:21,334] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.334529 (ThreadPoolExecutor-1_3): Rolling back transaction.
[2021-08-17 06:23:21,334] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.334656 (ThreadPoolExecutor-1_1): Rolling back transaction.
[2021-08-17 06:23:21,335] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.334804 (ThreadPoolExecutor-1_3): On list_staging_dbt_facts: No close available on handle
[2021-08-17 06:23:21,335] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.334939 (ThreadPoolExecutor-1_1): On list_staging_dbt_dims: No close available on handle
[2021-08-17 06:23:21,335] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.335141 (ThreadPoolExecutor-1_3): Error running SQL: macro list_relations_without_caching
[2021-08-17 06:23:21,335] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.335333 (ThreadPoolExecutor-1_1): Error running SQL: macro list_relations_without_caching
[2021-08-17 06:23:21,335] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.335459 (ThreadPoolExecutor-1_3): Rolling back transaction.
[2021-08-17 06:23:21,336] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.335574 (ThreadPoolExecutor-1_1): Rolling back transaction.
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.377842 (ThreadPoolExecutor-1_0): SQL status: BEGIN in 0.06 seconds
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.378061 (ThreadPoolExecutor-1_0): Using redshift connection "list_staging_dbt_staging".
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.378143 (ThreadPoolExecutor-1_0): On list_staging_dbt_staging: /* {"app": "dbt", "dbt_version": "0.16.1", "profile_name": "carnext_dwh", "target_name": "dev", "connection_name": "list_staging_dbt_staging"} /
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - select
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - 'staging' as database,
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - tablename as name,
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - schemaname as schema,
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - 'table' as type
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - from pg_tables
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - where schemaname ilike 'dbt_staging'
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - union all
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - select
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - 'staging' as database,
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - viewname as name,
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - schemaname as schema,
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - 'view' as type
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - from pg_views
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO - where schemaname ilike 'dbt_staging'
[2021-08-17 06:23:21,378] {bash_operator.py:157} INFO -
[2021-08-17 06:23:21,405] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.405686 (ThreadPoolExecutor-1_2): SQL status: BEGIN in 0.09 seconds
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.405881 (ThreadPoolExecutor-1_2): Using redshift connection "list_staging_dbt_seed_data".
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.405973 (ThreadPoolExecutor-1_2): On list_staging_dbt_seed_data: /
{"app": "dbt", "dbt_version": "0.16.1", "profile_name": "carnext_dwh", "target_name": "dev", "connection_name": "list_staging_dbt_seed_data"} /
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - select
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - 'staging' as database,
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - tablename as name,
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - schemaname as schema,
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - 'table' as type
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - from pg_tables
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - where schemaname ilike 'dbt_seed_data'
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - union all
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - select
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - 'staging' as database,
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - viewname as name,
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - schemaname as schema,
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - 'view' as type
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - from pg_views
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO - where schemaname ilike 'dbt_seed_data'
[2021-08-17 06:23:21,406] {bash_operator.py:157} INFO -
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.410117 (ThreadPoolExecutor-1_5): SQL status: BEGIN in 0.10 seconds
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.410350 (ThreadPoolExecutor-1_5): Using redshift connection "list_staging_dbt_marts".
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.410491 (ThreadPoolExecutor-1_5): On list_staging_dbt_marts: /
{"app": "dbt", "dbt_version": "0.16.1", "profile_name": "carnext_dwh", "target_name": "dev", "connection_name": "list_staging_dbt_marts"} /
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - select
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - 'staging' as database,
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - tablename as name,
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - schemaname as schema,
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - 'table' as type
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - from pg_tables
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - where schemaname ilike 'dbt_marts'
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - union all
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - select
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - 'staging' as database,
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - viewname as name,
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - schemaname as schema,
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - 'view' as type
[2021-08-17 06:23:21,410] {bash_operator.py:157} INFO - from pg_views
[2021-08-17 06:23:21,411] {bash_operator.py:157} INFO - where schemaname ilike 'dbt_marts'
[2021-08-17 06:23:21,411] {bash_operator.py:157} INFO -
[2021-08-17 06:23:21,421] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.421338 (ThreadPoolExecutor-1_0): SQL status: SELECT in 0.04 seconds
[2021-08-17 06:23:21,477] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.462472 (ThreadPoolExecutor-1_5): SQL status: SELECT in 0.05 seconds
[2021-08-17 06:23:21,493] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.467758 (ThreadPoolExecutor-1_2): SQL status: SELECT in 0.06 seconds
[2021-08-17 06:23:21,618] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.603308 (ThreadPoolExecutor-1_5): On list_staging_dbt_marts: ROLLBACK
[2021-08-17 06:23:21,658] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.652783 (ThreadPoolExecutor-1_2): On list_staging_dbt_seed_data: ROLLBACK
[2021-08-17 06:23:21,760] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.759903 (ThreadPoolExecutor-1_0): On list_staging_dbt_staging: ROLLBACK
[2021-08-17 06:23:21,850] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.850589 (ThreadPoolExecutor-1_4): SQL status: BEGIN in 0.54 seconds
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.850805 (ThreadPoolExecutor-1_4): Using redshift connection "list_staging_dbt".
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.850886 (ThreadPoolExecutor-1_4): On list_staging_dbt: /
{"app": "dbt", "dbt_version": "0.16.1", "profile_name": "carnext_dwh", "target_name": "dev", "connection_name": "list_staging_dbt"} */
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - select
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - 'staging' as database,
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - tablename as name,
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - schemaname as schema,
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - 'table' as type
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - from pg_tables
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - where schemaname ilike 'dbt'
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - union all
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - select
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - 'staging' as database,
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - viewname as name,
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - schemaname as schema,
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - 'view' as type
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - from pg_views
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO - where schemaname ilike 'dbt'
[2021-08-17 06:23:21,851] {bash_operator.py:157} INFO -
[2021-08-17 06:23:21,880] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.879900 (ThreadPoolExecutor-1_4): SQL status: SELECT in 0.03 seconds
[2021-08-17 06:23:21,885] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.885352 (ThreadPoolExecutor-1_4): On list_staging_dbt: ROLLBACK
[2021-08-17 06:23:21,888] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.888201 (MainThread): Connection 'master' was properly closed.
[2021-08-17 06:23:21,888] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.888341 (MainThread): Connection 'list_staging_dbt_staging' was left open.
[2021-08-17 06:23:21,888] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.888452 (MainThread): On list_staging_dbt_staging: Close
[2021-08-17 06:23:21,888] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.888673 (MainThread): Connection 'list_staging_dbt_dims' was properly closed.
[2021-08-17 06:23:21,888] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.888782 (MainThread): Connection 'list_staging_dbt_seed_data' was left open.
[2021-08-17 06:23:21,888] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.888893 (MainThread): On list_staging_dbt_seed_data: Close
[2021-08-17 06:23:21,889] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.889097 (MainThread): Connection 'list_staging_dbt_facts' was properly closed.
[2021-08-17 06:23:21,889] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.889201 (MainThread): Connection 'list_staging_dbt' was left open.
[2021-08-17 06:23:21,889] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.889306 (MainThread): On list_staging_dbt: Close
[2021-08-17 06:23:21,889] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.889484 (MainThread): Connection 'list_staging_dbt_marts' was left open.
[2021-08-17 06:23:21,889] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.889582 (MainThread): On list_staging_dbt_marts: Close
[2021-08-17 06:23:21,889] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.889816 (MainThread): ERROR: Database Error
[2021-08-17 06:23:21,890] {bash_operator.py:157} INFO - SSL error: unsupported method
[2021-08-17 06:23:21,890] {bash_operator.py:157} INFO -
[2021-08-17 06:23:21,890] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.890042 (MainThread): Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f5458d22d30>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f5458150e80>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f54427d7ac8>]}
[2021-08-17 06:23:21,890] {bash_operator.py:157} INFO - 2021-08-17 06:23:21.890317 (MainThread): Flushing usage events
[2021-08-17 06:23:22,581] {bash_operator.py:157} INFO - 1 errors detected
[2021-08-17 06:23:22,581] {bash_operator.py:161} INFO - Command exited with return code 1

System information

Which database are you using dbt with?

  • postgres
  • redshift
  • bigquery
  • snowflake
  • other (specify: ____________)

The output of dbt --version:

 0.16.1

The operating system you're using:

CentOS Linux 7

The output of python --version:
Python 3.6.13

[CT-1247] [Feature] Consolidate current_timestamp & associates

Is this your first time submitting a feature request?

  • I have read the expectations for open source contributors
  • I have searched the existing issues, and I could not find an existing issue for this feature
  • I am requesting a straightforward extension of existing dbt-redshift functionality, rather than a Big Idea better suited to a discussion

Describe the feature

Adapter implementation of dbt-labs/dbt-core#5521

Describe alternatives you've considered

No response

Who will this benefit?

No response

Are you interested in contributing this feature?

No response

Anything else?

No response

Getting AccessExclusiveLock for redshift while running dbt

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

Hi I have been facing AccessExclusiveLock being set via dbt run statements connecting to my redshift datawarehouse.
Wanted to know if that is the type of lock dbt creates.
https://aws.amazon.com/premiumsupport/knowledge-center/prevent-locks-blocking-queries-redshift/

And how to set it to use AccessShareLock instead?

Expected Behavior

No response

Steps To Reproduce

No response

Relevant log output

No response

Environment

- OS:linux
- Python:3.7
- dbt:

What database are you using dbt with?

No response

Additional Context

No response

[CT-471] Redshift: Retry n times on connection timeout

Describe the feature

Runs/builds/tests can create hundreds of independent database connections depending on the size of the project and a single connection timeout due to transient network connections, EC2 load, or redshift load can cause an entire run to fail. Connection timeouts can, and most often are, transient and will often succeed on a retry.

Additional context

This would be similar to connect_retries on Snowflake.
See: https://github.com/dbt-labs/dbt-snowflake/pull/6/files

Who will this benefit?

Anyone using the redshift connector.

[CT-454] Add env var for `ra3_node` profile config

Background

dbt-labs/dbt-core#3408 added support for cross-database (read-only) querying IFF the user is connecting to a cluster with RA3-type nodes. On the recommendation of the Redshift team, who do not anticipate "classic" node types disappearing anytime soon, and see them remaining the default for some time, we kept our preexisting error in place, and gave users a way to tell us (in profiles.yml) whether they're using RA3 or not. (Original discussion in dbt-labs/dbt-core#3179 + dbt-labs/dbt-core#3236)

Problem

dbt Cloud users do not have access to custom profiles.yml fields today, but they do have access to other configuration options (i.e. dbt_project.yml + env vars). Which of those is the right way forward?

  1. "project" config (dbt_project.yml): I can't think of an exact precedent for this. Maybe quoting? Or the simple fact that sources have been defined in another warehouse? When negotiating the fuzzy boundary between configuration of the project and configuration of the warehouse, this feels more warehouse than project.
  2. "global" config (flag + env var + profiles.yml): This feels more consistent with existing configurations.
    • profiles.yml already supports this config (though under a Redshift target, rather than the top-level config: block)
    • Env var is totally doable: DBT_REDSHIFT_RA3_NODE=True|False. If set via env var, this should take precedence over the profiles.yml setting, just like with global configs.
    • Flag would require a change to dbt-core (main.py). Eventually, we should aim to make this not the case, and allow plugins to register their own custom flags.

Tactical question: Which is the right spot to use the env var, if defined?

ra3_node: Optional[bool] = False

ra3_node = self.config.credentials.ra3_node

Much bigger question: Should we natively support env vars for all adapter-specific profiles.yml configs, via the formulation DBT_{ADAPTER}_{CONFIG}? If we could add this logic to dbt-core, in the Credentials base class, it would make it easier to support new configs as soon as they're added. I'm just not sure if this is the right-sized solution for the problem we have today in dbt Cloud, where it's difficult to coordinate versioned schemas of adapter-specific profiles.yml configuration. (I'll open a dbt-core issue for this)

[CT-1174] mypy broken despite no code changes

Describe the bug

A clear and concise description of what the bug is. What command did you run? What happened?

Steps To Reproduce

In as much detail as possible, please provide steps to reproduce the issue. Sample data that triggers the issue, example model code, etc is all very helpful here.

Expected behavior

A clear and concise description of what you expected to happen.

Screenshots and log output

If applicable, add screenshots or log output to help explain your problem.

System information

The output of dbt --version:

<output goes here>

The operating system you're using:

The output of python --version:

Additional context

Add any other context about the problem here.

[CT-1118] Materalize as Redshift external tables (Glue tables)

Describe the feature

A clear and concise description of what you want to happen.
With DBT we have only Table and View materialization which is fine for most of the engines. With Redshift Spectrum, it would be useful to be able to directly materialize models as external table to get a better integration of DBT with the lakehouse capabilities of Redshift.

Describe alternatives you've considered

A clear and concise description of any alternative solutions or features you've considered.
A workaround solution i use currently is by materializing the model as view and then use a post hook to create an external table from the view :

{{ config(
   alias='exter',
   materialized='view',
   bind=False,
   post_hook=[{"sql": "{{ insert_to_external_table(
       this.schema,
       this.table,
       'glue_catalog_test',
       'crwiris',
       ) }}",
   "transaction": False},
   after_commit("drop view dev.dbt.exter")]
   )
}}
 
select * FROM "dev"."glue_catalog_test"."crwiris"

Additional context

Please include any other relevant context here.

Who will this benefit?

What kind of use case will this feature be useful for? Please be specific and provide examples, this will help us prioritize properly.
Lakehouse approach has become attractive since a while and Redshift (with Spectrum) is a major player for handling lake and warehouse data with the same engine and query formats. Many companies would benefit from the ability of DBT to handle Glue tables through Redshift not only for reading but for creating(already possible through Macros) and inserting seamlessly.

Are you interested in contributing this feature?

Let us know if you want to write some code, and how we can help.

[CT-1179] [Bug] Fix bot changelog generation

Is this a new bug in dbt-redshift?

  • I believe this is a new bug in dbt-redshift
  • I have searched the existing issues, and I could not find an existing issue for this bug

Current Behavior

Changelog generated by bot-changelog.yml has the following bugs

  • does not correctly select label
  • adds a newline to the changelog that causes tests to fail
  • runs twice on opened PRs

Expected Behavior

All tests pass and it runs a single time for the correct label.

Steps To Reproduce

Look at a PR from dependabot.

Relevant log output

No response

Environment

- OS:
- Python:
- dbt-core:
- dbt-redshift:

Additional Context

The changes were made in dbt-core here. The same changes can be made here. This includes updating the version of the marketplace action that is being used.

Redshift distribution and sort key - default to auto or allow top level configuration

Describe the feature

Redshift now has an auto option for table distribution and sort keys. Additionally, they have also released a feature, Automatic Table Optimization, that allows Redshift to swap things out. This feature more or less automates the suggestions that show up in the svv_alter_table_recommendations system view, UI, etc.

It would be super helpful to have that as the default, either by convention in DBT, or by configuration through allowing the dist or sort configuration values to be set in the dbt_project.yml.

EDIT: Looks like sort keys can be set in dbt_project.yml already.

Describe alternatives you've considered

A config() block in every model. 😢

Additional context

This is a Redshift database specific feature.

Who will this benefit?

Anyone who uses Redshift, and would like to default to auto dist and/or sort keys to take advantage of the Automatic Table Optimization feature.

Are you interested in contributing this feature?

Sure.

[CT-1144] [CT-1093] [Bug] Incremental materialization alters model table to look like temporary table

Is this a new bug in dbt-core?

  • I believe this is a new bug in dbt-core
  • I have searched the existing issues, and I could not find an existing issue for this bug

Current Behavior

We have created an incremental model. As part of the initial creation of this model, we have run a macro to add a primary key, which involves creating a temporary not-null column, populating it, and then replacing the original nullable column:

        alter table "{{table_relation.schema}}"."{{table_relation.identifier}}" add column TEMP_FOR_PK {{column_type}} not null default {{default_value}};
        update "{{table_relation.schema}}"."{{table_relation.identifier}}" set TEMP_FOR_PK = nvl( "{{column_name}}" :: {{column_type}}, {{default_value}});
        alter table "{{table_relation.schema}}"."{{table_relation.identifier}}" drop column "{{column_name}}";
        alter table "{{table_relation.schema}}"."{{table_relation.identifier}}" rename column TEMP_FOR_PK to "{{column_name}}";

However, DBT undoes this work as part of creating an incremental model:

13:42:15.480122 [debug] [Thread-1  ]: On model.redacted.example_model: /* {"app": "dbt", "dbt_version": "1.0.1", "profile_name": "default", "target_name": "default", "node_id": "model.redacted.example_model"} */

    alter table "keith"."warehouse"."example_model" add column "example_column__dbt_alter" character varying(259);
    update "keith"."warehouse"."example_model" set "example_column__dbt_alter" = "example_column";
    alter table "keith"."warehouse"."example_model" drop column "example_column" cascade;
    alter table "keith"."warehouse"."example_model" rename column "example_column__dbt_alter" to "example_column"

DBT then attempts to insert the new rows into the base table from the temporary table:

insert into "keith"."warehouse"."example_model" (REDACTED)
(
    select REDACTED
    from "example_model__dbt_tmp134213420791"
)

Which results in the following error from the database:

13:42:15.904777 [debug] [Thread-1  ]: Postgres adapter: Postgres error: cannot insert/update into table after dropping non-nullable column

Expected Behavior

DBT would not change the base table. Instead, it would either (1) insert the rows from the temporary table as-is, or (2) it would alter the temporary table to match the base table.

Steps To Reproduce

  1. Create an incrementally materialized model with a post-hook that makes columns not-nullable and creates a primary key.
  2. Perform a full refresh.
  3. Perform an incremental refresh.

Relevant log output

See log excerpts in Current Behavior section.

Environment

- OS: Ubuntu 20.0.4 (uname -a `Linux ip-10-150-179-62 5.15.0-1017-aws dbt-labs/dbt-core#21~20.04.1-Ubuntu SMP Fri Aug 5 11:44:14 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux`)
- Python: 3.8.10
- dbt: 1.2.1

Which database adapter are you using with dbt?

redshift

Additional Context

No response

Indefinite AWS MFA auth loop when running DBT with Redshift IAM auth

Describe the bug

I'm following the documentation here for now to set up IAM auth with Redshift profiles, but am encountering an issue where I am continually prompted to enter an MFA token.

Steps To Reproduce

See below for command and log output (the model code is trivial, it's just a simple select from a raw/base model):

12:06 $ dbt run --profile dade --target iam --profiles-dir profiles/dev -m assigned_worker
Running with dbt=0.18.0
* Deprecation Warning: dbt v0.17.0 introduces a new config format for the
dbt_project.yml file. Support for the existing version 1 format will be removed
in a future release of dbt. The following packages are currently configured with
config version 1:
 - dbt_analytics
 - dbt_utils
 - redshift

For upgrading instructions, consult the documentation:
  https://docs.getdbt.com/docs/guides/migration-guide/upgrading-to-0-17-0

* Deprecation Warning: The "adapter_macro" macro has been deprecated. Instead,
use the `adapter.dispatch` method to find a macro and call the result.
adapter_macro was called for: dbt_utils.type_string
Found 264 models, 1430 tests, 0 snapshots, 10 analyses, 332 macros, 1 operation, 0 seed files, 167 sources

Enter MFA code for arn:aws:iam::042199655640:mfa/alex: 
Enter MFA code for arn:aws:iam::042199655640:mfa/alex: 
Enter MFA code for arn:aws:iam::042199655640:mfa/alex: 
Enter MFA code for arn:aws:iam::042199655640:mfa/alex: 
Enter MFA code for arn:aws:iam::042199655640:mfa/alex: 
Enter MFA code for arn:aws:iam::042199655640:mfa/alex: 
Enter MFA code for arn:aws:iam::042199655640:mfa/alex: 
Enter MFA code for arn:aws:iam::042199655640:mfa/alex: 
…

And I know it's not necessarily an issue with my AWS config or DBT profiles, as I am able to connect successfully, for example when I run debug like so:

12:05 $ dbt debug --profile dade --target iam --profiles-dir profiles/dev 
Running with dbt=0.18.0
dbt version: 0.18.0
python version: 3.8.10
python path: /Users/silvaale/.pyenv/versions/3.8.10/bin/python3.8
os info: macOS-10.15.7-x86_64-i386-64bit
Using profiles.yml file at profiles/dev/profiles.yml
Using dbt_project.yml file at /Users/silvaale/Documents/data-dbt-analysis/dbt_project.yml

Configuration:
  profiles.yml file [OK found and valid]
  dbt_project.yml file [OK found and valid]

Required dependencies:
 - git [OK found]

Connection:
  host: 10.0.20.70
  port: 5439
  user: alex_iam
  database: dade
  schema: dbt_analytics__alex_iam
  search_path: None
  keepalives_idle: 3600
  sslmode: None
  method: iam
  cluster_id: dw-redshift-dev
  iam_profile: wmops
  iam_duration_seconds: 900
Enter MFA code for arn:aws:iam::042199655640:mfa/alex: 
  Connection test: OK connection ok

Here's the dbt profiles.yml I'm using in this test (FYI, the "transformers" group has the necessary DBT grants, like, create on DB):

config:
  partial_parse: True

dade:
  outputs:
    
    prod:
      type: redshift
      threads: 1
      host: "{{ env_var('DBT_HOST') }}"
      port: 5439
      user: dbt
      password: "{{ env_var('DBT_PASSWORD') }}"
      dbname: "{{ env_var('DBT_DBNAME') }}"
      schema: dbt_analytics

    iam:
      type: redshift
      method: iam
      cluster_id: dw-redshift-dev
      host: 10.0.20.70
      user: alex_iam
      iam_profile: wmops # optional
      iam_duration_seconds: 900  # optional
      autocreate: true           # optional
      db_groups: ['transformers']    # optional

      # Other Redshift configs:
      port: 5439
      dbname: dade
      schema: dbt_analytics__alex_iam
      threads: 1
      keepalives_idle: 3600 # default 0, indicating the system default
      # search_path: public # optional, but not recommended
      #sslmode: [optional, set the sslmode used to connect to the database (in case this parameter is set, will look for ca in ~/.postgresql/root.crt)]

  target: prod

Expected behavior

What I would expect to see instead is to be prompted once, and have that MFA token remain valid for some configurable duration.

Screenshots and log output

This is the log output of the dbt run command pasted above:

2021-08-03 16:06:35.471848 (MainThread): Running with dbt=0.18.0
2021-08-03 16:06:35.666706 (MainThread): running dbt with arguments Namespace(cls=<class 'dbt.task.run.RunTask'>, debug=False, defer=None, exclude=None, fail_fast=False, full_refresh=False, log_cache_events=False, log_format='default', models=['assigned_worker'], partial_parse=None, profile='dade', profiles_dir='profiles/dev', project_dir=None, record_timing_info=None, rpc_method='run', selector_name=None, single_threaded=False, state=None, strict=False, target='iam', test_new_parser=False, threads=None, use_cache=True, use_colors=None, vars='{}', version_check=True, warn_error=False, which='run', write_json=True)
2021-08-03 16:06:35.667153 (MainThread): Tracking: tracking
2021-08-03 16:06:35.692776 (MainThread): Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x107d15940>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x107d1df10>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x107d1d910>]}
2021-08-03 16:06:35.851744 (MainThread): profile hash mismatch, cache invalidated
2021-08-03 16:06:35.858492 (MainThread): Parsing macros/business_time__pb.sql
2021-08-03 16:06:35.864497 (MainThread): Parsing macros/get_fiscal_year_dates.sql
2021-08-03 16:06:35.869095 (MainThread): Parsing macros/get_custom_field_name__pb.sql
2021-08-03 16:06:35.889176 (MainThread): Parsing macros/periodic_amount.sql
2021-08-03 16:06:35.892840 (MainThread): Parsing macros/get_fiscal_periods.sql
2021-08-03 16:06:35.896603 (MainThread): Parsing macros/work_age__pb.sql
2021-08-03 16:06:35.902465 (MainThread): Parsing macros/business_hours_per_day__pb.sql
2021-08-03 16:06:35.905195 (MainThread): Parsing macros/try_cast_date.sql
2021-08-03 16:06:35.908347 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.909843 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.910350 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.910871 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.913435 (MainThread): Parsing macros/get_date_parts.sql
2021-08-03 16:06:35.916688 (MainThread): Parsing macros/compress_if_full_refresh.sql
2021-08-03 16:06:35.918522 (MainThread): Parsing macros/get_description_line_item_effective_date.sql
2021-08-03 16:06:35.924617 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.925722 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.928387 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.928713 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.929299 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.929535 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:35.932469 (MainThread): Parsing macros/get_table_name_from_options.sql
2021-08-03 16:06:35.934838 (MainThread): Parsing macros/get_custom_field_select__pb.sql
2021-08-03 16:06:35.937328 (MainThread): Parsing macros/business_open_close__pb.sql
2021-08-03 16:06:35.938716 (MainThread): Parsing macros/get_datestamp.sql
2021-08-03 16:06:35.939923 (MainThread): Parsing macros/strip_deleted_on_from_company_name.sql
2021-08-03 16:06:35.940880 (MainThread): Parsing macros/get_holidays__pb.sql
2021-08-03 16:06:35.942821 (MainThread): Parsing macros/geo_distance.sql
2021-08-03 16:06:35.945618 (MainThread): Parsing macros/get_full_name.sql
2021-08-03 16:06:35.946619 (MainThread): Parsing macros/get_pricing_stratgey_name.sql
2021-08-03 16:06:35.947467 (MainThread): Parsing macros/get_assignment_lifecyle_stage_times_sql.sql
2021-08-03 16:06:35.949606 (MainThread): Parsing macros/catalog.sql
2021-08-03 16:06:35.962517 (MainThread): Parsing macros/relations.sql
2021-08-03 16:06:35.963659 (MainThread): Parsing macros/adapters.sql
2021-08-03 16:06:35.986786 (MainThread): Parsing macros/materializations/snapshot_merge.sql
2021-08-03 16:06:35.988298 (MainThread): Parsing macros/catalog.sql
2021-08-03 16:06:35.990930 (MainThread): Parsing macros/relations.sql
2021-08-03 16:06:35.992574 (MainThread): Parsing macros/adapters.sql
2021-08-03 16:06:36.013348 (MainThread): Parsing macros/materializations/snapshot_merge.sql
2021-08-03 16:06:36.019007 (MainThread): Parsing macros/core.sql
2021-08-03 16:06:36.023778 (MainThread): Parsing macros/materializations/helpers.sql
2021-08-03 16:06:36.033839 (MainThread): Parsing macros/materializations/snapshot/snapshot_merge.sql
2021-08-03 16:06:36.036026 (MainThread): Parsing macros/materializations/snapshot/strategies.sql
2021-08-03 16:06:36.052617 (MainThread): Parsing macros/materializations/snapshot/snapshot.sql
2021-08-03 16:06:36.087185 (MainThread): Parsing macros/materializations/seed/seed.sql
2021-08-03 16:06:36.108958 (MainThread): Parsing macros/materializations/incremental/helpers.sql
2021-08-03 16:06:36.111102 (MainThread): Parsing macros/materializations/incremental/incremental.sql
2021-08-03 16:06:36.143231 (MainThread): Parsing macros/materializations/common/merge.sql
2021-08-03 16:06:36.159458 (MainThread): Parsing macros/materializations/table/table.sql
2021-08-03 16:06:36.167452 (MainThread): Parsing macros/materializations/view/view.sql
2021-08-03 16:06:36.174588 (MainThread): Parsing macros/materializations/view/create_or_replace_view.sql
2021-08-03 16:06:36.181366 (MainThread): Parsing macros/etc/get_custom_alias.sql
2021-08-03 16:06:36.183087 (MainThread): Parsing macros/etc/query.sql
2021-08-03 16:06:36.184873 (MainThread): Parsing macros/etc/is_incremental.sql
2021-08-03 16:06:36.186903 (MainThread): Parsing macros/etc/datetime.sql
2021-08-03 16:06:36.197351 (MainThread): Parsing macros/etc/get_custom_schema.sql
2021-08-03 16:06:36.199863 (MainThread): Parsing macros/etc/get_custom_database.sql
2021-08-03 16:06:36.202025 (MainThread): Parsing macros/adapters/common.sql
2021-08-03 16:06:36.251588 (MainThread): Parsing macros/schema_tests/relationships.sql
2021-08-03 16:06:36.254195 (MainThread): Parsing macros/schema_tests/not_null.sql
2021-08-03 16:06:36.256444 (MainThread): Parsing macros/schema_tests/unique.sql
2021-08-03 16:06:36.259138 (MainThread): Parsing macros/schema_tests/accepted_values.sql
2021-08-03 16:06:36.266467 (MainThread): Parsing macros/cross_db_utils/except.sql
2021-08-03 16:06:36.268474 (MainThread): Parsing macros/cross_db_utils/replace.sql
2021-08-03 16:06:36.270278 (MainThread): Parsing macros/cross_db_utils/concat.sql
2021-08-03 16:06:36.273076 (MainThread): Parsing macros/cross_db_utils/identifer.sql
2021-08-03 16:06:36.276381 (MainThread): Parsing macros/cross_db_utils/datatypes.sql
2021-08-03 16:06:36.284952 (MainThread): Parsing macros/cross_db_utils/_is_relation.sql
2021-08-03 16:06:36.287489 (MainThread): Parsing macros/cross_db_utils/length.sql
2021-08-03 16:06:36.290274 (MainThread): Parsing macros/cross_db_utils/dateadd.sql
2021-08-03 16:06:36.293805 (MainThread): Parsing macros/cross_db_utils/intersect.sql
2021-08-03 16:06:36.295436 (MainThread): Parsing macros/cross_db_utils/right.sql
2021-08-03 16:06:36.298633 (MainThread): Parsing macros/cross_db_utils/datediff.sql
2021-08-03 16:06:36.301864 (MainThread): Parsing macros/cross_db_utils/safe_cast.sql
2021-08-03 16:06:36.304376 (MainThread): Parsing macros/cross_db_utils/hash.sql
2021-08-03 16:06:36.306316 (MainThread): Parsing macros/cross_db_utils/position.sql
2021-08-03 16:06:36.308451 (MainThread): Parsing macros/cross_db_utils/literal.sql
2021-08-03 16:06:36.309810 (MainThread): Parsing macros/cross_db_utils/current_timestamp.sql
2021-08-03 16:06:36.313803 (MainThread): Parsing macros/cross_db_utils/width_bucket.sql
2021-08-03 16:06:36.320144 (MainThread): Parsing macros/cross_db_utils/last_day.sql
2021-08-03 16:06:36.323559 (MainThread): Parsing macros/cross_db_utils/split_part.sql
2021-08-03 16:06:36.325917 (MainThread): Parsing macros/cross_db_utils/date_trunc.sql
2021-08-03 16:06:36.328169 (MainThread): Parsing macros/materializations/insert_by_period_materialization.sql
2021-08-03 16:06:36.355620 (MainThread): Parsing macros/logger/pretty_log_format.sql
2021-08-03 16:06:36.356980 (MainThread): Parsing macros/logger/pretty_time.sql
2021-08-03 16:06:36.358396 (MainThread): Parsing macros/logger/log_info.sql
2021-08-03 16:06:36.359968 (MainThread): Parsing macros/datetime/date_spine.sql
2021-08-03 16:06:36.364875 (MainThread): Parsing macros/web/get_url_host.sql
2021-08-03 16:06:36.366800 (MainThread): Parsing macros/web/get_url_path.sql
2021-08-03 16:06:36.369672 (MainThread): Parsing macros/web/get_url_parameter.sql
2021-08-03 16:06:36.371259 (MainThread): Parsing macros/geo/haversine_distance.sql
2021-08-03 16:06:36.372574 (MainThread): Parsing macros/schema_tests/equal_rowcount.sql
2021-08-03 16:06:36.374434 (MainThread): Parsing macros/schema_tests/relationships_where.sql
2021-08-03 16:06:36.377029 (MainThread): Parsing macros/schema_tests/recency.sql
2021-08-03 16:06:36.378638 (MainThread): Parsing macros/schema_tests/not_constant.sql
2021-08-03 16:06:36.379972 (MainThread): Parsing macros/schema_tests/at_least_one.sql
2021-08-03 16:06:36.381396 (MainThread): Parsing macros/schema_tests/unique_combination_of_columns.sql
2021-08-03 16:06:36.383061 (MainThread): Parsing macros/schema_tests/cardinality_equality.sql
2021-08-03 16:06:36.385031 (MainThread): Parsing macros/schema_tests/expression_is_true.sql
2021-08-03 16:06:36.386622 (MainThread): Parsing macros/schema_tests/equality.sql
2021-08-03 16:06:36.390251 (MainThread): Parsing macros/schema_tests/mutually_exclusive_ranges.sql
2021-08-03 16:06:36.397645 (MainThread): Parsing macros/sql/nullcheck_table.sql
2021-08-03 16:06:36.399137 (MainThread): Parsing macros/sql/generate_series.sql
2021-08-03 16:06:36.403254 (MainThread): Parsing macros/sql/get_relations_by_prefix.sql
2021-08-03 16:06:36.407734 (MainThread): Parsing macros/sql/get_tables_by_prefix_sql.sql
2021-08-03 16:06:36.410829 (MainThread): Parsing macros/sql/star.sql
2021-08-03 16:06:36.414005 (MainThread): Parsing macros/sql/unpivot.sql
2021-08-03 16:06:36.420531 (MainThread): Parsing macros/sql/union.sql
2021-08-03 16:06:36.432087 (MainThread): Parsing macros/sql/groupby.sql
2021-08-03 16:06:36.433759 (MainThread): Parsing macros/sql/surrogate_key.sql
2021-08-03 16:06:36.436302 (MainThread): Parsing macros/sql/nullcheck.sql
2021-08-03 16:06:36.438736 (MainThread): Parsing macros/sql/get_column_values.sql
2021-08-03 16:06:36.443972 (MainThread): Parsing macros/sql/pivot.sql
2021-08-03 16:06:36.447725 (MainThread): Parsing macros/sql/get_query_results_as_dict.sql
2021-08-03 16:06:36.451488 (MainThread): Parsing macros/ddl.sql
2021-08-03 16:06:36.460682 (MainThread): Parsing macros/compression.sql
2021-08-03 16:06:36.479121 (MainThread): Parsing macros/utilities.sql
2021-08-03 16:06:36.480733 (MainThread): Parsing macros/try_cast.sql
2021-08-03 16:06:36.482308 (MainThread): Parsing macros/unload.sql
2021-08-03 16:06:36.488977 (MainThread): Parsing macros/redshift_maintenance_operation.sql
2021-08-03 16:06:36.493495 (MainThread): Parsing macros/introspection.sql
2021-08-03 16:06:36.512359 (MainThread): Parsing macros/queries.sql
2021-08-03 16:06:36.520854 (MainThread): * Deprecation Warning: dbt v0.17.0 introduces a new config format for the
dbt_project.yml file. Support for the existing version 1 format will be removed
in a future release of dbt. The following packages are currently configured with
config version 1:
 - dbt_analytics
 - dbt_utils
 - redshift

For upgrading instructions, consult the documentation:
  https://docs.getdbt.com/docs/guides/migration-guide/upgrading-to-0-17-0

2021-08-03 16:06:36.521009 (MainThread): Sending event: {'category': 'dbt', 'action': 'deprecation', 'label': 'be4ceda5-9e7a-4414-83c6-834981ac4227', 'property_': 'warn', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x107d1d100>]}
2021-08-03 16:06:36.602240 (MainThread): profile hash mismatch, cache invalidated
2021-08-03 16:06:36.682165 (MainThread): Acquiring new redshift connection "model.dbt_analytics.universal_holidays_list__pb".
2021-08-03 16:06:36.702383 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.713468 (MainThread): Acquiring new redshift connection "model.dbt_analytics.universal_address__pb".
2021-08-03 16:06:36.724059 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.733285 (MainThread): Acquiring new redshift connection "model.dbt_analytics.universal_company_users__pb".
2021-08-03 16:06:36.743875 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.756038 (MainThread): Acquiring new redshift connection "model.dbt_analytics.universal_holidays__pb".
2021-08-03 16:06:36.762990 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.771648 (MainThread): Acquiring new redshift connection "model.dbt_analytics.universal_company__pb".
2021-08-03 16:06:36.780055 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.787440 (MainThread): Acquiring new redshift connection "model.dbt_analytics.universal_user_acl_roles__pb".
2021-08-03 16:06:36.796351 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.804414 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_note__pb".
2021-08-03 16:06:36.821309 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.828691 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_agingsla__pb".
2021-08-03 16:06:36.871575 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:36.873046 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:36.873486 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:36.873932 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:36.891371 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.898530 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_custom_field_values__pb".
2021-08-03 16:06:36.909888 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.918025 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_question_answer_pair_details__pb".
2021-08-03 16:06:36.924605 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.931978 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_resource_time_tracking_firstoccurrence__pb".
2021-08-03 16:06:36.938117 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.944223 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_negotiation_spending__pb".
2021-08-03 16:06:36.952463 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.960601 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_aging__pb".
2021-08-03 16:06:36.982794 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:36.989133 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_routing_strategy_details__pb".
2021-08-03 16:06:36.998926 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.007666 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_changelog_status__pb".
2021-08-03 16:06:37.013743 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.019736 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_custom_field_instance_history__pb".
2021-08-03 16:06:37.037314 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.044675 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_custom_field_instance__pb".
2021-08-03 16:06:37.053385 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.060513 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_agingsla_escalations_date_mesh__pb".
2021-08-03 16:06:37.069965 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.077867 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_resource_time_tracking_lastoccurrence__pb".
2021-08-03 16:06:37.085618 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.095507 (MainThread): Acquiring new redshift connection "model.dbt_analytics.first_trip_resolution__pb".
2021-08-03 16:06:37.105952 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.114760 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_parts_usage__pb".
2021-08-03 16:06:37.128780 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.135895 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_sub_status_type_association_details__undeleted__pb".
2021-08-03 16:06:37.145172 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.153156 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_negotiation_scheduling__pb".
2021-08-03 16:06:37.165710 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.173262 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_sub_status_type_association_details__pb".
2021-08-03 16:06:37.183147 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.193693 (MainThread): Acquiring new redshift connection "model.dbt_analytics.callback__pb".
2021-08-03 16:06:37.210108 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.219193 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_milestones_current__pb".
2021-08-03 16:06:37.227568 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.237178 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_note__undeleted__pb".
2021-08-03 16:06:37.246177 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.255657 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_escalation_details__pb".
2021-08-03 16:06:37.270372 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.280300 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_rescheduled_first__pb".
2021-08-03 16:06:37.290420 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.299321 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_resource_assigned_lastoccurrence__pb".
2021-08-03 16:06:37.308273 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.315808 (MainThread): Acquiring new redshift connection "model.dbt_analytics.worker_lob_zone_postalcode__pb".
2021-08-03 16:06:37.329158 (MainThread): * Deprecation Warning: The "adapter_macro" macro has been deprecated. Instead,
use the `adapter.dispatch` method to find a macro and call the result.
adapter_macro was called for: dbt_utils.type_string
2021-08-03 16:06:37.329533 (MainThread): Sending event: {'category': 'dbt', 'action': 'deprecation', 'label': 'be4ceda5-9e7a-4414-83c6-834981ac4227', 'property_': 'warn', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x108a87310>]}
2021-08-03 16:06:37.339427 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.346922 (MainThread): Acquiring new redshift connection "model.dbt_analytics.user_custom_field_value__pb".
2021-08-03 16:06:37.361743 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.368584 (MainThread): Acquiring new redshift connection "model.dbt_analytics.worker_company__pb".
2021-08-03 16:06:37.379029 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.387275 (MainThread): Acquiring new redshift connection "model.dbt_analytics.worker_to_site__pb".
2021-08-03 16:06:37.402470 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.411092 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__field_type__pb".
2021-08-03 16:06:37.417949 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.435527 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__routing_strategy_tracking__pb".
2021-08-03 16:06:37.447329 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.455986 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__location_mapping__pb".
2021-08-03 16:06:37.464662 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.472128 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__invoice__pb".
2021-08-03 16:06:37.480125 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.490376 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_shipment__part_with_tracking_group__pb".
2021-08-03 16:06:37.500556 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.510467 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__location__pb".
2021-08-03 16:06:37.518579 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.526684 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_shipment__shipment_undeleted__pb".
2021-08-03 16:06:37.536333 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.546453 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_acl__acl_group__pb".
2021-08-03 16:06:37.555992 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.564475 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__phone__pb".
2021-08-03 16:06:37.572990 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.585669 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_sub_status_audit__pb".
2021-08-03 16:06:37.595886 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.605308 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__user_group__pb".
2021-08-03 16:06:37.615223 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.624655 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work__pb".
2021-08-03 16:06:37.635986 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.645840 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__field_option__pb".
2021-08-03 16:06:37.655566 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.666594 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__address__pb".
2021-08-03 16:06:37.684346 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.694400 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_resource__pb".
2021-08-03 16:06:37.704230 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.713077 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_escalation__pb".
2021-08-03 16:06:37.722945 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.733496 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__client_contact__pb".
2021-08-03 16:06:37.743854 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.754033 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__client_contact_phone_association__pb".
2021-08-03 16:06:37.765250 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.773252 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_sub_status_type_association__pb".
2021-08-03 16:06:37.782820 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.791734 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_shipment__shipment_group__pb".
2021-08-03 16:06:37.800433 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.807854 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_undeleted__pb".
2021-08-03 16:06:37.816355 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.825474 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__profile__pb".
2021-08-03 16:06:37.836029 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.845616 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_status_transition_history_summary__pb".
2021-08-03 16:06:37.855494 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.864598 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_negotiation__pb".
2021-08-03 16:06:37.874800 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.885570 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__users__pb".
2021-08-03 16:06:37.900399 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.910699 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_milestones__pb".
2021-08-03 16:06:37.919961 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.929106 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_question_answer_pair__pb".
2021-08-03 16:06:37.941912 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.950268 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__field_option_selection__pb".
2021-08-03 16:06:37.958380 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.966879 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_sub_status_type__pb".
2021-08-03 16:06:37.975146 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:37.987564 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_acl__acl_group_to_acl_priv__pb".
2021-08-03 16:06:38.029409 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.038975 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__postal_code__pb".
2021-08-03 16:06:38.048855 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.058603 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_shipment__shipment__pb".
2021-08-03 16:06:38.068271 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.080220 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_changelog__pb".
2021-08-03 16:06:38.090365 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.099683 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_acl__acl_priv__pb".
2021-08-03 16:06:38.109191 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.117600 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__group_routing_strategy_association__pb".
2021-08-03 16:06:38.127137 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.136123 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_resource_time_tracking__pb".
2021-08-03 16:06:38.145783 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.152969 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__routing_strategy__pb".
2021-08-03 16:06:38.160762 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.168866 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__custom_field_definition__pb".
2021-08-03 16:06:38.178256 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.185933 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__note__pb".
2021-08-03 16:06:38.195583 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.203768 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_acl__user_membership__pb".
2021-08-03 16:06:38.211131 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.219620 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__custom_field_instance__pb".
2021-08-03 16:06:38.228053 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.235431 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_shipment__part_with_tracking__pb".
2021-08-03 16:06:38.243448 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.251008 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_shipment__shipment_group_undeleted__pb".
2021-08-03 16:06:38.260122 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.269447 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__company__pb".
2021-08-03 16:06:38.278255 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.287378 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_employee_spend__pii".
2021-08-03 16:06:38.297467 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.306276 (MainThread): Acquiring new redshift connection "model.dbt_analytics.worker_location".
2021-08-03 16:06:38.321632 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.330482 (MainThread): Acquiring new redshift connection "model.dbt_analytics.top_talent_and_employee_spend".
2021-08-03 16:06:38.343748 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.352213 (MainThread): Acquiring new redshift connection "model.dbt_analytics.order_item_invoice".
2021-08-03 16:06:38.374982 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.382431 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_contractor_spend__pii".
2021-08-03 16:06:38.390357 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.397578 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_final_cost_details".
2021-08-03 16:06:38.414873 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.423684 (MainThread): Acquiring new redshift connection "model.dbt_analytics.dc_worker__pii".
2021-08-03 16:06:38.432576 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.439699 (MainThread): Acquiring new redshift connection "model.dbt_analytics.dc_location__pii".
2021-08-03 16:06:38.447155 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.453869 (MainThread): Acquiring new redshift connection "model.dbt_analytics.dc_assignment__pii".
2021-08-03 16:06:38.465299 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.472795 (MainThread): Acquiring new redshift connection "model.dbt_analytics.talent_pool_members__pii".
2021-08-03 16:06:38.481594 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.489612 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__address__pii".
2021-08-03 16:06:38.496825 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.504253 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__users__pii".
2021-08-03 16:06:38.512556 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:38.519934 (MainThread): Acquiring new redshift connection "model.dbt_analytics.sdlc_metrics".
2021-08-03 16:06:38.531366 (MainThread): Acquiring new redshift connection "model.dbt_analytics.test_metrics__integration_test_cases".
2021-08-03 16:06:38.545742 (MainThread): Acquiring new redshift connection "model.dbt_analytics.test_metrics__mabl".
2021-08-03 16:06:38.558529 (MainThread): Acquiring new redshift connection "model.dbt_analytics.test_metrics__ios_smoke".
2021-08-03 16:06:38.569831 (MainThread): Acquiring new redshift connection "model.dbt_analytics.test_metrics__api".
2021-08-03 16:06:38.581046 (MainThread): Acquiring new redshift connection "model.dbt_analytics.performance_metrics__bulk_upload".
2021-08-03 16:06:38.592159 (MainThread): Acquiring new redshift connection "model.dbt_analytics.performance_metrics__monolith".
2021-08-03 16:06:38.608312 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__role_to_user_undeleted".
2021-08-03 16:06:38.620889 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__feature_type_undeleted".
2021-08-03 16:06:38.633373 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__role_localization_undeleted".
2021-08-03 16:06:38.646329 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__feature_type_localization_undeleted".
2021-08-03 16:06:38.660624 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__permission_localization_undeleted".
2021-08-03 16:06:38.673321 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__permission_undeleted".
2021-08-03 16:06:38.686471 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__role_undeleted".
2021-08-03 16:06:38.703406 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_throughput".
2021-08-03 16:06:38.729017 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_spend_year_over_year_by_project".
2021-08-03 16:06:38.755392 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_status_by_client".
2021-08-03 16:06:38.776586 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_minimal".
2021-08-03 16:06:38.798958 (MainThread): Acquiring new redshift connection "model.dbt_analytics.labor_cloud_status".
2021-08-03 16:06:38.821999 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_staffing_forecast".
2021-08-03 16:06:38.843640 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_total_and_average_spend_by_client".
2021-08-03 16:06:38.864390 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_project_minimal".
2021-08-03 16:06:38.882060 (MainThread): Acquiring new redshift connection "model.dbt_analytics.denorm_work_one".
2021-08-03 16:06:38.909437 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_employee_spend".
2021-08-03 16:06:38.925553 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_contractor_spend".
2021-08-03 16:06:38.944274 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_client_minimal".
2021-08-03 16:06:38.965690 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_operational_metrics_by_project".
2021-08-03 16:06:38.984783 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_spend_year_over_year_by_client".
2021-08-03 16:06:39.004084 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_spend_year_over_year".
2021-08-03 16:06:39.020413 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_status_by_project".
2021-08-03 16:06:39.036553 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assignment_operational_metrics_by_clients".
2021-08-03 16:06:39.056458 (MainThread): Acquiring new redshift connection "model.dbt_analytics.talent_pool_metrics".
2021-08-03 16:06:39.071013 (MainThread): Acquiring new redshift connection "model.dbt_analytics.order_custom_field_value".
2021-08-03 16:06:39.094965 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_durations".
2021-08-03 16:06:39.114258 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_routing_strategy".
2021-08-03 16:06:39.134154 (MainThread): Acquiring new redshift connection "model.dbt_analytics.project_work_association__no_dupes".
2021-08-03 16:06:39.148912 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_time_details".
2021-08-03 16:06:39.166507 (MainThread): Acquiring new redshift connection "model.dbt_analytics.candidate_flow".
2021-08-03 16:06:39.186548 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_company_with_accepted_dates".
2021-08-03 16:06:39.204602 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_history_summary_latest".
2021-08-03 16:06:39.220626 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_custom_field_instance".
2021-08-03 16:06:39.238315 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_milestones_resource_details".
2021-08-03 16:06:39.257117 (MainThread): Acquiring new redshift connection "model.dbt_analytics.paid_work_details".
2021-08-03 16:06:39.274670 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_application".
2021-08-03 16:06:39.292826 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_project".
2021-08-03 16:06:39.314865 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_escalation__with_company".
2021-08-03 16:06:39.335111 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_milestones_negotiation_details".
2021-08-03 16:06:39.353576 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work".
2021-08-03 16:06:39.367213 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_time_metrics".
2021-08-03 16:06:39.387757 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_client_spend".
2021-08-03 16:06:39.403944 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_custom_field".
2021-08-03 16:06:39.419212 (MainThread): Acquiring new redshift connection "model.dbt_analytics.rating__undeleted".
2021-08-03 16:06:39.434156 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_company_with_paid_dates".
2021-08-03 16:06:39.450378 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_resource_time_tracking_firstoccurrence".
2021-08-03 16:06:39.463254 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_closed_with_worker_labels".
2021-08-03 16:06:39.483807 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_question_answer_pair".
2021-08-03 16:06:39.496766 (MainThread): Acquiring new redshift connection "model.dbt_analytics.work_resource_time_tracking_lastoccurrence".
2021-08-03 16:06:39.509401 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_decisionflow__decision_flow_template_undeleted".
2021-08-03 16:06:39.527968 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_decisionflow__decision_flow_undeleted".
2021-08-03 16:06:39.542935 (MainThread): Acquiring new redshift connection "model.dbt_analytics.user_login_history".
2021-08-03 16:06:39.560024 (MainThread): Acquiring new redshift connection "model.dbt_analytics.user_custom_field_value".
2021-08-03 16:06:39.584602 (MainThread): Acquiring new redshift connection "model.dbt_analytics.worker_labels".
2021-08-03 16:06:39.612726 (MainThread): Acquiring new redshift connection "model.dbt_analytics.qualification__undeleted".
2021-08-03 16:06:39.628082 (MainThread): Acquiring new redshift connection "model.dbt_analytics.user_to_qualification__undeleted".
2021-08-03 16:06:39.641177 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assessment_user_choice".
2021-08-03 16:06:39.660182 (MainThread): Acquiring new redshift connection "model.dbt_analytics.assigned_worker".
2021-08-03 16:06:39.674962 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_flowengine__node_undeleted".
2021-08-03 16:06:39.688944 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_flowengine__flow_undeleted".
2021-08-03 16:06:39.701597 (MainThread): Acquiring new redshift connection "model.dbt_analytics.revenue__adhoc".
2021-08-03 16:06:39.719188 (MainThread): Acquiring new redshift connection "model.dbt_analytics.revenue__transactional".
2021-08-03 16:06:39.737662 (MainThread): Acquiring new redshift connection "model.dbt_analytics.creditmemo_time_period_amount_distribution".
2021-08-03 16:06:39.763096 (MainThread): Acquiring new redshift connection "model.dbt_analytics.service_transaction_revenue_effective_dates".
2021-08-03 16:06:39.776024 (MainThread): Acquiring new redshift connection "model.dbt_analytics.revenue__amount_distribution__hour".
2021-08-03 16:06:39.797225 (MainThread): Acquiring new redshift connection "model.dbt_analytics.revenue__type_by_client__month".
2021-08-03 16:06:39.813417 (MainThread): Acquiring new redshift connection "model.dbt_analytics.creditmemo_issued_invoice_line_items".
2021-08-03 16:06:39.831072 (MainThread): Acquiring new redshift connection "model.dbt_analytics.revenue__nonadhoc_amount_distribution".
2021-08-03 16:06:39.847305 (MainThread): Acquiring new redshift connection "model.dbt_analytics.revenue__nonadhoc".
2021-08-03 16:06:39.864939 (MainThread): Acquiring new redshift connection "model.dbt_analytics.revenue__adhoc_amount_distribution".
2021-08-03 16:06:39.881770 (MainThread): Acquiring new redshift connection "model.dbt_analytics.dc_pricing_configuration".
2021-08-03 16:06:39.895014 (MainThread): Acquiring new redshift connection "model.dbt_analytics.dc_worker".
2021-08-03 16:06:39.910128 (MainThread): Acquiring new redshift connection "model.dbt_analytics.dc_payment".
2021-08-03 16:06:39.925799 (MainThread): Acquiring new redshift connection "model.dbt_analytics.dc_assignment".
2021-08-03 16:06:39.941449 (MainThread): Acquiring new redshift connection "model.dbt_analytics.dc_company".
2021-08-03 16:06:39.988047 (MainThread): Acquiring new redshift connection "model.dbt_analytics.dc_location".
2021-08-03 16:06:40.003716 (MainThread): Acquiring new redshift connection "model.dbt_analytics.talent_pool_members".
2021-08-03 16:06:40.017869 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work".
2021-08-03 16:06:40.034809 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_decisionflow__decision_status".
2021-08-03 16:06:40.047369 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_decisionflow__decision_flow".
2021-08-03 16:06:40.059672 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__client_company".
2021-08-03 16:06:40.070026 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_qualification__qualification".
2021-08-03 16:06:40.085540 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__project_work_association".
2021-08-03 16:06:40.096656 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_resource_label".
2021-08-03 16:06:40.109216 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_history_summary".
2021-08-03 16:06:40.123389 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__user_rating_history_summary".
2021-08-03 16:06:40.137278 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_decisionflow__decision_flow_template".
2021-08-03 16:06:40.150326 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_authentication__login_history".
2021-08-03 16:06:40.163356 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__address".
2021-08-03 16:06:40.178862 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__field_option_selection".
2021-08-03 16:06:40.192750 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__country".
2021-08-03 16:06:40.203535 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__rating".
2021-08-03 16:06:40.217365 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__role_to_permission".
2021-08-03 16:06:40.229779 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_plutus__order_state_type".
2021-08-03 16:06:40.243351 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_plutus__order".
2021-08-03 16:06:40.257390 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_plutus__order_item".
2021-08-03 16:06:40.271513 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__user_to_qualification".
2021-08-03 16:06:40.285716 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__assessment_user_association".
2021-08-03 16:06:40.303020 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_custom_field".
2021-08-03 16:06:40.317935 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_milestones".
2021-08-03 16:06:40.331312 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__plutus_invoice_to_invoice".
2021-08-03 16:06:40.345978 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__login_info".
2021-08-03 16:06:40.361754 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__register_transaction".
2021-08-03 16:06:40.376066 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__role_localization".
2021-08-03 16:06:40.389051 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__assessment".
2021-08-03 16:06:40.403794 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__assessment_item".
2021-08-03 16:06:40.417732 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_escalation".
2021-08-03 16:06:40.432262 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__field_option".
2021-08-03 16:06:40.445592 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__user_group".
2021-08-03 16:06:40.458823 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_negotiation".
2021-08-03 16:06:40.470120 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__state".
2021-08-03 16:06:40.482580 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__routing_strategy".
2021-08-03 16:06:40.493558 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__users".
2021-08-03 16:06:40.509620 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__permission".
2021-08-03 16:06:40.521841 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_custom_field_group".
2021-08-03 16:06:40.533639 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__assessment_item_choice".
2021-08-03 16:06:40.548275 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_resource_time_tracking".
2021-08-03 16:06:40.561295 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__role".
2021-08-03 16:06:40.574242 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_flowengine__flow".
2021-08-03 16:06:40.588306 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__time_dimension".
2021-08-03 16:06:40.603238 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__time_zone".
2021-08-03 16:06:40.615397 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__service_transaction".
2021-08-03 16:06:40.627366 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_decisionflow__decider_type".
2021-08-03 16:06:40.640931 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__custom_field_definition".
2021-08-03 16:06:40.654486 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__feature_type".
2021-08-03 16:06:40.666438 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__assessment_attempt".
2021-08-03 16:06:40.677521 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_qualification__qualification_type".
2021-08-03 16:06:40.691370 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__service_transaction_revenue".
2021-08-03 16:06:40.704523 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__feature_type_localization".
2021-08-03 16:06:40.716118 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__user_user_group_association".
2021-08-03 16:06:40.727285 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_decisionflow__decision_step".
2021-08-03 16:06:40.740146 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_plutus__invoice".
2021-08-03 16:06:40.752141 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__permission_localization".
2021-08-03 16:06:40.765089 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_status_transition_history_summary".
2021-08-03 16:06:40.777068 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_permissions__role_to_user".
2021-08-03 16:06:40.789558 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__credit_memo_audit".
2021-08-03 16:06:40.801712 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_plutus__source_type".
2021-08-03 16:06:40.814589 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_resource".
2021-08-03 16:06:40.827844 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_decisionflow__decision".
2021-08-03 16:06:40.839847 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__group_routing_strategy_association".
2021-08-03 16:06:40.850232 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__assessment_attempt_response".
2021-08-03 16:06:40.864249 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_custom_field_saved".
2021-08-03 16:06:40.875775 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__invoice_line_item".
2021-08-03 16:06:40.886301 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_decisionflow__decision_flow_status".
2021-08-03 16:06:40.898064 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__profile".
2021-08-03 16:06:40.907293 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__project".
2021-08-03 16:06:40.919840 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__routing_strategy_tracking".
2021-08-03 16:06:40.929822 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_plutus__invoice_state_type".
2021-08-03 16:06:40.940749 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_question_answer_pair".
2021-08-03 16:06:40.953458 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_plutus__payable_type".
2021-08-03 16:06:40.967106 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__invoice".
2021-08-03 16:06:40.978312 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__field_type".
2021-08-03 16:06:40.990202 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_flowengine__node".
2021-08-03 16:06:41.003385 (MainThread): Acquiring new redshift connection "model.dbt_analytics.ms_customfield__custom_field_instance".
2021-08-03 16:06:41.015835 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__company".
2021-08-03 16:06:41.024969 (MainThread): Acquiring new redshift connection "model.dbt_analytics.marketcore__work_custom_field_group_association".
2021-08-03 16:06:41.034827 (MainThread): Acquiring new redshift connection "model.dbt_analytics.users".
2021-08-03 16:06:41.047676 (MainThread): Acquiring new redshift connection "model.dbt_analytics.users_with_profile".
2021-08-03 16:06:41.061441 (MainThread): Acquiring new redshift connection "model.dbt_analytics.address".
2021-08-03 16:06:41.073374 (MainThread): Acquiring new redshift connection "model.dbt_analytics.fiscal_periods".
2021-08-03 16:06:41.100926 (MainThread): Acquiring new redshift connection "model.dbt_analytics.company_average_days_to_approval".
2021-08-03 16:06:41.115043 (MainThread): Acquiring new redshift connection "model.dbt_analytics.company_assignment_lifecycle_overview".
2021-08-03 16:06:41.132279 (MainThread): Acquiring new redshift connection "model.dbt_analytics.company_average_rating".
2021-08-03 16:06:41.149809 (MainThread): Acquiring new redshift connection "model.dbt_analytics.company_average_days_to_pay".
2021-08-03 16:06:41.163667 (MainThread): Acquiring new redshift connection "model.dbt_analytics.buyer_company_ratings".
2021-08-03 16:06:41.181058 (MainThread): Acquiring new redshift connection "model.dbt_analytics.invoice_line_item_adhoc_subscription_effective_split".
2021-08-03 16:06:41.192445 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:41.193202 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:41.195970 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:41.196337 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:41.197057 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:41.197309 (MainThread): invalid escape sequence '\d'
2021-08-03 16:06:41.224192 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.work_with_location_and_start_time".
2021-08-03 16:06:41.239233 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.work_count_by_company_and_date".
2021-08-03 16:06:41.250463 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.worker_scorecard".
2021-08-03 16:06:41.262099 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.staffing_rate_by_company_date_and_routing_strategy".
2021-08-03 16:06:41.271507 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.throughput_by_company_and_date".
2021-08-03 16:06:41.280577 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.company_client_risk_metrics".
2021-08-03 16:06:41.295339 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.company_risk_metrics".
2021-08-03 16:06:41.313398 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.company_assignment_lifecycle_trend".
2021-08-03 16:06:41.323037 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.work_project_throughput".
2021-08-03 16:06:41.333867 (MainThread): Acquiring new redshift connection "analysis.dbt_analytics.company_scorecard".
2021-08-03 16:06:41.353263 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_work_history_summary_latest_sent_once".
2021-08-03 16:06:41.362256 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_minimum_years_data__work_history_summary".
2021-08-03 16:06:41.371013 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_creditmemo_issued_invoice_line_items_amount_greater_zero".
2021-08-03 16:06:41.380675 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_minimum_years_data__work".
2021-08-03 16:06:41.389149 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_minimum_years_data__work_resource".
2021-08-03 16:06:41.399175 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_minimum_years_data__work_milestones".
2021-08-03 16:06:41.407866 (MainThread): Acquiring new redshift connection "test.dbt_analytics.unique_marketcore__users_associate_object_id__and_not_blank".
2021-08-03 16:06:41.417671 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_minimum_years_data__group_routing_strategy_association".
2021-08-03 16:06:41.428971 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_minimum_years_data__work_resource_time_tracking".
2021-08-03 16:06:41.438375 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_minimum_years_data__routing_strategy".
2021-08-03 16:06:41.447028 (MainThread): Acquiring new redshift connection "test.dbt_analytics.unique_marketcore__company_org_object_id__and_not_blank".
2021-08-03 16:06:41.455147 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_minimum_years_data__work_status_transition_history_summary".
2021-08-03 16:06:41.463658 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_minimum_years_data__routing_strategy_tracking".
2021-08-03 16:06:41.475014 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_revenue__adhoc_amount_distribution_valid".
2021-08-03 16:06:41.484166 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_worker_company_missing_worker_company__pb".
2021-08-03 16:06:41.494034 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:41.501150 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_stale_custom_field_value_worker_custom_field_instance__pb".
2021-08-03 16:06:41.508943 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:41.517442 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_wrong_worker_company__work_agingsla__pb".
2021-08-03 16:06:41.527825 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:41.534782 (MainThread): Acquiring new redshift connection "test.dbt_analytics.assert_wrong_worker_company_null__work_agingsla__pb".
2021-08-03 16:06:41.546076 (MainThread): 'soft_unicode' has been renamed to 'soft_str'. The old name will be removed in MarkupSafe 2.1.
2021-08-03 16:06:41.556709 (MainThread): Acquiring new redshift connection "operation.dbt_analytics.dbt_analytics-on-run-end-0".
2021-08-03 16:06:57.872900 (MainThread): Acquiring new redshift connection "model.redshift.redshift_columns".
2021-08-03 16:06:57.881424 (MainThread): Acquiring new redshift connection "model.redshift.redshift_tables".
2021-08-03 16:06:57.889271 (MainThread): Acquiring new redshift connection "model.redshift.redshift_constraints".
2021-08-03 16:06:57.897938 (MainThread): Acquiring new redshift connection "model.redshift.redshift_sort_dist_keys".
2021-08-03 16:06:57.905571 (MainThread): Acquiring new redshift connection "model.redshift.redshift_admin_queries".
2021-08-03 16:06:57.916754 (MainThread): Acquiring new redshift connection "model.redshift.redshift_admin_users_schema_privileges".
2021-08-03 16:06:57.928091 (MainThread): Acquiring new redshift connection "model.redshift.redshift_admin_dependencies".
2021-08-03 16:06:57.945133 (MainThread): Acquiring new redshift connection "model.redshift.redshift_admin_users_table_view_privileges".
2021-08-03 16:06:57.957302 (MainThread): Acquiring new redshift connection "model.redshift.redshift_admin_table_stats".
2021-08-03 16:06:57.975535 (MainThread): Acquiring new redshift connection "model.redshift.pg_namespace".
2021-08-03 16:06:57.982455 (MainThread): Acquiring new redshift connection "model.redshift.stl_query".
2021-08-03 16:06:58.059277 (MainThread): Acquiring new redshift connection "model.redshift.svv_diskusage".
2021-08-03 16:06:58.066455 (MainThread): Acquiring new redshift connection "model.redshift.pg_user".
2021-08-03 16:06:58.073073 (MainThread): Acquiring new redshift connection "model.redshift.pg_tables".
2021-08-03 16:06:58.079463 (MainThread): Acquiring new redshift connection "model.redshift.redshift_cost".
2021-08-03 16:06:58.087366 (MainThread): Acquiring new redshift connection "model.redshift.stv_tbl_perm".
2021-08-03 16:06:58.094488 (MainThread): Acquiring new redshift connection "model.redshift.stv_blocklist".
2021-08-03 16:06:58.100749 (MainThread): Acquiring new redshift connection "model.redshift.pg_class".
2021-08-03 16:06:58.107047 (MainThread): Acquiring new redshift connection "model.redshift.pg_depend".
2021-08-03 16:06:58.113019 (MainThread): Acquiring new redshift connection "model.redshift.stl_explain".
2021-08-03 16:06:58.118720 (MainThread): Acquiring new redshift connection "model.redshift.pg_attribute".
2021-08-03 16:06:58.124855 (MainThread): Acquiring new redshift connection "model.redshift.pg_views".
2021-08-03 16:06:58.130896 (MainThread): Acquiring new redshift connection "model.redshift.stl_wlm_query".
2021-08-03 16:07:05.058912 (MainThread): Found 264 models, 1430 tests, 0 snapshots, 10 analyses, 332 macros, 1 operation, 0 seed files, 167 sources
2021-08-03 16:07:05.191909 (MainThread): 
2021-08-03 16:07:05.192346 (MainThread): Acquiring new redshift connection "master".
2021-08-03 16:07:05.194861 (ThreadPoolExecutor-0_0): Acquiring new redshift connection "list_dade".
2021-08-03 16:07:05.202996 (ThreadPoolExecutor-0_0): Using redshift connection "list_dade".
2021-08-03 16:07:05.203148 (ThreadPoolExecutor-0_0): On list_dade: /* {"app": "dbt", "dbt_version": "0.18.0", "profile_name": "dade", "target_name": "iam", "connection_name": "list_dade"} */

    select distinct nspname from pg_namespace
  
2021-08-03 16:07:05.203260 (ThreadPoolExecutor-0_0): Opening a new connection, currently in state init
2021-08-03 16:07:05.203343 (ThreadPoolExecutor-0_0): Connecting to Redshift using 'IAM' credentials
2021-08-03 16:07:05.203420 (ThreadPoolExecutor-0_0): Connecting to Redshift using 'IAM'with profile wmops
2021-08-03 16:07:11.068376 (ThreadPoolExecutor-0_0): unclosed <ssl.SSLSocket fd=12, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0, laddr=('192.168.1.98', 53410), raddr=('54.239.21.217', 443)>
2021-08-03 16:07:11.745975 (ThreadPoolExecutor-0_0): SQL status: SELECT in 6.54 seconds
2021-08-03 16:07:11.753228 (ThreadPoolExecutor-0_0): unclosed <ssl.SSLSocket fd=12, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0, laddr=('192.168.1.98', 53411), raddr=('54.239.31.87', 443)>
2021-08-03 16:07:11.754197 (ThreadPoolExecutor-0_0): On list_dade: Close
2021-08-03 16:07:13.671289 (ThreadPoolExecutor-1_0): Acquiring new redshift connection "list_dade_dbt_analytics__alex_iam_pii".
2021-08-03 16:07:13.678812 (ThreadPoolExecutor-1_0): Using redshift connection "list_dade_dbt_analytics__alex_iam_pii".
2021-08-03 16:07:13.678923 (ThreadPoolExecutor-1_0): On list_dade_dbt_analytics__alex_iam_pii: BEGIN
2021-08-03 16:07:13.679018 (ThreadPoolExecutor-1_0): Opening a new connection, currently in state closed
2021-08-03 16:07:13.679102 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM' credentials
2021-08-03 16:07:13.679184 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM'with profile wmops
2021-08-03 16:07:24.161795 (ThreadPoolExecutor-1_0): Error running SQL: BEGIN
2021-08-03 16:07:24.161984 (ThreadPoolExecutor-1_0): Rolling back transaction.
2021-08-03 16:07:24.162127 (ThreadPoolExecutor-1_0): Opening a new connection, currently in state closed
2021-08-03 16:07:24.162232 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM' credentials
2021-08-03 16:07:24.162325 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM'with profile wmops
2021-08-03 16:07:28.038175 (ThreadPoolExecutor-1_0): Error running SQL: macro list_relations_without_caching
2021-08-03 16:07:28.038336 (ThreadPoolExecutor-1_0): Rolling back transaction.
2021-08-03 16:07:28.038463 (ThreadPoolExecutor-1_0): Opening a new connection, currently in state closed
2021-08-03 16:07:28.038555 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM' credentials
2021-08-03 16:07:28.038640 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM'with profile wmops
2021-08-03 16:07:34.957104 (ThreadPoolExecutor-1_0): unclosed <ssl.SSLSocket fd=14, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0, laddr=('192.168.1.98', 53416), raddr=('54.239.21.217', 443)>
2021-08-03 16:07:35.528364 (ThreadPoolExecutor-1_0): On list_dade_dbt_analytics__alex_iam_pii: Close
2021-08-03 16:07:35.529254 (ThreadPoolExecutor-1_0): Acquiring new redshift connection "list_dade_dbt_analytics__alex_iam_pb".
2021-08-03 16:07:35.532012 (ThreadPoolExecutor-1_0): unclosed <ssl.SSLSocket fd=14, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0, laddr=('192.168.1.98', 53417), raddr=('54.239.31.87', 443)>
2021-08-03 16:07:35.534319 (ThreadPoolExecutor-1_0): Using redshift connection "list_dade_dbt_analytics__alex_iam_pb".
2021-08-03 16:07:35.534478 (ThreadPoolExecutor-1_0): On list_dade_dbt_analytics__alex_iam_pb: BEGIN
2021-08-03 16:07:35.534605 (ThreadPoolExecutor-1_0): Opening a new connection, currently in state closed
2021-08-03 16:07:35.534679 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM' credentials
2021-08-03 16:07:35.534747 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM'with profile wmops
2021-08-03 16:07:49.998274 (ThreadPoolExecutor-1_0): Error running SQL: BEGIN
2021-08-03 16:07:49.998403 (ThreadPoolExecutor-1_0): Rolling back transaction.
2021-08-03 16:07:49.998547 (ThreadPoolExecutor-1_0): Opening a new connection, currently in state closed
2021-08-03 16:07:49.998667 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM' credentials
2021-08-03 16:07:49.998816 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM'with profile wmops
2021-08-03 16:08:06.263592 (ThreadPoolExecutor-1_0): unclosed <ssl.SSLSocket fd=15, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0, laddr=('192.168.1.98', 53429), raddr=('54.239.21.217', 443)>
2021-08-03 16:08:06.978559 (ThreadPoolExecutor-1_0): Error running SQL: macro list_relations_without_caching
2021-08-03 16:08:06.978709 (ThreadPoolExecutor-1_0): Rolling back transaction.
2021-08-03 16:08:06.982698 (ThreadPoolExecutor-1_0): On list_dade_dbt_analytics__alex_iam_pb: Close
2021-08-03 16:08:06.983418 (ThreadPoolExecutor-1_0): Acquiring new redshift connection "list_dade_dbt_analytics__alex_iam".
2021-08-03 16:08:06.985626 (ThreadPoolExecutor-1_0): unclosed <ssl.SSLSocket fd=15, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0, laddr=('192.168.1.98', 53430), raddr=('54.239.31.87', 443)>
2021-08-03 16:08:06.987386 (ThreadPoolExecutor-1_0): Using redshift connection "list_dade_dbt_analytics__alex_iam".
2021-08-03 16:08:06.987476 (ThreadPoolExecutor-1_0): On list_dade_dbt_analytics__alex_iam: BEGIN
2021-08-03 16:08:06.987733 (ThreadPoolExecutor-1_0): Opening a new connection, currently in state closed
2021-08-03 16:08:06.987931 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM' credentials
2021-08-03 16:08:06.988006 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM'with profile wmops
2021-08-03 16:08:12.749574 (ThreadPoolExecutor-1_0): Error running SQL: BEGIN
2021-08-03 16:08:12.749728 (ThreadPoolExecutor-1_0): Rolling back transaction.
2021-08-03 16:08:12.749857 (ThreadPoolExecutor-1_0): Opening a new connection, currently in state closed
2021-08-03 16:08:12.749952 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM' credentials
2021-08-03 16:08:12.750037 (ThreadPoolExecutor-1_0): Connecting to Redshift using 'IAM'with profile wmops

System information

Which database are you using dbt with?

  • postgres
  • redshift
  • bigquery
  • snowflake
  • other (specify: ____________)

The output of dbt --version:

0.18.0

The operating system you're using:

Mac Catalina 10.15.7 (19H524)

The output of python --version:

Python 3.8.10

Additional context

Potentially relevant issues:

[CT-791] intermittent database connection error

Describe the bug

We've seen some intermittent database connection error with Redshift, something like

connection to server at "redshift cluster domain name", port 5439 failed: timeout expired

I've been playing with two parameters in profile.yml

iam_duration_seconds
keepalives_idle (set it a small value to trigger a pulse more frequently to keep it alive)

and it's been less frequent since. But I don't fully understand how connection is managed by DBT: is it a new connection for each model, or available connection from a pool?

Steps To Reproduce

No easy way to reproduce

Expected behavior

No connection error unless it's a network error.

Screenshots and log output

If applicable, add screenshots or log output to help explain your problem.

System information

The output of dbt --version:

installed version: 0.19.1
   latest version: 1.0.0

Your version of dbt is out of date! You can find instructions for upgrading here:
https://docs.getdbt.com/docs/installation

Plugins:
  - postgres: 0.19.1
  - redshift: 0.19.1
  - snowflake: 0.19.1
  - bigquery: 0.19.1

The operating system you're using:

Amazon Linux docker container running on ecs cluster

The output of python --version:

Python 3.8.10

Additional context

We connect to Redshift using IAM role authentication.

[CT-79] Show a better error message than `SSL SYSCALL error: EOF detected`

Describe the feature

Occasionally, someone will post in the community Slack because they've run into an inscrutable error message SSL SYSCALL error: EOF detected.

The state of the art seems to be this post from Jake back in the day, recommending a decrease of the keepalives_idle value.

If/when this does happen, we should do a better job of explaining what happened and how it can be resolved. It's confusing to be described as an SSL error* plus an end of file error, for it to ultimately be a timeout problem.

*I read it as Single Sockets Layer, which is doubly concerning - is my connection unsafe somehow?

I'd prefer something like SSL SYSCALL error: EOF detected. The socket was closed unexpectedly, try a lower keepalives_idle value. (Related: this would be a great candidate for nice shortlinks to read more in the docs @runleonarun)

Describe alternatives you've considered

-✅ Updating the docs so that people find out how to fix it when they search the error message

  • Copy-pasting the same Slack link for the rest of my life

Who will this benefit?

What kind of use case will this feature be useful for? Please be specific and provide examples, this will help us prioritize properly.

  • Redshift and Postgres users who have a patchy connection

Are you interested in contributing this feature?

Yeah this seems like the sort of thing I could do, if I knew where the error was raised and how to safely relay it

Versioning the dbt-redshift plugin

  • Let's start pinning minor versions of dbt-core (~=1.0), rather than tightly pinning exact versions
  • For testing changes to this package, should we be installing the latest version of dbt-core (from develop), rather than the latest version available on PyPi? Otherwise, this causes test failures when there are core changes that require corresponding plugin changes, or even just corresponding changes to integration tests

`test_concurrent_transaction` failing

This test is pretty hard to debug, since it exists to reproduce a common-yet-wacky scenario. There's some helpful context in the README.

This is the error we started seeing in scheduled runs this morning:

Unable to connect to Python Management Process
DETAIL:
  -----------------------------------------------
  error:  Unable to connect to Python Management Process
  code:      10001
  context:   system:2
  query:     0
  location:  pymanager_client.cpp:43
  process:   padbmaster [pid=6709]
  -----------------------------------------------

I think this error is coming from Redshift, and it's possibly related to a recent Redshift change: https://forums.aws.amazon.com/thread.jspa?threadID=347702

[CT-1210] [Feature] Cross-database macro for type_boolean()

Is this your first time submitting a feature request?

  • I have read the expectations for open source contributors
  • I have searched the existing issues, and I could not find an existing issue for this feature
  • I am requesting a straightforward extension of existing dbt-redshift functionality, rather than a Big Idea better suited to a discussion

Describe the feature

see dbt-labs/dbt-core#5739

Describe alternatives you've considered

No response

Who will this benefit?

No response

Are you interested in contributing this feature?

No response

Anything else?

No response

[CT-1185] [Feature] [Spike] Support dbt Python models on Redshift/AWS

UPDATE: closed in favor of #204

Is this your first time submitting a feature request?

  • I have read the expectations for open source contributors
  • I have searched the existing issues, and I could not find an existing issue for this feature
  • I am requesting a straightforward extension of existing dbt-redshift functionality, rather than a Big Idea better suited to a discussion

Describe the feature

dbt Python models, but on Redshift/AWS

Describe alternatives you've considered

n/a

Who will this benefit?

Redshift/AWS users

Are you interested in contributing this feature?

not exactly...

Anything else?

seems like the Python cursor API (thanks @dataders) supports reading and writing dataframes: https://docs.aws.amazon.com/redshift/latest/mgmt/python-api-reference.html#python-api-cursor

potentially combine that with something like this: https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/launch-a-spark-job-in-a-transient-emr-cluster-using-a-lambda-function.html

we should investigate the technical feasibility here

[CT-644] Align Botocore and Boto3 Dependencies pinnings to support MWAA Airflow

Describe the feature

MWAA python package dependencies can not be resolved with the currently pinned ranges in dbt-redshift.

MWAA is a managed Airflow environment that runs in AWS. AWS Suggests a list of constraints to ensure that the environment is healthy when installing new packages. While I can imagine a desire to resolve conflicts with every platform the likelihood that an AWS environment is using Redshift seems pretty high. The constraint file for the most current version of MWAA can be found here:
https://raw.githubusercontent.com/apache/airflow/constraints-2.2.2/constraints-3.7.txt

It looks like MWAA wants boto3: to be boto3==1.18.65

For reference the logs when attempting to use dbt-redshift in the environment.

ERROR: Cannot install dbt-redshift==0.13.0, dbt-redshift==0.13.1, dbt-redshift==0.14.0, dbt-redshift==0.14.1, dbt-redshift==0.14.2, dbt-redshift==0.14.3, dbt-redshift==0.14.4, dbt-redshift==0.15.0, dbt-redshift==0.15.1, dbt-redshift==0.15.2, dbt-redshift==0.15.3, dbt-redshift==0.16.0, dbt-redshift==0.16.1, dbt-redshift==0.17.0, dbt-redshift==0.17.1, dbt-redshift==0.17.2, dbt-redshift==0.18.0, dbt-redshift==0.18.1, dbt-redshift==0.18.2, dbt-redshift==0.19.0, dbt-redshift==0.19.1 and dbt-redshift==0.19.2 because these package versions have conflicting dependencies.

The conflict is caused by:
    dbt-redshift 0.19.2 depends on boto3<1.16 and >=1.4.4
    dbt-redshift 0.19.1 depends on boto3<1.16 and >=1.4.4
    dbt-redshift 0.19.0 depends on boto3<1.16 and >=1.4.4
    dbt-redshift 0.18.2 depends on botocore<1.15 and >=1.5.0
    dbt-redshift 0.18.1 depends on botocore<1.15 and >=1.5.0
    dbt-redshift 0.18.0 depends on botocore<1.15 and >=1.5.0
    dbt-redshift 0.17.2 depends on botocore<1.15 and >=1.5.0
    dbt-redshift 0.17.1 depends on botocore<1.15 and >=1.5.0
    dbt-redshift 0.17.0 depends on botocore<1.15 and >=1.5.0
    dbt-redshift 0.16.1 depends on botocore<1.15 and >=1.5.0
    dbt-redshift 0.16.0 depends on botocore<1.15 and >=1.5.0
    dbt-redshift 0.15.3 depends on boto3<1.10.0 and >=1.6.23
    dbt-redshift 0.15.2 depends on boto3<1.10.0 and >=1.6.23
    dbt-redshift 0.15.1 depends on boto3<1.10.0 and >=1.6.23
    dbt-redshift 0.15.0 depends on boto3<1.10.0 and >=1.6.23
    dbt-redshift 0.14.4 depends on boto3<1.11.0 and >=1.4.4
    dbt-redshift 0.14.3 depends on boto3<1.10.0 and >=1.6.23
    dbt-redshift 0.14.2 depends on boto3<1.10.0 and >=1.6.23
    dbt-redshift 0.14.1 depends on boto3<1.10.0 and >=1.6.23
    dbt-redshift 0.14.0 depends on boto3<1.10.0 and >=1.6.23
    dbt-redshift 0.13.1 depends on boto3<1.10.0 and >=1.6.23
    dbt-redshift 0.13.0 depends on boto3<1.10.0 and >=1.6.23
    The user requested (constraint) boto3==1.18.65
    The user requested (constraint) botocore==1.21.65

Describe alternatives you've considered

We'll continue to use the 'dbt' package and call outs to the external cli.

Who will this benefit?

Users that run DBT via Airflow.

Are you interested in contributing this feature?

I can help look into it, but I'm unfamiliar with the historical reason the packages are pinned as they are.

[CT-789] Add support for grant macros

Everything in Redshift is mostly the same as in PostgreSQL, so we don't require any changes to:

  • materializations
  • get_grant_sql
  • get_revoke_sql

However, the syntax for showing grants is very different. I think something like this will work, though I'll be the first to admit that it looks & feels pretty weird:

{% macro redshift__get_show_grant_sql(relation) %}

with privileges as (

        -- valid options per https://docs.aws.amazon.com/redshift/latest/dg/r_HAS_TABLE_PRIVILEGE.html
	select 'select' as privilege
	union all
	select 'insert' as privilege
	union all
	select 'update' as privilege
	union all
	select 'delete' as privilege
	union all
	select 'references' as privilege

)

select
    u.usename as grantee,
    p.privilege
from pg_user u
cross join privileges p
where has_table_privilege(u.usename, '{{ relation }}', privilege)

{% endmacro %}

Even though that function is called has_table_privilege, it works for both tables + views. Schema-level privileges require using has_schema_privilege instead; not something we need to worry about for now.

Assuming that query is able to return columns by the same name (and we could rename them in the SQL if need be), we shouldn't need to override any other parts of the puzzle.

[CT-351] Use the new testing framework in the redshift plugin

As part of the testing overhaul we have created a suite of adapter tests that will be located in the dbt.tests.adapter namespace and will be available for subclassing in specific adapter test suites. This ticket is to enable using the new test suite and implement the initial batch of tests (the ones that replace dbt-adapter-tests).

[CT-987] Relation does not exist when create new view

Describe the bug

Use drop cascade to drop my snapshot('notifications') and derived view('notifications_current') first. Then recreate the snapshot and related post view. The snapshot can be created successfully while the view can't.

Steps To Reproduce

dbt run model to drop snapshot and cascaded view
dbt snapshot to recreate it
dbt run to create view

Expected behavior

The view can be created

Screenshots and log output

01:37:49 Completed with 1 error and 0 warnings:
01:37:49
01:37:49 Database Error in model notifications_current (models/notifications_poc/notifications_current.sql)
01:37:49 relation "uat_conforming_notifications.notifications_current" does not exist

System information

dbt core 1.1.0

<output goes here>

Redshift

python 3.9.13

Additional context

Add any other context about the problem here.

[CT-1617] Materialization to create external tables (Redshift first)

Describe the feature

When using Redshift Spectrum, dbt is able to stage external tables stored in S3 and do useful things such as declaring the source data format (csv, json, parquet, etc.). The idea is to create a new materialization (potentially "external" would work) option which would handle persisting query results and enable a few capabilties:

  • Specifying a location to output partitioned results generated by a model's sql
  • Specifying the output format (csv, json, parquet) and compression where applicable
  • Control over the partitioning logic (e.g. use a defined set of columns to create partitions in the lake)
  • For Redshift, ability to CLEAR PATH to overwrite existing partitions in the lake

Describe alternatives you've considered

The implementation of create_external_table here accomplishes this when triggered by a run-operation. The goal here is to make that logic a materialization so that it can become part of the dbt run pipeline.

Additional context

Believe this is relevant for any of the databases currently supported in the external tables package:

  • Redshift (Spectrum)
  • Snowflake
  • BigQuery
  • Spark
  • Synapse
  • Azure SQL

Who will this benefit?

dbt Users who have existing infrastructure that leverages a more data lake centric approach for managing persistence will benefit from this. They can use dbt and the warehouse as an ephemeral compute / transform layer, and then persist the data to a file store, which enables other tools (e.g. AWS Glue / Athena) to query the results using existing analytical patterns.

Are you interested in contributing this feature?

Yes.

[CT-1011] Add changie to dbt-redshift

Describe the feature

implement changie in all adapters to reach parity with dbt-core

Describe alternatives you've considered

A clear and concise description of any alternative solutions or features you've considered.

Who will this benefit?

Users: will have easy format to verify they have created a changelog for their pr, and simple interactive mode to create said changelog.

Maintainers: will be able to easily batch PR's by release/ pre-release/ easier release system.

Are you interested in contributing this feature?

Yes

[CT-1122] [CT-1116] [Bug] `adapter.get_relation` for relations in other databases [RA3]

Is this a new bug in dbt-core?

  • I believe this is a new bug in dbt-core
  • I have searched the existing issues, and I could not find an existing issue for this bug

Current Behavior

adapter.get_relation seems to ignore database argument and use the target database anyway.

Expected Behavior

The right database is used.

Steps To Reproduce

  • I have a model, defined as below, which checks if it already exists in the same schema, but in another table. If it exists - it prints a log message and reads data from there:

    {{ config({ "materialized":"table" }) }}
    
    {%- set relation = adapter.get_relation(
        database="integrated",
        schema=this.schema,
        identifier=this.identifier) -%}
    
    {% if relation is not none %}
    
    {{ log(relation.database ~ '.' ~ relation.schema ~ '.' ~ 
           relation.identifier ~ " already exists", info=true) }}
    
    select * from {{relation}}
    
    {% else %}
    
    ... (e.g. creates an empty table or do nothing if exists)
    
    {% endif %}
    
  • I have 2 databases: dev and integrated.

  • In dev I have that table already created, in integrated - not.

  • The database specified in the active target is dev.

  • Now, when running dbt run -m "<model_identifier>", I would expect it to do nothing in dev database. But, instead of that I get:

    02:41:51  1 of 1 START table model some_schema.some_model ................... [RUN]
    02:41:52  integrated.some_schema.some_model already exists
    02:41:52  1 of 1 ERROR creating table model some_schema.some_model .......... [ERROR in 0.80s]
    
    02:41:55  Completed with 1 error and 0 warnings:
    02:41:55  
    02:41:55  Database Error in model <model_identifier>
    02:41:55    Schema some_schema does not exist in the database.
    

    so adapter.get_relation claims to be able to find the table in the integrated database - although it doesn't exist there. And because it doesn't exist - the run fails when it tries to actually read the data from there.

PS that's only a dummy example to visualise the issue, please don't focus on the logic itself

Relevant log output

No response

Environment

- OS: macOS Monterey 12.5.1
- Python: 3.9.11
- dbt:
  
  Core:
  - installed: 1.2.1
  - latest:    1.2.1 - Up to date!

  Plugins:
  - redshift: 1.2.1 - Up to date!
  - postgres: 1.2.1 - Up to date!

Which database adapter are you using with dbt?

redshift

Additional Context

No response

[CT-23] syntax error with table backup option

Describe the bug

dbt run fails on syntax error for models that have both backup option and table_attribute set e.g. dist_key

Steps To Reproduce

run this model

{{ config(backup=False, dist='a', sort='a') }}
select 1 as a

Expected behavior

model should build

Screenshots and log output

Screen Shot 2022-01-05 at 1 05 37 PM

System information

The output of dbt --version:

installed version: 1.0.1
   latest version: 1.0.1

Up to date!

Plugins:
  - snowflake: 0.21.0
  - redshift: 1.0.0
  - postgres: 1.0.1

The operating system you're using:
Mac
The output of python --version:
Python 3.8.6

Additional context

backup option needs to be before table_attribute, which is not the case by looking at complied SQL. This is related to https://github.com/dbt-labs/dbt-redshift/pull/42/files

[CT-359] Column names not quoted in alter table statements

Describe the bug

When using the config on_schema_change='sync_all_columns' for a model, dbt executes alter statements on the pre-exiting table to add a column. The column in the alter statement is not quoted correctly and this leads to SQL syntax errors.

In the screenshot below, the generated alter table statement is:

alter table "analytics"."dbt_analytics_intermediate"."hourly_precalculated_metrics" add column events__screener page bigint

The column name that is attempted to be added to the incremental model is named "events__Screener Page".

Steps To Reproduce

  1. Create and materialise a new incremental model with the sync_all_columns configuration
  2. Add a quoted column to the model with spaces such as "This is a new column"
  3. Run the model

Expected behavior

The alter statement should be generated correctly with a new column created with a double quoted column name.

Screenshots and log output

Screen Shot 2022-03-14 at 5 34 45 pm

System information

The output of dbt --version:

installed version: 1.0.3
   latest version: 1.0.3

Up to date!

Plugins:
  - redshift: 1.0.0 - Up to date!
  - postgres: 1.0.3 - Up to date!

The operating system you're using:
Mac

The output of python --version:
Python 3.9.7

Additional context

Add any other context about the problem here.

[CT-1061] Remove `ra3_node` profile config?

Describe the feature

Currently, if you're using "2nd gen" Redshift nodes, dbt-redshift will try to raise a helpful error if it thinks you're doing something disallowed, by trying to query/catalog a table that lives in a different database:

raise dbt.exceptions.NotImplementedException(
"Cross-db references allowed only in RA3.* node. ({} vs {})".format(
database, expected
)

(As it turns out, this exception is raised inconsistently — yes during docs generate, but seemingly not during dbt run if you've defined sources with other database values.)

With the advent of RA3 nodes in 2019 (!), you can opt into this functionality by setting an ra3_node: true configuration in profiles.yml

I think there's a legitimate case to just remove the exception entirely, and treat verify_database as a no-op.

Describe alternatives you've considered

Leaving ra3_node as a profile configuration, but setting it to true by default, now that the majority (we think) of dbt-redshift users are running on RA3 nodes.

Additional context

I'm trying to remind myself of why we went for the opt-in config in the first place, rather than removing this gate completely. The relevant issues are from earlier last year:

I recall our rationale coming down to trying to answer the question: Are RA3 nodes the future of Redshift? Or, if 2nd-gen node types are sticking around, how fundamental is the fork in functionality between 2nd-gen and 3rd-gen+ nodes?

I don't remember getting crystal-clear answers to those questions. In retrospect, the ra3_node config was a highly conservative approach which assumed that:

  • 2nd-gen and 3rd-gen node types were sticking around for some time (mostly true)
  • There would be many more functional gaps as RA3 nodes added more and more capabilities, such that dbt would need to be able to check a conditional config — this hasn't really proven to be the case

Who will this benefit?

Having profile configurations where they're not truly needed leads to:

  • potential confusion for end users
  • documentation maintenance burden
  • additional work for dbt Cloud to support those profile configurations

[CT-33] Add macro/run-operation for uploading files from local filesystem to S3

Describe the feature

Prompted by brooklyn-data/dbt_artifacts#60.

Access to a macro such as:

{% do adapter.upload_file(file_path, destination) %}

would enable dbt_artifacts to become compatible with Redshift. The user would need to create their own S3 bucket, and grant any policies needed.

The adapter already has boto3 as a dependency, so no additional packages should be needed.

Describe alternatives you've considered

A clear and concise description of any alternative solutions or features you've considered.

Additional context

Please include any other relevant context here.

Who will this benefit?

What kind of use case will this feature be useful for? Please be specific and provide examples, this will help us prioritize properly.

Are you interested in contributing this feature?

Let us know if you want to write some code, and how we can help.

Redshift Serializable isolations and dbt transactions

Describe the feature

We use dbt on top of Redshift at a decent scale. It's about 1600 models, many updated hourly and every quarter hour. We also have a large BI instance hooked up to the same database. As we approached 1000 BI users we started to have rampant serializable isolation violations in redshift. As you may know this is the only isolation available, but the granularity is at the table level. If the BI tool joins a source table for a dbt model and that model's output, it is very easy to get a non serializable sequence of events.

We have found that the inbuilt redshift materializations hold long transactions while doing the following:

Table:

  1. CTAS into new table
  2. ALTER cur -> old
  3. ALTER new -> cur

Incremental:

  1. CTAS into tmp table
  2. delete from cur where id in tmp
  3. insert into cur * from tmp

Describe alternatives you've considered

We tried Redshift's new datashare feature to get, effectively, a kind of snapshot isolation but it's cumbersome.

Who will this benefit?

Redshift users who encounter this issue. We're not sure how many there are out there. This could increase the commit queue delays since there are two commits but decrease the serialization issues. As well, in the table materialization, it could leave a stray _new table which is cleaned up in the next run, if you get a failure in the ALTER

Are you interested in contributing this feature?

Our solution was to break the above into two separate transactions after each CTAS and have been running it for two weeks. It has reduced our serialization errors 100x. We can submit the code as a PR attached to this issue but wanted to see if this is something that would be accepted before we posted. They are done as:

{% materialization incremental, adapter='redshift' -%}

[CT-348] Unit tests broken for release branches

For our unit tests, we wanted to make sure we were testing new changes against the newest version of dbt-core so that we could adapt to any changes made there (they are essentially our foundation). So for pushes, PR's, and just manual triggers, we clone the latest code from dbt-core and dbt-postgres on the main branch and use that to test against. The code for this can be found here.

This works well for dbt-redshift when we are testing out the main branch or changes going in to the main branch.

This doesn't work well when we are trying to test a release branch or changes going into a release branch. We would essentially be testing dbt-core code in main against dbt-redshift code that is for 1.0.latest. We should be testing against dbt-core's 1.0.latest branch instead.

We now have breaking changes on the dbt-core main branch that we incorporated into the dbt-redshift main branch but the dbt-redshift 1.0.latest branch is broken because it is referring to the newer dbt-core core.

[CT-1340] [Feature] Support Python models (dbt-py) on Redshift/AWS

Is this your first time submitting a feature request?

  • I have read the expectations for open source contributors
  • I have searched the existing issues, and I could not find an existing issue for this feature
  • I am requesting a straightforward extension of existing dbt-redshift functionality, rather than a Big Idea better suited to a discussion

Describe the feature

Background:

There's a Spark redshift connector. This would allow user to run python transformation code on EMR cluster that load data from Redshift, and write transformed data back to Redshift. The whole process is very similar to using Dataproc to run python models on GCP/BigQuery.

Items needed for implementation:

  • If there's additional profile information needed for EMR cluster, we can add it as optional attributes at Credentials(existing example for bigquery).
  • We need one macro to generate the final code to run on EMR cluster, Previous example for dbt-bigquery here
  • Now that we have the profile info and the macro to generate final code, we need submission classes to submit python code to the cluster. Existing submission code for dbt-bigquery, Function to define in impl.py(link1, link2). And how those classes are being used by dbt-core(This doesn't need to be changed.)

Describe alternatives you've considered

No response

Who will this benefit?

No response

Are you interested in contributing this feature?

No response

Anything else?

No response

Redshift create table with backup option

Describe the feature

The CREATE TABLE statement in Redshift has a configuration option that controls whether or not the table should be included in automated and manual cluster snapshots. The default setting for this option is YES. At present this is not configurable in DBT and the default is used for all table and incremental materialisations.

By allowing configuration of this parameter in DBT models we can save processing time when creating snapshots and restoring from snapshots and to reduce storage space on Amazon Simple Storage Service.

Describe alternatives you've considered

There is no way to get around this except to turn off automated snapshots at a cluster level.

To disable automated snapshots, set the retention period to zero. If you disable automated snapshots, Amazon Redshift stops taking snapshots and deletes any existing automated snapshots for the cluster.

Additional context

This feature is Redshift specific. The feature is similar but not exactly the same as Snowflakes TRANSIENT table.

Who will this benefit?

This will benefit all DBT users who use Redshift and who want to control storage costs and to speed up snapshot creation and restores.

P.S. If you think this is worthwhile and in keeping with DBT's principles, then I would love to work on this issue with a PR.

[CT-1616] get_relation method returns None for Redshift external tables

Describe the bug

get_relation method (https://docs.getdbt.com/docs/writing-code-in-dbt/jinja-context/adapter/#get_relation) returns None when provided with database, schema and table name of a Redshift external table (AWS Spectrum).

Steps To Reproduce

  1. Create an external table with AWS Spectrum: https://docs.aws.amazon.com/redshift/latest/dg/c-spectrum-external-tables.html
  2. Attempt to get_relation on that external table.

Expected behavior

Should return a regular relation as it does for native Redshift tables, e.g. an output like analytics.spectrum_schema.mytable instead of None

Screenshots and log output

System information

Which database are you using dbt with?

  • postgres
  • redshift
  • bigquery
  • snowflake
  • other (specify: - - - )

The output of dbt --version:

0.16.0

The operating system you're using:
MacOS Mojave

The output of python --version:
3.6.2

Additional context

[CT-577] Fix: tables not being dropped after testing even though shows cascadle/drop happening in logs

Past couple nights CI/CD have tagged https://github.com/dbt-labs/dbt-redshift/runs/6205412456?check_suite_focus=true 1040 failures for to many tables including temporary tables.

Looking in redshift test account showing that tables are stacking up and staying. even though test logs show drop call happening

Expected behavior

tests should run/build tables/views and cleanup after run

Screenshots and log output

cascade call:
error log

integration test logs:
https://github.com/dbt-labs/dbt-redshift/runs/6205412456?check_suite_focus=true#step:6:1

Additional context

Add any other context about the problem here.

Support writing to external tables in dbt models

Describe the feature

Historically, to get data from Redshift into S3, you needed to run an unload command. It looks like a new query syntax, create external table .... as will allow dbt to write transformed data directly to s3.

The dbt-external-tables package might be a good home for logic like this? Or, I could buy this being something natively supported by dbt if the extensions to the existing table materialization are minimal / tractable to implement.

Who will this benefit?

Data Lake Professionals ™️

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.