Coder Social home page Coder Social logo

metabase_api_python's Introduction

PyPI version contributions welcome codecov GitHub license

Installation

pip install metabase-api

Initializing

from metabase_api import Metabase_API

mb = Metabase_API('https://...', 'username', 'password')  # if password is not given, it will prompt for password

Functions

REST functions (get, post, put, delete)

Calling Metabase API endpoints (documented here) can be done using the corresponding REST function in the wrapper.
E.g. to call the endpoint GET /api/database/, use mb.get('/api/database/').

Helper Functions

You usually don't need to deal with these functions directly (e.g. get_item_info, get_item_id, get_item_name)

Custom Functions

For a complete list of functions parameters see the functions definitions using the above links. Here we provide a short description:

  • create_card

Specify the name to be used for the card, which table (name/id) to use as the source of data and where (i.e. which collection (name/id)) to save the card (default is the root collection).

mb.create_card(card_name='test_card', table_name='mySourceTable')  # Setting `verbose=True` will print extra information while creating the card.

Using the column_order parameter we can specify how the order of columns should be in the created card. Accepted values are 'alphabetical', 'db_table_order' (default), or a list of column names.

mb.create_card(card_name='test_card', table_name='mySourceTable', column_order=['myCol5', 'myCol3', 'myCol8'])

All or part of the function parameters and many more information (e.g. visualisation settings) can be provided to the function in a dictionary, using the custom_json parameter. (also see the make_json function below)

q = '''
  select *
  from my_table 
  where city = '{}'
'''

for city in city_list:

  query = q.format(city)
  
  # here I included the minimum keys required. You can add more.
  my_custom_json = {
    'name': 'test_card',
    'display': 'table',
    'dataset_query': {
      'database': db_id,
      'native': { 'query': query },
      'type': 'native' 
    }
  }
       
  # See the function definition for other parameters of the function (e.g. in which collection to save the card)
  mb.create_card(custom_json=my_custom_json)
  • create_collection

Create an empty collection. Provide the name of the collection, and the name or id of the parent collection (i.e. where you want the created collection to reside). If you want to create the collection in the root, you need to provide parent_collection_name='Root'.

mb.create_collection(collection_name='test_collection', parent_collection_id=123)
  • create_segment

Provide the name to be used for creating the segment, the name or id of the table you want to create the segment on, the column of that table to filter on and the filter values.

mb.create_segment(segment_name='test_segment', table_name='user_table', column_name='user_id', column_values=[123, 456, 789])
  • copy_card

At the minimum you need to provide the name/id of the card to copy and the name/id of the collection to copy the card to.

mb.copy_card(source_card_name='test_card', destination_collection_id=123)
  • copy_pulse

Similar to copy_card but for pulses.

mb.copy_pulse(source_pulse_name='test_pulse', destination_collection_id=123)
  • copy_dashboard

You can determine whether you want to deepcopy the dashboard or not (default False).
If you don't deepcopy, the duplicated dashboard will use the same cards as the original dashboard.
When you deepcopy a dashboard, the cards of the original dashboard are duplicated and these cards are used in the duplicate dashboard.
If the destination_dashboard_name parameter is not provided, the destination dashboard name will be the same as the source dashboard name (plus any postfix if provided).
The duplicated cards (in case of deepcopying) are saved in a collection called [destination_dashboard_name]'s cards and placed in the same collection as the duplicated dashboard.

mb.copy_dashboard(source_dashboard_id=123, destination_collection_id=456, deepcopy=True)
  • copy_collection

Copies the given collection and its contents to the given destination_parent_collection (name/id). You can determine whether to deepcopy the dashboards.

mb.copy_collection(source_collection_id=123, destination_parent_collection_id=456, deepcopy_dashboards=True, verbose=True)

You can also specify a postfix to be added to the names of the child items that get copied.

  • clone_card

Similar to copy_card but a different table is used as the source for filters of the card.
This comes in handy when you want to create similar cards with the same filters that differ only on the source of the filters (e.g. cards for 50 US states).

mb.clone_card(card_id=123, source_table_id=456, target_table_id=789, new_card_name='test clone', new_card_collection_id=1)
  • update_column

Update the column in Data Model by providing the relevant parameter (list of all parameters can be found here).
For example to change the column type to 'Category', we can use:

mb.update_column(column_name='myCol', table_name='myTable', params={'semantic_type':'type/Category'}  # (For Metabase versions before v.39, use: params={'special_type':'type/Category'}))
  • search

Searches for Metabase objects and returns basic info.
Provide the search term and optionally item_type to limit the results.

mb.search(q='test', item_type='card')
  • get_card_data

Returns the rows.
Provide the card name/id and the data format of the output (csv or json). You can also provide filter values.

results = mb.get_card_data(card_id=123, data_format='csv')
  • make_json

It's very helpful to use the Inspect tool of the browser (network tab) to see what Metabase is doing. You can then use the generated json code to build your automation. To turn the generated json in the browser into a Python dictionary, you can copy the code, paste it into triple quotes (''' ''') and apply the function make_json:

raw_json = ''' {"name":"test","dataset_query":{"database":165,"query":{"fields":[["field-id",35839],["field-id",35813],["field-id",35829],["field-id",35858],["field-id",35835],["field-id",35803],["field-id",35843],["field-id",35810],["field-id",35826],["field-id",35815],["field-id",35831],["field-id",35827],["field-id",35852],["field-id",35832],["field-id",35863],["field-id",35851],["field-id",35850],["field-id",35864],["field-id",35854],["field-id",35846],["field-id",35811],["field-id",35933],["field-id",35862],["field-id",35833],["field-id",35816]],"source-table":2154},"type":"query"},"display":"table","description":null,"visualization_settings":{"table.column_formatting":[{"columns":["Diff"],"type":"range","colors":["#ED6E6E","white","#84BB4C"],"min_type":"custom","max_type":"custom","min_value":-30,"max_value":30,"operator":"=","value":"","color":"#509EE3","highlight_row":false}],"table.pivot_column":"Sale_Date","table.cell_column":"SKUID"},"archived":false,"enable_embedding":false,"embedding_params":null,"collection_id":183,"collection_position":null,"result_metadata":[{"name":"Sale_Date","display_name":"Sale_Date","base_type":"type/DateTime","fingerprint":{"global":{"distinct-count":1,"nil%":0},"type":{"type/DateTime":{"earliest":"2019-12-28T00:00:00","latest":"2019-12-28T00:00:00"}}},"special_type":null},{"name":"Account_ID","display_name":"Account_ID","base_type":"type/Text","fingerprint":{"global":{"distinct-count":411,"nil%":0},"type":{"type/Text":{"percent-json":0,"percent-url":0,"percent-email":0,"average-length":9}}},"special_type":null},{"name":"Account_Name","display_name":"Account_Name","base_type":"type/Text","fingerprint":{"global":{"distinct-count":410,"nil%":0.0015},"type":{"type/Text":{"percent-json":0,"percent-url":0,"percent-email":0,"average-length":21.2916}}},"special_type":null},{"name":"Account_Type","display_name":"Account_Type","base_type":"type/Text","special_type":"type/Category","fingerprint":{"global":{"distinct-count":5,"nil%":0.0015},"type":{"type/Text":{"percent-json":0,"percent-url":0,"percent-email":0,"average-length":3.7594}}}}],"metadata_checksum":"7XP8bmR1h5f662CFE87tjQ=="} '''
myJson = mb.make_json(raw_json)  # setting 'prettyprint=True' will print the output in a structured format.
mb.create_card('test_card2', table_name='mySourceTable', custom_json={'visualization_settings':myJson['visualization_settings']})
  • move_to_archive

Moves the item (Card, Dashboard, Collection, Pulse, Segment) to the Archive section.

mb.move_to_archive('card', item_id=123)
  • delete_item

Deletes the item (Card, Dashboard, Pulse). Currently Collections and Segments cannot be deleted using the Metabase API.

mb.delete_item('card', item_id=123)

Notes

There are also two other Python wrappers for Metabase API here and here.

metabase_api_python's People

Contributors

dynnammo avatar etoulas avatar moralescastillo avatar tjarksaul avatar vmercierfr avatar vvaezian avatar zbjdonald avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

metabase_api_python's Issues

Creating cards with description = ''

Copying a card or creating a card using custom_json with description = '' results in an non-informative error. My sugestion is to either add an informative error mensage, or to create a conditional statement such as "if custom_json['description'] = '' then custom_json['description']= None".

Check column existence in create_card function

In create_card function, first check that all the mentioned columns are in the database
There was a case the function didn't produce any error and when trying to open the card in MB it failed.

NameError: name 'null' is not defined in get_card_data if chosen format is json

I am getting error when trying to use method get_card_data with chosing data format 'json'

Traceback (most recent call last): File "<input>", line 1, in <module> File "/usr/local/lib/python3.9/site-packages/metabase_api/metabase_api.py", line 837, in get_card_data return eval(res.text) File "<string>", line 2, in <module> NameError: name 'null' is not defined

Seems like eval() does not like 'null', maybe using of json.loads() instead could help.

Function for changing the data source of a dashboard

For the first version we can assume the source of all the cards on the dashboard are the same (an initial check needs to be done in this case).
The function for changing the data source of a card is used for each card of the dashboard.
The dashboard filters needs to be updated as well.

Make wrapper more maintenance-proof with non-breaking refactor

Currently, the module is built quite simply with REST functions at the beginning of the metabase-api and helper functions that helps developers do the most commons actions made on a Metabase instance : create cards, modify them, get informations.

In order to have a more complete wrapper that embrace all the possibilities of the Metabase API and in the same maintaining the simplicity of this wrapper, it could be relevant to do some non-breaking refactoring:

metabase_api_python # we keep the upper level unchanged
    README.md
    LICENSE
    ...
    metabase_api_python # here's come the changes
        src
            rest_functions.py # contain the REST functions needed by all endpoints
            utils.py # get_item_id, get_item_info etc. go here
            endpoints # containing relevant helpers functions relating to a specific endpoint, 
                cards.py
                dashboards.py
                collections.py
                ... # other endpoints not currently implemented ๐Ÿ”ฅ
        metabase_api.py # keeping Metabase_API class here, that imports all functions from src folder

Funded by
Open Source Politics

Missing step in the clone_card function

Hello,

I found that the function clone_card doesn't change the column_id to target_col_id for 'aggregation' part of the query here

My card_info has the following part

'dataset_query': {'database': 5,
                   'query': {'aggregation': [['distinct',
                                              ['field', 577, None]]],
                             'filter': ['and',
                                        ['=', ['field', 576, None], 'type'],
                                        ['=',
                                         ['field', 567, None],
                                         'approved']],
                             'source-table': 103},
                   'type': 'query'},

I assume that we have to iterate over aggregation part as well.

Metabase v0.43.0

Update get_item_id and get_item_name

As a developer, I want to get items from a Metabase instance only knowing their names.

Currently, get_item_id and get_item_name aren't fully working with objects like database or collection.

Purpose

Complete those auxiliary with some new objects available : dashboard, collection andpulse

Founding

Funded by OpenSourcePolitics

Deepcopying dashboards may fail with 'None' card ids

Hello, found your project in metabase discourse and I'm already using it in my company's deployment pipelines, thankyou!

I had to fix an issue I've found when deepcopying dashboards, for some reason metabase api returned cards with card ids as None. I've added some None checking at metabase_api.py around lines 530-560.

I've forked you project with this fix for this case, if youre interested I could open an PR.

Using VCR for testing coverage

Context
Currently, the testing suite is based on a database setup private to @vvaezian . When adding new endpoint or helper functions, we can't run it locally (or it implies heavy setup) to be sure that it accomplishes what we want.

Feature
The Ruby wrapper of Metabase uses a gem called VCR that has an interesting way of working : we register APIs calls made with the wrapper against a Metabase instance in YAML / JSON files. When running the test, instead of running it against a Metabase instance, it uses the APIs calls previously registered to check the results.

While it may seem a bit huge as a work to do, the quoted wrapper has currently 34 registered "cassettes" that covers most of the endpoints currently developed in metabase_api_python

Roadmap

  • Add VCRpy and pytest-recording to the projet
  • Create all necessary cassettes for the current tests
  • Refactor tests to fit in
  • github action that launches tests on PRs and master branch

authentication triggers an error simplejson.errors.JSONDecodeError

I installed the package and I tried to authenticate with both the command line and a script, yet it's not working. When I run this code

from metabase_api import Metabase_API

Metabase_API("http://localhost:4009/","[email protected]","test123")

I'm getting the following error in the my terminal

Traceback (most recent call last):
  File "mbmap.py", line 59, in <module>
    Metabase_API("http://localhost:4009/","[email protected]","test123")
  File "/home/test/.local/lib/python3.8/site-packages/metabase_api/metabase_api.py", line 13, in __init__
    self.authenticate()
  File "/home/test/.local/lib/python3.8/site-packages/metabase_api/metabase_api.py", line 25, in authenticate
    self.session_id = res.json()['id']
  File "/usr/lib/python3/dist-packages/requests/models.py", line 897, in json
    return complexjson.loads(self.text, **kwargs)
  File "/usr/lib/python3/dist-packages/simplejson/__init__.py", line 518, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python3/dist-packages/simplejson/decoder.py", line 370, in decode
    obj, end = self.raw_decode(s)
  File "/usr/lib/python3/dist-packages/simplejson/decoder.py", line 400, in raw_decode
    return self.scan_once(s, idx=_w(s, idx).end())
simplejson.errors.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

KeyError: 'sizeX'

When running the mb.copy_collection command,

Copying the dashboard "Report" ...
Traceback (most recent call last):
File "/root/./copy_metabase_reports.py", line 5, in
mb.copy_collection(source_collection_id=5, destination_parent_collection_id=11, deepcopy_dashboards=True, verbose=True)
File "/usr/local/lib/python3.10/dist-packages/metabase_api/metabase_api.py", line 931, in copy_collection
self.copy_dashboard(source_dashboard_id=dashboard_id,
File "/usr/local/lib/python3.10/dist-packages/metabase_api/metabase_api.py", line 838, in copy_dashboard
card_json[prop] = card[prop]
KeyError: 'sizeX'
root@localhost:~#

{ "browser-info": { "language": "en-GB", "platform": "Win32", "userAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36", "vendor": "Google Inc." }, "system-info": { "file.encoding": "UTF-8", "java.runtime.name": "OpenJDK Runtime Environment", "java.runtime.version": "11.0.17+8", "java.vendor": "Eclipse Adoptium", "java.vendor.url": "https://adoptium.net/", "java.version": "11.0.17", "java.vm.name": "OpenJDK 64-Bit Server VM", "java.vm.version": "11.0.17+8", "os.name": "Linux", "os.version": "5.15.0-56-generic", "user.language": "en", "user.timezone": "GMT" }, "metabase-info": { "databases": [ "h2", "mysql" ], "hosting-env": "unknown", "application-database": "mysql", "application-database-details": { "database": { "name": "MySQL", "version": "8.0.31-0ubuntu0.22.04.1" }, "jdbc-driver": { "name": "MariaDB Connector/J", "version": "2.7.6" } }, "run-mode": "prod", "version": { "date": "2022-12-07", "tag": "v0.45.1", "branch": "release-x.45.x", "hash": "019d31c" }, "settings": { "report-timezone": null } } }

Unable to use get_columns_name_id as a non-superuser

In the get_columns_name_id function, it checks friendly_names_is_disabled first before allowing us to use the function as seen in https://github.com/vvaezian/metabase_api_python/blob/master/metabase_api/metabase_api.py#L339. However, in the friendly_names_is_disabled function, it does a self.get('/api/setting') as seen in https://github.com/vvaezian/metabase_api_python/blob/master/metabase_api/metabase_api.py#L372. This is causing a problem because according to https://github.com/metabase/metabase/blob/master/docs/api-documentation.md#setting, GET /api/setting/ is only available to superuser, so even though Friendly Table and Field Names is already set to disable by the Admin, a non superuser is still unable to use the get_columns_name_id function because self.get('/api/setting') returns False for a non superuser. Is it possible to consider the scenarios for a non superuser?

create_collection creates collections with names already in use

The create_collection method does not check if a collection with that name already exists.
I only noticed this when using the get_item_info method to get the details of the collection I had created and it returned the message "ValueError: There is more than one collection with the name"

Changing the database with the clone_card function

I'd love to be able changing the underlying database with the clone_card function. The Use case I've got in mind is having multiple environments which all have the same question/card but different databases.

I've imagined something like this:

mb.clone_card(card_id=2, source_table_id=16, target_table_id=69, target_database_id=1 ,new_card_name='Cloned Card', new_card_collection_id=6, new_database_id=2, ignore_these_filters=[])

As far as I can tell, only the database_id as well as the dataset_query.database properties of the card have to be changed.

Copy collection to root collection does not work

For my product I needed to copy my collection from root to root. I couldn't do it, unless I added the ternary operator to line 1005 of metabase_api.py file. I used value of parameter = "Root", as the function create_collection suggests
Screenshot from 2021-10-18 17-54-23

deepcopy collections give error: if item['model'] == 'collection': TypeError: string indices must be integers

Traceback (most recent call last):
File "metabase.py", line 42, in
mb.copy_collection(source_collection_id=17, destination_parent_collection_id=8, destination_collection_name='Vlaanderen_copy', deepcopy_dashboards=True, verbose=True)
File "C:\Users\janwa\AppData\Local\Programs\Python\Python37\lib\site-packages\metabase_api\metabase_api.py", line 761, in copy_collection
if item['model'] == 'collection':
TypeError: string indices must be integers

Add card to dashboard custom function

As a business analyst, I would like to have many other custom functions, as instance add_card_to_dashboard to add a specific card to a specific dashboard

KeyError: 'Native' for Clone_Card

When running clone_card the following error is generated:

filters_data = card_info['dataset_query']['native']['template-tags']
KeyError: 'native'

Metabase Version: 0.41.2

Function 'move_to_archive()' returning SyntaxError

Hi!
I am experiencing the following problem:

The code I am trying to execute is :
from metabase_api import Metabase_API

Error message:
File "/usr/local/lib/python3.6/dist-packages/metabase_api/__init__.py", line 1, in <module>
from metabase_api import Metabase_API
File "/usr/local/lib/python3.6/dist-packages/metabase_api/metabase_api.py", line 705
self.verbose_print(verbose, 'Successfully Archived.') if res == 202 else print('Archiving Failed.')
^
SyntaxError: invalid syntax

I can't use the other functions because of this error. I commented the function (move_to_archive()) and saved using Admin permission and now it's working.

Are there any prerequisites for this function (move_to_archive ()) works correctly?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.