Coder Social home page Coder Social logo

googlecloudplatform / explainable_ai_sdk Goto Github PK

View Code? Open in Web Editor NEW
22.0 5.0 12.0 319 KB

This is an SDK for Google Cloud Explainable AI service. Explainable AI SDK helps users build explanation metadata for their models and visualize feature attributions returned from the model.

Home Page: https://cloud.google.com/explainable-ai

License: Apache License 2.0

Python 100.00%

explainable_ai_sdk's Introduction

Explainable AI SDK

This is a Python SDK for Google Cloud Explainable AI, an explanation service that provides insight into machine learning models deployed on AI Platform. The Explainable AI SDK helps to visualize explanation results, and to define explanation metadata for the explanation service.

Explanation metadata tells the explanation service which of your model's inputs and outputs to use for your explanation request. The SDK has metadata builders that help you to build and save an explanation metadata file before you deploy your model to AI Platform.

The Explainable AI SDK also helps you to visualize feature attribution results on models deployed to AI Platform.

Installation

The Explainable AI SDK supports models built with:

  • Python 3.7 and later
  • TensorFlow 1.15 or TensorFlow 2.x.

The Explainable AI SDK is preinstalled on Google Cloud AI Platform Notebooks images.

For other platforms:

  1. Make sure that you have installed Cloud SDK. In order to communicate with Cloud AI Platform, the Explainable AI SDK requires a shell environment with Cloud SDK installed.

  2. Install the Explainable AI SDK:

    pip install explainable-ai-sdk

Metadata Builders

After you build your model, you use a metadata builder to create your explanation metadata. This produces a JSON file that is used for model deployment on AI Platform.

There are different metadata builders for TensorFlow 1.x and 2.x in their respective folders.

TensorFlow 2.x

For TensorFlow 2.x, there is one metadata builder that takes a SavedModel, and uploads both your model and metadata file to Cloud Storage.

For example:

from explainable_ai_sdk.metadata.tf.v2 import SavedModelMetadataBuilder
builder = SavedModelMetadataBuilder(
    model_path)
builder.save_model_with_metadata('gs://my_bucket/model')  # Save the model and the metadata.

TensorFlow 1.x

For TensorFlow 1.x, the Explainable AI SDK supports models built with Keras, Estimator and the low-level TensorFlow API. There is a different metadata builder for each of these three TensorFlow APIs. An example usage for a Keras model would be as follows:

from explainable_ai_sdk.metadata.tf.v1 import KerasGraphMetadataBuilder
my_model = keras.models.Sequential()
my_model.add(keras.layers.Dense(32, activation='relu', input_dim=10))
my_model.add(keras.layers.Dense(32, activation='relu'))
my_model.add(keras.layers.Dense(1, activation='sigmoid'))
builder = KerasGraphMetadataBuilder(my_model)
builder.save_model_with_metadata('gs://my_bucket/model')  # Save the model and the metadata.

For examples using the Estimator and TensorFlow Core builders, refer to the v1 README file.

Making Predict and Explain Calls

The Explainable AI SDK includes a model interface to help you communicate with the deployed model more easily. With this interface, you can call predict() and explain() functions to get predictions and explanations for the provided data points, respectively.

Here is an example snippet for using the model interface:

project_id = "example_project"
model_name = "example_model"
version_name = "v1"

m = explainable_ai_sdk.load_model_from_ai_platform(project_id, model_name, version_name)
instances = []

# ... steps for preparing instances

predictions = m.predict(instances)
explanations = m.explain(instances)

Explanation, Attribution, and Visualization

The explain() function returns a list of Explanation objects -- one Explanation per input instance. This object makes it easier to interact with returned attributions. You can use the Explanation object to get feature importance and raw attributions, and to visualize attributions.

Note: Currently, the feature_importance() and as_tensors() functions only work on tabular models, due to the limited payload size. We are working on making both functions available for image models.**

Get feature importance

The feature_importance() function returns the imporance of each feature based on feature attributions. Note that if a feature has more than one dimension, the importance is calculated based on the aggregation.

explanations[0].feature_importance()

Get raw attributions

To get feature attributions over each dimension, use the as_tensors() function to return the raw attributions as tensors.

explanations[0].as_tensors()

Visualize attributions

The Explanation class allows you to visualize feature attributions directly. For both image and tabular models, you can call visualize_attributions() to see feature attributions.

explantions[0].visualize_attributions()

Here is an example visualization:

An attribution visualization for a tabular model

Caveats

  • This library works with (and depends) on either major version of TensorFlow.
  • Do not import the metadata/tf/v1 and metadata/tf/v2 folders in the same Python runtime. If you do, there may be unintended side effects of mixing TensorFlow 1.x and 2.x behavior.

Explainable AI documentation

For more information about Explainable AI, refer to the Explainable AI documentation.

License

All files in this repository are under the Apache License, Version 2.0 unless noted otherwise.

Note: We are not accepting contributions at this time.

explainable_ai_sdk's People

Contributors

besimav avatar nanchenchen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

explainable_ai_sdk's Issues

.explain() fails with Endpoint doesn't have traffic_split precondition error

When getting explanations for a Vertex AI Model I get the following error message

{
  "error": {
    "code": 400,
    "message": "Endpoint projects/sascha-playground-doit/locations/us-central1/endpoints/519294943752093696 doesn't have traffic_split.",
    "status": "FAILED_PRECONDITION"
  }
}

Model: Vertex AI AutoML Image Classification
Method: XRAY

Code to reproduce

import explainable_ai_sdk
from base64 import b64encode
import tensorflow as tf

instances = []

img_bytes = tf.io.read_file('/content/cast_def_0_1104.jpeg')
b64str = b64encode(img_bytes.numpy()).decode('utf-8')
instances.append({"content": {'b64': b64str}})

remote_model = explainable_ai_sdk.load_model_from_vertex(PROJECT, REGION, ENDPOINT_ID)
#predictions = remote_ig_model.predict(instances)
response = remote_model.explain(instances)

This precondition is limiting us in multiple ways

  1. Sometimes there is only one version deployed so enabling splitting is not an option.
  2. And even if there are multiple versions we usually enable traffic splitting for a short time and as soon as we are sure the model performs in production well we switch 100% of the traffic to the new model

Looking forward to your feedback

Best regards
Sascha

explainable_ai_sdk.load_model_from_ai_platform errors with "google.auth.credentials' has no attribute 'CredentialsWithQuotaProject"

Following the notebook (just using different data) at GoogleCloudPlatform/ai-platform-samples/blob/master/notebooks/samples/explanations/tf2/ai-explanations-tabular.ipynb
using TF2.2 and 1.0.5 release of explainable_ai_sdk python 3.7. AI Platform Notebook instance.

When calling load_model_from_ai_platform(proj,model,version) the error is raised.
AttributeError: module 'google.auth.credentials' has no attribute 'CredentialsWithQuotaProject'

Not sure if this is due to a mistake I have made or a package dependency.
thanks,
jim

Here is the full error message:


AttributeError Traceback (most recent call last)
in
2 PROJECT_ID = 'jwtfxpipe'
3 print('model {} version {}'.format(MODEL,VERSION))
----> 4 remote_ig_model = explainable_ai_sdk.load_model_from_ai_platform(PROJECT_ID, MODEL, VERSION)
5 ig_response = remote_ig_model.explain([prediction_json])
6 print('OK@',time.asctime())

/opt/conda/lib/python3.7/site-packages/explainable_ai_sdk/model/model_factory.py in load_model_from_ai_platform(project, model, version, credentials)
56 if version:
57 endpoint = os.path.join(endpoint, 'versions', version)
---> 58 return _MODEL_REGISTRY[_REMOTE_MODEL_KEY](endpoint, credentials)
59
60

/opt/conda/lib/python3.7/site-packages/explainable_ai_sdk/model/ai_platform_model.py in init(self, endpoint, credentials)
48 self._credentials = credentials
49 self._endpoint = endpoint
---> 50 self._explanation_metadata = self._get_explanation_metadata()
51 self._modality_input_list_map = utils.get_modality_input_list_map(
52 self._explanation_metadata)

/opt/conda/lib/python3.7/site-packages/explainable_ai_sdk/model/ai_platform_model.py in _get_explanation_metadata(self)
103
104 """
--> 105 explanation_md_uri = self._get_explanation_metadata_uri()
106
107 md = explain_metadata.ExplainMetadata.from_file(explanation_md_uri)

/opt/conda/lib/python3.7/site-packages/explainable_ai_sdk/model/ai_platform_model.py in _get_explanation_metadata_uri(self)
83 gcs bucket uri.
84 """
---> 85 gcs_uri = self._get_deployment_uri()
86 match = re.search('gs://(?P<bucket_name>[^/])[/](?P.*)',
87 gcs_uri)

/opt/conda/lib/python3.7/site-packages/explainable_ai_sdk/model/ai_platform_model.py in _get_deployment_uri(self)
63 """
64 response = http_utils.make_get_request_to_ai_platform(
---> 65 self._endpoint, self._credentials)
66
67 if 'deploymentUri' not in response:

/opt/conda/lib/python3.7/site-packages/explainable_ai_sdk/model/http_utils.py in make_get_request_to_ai_platform(uri_params_str, credentials, timeout_ms)
93 Request results in json format.
94 """
---> 95 headers = _get_request_header(credentials)
96 ai_platform_endpoint = (
97 os.getenv('CLOUDSDK_API_ENDPOINT_OVERRIDES_ML') or

/opt/conda/lib/python3.7/site-packages/explainable_ai_sdk/model/http_utils.py in _get_request_header(credentials)
40 # If credentials is not given, use the default credentials
41 if credentials is None:
---> 42 credentials, _ = google.auth.default()
43
44 # Refresh credentials in case it has been expired.

/opt/conda/lib/python3.7/site-packages/google/auth/_default.py in default(scopes, request)
306 scopes (Sequence[str]): The list of scopes for the credentials. If
307 specified, the credentials will automatically be scoped if
--> 308 necessary.
309 request (google.auth.transport.Request): An object used to make
310 HTTP requests. This is used to detect whether the application

/opt/conda/lib/python3.7/site-packages/google/auth/_default.py in _get_gae_credentials()
178 explicit_file = os.environ.get(environment_vars.CREDENTIALS)
179
--> 180 _LOGGER.debug(
181 "Checking %s for explicit credentials as part of auth process...", explicit_file
182 )

/opt/conda/lib/python3.7/site-packages/google/auth/app_engine.py in
79
80 class Credentials(
---> 81 credentials.Scoped, credentials.Signing, credentials.CredentialsWithQuotaProject
82 ):
83 """App Engine standard environment credentials.

AttributeError: module 'google.auth.credentials' has no attribute 'CredentialsWithQuotaProject'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.