Coder Social home page Coder Social logo

incatools / ontology-access-kit Goto Github PK

View Code? Open in Web Editor NEW
105.0 14.0 22.0 50.79 MB

Ontology Access Kit: A python library and command line application for working with ontologies

Home Page: https://incatools.github.io/ontology-access-kit/

License: Apache License 2.0

Makefile 0.05% Jupyter Notebook 72.53% Python 27.35% HTML 0.06%
obofoundry owl apis ontology rdf python ontology-api semantic-web monarchinitiative geneontology

ontology-access-kit's Introduction

Ontology Access Kit (OAK)

Python lib for common ontology operations over a variety of backends.

PyPI version badge Downloads DOI Contributor Covenant

OAK provides a collection of interfaces for various ontology operations, including:

  • look up basic features of an ontology element, such as its label, definition, relationships, or aliases
  • search an ontology for a term
  • validate an ontology
  • modify or delete terms
  • generate and visualize subgraphs
  • identify lexical matches and export as SSSOM mapping tables
  • perform more advanced operations, such as graph traversal, OWL axiom processing, or text annotation

These interfaces are separated from any particular backend, for which there a number of different adapters. This means the same Python API and command line can be used regardless of whether the ontology:

  • is served by a remote API such as OLS or BioPortal
  • is present locally on the filesystem in owl, obo, obojson, or sqlite formats
  • is to be downloaded from a remote repository such as the OBO library
  • is queried from a remote database, including SPARQL endpoints (Ontobee/Ubergraph), A SQL database, a Solr/ES endpoint

Documentation:

Contributing

See the contribution guidelines at CONTRIBUTING.md. All contributors are expected to uphold our Code of Conduct.

Usage

from oaklib import get_adapter

# connect to the CL sqlite database adapter
# (will first download if not already downloaded)
adapter = get_adapter("sqlite:obo:cl")

NEURON = "CL:0000540"

print('## Basic info')
print(f'ID: {NEURON}')
print(f'Label: {adapter.label(NEURON)}')

for alias in adapter.entity_aliases(NEURON):
    print(f'Alias: {alias}')

print('## Relationships (direct)')
for relationship in adapter.relationships([NEURON]):
    print(f' * {relationship.predicate} -> {relationship.object} "{adapter.label(relationship.object)}"')
    
print('## Ancestors (over IS_A and PART_OF)')
from oaklib.datamodels.vocabulary import IS_A, PART_OF
from oaklib.interfaces import OboGraphInterface

if not isinstance(adapter, OboGraphInterface):
    raise ValueError('This adapter does not support graph operations')

for ancestor in adapter.ancestors(NEURON, predicates=[IS_A, PART_OF]):
    print(f' * ANCESTOR: "{adapter.label(ancestor)}"')

For more examples, see

Command Line

See:

Search

Use the pronto backend to fetch and parse an ontology from the OBO library, then use the search command

runoak -i obolibrary:pato.obo search osmol 

Returns:

PATO:0001655 ! osmolarity
PATO:0001656 ! decreased osmolarity
PATO:0001657 ! increased osmolarity
PATO:0002027 ! osmolality
PATO:0002028 ! decreased osmolality
PATO:0002029 ! increased osmolality
PATO:0045034 ! normal osmolality
PATO:0045035 ! normal osmolarity

QC and Validation

Perform validation on PR using sqlite/rdftab instance:

runoak -i sqlite:../semantic-sql/db/pr.db validate

List all terms

List all terms obolibrary has for mondo

runoak -i obolibrary:mondo.obo terms 

Lexical index

Make a lexical index of all terms in Mondo:

runoak  -i obolibrary:mondo.obo lexmatch -L mondo.index.yaml

Search

Searching over OBO using ontobee:

runoak  -i ontobee: search tentacle

yields:

http://purl.obolibrary.org/obo/CEPH_0000256 ! tentacle
http://purl.obolibrary.org/obo/CEPH_0000257 ! tentacle absence
http://purl.obolibrary.org/obo/CEPH_0000258 ! tentacle pad
...

Searching over a broader set of ontologies in bioportal (requires API KEY) (https://www.bioontology.org/wiki/BioPortal_Help#Getting_an_API_key)

runoak set-apikey bioportal YOUR-KEY-HERE
runoak  -i bioportal: search tentacle

yields:

BTO:0001357 ! tentacle
http://purl.jp/bio/4/id/200906071014668510 ! tentacle
CEPH:0000256 ! tentacle
http://www.projecthalo.com/aura#Tentacle ! Tentacle
CEPH:0000256 ! tentacle
...

Alternatively, you can add "BIOPORTAL_API_KEY" to your environment variables.

Searching over more limited set of ontologies in Ubergraph:

runoak -v -i ubergraph: search tentacle

yields

UBERON:0013206 ! nasal tentacle

Annotating Texts

runoak  -i bioportal: annotate neuron from CA4 region of hippocampus of mouse

yields:

object_id: CL:0000540
object_label: neuron
object_source: https://data.bioontology.org/ontologies/NIFDYS
match_type: PREF
subject_start: 1
subject_end: 6
subject_label: NEURON

object_id: http://www.co-ode.org/ontologies/galen#Neuron
object_label: Neuron
object_source: https://data.bioontology.org/ontologies/GALEN
match_type: PREF
subject_start: 1
subject_end: 6
subject_label: NEURON

...

Mapping

Create a SSSOM mapping file for a set of ontologies:

robot merge -I http://purl.obolibrary.org/obo/hp.owl -I http://purl.obolibrary.org/obo/mp.owl convert --check false -o hp-mp.obo
runoak lexmatch -i hp-mp.obo -o hp-mp.sssom.tsv

Visualization of ancestor graphs

Use the sqlite backend to visualize graph up from 'vacuole' using test ontology sqlite:

runoak -i sqlite:tests/input/go-nucleus.db  viz GO:0005773

img

Same using ubergraph, restricting to is-a and part-of

runoak -i ubergraph:  viz GO:0005773 -p i,BFO:0000050

Same using pronto, fetching ontology from obolibrary

runoak -i obolibrary:go.obo  viz GO:0005773

Configuration

OAK uses pystow for caching. By default, this goes inside ~/.data/, but can be configured following these instructions.

ontology-access-kit's People

Contributors

anitacaron avatar azankl avatar caufieldjh avatar cmungall avatar cthoyt avatar dalito avatar deepakunni3 avatar glass-ships avatar hrshdhgd avatar joeflack4 avatar julesjacobsen avatar justaddcoffee avatar kevinschaper avatar kimrutherford avatar lubianat avatar manulera avatar matentzn avatar nicholsn avatar nlharris avatar pkalita-lbl avatar plbremer avatar realmarcin avatar shawntanzk avatar sierra-moxon avatar souzadevinicius avatar sujaypatil96 avatar yarikoptic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ontology-access-kit's Issues

Define internationalization_interface

This issue is to define internatiionalization requirements (not to implement yet)

this would have methods such as

  • get_translations(curie, property)

Alternatively this could just be woven into existing interfaces. Note that all sparql impls have:

    multilingual: bool = None
    preferred_language: LANGUAGE_TAG = field(default_factory=lambda: "en")

this is used to determine the default language if the triplestore supports multiple languages

High-level interface via the Bioregistry

The bioontologies package has a very high-level interface for getting OBO graphs by prefix, since it can automatically look up the appropriate IRIs via its mappings to OBO Foundry.

import bioontologies

parse_results = bioontologies.get_obograph_by_prefix("go")
go_graph_document = parse_results.graph_document

This could be extended to OAK and also let you choose if you want the OWL, OBO, or OBO Graph JSON artifact to get consumed

`poetry install` issue: ` • Installing scipy (1.6.1): Failed`

Description

I'm trying to use oaklib with mondo-analysis, but am having difficulty installing it.

I don't think it's available on pip yet.

I made a new pyproject.toml:

[tool.poetry]
name = "xxx"
version = "0.0.0"
description = "xxx"
authors = ["xxx"]

readme = "README.md"

[tool.poetry.dependencies]
python = "^3.9"
oaklib = ">0.0.0"

[tool.poetry.dev-dependencies]
pytest = "^5.2"

[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"

[tool.poetry.scripts]
runoak = "oaklib.cli:main"
omk = "oaklib.omk.omk_cli:main"

[tool.poetry.extras]

I then ran:
python3 -m poetry install (I was having difficulty w/ normal poetry install; too many Python installations on my comp).

Error messages

python3 -m poetry install
Creating virtualenv xxx-iIwCVzCe-py3.9 in /Users/joeflack4/Library/Caches/pypoetry/virtualenvs
Updating dependencies
Resolving dependencies... (263.2s)

Writing lock file

Package operations: 116 installs, 0 updates, 0 removals

  • Installing six (1.16.0)
  • Installing hbreader (0.9.1)
  • Installing isodate (0.6.1)
  • Installing pyparsing (2.4.7)
  • Installing antlr4-python3-runtime (4.9.3)
  • Installing certifi (2022.5.18)
  • Installing charset-normalizer (2.0.12)
  • Installing idna (3.3)
  • Installing jsonasobj (2.0.1)
  • Installing rdflib (6.1.1)
  • Installing urllib3 (1.26.9)
  • Installing attrs (21.4.0)
  • Installing click (8.1.3)
  • Installing markupsafe (2.1.1)
  • Installing mdurl (0.1.1)
  • Installing pyjsg (0.11.10)
  • Installing pyrsistent (0.18.1)
  • Installing pytz (2022.1)
  • Installing pyyaml (6.0)
  • Installing rdflib-jsonld (0.6.1)
  • Installing requests (2.27.1)
  • Installing wrapt (1.14.1)
  • Installing zipp (3.8.0)
  • Installing alabaster (0.7.12)
  • Installing babel (2.10.1)
  • Installing chardet (4.0.0)
  • Installing decorator (5.1.1)
  • Installing deprecated (1.2.13)
  • Installing docutils (0.17.1)
  • Installing frozenlist (1.3.0)
  • Installing imagesize (1.3.0)
  • Installing importlib-metadata (4.11.3)
  • Installing jinja2 (3.1.2)
  • Installing json-flattener (0.1.9)
  • Installing jsonasobj2 (1.0.4)
  • Installing jsonpointer (2.3)
  • Installing jsonschema (4.5.1)
  • Installing markdown-it-py (2.1.0)
  • Installing multidict (6.0.2)
  • Installing packaging (21.3)
  • Installing ply (3.11)
  • Installing prefixcommons (0.1.9)
  • Installing pygments (2.12.0)
  • Installing rdflib-shim (1.0.3)
  • Installing ruamel.yaml.clib (0.2.6)
  • Installing shexjsg (0.8.2)
  • Installing snowballstemmer (2.2.0)
  • Installing sparqlwrapper (2.0.0)
  • Installing sphinxcontrib-applehelp (1.0.2)
  • Installing sphinxcontrib-devhelp (1.0.2)
  • Installing sphinxcontrib-htmlhelp (2.0.0)
  • Installing sphinxcontrib-jsmath (1.0.1)
  • Installing sphinxcontrib-qthelp (1.0.3)
  • Installing sphinxcontrib-serializinghtml (1.1.5)
  • Installing aiosignal (1.2.0)
  • Installing async-timeout (4.0.2)
  • Installing cfgraph (0.2.1)
  • Installing et-xmlfile (1.1.0)
  • Installing greenlet (1.1.2)
  • Installing jsonpatch (1.32)
  • Installing jsonpath-ng (1.5.3)
  • Installing linkml-runtime (1.2.14)
  • Installing mdit-py-plugins (0.3.0)
  • Installing numpy (1.22.3)
  • Installing pyshexc (0.9.1)
  • Installing python-dateutil (2.8.2)
  • Installing ruamel.yaml (0.17.21)
  • Installing sparqlslurper (0.5.1)
  • Installing sphinx (4.5.0)
  • Installing tqdm (4.64.0)
  • Installing typing-extensions (4.2.0)
  • Installing yarl (1.7.2)
  • Installing aiohttp (3.8.1): Installing...
  • Installing argparse (1.4.0)
  • Installing commonmark (0.9.1)
  • Installing fastobo (0.11.1): Installing...
  • Installing graphviz (0.20)
  • Installing joblib (1.1.0): Installing...
  • Installing linkml-dataops (0.1.0)
  • Installing more-click (0.1.1)
  • Installing myst-parser (0.17.2)
  • Installing networkx (2.8.1): Installing...
  • Installing openpyxl (3.0.9): Installing...
  • Installing pandas (1.4.2): Downloading... 40%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1): Installing...
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2): Downloading... 50%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing linkml-dataops (0.1.0)
  • Installing more-click (0.1.1)
  • Installing myst-parser (0.17.2)
  • Installing networkx (2.8.1): Installing...
  • Installing openpyxl (3.0.9): Installing...
  • Installing pandas (1.4.2): Downloading... 50%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing joblib (1.1.0)
  • Installing linkml-dataops (0.1.0)
  • Installing more-click (0.1.1)
  • Installing myst-parser (0.17.2)
  • Installing networkx (2.8.1): Installing...
  • Installing openpyxl (3.0.9): Installing...
  • Installing pandas (1.4.2): Downloading... 50%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2): Downloading... 60%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2): Downloading... 60%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing openpyxl (3.0.9)
  • Installing pandas (1.4.2): Downloading... 60%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing graphviz (0.20)
  • Installing joblib (1.1.0)
  • Installing linkml-dataops (0.1.0)
  • Installing more-click (0.1.1)
  • Installing myst-parser (0.17.2)
  • Installing networkx (2.8.1): Installing...
  • Installing openpyxl (3.0.9)
  • Installing pandas (1.4.2): Downloading... 60%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing fastobo (0.11.1)
  • Installing graphviz (0.20)
  • Installing joblib (1.1.0)
  • Installing linkml-dataops (0.1.0)
  • Installing more-click (0.1.1)
  • Installing myst-parser (0.17.2)
  • Installing networkx (2.8.1): Installing...
  • Installing openpyxl (3.0.9)
  • Installing pandas (1.4.2): Downloading... 60%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2): Downloading... 70%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2): Downloading... 80%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing argparse (1.4.0)
  • Installing commonmark (0.9.1)
  • Installing fastobo (0.11.1)
  • Installing graphviz (0.20)
  • Installing joblib (1.1.0)
  • Installing linkml-dataops (0.1.0)
  • Installing more-click (0.1.1)
  • Installing myst-parser (0.17.2)
  • Installing networkx (2.8.1): Installing...
  • Installing openpyxl (3.0.9)
  • Installing pandas (1.4.2): Downloading... 80%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing aiohttp (3.8.1)
  • Installing argparse (1.4.0)
  • Installing commonmark (0.9.1)
  • Installing fastobo (0.11.1)
  • Installing graphviz (0.20)
  • Installing joblib (1.1.0)
  • Installing linkml-dataops (0.1.0)
  • Installing more-click (0.1.1)
  • Installing myst-parser (0.17.2)
  • Installing networkx (2.8.1): Installing...
  • Installing openpyxl (3.0.9)
  • Installing pandas (1.4.2): Downloading... 80%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2): Downloading... 90%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2): Downloading... 100%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2): Downloading... 100%
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0): Installing...
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pydantic (1.9.0)
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0)
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2): Installing...
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0)
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing openpyxl (3.0.9)
  • Installing pandas (1.4.2): Installing...
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0)
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing networkx (2.8.1)
  • Installing openpyxl (3.0.9)
  • Installing pandas (1.4.2): Installing...
  • Installing parse (1.19.0): Installing...
  • Installing pydantic (1.9.0)
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pydantic (1.9.0)
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0)
  • Installing pydantic (1.9.0)
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing parse (1.19.0)
  • Installing pydantic (1.9.0)
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.
  • Installing pandas (1.4.2)
  • Installing parse (1.19.0)
  • Installing pydantic (1.9.0)
  • Installing pyshex (0.8.1)
  • Installing pystow (0.4.3)
  • Installing scipy (1.6.1): Failed

  RuntimeError

  Invalid hashes (sha256:334a53f51b972b6df99ca2f5b723848472e5f130eed4189420ad85162f41d68e) for scipy (1.6.1) using archive scipy-1.6.1-cp39-cp39-macosx_10_9_x86_64.whl. Expected one of sha256:0c8a51d33556bf70367452d4d601d1742c0e806cd0194785914daf19775f0e67, sha256:0e5b0ccf63155d90da576edd2768b66fb276446c371b73841e3503be1d63fb5d, sha256:2481efbb3740977e3c831edfd0bd9867be26387cacf24eb5e366a6a374d3d00d, sha256:33d6b7df40d197bdd3049d64e8e680227151673465e5d85723b3b8f6b15a6ced, sha256:5da5471aed911fe7e52b86bf9ea32fb55ae93e2f0fac66c32e58897cfb02fa07, sha256:5f331eeed0297232d2e6eea51b54e8278ed8bb10b099f69c44e2558c090d06bf, sha256:5fa9c6530b1661f1370bcd332a1e62ca7881785cc0f80c0d559b636567fab63c, sha256:6725e3fbb47da428794f243864f2297462e9ee448297c93ed1dcbc44335feb78, sha256:68cb4c424112cd4be886b4d979c5497fba190714085f46b8ae67a5e4416c32b4, sha256:794e768cc5f779736593046c9714e0f3a5940bc6dcc1dba885ad64cbfb28e9f0, sha256:83bf7c16245c15bc58ee76c5418e46ea1811edcc2e2b03041b804e46084ab627, sha256:8e403a337749ed40af60e537cc4d4c03febddcc56cd26e774c9b1b600a70d3e4, sha256:a15a1f3fc0abff33e792d6049161b7795909b40b97c6cc2934ed54384017ab76, sha256:a423533c55fec61456dedee7b6ee7dce0bb6bfa395424ea374d25afa262be261, sha256:a5193a098ae9f29af283dcf0041f762601faf2e595c0db1da929875b7570353f, sha256:bd50daf727f7c195e26f27467c85ce653d41df4358a25b32434a50d8870fc519, sha256:c4fceb864890b6168e79b0e714c585dbe2fd4222768ee90bc1aa0f8218691b11, sha256:e79570979ccdc3d165456dd62041d9556fb9733b86b4b6d818af7a0afc15f092, sha256:f46dd15335e8a320b0fb4685f58b7471702234cba8bb3442b69a3e1dc329c345.

  at ~/Library/Python/3.9/lib/python/site-packages/poetry/installation/executor.py:627 in _download_link
      623│                     )
      624│                 )
      625│
      626│             if archive_hashes.isdisjoint(hashes):
    → 627│                 raise RuntimeError(
      628│                     "Invalid hashes ({}) for {} using archive {}. Expected one of {}.".format(
      629│                         ", ".join(sorted(archive_hashes)),
      630│                         package,
      631│                         archive_path.name,

  • Installing sphinx-click (4.0.3)
  • Installing sphinx-rtd-theme (1.0.0)
  • Installing sqlalchemy (1.4.36)
  • Installing watchdog (2.1.8)

Support for reasoning

Strategies:

Note that for static ontologies when using either

  • sql_implementation
  • ubergraph

We get pre-reasoned relation graph results already

Search behavior on obo files not as expected

I noticed this first when going through the tutorials and running search via the cli:

runoak --input obolibrary:fbbt.obo search 'wing vein'

What is expected in the tutorial is:

FBbt:00004751 ! wing vein
FBbt:00004754 ! axillary vein
FBbt:00004759 ! wing vein L1
FBbt:00004760 ! wing vein L2

But what is returned by OAK is:

FBbt:00004751 ! wing vein

This behavior is seen on:

  • main (ref 9bd64d9)
  • v0.1.19

After a bit of tinkering I realized that this might be because the default search behavior might have changed from SearchTermSyntax.REGULAR_EXPRESSION to SearchTermSyntax.STARTS_WITH.

set_label_for_curie not implemented for owl files?

The use case here is adding rdf labels to existing owl files using lexmatch from OAK. When you do this there is an error that the set_label_for_curie method is not implemented. Per @cmungall lexmatch rdflabel generation should work with the PRONTO implementation currently.

runoak -i ../../DLO/deep-learning-ontology/deep-learning-ontology-full.owl -a ../../DLO/deep-learning-ontology/external/ml-ontology-202010021305.owl lexmatch --add-labels
WARNING:root:Using rdflib rdf/xml parser; this behavior may change in future
WARNING:root:Using rdflib rdf/xml parser; this behavior may change in future
WARNING:root:Not a curie: deep-learning-ontology/deep-learning-ontology-full.owl
WARNING:root:Not a curie: deep-learning-ontology/deep-learning-ontology-full.owl
Traceback (most recent call last):
File "/Users/marcin/Documents/VIMSS/ontology/OAK/ontology-access-kit/venv/bin/runoak", line 8, in
sys.exit(main())
File "/Users/marcin/Documents/VIMSS/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/click/core.py", line 1130, in call
return self.main(*args, **kwargs)
File "/Users/marcin/Documents/VIMSS/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/click/core.py", line 1055, in main
rv = self.invoke(ctx)
File "/Users/marcin/Documents/VIMSS/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/marcin/Documents/VIMSS/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/marcin/Documents/VIMSS/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/click/core.py", line 760, in invoke
return __callback(*args, **kwargs)
File "/Users/marcin/Documents/VIMSS/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/oaklib/cli.py", line 1150, in lexmatch
add_labels_from_uris(impl)
File "/Users/marcin/Documents/VIMSS/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/oaklib/utilities/lexical/lexical_indexer.py", line 46, in add_labels_from_uris
oi.set_label_for_curie(curie, label)
File "/Users/marcin/Documents/VIMSS/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/oaklib/interfaces/basic_ontology_interface.py", line 203, in set_label_for_curie
raise NotImplementedError
NotImplementedError

Ensure curies are always curies

many of the datamodels force the use of CURIEs

sometimes applying naive uri-to-curie yields things like

uberon/core#BRAIN:NAME:ABV

not clear if this is due to some kind of roundtrip through obo but either way the code should be robust and always yield CURIEs where these are expected

Searches on `cl.obo`, `uberon.obo` fail

I'm following the tutorial, and diverged a bit to try searches in other OBO ontologies.

The following search fails:

$ runoak -i obolibrary:cl.obo search 'epithelial cell of lung'
/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/parsers/_fastobo.py:84: SyntaxWarning: source document contains incomplete creation date: 2021-11-08
  process_clause_typedef(clause, data, self.ont)
/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/parsers/_fastobo.py:84: NotImplementedWarning: cannot process `equivalent_to_chain: attaches_to part_of` macro
  process_clause_typedef(clause, data, self.ont)
Traceback (most recent call last):
  File "/home/harry/oak-tutorial/oak-env/bin/runoak", line 8, in <module>
    sys.exit(main())
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 1654, in invoke
    super().invoke(ctx)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/oaklib/cli.py", line 140, in main
    settings.impl = impl_class(resource)
  File "<string>", line 6, in __init__
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/oaklib/implementations/pronto/pronto_implementation.py", line 73, in __post_init__
    ontology = Ontology.from_obo_library(resource.slug)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/ontology.py", line 206, in from_obo_library
    return cls(
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/ontology.py", line 283, in __init__
    cls(self).parse_from(_handle)  # type: ignore
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/parsers/obo.py", line 48, in parse_from
    self.symmetrize_lineage()
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/parsers/base.py", line 84, in symmetrize_lineage
    graphdata.lineage[superentity].sub.add(subentity)
KeyError: 'CARO:0000000'

Looks like a pronto parsing issue?

Searching cl.owl instead produces numerous errors but eventually yields the desired result (CL:0000082 ! epithelial cell of lung)

I get a similar result when running

$ runoak -i obolibrary:uberon.obo search 'epithelial cell'

for Uberon, though in that case searching uberon.owl instead raises an AttributeError:

Traceback (most recent call last):
  File "/home/harry/oak-tutorial/oak-env/bin/runoak", line 8, in <module>
    sys.exit(main())
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 1654, in invoke
    super().invoke(ctx)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/oaklib/cli.py", line 140, in main
    settings.impl = impl_class(resource)
  File "<string>", line 6, in __init__
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/oaklib/implementations/pronto/pronto_implementation.py", line 73, in __post_init__
    ontology = Ontology.from_obo_library(resource.slug)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/ontology.py", line 206, in from_obo_library
    return cls(
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/ontology.py", line 283, in __init__
    cls(self).parse_from(_handle)  # type: ignore
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/parsers/rdfxml.py", line 115, in parse_from
    self._extract_annotation_property(prop, curies)
  File "/home/harry/oak-tutorial/oak-env/lib/python3.9/site-packages/pronto/parsers/rdfxml.py", line 668, in _extract_annotation_property
    label = elem.find(_NS["rdfs"]["label"]).text
AttributeError: 'NoneType' object has no attribute 'text'

Add a generate-categories method

Interface:

tbd

  • add to BasicOntologyInterface
  • make a new interface

?

the input will be a curie or curies, and an optional list of preferred systems, drawn from:

  • biolink
  • cob
  • UMLS STY
  • owl vocabulary (e.g. owl:Class)
  • design pattern

these are typically indicated via rdf:type (but not in OWL-DL), dc:conformsTo, subproperties of rdf:type (e.g. biolink:category)

the results will be an iterator of tuples (curie, system, category_curie)

Implementation:

not all implementations will be able to resolve this for all systems. if the preferred systems is empty, it will do a best effort. if preferred systems are passed and cannot be satisfied, throw a NotImplemented

for now start with ubergraph, this precomputes biolink categories, so it should be a simple lookup query

out of scope for this issue but in future:

  • add cob categories to ubergraph implementation, depends on INCATools/ubergraph#55
  • use either graph walking or relation-graph (e.g. for sqldb which already loads RG) to compute other categories - related to what @jhc is doing

lexmatch failing for OBO file

This is starting from an OBO file (for the Deep Learning Ontology DLO) generated with robot convert from the original rdf/xml.
Links to the input files are at bottom of ticket.

Robot command to generate OBO input:
robot convert --input DLO.xrdf --format owl --output DLO.obo

OAK command:

runoak -i DLO.obo lexmatch -o DLO.ssom.tsv

This gives the following error:

/Users/marcin/Documents/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/pronto/parsers/rdfxml.py:286: SyntaxWarning: <Element '{http://purl.org/dc/elements/1.1/}description' at 0x7fca5a2b2cc0> contains text but no `xsd:datatype`
  meta.annotations.add(self._extract_literal_pv(child))
/Users/marcin/Documents/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/pronto/parsers/rdfxml.py:286: SyntaxWarning: <Element '{http://purl.org/dc/elements/1.1/}title' at 0x7fca5a2b2d60> contains text but no `xsd:datatype`
  meta.annotations.add(self._extract_literal_pv(child))
/Users/marcin/Documents/ontology/OAK/ontology-access-kit/venv/lib/python3.9/site-packages/pronto/ontology.py:283: NotImplementedWarning: cannot process plain `owl:AnnotationProperty`
  cls(self).parse_from(_handle)  # type: ignore

To reproduce, the DLO input files are here:
DLO.xrdf
DLO.obo

SQLImpl() does not download ontology?

from oaklib.resource import OntologyResource
from oaklib.implementations.sqldb.sql_implementation import SqlImplementation
resource = OntologyResource(slug='mondo.obo', local=False)
oi = SqlImplementation(resource)

results in:

ArgumentError: Could not parse rfc1738 URL from string 'mondo.obo'

Same works with prontoimpl

extract-triples sometimes leaves owl axioms incomplete leading to ErrorNs in owlapi

To replicate:

runoak -i ubergraph:uberon extract-triples UBERON:0001463 -o finger1.ttl -O ttl

the ttl looks OK at first, but if we visualize:

runoak -i finger1.ttl viz UBERON:0001463 -p i,p 

we get truncated graphs

image

in protege:
image

under manus we have:

    rdfs:subClassOf [ a owl:Restriction ;
            owl:onProperty ns1:RO_0002551 ;
            owl:someValuesFrom ns1:UBERON_0001442 ],
        [ a owl:Restriction ;
            owl:onProperty ns1:RO_0002202 ;
            owl:someValuesFrom ns1:UBERON_0006875 ] ;
    owl:equivalentClass [ ] ;

for some reason the subclass axiom isn't extracted, and the equivalence axiom is to a blank node

Lexmatch only creates skos:closeMatch?

Is there currently any documentation on the lexmatch code?

I noticed that currently everything in lexmatch get skos:closeMatch:

DOID:0050140	acute diarrhea	skos:closeMatch	SYMP:0000675	acute diarrhea	Lexical	oaklib	rdfs:label	rdfs:label	acute diarrhea

Add a notebook demonstrating mappings

in the notebooks/ folder

rough script of ow it should work:

part 1 - fetching existing mappings

extract a sssom file from bioportal
extract a sssom file from a downloaded ontology

choose an uberon. term like limb as an example

part 2 - generating mappings

choose two small ontologies, e.g. zfa and xao, demonstrate using lexmatch to find mappings

Add a command for structurally comparing two ontology based on an SSSOM mapping set

inputs: O1, O2, Mappings

Extract from O1 the set of terms used in mappings, perform gap-fill to obtain non-redundant inferred relationships between all mapped terms

for each relationship, compare with matched relationship in O2. Categorize as

  • identical, asserted
  • identical, inferred on either or both sides
  • different relationship (e.g. switches from is-a to part-of)
  • invalid (may require dynamic reasoning so many implementations may not implement this part)

Docs example fails: `runoak validate -i obolibrary:mondo.obo terms`

Description

Hello all. Glad to be finally starting to use OAK.

I'm just playing around with some options, and this first example that I took from the README.md didn't work.

Error

Error: No such option: -i

Log

runoak validate -i obolibrary:mondo.obo terms
Usage: runoak validate [OPTIONS]
Try 'runoak validate --help' for help.

Error: No such option: -i

0.2.0 release things to be decided

  • switch label to lbl #26
  • use iterators ubiquitously?
  • One CLI with many subcommands, or subcommands partitioned across different CLIs
  • package structure
  • CLI: runoak vs claiming oak

Document outputs more in tutorial

part1 relies on some understand of basic unix CLI philosophy

considering adding basic examples of piping and redirecting

Also explain --output and --output-format up-front

And SOP to PRs

Documentation strategy
Add SOP for PRs to always include documentation for ever new feature

review cross-ontology-diff datamodel prior to implementing

Currently the cross-ontology-diff code uses an ad-hoc tuple as the output:

ANALOGOUS_RELATION = Tuple[DIFF_CATEGORY, CURIE, PRED_CURIE, CURIE, List[CURIE]]

current results look like this:

('IDENTICAL', 'EMAPA:16037', 'rdfs:subClassOf', 'EMAPA:36032', [])
('OTHER', 'EMAPA:16041', 'rdfs:subClassOf', 'EMAPA:16039', [])

which is not so useful for review. We want to see labels, plus mapped concepts

We would like to move to a documented datamodel that may have applications beyond this software library:

https://incatools.github.io/ontology-access-kit/datamodels/cross-ontology-diff/index.html

Review this prior to implementing

lexmatch should create a legal sssom file

Currently I notice these violations:

  • License and mapping set ids should be purls (@hrshdhgd we can use what we use as defaults in sssom-py as well)
  • no curie map (we should be able to pass in a metadata parameter just as in sssom py)
  • Some subject_id/obect_id values are not curies presumable because they are not known w/o a curie map

Implement gap filling for subsets

From @dosumis:

The graph is seeded with 2 terms and an objectProperty and show a large set of related terms and their relationships. I want only relationships between the terms specified, via subClassOf + some set of OP existential restrictions

Here's an example from HubMap - the seed = a set of terms + BFO:0000050. Only terms in the seed are in the graph. A non-redundant set of part_of and subClassOf relationships are shown

image

I think this is more or less the same as: ontodev/robot#497

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.