Coder Social home page Coder Social logo

pygeofilter's Introduction

geopython

Vanity package for geopython projects

pip install geopython
>>> import geopython

pygeofilter's People

Contributors

bitner avatar captaincoordinates avatar constantinius avatar drnextgis avatar github-actions[bot] avatar jburkinshaw avatar jokiefer avatar ka7eh avatar kalxas avatar mkeller3 avatar mmcfarland avatar ricardogsilva avatar tomkralidis avatar totycro avatar vincentsarago avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pygeofilter's Issues

SqlAlchemy temporal operations not properly handled

Correct me if i'm completely misusing this but I'm trying to perform a simple "meets" expression as cql-json:

For simplicity my db contains the following datetimes:

1: 2019-06-24T07:41:10Z
2: 2019-06-24T07:41:13Z
3: 2019-06-24T07:41:17Z
4: 2019-06-24T07:41:21Z

And I want to perform a "meets" request with time interval 2019-06-24T07:41:13 - 2019-06-24T07:41:18 as input.
This should result in all rows from the db where the datetime coinsides with the beginning of the interval, which in my db is only the second row: 2019-06-24T07:41:13Z

However running the query, it seems to parse correctly:

from pygeofilter.parsers.cql_json import parse as parse_json
from pygeofilter.backends.sqlalchemy import to_filter
import sqlalchemy as sa

cql_json = "meets": [{ "property": "datetime" }, ["2019-06-24T07:41:13Z", "2019-06-24T07:41:18Z"]]

ast = parse_json(cql_json)
  TimeMeets(lhs=ATTRIBUTE datetime, rhs=Interval(start=datetime.datetime(2019, 6, 24, 7, 41, 13, tzinfo=<StaticTzInfo 'Z'>), end=datetime.datetime(2019, 6, 24, 7, 41, 18, tzinfo=<StaticTzInfo 'Z'>)))

but when it evaluates the ast to a SqlAlchemy expression, and queries the db, it returns both row 2 and 3: 2019-06-24T07:41:13Z and 2019-06-24T07:41:17Z

Inspecting the temporal method of filters.py in the sqlalchemy backend:
billede

it seems to handle special cases for before, after and tequals operations, thus other temporals end up being evaluated as either between, >=, or <= so my query ends up being evaluated as a between(datetime, "2019-06-24T07:41:13Z", "2019-06-24T07:41:18Z") explaining why row 3 is also returned

On a sidenote:
It seems that you have to specify an interval as input for anything besides before and after otherwise this crashes:
billede
with TypeError: cannot unpack non-iterable datetime.datetime object

From the OAF specification:

CQL supports date and timestamps as time instants, but even the smallest "instant" has a duration and can also be evaluated as an interval.

Hence to my understanding you should be able to run "meets", "metby", "begins" ect. on exact timedates, treating them as intervals, and not having to explicitly specify an interval (despite it being nonsensical to do in practise)

Am I completely missing something here or is the temporal method in fact not handling all temporal cases?

cql-text

I see that there is a branch with a base implementation for cql-text.

When I try to use it, I get import errors.

What is the road map for this branch? Do you have a timeline? Is it actively being developed?

Invalid cql-json `and` expression is parsed into "wrong" ast

I cannot find any description of how pygeofilter is supposed to behave when it meets invalid filters. The issue is written assuming that pygeofilter should fail when parsing invalid filters.

Example 1 where pygeofilter silently ignores invalid and expression:

>>> import pygeofilter.ast
>>> from pygeofilter.parsers.cql_json import parse as parse_json
>>> 
>>> cql_json = { "intersects": [ { "property": "geometry" }, { "type": "Point", "coordinates": [ 10.4064, 55.3951 ] } ], "and": [ {"eq": [ { "property": "direction" }, "east" ] } ] }
>>> print(pygeofilter.ast.get_repr(parse_json(cql_json)))
INTERSECTS(ATTRIBUTE geometry, Geometry(geometry={'type': 'Point', 'coordinates': [10.4064, 55.3951]}))

Example 2 where pygeofilter parses invalid and into a simple comparison:

>>> import pygeofilter.ast
>>> from pygeofilter.parsers.cql_json import parse as parse_json
>>> 
>>> cql_json = { "and": [ {"eq": [ { "property": "direction" }, "east" ] } ] }
>>> print(pygeofilter.ast.get_repr(parse_json(cql_json)))
ATTRIBUTE direction = 'east'

Needless to say it would be a better user experience if both these examples would fail with some meaningful message.

CQL parsing error: No terminal matches '[' in the current parser context

For OGC API Records work in Testbed 18, we need to leverage the VendorSpecificParameters option, as available from the pyCSW implementation ().

The pyCSW implementation uses the pygeofilter for request parsing:

from pygeofilter.parsers.ecql import parse as parse_ecql
from pygeofilter.parsers.cql2_json import parse as parse_cql2_json

When using the option, the CQL parser throws an exception:

{
  "code": "InvalidParameterValue",
  "description": "CQL parsing error: No terminal matches '[' in the current parser context, at line 1 col 4\n\ndcs[key_challenge] = \"secret\"\n   ^\nExpected one of: \n\t* LESSTHAN\n\t* COMMA\n\t* \"<>\"\n\t* EQUAL\n\t* IS\n\t* IN\n\t* \"DOES-NOT-EXIST\"\n\t* SLASH\n\t* AFTER\n\t* EXISTS\n\t* BEFORE\n\t* BETWEEN\n\t* AND\n\t* NOT\n\t* ILIKE\n\t* \"<=\"\n\t* \">=\"\n\t* RPAR\n\t* DURING\n\t* LPAR\n\t* MINUS\n\t* PLUS\n\t* STAR\n\t* LIKE\n\t* MORETHAN\n\t* OR\n\nPrevious tokens: Token('NAME', 'dcs')\n"
}

You can reproduce the error with the following CURL request:

curl -X GET "https://ogc.demo.secure-dimensions.de/pycsw/collections/metadata%3Amain/items?limit=10&f=json&dcs[key_challenge]=secret" -H "accept: application/geo+json"

to_cql2 incorrectly formats isNull args

According to the cql2 spec, the NULL predicate should look like this in cql2-json:

{
      "op": "isNull",
      "args": [ { "property": "geometry" } ]
}

However, if using to_cql2 to generate the json, args is not a list:

>>> from pygeofilter.parsers.ecql import parse as parse_ecql
>>> from pygeofilter.backends.cql2_json import to_cql2
>>> to_cql2(parse_ecql("geometry IS NULL"))
'{"op": "isNull", "args": {"property": "geometry"}}'

[cql2-text] NOT operator is not supported

Initially raised in stac-utils/stac-fastapi#473, it appears as though the NOT operator is under- or un-supported for CQL2-TEXT. E.g.

collection NOT LIKE 'not-a-collection-id'

#22 appears to be a similar issue that asks about general CQL2-TEXT support, but I thought a specific issue for this problem might be helpful. Please feel free to close if you feel as though this is a duplicate of #22 (or another issue). Thanks!

sqlalchemy tests fail due to non existing management keyword argument

geoalchemy removed the management argument from the intialization of a geoalchemy2.types._GISType earlier this year:

geoalchemy/geoalchemy2#415

As a result of this change, subsequent geoalchemy releases are not compatible with pygeofilter's sqlalchemy tests, which are still using the management keyword argument:

class Record(Base):
__tablename__ = "record"
identifier = Column(String, primary_key=True)
geometry = Column(
Geometry(
geometry_type="MULTIPOLYGON",
srid=4326,
spatial_index=False,
management=True,
)
)

This means the tests are currently failing.

Possible solutions:

  1. Follow suite with geoalchemy2 and simply remove the management argument
  2. Pin geoalchemy2 to a known version

I guess the first option is better, since the fix is really small - I'll prepare a PR with the change

booleanLiteral terminals parsed as attributes in cql2-text

System Info

Python Version: 3.10

OS: macOS Ventura 13.6.6

pygeofilter Version: 0.2.1 (main branch)

Problem Description

According to the OGC CQL2 documentation, a value of TRUE (case-insensitive) should be interpreted as a booleanLiteral value (see CQL2-Text example 14). So, for example, in the following query we should interpret attr as an attribute and TRUE as a booleanLiteral that we are comparing attr against:

attr = TRUE

However, the current CQL2 Text parser interprets the value of TRUE as an attribute. This leads to errors when we try to translate the AST to a backend like SQLAlchemy.

Steps to Reproduce:

from pygeofilter import ast
from pygeofilter.parsers.cql2_text import parse

result = parse("attr = TRUE")
assert result == ast.Equal(
    ast.Attribute("attr"),
    True,
)
# Assertion fails. The actual object is
# ast.Equal(
#     ast.Attribute("attr"),
#     ast.Attribute("TRUE")
# )

Possible Solution

I will put up a PR shortly with a possible solution that uses Lark's terminal priority to ensure that values of TRUE/true/FALSE/false are parsed as booleanLiteral values.

colon in attribute name

hi,

I am trying to filter on an attribute with a colon in its name, in my case 's5p:orbit'. after conversion to django filters, this name is unchanged, so django cannot find the matching field. so I now manually change the AST so colons become "_ ".. of course while typing this, I see that you can pass a field mapping to to_filter.. so now I'm wondering if it would make sense to have a mode where ':' is automatically converted to " _", for cases where the django model matches this?

Are cql2-json operators mandatory lowercase?

The cql2_json parser fails with a ValueError if I try to pass it an operator in uppercase, e.g. "op": "or", "args": [...] is fine, but "op": "OR", "args": [...] fails. And indeed, in cql2_json/parser.py we find an explicit test if op in ("and", "or"): (ditto for all the other operators, of course).

What is the reasoning behind this? cql2_text is case-insensitive w.r.t. operators (on account of presumably deliberate condition "OR"i etc. specifications in the Lark grammar), but if I consult the CQL2 JSON standard spec I just see things like orExpression = booleanExpression "OR" booleanExpression; which to me would indicate either case insensitivity (haven't been able to quickly find out what the default is for BNF literals) or in fact mandatory uppercase...

Use pinned ranges in requirements*.txt

As discovered in #57 breaking changes in dependencies can cause issues with pygeofilter.

Suggest to amend requirements*.txt files to pin specific versions or version ranges for all dependencies.

pygeoif 1.0.0 is not compatible with pygeofilter 0.1.2, needs to be pinned to an earlier version

pygeoif released v 1.0.0 on Sept 22 2022, which is the first release since 0.7 in 2017 (!).

pygeoif is unpinned in pygeofilter, so the latest version 1.0.0 is pulled, even though it's not compatible.

stac-fastapi gets this failure when trying to import the cql2 converters:

File "/app/stac_fastapi/pgstac/stac_fastapi/pgstac/core.py", line 13, in
from pygeofilter.backends.cql2_json import to_cql2
File "/usr/local/lib/python3.8/site-packages/pygeofilter/backends/cql2_json/init.py", line 1, in
from .evaluate import to_cql2
File "/usr/local/lib/python3.8/site-packages/pygeofilter/backends/cql2_json/evaluate.py", line 32, in
from ..evaluator import Evaluator, handle
File "/usr/local/lib/python3.8/site-packages/pygeofilter/backends/evaluator.py", line 31, in
from .. import ast
File "/usr/local/lib/python3.8/site-packages/pygeofilter/ast.py", line 32, in
from . import values
File "/usr/local/lib/python3.8/site-packages/pygeofilter/values.py", line 33, in
from pygeoif.geometry import as_shape
ImportError: cannot import name 'as_shape' from 'pygeoif.geometry' (/usr/local/lib/python3.8/site-packages/pygeoif/geometry.py)

The in stac-fastapi was to pin it to 0.7 in setup.py

    "pygeoif==0.7",

Related issues:

CQL JSON like query fails

>>> from pygeofilter.parsers.cql_json import parse as parse_json
>>> parse_json({ "eq": [{ "property": "title" }, "Lorem ipsum"]})  # sanity check/works
Equal(lhs=ATTRIBUTE title, rhs='Lorem ipsum')
>>> parse_json({ "like": [{ "property": "title" }, "Lorem%"]})
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/tomkralidis/Dev/pycsw/lib/python3.7/site-packages/pygeofilter/parsers/cql_json/parser.py", line 228, in parse
    return walk_cql_json(cql)
  File "/Users/tomkralidis/Dev/pycsw/lib/python3.7/site-packages/pygeofilter/parsers/cql_json/parser.py", line 167, in walk_cql_json
    walk_cql_json(value['like'][0]),
TypeError: list indices must be integers or slices, not str

cc @kalxas

[CQL2-JSON] function's arguments should be passed using `args` not `arguments` keyword

in
https://github.com/geopython/pygeofilter/blob/9a2d0ea43ad15032cbe92208612cb5a5ffb990a9/pygeofilter/parsers/cql2_json/parser.py#LL107C69-L107C78

the function input is defined by using arguments while in the OGC specification it uses args

ref: https://docs.ogc.org/DRAFTS/21-065.html#functions

example ๐Ÿ‘‡

{
  "op": "s_within",
  "args": [
    {"property": "road"},
    {
      "function": {
        "name":"Buffer",
        "args": [
          {"property": "geometry"},
          10,
          "m"
        ]
      }
    }
  ]
}

Provide containerized test executor

Local environment configuration for test execution can be tricky. A containerized execution environment should be provided for at least one of the supported Python versions to make it easier to execute tests.

Is Pygeofilter supporting various CRS other than 4326?

Summary

Hi! I've been working on pygeoapi for the past couple weeks and a recent PR #964 added the possibility to use CQL to query features inside the pygeoapi PostgreSQL provider ! :)

The implementation of this new feature uses pygeofilter as it references the to_filter function to translate ECQL AST to a Django Query expression. The call to to_filter from pygeoapi can be found here.

While using this new pygeoapi CQL feature, I have come across an issue that could be coming from pygeofilter? I'm creating this issue more as a discussion topic, because I'd like to know more information about the way pygeofilter is meant to be used with geometries using various SRID. I'm interested to know in order to see if I can work around my issue or if it's something that might bring an idea to someone working in this current repository.

Context

In order to provide context, here's an example of my usage in pygeoapi and the steps to reproduce my issue:

  1. In pygeoapi, via a service request handler, I'm receiving a CQL written as such: INTERSECTS(geom, POLYGON((-103.7402 55.4461, -103.7402 55.8272, -103.2597 55.8272, -103.2597 55.4461, -103.7402 55.4461))).

  2. pygeoapi starts by using the parse function to parse the CQL string. To the best of my understanding, this uses Lark to parse the string and generate a result ref: Lark's parse. The call to pygeofilter's parse from pygeoapi can be found here

  3. Then, in pygeoapi, this result is sent to the aforementioned to_filter function in order to finally send the cql filters results to an sqlalchemy query like so: session.query(self.table_model).filter(cql_filters). This call from pygeoapi can be found here.

  4. Everything seems to be great and I'm pretty convinced that, if the CQL has a geometry that has its coordinates in SRID=4326 or maybe the same SRID as the data, everything would be working fine. That being said, I believe that I've stumbled on this particular issue, because our data is stored in SRID=3978, not in SRID=4326(!). So, when I execute this CQL on our point layer, in 3978, sqlachemy understandably fails and gives this error: sqlalchemy.exc.InternalError: (psycopg2.errors.InternalError_) ST_Intersects: Operation on mixed SRID geometries (Point, 3978) != (Polygon, 4326) and [SQL: SELECT {....} FROM {....} WHERE ST_Intersects({geom_field}, ST_GeomFromEWKT(%(ST_GeomFromEWKT_1)s)).

  5. Therefore, the first thing I've tried (even if that's not what we want in pygeoapi) is to send a CQL with a POLYGON that's using coordinates in SRID=3978 (the same SRID as the data) just to attempt to get results from the query. I used this CQL: INTERSECTS(geom, POLYGON((-574624 735257, -574624 870115, -417433 870115, -417433 735257, -574624 735257))). It failed with the same error message. This was telling me something was probably forcing it in SRID=4326.

  6. I then tried to send a CQL in which I was explicitly writing the SRID of the POLYGON using a PostGIS EWKT format as such: INTERSECTS(geom, SRID=3978;POLYGON((-574624 735257, -574624 870115, -417433 870115, -417433 735257, -574624 735257))) to attempt to force the geometry in 3978. It was still giving me the same error about a 4326 incompatibility.

  7. At this point, I thought either (1) pygeofilter is not meant to be used with a SRID other than 4326, or (2) it's not supporting the EWKT format. I was thinking I had to abandon using to_filter altogether and try to find a different way to make it work for CQL inside pygeoapi. However, before doing so, I jumped in GitHub to try to understand the implementation of to_filter and something caught my attention. Is it possible that, with all the formatting that happens (impressive work btw!), the function parse_geometry is called at some point?

image

If so, this might explain the issue that I'm experiencing. This parse_geometry function starts by calling as_shape** from the pygeoif library and then calls to_wkt** again from the pygeoif library. Afterwards, it uses a regular expression to search for a SRID={some_number}; in the geometry WKT string to determine what kind of SRID should be used in ST_GeomFromEWKT to be returned. Even though, using rhs variable on the geometry, I validate that the SRID 3978 is defined (Geometry(geometry={'type': 'Polygon', 'coordinates': (((-574624.0, 735257.0), (-574624.0, 870115.0), (-417433.0, 870115.0), (-417433.0, 735257.0), (-574624.0, 735257.0)),), 'crs': {'type': 'name', 'properties': {'name': 'urn:ogc:def:crs:EPSG::3978'}}})); after the object goes through as_shape and to_wkt it loses its SRID. Actually, my understanding is that a WKT never has a SRID - when a WKT has a SRID it's called an EWKT - so I think that's normal that a SRID can't be read ever? In my particular case, it seems the SRID={number}; is never found and the default SRID=4326; is used, all the time, in the response, possibly causing my issue.

My question, finally! :), is it possible to use pygeofilter CQL on geometries that are in other SRIDs? Maybe the parse_geometry function should have an additional parameter like the similar parse_bbox function?

Thank you for taking the time to read!

TL;DR

Is it possible the function parse_geometry always responds with an SRID=4326 when returning the ST_GeomFromEWKT? Making pygeofilter exclusively(?) work with geometries in SRID=4326 projection? Is there a way to work with other SRID values for a geometry in pygeofilter CQL? Maybe the parse_geometry function should have an additional parameter like the similar parse_bbox function?

Footnotes

** pygeofilter (and by extension pygeoapi) is referencing an old version of pygeoif (version==0.7 which dates back from sometime near 2012 maybe?). This is on purpose as it seems that the latest version of pygeoif has caused issues recently and so this commit addresses this. That is why the references above point to other places in pygeoif. Indeed, the function as_shape in pygeoif's Geometry class doesn't exist anymore. It's now in PyGeoif's Factories class. Also, the function to_wkt in pygeoif's Geometry class has been renamed to wkt.

django like filter wont work for various wildcard chars

Status Quo

if a fes filter:

<ogc:Filter xmlns:ogc="http://www.opengis.net/ogc">
 <ogc:And>
  <ogc:PropertyIsEqualTo>
   <ogc:PropertyName>Type</ogc:PropertyName>
   <ogc:Literal>dataset</ogc:Literal>
  </ogc:PropertyIsEqualTo>
  <ogc:PropertyIsLike wildCard="*" singleChar="_" escapeChar="/">
   <ogc:PropertyName>AnyText</ogc:PropertyName>
   <ogc:Literal>*potentielle Evapotranspiration*</ogc:Literal>
  </ogc:PropertyIsLike>
 </ogc:And>
</ogc:Filter>

is converted to a django filter, the like query will result in a exact lookup for *potentielle Evapotranspiration*

This is cause the current like function does not take into account the wildCard char. It will always use the hard coded % char as wildcard.

support for DuckDB

Hi,
I am currently trying to figure out a way to use pygeofilter for converting CQL2 to a DuckDB query using the spatial extension. This extension is quite prototypical and I stumble across a couple of caveats. Here's my current approach:

import duckdb

duckdb.install_extension('spatial')
duckdb.load_extension('spatial')

Define the CQL2 filter

cql2_filter = {
    "op": "and",
    "args": [
        {
            "op": ">=",
            "args": [
                {
                    "property": "end_datetime"
                },
                '2020-03-28T20:05:46+02'
            ]
        },
        {
            "op": "<=",
            "args": [
                {
                    "property": "start_datetime"
                },
                '2020-03-28T22:06:15+02'
            ]
        },
        {
            "op": "=",
            "args": [
                {
                    "property": "sar:instrument_mode"
                },
                'IW'
            ]
        }
    ]
}

Optionally add spatial filtering

# ext = None
ext = {'xmin': -4, 'xmax': -2, 'ymin': 6, 'ymax': 8}

if ext is not None:
    arg = {
        'op': 's_intersects',
        'args': [
            {
                'property': 'geometry'
            },
            {
                'type': 'Polygon',
                'coordinates': [[[ext['xmin'], ext['ymin']],
                                 [ext['xmin'], ext['ymax']],
                                 [ext['xmax'], ext['ymax']],
                                 [ext['xmax'], ext['ymin']],
                                 [ext['xmin'], ext['ymin']]]]
            }
        ]
    }
    cql2_filter['args'].append(arg)

Convert CQL2 filter to SQL where clause

from pygeofilter.parsers.cql2_json import parse as json_parse

filter = json_parse(cql2_filter)
from pygeofilter.backends.sql.evaluate import to_sql_where

sql_where = to_sql_where(filter, {
    's1:datatake': 's1:datatake',
    'datetime': 'datetime',
    'sar:instrument_mode': 'sar:instrument_mode',
    'end_datetime': 'end_datetime',
    'start_datetime': 'start_datetime',
    'geometry': 'geometry'
})

Create DuckDB query for a geoparquet file:
Here it gets ugly because (1) the WKB column needs to be converted to GEOMETRY type and (2) the DuckDB-spatial implementation of ST_GeomFromWKB cannot read the WKB-HEX representation returned by to_sql_where.

import re

sql_query = "SELECT * EXCLUDE geometry, ST_GeomFromWKB(geometry) AS geometry FROM '20200301_20200401.parquet' WHERE %s" % sql_where

if ext is not None:
    # convert WKB blob to GEOMETRY
    sql_query = sql_query.replace('ST_Intersects("geometry"', 'ST_Intersects(ST_GeomFromWKB(geometry)')
    
    # duckdb_spatial apparently cannot yet read wkb_hex representation -> convert it back to text
    spatial = ("ST_GeomFromText('POLYGON(({xmin} {ymin}, {xmin} {ymax}, "
               "{xmax} {ymax}, {xmax} {ymin}, {xmin} {ymin}))')")
    sql_query = re.sub(r'ST_GeomFromWKB\(x\'.*\'\)', spatial.format(**ext), sql_query)
sql_query

Execute the query:

df = duckdb.query(sql_query)

I wonder, how would you do this? Do you think there is anything that could/should be modified on the pygeofilter end or is it entirely up to the DuckDB-spatial package? I'd appreciate any help.

cql2-json between operator fails

I see in the specs that between operator accepts 3 args: property, lower limit, and upper limit.
But pygeofilter cql2_json parser expects two args: property and a list of two values indicating the lower and upper limits.

Here's the example from OGC Specs (https://docs.ogc.org/DRAFTS/21-065.html#advanced-comparison-operators), which fails with pygeofilter:

{
  "op": "between",
  "args": [
    { "property": "depth" },
    100.0,
    150.0
  ]
}

pygeofilter expects it to be like this:

{
  "op": "between",
  "args": [
    { "property": "depth" },
    [100.0, 150.0]
  ]

Is this due to changes in specs?
It seems like a straightforward fix here: https://github.com/geopython/pygeofilter/blob/main/pygeofilter/parsers/cql2_json/parser.py#L131.
If it's an actual bug, I can submit a PR.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.