Coder Social home page Coder Social logo

hotosm / galaxy-api Goto Github PK

View Code? Open in Web Editor NEW
14.0 12.0 5.0 13.66 MB

Backend to fetch data from Underpass

Home Page: https://galaxy-api.hotosm.org/latest/redoc

License: GNU Affero General Public License v3.0

Python 99.53% Dockerfile 0.25% Shell 0.22%
python osm

galaxy-api's Introduction

GALAXY API

example workflow

Getting Started

1. Install

Clone the Repo to your machine

git clone https://github.com/hotosm/galaxy-api.git

Navigate to repo

cd galaxy-api

Install python dependencies

pip install -r requirements.txt

2. Create config.txt inside src directory.

image

3. Setup Underpass

Run underpass from here OR Create database "underpass" in your local postgres and insert sample dump from /tests/src/fixtures/underpass.sql

4. Setup Oauth

Login to OSM , Click on My Settings and register your local galaxy app to Oauth2applications

image

Check on read user preferences and Enter redirect URI as following http://127.0.0.1:8000/latest/auth/callback/

Grab Client ID and Client Secret and put it inside config.txt as OAUTH Block , you can generate secret key for your application by yourself

5. Put your credentials inside config.txt

Insert your config blocks with the database credentials where you have underpass ,insight and tm in your database

[UNDERPASS]
host=localhost
user=postgres
password=admin
database=underpass
port=5432

[OAUTH]
client_id= your client id
client_secret= your client secret
url=https://www.openstreetmap.org
scope=read_prefs
login_redirect_uri=http://127.0.0.1:8000/latest/auth/callback/
secret_key=jnfdsjkfndsjkfnsdkjfnskfn

[API_CONFIG]
env=dev

Optional Configuration

You can further customize API if you wish with API_CONFIG Block

[API_CONFIG]
api_host=http://127.0.0.1 # you can define this if you have different host
api_port=8000
to use to for psycopg2 connections
log_level=info #options are info,debug,warning,error
env=dev # default is dev , supported values are dev and prod
Setup Tasking Manager Database for TM related development

Setup Tasking manager from here OR Create database "tm" in your local postgres and insert sample dump from TM test dump. (wget https://raw.githubusercontent.com/hotosm/tasking-manager/develop/tests/database/tasking-manager.sql)

psql -U postgres -h localhost tm < tasking-manager.sql

Add those block to config.txt with the value you use in the tasking manager configuration.

[TM]
host=localhost
user=postgres
password=admin
database=tm
port=5432

You can test it with the /mapathon/detail/ endpoint and with the following input: {"fromTimestamp":"2019-04-08 10:00:00.000000","toTimestamp":"2019-04-08 11:00:00.000000","projectIds":[1],"hashtags":[]}

8. Run server

uvicorn API.main:app --reload

9. Navigate to Fast API Docs to get details about API Endpoint

After sucessfully running server , hit this URL on your browser

http://127.0.0.1:8000/latest/docs

Check Authetication

  1. Hit /auth/login/
  2. Hit Url returned on response
  3. You will get access_token
  4. You can use that access_token in all endpoints that requires authentication , To check token pass token in /auth/me/ It should return your osm profile

If you get a 401 response with the detail "User is not staff member", get your OSM id using https://galaxy-api.hotosm.org/v1/docs#/default/get_user_id_osm_users_ids__post, then run the following SQL on underpass database replacing ID:

INSERT INTO users_roles VALUES (ID, 1);

Repeat the steps to get a new access_token.

API has been setup successfully !

Run tests

Galaxy-API uses pytest for tests ,Navigate to root Dir, Install package in editable mode

pip install -e .

Make sure you have postgresql installed locally with postgis extension enabled , Now Run Pytest

py.test -v -s

Run Individual tests

py.test -k test function name

Galaxy Package

Local Install

python setup.py install

Now import as :

import galaxy

For database :

from galaxy import Database

For Mapathon :

from galaxy import Mapathon

New Relic

When using New Relic, save the newrelic.ini to the root of the project and run the following to start the server:

NEW_RELIC_CONFIG_FILE=newrelic.ini $PATH_TO_BIN/newrelic-admin run-program $PATH_TO_BIN/uvicorn API.main:app

Setup Documentation

Galaxy API Uses Sphinx for it's technical documentation.

To Setup :

Navigate to docs Folder and Build .rst files first

cd docs

If you want to generate documentation for src

sphinx-apidoc -o source ../src/galaxy

If you want to generate documentation for API

sphinx-apidoc -o source ../API

You can create HTML files with following

Or you can export it in other supported formats by Sphinx

make html

All exported html files are inside build/html

galaxy-api's People

Contributors

d-rita avatar dakotabenjamin avatar emi420 avatar eternaltyro avatar jorgemartinezg avatar kshitijrajsharma avatar nicolasgrosjean avatar petya-kangalova avatar ramyaragupathy avatar robsavoye avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

galaxy-api's Issues

move mapathon/utils.py and refactor database access

mapathon/utils.py is currently a module separate from the router code. As this is a collection of queries for a mapathon, this file should be moved into the core src directory, and renamed to something more descriptive of it's functionality. It currently accesses psycopg2 directly, so should also be modified to use the new database base class.

Embed documentation comments in Python code

While Python uses pydoc, which uses comment strings in the class definitions, Doxygen produces web viewable documentation by parsing Doxygen specific comments. This enables people to get a list of all files, classes, methods, etc.. which lets others analyze the code from a high-level. As code gets developed, documentation will also grow. This task is mostly a reminder to integrate Doxygen comments into code development along with sufficient comments for pydoc. Documentation is important to let others learn about our project if they want to contribute or use it.

Group reports for a set of OSM usernames

Given a set of usernames and hashtags, the endpoint should return statistics on feature contributions for individual users

As a permissioned user, I can input a set of OSM usernames, specify a date range and a set of hashtags so that I can generate a user contribution report for a given time period.

Endpoint: /osm-users/reports/ POST

Target Groups:
Training Group
Internal mapping projects
Data Quality team

Data Source:
Insights RDS

Sample request body:
{
“Osm_usernames”:[],
“Hashtags”:[],
"from_timestamp":"2021-08-27 09:00:00",
"to_timestamp":"2021-08-27 11:00:00",
“Period”: weekly/monthly

}

ACCEPTANCE CRITERIA:

  • Users are able to select from a calendar picker from and to dates

    • Frontend should handle UTC time conversion
    • Frontend should make sure both from and to dates are filled by the user
    • Required field
    • Should be handle to one month time interval
  • Hashtags

    • Default to ‘Any’
    • Users are able to search and type in hashtags.
    • All entered hashtags appear in the hashtag input form.
    • Not a required field
  • Users can provide a set of OSM usernames

    • Default to ‘Any ‘
    • All entered usernames should appear in the username input form.
    • Limit to 50 usernames
  • Let user download the result in file format

    • Binary field
    • Default value is false
  • Output file format

    • JSON - this is the default option
    • CSV - for the download option

Output :

JSON

{“contributors”:[
    “Username”:
    “Changesets”:
“Buildings_created”:
“Buildings_modified”:
“Editors”: []
]}

CSV Username|changesets|buildings_created|buildings_modified

  • Submit button displayed at the bottom of the page.
  • Submit button activated only when no errors found in submit API request form.

Create a base class for charts

Since the output from the API will be used for the front-end, possibly multiple front-ends, it should be possible to generate charts using the Python matplotlib module to create PMG images the various other tools can load. As these will be downloadable either locally or via the Galaxy website, this will enable these charts to be used in other people's reports.

Implement mapathon endpoints to use Galaxy database

Currently the mapathon statistics only use the insights database. All the mapathon endpoints in the API need to use the Galaxy database instead. The API should work with either database where possible.

Bug in the filters of mapathon summary and details report

Three is an issue in the queries used for mapthaon summary and details report.

Incidence

The issue was highlighted during galaxy WG on 20 Oct 2021, when TM project #111 was used to show mapathon summary stats for the time span 3-9 AM of 20-10-2021 and it returned created and modified features by users. However, when checking mapping activities for project 111 on TM, it show that there were no mapping activities within the specified time span

Investigation and debugging

Upon check the in the query filters that was sent to the insight database, it is the following query

select t.uid, t.key,t.action, count(distinct id)
from (select (each(osh.tags)).key, (each(osh.tags)).value,osh.*
from public.osm_element_history osh
where osh.changeset in (select c.id
from public.osm_changeset c
where c.created_at between '2021-10-20 03:00:00' and '2021-10-20 09:00:00'
and (
(c.tags -> 'comment') ~~ '%hotosm-project-111%' or (c.tags -> 'hashtags') ~~ '%hotosm-project-111%'
)
)
) as t
group by t.key,t.action,t.uid
order by 1,3,4 desc;

To validate the changeset that are returned for mapathon report, the inner sub query is extracted and run and it returned 28 changesets, with IDs listed bellow

select *
from public.osm_changeset c
where c.created_at between '2021-10-20 03:00:00' and '2021-10-20 09:00:00'
and (
(c.tags -> 'comment') ~~ '%hotosm-project-111%' or (c.tags -> 'hashtags') ~~ '%hotosm-project-111%'
)

Validating the tags in the returned changeset, such as the following
"source"=>"tms[22]:https://services.digitalglobe.com/earthservice/tmsaccess/tms/1.0.0/DigitalGlobe:ImageryTileService@EPSG:3857@jpg/{zoom}/{x}/{-y}.jpg?connectId=ca613e76-811f-46e7-9e1d-84f6795441c2", "comment"=>"#hotosm-project-11132 Added buildings #mapbeks #UNMAPPEDPH #UNMAPPEDTarlac #UNMAPPEDPH2021 #fb-communityimpact #hotmicrogrants2021", "hashtags"=>"#hotosm-project-11132;#mapbeks;#UNMAPPEDPH;#UNMAPPEDTarlac;#UNMAPPEDPH2021;#fb-communityimpact;#hotmicrogrants2021", "created_by"=>"JOSM/1.5 (18193 en)"

It obvious that these change set have been returned because they belong to project 11132 with hashtag #hotosm-project-11132
and since the query uses create_hashtag_filter_query the filter would look like
hashtag like "%hotosm-project-111%" which matches the hashtag of project 11132 #hotosm-project-11132

Proposed solution

Noticing that the project hashtags are having a space after the project hashtag in the comments and a semicolon in the changeset hashtags,, it is potential solution to add space (for comment hashtags) and semicolon (for the changeset hashtags) to avoid such scenarios. So the revised query filters would look like the following:

select *
from public.osm_changeset c
where c.created_at between '2021-10-20 03:00:00' and '2021-10-20 09:00:00'
and (
(c.tags -> 'comment') ~~ '%hotosm-project-111 %' or (c.tags -> 'hashtags') ~~ '%hotosm-project-111;%'
)

Returned changetsets IDs for the debugging

112722791
112724299
112727253
112727656
112728100
112728147
112728948
112729144
112729266
112729338
112729365
112729651
112729796
112730082
112730181
112730352
112730372
112730566
112730622
112730691
112730753
112730919
112731099
112731271
112731350
112731401
112731631
112733069

@JorgeMartinezG @ramyaragupathy

Import Leaderboard database

Underpass contains a python utility, util/stats2galaxy.py, which reads the the existing Leaderboard database and converts it to the database schema used by Galaxy. There is a recent dump of the Leaderboard database in the /data subdirectory on the new EC2 instance. The file is mm-osm-stats-20211110.dump. This file initially needs to be imported into postgre. The script reads from that database and writes it to the Galaxy database. This will populate the Galaxy statistics database with many years of historical statistics.

When Underpass runs, it'll update any existing records in the database, but this will help bootstrap Galaxy with the historical statistics.

Filter data quality reports by Tasking Manager Project ID

An API endpoint that filters out data quality issues for a specific Tasking Manager Project ID

Data Source:
Underpass

Sample request body:

{
  “project_ids”:[],
  “issue_type”: ["bad_geometry", "bad_value", "incomplete_tags", "all"]
}

Acceptance Criteria:

  • User feeds in a Tasking manager Project ID
    • Required input
    • Integer array field
    • Any Non-integer value in the array should throw an error - 400 Bad Request
    • Non-empty array should throw an error - 400 Bad Request
  • User can select the data quality issue from a list of options
    • Required input value

    • String array
      - bad_geometry
      - bad_value
      - incomplete_tags
      - all

    • Any field other than the options provided should throw an error - 400 Bad Request

    • Non-empty array should throw an error - 400 Bad Request
      Output file format:
      GeoJSON

{
  "type": "FeatureCollection",
  "features": [
    {
      "type": "Feature",
      "properties": {
“Osm_id”: int
“Changeset_id”: int
“Changeset_timestamp”: timestamp
“Issue_type”:string
},
      "geometry": {
        "type": "Point",
        "coordinates": [
          97.734375,
          58.26328705248601
        ]
      }
    }
  ]
}

CSV: OSM ID|Changeset ID|latitude|longitude|timestamp|Issue type

Mapathon end point producing strange results

To fix this bug we need to Update Query Builder's hashtag query producing function ! @omranlm identified the issue in the query . We need to update it and check the associated affected endpoint and make the test work.
For that result and query generated from the builder needs to be verified.
cc : @JorgeMartinezG

SQL endpoint for user data quality

Add a file & class for validation queries. There will be several of these for different types of validation analysis, this task is focused on just a single endpoint. This class should take a user id and polygon and query the validation table in osmstats. Additional parameters will also be needed to filter data by changeset, hashtag, and time interval. A time interval is an optional starting and ending time for the query. There are currently two values in the validation table, badvalue and badgeom. There are other optional values for this database column that will also need to be supported. These are incomplete, and complete, which are used with the optional tag completeness validation, which isn't enable by default at runtime.

The output of this query should be the location and other relevant tags to aid the human validator. Layered on top of this query is also support to export the data returned from the query as either CSV or GeoJson.

Create a base class for output

Multiple endpoints in the API need to support exporting data in a form it can be used in other programs. The two formats to support are CSV and GeoJson. While postgres can output a query in these formats, there are also numerous python modules that can be used for more control over the output. Since the SQL queries will be generating the input to this class, there will need to be a flexible and portable data flow between them.

Refactor directory structure

The directory structure and naming should be modified into something closer to how other projects are organized. The existing code under osmstats should be renamed to FastAPI (or something similar), as it's going to be a thin layer, and not the core functionality of the API. The osmstatsmodule should be renamed src (or something similar),, with the test cases moved into a tests sub-directory. It's entirely likely the API may grow to contain many files for various API access, so they should be grouped under separate directories based on common functionality. Typing pytest at the top level or under tests should run all the tests,

osm-stats-api
-> FastAPI (or front-end)
-> src: (or backend)
-> validation
-> mapathon
-> merl
-> etc...
-> tests:
-> backend (or src)
-> frontend (or FastAPI)

It's easier to do this refactoring now before there are more files. Good organization will make it easier for people new to this project to navigate through the code.

Use a polygon for all data filtering

Currently all the SQL queries are using countries, whereas the goal is to use any polygon as a boundary, not just countries. It's entirely possible people may want to use smaller boundaries for filtering data, like a state, or country, or city. All the SQL queries need to support boundaries that aren't a country. A list of countries may very well be used as a menu on the website for users that do wish to use country boundaries, but the goal is to support any polygon.

Enhancement in Under pass

Currently, the stats of the Mapathon summary report generated from insights and underpass is not mapping need to be found causing the bug and fixed it .
Related to #64

Document class design

There are many endpoints in the User stories, but they do break down into related groups of data queries., To avoid code duplication, a design document should identify base classes that can be shared by higher level classes. For example related queries about user statistics might have similar functionality, and should share code that uses arguments to the methods to change behavior. This will help determine the implementation of the data flow of the API, which is better to do now while the project is small and fluid. While currently the endpoints are relatively simple, there will be growing complexity as more endpoints get implemented.

Add AGPLv3 to the code

Currently there is no software license for the galaxy-api. After many months of discussion, An example of this in in the HOT qa-tiles repository: https://github.com/hotosm/hot-qa-tiles. This includes the copyright block at the top of all files, and a copy of the agpl license in the top level directory.

Rename osmstats to Galaxy

For better branding and less confusion with the older osm-stats database used by Leaderboard, the where ever osm-stats or osmstats, etc... should be renamed to 'galaxy; This is the Galaxy API, and does more than just osm-stats. This includes renaming the github repository.

Mapathon summary report: Accessible by any user

An API endpoint that exports specific and preferred mapathon activity information requested by User which include the number of contributors, features created and modified, given a certain period of time by projects.

Any user without logging in, can select a specific date range and select different combinations of filters to generate mapathon summary information.

Endpoint: /mapathon/summary/

Data Source:

  • Should come from Underpass - for now relying on the validated data from Insights

ACCEPTANCE CRITERIA:

  1. An option to to provide from and to timestamp filter in UTC
  • Validate that both from and to timestamps are provided - Both the fields Required parameter
  • From and to interval should be lesser than or equal to 24 hrs
  1. An option to provide hashtag filter
  • Text array - this will a text field and we will internally handle the hashtag extraction
  • There is no limit on the number of hashtags
  • Not necessary if Tasking Manager project ID filter is supplied
  1. An option to provide Tasking Manager project IDs
  • Integer Array - this will an integer field and we will internally handle the project ID extraction
  • Any non integer values will throw a validation error
  • There is no limit on the number of project IDs
  • Not necessary if hashtag filter is supplied
  1. Output file format
  • JSON - this is the default option

Output Schema

{
  “contributorsCount”: integer,
  “mappedFeatures”: [
      {
        “key”: string,
        “Action”: string,
        “count”: number
      }
   ]
}

Time input validation for Mapathon endpoints

Following our discussion at the scrum, we should make sure from and to timestamps are validated in the back end for the following:

  • From timestamp should always be lesser than to timestamp
  • Both the timestamps should never exceed the current timestamp

@d-rita - is already handling this on the client side

cc @JorgeMartinezG @itskshitiz321

Update API docs

Currently the API docs reference "osm stats". All occurrences should use "galaxy" instead.

Create a test suite for OSM Stats

The OSM Stats API needs a test suite so it can be validated via a CI system. This will primarily be unit level tests. Each class in the API module should have a test case that imports, and tests all the class methods. This should include using some bad data to test stability of the error handling.

This task will also require a small dedicated test database that can be created from within the test execution. Part of this task is deciding on the best Python testing framework. I use DejaGnu for other Python projects as it's very mature and maintained, but am open to other solutions.

As creating a test suite is a long-term project that never really ends, this task will be complete when there are tests for the currently existing functionality as of Oct, 2021, which at this stage is still relatively small.

Mapathon Detailed Report: Accessible only for signed in users

An API endpoint that exports specific and preferred mapathon activity information requested by User which includes:

  • the contributors and their contribution count - features created and modified, tasks mapped and validated, given a certain period of time by projects.

As [Permissioned User], I can select a specific date range and select different combinations of filters so I can generate and download detailed mapathon information.

Endpoint: /mapathon/detailed/ POST request

Data Source:

  • Insights RDS
  • Tasking Manager RDS -Make a read-only user to retrieve the data from RDS directly, so that there is less dependency on TM API

Sample request body:

{
	"project_ids":[11224,10042,9906,1381,11203,10681,8055,8732,11193,7305,11210,10985,10988,11190,6658,5644,10913,6495,4229],
	"from_timestamp":"2021-08-27 09:00:00",
	"to_timestamp":"2021-08-27 11:00:00",
	"hashtags":["mapandchathour2021"]
}

ACCEPTANCE CRITERIA:

  1. An option to to provide from and to timestamp filter in UTC
  • Validate that both from and to timestamps are provided - Both the fields Required parameter
  • From and to interval should be lesser than or equal to 24 hrs
  1. An option to provide hashtag filter
  • Text array - this will a text field and we will internally handle the hashtag extraction
  • There is no limit on the number of hashtags
  • Not necessary if Tasking Manager project ID filter is supplied
  1. An option to provide Tasking Manager project IDs
  • Integer Array - this will an integer field and we will internally handle the project ID extraction
  • Any non integer values will throw a validation error
  • There is no limit on the number of project IDs
  • Not necessary if hashtag filter is supplied
  1. An option to provide mapping types - to be Discussed with stakeholders
  2. Let user download the result in file format
  • Binary field
  • Default value is false
  1. Output file format
  • JSON - this is the default option
  • CSV - for the download option
  • Caching - endpoint/frontend?
  1. Output Schema
    JSON
{
    “mappedFeatures”: [
      {
        “key”: string,
        “Action”: string,
        “count”: number,
        “Username”: “user_1”
      },
     {
        “key”: string,
        “Action”: string,
        “count”: number,
        “Username”:”user_2”
      }

   ],
X :[
{
“User”:
“Building_created”: 
“building_modified”:
“Highways_length_created”:
“Tasks_mapped”:
“tasks_validated”:
}
]
}

Action = created, modified, deleted(?)

CSV_1 (User stats per feature)

User|Key|Action|Count
User_1|building|created|100
User_1|building|modified|50

CSV_2 (User stats grouped by mapping type)

User|buildings_created|buildings_modifed|highway_length_created|amenity_created|amenity_modified|landuse_created|landuse_modified|tasks_mapped|tasks_validated

Bug in Githb CI while running tests

Tests are being failed while succeeding in the local environment , Probably due to PostGIS image that I am using in Github CI , Need to figure out

Bug in Mapathon Query Generator

    "fromTimestamp": "2021-09-24T07:00:00.000",
    "toTimestamp": "2021-09-24T16:00:00.000",
    "projectIds": [
        11494,
        11495
    ],
    "hashtags": [
        ""
    ]
}

While passing that parameter even with spaces in hashtags and project id system is getting stuck and sending anonymous queries to the database

SQL endpoint for TM project quality

Add a file & class for validation queries. There will be several of these for different types of validation analysis, this task is focused on just a single endpoint. This class should take a hashtag and polygon and query the validation table in osmstats. An additional parameter will also be needed to filter data by time interval. A time interval is an optional starting and ending time for the query. There are currently two values in the validation table, badvalue and badgeom. There are other optional values for this database column that will also need to be supported. These are incomplete, and complete, which are used with the optional tag completeness validation, which isn't enable by default at runtime.

The output of this query should be the location and other relevant tags to aid the human validator. Layered on top of this query is also support to export the data returned from the query as either CSV or GeoJson.

Filter data quality reports by OSM Username: API END Point for Data Quality Username

An API endpoint that filters out data quality issues for specific OSM usernames

This endpoint requires OSM Login

Endpoint: /data-quality/user-reports/ POST

Data Source:
Underpass

Sample request body:

{
“Osm_usernames”:[ ],
“issue_type”: [bad_geometry/bad_value/all]
“From_timestamp”: “2021-08-27 09:00:00”
“to_timestamp”:”2021-08-31 09:00:00”
“Output_type”: GeoJSON/CSV

}

Acceptance Criteria:
User feeds in one or more OSM usernames
Required input
Text array
Non-empty array should throw an error - 400 Bad Request
User can select the data quality issue from a list of options
Required input value
String array
bad_geometry
bad_value
All
Any field other than the options provided should throw an error - 400 Bad Request
Non-empty array should throw an error - 400 Bad Request
Timestamps:
From_timestamp
UTC format
Required field
Non-empty value should throw an error
Other validation on timestamp - error if it exceeds current time
To_timestamp
UTC format
Required field
Non-empty value should throw an error
Other validation on timestamp - error if it exceeds current time
A maximum of 1 month/31 days difference between start and end dates
Output Type
GeoJSON - default file format
CSV
Output file format:
GeoJSON
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"properties": {
“Osm_id”: int
“Changeset_id”: int
“Changeset_timestamp”: timestamp
“Issue_type”:string
},
"geometry": {
"type": "Point",
"coordinates": [
97.734375,
58.26328705248601
]
}
}
]
}
CSV
OSM ID|Changeset ID|latitude|longitude|timestamp|Issue type

Make OSM Stats API work without FastAPI

Currently the OSM Stats API code is only usable via a REST request. We want an API that other projects can use, or write their own front-end. The goal of this is to separate the API code that queries the databases, so it can be imported into other Python projects. The code that supports our front-end will load the same module.

This task will be complete when a unit level test case for an endpoint works that can be run as part of a CI system.

Unit test for Database Connection and Query Running

Purpose :

Creating Unit test Class that will test the current database state and connection

Method :

pytest

Idea :

  • Test database that can be created from within the test execution
  • Create , insert and search through test tables and clear everything after execution

Related Issue

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.