Coder Social home page Coder Social logo

ecoindex_python_fullstack's People

Contributors

dependabot[bot] avatar paulphpe avatar vvatelot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

ecoindex_python_fullstack's Issues

CI Todo

  • Project quality on merge on main -> Generate Badge
  • Push docker images latest when not draft
  • Update Docker Images documentation
  • Stale action

Exclude hosts list from analyzis

Sometimes we want to exclude some hosts from analyzis, for example, when API is deployed in production, we don't want to authorize analyzis of localhost...

[Bug]: Docker not working on Mac

What happened?

Docker images do not work on Mac ARM platform for both API and CLI images

Needs to automate build / publish images with github action

Project

Ecoindex API

What OS do you use?

Mac

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]: Error with mp3

What happened?

Error when trying to analyze an mp3 file

Project

Ecoindex Scraper

What OS do you use?

No response

urls

https://www.pascalfaure.com/hyper_relax/01_relaxation_corporelle.mp3

Relevant log output

Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/celery/app/trace.py", line 477, in trace_task
    R = retval = fun(*args, **kwargs)
                 ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/celery/app/trace.py", line 760, in __protected_call__
    return self.run(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/celery/app/autoretry.py", line 60, in run
    ret = task.retry(exc=exc, **retry_kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/celery/app/task.py", line 736, in retry
    raise_with_context(exc)
  File "/usr/local/lib/python3.12/site-packages/celery/app/autoretry.py", line 38, in run
    return task._orig_run(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/ecoindex/worker/tasks.py", line 36, in ecoindex_task
    queue_task_result = run(
                        ^^^^
  File "/usr/local/lib/python3.12/asyncio/runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/asyncio/base_events.py", line 684, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/ecoindex/worker/tasks.py", line 52, in async_ecoindex_task
    ecoindex = await EcoindexScraper(
               ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/ecoindex/scraper/scrap.py", line 48, in get_page_analysis
    page_metrics = await self.scrap_page()
                   ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/ecoindex/scraper/scrap.py", line 90, in scrap_page
    return PageMetrics(
           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/pydantic/main.py", line 164, in __init__
    __pydantic_self__.__pydantic_validator__.validate_python(data, self_instance=__pydantic_self__)
ValueError

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]: Sitemap only found 1 URL in Ecoindex-cli

What happened?

Step to reproduce :
alias ecoindex-cli="docker run -it --rm --add-host=host.docker.internal:host-gateway -v /tmp/ecoindex-cli:/tmp/ecoindex-cli vvatelot/ecoindex-cli:2.26.0 ecoindex-cli"
ecoindex-cli analyze --url https://simbios.fr --sitemap https://simbios.fr/sitemap.xml --export-format json

Only 1 URL was found while the sitemap contains more than 50 elements : https://simbios.fr/sitemap.xml

Project

Ecoindex CLI

What OS do you use?

Linux

urls

https://simbios.fr/sitemap.xml

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]: Local analyzis with CLI not working

What happened?

When trying to analyze a loclahost page, it fails when using docker.

Should declare host network configuration in readme

Project

Ecoindex CLI

What OS do you use?

No response

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Migration TODO

  • Template PR and issues
  • Testing
  • Refactoring API dependencies (sort...)
  • BFF methods
  • script update swagger
  • auto update version api
  • Record DB
  • Health
  • Refactor routers dependencies (pagination, host, date)
  • Finish migration pydantic v2
  • Migrate List to list...
  • Readme badges
  • Main readme / disclaimer
  • Sentry / Glitchtip
  • Github actions
    • Build publish docker API (workflow dispatch)
    • Build publish docker CLI (workflow dispatch)
    • Build publish ecoindex compute package
    • Build publish ecoindex scraper package
  • Tests
  • Start-dev: auto reload

Commands

CLI

Build docker image

cd projects/ecoindex_cli
poetry lock
poetry build-project
docker build -t ecoindex-cli:playwright --build-arg="wheel=ecoindex_cli-2.23.0-py3-none-any.whl" .

API

Build docker image

cd projects/ecoindex_api
poetry lock
poetry build-project
docker build -t ecoindex-api-backend --build-arg="ecoindex_api-3.1.0-py3-none-any.whl" -f docker/backend/dockerfile .
docker build -t ecoindex-api-worker --build-arg="ecoindex_api-3.1.0-py3-none-any.whl" -f docker/worker/dockerfile .
docker compose up

Create migration

cd projects/ecoindex_api
poetry run alembic revision --autogenerate -m "Migration name"

Github actions

Pull request

  • Check conventional commit
  • Add labels
  • Check quality
    • Linting: ruff, mypy
    • Tests
    • Coverage

Build deploy compute

  • Trigger on workflow_dispatch with inputs:
    • bump_type: major, minor, patch
  • Actions:
    • Checkout
    • Setup python
    • Setup poetry:
      • pip install poetry
      • poetry config virtualenvs.create false
    • Setup poetry multi project:
      • poetry self add poetry-multiproject-plugin
    • Get previous version:
      • echo "previous_version=$(poetry version)" >> $GITHUB_OUTPUT
    • Bump version
    • Bump ecoindex_compute_version in components/ecoindex/data/__init__.py
    • Commit + Push changes:
      • poetry version ${{ inputs.bump_type }}
      • echo "new_version=$(poetry version)" >> $GITHUB_OUTPUT
      • chore(compute): bump version to ${version}
    • Build project:
      • poetry build-project
    • Publish to pypi:
    • Create tag:
      • v${version}-compute
      • target: ${{ github.ref }}
      • force: true
    • Create changelog:
      • Exclude scopes: api, cli, scraper
      • From: v${version}-compute
      • To: v${previous_version}-compute
    • Create release:
      • Body: changelog
      • Title: [compute] ${version}

[Bug]: Total count host when filtered

What happened?

The total count match the number of analyzis et not hosts...

Project

Ecoindex API

What OS do you use?

No response

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Aide - Compréhension du calcul du poids de page

Bonjour,

Pouvez-vous m'éclairer sur les éléments qui sont pris en compte avec l'écoindex-cli pour calculer le métric poids d'une page via la ligne de commande ?

Est-ce que tout entre en compte dans le calcul : js, css, images (eager, lazy, est-ce que tout ce qui est en source srset aussi même si on fait le calcul dans une résolution précise ?) , page html, fonts internes.

Les ressources internes uniquements ? Internes et externes ?

Merci !

Aide - lancement de ecoindex_cli dans ce projet global

Bonjour,

Je ne suis pas familier avec les tasks de poetry et je n'ai pas trouvé comment lancer comme sur le repo autonome https://github.com/cnumr/ecoindex_cli le client ecoindex_cli pour mes analyzes.

J'ai essayé

`ubuntu@xxxx:~/ecoindex_python_fullstack/projects/ecoindex_cli$ task docker-build
Updating dependencies
Resolving dependencies... (5.1s)

The command "build-project" does not exist.
exit status 1
ubuntu@xxxx:~/ecoindex_python_fullstack/projects/ecoindex_cli$ `

Sans trop de succès.
Pourriez-vous m'aider.

Merci !

[Bug]: Erreur EcoIndex Scraper sur certaines URLs

What happened?

La librairie fonctionne correctement pour certains sites mais j’ai malheureusement rencontré un problème avec certaines URLs.

Exception message : "'charmap' codec can't decode byte 0x9d in position 766503: character maps to "

Code test utilisé pour appliquer EcoIndex Scraper :

import asyncio
from pprint import pprint
from ecoindex.scraper import EcoindexScraper

def main():
print('')
print("ECOINDEX ANALYSIS ")
print('
')

url = https://www.orange.com/fr
try :
    pagenalysis = asyncio.run(EcoindexScraper(url=url).get_page_analysis())
    print(pagenalysis.score)
    print(pagenalysis.ges)
    print(pagenalysis.water)
except Exception as e :
    print('Error on execute EcoIndex scrapper')
    print(e)

if name == "main":
main()

Project

Ecoindex Scraper

What OS do you use?

Windows

urls

https://www.orange.com/fr
https://www.businessdecision.com/fr-fr

Relevant log output

*************************
ECOINDEX ANALYSIS
*************************
Error on execute EcoIndex scrapper
'charmap' codec can't decode byte 0x9d in position 766503: character maps to <undefined>

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]: Pydantic serialization error

What happened?

A bug happened with the BFF
/{version}/ecoindexes/latest

Project

Ecoindex API

What OS do you use?

No response

urls

No response

Relevant log output

ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `datetime` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   Expected `uuid` but got `str` - serialized value may not be as expected
ecoindex-api-backend  |   return self.serializer.to_python(

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]: AIOMYsql

What happened?

Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/celery/app/trace.py", line 477, in trace_task
    R = retval = fun(*args, **kwargs)
                 ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/celery/app/trace.py", line 760, in __protected_call__
    return self.run(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/celery/app/autoretry.py", line 60, in run
    ret = task.retry(exc=exc, **retry_kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/celery/app/task.py", line 736, in retry
    raise_with_context(exc)
  File "/usr/local/lib/python3.12/site-packages/celery/app/autoretry.py", line 38, in run
    return task._orig_run(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/ecoindex/worker/tasks.py", line 38, in ecoindex_task
    queue_task_result = run(async_ecoindex_task(self, url, width, height))
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/asyncio/runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/asyncio/base_events.py", line 664, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/ecoindex/worker/tasks.py", line 47, in async_ecoindex_task
    await check_quota(host=urlparse(url=url).netloc)
  File "/usr/local/lib/python3.12/site-packages/ecoindex/backend/utils/__init__.py", line 86, in check_quota
    count_daily_request_per_host = await get_count_daily_request_per_host(host=host)
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/ecoindex/database/repositories/ecoindex.py", line 110, in get_count_daily_request_per_host
    results = await db.execute(statement)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlmodel/ext/asyncio/session.py", line 145, in execute
    return await super().execute(
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/ext/asyncio/session.py", line 455, in execute
    result = await greenlet_spawn(
             ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/util/_concurrency_py3k.py", line 190, in greenlet_spawn
    result = context.throw(*sys.exc_info())
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlmodel/orm/session.py", line 129, in execute
    return super().execute(
           ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 2308, in execute
    return self._execute_internal(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 2190, in _execute_internal
    result: Result[Any] = compile_state_cls.orm_execute_statement(
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/context.py", line 293, in orm_execute_statement
    result = conn.execute(
             ^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1416, in execute
    return meth(
           ^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/sql/elements.py", line 516, in _execute_on_connection
    return connection._execute_clauseelement(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1639, in _execute_clauseelement
    ret = self._execute_context(
          ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1848, in _execute_context
    return self._exec_single_context(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1988, in _exec_single_context
    self._handle_dbapi_exception(
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 2346, in _handle_dbapi_exception
    raise exc_info[1].with_traceback(exc_info[2])
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1969, in _exec_single_context
    self.dialect.do_execute(
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 922, in do_execute
    cursor.execute(statement, parameters)
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/aiomysql.py", line 93, in execute
    return self.await_(self._execute_async(operation, parameters))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/util/_concurrency_py3k.py", line 125, in await_only
    return current.driver.switch(awaitable)  # type: ignore[no-any-return]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/util/_concurrency_py3k.py", line 185, in greenlet_spawn
    value = await result
            ^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/dialects/mysql/aiomysql.py", line 102, in _execute_async
    result = await self._cursor.execute(operation, parameters)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/aiomysql/cursors.py", line 239, in execute
    await self._query(query)
  File "/usr/local/lib/python3.12/site-packages/aiomysql/cursors.py", line 457, in _query
    await conn.query(q)
  File "/usr/local/lib/python3.12/site-packages/aiomysql/connection.py", line 469, in query
    await self._read_query_result(unbuffered=unbuffered)
  File "/usr/local/lib/python3.12/site-packages/aiomysql/connection.py", line 683, in _read_query_result
    await result.read()
  File "/usr/local/lib/python3.12/site-packages/aiomysql/connection.py", line 1164, in read
    first_packet = await self.connection._read_packet()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/aiomysql/connection.py", line 609, in _read_packet
    packet_header = await self._read_bytes(4)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/aiomysql/connection.py", line 657, in _read_bytes
    data = await self._reader.readexactly(num_bytes)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/asyncio/streams.py", line 734, in readexactly
    await self._wait_for_data('readexactly')
  File "/usr/local/lib/python3.12/asyncio/streams.py", line 527, in _wait_for_data
    await self._waiter
RuntimeError: Task <Task pending name='Task-268' coro=<async_ecoindex_task() running at /usr/local/lib/python3.12/site-packages/ecoindex/worker/tasks.py:47> cb=[_run_until_complete_cb() at /usr/local/lib/python3.12/asyncio/base_events.py:180]> got Future <Future pending> attached to a different loop

Project

Ecoindex API

What OS do you use?

No response

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

[Bug]: iframe content not accounted for when calculating ecoindex

What happened?

ecoindex_cli does not take the content of iframes into account when calculating the ecoindex of a page.

I'm using the following files as a testcase.zip, served locally on port 8080.
I analyze the page using the following command and get this result :

ecoindex-cli analyze --url http://127.0.0.1:8080/testEcoIndexIFrame.html --export-format json --outputfile ./result.json

result:

[
    {
        "width": 1920,
        "height": 1080,
        "url": "http://127.0.0.1:8080/testEcoIndexIFrame.html",
        "size": 2.511,
        "nodes": 6,
        "requests": 3,
        "grade": "A",
        "score": 97.0,
        "ges": 1.06,
        "water": 1.59,
        "ecoindex_version": "5.4.1",
        "date": "2023-02-27 17:50:40.593434",
        "page_type": null
    }
]

The number of DOM nodes detected by ecoindex-cli is 6, which maps the content of the main html file, but does not include the content of the html inside the iframe.
The number of requests and the size seems to be correct though : when I increase the subPage.hml page's size, the size reported by ecoindex-cli increases as well.

On some real life pages, this can make a huge different in the final rank obtained when analyzing a page (from G to E rank).

Version

Above 3.6

What OS do you use?

Linux

urls

No response

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.