dynamist / subgit Goto Github PK
View Code? Open in Web Editor NEWSub-git repo handler
License: Apache License 2.0
Sub-git repo handler
License: Apache License 2.0
We have a need for a integration test script that can be rerun locally any number of times that will run all commands in a few different ways and corner cases just to exercise what commands should be possible to run in what situation. This script or feature is not something done within pytest/unittests as it should be a shell script or possible a invoke script that really runs commands like subgit status
or subgit pull
.
This will require a set of different config files and config files that should be possible to run and use.
This feature is needed to compliment any normal unittests as doing complex integration tests within this type of code is difficult and it is both much cheaper, simpler and more effective to run a series of cli commands instead designed to test this on a integration test level instead.
When running the subgit init
command you are rewarded with an empty config file. This doesn't help the user of subgit in any usable way. We should look to vagrant init
as a good example on how to do this.
Current behaviour:
$ subgit init
$ cat .subgit.yml
repos: {}
Expected behaviour:
$ subgit init
$ cat .subgit.yml
repos:
subgit:
url: [email protected]:dynamist/subgit.git
revision:
branch: master
We could even include comments for each key (url
, revision
, etc).
Run bandit checks and add appropriate exclusions to the source code.
In order to better and much more simply facilitate the hierarchy and clone order and more where we might want to clone a second repo into some already cloned folder it would make much more sense and make it simpler to process and parse repos in order if the user would make the repo ordering.
Example of new format would be
repos:
- name: books
url: git@.../books.git
revision:
branch: master
- name: windows
url: git@.../windows.git
revision:
branch: master
Things that is affected by this is plenty, here is a few things that probably needs to be rewritten
A possible good idea within these kinds of changes is to make a more dedicated and separate migration section within documentation that describes how a user migrates between different versions of the syntax. Another idea is to implement "versions" of the syntax that can be used to validate or check if this version of the cli really works with or supports a specific version of the config file or not. This however might be another ticket and probably should not be done in here right now
We need this change to better and make a simpler support for the other feature described in #33
So @naestia had the idea that we should be able to generate or "import" all git repos that is available from either my user account or an organisation account.
We have an example script of this already made on the side that supports this for our own needs, but this should be supported in the main tool if possible.
Implement the cli commands similar to this
subgit import github <username-or-organisation> <output-file>
subgit import gitlab <username-or-organisation-id> <output-file>
where we also define the output file. Note here that we need protection to avoid or to ask the user to overwrite existing configuration file. The output file maybe should be optional and it should default to .subgit.yml
by default.
Things we should parse out from the repo is the clone url, repo name and default branch name and save this in the configuration file. Anything else is redundant by default and needs to be added by the user manually after.
So one thing here and one critical thing is that any dependencies we have or need in order to make the commands work like git-python or if we need any other libraries based on what the script translation will be, if we need additional python dependencies, they should be put in the extras_installs section in setup.py file and the commands should be enabled/disabled based on if the python imports exists and can be done.
There is a need to have a command that will print out a specific status command that simulates the same kind of features that "git status" do but limited a bit.
Procedure that should be done
would have the option of cloning objects in sgit to subfolder.
something like following
sgit repo add --path subfolder/ <name> <url> <rev>
where --path
defaults to ./
if nestling repos is utilized, closest root folder should be cloned first
sgit repo add --path repo/ subrepo https://git.example.com/subrepo.git master
sgit repo add --path . repo https://git.example.com/repo.git master
in above case, repo.git
should be cloned first, since subrepo.git
is dependant on it path-wise locally.
non-allowed characters or paths should be /../
anywhere in the path, working outside initialized folder shouldn't be allowed.
$ mkdir tmp
$ sgit init
$ sgit repo add tmp/foobar [email protected]:dynamist/foobar.git master
$ sgit update
DEBUG: Repo update - all
Are you sure you want to update the following repos "foo, bar, baz, tmp/foobar"
(y/n) << y
TODO: Parse for any changes...
DEBUG: Handling branch update case
Traceback (most recent call last):
File "/home/nn/code/sgit/sgit/cli.py", line 186, in cli_entrypoint
exit_code = run(cli_args, sub_args)
File "/home/nn/code/sgit/sgit/cli.py", line 172, in run
retcode = core.update(repo)
File "/home/nn/code/sgit/sgit/core.py", line 319, in update
repo.create_head(f'{branch_revision}', f'origin/{branch_revision}')
File "/home/nn/code/sgit/.venv/lib/python3.6/site-packages/git/repo/base.py", line 386, in create_head
return Head.create(self, path, commit, force, logmsg)
File "/home/nn/code/sgit/.venv/lib/python3.6/site-packages/git/refs/symbolic.py", line 543, in create
return cls._create(repo, path, cls._resolve_ref_on_create, reference, force, logmsg)
File "/home/nn/code/sgit/.venv/lib/python3.6/site-packages/git/refs/symbolic.py", line 506, in _create
(full_ref_path, existing_data, target_data))
OSError: Reference at 'refs/heads/master' does already exist, pointing to '631771e9b527cb8b1ef92735b936744270f363ee', requested was '4a29994ad13964de9f869b82873482cdcf03b894'
> /home/nn/code/sgit/.venv/lib/python3.6/site-packages/git/refs/symbolic.py(506)_create()
-> (full_ref_path, existing_data, target_data))
After #59 was merged we get this error when using the old config file format.
❯ subgit pull
Exception type : TypeError
EXCEPTION MESSAGE: string indices must be integers
To get more detailed exception set environment variable 'DEBUG=1'
To PDB debug set environment variable 'PDB=1'
Here is the traceback:
Traceback (most recent call last):
[...]
File "/home/nn/code/github.com/dynamist/subgit/subgit/core.py", line 354, in pull
active_repos = self._get_active_repos(config)
File "/home/nn/code/github.com/dynamist/subgit/subgit/core.py", line 335, in _get_active_repos
repo_name = repo_data["name"]
TypeError: string indices must be integers
Expected result: the help in sgit --help
says "Initialize a new subgit repo", so I expect the command to initialize a new subgit repo.
Actual result:
$ sgit init
Unable to find a config file in any directory...
$ sgit init https://github.com/dynamist/subgit.git
Unable to find a config file in any directory...
$ echo $?
1
To reproduce
pip3 install --user git+https://github.com/dynamist/sgit.git
rh@rh ~ mkdir /tmp/sgit-test
rh@rh ~ cd /tmp/sgit-test
rh@rh /tmp/sgit-test sgit init
Successfully wrote new config file ".sgit.yml" to disk
rh@rh /tmp/sgit-test sgit repo add sgit https://github.com/dynamist/sgit.git master
Successfully added new repo "sgit"
rh@rh /tmp/sgit-test sgit update
DEBUG: Repo update - all
Are you sure you want to update the following repos "sgit"
(y/n) << y
Traceback (most recent call last):
File "/home/rh/.local/lib/python3.6/site-packages/sgit/cli.py", line 185, in cli_entrypoint
exit_code = run(cli_args, sub_args)
File "/home/rh/.local/lib/python3.6/site-packages/sgit/cli.py", line 171, in run
retcode = core.update(repo)
File "/home/rh/.local/lib/python3.6/site-packages/sgit/core.py", line 253, in update
repo = Repo(repo_path)
File "/home/rh/.local/lib/python3.6/site-packages/git/repo/base.py", line 130, in __init__
raise NoSuchPathError(epath)
git.exc.NoSuchPathError: /tmp/sgit-test/sgit
> /home/rh/.local/lib/python3.6/site-packages/git/repo/base.py(130)__init__()
-> raise NoSuchPathError(epath)
rh@rh /tmp/sgit-test
We need to change the import
command to inspect to avoid and remove the confusion that import might be thought as it would "import something into github/gitlab" and not specifically import something to a config file.
The current best suggestion is to rename it to inspect
Right now the update command will clone and fetch/hard reset but we need to have a seperate command that can run git fetch
in all repos but not to modify the repo that sgit update
would do. Additional to fetch is to implement -p as that is common or try to find some way to do a passthrough of any flags into the fetch command
When running this method def _check_remote(self, repo):
on a repo with a specific set of setup it will trigger false-positive case.
So if we have a cloned repo say pykwalify, and we have it around for some time. Then on the remote a feature branch named f/foobar
was removed due to it being merged in github in a PR, then we will get a diff the next time the _check_remote
method is called because the logic is written so that if there is a mismatch between the local and remote we assume something is wrong. And we treat this diff as the same as the repo would have a commit ahead of the remote or some uncommitted file locally.
The fix is probably to make the checks much more specific in the method and have corner cases where it should not be treated as a dirty repo.
The one possible exception to this is that in the case i manually create a branch locally that is not pushed to remote it could be seen as dirty but in a legit way counter to the above text
So right now we have no way from the commandline to choose any other file other then the default. The Core class already supports this feature but the cli needs to be extended to work with this as well as we have many configuration files that we can't really use as we have no way to point subgit to another file on demand.
In the hiearchy of cli parsers it needs to be figured out if we should but it on the global scope, or on the local command scope or not. It is simpler to put it once in the global scope, the downside is that in the global scope is that we can't use it anywhere we like within the cli argument.
The downside of putting it on all leaf parsers is that it is redundant and requires several duplicate implementations. This could possibly spark the merge of all cli parsers into one parser as we no longer have so many leaf commands like we used to have
Right now everything is pretty much jumbled up into one place and we do all repo tracking and parsing within the Core class.
What i want to change is that we should split the logic into two classes where we make a Wrapper class around git-pythoons Repo class with our own SgitRepo class where we put most of the logic that we add that deals with the repo itself.
Then we keep most of the cli buisiness logic within the Core class as today, but we try to push down as much code as possible into SgitRepo to reduce the complexity of things.
Note here that we also want to avoid recoding same things all over that many commands already do like resetting, tracking config, custom state handling, checking if repo is dirty and so on. This belongs within SgitRepo as many different commands will want to re-use these things.
I have previously executed subgit pull
using this configuration file.
.subgit.yml
:
repos:
adminbooks:
url: [email protected]:org/abooks.git
revision:
branch: master
tbooks:
url: [email protected]:org/tbooks.git
revision:
branch: master
A few weeks later (after more commits have been pushed), I run into problem running subgit pull
again.
Invocation with DEBUG=1
:
DEBUG - git.cmd:976 - Popen(['git', 'fetch', '-v', '--', 'origin'], cwd=/opt/documents/tbooks, universal_newlines=True, shell=None, istream=None)
DEBUG - subgit.core:442 - Handling branch pull case
DEBUG - git.cmd:976 - Popen(['git', 'cat-file', '--batch-check'], cwd=/opt/documents/tbooks, universal_newlines=False, shell=None, istream=<valid stream>)
Traceback (most recent call last):
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/subgit/cli.py", line 309, in cli_entrypoint
exit_code = run(cli_args, sub_args)
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/subgit/cli.py", line 250, in run
retcode = core.pull(repos)
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/subgit/core.py", line 448, in pull
repo.create_head(f"{branch_revision}", f"origin/{branch_revision}")
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/git/repo/base.py", line 490, in create_head
return Head.create(self, path, commit, logmsg, force)
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/git/refs/symbolic.py", line 615, in create
return cls._create(repo, path, cls._resolve_ref_on_create, reference, force, logmsg)
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/git/refs/symbolic.py", line 567, in _create
raise OSError(
OSError: Reference at 'refs/heads/master' does already exist, pointing to '7f0b7fd89dc7134d6deaf574f04e7321d6d75cc7', requested was 'e8d505b6f69c8c3ac9efac863773e3d2fe13c1f1'
Traceback (most recent call last):
File "/home/nn/.local/bin/subgit", line 8, in <module>
sys.exit(cli_entrypoint())
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/subgit/cli.py", line 309, in cli_entrypoint
exit_code = run(cli_args, sub_args)
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/subgit/cli.py", line 250, in run
retcode = core.pull(repos)
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/subgit/core.py", line 448, in pull
repo.create_head(f"{branch_revision}", f"origin/{branch_revision}")
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/git/repo/base.py", line 490, in create_head
return Head.create(self, path, commit, logmsg, force)
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/git/refs/symbolic.py", line 615, in create
return cls._create(repo, path, cls._resolve_ref_on_create, reference, force, logmsg)
File "/home/nn/.local/pipx/venvs/subgit/lib/python3.8/site-packages/git/refs/symbolic.py", line 567, in _create
raise OSError(
OSError: Reference at 'refs/heads/master' does already exist, pointing to '7f0b7fd89dc7134d6deaf574f04e7321d6d75cc7', requested was 'e8d505b6f69c8c3ac9efac863773e3d2fe13c1f1'
This is a rewrite of #2 with some updates
Right now we only have the ability to clone into the same folder. But we want to be able to clone into a sub-folder if possible.
Extend the configuration for a repo to allow for the option clone-path: <folder-path>
for any repo we define
If this is set then we should attempt to clone the git repo into this folder. But there is two different cases here.
The first is that we might want to clone one git repo into the already cloned repo of another repo. So lets say we have the repo pykwalify/
that we already have cloned to disk. We want to clone another repo redis-py-cluster
into the folder path pykwalify/redis-cluster/
instead of the root folder we stand in. The problem here is that we must have some kind of dependency resolution where we must say what order repos should be processed in to allow for this as we can't create a random folder, clone redis-py-cluster
into it first, then clone and overwrite pykwalify/
on the same folder. Add ontop of this that our configuration file is a nested dict that do not guarantee or preserv insert order always so we can't guarantee that we will always process pykwalify before redis-py-cluster for instance.
If we go down dependency order lane, keep it as simple as possible, depnedencies must be defined where one repo "depends on" another repo, the tool will just re-order the execution order to fullfill the dependencies, then execute the commands in that order.
The second alternative we have is that we just ignore any dependencies and we hope that subgit just executes things in the correct order. This might cause issues and problems if we only let this to chanse. The upside is that this is way simpler to implement and we might just impelment the feature and add in dependency handling later on to move the tool forward.
Non-allowed characters or paths should be /../ anywhere in the path, working outside initialized folder shouldn't be allowed.
We also might need to sort out how we deal with moving an already exported folder if we move things around. This might mean that we need to implement state tracking of already done commands. This will need to be investigated further.
Running tox linters after #61 is merged will exit zero, but there is issues in both subgit/
and tests/
folders, so creating an issue.
After issued are fixed, remove tox.ini:31 ... --exit-zero
and re-run tox -e linters
to verify.
$ tox -e linters
ROOT: tox-gh-actions won't override envlist because tox is not running in GitHub Actions
linters: commands[0]> ruff --exit-zero subgit/ tests/
subgit/core.py:896:32: C413 Unnecessary `list` call around `sorted()`
subgit/core.py:903:32: C413 Unnecessary `list` call around `sorted()`
subgit/core.py:909:32: C413 Unnecessary `list` call around `sorted()`
tests/conftest.py:6:1: I001 [*] Import block is un-sorted or un-formatted
tests/conftest.py:17:47: B026 Star-arg unpacking after a keyword argument is strongly discouraged
tests/test_core.py:4:1: I001 [*] Import block is un-sorted or un-formatted
tests/test_core_query.py:4:1: I001 [*] Import block is un-sorted or un-formatted
tests/test_core_query.py:22:9: B015 Pointless comparison. This comparison does nothing but waste CPU instructions. Either prepend `assert` or remove it.
Found 8 errors.
[*] 3 potentially fixable with the --fix option.
linters: OK (0.04=setup[0.03]+cmd[0.01] seconds)
congratulations :) (0.08 seconds)
Implement subgit delete
command. We want to have a new command that when run should wipe and delete either the selected repos, or all repos defined in the configuration file.
Precaution steps that should be taken is that it should always ask for user confirmation if it should delete a folder and the entire git tree. Secondly it should either prevent it completley to be deleted if the user has any dirty files or unstaged files or uncommited files that might be lost. A possible third precaution would be to diff any local branches with remote branches to see if there is any unpushed commits that might be lost, this could be put into the future tho.
We also probably want to implement options for -y, --yes
and/or -f, --force
or something to allow for automation of this step for cases like CI
CLI command to implement would be something like
subgit delete (<repo>...) (-y) (-f)
Where if no repo is set, it means all repos, if one or more repos is explicitly set, then work only on them.
Most usefull use-case for this is ci-cd scenarios where you might clone the repos, make some operation and it might add in files etc into the cloned folders and you want to wipe it all and restart the next run or within the same run.
Integration test this against existing repos and implement light unittests for this
So we have a use-case where we want to be able to define either one or more folders and/or files that should be checked out from the git repo.
So for instance we have a repo that only contains folders in the root of the git repo and they can be used for different things and in different situations. So to avoid having to clone everything or to checkout everything, we want one of two things.
One solution is that we must clone the entire git repo down to local, but we use sparse-checkouts which is an older feature and we tell git only to checkout a list of names from the repo and maintain them.
Or the second solution is to use git clone --filter from git 2.19 now works on GitHub (tested 2020-09-18, git 2.25.1)
that might also prevent us from having to download everything from the remote as well.
The configuration for a repo has to be upgraded to include a option to define sparse checkouts and to define a list of names that we want to checkout.
Note here that this will eventually affect all commands like pull, init and so on, that needs to work with both adding and removing this. The bad part about this feature is that it adds in a ton of complexity in what repo states that we should both support and sort out how to move between. It might just be that if we go from a full checkout to sparse checkout that we give off an error and that the user has to run subgit delete
first before we can do a sparse checkout.
One idea here is that we maybe have to implement a custom state tracking mechanism ourselfs that will track some certain states about the git repo that might not be so obvious or simpler or even possible to figure out and track from a git repo that we have checked out already. I would like to avoid this as it adds complexity, but if we need this to sort out tracking our repos, then so be it...
Source material for this feature and examples how sparse checkouts work can be found here https://stackoverflow.com/questions/180052/checkout-subdirectories-in-git
Implement ruff https://github.com/astral-sh/ruff
It should check in github actions and be possible to run/documented locally
Using this .sgit.yml
:
repos:
tag-major-minor:
clone-url: [email protected]:dynamist/subgit-sample-f.git
revision:
tag: v1.2
Doing an initial sgit update
, I get:
$ sgit update
DEBUG: Repo update - all
Are you sure you want to update the following repos "tag-major-minor"
(y/n) << y
Traceback (most recent call last):
File "/home/nn/code/dynamist/sgit/sgit/cli.py", line 186, in cli_entrypoint
exit_code = run(cli_args, sub_args)
File "/home/nn/code/dynamist/sgit/sgit/cli.py", line 172, in run
retcode = core.update(repo)
File "/home/nn/code/dynamist/sgit/sgit/core.py", line 297, in update
raise SgitException('Initial clone must be from a branch')
sgit.exceptions.SgitException: Initial clone must be from a branch
> /home/nn/code/dynamist/sgit/sgit/core.py(297)update()
-> raise SgitException('Initial clone must be from a branch')
(Pdb)
Here is a reproducible transcript:
nn@origo:/tmp$ mkdir -p /tmp/pykwalify
nn@origo:/tmp$ cd /tmp/pykwalify/
nn@origo:/tmp/pykwalify$ sgit init
Successfully wrote new config file ".sgit.yml" to disk
nn@origo:/tmp/pykwalify$ sgit repo add pykwalify [email protected]:Grokzen/pykwalify.git
Successfully added new repo "pykwalify"
nn@origo:/tmp/pykwalify$ sgit update -y
Are you sure you want to update the following repos "pykwalify"
--yes flag set, automatically answer yes to question
Successfully cloned repo "pykwalify" from remote server
nn@origo:/tmp/pykwalify$ cd pykwalify/
nn@origo:/tmp/pykwalify/pykwalify$ git tag -l
0.1.2
1.0.0
1.0.1
1.1.0
1.2.0
1.3.0
1.4.0
1.4.1
1.5.0
1.5.1
1.5.2
1.6.0
1.6.1
1.7.0
1.8.0
14.06
14.06.1
14.08
14.12
15.01
v0.1.0
v0.1.1
nn@origo:/tmp/pykwalify/pykwalify$ cd ..
nn@origo:/tmp/pykwalify$ sgit repo set pykwalify tag 1.8.0
Exception type : KeyError
EXCEPTION MESSAGE: 'tag'
To get more detailed exception set environment variable 'DEBUG=1'
To PDB debug set environment variable 'PDB=1'
I am running 1859c72
So i found out that when i was running subgit fetch
that it was super slow even with only 3-4 repos in my config file. It would make sense to implement threads or subprocesses here to run the fetch operation in some kind of batches. Maybe a subprocess pool with 5 or 10 parralell workers is good as it would rate limit our calls to github/gitlab in the case you have some huge config files
Implement subgit reset
command. We want to have a new command that when run should do the git reset
similar command on the selected git repos or all repos in the configuration. Resetting a git repo would throw away any unstaged or uncommited things and go back to the state or commit that we have defined in the .subgit.yml
config file.
Precaution steps that should be taken is that it should always ask for user confirmation if it should reset a folder and the entire git tree. Compared to the delete command, this should be a bit more nicer to work with, but since we still are modifying and possibly deleting files or changes the user has done, we need to implement a confirmation step asking the user to really perform any actions and specially if there is things uncommited.
We also probably want to implement options for -y, --yes
or something to allow for automation of this step for cases like CI
CLI command to implement would be something like
subgit reset (<repo>...) (-y)
Where if no repo is set, it means all repos, if one or more repos is explicitly set, then work only on them.
Most usefull use-case for this is ci-cd scenarios where you might clone the repos, make some operation and it might add in files etc into the cloned folders and you want to reset everything back but not have to reclone everything before continue with some other operation within the build.
Integration test this against existing repos and implement light unittests for this
Based on the following blog post, we are able to create custom git commands that the git
tool itself can pick up and use. We want to implement this for subgit as well. This will help to bridge the useability as more people know about git and using things like git submodules
that we basically want to extend. https://gitbetter.substack.com/p/automate-repetitive-tasks-with-custom
The task is to extend setup.py
to include a new entrypoint script that will be named and to be compatible with this git-subcommand feature. Build a package of subgit and test this feature out so that we could possibly call something like git subgit pull
for instance
One major issue with this ticket that needs a solution is the redundancy in our name when added to the end of git
command. writing git 2 times to run out commands is verbose and redundant. We need to have suggestions or alterations to our name to make this simpler. For instance we could go with git sub
or git module
or git sg
or something completley different. It is within the task to come up with some suggestion, present them to the team and to get an approval and a choice for this command name that we will move forward with.
The command 'subgit clean' is meant to work (in most cases) together with 'subgit reset' as away of completely restoring a repo to a specific state by cleaning the repo of any untracked files.
CLI command to implement would be something like
subgit clean (<repo>...) (-y)
Add a webhook to publish documentation to https://subgit.readthedocs.io/
Right now sgit update
only says that is successfully updated master branch to the latest on that branch, it do not indicate if it changd anything or if it was unaltered. Also show more details what happens if it updates.
There is a need to be able to enable and disable repos from cli so repos that dont work or repos that you dont want to work with should be ingored when running commands.
Simples way is to add a new flag to the repo data structure enable: True/False
and let it be controlled with the cli command sgit repo enable <repo-name>
and sgit repo disable <repo-name>
sgit repo add <name> <url> <rev>
perhaps <rev>
should default to master
and be an optional variable?
Implement a basic set of pre-commit hooks.
The scope is only hooks from https://github.com/pre-commit/pre-commit-hooks for now.
It is currently not possible to add a repo that is bare and dont have any refs on the remote side. It fails out with the error fatal: Remote branch master not found in upstream origin
Investigate if it is possible to determine if the repo is bare or not. Otherwise a new solution has to be thought up on how to deal with this case as it must be supported.
Instead of having an --output ...
flag that controlls this it makes more sense and is more compatible with Linux in general to have this command print to STDOUT and let the user redirect things where he wants it to go.
In #17 we add some basic batch mode handling. It would be preferable to have a command line switch over this solution.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.