Coder Social home page Coder Social logo

sirlynix / obs-kinect Goto Github PK

View Code? Open in Web Editor NEW
429.0 25.0 29.0 1.04 MB

OBS Plugin to use a Kinect (all models supported) in OBS (and setup a virtual green screen based on depth and/or body detection).

License: GNU General Public License v2.0

Lua 6.63% C++ 93.37%
kinect kinect-v2 obs-plugin kinect-sdk obs-kinect depth-map image-processing azure-kinect obs-studio

obs-kinect's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

obs-kinect's Issues

VCRUNTIME140.DLL missing (?)

I was trying to run this plugin on my OBS Studio 25.0.8 64-bit, but it outputted 22:01:31.677: LoadLibrary failed for '../../obs-plugins/64bit/obs-kinect.dll': The specified module could not be found. 22:01:31.677: (126) 22:01:31.677: Module '../../obs-plugins/64bit/obs-kinect.dll' not loaded to the log.
I ran Dependency Walker just like you said in one of the other issues, and it displayed some errors regarding DLLs.
Here's the Dependency Walker output:
obs-kinect.zip

I've already installed the Visual Studio redistributables you mentioned (both 32 and 64-bit)

Nothing shown up when added KinectSource

Hi, thanks for this project. Looks promising.

I tested on Windows 10 build 19041.264 with Obs version 25.0.8 64-bit. Properly installed all dependencies as said.

Downloaded and tested both version 3 RC 1 and 2, both didn't show video feedback after adding KinectSource despite waiting for a long while.

Annotation 2020-06-24 054855

[v1] "Kinect Source" not available/present when adding source

Going to preface this with a "thank you" for making this plugin. Very clever to be able to implement stuff like this into OBS to see all the creative ways it can be used.

That being said, after installing the plugin I'm unable to add the Kinect into my scene, as the source entry isn't showing up when I try to add a source. Granted, I am using a Kinect v1 (in which I have to use versions 1.8 of the Drivers, SDK, and Toolkit), which might have something to do with it. Will the source entry only show up when the program/plugin detects a Kinect is connected to the system, or if a certain condition is met?

static scene masking trick

Hi SirLynx,
i will try to make new masking layer to your great Kinect plugin.
I think in scene are a lot of static object, that we need to preserve in scene, but they are sooo shiny and depth sensor have problem with depth detection on it (for example turnables with vinyl on it).

My idea is add to plugin new static layer with optional mask (some black and white mask (maybee with grayscale for smooth))
this layer have high priority above depth sensor mask.
(in first version i planing this layer you can make from any tool like a gimp from scene screenshot).

Result: static things in scene (microphone, desk and my dj stuff will always visible), because currently is on it a lot of depth sensor noise. This dramaticaly improves faux green screen efect on static scene, when is this type of masking needed.

Thanx for reply your ideas about it. I can help you with code, (after i learn how it works, because im not typical c++ guy, i currently work most only with c# and typescript).

Bob

[Discussion] A portable Kinect library?

Hello!

It's been a while since my work on this plugin is more related to the different Kinect devices support and less to OBS integration and I've been thinking...

Would you find interesting to have a portable library able to fetch and use all three Kinect models, MIT licensed.

To say it in a few words, I've put a lot of work in managing all Kinect models in a common and a (soon to be) cross platform way. And I've been wondering if other projects could benefit from this.
Of course this would have no impact on obs-kinect as it would just use this library instead of having all that code.

Edit: To be clear what I'm talking about is an abstraction over official Kinect SDK, freenect(1 and 2) and NuiSensorLib to expose a common interface in a cross-platform way.
It would have a C API to be able to be exposed to other languages as well.

Upside down body detection

Hi!

First: Thanks to your great piece of software I´m able to use my Kinect v2 for my streaming setup (on Windows at least so far) - so, cool stuff and thanks!

Due to the way I mounted my Kinect (upside down) body detection doest not work. I did not think about that when I chose my mounting setup point because flipping the image is just a click. I guess you can not do anything about that behaviour?!

I´m strongly looking forward to Linux support aswell!!

Kind regards =)

PS: You do mention in the 0.3 notes that German translation is not fully done but I guess this is just a false reporting?! If not, I can do translations.

0.3 update thread (WIP)

Some update about the next obs-kinect version!

General stuff

I massively refactored the code, and improved performance. You can now have multiples Kinect v2 sources at the cost of one (Kinect images/color to depth mapping will only be done once instead of once per source, dramatically improving performance). I'm not sure if that's gonna be helpful but still, it's an improvement.

Multiple Kinect v2 sources

I also added body and depth hybrids (as requested by someone on reddit):

  • "Body or Depth" (everything in the depth range or a body)
  • "Body within Depth" (a body inside the depth range)

One of the big stuff about the refactor is also the support for multiple Kinects from multiple frameworks!

This is done by having all code related to a Kinect runtime (or even all Windows specific stuff) in a separate .dll which is loaded by obs-kinect at startup. This was made to add support for freenect(2) in the future, making the plugin available on Linux and macOS (won't be in the 0.3).

There's also a new "Kinect service priority" combo box, which tells obs-kinect to raise the priority of the KinectService.exe process, this is helpful in case where your CPU is under heavy load (like compiling/playing games) and prevents/reduce lost images. (this is only an issue with Kinect runtime 2.0, afaik all other frameworks, including Kinect runtime v1.0 and freenect, don't have a separate service for this).

Also:

  • I fixed depth/infrared distortion, changing to depth/infrared source now resize the Kinect source to the right size
  • I added a "dirty depth" max count setting, which allows the source to fetch a depth value from a previous frame if it doesn't have a value for some pixel. This helps with flickering but introduces a "depth lag" in case of movement.

About Kinect v1

Color stream
Depth stream

As you can see, I'm now able to retrieve Kinect v1 color and depth streams. I'm even able to retrieve both of them at the same time (which isn't as straightforward than with Kinect v2) and try to do some green-screen stuff.

image

... Okay that part still needs some work.

Interesting notes about Kinect v1:

  • It's a pain in the ass to use compared to Kinect v2 SDK.
  • It has lower color/depth quality than v2 (duh)
  • It has an accelerometer and a motor allowing to look up/down on its own.
  • It has a "near mode", which brings the depth range from 80-400cm to something like 40-200cm
  • It works by exclusive access, which means you can't have obs-kinect and another process using the same kinect at the same time (you can still have multiple obs-kinect v1 sources).
  • You can have multiple Kinect v1 at the same time, obs-kinect will ask you which one you want to use (warning: afaik only the first one can have body stream due to a SDK limitation).
  • It allows to change white balance/contrast/hue/saturation/exposure/brightness/gain/backlight compensation settings. Kinect v2 doesn't allow that unfortunately (but I'm looking to add support for it through raw USB commands, since the device does support it).
  • It has a dedicated background removal stream

That last one was a surprise for me, it also works quite well (even better than my own background removal with Kinect v2, except for the color quality).

Here's an example from Microsoft SDK:

I wish to be able to have something this smooth for Kinect v2 one day.

Anyway, I'll expose this new source for Kinect v1 (unfortunately v2 doesn't have that).

I have no idea when 0.3 will be out, things complicated a lot because of all the Kinect v1 possibilities (and because it's working in such a different way from Kinect v2). I'll update this thread if I have anything relevant to say about this.

I'm sometimes streaming obs-kinect dev on Twitch

I worked on obs-kinect 0.3 body/depth hybrids in english some days ago, you can see the replay here:
https://www.twitch.tv/videos/605065358

I also worked on the Kinect v1 support, in my native language (french) here:
https://www.twitch.tv/videos/607021382

Enjoy!

Lagging/stuttering

Machine details:
Ryzen 9 3900x
GeForce 2080
32gb RAM
Everything on SSDs
Asrock x570 Steel Legend
Windows 10 with all latest updates, recommended visual studio installs, kinect runtime
Kinect Configuration Verifier shows all green checks except USB, shows yellow (!) due to non-Intel or Realense USB 3.0 ports, but works fine

When I use Kinect with depth sensor in OBS, it stutters and lags. This doesn't happen when I use the Kinect in TouchDesigner (or any other program), in fact, it doesn't happen when I use TouchDesigner to Spout to OBS, so I have narrowed down that this is an issue with the plugin for OBS itself. Regular camera via the plugin works fine, it's just anything using the depth sensor lags.

Support for IR webcams

I am wondering if it would be possible to use IR/Windows Hello webcams instead of a Kinect for this since they use a similar technology. I know that it is possible to use a Kinect as an IR camera, but I cannot find an easy way to do it the other way around.

Ability to adjust alignment

The depth channel and the color channel are not fully aligned, or not on my set up. You can see the edges on the right of my body are very well followed, but on the left, they're not close. The wooden block I'm holding, and my hand, shows the difference between the two.

image

LoadLibrary failed with Kinect v1

I have just added the new plugin version to OBS, but OBS will not start with the following messages:
18:04:50.980: LoadLibrary failed for '../../obs-plugins/64bit/obs-kinect-sdk20.dll': (null) (126)
18:04:50.980: Module '../../obs-plugins/64bit/obs-kinect-sdk20.dll' not loaded
18:04:50.981: LoadLibrary failed for 'obs-kinect-sdk20': (null) (126)
18:04:51.198: Required value 'get_name' for '(null)' not found. obs_register_source failed.

At first glance it seems that it tries to load Kinect v2, even though I have just a Kinect v1.

I suspect this could be either that I do not have a Kinect v2 hooked up or that I have not installed the v2 SDK.

Did you ever try to start OBS with just a Kinect v1 attached?

By the way: Great work. Reviving this old hardware in such an amazing fashion.

Azure Kinect Files in the OBS folder.

I've been using this last year with a Kinect V2 an it worked great? Thank you!

I recently switched to an Azure Kinect and the kinect source shows up in OBS but is totally blank. Does the entire Azure Kinect SDK need to be copied to the OBS directory?
I put the folder labeled "Azure Kinect SDK v1.4.1" in the

obs - bin - 64bit directory and that does not seem to be working.

Thanks you!

Kinect SDK files?

I've been using this last year with a Kinect V2 an it worked great? Thank you!

I recently switched to an Azure Kinect and the kinect source shows up in OBS but is totally blank. Does the entire Azure Kinect SDK need to be copied to the OBS directory?
I put the folder labeled "Azure Kinect SDK v1.4.1" in the

obs - bin - 64bit directory and that does not seem to be working.

Thanks you!

Multiple V2s or V1s

Hello,

I am currently using a V2 kinect with OBS awesomely (thanks by the way)

I wanted to ask you something else though. I know you are working on multiple streams from multiple connect units. Are you doing anything to combine those pointclouds?

Im interested in this from a 3d Scanning aspect. I currently use V1s and V2s for 3d scanning & have gotten decent results.

I tried messing with the SDK and multiple kinect streams but havent been able to sync the clouds up well enough to grab a REALLY good scan.

Sorry to intrude on the OBS thread but didnt know of a way to contact you

[0.3rc1] Kinect v1 No image, "failed to open color stream: Unknown error" when enabling faux greenscreen

Going to link the logs upfront: https://obsproject.com/logs/93eDn7cI3SDKJ-gB
https://obsproject.com/logs/iwPUGmC7PA67g_vw

0.3 has been an absolute godsend to get the 360 Kinects up and running. It appears checking "Enable faux greenscreen effect" will result in no image showing, regardless of what filter type is used (I'm taking .greenscreentype_dedicated is to be used for V1). Looking into the log gives the error "failed to open color stream: Unknown error" for every instance it is ticked; the color stream itself works, however, as shown below.
https://imgur.com/tXp4DUS

I double-checked everything needed in terms of runtimes for both Visual 2019 and Kinect, is there something else that's needed? I have the runtimes and SDKs for both 1.8 and 2.0, would it be possible those are conflicting?

v1 support?

Hey just curious what the state of v1 is? Also OS X?

I have some libfreenect experience. Lemme know if I can help.

Device parameters are not applied in some cases

Device-specific parameters are not applied when the source is created (and a device has not been associated yet)/source is invisible (as device is no longer associated).

A KinectSource should remember its device-specific parameters and apply them when retrieving a device, the problem is it cannot decode device-specific parameters without a device.

any news or workaround to use the addon into Streamlabs?

Sorry to ask, because you mention on readme that is not compatible...

I copied the plugin files into:
C:\Program Files\Streamlabs OBS\resources\app.asar.unpacked\node_modules\obs-studio-node\obs-plugins
C:\Program Files\Streamlabs OBS\resources\app.asar.unpacked\node_modules\obs-studio-node\data\obs-plugins\obs-kinect

But the plugin is not detected as a widget.

It works fine in Obs Studio. Thanks
There’s some workaround?

Kinect v1 servo-support

Kinect v1 (and v2 as well?) has a servo, which can control the vertical orientation (-31° to 31°).

It's already supported by libfreenect, would you consider adding an option to control this from OBS?

Thanks.

Kinect 1 - disables high res mode when changing scenes

Doesn't save the high res option. When you return to a scene with KinectSource or open OBS high res mode is disabled, even though the option is checked. Unchecking and checking again enables it until you change the scene again.

FPS lagging

Hello !
I followed all the instructions , my Kinect v2 is well detected but unfortunately the camera is around 10-15fps.
I tried with 0.2 0.3rc1 and 03rc2 and it’s the same (more fluid with the 0.3).
My setup is Ryzen 7 3700X , 2070S and 16Gb Ram at 3100Mhz. I don’t think the problem is from my hardware but I messed up somewhere.
Thanks for you help ^^

Freenect(2) update thread (Linux/macOS support)

Hi,

With the upcoming 0.3 release, I splitted obs-kinect into multiple parts: common stuff like obs plugin registration, depth/infrared/faux green-screen processing, source and general device handling is part of the core but it doesn't know about Kinect devices (or even windows). Kinect v1/v2 handling is done in separate dll which are loaded by obs-kinect at startup.

This was made to add support for Kinect v1 without requiring the install of both v1 and v2 runtimes. And also to support Linux/macOS.

Obviously, Microsoft Kinect SDKs don't work on Linux/macOS, hopefully projects like libfreenect and libfreenect2 allow the use of Kinect on these systems.

Since obs-kinect device handling is done in separate libraries, it's possible to add libfreenect support kinda easily.

I tried both of these libraries on Windows (since they do have Windows support as well) for testing and it's working well!

freenect (v1):
image
image

freenect2 (v2):
image

However since libfreenect(2) needs raw USB commands, it requires to install a custom USB driver (like UsbDk). Hopefully this doesn't conflict with Kinect v2 (I can use libfreenect2 and Kinect SDK v2.0 without problem), but well, for Kinect v1 it's a bit more like hell. You have to replace the Kinect USB driver (using Zadig) for libfreenect to work which prevents Kinect SDK v1.8 to work, and it took me one hour to reinstall the official Kinect drivers, so yeah, I don't think I will release an obs-kinect libfreenect backend on Windows because of that (since it doesn't seem to do things you cannot do with official Kinect v1.8 runtime, except maybe for led control?).

As for Kinect v2, libfreenect2 does add more control, especially with the RGB camera color settings (things like exposure mode, white balance, gain, and such), so I think I'll support it on Windows too (even though I hope to be able to do that as well with the official SDK backend, using this).

Important note

Both libfreenect libs are really cool, but they are lacking some higher-level stuff the Kinect SDK does, like body/skeletal detection, or dedicated background removal (like Kinect SDK v1.8 does). It doesn't mean you won't be able to have background removal on Linux/macOS but it means you won't be able to use body informations for that and there's not much I can do about that. (except implementing body detecting but it requires some skill I don't have).

I will be able to compile the plugin on Linux, or even test it without problems, but I won't be able to do that with macOS. So if you have a mac and are willing to help, please let me know!

Also, as the 0.3 release is already a big one, Linux/macOS support won't be part of it (but I'll try to do it asap).

TL;DR: Linux/macOS support requires an open-source backend since official Microsoft SDKs are for Windows. There are backend for Kinect v1 and v2 but they don't do higher-level processing like body information. Things like body detection/body-based greenscreen filtering won't work on Linux/macOS because of that.

Kinect source in OBS

Hi All, I have downloaded all files to use my Kinect as webcam/green screen in obs. i have copied the files over to obs-studio on my C drive, unfortunately when I open OBS their is no Kinect source when I goto add a new source.

Worked on my other laptop but unfortunately the graphic's card isn't good enough hence why I tried on the new laptop.

Any ideas would be most apricated.
OBS sources

Plugin not loaded

Hi !

I'm trying to install your plugin but it is not loaded by OBS at startup.

Here's what I found in OBS logs :
17:57:37.963: LoadLibrary failed for '../../obs-plugins/64bit/obs-kinect.dll': The specified module could not be found.
17:57:37.963: (126)
17:57:37.963: Module '../../obs-plugins/64bit/obs-kinect.dll' not loaded

Plugin files are correctly copied, I'm using a kinect v2 with 2.2_1905 runtime library installed.

My goal is to use your plugin for mixed reaity with an oculus quest.

obs-kinect not loaded

Hi,

Thanks for the great plugin, but I'm having some issues loading.

Below are my obs logs:

16:28:44.226: LoadLibrary failed for '../../obs-plugins/64bit/obs-kinect.dll': (null) (126)
16:28:44.226: Module '../../obs-plugins/64bit/obs-kinect.dll' not loaded

Full log here: https://obsproject.com/logs/Jo1VVBQey9oZgt0E

Already installed both:

32bits: https://aka.ms/vs/16/release/vc_redist.x86.exe
64bits: https://aka.ms/vs/16/release/vc_redist.x64.exe

also installed Kinect SDK 1.8 from here:

https://www.microsoft.com/en-us/download/details.aspx?id=40278

My kinect is v1.

What am I missing?

Body & depth glitches sometimes?

I've gotten it where just depth will render my close body pretty well but Body & Depth sometimes puts a body up and to the left of where I am? I'd love to fork and help ya play with this stuff, it's just been a long time since I developed for Windows more than PHP lmao.
image

Standalone Driver

Would you be able to make a standalone driver to allow for the Kinect to be used as a webcam? Currently this is able to happen with OBS-Virtual Cam.
With the shortage of webcams right now I bet you would be able to sell something like that for a few bucks. I know I would be willing to spend the money so I didn't have to run OBS while on a Skype call.

Kinect V1 device detected but remains black screen

I have the plugin installed into OBS V27, using 0.3 release candidate 2 and I added the plugin to my sources and moved it to the top. All of the properties are there, and my device is detected as KinectSDK1.0 - Kinect #0: USB along with the device serial.

No matter what settings I change though nothing happens and the Kinect doesn't display any image. Yet it works perfectly fine within the dev toolkit apps.

Clipping via world coordinates

Hi, thanks for writing this plugin!
Just wanted to drop an idea based on how I was using the kinect.

A neat feature would be to use the Kinect's accelerometer or do some floor plane detection to determine real world XYZ coordinates, then clip the image using those coordinates.

For example, you could remove the floor from the scene, or a wall.

Offset?

I tried playing around with this today, and I noticed that unless I stand 1-2 meters away from the Kinect, there seems to be a very huge offset between the color picture and the depth information. Every seems to be shifted over by 10-20 pixels, leaving some kind of strange looking shadow.

I assume this is because the two cameras are quite far apart from each other, so some distance is needed for it to combine everything correctly.

Is there a possibility that in the future, you could add some manual controls to align the depth and color images?
I'm not able to place the camera far away from me due to space constraints.

Azure Kinect support

Good news everyone.

Microsoft contacted me about this plugin and is willing to help me support the Azure Kinect by sending one to me so I can add support for it!

This also means I will work on the plugin again, after not touching it for a few months due to other projects.

I'll use this thread to keep you informed about azure kinect support in obs-kinect.

Motion Tracking Support

When i add kinect as a source in OpenNIVirtualCam i can see the motion tracking when smart tracking feature is selected. I do not see the same with OBS though, this is very helpful feature especially when there are multiple people in the room or when the subject is moving. Would be a great addition to the already available excellent features in obs kinect :)

Kinect not showing in OBS 27.1.3 64bit windows

I have downloaded the obs plugin, the github plugin, windows runtime. and windows SDK like it said in this video but I still have no luck with the kinect showing up as a source in OBS.

Im able to get it to come up in runtime and the sdk.

Can we use that to identify them then? A fluctuating pixel, when considered over a period of 5 frames, will have a wide range of values in way that most pixels wouldn't. That would introduce lag I guess, but maybe not much.

Thanks for the links. If I'm reading it right, the z data is a float value? What is it when there's no value, -1?

Some of my function logic might work best for static subjects, like people sat at desks on video calls or streaming gameplay. The function may not be good for scenes with lots of movement, but that doesn't mean it's not worth doing I reckon.

Originally posted by @Funkcorner in https://github.com/SirLynix/obs-kinect/issue_comments/710183429

Control Brightness, Hue, or Gain

If possible could you add the ability to adjust the brightness/iso of the Kinect via the SDK? I try to use the filter in obs but since the Kinect itself is too bright the data is gone so I can't just lower brightness.

image

Kinect

Evening all, So had the Xbox Kinect working fine for weeks now, come to use today and the kinect comes on for around 10 seconds then goes off an then comes back on again. Just keeps doing the same, image is viable when kinect is on but then freezes when it goes off.

Any ideas what the cause of this maybe, i would really apricate any assistance.

OBS - Kinect Source Stuttering

Hey mate,

I'm having an issue in OBS where every ~3 seconds the Camera will stutter (freeze for ~0.5 seconds).

Color Basics on the SDK works fine. This seems to only happen on OBS.

I have a XBONE Kinect, and installed Runtime v2.2, SDK v2.0, plugin 0.3 RC2. There also doesn't seem to be any excessive resource utilisation from the plugin/OBS.

You're assistance would be much appreciated.

Donation

This plugin is very useful and I was wondering if you had a way for people who enjoy the plugin to send you a donation/tip to thank you for your hard work.

Add support for custom shaders

I was thinking, there already are plugins that brings support for shaders to OBS Studio, but due to the fact that they apply after this plugin renders, none of them is able to use the depth map and such. It would be nice to add a way to have that.

Also a way to have some augmented reality stuff.

60 fps?

Hi, I'm from Mexico, sorry for my bad English. First, thank you for all your work. My question is if I can put my kinect v2 at 60fps, or if I can only use 30fps.

Depth camera mapping to OBS layer height

I use this excellent plugin to create scenes during video conferences with depth.

This is generally no more complex than

  1. a background
  2. a foreground object
  3. a kinect green screen person

by using OBS object layering, I can make me appear part of the screen. This morning I had the Millenium Falcon backdrop, but the foreground object was the chess table, so I could make it look like I was sitting behind it, simple by placing the table at the top of the layers, then the kinect source, then the background. Later on I added a few more layers so I could walk behind the seat upright, but I had to manually alter the layers to do so.

Imagine if the layer height was altered in realtime by depth camera input! A scene could have 3 layers and the height of the kinect source could change depending on how close the user was to the camera, giving the sense of 3d!

Minimum depth

Hi, when getting too close to the Kinect when using the faux green screen, the image isn't captured. Is this a hardware limitation of the 3D scan?
If not, could you make it possible for it to detect objects closer than the current minimum?
If it is, could you make the 2D camera replace the missing image only in the spot that got too close?

I've been messing with using the 2 sources, but unless I can find a way to have an image mask that follows me, it's defeating the purpose of the green screen.

Add body part tracking

It would be cool to let obs-kinect zoom to your head position even while moving. And maybe some more cool effects related to this? Bodies have been handled in obs-kinect for many months now, a lot of cool stuff are possible.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.