thibaultbee / streampack Goto Github PK
View Code? Open in Web Editor NEWRTMP/RTMPS and SRT live streaming libraries for Android
Home Page: https://thibaultbee.github.io/StreamPack/index.html
License: Apache License 2.0
RTMP/RTMPS and SRT live streaming libraries for Android
Home Page: https://thibaultbee.github.io/StreamPack/index.html
License: Apache License 2.0
preview.setSurfaceProvider(
previewView.getSurfaceProvider())
Because AutoFitSurfaceView can't get SurfaceProvider, how to switch from back camera to front camera?
2.5.2 (or commit: 3a9af52)
Pixel4 - AOSP 10 (Rooted, Custom OS)
Root is required because the MountAngle
in camera_config.xml
, which is the ReadOnly configuration file of the OS, needs to be rewritten.
This can be reproduced by changing the angle to 0 or 180.
Thanks for this project, it has been very helpful as I was trying to implement SRT. Thank you very much.
After using the demo and libraries of this project, I found that
It seems that devices with camera angles where the MountAngle is 0 or 180 are not oriented correctly.
MountAngle
in camera_config.xml
is the value of SENSOR_ORIENTATION
in CameraCharacteristics
.
e.g. https://github.com/LineageOS/android_device_bq_sdm660-common/blob/lineage-16.0/configs/camera/camera_config.xml
For example, if you request Size(1280, 720)
, then
A camera with MountAngle=90
will preview and deliver a 1280 x 720 image with the correct proportions.
However, with a camera with MountAngle=0
, the captured video with 1280 horizontal and 720 vertical will be distorted, as if it were scaled down or enlarged to 1280 vertical and 720 horizontal.
The following is a capture of the distorted image.
(Android preview screen and the image distributed by SRT)
After applying my patch (please check alternative solutions.)
No response
I have created a patch to work with the less common MountAngle devices.
Here is the Dirty code. Sry.
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/DeviceOrientationProvider.kt b/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/DeviceOrientationProvider.kt
index 2dc06e1..cc60558 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/DeviceOrientationProvider.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/DeviceOrientationProvider.kt
@@ -16,14 +16,22 @@
package io.github.thibaultbee.streampack.internal.data.orientation
import android.content.Context
+import android.hardware.camera2.CameraCharacteristics
import android.util.Size
import io.github.thibaultbee.streampack.internal.interfaces.IOrientationProvider
import io.github.thibaultbee.streampack.internal.utils.extensions.deviceOrientation
import io.github.thibaultbee.streampack.internal.utils.extensions.isDevicePortrait
import io.github.thibaultbee.streampack.internal.utils.extensions.landscapize
import io.github.thibaultbee.streampack.internal.utils.extensions.portraitize
+import io.github.thibaultbee.streampack.utils.getCameraCharacteristics
+import io.github.thibaultbee.streampack.internal.sources.camera.CameraCapture
class DeviceOrientationProvider(private val context: Context) : IOrientationProvider {
+ private var cameraCapture: CameraCapture? = null
+ fun setCameraCapture(capture: CameraCapture) {
+ this.cameraCapture = capture
+ }
+
override val orientation: Int
get() {
//TODO: this might not be working on all devices
@@ -31,6 +39,13 @@ class DeviceOrientationProvider(private val context: Context) : IOrientationProv
return if (deviceOrientation == 0) 270 else deviceOrientation - 90
}
+ override val cameraAngle: Int
+ get() {
+ val cameraId = this.cameraCapture?.cameraId ?: return 90
+ val characteristics = context.getCameraCharacteristics(cameraId)
+ return characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION) as Int
+ }
+
override fun orientedSize(size: Size): Size {
return if (context.isDevicePortrait) {
size.portraitize()
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/FixedOrientationProvider.kt b/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/FixedOrientationProvider.kt
index 8f5ee4f..e14c15f 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/FixedOrientationProvider.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/internal/data/orientation/FixedOrientationProvider.kt
@@ -21,7 +21,7 @@ import io.github.thibaultbee.streampack.internal.utils.extensions.landscapize
import io.github.thibaultbee.streampack.internal.utils.extensions.portraitize
import io.github.thibaultbee.streampack.utils.OrientationUtils
-class FixedOrientationProvider(override val orientation: Int) : IOrientationProvider {
+class FixedOrientationProvider(override val orientation: Int, override val cameraAngle: Int = 90) : IOrientationProvider {
override fun orientedSize(size: Size): Size {
return if (OrientationUtils.isPortrait(orientation)) {
size.portraitize()
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/internal/encoders/VideoMediaCodecEncoder.kt b/core/src/main/java/io/github/thibaultbee/streampack/internal/encoders/VideoMediaCodecEncoder.kt
index 1ed1709..1a0f2d5 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/internal/encoders/VideoMediaCodecEncoder.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/internal/encoders/VideoMediaCodecEncoder.kt
@@ -88,8 +88,15 @@ class VideoMediaCodecEncoder(
val videoConfig = config as VideoConfig
orientationProvider.orientedSize(videoConfig.resolution).apply {
// Override previous format
- format.setInteger(MediaFormat.KEY_WIDTH, width)
- format.setInteger(MediaFormat.KEY_HEIGHT, height)
+ val cameraAngle = orientationProvider.cameraAngle
+
+ if (cameraAngle == 90 || cameraAngle == 270) {
+ format.setInteger(MediaFormat.KEY_WIDTH, width)
+ format.setInteger(MediaFormat.KEY_HEIGHT, height)
+ } else {
+ format.setInteger(MediaFormat.KEY_WIDTH, height)
+ format.setInteger(MediaFormat.KEY_HEIGHT, width)
+ }
}
}
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/internal/interfaces/IOrientationProvider.kt b/core/src/main/java/io/github/thibaultbee/streampack/internal/interfaces/IOrientationProvider.kt
index 599d8ed..215bd72 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/internal/interfaces/IOrientationProvider.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/internal/interfaces/IOrientationProvider.kt
@@ -27,6 +27,12 @@ interface IOrientationProvider {
*/
val orientation: Int
+ /**
+ * CameraSensor angle. Generally 90, 270.
+ * Expected values: 0, 90, 180, 270.
+ */
+ val cameraAngle: Int
+
/**
* Return the size with the correct orientation.
*/
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseCameraStreamer.kt b/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseCameraStreamer.kt
index d685267..f2016d1 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseCameraStreamer.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseCameraStreamer.kt
@@ -61,8 +61,13 @@ open class BaseCameraStreamer(
initialOnErrorListener = initialOnErrorListener
), ICameraStreamer {
private val cameraCapture = videoCapture as CameraCapture
+ private val deviceOrientationProvider = orientationProvider as DeviceOrientationProvider
override val helper = CameraStreamerConfigurationHelper(muxer.helper)
+ init {
+ deviceOrientationProvider.setCameraCapture(cameraCapture)
+ }
+
/**
* Get/Set current camera id.
*/
diff --git a/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseStreamer.kt b/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseStreamer.kt
index cff625d..44df9ef 100644
--- a/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseStreamer.kt
+++ b/core/src/main/java/io/github/thibaultbee/streampack/streamers/bases/BaseStreamer.kt
@@ -59,7 +59,7 @@ abstract class BaseStreamer(
private val context: Context,
protected val audioCapture: IAudioCapture?,
protected val videoCapture: IVideoCapture?,
- orientationProvider: IOrientationProvider,
+ protected val orientationProvider: IOrientationProvider,
private val muxer: IMuxer,
protected val endpoint: IEndpoint,
initialOnErrorListener: OnErrorListener? = null
Is there a way to configure the CameraRtmpLiveStreamer to write the stream to a file? Maybe exposing the muxer
property, so we pass in writeToFile = true
?
class CameraRtmpLiveStreamer(
context: Context,
enableAudio: Boolean = true,
initialOnErrorListener: OnErrorListener? = null,
initialOnConnectionListener: OnConnectionListener? = null
) : BaseCameraLiveStreamer(
context = context,
enableAudio = enableAudio,
muxer = FlvMuxer(context = context, writeToFile = false),
endpoint = RtmpProducer(hasAudio = enableAudio, hasVideo = true),
initialOnErrorListener = initialOnErrorListener,
initialOnConnectionListener = initialOnConnectionListener
)
hi there , reacently , i try to record screen with MediaProjection and virtualdisplay to push to srt server , i am faceing some kind of issus, could you please and the screen record feature in the futer release
Originally posted by @DesenYang in #14 (comment)
support rtmps
Hello, when you are not connected to WIFI, srt cannot be used. Why? Looking forward to your reply, thank you!
In my test with Simple Realtime Server it is giving timeout.
I used the demo screen app, it seems that when the image is static, no data is sent to the server so it considers the connection idle and forces the disconnection.
I tested SRT and RTMP
The Maven Central does not provide sources
artifact for the StreamPack. Could you add sources to published artifacts?
I try to implement my own ISurfaceCapture
for live streaming GL-rendered content, however it is very difficult to understand hierarchy and what library classes do without sources. The Android Studio has a "Download sources" feature, but it doesn't work for your library due to missing sources
artifact.
There is a stackoverflow answer that presents a simple solution to this issue: https://stackoverflow.com/questions/26874498/publish-an-android-library-to-maven-with-aar-and-sources-jar I can use it to publish sources in a local maven, but I thought it's worth sharing to help other devs too.
Why do I change mobile from portrait to landscape .AutoFitSurfaceView showed people and things are positive direction,but media player's image not rotate people and things in positive direction?
How can I let media player's image show positive direction when I rotate mobile?
Hi:
I often get below error crash my app. What does meaning of below Exception?
io.github.thibaultbee.streampack.error.StreamPackError: java.lang.NullPointerException: Attempt to invoke virtual method 'int android.media.audiofx.AcousticEchoCanceler.setEnabled(boolean)' on a null object reference
Error occur at this line↓
cameraRtmpLiveStreamer?.configure(audioConfig, videoConfig)
Adds support of hardware HEVC video encoder when it is available on device.
Why ?
Improves video quality for the same bitrate.
TODO
Hi:
My app's camera AutoFitSurfaceView preview's image in portrait too thin,but in landscape too fat. How to solve this problem?
Hi,
Thanks for the library. It works but on one of my devices, the app just crashes as soon as I call stream.release(). The crash happens in native code so there's no way to try catch it. Here is the crash log:
12-19 19:25:49.181 16919 16919 F libc : Fatal signal 11 (SIGSEGV), code 1, fault addr 0x0 in tid 16919 (com.my.example)
12-19 19:25:49.214 18091 18091 F DEBUG : *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
12-19 19:25:49.214 18091 18091 F DEBUG : Build fingerprint: 'samsung/nobleltelgt/nobleltelgt:7.0/NRD90M/N920LKLU2DVG1:user/release-keys'
12-19 19:25:49.215 18091 18091 F DEBUG : Revision: '9'
12-19 19:25:49.215 18091 18091 F DEBUG : ABI: 'arm64'
12-19 19:25:49.215 18091 18091 F DEBUG : pid: 16919, tid: 16919, name: com.my.example >>> com.my.example <<<
12-19 19:25:49.215 18091 18091 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0
12-19 19:25:49.215 18091 18091 F DEBUG : x0 0000007ab7a8a100 x1 0000007fe5412810 x2 0000000000000000 x3 0000007ab794dde2
12-19 19:25:49.215 18091 18091 F DEBUG : x4 0000007fe5412710 x5 0000800000000000 x6 0000000000ffffff x7 ffffffffffffffff
12-19 19:25:49.215 18091 18091 F DEBUG : x8 0000000000000000 x9 0000000000000000 x10 0000000000430000 x11 0000000072287fc0
12-19 19:25:49.215 18091 18091 F DEBUG : x12 00000000722e6660 x13 ffffff0000000000 x14 0000007a83c6d800 x15 0000000000000000
12-19 19:25:49.215 18091 18091 F DEBUG : x16 0000007aaa93b578 x17 0000007ab885f36c x18 00000000ebad6082 x19 0000000000000000
12-19 19:25:49.215 18091 18091 F DEBUG : x20 0000007fe5412810 x21 0000007ab7a8a100 x22 0000007fe5412b2c x23 00000000729bbb63
12-19 19:25:49.215 18091 18091 F DEBUG : x24 0000000000000004 x25 47469ecc02d4fc0d x26 0000007ab7ac7a98 x27 47469ecc02d4fc0d
12-19 19:25:49.215 18091 18091 F DEBUG : x28 0000000000000001 x29 0000007fe54127f0 x30 0000007aaa8e8884
12-19 19:25:49.215 18091 18091 F DEBUG : sp 0000007fe54127d0 pc 0000007ab885f388 pstate 0000000020000000
12-19 19:25:49.583 18091 18091 F DEBUG :
12-19 19:25:49.583 18091 18091 F DEBUG : backtrace:
12-19 19:25:49.583 18091 18091 F DEBUG : #00 pc 000000000000e388 /system/lib64/libutils.so (_ZNK7android7RefBase9decStrongEPKv+28)
12-19 19:25:49.583 18091 18091 F DEBUG : #1 pc 0000000000033880 /system/lib64/libmedia_jni.so
12-19 19:25:49.583 18091 18091 F DEBUG : #2 pc 00000000026e41b0 /system/framework/arm64/boot-framework.oat (offset 0x1fe8000) (android.media.MediaCodec.native_release+124)
12-19 19:25:49.584 18091 18091 F DEBUG : #3 pc 00000000026e6950 /system/framework/arm64/boot-framework.oat (offset 0x1fe8000) (android.media.MediaCodec.release+60)
12-19 19:25:49.584 18091 18091 F DEBUG : #4 pc 00000000000d1eb4 /system/lib64/libart.so (art_quick_invoke_stub+580)
12-19 19:25:49.584 18091 18091 F DEBUG : #5 pc 00000000000deb88 /system/lib64/libart.so (_ZN3art9ArtMethod6InvokeEPNS_6ThreadEPjjPNS_6JValueEPKc+208)
12-19 19:25:49.584 18091 18091 F DEBUG : #6 pc 000000000028db00 /system/lib64/libart.so (_ZN3art11interpreter34ArtInterpreterToCompiledCodeBridgeEPNS_6ThreadEPNS_9ArtMethodEPKNS_7DexFile8CodeItemEPNS_11ShadowFrameEPNS_6JValueE+312)
12-19 19:25:49.584 18091 18091 F DEBUG : #7 pc 0000000000286adc /system/lib64/libart.so (_ZN3art11interpreter6DoCallILb0ELb0EEEbPNS_9ArtMethodEPNS_6ThreadERNS_11ShadowFrameEPKNS_11InstructionEtPNS_6JValueE+592)
12-19 19:25:49.584 18091 18091 F DEBUG : #8 pc 00000000005565c8 /system/lib64/libart.so (MterpInvokeVirtualQuick+452)
12-19 19:25:49.584 18091 18091 F DEBUG : #9 pc 00000000000c8614 /system/lib64/libart.so (ExecuteMterpImpl+29972)
12-19 19:25:51.089 18091 18091 E : ro.debug_level = 0x4f4c
12-19 19:25:51.090 18091 18091 E : sys.mobilecare.preload = false
Hi,
I receive SocketException : Connection does not exist
while stream to remote SRT device
with passphrase
every time, and it works ok if stream to remote SRT device
without passphrase
. Both are using sample app with latest release version 2.5.2
,
and I tried stream by ffmpeg with passphrase
, and it works fine, so I think it's might not be server's problem. (Same docker image and environment, only difference is with or without passphrase)
Could you please give me some advise to solve this problem?
Thanks a lot.
2023-03-13 18:23:50.718 14063-14667/io.github.thibaultbee.streampack.sample E/: onError
io.github.thibaultbee.streampack.error.StreamPackError: io.github.thibaultbee.streampack.error.StreamPackError: java.net.SocketException: Connection does not exist
at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$videoEncoderListener$1.onOutputFrame(BaseStreamer.kt:123)
at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder$encoderCallback$1.onOutputBufferAvailable(MediaCodecEncoder.kt:109)
at android.media.MediaCodec$EventHandler.handleCallback(MediaCodec.java:1865)
at android.media.MediaCodec$EventHandler.handleMessage(MediaCodec.java:1763)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loopOnce(Looper.java:233)
at android.os.Looper.loop(Looper.java:334)
at android.os.HandlerThread.run(HandlerThread.java:67)
Caused by: io.github.thibaultbee.streampack.error.StreamPackError: java.net.SocketException: Connection does not exist
at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$muxListener$1.onOutputFrame(BaseStreamer.kt:135)
at io.github.thibaultbee.streampack.internal.muxers.ts.utils.TSOutputCallback.writePacket(TSOutputCallback.kt:24)
at io.github.thibaultbee.streampack.internal.muxers.ts.packets.TS.write(TS.kt:130)
at io.github.thibaultbee.streampack.internal.muxers.ts.packets.Pes.write(Pes.kt:51)
at io.github.thibaultbee.streampack.internal.muxers.ts.TSMuxer.generateStreams(TSMuxer.kt:154)
at io.github.thibaultbee.streampack.internal.muxers.ts.TSMuxer.encode(TSMuxer.kt:143)
at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$videoEncoderListener$1.onOutputFrame(BaseStreamer.kt:120)
at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder$encoderCallback$1.onOutputBufferAvailable(MediaCodecEncoder.kt:109)
at android.media.MediaCodec$EventHandler.handleCallback(MediaCodec.java:1865)
at android.media.MediaCodec$EventHandler.handleMessage(MediaCodec.java:1763)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loopOnce(Looper.java:233)
at android.os.Looper.loop(Looper.java:334)
at android.os.HandlerThread.run(HandlerThread.java:67)
Caused by: java.net.SocketException: Connection does not exist
at io.github.thibaultbee.srtdroid.models.Socket.send(Socket.kt:647)
at io.github.thibaultbee.streampack.ext.srt.internal.endpoints.SrtProducer.write(SrtProducer.kt:160)
at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$muxListener$1.onOutputFrame(BaseStreamer.kt:132)
at io.github.thibaultbee.streampack.internal.muxers.ts.utils.TSOutputCallback.writePacket(TSOutputCallback.kt:24)
at io.github.thibaultbee.streampack.internal.muxers.ts.packets.TS.write(TS.kt:130)
at io.github.thibaultbee.streampack.internal.muxers.ts.packets.Pes.write(Pes.kt:51)
at io.github.thibaultbee.streampack.internal.muxers.ts.TSMuxer.generateStreams(TSMuxer.kt:154)
at io.github.thibaultbee.streampack.internal.muxers.ts.TSMuxer.encode(TSMuxer.kt:143)
at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer$videoEncoderListener$1.onOutputFrame(BaseStreamer.kt:120)
at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder$encoderCallback$1.onOutputBufferAvailable(MediaCodecEncoder.kt:109)
at android.media.MediaCodec$EventHandler.handleCallback(MediaCodec.java:1865)
at android.media.MediaCodec$EventHandler.handleMessage(MediaCodec.java:1763)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loopOnce(Looper.java:233)
at android.os.Looper.loop(Looper.java:334)
at android.os.HandlerThread.run(HandlerThread.java:67)
The bitrate spikes drastically right at the minute 5:00 when streaming to a RTPM server.
To Reproduce
Steps to reproduce the behavior:
demo-camera
app on an Android device.Expected behavior
Bitrate stays at 2000 kb/s for as long as the stream lasts.
Adds support for a minimal set of camera settings:
Why ?
Improves camera captures settings.
First off, I'm super excited about this project! Good luck!
I was trying to test the demo app in the first beta release on my Pixel 4a, but I'm getting the following error:
startStream: java.lang.UnsupportedOperationException:
Tried to obtain display from a Context not associated with one. Only visual Contexts (such as Activity or one created with Context#createWindowContext) or ones created with Context#createDisplayContext are associated with displays. Other types of Contexts are typically related to background entities and may return an arbitrary display.
Thanks!
Hi,Thanks for your masterpieces.
I was not familiar with kotlin and I try to build it, But failed。I really want to try srt on my phone, I was wondering is there any release apk package?
Basically I would like to show a VUmeter with the audio volume/gain/amplitude being sent when streaming.
I could not found any way, I wonder if that is possible?
Thanks
Adds RIST support: see https://www.rist.tv/
Why ?
Improves compatibility with RIST devices.
Prerequisites
An Android RIST wrapper.
Do you plan on exposing GL so you can use OpenGL to apply filters to the streaming video? Most similar libraries on Andriod (https://github.com/pedroSG94/rtmp-rtsp-stream-client-java) and HashinKit on iOS provide this feature or someway to filter/provide a way to modify the broadcast image. I specifically want this feature to be able to add an overlay image via a shader similar to this https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/4b85bce8475deb97380dad3bbe51aca6d24087ea/encoder/src/main/java/com/pedro/encoder/input/gl/render/filters/object/ImageObjectFilterRender.java
As a side note, I think this library has a lot of potentials and think it is heading in the right direction
Adds support of hardware OPUS audio encoder when it is available on device.
Why ?
Improves audio quality for the same bitrate.
Prerequisites
A phone with a hardware OPUS encoder.
N/A
N/A
Recently, I drafted a PR for the api.video ios library to support the ability to mirror the video image apivideo/api.video-swift-live-stream#11
The hope was to come here and do the same, but I think it may not be possible - But was curious if you had any thoughts on a solution.
My initial attempt was to setMatrix
on the previewSurface
as well as the encoderSurface
. But, this requires locking the canvas, which I was unable to accomplish while the preview is active. Here is the Mirror
class I was messing around with. The new param cameraCapture
is passed in when the CameraSettings
class gets instantiated.
class Mirror(private val context: Context, private val cameraController: CameraController, private val cameraCapture: CameraCapture) {
private val notMirrored: Matrix = Matrix()
private val mirrored: Matrix = Matrix()
private var _isMirrored = false;
init {
notMirrored.setScale(0F, 0F)
mirrored.setScale(-1F, 1F)
}
var isMirrored: Boolean
get() {
return _isMirrored
}
set(newValue) {
if (cameraCapture.previewSurface == null) return
val canvas: Canvas = cameraCapture.previewSurface!!.lockCanvas(Rect())
if (newValue) {
canvas.setMatrix(mirrored)
} else {
canvas.setMatrix(notMirrored)
}
cameraCapture.previewSurface!!.unlockCanvasAndPost(canvas)
}
}
Most solutions seem to talk about modifying the actual SurfaceView
or TextureView
, but this library seems to rely on Camera2
API to handle the surfaces (targets) directly.
Curious if you have any additional ideas on how to implement the ability to mirror (flip) the image.
No response
No response
Would you mind enhancing it to support torch/flashlight parameter?
Thanks.
Add Link aggregation support in order to transmit stream on multiple link (Ethernet, Wi-Fi, Cellular,...)
Why ?
Improves stream quality by using cellular and Wi-Fi modem.
Prequisite
SRT support, then srtdroid support.
Pixel 6 Pro Android 12
SRT
No response
No response
Yes
using a local SRT Stream server on Linux System and Try to run the app in emulator
It should stream the video
E/StandaloneCoroutine: startStream failed
java.net.ConnectException: Connection setup failure: connection timed out
at io.github.thibaultbee.srtdroid.models.Socket.connect(Socket.kt:241)
at io.github.thibaultbee.srtdroid.models.Socket.connect(Socket.kt:254)
at io.github.thibaultbee.streampack.ext.srt.internal.endpoints.SrtProducer$connect$4.invokeSuspend(SrtProducer.kt:119)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:3
No response
No response
2.5.2
When recording the screen and selecting 1920x1080 as the resolution, in portrait mode the receiving side has black around the phone screen - as a full frame 1920x1080 video.
If the phone is in portrait, an option to lock this to 1080x1920 for example, so that the video is resized correctly on the receiving end.
An option in settings to lock the resolution to the starting orientation, and flip the resolution so that it is resized correctly.
No response
TLDR: A single device have experienced a malfunction while using the zooming functionality. If this should be a bigger issue it should be looked at further.
If you come across this issue and you have a Nokia device, it would be nice if you can test this issue so we can get some data on how big the issue is.
When zooming, in this case using the StreamPack demo-camera app - When setting the zoomRatio, the preview appears to stop displaying the camera view.
With the error message E/CameraDeviceCallback: Camera 0 is in error 4
At this point we dont know the scope of this issue.
We want to figure out the cause of this issue.
These are the candidates that we can come up with.
First you confirm that the issue you are experiencing is the same as described here.
If this is the same issue. Then please provide the following information.
if(value > 1f && zoomRatioRange.value != null) Log.d("STREAMPACK - ZoomRange:", zoomRatioRange.value!!.lower.toString() + " - " + zoomRatioRange.value!!.upper.toString())
- Which is also referenced in the link just belowE/CameraDeviceCallback: Camera 0 is in error 4
Works without any similarities to the issue on Nokia T20, and works without any errors.
This issue has been discussed here:
As of now this issue has only been experienced on a single device, and not reproduced by any other devices. Which would not be considered a critical fault with StreamPack. But for instance if the scope of this issue was affecting all Nokia devices on Android 11 and above, it might have to get looked at.
Hi, it's possible assign a SRT PHRASE to a streamer?
Hello!
Thanks for the library!
I'm using TextureView to show preview and there is transition that changes TextureView's size (and aspect ratio)
And I need preview height and width to calculate scale for the matrix to have correct aspect ratio of preview.
Is there any possibility to get the stream preview dimensions?
Thanks
Hi,
I am trying to run the screen recorder demo app in my Oneplus 9RT device (android 12). I have selected the RTMP protocol and also set the server url in settings. Following is the settings which I have used:-
But when pressing the record screen button, I am getting the following crash in my phone:-
An error occurred
io.github.thibaultbee.streampack.error.StreamPackError: java.lang.IllegalArgumentException
at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:242)
at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:264)
at io.github.thibaultbee.streampack.screenrecorder.ScreenRecorderService.onStartCommand(ScreenRecorderService.kt:314)
at android.app.ActivityThread.handleServiceArgs(ActivityThread.java:4807)
at android.app.ActivityThread.access$2100(ActivityThread.java:254)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2222)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loopOnce(Looper.java:233)
at android.os.Looper.loop(Looper.java:344)
at android.app.ActivityThread.main(ActivityThread.java:8212)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:584)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1034)
Caused by: java.lang.IllegalArgumentException
at android.media.MediaCodec.native_configure(Native Method)
at android.media.MediaCodec.configure(MediaCodec.java:2176)
at android.media.MediaCodec.configure(MediaCodec.java:2092)
at io.github.thibaultbee.streampack.internal.encoders.MediaCodecEncoder.configureCodec(MediaCodecEncoder.kt:180)
at io.github.thibaultbee.streampack.internal.encoders.VideoMediaCodecEncoder.createVideoCodec(VideoMediaCodecEncoder.kt:102)
at io.github.thibaultbee.streampack.internal.encoders.VideoMediaCodecEncoder.configure(VideoMediaCodecEncoder.kt:59)
at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:237)
at io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:264)
at io.github.thibaultbee.streampack.screenrecorder.ScreenRecorderService.onStartCommand(ScreenRecorderService.kt:314)
at android.app.ActivityThread.handleServiceArgs(ActivityThread.java:4807)
at android.app.ActivityThread.access$2100(ActivityThread.java:254)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2222)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loopOnce(Looper.java:233)
at android.os.Looper.loop(Looper.java:344)
at android.app.ActivityThread.main(ActivityThread.java:8212)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:584)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1034)
```
Thank you very much!!!
2.5.2
N/A
Just an idea, as this is now a very popular system and used by thousands now for IRL streaming... but I understand this may be a lot of work?
Support for SRTLA - SRT transport proxy with link aggregation for connection bonding. To make use of the mobile network and WiFi to create a bonded connection. SRTLA is open source, and already implemented in the 'IRL Pro' streaming app, however this does not provide screen recording, only camera support.
SRTLA is used as a part of the BELABOX streaming project, which allows bonding over multiple connections. More information regarding SRTLA here: https://github.com/BELABOX/srtla
Thank you, and thank you for the support you have already provided.
Implement an option to use an SRTLA server as an endpoint, making use of the mobile cellular data and wifi.
No response
2.5.2
Emulator
It would be nice to have a background streaming service.
No response
No response
Hi, thank you for make this wonderful project, the part of SRT works very well, but there seems to be some problems in the part of RTMP, when I stop RTMP publishing, the app crashed, log like this:
11/14 23:58:46: Launching 'demo-camera' on smartisan OS105.
Install successfully finished in 3 s 498 ms.
$ adb shell am start -n "io.github.thibaultbee.streampack.sample/io.github.thibaultbee.streampack.app.ui.main.MainActivity" -a android.intent.action.MAIN -c android.intent.category.LAUNCHER
Connected to process 5910 on device 'smartisan-os105-b558c244'.
Capturing and displaying logcat messages from application. This behavior can be disabled in the "Logcat output" section of the "Debugger" settings page.
I/art: Late-enabling -Xcheck:jni
I/System: Daemon delayGCRequest, sDelayGCRequest=false, delay=true, sPendingGCRequest=false
W/art: Before Android 4.1, method android.graphics.PorterDuffColorFilter androidx.vectordrawable.graphics.drawable.VectorDrawableCompat.updateTintFilter(android.graphics.PorterDuffColorFilter, android.content.res.ColorStateList, android.graphics.PorterDuff$Mode) would have incorrectly overridden the package-private method in android.graphics.drawable.Drawable
I/CameraManagerGlobal: Connecting to camera service
W/CameraManagerGlobal: [soar.cts] ignore the status update of camera: 2
W/CameraManagerGlobal: [soar.cts] ignore the status update of camera: 3
W/DataBinding: Setting the fragment as the LifecycleOwner might cause memory leaks because views lives shorter than the Fragment. Consider using Fragment's view lifecycle
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
W/Utils: could not parse long range '175-174'
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
W/VideoCapabilities: Unrecognized profile/level 0/3 for video/mpeg2
W/VideoCapabilities: Unrecognized profile/level 0/3 for video/mpeg2
W/VideoCapabilities: Unsupported mime video/x-ms-wmv
W/VideoCapabilities: Unsupported mime video/x-ms-wmv
W/VideoCapabilities: Unsupported mime video/x-ms-wmv
W/VideoCapabilities: Unsupported mime video/divx
W/VideoCapabilities: Unsupported mime video/divx311
W/VideoCapabilities: Unsupported mime video/divx4
W/VideoCapabilities: Unsupported mime video/mp4v-esdp
I/VideoCapabilities: Unsupported profile 4 for video/mp4v-es
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
I/VideoMediaCodecEncoder: Selected encoder OMX.qcom.video.encoder.avc
I/OMXClient: MuxOMX ctor
E/ACodec: found 1 codecs
E/ACodec: codec OMX.qcom.video.encoder.avc selected
I/MediaCodec: MediaCodec will operate in async mode
E/ACodec: [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -1010
I/ExtendedACodec: setupVideoEncoder()
W/ACodec: do not know color format 0x7fa30c04 = 2141391876
W/ACodec: do not know color format 0x7fa30c00 = 2141391872
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/ACodec: setupAVCEncoderParameters with [profile: High] [level: Level51]
I/ACodec: [OMX.qcom.video.encoder.avc] cannot encode HDR static metadata. Ignoring.
I/ACodec: setupVideoEncoder succeeded
I/ExtendedACodec: [OMX.qcom.video.encoder.avc] configure, AMessage : AMessage(what = 'conf', target = 1) = {
string mime = "video/avc"
int32_t frame-rate = 30
int32_t color-format = 2130708361
int32_t profile = 8
int32_t height = 1280
int32_t width = 720
int32_t bitrate = 2000000
float i-frame-interval = 1.000000
int32_t level = 32768
int32_t encoder = 1
}
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/Adreno: QUALCOMM build : 54dba37, I1b6e53de78
Build Date : 02/21/18
OpenGL ES Shader Compiler Version: XE031.14.00.04
Local Branch :
Remote Branch : quic/gfx-adreno.lnx.1.0.r9-rel
Remote Branch : NONE
Reconstruct Branch : NOTHING
I/Adreno: PFP: 0x005ff087, ME: 0x005ff063
D/: SurfaceMonitor closed!
I/AudioMediaCodecEncoder: Selected encoder OMX.google.aac.encoder
I/OMXClient: MuxOMX ctor
E/ACodec: found 1 codecs
E/ACodec: codec OMX.google.aac.encoder selected
I/MediaCodec: MediaCodec will operate in async mode
D/StandaloneCoroutine: Streamer is created
D/OpenGLRenderer: RenderMonitor init!
D/OpenGLRenderer: RenderMonitor closed!
I/OpenGLRenderer: Initialized EGL, version 1.4
D/OpenGLRenderer: Swap behavior 1
I/PreviewView: Starting on camera: 0
D/PreviewView: View finder size: 1080 x 2010
D/PreviewView: Selected preview size: 1920x1080
I/art: Do partial code cache collection, code=17KB, data=29KB
I/art: After code cache collection, code=17KB, data=29KB
I/art: Increasing code cache capacity to 128KB
D/AutoFitSurfaceView: Measured dimensions set: 1131 x 2010
W/art: Before Android 4.1, method double java.util.concurrent.ThreadLocalRandom.internalNextDouble(double, double) would have incorrectly overridden the package-private method in java.util.Random
W/art: Before Android 4.1, method int java.util.concurrent.ThreadLocalRandom.internalNextInt(int, int) would have incorrectly overridden the package-private method in java.util.Random
W/art: Before Android 4.1, method long java.util.concurrent.ThreadLocalRandom.internalNextLong(long, long) would have incorrectly overridden the package-private method in java.util.Random
I/CameraController: Supported FPS range list: [[15, 15], [20, 20], [24, 24], [7, 30], [30, 30], [10, 30]]
D/CameraController: Selected Fps range [30, 30]
I/System: Daemon delayGCRequest, sDelayGCRequest=true, delay=false, sPendingGCRequest=false
I/: Connection succeeded
D/ACodec: dataspace changed to 0x10c10000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:1(BT709), T:3(SMPTE_170M))
I/: Format changed : {csd-1=java.nio.HeapByteBuffer[pos=0 lim=9 cap=9], mime=video/avc, frame-rate=30, width=720, height=1280, color-standard=1, color-range=2, bitrate=2000000, csd-0=java.nio.HeapByteBuffer[pos=0 lim=22 cap=22], color-transfer=3, max-bitrate=2000000}
I/: Format changed : {bitrate=128000, mime=audio/mp4a-latm, csd-0=java.nio.HeapByteBuffer[pos=0 lim=2 cap=2], channel-count=2, sample-rate=44100, max-bitrate=128000}
D/VideoMediaCodecEncoder: Not running
D/AudioMediaCodecEncoder: Not running
I/AudioMediaCodecEncoder: Selected encoder OMX.google.aac.encoder
I/OMXClient: MuxOMX ctor
E/ACodec: found 1 codecs
E/ACodec: codec OMX.google.aac.encoder selected
I/MediaCodec: MediaCodec will operate in async mode
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706434 for video/avc
I/VideoMediaCodecEncoder: Selected encoder OMX.qcom.video.encoder.avc
I/OMXClient: MuxOMX ctor
E/ACodec: found 1 codecs
E/ACodec: codec OMX.qcom.video.encoder.avc selected
I/MediaCodec: MediaCodec will operate in async mode
E/ACodec: [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -1010
I/ExtendedACodec: setupVideoEncoder()
W/ACodec: do not know color format 0x7fa30c04 = 2141391876
W/ACodec: do not know color format 0x7fa30c00 = 2141391872
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/ACodec: setupAVCEncoderParameters with [profile: High] [level: Level51]
I/ACodec: [OMX.qcom.video.encoder.avc] cannot encode HDR static metadata. Ignoring.
I/ACodec: setupVideoEncoder succeeded
I/ExtendedACodec: [OMX.qcom.video.encoder.avc] configure, AMessage : AMessage(what = 'conf', target = 10) = {
string mime = "video/avc"
int32_t frame-rate = 30
int32_t color-format = 2130708361
int32_t profile = 8
int32_t height = 1280
int32_t width = 720
int32_t bitrate = 2000000
float i-frame-interval = 1.000000
int32_t level = 32768
int32_t encoder = 1
}
W/ACodec: do not know color format 0x7f000789 = 2130708361
A/libc: Fatal signal 7 (SIGBUS), code 1, fault addr 0x7f617f72b1 in tid 5910 (reampack.sample)
I found if I comment outsocket.close()
in RtmpProducer.kt
, then no longer cause A/libc: Fatal signal 7 (SIGBUS), code 1, fault addr 0x7f617f72b1 in tid 5910 (reampack.sample)
crash, but the socket was not closed, when start publish again, app crashed again.
I hope you can provide some help. Thanks.
thanks guy . is this project support srt streamid in the future ?
I ran the sample app on Android. Providing an RTMP URL with the streaming key starts a successful stream, but when I check the actual data received I get nothing. This library is also being used in api.video-reactnative-live-stream
2.5.2
N/A
N/A
No response
No response
Not tested
Call srtProducer.connect(url)
where url is a valid SRT URL.
The method returns successfully.
InvalidParameterException: Failed to parse URL <xx>: unknown host
is thrown (from SrtProducer.kt:89)
Hi! I encountered this small corner case while extending the SRT streamer and playing with it in a toy setup.
It appears that the safe call at SrtProducer.kt:120 will return null when onConnectionListener
has not been set.
This seems to cause SrtProducer.kt:88 to evaluate to null even when uri.host
is valid and the connection has succeeded. So it throws the error even when everything has worked.
I believe a possible fix might be to return true or Unit at the end of the connect(ip, port)
method so that it's not null when successful. (Though I'm not sure of the code design/style considerations here, so I thought it better to just note the issue rather than submit a PR.)
Cheers,
George
No response
2.5.2
Pixel 5a
I would like to show the current bitrate to the user, but I am not able to get the bitrate
A value in the setting object or something would be nice, or a callback of sorts on bitrate change
No response
Many android devices do not have H264/H265 encoders. But they do have either vp8 or vp9 encoder. How about extending support to these encoders as well?
If the initial state of the demo is portrait mode, sink display is normal.
Then the display on the sink side is normal. then, if the mobile phone is switched between horizontal and vertical ,it just work well.
If the initial state of the phone is horizontal, the display screen is stretched.
Before clicking the "record screen" button, keep the phone in landscape mode, and then click "record screen"
The screen displayed on the sink is stretched, and later, if the mobile phone switches between horizontal and vertical mode, the picture is also distorted
Adds an adaptive bitrate algorithm to ajust bitrate to available bandwidth.
Why ?
Improves stream quality.
2.5.2
samsung galaxy 21+
pixel 6 pro
SRT
ACC
sampleRate = 44100,
startBitrate = 128000,
default setting
resolution: 120x120
No
create stream
view stream on another android device
low lag video stream
very laggy video stream
No response
No response
I've been using StreamPack on Android via api.video-reactnative-live-stream.
Running on device or simulator the broadcasting service livepeer.studio is not able to show the broadcasted camera content on Android. Even though I am able to see it if I broadcast to a local ffplay -listen 1 -i rtmp://0.0.0.0:1935/s/streamKey
.
After talking with a colleague of mine at Livepeer, he debugged the RTMP stream received content. His feedback is:
The RTMP content is completely corrupt:
[124+0] 58 bytes of H264 video keyframe header
[124+0] 4020 bytes of H264 video keyframe NALU
[158+0] 1476 bytes of H264 video iframe NALU
[191+0] 1060 bytes of H264 video iframe NALU
[224+0] 1412 bytes of H264 video iframe NALU
[235+0] 1524 bytes of H264 video iframe NALU
[221+0] 1476 bytes of H264 video iframe NALU
[231+0] 1540 bytes of H264 video iframe NALU
[219+0] 1412 bytes of H264 video iframe NALU
[229+0] 1572 bytes of H264 video iframe NALU
[238+0] 1588 bytes of H264 video iframe NALU
[226+0] 2116 bytes of H264 video iframe NALU
[212+0] 2404 bytes of H264 video iframe NALU
[222+0] 1156 bytes of H264 video iframe NALU
[233+0] 1572 bytes of H264 video iframe NALU
[220+0] 1588 bytes of H264 video iframe NALU
[229+0] 1956 bytes of H264 video iframe NALU
[217+0] 1780 bytes of H264 video iframe NALU
[227+0] 2036 bytes of H264 video iframe NALU
[213+0] 2052 bytes of H264 video iframe NALU
[224+0] 2228 bytes of H264 video iframe NALU
[211+0] 2228 bytes of H264 video iframe NALU
[220+0] 1476 bytes of H264 video iframe NALU
[208+0] 1828 bytes of H264 video iframe NALU
[218+0] 1956 bytes of H264 video iframe NALU
[204+0] 2036 bytes of H264 video iframe NALU
[215+0] 1284 bytes of H264 video iframe NALU
[225+0] 1348 bytes of H264 video iframe NALU
[235+0] 1428 bytes of H264 video iframe NALU
[245+0] 964 bytes of H264 video iframe NALU
[209+0] 868 bytes of H264 video iframe NALU
Numbers in front are timestamps in milliseconds. They're all over the place!
Do you know why this is happening?
Thanks
Hi Thibault! First of all, thanks for the library!
Implementing streaming to my project I followed your experience as an example, but I ran into some difficulties. Do you mind if I ask few questions?
When you start the application, it is noticeable that the camera frame rate is below 30, although the value is set to 30, and when you start the stream, the fps drops even more. As far as I understood, this is due to the call to the holder.setFixedSize() method every time the surface changes.
I tried to call holder.setFixedSize() only in surfaceCreated() and then there are no problems with fps, but in this case, when the stream starts, the information about the set fps does not reach the server, which leads to errors and the stream does not play.
Can you please tell me how to fix this? I would be grateful for any advice!
Hi
I wanted to know if this protocol is good for making a webinar with 5 callers and 50 listeners ?
Hi,I used the app with srt,I want to encrypt the stream, but i donot know how to fill the passphrase field。
Could you give me some advice?
Dispatch network and heavy computation api from Main thread.
Concerned API:
Why ?
To avoid creating an unresponsive UI and simplify API usage.
2.52
Xiaomi Redmi Note 8T
Is it possible to use a dns to broadcast the streaming, say no-ip or others?
No response
No response
On device with realtime source timestamp, it is not possible to read stream from ffmpeg.
On ffmpeg side, the following errors happen:
2:41:18.339593/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=0 avail=0 ack.seq=1646609462 pkt.seq=1646609462 rcv-remain=8191 drift=1545
12:41:18.339743/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=1 avail=0 ack.seq=1646609462 pkt.seq=1646609463 rcv-remain=8191 drift=1545
12:41:19.324344/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=8 avail=0 ack.seq=1646609462 pkt.seq=1646609470 rcv-remain=8191 drift=1545
12:41:19.324490/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=9 avail=0 ack.seq=1646609462 pkt.seq=1646609471 rcv-remain=8191 drift=1545
12:41:19.340980/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=12 avail=0 ack.seq=1646609462 pkt.seq=1646609474 rcv-remain=8191 drift=1545
12:41:19.341026/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=13 avail=0 ack.seq=1646609462 pkt.seq=1646609475 rcv-remain=8191 drift=1545
12:41:19.372917/T0x700008c41000!W:SRT.qr: @898072221:No room to store incoming packet: offset=17 avail=0 ack.seq=1646609462 pkt.seq=1646609479 rcv-remain=8191 drift=1545
See #11 (comment)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.