Comments (18)
Hi,
This librtmp always sets hasVideo and hasAudio flags always, so this is not the problem. However I managed to solve the problem. The issue was the timestamps order. I was setting audio and video timestamps correclty, but I was not sending them in the appropriate order. I seams that to mux audio and video in RTMP it is needed that all packets are (regardless they are video or audio), they have to include always an increasing timestamp. What as happening to me is that I was sending a video frame and after that an audio frames with a timestamp with a slightly lower timestamp. Now it works, I have to take care to send all the frames (or rtmp packets) with a timestamp higher or equal than the previous one (even while interlacing video and audio packets), then it works great!
Thanks for your help even though
from librtmp-client-for-android.
@insthync for more than 6 six months that I do not program anything related to java or Android (actually I haven't ever been and Android developer, I am a complete noob 😛) I haven't seen anything relevant to your implementation (a part that you may send codec configuration frame multiple times), to me it is weird that you use the same thread to encode the frames than to send the rtmp data. I used the grafika examples to code the VideoEncoder class and thread. That might help you to code a more elaborated encoder class.
Some hints that come to my mind:
- Use a separate thread for the RTMP muxer/sender. It will help you if you pretend to include audio in the future.
- Make sure the first frame you send to the RTMP service thread is the codec configuration SPS and PPS NALs. Only once and the first frame.
- Make sure the coded data you append to the RTMP muxer is correct (dump it to disk to a .h264 file and reproduce it with VLC)
- Make sure your frame timestamps are consistent (in milliseconds).
- I used a frame queue in order to connect encoder and RTMP muxer threads.
Those are just my 2 cents, be aware I am not an Android developer, so use my thoughts at your own risk.
from librtmp-client-for-android.
Finally I solved part of my problem, there was an issue with my timestamp calculations, so I can verify they must be introduced in milliseconds. However I couldn't manage to stream audio and video together, I can send video and audio independently and it works, however when I try to send both the communication crashes. I am sending realtime audio and video, does anyone knows if I should take care of sending video and audio frames in some specific order (i.e. audio waits video, or video waits audio)? Up to now as soon as I grab a frame (audio or video) I send it.
from librtmp-client-for-android.
Hi,
You mean you do not play video and audio together yeah. If so, some server does not send correct header that indicates stream contains both video and audio. So exoplayer does not initialize audio track. To solve this issue, i always let exoplayer initialize audio track and audio & video plays together.
I comment out the /hasAudio &&/ condition.
Do you think you have a problem something like that?
from librtmp-client-for-android.
@davidcassany @mekya Hi, i have same question,i can not publish video and audio together.write result return 0 and the timestamp is increasing, type 9 is video ,8 is audio. how do you solve it?
11-22 01:31:32.306 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 9,size 29,dts 0,result 0
11-22 01:31:32.316 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 9,size 5184,dts 0,result 0
11-22 01:31:32.348 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 9,size 24,dts 67,result 0
11-22 01:31:32.352 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 2,dts 0,result 0
11-22 01:31:32.352 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 92,dts 0,result 0
11-22 01:31:32.372 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 113,dts 20,result 0
11-22 01:31:32.395 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 128,dts 40,result 0
11-22 01:31:32.398 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 124,dts 60,result 0
11-22 01:31:32.415 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 9,size 82,dts 134,result 0
11-22 01:31:32.433 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 120,dts 80,result 0
11-22 01:31:32.472 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 120,dts 101,result 0
11-22 01:31:32.486 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 9,size 278,dts 201,result 0
11-22 01:31:32.492 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 118,dts 140,result 0
11-22 01:31:32.514 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 117,dts 160,result 0
11-22 01:31:32.532 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 103,dts 180,result 0
11-22 01:31:32.551 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 110,dts 200,result 0
11-22 01:31:32.557 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 9,size 1113,dts 267,result 0
11-22 01:31:32.571 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 102,dts 220,result 0
11-22 01:31:32.577 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 97,dts 240,result 0
11-22 01:31:32.615 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 9,size 1594,dts 334,result 0
11-22 01:31:32.633 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 101,dts 260,result 0
11-22 01:31:32.635 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 93,dts 300,result 0
11-22 01:31:32.671 11950-12464/com.pursll.frl.app D/SrsFlvMuxer: rtmp write type 8,size 106,dts 320,result 0
from librtmp-client-for-android.
@gzsll your timestamps are not correct, each frame added to the stream should have a dts higher than the previous one (regardless if this is audio or video). See in 11-22 01:31:32.415, that frame has 134 as the timestamp, however the next one has 80, that's wrong. To solve it I used to queue audio and video frames in custom priority queue, where they were sorted by timestamp. Then I was getting frames from that queue to the RTMP library.
from librtmp-client-for-android.
@davidcassany first frame's timestamp is 0? other frame is higher than previous?
from librtmp-client-for-android.
@gzsll not sure if first frame needs to be zero, however have a close look to your timestamps. See in instant 11-22 01:31:32.615 and then 11-22 01:31:32.633, the second timestamp belongs to the past, video and audio timestamps are not independent. If you use wireshark to verify your RTMP headers, you will notice the error in headers.
from librtmp-client-for-android.
@davidcassany thanks,i will try it.
from librtmp-client-for-android.
Can you share your source code for sending audio and video?
I am looking for this urgenly.
Thank you very much/
from librtmp-client-for-android.
Hi @davidarchi,
I will try to summarise the relevant parts and maybe some code snippets between today and tomorrow. I hope it will help, I can't give all the code as the parts I am using are quite coupled to an specific application and it isn't that short.
from librtmp-client-for-android.
@davidcassany Hi, can you check my implementation? its source code located at https://github.com/insthync/AndroidSimpleScreenRTMP
Now I just trying to send video data the result is 0 but it does not work, don't know what it wrong.
from librtmp-client-for-android.
@davidarchi sorry for my delayed answer, I did not have the chance to write the implementation abstract as I told you :P, anyway better later than never. Here goes:
I structered the code with the following classes:
-
MediaCommons: it holds all the media format constants (ie frame, bitrate, audio codec, video codec, etc.) and it also includes the routines to set the camera. It is a singleton, so the first time is you access to it in configures the camera and only once (then is when you know all the media constants, since they might be device dependent).
-
CodedFrame: in holds the coded frame bytes. Codec configuration bytes (SPS/PPS NALS or AAC header). Includes these attributes
public final static Boolean VIDEO = true;
public final static Boolean AUDIO = false;
byte[] encData;
byte[] extraInfo; //SPS/PPS NAL units or AAC headers
Long timestamp; //In usec, direct value form Android encoder (presentationTimeUs)
Boolean video;
And this methods:
int getTimestampMilliseconds(){
return ((Long)(timestamp/1000)).intValue();
}
@Override
public int compareTo(Object o) {
CodedFrame frame = (CodedFrame) o;
return timestamp.compareTo(frame.timestamp);
}
- AVFrameQueue: It is a priorized queue, so all frames, regardless being audio or video are sorted by timestamp. Includes attributes:
private PriorityQueue<CodedFrame> pQueue;
private int videoCount;
private int audioCount;
And methods:
public void push(CodedFrame frame);
// Be aware of race conditions, I used qQueue object to sychornize sensible parts of push method
public CodedFrame pop() throws InterruptedException;
// It only returns a frame if, and only if, there is at least one video AND one audio frame in the queue.
// This is the reason to have separate audio and video counters.
// If the condition is not met it returns NULL
public void clear();
// It resets the queue to an empty state.
- AudioCapture: It includes independent threads to grab RAW audio frames and encode them to AAC and push to the AVFrameQueue. Includes this attributes:
private EncoderThread mEncoderThread;
// An inner class that contains the audio encoder process, it places the coded data in a CodedFrame and pushes it to the AVFrameQueue
private CaptureThread mCaptureThread;
// An innner class that grabs raw audio frames and queues them to the audio encoder
private MediaCodec mEncoder;
private MediaFormat audioFormat;
Encoder thread has a drainEncoder method (this is the method that pushes encoded frames to the AVFrameQueue):
public void drainEncoder() {
final int TIMEOUT_USEC = 0; // no timeout -- check for buffers, bail if none
ByteBuffer[] encoderOutputBuffers = mEncoder.getOutputBuffers();
while (true) {
int encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
// no output available yet
break;
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// not expected for an encoder
encoderOutputBuffers = mEncoder.getOutputBuffers();
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// Should happen before receiving buffers, and should only happen once.
mEncodedFormat = mEncoder.getOutputFormat();
Log.d(TAG, "encoder output format changed: " + mEncodedFormat);
} else if (encoderStatus < 0) {
Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " +
encoderStatus);
// let's ignore it
} else {
ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
if (encodedData == null) {
throw new RuntimeException("encoderOutputBuffer " + encoderStatus +
" was null");
}
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
encodedData.position(mBufferInfo.offset);
encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
codecConfig = new byte[mBufferInfo.size];
encodedData.get(codecConfig);
mBufferInfo.size = 0;
}
if (mBufferInfo.size != 0) {
// adjust the ByteBuffer values to match BufferInfo (not needed?)
encodedData.position(mBufferInfo.offset);
encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
byte[] outData = new byte[mBufferInfo.size];
encodedData.get(outData);
CodedFrame frame = new CodedFrame(CodedFrame.AUDIO);
frame.timestamp = mBufferInfo.presentationTimeUs;
frame.extraInfo = codecConfig;
frame.encData = outData;
queue.push(frame);
if (VERBOSE) {
Log.d(TAG, "sent " + mBufferInfo.size + " bytes to muxer, ts=" +
mBufferInfo.presentationTimeUs);
}
}
mEncoder.releaseOutputBuffer(encoderStatus, false);
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.w(TAG, "reached end of stream unexpectedly");
break; // out of while
}
}
}
}
-
VideoEncoder: this is a class that has an independent thread to code video frames. This is parallel class to the AudioCapture. After all it also includes an encoder thread with a drainEncoder that pushes CodedFrames to the AVFramesQueue.
-
Muxer: This class it another independent thread that consumes frames to the AVFrameQueue and adds them to the RTMP muxer. Includes this attributes:
private RTMPMuxer muxer;
private boolean isStreaming;
private AVFramesQueue queue;
private URI url;
and the following run method:
@Override
public void run()
{
if (muxer.open(url.toString()) <= 0 || muxer.isConnected() <= 0){
isStreaming = false;
cb.onMuxerClose();
return;
}
boolean videoConf = false;
boolean audioConf = false;
CodedFrame frame;
while (isStreaming)
{
try {
frame = queue.pop();
} catch (InterruptedException e) {
isStreaming = false;
continue;
}
if (frame == null){
continue;
}
if (muxer.isConnected() > 0) {
if (frame.video) {
if (!videoConf && frame.extraInfo != null){
videoConf = true;
muxer.writeVideo(frame.extraInfo, 0, frame.extraInfo.length, frame.getTimestampMilliseconds());
} else if (videoConf) {
muxer.writeVideo(frame.encData, 0, frame.encData.length, frame.getTimestampMilliseconds());
}
} else {
if (!audioConf && frame.extraInfo != null){
audioConf = true;
muxer.writeAudio(frame.extraInfo, 0, frame.encData.length, frame.getTimestampMilliseconds());
} else if (audioConf) {
muxer.writeAudio(frame.encData, 0, frame.encData.length, frame.getTimestampMilliseconds());
}
}
} else {
break;
}
}
if (muxer.isConnected() > 0) {
muxer.close();
}
if (isStreaming){
isStreaming = false;
cb.onMuxerClose();
}
}
Thats all I got, I hope it helps :) Be warned that I am not an Android developer (neither a java developer), it has been actually my first Android app, so I am pretty sure it can be really improved. My guide was the grafika project.
from librtmp-client-for-android.
@davidcassany Thank you, I will try it :)
from librtmp-client-for-android.
Now I have send video format data before send video frame data, but still can not watch it on VLC, I've notice the video data on server, its framerate always zero. Does not sure that what should I do.
Edit: The Video that recording at server is fine, I can watch it but I can't watch it live.
from librtmp-client-for-android.
@insthync from your implementation I would change the timestamp calculations, I would use directly the timestamps provided by the encoder.
You have:
if (startTime == 0)
startTime = mBufferInfo.presentationTimeUs / 1000;
int timestamp = currentFrame * (1000 / FRAME_RATE);
I would try:
int timestamp = (Long)(mBufferInfo.presentationTimeUs).intValue() - startTime;
Where start time is taken right after encoder start.
Add some control to send only once the configuration, make sure mRTMPMuxer.writeVideo run only once and with the timestamp value, not zero, even it is the first one it doesn't necessarily have to be zero.
if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
// Pulling codec config data
encodedData.position(mBufferInfo.offset);
encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
Log.i(TAG, "sent " + mBufferInfo.size + " bytes to muxer...");
byte[] bytes = new byte[encodedData.remaining()];
encodedData.get(bytes);
int writeResult = mRTMPMuxer.writeVideo(bytes, 0, bytes.length, 0);
Log.d(TAG, "RTMP_URL write format result: " + writeResult);
mBufferInfo.size = 0;
}
Which server are you using? I used to use nginx plus the RTMP plugin, it played well with either VLC or FFPLAY. You can use wireshark to see all RTMP headers are correct, this is the way I managed to debug the timestamps, reviewing RTMP packet headers. Wireshark has a decoder for RTMP packets. You should see only deltas of the frame time (RTMP does not use absolute timestamps).
from librtmp-client-for-android.
@davidcassany Thank you, I will try it. about the server I use nginx with its plugin.
from librtmp-client-for-android.
Now, I can see live stream on VLC, but I have to active screen to make frame changes seem like it might be problems with MediaCode, when frame not changes dequeueOutputBuffer() function will return MediaCodec.INFO_TRY_AGAIN_LATER with this it will not send data to server, I will change condition to keep last frame data and send when dequeueOutputBuffer() not return >= 0
The framerate data on server still 0, I think it does not matter.
from librtmp-client-for-android.
Related Issues (20)
- Sample Example for RTMP URL HOT 1
- always crash with SIGSEGV after 6~8 hours rtmp steam play in android HOT 2
- Upload RTMP to YT - closing after a couple of buffers (NAL units) HOT 7
- RtmpMuxer推流使用exoplay无法播放,但是电脑直接用ffmepg推流exoplay可以正常播放? HOT 2
- Unable to stream video after updating to the new release 3.2.0 HOT 1
- 请问RTMPMuxer.writeVideo返回-20是什么错误? HOT 1
- write stream to local file
- Issues enabling debug mode
- Can not stream with RtmpStreamer? HOT 2
- SEI message support (Display Orientation)
- how to work with this lib in Azure media service android? HOT 1
- 断网close后,第二次不能推流到服务端
- 推流无法使用VLC播放 HOT 2
- The method of writevideo is not hard to push. When writeaudio is added to push, the video cannot be played HOT 1
- Does this library support audio only publishing? HOT 2
- how to switch rtmp stream HOT 2
- JCenter shutdown - do you plan to move the latest version to another repository? HOT 20
- Send Metadata HOT 1
- RTMP publish/write confusion HOT 1
- Failed to analyze sps from h264 data
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from librtmp-client-for-android.