Coder Social home page Coder Social logo

shogo4405 / haishinkit.dart Goto Github PK

View Code? Open in Web Editor NEW
30.0 5.0 19.0 215 KB

Camera and Microphone streaming library via RTMP for Flutter.

Home Page: https://pub.dev/packages/haishin_kit

License: BSD 3-Clause "New" or "Revised" License

Kotlin 17.25% Ruby 2.64% Swift 21.37% Objective-C 0.61% Dart 58.14%
flutter rtmp streaming

haishinkit.dart's Introduction

HaishinKit Plugin

pub package

  • A Flutter plugin for iOS, Android. Camera and Microphone streaming library via RTMP.
Android iOS
Support SDK 21+ iOS 12.0+

🌏 Dependencies

Project name Notes License
HaishinKit for iOS, macOS and tvOS. Camera and Microphone streaming library via RTMP, HLS for iOS, macOS and tvOS. BSD 3-Clause "New" or "Revised" License
HaishinKit for Android. Camera and Microphone streaming library via RTMP for Android. BSD 3-Clause "New" or "Revised" License

🎨 Features

RTMP

  • Authentication
  • Publish and Recording (H264/AAC)
  • Playback (Beta)
  • Adaptive bitrate streaming
    • Automatic drop frames
  • Action Message Format
    • AMF0
    • AMF3
  • SharedObject
  • RTMPS
    • Native (RTMP over SSL/TLS)

🐾 Example

Here is a small example flutter app displaying a camera preview.

import 'dart:async';

import 'package:audio_session/audio_session.dart';
import 'package:flutter/material.dart';
import 'package:haishin_kit/audio_source.dart';
import 'package:haishin_kit/net_stream_drawable_texture.dart';
import 'package:haishin_kit/rtmp_connection.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_source.dart';
import 'package:permission_handler/permission_handler.dart';

void main() {
  runApp(const MyApp());
}

class MyApp extends StatefulWidget {
  const MyApp({Key? key}) : super(key: key);

  @override
  State<MyApp> createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  RtmpConnection? _connection;
  RtmpStream? _stream;
  bool _recording = false;
  CameraPosition currentPosition = CameraPosition.back;

  @override
  void initState() {
    super.initState();
    initPlatformState();
  }

  Future<void> initPlatformState() async {
    await Permission.camera.request();
    await Permission.microphone.request();

    // Set up AVAudioSession for iOS.
    final session = await AudioSession.instance;
    await session.configure(const AudioSessionConfiguration(
      avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
      avAudioSessionCategoryOptions:
      AVAudioSessionCategoryOptions.allowBluetooth,
    ));

    RtmpConnection connection = await RtmpConnection.create();
    connection.eventChannel.receiveBroadcastStream().listen((event) {
      switch (event["data"]["code"]) {
        case 'NetConnection.Connect.Success':
          _stream?.publish("live");
          setState(() {
            _recording = true;
          });
          break;
      }
    });
    RtmpStream stream = await RtmpStream.create(connection);
    stream.attachAudio(AudioSource());
    stream.attachVideo(VideoSource(position: currentPosition));

    if (!mounted) return;

    setState(() {
      _connection = connection;
      _stream = stream;
    });
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(title: const Text('HaishinKit example app'), actions: [
          IconButton(
            icon: const Icon(Icons.flip_camera_android),
            onPressed: () {
              if (currentPosition == CameraPosition.front) {
                currentPosition = CameraPosition.back;
              } else {
                currentPosition = CameraPosition.front;
              }
              _stream?.attachVideo(VideoSource(position: currentPosition));
            },
          )
        ]),
        body: Center(
          child: _stream == null
              ? const Text("")
              : NetStreamDrawableTexture(_stream),
        ),
        floatingActionButton: FloatingActionButton(
          child: _recording
              ? const Icon(Icons.fiber_smart_record)
              : const Icon(Icons.not_started),
          onPressed: () {
            if (_recording) {
              _connection?.close();
              setState(() {
                _recording = false;
              });
            } else {
              _connection?.connect("rtmp://192.168.1.9/live");
            }
          },
        ),
      ),
    );
  }
}

haishinkit.dart's People

Contributors

shogo4405 avatar tian-zhihui avatar alesk7 avatar simonwang9610 avatar brauliolomeli avatar kodam-zz avatar

Stargazers

Tom Grushka avatar  avatar Cuvii avatar Pinglei Guo avatar Joe avatar  avatar Yılmaz Yağız Dokumacı avatar Leon Berenschot avatar  avatar Hao Chen avatar  avatar  avatar Rout Martin avatar leeyi avatar  avatar Eric Kalosa-Kenyon avatar Yamashita avatar Konstantin Epifanov avatar  avatar  avatar 魔咔啦咔 avatar Liam's Li avatar Dipesh Dulal avatar izm avatar  avatar AISebasChan avatar Xie Wang avatar koji4104 avatar Michael Mathieu avatar Takayuki_Sei avatar

Watchers

 avatar James Cloos avatar Eric Kalosa-Kenyon avatar  avatar  avatar

haishinkit.dart's Issues

Unable to stream from android

Describe the bug

When I call _connection?.connect(
"rtmp://a.rtmp.youtube.com/live2/$token");

The first response in listener is: 'NetConnection.Connect.Success' and within a second receive "NetConnection.Connect.Closed"

To Reproduce

Call the function: _connection?.connect(
"rtmp://a.rtmp.youtube.com/live2/$token"); from android

Expected behavior

Start Livestreaming

Version

haishin_kit: ^0.11.2

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

No response

Page closed, camera idle, not closed

Describe the bug

import 'package:flutter_pet_community/tools.dart';
import 'package:haishin_kit/rtmp_connection.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_source.dart';

import 'package:audio_session/audio_session.dart';
import 'package:haishin_kit/audio_source.dart';
import 'package:haishin_kit/stream_view_texture.dart';

class TestView extends StatefulWidget {
  static const String routeName = "TestView";

  const TestView({super.key});

  @override
  State<TestView> createState() => _TestViewState();
}

class _TestViewState extends State<TestView> {
  String streamURL = "rtmp://192.168.0.102:1935/live";
  RtmpConnection? _connection;
  RtmpStream? _stream;
  bool _recording = false;
  CameraPosition currentPosition = CameraPosition.back;

  RtmpConnection? connection;
  @override
  void initState() {
    super.initState();
    initPlatformState();
  }

  Future<void> initPlatformState() async {
    await Permission.camera.request();
    await Permission.microphone.request();

    // Set up AVAudioSession for iOS.
    final session = await AudioSession.instance;
    await session.configure(const AudioSessionConfiguration(
      avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
      avAudioSessionCategoryOptions: AVAudioSessionCategoryOptions.allowBluetooth,
    ));
    connection = await RtmpConnection.create();

    connection?.eventChannel.receiveBroadcastStream().listen((event) {
      switch (event["data"]["code"]) {
        case 'NetConnection.Connect.Success':
          _stream?.publish("live");
          setState(() {
            _recording = true;
          });
          break;
      }
    });

    RtmpStream stream = await RtmpStream.create(connection!);
    await stream.attachAudio(AudioSource());
    await stream.attachVideo(VideoSource(position: currentPosition));

    if (!mounted) return;

    setState(() {
      _connection = connection;
      _stream = stream;
    });
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(title: const Text('HaishinKit example app'), actions: [
        IconButton(
          icon: const Icon(Icons.flip_camera_android),
          onPressed: () {
            if (currentPosition == CameraPosition.front) {
              currentPosition = CameraPosition.back;
            } else {
              currentPosition = CameraPosition.front;
            }
            _stream?.attachVideo(VideoSource(position: currentPosition));
          },
        )
      ]),
      body: Center(
        child: _stream == null ? const Text("") : StreamViewTexture(_stream),
      ),
      floatingActionButton: FloatingActionButton(
        child: _recording ? const Icon(Icons.fiber_smart_record) : const Icon(Icons.not_started),
        onPressed: () async {
          if (_recording) {
            await Future.wait([
              _stream!.attachAudio(null),
              _stream!.attachVideo(null),
            ]);

            _connection?.close();

            setState(() {
              _recording = false;
            });
          } else {
            _connection?.connect(streamURL);
          }
        },
      ),
    );
  }

  @override
  void dispose() {
    _connection?.close();
    _connection?.dispose();
    _stream?.close();
    _stream?.dispose();
    super.dispose();
  }
}

To Reproduce

After the page is closed and exited, the camera keeps printing and remains idle without being closed。

Expected behavior

Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_CLOSED for client com.samsung.adaptivebrightnessgo API Level 2

Version

haishin_kit: ^0.13.0

dart: ">=3.4.3 <4.0.0"
flutter: ">=3.22.0"

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_IDLE for client com.samsung.adaptivebrightnessgo API Level 2
I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_CLOSED for client com.samsung.adaptivebrightnessgo API Level 2
I/CameraManagerGlobal(19085): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_OPEN for client com.moyou.flutter_pet_community API Level 2
I/CameraManagerGlobal(19085): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_ACTIVE for client com.moyou.flutter_pet_community API Level 2
I/CameraManagerGlobal(19085): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_IDLE for client com.moyou.flutter_pet_community API Level 2
I/CameraManagerGlobal(19085): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_CLOSED for client com.moyou.flutter_pet_community API Level 2
E/CameraCaptureSession(19085): Session 0: Exception while stopping repeating: 
E/CameraCaptureSession(19085): android.hardware.camera2.CameraAccessException: CAMERA_ERROR (3): cancelRequest:688: Camera 0: Error clearing streaming request: Function not implemented (-38)
E/CameraCaptureSession(19085): 	at android.hardware.camera2.CameraManager.throwAsPublicException(CameraManager.java:1731)
E/CameraCaptureSession(19085): 	at android.hardware.camera2.impl.ICameraDeviceUserWrapper.cancelRequest(ICameraDeviceUserWrapper.java:99)
E/CameraCaptureSession(19085): 	at android.hardware.camera2.impl.CameraDeviceImpl.stopRepeating(CameraDeviceImpl.java:1371)
E/CameraCaptureSession(19085): 	at android.hardware.camera2.impl.CameraCaptureSessionImpl.close(CameraCaptureSessionImpl.java:579)
E/CameraCaptureSession(19085): 	at android.hardware.camera2.impl.CameraCaptureSessionImpl$2.onDisconnected(CameraCaptureSessionImpl.java:790)
E/CameraCaptureSession(19085): 	at android.hardware.camera2.impl.CameraDeviceImpl$7.run(CameraDeviceImpl.java:267)
E/CameraCaptureSession(19085): 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
E/CameraCaptureSession(19085): 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
E/CameraCaptureSession(19085): 	at java.lang.Thread.run(Thread.java:1012)
E/CameraCaptureSession(19085): Caused by: android.os.ServiceSpecificException: cancelRequest:688: Camera 0: Error clearing streaming request: Function not implemented (-38) (code 10)
E/CameraCaptureSession(19085): 	at android.os.Parcel.createExceptionOrNull(Parcel.java:3037)
E/CameraCaptureSession(19085): 	at android.os.Parcel.createException(Parcel.java:3007)
E/CameraCaptureSession(19085): 	at android.os.Parcel.readException(Parcel.java:2990)
E/CameraCaptureSession(19085): 	at android.os.Parcel.readException(Parcel.java:2932)
E/CameraCaptureSession(19085): 	at android.hardware.camera2.ICameraDeviceUser$Stub$Proxy.cancelRequest(ICameraDeviceUser.java:666)
E/CameraCaptureSession(19085): 	at android.hardware.camera2.impl.ICameraDeviceUserWrapper.cancelRequest(ICameraDeviceUserWrapper.java:97)
E/CameraCaptureSession(19085): 	... 7 more
I/CameraManagerGlobal(19085): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_OPEN for client com.moyou.flutter_pet_community API Level 2
I/CameraManagerGlobal(19085): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_ACTIVE for client com.moyou.flutter_pet_community API Level 2
W/System  (19085): A resource failed to call Surface.release. 
W/System  (19085): A resource failed to call Surface.release. 
I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_OPEN for client com.samsung.adaptivebrightnessgo API Level 2
I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_ACTIVE for client com.samsung.adaptivebrightnessgo API Level 2
I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_IDLE for client com.samsung.adaptivebrightnessgo API Level 2
I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_CLOSED for client com.samsung.adaptivebrightnessgo API Level 2
D/BufferPoolAccessor2.0(19085): evictor expired: 1, evicted: 0
D/BufferPoolAccessor2.0(19085): bufferpool2 0xb400007bd4232c28 : 0(0 size) total buffers - 0(0 size) used buffers - 112/117 (recycle/alloc) - 6/224 (fetch/transfer)
D/BufferPoolAccessor2.0(19085): evictor expired: 1, evicted: 1
I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_OPEN for client com.samsung.adaptivebrightnessgo API Level 2
I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_ACTIVE for client com.samsung.adaptivebrightnessgo API Level 2
I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_IDLE for client com.samsung.adaptivebrightnessgo API Level 2
I/CameraManagerGlobal(19085): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_CLOSED for client com.samsung.adaptivebrightnessgo API Level 2

written error: method name "setScreenSettings" was mistakenly written as setScreenSettigns in rtmp_stream_method_channel.dart

Describe the bug

error : Unhandled Exception: MissingPluginException(No implementation found for method RtmpStream#setScreenSettigns on channel com.haishinkit)

and obviously this is a written error

To Reproduce

  1. _stream?.screenSettings = ScreenSettings(width: 1080, height: 1920);
  2. run your flutter project in iphone
  3. you will see the error in console:
    [ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: MissingPluginException(No implementation found for method RtmpStream#setScreenSettigns on channel com.haishinkit)
    #0 MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:332:7)

Expected behavior

there is no error, and everything is ok

Version

flutter 3.22.2
Dart 3.4.3
haishin_kit: ^0.13.0

Smartphone info.

iphone 13
ios 17.5.1

Additional context

No response

Screenshots

Uploading WX20240720-081414@2x.png…

Relevant log output

[ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: MissingPluginException(No implementation found for method RtmpStream#setScreenSettigns on channel com.haishinkit)
#0      MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:332:7)
<asynchronous suspension>
#1      MethodChannelRtmpStream.setScreenSettings (package:haishin_kit/rtmp_stream_method_channel.dart:57:12)
<asynchronous suspension>
[ERROR:flutter/shell/common/shell.cc(1055)] The 'com.haishinkit.eventchannel/12916161104' channel sent a message from native to Flutter on a non-platform thread. Platform channel messages must be sent on the platform thread. Failure to do so may result in data loss or crashes, and must be fixed in the plugin or application code creating that channel.
See https://docs.flutter.dev/platform-integration/platform-channels#channels-and-platform-threading for more information.
flutter: -------------->
flutter: {type: ioError, data: null}
[ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: NoSuchMethodError: The method '[]' was called on null.

Back Camera quality is very poor and blurred on Android.

Describe the bug

I have tested the livestreaming with the example code and it worked on mux.com

But, the biggest problem is the video quality is very poor and can see blurred view on mux playback and Can you please help me to understand what I am missing? What is causing this poor video quality?

import 'dart:async';

import 'package:audio_session/audio_session.dart';
import 'package:flutter/material.dart';
import 'package:haishin_kit/audio_settings.dart';
import 'package:haishin_kit/audio_source.dart';
import 'package:haishin_kit/capture_settings.dart';
import 'package:haishin_kit/net_stream_drawable_texture.dart';
import 'package:haishin_kit/rtmp_connection.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_settings.dart';
import 'package:haishin_kit/video_source.dart';
import 'package:permission_handler/permission_handler.dart';

void main() {
  runApp(const MyApp());
}

const streamKey = "***********************";

class MyApp extends StatefulWidget {
  const MyApp({Key? key}) : super(key: key);

  @override
  State<MyApp> createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  RtmpConnection? _connection;
  RtmpStream? _stream;
  bool _recording = false;
  String _mode = "publish";
  CameraPosition currentPosition = CameraPosition.front;

  @override
  void initState() {
    super.initState();
    initPlatformState();
  }

  @override
  void dispose() {
    _stream?.dispose();
    _connection?.dispose();
    super.dispose();
  }

  Future<void> initPlatformState() async {
    await Permission.camera.request();
    await Permission.microphone.request();

    // Set up AVAudioSession for iOS.
    final session = await AudioSession.instance;
    await session.configure(const AudioSessionConfiguration(
      avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
      avAudioSessionCategoryOptions:
      AVAudioSessionCategoryOptions.allowBluetooth,
    ));

    RtmpConnection connection = await RtmpConnection.create();
    connection.eventChannel.receiveBroadcastStream().listen((event) {
      switch (event["data"]["code"]) {
        case 'NetConnection.Connect.Success':
          if (_mode == "publish") {
            _stream?.publish(streamKey);
          } else {
            _stream?.play("live");
          }
          setState(() {
            _recording = true;
          });
          break;
      }
    });

    RtmpStream stream = await RtmpStream.create(connection);
    stream.audioSettings = AudioSettings(muted: false, bitrate: 64 * 1000);
    stream.videoSettings = VideoSettings(
      width: 1080,
      height: 1920,
      bitrate: 12 * 1000,
      frameInterval: 2
    );
    stream.attachAudio(AudioSource());
    stream.attachVideo(VideoSource(position: currentPosition));

    if (!mounted) return;

    setState(() {
      _connection = connection;
      _stream = stream;
    });
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(title: const Text('HaishinKit'), actions: [
          // IconButton(
          //   icon: const Icon(Icons.play_arrow),
          //   onPressed: () {
          //     if (_mode == "publish") {
          //       _mode = "playback";
          //       _stream?.attachVideo(null);
          //       _stream?.attachAudio(null);
          //     } else {
          //       _mode = "publish";
          //       _stream?.attachAudio(AudioSource());
          //       _stream?.attachVideo(VideoSource(position: currentPosition));
          //     }
          //   },
          // ),
          IconButton(
            icon: const Icon(Icons.flip_camera_android_sharp),
            onPressed: () {
              currentPosition = currentPosition == CameraPosition.front
                  ? CameraPosition.back
                  : CameraPosition.front;

              _stream?.attachVideo(VideoSource(position: currentPosition));
            },
          )
        ]),
        body: Center(
          child: _stream == null
              ? const CircularProgressIndicator()
              : Stack(
              children: [
                NetStreamDrawableTexture(_stream),
                Positioned(
                  bottom: 20,
                  right: 20,
                  left: 20,
                  child: SizedBox(
                    width: double.infinity,
                    height: 100,
                    child: Row(
                      mainAxisAlignment: MainAxisAlignment.spaceEvenly,
                      children: const [
                        Icon(Icons.shopping_bag, color: Colors.white,),
                        Icon(Icons.thumb_up, color: Colors.white,),
                        Icon(Icons.report, color: Colors.white,),
                      ],
                    ),
                  ),
                )
              ]
          ),
        ),
        floatingActionButton: FloatingActionButton(
          child: _recording
              ? const Icon(Icons.fiber_smart_record)
              : const Icon(Icons.not_started),
          onPressed: () {
            if (_recording) {
              _connection?.close();
              setState(() {
                _recording = false;
              });
            } else {
              _connection?.connect("rtmps://global-live.mux.com:443/app");
            }
          },
        ),
      ),
    );
  }
}

To Reproduce

Run the above code as mentioned in the bug.

Expected behavior

The 720p should show HD and so on.

Version

0.9.2

Smartphone info.

  • Xiomi note 8 pro

Additional context

No response

Screenshots

No response

Relevant log output

Launching lib/main.dart on Redmi Note 8 Pro in debug mode...
Running Gradle task 'assembleDebug'...
✓  Built build/app/outputs/flutter-apk/app-debug.apk.
Installing build/app/outputs/flutter-apk/app.apk...
Debug service listening on ws://127.0.0.1:64958/HisNmM4CRNs=/ws
Syncing files to device Redmi Note 8 Pro...
E/ion     ( 1410): ioctl c0044901 failed with code -1: Invalid argument
I/estream_haishi( 1410): ProcessProfilingInfo new_methods=1122 is saved saved_to_disk=1 resolve_classes_delay=8000
D/libMEOW ( 1410): applied 1 plugins for [com.example.livestream_haishin]:
D/libMEOW ( 1410):   plugin 1: [libMEOW_gift.so]:
I/GED     ( 1410): [GT]_get_procNameprocess pid(1410)
I/GED     ( 1410): [GT]_getprocess name(com.example.livestream_haishin)
I/estream_haishi( 1410): [GT] ret(1) gt_status(00000000) aniso_debug_level(0) gt_aniso_max_level(16) ani so mask(00000001) tri mask(00000002)
I/libMEOW_gift( 1410): ctx:0xb4000071be46ff70, ARC not Enabled.
I/BufferQueueConsumer( 1410): [](id:58200000000,api:0,p:-1,c:1410) connect(): controlledByApp=true
I/CameraManagerGlobal( 1410): Connecting to camera service
D/libMEOW ( 1410): applied 1 plugins for [com.example.livestream_haishin]:
D/libMEOW ( 1410):   plugin 1: [libMEOW_gift.so]:
I/GED     ( 1410): [GT]_get_procNameprocess pid(1410)
I/GED     ( 1410): [GT]_getprocess name(com.example.livestream_haishin)
I/estream_haishi( 1410): [GT] ret(1) gt_status(00000000) aniso_debug_level(0) gt_aniso_max_level(16) ani so mask(00000001) tri mask(00000002)
I/libMEOW_gift( 1410): ctx:0xb4000071b765bf68, ARC not Enabled.
W/CameraManagerGlobal( 1410): [soar.cts] ignore the status update of camera: 20
W/CameraManagerGlobal( 1410): [soar.cts] ignore the status update of camera: 21
W/CameraManagerGlobal( 1410): [soar.cts] ignore the status update of camera: 22
W/CameraManagerGlobal( 1410): [soar.cts] ignore the status update of camera: 61
W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 21
W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 22
W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 61
I/BufferQueueProducer( 1410): [SurfaceTexture-0-1410-0](id:58200000000,api:1,p:1410,c:1410) connect(): api=1 producerControlledByApp=true
E/libc    ( 1410): Access denied finding property "persist.vendor.camera.privapp.list"
E/CameraManagerGlobal( 1410): Camera 61 is not available. Ignore physical camera status change
W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 21
W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 22
W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 61
I/BufferQueueConsumer( 1410): [](id:58200000001,api:0,p:-1,c:1410) connect(): controlledByApp=true
I/BufferQueueConsumer( 1410): [](id:58200000002,api:0,p:-1,c:1410) connect(): controlledByApp=true
I/BufferQueueProducer( 1410): [SurfaceTexture-1-1410-1](id:58200000001,api:4,p:825,c:1410) connect(): api=4 producerControlledByApp=true
I/BufferQueueProducer( 1410): [SurfaceTexture-1-1410-2](id:58200000002,api:4,p:825,c:1410) connect(): api=4 producerControlledByApp=true
I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
V/PhoneWindow( 1410): DecorView setVisiblity: visibility = 4, Parent = android.view.ViewRootImpl@f58e519, this = DecorView@8fd59de[MainActivity]
I/GED     ( 1410): ged_boost_gpu_freq, level 100, eOrigin 2, final_idx 27, oppidx_max 27, oppidx_min 0
E/CameraCaptureSession( 1410): Session 0: Exception while stopping repeating: 
E/CameraCaptureSession( 1410): android.hardware.camera2.CameraAccessException: CAMERA_ERROR (3): The camera device has encountered a serious error
E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraDeviceImpl.checkIfCameraClosedOrInError(CameraDeviceImpl.java:2305)
E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraDeviceImpl.stopRepeating(CameraDeviceImpl.java:1263)
E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraCaptureSessionImpl.close(CameraCaptureSessionImpl.java:578)
E/CameraCaptureSession( 1410): 	at com.haishinkit.media.Camera2Source.setSession(Camera2Source.kt:79)
E/CameraCaptureSession( 1410): 	at com.haishinkit.media.Camera2Source.setDevice(Camera2Source.kt:52)
E/CameraCaptureSession( 1410): 	at com.haishinkit.media.Camera2Source.onError(Camera2Source.kt:199)
E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraDeviceImpl.notifyError(CameraDeviceImpl.java:1703)
E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraDeviceImpl.lambda$oDs27OTfKFfK18rUW2nQxxkPdV0(Unknown Source:0)
E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.-$$Lambda$CameraDeviceImpl$oDs27OTfKFfK18rUW2nQxxkPdV0.accept(Unknown Source:8)
E/CameraCaptureSession( 1410): 	at com.android.internal.util.function.pooled.PooledLambdaImpl.doInvoke(PooledLambdaImpl.java:278)
E/CameraCaptureSession( 1410): 	at com.android.internal.util.function.pooled.PooledLambdaImpl.invoke(PooledLambdaImpl.java:201)
E/CameraCaptureSession( 1410): 	at com.android.internal.util.function.pooled.OmniFunction.run(OmniFunction.java:97)
E/CameraCaptureSession( 1410): 	at android.os.Handler.handleCallback(Handler.java:938)
E/CameraCaptureSession( 1410): 	at android.os.Handler.dispatchMessage(Handler.java:99)
E/CameraCaptureSession( 1410): 	at android.os.Looper.loop(Looper.java:236)
E/CameraCaptureSession( 1410): 	at android.os.HandlerThread.run(HandlerThread.java:67)
W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 21
I/BufferQueueProducer( 1410): [SurfaceTexture-1-1410-1](id:58200000001,api:4,p:825,c:1410) disconnect(): api=4
W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 22
D/libMEOW ( 1410): applied 1 plugins for [com.example.livestream_haishin]:
D/libMEOW ( 1410):   plugin 1: [libMEOW_gift.so]:
W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 61
I/BufferQueueProducer( 1410): [SurfaceTexture-1-1410-2](id:58200000002,api:4,p:825,c:1410) disconnect(): api=4
E/CameraManagerGlobal( 1410): Camera 61 is not available. Ignore physical camera status change

Can't Dispose camera when leaving the page.

Describe the bug

Camera is not disposing though leaving the page. And camera is showing black .

To Reproduce

`
import 'dart:async';

import 'package:audio_session/audio_session.dart';
import 'package:flutter/material.dart';
import 'package:haishin_kit/audio_source.dart';
import 'package:haishin_kit/net_stream_drawable_texture.dart';
import 'package:haishin_kit/rtmp_connection.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_source.dart';
import 'package:permission_handler/permission_handler.dart';

class Stream extends StatefulWidget {
const Stream({super.key});

@OverRide
State createState() => _StreamState();
}

class _StreamState extends State {
RtmpConnection? _connection;
RtmpStream? _stream;
bool _recording = false;
CameraPosition currentPosition = CameraPosition.front;
@OverRide
void initState() {
initPlatformState();
super.initState();
}

Future initPlatformState() async {
await Permission.camera.request();
await Permission.microphone.request();

// Set up AVAudioSession for iOS.
final session = await AudioSession.instance;
await session.configure(const AudioSessionConfiguration(
  avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
  avAudioSessionCategoryOptions:
      AVAudioSessionCategoryOptions.allowBluetooth,
));

RtmpConnection connection = await RtmpConnection.create();
connection.eventChannel.receiveBroadcastStream().listen((event) {
  switch (event["data"]["code"]) {
    case 'NetConnection.Connect.Success':
      _stream?.publish("new");
      setState(() {
        _recording = true;
      });
      break;
  }
});
RtmpStream stream = await RtmpStream.create(connection);

stream.attachAudio(AudioSource());
stream.attachVideo(VideoSource(position: currentPosition));

if (!mounted) return;

setState(() {
  _connection = connection;
  _stream = stream;
});

}

@OverRide
Future dispose() async {
await _stream!.close();
await _stream!.dispose();
_connection!.close();
_connection!.dispose();
super.dispose();
}

@OverRide
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: const Text('HaishinKit example app'), actions: [
IconButton(
icon: const Icon(Icons.flip_camera_android),
onPressed: () {
if (currentPosition == CameraPosition.front) {
currentPosition = CameraPosition.back;
} else {
currentPosition = CameraPosition.front;
}
_stream?.attachVideo(VideoSource(position: currentPosition));
},
)
]),
body: Center(
child: _stream == null
? const Text("No Response")
: NetStreamDrawableTexture(_stream),
),
floatingActionButton: FloatingActionButton(
child: _recording
? const Icon(Icons.fiber_smart_record)
: const Icon(Icons.not_started),
onPressed: () {
if (_recording) {
_connection?.close();
setState(() {
_recording = false;
});
} else {
_connection?.connect("rtmp://10.0.2.2:1935/app/i");
// setState(() {
// _recording = true;
// });
}
},
),
);
}
}

`

Expected behavior

Dispose camera when leaving the page

Version

haishin_kit: ^0.9.1

Smartphone info.

Android emulator

Additional context

No response

Screenshots

No response

Relevant log output

EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)

Video Resolution Not Change

Describe the bug

Hello, I try to increase the video height to 720 and width 1080 but it will not reflect in the video preview once we run a project

To Reproduce

//import 'dart:async';

import 'package:audio_session/audio_session.dart';
import 'package:flutter/material.dart';
import 'package:haishin_kit/audio_settings.dart';
import 'package:haishin_kit/audio_source.dart';
import 'package:haishin_kit/net_stream_drawable_texture.dart';
import 'package:haishin_kit/rtmp_connection.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_settings.dart';
import 'package:haishin_kit/video_source.dart';
import 'package:permission_handler/permission_handler.dart';

class Streaming extends StatefulWidget {
  const Streaming({Key? key}) : super(key: key);

  @override
  State<Streaming> createState() => _StreamingState();
}

class _StreamingState extends State<Streaming> {
  RtmpConnection? _connection;
  RtmpStream? _stream;
  bool _recording = false;
  String _mode = "publish";
  CameraPosition currentPosition = CameraPosition.back;

  @override
  void initState() {
    super.initState();
    initPlatformState();
  }

  @override
  void dispose() {
    _stream?.dispose();
    _connection?.dispose();
    super.dispose();
  }

  Future<void> initPlatformState() async {
    await Permission.camera.request();
    await Permission.microphone.request();

    // Set up AVAudioSession for iOS.
    final session = await AudioSession.instance;
    await session.configure(const AudioSessionConfiguration(
      avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
      avAudioSessionCategoryOptions:
          AVAudioSessionCategoryOptions.allowBluetooth,
    ));

    RtmpConnection connection = await RtmpConnection.create();
    connection.eventChannel.receiveBroadcastStream().listen((event) {
      switch (event["data"]["code"]) {
        case 'NetConnection.Connect.Success':
          if (_mode == "publish") {
            _stream?.publish("live");
          } else {
            _stream?.play("live");
          }
          setState(() {
            _recording = true;
          });
          break;
      }
    });

    RtmpStream stream = await RtmpStream.create(connection);
    stream.audioSettings = AudioSettings(muted: true, bitrate: 64 * 1000);
    stream.videoSettings = VideoSettings(
      width: 720,
      height: 1080,
      bitrate: 160 * 1000,
      muted: true,
    );
    stream.attachAudio(AudioSource());
    stream.attachVideo(VideoSource(position: currentPosition));

    if (!mounted) return;

    setState(() {
      _connection = connection;
      _stream = stream;
    });
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(title: const Text('HaishinKit'), actions: [
          IconButton(
            icon: const Icon(Icons.play_arrow),
            onPressed: () {
              if (_mode == "publish") {
                _mode = "playback";
                _stream?.attachVideo(null);
                _stream?.attachAudio(null);
              } else {
                _mode = "publish";
                _stream?.attachAudio(AudioSource());
                _stream?.attachVideo(VideoSource(position: currentPosition));
              }
            },
          ),
          IconButton(
            icon: const Icon(Icons.flip_camera_android),
            onPressed: () {
              if (currentPosition == CameraPosition.front) {
                currentPosition = CameraPosition.back;
              } else {
                currentPosition = CameraPosition.front;
              }
              _stream?.attachVideo(VideoSource(position: currentPosition));
            },
          )
        ]),
        body: Center(
          child: _stream == null
              ? const Text("")
              : NetStreamDrawableTexture(_stream),
        ),
        floatingActionButton: FloatingActionButton(
          child: _recording
              ? const Icon(Icons.fiber_smart_record)
              : const Icon(Icons.not_started),
          onPressed: () {
            if (_recording) {
              _connection?.close();
              setState(() {
                _recording = false;
              });
            } else {
              _connection?.connect(
                  "rtmp://*******");  // removed by shogo4405
            } 
          },
        ),
      ),
    );
  }
}

Expected behavior

I will reflect the height and width if I make any changes in that

Version

version : 0.13.0

Smartphone info.

  • Device Android: Samsung Galaxy M31 S

Additional context

No response

Screenshots

No response

Relevant log output

No response

StreamTextureView not showing camera preview on iOS

Describe the bug

The StreamViewTexture widget is not returning a preview on iOS.
The RtmpStream#registerTexture method in RTMPStreamHandler.swift seems to be incorrect.

To Reproduce

  1. Use StreamViewTexture as per example
  2. attachVideo as per example
  3. See no preview of camera
  4. Start streaming (works)

Expected behavior

  1. StreamViewTexture(_stream) should return the preview from camera on Flutter

Version

Latest from Main

Smartphone info.

  • iOS 18 (iPhone 15 pro max)

Additional context

import 'dart:async';

import 'package:audio_session/audio_session.dart';
import 'package:baseline/resources/constants/colors.dart';
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:haishin_kit/video_settings.dart';
import 'package:keep_screen_on/keep_screen_on.dart';
import 'package:nylo_framework/nylo_framework.dart';
import 'package:permission_handler/permission_handler.dart';
import 'package:haishin_kit/audio_source.dart';
import 'package:haishin_kit/stream_view_texture.dart';
import 'package:haishin_kit/rtmp_connection.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_source.dart';

class HaishinInterfacePage extends NyStatefulWidget {
  static const path = '/stream-interface';

  HaishinInterfacePage({super.key})
      : super(path, child: () => _HaishinInterfacePageState());
}

class _HaishinInterfacePageState extends NyState<HaishinInterfacePage> {
  RtmpConnection? _connection;
  RtmpStream? _stream;
  bool _recording = false;
  CameraPosition currentPosition = CameraPosition.back;

  @override
  void initState() {
    super.initState();
    NyLogger.info("HaishinInterfacePage initState");

    KeepScreenOn.turnOn();

    SystemChrome.setPreferredOrientations([
      DeviceOrientation.landscapeRight,
      DeviceOrientation.landscapeLeft,
    ]);

    SystemChrome.setEnabledSystemUIMode(SystemUiMode.immersiveSticky);

    setupStream();
  }

  void setupStream() async {
    try {
      await Permission.camera.request();
      await Permission.microphone.request();

      final session = await AudioSession.instance;
      await session.configure(const AudioSessionConfiguration(
        avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
        avAudioSessionCategoryOptions:
            AVAudioSessionCategoryOptions.allowBluetooth,
      ));

      RtmpConnection connection = await RtmpConnection.create();
      connection.eventChannel.receiveBroadcastStream().listen((event) {
        NyLogger.info('Event received: $event');
        if (event != null) {
          var data = event["data"];
          if (data != null) {
            NyLogger.info('Event data: $data');
            var code = data["code"];
            if (code != null) {
              NyLogger.info('Event code: $code');
              switch (code) {
                case 'NetConnection.Connect.Success':
                  _stream?.videoSettings.width = 1920;
                  _stream?.videoSettings.height = 1080;
                  _stream?.videoSettings.bitrate = 4000 * 1000; // Bitrate HD
                  _stream?.videoSettings.profileLevel = ProfileLevel.h264High52;
                  _stream?.videoSettings.frameInterval = 1;
                  _stream?.publish('8d63-yapg-0h78-rhxf-e3u5');
                  setState(() {
                    _recording = true;
                  });
                  break;
              }
            } else {
              NyLogger.error('Code is null in event data');
            }
          } else {
            NyLogger.error('Event data is null');
          }
        } else {
          NyLogger.error('Event is null');
        }
      });

      RtmpStream stream = await RtmpStream.create(connection);
      stream.attachAudio(AudioSource());
      stream.attachVideo(VideoSource(position: currentPosition));

      if (!mounted) return;

      setState(() {
        NyLogger.info('Connection created');
        _connection = connection;
        _stream = stream;
      });
    } catch (e) {
      NyLogger.error("Initialization error: $e");
    }
  }

  @override
  void dispose() {
    _stopStream();
    KeepScreenOn.turnOff();
    SystemChrome.setPreferredOrientations([
      DeviceOrientation.portraitUp,
      DeviceOrientation.portraitDown,
    ]);
    SystemChrome.setEnabledSystemUIMode(SystemUiMode.edgeToEdge);
    super.dispose();
  }

  void _startStream() async {
    if (_recording) {
      _connection?.close();
      setState(() {
        _recording = false;
      });
    } else {
      _connection?.connect("rtmp://a.rtmp.youtube.com/live2");
    }
  }

  void _stopStream() async {
    _connection?.close();
  }

  @override
  Widget view(BuildContext context) {
    return Scaffold(
      backgroundColor: BColors.gray900,
      body: Stack(
        children: [
          SizedBox(
            width: double.infinity,
            height: double.infinity,
            child: StreamViewTexture(_stream),
          ),
          Positioned(
            left: 36,
            right: 36,
            top: 24,
            child: Row(
              mainAxisAlignment: MainAxisAlignment.spaceBetween,
              children: [
                Flexible(
                  child: Image.asset(
                    'public/assets/images/baseline-light.png',
                    width: 120,
                  ),
                ),
                ElevatedButton(
                  onPressed: () async {
                    NyLogger.info('Pressed Start Stream');
                    _startStream();
                  },
                  style: ElevatedButton.styleFrom(
                    elevation: 0,
                    padding:
                        const EdgeInsets.symmetric(horizontal: 16, vertical: 8),
                    shape: RoundedRectangleBorder(
                      borderRadius: BorderRadius.circular(4),
                    ),
                  ),
                  child: const Text(
                    "Toggle Stream",
                    style: TextStyle(color: Colors.white, fontSize: 16),
                    textAlign: TextAlign.center,
                  ),
                ),
              ],
            ),
          ),
          Positioned(
            left: 36,
            right: 36,
            bottom: 24,
            child: Row(
              mainAxisAlignment: MainAxisAlignment.spaceBetween,
              children: [
                Flexible(
                  child: Container(
                    padding:
                        const EdgeInsets.symmetric(horizontal: 16, vertical: 8),
                    decoration: BoxDecoration(
                      color: Colors.black.withOpacity(0.5),
                      borderRadius: BorderRadius.circular(4),
                    ),
                    child: const Text(
                      "Score will be displayed here on stream",
                      style: TextStyle(color: Colors.white, fontSize: 16),
                      textAlign: TextAlign.center,
                    ),
                  ),
                ),
                ElevatedButton(
                  onPressed: () {
                    // Add your instructions logic here
                  },
                  style: ElevatedButton.styleFrom(
                    backgroundColor: BColors.gray700,
                    elevation: 0,
                    padding:
                        const EdgeInsets.symmetric(horizontal: 16, vertical: 8),
                    shape: RoundedRectangleBorder(
                      borderRadius: BorderRadius.circular(4),
                    ),
                  ),
                  child: const Text(
                    "Instructions",
                    style: TextStyle(color: Colors.white, fontSize: 16),
                    textAlign: TextAlign.center,
                  ),
                ),
              ],
            ),
          ),
        ],
      ),
    );
  }
}

Screenshots

image
image

Relevant log output

No response

crashes

Describe the bug

Using the provided example, on Android, clicking stop after starting live always crashes

To Reproduce

 floatingActionButton: FloatingActionButton(
          child: _recording
              ? const Icon(Icons.fiber_smart_record)
              : const Icon(Icons.not_started),
          onPressed: () {
            if (_recording) {
              _connection?.close();
              setState(() {
                _recording = false;
              });
            } else {
              _connection?.connect("rtmp://192.168.1.9/live");
            }
          },
        ),

Expected behavior

~

Version

haishin_kit: ^0.12.0
audio_session: ^0.1.18

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

No response

Sigsev on dispose

Describe the bug

Segfault in OplusCCodec when closing or disposing the connection or stream.

To Reproduce

@OverRide
void dispose() {
super.dispose();
_connection?.close();
}

Expected behavior

it doesn't crash the app.

Version

haishin_kit: 0.9.2

Smartphone info.

  • Device: OnePlus 8 pro
  • Os: OxygenOs 12.1 / Android 12

Additional context

any attempt to close or dispose the resource ends in a segfault.

Screenshots

No response

Relevant log output

D/RtmpStream(16158): current=PUBLISHING, change=CLOSED
D/AudioRecord(16158): stop(2625): mActive:1
D/AudioRecord(16158): mAudioRecord->stop()
D/AudioRecord(16158): AudioRecordThread pause()
D/AudioRecord(16158): stop() end
D/OplusCCodec(16158): initiateShutdown [386]: (0xb4000075e0a15340) keepComponentAllocated=1
D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
D/OplusCCodec(16158): initiateShutdown [386]: (0xb4000075e0a15340) keepComponentAllocated=0
F/libc    (16158): Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x7703fde000 in tid 16596 (odec.AudioCodec), pid 16158 (com.partaga.app)
Process name is com.partaga.app, not key_process
keyProcess: 0
*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
Build fingerprint: 'OnePlus/OnePlus8Pro_EEA/OnePlus8Pro:12/RKQ1.211119.001/Q.GDPR.202210170945:user/release-keys'
Revision: '0'
ABI: 'arm64'
Timestamp: 2022-11-22 04:27:17.263872176+0100
Process uptime: 0s
Cmdline: com.partaga.app
pid: 16158, tid: 16596, name: odec.AudioCodec  >>> com.partaga.app <<<
uid: 10607
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x7703fde000
    x0  0000007703fde000  x1  00000076ead176f0  x2  00000000000006f8  x3  0000000000000000
    x4  0000000000000008  x5  0000000000000000  x6  0000007703fde000  x7  0004000400040004
    x8  0004000400040004  x9  1e050955096496b5  x10 0000000000000700  x11 0000000000000000
    x12 0000000000033680  x13 00000076ead170e8  x14 0000000000000002  x15 0000007593376000
    x16 00000076e776bae8  x17 00000076ec686940  x18 0000007576e00000  x19 b4000075e0b69400
    x20 0000000000000800  x21 0000000000000000  x22 0000007703fde000  x23 00000076e76ce260
    x24 0000000000000700  x25 0000007593376000  x26 00000076e7769978  x27 0000007593374e18
    x28 0000007593374e50  x29 0000007593374ca0
    lr  00000076e7703730  sp  0000007593374c70  pc  00000076ec6869f4  pst 0000000000001000
backtrace:
      #00 pc 00000000000759f4  /apex/com.android.runtime/lib64/bionic/libc.so (memcpy_opt+180) (BuildId: bbbdeb7c87c74f1491f92c6e605095b0)
      #01 pc 000000000005d72c  /system/lib64/libaudioclient.so (android::AudioRecord::read(void*, unsigned long, bool)+380) (BuildId: 3cd928556fc187c2febdb332bc052fa9)
      #02 pc 0000000000172864  /system/lib64/libandroid_runtime.so (android_media_AudioRecord_readInDirectBuffer(_JNIEnv*, _jobject*, _jobject*, int, unsigned char)+248) (BuildId: 0a3f50eaf6daea7090ba06720efb1840)
      #03 pc 000000000033b1c0  /data/misc/apexdata/com.android.art/dalvik-cache/arm64/boot.oat (art_jni_trampoline+128)

Camera quality

Describe the bug

Camera resolution was blur
Screenshot_20240524_100423_com example flutter_application_3

To Reproduce

How to modify to get clear resolution

Expected behavior

Setting for video

`

RtmpStream stream = await RtmpStream.create(connection);
stream.audioSettings = AudioSettings(bitrate: 64 * 1000);
stream.videoSettings = VideoSettings(
  width: 1920,
  height: 1080,
  bitrate: 2700 * 1000,
   profileLevel: ProfileLevel.h264HighAutoLevel,
);
stream.attachAudio(AudioSource());
stream.attachVideo(VideoSource(position: currentPosition));

`

Version

Flutter 3.19.0

Smartphone info.

Android Huawei

Additional context

No response

Screenshots

No response

Relevant log output

No response

On Android, app is crashing after receiving IllegalArgumentException: type=13 or 14

Describe the bug

When live streaming for the first time after the app is opened, I'm sometimes getting IllegalArgumentException: type=13 on the java side. When this happens, live streaming is not working however the preview works fine. And when I try to dispose the stream and the connection, type=14 exception occurs. From then on, the next time I try to dispose the stream/connection, no matter the live was successfully streaming or not, the app is crashing.

To Reproduce

  1. Start a live stream
  2. Wait for IllegalArgumentException to ocuur
  3. End the live stream

Expected behavior

To not crash the app or handle the IllegalArgumentExceptions gracefully.

Version

I'm currently using 0.11.1 dart version because the versions 0.11.2 and 0.11.3 were even worse.

Smartphone info.

  • Device: [S20 Ultra]
  • OS: Android 12
  • Server: Antmedia

Additional context

No response

Screenshots

No response

Relevant log output

I/CameraManagerGlobal( 5885): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_ACTIVE for client com.example.live API Level 2
I/BufferQueueProducer( 5885): [SurfaceTexture-1-5885-11](id:16fd00000013,api:4,p:1398,c:5885) queueBuffer: queued for the first time.
I/BufferQueueProducer( 5885): [SurfaceTexture-1-5885-12](id:16fd00000014,api:4,p:1398,c:5885) queueBuffer: queued for the first time.
I/CameraManagerGlobal( 5885): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_IDLE for client com.example.live API Level 2
I/CameraManagerGlobal( 5885): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_CLOSED for client com.example.live API Level 2
I/CameraManagerGlobal( 5885): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_OPEN for client com.example.live API Level 2
I/flutter ( 5885): hey hey {data: {mode: 1.0, code: NetConnection.Connect.Success, capabilities: 33.0, fmsVer: RED5/1,0,9,0, data: [], level: status, description: Connection succeeded.}, type: rtmpStatus}
I/flutter ( 5885): Event codeSSSS NetConnection.Connect.Success
I/CameraManagerGlobal( 5885): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_ACTIVE for client com.example.live API Level 2
D/RtmpStream( 5885): current=INITIALIZED, change=OPEN
I/BufferQueueProducer( 5885): [SurfaceTexture-1-5885-12](id:16fd00000014,api:4,p:1398,c:5885) queueBuffer: queued for the first time.
I/BufferQueueProducer( 5885): [SurfaceTexture-1-5885-11](id:16fd00000013,api:4,p:1398,c:5885) queueBuffer: queued for the first time.
D/InsetsSourceConsumer( 5885): ensureControlAlpha: for ITYPE_STATUS_BAR on com.example.live/com.example.live.MainActivity
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 0
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 1
V/Toast   ( 5885): show: caller = io.github.ponnamkarthik.toast.fluttertoast.MethodCallHandlerImpl.onMethodCall:101
V/Toast   ( 5885): show: focusDisplayId = 0, isFocusInDesktop = false mCustomDisplayId=-1 isDexDualMode=false
V/Toast   ( 5885): show: isActivityContext = false
D/NativeCustomFrequencyManager( 5885): [NativeCFMS] BpCustomFrequencyManager::BpCustomFrequencyManager()
I/ViewRootImpl@435ecfd[Toast]( 5885): setView = android.widget.FrameLayout@2b782c0 TM=true
I/ViewRootImpl@435ecfd[Toast]( 5885): Relayout returned: old=(0,74,1080,2274) new=(405,2065,674,2174) req=(269,109)0 dur=6 res=0x7 s={true -5476376633730209568} ch=true fn=-1
D/OpenGLRenderer( 5885): eglCreateWindowSurface
I/ViewRootImpl@435ecfd[Toast]( 5885): [DP] dp(1) 0 android.view.ViewRootImpl.reportNextDraw:11420 android.view.ViewRootImpl.performTraversals:4193 android.view.ViewRootImpl.doTraversal:2919
D/ViewRootImpl@435ecfd[Toast]( 5885): Creating frameDrawingCallback nextDrawUseBlastSync=false reportNextDraw=true hasBlurUpdates=false
D/ViewRootImpl@435ecfd[Toast]( 5885): Creating frameCompleteCallback
D/ViewRootImpl@435ecfd[Toast]( 5885): Received frameDrawingCallback frameNum=1. Creating transactionCompleteCallback=false
I/BufferQueueProducer( 5885): [ViewRootImpl@435ecfd[Toast]#4(BLAST Consumer)4](id:16fd00000015,api:1,p:5885,c:5885) queueBuffer: queued for the first time.
D/OpenGLRenderer( 5885): GPIS:: SetUp Pid : 5885    Tid : 5942
D/ViewRootImpl@435ecfd[Toast]( 5885): Received frameCompleteCallback  lastAcquiredFrameNum=1 lastAttemptedDrawFrameNum=1
I/ViewRootImpl@435ecfd[Toast]( 5885): [DP] pdf(0) 0 android.view.ViewRootImpl.lambda$addFrameCompleteCallbackIfNeeded$3$ViewRootImpl:4995 android.view.ViewRootImpl$$ExternalSyntheticLambda16.run:6 android.os.Handler.handleCallback:938
I/ViewRootImpl@435ecfd[Toast]( 5885): [DP] rdf()
D/ViewRootImpl@435ecfd[Toast]( 5885): reportDrawFinished (fn: -1)
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 0
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 1
I/CameraManagerGlobal( 5885): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_IDLE for client com.example.live API Level 2
I/CameraManagerGlobal( 5885): Camera 0 facing CAMERA_FACING_BACK state now CAMERA_STATE_CLOSED for client com.example.live API Level 2
I/CameraManagerGlobal( 5885): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_OPEN for client com.example.live API Level 2
I/CameraManagerGlobal( 5885): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_ACTIVE for client com.example.live API Level 2
D/OpenGLRenderer( 5885): setSurface called with nullptr
D/OpenGLRenderer( 5885): setSurface() destroyed EGLSurface
D/OpenGLRenderer( 5885): destroyEglSurface
I/ViewRootImpl@435ecfd[Toast]( 5885): dispatchDetachedFromWindow
D/InputTransport( 5885): Input channel destroyed: 'a5f2f74', fd=451
I/BufferQueueProducer( 5885): [SurfaceTexture-2-5885-13](id:16fd00000016,api:4,p:1398,c:5885) queueBuffer: queued for the first time.
I/BufferQueueProducer( 5885): [SurfaceTexture-2-5885-14](id:16fd00000017,api:4,p:1398,c:5885) queueBuffer: queued for the first time.
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 0
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 1
I/flutter ( 5885): publishing with j7pCoHfImSmA1695029303492
D/RtmpStream( 5885): current=OPEN, change=PUBLISH
I/flutter ( 5885): started
D/RtmpStream( 5885): current=PUBLISH, change=PUBLISHING
D/Camera2Source( 5885): startRunning: android.hardware.camera2.impl.CameraDeviceImpl@16ae9d8
D/MediaCodecList( 5885): codecHandlesFormat: no format, so no extra checks
D/MediaCodecList( 5885): codecHandlesFormat: no format, so no extra checks
D/MediaCodecList( 5885): codecHandlesFormat: no format, so no extra checks
I/ACodec  ( 5885):  [] Now uninitialized
I/ACodec  ( 5885): [] onAllocateComponent
I/OMXClient( 5885): IOmx service obtained
I/ACodec  ( 5885): [OMX.qcom.video.encoder.avc] Now Loaded
I/MediaCodec( 5885): MediaCodec will operate in async mode
D/VQApply ( 5885): raise bitrate: configured 800000 to floor 1367847
D/VQApply ( 5885): minquality: requested QP unsupported, boost bitrate 1367847 by 273569
D/VQApply ( 5885): minquality/target bitrate raised from 800000 to 1641416 bps
D/MediaCodec( 5885): shapeMediaFormat: deltas(2): AMessage(what = 0x00000000) = {
D/MediaCodec( 5885):     int32_t bitrate = 1641416
D/MediaCodec( 5885):     int32_t android._encoding-quality-level = 0
D/MediaCodec( 5885):   }
I/ACodec  ( 5885): app-pid(5885)
W/OMXUtils( 5885): do not know color format 0x7fa30c06 = 2141391878
W/OMXUtils( 5885): do not know color format 0x7fa30c04 = 2141391876
W/OMXUtils( 5885): do not know color format 0x7fa30c00 = 2141391872
W/OMXUtils( 5885): do not know color format 0x7fa30c09 = 2141391881
W/OMXUtils( 5885): do not know color format 0x7fa30c0a = 2141391882
W/OMXUtils( 5885): do not know color format 0x7fa30c08 = 2141391880
W/OMXUtils( 5885): do not know color format 0x7fa30c07 = 2141391879
W/OMXUtils( 5885): do not know color format 0x7f000789 = 2130708361
I/ACodec  ( 5885): app-name : com.example.live
I/ACodec  ( 5885): setupAVCEncoderParameters with [profile: Baseline] [level: Level31]
I/ACodec  ( 5885): Enable Perceptual Video Coding
I/ACodec  ( 5885): Success set IPB VideoMinQP(2/2/2) VideoMaxQP(50/50/50)
I/ACodec  ( 5885): reconfigEncoder4OtherApps
I/ACodec  ( 5885): [OMX.qcom.video.encoder.avc] cannot encode HDR static metadata. Ignoring.
I/ACodec  ( 5885): setupVideoEncoder succeeded
I/ACodec  ( 5885): [OMX.qcom.video.encoder.avc] configure, AMessage : AMessage(what = 'conf', target = 14) = {
I/ACodec  ( 5885):   int32_t capture-rate = 30
I/ACodec  ( 5885):   int32_t color-format = 2130708361
I/ACodec  ( 5885):   int32_t i-frame-interval = 2
I/ACodec  ( 5885):   int32_t level = 512
I/ACodec  ( 5885):   string mime = "video/avc"
I/ACodec  ( 5885):   int32_t profile = 1
I/ACodec  ( 5885):   int32_t width = 702
I/ACodec  ( 5885):   int32_t channel-count = 1
I/ACodec  ( 5885):   int32_t bitrate = 1641416
I/ACodec  ( 5885):   int32_t frame-rate = 30
I/ACodec  ( 5885):   int32_t height = 866
I/ACodec  ( 5885):   int32_t repeat-previous-frame-after = 33333
I/ACodec  ( 5885):   int32_t android._encoding-quality-level = 0
I/ACodec  ( 5885):   int32_t flags = 1
I/ACodec  ( 5885):   int32_t encoder = 1
I/ACodec  ( 5885): }
W/OMXUtils( 5885): do not know color format 0x7f000789 = 2130708361
D/MediaCodec( 5885): keep callback message for reclaim
I/ACodec  ( 5885): [OMX.qcom.video.encoder.avc] Now Loaded->Idle
I/ACodec  ( 5885): [OMX.qcom.video.encoder.avc] Now Idle->Executing
I/MediaCodec( 5885): setCodecState state(0), called in 6
I/ACodec  ( 5885): [OMX.qcom.video.encoder.avc] Now Executing
D/ACodec  ( 5885): dataspace changed to 0x10c10000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:2(BT601_625), T:3(SMPTE_170M))
I/MediaCodec( 5885): {max-bitrate=1641416, csd-1=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8], color-transfer=3, mime=video/avc, width=702, bitrate=1641416, color-range=2, frame-rate=30, color-standard=2, height=866, csd-0=java.nio.HeapByteBuffer[pos=0 lim=22 cap=22]}
I/MediaCodec( 5885): setCodecState state(1), called in 6
D/OpenGLRenderer( 5885): setSurface called with nullptr
I/flutter ( 5885): socket stream:value:updated [1002, {started_at: 20231008160803, status: 1}]
W/System  ( 5885): A resource failed to call release. 
W/System  ( 5885): A resource failed to call release.
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 0
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 1
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 0
I/ViewRootImpl@d961a53[MainActivity]( 5885): ViewPostIme pointer 1
I/DecorView( 5885): notifyKeepScreenOnChanged: keepScreenOn=false
I/flutter ( 5885): disposing 1
I/flutter ( 5885): disposing 2
I/flutter ( 5885): disposing 7
I/flutter ( 5885): socket disconnect
I/flutter ( 5885): socket disconnect forced close
I/CameraManagerGlobal( 5885): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_IDLE for client com.example.live API Level 2
I/CameraManagerGlobal( 5885): Camera 1 facing CAMERA_FACING_FRONT state now CAMERA_STATE_CLOSED for client com.example.live API Level 2
E/com.gcomm.ucct( 5885): [SurfaceTexture-0-5885-10] checkAndUpdateEglState: invalid current EGLDisplay
E/flutter ( 5885): [ERROR:flutter/fml/platform/android/jni_util.cc(204)] java.lang.IllegalStateException: Unable to update texture contents (see logcat for details)
E/flutter ( 5885): 	at android.graphics.SurfaceTexture.nativeUpdateTexImage(Native Method)
E/flutter ( 5885): 	at android.graphics.SurfaceTexture.updateTexImage(SurfaceTexture.java:249)
E/flutter ( 5885): 	at io.flutter.embedding.engine.renderer.SurfaceTextureWrapper.updateTexImage(SurfaceTextureWrapper.java:55)
E/flutter ( 5885): 	at android.os.MessageQueue.nativePollOnce(Native Method)
E/flutter ( 5885): 	at android.os.MessageQueue.next(MessageQueue.java:335)
E/flutter ( 5885): 	at android.os.Looper.loopOnce(Looper.java:186)
E/flutter ( 5885): 	at android.os.Looper.loop(Looper.java:313)
E/flutter ( 5885): 	at android.app.ActivityThread.main(ActivityThread.java:8669)
E/flutter ( 5885): 	at java.lang.reflect.Method.invoke(Native Method)
E/flutter ( 5885): 	at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:571)
E/flutter ( 5885): 	at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1135)
E/flutter ( 5885):
F/flutter ( 5885): [FATAL:flutter/shell/platform/android/platform_view_android_jni_impl.cc(1327)] Check failed: fml::jni::CheckException(env).
F/libc    ( 5885): Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 5885 (com.example.live), pid 5885 (com.example.live)
*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
Build fingerprint: 'samsung/z3qsqw/z3q:12/SP1A.210812.016/G988USQU3FVH1:user/release-keys'
Revision: '15'
ABI: 'arm64'
Processor: '4'
Timestamp: 2023-10-08 16:09:07.193291567+0800
Process uptime: 263s
Cmdline: com.example.live
pid: 5885, tid: 5885, name: com.example.live  >>> com.example.live <<<
uid: 10432
signal 6 (SIGABRT), code -1 (SI_QUEUE), fault addr --------
Abort message: '[FATAL:flutter/shell/platform/android/platform_view_android_jni_impl.cc(1327)] Check failed: fml::jni::CheckException(env).
'
    x0  0000000000000000  x1  00000000000016fd  x2  0000000000000006  x3  0000007fd001e590
    x4  000000790e762000  x5  000000790e762000  x6  000000790e762000  x7  000000000b5df312
    x8  00000000000000f0  x9  69a24e70ae026170  x10 0000000000000000  x11 ffffff80fffffbdf
    x12 0000000000000001  x13 000000000000007e  x14 0000000000000000  x15 00003fc309ea4233
    x16 000000790a13a060  x17 000000790a1169d0  x18 000000790ddec000  x19 00000000000016fd
    x20 00000000000016fd  x21 00000000ffffffff  x22 b40000764a27abf8  x23 0000000000000000
    x24 0000007fd001f120  x25 0000007fd001efd8  x26 0000007fd001ee98  x27 0000000000000001
    x28 000000000000005f  x29 0000007fd001e610
    lr  000000790a0c6b50  sp  0000007fd001e570  pc  000000790a0c6b7c  pst 0000000000001000
backtrace:
      #00 pc 0000000000051b7c  /apex/com.android.runtime/lib64/bionic/libc.so (abort+168) (BuildId: 0f39fc790debeef5bc191987ab30a3f5)
      #01 pc 000000000162514c  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #02 pc 000000000164bc44  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #03 pc 0000000001631dc8  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #04 pc 00000000016265cc  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #05 pc 000000000162633c  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #06 pc 00000000019719a0  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #07 pc 000000000196ab68  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #08 pc 0000000001971c38  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #09 pc 000000000196ab68  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #10 pc 0000000001971c38  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #11 pc 000000000196ab68  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #12 pc 0000000001971c38  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #13 pc 000000000196ab68  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #14 pc 0000000001971c38  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #15 pc 000000000196ab68  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #16 pc 0000000001971c38  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #17 pc 000000000196ab68  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #18 pc 0000000001971c38  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #19 pc 000000000196ab68  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #20 pc 0000000001971c38  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #21 pc 000000000196ab68  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #22 pc 0000000001970098  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #23 pc 0000000001a0cae4  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #24 pc 000000000164c7f0  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #25 pc 0000000001651d58  /data/app/~~ESGSXOBkRCz_ERt9TRoFGQ==/com.example.live-mLOPznfP-ixFg5O18IwgoQ==/lib/arm64/libflutter.so (BuildId: a0d4c78b9cfa53527751a2a655d07347efcef020)
      #26 pc 0000000000018184  /system/lib64/libutils.so (android::Looper::pollInner(int)+916) (BuildId: 748948a5650ad93d18b12eb1d9a51a89)
      #27 pc 0000000000017d84  /system/lib64/libutils.so (android::Looper::pollOnce(int, int*, int*, void**)+116) (BuildId: 748948a5650ad93d18b12eb1d9a51a89)
      #28 pc 0000000000159170  /system/lib64/libandroid_runtime.so (android::android_os_MessageQueue_nativePollOnce(_JNIEnv*, _jobject*, long, int)+48) (BuildId: 90cd534e06cca8c1ea7217ddc4584767)
      #29 pc 00000000003fe504  /data/misc/apexdata/com.android.art/dalvik-cache/arm64/boot.oat (art_jni_trampoline+116)
Lost connection to device.

Exited.

Front camera reversed

Describe the bug

It seems that front camera is not flipped that's why it seems reversed? Can we fix it?

To Reproduce

Launch stream with front camera

Expected behavior

Camera should be flipped

Version

0.13.0

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

No response

Build failed with an exception.

Describe the bug

Execution failed for task ':app:checkDebugAarMetadata'.

Could not resolve all files for configuration ':app:debugRuntimeClasspath'.
Could not resolve com.github.shogo4405.HaishinKitkt:haishinkit:0.10.2.
Required by:
project :app > project :haishin_kit
> Could not resolve com.github.shogo4405.HaishinKit
kt:haishinkit:0.10.2.
> Could not get resource 'https://jitpack.io/com/github/shogo4405/HaishinKit~kt/haishinkit/0.10.2/haishinkit-0.10.2.pom'.
> Could not GET 'https://jitpack.io/com/github/shogo4405/HaishinKit~kt/haishinkit/0.10.2/haishinkit-0.10.2.pom'. Received status code 521 from server:

  • Try:
    Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

To Reproduce

  1. flutter run

Expected behavior

  1. flutter run

Version

0.9.2

Smartphone info.

  • Android

Additional context

No response

Screenshots

No response

Relevant log output

No response

Why after disposing streams mobile app freezed for 5-10 seconds ?

Describe the bug

I start the stream, but It works fine but when I stop stream and dispose it I can do nothing with app during 5-10 seconds

To Reproduce

Launch example And dispose it(with stoping rtmp connection and stream)

Expected behavior

Close it in another stream

Version

0.13.0

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

No response

Exception is thrown when wifi is turned off (Android)

Describe the bug

There is no info/callback If user turns off wifi/cellular network

To Reproduce

  1. Use code from "Example" section of https://github.com/shogo4405/HaishinKit.dart
  2. Replace urls with your server
  3. Start streaming
  4. Turn off wifi on device
  5. You'll see java.net.SocketException in logcat

Expected behavior

Some callback with error code through RtmpConnection.connection (just like on native iOS)

Version

haishin_kit: ^0.9.4
minSdkVersion 21

Smartphone info.

  • Device: Redmi Note 7
  • OS: Android 10

Additional context

No response

Screenshots

No response

Relevant log output

java.net.SocketException: Broken pipe
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:117)
at java.net.SocketOutputStream.write(SocketOutputStream.java:161)
at com.haishinkit.net.NetSocketImpl.doOutput(NetSocketImpl.kt:132)
at com.haishinkit.net.NetSocketImpl.access$doOutput(NetSocketImpl.kt:20)
at com.haishinkit.net.NetSocketImpl$doConnection$1.invokeSuspend(NetSocketImpl.kt:156)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.internal.LimitedDispatcher.run(LimitedDispatcher.kt:42)
at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:95)
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)

java.net.SocketException: Software caused connection abort
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:119)
at java.net.SocketInputStream.read(SocketInputStream.java:176)
at java.net.SocketInputStream.read(SocketInputStream.java:144)
at com.haishinkit.net.NetSocketImpl.doInput(NetSocketImpl.kt:101)
at com.haishinkit.net.NetSocketImpl.doConnection(NetSocketImpl.kt:160)
at com.haishinkit.net.NetSocketImpl.access$doConnection(NetSocketImpl.kt:20)
at com.haishinkit.net.NetSocketImpl$connect$1.invokeSuspend(NetSocketImpl.kt:59)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.internal.LimitedDispatcher.run(LimitedDispatcher.kt:42)
at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:95)
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)

Missing release of bugfix

Describe the bug

PR #7 fixed a critical bug. The release is still not available on pub.dev though

To Reproduce

  1. Download the latest version from pub.dev
  2. Set video settings
  3. Start streaming

Expected behavior

Streaming is done in the settings provided

Version

0.9.1

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

No response

Cannot receive RtmpStream events

Describe the bug

When creating a RtmpConnection, we could listen to RtmpConnection.eventChannel.receiveBroadcastStream() for receiving connection status.

However, RtmpStream.eventChannel.receiveBroadcastStream() will not report any event from android side.

I doubt that the android side does not add event listeners in RtmpStreamHandler.init at https://github.com/shogo4405/HaishinKit.dart/blob/main/android/src/main/kotlin/com/haishinkit/haishin_kit/RtmpStreamHandler.kt, since both RtmpConnectionHandler and RtmpStreamHandler for iOS add event listeners, and RtmpConnectionHandler for Android also adds event listeners in .init

To Reproduce

The _onStreamEvent would not receive any data from the Android side

import 'dart:async';

import 'package:audio_session/audio_session.dart';
import 'package:haishin_kit/audio_settings.dart';
import 'package:haishin_kit/audio_source.dart';
import 'package:haishin_kit/av_capture_session_preset.dart';
import 'package:haishin_kit/rtmp_connection.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_settings.dart';
import 'package:haishin_kit/video_source.dart';
import 'package:streaming_device_management/models/models.dart';

class CameraLiveController {
  final String rtmpServer;
  final String streamKey;
  final StreamController<CameraLiveState> _stateController =
      StreamController<CameraLiveState>.broadcast();
  final StreamController<RtmpEvent> _eventController =
      StreamController<RtmpEvent>.broadcast();

  CameraLiveController(this.rtmpServer, this.streamKey) {
    _stateController.onListen = () {
      _stateController.add(_state);
    };
  }

  Future<void> publish() async {
    if (!_audiSessionConfigured) {
      _audiSessionConfigured = await _configureAudioSession();
    }

    _connection ??= await _createConnection();
    _stream ??= await _createStream(_connection!);

    /// once the connection is established, the stream will be published automatically
    _connection!.connect(rtmpServer);
  }

  ///! it will crash entire app if not detached audio and video before closing connection
  Future<void> stop() async {
    _state = CameraLiveState(
      cameraPosition: _state.cameraPosition,
      audioSettings: _state.audioSettings,
      videoSettings: _state.videoSettings,
      sessionPreset: _state.sessionPreset,
    );
    _stateController.add(_state);

    await _stream?.attachAudio(null);
    await _stream?.attachVideo(null);

    _connection?.close();
  }

  Future<void> switchCamera() async {
    final position = _state.cameraPosition == CameraPosition.back
        ? CameraPosition.front
        : CameraPosition.back;

    _notifyStateChange(cameraPosition: position);
    _stream?.attachVideo(VideoSource(position: position));
  }

  set videoSettings(VideoSettings settings) {
    if (_state.videoSettings == settings || _stream == null) return;
    _notifyStateChange(videoSettings: settings);
    _stream?.videoSettings = settings;
  }

  set sessionPreset(AVCaptureSessionPreset preset) {
    if (_state.sessionPreset == preset || _stream == null) return;
    _notifyStateChange(sessionPreset: preset);
    _stream?.sessionPreset = preset;
  }

  set frameRate(int value) {
    if (_state.frameRate == value || _stream == null) return;

    _notifyStateChange(frameRate: value);
    _stream?.frameRate = value;
  }

  /// [RtmpConnection] will close and dispose all [RtmpStream] associated with it,
  /// so no need to close/dispose [RtmpStream] manually
  void dispose() async {
    await _stateController.close();
    await _eventController.close();

    // await _stream?.dispose();
    _connection?.dispose();
    _streamEventSub?.cancel();

    _connectionEventSub?.cancel();

    _stream = null;
    _connection = null;
  }

  RtmpConnection? _connection;
  RtmpStream? _stream;

  CameraLiveState _state = CameraLiveState.defaultState();
  Stream<CameraLiveState> get state => _stateController.stream;
  Stream<RtmpEvent> get event => _eventController.stream;

  StreamSubscription? _connectionEventSub;
  StreamSubscription? _streamEventSub;

  Future<RtmpConnection> _createConnection() async {
    final connection = await RtmpConnection.create();

    _connectionEventSub?.cancel();
    _connectionEventSub = connection.eventChannel
        .receiveBroadcastStream()
        .listen(_onConnectionEvent);
    return connection;
  }

  Future<RtmpStream> _createStream(RtmpConnection connection) async {
    final stream = await RtmpStream.create(connection);

    _streamEventSub?.cancel();
    _streamEventSub =
        stream.eventChannel.receiveBroadcastStream().listen(_onStreamEvent);

    stream.audioSettings = _state.audioSettings;
    stream.videoSettings = _state.videoSettings;

    return stream;
  }

  void _onConnectionEvent(dynamic data) async {
    final map = <String, dynamic>{};

    for (final entry in (data["data"] as Map).entries) {
      final key = entry.key as String?;
      if (key != null) {
        map[key] = entry.value;
      }
    }

    final event = RtmpConnectionEvent.fromMap(map);

    _eventController.add(event);

    if (event.code == RtmpConnectionCode.connectSuccess) {
      _notifyStateChange(connected: true);

      await _stream?.attachAudio(AudioSource());
      await _stream?.attachVideo(VideoSource(position: _state.cameraPosition));
      await _stream?.publish(streamKey);

      _notifyStateChange(streaming: true, localStream: _stream);
    }
  }

  ///! no RtmpStreamEvent sent from [RtmpStream.eventChannel]
  void _onStreamEvent(dynamic data) {
    final map = <String, dynamic>{};

    for (final entry in (data["data"] as Map).entries) {
      print(entry);

      final key = entry.key as String?;
      if (key != null) {
        map[key] = entry.value;
      }
    }

    final event = RtmpStreamEvent.fromMap(map);

    _eventController.add(event);

    if (event.code == RtmpStreamCode.publishStart) {
      // _notifyStateChange(streaming: true, localStream: _stream);
    }
  }

  void _notifyStateChange({
    bool? streaming,
    bool? connected,
    CameraPosition? cameraPosition,
    AudioSettings? audioSettings,
    VideoSettings? videoSettings,
    RtmpStream? localStream,
    int? frameRate,
    AVCaptureSessionPreset? sessionPreset,
  }) {
    _state = _state.copyWith(
      streaming: streaming,
      connected: connected,
      cameraPosition: cameraPosition,
      videoSettings: videoSettings,
      audioSettings: audioSettings,
      frameRate: frameRate,
      sessionPreset: sessionPreset,
      localStream: localStream,
    );
    _stateController.add(_state);
  }

  bool _audiSessionConfigured = false;
}

Future<bool> _configureAudioSession() async {
  final session = await AudioSession.instance;
  const config = AudioSessionConfiguration(
    avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
    avAudioSessionCategoryOptions: AVAudioSessionCategoryOptions.allowBluetooth,
  );
  try {
    await session.configure(config);
    return true;
  } catch (e) {
    print(e);
    return false;
  }
}

Below is CarmeraLiveState:

import 'package:haishin_kit/av_capture_session_preset.dart';
import 'package:haishin_kit/audio_settings.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_settings.dart';
import 'package:haishin_kit/video_source.dart';

class CameraLiveState {
  final bool streaming;
  final bool connected;
  final RtmpStream? localStream;
  final CameraPosition cameraPosition;
  final AudioSettings audioSettings;
  final VideoSettings videoSettings;
  final AVCaptureSessionPreset sessionPreset;
  final int frameRate;

  const CameraLiveState({
    this.streaming = false,
    this.connected = false,
    this.cameraPosition = CameraPosition.back,
    this.localStream,
    this.frameRate = 30,
    this.sessionPreset = AVCaptureSessionPreset.hd1280x720,
    required this.audioSettings,
    required this.videoSettings,
  });

  factory CameraLiveState.defaultState() {
    return CameraLiveState(
      audioSettings: AudioSettings(bitrate: 64 * 1000),
      videoSettings: VideoSettings(
        width: 480,
        height: 272,
        bitrate: 512 * 1000,
      ),
    );
  }

  bool get canPreview => streaming && connected && localStream != null;

  CameraLiveState copyWith({
    bool? streaming,
    bool? connected,
    CameraPosition? cameraPosition,
    AudioSettings? audioSettings,
    VideoSettings? videoSettings,
    RtmpStream? localStream,
    int? frameRate,
    AVCaptureSessionPreset? sessionPreset,
  }) {
    return CameraLiveState(
      streaming: streaming ?? this.streaming,
      connected: connected ?? this.connected,
      frameRate: frameRate ?? this.frameRate,
      sessionPreset: sessionPreset ?? this.sessionPreset,
      localStream: localStream ?? this.localStream,
      cameraPosition: cameraPosition ?? this.cameraPosition,
      audioSettings: audioSettings ?? this.audioSettings,
      videoSettings: videoSettings ?? this.videoSettings,
    );
  }

  @override
  String toString() {
    return 'CameraLiveState{streaming: $streaming, connected: $connected, localStream: $localStream, cameraPosition: $cameraPosition, audioSettings: $audioSettings, videoSettings: $videoSettings, sessionPreset: $sessionPreset, frameRate: $frameRate}';
  }
}

Expected behavior

Listening the events related to RtmpStream via RtmpStream.eventChannel.receiveBroadcastStream().listen(_onStreamEvent)

Version

version: 0.11.2

flutter version:

[✓] Flutter (Channel stable, 3.10.5, on macOS 13.4.1 22F770820d darwin-arm64, locale en-CA)
    • Flutter version 3.10.5 on channel stable at /Users/simonwang/flutter
    • Upstream repository https://github.com/flutter/flutter.git
    • Framework revision 796c8ef792 (3 months ago), 2023-06-13 15:51:02 -0700
    • Engine revision 45f6e00911
    • Dart version 3.0.5
    • DevTools version 2.23.1

[✓] Android toolchain - develop for Android devices (Android SDK version 33.0.1)
    • Android SDK at /Users/simonwang/Library/Android/sdk
    • Platform android-33, build-tools 33.0.1
    • Java binary at: /Applications/Android Studio.app/Contents/jre/Contents/Home/bin/java
    • Java version OpenJDK Runtime Environment (build 11.0.12+0-b1504.28-7817840)
    • All Android licenses accepted.

[✓] Xcode - develop for iOS and macOS (Xcode 14.3.1)
    • Xcode at /Applications/Xcode.app/Contents/Developer
    • Build 14E300c
    • CocoaPods version 1.12.0

[✓] Chrome - develop for the web
    • Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome

[✓] Android Studio (version 2021.2)
    • Android Studio at /Applications/Android Studio.app/Contents
    • Flutter plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/9212-flutter
    • Dart plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/6351-dart
    • Java version OpenJDK Runtime Environment (build 11.0.12+0-b1504.28-7817840)

[✓] IntelliJ IDEA Community Edition (version 2022.3)
    • IntelliJ at /Applications/IntelliJ IDEA CE.app
    • Flutter plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/9212-flutter
    • Dart plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/6351-dart

[✓] VS Code (version 1.81.1)
    • VS Code at /Applications/Visual Studio Code.app/Contents
    • Flutter extension version 3.70.0

[✓] Connected device (4 available)
    • Pixel 6 (mobile)           • 21121FDF6006BK                       • android-arm64  • Android
      12 (API 32)
    • iPhone 14 Pro Max (mobile) • 24654780-1645-4B74-90EB-E276AE1AFF03 • ios            •
      com.apple.CoreSimulator.SimRuntime.iOS-16-4 (simulator)
    • macOS (desktop)            • macos                                • darwin-arm64   • macOS
      13.4.1 22F770820d darwin-arm64
    • Chrome (web)               • chrome                               • web-javascript • Google
      Chrome 116.0.5845.140

[✓] Network resources
    • All expected network resources are available.

• No issues found!

Smartphone info.

  • Device: Google Pixels 6

Additional context

No response

Screenshots

No response

Relevant log output

No response

Youtube RTMP stream bug

Describe the bug

I am facing a bug - I think it's native swift issue - when I pass the youtube stream url and the stream key it gives me an error.

import 'dart:async';

import 'package:audio_session/audio_session.dart';
import 'package:flutter/material.dart';
import 'package:haishin_kit/audio_settings.dart';
import 'package:haishin_kit/audio_source.dart';
import 'package:haishin_kit/net_stream_drawable_texture.dart';
import 'package:haishin_kit/rtmp_connection.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_settings.dart';
import 'package:haishin_kit/video_source.dart';
import 'package:permission_handler/permission_handler.dart';

void main() {
  runApp(const MyApp());
}

class MyApp extends StatefulWidget {
  const MyApp({Key? key}) : super(key: key);

  @override
  State<MyApp> createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  RtmpConnection? _connection;
  RtmpStream? _stream;
  bool _recording = false;
  String _mode = "publish";
  CameraPosition currentPosition = CameraPosition.front;

  @override
  void initState() {
    super.initState();
    initPlatformState();
  }

  @override
  void dispose() {
    _stream?.dispose();
    _connection?.dispose();
    super.dispose();
  }

  Future<void> initPlatformState() async {
    await Permission.camera.request();
    await Permission.microphone.request();

    // Set up AVAudioSession for iOS.
    final session = await AudioSession.instance;
    await session.configure(const AudioSessionConfiguration(
      avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
      avAudioSessionCategoryOptions:
          AVAudioSessionCategoryOptions.allowBluetooth,
    ));

    RtmpConnection connection = await RtmpConnection.create();
    connection.eventChannel.receiveBroadcastStream().listen((event) {
      switch (event["data"]["code"]) {
        case 'NetConnection.Connect.Success':
          if (_mode == "publish") {
            _stream?.publish("live");
          } else {
            _stream?.play("live");
          }
          setState(() {
            _recording = true;
          });
          break;
      }
    });

    RtmpStream stream = await RtmpStream.create(connection);
    stream.audioSettings = AudioSettings(muted: false, bitrate: 64 * 1000);
    stream.videoSettings = VideoSettings(
      width: 480,
      height: 272,
      bitrate: 512 * 1000,
    );
    stream.attachAudio(AudioSource());
    stream.attachVideo(VideoSource(position: currentPosition));

    if (!mounted) return;

    setState(() {
      _connection = connection;
      _stream = stream;
    });
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(title: const Text('HaishinKit'), actions: [
          IconButton(
            icon: const Icon(Icons.play_arrow),
            onPressed: () {
              if (_mode == "publish") {
                _mode = "playback";
                _stream?.attachVideo(null);
                _stream?.attachAudio(null);
              } else {
                _mode = "publish";
                _stream?.attachAudio(AudioSource());
                _stream?.attachVideo(VideoSource(position: currentPosition));
              }
            },
          ),
          IconButton(
            icon: const Icon(Icons.flip_camera_android),
            onPressed: () {
              if (currentPosition == CameraPosition.front) {
                currentPosition = CameraPosition.back;
              } else {
                currentPosition = CameraPosition.front;
              }
              _stream?.attachVideo(VideoSource(position: currentPosition));
            },
          )
        ]),
        body: Center(
          child: _stream == null
              ? const Text("")
              : NetStreamDrawableTexture(_stream),
        ),
        floatingActionButton: FloatingActionButton(
          child: _recording
              ? const Icon(Icons.fiber_smart_record)
              : const Icon(Icons.not_started),
          onPressed: () {
            if (_recording) {
              _connection?.close();
              setState(() {
                _recording = false;
              });
            } else {
              _connection?.connect("rtmp://a.rtmp.youtube.com/live2");
              _stream?.publish("....-....-....-....-....");
             
            }
          },
        ),
      ),
    );
  }
}

To Reproduce

Call the connect function

Expected behavior

I Expected to live stream on youtube.

Version

I am running haishin_kit: ^0.9.2

Smartphone info.

  • Device Iphone 6s
  • OS (IOS 12.5)

Additional context

No response

Screenshots

No response

Relevant log output

`Unsupported value: [Optional(undefined)] of type __SwiftValue`
`Lost connection to device.`

You have some porblems with plugin implementation

Describe the bug

You pass some wrong arguments

[ERROR:flutter/shell/common/shell.cc(1038)] The 'com.haishinkit.eventchannel/12929595648' channel sent a message from native to Flutter on a non-platform thread. Platform channel messages must be sent on the platform thread. Failure to do so may result in data loss or crashes, and must be fixed in the plugin or application code creating that channel.

To Reproduce

Just launch it on Iphone

Expected behavior

It should be parsable

Version

0.13.0

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

No response

Adobe authentication for rtmp not working on Android side

Describe the bug

Hello, I've been trying to connect to an rtmp publish server which requires authentification.
On my dart side I'm using

RtmpConnection connection = await RtmpConnection.create();
connection.eventChannel.receiveBroadcastStream().listen((event) {
      printDebug("EVENT " + event.toString());
}

to read what are the auth messages that the server responds with.
iOS devices works fine and can publish a mic stream with the following messages:

flutter: EVENT {data: {level: error, code: NetConnection.Connect.Rejected, description: [ AccessManager.Reject ] : [ code=403 need auth; authmod=adobe ] : }, type: rtmpStatus}
flutter: EVENT {type: rtmpStatus, data: {description: [ AccessManager.Reject ] : [ authmod=adobe ] : ?reason=needauth&user=user&salt=XXXX==&challenge=XXXX==&opaque=XXXX==, level: error, code: NetConnection.Connect.Rejected}}
flutter: EVENT {data: {data: [null], description: Connection succeeded, objectEncoding: 0.0, clientid: 1.0, level: status, code: NetConnection.Connect.Success}, type: rtmpStatus}
flutter: EVENT {type: rtmpStatus, data: {code: NetStream.Publish.Start, description: , clientid: 1.0, level: status, details: /streams.mount123}}
flutter: EVENT {data: null, type: rtmpStatus}
flutter: EVENT {type: rtmpStatus, data: {details: /streams.mount123, description: , level: status, code: NetStream.Publish.Start, clientid: 1.0}}

On Android side I'm not able to connect via adobe auth to the streaming server, I'm geting the following messages:

I/flutter (28417): EVENT {data: {code: NetConnection.Connect.Rejected, level: error, description: [ AccessManager.Reject ] : [ code=403 need auth; authmod=adobe ] : }, type: rtmpStatus}
I/flutter (28417): EVENT {data: {code: NetConnection.Connect.Rejected, level: error, description: [ AccessManager.Reject ] : [ authmod=adobe ] : ?reason=needauth&user=user&salt=XXXX==&challenge=XXXX==&opaque=XXXX==}, type: rtmpStatus}
D/TrafficStats(28417): tagSocket(6) with statsTag=0xffffffff, statsUid=-1
I/flutter (28417): EVENT {data: {code: NetConnection.Connect.Closed, level: status}, type: rtmpStatus}

From what I've seen on Android side, the connection.close() is always called before a new connect attempt when the server asks for user and password, but on iOS the connection close is called only when one of the user and password field are empty. Is this where the issue originates?
Can you please help me with this please? I'm not sure on what I should begin looking into on the Android side.

To Reproduce

  1. Use the example project
  2. For line 138 use _connection?.connect("rtmp://user:password@url/live"); in order to have user and password for the native layer to push forward.

Expected behavior

Both Android and iOS devices should connect and publish to the rtmp adobe auth server.

Version

haishin_kit: ^0.9.4

Smartphone info.

  • Any Android device or emulator
  • Any iOS device, or simulator (on simulator the connection occurs but it won't stream any data from the mic)

Additional context

No response

Screenshots

No response

Relevant log output

No response

Cannot set the com.android.graphics.injectLayers.enable attribute to true in the APK's AndroidManifest.xml file

Describe the bug

I encountered an issue when trying to upload my Flutter app bundle to the Google Play Store. During the upload process, I received an error stating "Cannot set the com.android.graphics.injectLayers.enable attribute to true in the APK's AndroidManifest.xml file" This error prevented me from successfully publishing my app on the Play Store.

To Reproduce

  1. Generate an app bundle using the Flutter build command.
  2. Upload the app bundle to the Google Play Console.
  3. During the upload process, the error message "Cannot set the com.android.graphics.injectLayers.enable attribute to true in the APK's AndroidManifest.xml file" is displayed, preventing the upload from completing successfully.

Expected behavior

The app bundle should be uploaded to the Google Play Store without any errors related to "Cannot set the com.android.graphics.injectLayers.enable attribute to true in the APK's AndroidManifest.xml file"

Version

Flutter version: 3.7.12
Flutter build command used to generate the app bundle: flutter build appbundle

dependencies:
haishin_kit: ^0.9.4

Smartphone info.

Android min sdk 21

Additional context

I realized that it is your package that is causing the error because I removed the dependency and all its usages from the project, and I was able to upload an app bundle successfully. The strange thing is that I had successfully uploaded an app bundle before using the package. I have been trying to upload the update to the store for over a month, and this issue has been preventing me from doing so. I even contacted Google Play support, but they couldn't assist me

Please let me know if there is any additional information or logs that I can provide to help resolve this issue.

Screenshots

image

Relevant log output

No response

Adding VideoEffect in RTMPStreamHandler does not take effect

Describe the bug

The breakpoint will be entered, but has no effect

To Reproduce

instance?.registerVideoEffect(MonochromeEffect())

final class MonochromeEffect: VideoEffect {
let filter: CIFilter? = CIFilter(name: "CIColorMonochrome")

override func execute(_ image: CIImage, info: CMSampleBuffer?) -> CIImage {
guard let filter: CIFilter = filter else {
return image
}
filter.setValue(image, forKey: "inputImage")
filter.setValue(CIColor(red: 0.75, green: 0.75, blue: 0.75), forKey: "inputColor")
filter.setValue(1.0, forKey: "inputIntensity")
return filter.outputImage!
}
}

Expected behavior

image has grayscale

Version

fork Version

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

No response

Cannot connect to Mux RTMP secure server URL

Describe the bug

Thank you for all of your work on your excellent product.

I use your example Flutter code in debug mode to try to connect via an iPhone 11 to a Mux RTMP server secure URL e.g. rtmps://global-live.mux.com:443/app/MUX STREAM KEY and the response after attempting to connect is always "NetConnection.Connect.Closed". I test the same URL on the iPhone using Larix and it works fine.

To Reproduce

  1. Use your example Flutter code
  2. Edit pubspec.yml to use haishin_kit 0.9.1
  3. Use a Mux secure URL e.g. rtmps://global-live.mux.com:443/app/MUX STREAM KEY
  4. Observe that is does not connect

Expected behavior

To connect and broadcast

Version

haishin_kit 0.9.1

Smartphone info.

  • Device: iPhone 11
  • OS: iOS 15.5

Additional context

No response

Screenshots

No response

Relevant log output

No response

NetConnection.Call.Failed not emitted when trying to connect to server and there is no internet conncetion

Describe the bug

When a user has no internet connection and attempts to connect to the rtmp server, there is no connection event emitted to the flutter side of the application.
However, the following error is emitted on the java side:

java.net.ConnectException: failed to connect to "example-IP" (port 1935) from /:: (port 0): connect failed: ENETUNREACH (Network is unreachable)

From my understanding, the connection event "NetConnection.Call.Failed" should be emitted and from that we can inform the user manually that they have a connectivity issue on their side.

To Reproduce

  • Run example app Example2
  • Turn off all internet connections on android device
  • Try to connect to a working RTMP server
  • See Java exception java.net.ConnectException: failed to connect to in the logs

Expected behavior

To receive the connection event NetConnection.Call.Failed

Version

haishin_kit version 0.13.0

Smartphone info.

Google Pixel 7 Pro
Android version 14

Additional context

No response

Screenshots

No response

Relevant log output

onnect failed: ENETUNREACH (Network is unreachable) does not 
[        ] W/NetSocketImpl( 1122): 	at libcore.io.IoBridge.connect(IoBridge.java:187)
[        ] W/NetSocketImpl( 1122): 	at java.net.PlainSocketImpl.socketConnect(PlainSocketImpl.java:142)
[        ] W/NetSocketImpl( 1122): 	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:390)
[        ] W/NetSocketImpl( 1122): 	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:230)
[        ] W/NetSocketImpl( 1122): 	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:212)
[        ] W/NetSocketImpl( 1122): 	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:436)
[        ] W/NetSocketImpl( 1122): 	at java.net.Socket.connect(Socket.java:646)
[        ] W/NetSocketImpl( 1122): 	at java.net.Socket.connect(Socket.java:595)
[        ] W/NetSocketImpl( 1122): 	at java.net.Socket.<init>(Socket.java:475)
[        ] W/NetSocketImpl( 1122): 	at java.net.Socket.<init>(Socket.java:243)
[        ] W/NetSocketImpl( 1122): 	at com.haishinkit.net.NetSocketImpl.createSocket(NetSocketImpl.kt:203)
[        ] W/NetSocketImpl( 1122): 	at com.haishinkit.net.NetSocketImpl.doConnection(NetSocketImpl.kt:165)
[        ] W/NetSocketImpl( 1122): 	at com.haishinkit.net.NetSocketImpl.access$doConnection(NetSocketImpl.kt:21)
[        ] W/NetSocketImpl( 1122): 	at com.haishinkit.net.NetSocketImpl$connect$1.invokeSuspend(NetSocketImpl.kt:64)
[        ] W/NetSocketImpl( 1122): 	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
[        ] W/NetSocketImpl( 1122): 	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108)
[        ] W/NetSocketImpl( 1122): 	at kotlinx.coroutines.internal.LimitedDispatcher$Worker.run(LimitedDispatcher.kt:115)
[        ] W/NetSocketImpl( 1122): 	at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:103)
[        ] W/NetSocketImpl( 1122): 	at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:584)
[        ] W/NetSocketImpl( 1122): 	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:793)
[        ] W/NetSocketImpl( 1122): 	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:697)
[        ] W/NetSocketImpl( 1122): 	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:684)
[        ] W/NetSocketImpl( 1122): Caused by: android.system.ErrnoException: connect failed: ENETUNREACH (Network is unreachable)

Camera Not Closed

Describe the bug

...When Leaving and Disposing the Page. (calling stream.close() has no effect on the camera referenced)

The bug is in this file

https://github.com/shogo4405/HaishinKit.dart/blob/main/android/src/main/kotlin/com/haishinkit/haishin_kit/RtmpStreamHandler.kt

Ctrf + F, camera,

you can see you called camera.open()

but never called close / dispose on the camera,

as a result the camera is never released even after leaving/closing the page (the privacy alert camera icon is still in the status bar)

To Reproduce

please check the source code in file

https://github.com/shogo4405/HaishinKit.dart/blob/main/android/src/main/kotlin/com/haishinkit/haishin_kit/RtmpStreamHandler.kt

Expected behavior

close camera when stream.close() is called

Version

latest

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

No response

Draggable turns black when moved

Describe the bug

Hello, I am using a Draggable so that when transmitting I can move the camera, when it starts it looks good but when I move it it looks black, is there any way to use NetStreamDrawableTexture with Draggable?

To Reproduce

Wrap NetStreamDrawableTexture in a Draggable, just that

Expected behavior

That you can continue to see the preview camera when you move it

Version

0.9.4

Smartphone info.

0.9.4

Additional context

No response

Screenshots

WhatsApp.Video.2023-06-03.at.12.18.26.PM.mp4

Relevant log output

No response

StreamViewTexture height and steramview resolution

Describe the bug

Can't adjust StreamViewTexture to show full screen camera.

To Reproduce

If changes like Future _updatePlatformState(BoxConstraints constraints) async {
widget.netStream?.updateTextureSize({
"width": constraints.maxWidth,
"height": constraints.maxHeight/3,
}); show full screen of view but resolution was too bad show object taller and see unclear

Expected behavior

Want to know how to modify to show texture with full screen and high resolution to see user clearly

Version

Flutter 3.19.3 • channel stable • https://github.com/flutter/flutter.git
Framework • revision ba39319843 (3 months ago) • 2024-03-07 15:22:21 -0600
Engine • revision 2e4ba9c6fb
Tools • Dart 3.3.1 • DevTools 2.31.1

Smartphone info.

Huawei nova 3i

Additional context

No response

Screenshots

No response

Relevant log output

No response

registerTexture may not update the size of the texture on iOS

Describe the bug

When invoking RtmpStream#registerTexture on iOS, it may create multiple NetStreamDrawableTexture.

As a result, it may show a black screen when previewing the camera video, as the texture has zero media size.

For example, to preview the camera correctly, we have to:

  1. registerTexture to send the texture id back to the dart side
  2. registerTexture with the media size to set the correct screen size for this texture.

However, if doing the two steps as the following:

mixin LiveStreamTextureMixin on LiveStreamConnectionManager {
  int? _textureId;
  Size? _previousSize;

  int? get textureId => _textureId;

  Future<int?> registerTexture({Size? size}) async {
    if (_stream == null) return null;

    _textureId ??= await _stream!.registerTexture({});

    if (size != null && _previousSize != size) {
       final textureId = await _stream!.registerTexture({
         "width": size.width,
         "height": size.height,
       });

      if (textureId != null && _textureId != textureId) {
         _textureId = textureId;
       }

       _previousSize = size;
    }

    return _textureId;
  }

The second invocation of registerTexture("width":..) may create a new NetStreamDrawableTexture instead of setting the size for the texture (previously created by the first invocation).

The main reason is because it is an async task when attaching the stream to a texture:
image
image
Consequently, the second invocation may go to the branch: instance?.mixer.drawable == nil and then try to create a new texture, instead of setting a media size for a texture as we expected.

To Reproduce

Use the below code to register textures:

mixin LiveStreamTextureMixin on LiveStreamConnectionManager {
  int? _textureId;
  Size? _previousSize;

  int? get textureId => _textureId;

  Future<int?> registerTexture({Size? size}) async {
    if (_stream == null) return null;

    _textureId ??= await _stream!.registerTexture({});

    if (size != null && _previousSize != size) {
       final textureId = await _stream!.registerTexture({
         "width": size.width,
         "height": size.height,
       });

      if (textureId != null && _textureId != textureId) {
         _textureId = textureId;
       }

       _previousSize = size;
    }

    return _textureId;
  }

The camera preview may show a black screen, depending on whether the texture has a valid media size.

If calling registerTexture({Size? size}) continuously, it has a high chance of having a black screen.

Expected behavior

  1. The first invocation of registerTexture should register the texture and attach it to the stream synchronously
  2. The second invocation of RegisterTextture should set the media size for the previously created texture correctly.

Version

HaishinKit.dart: v0.11.2
Flutter: v3.10.5

Smartphone info.

  • iPhone 13 mini (iOS 16.1.2)

Additional context

Is it necessary to attach a texture to the stream asynchronously?

In my forked version, I split the logic of registering the texture into 3 different methods:

  1. registerTexture: only register a texture for Flutter
  2. updateTextureSize: only set the media size for the texture attached to the net stream. if no attachment, just return nil.
  3. unregisterTexture: to unregister the texture from the flutter registry (no need for Android, but recommended for iOS)

By doing so, we could ensure we are never creating a new texture and that the media size could be updated correctly.

However, the issue is not solved completely, as we still cannot predict exactly what time the texture has been attached to the net stream.

Therefore, I suggest creating the above three methods to make them more specific and then attaching a texture to the net stream synchronously. If it makes sense, I will create a PR.

Screenshots

No response

Relevant log output

No response

Support pure audio (no video) ?

Describe the bug

describe the bug: s
to reproduce: s
expected behavior: s
version: s

To Reproduce

Expected behavior

1

Version

1

Smartphone info.

No response

Additional context

No response

Screenshots

No response

Relevant log output

No response

ios/Pods/HaishinKit/Sources/Media/IOCaptureUnit.swift:178:62 'isMultiCamSupported' is only available in iOS 13.0 or newer

Describe the bug

ios/Pods/HaishinKit/Sources/Media/IOCaptureUnit.swift:178:62 'isMultiCamSupported' is only available in iOS 13.0 or newer

To Reproduce

./ios/Podfile

# Uncomment this line to define a global platform for your project
# platform :ios, '11.0'
platform :ios, '11.0'

# rm -rf Podfile.lock pods .symlink Runner.xcworkspace && flutter clean && flutter pub get && pod install --repo-update
# source 'https://github.com/CocoaPods/Specs.git'
source 'https://mirrors.tuna.tsinghua.edu.cn/git/CocoaPods/Specs.git'

# CocoaPods analytics sends network stats synchronously affecting flutter build latency.
ENV['COCOAPODS_DISABLE_STATS'] = 'true'

project 'Runner', {
  'Debug' => :debug,
  'Profile' => :release,
  'Release' => :release,
}
...

Expected behavior

Look forward to normal operation

Version

haishin_kit: ^0.11.0
flutter doctor -v
[!] Flutter (Channel stable, 3.10.6, on macOS 13.4.1 22F770820d
    darwin-arm64 (Rosetta), locale zh-Hans-CN)
    • Flutter version 3.10.6 on channel stable at
      /Users/leeyi/workspace/flutter
    ! Warning: `dart` on your path resolves to
      /opt/homebrew/Cellar/dart/2.18.6/libexec/bin/dart, which is not
      inside your current Flutter SDK checkout at
      /Users/leeyi/workspace/flutter. Consider adding
      /Users/leeyi/workspace/flutter/bin to the front of your path.
    • Upstream repository https://github.com/flutter/flutter.git
    • Framework revision f468f3366c (4 周前), 2023-07-12 15:19:05 -0700
    • Engine revision cdbeda788a
    • Dart version 3.0.6
    • DevTools version 2.23.1
    • Pub download mirror https://pub.flutter-io.cn
    • Flutter download mirror https://storage.flutter-io.cn
    • If those were intentional, you can disregard the above warnings;
      however it is recommended to use "git" directly to perform update
      checks and upgrades.

[✓] Android toolchain - develop for Android devices (Android SDK version
    33.0.1)
    • Android SDK at /Users/leeyi/Library/Android/sdk
    • Platform android-33, build-tools 33.0.1
    • Java binary at: /Applications/Android
      Studio.app/Contents/jbr/Contents/Home/bin/java
    • Java version OpenJDK Runtime Environment (build
      17.0.6+0-17.0.6b829.9-10027231)
    • All Android licenses accepted.

[✓] Xcode - develop for iOS and macOS (Xcode 14.3.1)
    • Xcode at /Applications/Xcode.app/Contents/Developer
    • Build 14E300c
    • CocoaPods version 1.12.1

[✓] Chrome - develop for the web
    • Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google
      Chrome

[!] Android Studio
    • Android Studio at /Applications/Android Studio
      Preview.app/Contents
    • Flutter plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/9212-flutter
    • Dart plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/6351-dart
    ✗ Unable to find bundled Java version.
    • Try updating or re-installing Android Studio.

[✓] Android Studio (version 2022.3)
    • Android Studio at /Applications/Android Studio.app/Contents
    • Flutter plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/9212-flutter
    • Dart plugin can be installed from:
      🔨 https://plugins.jetbrains.com/plugin/6351-dart
    • Java version OpenJDK Runtime Environment (build
      17.0.6+0-17.0.6b829.9-10027231)

[✓] Connected device (3 available)
    • iPhone7 (mobile) • 064cedc7fa15dc1353e8a45ab54655f10eeb5a1a • ios
      • iOS 15.7.8 19H364
    • macOS (desktop)  • macos                                    •
      darwin-arm64   • macOS 13.4.1 22F770820d darwin-arm64 (Rosetta)
    • Chrome (web)     • chrome                                   •
      web-javascript • Google Chrome 115.0.5790.114

[✓] Network resources
    • All expected network resources are available.

Smartphone info.

  • device iPhone7 Real machine

Additional context

this my open source project https://github.com/imboy-pub/imboy-flutter

Screenshots

No response

Relevant log output

No response

Unsupported code error

image

Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Unsupported value for standard codec'

Unsupported value: [] of type __SwiftValue
*** Assertion failure in -[FlutterStandardWriter writeValue:], FlutterStandardCodec.mm:338
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Unsupported value for standard codec'
*** First throw call stack:
(0x1b13c1288 0x1ca0bb744 0x1b2c4e360 0x10a673478 0x10a6735ac 0x10a6735ac 0x10a673b00 0x10a6712fc 0x105bc5c50 0x105bc5388 0x105bc5454 0x1b1352834 0x1b13eefd4 0x1b13c21d0 0x1b13688ac 0x1b2b37754 0x1065913bc 0x1065914c8 0x10665ccf0 0x106652038 0x106652644 0x106652644 0x106652788 0x10667efe0 0x10661afd0 0x10661bcac 0x10661c2f8 0x1b142c120 0x1b144917c 0x1b1026e6c 0x1b1028a30 0x1b1030124 0x1b1030c80 0x1b103b500 0x222c800bc 0x222c7fe5c)
libc++abi: terminating with uncaught exception of type NSException
* thread #34, queue = 'com.haishinkit.HaishinKit.NetSocket.input', stop reason = signal SIGABRT
    frame #0: 0x00000001e8f56b38 libsystem_kernel.dylib`__pthread_kill + 8
libsystem_kernel.dylib`__pthread_kill:
->  0x1e8f56b38 <+8>:  b.lo   0x1e8f56b58               ; <+40>
    0x1e8f56b3c <+12>: pacibsp 
    0x1e8f56b40 <+16>: stp    x29, x30, [sp, #-0x10]!
    0x1e8f56b44 <+20>: mov    x29, sp
Target 0: (Runner) stopped.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.