Coder Social home page Coder Social logo

livekit / client-sdk-js Goto Github PK

View Code? Open in Web Editor NEW
332.0 332.0 140.0 5.99 MB

LiveKit browser client SDK (javascript)

Home Page: https://livekit.io

License: Apache License 2.0

TypeScript 96.99% HTML 1.64% JavaScript 0.94% CSS 0.42%
javascript typescript webrtc

client-sdk-js's Introduction

The LiveKit icon, the name of the repository and some sample code in the background.

LiveKit: Real-time video, audio and data for developers

LiveKit is an open source project that provides scalable, multi-user conferencing based on WebRTC. It's designed to provide everything you need to build real-time video audio data capabilities in your applications.

LiveKit's server is written in Go, using the awesome Pion WebRTC implementation.

GitHub stars Slack community Twitter Follow GitHub release (latest SemVer) GitHub Workflow Status License

Features

Documentation & Guides

https://docs.livekit.io

Live Demos

Ecosystem

  • Agents: build real-time multimodal AI applications with programmable backend participants
  • Egress: record or multi-stream rooms and export individual tracks
  • Ingress: ingest streams from external sources like RTMP, WHIP, HLS, or OBS Studio

SDKs & Tools

Client SDKs

Client SDKs enable your frontend to include interactive, multi-user experiences.

Language Repo Declarative UI Links
JavaScript (TypeScript) client-sdk-js React docs | JS example | React example
Swift (iOS / MacOS) client-sdk-swift Swift UI docs | example
Kotlin (Android) client-sdk-android Compose docs | example | Compose example
Flutter (all platforms) client-sdk-flutter native docs | example
Unity WebGL client-sdk-unity-web docs
React Native (beta) client-sdk-react-native native
Rust client-sdk-rust

Server SDKs

Server SDKs enable your backend to generate access tokens, call server APIs, and receive webhooks. In addition, the Go SDK includes client capabilities, enabling you to build automations that behave like end-users.

Language Repo Docs
Go server-sdk-go docs
JavaScript (TypeScript) server-sdk-js docs
Ruby server-sdk-ruby
Java (Kotlin) server-sdk-kotlin
Python (community) python-sdks
PHP (community) agence104/livekit-server-sdk-php

Tools

Install

Tip

We recommend installing LiveKit CLI along with the server. It lets you access server APIs, create tokens, and generate test traffic.

The following will install LiveKit's media server:

MacOS

brew install livekit

Linux

curl -sSL https://get.livekit.io | bash

Windows

Download the latest release here

Getting Started

Starting LiveKit

Start LiveKit in development mode by running livekit-server --dev. It'll use a placeholder API key/secret pair.

API Key: devkey
API Secret: secret

To customize your setup for production, refer to our deployment docs

Creating access token

A user connecting to a LiveKit room requires an access token. Access tokens (JWT) encode the user's identity and the room permissions they've been granted. You can generate a token with our CLI:

livekit-cli create-token \
    --api-key devkey --api-secret secret \
    --join --room my-first-room --identity user1 \
    --valid-for 24h

Test with example app

Head over to our example app and enter a generated token to connect to your LiveKit server. This app is built with our React SDK.

Once connected, your video and audio are now being published to your new LiveKit instance!

Simulating a test publisher

livekit-cli join-room \
    --url ws://localhost:7880 \
    --api-key devkey --api-secret secret \
    --room my-first-room --identity bot-user1 \
    --publish-demo

This command publishes a looped demo video to a room. Due to how the video clip was encoded (keyframes every 3s), there's a slight delay before the browser has sufficient data to begin rendering frames. This is an artifact of the simulation.

Deployment

Use LiveKit Cloud

LiveKit Cloud is the fastest and most reliable way to run LiveKit. Every project gets free monthly bandwidth and transcoding credits.

Sign up for LiveKit Cloud.

Self-host

Read our deployment docs for more information.

Building from source

Pre-requisites:

  • Go 1.22+ is installed
  • GOPATH/bin is in your PATH

Then run

git clone https://github.com/livekit/livekit
cd livekit
./bootstrap.sh
mage

Contributing

We welcome your contributions toward improving LiveKit! Please join us on Slack to discuss your ideas and/or PRs.

License

LiveKit server is licensed under Apache License v2.0.


LiveKit Ecosystem
Realtime SDKsReact Components · Browser · Swift Components · iOS/macOS/visionOS · Android · Flutter · React Native · Rust · Node.js · Python · Unity (web) · Unity (beta)
Server APIsNode.js · Golang · Ruby · Java/Kotlin · Python · Rust · PHP (community)
Agents FrameworksPython · Playground
ServicesLiveKit server · Egress · Ingress · SIP
ResourcesDocs · Example apps · Cloud · Self-hosting · CLI

client-sdk-js's People

Contributors

bekriebel avatar boks1971 avatar burzomir avatar caiiiycuk avatar cnderrauber avatar davideberlein avatar davidliu avatar davidzhao avatar dbkr avatar dependabot[bot] avatar dsa avatar frostbyte73 avatar github-actions[bot] avatar hermanbilous avatar hiroshihorie avatar jibon57 avatar lukasio avatar mpnri avatar ocupe avatar pablofuente avatar renovate[bot] avatar rnakano avatar scott-lc avatar svajunas-budrys avatar tab1293 avatar theomonnom avatar toger5 avatar uninen avatar wcarle avatar zesun96 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

client-sdk-js's Issues

Quickly muting and unmuting audio causes a muting loop

If you quickly toggle the mute state of an audio track, the track goes into a state of muting and unmuting until disconnecting from the server. I'm not sure if this issue is in the client or server code.

Reproduction steps with the sample client:

  1. Using the example/sample, add a set of hooks to log mute and unmute room events
    .on(RoomEvent.TrackMuted, () => appendLog("Track muted"))
    .on(RoomEvent.TrackUnmuted, () => appendLog("Track unmuted"))
  1. Launch the sample and connect two clients.
  2. Open the console on a client and quickly mute and unmute the audio track.
window.muteAudio(); window.muteAudio();
  1. Observe from the second client that the audio track continues to get muted and unmuted.

The console command is the most reliable way to trigger it, but I have had it trigger in a real scenario with a push-to-talk setup where the user presses and immediately releases or accidentally double taps the push-to-talk key. This can be mitigated by adding a delay between mute calls in a client implementation, but it seems like something is getting into a bad state when a mute/unmute call are done back to back.

it reported an error after I installed it(npm i livekit-client,webpack,node:v13.11.0)

in ./node_modules/livekit-client/dist/livekit-client.esm.js

Module parse failed: Unexpected token (4675:48)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
| creationTime: isSet$1(object.creationTime) ? Number(object.creationTime) : 0,
| turnPassword: isSet$1(object.turnPassword) ? String(object.turnPassword) : '',

        enabledCodecs: Array.isArray(object?.enabledCodecs)

| ? object.enabledCodecs.map((e) => Codec.fromJSON(e))
| : [],

@ ./node_modules/cache-loader/dist/cjs.js??ref--13-0!./node_modules/babel-loader/lib!./node_modules/cache-loader/dist/cjs.js??ref--1-0!./node_modules/vue-loader/lib??vue-loader-options!./src/components/meeting/index.vue?vue&type=script&lang=js& 88:0-64 234:96-101 241:20-32 401:22-31 401:88-97 401:160-169 401:216-225 401:276-285 574:13-22 574:78-87 574:149-158 574:204-213 574:263-272
@ ./src/components/meeting/index.vue?vue&type=script&lang=js&
@ ./src/components/meeting/index.vue
@ ./node_modules/cache-loader/dist/cjs.js??ref--13-0!./node_modules/babel-loader/lib!./node_modules/cache-loader/dist/cjs.js??ref--1-0!./node_modules/vue-loader/lib??vue-loader-options!./src/views/meeting.vue?vue&type=script&lang=js&
@ ./src/views/meeting.vue?vue&type=script&lang=js&
@ ./src/views/meeting.vue
@ ./src/router/index.js
@ ./src/main.js

Provide option to avoid stopping audio track during mute

When using bluetooth devices for audio, stopping the audio track triggers a profile change on the headset (from communication mode to hq playback mode).

Doing so causes an audible change in the playback and could cause issues. We should by default not trigger the change in communication profile, but offer an option for user to stop the track if desired (to disable recording indicator).

Module '"livekit-client"' has no exported member 'createLocalVideoTrack'.

Hello,

When I try to import createLocalVideoTrack from 'livekit-client' it gives me error below. It is same on React example of yours. Other imports works just fine. I looked into src and could not find anything related to createLocalVideoTrack. Here is the screenshot:

image

Error: Module '"livekit-client"' has no exported member 'createLocalVideoTrack'. I am going to use it for the displaying local video of user.

Ability to select output audio device

Currently we are able to set input device for audio, but not output. While there are ways of setting this via manually binding audio elements, this is something we could simplify.

Enable remote video track publications on first attachment

When adaptiveStream is enabled, video tracks should be enabled only when they are visible. The current implementation presumes that new tracks are attached as soon as they are subscribed and are therefore enabled by default.

As one approach to allow for use cases where this is not the case (e.g. a user starts sharing their screen but decides to actually present it at a later point), this behaviour could be inverted:

The initial value of RemoteTrackPublication.disabled should be set to true if kind === Track.Kind.Video and this value should be then set to false as soon as the corresponding RemoteVideoTrack has been attach()'ed for the first time (only if it's actually visible at this point, of course).

[Feature] callback event for Stats

It will be best if it was possible to get live Stats by callback event

async getSenderStats(): Promise<VideoSenderStats[]> {

This way we can detect user's connection status & give warning or suggestion for their slow connection. The current implementation only working if simulcast enabled. Will be better if we can get some way to get status for any type connection.

Vue project error Module parse failed: Unexpected token

In the following snippet, the inability to parse results in an exception, the error is Module parse failed: Unexpected token

const opts = { ...options, };

const opts = { ...(_a = this.roomOptions) === null || _a === void 0 ? void 0 : _a.publishDefaults, ...options, };

this.options.audioCaptureDefaults = { ...audioDefaults, ...options === null || options === void 0 ? void 0 : options.audioCaptureDefaults, }; this.options.videoCaptureDefaults = { ...videoDefaults, ...options === null || options === void 0 ? void 0 : options.videoCaptureDefaults, }; this.options.publishDefaults = { ...publishDefaults, ...options === null || options === void 0 ? void 0 : options.publishDefaults, };

client should wait for track metadata to arrive before giving up on track matches

When a MediaTrack is received, the client may have not received the updated track metadata yet. The current check we have in place is naive, and ignores any track that may have been added after the Participant has joined.

https://github.com/livekit/client-sdk-js/blob/main/src/room/participant/RemoteParticipant.ts#L40

Instead it should be some period of looking for the track, and firing an error event when it couldn't be subscribed.

switchActiveDevice fails for audioinput on Safari

When using switchActiveDevice on Safari 15.0 (16612.1.29.41.4, 16612), the track will stop publishing abruptly and end with the following error in console:

A MediaStreamTrack ended due to a capture failure

track metadata

in v15.0 the name property of tracks was removed.

when publishing tracks from the server-sdk I misused the name as metadata to let the app know what to do with that particular track.
I'm not sure if there is the need for an actual "name" property, but something that users can use as a place for arbitrary (meta) data on a track would be great!

Using codec unsupported by the server causes uncaught error in promise

Using v0.12.1 from nmpjs

  1. Use a server that does not have h264 as an enabled codec
  2. Create a video track with videoCodec: "h264"
  3. Note that the track is not published to the server, but no error or warning is spawned
  4. Mute the video track
  5. Unmute the video track, the following error is thrown in the console log:
01:51:47.553 LocalTrack.js?0f51:77 restarting track with constraints {width: {…}, height: {…}, frameRate: {…}, deviceId: {…}, resolution: {…}}
01:51:47.568 LocalVideoTrack {_events: {…}, _eventsCount: 2, _maxListeners: undefined, attachedElements: Array(1), isMuted: false, …}
01:51:48.257 LocalTrack.js?0f51:92 re-acquired MediaStreamTrack
01:51:48.258 LocalVideoTrack.js?646b:6 Uncaught (in promise) Error
Promise.then (async)
step @ LocalVideoTrack.js?646b:7
eval @ LocalVideoTrack.js?646b:8
__awaiter @ LocalVideoTrack.js?646b:4
restartTrack @ LocalVideoTrack.js?646b:163
unmute @ LocalVideoTrack.js?646b:71
(anonymous) @ VM22733:1

`getTrack` never returns the ScreenShare publication

Hey ! 👋
First of all, a big thanks for LiveKit, it's really great and you did an awesome job 🎉

Here is the problem I encountered:

the function participant.getTrack never returns the ScreenSharing publication when called this way: localParticipant.getTrack(Track.Source.ScreenShare).

In fact I was testing the React SDK a bit, especially with this component: https://github.com/livekit/livekit-react/blob/master/src/components/ControlsView.tsx

And when I clicked on "Share screen", the content of the button did not switch to "Stop sharing" as it should. So I investigated and found this bug that seems to happen on my PC, but I can't really say why.

So I did a monkeypatch of the getTrack function by adding this:

  // @ts-ignore
  room.localParticipant.getTrack = (
    source: Track.Source
  ): TrackPublication | undefined => {
    if (source === Track.Source.Unknown) {
      return;
    }
    for (const [, pub] of room.localParticipant.tracks) {
      if (pub.source === source) {
        return pub;
      }
      if (pub.source === Track.Source.Unknown) {
        if (
          source === Track.Source.Microphone &&
          pub.kind === Track.Kind.Audio &&
          pub.trackName !== "screen"
        ) {
          return pub;
        }
        if (
          source === Track.Source.Camera &&
          pub.kind === Track.Kind.Video &&
          pub.trackName !== "screen"
        ) {
          return pub;
        }
        if (
          source === Track.Source.ScreenShare &&
          pub.kind === Track.Kind.Video &&
          pub.trackName === "screen"
        ) {
          return pub;
        }

       // 👇👇👇 Added this condition
        if (
          source === Track.Source.ScreenShare &&
          pub.kind === Track.Kind.Video &&
          pub.track?.source === "screen_share"
        ) {
          return pub;
        }

        if (
          source === Track.Source.ScreenShareAudio &&
          pub.kind === Track.Kind.Audio &&
          pub.trackName === "screen"
        ) {
          return pub;
        }
      }
    }
  };

And now it does indeed work. The condition above the one I added didn't work, because in my case, the screen sharing publication always had an empty trackName as follows "". I didn't investigate further to understand why, but I'm at your disposal if you need more information.

If you would also like me to suggest a PR for this, it would be with pleasure !

Thanks again for your time.

Local peer metadata update feature

Can local SDK exposes way for users to update local peer metadata from client side itself, so that other users can get that metadata.

Any new participants should get metadata for all existing users.

Why there is a dependency of updating it from Server side.

How to check participation's camera or microphone is enabled?

Hi,

I am having an issue for 2 days. What I am trying to do is accessing participant's isCameraEnabled, isMicrophoneEnabled variables. Because those created as get method, it does not return true when I want to check.

in participantConnected(participant: Participant) method, I am trying to console log participant.isMicrophoneEnabled. It returns false. But I already connected with enable cam and mic. Also if I make isMicrophoneEnabled true on Vue chrome extension, I can see my video on the screen. I also put some consoles in node_modules\livekit-client\dist\room\participant\Participant.js to see if it is triggered.

...
console.log(participant.isMicrophoneEnabled) // returns false
console.log(participant) //
...

node_modules\livekit-client\dist\room\participant\Participant.js
image

Both console logs do not works until I click the participant's isCameraEnabled on console (You can see the image below.)

Step 1
image

Step 2 - After I clicked (...) next to isCameraEnabled
image

image

I also tried to use room.participants.get('participant.sid') but it does not initialized in participantConnected func. I can also use this one if it will work.

Thanks

Safari - "InvalidAccessError: Failed to set remote offer sdp: Failed to set remote video description send parameters."

When on Safari, a user is unable to see remote participant video tracks that are sent via h264 simulcast from browsers that have the 42001f profile available. As you guys have noted in the server code, the 42001f profile is unsupported in Safari: livekit/livekit@4ce2979

However, the baseline codec seems to be making it through from Chrome participants, at least. This is from a sendonly offer sdp grabbed from some browser logs:

v=0
o=- 7320374909748723303 1628044518 IN IP4 0.0.0.0
s=-
t=0 0
a=fingerprint:sha-256 AB:7E:F5:C3:31:EF:AE:94:02:E0:1A:97:20:B2:89:FF:8B:D1:D1:A2:1B:4F:C4:03:C8:8B:09:D7:B4:79:FF:2D
a=group:BUNDLE 0 1
m=video 9 UDP/TLS/RTP/SAVPF 102
c=IN IP4 0.0.0.0
a=setup:actpass
a=mid:0
a=ice-ufrag:jbJTpnfaRAcgQTad
a=ice-pwd:QcTqYYWKZRUbiixrLkeUVMwugbKDSJBe
a=rtcp-mux
a=rtcp-rsize
a=rtpmap:102 H264/90000
a=fmtp:102 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42001f
a=rtcp-fb:102 goog-remb
a=rtcp-fb:102 transport-cc
a=rtcp-fb:102 ccm fir
a=rtcp-fb:102 nack
a=rtcp-fb:102 nack pli
a=ssrc:464145329 cname:PA_TZk9wyEQB79u|TR_cZ7uqAwHHSXe
a=ssrc:464145329 msid:PA_TZk9wyEQB79u|TR_cZ7uqAwHHSXe TR_cZ7uqAwHHSXe
a=ssrc:464145329 mslabel:PA_TZk9wyEQB79u|TR_cZ7uqAwHHSXe
a=ssrc:464145329 label:TR_cZ7uqAwHHSXe
a=msid:PA_TZk9wyEQB79u|TR_cZ7uqAwHHSXe TR_cZ7uqAwHHSXe
a=sendonly
m=audio 9 UDP/TLS/RTP/SAVPF 111
c=IN IP4 0.0.0.0
a=setup:actpass
a=mid:1
a=ice-ufrag:jbJTpnfaRAcgQTad
a=ice-pwd:QcTqYYWKZRUbiixrLkeUVMwugbKDSJBe
a=rtcp-mux
a=rtcp-rsize
a=rtpmap:111 opus/48000/2
a=fmtp:111 minptime=10;useinbandfec=1
a=rtcp-fb:111 transport-cc
a=ssrc:2246747591 cname:PA_TZk9wyEQB79u|TR_HQewpt7DVKad
a=ssrc:2246747591 msid:PA_TZk9wyEQB79u|TR_HQewpt7DVKad TR_HQewpt7DVKad
a=ssrc:2246747591 mslabel:PA_TZk9wyEQB79u|TR_HQewpt7DVKad
a=ssrc:2246747591 label:TR_HQewpt7DVKad
a=msid:PA_TZk9wyEQB79u|TR_HQewpt7DVKad TR_HQewpt7DVKad
a=sendonly

I think the issue is here:

const selected = cap.codecs.find(

The codecs appear to be ordered in such a way that the first codec found will always be the baseline profile (42001f) if the browser has that codec. The .find might need to be a little more robust and also check that the sdpFmtpLine includes 42e01f if the videoCodec is h264 (if the sdpFmtpLine property exists. I think it might only be in Chrome and possibly FF). Here is a lazy hack I experimented with and was able to get consistent cross-browser h264 simulcast behavior:

const selected = cap.codecs.find((c) => {
  const codec = c.mimeType.toLowerCase();
  const matchesVideoCodec = codec === `video/${videoCodec}`;

  videoCodec === "h264" && c.sdpFmtpLine
    ? matchesVideoCodec && c.sdpFmtpLine.includes("42e01f")
    : matchesVideoCodec || codec === "audio/opus";
});

TrackSubscribed event sent before connect is completed

When a user tries to connect while publishing tracks at the same time, i.e. with: connect(..., {audio: true, video: true}), it's possible for the browser to pop up a permission prompt. When this happens, remote TrackSubscribed events would be missed.

A workaround for now is to publish video and audio only after the initial connection is made.

Calling `publishTracks` and then `unPublishTracks` results in the peer connection senders growing uncontrollably

I have implemented the code to call unpublishTracks in our app and when I call in and leave (calling publishTracks then calling unpublishTracks on the same track objects passed to publishTracks) results in the following sequence of log messages:

unpublishTrack: removing the track from 2 senders 
unpublishTrack: removing the track from 4 senders 
unpublishTrack: removing the track from 6 senders 

const senders = this.engine.peerConn.getSenders();

Support for Screen Sharing audio

Hello! Congrats for this awesome project, I've been trying several open source solutions for WebRTC, but LiveKit really stands out for its simplicity and flexibility. Documentation, is great too.

It would be great to support audio when requesting a screen sharing stream.

I don't know if this is better suitable as an option or as default.

window.navigator.mediaDecies({ audio: true, video: true })

Because to receive an audio track the user needs the explicitly check the "share audio" anyway.

Regards

track cannot be muted before/immediately after publishing

Steps to reproduce:

  1. Create your tracks using createLocalTracks
  2. Mute the audio track that you have created using audioTrack.mute(), leave the vide track unmuted
  3. Connect to your livekit server using connect() and pass the existing tracks
  4. Connect another client to the server - observe that audio is correctly not transmitting
  5. Mute your video track using videoTrack.mute()
  6. Your audio will start transmitting.

(This will also happen if immediately muting a track after the connect has succeeded).

I believe this is being caused by the fact that the mute state does not appear to be sent to the engine when initially adding a track. So, when the video track gets muted LocalParticipant.updateInfo is being called which includes:

// match local track mute status to server
info.tracks.forEach((ti) => {
const pub = <LocalTrackPublication> this.tracks.get(ti.sid);
if (!pub) {
return;
}
if (ti.muted && !pub.isMuted) {
pub.mute();
} else if (!ti.muted && pub.isMuted) {
pub.unmute();
}
});
This unmutes the local track.

This can be tested with the sample code by adding the mute step to this part of the code:

} else if (publication.kind === Track.Kind.Audio) {
// skip adding local audio track, to avoid your own sound
// only process local video tracks
audioTrack = <LocalAudioTrack>publication.track;
}

Making it:

    } else if (publication.kind === Track.Kind.Audio) {
      // skip adding local audio track, to avoid your own sound
      // only process local video tracks
      audioTrack = <LocalAudioTrack>publication.track;
      appendLog('muting audio');
      audioTrack.mute();
    }

I ran into this when working on an app that saves the user's mute state between sessions, so a track is immediately muted before/during connection if that is the state they last used.

Two requests/suggestions:

  1. Reverse the logic of LocalParticipant.updateInfo to push the local state to the server instead of pulling local state from the server. I don't think the server mute state should ever override the local mute state of a track.
  2. Push the mute state of a track to the server at the time it is added.

I can try to make a PR for this soon, but I'm still trying to figure out the best way to pass the mute state during the track add step.

React Native SDK timeline

It will be great if we can get some rough timeline when we can expect react native SDK, as it will help us to figure out if we can use livekit.

can't use/build it in an angular app

Hi!

I'm not able to use the client-sdk in a simple angular app.

reproduce

npm install -g @angular/cli
ng init my-app
cd my-app
npm install livekit-client --save

edit src/main.ts by adding

import {
  Room
} from 'livekit-client';

to the imports. then run

ng build

error

on my linux this results in the following build error

✔ Browser application bundle generation complete.

Error: node_modules/livekit-client/dist/logger.d.ts:11:30 - error TS2503: Cannot find namespace 'log'.

11 declare const livekitLogger: log.Logger;
                                ~~~


Error: node_modules/livekit-client/dist/proto/livekit_models.d.ts:1:8 - error TS1192: Module '"projects/livekit/angular-issue/issue/node_modules/protobufjs/minimal"' has no default export.

1 import _m0 from "protobufjs/minimal";
         ~~~


Error: node_modules/livekit-client/dist/proto/livekit_rtc.d.ts:1:8 - error TS1192: Module '"projects/livekit/angular-issue/issue/node_modules/protobufjs/minimal"' has no default export.

1 import _m0 from "protobufjs/minimal";
         ~~~

Versions

OS: linux

ng --version:

Angular CLI: 13.2.6
Node: 16.14.0
Package Manager: npm 8.3.1
OS: linux x64

Angular: 13.2.6
... animations, cli, common, compiler, compiler-cli, core, forms
... platform-browser, platform-browser-dynamic, router

Package                         Version
---------------------------------------------------------
@angular-devkit/architect       0.1302.6
@angular-devkit/build-angular   13.2.6
@angular-devkit/core            13.2.6
@angular-devkit/schematics      13.2.6
@schematics/angular             13.2.6
rxjs                            7.5.5
typescript                      4.5.5

potential fixes

replace import _m0 from "protobufjs/minimal"; with import * as _m0 from "protobufjs/minimal"; in the files proto/livekit_rtc.d.ts and livekit_models.d.ts

replace import log from 'loglevel'; with import * as log from 'loglevel'; in the file logger.d.ts

Thanks

Thanks for your time and effort 😃

Unable to use livekit client with http: protocol.

Hi, I unable to use livekit with http: protocol. I have site that serve both on https, and http. Everything works fine on https, but when I open it with http protocol livekit never connected. I don't need something special in case of http protcol I only need for data channels, so I don't want to use media devices. The problem is inside livekit-client

        if (isWeb()) {
          window.addEventListener('beforeunload', this.onBeforeUnload);
          navigator.mediaDevices.addEventListener('devicechange', this.handleDeviceChange); **// <-- HERE**
        }

        resolve(this);

With http protocol navigator.mediaDevices are not defined, so resolve is never called.

Mute/Unmute events are not fired

Hello!

I just discovered that when a local user mutes a track: trackPublication.mute().
Only localParticipant.onTrackMuted is fired locally. So remote peers can't be notified when this happens.

I tried to subscribe to remoteParticipant.on(ParticipantEvent.TrackMuted) and room.on(RoomEvent.TrackMuted) without luck.

Without this, there is no reactive way to update a UI when a peer user mutes/unmutes a track. (The same goes for unmute)

Additionally, version 12.0.1 of the client-sdk-js stopped emitting localParticipant.onTrackMuted when participant is muted by server.

Regards.

Video Mute/Unmute event is not working

When mute/unmute my video, event for same doesn't gets called.

This only happening with video, for audio mute/unmute event is triggered properly.

I am using the latest version of livekit-client 1.11.1

---- Mute ----
(pub as LocalTrackPublication).mute();

----- UNmute ----
videoPub.forEach(pub => {
(pub as LocalTrackPublication).unmute();
});

This is how am an doing the mute unmute

Use an own logger instance within livekit

when the application that uses the client-sdk is also utilising loglevel as a logging library, right now the loglevel is set globally for the whole application effectively only allowing the whole application to have one log level.
The desired solution would be that one can say something like "give me a loglevel DEBUG for my code, but livekit should have loglevel WARN".
It would imho be more developer friendly if livekit would use its own logger instance of loglevel in order to easily switch between desired logging outputs.

From the loglevel docs:

A log.getLogger(loggerName) method.
This gets you a new logger object that works exactly like the root log object, but can have its level and logging methods set independently. All loggers must have a name (which is a non-empty string, or a Symbol). Calling getLogger() multiple times with the same name will return an identical logger object.
In large applications, it can be incredibly useful to turn logging on and off for particular modules as you are working with them. Using the getLogger() method lets you create a separate logger for each part of your application with its own logging level.
Likewise, for small, independent modules, using a named logger instead of the default root logger allows developers using your module to selectively turn on deep, trace-level logging when trying to debug problems, while logging only errors or silencing logging altogether under normal circumstances.

would be happy to submit a PR if this is something that you are open to!

When I upgrade from 0.17.0 to 0.18.2, the project reports an error (node:v14.18.2,webpack,vue cli4 ),How to deal with it

./node_modules/livekit-client/dist/livekit-client.esm.js

Module parse failed: Unexpected token (4675:48)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
| creationTime: isSet$1(object.creationTime) ? Number(object.creationTime) : 0,
| turnPassword: isSet$1(object.turnPassword) ? String(object.turnPassword) : '',

        enabledCodecs: Array.isArray(object?.enabledCodecs)

| ? object.enabledCodecs.map((e) => Codec.fromJSON(e))
| : [],

@ ./node_modules/cache-loader/dist/cjs.js??ref--13-0!./node_modules/babel-loader/lib!./node_modules/cache-loader/dist/cjs.js??ref--1-0!./node_modules/vue-loader/lib??vue-loader-options!./src/components/meeting/index.vue?vue&type=script&lang=js& 88:0-64 234:96-101 241:20-32 401:22-31 401:88-97 401:160-169 401:216-225 401:276-285 575:13-22 575:78-87 575:149-158 575:204-213 575:263-272
@ ./src/components/meeting/index.vue?vue&type=script&lang=js&
@ ./src/components/meeting/index.vue
@ ./node_modules/cache-loader/dist/cjs.js??ref--13-0!./node_modules/babel-loader/lib!./node_modules/cache-loader/dist/cjs.js??ref--1-0!./node_modules/vue-loader/lib??vue-loader-options!./src/views/meeting.vue?vue&type=script&lang=js&
@ ./src/views/meeting.vue?vue&type=script&lang=js&
@ ./src/views/meeting.vue

IOS safari doesn't connect to a room.

SDK verion: 0.17.1
LiveKit server version: 15.5
IOS version: 15.3

Only in IOS safari the client can't connect to a room. If the client connects without publishing any media stream then after some time the connection happens. But if there is at least one publishing media stream, then seems like the client can't finish the connection.

Failed calls to `connect` do not clean up peer connections

I have a situation where our livekit-server is responding to all connect calls with 500s, which is not erroneous behavior per say (our redis server is haviing issues so the 500 is expected), but the way the client is handling multiple calls to connect seems problematic:

image

As you can see, each failed call to connect keeps a peer connection around

The client side error:

RTCClient.js:19 WebSocket connection to 'wss://livekit-server.zeet.app/rtc?access_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ2aWRlbyI6eyJyb29tIjoiMTQ1ZTIxZWYtZGZlZS00NGNlLTkzOWYtZWFjNDNlZGRhODI4Iiwicm9vbUpvaW4iOnRydWV9LCJpYXQiOjE2MTIzOTY1NDQsIm5iZiI6MTYxMjM5NjU0NCwiZXhwIjoxNjEyNDEwOTQ0LCJpc3MiOiJ0ZXN0IiwianRpIjoiMHhGRkM4MGJkMkE0MTNmMzdFMTI1RDM5QzI4MUNjODVCODhkY2ViRjIwIn0.bHbzPEU2Cg8ZN9joMr8UYXQok7fpPQz82v1uVWa_pNg' failed: Error during WebSocket handshake: Unexpected response code: 500

The err message: Could not connect

Chrome in iOS

Hi in iOS 14.3 Google Chrome support webrtc, I have tried your library but it doesn't work. How can I get around this?

Signal reconnect logic disconnects the room

When the websocket connection gets disconnected and the reconnect logic is triggered, the engine also triggers a disconnect. So, even when the reconnection succeeds, the engine & room get disconnected.

This appears to be coming from here:

this.client.onLeave = () => {
this.close();
this.emit(EngineEvent.Disconnected);
};

Which is coming from here:

} else if (msg.leave) {
if (this.onLeave) {
this.onLeave();
}

I stopped tracing back at this point to see where the root cause is, but it looks like a lot of this logic was added in this commit: 7cdc80b.

I had this happen in a real setup when there was a network interuption, but I think a clean way to test it manually is to call room.engine.client.ws.close()

RoomEvent.StateChanged event

In order to make it easier to subscribe to all roomstate changes it could be nice to expose this with a dedicated event.

room.on(RoomEvent.StateChanged, handleRoomStateChanged);

Parameter deviceId doesn't work

Parameter deviceId doesn't work when connecting rooms,but it can work in version 14.2 .
return {
audio: true,
video: {
frameRate: { ideal: 24, max: 60 },
width: 1920 ,
height: 1080,
deviceId: videoDeviceId,
},
};

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.