• Stars
    star
    197
  • Rank 190,691 (Top 4 %)
  • Language
    Dart
  • License
    Apache License 2.0
  • Created almost 3 years ago
  • Updated 2 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Flutter Client SDK for LiveKit
The LiveKit icon, the name of the repository and some sample code in the background.

pub package

LiveKit Flutter SDK

Use this SDK to add real-time video, audio and data features to your Flutter app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.

This package is published to pub.dev as livekit_client.

Docs

More Docs and guides are available at https://docs.livekit.io

Current supported features

Feature Subscribe/Publish Simulcast Background audio Screen sharing End to End Encryption
Web 🟢 🟢 🟢 🟢 🟢
iOS 🟢 🟢 🟢 🟢 🟢
Android 🟢 🟢 🟢 🟢 🟢
Mac 🟢 🟢 🟢 🟢 🟢
Windows 🟢 🟢 🟢 🟢 🟢
Linux 🟢 🟢 🟢 🟢 🟢

🟢 = Available

🟡 = Coming soon (Work in progress)

🔴 = Not currently available (Possibly in the future)

Example app

We built a multi-user conferencing app as an example in the example/ folder. You can join the same room from any supported LiveKit clients.

Installation

Include this package to your pubspec.yaml

---
dependencies:
  livekit_client: <version>

iOS

Camera and microphone usage need to be declared in your Info.plist file.

<dict>
  ...
  <key>NSCameraUsageDescription</key>
  <string>$(PRODUCT_NAME) uses your camera</string>
  <key>NSMicrophoneUsageDescription</key>
  <string>$(PRODUCT_NAME) uses your microphone</string>

Your application can still run the voice call when it is switched to the background if the background mode is enabled. Select the app target in Xcode, click the Capabilities tab, enable Background Modes, and check Audio, AirPlay, and Picture in Picture.

Your Info.plist should have the following entries.

<dict>
  ...
  <key>UIBackgroundModes</key>
  <array>
    <string>audio</string>
  </array>

Notes

Since xcode 14 no longer supports 32bit builds, and our latest version is based on libwebrtc m104+ the iOS framework no longer supports 32bit builds, we strongly recommend upgrading to flutter 3.3.0+. if you are using flutter 3.0.0 or below, there is a high chance that your flutter app cannot be compiled correctly due to the missing i386 and arm 32bit framework #132 #172.

You can try to modify your {projects_dir}/ios/Podfile to fix this issue.

post_install do |installer|
  installer.pods_project.targets.each do |target|
    flutter_additional_ios_build_settings(target)

    target.build_configurations.each do |config|

      # Workaround for https://github.com/flutter/flutter/issues/64502
      config.build_settings['ONLY_ACTIVE_ARCH'] = 'YES' # <= this line

    end
  end
end

For iOS, the minimum supported deployment target is 12.1. You will need to add the following to your Podfile.

platform :ios, '12.1'

You may need to delete Podfile.lock and re-run pod install after updating deployment target.

Android

We require a set of permissions that need to be declared in your AppManifest.xml. These are required by Flutter WebRTC, which we depend on.

<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.your.package">
  <uses-feature android:name="android.hardware.camera" />
  <uses-feature android:name="android.hardware.camera.autofocus" />
  <uses-permission android:name="android.permission.CAMERA" />
  <uses-permission android:name="android.permission.RECORD_AUDIO" />
  <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
  <uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
  <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
  <uses-permission android:name="android.permission.BLUETOOTH" android:maxSdkVersion="30" />
  <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
  <uses-permission android:name="android.permission.BLUETOOTH_ADMIN" android:maxSdkVersion="30" />
  ...
</manifest>

For using the bluetooth headset correctly on the android device, you need to add permission_handler to your project. And call the following code after launching your app for the first time.

import 'package:permission_handler/permission_handler.dart';

Future<void> _checkPremissions() async {
  var status = await Permission.bluetooth.request();
  if (status.isPermanentlyDenied) {
    print('Bluetooth Permission disabled');
  }
  status = await Permission.bluetoothConnect.request();
  if (status.isPermanentlyDenied) {
    print('Bluetooth Connect Permission disabled');
  }
}

void main() async {
  WidgetsFlutterBinding.ensureInitialized();
  await _checkPremissions();
  runApp(MyApp());
}

Desktop support

In order to enable Flutter desktop development, please follow instructions here.

On M1 Macs, you will also need to install x86_64 version of FFI:

sudo arch -x86_64 gem install ffi

On Windows VS 2019 is needed (link in flutter docs will download VS 2022).

Usage

Connecting to a room, publish video & audio

final roomOptions = RoomOptions(
  adaptiveStream: true,
  dynacast: true,
  // ... your room options
)

final room = await LiveKitClient.connect(url, token, roomOptions: roomOptions);
try {
  // video will fail when running in ios simulator
  await room.localParticipant.setCameraEnabled(true);
} catch (error) {
  print('Could not publish video, error: $error');
}

await room.localParticipant.setMicrophoneEnabled(true);

Screen sharing

Screen sharing is supported across all platforms. You can enable it with:

room.localParticipant.setScreenShareEnabled(true);

Android

On Android, you would have to define a foreground service in your AndroidManifest.xml.

<manifest xmlns:android="http://schemas.android.com/apk/res/android">
  <application>
    ...
    <service
        android:name="de.julianassmann.flutter_background.IsolateHolderService"
        android:enabled="true"
        android:exported="false"
        android:foregroundServiceType="mediaProjection" />
  </application>
</manifest>

iOS

On iOS, a broadcast extension is needed in order to capture screen content from other apps. See setup guide for instructions.

Desktop(Windows/macOS)

On dekstop you can use ScreenSelectDialog to select the window or screen you want to share.

try {
  final source = await showDialog<DesktopCapturerSource>(
    context: context,
    builder: (context) => ScreenSelectDialog(),
  );
  if (source == null) {
    print('cancelled screenshare');
    return;
  }
  print('DesktopCapturerSource: ${source.id}');
  var track = await LocalVideoTrack.createScreenShareTrack(
    ScreenShareCaptureOptions(
      sourceId: source.id,
      maxFrameRate: 15.0,
    ),
  );
  await room.localParticipant.publishVideoTrack(track);
} catch (e) {
  print('could not publish screen sharing: $e');
}

End to End Encryption

LiveKit supports end-to-end encryption for audio/video data sent over the network. By default, the native platform can support E2EE without any settings, but for flutter web, you need to use the following steps to create e2ee.worker.dart.js file.

# for example app
dart compile js .\web\e2ee.worker.dart -o .\example\web\e2ee.worker.dart.js
# for your project 
export YOU_PROJECT_DIR=your_project_dir
git clone https://github.com/livekit/client-sdk-flutter.git
cd client-sdk-flutter && flutter pub get
dart compile js .\web\e2ee.worker.dart -o ${YOU_PROJECT_DIR}\web\e2ee.worker.dart.js

Advanced track manipulation

The setCameraEnabled/setMicrophoneEnabled helpers are wrappers around the Track API.

You can also manually create and publish tracks:

var localVideo = await LocalVideoTrack.createCameraTrack();
await room.localParticipant.publishVideoTrack(localVideo);

Rendering video

Each track can be rendered separately with the provided VideoTrackRenderer widget.

VideoTrack? track;

@override
Widget build(BuildContext context) {
  if (track != null) {
    return VideoTrackRenderer(track);
  } else {
    return Container(
      color: Colors.grey,
    );
  }
}

Audio handling

Audio tracks are played automatically as long as you are subscribed to them.

Handling changes

LiveKit client makes it simple to build declarative UI that reacts to state changes. It notifies changes in two ways

  • ChangeNotifier - generic notification of changes. This is useful when you are building reactive UI and only care about changes that may impact rendering.
  • EventsListener<Event> - listener pattern to listen to specific events (see events.dart).

This example will show you how to use both to react to room events.

class RoomWidget extends StatefulWidget {
  final Room room;

  RoomWidget(this.room);

  @override
  State<StatefulWidget> createState() {
    return _RoomState();
  }
}

class _RoomState extends State<RoomWidget> {
  late final EventsListener<RoomEvent> _listener = widget.room.createListener();

  @override
  void initState() {
    super.initState();
    // used for generic change updates
    widget.room.addListener(_onChange);

    // used for specific events
    _listener
      ..on<RoomDisconnectedEvent>((_) {
        // handle disconnect
      })
      ..on<ParticipantConnectedEvent>((e) {
        print("participant joined: ${e.participant.identity}");
      })
  }

  @override
  void dispose() {
    // be sure to dispose listener to stop listening to further updates
    _listener.dispose();
    widget.room.removeListener(_onChange);
    super.dispose();
  }

  void _onChange() {
    // perform computations and then call setState
    // setState will trigger a build
    setState(() {
      // your updates here
    });
  }

  @override
  Widget build(BuildContext context) {
    // your build function
  }
}

Similarly, you could do the same when rendering participants. Reacting to changes makes it possible to handle tracks published/unpublished or re-ordering participants in your UI.

class VideoView extends StatefulWidget {
  final Participant participant;

  VideoView(this.participant);

  @override
  State<StatefulWidget> createState() {
    return _VideoViewState();
  }
}

class _VideoViewState extends State<VideoView> {
  TrackPublication? videoPub;

  @override
  void initState() {
    super.initState();
    widget.participant.addListener(this._onParticipantChanged);
    // trigger initial change
    _onParticipantChanged();
  }

  @override
  void dispose() {
    widget.participant.removeListener(this._onParticipantChanged);
    super.dispose();
  }

  @override
  void didUpdateWidget(covariant VideoView oldWidget) {
    oldWidget.participant.removeListener(_onParticipantChanged);
    widget.participant.addListener(_onParticipantChanged);
    _onParticipantChanged();
    super.didUpdateWidget(oldWidget);
  }

  void _onParticipantChanged() {
    var subscribedVideos = widget.participant.videoTracks.values.where((pub) {
      return pub.kind == TrackType.VIDEO &&
          !pub.isScreenShare &&
          pub.subscribed;
    });

    setState(() {
      if (subscribedVideos.length > 0) {
        var videoPub = subscribedVideos.first;
        // when muted, show placeholder
        if (!videoPub.muted) {
          this.videoPub = videoPub;
          return;
        }
      }
      this.videoPub = null;
    });
  }

  @override
  Widget build(BuildContext context) {
    var videoPub = this.videoPub;
    if (videoPub != null) {
      return VideoTrackRenderer(videoPub.track as VideoTrack);
    } else {
      return Container(
        color: Colors.grey,
      );
    }
  }
}

Mute, unmute local tracks

On LocalTrackPublications, you could control if the track is muted by setting its muted property. Changing the mute status will generate an onTrackMuted or onTrack Unmuted delegate call for the local participant. Other participant will receive the status change as well.

// mute track
trackPub.muted = true;

// unmute track
trackPub.muted = false;

Subscriber controls

When subscribing to remote tracks, the client has precise control over status of its subscriptions. You could subscribe or unsubscribe to a track, change its quality, or disabling the track temporarily.

These controls are accessible on the RemoteTrackPublication object.

For more info, see Subscriber controls.

Getting help / Contributing

Please join us on Slack to get help from our devs / community members. We welcome your contributions(PRs) and details can be discussed there.

License

Apache License 2.0

Thanks

A huge thank you to flutter-webrtc for making it possible to use WebRTC in Flutter.


LiveKit Ecosystem
Client SDKsComponents · JavaScript · Rust · iOS/macOS · Android · Flutter · Unity (web) · React Native (beta)
Server SDKsNode.js · Golang · Ruby · Java/Kotlin · PHP (community) · Python (community)
ServicesLivekit server · Egress · Ingress
ResourcesDocs · Example apps · Cloud · Self-hosting · CLI

More Repositories

1

livekit

End-to-end stack for WebRTC. SFU media server and SDKs.
Go
6,715
star
2

client-sdk-js

LiveKit browser client SDK (javascript)
TypeScript
273
star
3

livekit-react

React component and library for LiveKit
TypeScript
166
star
4

server-sdk-go

Client and server SDK for Golang
Go
164
star
5

livekit-cli

Command line interface to LiveKit
Go
154
star
6

client-sdk-swift

LiveKit Swift Client SDK. Easily build live audio or video experiences into your mobile app, game or website.
Swift
145
star
7

client-sdk-android

LiveKit SDK for Android
Kotlin
138
star
8

egress

Export and record WebRTC sessions and tracks
Go
137
star
9

rust-sdks

LiveKit real-time SDK and server API for Rust
Rust
119
star
10

agents

Build real-time multimodal AI applications 🤖🎙️📹
Python
111
star
11

components-js

Official open source React components and examples for building with LiveKit.
TypeScript
104
star
12

server-sdk-js

JS Server SDK to LiveKit
TypeScript
100
star
13

client-sdk-react-native

TypeScript
80
star
14

protocol

LiveKit protocol. Protobuf definitions for LiveKit's signaling protocol
Go
58
star
15

ingress

Ingest streams (RTMP/WHIP) or files (HLS, MP4) to LiveKit WebRTC
Go
48
star
16

sip

SIP to WebRTC bridge for LiveKit
Go
46
star
17

client-sdk-unity-web

Client SDK for Unity WebGL
C#
42
star
18

python-sdks

LiveKit real-time SDK and server API for Python
Python
38
star
19

livekit-helm

LiveKit Helm charts
Smarty
36
star
20

livekit-recorder

Go
31
star
21

livekit-server-sdk-python

LiveKit Server SDK for Python
Python
25
star
22

server-sdk-kotlin

Kotlin
24
star
23

track-processors-js

TypeScript
23
star
24

client-example-swift

Example app for LiveKit Swift SDK 👉 https://github.com/livekit/client-sdk-swift
Swift
21
star
25

server-sdk-ruby

LiveKit Server SDK for Ruby
Ruby
20
star
26

client-sdk-unity

C#
20
star
27

psrpc

Go
17
star
28

meet

Open source video conferencing app built on LiveKit Components, LiveKit Cloud, and Next.js.
TypeScript
16
star
29

client-unity-demo

Demo for LiveKit Unity SDK
C#
11
star
30

client-sdk-cpp

C++
11
star
31

WebRTC-swift

Swift package for WebRTC
Objective-C
10
star
32

agents-playground

TypeScript
9
star
33

deploy

Resources for deploying LiveKit
Go
8
star
34

livekit-docs

JavaScript
8
star
35

chrometester

A livekit tester to simulate a subscriber in a room, uses headless Chromium
JavaScript
8
star
36

nats-test

benchmarking app that emulates our usage patterns
Go
7
star
37

mediatransportutil

Media transport utilities
Go
5
star
38

webrtc-vmaf

VMAF benchmarking tool for WebRTC codecs
Python
4
star
39

client-example-collection-swift

A collection of small examples for the LiveKit Swift SDK 👉 https://github.com/livekit/client-sdk-swift
Swift
3
star
40

webrtc-chrome

Fork of Google's WebRTC repo, with LiveKit patches.
C++
3
star
41

rtcscore-go

Library to calculate Mean Opinion Score(MOS)
Go
2
star
42

ios-test-apps

Swift
1
star
43

gstreamer

C
1
star
44

mageutil

Go
1
star
45

components-android

Kotlin
1
star
46

gst-plugins

Go
1
star