• Stars
    star
    238
  • Rank 169,306 (Top 4 %)
  • Language
    Dart
  • License
    Apache License 2.0
  • Created over 3 years ago
  • Updated 3 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Flutter Client SDK for LiveKit
The LiveKit icon, the name of the repository and some sample code in the background.

pub package

LiveKit Flutter SDK

Use this SDK to add real-time video, audio and data features to your Flutter app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.

This package is published to pub.dev as livekit_client.

Docs

More Docs and guides are available at https://docs.livekit.io

Current supported features

Feature Subscribe/Publish Simulcast Background audio Screen sharing End to End Encryption
Web 🟒 🟒 🟒 🟒 🟒
iOS 🟒 🟒 🟒 🟒 🟒
Android 🟒 🟒 🟒 🟒 🟒
Mac 🟒 🟒 🟒 🟒 🟒
Windows 🟒 🟒 🟒 🟒 🟒
Linux 🟒 🟒 🟒 🟒 🟒

🟒 = Available

🟑 = Coming soon (Work in progress)

πŸ”΄ = Not currently available (Possibly in the future)

Example app

We built a multi-user conferencing app as an example in the example/ folder. You can join the same room from any supported LiveKit clients.

Installation

Include this package to your pubspec.yaml

---
dependencies:
  livekit_client: <version>

iOS

Camera and microphone usage need to be declared in your Info.plist file.

<dict>
  ...
  <key>NSCameraUsageDescription</key>
  <string>$(PRODUCT_NAME) uses your camera</string>
  <key>NSMicrophoneUsageDescription</key>
  <string>$(PRODUCT_NAME) uses your microphone</string>

Your application can still run the voice call when it is switched to the background if the background mode is enabled. Select the app target in Xcode, click the Capabilities tab, enable Background Modes, and check Audio, AirPlay, and Picture in Picture.

Your Info.plist should have the following entries.

<dict>
  ...
  <key>UIBackgroundModes</key>
  <array>
    <string>audio</string>
  </array>

Notes

Since xcode 14 no longer supports 32bit builds, and our latest version is based on libwebrtc m104+ the iOS framework no longer supports 32bit builds, we strongly recommend upgrading to flutter 3.3.0+. if you are using flutter 3.0.0 or below, there is a high chance that your flutter app cannot be compiled correctly due to the missing i386 and arm 32bit framework #132 #172.

You can try to modify your {projects_dir}/ios/Podfile to fix this issue.

post_install do |installer|
  installer.pods_project.targets.each do |target|
    flutter_additional_ios_build_settings(target)

    target.build_configurations.each do |config|

      # Workaround for https://github.com/flutter/flutter/issues/64502
      config.build_settings['ONLY_ACTIVE_ARCH'] = 'YES' # <= this line

    end
  end
end

For iOS, the minimum supported deployment target is 12.1. You will need to add the following to your Podfile.

platform :ios, '12.1'

You may need to delete Podfile.lock and re-run pod install after updating deployment target.

Android

We require a set of permissions that need to be declared in your AppManifest.xml. These are required by Flutter WebRTC, which we depend on.

<manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.your.package">
  <uses-feature android:name="android.hardware.camera" />
  <uses-feature android:name="android.hardware.camera.autofocus" />
  <uses-permission android:name="android.permission.CAMERA" />
  <uses-permission android:name="android.permission.RECORD_AUDIO" />
  <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
  <uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
  <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
  <uses-permission android:name="android.permission.BLUETOOTH" android:maxSdkVersion="30" />
  <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
  <uses-permission android:name="android.permission.BLUETOOTH_ADMIN" android:maxSdkVersion="30" />
  ...
</manifest>

For using the bluetooth headset correctly on the android device, you need to add permission_handler to your project. And call the following code after launching your app for the first time.

import 'package:permission_handler/permission_handler.dart';

Future<void> _checkPremissions() async {
  var status = await Permission.bluetooth.request();
  if (status.isPermanentlyDenied) {
    print('Bluetooth Permission disabled');
  }
  status = await Permission.bluetoothConnect.request();
  if (status.isPermanentlyDenied) {
    print('Bluetooth Connect Permission disabled');
  }
}

void main() async {
  WidgetsFlutterBinding.ensureInitialized();
  await _checkPremissions();
  runApp(MyApp());
}

Desktop support

In order to enable Flutter desktop development, please follow instructions here.

On M1 Macs, you will also need to install x86_64 version of FFI:

sudo arch -x86_64 gem install ffi

On Windows VS 2019 is needed (link in flutter docs will download VS 2022).

Usage

Connecting to a room, publish video & audio

final roomOptions = RoomOptions(
  adaptiveStream: true,
  dynacast: true,
  // ... your room options
)

final room = await LiveKitClient.connect(url, token, roomOptions: roomOptions);
try {
  // video will fail when running in ios simulator
  await room.localParticipant.setCameraEnabled(true);
} catch (error) {
  print('Could not publish video, error: $error');
}

await room.localParticipant.setMicrophoneEnabled(true);

Screen sharing

Screen sharing is supported across all platforms. You can enable it with:

room.localParticipant.setScreenShareEnabled(true);

Android

On Android, you would have to define a foreground service in your AndroidManifest.xml.

<manifest xmlns:android="http://schemas.android.com/apk/res/android">
  <application>
    ...
    <service
        android:name="de.julianassmann.flutter_background.IsolateHolderService"
        android:enabled="true"
        android:exported="false"
        android:foregroundServiceType="mediaProjection" />
  </application>
</manifest>

iOS

On iOS, a broadcast extension is needed in order to capture screen content from other apps. See setup guide for instructions.

Desktop(Windows/macOS)

On dekstop you can use ScreenSelectDialog to select the window or screen you want to share.

try {
  final source = await showDialog<DesktopCapturerSource>(
    context: context,
    builder: (context) => ScreenSelectDialog(),
  );
  if (source == null) {
    print('cancelled screenshare');
    return;
  }
  print('DesktopCapturerSource: ${source.id}');
  var track = await LocalVideoTrack.createScreenShareTrack(
    ScreenShareCaptureOptions(
      sourceId: source.id,
      maxFrameRate: 15.0,
    ),
  );
  await room.localParticipant.publishVideoTrack(track);
} catch (e) {
  print('could not publish screen sharing: $e');
}

End to End Encryption

LiveKit supports end-to-end encryption for audio/video data sent over the network. By default, the native platform can support E2EE without any settings, but for flutter web, you need to use the following steps to create e2ee.worker.dart.js file.

# for example app
dart compile js .\web\e2ee.worker.dart -o .\example\web\e2ee.worker.dart.js
# for your project 
export YOU_PROJECT_DIR=your_project_dir
git clone https://github.com/livekit/client-sdk-flutter.git
cd client-sdk-flutter && flutter pub get
dart compile js .\web\e2ee.worker.dart -o ${YOU_PROJECT_DIR}\web\e2ee.worker.dart.js

Advanced track manipulation

The setCameraEnabled/setMicrophoneEnabled helpers are wrappers around the Track API.

You can also manually create and publish tracks:

var localVideo = await LocalVideoTrack.createCameraTrack();
await room.localParticipant.publishVideoTrack(localVideo);

Rendering video

Each track can be rendered separately with the provided VideoTrackRenderer widget.

VideoTrack? track;

@override
Widget build(BuildContext context) {
  if (track != null) {
    return VideoTrackRenderer(track);
  } else {
    return Container(
      color: Colors.grey,
    );
  }
}

Audio handling

Audio tracks are played automatically as long as you are subscribed to them.

Handling changes

LiveKit client makes it simple to build declarative UI that reacts to state changes. It notifies changes in two ways

  • ChangeNotifier - generic notification of changes. This is useful when you are building reactive UI and only care about changes that may impact rendering.
  • EventsListener<Event> - listener pattern to listen to specific events (see events.dart).

This example will show you how to use both to react to room events.

class RoomWidget extends StatefulWidget {
  final Room room;

  RoomWidget(this.room);

  @override
  State<StatefulWidget> createState() {
    return _RoomState();
  }
}

class _RoomState extends State<RoomWidget> {
  late final EventsListener<RoomEvent> _listener = widget.room.createListener();

  @override
  void initState() {
    super.initState();
    // used for generic change updates
    widget.room.addListener(_onChange);

    // used for specific events
    _listener
      ..on<RoomDisconnectedEvent>((_) {
        // handle disconnect
      })
      ..on<ParticipantConnectedEvent>((e) {
        print("participant joined: ${e.participant.identity}");
      })
  }

  @override
  void dispose() {
    // be sure to dispose listener to stop listening to further updates
    _listener.dispose();
    widget.room.removeListener(_onChange);
    super.dispose();
  }

  void _onChange() {
    // perform computations and then call setState
    // setState will trigger a build
    setState(() {
      // your updates here
    });
  }

  @override
  Widget build(BuildContext context) {
    // your build function
  }
}

Similarly, you could do the same when rendering participants. Reacting to changes makes it possible to handle tracks published/unpublished or re-ordering participants in your UI.

class VideoView extends StatefulWidget {
  final Participant participant;

  VideoView(this.participant);

  @override
  State<StatefulWidget> createState() {
    return _VideoViewState();
  }
}

class _VideoViewState extends State<VideoView> {
  TrackPublication? videoPub;

  @override
  void initState() {
    super.initState();
    widget.participant.addListener(this._onParticipantChanged);
    // trigger initial change
    _onParticipantChanged();
  }

  @override
  void dispose() {
    widget.participant.removeListener(this._onParticipantChanged);
    super.dispose();
  }

  @override
  void didUpdateWidget(covariant VideoView oldWidget) {
    oldWidget.participant.removeListener(_onParticipantChanged);
    widget.participant.addListener(_onParticipantChanged);
    _onParticipantChanged();
    super.didUpdateWidget(oldWidget);
  }

  void _onParticipantChanged() {
    var subscribedVideos = widget.participant.videoTracks.values.where((pub) {
      return pub.kind == TrackType.VIDEO &&
          !pub.isScreenShare &&
          pub.subscribed;
    });

    setState(() {
      if (subscribedVideos.length > 0) {
        var videoPub = subscribedVideos.first;
        // when muted, show placeholder
        if (!videoPub.muted) {
          this.videoPub = videoPub;
          return;
        }
      }
      this.videoPub = null;
    });
  }

  @override
  Widget build(BuildContext context) {
    var videoPub = this.videoPub;
    if (videoPub != null) {
      return VideoTrackRenderer(videoPub.track as VideoTrack);
    } else {
      return Container(
        color: Colors.grey,
      );
    }
  }
}

Mute, unmute local tracks

On LocalTrackPublications, you could control if the track is muted by setting its muted property. Changing the mute status will generate an onTrackMuted or onTrack Unmuted delegate call for the local participant. Other participant will receive the status change as well.

// mute track
trackPub.muted = true;

// unmute track
trackPub.muted = false;

Subscriber controls

When subscribing to remote tracks, the client has precise control over status of its subscriptions. You could subscribe or unsubscribe to a track, change its quality, or disabling the track temporarily.

These controls are accessible on the RemoteTrackPublication object.

For more info, see Subscriber controls.

Getting help / Contributing

Please join us on Slack to get help from our devs / community members. We welcome your contributions(PRs) and details can be discussed there.

License

Apache License 2.0

Thanks

A huge thank you to flutter-webrtc for making it possible to use WebRTC in Flutter.


LiveKit Ecosystem
Client SDKsComponents Β· JavaScript Β· Rust Β· iOS/macOS Β· Android Β· Flutter Β· Unity (web) Β· React Native (beta)
Server SDKsNode.js Β· Golang Β· Ruby Β· Java/Kotlin Β· PHP (community) Β· Python (community)
ServicesLivekit server Β· Egress Β· Ingress
ResourcesDocs Β· Example apps Β· Cloud Β· Self-hosting Β· CLI

More Repositories

1

livekit

End-to-end stack for WebRTC. SFU media server and SDKs.
Go
9,214
star
2

agents

Build real-time multimodal AI applications πŸ€–πŸŽ™οΈπŸ“Ή
Python
750
star
3

client-sdk-js

LiveKit browser client SDK (javascript)
TypeScript
329
star
4

livekit-cli

Command line interface to LiveKit
Go
199
star
5

server-sdk-go

Client and server SDK for Golang
Go
189
star
6

client-sdk-swift

LiveKit Swift Client SDK. Easily build live audio or video experiences into your mobile app, game or website.
Swift
182
star
7

rust-sdks

LiveKit real-time and server SDKs for Rust
Rust
173
star
8

client-sdk-android

LiveKit SDK for Android
Kotlin
169
star
9

livekit-react

React component and library for LiveKit
TypeScript
169
star
10

egress

Export and record WebRTC sessions and tracks
Go
166
star
11

components-js

Official open source React components and examples for building with LiveKit.
TypeScript
145
star
12

node-sdks

LiveKit real-time and server SDKs for Node.JS
TypeScript
124
star
13

client-sdk-react-native

TypeScript
98
star
14

python-sdks

LiveKit real-time and server SDKs for Python
Python
80
star
15

sip

SIP to WebRTC bridge for LiveKit
Go
76
star
16

protocol

LiveKit protocol. Protobuf definitions for LiveKit's signaling protocol
Go
69
star
17

ingress

Ingest streams (RTMP/WHIP) or files (HLS, MP4) to LiveKit WebRTC
Go
68
star
18

agents-playground

TypeScript
61
star
19

client-sdk-unity-web

Client SDK for Unity WebGL
C#
50
star
20

livekit-helm

LiveKit Helm charts
Smarty
47
star
21

livekit-recorder

Go
32
star
22

client-sdk-unity

C#
31
star
23

track-processors-js

TypeScript
29
star
24

server-sdk-kotlin

Kotlin
28
star
25

livekit-server-sdk-python

LiveKit Server SDK for Python
Python
25
star
26

psrpc

Go
22
star
27

client-example-swift

Example app for LiveKit Swift SDK πŸ‘‰ https://github.com/livekit/client-sdk-swift
Swift
21
star
28

server-sdk-ruby

LiveKit Server SDK for Ruby
Ruby
21
star
29

meet

Open source video conferencing app built on LiveKit Components, LiveKit Cloud, and Next.js.
TypeScript
16
star
30

client-unity-demo

Demo for LiveKit Unity SDK
C#
11
star
31

client-sdk-cpp

C++
11
star
32

WebRTC-swift

Swift package for WebRTC
Objective-C
10
star
33

livekit-docs

JavaScript
10
star
34

deploy

Resources for deploying LiveKit
Go
9
star
35

webrtc-vmaf

VMAF benchmarking tool for WebRTC codecs
Python
9
star
36

chrometester

A livekit tester to simulate a subscriber in a room, uses headless Chromium
JavaScript
8
star
37

gstreamer-publisher

Command-line app that publishes any GStreamer pipeline to LiveKit
Go
8
star
38

nats-test

benchmarking app that emulates our usage patterns
Go
7
star
39

mediatransportutil

Media transport utilities
Go
6
star
40

rtcscore-go

Library to calculate Mean Opinion Score(MOS)
Go
4
star
41

components-swift

Swift
4
star
42

signal-proxy

Go
3
star
43

client-example-collection-swift

A collection of small examples for the LiveKit Swift SDK πŸ‘‰ https://github.com/livekit/client-sdk-swift
Swift
3
star
44

webrtc-chrome

Fork of Google's WebRTC repo, with LiveKit patches.
C++
3
star
45

components-android

Kotlin
3
star
46

gst-plugins

Go
3
star
47

client-sdk-react-native-expo-plugin

TypeScript
2
star
48

webrtc-xcframework

Ruby
2
star
49

mageutil

Go
2
star
50

ios-test-apps

Swift
1
star
51

gstreamer

C
1
star
52

websocket-bridge

Send and Receive Media to LiveKit via WebSocket
Go
1
star