Use this SDK to add real-time video, audio and data features to your Android/Kotlin app. By connecting to a self- or cloud-hosted Android Kotlin SDK for LiveKitLiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.
Table of Contents
Docs
Docs and guides at https://docs.livekit.io.
API reference can be found at https://docs.livekit.io/client-sdk-android/index.html .
Installation
LiveKit for Android is available as a Maven package.
...
dependencies {
implementation "io.livekit:livekit-android:1.2.1"
// Snapshots of the latest development version are available at:
// implementation "io.livekit:livekit-android:1.2.2-SNAPSHOT"
}
You'll also need jitpack as one of your repositories.
subprojects {
repositories {
google()
mavenCentral()
// ...
maven { url 'https://jitpack.io' }
// For SNAPSHOT access
// maven { url 'https://s01.oss.sonatype.org/content/repositories/snapshots/' }
}
}
Usage
Permissions
LiveKit relies on the RECORD_AUDIO
and CAMERA
permissions to use the microphone and camera.
These permission must be requested at runtime. Reference the sample app for an example.
Publishing camera and microphone
room.localParticipant.setCameraEnabled(true)
room.localParticipant.setMicrophoneEnabled(true)
Sharing screen
// create an intent launcher for screen capture
// this *must* be registered prior to onCreate(), ideally as an instance val
val screenCaptureIntentLauncher = registerForActivityResult(
ActivityResultContracts.StartActivityForResult()
) { result ->
val resultCode = result.resultCode
val data = result.data
if (resultCode != Activity.RESULT_OK || data == null) {
return@registerForActivityResult
}
lifecycleScope.launch {
room.localParticipant.setScreenShareEnabled(true, data)
}
}
// when it's time to enable the screen share, perform the following
val mediaProjectionManager =
getSystemService(MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
screenCaptureIntentLauncher.launch(mediaProjectionManager.createScreenCaptureIntent())
Rendering subscribed tracks
LiveKit uses SurfaceViewRenderer
to render video tracks. A TextureView
implementation is also
provided through TextureViewRenderer
. Subscribed audio tracks are automatically played.
class MainActivity : AppCompatActivity() {
lateinit var room: Room
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
// Create Room object.
room = LiveKit.create(applicationContext)
// Setup the video renderer
room.initVideoRenderer(findViewById<SurfaceViewRenderer>(R.id.renderer))
connectToRoom()
}
private fun connectToRoom() {
val url = "wss://your_host"
val token = "your_token"
lifecycleScope.launch {
// Setup event handling.
launch {
room.events.collect { event ->
when (event) {
is RoomEvent.TrackSubscribed -> onTrackSubscribed(event)
else -> {}
}
}
}
// Connect to server.
room.connect(
url,
token,
)
// Turn on audio/video recording.
val localParticipant = room.localParticipant
localParticipant.setMicrophoneEnabled(true)
localParticipant.setCameraEnabled(true)
}
}
private fun onTrackSubscribed(event: RoomEvent.TrackSubscribed) {
val track = event.track
if (track is VideoTrack) {
attachVideo(track)
}
}
private fun attachVideo(videoTrack: VideoTrack) {
videoTrack.addRenderer(findViewById<SurfaceViewRenderer>(R.id.renderer))
findViewById<View>(R.id.progress).visibility = View.GONE
}
}
See the basic sample app for the full implementation.
Audio modes
WebRTC utilizes an audio module to interface with the device's audio input and output. By default, the audio module is configured for two-way communications.
If you are building a livestreaming or music app, you can make the following tweaks to improve playback quality:
WebRTCModuleOptions options = WebRTCModuleOptions.getInstance();
AudioDeviceModule adm = JavaAudioDeviceModule.builder(this)
.setAudioAttributes(AudioAttributes.Builder()
.setUsage(AudioAttributes.USAGE_MEDIA)
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.build())
.setUseStereoOutput(true)
.build();
options.audioDeviceModule = adm;
@FlowObservable
Properties marked with @FlowObservable
can be accessed as a Kotlin Flow to observe changes
directly:
coroutineScope.launch {
room::activeSpeakers.flow.collectLatest { speakersList ->
/*...*/
}
}
Sample App
Note: If you wish to run the sample apps directly from this repo, please consult the Dev Environment instructions.
We have a basic quickstart sample app here, showing how to connect to a room, publish your device's audio/video, and display the video of one remote participant.
There are two more full featured video conferencing sample apps:
They both use
the CallViewModel
, which handles the Room
connection and exposes the data needed for a basic video conferencing
app.
The respective ParticipantItem
class in each app is responsible for the displaying of each
participant's UI.
Dev Environment
To develop the Android SDK or running the sample app directly from this repo, you'll need:
- Clone the repo to your computer
- Ensure the protocol submodule repo is initialized and updated
git clone https://github.com/livekit/client-sdk-android.git
cd client-sdk-android
git submodule update --init
For those developing on Apple M1 Macs, please add below to $HOME/.gradle/gradle.properties
protoc_platform=osx-x86_64
Optional (Dev convenience)
- Download webrtc sources from https://webrtc.googlesource.com/src
- Add sources to Android Studio by pointing at the
webrtc/sdk/android
folder.
LiveKit Ecosystem | |
---|---|
Client SDKs | Components Β· JavaScript Β· iOS/macOS Β· Android Β· Flutter Β· React Native Β· Rust Β· Python Β· Unity (web) Β· Unity (beta) |
Server SDKs | Node.js Β· Golang Β· Ruby Β· Java/Kotlin Β· PHP (community) Β· Python (community) |
Services | Livekit server Β· Egress Β· Ingress |
Resources | Docs Β· Example apps Β· Cloud Β· Self-hosting Β· CLI |