Android Kotlin SDK for LiveKit
Official Android Client SDK for LiveKit. Easily add video & audio capabilities to your Android apps.
Docs
Docs and guides at https://docs.livekit.io
Installation
LiveKit for Android is available as a Maven package.
```groovy title="build.gradle" ... dependencies { implementation "io.livekit:livekit-android:" // Snapshots of the latest development version are available at: // implementation "io.livekit:livekit-android:-SNAPSHOT" }
You'll also need jitpack as one of your repositories.
```groovy
subprojects {
repositories {
google()
mavenCentral()
// ...
maven { url 'https://jitpack.io' }
// For SNAPSHOT access
// maven { url 'https://s01.oss.sonatype.org/content/repositories/snapshots/' }
}
}
Sample App
There are two sample apps with similar functionality:
Usage
Permissions
LiveKit relies on the RECORD_AUDIO and CAMERA permissions to use the microphone and camera.
These permission must be requested at runtime. Reference the sample app for an example.
Publishing camera and microphone
room.localParticipant.setCameraEnabled(true)
room.localParticipant.setMicrophoneEnabled(true)
Sharing screen
// create an intent launcher for screen capture
// this *must* be registered prior to onCreate(), ideally as an instance val
val screenCaptureIntentLauncher = registerForActivityResult(
ActivityResultContracts.StartActivityForResult()
) { result ->
val resultCode = result.resultCode
val data = result.data
if (resultCode != Activity.RESULT_OK || data == null) {
return@registerForActivityResult
}
lifecycleScope.launch {
room.localParticipant.setScreenShareEnabled(true, data)
}
}
// when it's time to enable the screen share, perform the following
val mediaProjectionManager =
getSystemService(MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
screenCaptureIntentLauncher.launch(mediaProjectionManager.createScreenCaptureIntent())
Rendering subscribed tracks
LiveKit uses WebRTC-provided org.webrtc.SurfaceViewRenderer to render video tracks. A TextureView implementation is also provided through TextureViewRenderer. Subscribed audio tracks are automatically played.
class MainActivity : AppCompatActivity(), RoomListener {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
...
val url = "wss://your_host";
val token = "your_token"
lifecycleScope.launch {
val room = LiveKit.connect(
applicationContext,
url,
token,
ConnectOptions(),
RoomOptions(),
)
val localParticipant = room.localParticipant
localParticipant.setMicrophoneEnabled(true)
localParticipant.setCameraEnabled(true)
launch {
room.events.collect { event ->
when(event){
is RoomEvent.TrackSubscribed -> onTrackSubscribed(event)
}
}
}
}
}
private fun onTrackSubscribed(event: RoomEvent.TrackSubscribed) {
if (event.track is VideoTrack) {
attachVideo(track)
}
}
private fun attachVideo(videoTrack: VideoTrack) {
// viewBinding.renderer is a `org.webrtc.SurfaceViewRenderer` in your
// layout
videoTrack.addRenderer(viewBinding.renderer)
}
}
@FlowObservable
Properties marked with @FlowObservable can be accessed as a Kotlin Flow to observe changes directly:
coroutineScope.launch {
room::activeSpeakers.flow.collectLatest { speakersList ->
/*...*/
}
}
Dev Environment
To develop the Android SDK or running the sample app, you'll need:
- Ensure the protocol submodule repo is initialized and updated with
git submodule update --init - Install Android Studio Arctic Fox 2020.3.1+
For those developing on Apple M1 Macs, please add below to $HOME/.gradle/gradle.properties
protoc_platform=osx-x86_64
Optional (Dev convenience)
- Download webrtc sources from https://webrtc.googlesource.com/src
- Add sources to Android Studio by pointing at the
webrtc/sdk/androidfolder.