Livestreaming from mobile devices allows you to connect with your audience, wherever you are. Services that provide such an opportunity are very popular and used in various fields.
In this article, we will take a closer look at how to create your own application for mobile streaming and watching livestreams on Android, as well as how to integrate it with Gcore Streaming Platform.
Streaming protocols
Streaming protocols are used to send video and audio over public networks. One of the most popular streaming protocols is RTMP. It is supported by most streaming platforms. RTMP is reliable and perfect for livestreaming due to its low latency and retransmission of data packets based on the TCP protocol.
To distribute and play content on usersâ devices, streaming platforms offer the popular and scalable broadcast HLS and DASH formats. In addition, Android devices are equipped with a native MediaPlayer that supports HLS playback. That is why we will focus on this particular protocol.
Selecting a library to create an RTMP stream
There are few open-source solutions for RTMP streaming from Android devices and even fewer truly functional options. Letâs take a look at some of them.
1. rtmp-rtsp-stream-client-java
Pros:
- New, constantly updated library
- Minimum API 16 support
- Camera and Camera2 API support
- Camera switching during streaming
- Adaptive bitrate
- Audio and video activation/deactivation during streaming
- Possibility to configure broadcasting parameters
- Installation of OpenGL filters, images, GIFs, or text in real-time
- Easy to use: Everything works out of the box, without any additional manipulations
- Availability of documentation and instructions for the library on GitHub
- Convenient library import via Gradle
Cons:
- Pause function for a broadcast stream is not available
2. HaishinKit
Pros:
- Stable updates
- Supports RTMP playback
- Uses the current Camera2 API
- Supports camera switching during streaming
- Allows you to configure broadcasting parameters
Cons:
- Adaptive bitrate option is not available
- Pause function for a broadcast stream is not available
3. LiveVideoBroadcaster
Pros:
- Minimum API 28 support
- Camera switching during streaming
- Adaptive stream quality (frame rate and bit rate)
- Audio activation/deactivation during streaming
Cons:
- Out of date: last commit on July 28, 2020
- Uses outdated Camera API
- No library import via Gradle
- Much more complex than rtmp-rtsp-stream-client-java
- Frames sent by the library contain errors (DC, AC, MV)
To summarize, rtmp-rtsp-stream-client-java has the most advantages and the fewest disadvantages of all the three options. Therefore, we consider this library to be the most suitable solution.
Streaming implementation via RTMP protocol from an Android smartphone
The rtmp-rtsp-stream-client-java library provides two types of objects for streamingâRtmpCamera1 and RtmpCamera2. The former uses Camera API to capture the stream from your smartphone camera, while the latter uses Camera2 API.
We recommend using RtmpCamera2, as Camera API has been obsolete since Android API level 21. Our example uses the up-to-date RtmpCamera2.
We will now take a step-by-step look at how to use the rtmp-rtsp-stream-client-java library for mobile streaming. But first, letâs briefly review how it works.
- When the
rtmpCamera2.startPreview()
method is called, the camera is captured. - The captured camera session is sent to the OpenGlView input, where it is displayed to the user.
- When the
rtmpCamera2.startStream()
method is called, a connection to the remote server is established. - The captured session from the camera is transmitted via the RTMP protocol to the specified rtmpUrl.
Now letâs take a step-by-step look at how to organize online mobile streaming.
1. Init
To use the rtmp-rtsp-stream-client-java library in your project, you need to add dependencies to build.gradle:
allprojects { repositories { maven { url âhttps://jitpack.ioâ } } } dependencies { implementation âcom.github.pedroSG94.rtmp-rtsp-stream-client- java:rtplibrary:2.1.7â }
2. Permissions
Specify the required permissions in the AndroidManifest.xml file:
<uses-permission android:name="android.permission.INTERNET"/> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/> <uses-permission android:name="android.permission.CAMERA"/> <uses-permission android:name="android.permission.RECORD_AUDIO"/>
3. Displaying the camera stream
When streaming from a smartphone camera, you need to see what is being broadcast. For this purpose, a corresponding View will display the camera stream on the screen. Android uses SurfaceView or TextureView. The used library also has its own OpenGlView that is inherited from SurfaceView.
With RtmpCamera1 you can use any of these Views, while only OpenGlView is available with RtmpCamera2. But among all Views, only OpenGlView allows you to use various filters, images, GIFs, or text during streaming.
Add OpenGlView to Layout Activity or Fragment to see the camera stream:
<com.pedro.rtplibrary.view.OpenGlView android:id="@+id/openGlView" android:layout_width="match_parent" android:layout_height="match_parent" android:keepScreenOn="true" android:visibility="gone" app:keepAspectRatio="true" app:aspectRatioMode="adjust"/>
4. Preparing for streaming
To initialize the RtmpCamera2 object, you need the OpenGlView object and the ConnectCheckerRtmp interface implementation:
private val connectCheckerRtmp = object : ConnectCheckerRtmp { override fun onAuthErrorRtmp() { _toastMessageId.postValue(R.string.auth_error) } override fun onAuthSuccessRtmp() { _toastMessageId.postValue(R.string.auth_success) } override fun onConnectionFailedRtmp(reason: String) { _toastMessageId.postValue(R.string.connection_failed) stopBroadcast() } override fun onConnectionStartedRtmp(rtmpUrl: String) {} override fun onConnectionSuccessRtmp() { _toastMessageId.postValue(R.string.connection_success) _streamState.postValue(StreamState.PLAY) } override fun onDisconnectRtmp() { _toastMessageId.postValue(R.string.disconnected) _streamState.postValue(StreamState.STOP) } override fun onNewBitrateRtmp(bitrate: Long) {} }
To use the adaptive bitrate, make some additions to the implementation of this interface:
//... private lateinit var bitrateAdapter: BitrateAdapter override fun onConnectionSuccessRtmp() { bitrateAdapter = BitrateAdapter { bitrate -> rtmpCamera2?.setVideoBitrateOnFly(bitrate) }.apply { setMaxBitrate(StreamParameters.maxBitrate) } _toastMessageId.postValue(R.string.connection_success) _currentBitrate.postValue(rtmpCamera2?.bitrate) _streamState.postValue(StreamState.PLAY) } //... override fun onNewBitrateRtmp(bitrate: Long) { bitrateAdapter.adaptBitrate(bitrate) _currentBitrate.postValue(bitrate.toInt()) disableStreamingAfterTimeOut() }
Add a callback to the OpenGlView object. The callback methods will be used to start and stop the camera preview:
private val surfaceHolderCallback = object : SurfaceHolder.Callback { override fun surfaceCreated(holder: SurfaceHolder) { rtmpCamera2?.startPreview( StreamParameters.resolution.width, StreamParameters.resolution.height ) } override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {} override fun surfaceDestroyed(holder: SurfaceHolder) { rtmpCamera2?.stopPreview() } } binding.openGlView.holder.addCallback(surfaceHolderCallback)
Create the RtmpCamera2 object that will be used for streaming:
rtmpCamera2 = RtmpCamera2(openGlView, connectCheckerRtmp)
5. Launching and stopping a livestream
Set the video and audio parameters, and launch the livestream.
Livestream with default parameters:
fun startBroadcast(rtmpUrl: String) { rtmpCamera2?.let { if (!it.isStreaming) { if (it.prepareAudio() && it.prepareVideo()) { _streamState.value = StreamState.PLAY it.startStream(rtmpU } else { _streamState.value = StreamState.STOP _toastMessageId.value = R.string.error_preparing_stream } } } }
Livestream with custom parameters:
fun startBroadcast(rtmpUrl: String) { val audioIsReady = rtmpCamera2.prepareAudio( StreamParameters.audioBitrate, StreamParameters.sampleRate, StreamParameters.isStereo, StreamParameters.echoCanceler, StreamParameters.noiseSuppressor ) val videoIsReady = rtmpCamera2.prepareVideo( StreamParameters.resolution.width, StreamParameters.resolution.height, StreamParameters.fps, StreamParameters.startBitrate, StreamParameters.iFrameIntervalInSeconds, CameraHelper.getCameraOrientation(getApplication()) ) rtmpCamera2?.let { if (!it.isStreaming) { if (audioIsReady && videoIsReady) { _streamState.value = StreamState.PLAY it.startStream(rtmpU } else { _streamState.value = StreamState.STOP _toastMessageId.value = R.string.error_preparing_stream } } } }
Stopping the stream:
fun stopBroadcast() { rtmpCamera2?.let { if (it.isStreaming) { _streamState.value = StreamState.STOP it.stopStream() } } }
Integration with Gcore Streaming Platform
Creating a Gcore account
To integrate the streaming platform into the project, you need to create a free Gcore account with your email and password.
Activate the service by selecting Free Live or any other suitable plan.
To interact with Gcore Streaming Platform, we will use Gcore API. We will execute requests using Retrofit together with RxJava. But you can use any other method of sending HTTP requests if you want.
Authorization
Log in to start working with API. Use the email and password specified during registration to receive the Access Token, which you will need for further requests.
class AuthRequestBody( @SerializedName("username") val eMail: String, @SerializedName("password") val password: String, @SerializedName("one_time_password") val oneTimePassword: String = "authenticator passcode" ) interface AuthApi { @POST("./auth/jwt/login") fun performLogin(@Body body: AuthRequestBody): Single<AuthResponse> } private fun auth(eMail: String, password: String) { val requestBody = AuthRequestBody(eMail = eMail, password = password) compositeDisposable.add( (requireActivity().application as GCoreApp).authApi .performLogin(requestBody) .subscribeOn(Schedulers.computation()) .observeOn(AndroidSchedulers.mainThread()) .auth(requireActivity(), requestBody) .subscribe({ authResponse -> showToast(R.string.logged_success) saveAuthData(requestBody, authResponse) routeToStreams() }, { showToast(R.string.logged_fail) }) ) }
Getting PUSH URL
There are two ways to get the URL for sending the RTMP stream:
Method 1. Send a Get all live streams request to get all livestreams. As a response, we will receive data on all streams created in your account.
An example of sending a request:
interface StreamsApi { /** * @param page integer; Query parameter. Use it to list the paginated content * @param with_broadcasts integer; Query parameter. * Set to 1 to get details of the broadcasts associated with the stream */ @GET("./vp/api/streams") fun getStreams( @Header("Authorization") accessToken: String, @Query("page") page: Int, @Query("with_broadcasts") withBroadcasts: Int = 1 ): Single<List<StreamItemResponse>> //... } private fun loadStreamItems(page: Int = 1) { val accessToken = getAccessToken() var currentPage = page if (currentPage == 1) { streamItems.clear() } compositeDisposable.add( (requireActivity().application as GCoreApp).streamsApi .getStreams("Bearer $accessToken", page = currentPage) .subscribeOn(Schedulers.computation()) .observeOn(AndroidSchedulers.mainThread()) .subscribe({ if (it.isNotEmpty()) { it.forEach { streamItemResponse -> streamItems.add(StreamItemModel.getInstance(streamItemResponse)) } loadStreamItems(page = ++currentPage) } else { updateDataInAdapter() } }, { it.printStackTrace() }) ) }
Method 2. Send a Get live stream request to get a particular livestream. As a response, we will receive data only on the requested stream, if such stream exists.
An example of sending a request:
interface StreamsApi { //... @GET("/vp/api/streams/{stream_id}") fun getStreamDetailed( @Header("Authorization") accessToken: String, @Path("stream_id") streamId: Int ): Single<StreamDetailedResponse> //... } private fun getStreamDetailedInfo(streamId: Int) { val accessToken = getAccessToken() compositeDisposable.add( (requireActivity().application as GCoreApp).streamsApi .getStreamDetailed("Bearer $accessToken", streamId) .subscribeOn(Schedulers.computation()) .observeOn(AndroidSchedulers.mainThread()) .subscribe({ streamDetailedInfo -> //... }, { it.printStackTrace() }) ) }
From the responses to these requests, we get a push_url and use it as the URL for sending the RTMP stream. Once the stream begins, the broadcast will start automatically. Select the required stream in your personal account. You can use the preview before publishing the stream to your website or player.
Active stream playback
With Gcore Streaming Platform, you can broadcast streams on third-party resources in various formats, including HLS.
In our example, we will not consider simultaneous streaming and playback on one device. Instead, streaming should be launched from any other device.
To play the active stream, use the standard Android MediaPlayer. If you want to have more control over the stream and the ability to customize the player, we recommend using ExoPlayer.
On-screen display
To display the video stream on the smartphone screen, VideoView is required. It should be added to the Layout Activity or Fragment where you plan to play the stream:
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent" xmlns:app="http://schemas.android.com/apk/res-auto" android:background="@color/black" android:id="@+id/streamPlayer" tools:context=".screens.StreamPlayerFragment"> <VideoView android:id="@+id/videoView" android:layout_width="match_parent" android:layout_height="match_parent" android:layout_gravity="center"/> ... </FrameLayout>
Starting playback
Before playing, the hls_playlist_url of the active stream should be embedded in the player during its initialization. hls_playlist_url is returned in the response to the Get livestream request mentioned above.
Player initialization:
private fun initializePlayer(streamUri: String) { val videoView = binding.videoView videoView.setVideoURI(Uri.parse(streamUri)) val mediaController = MediaController(videoView.context) mediaController.setMediaPlayer(videoView) videoView.setMediaController(mediaController) videoView.setOnPreparedListener { binding.progressBar.visibility = View.GONE videoView.start() } videoView.setonerrorListener(mediaPlayeronerrorListener) }
Once the initialization is complete, start playback by calling the videoView.start() method.
Launching the player:
private fun releasePlayer() { binding.videoView.stopPlayback() }
Summary
Using our examples, organizing a livestream in an Android application is quite simple and does not take much time. All you need is the open-source rtmp-rtsp-stream-client-java library and our Streaming Platform.
Gcore streaming API
All the code mentioned in the article can be viewed on GitHub.