Seamlessly Switching Camera Lenses During Video Recording with CameraX on Android.

Ngenge Senior
5 min readSep 9, 2023

--

Photo by Angela Compagnone on Unsplash

One of the features of Snapchat that I have always been thinking about and trying to understand is how the app is able to switch between the front and back camera lenses during an ongoing recording.

One possible implementation that was suggested was to temporarily save a record when the lens is switched and at the final recording, join all the videos involved into a single video and delete the individual videos. This obviously sounds like a good plan but I guess you just want to switch between both lenses as a developer without all this hassle. Let us go.

1. Create a Project in Android Studio

Start by creating a new project in Android Studio and add the CameraX dependencies of at least version 1.3.0-beta01but in this demonstration, I use the version 1.3.0-rc01.

dependencies {
// CameraX dependencies
def camerax_version = "1.3.0-rc01"
implementation "androidx.camera:camera-core:${camerax_version}"
implementation "androidx.camera:camera-camera2:${camerax_version}"
implementation "androidx.camera:camera-lifecycle:${camerax_version}"
implementation "androidx.camera:camera-video:${camerax_version}"
implementation "androidx.camera:camera-view:${camerax_version}"
implementation "androidx.camera:camera-extensions:${camerax_version}"
// To use .await() on camera provider future suspend function
implementation "androidx.concurrent:concurrent-futures-ktx:1.1.0"

// other android dependencies

}

2. Make sure to add CAMERA and RECORD_AUDIO permissions in your app’s AndroidManifest.xml file.



<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />

3. I will assume you know how to request runtime permissions and leave that for you and focus on the main topic.

Create your view layout and inflate it with a PreviewView, a button for video recording, and a button for switching between the camera lenses.

<?xml version="1.0" encoding="utf-8"?>
<layout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools">

<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".PersistentRecordingActivity">

<androidx.camera.view.PreviewView
android:id="@+id/previewView"
android:layout_width="0dp"
android:layout_height="0dp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent">

</androidx.camera.view.PreviewView>

<Button
android:id="@+id/buttonRecord"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginBottom="24dp"
android:text="@string/start_record"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />

<ImageView
android:id="@+id/imageViewFlipCamera"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginTop="16dp"
android:layout_marginEnd="16dp"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:srcCompat="@drawable/flip_camera" />

</androidx.constraintlayout.widget.ConstraintLayout>
</layout>

Next, in our Activity Kotlin file, we will initialize our variables which include recording,camera lens,Preview usecase,recorder and VideoCapture usecase.

class PersistentRecordingActivity : AppCompatActivity() {
private val binding: ActivityPersistentRecordingBinding by lazy {
ActivityPersistentRecordingBinding.inflate(layoutInflater)
}

// Holds the current recording
private var recording: Recording? = null

private val preview: Preview by lazy {
Preview.Builder().build()
}

private var cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA
// Recorder to use with video capture usecase
private val recorder: Recorder by lazy {
Recorder.Builder().build()
}
private val videoCapture: VideoCapture<Recorder> by lazy {
VideoCapture.withOutput(recorder)
}

private lateinit var cameraProvider: ProcessCameraProvider

override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(binding.root)
lifecycleScope.launch {
// we will implement this
startCamera()
}
// And this
createClickListeners()
}

}

The recording variable holds the current recording which is null when the Activity is launched and the cameraSelector holds the current camera face which by default for us is the back Camera, the recorder variable is created and bound to the videoCapture use case. We used Kotlin’s by lazy delegate to initialize most of the variables, which we could still have done in the startCamera method. Let us see the startCamera .

private suspend fun startCamera() {
cameraProvider = ProcessCameraProvider.getInstance(this).await()
preview.setSurfaceProvider(binding.previewView.surfaceProvider)
cameraProvider.unbindAll()
try {
cameraProvider.bindToLifecycle(this, cameraSelector, preview, videoCapture)
} catch (ex: Exception) {
ex.printStackTrace()
}
}

The startCamera method is a suspend function since we make use of .await() which is an extension of the androidx.cocunrrent library added that waits for a ListenableFuture to complete and return the type. To get the camera feed, we let the PreviewView’s surface be used as the surface of the preview use case. Without calling preview.setSurfaceProvider() , the camera feed will not be displayed. We complete the method by calling bindToLifeCycle method of ProcessCameraProvider and pass in the lifecycle, camera selector, and the two use cases. At this point, running the project should display the camera feed with the recording button and flip camera image view.

The method createClickListeners houses the two methods startRecording and flipCamera called respectively by the two views buttonRecord and imageViewFlipCamera

private fun createClickListeners() {
binding.apply {
imageViewFlipCamera.setOnClickListener {
flipCamera()
}

buttonRecord.setOnClickListener {
startRecording()
}

}
}

Start Persistent Recording

In order for us to switch cameras while recording, we need to pause the current recording first and unbind all the use cases, then when the new camera lens is ready, we bind the new camera lens along with the Preview and VideoCapturecases but for this to be possible, we must create the recording as a persistent recording, so that audio data is captured after we unbind the use cases. Below is the startRecording method implementation.

@OptIn(ExperimentalPersistentRecording::class)
private fun startRecording() {
if (recording == null) {
val contentValues = ContentValues().apply {
put(MediaStore.Video.Media.DISPLAY_NAME, System.currentTimeMillis())
put(MediaStore.Video.Media.MIME_TYPE, "video/mp4")
put(MediaStore.Video.Media.RELATIVE_PATH, "DCIM/CameraX")
}
val outputOptions = MediaStoreOutputOptions.Builder(
contentResolver,
MediaStore.Video.Media.EXTERNAL_CONTENT_URI
)
.setContentValues(contentValues)
.build()
recording = recorder.prepareRecording(this, outputOptions)
.asPersistentRecording() // Audio data is recorded after the VideoCapture is unbound
.withAudioEnabled()
.start(
ContextCompat.getMainExecutor(this)
) { recordEvent ->
when (recordEvent) {
is VideoRecordEvent.Start -> {
binding.buttonRecord.text = getString(R.string.stop_record)
}

is VideoRecordEvent.Finalize -> {
binding.buttonRecord.text = getString(R.string.start_record)
}

}


}

} else {
recording?.stop()
recording = null
}


}

Each time buttonRecord is called, we either start recording(create a persistent recording) or stop the current recording. If it is a new recording, we have a reference to it and that is what we use later when we switch Camera lenses while recording. Next, let us see the flipCamera method.

private fun flipCamera() {
// change the Camera lens
cameraSelector = if (cameraSelector == CameraSelector.DEFAULT_BACK_CAMERA) {
CameraSelector.DEFAULT_FRONT_CAMERA
} else {
CameraSelector.DEFAULT_BACK_CAMERA
}
// verify if there is a current recording
if (recording != null) {
recording?.pause()
cameraProvider.unbindAll()

cameraProvider.bindToLifecycle(this, cameraSelector, preview, videoCapture)
recording?.resume()

} else {
cameraProvider.unbindAll()
cameraProvider.bindToLifecycle(this, cameraSelector, preview, videoCapture)
}
}

The above method is called in two scenarios, either while a recording is in progress or no recording is in progress. When there is no current recording ongoing, we query the current camera selector, change it, unbind all the use cases, and rebind the use cases with the new camera selector. When there is an ongoing recording, since we created the recording as a persistent recording, we must first pause the recording, unbind the use cases, then rebind the use cases, and finally resume recording.

Note that in order to stop a persistent recording, we must explicitly
call recording.close() or recording.stop()

Below is the complete source code for the activity and note that you have to request the required runtime permissions before you are able to run this code and record video alongside audio.

For a demonstration, below is how the final app looks and a sample recorded video.

Okay then, this is where we come to a close. Let me know your thoughts.

--

--

Ngenge Senior
Ngenge Senior

Written by Ngenge Senior

Mobile app developer/technical writer

Responses (3)