iOS Camera - A surface dive into a deep topic

Summary

There are different ways to access the camera on iOS devices. One simple way would be to use UIImagePickerController, which makes use of the stock system camera UI. This is useful if you want to quickly get up and running to take a picture or video, and you don’t really need to leverage the full power of the camera (e.g., manual changing of the focus, exposure, how to process the camera feed, etc.).

If you are looking to do more custom things with the camera, hopefully these high level notes help get you started and serve as a reference during your development! This post focuses on the camera, but these concepts can be extended to other phone sensors like the microphone.


Resources

Note - Input refers to data that the device can collect (camera feed, microphone, etc.). Output refers to the ability for the iOS device to take the input and produce an output (video file, sound file, etc.).

Steps to set up the camera

  1. Request authorization to use the camera
    Note: you need to add NSCameraUsageDescription to your plist with a description of why you want to access the camera. Your users will see this when asked to grand permission to use the camera. Apple has been pretty vocal about including an accurate and detailed description of why you want permission, so don’t take this description lightly. Also, if you don’t add this, your app will just crash!

  2. Set up the capture session - You use AVCaptureSession to configure your camera’s session inputs (e.g., which camera & related settings such as focus type) and outputs (e.g., how to consume the input from the camera such as making video or pictures). You configure the capture session inputs by calling the capture session’s canAddInput with an AVCaptureInput. Similarly, you configure the output by calling the capture session’s canAddOutput with an AVCaptureOutput. Also, make sure to do this in the capture session’s begin/commit scope:

    func configure(session: AVCaptureSession) {
     . . .
     session.beginConfiguration()
     defer { session.commitConfiguration() }
     // configure your sesison
    }
    

    Note: you are able to change configuration inputs and outputs after the session is running, but you will still need to wrap those changes in the above.

  3. Get the camera preview ready - You will typically want to see what your camera is looking at, so this is an important step. Also, setting up the camera and seeing it work is such a cool feeling!

    Apple recommends using a UIView subclass with an AVCaptureVideoPreviewLayer as the backing layer (see the sample app in Resources above), which ends up playing nice with interface builder. If you are like me and like to do things programmatically, something like this also works:

    final class CameraViewController: UIViewController {
    
     private let _previewLayer: AVCaptureVideoPreviewLayer = {
         let layer = AVCaptureVideoPreviewLayer()
         layer.videoGravity = .resizeAspectFill // fill the view
         return layer
     }()
        
     init(session: AVCaptureSession) {
         super.init(nibName: nil, bundle: nil)
         _previewLayer.session = session
     }
        
     required init?(coder _: NSCoder) { fatalError("\(#function) has not been implemented") }
        
     override func viewDidLoad() {
         super.viewDidLoad()
         _previewLayer.frame = view.layer.bounds
         view.layer.addSublayer(_previewLayer)
     }
    }
    

    Note 1: microphone only or spy camera apps, may not want a camera preview 😅.
    Note 2: if your app supports multiple orientations, you will want to set the preview layer’s AVCaptureConnection’s videoOrientation property (videoPreviewLayer.connection?.videoOrientation = someOrientation). This is usually done when initially setting up the AVCatpureSession and during orientation changes like in a view controller’s viewWillTransition(to:with:).
    Note 3: typically the camera preview is shown full screen, but since the preview is simply a view/layer you can shape and resize the preview how you would like! I have done a camera preview that lives in a small rectangle in a modal popup, which was smaller than the overall screen size.

  4. Run the session! - Simply call session.startSession(). If you have permission to use the camera, configured the session properly, and added your preview, you will see a camera feed on your phone! This never gets old 😎!

Note - Apple recommends that you start your session off the main thread as it is a blocking operation; this keeps the UI responsive. Apple uses a serial queue in their sample documentation and also uses it to put the other camera configuration changes on this queue such as setting up the capture session, changing inputs/outputs, capturing camera input to make an output, and changing device settings like focus or exposure. This keeps the UI as responsive as possible by offloading work from the main queue.

Also, Apple notes this in their sample project (see Resources above), “In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.” If you didn’t use a serial queue, there could be a race condition when changing the camera session and related settings, which could result in undefined behaviors.

What is my phone capable of?


Additional Thoughts

Please feel free to reach out, I appreciate any feedback!

rss facebook twitter github youtube mail spotify lastfm instagram linkedin google google-plus pinterest medium vimeo stackoverflow reddit quora quora