Home

Awesome

SCNRecorder

GitHub license Platforms Swift Cocoapods compatible Carthage compatible Swift Package Manager

SCNRecorder allows you to record videos and to capture images from ARSCNView, SCNView and ARView (RealityKit) without sacrificing performance. It gives you an incredible opportunity to share the media content of your augmented reality app or SceneKit based game.

Starting version 2.2.0 SCNRecorder supports Metal only.

Sample

Requirements

Installation

CocoaPods

pod 'SCNRecorder', '~> 2.7'

Carthage

github "gorastudio/SCNRecorder"

Usage

Import the SCNRecorder module.

import SCNRecorder

Call sceneView.prepareForRecording() at viewDidLoad.

@IBOutlet var sceneView: SCNView!

override func viewDidLoad() {
  super.viewDidLoad()
  sceneView.prepareForRecording()
}

And now you can use new functions to capture videos.

try sceneView.startVideoRecording()
sceneView.finishVideoRecording { (videoRecording) in 
  /* Process the captured video. Main thread. */
  let controller = AVPlayerViewController()
  controller.player = AVPlayer(url: recording.url)
  self.navigationController?.pushViewController(controller, animated: true)
}

To capture an image it is enough to call:

sceneView.takePhoto { (photo: UIImage) in
  /* Your photo is now here. Main thread. */
}

or

sceneView.takePhotoResult { (result: Result<UIImage, Swift.Error>) in
  /* Result is here. Main thread. */
}

Look at the Example project for more details.

Audio capture

ARSCNView

To capture video with audio from ARSCNView enable audio in the ARConfiguration.

let configuration = ARWorldTrackingConfiguration()
configuration.providesAudioData = true
sceneView.session.run(configuration)

SCNView

To capture audio from SCNView you have to implement it by yourself.

var captureSession: AVCaptureSession?

override func viewDidLoad() {
  super.viewDidLoad()
  sceneView.prepareForRecording()
  
  guard let recorder = sceneView.recorder else { return }
  let captureSession = AVCaptureSession()
  
  guard let captureDevice = AVCaptureDevice.default(for: .audio) else { return }
  
  do {
    let captureInput = try AVCaptureDeviceInput(device: captureDevice)
    
    guard captureSession.canAddInput(captureInput) else { return }
    captureSession.addInput(captureInput)
  }
  catch { print("Can't create AVCaptureDeviceInput: \(error)")}
  
  guard captureSession.canAddRecorder(recorder) else { return }
  captureSession.addRecorder(recorder)
  
  captureSession.startRunning()
  self.captureSession = captureSession
}

or, simply

var captureSession: AVCaptureSession?

override func viewDidLoad() {
  super.viewDidLoad()
  sceneView.prepareForRecording()
  
  captureSession = try? .makeAudioForRecorder(sceneView.recorder!)
}

Music Overlay

Instead of capturing audio using microphone you can play music and add it to video at the same time.


let auidoEngine = AudioEngine()

override func viewDidLoad() {
  super.viewDidLoad()
  
  sceneView.prepareForRecording()
  do {
    audioEngine.recorder = sceneView.recorder
    
    // If true, use sound data from audioEngine if any
    // If false, use sound data ARSession/AVCaptureSession if any
    sceneView.recorder?.useAudioEngine = true
    
    let player = try AudioEngine.Player(url: url)
    audioEngine.player = player
    
    player.play()
  }
  catch { 
    print(\(error))
  }
}

RealityKit

To support recording RealityKit, copy ARView+MetalRecordable.swift and ARView+SelfSceneRecordable.swift files to your project. Then look at RealityKitViewController.swift for usage.

That's it!

Look at the Example project for more details.

Author

Thanks to Fedor Prokhorov and Dmitry Yurlov for testing, reviewing and inspiration.

GORA Studio

Made with magic 🪄 at GORA Studio

License

This project is licensed under the MIT License - see the LICENSE file for details