Home

Awesome

FDSoundActivatedRecorder

CI Status Version License Platform Carthage compatible Readme Score

Start recording when the user speaks. All you have to do is tell us when to start listening. Then we wait for an audible noise and start recording. This is mostly useful for user speech input and the "Start talking now" prompt.

:pizza: Virtual tip jar: https://amazon.com/hz/wishlist/ls/EE78A23EEGQB

Features

Installation

Add this to your project using Swift Package Manager. In Xcode that is simply: File > Swift Packages > Add Package Dependency... and you're done. Alternative installation options are shown below for legacy projects.

CocoaPods

If you are already using CocoaPods, just add 'FDSoundActivatedRecorder' to your Podfile then run pod install.

Carthage

If you are already using Carthage, just add to your Cartfile:

github "fulldecent/FDSoundActivatedRecorder" ~> 0.1

Then run carthage update to build the framework and drag the built FDSoundActivatedRecorderframework into your Xcode project.

Usage

First, install by adding pod 'FDSoundActivatedRecorder', '~> 1.0.0' to your Podfile.

Import the project with:

import FDSoundActivatedRecorder

Then begin listening with:

self.recorder = FDSoundActivatedRecorder()
self.recorder.delegate = self
self.recorder.startListening()

A full implementation example is provided in this project.

If your app is in the app store, I would much appreciate if you could add your app to https://www.cocoacontrols.com/controls/fdsoundactivatedrecorder under "Apps using this control" and "I Use This Control".

Regular Recorder

If you want to use it as a regular recorder, without the ability to of trimming the audio.

  1. Begin listening:
self.recorder.startListening()
  1. Begin recording:
self.recorder.startRecording()
  1. Finally, you can stop recording using the following method:
self.recorder.stopAndSaveRecording()

Full API

The full API, from FDSoundActivatedRecorder.swift is copied below:

@objc protocol FDSoundActivatedRecorderDelegate {
    /// A recording was triggered or manually started
    func soundActivatedRecorderDidStartRecording(recorder: FDSoundActivatedRecorder)

    /// No recording has started or been completed after listening for `TOTAL_TIMEOUT_SECONDS`
    func soundActivatedRecorderDidTimeOut(recorder: FDSoundActivatedRecorder)

    /// The recording and/or listening ended and no recording was captured
    func soundActivatedRecorderDidAbort(recorder: FDSoundActivatedRecorder)

    /// A recording was successfully captured
    func soundActivatedRecorderDidFinishRecording(recorder: FDSoundActivatedRecorder, andSaved file: NSURL)
}

class FDSoundActivatedRecorder : NSObject {
    /// A log-scale reading between 0.0 (silent) and 1.0 (loud), nil if not recording
    dynamic var microphoneLevel: Double

    /// Receiver for status updates
    weak var delegate: FDSoundActivatedRecorderDelegate?

    /// Listen and start recording when triggered
    func startListening()

    /// Go back in time and start recording `RISE_TRIGGER_INTERVALS` ago
    func startRecording()

    /// End the recording and send any processed & saved file to `delegate`
    func stopAndSaveRecording()

    /// End any recording or listening and discard any recorded files
    func abort()

    /// This is a PRIVATE method but it must be public because a selector is used in NSTimer (Swift bug)
    func interval()
}

Technical discussion

This library is tuned for human speech detection using Apple retail iOS devices in a quiet or noisy environement. You are welcome to tune the audio detection constants of this program for any special needs you may have. Following is a technical description of how the algorithm works from FDSoundActivatedRecorder.swift.

V               Recording
O             /-----------\
L            /             \Fall
U           /Rise           \
M          /                 \
E  --------                   --------
   Listening                  Done

Sponsorship

[ YOUR LOGO HERE ]

Please contact github.com@phor.net to discuss adding your company logo above and supporting this project.