In this tutorial you will have fun learning how to use the AVAudioRecorder class and the AVAudioPlayer class. An instance of the AVAudioRecorder class, in an iOS project, enables you to provide the ability for the app user to record a sound. An instance of the AVAudioPlayer class, in an iOS project, enables you to provide the ability for the app user to play, pause, stop the recorded a sound file.

Tutorial Revision Notes
  • The tutorial was updated to fixed the issue of the AudioDiary app not working on a real device.
  • Code presented on this page assume you are using Xcode 7.1 and Swift version 2. So if you are using a newer version of Swift, it may produce errors. Fix them by using Xcode’s Fix-it tool.
  • I assume you aren’t a newbie to the Swift Programming Language and you know your way around Xcode Integrated Development Editor.
  • Icons used in the Xcode project was downloaded for free on the icons8 site.

Here is the link to download the Xcode project called, AudioDiary I created for this tutorial:

download-excodeproj

There are three parts to the project and today you will work on part 1 to implement these features in the project.

audiodiary-screen1The Recorder View is shown every time the app user launch the app. You’ll add code in the RecorderViewController.swift class file to enable the app user to perform these tasks:

  • Record a sound by clicking the record button
  • Stop the recorder by clicking the stop button

The AVAudioRecorder Class

You will create an object called, audioRecorder from the AVAudio class. You will use two functions of the audioRecorder object; one to record a sound, the other to stop the recorder.

The UIProgressView Class

You will create an object called, progressView from the UIProgressView class. You will use it to show recording progress.

The NSTimer Class

You’ll create an object called progressViewTimer from the NSTimer class. You’ll use it to animate the progressView object.

The Navigation Bar Button

When the app user tap the navigation bar button, the app will segue the user to the Recordings View.

The RecorderViewController Code

Let us get to work in adding code in the RecorderViewController.swift class file. Start by using Xcode’s Source Control menu to create a new branch from the master branch. Give it the name branch-one. Next, modify the top portion of the RecorderViewController class so it look like this:

audiodiary-code1

You imported the AVFoundation module in the file. You did that because you’ll be using the AVAudioRecorder class in the RecorderViewController class. You conformed the RecorderViewController class to the AVAudioRecorderDelegate protocol. You did that to prevent the compiler from complain when you use the AVAudio class to create an audioRecorder object, use its functions and properties, in the RecorderViewController class. You declared a recordingSession object variable from the AVAudioSession class. Finally, you declared an audioRecorder object variable from the AVAudioRecorder class

Now, modify remaining code in the AudioRecorderViewController.swift file so it look like this:

override func viewDidLoad() {
        super.viewDidLoad()
        
        self.view.backgroundColor = UIColor(patternImage: UIImage(named:"wallpaper.png")!)
        progressView.tintColor = UIColor(red: 1.0, green: 0.02, blue: 0.00, alpha: 1.0)
        progressView.progress = progressCounter
    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
    }

    @IBAction func recordButtonClicked() {
        progressCounter = 0.0
        setupAudioRecorder()
        audioRecorder.record()
        
        if progressViewTimer != nil {
         progressViewTimer.invalidate()
        }
        
        progressViewTimer = NSTimer.scheduledTimerWithTimeInterval(0.1,
            target:self,
            selector:"updateProgressView",
            userInfo:nil,
            repeats:true)
    }

    @IBAction func stopButtonClicked() {
        if audioRecorder != nil {
            audioRecorder.stop()
            progressViewTimer.invalidate()
            progressView.progress = 0.0
        }
    }

    func updateProgressView() {
        progressCounter += 0.01
        progressView.progress = progressCounter
        
        if progressCounter > 1.0 {
            progressViewTimer.invalidate()
        }
        
        /*** DEBUG STATEMENT ***/
        print("Progress: \(progressCounter)")
    }

    func setupAudioRecorder() {
        recordingSession = AVAudioSession.sharedInstance()
        
       do {
            try recordingSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
            try recordingSession.setActive(true)
            
            let fileManager = NSFileManager.defaultManager()
            let urls = fileManager.URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
            let documentsDirectoryURL: NSURL = urls.first!
            soundFileName = "recording-" + randomString() + ".caf"
            let audioFileURL = documentsDirectoryURL.URLByAppendingPathComponent(self.soundFileName)
            
            /*** DEBUG STATEMENT ***/
            print("Audio File Url\n\(audioFileURL)")
            
            let audioSettings = [AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
                AVSampleRateKey: 12000.0,
                AVNumberOfChannelsKey: 1 as NSNumber,
                AVEncoderAudioQualityKey: AVAudioQuality.High.rawValue]
            
            do {
                audioRecorder = try AVAudioRecorder(URL: audioFileURL, settings: audioSettings)
                audioRecorder.delegate = self
                audioRecorder.prepareToRecord()
            } catch let error as NSError {
                // Recorder not initialized
                 /*** DEBUG STATEMENT ***/
                print(error.localizedDescription)
            }
        } catch let error as NSError {
            print("Recording session failed: \(error.localizedDescription)")
        }
    }

    func randomString() -> String {
        let len: Int = 3 // String length
        let needle : NSString = "0123456789"
        let randomString : NSMutableString = NSMutableString(capacity: len)
        
        for (var i=0; i < len; i++){
            let length = UInt32 (needle.length)
            let rand = arc4random_uniform(length)
            randomString.appendFormat("%C", needle.characterAtIndex(Int(rand)))
        }
        
        return randomString as String
    }
    
    // MARK: - AVAudioRecorder delegates
    
    func audioRecorderDidFinishRecording(recorder: AVAudioRecorder, successfully flag: Bool) {
        let msg = "Your recording was saved as, \(soundFileName)"
        let alert = UIAlertController(title: "Audio Diary", message: msg, preferredStyle: .Alert)
        alert.addAction(UIAlertAction(title: "OK", style: .Default, handler: nil))
        self.presentViewController(alert, animated: true, completion: nil)
        //self.navigationController?.dismissViewControllerAnimated(true, completion: nil)
    }
    
    func audioRecorderEncodeErrorDidOccur(recorder: AVAudioRecorder, error: NSError?) {
        print("\(error!.localizedDescription)")
    }
}

That’s all the code you have to put in the RecorderViewController class. Use the Source Control menu to commit changes made to the class file.

Test The Recorder View Controller Code

Run the app on your real device or on the Simulator. The Recorder View should look like the first image shown below. When you click the Record button, the invisible audioRecorder object start recording and the progressView animate, as shown in the second image.

audiodiary-screen1 audiodiary-screen2

For example, use the Recorder View to record yourself saying these words in your Mac’s microphone or your headset’s microphone. When you are done, click the stop button.

  • Bread
  • milk
  • toilet paper
  • butter
  • Apple pie
  • orange juice, two bottles
  • Fish
  • Beer and wine
  • tissue
  • chicken wings
How The Recorder View Controller Code Works

It is time to examine how code you entered in the RecorderViewController class file work.

The ViewDidLoad Function

Code you entered in the function does the following:

  • Add an image on the view’s background.
  • Change the progressView’s tintColor property so when it animates, you see a progressing red bar.
  • Initialize the progressView’s progress property.

The Record Button Code

When you click the record button, the recordButtonClicked() function is fired. Code you entered in the function does the following:

  • Reset the progressView’s progressCounter variable.
  • Call a function that save a unique sound file in the application sandbox’s Documents directory.
  • Start the invisible audioRecorder by calling its record() function.
  • Stop the progressViewTimer within an if statement block.
  • Initialize the progressViewTimer object with the NSTimer’s scheduledTimerWithTimeInterval() function. That function takes five parameters and one of them is a selector that calls the updateProgressView() function. That function contain code to pretty much update the progress property of the progressView and stop the progressViewTimer when the progressCounter is greater than 1.0.

The Stop Button Code

When you click the stop button, the stopButtonClicked() function is fired. Code you entered in the function does the following:

  • Stop the invisible audioRecorder by calling its stop() function.
  • Terminate the progressViewTimer by calling its invalidate() function.
  • Reset the progressView’s progress property.

The updateProgressView Function

When called, code in this function does the following:

  • Increment the progressCounter variable
  • Set the progressView’s progress property
  • In an if statement block, the terminate the progressViewTimer

The final statement in the updateProgressView function is for debugging purpose. It print in the console, the progressCounter variable’s value.

The setupAudioRecorder Function

This function is the work horse of the RecorderViewController class. The first statement you entered in the function initialize the recordingSession object. Code you entered in the first do block, setup a name for the sound file the audioRecorder object will save in the app’s Documents directory, get the full path of the sandbox’s Documents directory as an URL, and append the soundFileName to it. Next, you print the audioFileURL in the console. Next, you created an audioSettings dictionary with four key-values.

Now, in the second do block, you configured the audioRecorder object by feeding it the audioFileURL you setup in the first do block, and the audioSettings dictionary. Next, you made the audioRecorder a delegate of the RecorderViewController class. The last statement in the second do block prepare the audioRecorder object for the recording event and save the audioFileURL in the sandbox’s Documents directory. The statement in the first catch block prints the error message the audioRecord object return. The statement in the second catch block print the error message the recordingSession object return.

The randomString Function

This function create a random string and return it to its caller. Now, the randomString() function was used in the setupAudioRecorder() function to create a unique sound file name.

The AVAudioRecorderDelegate Protocol Functions

The final set of code you entered in the PlayerViewController.swift file implemented two functions of the AVAudioRecorderDelegate protocol. By theImplementing the AVAudioRecorderDelegate functions in the AudioRecorderViewController.swift file is optional.

audioRecorderDidFinishRecording() – This function is fired when the audioRecorder’s stop() function is fired. That event happen when you click the Recorder View’s stop button. Now, code you entered in the audioRecorderDidFinishRecording() function print the soundFileName in an alert view.

audiodiary-screen3

audioRecorderEncodeErrorDidOccur() – This function is fired when when the audioRecorder throw an error. Code you entered in the function print it in an alert view.

sign-next That’s it. You’ve reached the end of the first part of the tutorial. Use the Source Control menu to commit changes to the git repository. Next, week I will bring you part 2 of the tutorial. In the mean time, comments are welcomed! 🙂

Tags:

2 Responses

  1. i want to make an app. and i use your Is-Headphone-Plugged-In. it is very good example. Now i am recording my voice with music playing in the background.(like karaoke). my question is how can i hear my voice from headphone? i dont hear my voice from headphone when recoring

Leave a Reply

UIDocument Demystified: A Step-by-step Guide on Local and iCloud Document Storage

 

Archives