iOS Development With Swift Part 7

This is part 7 and final part of the swift iOS Programming tutorials series. The first part of the series is here and the previous part of the series that is tutorial # 6 is here. In this tutorial we will finish off the app by adding two more buttons in audio playback to change the pitch of the recorded audio to make your voice sound like chipmunk and darth vader. We will learn about audio engine and some new classes in audio engine which help us to change pitch. Finally we will finish off this series by learning on how to run the app on physical device.

Adding Chipmunk Button

First off let’s start of with something we have done many times now and that is to add the chipmunk button on second screen, add constraints and add action.

  • First of download this image for chipmunk.
  • Next add it to Xcode image assets and name it chipmunk.
  • Next add a button and change it image to chipmunk and place it a little lower than the slow audio button on the vertical scale and align it with slow button.
  • Add constraints on vertical side so that the chipmunk button is at a certain distance from the slow button. You can do that by control dragging from the chipmunk button to slow button.
  • Add constraint to align it at the same distance from the left edge as you did with slow button by control dragging from button to the left edge.
  • Add action in play sounds view controller and name the action playChipmunkAudio. You can add a print statement to make sure that the function is called and for debugging purposes.

How to Change Pitch

Now let’s tackle the hard problem of how to change the pitch of the audio. If we increase the pitch of the audio the recorded voice will sound like chipmunk voice.

  • Have a look at AVFoundation reference to find a suitable class which can help us change the pitch of the audio. (Hint: Search pitch in the documentation and you should land on the correct class)
  • There is one class in AVFoundation which can change the pitch and it is AVAudioUnitTimePitch. Looking at the documentation we can see that it has a pitch property. Click on the pitch property and you can see that the default value is 1.0 and you can change the pitch from range -2400 to 2400 so we need to decrease the pitch to make sound like chipmunk.

AVAudioUnitTimePitch & Audio Engine

  • Good we have found which class to use but we still need an example on how to actually use this class. So let’s tackle this next. Google for AVAudioUnitTimePitch example and you should find these articles(Article 1 & Article 2). Also, you will find wwdc link. Wwdc is the apple developer conference held annually and in this conference apple engineers discuss how to the latest libraries, and frameworks. Whenever you want to do something always check wwdc for any reference videos which can help you. There is a video there AVAudioEngine in Practice. Udacity resources suggest have a look at the video from 24:41 to 30:57 later when you have time.
  • Basically here is the psuedo in english we will dive into the code soon.
    • Make an object of AVAudioEngine and initialize it. AVAudioEngine is like the container or a board and you can nodes/components in AVAudioEngine to do complex audio processing and connections. Nodes are smaller components which can be attached to the audio engine and you can connect these nodes inside audio engine to make complex configurations for processing audio. You can do a lot like mixing audio, adding multiple output or input sources etc
    • Create a player node and attach it to audio engine. Player node is the component which will be able to play the audio.
    • Create a unit time pitch node(AVAudioUnitTimePitch) and set the pitch property for this node to 1000 to make it sound like a chipmunk. Then attach it to audio engine.
    • Now the concept is that in audio engine we will connect the player node to the audio pitch node and connect the pitch node in turn to the output node(speakers). The wwdc video explains this in more detail but as soon as you complete all the connections from the player node up to the output node and start it automatically what will happen is that as the player node has the audio it will acts as an input for the unit time pitch node and pitch node will process the audio coming in and in our case pitch node will increase the audio pitch and send it to it’s output. Then the output from the pitch node will act as an input to the output node or speakers and we will be able to hear the transformed audio from our device speakers.
    • We will connect the player node to the pitch node.
    • We will then connect the pitch node to the output node.
    • Finally we will start the audio engine and we should hear the processed audio from device speakers.

Audio Engine Code

Now we understand a little bit how the things will work we will now start coding. We will make all the changes below in PlaySoundsViewController.swift

  • First declare objects for AVAudioEngine and AVAudioFile below where we have declared audioPlayer and our model just before viewDidLoad function. We need an object of AVAudioFile as the player node will not accept a url instead it needs an object of AVAudioFile in order to play it.
    1
    2
    
    var audioEngine:AVAudioEngine!
    var audioFile:AVAudioFile!
  • Next in viewDidLoad initialize audio engine and audio file objects. For audio file we pass along the recorded file url.
    1
    2
    
    audioEngine = AVAudioEngine()
    audioFile = AVAudioFile(forReading: receivedAudio.filePathUrl, error: nil)
  • We will create a function to play the audio with the given pitch value. We are creating a separate function as we will need the same code when we play the audio with darth vader voice but with different pitch.
    1
    
    playAudioWithVariablePitch(1000)
  • Next declare the code for playAudioWithVariablePitch function as given below.
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    
    func playAudioWithVariablePitch(pitch: Float) {
    audioPlayer.stop()
    audioEngine.stop()
    audioEngine.reset()
    var audioPlayerNode = AVAudioPlayerNode()
    audioEngine.attachNode(audioPlayerNode)
     
    var changePitchEffect = AVAudioUnitTimePitch()
    changePitchEffect.pitch = pitch
    audioEngine.attachNode(changePitchEffect)
     
    audioEngine.connect(audioPlayerNode, to: changePitchEffect, format: nil)
    audioEngine.connect(changePitchEffect, to: audioEngine.outputNode, format: nil)
     
    audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
    audioEngine.startAndReturnError(nil)
     
    audioPlayerNode.play()
    }
  • We have already gone through the above code in algorithm form but let me just go over the code a little bit again.
    • The first three lines make sure that there is already no other player object running and resets audio engine.
      1
      2
      3
      
      audioPlayer.stop()
      audioEngine.stop()
      audioEngine.reset()
    • Next two lines makes a player node and attaches it to the audio engine.
    • 1
      2
      
      var audioPlayerNode = AVAudioPlayerNode()
      audioEngine.attachNode(audioPlayerNode)
    • Next three lines makes a time unit pitch object, changes the pitch to the value passed in and then attaches it the the audio player node.
      1
      2
      3
      
      var changePitchEffect = AVAudioUnitTimePitch()
      changePitchEffect.pitch = pitch
      audioEngine.attachNode(changePitchEffect)
    • Next two lines make the proper connections within audio engine between the player node, pitch node and output node.
      1
      2
      
      audioEngine.connect(audioPlayerNode, to: changePitchEffect, format: nil)
      audioEngine.connect(changePitchEffect, to: audioEngine.outputNode, format: nil)
    • Finally in the last three the audio file is added to the player node, engine is started and player is started to start processing.
      1
      2
      3
      4
      
      audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
      audioEngine.startAndReturnError(nil)
       
      audioPlayerNode.play()
    • Now just run the app and record your voice and when you click on the chipmunk icon you will hear your voice in a high pitch which will make it sound like a chipmunk.

Adding Darth Vader Voice

Now follow the same sequence we did to add the chipmunk button, add constrains and add action and called our own function playAudioWithVariablePitch(1000) with pitch value of 1000 do the same thing for darth vader with this image, align it on the right of the chipmunk and call playAudioWithVariablePitch with value of -1000 and run the app to complete the app.

Bonus: Run the app on actual Device

Now you have the app running on the simulator but it always is a good idea to make sure that the application is working on an actual device so every app has a stage in it’s development when the application is deployed and tested on compatible devices. Also, it is cool to have your application created by you on your iPad to show your friends to impress them.

  • Now in order to run the application on an actual device you will need to have an apple developer account. Developing on xcode and testing on the simulator is free but if you want to actually develop your app on an physical device or release your app to the app store you will need to buy an yearly subscription of apple developer account. It costs $99/year to buy developer account and you can find more about this program or register here.
  • Now if you have the developer account then download this pdf provided by Udacity instructor and learn how to set up xcode to run your app on the device.
  • Also, if you are able to set up xcode to run the app on your device then you might also consider TestFlight as you can easily track multiple users with this. You will still need to make sure that you have set up your xcode by following the above pdf but we can track a lot more and it is easier to release builds for beta testing through testflight. Don’t worry if you don’t like testflight you can procedure in pdf and it will work but through testflight after you have defined the provision profile and xcode is set up all you have to do is make an archive for distribution, upload the build to testflight using their app or website and automatically the application will be delivered to all people you have added to testflight and they will be notified.

Conclusions

This concludes your application. Feel free to ask any questions if my instructions are confusing. Also don’t forget these instructions are only help text to the swift course videos so watch them first and use these instructions if you get stuck or don’t understand something. Use the Udacity swift course forums for help and always if you get stuck or need my help don’t hesitate to add a comment to these tutorials or use the contact form and i will try to help you to my best abilities. Also, feel free to make this app your own by adding more features or work on your own idea and start building your own apps.

Also, if you find any mistakes, want to send suggestions for this tutorial, want me to write more detail for some part of the tutorial series or want me to write a tutorial on something you are interested in feel free to contact me using contact form or drop me an email at jawad@ndataconsulting.com. Also, i have experience in swift and objective-c programming in iOS and if you have work then drop me an email at my personal email above or contact my company at info@ndataconsulting.com. Also do check my company nDataConsulting for work we do and check our portfolio here

 

1 thought on “iOS Development With Swift Part 7

  1. Pingback: iOS Development With Swift Part 6 | Jawad Rashid Blog

Leave a Reply

Your email address will not be published. Required fields are marked *