ShazamKit in iOS

October 05, 2021 Publish By : EXPERT APP DEVS 2 min read Viewed By : 999

Prerequisites for implement ShazamKit

  • Require apple development account and physical device with iOS 15 installed 
  • Require xcode 13 

What is shazamKit 

  • Shazam is introduced in iOS and iPadOS 14.2 in the control center with the name “music recognition”.
  • ShazamKit introduced in iOS and iPadOS 15 for development’
  • ShazamKit uses the unique framework of audio recognition.
  • ShazamKit converts the audio into a unique signature and finds a match.
  • ShazamKit works in a one-way conversation,  it can not convert signatures into audio. 
  • ShazamKit create a signature reference for searchable audio 
  • Reference for how to match the audio signature 

signature match

ShazamKit Working flow diagram 

shazamKit working flow diagram

How to implement ShazamKit

  1. Create a project with unique bundle identifier 
  2. Now Register your app id in developer console and in app services enable ShazamKit
  3. Add Microphone Permissions usage permission in Info.plist file 
  4. Write code in AppDelegate.Swift file

//Import AVFoundation 

import AVFoundation

//Add RecordPermission code  in didFinishLaunchingWithOptions

if AVAudioSession.sharedInstance().recordPermission != .granted {

AVAudioSession.sharedInstance().requestRecordPermission { (isGranted) in

print("Microphone permissions \(isGranted)")



  1. Now Write code in ViewController.swift 
  • Import following dependency 

Import ShazamKit

import NexmoClient

import NexmoClient

import AVFoundation

  • Create variable in ViewController 

let session = SHSession()

let audioEngine = AVAudioEngine()

var lastMatchID: String = ""


  • Create functions for start and stop audio analysis 

func startAnalysingAudio(){

let inputNode = audioEngine.inputNode

let bus = 0

inputNode.installTap(onBus: bus, bufferSize: 2048, format: inputNode.inputFormat(forBus: bus)) { 

(buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in

self.session.matchStreamingBuffer(buffer, at: time)



try! audioEngine.start()



func stopAnalysingAudio() {

let inputNode = audioEngine.inputNode

let bus = 0

inputNode.removeTap(onBus: bus)



  1. Now Start and Stop analysis audio 
  • Set session delegate code in ViewDidLoad

session.delegate = self

  • Write bellow code for Start analysing audio 


  • Write bellow code for Stop analysing audio



  1. Now Add ShazamKit Session delegate Protocols 
  • Create an extension for shazamKit delegate 

extension ViewController: SHSessionDelegate {}

  • Add protocols in extensions 

func session(_ session: SHSession, didFind match:SHMatch) {

if let matchedItem = match.mediaItems.first,

let title = matchedItem.title,

let artist = matchedItem.artist,

let matchId = matchedItem.shazamID,

matchId != lastMatchID {

lastMatchID = matchId

DispatchQueue.main.async {

self.send(message: "I am listening : \(title) by \(artist) - Via ShazamKit")





func session(_ session: SHSession,didNotFindMatchFor signature: SHSignature,error: Error?) {

if error != nil {

print(error as Any)




Need a consultation?

Drop us a line! We are here to answer your questions 24/7.