Why did Apple spend $400M to acquire Shazam?

Posted:
in iPhone edited September 2018
Rumors that Apple was seeking to acquire UK firm Shazam Entertainment Ltd. first floated back in December. Following an EU approval process, Apple officially announced it had finalized the purchase on Monday, ostensibly to "provide users even more great ways to discover, experience and enjoy music." That appears to be a major understatement.

Shazam
Shazam discovers music, but that's not a feature worth $400 million

Shazam is one of the oldest apps in the store

It might be hard to get excited about Apple acquiring one of the first apps to ever appear in the iPhone App Store way back in 2008. But Shazam isn't just as old as the iOS app store; it's actually about as old as macOS X itself, having been founded in 1999 and first released in 2002 as a dialup service for identifying songs.

Sixteen years ago, UK users could mash vertical digits 2580 on their Nokia to dial up the service, which would listen for 30 seconds, then disconnect, process the results, and text the user back with the song title and artist. The service later launched in the U.S. in 2004, costing about $1 per identification. When it launched as a free iPhone app in 2008, it presented identified songs in iTunes, harvesting an affiliate cut of suggested download purchases.

Across the last decade, Apple has regularly promoted and partnered with Shazam, noting it was one of the top ten apps for iOS back in 2013 and integrating it with Siri in 2014. The next year it showed off Shazam working on the Apple Watch, presenting lyrics of the song in real time.

Shazam iOS 8 Siri
Shazam has been integrated into Siri since iOS 8


Shazam remains wildly popular today across a variety of platforms including the Mac. However, music identification is not an exclusive feature Apple can buy given competition in the space, nor something that has a clear potential to excite users to the point of keeping them from defecting or causing them to switch.

So while Shazam integrated itself with Snapchat a couple years ago, that hasn't stopped Facebook's Instagram Stories from ransacking Snapchat's users and popularity. Similarly, Google developed its own Shazam-like service for identifying ambient songs automatically on Pixel 2 phones, but few in Android-land cared enough about that -- or any of Pixel's other features -- to pay a premium to buy the device.

Why buy Shazam?

On top of the dubious value of basic music identification, Apple also isn't known for liberally buying up scores of acquisitions willy-nilly just to blow money. Note that's in contrast to Google and Microsoft, each of which incinerated $15 billion on a pair of outlays that were later written off in the manner of a millionaire playboy totaling his new Lamborghini before walking away with just a casual smirk.

Virtually every one of Apple's recent acquisitions can be directly linked to the launch of serious, significant new features or to embellishing core initiatives designed to help sell its hardware, including Face ID (Faceshift, Emotient, and Perceptio); Siri (VocalIQ); Photos and CoreML (Turi, Tuplejump, Lattice Data, Regaind); Maps (Coherent Navigation, Mapsense, and Indoor.io); wireless charging (PowerbyProxi) and so on.


Apple's Shazam acquisition was close to the price it paid for AuthenTec, which netted Touch ID


Further, the reported $400 million price tag on the Shazam acquisition puts it in a rare category of large purchases that Apple has made which involved revolutionary changes to its platforms. Only Anobit (affordable flash storage), AuthenTec (Touch ID), PrimeSense (TrueDepth imaging) and NeXT itself are in the same ballpark apart from Beats-- Apple's solitary, incomparably larger $3 billion purchase that delivered both the core of Apple Music and an already profitable audio products subsidiary paired with a popular, global brand.

What about Shazam's "ways to discover, experience and enjoy music" could possibly be worth the amount of money Apple spent to acquire it?

Shazam's Flow-FireFly: a visual onramp to Augmented Reality

Rather than just music discovery, it's much more likely related to a key initiative at Shazam that's distinct from the song identification feature it's most famous for, as Mike Wuerthele first noted for AppleInsider in December.

Back in 2015, Shazam announced an effort to move beyond audio identification using a microphone-- largely provided as a music identifying service to users-- to using smartphone cameras to visually identify items. However, rather than simply identifying objects, Shazam's visual recognition platform was developed as a platform for marketers seeking to engage with audiences.

Shazam It
Shazam launched visual recognition in 2015, aimed at marketing engagement


A year earlier, Amazon debuted its FireFly service to perform a similar sort of "visual Shazam," except that Amazon's aim was to use FireFly as a way to sell its then-new Fire Phone.

FireFly was based on Flow, an iPhone app originally launched by Amazon's A9 search subsidiary back in 2011. Flow was designed to recognize millions of products by scanning their barcodes.

Amazon A9 Flow app for iOS
The 2011 Amazon A9 Flow app for iOS identified products using the camera


Years later, Amazon presented "FireFly" as a way for Fire Phone users to similarly point their camera at a product (in the demo, the same jar of Nutella that Flow was promoted as recognizing) to identify it and subsequently order it from Amazon. FireFly was also designed to recognize other data such as phone numbers on a business card.

FireFly
Amazon's Jeff Bezos demonstrated FireFly as if it were an entirely new feature on Fire Phone, rather than a three year old port of an existing iPhone app


Amazon's camera-based FireFly was nearly identical in concept to Shazam's use of a microphone to identify a song and then taking the user to iTunes to purchase it. But while Shazam had been wildly popular as a free app to identify songs, the Fire Phone ended up (for several reasons) a huge flop as a somewhat-expensive new phone aimed primarily at just making it easier to shop on Amazon.

When Shazam introduced its own visual recognition service the following year, rather than tying it into a store to suggest referral purchases it launched the service with a series of marketing partners looking for new ways to get audiences to interact with their brands. From Disney and Warner Bros. to Target to a variety of book and periodical publishers to a series of other product brands, Shazam enabled interactive campaigns that involved a "Shazam tag."

Once a user identified a code with the Shazam app (or recognized an image such as a book or album cover), it worked like a simple URL to open up a marketing website, or take them to a movie preview or even get them shopping in Target's online store. So far, Shazam had little more than a proprietary QR Code embellished with some visual recognition features. In 2016 Shazam's new foray into marketing was compelling enough to raise $30 million in new funding from investors, giving the company a unicorn valuation of $1 billion

Even so, in 2016, Shazam's new foray into marketing was compelling enough to raise $30 million in new funding from investors, giving the company a unicorn valuation of $1 billion. The fact that Apple subsequently paid "only" $400 million for it makes the deal sound like a bargain.

Last year, Shazam made an additional step, embracing Augmented Reality. Now, rather that just taking users to a standard website, it could use its Shazam Codes (or visual recognition of products or posters) to launch an engaging experience right in the camera, layering what the camera sees with "augmented" graphics synced to the movement of the user's device.

Now users could identify a bottle of Bombay Sapphire gin (below) and see it animate botanicals in front of them, while also suggesting cocktail recipes.




Shazam's AR campaign for Bombay Sapphire


Another Shazam campaign in Australia, for Disney's "Guardians of the Galaxy 2," delivered a Spotify playlist "mixtape" along with presenting the movie trailer and an opportunity to buy tickets. A campaign in Spain let users animate Fanta billboards in AR using their phones. And a Hornitos tequila app used a mini-game, rendered in ARKit, to award discounts on purchases.

Given Apple's interest in building traction for ARKit, which launched last fall as the world's largest AR platform, it seems pretty clear that Apple bought Shazam, not really for any particular technology as Apple has already developed its own core visual recognition engine for iOS, but because Shazam has developed significant relationships with global brands to make use of AR as a way to engage with audiences.

Apple's iAd, part two

Back in 2010, Steve Jobs introduced iAd as an initiative to help iOS developers to monetize free apps. Rather than just putting up conventional banner ads-- which could only pull users out of their apps and throw them into the browser to display a marketing website-- iAd was intended to launch a mini-world of marketing content users could explore, then leave to return to the app they were using.

Apple hoped to build a legitimate ad business that didn't rely on tracking users, while giving developers a better way to monetize their apps and end users a better experience with advertising that respected their privacy while keeping navigation straightforward.

Within an iAd experience hosted as a secured HTML5 container running in parallel to the app, users could explore 3D graphical images, customize a product, and even engage in mini-games.


Steve Jobs' vision for iAD was a better ad experience, but advertisers wanted to keep tracking users the way Google had enabled


Apple's iAd was a novel idea, but advertisers hated the fact that Apple erected barriers to stop marketers from tracking users and collecting data on them the way Google and other ad networks allowed. This eventually killed iAd, after a brutal series of bashing articles were presented by ad world figures disparaging iAd as being too expensive, then too inexpensive, all as a distraction from the real reason that ad agencies hated iAd so much: it reined in their ability to spy on users, which had already become the standard for setting ad rates and charging advertisers.

Apple didn't leave the advertising business entirely -- it still was in the business of merchandizing content in its own iTunes and the App Store, and it later added search advertising to its storefronts. However, it no longer had a way to closely interact with brands outside of giving them a development platform to build their own marketing apps.

With the development of ARKit, Apple has now created a new "mixed reality" world of app experiences that mesh right into the real world. At its last two WWDC events, Apple has introduced various games as primary examples of using ARKit. However, Shazam has already developed marketing campaigns that take advantage of ARKit to build engaging experiences-- very similar to the core concept of iAd many years ago.

Using Shazam's AR as a iAD-like brand engagement tool

This starts to explain why Apple paid out a significant sum to own Shazam: it's perhaps one of the most valuable applications of Apple's ARKit technology outside of games. The company's business is already established across a series of major brands, ranging from Coca-Cola and Pepsi and studios from Disney to Fox.

With Shazam inside Apple, iOS can further expand upon integrating visual recognition features into the camera, something that it has already started doing.

In 2017, Apple introduced intelligent support for QR Codes -- simply point open and point the camera at one, and iOS will interpret the link to provide a button to open a website, or even to configure something such as the password for local WiFi network.


Last year the Jamf Nation user conference took advantage of QRCode support in iOS 11 to make it easy to join the WiFi network


Apple also already makes internal use of Machine Learning-based visual recognition techniques to stay focused on a moving subject when recording video, to read the characters on a redeemed iTunes gift card, to pair an Apple Watch, or migrate to a new iPhone. It has also opened up core ML capabilities to third party developers to make use of in their own apps.


Apple has supported visual recognition in iTunes Card redemption and Apple Watch for years


With or without Shazam, Apple could already build more sophisticated applications of visual recognition right into its iOS camera -- potentially in the future, even in the background -- allowing iOS devices to "see" not just QR Codes but recognized posters, billboards, products and other objects, and suggest interactions with them, without manually launching the camera. But Shazam gives Apple a way to demonstrate the value of its AR and visual ML technologies, and to apply these in partnerships with global brands.

Amazon has received incredible attention for taking the concept of Siri's voice assistant and enhancing it into a smarter, always-listening background service with Alexa, something that Apple is now addressing with Hey Siri and HomePod.

But the 50 million simple Alexa and Google Assistant devices out there lack something that most iOS devices already have: a camera capable of providing visual, not just audio, based assistance.

For Amazon and Google, that will require up-selling its Wi-Fi microphone buyers to more expensive devices outfitted with a camera. Apple already has over a billion users of devices that can already do this on their iPhone or iPad, and most of these can launch ARKit experiences based on what they see.

Good luck trying to build a minimally functional AR experience on the global installed base of Android phones, most of which can barely browse the web and run a few apps, let alone aspire to run Google's ARCore platform, which only works on a very limited number of recent, high-end Android models.

Additionally, as the developer of iOS, Apple can also integrate location-based geofencing and use "Hey Siri" or Siri Shortcuts to invoke not just opportunities to begin an ARKit session for Shazam-style marketing purposes, or to launch a chapter of a location based game, but even to launch work applications in enterprise settings-- pointing a camera at a broken device to begin evaluating what's required to fix it, for example, or visually registering a person's invitation to check them into an event.

From that perspective, it's pretty obvious why Apple bought Shazam. It wants to own a key component supporting AR experiences, as well as flesh out applications of visual recognition. This also explains why Apple is investing so much into the core A12 Bionic silicon used to interpret what camera sensors see, as well as the technologies supporting AR-- combining motion sensing and visual recognition to create a model for graphics augmented into the raw camera view.

Today, this is building captivating games, educational tools, marketing experiences and even business tools that use mobile devices to integrate reality with AR content. In the future this will likely move from a hand held display to a vehicle windshield or even wearables that augment what we see with digital information layered on top.

This all started with one of the first interesting iPhone apps.
fotoformat
«1

Comments

  • Reply 1 of 28
    This app was brilliant since day 1. 
    claire1watto_cobrabyronl
  • Reply 2 of 28
    TL;DR but one of the annoying things about the concept is that it won't work via CarPlay. Siri refuses to listen and tell me what is on. And the place I listen to music most is, you guessed it, in the car.
    patchythepirateDeelronkkqd1337claire1byronl
  • Reply 3 of 28
    mike1mike1 Posts: 3,286member
    I want a gin and tonic right now.
    sphericSpamSandwichwatto_cobra
  • Reply 4 of 28
    Rayz2016Rayz2016 Posts: 6,957member

    in contrast to Google and Microsoft, each of which incinerated $15 billion on a pair of outlays that were later written off in the manner of a millionaire playboy totaling his new Lamborghini before walking away with just a casual smirk.

    🤣 Genius!
    battiato1981radarthekatclaire1watto_cobra
  • Reply 5 of 28
    I wish Apple would buy Sirius and make it less garbage with CarPlay. Right now Sirius on CarPlay is a dumpster fire.
    claire1
  • Reply 6 of 28
    Me: “Hey, Siri what am I looking at?”

    Siri: “You idiot, that’s Algore!”

    Me: “OK... what am I thinking about?”

    Actually, this is a very inciteful and prescient article... Good job DED!




    watto_cobra
  • Reply 7 of 28
    Good piece.

    Virtually every one of Apple's recent acquisitions can be directly linked…” we still don't know what Beddit was all about.
    watto_cobra
  • Reply 8 of 28
    I wish Apple would buy Sirius and make it less garbage with CarPlay. Right now Sirius on CarPlay is a dumpster fire.
    Hum. Sirius is a standalone radio receiver system, actually no different in principle than the AM/FM radio. I think perhaps you're suggesting a better integration of inputs? Not sure exactly how (or why) Apple would want to do that. But I do agree that the auto manufacturers need to get better support for CarPlay.
    watto_cobra
  • Reply 9 of 28
    In 2007, Apple’s iPhone changed the way we communicate with each other...

    The jury is still out whether this is a good thing or a bad thing.
    wonkothesaneslprescottbaconstangspheric
  • Reply 10 of 28
    davgregdavgreg Posts: 1,037member
    If memory serves me correctly, Shazam can ID TV shows and movies from the audio. This could be great for discovery, which is a problem buying content. A lot of people do not know the specific episode or movie they want.

    As to advertising, I block it all and the harder they try to put it in front of me the harder I resist. AR in the consumer market seems to be similar to self driving cars- the answer to the question nobody on the end user side was asking.
    baconstangclaire1
  • Reply 11 of 28
    I suppose that could be interesting, but why can't I search for text in Photos? Can Apple sort out that minor but very functional issue first? What year is this?
    tokyojimu
  • Reply 12 of 28
    mike1 said:
    I want a gin and tonic right now.
    Mmmmmm. Bombay Sapphire....
  • Reply 13 of 28
    I suppose that could be interesting, but why can't I search for text in Photos? Can Apple sort out that minor but very functional issue first? What year is this?
    That should be realitively easy to do as opposed to video or changing scenes while driving.
  • Reply 14 of 28
    LatkoLatko Posts: 398member
    In 2007, Apple’s iPhone changed the way we communicate with each other...

    The Xs seems to radically break with that phenomenon
  • Reply 15 of 28
    Latko said:
    In 2007, Apple’s iPhone changed the way we communicate with each other...

    The Xs seems to radically break with that phenomenon
    The A12 Bionic chip, especially the neural engine have yet to be exploited... 

    I suspect that we’ll soon see Aipple and 3rd-party apps that can dynamically determine what the camera is seeing (signs, buildings, landmarks, etc.) and package that with what the iPhone already knows...

    Consider a simple example: your friend is at Niagara Falls and you can see what she is looking at via the Find Frends app...

    ...the weather is here I wish you were beautiful — oh, you are and so am I (both).
    tmayradarthekatwatto_cobra
  • Reply 16 of 28
    tmaytmay Posts: 6,341member
    Latko said:
    In 2007, Apple’s iPhone changed the way we communicate with each other...

    The Xs seems to radically break with that phenomenon
    The A12 Bionic chip, especially the neural engine have yet to be exploited... 

    I suspect that we’ll soon see Aipple and 3rd-party apps that can dynamically determine what the camera is seeing (signs, buildings, landmarks, etc.) and package that with what the iPhone already knows...

    Consider a simple example: your friend is at Niagara Falls and you can see what she is looking at via the Find Frends app...

    ...the weather is here I wish you were beautiful — oh, you are and so am I (both).
    Give me a Siri interface to an AI version of IMDB, and I might be able to track those long ago scenes in film and television that I barely remember, but want to see again.
    watto_cobra
  • Reply 17 of 28
    tmay said:
    Latko said:
    In 2007, Apple’s iPhone changed the way we communicate with each other...

    The Xs seems to radically break with that phenomenon
    The A12 Bionic chip, especially the neural engine have yet to be exploited... 

    I suspect that we’ll soon see Aipple and 3rd-party apps that can dynamically determine what the camera is seeing (signs, buildings, landmarks, etc.) and package that with what the iPhone already knows...

    Consider a simple example: your friend is at Niagara Falls and you can see what she is looking at via the Find Frends app...

    ...the weather is here I wish you were beautiful — oh, you are and so am I (both).
    Give me a Siri interface to an AI version of IMDB, and I might be able to track those long ago scenes in film and television that I barely remember, but want to see again.
    Oh yeah...

    ...”there were these three woodsmen”...

    or

    ...”thank god Hop Sing was”...


    Ya know, categorizing IMDB videos as a searchable database would be a naturural for Apple’s FoundationDB...

    Maybe Apple should buy Vimeo.
    edited September 2018 baconstangwatto_cobra
  • Reply 18 of 28

    I found this very informative. I didn't know Shazam had so much going on. Thanks Dan!

    watto_cobra
  • Reply 19 of 28
    Rayz2016 said:

    in contrast to Google and Microsoft, each of which incinerated $15 billion on a pair of outlays that were later written off in the manner of a millionaire playboy totaling his new Lamborghini before walking away with just a casual smirk.

    🤣 Genius!


    Yeah, what is DED's beef with Bruce Wayne?



    claire1SpamSandwichwatto_cobra
  • Reply 20 of 28
    cincytee said:
    mike1 said:
    I want a gin and tonic right now.
    Mmmmmm. Bombay Sapphire....

    They should've changed the name to Mumbai Sapphire years ago!!
Sign In or Register to comment.