Podcast analytics in an immature market

podcast analytics
How reliable is podcast analytics at the moment? What can be measured and what cannot? In this article we dive into the current state of podcast analytics.

Share this message

With my background in content marketing, I understand like no other how important podcast analytics is. You invest a lot of resources in producing content and of course you want to see what the effect is. On the business objectives, but also on the listeners.

Podcast analytics gives you the data you need to adjust and evaluate. But how reliable are these podcast statistics at the moment?  

The short answer... it is difficult.

The longer answer is that web analytics is already quite mature and that the industry has already standardised everything to a reasonable extent. Podcast analytics, on the other hand, is still in its infancy. 

And what makes it even more difficult is that 'The big tech' companies are in a power struggle, all striving for a monopoly in the podcast world. What the effect of this is on Podcast analytics, I describe under the heading 'the wild west'.

Definitions for this article

In order to clarify what is currently possible, I would like to start with a few definitions and principles, so that certain things can be put into perspective.

Reliable v.s. indicative data

The first thing to know is that there is always a difference between watertight, reliable statistics and indicative statistics. This applies not only to podcast analytics, but also to web analytics... or even any kind of analytics.

Reliable analytics consists of hard, watertight data. And even this data is not 100% watertight. Think of it as a house... you can make it draught-free, but you can never keep all the air out. But it is data you can rely on, as long as you allow for a small deviation.

And then you have indicative analytics. These are statistics that cannot be measured watertight. There are various reasons for this, but you can generally say that the cause lies in an uncontrolled environment.

Think of web analytics. If someone visits your website incognito, blocks all cookies or uses an adblocker, you cannot track everything and therefore the figures are not watertight and only indicative.

Of course, you do have different levels of accuracy in the 'indicative' category. If I had to give a mark for web analytics based on my gut feeling, I would say an '8' for reliability. For podcast analytics, I would now say a '5'.

I will come back to what that means for you.

On-platform & off-platform data

This is about who has access and control over the data.

On-platform data is data generated via the platform that provides the analytics. In the case of Springcast, data generated via our embedded podcast player is on-platform data. We have all the data and can reliably report on it.

Off-platform data is data generated by an external application. In the case of podcasts, this is data that comes from Apple, Google or Spotify.

The current state of podcast analytics

At the moment, podcast analytics is still quite indicative. This is because parties such as Spotify, Apple and Google do not yet apply a standard when it comes to statistics.

For example, Spotify works with starts and streams, while Apple works with the number of devices that have listened. In terms of how long people listen, Spotify uses 25%, 50%, 75% and 100%, while Apple reports in average minutes listened.

In short ... as long as no industry standard is determined, reporting will be difficult, but mainly indicative.

The Wild West

Apple, Google, Amazon and Spotify are investing billions in the podcast industry. It is a domain they all want to dominate. Because in Silicon Valley one belief prevails ... the winner takes all!

And that means monopolies. And monopolies lead to closed doors.

In the world of podcasting, we are beginning to suffer from this too. Spotify acquires popular content creators and makes their content available exclusively on their platform. Amazon and Apple do the same.

Spotify also took over the free podcast hosting platform Anchor. There, users are already noticing that Spotify is given priority when it comes to integrations and Apple and Google are increasingly being left out. So, less integration possibilities.

The last move, which many podcasters and podcast hosting providers are currently suffering from is the move whereby Spotify stores all content on their own servers. This means that external podcast applications and podcast hosts can no longer keep track of what happens to an episode after Spotify has downloaded it. 

Do you want to know? Then you need to log in to your Spotify account. This is the response we got from Spotify when we requested to link with their Analytics:

Spotify podcast analytics

And I understand. We are now in further talks with Spotify to see what the possibilities are, but it doesn't make it easier and better for you as a podcaster to get insight into your podcast statistics.

This is why many international podcast hosting providers exclude Spotify from their statistics completely. This is also one of the reasons why many companies choose not to publish their podcasts on Spotify and Apple, but to distribute them via their own platform (website, newsletter, etc).

Of course, we believe that you as a podcaster should have access to all (non-privacy-sensitive) data and that you should be able to find it in a central place. And that's why, in addition to our work-a-rounds and on-platform analytics (coming soon), we will continue to fight for integration with Spotify, Apple and Google.

Steering on indicative data

As you can read at the beginning of this article under the heading definitions, there is no such thing as watertight numbers in the world of web and podcast analytics. So it's not about the absolute data, but the relative data... unless you are reporting to advertisers.

Web and podcast analytics is never watertight and therefore it is not about absolute data, but relative data. How do the figures develop? Instead of what are the figures?

By relative data, I mean that it is about how the figures develop or what they say. 

Because suppose you see 1,000 downloads... or 15,000 downloads... what does that say in itself? It doesn't. Because it is not much or little if you have no frame of reference. 

If you take my podcast as a benchmark, then 1,000 downloads is a nice number. If you compare it to the podcast Sander Schimmelpennink, then it's downright depressing 😉

What is interesting is to see how the number of downloads grows compared to the previous period. And as long as the technical way of measuring does not change, the increase is most likely correct.

Below you can see how many downloads my own podcast had at the measurement moment. This gives me the opportunity to investigate why episodes 1 and 3 performed better than episode 2. But a number by itself tells me nothing, without a reference.

So look for the signals that the statistics are giving. See it mainly as a reason to delve deeper into the figures or to conduct research.

Springcast analytics

As we speak, we are busy building an extensive analytics suite, with which we will mainly provide insight into (on-platform) data. For example:

To name a few. For strategic reasons, we can't share all metrics yet 😉 There are some really cool metrics coming... I can promise you that.

And of course we will continue to push the directories, such as Google, Apple and Spotify, to give us access to the data we need to provide quality reports.

Do you have contacts in such a directory? You will get a big reward if your lead brings us to the right people 😉 .

Nico Oud

Nico Oud

An entrepreneur with a mission. A mission to help entrepreneurs reach their goals. As a business coach I share my 20+ years of experience and with Springcast I help entrepreneurs to spread their message through podcasts. I'm also an avid podcast listener and producer.

More content