The digital analytics divide in #highered and #hesm is getting bigger

February 5th, 2015 Karine Joly 6 Comments

This post is #3 in a series of 10 to celebrate the 10th anniversary of this blog while looking back at the past decade in higher ed digital marketing and communications.

A 15-year love story with… digital analytics

I first blog about digital measurement. analytics and return on investment in posts published in March 2005. But, my obsession with measurement is older.

kj_radio1With a background in the news industry followed by a couple of years in a big dot com, my focus has always been on audiences. In my radio days, I always expected with trepidation the results of the bi-annual radio ratings to find out whether or not our station had improved.

  • Did our work matter to listeners?
  • How many of them?
  • Were we in good standing compared to the other stations on the market?

Later, when I started working on the web for About.com, one of the big dot coms at the end of the 90s, I fell even more in love with digital communications as soon as I saw what could be measured with web analytics.

If you’ve never experienced first hand a situation where you can’t measure the impact of your work due to the lack of accessible or affordable technology, I’m not sure you can get how amazed I felt when I discovered Google Analytics.

Before GA was implemented by virtually all schools in higher education, there were a few web log analyzers or traffic analysis package – some even open-source, but what they gave us wasn’t much compared to what’s at our finger tips today.

Web Analytics 10 years ago in higher ed

A decade ago, when I started this blog and worked for a college, I would include in my annual report some data on page views, submitted request for information forms and online donations as I explained in this post published in March 2005.

At that time, web hits – keeping track of any server requests be it for an image, a page or a script – were still one of the most used web data points in reports to decision makers.

Most top executives at colleges and universities didn’t know much about websites and even less about web analytics. So, the bigger the number sounded, the better it was. Some of you might remember how many college presidents announced in their speeches – or via press releases – that good looking 7-digit metric.

Digital Analytics today in higher ed

Fast forward 10 years and we can now track, measure and dissect the digital footprints our visitors leave behind. And, this can be done with no budget for web visitors and social media visitors as well.

Yet, I keep hearing the same discussions and arguments about the impossibility to measure this or that digital initiative.

While many institutions have reached important milestones on the road to measurement for higher ed websites, I’ve been hearing lately the familiar “we don’t have the time or the money to measure” song in the higher ed social media community.

Not enough money to buy the social media measurment tool that will spit out – at the click of a button – the magic bullet number…

Google Universal AnalyticsThis is a familiar tune, because it’s exactly what the web professionals would tell whoever listened 10 years ago. The tool were not quite there back then, but now the same tool that revolutionnize web measurement can also help a lot social media pros.

What hasn’t changed – as hinted in my post yesterday – is the fact that while some institutions have chosen to focus on measurable goals, many are still lacking in that department.

Why?

  • Lack of interest?
  • Missing expertise or traning?
  • Low level of focus?
  • The fear of failure measuring could reveal?

My bet is that it’s probably a combination of all of the above.

The increasing digital Analytics divide in higher education

Yet, while many institutions haven’t moved an inch beyond vanity metrics like website visits and follower counts on the road of digital measurement, a few have made huge strides.

In Liz Gross’ course on social media measurement I’m currently taking as part of our quality review process, I’ve had to read a few articles this week about the issue of social media return on investment (our text book is Social Media ROI from Oliver Blanchard).

Among these articles were a few posts about ROI published 3 years ago by a few higher ed bloggers.

10CWEA lot of what they wrote back then is still very representative of what I hear or see today on Twitter or at conferences when it’s time to explain (justify?) why we can’t measure social media (or any other digital initiatives).

Here are a few of these arguments:

  • We can’t measure the impact because what we do results in non-financial outcomes (typically the case of social media, communications or PR professionals)
  • We can only show correlation and not causation to show impact.
  • We can’t predict what impact our initiatives will have (so, let’s pray and hope for the best)
  • We can’t put a dollar value on a web visit, a video play or any other of our micro-conversions on our website
  • We can’t attribute any of this value to social media – or any specific digital campaign.

At the same time I’m taking Liz’ course, I’m also wrapping up the preparation work for the 3rd edition of the Higher Ed Analytics Conference next Wednesday. In this context, I got a chance to preview the 12 sessions on the program last week — and saw a totally different picture.

Because this annual conference focuses exclusively on digital analytics in higher education, this event tends to feature the best practices presented by early adopters in the field of digital measurement.

The 12 presenters of the 2015 edition will explain – among other things – how to do the impossible (at least what’s still described as impossible by a majority):

  • how to measure the contribution and request funding for web content with no financial outcomes such as featured stories on a homepage
  • how to prove that your digital campaign CAUSED a specific outcome on your website
  • how to put a real dollar value on your website micro-conversions
  • how to find the part in this value that can be attributed to social media

That’s why I believe the digital analytics divide has increased over the past 10 years.
Some institutions are still stuck in the past, while others have chosen to go beyond the familiar tune to measure what can and should be measured.

Where does YOUR school stand?

If it’s still stuck in the past like most, what’s missing to get it on the other side of this divide?

6 Responses

Got a question or comment?