12 Takes on measuring #HigherEd content performance
Karine Joly
There is no shortage of bad content online.
But, there is no scarcity of great quality content either.
When there is much more content than available attention and time, measuring the performance of your content is key – even if numbers and words (or pictures!) are not always a match made in heaven.
That’s why I asked the 12 higher ed speakers of the 2018 Higher Ed Content Conference to tell us how content performance is measured or evaluated at their respective school.
Engagement metrics and attribution for Aaron Baker, Digital Analytics Lead at Harvard University
For news content we value engagement metrics like time on page and the percentage of pageviews that scrolled to the end. We also collect attribution channel, pageviews, bounces, device, and location to help provide context to the engagement numbers. We use the data to talk about how content is being distributed across platforms, what worked in terms of how we put the piece together or the format used for telling the story (e.g. Q&A or photo gallery), and use data from all stories to set a baseline benchmark for performance (what is typical?).
Engagement metrics and demographics data for Amanda Waite, Creative Communications Director at the University of Vermont
We use a combination of Google Analytics, native analytics on social platforms, and this past year used a CASE survey to benchmark our magazine against other schools’. We pay the closest attention to engagement metrics (how the content performs among those who see it), which tells us the most about the value and relevance of our work. Tracking traffic from social to our website helps us better understand what kinds of content work on different platforms — and reveals a bit about how the audiences differ (for example: although Facebook drives higher numbers, our Twitter followers are more likely to spend more time with and engage more with a story when they’ve clicked through.)
Spreadsheet for vanity and engagement metrics for Erika Forsack, Social Media Strategist – Virginia Commonwealth University
At our university, we use a (relatively) simple spreadsheet that lives as a Google Document. We update it monthly with our traditional vanity (follows, likes) metrics as well as different engagement touch points. Throughout the month we track things more organically and share successes within our team through email and shout outs during staff meetings. A good chunk of our content that we post is tied to our news center, so we use click through rates and referrals from social. Because we do work somewhat like an agency and with many different PR specialists and areas of the university, what we are measuring can change based on their K.P.I.s (Key Performance Indicators). When I was a team of one at the School of the Arts operating on a shoestring (maybe it was more velcro), and couldn’t afford the fancy tools, I also used a simple version of this Google Document. I think it’s important to realize you can manage all of this on your own without special tools. Looking at the previous month’s metrics can help me make changes for the upcoming month based on how much content is published, when it’s posted and how the audience is interacting with it.
Engagement and referral traffic for Krista Boniface, Social Media Officer at the University of Toronto
At the University of Toronto, our measures of success depend on specific campaign goal, however there are some consistent factors that we evaluate. For social media, we pay a lot of attention to engagement, growth, views, view time, impressions and referral traffic. We also like to break down engagement with a qualitative sentiment analysis to see if we’re changing attitudes or actions through our social content (ie. Did our followers find this helpful? Were they able to make a smooth transition from social to web and did they stay for long?)
Excel is so great! There are so many exciting measures of success waiting right there in the raw data. With Excel, all you need is some analytical motivation, a comparative time frame and benchmarks to get started with and a lot of patience to fill in all the data as you go. Whether it’s year to year, or even day to day, these metrics will empower you to tweak your content to be more successful through each trial you take.
Reports generated with the power of Excel that show engagement and growth for each channel can show your next strategic move or where your team should shift gears, as a new content approach could make all the difference. Data is a huge driver for change and ROI, and also a great way to showcase your success stories and repeat formats that works.
Vanity and engagement metrics for Andrew Cassel, Social Media Admin at the University of Alaska Fairbanks
Content I share is measured and evaluated by my direct supervisor. We had a series of meetings over a period of months to narrow down the exact data I should report. There is SO MUCH available that we decided it needed to be easy and quick to find across all platforms, should be reported every month, and should help measure our progress towards certain goals. For smaller platforms I simply report the number of followers. For Facebook and Twitter that expands to include impressions, clicks, and engagement. I enter these numbers into a shared spreadsheet. The most exciting part is to look at the numbers after a year and see how many million (!) impressions content has received.
Page views, time on page and engagement for Conny Liegl, Senior Designer at California Polytechnic State University
At Kennedy Library, we’re using different metrics to track content performance. For websites, Google Analytics gives us a good idea of our users’ behavior, from page views, time on page to page depth or traffic sources. On social media, we mainly measure engagement, i.e. likes, mentions, shares or comments. Applying UX strategies, we try to optimize all content for SEO to secure organic traffic and good rankings through keyword discoverability. We can also determine the impact of our outreach based on philanthropic gifts the library received.
Benchmarking with past content for Sonja Likness Foust, Director of Social Media and Content Strategy at Duke University
We measure our big content pieces campaign-style, so we look at how the content performed across all of our channels, including web and social. We then benchmark those numbers with our standards for similar stories. It involves a lot of elbow-deep analytics work in Google Analytics and all of the social analytics platforms, manual screen-capturing and copy-pasting, and using our thinking caps to figure out conclusions based on the data. It’s worth it, though! We base decisions for future content largely on how past content has performed and how well it performed on different channels.
Conversions for Danielle Sewell, Director of Marketing & Communications at Coker College
I’m a big believer in creating messaging for what you measure. If the goal of a particular piece of content is to encourage attendance at an event, then I’ll measure success, at least in part, by how many people showed up. If we want more prospective students to connect with our admissions team, then I want to know how many people filled out an inquiry form after viewing the content. If the goal is to build a sense of community and school spirit, then I’ll be looking at social engagement metrics like shares, comments, or hashtag use. The metrics used to determine your level of success should change as the goal of your content changes. Data can be a powerful tool to help you determine what’s working well and what needs to be adjusted—but you have to be looking at the data that actually reflects what you were hoping to achieve.
Goal Tracking for Jeff Stevens, Assistant Web Manager at UF Health
UF Health Web Services’ primary focus is evaluating our content effectiveness for our clinical health mission, which is measured in how well we connect our patients to care resources. We use Google Analytics with goal tracking to evaluate a patient’s journey through our site towards making an appointment. We evaluate how often users utilize certain pages in their decision-making process, and adjust our sites accordingly to make them as straightforward as possible.
Measurement framework for Carol Duan, International Social Media Specialist at Boston University
Our social team uses a four-step framework for identifying meaningful metrics to social performance:
1) Identify overarching goals
2) Determine the communication activities that support these goals
3) Identify the metrics that determine success
4) Measure the performance.
Using the Chinese social channels I’m managing as an example, we use Sina Weibo as the main Chinese distribution channel to promote the content produced by our editorial team, so traffic generation would still be an important metric to evaluate the post performance. But when we look at the social stats on WeChat, we’d use engagement rate and open rate to evaluate the content performance, given that we use WeChat to feature student stories and promote an inclusive campus life.
Comprehensive analytics reports for Tia Linder, Assistant Director of Online Communications – Fordham University
We measure and evaluate content performance through analytics. We rely heavily on our metric reports to tell a story about our content and web pages. We compile monthly, quarterly, and yearly reports and compare traffic to our pages and ads. Our analytics gives us insight into what our web pages/ads are or are not doing. We encourage editors to gather their yearly reports in a Google document so they can compare performance over time.
360 reports on Analytics with top insights for Lindsay Nyquist, Director of Digital Communication – Fort Lewis College
I gather metrics monthly and distribute to my student marketing team, Marketing & Communications Department, Admission Department, and upper leadership. These metrics include:
A report from Sprout Social, our social media management tool, that analyzes our Facebook, Twitter, and Instagram activity;
A screenshot of our YouTube analytics;
A spreadsheet of stats from Mailchimp, our email marketing platform, including open rates, click-through rates, and numbers of emails sent;
A few graphs from Meltwater, our media monitoring service, that show our media mentions compared to our competitors in both the social and news spheres;
A screenshot of activity from YouVisit, our virtual tour, that shows the number of visits and actions taken;
A spreadsheet that shows the activity from any recent Snapchat geofilters, including number of swipes and uses;
And anything else that seems relevant!
The most important thing is that I don’t send these reports out blindly and expect those outside of digital communication to get value out of them. Each month, I pick the most interesting five points across all the metrics and explain them. This could include why certain numbers look especially high or low; why numbers may have changed compared to last month or last year, or particular events or initiatives that may have driven spikes in engagement or followers.
A conference focusing on higher ed content?
The Higher Ed Content Conference(now available on-demand!) is a must-attend event for higher ed content professionals and teams looking for new ideas and best practices.
Read below what a few of your higher ed colleagues who attended the past editions of the Higher Ed Content Conference say about the experience.