Home Contact Blog What Who Why

How true is a review?

According the 2014 report Keep Social Honest commissioned by CIM and run via YouGov, 65% of consumers would look at Trip advisor, Mums net or Amazon reviews before purchasing goods or services.  In research projects run in a previous life,  we’d seen a trend that younger students were more likely to seek reviews before making purchases on student essentials compared to mature students. That led me to wonder how important the growing range of student reviews are for influencing what university an applicant chooses.

 I had a look at some on-line university reviews which are still relatively new phenomena. Yes, there has always been the National Student Satisfaction survey but I'm particularly interested in the unprompted/ user generated reviews growing at pace across social media.  As well as reviews on established social platforms, there are growing number of student review sites such as Student Crowd. I am working on a bigger piece but for now I'll share some discoveries just on Facebook reviews across five institutions chosen at random. 

Facebook gives the option of giving ratings between 1 and 5 stars, where 1  is poor and 5 is excellent. It describes what the ratings mean but it is possible some reviewers mistake 1 for ‘first place’ and 5 as the lowest score, thereby inverting and unintentionally subverting the scores. 

There was an immediate discrepancy between the count of reviews recorded by Facebook and the number that could be displayed. This was mainly an issue when there were over 300 reviews. For this reason I had to base scores on measurable results.

Most reviews contained a high number of promotional messages. These were not reviews at all but were people trying to get in front of students. These messages tended to be for student housing, exam and coursework writing, club nights, local businesses and occasionally foreign nationals looking for funding. Promotional  ‘reviewers’ almost always scored the university a 5 out of 5. Unofficial university pages were more likely to have promotional ‘reviews’ (9.5%) compared to verified accounts (4.2%). A large number of verified accounts have disabled the review feature. 

The second point was how few of the reviewers leave any explanation for their score. In fact, nearly 75% did not leave any explanation. I noted that a number of these ‘no comment’ reviewers had given over 200 reviews. I suspect many are bogus ‘click farm’ accounts but given this was an afternoon musing rather than a full on project I did not look into further. One for another day.  I should add, I do not think the institutions paid for these reviews, more like the accounts are reviewing universities to increase their own reputations.

The third discovery was the number of reviews from students who have not actually started at the university yet. They made up 3% of all reviews (where it was possible to detect) and almost always scored the university a 5.  Most were expressing gratitude for an offer or acceptance, excitement about starting soon  or because they had experienced a good time at an open day. 

The final discovery, an easy calculation, was how many times a representative of the university had replied to a review. 0%.  Not one comment or 'like' from any of the five institutions chosen at random. One university (university D in this example) had a number of complaints and there were no responses  from the HE at all. Perhaps the PR team advised them to stick to the classic “never explain, never apologise” however I can’t help thinking some human interaction responding to the concerns would be helpful and well received.  

I then wondered how the Facebook average scores might change with the promotional messages removed or grouped by the  different types of review:

So what have I learned, if anything?

  • Promotional messages posted as reviews  inflate review scores slightly.
  • People about to attend the university are more likely to provide a higher score. Is there an engagement window of opportunity to make these ambassadors  feel welcome and part of campus? 
  • People who leave an explanation are more likely to score higher than those who do not leave an explanation, except for university D where there was a lot of passionate commentary.
  • The five universities in this example never responded to reviewer's comments regardless of whether they were positive or negative. I suspect this is due to lack of time.  Disabling the review function removes this dilemma but I can't help think there is a missed opportunity for getting customer feedback and having positive interaction.
  • Fact: Giving a low score without an explanation is no help to anyone. The next blog may be on review etiquette... 

Comment