Event success can simply mean that your event runs a profit. After all the work that goes into them, this is something that every organiser should of course feel proud and excited about.
However, in our industry, there are so many other things that deserve to be considered successes given that the end goal is to create holistic experiences for individuals who are passionate about subjects that we want to gather about. In this post, we spoke with four seasoned event professionals to find out what else they consider “success” to be, and what you could be measuring to get a better overall picture of how well your efforts paid off.
In events, we value the importance of face-to-face time with like-minded folks. We appreciate the thrill of ambiance and a positive atmosphere. We even get pleasure from the smallest things, like an event that makes sure their snacks are memorable (and delicious).
While a traditional framework for ROI is valuable, there’s so much more to a genuinely worthwhile conference than a number in a spreadsheet.
Going Beyond Tracking Revenue
Mike Madarasz from Digiday
Mike is a seasoned event professional. What’s more, he’s a seasoned marketing professional. If you’re unfamiliar with Digiday, it’s a powerhouse of publishing. Their repeatable business model and conference models are enviable in their excellence. However, when you’re so successful, it’s easy to get complacent about improvement. That’s never been an issue for Mike and his team, though. Here, he shares their modern recipe for measuring event success.
“Revenue is first and foremost what we track as a measure of event success. We want to make sure we’re retaining as many customers as possible year-to-year. We also want to make sure we’re growing those customers. Someone might buy a ticket this year and, if we can get them to sponsor that event next year, that’s a huge win for sure.
“We also want to know — it’s really tough to quantify, but we’re starting to get pretty good at it — the influence in the room.”
“The way we do that is by collecting the budgets of the people who are attending. We think it’s a good way to go about that. We’ve turned our attention more towards quality rather than quantity in as far as the audience goes.
“We’re hosting the Digiday Retail Forum and, if I wanted to, I could guarantee that that the event is going to be full tomorrow, but I want to make sure we have the right audience there so that it’s going to provide contributions to the discussions and learnings for everyone else.
“We send out follow-up surveys after every event. We get qualitative and quantitative data from those surveys. That’s kind of simple but that’s our method.
“It’s usually sent out less than a week after.”
“There’s a number of things that we want to put in front of people first, just to close the loop on their experience, like the slides from the event, photos from the event, what other events we have that are adjacent to that one coming up next, etc. Once we cover our bases on all that we’ll go back and try to see what our guests thought.”
Using Multiple Platforms to Evaluate Feedback
Scoring an entire conference out of 10 is a subjective measure of event success. One person’s six is another person’s four, and so on. While there’s a lot to be said about this form of testing, what Ministry of Testing share below is yet more valuable. They’ve established core areas where they pursue feedback from attendees in multiple ways to ensure they’re collecting both qualitative and quantitative results.
“We’ve been thinking about this a lot. We’ve tried a few things.”
“At our event in the Netherlands last year we printed off a schedule of the running order of the day, and on the back of that we printed an emoji feedback form for each talk.”
“The idea was you could circle which emoji you think represents what this talk meant to you. And then we added a box at the bottom for more feedback if you wanted to give it. We actually got pretty good feedback because people could do it anonymously.”
“Anonymous feedback can be dangerous, but in our first instance of the experiment with it, it went really well.”
“We read the blog posts that people write after the events. We check the engagement on our online platform. We also look at, when we send out the email after the event for attendees for access to the videos, what percentage of those attendees reply to get access.
“These are all things we’re looking at. We’re seeing that we’re probably not as successful as we thought without looking at the metrics, and that’s something we’re going to work on this year.
“In terms of our talks we’ve already started working on that. We’ve opened our reviews for papers to the public. So, you sign into your Ministry of Testing profile and you can review every single submission that we’ve had for a conference on our system.
“Some people felt like we were just picking talks because we liked people; our process really wasn’t that transparent. But now, if people are complaining about the line-ups we can say you can go on here and you can actually review talks. You can help us choose these line-ups. It’s in your hands.”
Re-Thinking Surveys for Event Success
“We try to do a survey after every event. We do three surveys: one for our sponsors, one for our attendees, and one for our speakers. Every year we aim to improve based on the results of those surveys. We try to make sure we have the surveys almost ready to go before the event even starts so that the minute that’s over, we can get it right out there while folks are still thinking about the event.”
“We ask questions about the quality of the speakers, the quality of the venue. We talk about the CoC and if they felt comfortable at the event. The food, the swag, the location, and the value.”
“Time is valuable so we can’t the survey too long. I think the ability to keep it anonymous is good unless the respondents want you to contact them. Folks are more open and honest and they don’t need to worry about retaliation.”
Old-School Feedback Making a Comeback
Saron Yitbarek from Codeland
Saron came highly recommended as someone to reach out to when it comes to conference feedback and event success. As well as the team’s reputation for providing a phenomenally warm and welcome show, the way that they evaluate and improve from that experience is innovative in its simplicity.
“We do feedback very manually. At the end of the conference we share, we print out ten questions on a scale of one to ten. We hand it out to folks at the end of the conference.”
“Each feedback form has a raffle ticket attached to it.”
“And if they fill it out, the responses get put into a big box and we do raffle prizes for folks as a thank-you for filling out the form and (hopefully) being honest about their experience. Afterwards, we tally up all the feedback and put it in a spreadsheet. Then we’re able to see what did we do well, what could we do better? Then we can see year over year what has changed.
“I don’t really see the value in having a name to the feedback form. If anything, it biases you. It makes the person feel more uncomfortable. However, I do think it’s important to know what type of person is giving the feedback. Is the feedback from a volunteer? Is it from an attendee? Is it from a sponsor? Because that’s really interesting. Your conference experience for an attendee is very different from that of a sponsor. So I think it’s really important to have on the form, you know, which one do you fall under?”
If you’re interested in finding out more about how our customers strategise about sales, marketing and event success, you can subscribe to blog updates from us below.