Having run thousands of 360 feedback processes, we've learned what makes them a waste of time.
We've learned by getting feedback on our approach from:
To date we've experimented with:
We're now pretty damn good at helping People Leaders enhance their feedback culture - and here are our top 9 traps to avoid when doing your next 360 feedback process.
~~~
There's a tendency to cram as many questions as you can into a 360 feedback template.
The thinking is "we might not do this again for 9-12 months, so let's get as much data as possible".
But, there is an inverse relationship between the amount of feedback you ask for, and the amount that you get.
The more feedback you ask for, the longer it takes each person to write it out, and the fewer people that actually do it. Or, everybody does it but they start giving shorter and shorter and less thoughtful responses to make sure it all gets done on time.
Ideally, you don't want employees spending more than hour typing out feedback for each person.
Stick to 4-6 questions for your 360 feedback template. Focus on quality, not quantity. By doing this, you'll end up with a greater quantity of feedback anyway.
~~~
If you give people room to write 1,000 words, they'll feel pressured to do just that.
Not only does this creates anxiety on the part of the feedback giver, it creates a sh*t sandwich of advice.
It goes like this:
"Here's something good that you do. And here's something not so good. But don't forget this good thing."
What should the recipient focus on - the two good things, or the one not so good thing? The point of the feedback isn't clear.
If you instead give people say 200 words, they're forced to be concise and factual. There's no room for waffle. This is far more clearcut and actionable for the person reading it.
At Howamigoing we usually restrict responses to 280 characters - a tweet. That's about as far as our relationship with Twitter goes though...
~~~
When you let people give feedback in the form of free text, you need to control for creative writing.
In order for feedback to be actionable, it needs to be tied to an action. There can't be behaviour change without a behaviour to refer back to.
Separating feedback into Fact and Feeling helps to moderate responses and overcome our natural tendency to only say how we're feeling.
And here's a short video with a couple of examples.
~~~
Once a 360 feedback processes is complete, we usually survey employees to ask which questions where most valuable.
The result? Feedback is 6x more valuable when received as a comment vs a rating.
Why? Ratings might give you a snapshot about where you stand, but they don't tell you how to get better. The purpose of feedback is to help you get better.
To illustrate this, consider these two questions.
Which would help you improve your presentation skills the most?
The second one, of course.
That's not to say ratings aren't useful at all, just that they should be combined with comments.
~~~
Having run thousands of 360 feedback processes, we've learned what makes them a waste of time.
We've learned by getting feedback on our approach from:
To date we've experimented with:
We're now pretty damn good at helping People Leaders enhance their feedback culture - and here are our top 9 traps to avoid when doing your next 360 feedback process.
~~~
There's a tendency to cram as many questions as you can into a 360 feedback template.
The thinking is "we might not do this again for 9-12 months, so let's get as much data as possible".
But, there is an inverse relationship between the amount of feedback you ask for, and the amount that you get.
The more feedback you ask for, the longer it takes each person to write it out, and the fewer people that actually do it. Or, everybody does it but they start giving shorter and shorter and less thoughtful responses to make sure it all gets done on time.
Ideally, you don't want employees spending more than hour typing out feedback for each person.
Stick to 4-6 questions for your 360 feedback template. Focus on quality, not quantity. By doing this, you'll end up with a greater quantity of feedback anyway.
~~~
If you give people room to write 1,000 words, they'll feel pressured to do just that.
Not only does this creates anxiety on the part of the feedback giver, it creates a sh*t sandwich of advice.
It goes like this:
"Here's something good that you do. And here's something not so good. But don't forget this good thing."
What should the recipient focus on - the two good things, or the one not so good thing? The point of the feedback isn't clear.
If you instead give people say 200 words, they're forced to be concise and factual. There's no room for waffle. This is far more clearcut and actionable for the person reading it.
At Howamigoing we usually restrict responses to 280 characters - a tweet. That's about as far as our relationship with Twitter goes though...
~~~
When you let people give feedback in the form of free text, you need to control for creative writing.
In order for feedback to be actionable, it needs to be tied to an action. There can't be behaviour change without a behaviour to refer back to.
Separating feedback into Fact and Feeling helps to moderate responses and overcome our natural tendency to only say how we're feeling.
And here's a short video with a couple of examples.
~~~
Once a 360 feedback processes is complete, we usually survey employees to ask which questions where most valuable.
The result? Feedback is 6x more valuable when received as a comment vs a rating.
Why? Ratings might give you a snapshot about where you stand, but they don't tell you how to get better. The purpose of feedback is to help you get better.
To illustrate this, consider these two questions.
Which would help you improve your presentation skills the most?
The second one, of course.
That's not to say ratings aren't useful at all, just that they should be combined with comments.
~~~
When using ratings, make sure you:
Compare these two questions:
Which do you think will give you better information about the level of peer recognition in your business?
The issues with Q1 are:
Now consider this question:
The problem here is that it's not clear to the user:
What if sometimes my attention to detail was "below average", but other times it was" above average"? How do you weight the situations? What makes for an outlier? It's subjectivity layered upon subjectivity.
Finally, consider these two questions:
Question 1 is the infamous NPS metric. As a statistician, I can safely say that it is a vanity metric of very little managerial value.
Do you ask people what they would want their weight to be, and then form a bell-curve off of that? No, you measure what their weight is.
Also, when you finally manage to catch-up with a good friend or family member for 30 minutes, how high on the agenda is recommending your employer?
Making decisions based on what people say they will do is fraught with danger. Make decisions based on what people actually do. If it hasn't happened yet, it's not data, it's conjecture.
~~~
Have you ever given your wife, husband, son or daughter the same advice over and over, only for it to be ignored? But then somebody else comes along and gives that exact same advice, which they decide to act upon? Frustrating, but that's life. Sometimes the actor saying the lines is more important than the script.
We listen to the people that we trust and respect.
Just because we work with someone, doesn't mean we care what they think about our work. It probably shouldn't be that way, but it is.
Yes it might makes sense for John, who I've done three projects with this year, to give me feedback, but if I don't think he has my best interests at heart, I'm not going to place much weight in his comments.
I'm far more interested in feedback from Jane, who I only did one project with, but I look up to her and think she's a boss.
~~~
Originally, Howamigoing catered for both anonymous and not anonymous feedback.
When feedback was not anonymous, it was very often labeled as "watered down" and "too positive". The 360 process was then labeled "tick the box".
So we tried removing people's names from their comments and ratings.
We found that this resulted in "more honest" feedback. There were no reports of "damaging" or "hurtful" feedback, just that "an extra layer of the onion had been peeled".
This was comforting for many HR Leads, who feared that letting people respond anonymously would create a culture of "non-ownership" of comments, and "keyboard warriors". It didn't, it just allowed people to a little more open than they normally are.
As a result, the 360 process was considered more valuable and worthwhile. This resulted in more transparent, more trusting performance conversations.
This was especially true in small businesses where the fear of hurting feelings was high. And even more true for leadership 360s, where people were afraid to be critical of their bosses.
~~~
We found that most employees don't want to wait 6 or 12 months to know where they stand.
And we found that 3 questions is enough for a 360 to deliver value to people.
So don't wait until the end of the year to do a big hairy 360.
Get a short, sharp one done each quarter with just a few pointed questions, like:
It's such a shame to see a business enthusiastically launch their first 360 feedback process...
...only for the CEO or Founder to leave all their feedback requests/ submissions to the last minute, or not take part at all.
It sends such a conflicting message to employees. "Hey, here's this new process that we are really excited about that will greatly benefit the business, but C-Suite aren't prioritising it."
Compare this to when your CEO is the first one to request feedback and the first one to give it. The social proof and momentum this creates is enormous.
Send an email to hello@howamigoing.com if you'd like help. We're always happy to have a chat and provide advice, even if we're not a good fit for you!