By William Ronco, Ph.D.
O would some power the gift to give us to see ourselves as others see us. Robert Byrnes
Are the villagers carrying torches?
A scientist beginning to review her 360 results
360 Survey Uses, Potentials
We like to use 360 surveys because they consistently generate useful results. Scientists we work with in leadership training, teambuilding and strategic planning consistently use their 360 results to make meaningful gains improving their leadership and communications performance. Their survey results help them see more clearly how their communications with others advances – or obstructs – their science.
360 surveys provide the person being surveyed with survey data about their communications and leadership effectiveness from respondents. The surveys use the “360” label because they usually elicit data from a mix of people around the person being surveyed, people who manage, collaborate with, are clients of and/or are managed by the respondent.
Once thought radical, 360 surveys are now in widespread use in most industries.
Numerous vendors offer 360 survey packages online. Software like Survey Monkey makes it possible for amateurs to design their own surveys with minimal investments of time and effort. Some companies conduct 360 surveys on key managers and employees and use the results when making promotion and advancement decisions. Others enable survey recipients to control the process themselves, selecting respondents more for learning and development purposes.
Scientists in particular benefit from the 360 survey process. If done well, the surveys translate the important but difficult-to-describe competencies of “communications” and “working relationships” into data. Scientists can take the time to analyze the often-rich data in their survey results, develop and implement thoughtful action plans. Some scientists we’ve worked with have taken their 360 results data far beyond the percentage and mean scores the survey software produces, creating scatter diagrams and exploring standard deviations.
360 Survey Shortcomings, Dangers
Of course 360 results do not reflect of respondents’ opinions in ways that are perfectly accurate. Some respondents are wary of the surveys’ claim of confidentiality and so restrain their responses. Some use the surveys as a weapon, communicating long-held grudges. Others take the surveys as an opportunity to advance their own position, e.g., in open-ended comments like “Harry is the best boss I ever had, signed George.”
It’s also quite possible to for 360 surveys to cause significant damage. The whole premise of receiving “objective” survey feedback one’s communications effectiveness glosses over the gravity of the personal issues 360 surveys may address. Listening skills, showing respect, working well with others — the behaviors surveys describe can be difficult to think about and discuss, entangled in respondents’ ego and identity.
Some few companies have put 360 surveys to ill use, reassuring respondents and recipients that the process is confidential but then using the data later to influence advancement and layoff decisions. More often, the harm people cause with 360 surveys is unintentional, the result of careless planning, survey design or processes to absolutely ensure respondent confidentiality.
Getting The Most From Your 360 Survey: Six Tips For Planning And Launching
It takes some clear thinking and planning to avoid the problems and shortcomings, and get the significant results 360 surveys offer:
1. Clarify outcomes. It’s most useful to begin work with 360 surveys not by reviewing the packaged survey offerings online or drafting survey questions but by listing what kinds of information you want the survey to provide and clarifying how you want to use the survey. If the survey is being initiated not by you but by your organization and your influence on survey design is limited, it’s still very useful to clarify what you’d like the survey to do for you, what aspects of your communications you’d like it to spotlight and how you anticipate using the results.
2. Clarify survey ownership. Some companies retain 360 survey data in Human Resource files, others enable recipients themselves to receive the only copy of the data. Both approaches can be effective for different purposes as long as they are made clear and then scrupulously maintained. Changing the uses and circulation of survey data mid-stream is ineffective and unethical.
3. Clarify respondent confidentiality. Some 360 surveys don’t try to achieve respondent confidentiality, using the process more as an open reporting device. More often, 360 surveys strive to ensure respondent confidentiality to maximize the value of and validity of responses. Respondent confidence in your assurance of confidentiality will enhance the quality of their responses. Take the time to discuss your plans for ensuring confidentiality with several of your actual respondents to ensure that you’ve provided an effective strategy.
4. Consider the number of respondents. It usually takes a minimum of 8 respondents to both provide reasonable respondent confidentiality and generate enough responses to detect patterns and trends in your communications. It’s fine to increase the number of respondents to 15 or 20, but often difficult to analyze results beyond that.
5. Maximize response rate. A survey of 10 respondents that has a 100% response rate is much more valuable – and valid – than one of 100 respondents that has a 10% response rate. Work to increase your response rate by selecting respondents thoughtfully and encouraging them to be open and frank in their responses. Keeping your survey as short as possible maximizes response rates. It’s tempting and easy to add questions that are “interesting.” However, each question you add has the potential to reduce your response rate .
6. Link your desired outcomes with the survey questions. Pre-packaged surveys generate tidy, graphically impressive results and enable you to compare your responses with those of people in other organizations. However, many of the statisticians we’ve worked with have scoffed at the packaged survey statistics as misleading. They’ve challenged the practice some of these packages use in providing you with mean data scores, neglecting to show how the data points are arrayed. Also, science jobs seldom lend themselves to comparison across organizations because they usually evolve in ways that are shaped by specific, project-related tasks.
7. If you’re designing your own survey, take the time to do the online tutorials. Customizing your survey usually makes it more possible to link survey questions with your desired outcomes, but it does take a bit of effort. The neatness and agility of Survey Monkey and other online tools can mask the ambiguity of the questions you formulate. To ensure that your questions are addressing the items that interest you, pre-test them on a few people and discuss their thoughts.
8. Tune your questions. Organize your questions into 5 – 8 groupings of 5-6 questions per group. Use a 5-point Strongly Agree – Strongly Disagree scale for responses on closed ended questions. The difference between an “Agree” and “Strongly Agree” response is essential for respondents to express their sentiments and scientists who want to perform at the highest levels. Use mostly closed-ended 5-point scale questions, but also include several open-ended questions to capture whatever the closed-ended questions miss. Remind respondents that the survey software uses their open-ended comments as written, so they should be careful not to use catch phrases that may identify them.
9. Consider using 2-part questions. Two-part questions ask respondents not only to scale responses on 5 points from Strongly Agree – Disagree but also rank the Importance of the question on a 5-point scale from Very Important to Very Unimportant. This question form often points out to the person being surveyed that he/she is performing tasks effectively that respondents don’t think are important while neglecting tasks that respondents rate as high priorities.
10. Ask respondents to email you when they’ve completed the survey. That way you can protect their confidentiality while also keeping tabs on your response rate. If your response rate lags, ask all respondents again to complete the survey. Allow about a week for respondents to complete their efforts.
“I Know Who Wrote This”
Initially reviewing their 360 survey results, many people ignore the quantitative data, turn to the open-ended comments, quickly find the one or two less-than-wonderful phrases and even more quickly conclude, “I know who wrote this.” Many others comb quickly through the data, focusing, then obsessing about the several less-than-perfect response sets.
Both responses are understandable but not terribly productive strategies for getting the most from 360 survey results. Finding patterns and trends in responses generates more useful insights than attempting to identify the source of a rogue response. And the statisticians among our readers know very well what to do with the outliers in any data set.
Scientists react with less than scientific responses to their 360 survey data because the data addresses personal issues. Despite our professional preferences, traces of emotion stubbornly remain. However much we claim to – and really do – want objective feedback about our communications effectiveness, reviewing data describing our actual performance tests the limits of our Spockian ideals.
As a scientist you will also probably quibble with the validity of 360 data, and your quibbles are valid. 360 survey data is subjective, your sample size is small, respondents completed the survey under less than perfect lab conditions. Still, the survey gives you a much fairly accurate, comprehensive and calibrated sense of others’ perceptions of your communications effectiveness.
Look Through The Johari Window
The Johari Window provides useful perspective for interpreting 360 results. Named not, as it sounds, after a mystical Eastern philosophy but rather, for its inventors Joseph Luft and Harrington Ingam, the contingency table divides ways to understand one’s 360 responses:
Optimally, all of survey recipient’s reactions to his or her 360 data would fit in the “Public” block; 360 data recipients would be able to perfectly predict all their responses.
Working with thousands of scientists and their 360 data, we have not yet encountered anyone who perfectly predicted – or has been completely surprised – by his or her 360 responses. A few peoples’ predictions closely approximate, and some broadly miss their actual results, but most are generally accurate. Nearly all miss a few things. They say the survey results are helpful in pointing out specific items they need to work on.
Some of the things people need to work on reside in the “Blind” block. These are aspects of their communications that elicit reactions they were not aware of. Scientists’ “Blind” block often includes others’ reactions to their efforts to criticize. The scientist did not realize that the criticism of others’ work that they thought was “crisp” and “rigorous but fair,” was perceived by the recipients of the criticism as “devastating,” “withering,” “over the top,” or “dehumanizing.”
Other, different things people need to work on fit more accurately in the “Hidden” block. These are aspects of their communications that they thought they had made clear, but that others apparently have not seen. This occurs when scientists thought they were providing more than enough information about progress on key tasks, only to find that their 360 survey respondents want and need much more.
The distinction between “Blind” and “Hidden” responses is important because each requires a different action response. Responding to “Blind” issues asks 360 survey recipients to generally withdraw, to pull back on behaviors that irritate, annoy or anger others. Responding to “Withheld” issues, on the other hand, requires 360 survey recipients to generally step up, push forward, communicate more about matters that interest, involve or impact others.
6 Tips Help You Get The Most From Your 360 Results
1. Focus first on the quantitative responses, then look at the open-ended comments. The quantitative data is less colorful but much more useful to enable you to identify patterns and themes in your communications effectiveness. Once you’ve got a grasp on your overall response profile, move on to the open-ended comments to provide more “color” to help you understand the quantitative shades of grey. Try not to dwell on any open-ended comment, no matter how interesting or upsetting it may be. Remember, the people writing these comments often don’t intend them to reflect their best thinking.
2. Begin with the “Overall” question. Most people start reviewing their data by looking at the responses to the first question on the survey, then moving on to each question in sequence. It’s more useful to skip to the “overall,” summary questions that usually conclude the results because they provide an overall context for all the other questions It’s also useful to prioritize the questions, focusing most on those that address issues most important to your effectiveness overall.
3. Aim for a majority of “5’s on a 5-point scale”on the key questions. If you’re wondering what to aim for, what’s considered “Excellent” 360 responses, aim for a majority of “5’s” on a 5-point scale. 4’s are fine, and 3’s may also be all right. However, it’s worth aiming for 5’s, especially on the questions that address issues most important to your job success for the same reasons you aim for excellence in your science work. Yes, some respondents just don’t give 5’s for anything and yes, you can’t please everyone. But the 360 is about communicating effectively, not pleasing people.
4. Convert your insights to actions. You’re most likely to make improvements and convert your 360 survey insights into actions if you:
- Keep the data in front of you. Some people tape print out to their monitor, or keep a view of it open on their browser
- Schedule times when you specifically address the survey insights
5. Discuss your survey results with respondents. Of course you shouldn’t ever ask respondents how they themselves responded to your survey. However, discussing your reactions, insights and action plans with respondents usually elicits positive responses from them. it shows them you’re taking their responses seriously. Also – as long as you don’t attempt to defend or explain yourself – discussing your results with respondents usually generates additional insight that’s very useful for you.
6. Plan to re-survey in 6 – 12 months. Circumstances change, and you’ve been working on addressing the issues that came up in that survey. Getting into the habit of doing a 360 annually enables you to benchmark and chart your progress in leadership and communications effectiveness the same ways you do for the technical aspects of your science.
This post was originally published on Genetics Engineering and Biotechnology News as a 2-part post. See:
Director of the Biotech Leadership Institute William Ronco, Ph.D. (firstname.lastname@example.org), designs and uses 360 surveys in his work with hundreds of scientists every year. Dr. Ronco consults on leadership, communications, team, and partnering performance in pharmaceutical, biotech, and science organizations.