Forums

My COO has asked me to rank the field staff we have within a specific region given recent loss of business. Obviously we are looking to downsize but not having been through this process before, I would like to know what format and agenda others use to determine one's ranking.

I work within the mobile medical arena thus this isn't your typical sales where you can look at one's production or metrics to see who are your top performers.

Any insight is greatly appreciated,

Dan[/i]

arc1's picture

Dan,

We are just coming off the back of the performance period and grappling with the same problem. People don't mind being ranked, but they object if there's a lack of clarity as to why.

The things I am focusing on to address this issue are:

[b]Clear high-level criteria[/b]. You can't rank people against each other unless you have criteria - without that it's completely subjective. What general behaviours does your company want to encourage (eg. speed? incident rates? contribution beyond immediate job description?)
[b]Job-specific criteria[/b]. If people aren't all in the exact same role, then to rank them, you probably need to translate the high-level criteria into specific things that particular person would need to be doing to demonstrate the high-level. A lot of large corporates do this in the form of a written "scorecard".
[b]Audit trail[/b]. I'd suggest you should keep a clear record of your decision process, so that if it's challenged (by your boss, or by a staff complaint / query), you would be able to demonstrate that the process was fair and objective, not a "finger in the air".

It sounds like you're not being given time to actually measure / performance manage the people before concluding a ranking? If so, makes it pretty tough. It's also a real challenge if you're basing the ranking on criteria that were never communicated to those staff before (ie. they didn't know where the goalposts were!)

But can you get access to any relevant performance records, periodic reviews, etc? That might help a lot.

Chris

Mark's picture

I'm not sure I follow you. You've never ranked anyone before, and as such you want a template of some sort, and in your case it's made more difficult because you have no metrics upon which to base your ranking?

(This is not judgmental, just a brief paraphrase.)

Mark

US41's picture

I have three teams reporting to me with five people on each team. I rank the teams, and I rank the individuals. I have a scoreboard that I record successes and failures on, and I divide them out and come up with numbers. When I sort the list from best score to worst, the people fall into the order that I had intuitively perceived that they were at anyway, with a few surprising exceptions.

It's hard to make objective metrics purely objective, but I've managed to come close, and it has been a long process with a lot of experimentation, thought, and slow work toward identifying what the key behaviors are. Most difficult was identifying what to gauge as successes, because obviously I cannot attend every call, read every document, see everything that they do. However, when I discovered an easy-to track denominator against which to divide exceptions, I was able to come up with some formulas that work surprisingly well to create this scoreboard.

It's not easy to do. It took a long time to develop with a couple of ludicrously obscene failures to develop metrics.

Once you have something like this in place, and you honestly track it, and give feedback against it, and publish the standings to your folks so that they can improve themselves, it serves as major feedback and their performance in the areas that drive those metrics will sharply turn upward. We've seen a 50% increase in behaviors that result in effective performance since starting to work with the scoreboard.

Having it in place means that if layoffs come, I pluck the bottom folks from that sorted list, and I keep the people I need.

But I was very, very careful to spend a long time (months) listing the behaviors that I wanted to measure and track. If you don't measure it, they don't do it. If you do, they will. Make sure what you measure is what you want.

If that doesn't sound doable for you, and I can understand why not everyone would do that, then be brave and list them in the order you think they perform in (You are probably 80% correct against what your perfect objective matrix would reveal anyway), and you have a good chance you will make the right decision.

... IF you are doing feedback, O3's, etc, and are in touch with your people and how well they perform.

That last part is the real key.