My problem with Jack Welch's "vitality curve" is that it assumes a certain distribution of performance is immutable. It assumes a static world in which 20% deserve to be promoted and rewarded, 10% deserve to be fired, 70% are good where they are.
Now, a large company in 1981 (when Welch became CEO of GE) would have a reasonable amount of deadweight. Why? Because from 1945 to 1981, people were still (culturally, at least) in shell shock from the Great Depression and (a) firing was rare, and (b) people didn't leave stable jobs, even after they'd checked out, and even if there were great opportunities elsewhere. So, as much as I dislike the "greed is good" attitude of the 1980s and the rampant "cost-cutting", I have to admit that most companies circa 1980 could use a bit of cutting, because they'd become pretty bloated and risk-averse.
Firing the bottom 10% of a large, sprawling company in 1981 was probably a good idea. You could do this maybe twice (i.e. two years in a row) and improve the company. The first time, you get a major effect, because the worst people aren't just costing money, but sucking away time and morale: they're dividers rather than mere subtractors. Every divider gone is a good step for a company. Subtractors tend to be just less competent than expected, and they should be improved and given more chances; but don't ever think twice about firing a divider.
But once you've gotten rid of your deadweight, now you're firing half-decent people who just haven't "clicked" yet. After the obvious underperformers (1 to 30% depending on the organization) are gone, the next targets are junior members of underperforming teams (and this is perverse, because the newest people are least responsible for the team problem). This, however, exacerbates the discrepancy between the good and bad teams: before, the differences were of prestige and where one sits in the cafeteria, but now being on the wrong team means getting fired. So, the careerist scramble to get on the right projects and under bosses who can protect their underlings gets a lot more heated. Soon you have the careerism and "warring departments" dynamic that Microsoft is getting raked over the coals for. Google, which has a less severe rank-and-yank (5% get PIP'd, which doesn't usually lead to firing, but only because most Googlers can get jobs anywhere they want with "Google" on their resume) has the careerism but not the warring departments (yet) but the internal mismanangement of Google+ sociology is definitely a step in the wrong direction.
> My problem with Jack Welch's "vitality curve" is that it assumes a certain distribution of performance is immutable. It assumes a static world in which 20% deserve to be promoted and rewarded, 10% deserve to be fired, 70% are good where they are.
Once your organization is large enough, a pretty reasonable case can be made that this distribution remains constant (although in fairness, the argument isn't that 10% deserved to be fired, it's that 10% are underperforming, and you need to quickly determine if that is going to change). It's undoubtedly NOT precise or immutable, but it is probably closer to "correct" than what happens without such practices in place.
> But once you've gotten rid of your deadweight, now you're firing half-decent people who just haven't "clicked" yet.
You are assuming no hiring, acquisitions, and changes in your business that change the value of employee work. That is the typical image of a large, lumbering conglomerate, and part of the point of codifying the practice is to force the organization to step out of the myth.
Now, a large company in 1981 (when Welch became CEO of GE) would have a reasonable amount of deadweight. Why? Because from 1945 to 1981, people were still (culturally, at least) in shell shock from the Great Depression and (a) firing was rare, and (b) people didn't leave stable jobs, even after they'd checked out, and even if there were great opportunities elsewhere. So, as much as I dislike the "greed is good" attitude of the 1980s and the rampant "cost-cutting", I have to admit that most companies circa 1980 could use a bit of cutting, because they'd become pretty bloated and risk-averse.
Firing the bottom 10% of a large, sprawling company in 1981 was probably a good idea. You could do this maybe twice (i.e. two years in a row) and improve the company. The first time, you get a major effect, because the worst people aren't just costing money, but sucking away time and morale: they're dividers rather than mere subtractors. Every divider gone is a good step for a company. Subtractors tend to be just less competent than expected, and they should be improved and given more chances; but don't ever think twice about firing a divider.
But once you've gotten rid of your deadweight, now you're firing half-decent people who just haven't "clicked" yet. After the obvious underperformers (1 to 30% depending on the organization) are gone, the next targets are junior members of underperforming teams (and this is perverse, because the newest people are least responsible for the team problem). This, however, exacerbates the discrepancy between the good and bad teams: before, the differences were of prestige and where one sits in the cafeteria, but now being on the wrong team means getting fired. So, the careerist scramble to get on the right projects and under bosses who can protect their underlings gets a lot more heated. Soon you have the careerism and "warring departments" dynamic that Microsoft is getting raked over the coals for. Google, which has a less severe rank-and-yank (5% get PIP'd, which doesn't usually lead to firing, but only because most Googlers can get jobs anywhere they want with "Google" on their resume) has the careerism but not the warring departments (yet) but the internal mismanangement of Google+ sociology is definitely a step in the wrong direction.