Whether you are a shelf stocker at Walmart, a second year associate at a consulting company or an equity analyst at an investment bank, you may feel that you are not adequately compensated for the work you do — in other words, you are underpaid. But underpaid relative to what? How do employers determine compensation levels, and what consequences can these decisions have for the organization?

Indeed, many people think that compensation systems are broken, with some CEOs paid exorbitant sums that are not always related to their performance while lower-level employees are paid salaries that barely keep them above the poverty level.

A recent article on Bloomberg.com last month illustrates the gap between high and low wage earners in the U.S. According to the article, in 2012, the average multiple of CEO compensation to that of rank-and-file workers was 204, up 20% since 2009. In other words, the average CEO made 204 times what the average worker earned in wages and benefits.

The most egregious example cited by Bloomberg.com was Ron Johnson, former CEO of J.C. Penney, which fired him April 8 after a 17-month stint during which he failed to turn around the company. Johnson, according to Bloomberg, received $53.3 million in compensation as reported in the company’s 2012 proxy — “1,795 times the average wage and benefits of a U.S. department store worker [$29,688] when he was hired.”

Comparing the two numbers “is the equivalent of stacking the length of a loaf of bread — give or take a few slices — against the height of the Empire State Building,” the article said, citing two other notable examples: Abercrombie CEO Michael Jefferies received $48.1 million in 2012, 1,640 times the average clothing-store worker’s $29,310 compensation package. Simon Property Groups paid CEO David Simon $137.2 million in 2011, 1,594 times the average compensation of $86,033 paid to “employees of funds, trusts and other financial vehicles.”

But CEO pay is just part of a much bigger issue: What does it mean to be “fairly” compensated? What are the consequences when employees feel they are underpaid, and how can employers address this concern? 

The Employee Perspective

According to Wharton management professor Peter Cappelli, the issue comes down to “whether employees believe that the amount you are paying them, all things considered, is unfair relative to what you are asking them to do and relative to what [type of job] they could get someplace else.”

Consider universities and other non-profits, says Cappelli, whose latest book is Why Good People Can’t Get Jobs: The Skills Gap and What Companies Can Do About It. “They tell employees that they may be making less money in terms of salary, but the benefits are good, the jobs are stable, the mission is important. All that could be true. The problem comes if, on balance, employees believe that the other attributes of the job are not commensurate with the low pay. If that is the case, the organization is likely to lose people, and its turnover rate will probably be higher, which of course ends up costing money. Also, the organization is likely to get people for whom it is their second or third choice — i.e., people who don’t necessarily want to be there but couldn’t get better jobs somewhere else.”

One of the key determinants of satisfaction — or dissatisfaction — with compensation is how employees feel their pay package compares to others, according to Wharton management professor Matthew Bidwell. “No doubt if somebody thinks he or she is doing the same work as another who is paid a lot more, this leads to resentment and ultimately to disengagement.”

Employers pay employees different compensation partly because of supply and demand, says Bidwell. “If the supply conditions are favorable, then wages go down. It feels rational.” But clearly employers also want their employees to be happy in their jobs. “So paying people the absolute minimum you can get away with is probably not a very good idea in terms of motivating them and keeping them from jumping ship.”

Wharton management professor John Paul MacDuffie cites research which suggests that employees arrive at perceptions of fairness regarding their compensation by comparing the ratio of their inputs — including, for example, their credentials, level of experience and amount of effort put into the job — to their outcomes, including such things as salary and benefits. Under this theory, employees also compare themselves to someone else, such as another person in the organization or even to themselves at an earlier stage of their career. In any case, “if the ratio is not equal, it causes a psychological strain that the employee wants to resolve,” MacDuffie says.

To deal with a feeling of being underpaid, he adds, an employee can do a number of things; for example, he can focus on the fact that he is lucky to have a job in a down economy, he can focus on the benefits of the job instead of the low pay, he can demand a raise or he can quit.

Wharton management professor Adam Cobb comes at the issue from the perspective of labor rates versus labor costs. Organizations, he says, do everyone a disservice by “equating the two. Labor rates refer to how much an employee makes per hour. But labor costs also reflect productivity. You could have two workers,” one who gets paid $20 an hour and the other $10 an hour. “But that doesn’t mean your labor costs are higher” if the $20 employee is five times more productive. Employers, especially when it comes to low-wage workers, “tend to think that if you raise the minimum wage, it will make labor prohibitively costly. But the reality is, if you pay people more, they tend to work harder,” whether that means devoting more attention to customers or pointing out ways that business processes can be improved.

A growing body of research, says Cobb, looks at the connection between low-wage work and productivity, and yet these studies don’t always filter down into corporate decision making. Instead, store managers often get bonuses if they reduce labor costs by eliminating employee bonuses, cutting back employee hours and so forth — i.e., “doing things that will diminish productivity.”

Employees and the Recession

An economy in recession, or slowly recovering from one, is rarely good news for wage earners, especially for the newly hired. Despite record highs in the stock market and slow improvement in the job market, unemployment remains at 7.5% and the federal minimum wage is $7.25 an hour, last reset in July 2009.

According to Lawrence Mishel, president of the Economic Policy Institute in Washington, D.C., productivity between 1973 and 2011 grew around 80% while wages and benefits of the median workers grew about 11%. “Almost all of that growth occurred in the 1995 to 2000 period. So, outside of these few years, there has been almost no growth in pay, but substantial growth in productivity.” In recent years, he adds, “we have seen the phenomenon of historically high profits [along with] substantial unemployment and little wage growth for hardly any group of workers,” including both high school and college graduates.

At the same time, economic policies over the past three decades “have failed workers although they may have succeeded in doing what they were meant to do — which is to make companies better off,” Mishel states. “The rich get richer and other workers are unable to participate fully in the economy’s gains.” A clear example is Apple, he says, “which pays college graduates $12 to $14 an hour to work in their stores, and yet the company has so much cash it doesn’t know what to do with it. Many companies are doing extraordinarily well, but it doesn’t seem to translate into greater pay for their workers.” 

Recessions can somewhat distort wage scales, notes Bidwell, citing a book by Princeton psychologist Daniel Kahneman called Thinking, Fast and Slow, in which the author notes how people’s perceptions of fairness in pay are heavily skewed towards the compensation they are receiving today. For example, Bidwell says, if you hire a worker at $20 per hour in good times and cut his pay to $15 an hour during a recession, he will be more dissatisfied than if he was brought in at $13 an hour, even if he ends up making more money in the long run. “This idea gets to the question of why employers tend not to cut employees’ pay during recessions, and why employees hired during a recession tend to be paid less than they would in a booming economy.”

The answer, according to one recent survey, is that employers don’t cut pay “because their employees would be so upset that it isn’t worth it,” says Bidwell. “So you have this kind of tension, which is that markets go up and down, but in any kind of individual exchange, we tend to expect the terms of our own employment to stay the same. We would be outraged if our employers cut our pay, but we also know that wages are supposed to adjust to fluctuating markets. So the brunt of adjustment falls on people getting hired at the time.”

Today’s economic conditions make decisions about compensation tougher than they used to be, Bidwell adds. “When we had high inflation, it was easy to give people pay cuts by giving them no raise at all or a very low one. But because we have low inflation and people are very reluctant to cut nominal pay, it’s much harder to bring people’s pay down. Once again, you may see more of the effects going on when people are being hired.”

This scenario has been played out frequently over the past few years, says Cobb. “When the economy hits a rough spot, the most obvious response is to institute hiring freezes and cut wages and benefits — actions which have a fast but very short-term effect on cash outlays. You are spending less money, but over the medium and long term, calculating how productive your workers are is a lot harder than calculating your labor costs.”

Fast-food Restaurants vs. Investment Banks

The issue of fairness, as noted earlier, comes up when employees in a particular company discover other jobs elsewhere that are paying more. Such comparisons happen frequently in retail — fast food restaurants, department stores and consumer electronics companies, for example — where salary information is relatively easy to obtain. “At that point, the employee might say, ‘If you [the company] are paying me 30% less than [what I think is fair], I am going to work 30% less hard,'” says Cappelli.

In the retail sector, where one of the big problems is theft by a company’s own employees, the impact can be especially acute, he adds. Once theft becomes a serious problem, the employer may start to monitor employees with cameras and other security devices, creating an environment that feels “like a prison, which in turn, leads to even greater attempts by the employees to get back at the company.”

Economists say that you get what you pay for, Cappelli notes. “If you are paying really low wages, you are going to get low quality people performing at a low level who don’t worry about getting fired.” But relevant to all this is the idea of fairness. “Employees do have the ability to [practice] discretion in their jobs. They can use that discretion to make jobs seem fair, which means they can steal or slow down [their output] or shirk [responsibilities]. Economists don’t take those things into account.”

Higher paid jobs, such as those in knowledge industries where companies might be able to justify paying some employees more than others, face a different set of issues. “Here, it depends on the culture of the organization,” says Cobb. “If everyone knows this is a competitive place, and if performance is what indicates compensation, it might be less of a concern because individuals are self selecting into those kinds of firms. It’s when firms have strong norms around teamwork and project-based goals that rewarding someone over another person can be problematic.”

MacDuffie suggests that “organizations that pay for performance are often thinking primarily about the incentive effect. But reward system design needs to balance incentive effects and equity concerns. The wider the dispersion of rewards generated by pay for performance systems, the harder it is for people to believe it is fair, and the harder it is to believe that performance actually varies as widely as the rewards that are being generated.”

Consequently, he adds, “some employers will try to compress, or limit, the range of that dispersion. That can help with the perceptions of fairness, especially if the work is interdependent and performance metrics are hard to come by.”

In addition, some employers will consciously choose to pay above the market compensation “because they think it will help them attract and retain better talent,” MacDuffie says. Employers also might hire people at different compensation levels based on individual negotiations with the employee, he adds. “But if there is too much of that, you increase the risk that people will find it unfair, assuming employees have information” about their colleagues’ salaries.

Underlying discussions about jobs in knowledge industries, Cobb adds, is the reality that high-status companies, including top-ranked investment banks and consulting firms, know they are especially attractive to young workers and therefore know they don’t have to pay as much when doing external hiring. Instead, these firms can point to benefits that are not reflected in a paycheck — such as the networks these employees are building or the experience they are getting that will help them get accepted into a good graduate school.

Most people would not refer to these types of work arrangements as “exploitation,” says Cobb. “Lower wage work is where the prototypical modern labor exploitation takes place. And even here, it is a relative term. A child working in a factory in Bangladesh presents a much more serious case of exploitation than what is typically found in the U.S.”

At the same time, he adds, “if you look at the working conditions and low wages paid by some U.S. companies, it is clear that these employees are the people who keep our prices low and allow us to have inexpensive food and inexpensive clothes, because the gap between the amount they produce relative to the amount they get paid is so huge.”

The Employer Perspective

What can employers do when the perception by an employee or by a group of employees is that they are underpaid? “One thing is to clearly explain to employees what the reward systems are … and establish a grievance or complaint process to ensure procedural fairness,” says MacDuffie.

Employers can also draw upon market wage information, suggesting, for example, that employees “compare themselves to people who work in their particular industry, job class and labor market,” he adds, and then commission a wage and benefit study to show those exact comparables. “It’s one of the ways employers can try to shape employee perceptions about the fairness or unfairness of rewards.”

Kevin Hallock, director of the Institute for Compensation Studies, professor of economics at Cornell University and author of a recent book titled, Pay: Why People Earn What They Earn and What You Can Do Now to Make More, notes that “many large companies have administrative pay functions that are very formalized. They include clear compensation grids, matching to external data, use of surveys and so forth. Even for jobs that aren’t common in a company’s particular market, there are ways to slot them in and figure out what the market might pay someone [who has skills important to the organization]. So there is actually quite a bit of science in this.”

At the same time, Hallock adds, “some employees feel decisions about compensation are arbitrary, because organizations don’t always communicate well about pay. So [there has to be] a well functioning system.” Workers, he notes, should think about their total compensation package — including such things as health benefits, child care programs and training opportunities — when comparing one job to another. “It’s not just the salary.”

Some companies, both for-profit and non-profit, make it a point to give employees a printout every year that clearly explains their benefits and what it costs the company to provide them.

Hallock suggests that each firm needs to decide its optimal wage. “Some have a low-wage strategy that works for them. They are less concerned about employee turnover, and perhaps customer interaction doesn’t matter as much. Other companies profit quite a bit by paying even a little more because it makes a huge difference in the quality of their workers…. On the other hand, if a company raises wages too much, it might not have a big payoff. It depends on” the market and industry.

Roxana Barbulescu, a management professor at McGill University and a visiting professor of organizational behavior at Wharton, notes that companies could choose to be more transparent about how they are run. “They can make the allocation of resources clearer, use input from employees for decision making or indicate if they are having a bad year and need to regroup.”

She also points to a well-documented aspect of today’s economy — the high number of job seekers taking positions for which they are overqualified, “which implies that they will also be underpaid relative to what they would be able to make” in a stronger economy, she says. “But they are taking these jobs in order to improve their employability for future jobs and eventually get back into the labor pool. They are trying to turn [the jobs] they have into an investment in their future.”

“We teach our students that fairness and the market are two completely different things,” says Bidwell. “The market is all about matching supply and demand and trying to tailor compensation to get and retain the best people you can while not paying too much to everyone else. Fairness is paying everyone the same. To some extent, you can do one or the other. The more we focus on fairness, the more we risk losing our [best] people or people with unusual situations. The more we say we want certain people and will do whatever is necessary to get them, the more we [inject] unfairness and inequity into the workplace. There is no easy answer.”