There was a lot that Anjani Jain liked about Bloomberg Businessweek’s ranking of business schools. It was only when the deputy dean at the Yale School of Management dug deeper that it stopped making sense to him.
Like most publications that rate institutions of higher education, this one chose certain categories to evaluate, such as how much money graduates make, and weighted each category based on its importance. But unlike with many other rankings, Businessweek asked students, recent alumni, and recruiters what was important to them, and used their responses to determine how much weight to give to each of the five categories it used to evaluate schools: compensation, learning, networking, entrepreneurship, and diversity. To Jain, this seemed like a good idea.
The dean, who has a background in mathematics, physics, and operations research, began poking around at the numbers, trying to replicate the ranking using the information that was public. He has tinkered with other rankings too; digging into the calculations can be a way for him to procrastinate when he has other work. But when following Businessweek’s published methodology, Jain’s ranking of the business schools did not come out in the same order.
You want transparency. You want people to be able to believe in the results.
“I noticed that somehow there were numerical anomalies,” he said. The crowdsourced weights that the magazine said it gave to each of its five categories didn’t seem to yield the ranking order the magazine had published.
Perplexed, Jain wrote to Businessweek. An editor wrote back, explaining that the weights had been applied to raw scores, rather than the scores published.
Jain was skeptical. He believed that he had deduced what the weights would have to be in order to recreate the list that Businessweek published. Those weights, he said, were not the same as what the magazine had gotten from its surveys.
The way Jain sees it, one of two things happened. Either the weights were applied to the data before it was normalized — a mathematical process that can be used to ensure that data sets are being compared on the same numerical scale — which would be a statistical error, he said. Or the ranking was calculated accurately, but after the fact some sort of “mysterious manipulation,” as Jain put it, took place.
A Businessweek editor did not respond to the Chronicle’s request for comment. The magazine stands by its ranking.
On a Zoom call, Jain asked a Businessweek editor if he could answer a yes-or-no question: Did you normalize the data before applying the weights? The editor asked for two weeks to get back to Jain, but the dean felt that was too long. He published his findings in Poets & Quants, a news site that covers graduate business schools and has its own ranking of such institutions. He later published a second piece about the 2018 and 2019 rankings, where says he also found some issues.
Businessweek requested a correction.
“Neither Poets & Quants nor Yale had access to the raw scores that are used in calculating the ranking, which Businessweek pointed out to Yale multiple times prior to the analysis’ publication,” a Bloomberg News spokesperson told Poets & Quants, according to an article that was published a week later. “By design, our proprietary ranking cannot be replicated or gamed using published data.”
The spokesperson said that the magazine’s methodology was vetted by multiple data scientists and that disclosing raw data “would create the possibility that the rankings could be reverse-engineered or gamed by a school for an unfair advantage.”
In response, Jain said he didn’t need access to the raw scores, nor did he ask for them.
“The published normalized scores are a sufficient proxy for the raw data,” he told Poets & Quants. “The statistically valid way to preserve the weights is to apply them to either the normalized or standardized versions of the raw data. That is what I did, and found the resulting rankings to be widely divergent from what BBW published.”
Jain did not back down. Instead, he urged the magazine to correct its ranking.
“I understand why this departure from your dug-in position is difficult,” he wrote to an editor. “But I hope that ethical considerations will ultimately prevail.”
In Jain’s revised ranking, some of the biggest names in the business-school world would plummet. The University of Pennsylvania’s Wharton School, for example, would slide from No. 9 into the 20s, he said.
The Poets & Quants posts did not go unnoticed. Jain’s original piece received almost 40,000 views. He heard from multiple deans, he said. One of them was Hasan Pirkul, dean of the school of management at the University of Texas at Dallas. His institution was ranked 32 on Businessweek’s list, but Jain said that by his calculation, it should have been No. 9.
“He’s done a service to all of us,” Pirkul said. His said appreciation of Jain’s calculations had nothing to do with the fact that his school would bounce up the list. “You want transparency. You want people to be able to believe in the results.”
Colleges have a tortured relationship with the popular lists that rank them. Some say they favor well-heeled, exclusive institutions and incentivize colleges to admit wealthy students. But Pirkul said they can help some institutions prove their value to a national and international audience — and potential applicants — whom they would otherwise be unable to reach.
“Rankings are very important for some schools,” he said. “If you’re a newcomer like us, you really need to have some kind of criteria where you can compete.”
Tatiana Melguizo, an associate professor at the University of Southern California’s Rossier School of Education who has a background in economics and researches quantitative methods of analysis, took a look at Jain’s work at the Chronicle’s request.
“My feeling is that he’s right,” she said. “Depending on when you normalize and when you apply the weights, the final ranking is subject to change.”
Melguizo said she was glad someone was pushing back on the ranking by asking for more transparency.
“The issue of replicability is huge,” she said, arguing that people should be able to see how these lists are being created. “We really need to push these rankings agencies to show the code.”
She added that Businessweek’s ranking had some good features. It was a good idea, she said, to ask the people who might be affected by such rankings how much they cared about each of the categories that the magazine measured.
“That democratizes the rankings,” she said. And she liked the inclusion of the diversity category, which was new for the magazine this year.
She wondered if there might have also been something else motivating Jain. How, for example, did his school fare in the ranking? When he shared his calculations with his students, Jain said they had a similar question.
Yale was ranked No. 12 on the Businessweek ranking. And on Jain’s recalculated list?
“It so happens,” he said, “that Yale would slightly improve.”
If Yale had scored the highest on the Businessweek list, his students asked, would he have bothered to go through all this trouble?
“Without a shadow of doubt.”