The overall “quality” of the 15 biggest U.S. airlines improved in 2011 over the previous year, but not by much: According to the annual Airline Quality Rating (AQR), scores improved from -1.20 to -1.08. (AQR scores are always negative, so the less negative the better.) Moreover, say the report’s authors, the score for 2011 was the best since they started the current scoring system in 1990.
Among individual airlines, the top scorer, ironically, was AirTran, which owner Southwest is phasing out as a brand. The other top five winners, in descending order, were Hawaiian Airlines, JetBlue Airways, Frontier, and Alaska Airlines. Delta Air Lines scored next and best among the giant “legacy” lines, followed by Southwest, US Airways, American, Continental, and United. As usual, the regionals accounted for most of the low scores.
AQR scores are a composite of four statistical measures compiled and reported by the U.S. Department of Transportation (DOT): on-time performance, how many travelers were denied boarding (“bumped”), how many bags the line mishandled, and how many complaints were logged by the DOT’s consumer program, all adjusted to a consistent per-passenger index. The 15 scored airlines improved performance in all four of the statistical measures. Four individual lines, however, saw their scores decline, including Continental, Mesa Air Group, United, and (very slightly) Hawaiian Airlines.
Current-year AQR scores appear to be a reasonably good predictor of future performance. Although the order of scores changed somewhat from 2010 figures, both winners and losers tend to be fairly consistent from year to year.
Nevertheless, I have a big problem with the AQR scores. As I’ve noted in previous analyses, people define “quality” for any product or service in two dramatically different ways:
- How good the product is, when delivered as the supplier promises, and
- How well the supplier delivers on what it promises.
The AQR scores reflect only the delivery component of quality—how closely each airline comes to meeting the basic obligations it has assumed. Thus, for the most part, the scores reflect more the absence of problems rather than any positive experience. They have nothing to say about the many other factors that affect how much you might enjoy a flight—seat space, onboard service, in-flight entertainment, and such—nor do they account for the generosity (or stinginess) of the various lines’ frequent-flyer programs or the presence/absence of annoying fees and charges.
The problem with delivery-based scores is, of course, that an airline can deliberately design and provide a poor product—think Spirit or Ryanair—and still earn a top AQR score by delivering its poor product consistently.
From the beginning, AQR authors have touted the scores as unique in that they’re based entirely on objective statistical data. True enough: Most of the “how good is the product” quality features can be measured only by traveler surveys, with their unavoidable biases. Nevertheless, when most of you select an airline, those “how good is the product” factors tend to outweigh the AQR factors by far, especially given the relatively small spread between top and bottom performance scores. And for the “how good” element of quality, my take—and that of many surveys—is that JetBlue offers the best flight experience of any large domestic airline; the other usual front-runner, Virgin America, didn’t carry enough traffic to make the AQR scoring.
The annual AQR release usually generates more attention than I’ve seen so far, probably because of the current focus on fuel prices, fees, and vexatious security hassles. Even so, you’ll probably still see quite a bit about AQR. And when you do, keep in mind that AQR measures only one part—and, to me, the less important part—of the overall airline quality picture.
You Might Also Like:
- Surprising New Study Says Airlines are Doing a Good Job
- 10 Best Airlines You’ve Never Flown
- Delta’s New Sub-Economy Ploy
Ed Perkins on Travel is copyright (c) 2012 Tribune Media Services, Inc.