Transcript of Webwatch Travel Research from “Trust or Consequence: Why Credibility Is the Killer App for Online Travel”
- Bill McGee, Consultant to Consumer Reports WebWatch
- Perry Perfors, Head of Research on Services, Consumentenbond in the Netherlands
- Beau Brendler, Consumer Reports WebWatch
- Shafiq Khan, Marriott Hotels
- Tracey Weber, Travelocity
- Cheryl Rosner, Hotels.com
- Monte Ramey, Advantage Rent-A-Car
- Madelyn Miller, TravelLady.com
- Neil Bainton, FareCompare
- Michele McDonald, AMC Communications International, Inc.
Note: This is an edited transcript of the proceedings.
Bill McGee: Thank you and good morning. I just feel it is worth reiterating what Beau [Brendler] said about this being an interactive day. We’ve designed all of the events so that at any time, if you have questions, concerns, criticisms, just jump right in.
As recently as 1999, Web-surfers who typed in www.delta.com would not have been directed to the home page of Delta Air Lines. In fact, they also would not have been directed to the home page of Delta Faucets. They would not even have been directed to the home page of television actress Delta Burke.
Instead, depending on the year, they would have been directed either to the home page of Delta Financial Corporation of Woodbury, New York, or the home page of DeltaCom Development, a software developer based in Raleigh, North Carolina.
So how did this happen? How could the world’s third-largest airline, a company with vast human and technological resources in the fields of sales and distribution, have not foreseen the need to secure intuitive Web domains and URLs?
Much smaller travel companies, with much more limited resources, had correctly forecasted the need to develop Internet sales strategies and had quickly implemented them.
Eventually, of course, Delta Air Lines did secure the URL of Delta.com, as well as DeltaAir.com, and DeltaAirlines.com. But in those days, prior to the introduction of domain registration regulation, published reports indicated that the carrier apparently was forced to pay a hefty sum for the rights to Delta.com.
This isn’t meant as a criticism of Delta. Ironically, in fact, the carrier was quite aggressive about revamping travel distribution strategies in the 1990s, and led the entire industry, for example, in a controversial move to reduce travel agency-based commissions in 1994.
But the Delta.com issue highlights a simple fact. No one – not even the best minds in business, technology, journalism, academia or the government – correctly foresaw all the ways in which the Internet was going to revolutionize worldwide commerce, or how quickly and permanently shopping patterns would be altered forever.
All of us have undertaken this journey together. At times we may truly forget what the world was like prior to the widespread use of the Internet little more than a decade ago. Then we are suddenly reminded of it, and we realize that the introduction of new technologies may well be outpacing our ability to fully absorb and implement them.
Some of us, in fact, may be numbed by how completely our world has changed, and continues to change. I’m reminded of this every time I watch my son use a technological tool. He was born in 1993, and I sincerely believe that he and all other children of his generation arrived hardwired from the factory different from older humans, such as an early 1960s model like me.
Anyone who’s spent time around a child born during the Clinton administration knows this is not a fabrication. My son instinctively understands how to punch, probe and press at computer keyboards, cellular phones, VCRs, and Gameboys, in a way I wouldn’t be able to master even after reading a stack of owners’ manuals.
Yet I’m not ashamed to admit that I’m continually learning from an 11-year-old. In fact, all of us are still learning about this strange and wondrous medium known as the Web.
The historical record makes it clear that each major advance in communications brought unforeseen benefits, just as it was coupled with unforeseen problems. This was true for the printing press, the telephone, the radio, the motion picture, the television, the cell phone. And now it’s just as true for the Internet.
In looking back at the early research conducted by Consumers Union into the online travel industry, it’s clear that some problems have bee fully addressed now, while some of today’s pressing concerns had not even been raised then. And by “early,” I’m referring to research that was completed less than five years ago.
This is a key reason why we are gathered here today. For three years, Consumer Reports WebWatch has been devoted to analyzing, reporting on, and seeking to improve the quality of content on the Internet. Today’s conference is a chance for all of us to take stock, note how and where progress has been made, and clearly address where changes are generally required.
Specifically, I’d like to, one, offer you an overview of the online travel testing projects Consumer Reports WebWatch has undertaken. Two, share new findings from a fresh report examining first-class airline bookings, which is just being released today. And three, explore the challenges ahead.
Today’s event is not designed as a unilateral exercise. Everyone in this room – WebWatch staffers, panelists and attendees alike – is encouraged to participate.
Of course, analyzing the Web is no easy task. We have to recognize that, in many ways, this is like asking fish to analyze water. Because we are all wedded to technology today, and it’s hard to find the intellectual perspective to grasp both the nuances and the big picture at once.
Undoubtedly, many of us used the Web to research and/or book the airline, hotel and car rental reservations required to be here in Dallas today. At WebWatch, we appreciate the irony that we are using the Internet as a tool to analyze itself.
Consumers Union and the other funders of WebWatch are committed to improving the quality of online content, so it’s fitting that much of the focus has been on travel, the single largest slice of Internet commerce.
As Jupiter Research noted recently in the fall, the online buying population will grow from 98 million users in 2003 to 156 million in 2009. And their prediction is that the number of consumers who purchase travel online will grow from 26 million to 45 million during this period.
The fact is, over the years, Consumer Reports WebWatch has been harshly critical of many travel companies, and some of those companies are represented here today. But we’d also like to praise those travel sites that have implemented significant improvements on behalf of consumers, and we’d like to work together to explore new opportunities for improvement.
We recognize that, for travel suppliers, whether they are airlines, hotel companies, rental car firms, cruise lines, tour operators, or vacation packagers, the challenges have never been greater.
Consider for a moment that your travel product can be sold through a storefront ticket office, a toll-free reservations line, a direct-mail system, an e-mail alert system, a third-party wholesaler or consolidator, a global distribution system, the entire travel agency community, with or without a base commission policy, a targeted travel agency compensated with a specific override bonus reward program, a corporate travel department. And then, of course, a third-party integrated travel site, an opaque bidding site, a fourth-party aggregater site, or your company’s own branded Web site.
There’s no doubt that, from the travel supplier’s side, most sensible strategies rely not just on one or two of these choices, but on most or even all of them. And there’s also no doubt that the Internet represents the least expensive way for nearly all travel companies to sell their wares.
Barry Diller, the CEO of Interactive Corporation, the parent company of Expedia, Hotels.com, Hotwire and other companies, said that the Internet pulls power away from suppliers and puts it in the hands of consumers.
So think for a moment about this from the consumer side – after all, that’s what we do all the time at Consumer Reports WebWatch. Buying travel has become more complex, not less complex. The technological tools have paradoxically made it easier and harder to buy travel online.
Yes, there is more choice. And WebWatch’s research has unveiled compelling evidence that there is more competition as well. But just as the Bruce Springsteen song laments that we now have 57 channels and there’s nothing on, the sheer magnitude of the choices offered to travelers can be overwhelming.
Consider it: For just about every retail product on the market, from eggs to shoes to a brand-new SUV, there are sufficient benchmarks so that even the most novice of buyers can be certain that a given price is fair.
But for travel, the pricing complexities are too dizzying even for many of the most sophisticated shoppers. And that’s just the price of the travel product itself. Determining the correct channel through which to buy it can be even more daunting.
As Joe Brancatelli, the travel columnist, said: “The knowledge barriers are actually quite high, and the average traveler doesn’t know enough to get the best fare. They need to know more than where they want to fly and what they want to pay.”
Civilians, that is, people who work in neither journalism nor travel, often approach me, and they’ll bark out the rate they recently paid for an airline flight or a hotel room or a rental car. Usually the booking has already been completed, and they want me to confirm that they paid a fair price.
On a much deeper level, they want confirmation that they weren’t fooled or duped. Often I start to explain that I’ll need a lot more detail before I could even begin to comment. For example, I’ll say, travel has reversed the distressed inventory theory, and placed the burden not on the supplier, but on the demander. Unlike rotten fruit or Christmas trees on December 24th, the price of travel goes up as it gets closer to its expiration date.
In fact, according to Runzheimer International, in a very recent survey, average airfare prices increase as much as 152% in one day, when comparing seven-day advance purchase airfares to less than seven-day advanced purchases.
Invariably, this is the point in the conversation when the civilian abruptly changes the conversation, with a shameful look that seems to acknowledge being taken.
Hey, “Don’t worry,” I want to say. We’ve all been there. It’s a universal feeling that buying travel can tie you up in knots. After all, I once paid $400 to fly from New York to Orlando. On Spirit Airlines, no less. And I do this for a living.
Is it any wonder, then, that many consumers are shell-shocked about how and where to buy travel? According to Jupiter Research, nearly two in five online travel consumers say they believe that no one site has the lowest rates or fares.
Before I begin to highlight the research we’ve conducted, it would be useful to explain the genesis of these testing projects. Like many complex undertakings, it had a very simple beginning.
In the spring of 2000, when I was the editor of Consumer Reports Travel Letter, published by Consumers Union, the idea emerged at a staff meeting to test the claims of then-burgeoning travel Web sites, several of which all boasted of offering the lowest rates. Why not simultaneously query identical fares from several sites, and see for ourselves who offered the lowest prices?
Several of us were new to Consumers Union, and we had no idea that this simple project, which I envisioned would take the better part of a morning, would take the better part of that spring and summer.
We soon became indoctrinated into the CU testing environment. First, the research department helped us determine which sites to test. Then the survey department helped us draft the itineraries. Then the statistics department helped us decide on a valid sampling of queries. Then the fact-checking department helped us review every facet of our work. And finally, the legal department helped us to determine that our findings were ready for publication.
What emerged was a methodology based on real-time simultaneous testing, conducted repetitively, anonymously, and without bias of any kind.
Consumer Reports WebWatch for the first time teamed with Consumer Reports Travel Letter to take a fresh look at the online travel market. From that joint project came the decision to continue this work, afterConsumer Reports Travel Letter shut down at the end of 2002. Since 2003, WebWatch has sponsored six additional online travel testing products, including the examination of first-class airline tickets being released today.
To date, we’ve found that the total number of travel sites that we’ve tested is 56 different sites based in seven different countries. The total number of rate queries exceeds 8,290. And the total number of hours spent testing – just doing the queries for a real-time test – is more than 786 hours. The total number of hours spent on these sites in pre- and post-testing, equals thousands of hours a year, for all of us.
The methodology crafted nearly five years ago has held up very well, but it has also been revised and improved. In fact, most of the improvements have come about due to direct feedback from the travel sites we have tested.
WebWatch recognizes that just as some travel sites have been educated by our research, so too have we been educated by their suggestions. Among the key changes that WebWatch implemented due to feedback from the online industry, one is that we varied the travel itineraries and the length of the booking windows in order to obtain better comparative samplings.
We also revamped our selection of lowest rates to reflect the reality that new display tools, such as horizontal and vertical matrices used on many Web sites, have replaced the scrolls of fares pioneered by computer reservation systems. Instead of selecting the first rate – and in fact now, it’s sometimes not even clear which rate occupies the first spot on many home screens – WebWatch chooses the lowest rate from among the first five results or the first printed page of results, whichever is greater. And, again, this was feedback that we got directly from the sites.
WebWatch also broadened the rankings to determine not just which travel sites offered the lowest rates and fares, but also which sites offered the closest rates and fares. Because this is such a competitive market, basically, we found that very often the site that came in second was less than a dollar more expensive than the site with the lowest fare. And so we felt that we needed some way to quantify who was repeatedly offering competitive fares.
WebWatch expanded its examinations to include not just performance rankings of lowest fares, but analysis of other factors important to consumers, such as ease of use, customer service, and privacy and security policies.
I think before I move on, I’ll just pause for a moment, since I just explained our methodology, and just see if anyone has any questions, specifically about that.
Female Speaker: [INAUDIBLE] earlier [INAUDIBLE] purchases?
Bill McGee: That was, I believe, a forecast that was made by – and you might have the slide there, Jan – that was made by Jupiter Research. And they’re anticipating that, from 2003-2009 the number of consumers who purchase travel online will grow from 26 million to 45 million.
Female Speaker: Thank you.
Bill McGee: Any other questions about our methodology? Because obviously the methodology is the basis for all of our work today.
Male Speaker: Was the focus on airfare only, or is it other components of travel also?
Bill McGee: Actually you’ll see that when we get into specific findings, for the most part what we’re talking about are rankings for lowest airfares and rates for other products, hotel and car rental.
But we’ve also tried, and it’s not easy, quite frankly, because trying to come up with a methodology that’s fair and accurate and apples to apples, we’ve tried to quantify over the years issues such as customer service, by looking at who provides 24-hour help desks, and things of that nature.
And we’ve also, in the latest study, we’ve included a new ranking in which we ranked the quality of the lowest fares based on nonstop service.This happens to be a test for first-class airfares.
Beau Brendler: Bill, when we did the study awhile back on opaque sites, we got criticized for our methodology pretty vociferously. Can you talk about some of the reasons related to that, in terms of why we chose particular routes or chose particular points of data to look at, as opposed to sort of testing individual companies’ marketing messages.In other words, I’m referring actually to Hotwire.
Bill McGee: Actually, over time, we’ve been criticized more than once for, particularly with airfares, with the selection of routes that we use.
We always first commission research through Consumers Union’s research department, to look at the most heavily traveled routes in the U.S. What you find when you do that, when you pull up the statistics from the Department of Transportation, is that you find the same cities over and over again. If you look at the top 10 air markets in the U.S., and you remove New York, Chicago, and Los Angeles, you’re not left with much.
And so we really wanted to expand beyond that. We wanted to look — for one thing, we wanted to look at a sampling of what the industry terms both leisure routes and corporate routes. We have a sampling of a lot of sun destinations from cold weather destinations, and things like that. But also, routes such as Chicago/Dallas, which are for the most part very heavily traveled by business travelers.
So we have geographically expanded across the entire country. And then when we’ve done international studies of gateways across the entire country, throughout the world.
But, again, we have been criticized for that. Our answer is that we’re trying to reach as broad an audience as possible. And in fact, a given Web site may be quite adept at providing low fares or low rates for air or hotel or car in certain markets and not in others.
All of the major third-party Web sites have very aggressive sales staffs, and they’re out talking to hotel chains, or even specific hotel properties or airlines. One of the things we try very hard to do when we have all this data, is we try and see what kind of patterns. In other words, if Web site A is very adept at offering low fares out of Chicago on Tuesday, did that happen again on Wednesday?
And every time we think we have a pattern going, we find that it kind of falls apart, because I would say that the reason is because of the aggressive sales forces. That, in fact, this is such a dynamic industry, and inventory shifts so quickly among these different competing sites, that it’s impossible for us to say:
Well, if you’re looking for low rates on hotels in Seattle, then this is the site you should go to. Because, in fact, over time, repetitively, we found that in fact there’s a lot of competition.
So that means two things. One is that, obviously, there are positive aspects in the fact that there’s all this competition. But it always comes back to our mantra, which we just say repeatedly over and over again, that you do have to shop around.
And then as far as some other criticisms that were leveled in that report, I’ll get into later on.
Female Speaker: i’m confused about what the criticism was. Was it that your route selection was too narrow?
Bill McGee: Yes, well, actually, that it was too broad, that it didn’t reflect the top 10 routes in the United States. In other words: New York/Boston, New York/Washington, New York/Chicago, New York/LA. So we looked at the DOT traffic numbers, and then from there we usually expand it. And, again, try and get a broader geographical sampling.
Female Speaker: So the criticism was that it wasn’t focused on the top 10?
Bill McGee: That’s right.
Female Speaker: That’s weird.
Bill McGee: Are there any other questions specifically about the methodology?
Female Speaker: Yes, how did you deal with the nearby airports when you were conducting your study?
Bill McGee: That’s a very good question. One of the hardest things that we do when we look at this, is our whole, the basis of our methodology is to be as apples to apples and fair to all sites as possible. And that can be extremely challenging.
Anyone who looks closely at competing Web sites will see that there are different interfaces, different technologies. And in fact, you can go on one Web site and say that you’re looking for a lowest fare from New York, and the default will be for the three largest airports: Newark, La Guardia, Kennedy. Others will give you seven airports, and will include Stewart and White Plains and Islip. And of course for other cities around the country, the same thing for outlying airports.
So what we do is, we always ensure that when we’re doing simultaneous testing, that the methodology is very, very specific. And all of our reports, which are published online, have very, very specific methodology sections – they’re usually the longest sections of the reports, actually – where we’ll say specifically which airports are acceptable and which are not. Because it would not be fair to say that we found the lowest fare out of White Plains for New York on one site, when the other site didn’t even offer White Plains.
So it’s always, and in fact what that means for us, for our testing, is it sometimes makes it very cumbersome for our testers, because some sites are not advanced enough to offer multiple airports. We then have to slow the testing down so that we’re in real-time, and one tester can do LaGuardia/Chicago, Kennedy/Chicago, Newark/Chicago, while the rest of us are catching up.
It’s a challenge. And the bottom line is that, for consumers, again, it’s kind of about choice. Obviously it’s great that some sites offer more airports, but we have to take a step back when we’re doing our testing, and we have to kind of say: It’s not about what a consumer may likely do. This is about making the testing as absolutely fair as possible, so that the sites are tested in the same way.
Female Speaker: [INAUDIBLE QUESTION] ?
Bill McGee: Right, how did we determine which sites to include in the testing? With each project, basically, we use public information from many of the research firms that have been cited here today. Whether it’s MediaMatrix, whether it’s Jupiter, Forrester.
We try and get a sense of the market. But what we have found is that, in recent years when we look at the large integrated sites, we have repetitively looked at Expedia, Orbitz and Travelocity, the three largest. One of the things internally that – we have looked at smaller sites, and one of the things internally that we’ve continually discussed is to include more smaller sites.
But it’s a challenge, based on our resources. At the same time, we usually try and balance the third-party integrated sites. By integrated, basically meaning third-party sites that state that they’re offering unbiased listings of multiple competing brands.
At the same time, we usually, depending on the test, include branded sites – specific airlines’ own sites, or hotel sites or car rental sites.
So what we have, over time, is a very rich historical record of how the three largest sites have performed: Expedia, Orbitz and Travelocity. And then a smaller sampling of how other sites have performed, based on the statistical evidence.
Male Speaker: Do you capture the fare basis code as you’re doing your analysis? That’s number one. And, number two, do you verify bookability of these fares?
Bill McGee: You’ll see as we go into the projects, there’s only been one project we have actually purchased travel, and that was because we were looking at opaque sites, which I’ll discuss later on. In all cases, we do not book. We confirm availability up to the final step of booking.
As far as fare basis codes, not all sites offer them. And by fare basis codes, for those who don’t know, basically the airline industry codes that indicate not only the class of service, but the fare ranking. That’s kind of an internal language that airline-to-airline and travel agents can use to kind of quantify a fare. Not all sites offer it. When available, we try to confirm it. And, in fact, it was very important, that issue, in our most recent study, because of first-class airline tickets.
What we found is that some sites were offering service that was listed as first-class when it fact it wasn’t.And I’ll get into that more specifically later.
Male Speaker: Did you take into account the intangibles of the amount of service, frequency of service, number of seats available, time slots? Comparing, perhaps, an airline that offers one 6am flight versus a carrier that offers hourly service?
Bill McGee: What we do is, depending on the test. For example, if we’re looking at domestic airfares, where there’s a lot of frequency, we’ll be very specific. So we’ll say: We’re looking to leave from Philadelphia at 9am on Thursday, and we’re going to San Francisco.
We then evaluate the results based on that. So, in other words, what we have found, to our dismay, very often, is that some sites do not return what we requested. And our rhetorical question is: Why offer that functionality if you’re not going to provide it?
In other words, if the site just says: We can only offer you functionality to offer you a listing of fares on Thursday, fine. But if we had specifically asked it for 9am departure on Thursday, why is the first return at 3pm? Why is the next return at 11pm? Why is the next return at 6am?
So we then evaluate the first five returns based on the criteria. If it falls outside, and the first five are not valid based on that, within a two-hour window either way, then we’ll discount it.
Same as if we ask for a specific airport. And, again, this happens all the time, surprisingly. Again, it begs the question: Why ask? Why provide that functionality if you’re not going to give the answer? In other words, if we ask for a departure from Newark Airport, and we only get listings from La Guardia, then we’ll discount that.
Then, what we found is that, over time, we have to broaden that. Because, for example, if you’re looking at international fares, obviously you can’t be that specific. There isn’t that much frequency on most international routes to say “I only want to leave at 9am.”
And the same for some other products that we basically had to broaden that. But within the confines of that search, we limit the results to that.
Female Speaker: How do you separate nonstop versus connected service?
Bill McGee: Actually, this is the very first – the report that’s being released on first-class fares is the first time that we’ve quantified it.
It’s something that, kind of anecdotally, we’ve noted in the past, but we felt it was time to quantify. So we recorded not only who had the lowest fares on these first-class routes, but also we basically tabulated the number of nonstops and, in fact, the total number of stops.
And in some cases, the total number of stops can be quite amazing, even though we were only looking at domestic routes. And we’ll get into that later as well.
So for this test, yes, we actually have findings that show that among the three largest sites, for example, in first-class fares, one site, Expedia, provided the highest percentage of lowest fares. And I’ll be getting into all these numbers later and sharing them with you.
A second site, Orbitz, provided the highest percentage of closest fares, fares that were within $10 of the lowest. And a third site, Travelocity, provided the highest percentage of nonstop flights. And an argument can be made that Travelocity had, although it didn’t rank above Expedia and Orbitz in terms of low fares, that the quality of the flights returned was higher.
So one of the interesting things that we’ve kind of started to notice in recent years is that the major sites, although in many ways they’re alike, we start to see some patterns emerge in terms of differences.
Anything else? Okay, I’ll go on. Just as WebWatch has evolved, so, too, has the industry. In fact, we have been witnesses to some remarkable changes that have taken place in online travel commerce, all in a relatively short period of time.
First and foremost has been the phenomenal growth of online travel, both in itself and as a percentage of all travel booking channels. FocusRight, at their online travel overview for the next two years, has said that in the U.S. alone this $190 billion industry will see more than half of its total business booked online by 2006.
Now, for many of the people in this room that work in the industry, they know just how far that has come in a very short period of time. Ten years ago it was low single digits.
But other changes have emerged as well, and they include the growth of lower fares online; the higher sales efficiencies afforded by Internet sales, leading to fuller airplanes and hotels; and the intensity of online competition.
At WebWatch, we’ve seen firsthand that this tremendous growth has been coupled with an influx of lower rates on the Internet. Over the last few years, many journalists have written about Web-only fares and the emergence of lower prices for travel on the Web.
We can state that it’s not a trend created by travel writers; it’s a statistical fact. In the early online travel tests conducted by WebWatch, the results were benchmarked by simultaneously searching for the same fares in Sabre and Apollo Galileo, two global distribution systems used by travel agencies.
We retained Harrell and Associates, an outside consultancy, to participate in all of our testing projects. We did this because the U.S. Department of Transportation at that time was regulating the content provided by GDS. And this regulation, which since then has been abolished, was designed to ensure that all listings of rates and travel products were presented in a fair and unbiased manner.
Therefore, Sabre and Apollo Galileo would provide us not only with the true lowest fares, but they would do so fairly and honestly. It’s amazing how quickly the entire travel distribution landscape changed.
In 2000, in the first test that Consumers Union did, none of the four independent travel sites tested by Consumers Union provided an airline fare that was equal to or lower than the GDS fare more than 45% of the time.
Three years later, in 2003, WebWatch examined opaque travel sites such as Priceline and Hotwire. And we included Sabre in this testing project as a benchmark for placing real-time bids. The shocking result was that among a total of 135 separate queries for airline, hotel and car rental prices, Sabre did not even tie by providing a single lowest rate. Thus, within three years’ time, the GDSs fell from providing an unmatched lowest rate 55% of the time to providing no lowest fares at all.
Now, this is not to say that GDSs have not evolved as well, since most now incorporate Web fares into their inventory, and provide travel agents with new technological tools. But it was remarkable to see how quickly travel Web sites emerged as price leaders.
Of course, such changes spawned responses from traditional travel booking channels. American Express, for example, said that most Web-only hotel rates are no bargain for business travelers, as they typically require prepayment, entail full or partial penalties, or are only valid with a minimum number of room nights.
This is something that comes up time and again. Every time we release the results of the latest Web testing project, invariably someone says, “Well, then, if there are problems with online sites, should we use a traditional brick-and-mortar travel agent?” And of course what we’re finding is that the answer to that can vary on many factors, including the products tested.
One of the things we’ve considered very carefully for several years now at WebWatch is looking at more complicated products such as cruises or vacation packages. And we haven’t done it as yet, for several reasons. One is, the percentage booked through third-party sites is really quite minuscule when you compare it to airlines, for example.
At the same time, it’s also provided serious challenges for us in terms of – since they’re such subjective products – apple to apples comparisons. But undoubtedly more of this travel is going to eventually migrate to the Web, and eventually we’ll have to address that challenge.
Another tangible effect of online commerce has been the enhanced ability of travel suppliers to sell more of what the industry calls their “distressed inventory” that otherwise would have remained unsold.
In all areas of travel, occupancy rates have gone up, thanks in large measure to the Internet’s ability to inexpensively sell more travel, particularly at the last minute.
And in this slide, let’s show you. This reflects numbers provided by the Air Transport Association, a trade organization for the airline industry, the U.S. airline industry passenger load factors. As you can see, the percentage of occupied seats on U.S. airlines – this reflects both domestic and international flights – has continually gone up every single year for the last 10 years, with the obvious exception of 2001 when the terrorist attacks occurred.
In fact, according to the Air Transport Association, airline load factors are now at all-time highs not recorded since World War II. It’s not our imagination, for many of us who travel quite frequently, to see that the middle seat is occupied as it never used to be, and that overhead bin space seems to be getting smaller – although in some cases, it is.
But nearly all analysts predict that this is a permanent trend, and nearly all analysts credit the Web as the single most important contributing factor.
The final significant change noted by WebWatch has been the intensity of competition among the leading travel sites. Originally we recorded lowest prices and rounded them to the nearest dollar. In recent tests, we have found the need to record exact amounts to the penny, because rival sites have been continually beating each other’s lowest rates by cents, not dollars.
In fact, in the examination of first-class airline fares we just conducted, we observed numerous cases in which the site with the lowest rate was literally one cent lower than its nearest rival. So, clearly, the large sites are not only looking at their own inventory, but they’re looking at their rivals’ inventories.
These economic slugfests seem to indicate several things. One is that competition is thriving on travel Web sites. Two is that consumers apparently can benefit from this competition. And three is that, moreso than ever, you need to shop around in order to truly find the best online bargains.
So then what else have we found? Each of the individual testing projects undertaken by Consumer Reports WebWatch has unearthed surprises. Some of these have been pleasant surprises, and some have not.
Although the record indicates that we have been thorough in testing a large number of travel Web sites, there are still many more sites to examine, particularly as aggregator sites emerge as a new force in online travel, and representatives from many of the aggregator sites are here today.
As this slide shows you, over the years we have tested a very high number of U.S. airline sites, as well as car rental sites, hotel sites. And, with third-party travel sites, over time, we’ve looked at most of the large ones: CheapTickets, Expedia, Hotels.com, Hotwire, Lodging.com, LowestFare, OneTravel, Orbitz, Priceline, QuickBook, TravelNow and Travelocity. As well as two global distribution systems: Sabre and Apollo Galileo.
In addition, Consumer Reports WebWatch has also been involved in the testing of several international sites, and I’ll speak specifically in greater detail about that later on.
What follows is a breakdown of each of WebWatch’s testing projects. The first project was released in June 2002. That was when WebWatch teamed with Consumer Reports Travel Letter to publish “Travel Web sites: You Still Need to Compare.”
This test pitted six leading third-party integrated travel sites: CheapTickets, Expedia, OneTravel, Orbitz, TravelNow, and Travelocity. We compared the lowest domestic airfares provided by these sites with the lowest fares provided by the GDS, in this case Sabre. Our findings were mixed.
On the one hand, we noted that the highest percentage of lowest fares was provided by Expedia. However, we also noted that the highest percentage of lowest fares coupled with viable flights was provided by Travelocity.
So what’s a viable flight? In those early days, we were confounded to see that the lowest fare did not always make sense for most travelers. In fact, the quality of such wacky flights forced us to reevaluate our findings and create new benchmarks for sensible itineraries.
For example, in 2000, we were on the site CheapTickets, and we queried a flight from New York to Chicago. I think at the time it was the third busiest route in the U.S. The lowest fare provided by CheapTickets was on U.S. Airways, and it involved seven legs in all, four outbound: New York to Syracuse, to Buffalo, to Philadelphia, to Chicago. According to the published times – assuming all four flights were on time, which statistically was not possible at that time – it would have taken 11 hours and 29 minutes to get from New York to Chicago.
Similarly, in 2000, on the Web site LowestFare, we queried a flight from Los Angeles to New York, and we queried the departure time for 5pm on August 21st, and the first flight provided was at 6:59am on August 22nd.
What we found is that, when we were tabulating the results, we needed to come up with some methodology to determine that it wouldn’t be fair to just simply say Web site A has the lowest fare, when in fact, by most sensible parameters, it wouldn’t have been feasible for most travelers. Perhaps maybe for some backpackers who didn’t mind sleeping in four airports overnight.
So what we did was, we then basically went in and created our own methodology. What we did is, we worked in reverse. We went and did some research into how the global distribution systems such as Sabre – which at that time, as I mentioned – were regulated.
So, in other words, if a travel agent asked Sabre for a departure at 10am, what was acceptable? So then we used those standards to evaluate our results a second time. And so, again, we kind of had two tiers of results in the early days, those with the lowest fares and those with the lowest fares that made sense for most travelers, based on accepted parameters.
As shown by these slides, Consumers Union’s first travel testing projects produced some bizarre results. By 2002, WebWatch noted that travel sites had made big improvements, but we still recorded some wacky flights.
For example, a round-trip from Miami to Atlanta included an outbound stop here in Dallas, and a return stop in Washington, D.C. Amazingly, that itinerary was topped on the wacky scale with another Miami/Atlanta leg that included a stop in Newark
We also found multiple cases in which a departure time was at least 10 hours earlier or later than what we had specifically requested.
For the most part, the wacky flight problem has abated, although it does crop up from time to time when searching for flights using a travel site’s lowest fare option. In our most recent testing project, conducted in December, Orbitz provided a first-class fare on American Airlines from Houston to Honolulu that included seven legs in all, routed from Houston to Austin to Dallas to Los Angeles to Honolulu, and back from Honolulu to Dallas to Austin to Houston. But, to its credit, Orbitz also provided more sensible itineraries that were priced higher, so the consumer can decide.
The 2002 project raised other issues that continue to be relevant today. For one thing, we noted how confusing it was to keep track of booking fees within each of the sites. We also expressed concern over proprietary deals between third-party sites and travel suppliers such as airlines, because we were worried that such marketing agreements had the potential to taint the unbiased listings promised by the leading third-party sites.
This is a theme that I’ll continue to return to this morning – and, I think, throughout the rest of the day – this issue of potential bias, which is a gravely serious issue.
In 2000, for example, Consumer Reports Travel Letter noted that, on some sites at that time, including Travelocity, advertised airlines dominated flight listings. On LowestFare, many TWA flights with inconvenient itineraries – and that was obtained through a contract fare deal with the former chairman of TWA – were repeatedly listed first.
And on all four third-party sites tested in 2000, certain airlines with viable itineraries for routes tested were not listed at all. We saw them in the GDSs, but we didn’t see them in the sites. Similar issues would continue to arise in future testing.
I just want to pause and see if there are any questions based on that specific test back in 2000 and 2002, on those tests. Okay. In April 2003, WebWatch published “Booking Hotels Online.” This was our first independent analysis of online travel, and it was a project obviously focused not on airlines but on the hotel market.
The test included the GDS Sabre as a benchmark, as well as Expedia, Hotels.com, Lodging.com, Orbitz, and Travelocity, and a number of branded hotel Web sites. At that time we looked at Four Seasons, Hilton, Hyatt, Marriott, Omni, Ritz-Carlton, Sheraton, and Westin.
Because the statistical sampling for each of those hotel chains was small in and of themselves, we grouped them together into a branded category. Even then, the hotel sector of the travel industry was emerging as a hotbed of competition and controversy.
Bjorn Hansen, the global hospitality industry leader at Price-Waterhouse Coopers, who studies the hotel industry, just recently put together some very interesting statistics. He tabulated the Internet effect conclusions on the hotel industry in 2004. You can see it’s rather a mixed bag.
The total induced demand nights created by the Internet was 50,000 daily room nights. The revenue gain from bookings induced by the Internet was $1.41 billion. The revenue loss due to transparency and competition of the Internet was $2.04 billion. So the net income effect was, in fact, a loss for the hotel industry of $635 million last year. And that may or may not surprise many people in the travel industry. Based on Mr. Hansen’s analysis, it’s clear that the Internet has been a source of both good news and bad news for the hotel industry.
WebWatch found some key differences between the hotel and airline segments. One was that, collectively, the branded hotel sites perform rather well when rates were queried at specific properties. Thus an inherent difference emerged between hotels on the one hand and airlines on the other, clearly due to the fact that many hotel locations are privately owned or franchised rather than corporately owned and managed, and subsequently some individual hotel properties were aggressively marketed via the Internet.
Overall for this project, Travelocity provided the highest percentage of lowest hotel rates. WebWatch also noted the continuation of several trends. One was that competition was heating up among the leading sites, and fare differences were narrowing considerably. Another was the advent of Web-only rates, since the GDS Sabre, that was used for this test, was beaten at providing the lowest fare 85% of the time by one or more of the Websites.
The Hotel Electronic Distribution Network, HEDNA, in 2001 published a white paper called “Biasing: What You Need to Know.” And at that time they stated: “Increasingly the hotel industry is citing examples where information delivered on travel displays is no longer nondiscriminatory or neutral, but displays a preference to another hotel or group of hotels.”
We found this report in our research when we were starting this project, and so we included a detailed examination of bias of hotel displays in third-party integrated travel sites in this report.
Specifically, WebWatch observed a troublesome trend. The functionality to obtain lowest hotel rates on all five third-party sites – Expedia, Hotels.com, Lodging.com, Orbitz, and Travelocity – was rather cumbersome, and often required advanced searches. Whereas displays of airline flights at that time in the leading integrated displays could be easily ordered on a lowest-to-highest fare basis, this was more difficult in the hotel sector.
I’d like to pause now and see if anyone has any questions specifically on this hotel project. Okay. In October 2003, Consumer Reports WebWatch focused on yet another sector of the travel industry, by examining travel sites that offered car rental bookings.
The resulting report, “Renting Cars Online,” reflected simultaneous testing of Expedia, Orbitz, Travelocity and four branded car rental sites: Alamo, Avis, Dollar, and Hertz.
Female Speaker: How does the hotel industry respond to such a big loss?
Bill McGee: The loss from the Internet bookings? Yeah, that was a statistic that certainly surprised me. Would you like to speak to that?
Shafiq Khan: The thing is, that’s a natural effect for the industry.
Bill McGee: Right.
Shafiq Khan: Individual players may have different experiences.
Bill McGee: Point well taken.
Shafiq Khan: Yeah, I’ll repeat. The number we see here from the PWC report is a macro effect on the industry, and individual players within the industry fared differently. So some of us fared better than others, as a result of which, our net effect was not a negative, but some obviously had even worse negative numbers than we did, than the industry did.
Bill McGee: And I should introduce Shafiq Khan from Marriott. I think one of the things that I found surprising about that – well, certainly you’re right about the macro effect – is that a lot of energy is being put into using the Internet as a distribution tool for the hotel industry.
And, feel free to correct me if I’m wrong, but it seems like an awful lot of kind of trial and error is taking place in terms of finding the right strategy. But I have to say, I was surprised to see that on a whole it turned out to be a negative number like that.
Now, the question is: how is that going to play out in the marketplace? Maybe not so much for those companies like Marriott that had a positive economic effect from the Internet, but from those that didn’t.
Male Speaker: Similar to load factors, do you have occupancy levels for hotels during those periods also?
Bill McGee: I don’t have them, but I know that over time the effect has been similar over the last several years. And, again, it has to do with this distressed inventory factor. Tracey [Weber, of Travelocity], would you agree with that, since you’re –
Tracey Weber: Yes. I think [INAUDIBLE], regardless of exactly how the players came together and it happened. I think that’s where the micro effect – the players who realize the way to use the Internet to drive a positive impact – made a difference.
Male Speaker: Isn’t it true that the money is still there? That the revenue’s being shifted, really, because so many hotels are going to the merchant model, where the hotel actually doesn’t make the mark-up so much as the reseller?
Bill McGee: Right, and Mr. Hansen is, his job is covering the hotel industry itself, and the bottom lines of hotels. So clearly you’re right. There is a shift going on there. And how that plays out is, I think, it’s fair to say it’s pretty much in flux now as well.
Again, from our perspective, when we analyze these issues, we’re always trying to bring it back to the consumer perspective. The bottom line is, there’s a lot of competition out there right now, and obviously that’s a good thing. We do have concerns to see how that eventually works out, and whether it leads to consolidation that in fact over time may lead to less competition.
Okay, we maybe can move on to car rentals.
The statistic that we just saw from Burst Media, just recently in February, is that among survey participants, this is a survey that they conducted of several thousand Internet users who book travel online. Among survey participants who use the Internet solely as a research tool, 11% reported they had researched car rental rates and availability. That struck us as a very, very small number. I’m not sure about some of the particulars of that survey, but when we did research for this project, looking at car rental sites, we found much higher numbers across the board for booking online.
When the testing was completed on this car rental project, Orbitz emerged as a clear winner at providing the highest percentage, not only of lowest rates, but also the new category that we created at that time, of closest rates. It fell within one dollar per day of the lowest rate.
I should point out that when we include this “closest rate” category in projects, we always establish the rate in advance, before we’ve seen any of the results, so that we’re not working in reverse after seeing the results. But we establish the rate based on a lot of factors. And for car rentals, since it is such a competitive market…
Bill McGee: …small and little-known rental companies from the mix, and Orbitz included a lot of those. Travelocity, in fact, emerged as the price leader, based on more established car rental firms. Interestingly, WebWatch also found that all three integrated sites – Expedia, Orbitz and Travelocity – sometimes provided lower rates for rentals from Alamo or Dollar than either Alamo.com or Dollar.com provided. The other two sites, Avis and Hertz, were not a factor in this.
However, the real news that emerged from this project concerned two serious issues: Discrepancies with pricing displays, and potential bias of displays.
This project marked the emergence of a phenomenon Consumer Reports WebWatch would eventually dub “fare jumping,” in which a rate suddenly changes, usually by increasing, in real-time, during the middle of the shopping process. With this project, rates suddenly changed with both Orbitz and Expedia. WebWatch contacted both companies, and their responses were included in the published research report – and, again, all of this research is available on the WebWatch site. Executives from both Orbitz and Expedia said their companies were striving for 100% accuracy in all pricing displays.
We can pause here to speak about fare jumping if you’d like, but it’s certainly going to be a topic that I’ll be getting to later on as well. This was the first time that this phenomenon emerged.
And, again, just to be clear, what we’re talking about is not that you go on and you see a rate, and then you stop and you talk to your partner about it, and maybe you do some other research and then you come back and the rate has changed. We’re talking about real-time. We have published results for every study that we undertake – in fact, every screen of every study that we undertake. So we have filled quite a few warehouses with our results. And, in many cases, with the time and date, we can see that less than seven seconds have elapsed.
Male Speaker: Does fare jumping include the non-availability if you have a price presented, and then you go to book it and it’s not there?
Bill McGee: No. If I understand you correctly, if the first offering, say, is a flight on United available at $500, and then it turns out that that’s not available and the second one is on Delta for $625. Is that what you mean?
Male Speaker: I mean like in a hotel or a car display, where a rate is presented and it says “rooms available.” And you click to basically a lie – like the guy at the beginning [in the video] said – and see, “Sorry, it’s not available.”
Bill McGee: Yes, what he was talking about and, again, I know Beau [Brendler] said this, but I want to stress this: there was no interaction between the WebWatch staff and the outside company that interviewed those people. So when I saw that film, I was as interested in it as many of you were.
And it’s also something that I’ve been hearing anecdotally from people outside the industry as well. We were referring specifically to – on the results page of a third-party site, where you have multiple brands and multiple fares, that we then click on the lowest fare offered.
And usually it says something like “price this item” or “select this item” or “choose this item.” We click on it to get further information. Usually it involves more itinerary details, flight times or a specific hotel location. And it’ll usually include the breakdown of taxes and fees and things.
That’s what we’re talking about. So, again, we’re talking about in real-time, in a matter of seconds, the price has gone up. And then within that subset of fare jumping, we’ve seen two types. One is where it’s flagged, and one is where it isn’t. And we’ll talk about that as well.
The other issue of potential bias – and, again, we’re speaking about this car rental project in 2003 – specifically concerned Expedia. And at that time, we saw that it was confined to its display of car rental rates, not other travel products such as airline flights and hotel rooms. That apparently has changed since then.
WebWatch expressed concern that Expedia had in effect created a two-tier system within its car rental display, by first offering the “Expedia Picks” listing of certain car rental firms, and then providing a “Show more vendors” option that offered additional listings. We have some screen captures from this project, this car rentals project that we may be able to pull up for you while I’m speaking.
Again, what we’re talking about here is a third-party site that, in searching for car rentals, rather than being given the entire universe of what Expedia had in its inventory for car rentals for those specific parameters, for that day, that city, etc., that type of vehicle, what we found is in fact that it was a two-tier system.
That the choice was not “all vendors,” but the choice was “Expedia Picks.” And then after you went to the “Expedia Picks” page, there was an icon – a rather small icon, in fact – on the bottom of the listings, that said “Show more vendors,” which then became a fully integrated listing of two sets of vendors.
Now, we spent a lot of time researching this, and in fact speaking to Expedia about it. Upon closer examination, when then re-tabulated all the results, and we found that the secondary list often included reputable name-brand car rental firms, first of all. So it was not as if the five largest rental firms were in the first list and smaller ones were in the second list. There was actually a mix. Household brands were in both lists. And what we also found is that the second list often included a lower rate than the initial list, which was of great concern to us.
In response to a query from Consumer Reports WebWatch, Expedia explained that the “Expedia Picks” listings represented the Web site’s preferred partners. However, Expedia refused to disclose if those partners did or did not pay for such placement.
I just want to pause here to see if anyone has any questions on this issue.
Male Speaker: We talk about Expedia as the only company involved in showing their own picks. But from my perspective, every one of them does that. Every one of them has a display that prefers their own picks, whatever they are, and then follows that with often what’s called “the retail display.”
Bill McGee: What we have seen – and we talked about this earlier, from a suppliers’ perspective – it’s not the first time we’ve heard a travel supplier say this. What we have seen is that in the displays themselves, this was the first time that we saw it categorized as such to consumers.
So in a way, this is certainly kind of a double-edged sword. Expedia, in one way, was in fact perhaps more honest than other Web sites in presenting to consumers, although you could argue that it was done in a confusing way, that there was a two-tiered system in effect.
The issue of omissions of viable suppliers in third-party sites has been something that’s concerned us since the beginning. It’s also – I think you can appreciate, from our perspective – very hard for us to get our hands around. It’s something we looked at with every single test. In the days when we were using Sabre or Apollo to benchmark our tests, we found repeatedly that the GDSs were offering options that were not offered by the suppliers.
This is one reason that we have gone back to the suppliers and Beau Brendler has been involved in this directly, in speaking to – excuse me, not to the suppliers, but to the third-party sites. And one of the things that we’ve requested from all travel sites, third-party sites, is a list of the universe of given suppliers.
Now, some omissions are fairly well known. For example, in the airline sector we know that Southwest and JetBlue are not in third-party sites, so even many consumers may be aware of that.
But when we have spoken to third-party sites about this, they all have stated that they reserve the right to include certain suppliers and not others. And they usually draw metaphors to other industries, saying that, well, a department store doesn’t carry every single brand of shoes, or what have you.
But, for us, it’s more complex than that, because what I can say is that, at WebWatch, we hold integrated third-party sites to a very high standard, a standard that we don’t hold, for example, branded sites. It’s just a given that when I bought a Focus, that I went into a Ford dealership and the sign said “Ford.” I expected to only be told about Fords. We understand that that’s an intuitive thing that all consumers grasp.
What we’re concerned about is what’s a very gray area here. In our testing we’ve seen basically preferred supplier lists that have, in our view, tainted what should be an inclusive, integrated listing. That second listing should have been the first one.
Now, if you want to offer an option to say: Well, this is Expedia’s preferred suppliers, that’s another issue. But the fact that it’s happening without consumer knowledge, and therefore without our ability to find it, that’s even more disturbing.
Male Speaker: Excuse me. Your assumption so far has been that bias is not allowed? Or is morally incorrect? The reason I say that is that, yes, under the DOT rules, the GDSs had to keep out all forms of bias. But the electronic, the Internet travel companies have not been constrained by that in any form.
And if you were to look at everybody here who’s selling something online today in the travel arena, there’s all sorts of forms of bias, whether it be very obvious or very subtle.
Bill McGee: That’s exactly right. The DOT rules that applied for GDSs for the better part of 20 years do not apply to integrated travel sites online.