Methodology

The purpose of this report is to provide a benchmark for airlines to identify which disruptions impact the passenger experience most, so as to prioritise solutions.

In creating this report, it’s important to be as transparent as possible, showing where discrepancies can occur often due to regulations within regions or definitional inconsistencies. For example, at what point does a protracted delay become a cancellation?

Even though most disruptions are caused by some knock-on effect of inclement weather, the optics of disruptions can be problematic for an airline - preferring delays over cancellations.

The good news is most passengers expect some form of disruption day-of-travel. However, they also expect the airline to provide better service when disruptions occur.


Airline Data | Delays & Cancellations

All airline data is provided by FlightAware and focuses on arrival time in destination. It is compiled on a daily, weekly and monthly basis focusing on cancellations and delays, and origin/destination disruption data for airports.

FlightAware uses the US Bureau of Transportation Statistics’ definition for a delay as any flight that exceeds scheduled time of arrival at the gate by more than 14 minutes.

FlightAware defines a true cancellation as when a cancellation is confirmed by the airline via some data sharing method. FlightAware will also synthetically cancel flights where flight plan information exists, but cannot be linked to actual aircraft movement.

All metrics are based on a North America standard, including estimating total planned flights per airline.

Delays are segmented by time

  • Minor = <1 hour (>14m - <60m)
  • Medium = 1-3 hours (>60m - <180m)
  • Major = >3 hours (>180m)

Airline Tiers and Classification

This report focuses on airlines with flight volumes that exceed 1200 per month (>1 million passengers per year). We track 600+ airlines but focus on the top 290 [approx.] by size.

For Global percentages we use total volume of flights and disruptions for the featured dataset (approx. 290 airlines), but any airline below Tier 4 will be designated as “Tier 5 or Other.”

For airline classification, ICAO and IATA databases were consulted, alongside an airline’s own documentation. An airline’s approach to ancillaries and seating class were also considered. That said, there is no consensus in the industry around airline classification; often it is used more as a branding exercise rather than informative.

To keep it simple we differentiate using Full Service Carrier [FSC] and Low Cost Carrier [LCC].

Tiers are defined as

Tier 1:
Approximately >17K flights/month | >25 million pax/year

Tier 2:
Approximately 6.5K - 17K flights/month | 10-25 million pax/year

Tier 3:
Approximately 2K - 6.5K flights/month | 3-10 million pax/year

Tier 4:
Approximately 1.2K - 2K flights/month | 1-3 million pax/year


ImPax | Impact + Passenger

ImPax Value

The ImPax Value is calculated as flights with medium delays, large delays and cancellations.

We do not use Minor delays in calculating the ImPax Value because the goal is to analyse the impact on the passenger experience, not the airline. Generally, sub-one hour delays have less of an impact on the passenger experience.

That said, we do not underestimate the impact large volumes of sub-one hour delays can have on the operations of an airline, leading to greater scheduling problems. Nor the impact a minor delay can have in specific passenger scenarios eg. A tight connection.

Estimated Impacted Passengers

We use the ImPax Percentage to make a conservative estimation of the number of disrupted passengers that requires some form of action by the airline - anything from simple updates to full rebookings and accommodation. This is based on an average airline flight at 80% capacity. Average flight is calculated by total passengers divided by total flights for a given time period. This can vary as passenger volumes for a given year can have a delayed release.

The idea is just to provide an alternative view of what disruptions look like at a passenger level, and to encourage airlines that prioritising solutions begins with a much smaller cohort of disrupted passengers than airlines might think.

For example, if we define “disruption” as any delay above 15 minutes, a Tier 1 airline could have >20% disruption rate for a given month - or millions of disrupted passengers. For the large majority, these disruptions do not register a great impact on the travel experience, nor require anything more than communication from the airline.

ImPax Percentage

Calculated by the ImPax Value (above) as a percentage of an airline’s total scheduled flights.


Sentiment Analysis | Reviews

For sentiment analysis, TripAdvisor, TrustPilot, Social, and Skytrax are used for airline source reviews. Each month, the report focuses on a selection of airlines (8–10) across each tier and region, using a sample size of >200 reviews.

The idea is simply to close-the-feedback-loop for airlines, and get a sense on any given month what are the dominant classifiers used in passenger reviews.

The impetus to write a review is often driven by memorability or exceptionality of customer experience – when the experience is above or below customer expectations. For this reason, in travel they tend toward the negative. When travel operates smoothly, it is unexceptionable because that is what the customer has paid for: on-time flight, baggage arriving on time, in-flight entertainment to be working, etc.,. When it goes wrong - and as airlines are most sensitive to inclement weather, so it often can go wrong - it inspires an emotional spike which negatively imprints memories.

Interestingly, most reviews are from a travel experience one to two months past. This suggests a negotiation period where the airline might have an opportunity to engage the passenger, and provide compensation to make it right.

The most insightful reviews [and the ones we try to surface] are those that outline something negative but share how the airline responded and mitigated the negative impact.


Disclaimer

We have spoken to >150 airlines in the last 24 months and all of them are pursuing new IROPS solutions to an industry-wide problem that is largely driven by factors – namely weather – out of their control. In no way does this report seek to negatively highlight an airline’s efforts, that is why the reviews are kept anonymous. Using sentiment analysis is simply another way to learn from the passenger experience.


About Author

Matthew Walker

Matthew Walker heads up marketing and research at Plan3. He has worked in travel technology since 2014.

For all inquiries about the Airline ImPax Report, email: impax@plan3.aero