The Surprising Stories Behind LAX, JFK, and LHR

The Surprising Stories Behind LAX, JFK, and LHR

Every airport has a three-letter code that travelers memorize without questioning. But these codes hide fascinating stories—naming conventions, historical accidents, and bureaucratic decisions that created the alphabet soup we now navigate. The system that seems arbitrary actually follows patterns, and the exceptions reveal aviation history’s quirks and compromises.

How the System Started

Airport codes didn’t begin with airports. They evolved from weather station identifiers used by the National Weather Service in the 1930s.

Two letters first: Weather stations used two-letter codes tied to city names. When commercial aviation needed its own identification system, the existing two-letter codes provided the foundation.

The third letter arrives: As aviation expanded, two letters proved insufficient. The International Air Transport Association (IATA) added a third character, creating the familiar format. Many early codes simply appended “X” to the existing two-letter weather codes—hence LAX (LA + X), PDX (PD + X), and PHX (PH + X).

Not the same as ICAO: IATA codes (three letters) differ from ICAO codes (four letters) used in flight plans and air traffic control. LAX is K-LAX in ICAO format; Heathrow is EGLL. Travelers see IATA codes; pilots use ICAO.

LAX: Los Angeles, California

Los Angeles International exemplifies the “add an X” solution to the two-to-three letter transition.

The original code: The weather station at Los Angeles used “LA.” When three letters became standard, adding X created LAX—pronounceable, memorable, and tied to the city abbreviation.

Why not LOS?: LOS was already assigned to Lagos, Nigeria. The X suffix avoided conflicts with existing city-name codes.

The name itself: Los Angeles International wasn’t always the name. The airport opened as Mines Field in 1930, became Los Angeles Municipal Airport in 1941, and received its current name in 1949. The LAX code stuck through all changes.

JFK: John F. Kennedy International

New York’s primary international gateway carries a code honoring an assassinated president—but that wasn’t always its identity.

The original: IDL: The airport opened in 1948 as New York International Airport and received the code IDL (for Idlewild, the neighborhood where it was built). IDL served as the code for fifteen years.

The renaming: One month after President Kennedy’s assassination in November 1963, the airport was renamed John F. Kennedy International Airport. The code changed to JFK, creating one of aviation’s most recognized identifiers.

Newark’s complication: With JFK taking that code, Newark Airport uses EWR—the “N” codes having been claimed elsewhere. The seemingly random EWR comes from “Newark” with a garbled first letter, a common pattern when obvious codes were unavailable.

LHR: London Heathrow

Britain’s busiest airport has a straightforward code—but London’s other airports tell more complex stories.

The logic: LHR simply takes “L” for London and “HR” for Heathrow. The system worked when Heathrow was London’s only major airport.

Gatwick’s oddity: London Gatwick received LGW—”L” for London and “GW” for Gatwick. Consistent with Heathrow’s pattern.

Stansted and Luton: STN (Stansted) and LTN (Luton) follow similar logic. But London City Airport, built decades later, became LCY—adding “Y” when the obvious combinations were taken.

Southend’s stretch: London Southend Airport, a marketing name for an airport actually in Essex, received SEN. The “London” branding is aspirational; the code is practical.

ORD: Chicago O’Hare

Chicago’s code confuses everyone who doesn’t know the history.

Not CHI: Chicago’s obvious code was already claimed by smaller airports and the city’s general designation. O’Hare needed something else.

The original name: O’Hare began as Orchard Place Airport, a military facility built in 1942. Its original code was ORD—for “Orchard.” The name changed but the code remained.

Edward O’Hare: The airport was renamed in 1949 to honor Edward “Butch” O’Hare, a naval aviator and Medal of Honor recipient killed in combat. His story—shooting down five Japanese bombers in a single mission—made him a Chicago hero.

Midway’s clarity: Chicago’s other major airport, Midway, has the sensible code MDW. Built before O’Hare, it was originally Chicago Municipal Airport and got renamed for the Battle of Midway in 1949.

SFO: San Francisco

San Francisco International’s code follows the straightforward pattern—and reveals why obvious codes usually worked best.

Simple logic: SFO takes the city abbreviation directly. No weather station complications, no name changes, no conflicts with other airports.

Oakland’s code: Across the bay, Oakland International uses OAK—equally logical. The Bay Area’s third major airport, San Jose, uses SJC (“San Jose, California”).

Why clarity matters: Travelers rarely confuse SFO, OAK, and SJC because the codes relate obviously to city names. Compare this to New York’s EWR, JFK, and LGA—three airports whose codes require memorization.

CDG: Paris Charles de Gaulle

France’s main international gateway honors a president but didn’t always carry his name.

The opening: The airport opened in 1974 as Aéroport de Paris-Nord (Paris North Airport). It received the code CDG from the start, anticipating the renaming to honor Charles de Gaulle after his death in 1970.

Orly’s persistence: Paris Orly, the older airport, uses ORY—a rare case where the obvious three letters were available. Orly was Paris’s primary airport until CDG opened.

Beauvais complications: Paris Beauvais Airport, marketed to budget carriers as a Paris option despite being 85 kilometers from the city, uses BVA. The “Paris” designation is creative marketing.

SIN: Singapore Changi

Singapore’s code looks like an abbreviation but represents the entire country.

Country as city: Singapore is both a city and a nation, so SIN works as an abbreviation for either. Changi Airport doesn’t need separate identification from the country.

The old airport: Before Changi opened in 1981, Singapore Paya Lebar Airport used SIN. The code transferred to the new facility seamlessly.

Regional patterns: Other single-city nations face similar situations. Monaco (MCM), Luxembourg (LUX), and Malta (MLA) all use national rather than airport-specific codes.

DXB: Dubai International

Dubai’s code follows Arabic transliteration patterns common in Gulf aviation.

The logic: DXB represents “Dubai” with an X separator—similar to the American pattern but applied to Arabic name transliterations.

DWC’s meaning: Dubai’s newer airport, Al Maktoum International, uses DWC (Dubai World Central). When it eventually becomes the primary airport, the code will remain even as traffic shifts.

Regional patterns: Abu Dhabi uses AUH, Doha uses DOH, and Riyadh uses RUH—all logical transliterations. The Gulf region’s codes mostly make intuitive sense.

NRT and HND: Tokyo’s Two Airports

Tokyo’s situation illustrates how airport names and codes diverge as systems evolve.

Narita’s code: NRT comes from Narita, the city where the airport is located—not Tokyo. The airport opened in 1978 and was initially called New Tokyo International Airport.

Haneda’s persistence: Tokyo’s original airport, Haneda, uses HND. It was Tokyo’s only major airport until Narita opened and has recently reclaimed significant international traffic.

The confusion factor: Tourists booking “Tokyo” flights must choose between NRT and HND. The codes don’t indicate Tokyo at all—creating exactly the confusion the coding system was designed to prevent.

YYZ: Toronto Pearson

Canada’s largest airport has a code that mystifies non-Canadians.

The Y prefix: All Canadian airport codes begin with Y, a designation from the original radio beacon system. The letter indicated Canadian stations.

Why YZ: The “YZ” portion came from a nearby radio beacon. Toronto’s airport wasn’t actually in Toronto—it was in the municipality of Malton. The closest beacon’s identifier appended to Y created YYZ.

The Rush connection: The rock band Rush titled an instrumental track “YYZ” in 1981, with the song’s opening mimicking the Morse code for the letters. Generations of rock fans learned airport code trivia through the song.

MSP: Minneapolis-St. Paul

Shared airports serving multiple cities create unique naming challenges.

The compromise: MSP takes “M” from Minneapolis and “SP” from St. Paul. Neither city gets priority in the code—a diplomatic solution for the Twin Cities.

Similar situations: Dallas-Fort Worth (DFW) follows the same pattern. Tampa Bay area uses TPA (Tampa). The Kansas Cities split between MCI (Kansas City International) and MKC (Kansas City Downtown).

When diplomacy fails: Some regions never resolved their airport identity. Washington D.C. area travelers choose between DCA (Reagan National), IAD (Dulles), and BWI (Baltimore-Washington)—three airports serving the same metropolitan area with codes tied to three different local identities.

The Mysterious Ones

Some codes seem to defy all logic until you know the story.

MCO (Orlando): Orlando’s code references McCoy Air Force Base, the military installation that became Orlando International Airport. McCoy was named for Colonel Michael McCoy, killed in a 1957 reconnaissance plane crash.

CLE (Cleveland): Simple enough—Cleveland abbreviated. But it’s named Hopkins International after a city manager, not the city.

MSY (New Orleans): The code references Moisant Stock Yards, the area where the original airport was built. Louis Armstrong International Airport keeps a code that predates jazz association by decades.

ORF (Norfolk): The code references Norfolk (NRF was taken), adding O for the original airport name, Ocean View.

What the Codes Reveal

Airport codes preserve history that names often obscure. Renamed airports keep old codes. Relocated airports retain identifiers from former locations. Merged airports carry codes from component facilities.

The practical lesson: Codes aren’t designed for memorization—they’re designed for unique identification. When LAX and LHR and JFK become memorable, that’s incidental to their function.

The traveler’s adaptation: Experienced travelers learn codes through repetition. The stories behind them add context but aren’t essential. What matters is that the codes work—every airport worldwide has a unique identifier that systems can process without confusion.

Next time you check a boarding pass, the three letters stamped on your luggage tag carry decades of aviation history. The system that seems arbitrary follows patterns—and where patterns break, stories hide.

Marcus Chen

Marcus Chen

Author & Expert

Marcus covers smart trainers, power meters, and indoor cycling technology. Former triathlete turned tech journalist with 8 years in the cycling industry.

208 Articles
View All Posts

Subscribe for Updates

Get the latest articles delivered to your inbox.