The ABCs of LAX: How Airports Get Their Codes
Have you ever wondered why some airport codes make total sense — like "SYD" for Sydney Kingsford Smith Airport — but others, like Kahului Airport's "OGG," don't?
In the 1930s, American pilots started using two-letter codes for airports based on a system the US National Weather Service used for cities. By the late 1940s, there were too many airports for the number of two-letter codes available, so a third letter was added.
In the 1960s, the International Air Transport Association (IATA) started standardizing these codes, and these are the codes we have today.
In an ideal world, all airport codes would be the first three letters of the location's name, like "SAN" for San Diego International Airport. But if that code is already in use, then the second choice is a different combination of letters from the location name, like San Fernando Airport's "SFE."
However if that is also taken, the IATA has to get creative.
Some codes are a reference to the area around an airport. For example, Bucharest Henri Coandă International Airport — "OTP" — is named after the town of Otopeni in which the airport is located, 17 kilometers north of the Romanian capital.
Sometimes the code might be a reference to the area's history, or to important people. According to the Official State of Hawaii Website, Kahului Airport's "OGG" is taken from the last letters of the name of Jim Hogg, the chief pilot of Hawaiian Airlines when the airport opened.
In other cases, an "X" was simply added to the old two-letter airport code. This explains the codes for places like Los Angeles International Airport — "LAX" — and Phoenix Sky Harbor International Airport, which is "PHX."
According to the IATA, there are over 11,000 airport codes currently in use, with around 40-50 new codes added each year.