No matter where the Allies decided to invade Nazi-occupied France in the spring of 1944, the tide was going to be an important determinant in the success or failure of the amphibious assault. Along the entire French coast of the English Channel, the vertical range from low tide to the next high tide always exceeded 6 meters. At low tide, those large tidal ranges exposed long stretches of beach that Allied soldiers would have to cross under heavy German fire.
German Field Marshal Erwin Rommel was well aware of tidal issues. He had been tasked by Adolf Hitler with building the so-called Atlantic Wall to defend against the anticipated invasion. “When they come, it will be at high water,” Rommel told his troops time and again. He believed that all was lost and Germany was doomed if he could not stop the Allies at the beaches. And so, beginning in February 1944, he had thousands of underwater obstacles built in the intertidal zone (see figure 1). They were positioned so as to be covered by midtide and, unseen, rip out the bottoms of the landing craft. But Allied aerial reconnaissance soon spotted the obstacles and recognized their purpose. That forced significant changes in the invasion plan and made its success even more dependent on accurate tide predictions.
Figure 1. German Field Marshal Erwin Rommel (front row, third from left) on the French channel coast in April 1944, at low tide. He was inspecting some of the millions of obstacles he had ordered built between the low- and high-water lines on beaches where the Allied invasion force might land.
Figure 1. German Field Marshal Erwin Rommel (front row, third from left) on the French channel coast in April 1944, at low tide. He was inspecting some of the millions of obstacles he had ordered built between the low- and high-water lines on beaches where the Allied invasion force might land.
Predicting tides
In 1944 the astronomical tide (as distinguished from wind-driven effects on water levels) was the most predictable of all marine phenomena, and had been so for more than a century. In fact, tide prediction had existed in some form for 2000 years. Of course for most of those two millennia, prediction methods were crude and not based on any physical understanding of how tides are produced. But even crude tide prediction was important to ancient shell fishermen, who had to know when to leave low-water mudflats before the tide rushed back in and drowned them. And mariners had to know the time of high water so they could safely bring their ships into port without running aground.
The connection between the tide and the movement and phases of the Moon was too obvious not to be noticed. So ancient fishermen and mariners developed simple prediction methods based on the observed time interval between the Moon’s highest ascent above the horizon and the next high water. They also knew that the tidal range varied throughout the month, increasing as one approached full or new Moon.1
The first physical oceanographic data series was probably the tabulation of the times and heights of high and low waters at the northern end of the Persian Gulf, compiled around 150 BC by the Hellenistic mathematician Seleucus of Babylon.2 Seleucus recognized that the two high tides on any given day could be quite different in height and that the difference varied throughout the month, being greatest when the Moon is farthest north or south of the equator.
Over succeeding centuries, methods of tide prediction became more elaborate. But until Isaac Newton addressed the problem, they were still based only on observed correlations between water levels and excursions of the Moon and Sun. The earliest tide table that’s actually been discovered, printed in China in 1056, is surprisingly accurate. It predicted the arrival times of the tidal bore—a tidal wavefront steepened by shallow water—in the Qiantang River at Yanguan. There were actually three tables, one each for the winter and summer and one for spring or autumn. The Chinese had recognized that the differences between afternoon and morning high tides were larger at the solstices than at the equinoxes.3
Perhaps the most elaborate European tide table was the Brouscon tidal almanac produced in the 1540s for the king of France. It consisted of beautiful color charts showing dozens of harbors. A pocket-sized version on vellum was used by most mariners at that time.
Understanding tides
No really accurate tide-prediction method could be devised before someone discovered how the tides arise. Even Galileo Galilei and Johannes Kepler didn’t accompish that, though each did figure out a piece of the puzzle. It was not until 1687 that Newton finally put the pieces together and explained how the tides are generated. In the Principia, he showed that the generation of tides depends on both the gravitational attraction of the Moon (as suggested by Kepler) and the centrifugal force of the Moon–Earth orbit. It involved accelerations not unlike the ones invoked by Galileo, but he had erroneously concluded that the Sun plays the principal role.
As Newton explained it, gravitational attraction pulls Earth and the Moon toward each other. But they are also centrifugally pushed apart as they both orbit their joint center of mass. Near Earth’s center, gravitational attraction and centrifugal repulsion balance each other. But that’s not so on Earth’s surface. On the side closest to the Moon, gravitation is stronger, and on the opposite side, centrifugal repulsion wins out. Therefore the planet, largely covered by water, has two tidal bulges, one toward the Moon on the side closest to the Moon and one on the opposite side. So any point on the sea will exhibit two high tides a day as Earth rotates.
By ignoring the effect of the continents, Newton’s explanation is of course oversimplified. The continents break up the Earth-covering waters into several large oceans. Newton, having assumed for simplicity that the sea responds instantaneously to the astronomical forces, essentially ignored all hydrodynamic effects—for example, the effect on the tide of the natural oscillation frequency of an ocean basin, something that Galileo had recognized.
Newton did not exploit his theory to propose an improved method of tide prediction. That task was taken up early in the next century by Daniel Bernoulli, who refined Newton’s equilibrium theory to produce tide tables that better incorporated several important astronomical frequencies. However, it was not until 1776 that Pierre Simon Laplace first described how the oceans respond dynamically to the slowly oscillating gravitational effects of the Moon and Sun.
Using the calculus, Laplace derived three equations for the global ocean, the first based on the conservation of mass and the other two based on momentum conservation in two horizontal directions.4 The Laplace tidal equations marked the beginning of the hydrodynamic modeling of the oceans, the first real application of physics to the sea and the birth of geophysical fluid dynamics.
The Laplace equations were very complex; they could not be fully solved until the advent of digital computers in the 20th century. But they did yield an immediate benefit that did not require a complete solution. Laplace demonstrated that the tide is unique among all oceanic phenomena in that all its energy is concentrated at only a few specific astronomical frequencies. For example, at many locations most of the energy is found at three semidiurnal frequencies: 1.93 cycles per day (due to the Moon), 2.00 cycles per day (due to the Sun), and 1.90 cycles per day (due to the eccentricity of the lunar orbit). In some locations, three other diurnal frequencies, due to the asymmetry introduced by the tilt of Earth’s axis, play an important role. The tidal energy spectrum contrasts dramatically with the ocean’s nontidal hydrodynamic energy, which is spread across the frequency spectrum to manifest random wind-driven changes in water level.
Laplace proposed that one could accurately predict the tide if one could calculate the energy at each of the most important astronomical frequencies. That insight would be at the heart of all future tide-prediction methods. Ultimately, it would lead to one of the most elegant mechanical computing machines ever invented.
The harmonic method
Laplace’s proposal for developing practical tide prediction was first taken up 80 years later in Britain by William Thomson (who became Lord Kelvin in 1892). Thomson used Laplace’s idea to develop the harmonic method—essentially Fourier analysis—for analyzing time series of tide measurements to determine how much energy there is at each tidal frequency. Those energies vary from place to place because of the way oceans, bays, and other waterways affect the tide. But the brilliance of the harmonic method was that it required no understanding of hydrodynamics. One simply needed to analyze a long-enough data record at each location so that the energies at the most important astronomical frequencies could be separated from each other. Independent of Thomson but also inspired by Laplace, William Ferrel of the US Coast and Geodetic Survey also developed a technique for harmonic analysis and prediction in the early 1880s.
The portion of the tidal height due to the energy at a given frequency can be plotted as a cosine curve that has a particular amplitude and a particular time (relative to some astronomical reference) for its high water. When all the cosine curves are added together, the result closely matches a measured tide curve. Harmonic analysis determines what the amplitudes and phases (the harmonic constants) must be for the predicted tide curve to match the measured curve as closely as possible. Dozens of pairs of harmonic constants can also be calculated for other tidal frequencies that represent other orbit variations in the Earth-Moon-Sun system.
Later it was discovered that nonlinear shallow-water effects transfer tidal energy to still other frequencies, and so those frequencies were included in the harmonic prediction method. These so-called overtides involve higher harmonics of the basic astronomical frequencies, and they are what make a tide curve for shallow water look distorted when compared with the simple cosine curve for deep water. In shallow water, the tide curve can, for example, exhibit faster rise and slower fall.
Exploiting the harmonic method required lots of data. Water-level measurements had to be made frequently (usually hourly) for several weeks. The longer the data series, the more harmonic components one could quantify for improved tide prediction.
Big brass machines
Once the harmonic constants were calculated, the problem became how to use them for tide predictions without having to make long, laborious calculations. In the early 1870s, Thomson designed an ingenious mechanical analog computer to automate the prediction process. It had dozens of gears and pulleys over which ran a wire that was connected to a pen touching a moving roll of paper.5 Each tidal constituent was represented by one gear rotating with a speed that was specific to that constituent’s frequency. A pin-and-yoke arrangement transformed the gear’s rotation into an up-and-down motion that pulled on the wire, thus providing that constituent’s contribution to the tide curve being drawn on the moving paper. Each pin-and-yoke pair was adjusted to provide the correct amplitude and phase as determined by the two harmonic constants for that tidal constituent.
Thomson’s first machine, built in London in 1872 by the Légé Engineering Company, summed the contributions of the 10 most important tidal constituents. It was a big, finely crafted, brass apparatus, later widely known as Kelvin’s tide machine. In the US, Ferrel designed a 19-constituent machine that was built in Washington, DC, in 1882 by Fauth and Company. But Ferrel’s machine was based on slightly different equations; instead of producing a complete, continuous tide-prediction curve, it predicted only the levels and times of high and low waters.
Twenty-five years later, Edward Roberts, who had worked out the gear ratios for Kelvin’s original device, designed a 40-constituent tide-predicting machine for the Bidston Observatory’s Liverpool Tidal Institute. A second US machine, with 37 tidal constituents, was designed by Rollin Harris and completed in 1912 in the workshops of the US Coast and Geodetic Survey. Like the Kelvin and Roberts devices, that machine, known as Old Brass Brains, produced entire tide curves.
With such machines, tide predictions were made for an entire year for all major ports and harbors around the world for which data had been taken and harmonically analyzed. For those so-called reference stations, the heights and times of all predicted high and low waters were published in annual tide tables. For “secondary stations” with insufficient data for harmonic analysis, high- and low-tide differences from a nearby reference station were calculated and used for daily prediction of local tides.
World War II
By September 1939, when the German invasion of Poland launched the second world war in a generation, huge brass mechanical machines descended from Kelvin’s original had made accurate tide prediction commonplace. The prediction process itself was fairly efficient, but a considerable manual effort was still required to extract the harmonic constants from long records of water-level data. It took weeks to harmonically analyze a year’s worth of data, something that a modern electronic computer would accomplish in an instant.
In 1940, when it looked like the Germans would invade England with their own amphibious landing—a plan they code-named Unternehmen Seelöwe (Operation Sea Lion)—tide predictions were important for both sides. The British believed the Germans would land at high tide to minimize the length of beach they would have to cross under fire. On 26 June Winston Churchill, who had become prime minister in May, asked the Admiralty for “a chart of the tides and Moons to cover the next six weeks” and “on which days conditions will be most favorable for a seaborne landing.”6
The Admiralty responded that the most likely time would be “when high water occurs near dawn, with no Moon.” They selected dates in July and August that met the criteria for beaches near eight English ports. But in September, having failed to destroy the Royal Air Force, Hitler canceled Sea Lion. The Admiralty soon banned the publication of all tide-prediction tables that might help the Germans in the future.
Four days after Japan attacked Pearl Harbor on 7 December 1941, Hitler declared war on the US. The following November, the Allies undertook their own first amphibious landing, at Casablanca in French Morocco, using tide predictions produced by the US Coast and Geodetic Survey with the 30-year-old Harris machine. Over the next two years, that machine also provided tide predictions for American amphibious landings on various Japanese-held islands in the Pacific.
The Normandy beaches
As an Allied cross-channel invasion loomed in 1944, Rommel, convinced that it would come at high tide, installed millions of steel, cement, and wooden obstacles on the possible invasion beaches, positioned so they would be under water by midtide. But the Allies first observed Rommel’s obstacles from the air in mid-February 1944. “Thereafter they seemed to grow like mushrooms . . . until by May there was an obstacle on every two or three yards of front.”7
The obstacles came in a variety of shapes and sizes. In figure 1 we see rows of half-buried logs pointed upward at a low angle, some with explosive mines on them. There were also so-called hedgehogs, each consisting of three 2-meter iron bars crossed at right angles, and “Belgian gates,” 2- by 3-meter steel frames planted upright.
The Allies would certainly have liked to land at high tide, as Rommel expected, so their troops would have less beach to cross under fire. But the underwater obstacles changed that. The Allied planners now decided that initial landings must be soon after low tide so that demolition teams could blow up enough obstacles to open corridors through which the following landing craft could navigate to the beach. The tide also had to be rising, because the landing craft had to unload troops and then depart without danger of being stranded by a receding tide.
There were also nontidal constraints. For secrecy, Allied forces had to cross the English Channel in darkness. But naval artillery needed about an hour of daylight to bombard the coast before the landings. Therefore, low tide had to coincide with first light, with the landings to begin one hour after. Airborne drops had to take place the night before, because the paratroopers had to land in darkness. But they also needed to see their targets, so there had to be a late-rising Moon. Only three days in June 1944 met all those requirements for “D-Day,” the invasion date: 5, 6, and 7 June.8
A 6-meter tidal range meant that water would rise at a rate of at least a meter per hour—perhaps even faster due to shallow-water effects. The times of low water and the speed of the tidal rise had to be known rather precisely, or there might not be enough time for the demolition teams to blow up a sufficient number of beach obstacles. Also, the low-water times were different at each of the five landing beaches (from west to east, they were code-named Utah, Omaha, Gold, Juno, and Sword). Between Utah and Sword, separated by about 100 km, the difference was more than an hour. So H-Hour, the landing time on each beach, would have to be staggered according to the tide predictions. Tidal currents, the along-shore flow due to the changing tide, were another important consideration. Strong tidal currents could easily push amphibious craft down the beach, away from their intended landing spots. But tidal currents were much harder to predict than the tides themselves, because they were much harder to measure.
All the Admiralty tide and tidal-current predictions for the war effort were produced by Arthur Thomas Doodson at the Liverpool Tidal Institute. The 53-year-old Doodson was at that time the world’s leading authority on tide prediction. He used two tide-predicting machines: the Kelvin machine, built in 1872 but overhauled in 1942 (shown in figure 2), and the Roberts-designed machine, built in 1906.
Figure 2. Kelvin’s tide machine, the mechanical calculator built for William Thomson (later Lord Kelvin) in 1872 but shown here as overhauled in 1942 to handle 26 tidal constituents. It was one of the two machines used by Arthur Doodson (above) at the Liverpool Tidal Institute to predict tides for the Normandy invasion. (Photos courtesy of Proudman Oceanographic Laboratory.)
Figure 2. Kelvin’s tide machine, the mechanical calculator built for William Thomson (later Lord Kelvin) in 1872 but shown here as overhauled in 1942 to handle 26 tidal constituents. It was one of the two machines used by Arthur Doodson (above) at the Liverpool Tidal Institute to predict tides for the Normandy invasion. (Photos courtesy of Proudman Oceanographic Laboratory.)
The two machines were put in separate rooms at the observatory to minimize the chance of a bomb destroying both. That was a very real possibility. Worry heightened during one of the Nazi propaganda broadcasts by the infamous “Lord Haw Haw” (the British traitor William Joyce). He promised that “by morning, Bidston Observatory will be no more.” Many bombs did fall near the observatory. Hundreds of windows were shattered, and many doors were damaged, including the entrance to Doodson’s bunker. But both tide-predicting machines survived and were kept running from early morning to late at night, seven days a week.
As a precaution, predictions for all the ports in the Admiralty Tide Tables were completed two years ahead of time. But the machines were also required for many additional predictions for wherever around the world the Allies needed them. By 1943, because of the war effort’s demands for educated personnel, the technical staff at the Liverpool Tidal Institute had been reduced to just Doodson and six young women who carried out the thousands of tabulations and arithmetic computations required for tidal analysis. Their additional duties included nighttime fire watch on the roof in tin helmets and trench coats and carrying buckets of water in case an incendiary bomb hit the observatory.9
Tide predictions for Normandy
On 15 July 1943, Lieutenant General Frederick Morgan sent his plan for Operation Overlord—the code name for the Allied invasion of northern Europe—to the British Chiefs of Staff. He had determined that the best site for the invasion was the stretch of Normandy beaches in the Bay of the Seine. His recommendation was accepted the following month by the Allied command.
At that time, the Admiralty’s superintendent of tides at the Hydrographic Office was William Ian Farquharson, a 43-year-old commander in the Royal Navy. Farquharson became responsible for finding a way to provide Doodson with the harmonic constants—or the data to calculate them—he would need for making accurate tide predictions for the landing beaches. But because of extremely tight security, Farquharson could not tell Doodson the biggest secret of the war—that the landings were to take place in Normandy. So he labeled the coastal location to which a particular set of harmonic constants or tidal data belonged with a code name. And he, in Bath, and Doodson, in Liverpool 200 km away, used those code names in their exchange of letters and telegrams.
To produce accurate tide predictions for the Normandy beaches, Doodson ideally needed accurate harmonic constants calculated from tide data measured at or very near each landing beach. But the British had no such data. The only harmonic constants they had were for the two French ports that bracketed the beaches: Le Havre to the east and Cherbourg to the west. Simple interpolation would not work, because shallow-water conditions varied from place to place. Shallow-water distortion of the tide can speed up the rise from low to high water, giving demolition teams less time to do their work. Very accurate shallow-water tidal data from the beaches themselves were therefore essential.
The 1943 Admiralty Tide Tables did include three locations near the Normandy beaches. But all three were secondary stations, and the tables contained warnings about the probable inaccuracy of predictions for those locations. Now Farquharson desperately needed tide data direct from the invasion beaches. So British teams in small boats and midget submarines carried out several secret midnight reconnaissance missions on the enemy beaches. Those dangerous missions did yield a few tide and current measurements, but much less than is normally required for tidal analysis.
On 9 October 1943, Farquharson sent Doodson a three-page handwritten letter marked “MOST URGENT” (see figure 3). It included 11 pairs of harmonic constants for the location code-named “Position Z.” He asked Doodson to produce hourly height predictions for four months commencing 1 April 1944. “The place is nameless and the constants inferred,” he wrote. “There is in fact very little data for it. I am gambling on the inferred shallow-water constants giving something like the right answer.”10
Figure 3. A “most urgent” October 1943 note to Arthur Doodson from William Farquharson, the Admiralty’s superintendent of tides, listing 11 pairs of tidal harmonic constants for a location, code-named “Position Z,” for which he was to prepare hourly tide predictions for April through July 1944. Doodson was not told that the predictions were for the Normandy coast, but he guessed as much.10
Figure 3. A “most urgent” October 1943 note to Arthur Doodson from William Farquharson, the Admiralty’s superintendent of tides, listing 11 pairs of tidal harmonic constants for a location, code-named “Position Z,” for which he was to prepare hourly tide predictions for April through July 1944. Doodson was not told that the predictions were for the Normandy coast, but he guessed as much.10
It is still not known how Farquharson came up with those 11 pairs of harmonic constants. Probably he modified the Le Havre constants in such a way as to match the shape of a measured tide curve determined from the little bit of water-level data collected by one of the reconnaissance teams. He may also have taken into consideration the time and height differences found in the Admiralty Tide Tables. In any case, Farquharson’s harmonic constants for the Normandy beaches compare favorably with those derived from modern hydrodynamic tide models of the English Channel.
Doodson put Farquharson’s harmonic constants on his tide-predicting machines to produce the tide predictions for D-Day. (Years later Doodson admitted that he had guessed from those harmonic constants that the Normandy beaches were the intended landing site.) Doodson’s tide predictions were then modified at the Hydrographic Office for each of the landing beaches and provided to the military planners and commanders in a variety of convenient ways. Most important were the tidal and illumination diagrams (see figure 4), which combined the tide predictions with additional information about moonrise, moonset, sunrise, sunset, and various degrees of twilight.11
Figure 4. Tidal and illumination diagram for Omaha Beach, 5–21 June 1944, shows one of the formats in which Doodson’s predictions were provided to military commanders. The diagram gives not only tides but also moonlight and degrees of twilight. Times are given in Greenwich Mean Time.11
Figure 4. Tidal and illumination diagram for Omaha Beach, 5–21 June 1944, shows one of the formats in which Doodson’s predictions were provided to military commanders. The diagram gives not only tides but also moonlight and degrees of twilight. Times are given in Greenwich Mean Time.11
The D-Day landings
Tides had been the key factor in selecting 5, 6, and 7 June as the three days most suitable for D-Day. But it was weather that ultimately determined on which of those days the invasion was launched. Bad weather, with high winds, high waves, and high surf on the beaches, would make an amphibious landing impossible. It would also mean no air support and very inaccurate naval gunfire, as well as low clouds and poor visibility for the airborne operations during the night before the landing. June weather was usually good on the Normandy coast. But in 1944 bad June weather ended up delaying D-Day from the original 5 June date chosen by General Dwight Eisenhower and his staff.
Eisenhower’s decision to go one day later, based on his meteorologists’ forecast of a 36-hour break in the weather, is credited with being a critical factor in surprising the Germans. Rommel, having concluded that the weather was too bad for an invasion during that stretch, left his headquarters on 5 June and went to Germany to spend the next day with his wife on her birthday. There was, however, one more reason for the reduced German preparedness on 6 June: Rommel, still believing that the Allies must come at high tide, thought the 6 June tides would not suit them.12
Once the invasion began, the tide predictions proved to be quite accurate. The landing went smoothly, with one dramatic exception: One hour after low water, American demolition teams landed at Omaha Beach, the only beach on which the German defenders had not significantly reduced their readiness because of the weather and the (supposedly) unfavorable tides. The American demolition teams at Omaha were tasked with blasting 16 channels through the beach obstacles, each 70 meters wide. But German gunfire from the bluffs above the beach took a heavy toll. The demolition teams managed to blast only six complete gaps and three partial ones; more than half their engineers were killed in the process. But without the accurate knowledge of the exact timing of the fast-rising tide, things would have been even worse.
At Utah, the other beach assigned to US forces and the westernmost of the five invasion beaches, American demolition teams had cleared the entire beach of obstacles by 8 am, with only 6 engineers killed and 11 wounded. Ironically, an unexpectedly strong tidal current actually helped the invaders by sweeping their landing craft two kilometers southeast of the intended landing spot, to a stretch of beach that turned out to be more lightly defended.
Aftermath
The D-Day landings might be considered the ultimate success for those big, beautiful brass tide-calculating machines. They continued in use by the British, Americans, Germans, and others into the 1960s, when digital computers finally took over. Though absurdly slow by today’s standards, the brass machines were very accurate.
The basic harmonic method on which their calculations were based changed relatively little with the transition to digital computers; numerical equations replaced pulleys and gears. There have, of course, been various numerical refinements to harmonic analysis—for example, least-squares techniques. Variations have also been developed to better handle the many additional overtide and compound constituents found in shallow waterways with large tidal ranges. There is now also a competing technique called the response method, a cross-spectral approach that’s useful for research but doesn’t really improve prediction accuracy. But the original harmonic methods of Kelvin and Ferrel are still today at the heart of tide prediction.
REFERENCES
Bruce Parker, former chief scientist of the National Oceanic and Atmospheric Administration’s National Ocean Service, is a visiting professor at the Stevens Institute of Technology’s Center for Maritime Systems in Hoboken, New Jersey.