Survival NotesBecause survival is good for you.
I was puzzled at first by how to calculate the distance from me to the horizon, but when viewing the problem as calculating the distance from the horizon to me, it became a simple hypoteneuse equation, a^2 = b^2 + c^2.
Your eye's distance from the center of the Earth squared is equal to the radius of the Earth squared plus the distance to the horizon squared. In math, (h+r)^2 = r^2 + d^2, where h=your height from the ground, r=radius of Earth, and d=distance to horizon. Thus, we can solve the equation for d as d=sqrt((h+r)^2 - r^2), which finds the distance to the horizon for any height. Or to solve for h, we would use h = sqrt(r^2 + d^2) - r, which finds the height that would be required to see any given distance.
The mean radius of the Earth is 3951 miles, i.e. about 20861280 feet. If an adult is 6 feet tall, his horizon is at d=sqrt((6+20861280)^2 - 20861280^2) feet, i.e. 15822 feet, which is almost exactly 3 miles. In other words, you could see a small shiny object placed on the ground at that distance, but no farther, if the Earth were perfectly spherical. In reality, the Earth is not a perfect sphere, but the formula provides a good estimate, especially for greater heights and distances or over water.
Because radio waves travel in roughly line-of-sight paths, the horizon formula also helps estimate how far an antenna can reach at any height (with sufficient transmitter power or receiver sensitivity).
Enter a height: feet
Try it out.
Arm's length is about 24 inches, so a height of one inch at arm's length is equal to a slope of 1/24, because it's 1 inch for every 24 inches (i.e. 1 inch per 2 feet). So if a tree at a distance of 1000 feet looks the same height as something one inch tall held at arms length, then you know the tree is about 1 * 1000 / 2 = 500 inches tall = 41'8" tall. There is no complicated trigonometry or calculus required here, just basic understanding of slopes and some simple math.
Suppose we know the size but not the distance. For example, we can safely assume that a distant man is about six feet tall (72 inches), so if he appears to be 1/16 inch high (0.0625 inch) on a ruler held out at arm's length, we can assume he is about 72 / 0.0625 * 2 = 2304 feet away. (The 2 is arm's length.)
To put it another way, the ratio of height to distance is always the same. So 1/16th inch at arms length is the same ratio as the height of a man at 2304 feet.
To estimate the distance in feet to a man, divide 144 by the number of inches of apparent height on a ruler held at arm's length. For yards of distance, divide 48 by the inches of apparent height.
How tall does the man look on a ruler at arm's length? inches
Try it out.
©2019 Ron Spain