Post by Scoutpilot on Apr 21, 2021 16:27:25 GMT -5
I took this excerpt from Parler. Does it sound familiar?
"Authored by Barry Brownstein via The American Institute for Economic Research,
Austrian mathematician Abraham Wald was a World War II hero. He worked out of a nondescript apartment building in Harlem for the Applied Mathematics Panel. Wald’s ability to see the unseen was a significant factor in the Allied victory in World War II.
Allied bomber planes were being shot down at such an alarming rate that bomber airmen were called “ghosts already.” The Air Force concluded that more armor was needed on the planes but adding armor would add weight. David McRaney, the author of several books on cognitive biases, tells the story of how Wald saved the military from a major blunder:
“The military looked at the bombers that had returned from enemy territory. They recorded where those planes had taken the most damage. Over and over again, they saw that the bullet holes tended to accumulate along the wings, around the tail gunner, and down the center of the body. Wings. Body. Tail gunner. Considering this information, where would you put the extra armor? Naturally, the commanders wanted to put the thicker protection where they could clearly see the most damage, where the holes clustered. But Wald said no, that would be precisely the wrong decision. Putting the armor there wouldn’t improve their chances at all.”
Wald looked at the same bullet holes and saw a pattern revealing “where a bomber could be shot and still survive the flight home.”
Wald didn’t fall for survivorship bias. Here is what he advised:
“What you should do is reinforce the area around the motors and the cockpit. You should remember that the worst-hit planes never come back. All the data we have come from planes that make it to the bases. You don’t see that the spots with no damage are the worst places to be hit because these planes never come back.”
McRaney writes, “The military had the best data available at the time, and the stakes could not have been higher, yet the top commanders still failed to see the flaws in their logic. Those planes would have been armored in vain had it not been for the intervention of a man trained to spot human error.”
We easily succumb to what you see is all there is (WYSIATI) mindset bias. In his book Thinking, Fast and Slow, Daniel Kahneman explains, “You cannot help dealing with the limited information you have as if it were all there is to know. You build the best possible story from the information available to you, and if it is a good story, you believe it.”
Think of the last time you looked to a “survivor” for career and life advice, eager to learn their ticket to success. McRaney writes, “The problem here is that you rarely take away from these inspirational figures advice on what not to do, on what you should avoid, and that’s because they don’t know.” We make faulty decisions when we ignore the evidence from those who did not survive a selection process. "
"Authored by Barry Brownstein via The American Institute for Economic Research,
Austrian mathematician Abraham Wald was a World War II hero. He worked out of a nondescript apartment building in Harlem for the Applied Mathematics Panel. Wald’s ability to see the unseen was a significant factor in the Allied victory in World War II.
Allied bomber planes were being shot down at such an alarming rate that bomber airmen were called “ghosts already.” The Air Force concluded that more armor was needed on the planes but adding armor would add weight. David McRaney, the author of several books on cognitive biases, tells the story of how Wald saved the military from a major blunder:
“The military looked at the bombers that had returned from enemy territory. They recorded where those planes had taken the most damage. Over and over again, they saw that the bullet holes tended to accumulate along the wings, around the tail gunner, and down the center of the body. Wings. Body. Tail gunner. Considering this information, where would you put the extra armor? Naturally, the commanders wanted to put the thicker protection where they could clearly see the most damage, where the holes clustered. But Wald said no, that would be precisely the wrong decision. Putting the armor there wouldn’t improve their chances at all.”
Wald looked at the same bullet holes and saw a pattern revealing “where a bomber could be shot and still survive the flight home.”
Wald didn’t fall for survivorship bias. Here is what he advised:
“What you should do is reinforce the area around the motors and the cockpit. You should remember that the worst-hit planes never come back. All the data we have come from planes that make it to the bases. You don’t see that the spots with no damage are the worst places to be hit because these planes never come back.”
McRaney writes, “The military had the best data available at the time, and the stakes could not have been higher, yet the top commanders still failed to see the flaws in their logic. Those planes would have been armored in vain had it not been for the intervention of a man trained to spot human error.”
We easily succumb to what you see is all there is (WYSIATI) mindset bias. In his book Thinking, Fast and Slow, Daniel Kahneman explains, “You cannot help dealing with the limited information you have as if it were all there is to know. You build the best possible story from the information available to you, and if it is a good story, you believe it.”
Think of the last time you looked to a “survivor” for career and life advice, eager to learn their ticket to success. McRaney writes, “The problem here is that you rarely take away from these inspirational figures advice on what not to do, on what you should avoid, and that’s because they don’t know.” We make faulty decisions when we ignore the evidence from those who did not survive a selection process. "