(For the SoundCloud audio, scroll down)
In a letter dated November 1861, General George B. McClellan — newly appointed by President Abraham Lincoln as commander of the Union Army — wrote to his wife Mary Ellen that “Mr. Lincoln is nothing more than a well-meaning baboon.”
McClellan’s undisguised disdain echoed a broader sentiment among the political and military elite, who badly misjudged Lincoln’s capacity to lead the United States in a moment of national crisis. In the years that followed, history would vindicate Lincoln as America’s greatest commander-in-chief — while McClellan’s own legacy was overshadowed by the very man he had once so casually disparaged.
McClellan was hardly the first person to look down on someone far greater than himself, and he certainly wasn’t the last. Take Ignaz Semmelweis, for example, the brilliant Hungarian physician whose simple, lifesaving idea should have made him a medical hero.
In the 1840s, Semmelweiss researched the high incidence of women dying after childbirth in hospitals and concluded that it was caused by doctors moving straight from autopsies to maternity wards, thereby infecting mothers. A staggering one in every six mothers died due to this practice.
There was a simple solution, Semmelweis said: doctors needed to wash their hands so that ‘cadaverous particles’ — the term germs had not yet been invented — would be removed. But the response to his suggestion was not gratitude but outrage. One senior Viennese physician dismissed Semmelweis’s handwashing solution as “the outpourings of a disturbed mind.”
The hostility to Semmelweis grew, and it essentially ended his career, the man poised to save countless lives was literally ridiculed into obscurity. He was eventually committed to an insane asylum, where he died at the age of 47. Only decades later did the medical world finally admit that the ‘disturbed mind’ had been right all along.
Semmelweis was not the only doctor ridiculed for seeing the truth too clearly. During the 1854 cholera outbreak in London, Dr. John Snow proposed an idea that all his colleagues considered utterly laughable: he argued that cholera wasn’t caused by “bad air” or mysterious atmospheric vapors, but by contaminated water. Today we don’t question this fact — but in mid-19th-century London, it was considered scientific heresy.
Snow wasn’t put off easily. He painstakingly mapped cholera cases, eventually traced the outbreak to the Broad Street water pump, and persuaded local officials to remove its handle so no one could pump water there. The deaths plummeted almost immediately, but the medical establishment still refused to take him seriously.
The president of the General Board of Health dismissed Snow’s work as “mere hypothesis,” and another critic sneered that his theory “cannot be entertained in any scientific discussion.” Snow, like Semmelweis, was treated as an irritant rather than a visionary. Only years later, long after his early death at 45, did the world recognize that the man they had waved away as a crank had actually solved one of the great medical mysteries of all time.
This pattern of condescension was not limited to the medical world. In the 1840s, Ada Lovelace — daughter of the poet Lord Byron and one of the most extraordinary minds of her generation — became fascinated by Charles Babbage’s proposed “analytical engine,” a mechanical device most people viewed as little more than an elaborate calculator.
But Lovelace saw something far more revolutionary. In a set of notes that she appended to her translation of an Italian science paper, she suggested that this machine, if built according to her specifications, would be able to manipulate symbols, compose music, and even generate original ideas — concepts that today form the backbone of modern computing and, more recently, AI.
But her vision was far too radical for her contemporaries. One prominent engineer dismissed her ideas as “the wild fancies of a young woman,” and others insisted Lovelace simply did not understand the limits of machinery. Lovelace, like Semmelweis and Snow, was written off as someone who thought too strangely, too imaginatively, too far beyond the accepted boundaries.
A century later, computer scientists rediscovered her work and suddenly realized that her “wild fancies” were, in fact, the earliest blueprint for the digital age. The woman whose insights were rudely dismissed in her lifetime became known as the world’s first computer programmer.
The dismissal of great people by their peers was not a phenomenon limited to the nineteenth century. History is replete with such examples, going all the way back to the Bible itself, with the most famous case appearing in Parshat Vayeishev.
Long before Lincoln was dismissed by McClellan, long before Semmelweis was mocked as delusional, long before John Snow was waved away as a crank, and long before Ada Lovelace was written off as an over-imaginative dreamer, Joseph’s brothers concluded that he was an overblown egotist punching way above his weight. They saw his confidence and heard his dreams, and immediately decided he was an arrogant narcissist obsessed with visions of grandeur.
What they never paused to consider was that perhaps these dreams were not fantasies at all, but glimpses of a destiny that he alone could perceive. Their prejudices and preconceived notions of their little brother blinded them to the remarkable qualities standing right in front of them: Joseph’s intuition, his emotional intelligence, his spiritual imagination, his innate leadership — all of which would emerge in the concluding chapters of Genesis.
Convinced they were dealing with an insufferable younger sibling who needed to be put in his place, they misread the situation entirely. In their rush to dismiss him, they failed to recognize that he was, in fact, the person who would one day save them all.
Malbim offers a psychologically astute insight that applies equally to all the examples throughout history: people interpret ambiguous information through the filter of their existing emotions. Because the brothers already viewed Joseph with suspicion, they didn’t read his dreams as neutral messages but as hostile declarations.
Their own jealousy and insecurity shaped what they thought the dreams meant — and, by extension, who they believed Joseph was. Malbim points out that had they not been so entangled in their biases, they might have seen the dreams in an entirely different light.
Which brings us to the most unsettling question of all. If Lincoln could be written off as a “well-meaning baboon,” if Semmelweis could be mocked into madness, if John Snow could be dismissed as a crank, and if Ada Lovelace could be waved away as a fanciful young woman, how many other potential Josephs has history quietly buried?
How many brilliant minds, original thinkers, and visionary spirits were crushed before their gifts could ever see daylight, not because they lacked greatness, but because those around them lacked the imagination to recognize it?
Joseph survived his brothers’ attempts to dismiss him and ultimately rose to fulfill his destiny. But his story stands as a warning: when we assume we already know someone’s limits, we may be blinding ourselves to the greatness standing right in front of us. And the tragedy is not only what we fail to see, it’s what the world loses when a future savior is silenced before he ever has a chance to begin.
So here’s a challenge for us all: This week, champion a quiet contrarian in your own circle. Seek out someone with unconventional ideas, and nurture them. Who knows, you might just uncover the next great thinker whose insights can change the world. Let us learn from the past and ensure that no potential Joseph is buried under the weight of our doubts.