Field of Science

Bottom-up and top-down in drug discovery

There are two approaches to discovering new drugs. In one approach drugs fall in your lap from the sky. In the other you scoop them up from the ocean. Let’s call the first the top-down approach and the second the bottom-up approach.

The bottom-up approach assumes that you can discover drugs by thinking hard about them, by understanding what makes them tick at the molecular level, by deconstructing the dance of atoms orchestrating their interactions with the human body. The top-down approach assumes that you can discover drugs by looking at their effects on biological systems, by gathering enough data about them without understanding their inner lives, by generating numbers through trial and error, by listening to what those numbers are whispering in your ear.

To a large extent, the bottom-up approach assumes knowledge while the top-down approach assumes ignorance. Since human beings have been ignorant for most of their history, for most of the recorded history of drug discovery they have pursued the top-down approach. When you don't know what works, you try things out randomly. The Central Americans found out by accident that chewing the bark of the Cinchona plant relieved them of the afflictions of malaria. Through the Middle Ages and beyond, people who called themselves physicians prescribed a witches' brew of substances ranging from sulfur to mercury to arsenic to try to cure a corresponding witches' brew of maladies, from consumption to the common cold. More often than not these substances killed patients as readily as the diseases themselves.

The top-down approach may seem crude and primitive, and it was primitive, but it worked surprisingly well. For the longest time it was exemplified by the ancient medical systems of China and India – one of these systems delivered an antimalarial medicine that helped its discoverer bag a Nobel Prize for Medicine. Through fits and starts, scores of failures and a few solid successes, the ancients discovered many treatments that were often lost to the dust of ages. But the philosophy endured. It endured right up to the early 20th century when the German physician Paul Ehrlich tested 604 chemical compounds - products of the burgeoning dye industry pioneered by the Germans - and found that compound 606 worked against syphilis. Syphilis was a disease that so bedeviled people since medieval times that it was often a default diagnosis of death, and cures were desperately needed. Ehrlich's 606 was arsenic-based, unstable and had severe side effects, but the state of medicine was such back then that anything was regarded as a significant improvement over the previous mercury-based compounds.

It was with Ehrlich's discovery that drug discovery started to transition to a more bottom-up discipline, systematically trying to make and test chemical compounds and understand how they worked at the molecular level. But it still took decades before the approach bore fruition. For that we had to await a nexus of great and concomitant advances in theoretical and synthetic organic chemistry, spectroscopy and cell and molecular biology. These advances helped us figure out the structure of druglike organic molecules, they revealed the momentous fact that drugs work by binding to specific target proteins, and they also allowed us to produce these proteins in useful quantity and uncover their structures. Finally at the beginning of the 80s, we thought we had enough understanding of chemistry to design drugs by bottom-up approaches, "rationally", as if everything that had gone on before was simply the product of random flashes of unstructured thought. The advent of personal computers (Apple and Microsoft had launched in the late 70s) and their immense potential left people convinced that it was only a matter of time before drugs were "designed with computers". What the revolution probably found inconvenient to discuss much was that it was the top-down analysis which had preceded it that had produced some very good medicines, from penicillin to thorazine.

Thus began the era of structure-based drug design which tries to design drugs atom by atom from scratch by knowing the protein glove in which these delicate molecular fingers fit. The big assumption is that the hand that fits the glove can deliver the knockout punch to a disease largely on its own. An explosion of scientific knowledge, startups, venture capital funding and interest from Wall Street fueled those heady times, with the upbeat understanding that once we understood the physics of drug binding well and had access to more computing power, we would be on our way to designing drugs more efficiently. Barry Werth's book "The Billion-Dollar Molecule" captured this zeitgeist well; the book is actually quite valuable since it's a rare as-it-happens study and not a more typical retrospective one, and therefore displays the same breathless and naive enthusiasm as its subjects.

And yet, 30 years after the prophecy was enunciated in great detail and to great fanfare, where are we? First, the good news. The bottom-up approach did yield great dividends - most notably in the field of HIV protease inhibitor drugs against AIDS. I actually believe that this contribution from the pharmaceutical industry is one of the greatest public services that capitalism has performed for humanity. Important drugs for lowering blood pressure and controlling heartburn were also the beneficiaries of top-down thinking. 

The bad news is that the paradigm fell short of the wild expectations that we had from it. Significantly short in fact. And the reason is what it always has been in the annals of human technological failure: ignorance. Human beings simply don't know enough about perturbing a biological system with a small organic molecule. Biological systems are emergent and non-linear, and we simply don't understand how simple inputs result in complex outputs. Ignorance was compounded with hubris in this case. We thought that once we understood how a molecule binds to a particular protein and optimized this binding, we had a drug. But what we had was simply a molecule that bound better to that protein; we still worked on the assumption that that protein was somehow critical for a disease. Also, a molecule that binds well to a protein has to overcome enormous other hurdles of oral bioavailability and safety before it can be called a drug. So even if - and that's a big if - we understood the physics of drug-protein binding well, we still wouldn't be any closer to a drug, because designing a drug involves understanding its interactions with an entire biological system and not just with one or two proteins.

In reality, diseases like cancer manifest themselves through subtle effects on a host of physiological systems involving dozens if not hundreds of proteins. Cancer especially is a wily disease because it activates cells for uncontrolled growth through multiple pathways. Even if one or two proteins were the primary drivers of this process, simply designing a molecule to block their actions would be too simplistic and reductionist. Ideally we would need to block a targeted subset of proteins to produce optimum effect. In reality, either our molecule would not bind even one favored protein sufficiently and lack efficacy, or it would bind the wrong proteins and show toxicity. In fact the reason why no drug can escape at least a few side effects is precisely because it binds to many other proteins other than the one we intended it to.

Faced with this wall of biological complexity, what do we do? Ironically, what we had done for hundreds of years, only this time armed with far more data and smarter data analysis tools. Simply put, you don't worry about understanding how exactly your molecule interacts with a particular protein; you worry instead only about its visible effects, about how much it impacts your blood pressure or glucose levels, or how much it increases urine output or metabolic activity. These endpoints are agnostic of knowledge of the detailed mechanism of action of a drug. You can also compare these results across a panel of drugs to try to decipher similarities and differences.

This is top-down drug design and discovery, writ large in the era of Big Data and techniques from computer science like machine learning and deep learning. The field is fundamentally steeped in data analysis and takes advantage of new technology that can measure umpteen effects of drugs on biological systems, greatly improved computing power and hardware to analyze these effects, and refined statistical techniques that can separate signal from noise and find trends.

The top-down approach is today characterized mainly by phenotypic screening and machine learning. Phenotypic screening involves simply throwing a drug at a cell, organ or animal and observing its effects. In its primitive form it was used to discover many of today's important drugs; in the field of anxiety medicine for instance, new drugs were discovered by giving them to mice and simply observing how much fear the mice exhibited toward cats. Today's phenotypic screening can be more fine-grained, looking at drug effects on cell size, shape and elasticity. One study I saw looked at potential drugs for wound healing; the most important tool in that study was a high-resolution camera, and the top-down approach manifested itself through image analysis techniques that quantified subtle changes in wound shape, depth and appearance. In all these cases, the exact protein target the drug might be interacting with was a distant horizon and an unknown. The large scale, often visible, effects were what mattered. And finding patterns and subtle differences in these effects - in images, in gene expression data, in patient responses - is what the universal tool of machine learning is supposed to do best. No wonder that every company and lab from Boston to Berkeley is trying feverishly to recruit data and machine learning scientists and build burgeoning data science divisions. These companies have staked their fortunes on a future that is largely imaginary for now.

Currently there seems to be, if not a war, at least a simmering and uneasy peace between top-down and bottom-up approaches in drug discovery. And yet this seems to be mainly a fight where opponents set up false dichotomies and straw men rather than find complementary strengths and limitations. First and foremost, the ultimate proof of the pudding is in the eating, and machine learning's impact on the number of approved new drugs still has to be demonstrated; the field is simply too new. The constellation of techniques has also proven itself to be much better at solving certain problems (mainly image recognition and natural language processing) than others. A lot of early stage medicinal chemistry data contains messy assay results and unexpected structure-activity relationships (SAR) containing "activity cliffs" in which a small change in structure leads to a large change in activity. Machine learning struggles with these discontinuous stimulus-response landscapes. Secondly, there are still technical issues in machine learning such as working with sparse data and noise that have to be resolved. Thirdly, while the result of a top-down approach may be a simple image or change in cell type, the number of potential factors that can lead to that result can be hideously tangled and multifaceted. Finally, there is the perpetual paradigm of garbage-in-garbage-out (GIGO). Your machine learning algorithm is only as good as the data you feed it, and chemical and biological data are notoriously messy and ill-curated; chemical structures might be incorrect, assay conditions might differ in space and time, patient reporting and compliance might be sporadic and erroneous, human error riddles data collection, and there might be very little data to begin with. The machine learning mill can only turn data grist into gold if what it's provided with is grist in the first place.

In contrast to some of these problems with the top-down paradigm, bottom-up drug design has some distinct advantages. First of all, it has worked, and nothing speaks like success. Also operationally, since you are usually looking at the interactions between a single molecule and protein, the system is much simpler and cleaner, and the techniques to study it are less prone to ambiguous interpretation. Unlike machine learning which can be a black box, here you can understand exactly what's going on. The amount of data might be smaller, but it may also be more targeted, manageable and reproducible. You don't usually have to deal with the intricacies of data fitting and noise reduction or the curation of data from multiple sources. Ultimately at the end of the day, if like HIV protease your target does turn out to be the Achilles heel of a deadly disease like AIDS, your atom-by-atom design can be as powerful as Thor's hammer. There is little doubt that bottom-up approaches have worked in selected cases, where the relevance of the target has been validated, and there is little doubt that this will continue to be the case.

Now it's also true that just like with top-downers, bottom-uppers have had their burden of computational problems and failures, and both paradigms have been subjected to their fair share of hype. Starting from that "designing drugs using computers" headline in 1981, people have understood that there are fundamental problems in modeling intermolecular interactions: some of these problems are computational and in principle can be overcome with better hardware and software, but others like the poor understanding of water molecules and electrostatic interactions are fundamentally scientific in nature. The downplaying of these issues and the emphasizing of occasional anecdotal successes has led to massive hype in computer-aided drug design. But in case of machine learning it's even worse in some sense since hype from applications of the field in other human endeavors is spilling over in drug discovery too; it seems hard for some to avoid claiming that your favorite machine learning system is going to soon cure cancer if it's making inroads in trendy applications like self-driving cars and facial recognition. Unlike machine learning though, the bottom-up take has at least had 20 years of successes and failures to draw on, so there is a sort of lid on hype that is constantly waved by skeptics.

Ultimately, the biggest advantage of machine learning is that it allows us to bypass detailed understanding of complex molecular interactions and biological feedback and work from the data alone. It's like a system of psychology that studies human behavior purely based on stimuli and responses of human subjects, without understanding how the brain works at a neuronal level. The disadvantage is that the approach can remain a black box; it can lead to occasional predictive success but at the expense of understanding. And a good open question is to ask how long we can keep on predicting without understanding. Knowing how many unexpected events or "Black Swans" exist in drug discovery, how long can top-down approaches keep performing well?

The fact of the matter is that both top-down and bottom-up approaches to drug discovery have strengths and limitations and should therefore be part of an integrated approach to drug discovery. In fact they can hopefully work well together, like members of a relay team. I have heard of at least one successful major project in a leading drug firm in which top down phenotypic screening yielded a valuable hit which then, midstream, was handed over to a bottom-up team of medicinal chemists, crystallographers and computational chemists who deconvoluted the target and optimized the hit all the way to an NDA (New Drug Application). At the same time, it was clear that the latter would not have been made possible without the former. In my view, the old guard of the bottom-up school has been reluctant and cynical in accepting membership in the guild for the young Turks of the top-down school, while the Turks have been similarly guilty of dismissing their predecessors as antiquated and irrelevant. This is a dangerous game of all-or-none in the very complex and challenging landscape of drug discovery and development, where only multiple and diverse approaches are going to allow us to discover the proverbial needle in the haystack. Only together will the two schools thrive, and there are promising signs that they might in fact be stronger together. But we'll never know until we try.

(Image: BenevolentAI)

Unifiers and diversifiers in physics, chemistry and biology

On my computer screen right now are two molecules. They are both large rings with about thirty atoms each, a motley mix of carbons, hydrogens, oxygens and nitrogens. In addition they have appendages of three or four atoms dangling off their periphery. There is only one, seemingly minor difference: The appendage in one of the rings has two more carbon atoms than that in the other. If you looked at the two molecules in flat 2D - in the representation most familiar to practicing chemists - you will sense little difference between them.
Yet when I look at the two molecules in 3D - if I look at their spatial representations or conformations - the differences between them are revealed in their full glory. The presence of two extra carbons in one of the compounds causes it to scrunch up, to slightly fold upon itself the way a driver edges close to the steering wheel. This slight difference causes many atoms which are otherwise far apart to come together and form hydrogen bonds, weak interactions that are nonetheless essential in holding biological molecules like DNA and proteins together. These hydrogen bonds can in turn modulate the shape of the molecule and allow it to get past cell membranes better than the other one. A difference of only two carbons - negligible on paper- can thus have profound consequences for the three-dimensional life of these molecules. And this difference in 3D can in turn translate to significant differences in their functions, whether those functions involve capturing solar energy or killing cancer cells.
Chemistry is full of hidden differences and similarities like these. Molecules exist on many different levels, and on each level they manifest unique properties. In one sense they are like human beings. On the surface they may appear similar, but probe deeper and each one is unique. And probing even deeper may then again reveal similarities. They are thus both similar and different all at once. But just like human beings molecules are shy; they won't open up unless you are patient and curious, they may literally fall apart if you are too harsh with them, and they may even turn the other cheek and allow you to study them better if you are gentle and beguiling enough. It is often only through detailed analysis that you can grasp their many-splendored qualities. It is this ever-changing landscape of multifaceted molecular personalities, slowly but surely rewarding the inquisitive and dogged mind, that makes chemistry so thrilling and open-ended. It is why I get a kick out of even mundane research.
When I study the hidden life of molecules I see diversity. And when I see diversity I am reminded of how important it is in all of science. Sadly, the history of science in the twentieth century has led both scientists and the general public to value unity over diversity. The main culprit in this regard has been physics whose quest for unity has become a victim of its own success. Beginning with the unification of mechanics with heat and electricity with magnetism in the nineteenth century, physics achieved a series of spectacular feats when it combined space with time, special relativity with quantum mechanics and the weak force with electromagnetism. One of the greatest unsolved problems in physics today is the combination of quantum mechanics with general relativity. These unification feats are both great intellectual achievements as well as noteworthy goals, but they have led many to believe that unification is the only thing that really matters in physics, and perhaps in all of science. They have also led to the belief that fundamental physics is all that is worth studying. The hype generated by the media in fields like cosmology and string theory and the spate of popular books written by scientist-celebrities in these fields have only made matters worse. All this is in spite of the fact that most of the world's physicists don't study fundamental physics in their daily work.
The obsession with unification has led to an ignorance of the diversity of discoveries in physics. In parallel with the age of the unifiers has existed the universe of diversifiers. While the unifiers have been busy proclaiming discoveries from the rooftops, the diversifiers have been quietly building new instruments and cataloging the reach of physics in less fundamental but equally fascinating fields like solid-state physics and biophysics. They have also gathered the important data which allowed the unifiers to ply their trade. Generally speaking, unifiers tend to be part of idea-driven revolutions while diversifiers tend to be part of tool-driven revolutions. The unifiers would never have seen their ideas validated if the diversifiers had not built tools like telescopes, charged coupled devices and, superconducting materials to test the great theories of physics. And yet, just like unification is idolized at the expense of diversification, ideas in physics have also been lionized at the expense of practical tools. We need to praise the tools of physics as much as the diversifiers who build them.
As a chemist I find it easier to appreciate diversity. Examples of molecules like the ones I cited above abound in chemistry. In addition chemistry is too complex to be reduced to a simple set of unifying principles, and most chemical discoveries are still made by scientists looking at special cases rather than those searching for general laws. It's also a great example of a tool-driven revolution, with new instrumental technologies like x-ray diffraction and nuclear magnetic resonance (NMR) completely revolutionizing the science during the twentieth century. There were of course unifiers in chemistry too - the chemists who discovered the general laws of chemical bonding are the most prominent example - but these unifiers have never been elevated to a status seen among physicists. Diversifiers who play in the mud of chemical phenomena and find chemical gems are still more important than ones who might proclaim general theories. There will always be the example of an unusual protein structure, a fleeting molecule whose existence defies our theories or or a new polymer with amazing ductility that will keep chemists occupied. And this will likely be the case for the foreseeable future.
Biology too has seen its share of unifiers and diversifiers. For most of its history biology was the ultimate diversifiers' delight, with intrepid explorers, taxonomists and microbiologists cataloging the wonderful diversity of life around us. When Charles Darwin appeared on the scene he unified this diversity in one stunning fell swoop through his theory of evolution by natural selection. The twentieth century modern synthesis of biology that married statistics, genetics and evolutionary biology was also a great feat of unification. And yet biology continues to be a haven for diversifier. There is always the odd protein, the odd sequence of gene or the odd insect with a particularly startling method of reproduction that catches the eye of biologists. These examples of unusual natural phenomena do not defy the unifying principles, but they do illustrate the sheer diversity in which the unifying principles can manifest themselves, especially on multiple emergent levels. They assure us that no matter how much we may unify biology, there will always be a place for diversifiers in it.
At the dawn of the twenty-first century there is again a need for diversifiers, especially in new fields like neuroscience and paleontology. We need to cast off the spell of fundamental physics and realize that diversifiers play on the same field as unifiers. Unifiers may come up with important ideas, but diversifiers are the ones who test them and who open up new corners of the universe for unifiers to ponder. Whether in chemistry or physics, evolutionary biology or psychology, we should continue to appreciate unity in diversity and diversity in unity. Together the two will advance science into new realms.

If you believe Western Civilization is oppressive, you will ensure it is oppressive

Philosopher John Locke's defense of the natural rights of
man should apply to all people, not just to one's favorite factions
This is my third monthly column for the website 3 Quarks Daily. In it I lament what I see as an attack on Enlightenment values and Western Civilization from both the right and the left. I am especially concerned by the prevalent narrative on the left that considers Western Civilization as fundamentally oppressive, especially since the left could be the only thing standing between civilization and chaos at the moment. Both right and left thus need to reach back into their roots as stewards of Enlightenment values.

When the British left India in 1947, they left a complicated legacy behind. On one hand, Indians had suffered tremendously under oppressive British rule for more than 250 years. On the other hand, India was fortunate to have been ruled by the British rather than the Germans, Spanish or Japanese. The British, with all their flaws, did not resort to putting large numbers of people in concentration camps or regularly subjecting them to the Inquisition. Their behavior in India had scant similarities with the behavior of the Germans in Namibia or the Japanese in Manchuria.
More importantly, while they were crisscrossing the world with their imperial ambitions, the British were also steeping the world in their long history of the English language, of science and the Industrial Revolution and of parliamentary democracy. When they left India, they left this legacy behind. The wise leaders of India who led the Indian freedom struggle - men like Jawaharlal Nehru, Mahatma Gandhi and B. R. Ambedkar - understood well the important role that all things British had played in the world, even as they agitated and went to jail to free themselves of British rule. Many of them were educated at Western universities like London, Cambridge and Columbia. They hated British colonialism, but they did not hate the British; once the former rulers left they preserved many aspects of their legacy, including the civil service, the great network of railways spread across the subcontinent and the English language. They incorporated British thought and values in their constitution, in their educational institutions, in their research laboratories and in their government services. Imagine what India would have been like today had Nehru and Ambedkar dismantled the civil service, banned the English language, gone back to using bullock cart and refused to adopt a system of participatory democracy, simply because all these things were British in origin.
The leaders of newly independent India thus had the immense good sense to separate the oppressor and his instruments of oppression from his enlightened side, to not throw out the baby with the bathwater. Nor was an appreciation of Western values limited to India by any means. In the early days, when the United States had not yet embarked on its foolish, paranoid misadventures in Southeast Asia, Ho Chi Minh looked toward the American Declaration of Independence as a blueprint for a free Vietnam. At the end of World War 1 he held the United States in great regard and tried to get an audience with Woodrow Wilson at the Versailles Conference. It was only when he realized that the Americans would join forces with the occupying French in keeping Vietnam an occupied colonial nation did Ho Chi Minh's views about the U.S. rightly sour. In other places in Southeast Asia and Africa too the formerly oppressed preserved many remnants of the oppressor's culture.
Yet today I see many, ironically in the West, not understanding the wisdom which these leaders in the East understood very well. The values bequeathed by Britain which India upheld were part of the values which the Enlightenment bequeathed to the world. These values in turn went back to key elements of Western Civilization, including Greek, Roman, Byzantine, French, German and Dutch. And simply put, Enlightenment values and Western Civilization are today under attack, in many ways from those who claim to stand by them. Both left and right are trampling on them in ways that are misleading and dangerous. They threaten to undermine centuries worth of progress.
The central character of Enlightenment values should be common knowledge, and yet the fact that it seems worth reiterating them is a sign of our times.
To wit, consider the following almost axiomatic statements:
Freedom of speech, religion and the press is all-important and absolute.
The individual and his property have certain natural and inalienable rights.
Truth, whatever it is, is not to be found in religious texts.
Kings and religious rulers cannot rule by fiat and are constrained by the wishes of the governed.
The world can be deciphered by rationally thinking about it.
All individuals deserve fair trials by jury and are not to be subjected to cruel punishment.
The importance of these ideas cannot be overestimated. When they were first introduced they were novel and revolutionary; we now take them for granted, perhaps too much for granted. They are in large part what allow us to distinguish ourselves as human beings, as members of the unique creative species called Homo sapiens.
The Enlightenment reached its zenith in mid-eighteenth century France, Holland and England, but its roots go back deep into the history of Western Civilization. As far back as ancient Babylon, the code of Hammurabi laid out principles of justice describing proportionate retaliation for crimes. The peak of enlightened thought before the French enlightenment was in Periclean Athens. Socrates, Plato and Aristotle, Athens led the way in philosophy and science, in history and drama; in some sense, almost every contemporary political and social problem and its resolution goes back to the Greeks. Even when others superseded Greek and Roman civilization, traces of the Enlightenment kept on appearing throughout Europe, even in its dark ages. For instance, the Code of the Emperor Justinian laid out many key judicial principles that we take for granted, including a right to a fair trial, a right against self-incrimination and a proscription against trying someone twice for the same crime.
In 1215, the Magna Carta became the first modern document to codify the arguments against the divine authority of kings. Even as wars and revolutions engulfed Europe during the next five hundred years, principles like government through the consent of the governed, trial by jury and the prohibition of cruel and unusual punishment got solidified through trial and error, through resistance and triumph. They saw their culmination in the English and American wars of independence and the constitutions of these countries in the seventeenth and eighteenth centuries. By the time we get to France in the mid 1750s, we have philosophers like John Locke explicitly talking about the natural rights of men and Charles-Louis Montesquieu explicitly talking about the tripartite separation of powers in government. These principles are today the bedrock of most democratic republics around the world, Western and Eastern. At the same time, let us acknowledge that Eastern ideas and thinkers – Buddha and Confucius in particular – have also contributed immensely to humanity's progress and will continue to do. In fact, personally I believe that the concepts of self-control, detachment and moderation that the East has given us will, in the final analysis, supersede everything else. However, most of these ideas are personal and inward looking. They are also very hard to live up to for most mortals, and for one reason or another have not integrated themselves thoroughly yet into our modern ways of life. Thus, there is little doubt that modern liberal democracies as they stand today, both in the West and the East, are mostly products of Western Civilizational notions.
In many ways, the study of Western Civilization is therefore either a study of Enlightenment values or of forces – mainly religious ones – aligned against them. It shows a steady march of the humanist zeitgeist through dark periods which challenged the supremacy of these values, and of bright ones which reaffirmed them. One would think that a celebration of this progress would be beyond dispute. And yet what we see today is an attack on the essential triumphs of Western Civilization from both left and right.
Each side brings its own brand of hostility and hypocrisy to bear on the issue. As the left rightly keeps pointing out, the right often seems to forget about the great mass of humanity that was not only cast on to the sidelines but actively oppressed and enslaved, even as freedom and individual rights seemed to be taking root elsewhere for a select few. In the 17th and 18th centuries, as England and America and France were freeing themselves from monarchy and the divine rights of kings, they were actively plunging millions of men and women in Africa, India and other parts of the world into bondage and slavery and pillaging their nations. The plight of slaves being transported to the English colonies under inhuman conditions was appalling, and so was the hypocrisy of thinkers like Thomas Jefferson and George Washington who wrote about how all men are born equal while simultaneously keeping them unequal. Anyone who denies the essential hypocrisy of such liberal leaders in selectively promulgating their values would be intentionally misleading themselves and others.
Even later, as individual rights became more and more codified into constitutions and official documents, they remained confined to a minority, largely excluding people of color, indigenous people, women and poor white men and from their purview. This hadn't been too different even in the crucible of democracy, Periclean Athens, where voting and democratic membership were restricted to landowning men. It was only in the late twentieth century - more than two hundred years after the Enlightenment - that these rights were fully extended to all. That's an awfully long time for what we consider as basic freedoms to seep into every strata of society. But we aren't there yet. Even today, the right often denies the systemic oppression of people of color and likes to pretend that all is well when it comes to equality of the law; in reality, when it comes to debilitating life events like police stops and searches, prison incarceration and health emergencies, minorities, women and the poor can be disproportionately affected. The right will seldom agree with these facts, but mention crime or dependence on welfare and the right is more than happy to generalize their accusations to all minorities or illegal immigrants.
The election of Donald Trump has given voice to ugly elements of racism and xenophobia in the U.S., and there is little doubt that these elements are mostly concentrated on the right. Even if many right-wingers are decent people who don't subscribe to these views, they also don't seem to be doing much to actively oppose them. Nor are they actively opposing the right's many direct assaults on the environment and natural resources, assaults that may constitute the one political action whose crippling effects are irreversible. Meanwhile, the faux patriotism on the far right that worships men like Jefferson and Woodrow Wilson while ignoring their flaws and regurgitates catchy slogans drafted by Benjamin Franklin and others during the American Revolution conveniently pushes the oppressive and hypocritical behavior of these men under the rug. Add to this a perverse miscasting of individual and states' rights, and you end up with people celebrating the Confederate Flag and Jefferson Davis.
If criticizing this hypocrisy and rejection of the great inequities in this country's past and present were all that the left was doing, then it would be well and good. Unfortunately the left has itself started behaving in ways that aren't just equally bad but possibly worse in light of the essential function that it needs to serve in a liberal society. Let's first remember that the left is the political faction that claims to uphold individual rights and freedom of speech. But especially in the United States during the last few years, the left has instead become obsessed with playing identity politics, and both individual rights and free speech have become badly mangled victims of this obsession. For the, left individual rights and freedom of speech are important as long as they apply to their favorite political groups, most notably minorities and women. For the extreme left in particular, there is no merit to individual opinion anymore unless it is seen through the lens of the group that the individual belongs to. Nobody denies that membership in your group shapes your individual views, but the left believes that the latter basically has no independent existence; this is an active rejection of John Locke's primacy of the individual as the most important unit of society. The left has also decided that some opinions – even if they may be stating facts or provoking interesting discussion – are so offensive that they must be censored, if not by official government fiat, then by mass protest and outrage that verges on bullying. Needless to say, social media with its echo chambers and false sense of reassurance engendered by surrounding yourself with people who think just like you has greatly amplified this regressive behavior.
As is painfully familiar by now, this authoritarian behavior is playing out especially on college campuses, with a new case of "liberal" students bullying or banning conservative speakers on campus emerging almost every week. Universities are supposed to be the one place in the world where speech of all kinds is not just explicitly allowed but encouraged, but you would not see this critical function fulfilled on many college campuses today. Add to this the Orwellian construct of "micoroagressions" that essentially lets anyone decide whether an action, piece of speech or event is an affront to their favorite oppressed political group, and you have a case of full-blown unofficial censorship purely based on personal whims that basically stifles any kind of disagreement. It is censorship which squarely attacks freedom of speech as espoused by Voltaire, Locke, Adams and others. As Voltaire's biographer Evelyn Hall – a woman living in Victorian times – famously said, "I disapprove of what you say, but I will defend to the death your right to say it." Seemingly a woman in Victorian times - a society that was decidedly oppressive to women - had more wisdom to defend freedom of speech than a young American liberal in the twenty-first century.
This behavior threatens to undermine and tear apart the very progressive forces which the left claims to believe in. Notably, their so-called embrace of individual rights and diversity often seems to exclude white people, and white men in particular. The same people who claim to be champions of individual rights claim that all white men are "privileged", have too many rights, are trampling on others' rights and do not deserve more. The writer Edward Luce who has just written a book warning about the decline of progressive values in America talks about how, at the Democratic National Convention leading up to the 2016 U.S. election, he saw pretty much every "diversity" box checked except that belonging to white working class people; it was almost as if the Democrats wanted to intentionally exclude this group. For many on the left, diversity equates only to ethnic and gender diversity; any other kind of diversity and especially intellectual or viewpoint diversity are to be either ignored or actively condemned.This attitude is entirely contrary to the free exchange of ideas and respect for diverse opinions that was the hallmark of Enlightenment thinking.
The claim that white men have enough rights and are being oppressive is factually contradicted by the plight of millions of poor whites who are having as miserable a time as any oppressed minority. They have lost their jobs and have lost their health insurance, they have been sold a pipe dream full of empty promises by all political parties, and in addition they find themselves mired in racist and ignorant stereotypes. The left's drumbeat of privilege is very real, but it is also context-dependent; it can rise and ebb with time and circumstance. To illustrate with just one example, a black man in San Francisco will enjoy certain financial and social privileges that a white man in rural Ohio quite certainly won't: how can one generalize notions of privilege to all white men then, and especially those who have been dealt a bad hand? The white working class has thus found itself with almost no friend; rich white people have both Democrats and Republicans, rich and poor minorities largely have Democrats, but poor whites have no one and are being constantly demonized. No wonder they voted for Donald Trump out of desperation; he at least pretended to be their friend, while the others did not even put on a pretense. The animosity among white working class people is thus understandable and documented in many enlightening books, especially Arlie Hochschild's "Strangers in their Own Land". Even Noam Chomsky, who cannot be faintly accused of being a conservative, has sympathized with their situation and justifiable resentment. And as Chomsky says, the problem is compounded by the fact that not everyone on the left actually cares about poor minorities, since the Democratic party which they support has largely turned into a party of moneyed neoliberal white elites in the last two decades.
This singling out of favorite political groups at the expense of other oppressed ones is identity politics at its most pernicious, and it's not just hypocritical but destructive; the counter-response to selective oppression cannot also be selective oppression. As Gandhi said, an eye for an eye makes the whole world go blind. And this kind of favoritism steeped in identity politics is again at odds with John Locke's idea of putting the individual front and center. Locke was a creature of his times, so just like Jefferson he did not actively espouse individual freedom for indigenous people, but his idealization of the individual as the bearer of natural rights was clear and critical. For hundreds of years that individual was mostly white, but the response to that asymmetry cannot simply be to pick an individual of another skin color.
The general response on the left against the sins of Western Civilization and white men has been to consider the whole edifice of Western Civilization as fundamentally oppressive. In some sense this is not surprising since for many years, the history of Western Civilization was written by the victors; by white men. A strong counter-narrative emerged with books like Howard Zinn's "A People's History of the United States"; since then many others have followed suit and they have contributed very valuable, essential perspectives from the other side. Important contributions to civilizational ideas from the East have also received their dues. But the solution is not to swing to the other extreme and dismiss everything that white men in the West did or focus only on their sins, especially as viewed through the lens of our own times. That would be a classic ousting of the baby with the bathwater, and exactly the kind of regressive thinking that the leaders of India avoided when they overthrew the British.
Yes, there are many elements of Western Civilization that were undoubtedly oppressive, but the paradox of it was that Western Civilization and white men also simultaneously crafted many ideas and values that were gloriously progressive; ideas that could serve to guide humanity toward a better future and are applicable to all people in all times. And these ideas came from the same white men who also brought us colonialism, oppression of women and slavery. If that seems self-contradictory or inconvenient, it only confirms Walt Whitman's strident admission: "Do I contradict myself? Very well, then I contradict myself. I am large, I contain multitudes." We can celebrate Winston Churchill's wartime leadership and oratory while condemning his horrific orchestration of one of India's worst famines. We can celebrate Jefferson's plea for separation of church and state and his embrace of science while condemning his treatment of slaves; but if you want to dismantle statues of him or James Madison from public venues, then you are effectively denouncing both the slave owning practices as well as the Enlightenment values of these founding fathers.
Consider one of the best-known Enlightenment passages, the beginnings of the Declaration of Independence as enshrined in Jefferson's soaring words: "We hold these truths to be self-evident; that all men are created equal; that they are endowed by their Creator with certain inalienable rights; that among these are life, liberty and the pursuit of happiness." It is easy to dismiss the slave-owning Jefferson as a hypocrite when he wrote these words, but their immortal essence was captured well by Abraham Lincoln when he realized the young Virginian's genius in crafting them:
"All honor to Jefferson--to the man who, in the concrete pressure of a struggle for national independence by a single people, had the coolness, forecast, and capacity to introduce into a merely revolutionary document, an abstract truth, applicable to all men and all times, and so to embalm it there, that to-day, and in all coming days, it shall be a rebuke and a stumbling-block to the very harbingers of re-appearing tyranny and oppression."
Thus, Lincoln clearly recognized that whatever his flaws, Jefferson intended his words to apply not just to white people or black people or women or men, but to everyone besieged by oppression or tyranny in all times. Like a potent mathematical theorem, the abstract, universal applicability of Jefferson's words made them immortal. In light of this great contribution, Jefferson's hypocrisy in owning slaves, while unfortunate and deserving condemnation, cannot be held up as a mirror against his entire character and legacy.
In its blanket condemnation of dead white men like Jefferson, the left also fails in appreciating what is perhaps one of the most marvelous paradoxes of history. It was precisely words like these, written and codified by Jefferson, Madison and others in the American Constitution, that gradually allowed slaves, women and minorities to become full, voting citizens of the American Republic. Yes, the road was long and bloody, and yes, we aren't even there yet, but as Martin Luther King memorably put it, the arc of the moral universe definitely bent toward justice in the long term. The left ironically forgets that the same people who it rails against also created the instruments of democracy and freedom that put the levers of power into the hands of Americans of all colors and genders. There is no doubt that this triumph was made possible by the ceaseless struggles of traditionally oppressed groups, but it was also made possible by a constitution written exclusively by white men who oppressed others: Whitman's multitudinous contradictions in play again.
Along with individual rights, a major triumph of Western Civilization and the Enlightenment has been to place science, reason, facts and observations front and center. In fact in one sense, the entire history of Western Civilization can be seen as a struggle between reason and faith. This belief in science as a beacon of progress was enshrined in the Royal Society's motto extolling skepticism: "Nullius in verba", or "Nobody's word is final". Being skeptical about kings' divine rights or about truth as revealed in religious texts was a profound, revolutionary and counterintuitive idea at the time. Enlightenment values ask us to bring only the most ruthless skepticism to bear on truth-seeking, and to let the facts lead us where they do. Science is the best tool for ridding us of our prejudices, but it never promises us that its truths would be psychologically comforting or conform to our preconceived social and political beliefs. In fact, if science does not periodically make us uncomfortable about our beliefs and our place in the universe, we are almost certainly doing it wrong.
Sadly, the left and right have both played fast and loose with this critical Enlightenment value. Each side looks to science and cherry-picks facts for confirming their social and political beliefs; each side then surrounds itself with people who believe what they do, and denounces the other side as immoral or moronic. For instance, the right rejects factual data on climate change because it's contrary to their political beliefs, while the left rejects data on gender or racial differences because it's contrary to theirs. The religious right rejects evidence, while the religious left rejects vaccination. Meanwhile, each side embraces the data that the other has rejected with missionary zeal because it supports their social agenda. Data on other social or religious issues is similarly met with rebuke and rejection. The right does not want to have a reasonable discussion on socialism, while the left does not want to have a reasonable discussion on immigration or Islam. The right often fails to see the immense contribution of immigration to this country's place in the world, while the left often regards any discussion even touching on reasonable limits to immigration as xenophobia and nativism.
The greatest tragedy of this willful blindness is that where angels fear to tread, fools and demagogues willingly step in. For instance, the left's constant refusal to engage in an honest and reasonable critique of Islam and its branding of those who wish to do this as Islamophobes discourages level-headed people from entering that arena, thus paving the way for bonafide Islamophobes and white supremacists. Meanwhile, the right's refusal to accept even reasonable evidence for climate change opens the field to those who think of global warming as a secular religion with special punishments for heretics. Both sides lose, but what really loses here is the cause of truth. Since truth has already become a casualty in this era of fake news and exaggerated polemics on social media, this refusal on both sides to accept facts that are incompatible with their psychological biases will surely sound the death knell for science and rationality. Then, as Carl Sagan memorably put it, unable to distinguish between what is true and what feels good, clutching our pearls, we will gradually slide, without even knowing it, into darkness and ignorance.
We need to resurrect the cause of Enlightenment values and Western Civilization, the values espoused by Jefferson, Locke and Hume, by Philadelphia, London and Athens. The fact that flawed white men largely created them should have nothing to do with their enthusiastic acceptance and propagation, since their essential, abstract, timeless qualities have nothing to do with the color of the skin of those who thought of them; rejecting them because of the biases of their creators would be, at the very least, replacing one set of biases with another.
One way of appreciating these values is to actually resurrect them with all their glories and faults in college courses, because college is where the mind truly forms. In the last 40 years or so, the number of colleges that include Western Civilization as a required course in their curriculum has significantly reduced. Emphasis is put instead on world history. It is highly rewarding to expose students to world history, but surely there is space to include a capsule history of the fundamental principles of Western Civilization as a required component of these curricula. Another strategy to leverage these ideals is to use the power of social media in a constructive manner, to use the great reaches of the Internet to bring together people who are passionate about them and who care about their preservation and transmission.
This piece may seem like it dwells more on the failures of the left than the right. For me the reason is simple: Donald Trump's election in the United States, along with the rise of similar authoritarian right-wing leaders in other countries, convinces me that at least for the foreseeable future, we won't be able to depend on the right to safeguard these values. Over the last few decades, conservative parties around the world and especially the Republican party in the United States have made their intention to retreat from the shores of science, reason and moderation clear. That does not mean that nobody on the right cares about these ideals, but it does mean that for now, the left will largely have to fill the void. In fact, by stepping up the left will in one sense simply be fulfilling the ideals enshrined by many of its heroes, including Franklin Roosevelt, Rosa Parks, Susan B. Anthony and John F. Kennedy. Conservatives in turn will have to again be the party of Abraham Lincoln and Dwight Eisenhower if they want to sustain democratic ideals, but they seem light years from being this way right now. If both sides fail to do this then libertarians will have to step in, but unfortunately libertarians comprise a minority of politically effective citizens. At one point in time, libertarians and liberals were united in sharing the values of individual rights, free speech, rational enlightenment and a fearless search for the truth, but the left seems to have sadly ceded that ground in the last few years. Their view of Western Civilization has become not only one-sided but also fundamentally pessimistic and dangerous.
Here are the fatal implications of that view: If you think Western Civilization is essentially oppressive, then you will always see it as oppressive. You will always see only the wretchedness in it. You will end up focusing only on its evils and not its great triumphs. You will constantly see darkness where you should see light. And once you relinquish stewardship of Western Civilization, there may be nobody left to stand up for liberal democracy, for science and reason, for all that is good and great that we take for granted.
You will then not just see darkness but ensure it. Surely none of us want that.

Why the world needs more Leo Szilards

The body of men and women who built the atomic bomb was vast, diverse, talented and multitudinous. Every conceivable kind of professional - from theoretical physics to plumber - worked on the Manhattan Project for three years over an enterprise that spread across the country and equaled the US automobile industry in its marshaling of resources like metals and electricity.
The project may have been the product of this sprawling hive mind, but one man saw both the essence and the implications of the bomb, in both science and politics, long before anyone else. Stepping off the curb at a traffic light across from the British Museum in London in 1933, Leo Szilard saw the true nature and the consequences of the chain reaction six years before reality breathed heft and energy into its abstract soul. In one sense though, this remarkable propensity for seeing into the future was business as usual for the Hungarian scientist. Born into a Europe that was rapidly crumbling in the face of onslaughts of fascism even as it was being elevated by revolutionary discoveries in science, Szilard grasped early in his youth both a world split apart by totalitarian regimes and the necessity of international cooperation engendered by the rapidly developing abilities of humankind to destroy itself with science. During his later years Szilard once told an audience, "Physics and politics were my two great interests". Throughout his life he would try to forge the essential partnership between the two which he thought was necessary to save the human species from annihilation.
A few years ago, Bill Lanouette brought out a new, revised edition of his authoritative, sensitive and sparkling biography of Szilard. It is essential reading for those who want to understand the nature of science, both as an abstract flight into the deep secrets of nature and a practical tool that can be wielded for humanity's salvation and destruction. As I read the book and pondered Szilard's life I realized that the twentieth century Hungarian would have been right at home in the twenty-first. More than anything else, what makes Szilard remarkable is how prophetically his visions have played out since his death in 1962, all the way to the year 2014. But Szilard was also the quintessential example of a multifaceted individual. If you look at the essential events of the man's life you can see several Szilards, each of whom holds great relevance for the modern world.
There's of course Leo Szilard the brilliant physicist. Where he came from precocious ability was commonplace. Szilard belonged to the crop of men known as the "Martians" - scientists whose intellectual powers were off scale - who played key roles in European and American science during the mid-twentieth century. On a strict scientific basis Szilard was perhaps not as accomplished as his fellow Martians John von Neumann and Eugene Wigner but that is probably because he found a higher calling in his life. However he certainly did not lack originality. As a graduate student in Berlin - where he hobnobbed with the likes of Einstein and von Laue - Szilard came up with a novel way to consolidate the two microscopic and macroscopic aspects of the science of heat, now called statistical mechanics and thermodynamics. He also wrote a paper connecting entropy and energy to information, predating Claude Shannon's seminal creation of information theory by three decades. In another prescient paper he set forth the principle of the cyclotron, a device which was to secure a Nobel Prize for its recognized inventor - physicist Ernest Lawrence - more than a decade later.
Later during the 1930s, after he was done campaigning on behalf of expelled Jewish scientists and saw visions of neutrons branching out and releasing prodigious amounts of energy, Szilard performed some of the earliest experiments in the United States demonstrating fission. And while he famously disdained getting his hands dirty, he played a key role in helping Enrico Fermi set up the world's first nuclear reactor.
Szilard as scientist also drives home the importance of interdisciplinary research, a fact which hardly deserves explication in today's scientific world where researchers from one discipline routinely team up with those from others and cross interdisciplinary boundaries with impunity. After the war Szilard became truly interdisciplinary when he left physics for biology and inspired some of the earliest founders of molecular biology, including Jacques Monod, James Watson and Max Delbruck. His reason for leaving physics for biology should be taken to heart by young researchers - he said that while physics was a relatively mature science, biology was a young science where even low hanging fruits were ripe for the picking.
Szilard was not only a notable theoretical scientist but he also had another strong streak, one which has helped so many scientists put their supposedly rarefied knowledge to practical use - that of scientific entrepreneur. His early training had been in chemical engineering, and during his days in Berlin he famously patented an electromagnetic refrigerator with his friend and colleague Albert Einstein; by alerting Einstein to the tragic accidents caused by leakage in mechanical refrigerators, he helped the former technically savvy patent clerk put his knowledge of engineering to good use (as another indication of how underappreciated Szilard remains, the Wikipedia entry on the device is called the "Einstein refrigerator"). Szilard was also finely attuned to the patent system, filing a patent for the nuclear chain reaction with the British Admiralty in 1934 before anyone had an inkling what element would make it work, as well as a later patent for a nuclear reactor with Fermi.
He also excelled at what we today called networking; his networking skills were on full display for instance when he secured rare, impurity-free graphite from a commercial supplier as a moderator in Fermi's nuclear reactor; in fact the failure of German scientists to secure such pure graphite and the subsequent inability of the contaminated graphite to sustain fission damaged their belief in the viability of a chain reaction and held them back. Szilard's networking abilities were also evident in his connections with prominent financiers and bankers who he constantly tried to conscript in supporting his scientific and political adventures; in attaining his goals he would not hesitate to write any letter, ring any doorbell, ask for any amount of money, travel to any land and generally try to use all means at his disposal to secure support from the right authorities. In his case the "right authorities" ranged, at various times in his life, from top scientists to bankers to a Secretary of State (James Byrnes), a President of the United States (FDR) and a Premier of the Soviet Union (Nikita Khrushchev).
I am convinced that had Szilard been alive today, his abilities to jump across disciplinary boundaries, his taste for exploiting the practical benefits of his knowledge and his savvy public relations skills would have made him feel as much at home in the world of Boston or San Francisco venture capitalism as in the ivory tower.
If Szilard had accomplished his scientific milestones and nothing more he would already have been a notable name in twentieth century science. But more than almost any other scientist of his time Szilard was also imbued with an intense desire to engage himself politically - "save the world" as he put it - from an early age. Among other scientists of his time, only Niels Bohr probably came closest to exhibiting the same kind of genuine and passionate concern for the social consequences of science that Szilard did. This was Leo Szilard the political activist. Even in his teens, when the Great War had not even broken out, he could see how the geopolitical landscape of Europe would change, how Russia would "lose" even if it won the war. When Hitler came to power in 1933 and others were not yet taking him seriously Szilard was one of the few scientists who foresaw the horrific legacy that this madman would bequeath Europe. This realization was what prompted him to help Jewish scientists find jobs in the UK, at about the same time that he also had his prophetic vision at the traffic light.
It was during the war that Szilard's striking role as conscientious political advocate became clear. He famously alerted Einstein to the implications of fission - at this point in time (July 1939) Szilard and his fellow Hungarian expatriates were probably the only scientists who clearly saw the danger - and helped Einstein draft the now iconic letter to President Roosevelt. Einstein's name remains attached to the letter, Szilard's is often sidelined; a recent article about the letter from the Institute for Advanced study on my Facebook mentioned the former but not the latter. Without Szilard the bomb would have certainly been built, but the letter may never have been written and the beginnings of fission research in the US may have been delayed. When he was invited to join the Manhattan Project Szilard snubbed the invitation, declaring that anyone who went to Los Alamos would go crazy. He did remain connected to the project through the Met Lab in Chicago, however. In the process he drove Manhattan Project security up the wall through his rejection of compartmentalization; throughout his life Szilard had been - in the words of the biologist Jacques Monod - "as generous with his ideas as a Maori chief with his wives" and he favored open and honest scientific inquiry. At one point General Groves who was the head of the project even wrote a letter to Secretary of War Henry Stimson asking the secretary to consider incarcerating Szilard; Stimson who was a wise and humane man - he later took ancient and sacred Kyoto off Groves's atomic bomb target list - refused.
Szilard's day in the sun came when he circulated a petition directed toward the president and signed by 70 scientists advocating a demonstration of the bomb to the Japanese and an attempt at cooperation in the field of atomic energy with the Soviets. This was activist Leo Szilard at his best. Groves was livid, Oppenheimer - who by now had tasted power and was an establishment man - was deeply hesitant and the petition was stashed away in a safe until after the war. Szilard's disappointment that his advice was not heeded turned to even bigger concern after the war when he witnessed the arms race between the two superpowers. In 1949 he wrote a remarkable fictitious story titled 'My Trial As A War Criminal' in which he imagined what would have happened had the United States lost the war to the Soviets; Szilard's point was that in participating in the creation of nuclear weapons, American scientists were no less or more complicit than their Russian counterparts. Szilard's take on the matter raised valuable questions about the moral responsibility of scientists, an issue that we are grappling with even today. The story played a small part in inspiring Soviet physicist Andrei Sakharov in his campaign for nuclear disarmament. Szilard also helped organize the Pugwash Conferences for disarmament, gave talks around the world on nuclear weapons, and met with Nikita Khrushchev in Manhattan in 1960; the result of this amiable meeting was both the gift of a Schick razor to Khrushchev and, more importantly, Khrushchev agreeing with Szilard's suggestion that a telephone hot-line be installed between Moscow and Washington for emergencies. The significance of this hot-line was acutely highlighted by the 1962 Cuban missile crisis. Sadly Szilard's later two attempts at meeting with Khrushchev failed.
After playing a key role in the founding of the Salk Institute in California, Szilard died peacefully in his sleep in 1964, hoping that the genie whose face he had seen at the traffic light in 1933 would treat human beings with kindness.
Since Szilard the common and deep roots that underlie the tree of science and politics have become far clearer. Today we need scientists like Szilard to stand up for science every time a scientific issue such as climate change or evolution collides with politics. When Szilard pushed scientists to get involved in politics it may have looked like an anomaly, but today we are struggling with very similar issues. As in many of his other actions, Szilard's motto for the interaction of science with politics was one of accommodation. He was always an ardent believer in the common goals that human beings seek, irrespective of the divergent beliefs that they may hold. He was also an exemplar of combining thought with action, projecting an ideal meld of the idealist and the realist. Whether he was balancing thermodynamic thoughts with refrigeration concerns or following up political idealism with letters to prominent politicians, he taught us all how to both think and do. As interdisciplinary scientist, as astute technological inventor, as conscientious political activist, as a troublemaker of the best kind, Leo Szilard leaves us with an outstanding role model and an enduring legacy. It is up to us to fill his shoes.

October, 1949: Oppenheimer is on the cover of LIFE, and cigarettes are still cool

Back in the good old days of the late 1940s, the age of innocence still writ large on this country's lifeline, LIFE magazine was a microcosm of American life, a daily staple that brought the leading lights and events of the country into the living rooms of the middle class. When readers received the October, 1949 issue in their mail they found the godlike face of American science and technology gazing beneficently at them from the cover. Thanks to the substantial capabilities of eBAY I was able to retrieve a copy.

J. Robert Oppenheimer had already become a household name because of his leadership of the atomic bomb project, and now he seemed to have outdone himself by becoming the director of the Institute of Advanced Study in Princeton, effectively making himself the boss of Albert Einstein, John von Neumann and Kurt Gödel. The 1949 issue paints a picture of Oppenheimer as the quintessential polymath genius and new frontiersman, with a healthy contribution from Oppenheimer the Family Man making the picture complete. There are also other goodies in the installment, with a cheerful smattering of old-fashioned 1940s sexism advertising household products for men and their doting wives. And yes, the biggest concern about cigarettes is throat irritation, a myth reassuringly dismissed by Camel.

The good old times.

First, the father of the atomic bomb inspiring readers with his steel-blue eyes, thoughtful gaze and ever-present cigarette.

Oppenheimer was regarded as the quintessential intellectual plumbing the intricate depths of physics. His signature porkpie hat, cigarette, dazzling mastery of topics as far-flung as French poetry and Sanskrit literature made him the poster boy for the rarefied American intellectual, a species which until then had largely seemed endemic to Europe.

Daddy's Home!: The glowing, breathless profile packed with quotes from Oppie painted Oppenheimer as that rare combination of ivory tower genius and everyman with a great family life who enjoyed romping around with his kids when he returned from work. Reality was different: his wife Kitty was given to bouts of heavy drinking even during the day, and she could be a very unpleasant person in personal interactions. His children Toni and Peter lived in the shadow of their often acerbic and absent father, and both their lives ended in tragedy: Toni committed suicide after her parents' deaths, and Peter Oppenheimer is a recluse who very rarely talks about his father.

But enough about Oppie. Nothing says class and poise better than Van Heusen shirts for the modern American man, worn especially when his wife lovingly bathes him.

Finally, a piece of good news to divert readers' minds from all that heavy mathematical physics and philosophizing. Camel cigarettes don't cause any throat irritation! (only lung cancer).

On John F. Kennedy's 100th birthday: Let us begin

It’s Bostonian John F. Kennedy’s 100th birthday today. Kennedy largely remains a hero on both sides of the aisle; for his moderate liberalism, for his passion for civil rights of minorities, for his tough yet cautious stance against the Soviet Union, for his calls to public service, and most importantly, for the unflagging optimism and positive vision for the future of America which he embodied. More than any other presidency in the last forty years, and in stark contrast to now, his tenure inspired Americans to believe that they were unified, and that their best days lay ahead of them. There have been few times in the history of this country when that optimism has been more sorely needed.

JFK was known for many things, but one of the enduring hallmarks of his legacy has been the words in his speeches. Among all presidents he was one of the most eloquent, and after him only Barack Obama came close to displaying the same fluency of language. There are many speeches of JFKs that are worth reading and remembering, but one that truly stands out is a speech on June 10, 1963 in which he made an impassioned plea for peace. The speech, delivered at American University in Washington D.C., was carefully crafted, copies were shown to only a few trusted advisors for comment, and Kennedy's indispensable speechwriter Ted Sorensen worked on it day and night to meet the president's schedule. In his book "To Move the World: JFK's Quest for Peace”, the economist Jeffrey Sachs considers this to be Kennedy's most important speech, and I tend to agree.

JFK's dedication to peacemaking shines through in his words. The piece contains one of the most memorable paragraphs that I have seen in any exhortation, political or otherwise. In words that are now famous, Kennedy appealed to our basic connection on this planet as the most powerful argument for worldwide peace:

"So let us not be blind to our differences, but let us also direct attention to our common interests and the means by which those differences can be resolved. And if we cannot end now our differences, at least we can help make the world safe for diversity. For in the final analysis, our most basic common link is that we all inhabit this small planet. We all breathe the same air. We all cherish our children's futures. And we are all mortal."

In a time of deep societal division, this call to finding common ground and building on our common values rather than our differences cannot be overemphasized. Kennedy was also saying these words through hard practical experience, against the background of the Cuban Missile Crisis in October 1962 that had brought the world to the edge of nuclear war. Recently declassified documents now indicate that the Soviets had more than 150 nuclear weapons in Cuba, and there were many close calls which could have sent the world over the precipice into thermonuclear destruction. For instance a little known submarine officer Vasili Arkhipov refused to launch his submarine's nuclear torpedo even as American planes were dropping dummy depth charges around the submarine. When the crisis was averted everyone thought that it was because of rational men's rational actions, but Kennedy knew better; he and his advisors understood how ultimately, helped as they were by their stubborn refusal to give in to military hardliners' insistence that Cuba should be bombed, it was dumb luck that saved humanity.

Kennedy was thus well aware in 1963 of how quickly and unpredictably war in general and nuclear war in particular can spiral out of everyone's hands; two years before, in another well-known speech in front of the United Nations, Kennedy had talked about the ominous and omnipresent sword of Damocles that everyone lives under, "hanging by the slenderest of threads, capable of being cut at any moment by accident, or miscalculation, or by madness". His Soviet counterpart Nikita Khrushchev understood this too, cautioning JFK to not tighten the "knot of war" which would eventually have to be catastrophically severed. As one consequence of the crisis, a telephone hotline was established between the two countries that would allow their leaders to efficiently communicate with each other.

Kennedy followed the Peace Speech with one of the signal achievements of his presidency, the signing and ratification of the Partial Test Ban Treaty (PTBT) which banned nuclear tests in the air, underwater and in space. Sachs describes how Kennedy used all the powers of persuasion at his disposal to convince the Joint Chiefs of Staff, Republican hardliners and Southern Democrats to endorse the treaty, while at the same time striking compromises with them that would encourage underground nuclear testing.

How has Kennedy's understanding of the dangers of nuclear war, his commitment to securing peace and his efforts toward nuclear disarmament played out in the fifty years after his tragic and untimely death? On one hand there is much cause for optimism. Kennedy's pessimistic prediction that in 1975 ten or twenty countries would have nuclear weapons has not come true. In fact the PTBT was followed in 1968 by the Nuclear Non-Proliferation Treaty, which for all its flaws has served as a deterrent to the formation of new nuclear states. Other treaties like SALT, START and most recently NEW START have drastically reduced the number of nuclear weapons to a fraction of what they were during the heyday of the Cold War; ironically it was Republican presidents Ronald Reagan and George H. W. Bush who must be credited with the greatest arms reductions. In addition there are several success stories of countries like South Africa, Sweden, Libya, Brazil and the former Soviet Republics giving up nuclear weapons after wisely realizing that they would be better off without them.

Yet there are troubling signs that Kennedy's dream is still very much a dream. Countries like Israel and India which did not sign the NPT have acquired nuclear arsenals. North Korea is baring its nuclear teeth and Iran seems to be meandering even if not resolutely marching toward acquiring a bomb. In addition loose nuclear material, non-state actors and unstable regimes like Pakistan pose an ever-present challenge that threatens to spiral out of control; the possibility of "accident, or miscalculation, or madness" is very much still with us.

There are also little signs that the United States is going to unilaterally disassemble its nuclear arsenal in spite of having the most sophisticated and powerful conventional weapons in the world, ones which can hit almost any target anywhere with massive destruction. The US did unilaterally disarm its biological weapons arsenal in the 70s, but nuclear weapons still seem to inspire myths and illusions that cannot be easily dispelled. A factor that's not much discussed but which is definitely the massive elephant in the room is spending on nuclear weapons; depending on which source you are looking at, the US spends anywhere between 20 to 50 billion dollars every year on the maintenance of its nuclear arsenal, more than what it did during the Cold War! Thousands of weapons are still deployment-ready, years after the Cold War has ended.

It goes without saying that this kind of spending is unconscionable, especially when it takes valuable resources away from pressing problems like healthcare and education. Eisenhower who warned us about the military-industrial complex lamented exactly this glut of misguided priorities in his own "Chance for Peace" speech in 1953:

"Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed. This world in arms is not spending money alone. It is spending the sweat of its laborers, the genius of its scientists, the hopes of its children. The cost of one modern heavy bomber is this: a modern brick school in more than 30 cities. It is two electric power plants, each serving a town of 60,000 population. It is two fine, fully equipped hospitals. It is some fifty miles of concrete pavement. We pay for a single fighter with a half-million bushels of wheat. We pay for a single destroyer with new homes that could have housed more than 8,000 people. This is not a way of life at all, in any true sense. Under the cloud of threatening war, it is humanity hanging from a cross of iron."

It is of course inconceivable to imagine a conservative politician saying this today, but more tragically it is disconcerting to find exactly the same problems that Eisenhower and Kennedy pointed out in the 50s and 60s looming over our future.

In a greater sense too Kennedy's vision is facing serious challenges. Jeffrey Sachs believes that sustainable development has replaced nuclear weapons as the cardinal problem facing us today and until now the signs for sustainable development have not been very promising. When it comes to states struggling with poverty, Sachs accurately reminds us that countries like the US often "regard these nations as foreign policy irrelevancies; except when poverty leads to chaos and extremism, in which case they suddenly turn into military or terrorist threats". The usual policy toward such countries is akin to the policy of a doctor who instead of preventing a disease waits until it turns into a full-blown infection, and then delivers medication that almost kills the patient without getting rid of the root cause. Sadly for both parties in this country, drones are a much bigger priority than dams. This has to change.

We are still struggling with the goal laid out by John Kennedy in his Peace Speech, and in our own times his words are as crucial and as desperately needed as they ever were, but Kennedy also realistically realized that reaching the goal would be a gradual, dogged and piecemeal process. He made it clear in his inaugural speech:

"There is no single, simple key to this peace; no grand or magic formula to be adopted by one or two powers. Genuine peace must be the product of many nations, the sum of many acts. It must be dynamic, not static, changing to meet the challenge of each new generation. For peace is a process -- a way of solving problems...(from the inaugural speech). All this will not be finished in the first 100 days. Nor will it be finished in the first 1,000 days, nor in the life of this administration, nor even perhaps in our lifetime on this planet. But let us begin."

Indeed. We do not know how it will end, nor do we even know how it will progress, but we can begin.