Saturday, November 3, 2012

The Riddle of Irreducible Complexity

Is life too complex to have evolved without guidance or design?  Proponents of Intelligent Design (ID) like Michael Behe and Kirk Cameron think so.  They claim that even the simplest bacterium has so many interdependent parts -- like the gears and springs of a watch -- that it could not have sprung into being without the direct input of an intelligent creator.

At first glance, they seem to have a point.  Deleting even a single gene in a developing fly embryo can be lethal to the fly.  Changing just one or two letters out of 3 billion in the human genome can give rise to a variety of deadly diseases.  If living things are so intricately balanced, how could they have evolved gradually, as evolutionary biologists claim?  Wouldn't all of the parts have to spring into existence at the same time?

To answer this, consider the following analogy.  Suppose a hunter-gatherer from the deep Amazon encounters an iPad.  Opening its case to peer at the microelectronics inside, she is puzzled: who could design and build such intricate parts?  Even when she finds out what each piece of silicon and metal is for, she is baffled.  Nobody has small enough hands, or a large enough intellect, to build a microchip from scratch.  They must have had the help of a computer -- but then, who built that one?  She might conclude that the first computer must have been a gift from an ultra-intelligent, and perhaps divine, being.

Of course, our friend is overlooking history.  As she investigates further, she learns that the predecessors of computers did not have microchips, but ran on manpower or steam.  Engineers later capitalized on inventions from electronics and the materials industry, as well as centuries of advances in physics and mathematics.  As computers became more complex, they helped people to design smaller, faster, and more complex computers, as well as the sophisticated microfabrication systems needed to build them.  The first computers helped to improve the hardware, which in turn helped to build faster computers, and so on.

It's an imperfect analogy, of course.  Unlike biology, even primitive computer technology is the product of intelligent designers driven by a set of goals.  But that's giving modern computer engineers too much credit.  The rise of the information age depended on the work of thousands of innovators spread over many centuries.  Their ranks include mathematicians, physicists, and chemists who had no idea that their work would give rise to microprocessors, Google, or Twitter.  In most cases, their goals had nothing to do with building a powerful computer, but were adapted for this purpose later.

Similarly, Biology has a goal: survival.  To get an edge on their competitors, living things capitalize on whatever past innovations they carry in their genes.  They re-purpose molecular bilge pumps for use as tiny propellers.  They transform their ancestors' solar power plants into cameras.  They even copy and build upon entire developmental blueprints from other living things, which is why it's so hard to tell the difference between the early embryos of humans and crocodiles.

Since evolution is blind, it can't predict whether an innovation will be successful.  And it's usually not.  Recall the supposed quote from Thomas Edison, "I have not failed; I've just found 10,000 ways that won't work."  As long as a creature can sacrifice a little time and energy on mistakes -- and every genetic disorder could qualify as a mistake -- it will occasionally hit upon a great idea.  Then comes the payoff.

In terms of our understanding of biology, we are like our fictitious hunter-gatherer: bright and curious, but almost completely ignorant of the past.  To worsen matters, no one was around to document evolution during 99.99% of its history.  If we want to understand how life arose and evolved, we have to take every little miraculous piece of biological machinery, and examine it.  What are its genetics?  What evolutionary path could it have taken?  Are there any paths we can rule out?  It takes a long time.

Understanding the natural world is tedious.  But it's infinitely more satisfying than exclaiming "Impossible!" and walking away.

Friday, April 13, 2012

Franklin’s Foibles and Other Closing Thoughts

Having just finished his Completed Autobiography, I have no doubt that Ben Franklin was in many ways an exceptional man.  Following his successful career as a printer, inventor and scientist, he spent his thirty-odd years of retirement as a civil servant.  During the Revolution he secured vital aid from France, which would become the United States’ closest ally in its struggle for independence.  Later, he played a role in setting up the new nation, fighting his chronically painful gout and kidney stones to help draft the U.S. Constitution and Bill of Rights before he succumbed to the ailments of age.  Socially, he was far ahead of his time, advocating for the abolition of the slave trade and then slavery altogether some eighty years before the Emancipation Proclamation.  In his last will and testament, he forgave the debt of his son-in-law Richard Bache on condition that he free his slave, Bob.

Still, it’s a bit reassuring to learn that even a legend like Franklin had his foibles, at least when viewed through the lens of history from our present vantage point.  In his discussion of Native Americans, he was uncommonly sympathetic, particularly in cases where European settlers at the frontiers maliciously murdered innocent Natives without provocation.  He clearly viewed them as equals to the Europeans from a human rights standpoint.   However, he also possessed much of his contemporaries’ cultural and religious chauvinism, which he used to justify the expansion of Europeans deeper into America, as is evidenced in passages like the following:
I often wish’d that I were employ’d by the Crown to settle a colony in Ohio, that we could do it effectually and without putting the nation to much expense.  What a glorious thing it would have been to settle in that fine country a large strong body of religious and industrious people!  What a security to the other colonies, and advantage to Britain by increasing her people, territory, strength and commerce.  Might it not have greatly facilitated the introduction of pure religion among the heathen, if we could, by such a colony, show them a better sample of Christians than they commonly saw in our Indian traders, the most vicious and abandoned wretches of our nation?  In such an enterprise I could have spent the remainder of my life with pleasure.
In other writings, he often referred to Native Americans as “the savages.”  My knowledge of 18th century English is limited, so I’m not sure how derogatory “savage” was at the time.  Furthermore, elsewhere Franklin advocated the honest purchase of their territory rather than forcible seizure by war.  However, it’s clear that he had no compunction about encroaching on Native lands for the sake of promulgating his own people’s culture and religion, which he clearly implied to be superior.  It's hard to fault him too gravely for this, considering the cultural milieu in which he was raised; even the sight of giants has its horizons.

On the topic of religion, he was more tolerant than many of his time.  For instance, he referred to his rationalist friend Joseph Priestly as an “honest heretic,” opining that “…’tis his honesty that has brought upon him the character of heretic” and “…I think all heretics I have known have been virtuous men.”  He continually advocated the importance of virtue and maintenance of good, honest character.  On the other hand, he was not above using religious language in pursuit of his carnal fulfillment.  Here is one excerpt from one of his many letters to an object of his affection and lust, Madame Brillon, a married woman who was young enough to be his daughter:
And now as I am consulting you upon a case of conscience, I will mention the opinion of a certain father of the church, which I find myself willing to adopt, tho’ I am not sure it is orthodox.  It is this, that the most effectual way to get rid of a certain temptation is, as often as it returns, to comply with and satisfy it.  Pray instruct me how far I may venture to practice upon this principle?
This series of exchanges happened after his own wife’s death, and there’s no clear evidence from this autobiography that he had any sexual affairs during his marriage.  However, considering the fact that Franklin spent most of the last 20 years of his marriage overseas, as well as his rumored reputation as a philanderer, it’s conceivable.  Some have accused Franklin of neglecting his wife by traveling so much, though Skousen suggests in the epilogue that Benjamin and Deborah had both drifted apart from one another in their later years (she repeatedly refused his many offers to take her to England with him in the years prior to the revolution).  In any case, it would be interesting to hear what Mrs. Franklin had to say about their relationship.

I imagine Franklin’s political enemies and other acquaintances had other negative things to say about him, but his writings convince me that he possessed an advanced moral compass for his time and on the whole earnestly strived to do the right thing, while cognizant of his own limitations to determine the best course of action.  Sounds like a decent guy to me.

Monday, April 9, 2012

Magnetic Healing and the Importance of Reproducibility, Then and Now

As a scientist, reproducibility is the little demon always lurking in your shadow.  At the helm of a series of often delicate experiments and complex analyses, you always wonder what will be discovered by those in your wake.  Did you carry out your study properly?  Were there any variables you didn't account for?  Above all, can an equally competent scientist independently reproduce your results?

In the past couple of years I’ve noticed increasing concern for reproducibility.  It’s an essential part of science, since it helps to cancel out the effects of human fallibility and conflicts of interest.  The more often a research finding is reproduced, the stronger a foundation it forms for future knowledge-building.

The problem is that most scientific journals do not publish straightforward reproducibility studies, with occasional high-profile exceptions -- for example, the apparently erroneous measurements of faster-than-light neutrinos mentioned in last week’s Nature podcast. Clearly, we need a place for reports like these to be routinely filed so that the value of any and every study can be more accurately judged.  Last week's Science podcast mentioned one such initiative in the field of psychology, called PsychFileDrawer.  I think every scientific field needs a repository like this, and the reproducibility studies should be linked to the original articles to minimize the time wasted following dead-end roads.  

As scientists, we need to be our own worst critics, for two main reasons:
  1. People outside our own specialties are not trained to critique our work properly; and
  2. When external audits discover problems, the whole of science suffers a blow to its reputation.
Of course, problems in scientific reproducibility are not a new phenomenon.  In his Compleated Autobiography, Ben Franklin mentions the case of a quack named Franz Mesmer who popularized a form of magnetic therapy based on his “theory” of animal magnetism:
We were much amiss’d in France for the past two years with the pretended new art of healing by what was called Magnetisme Animale.  The professor of this art, Mr. Mesmer, had in a short time made nearly twenty thousand Louis d’ors by teaching and practicing it.  He proposed the presence of a universal animal magnetic fluid that could heal the sick and prevent illness through a simple laying of hands, gestures, and signs… The remedy met with enough success to encourage the hopes of the ill; and even educated people, including doctors and surgeons, admitted to the school of magnetism.
One of Mesmer's devices, a magnetic tub called a baquet.  (Source
Franklin and others were curious and a little skeptical, so on orders of the King of France, he and a group of independent experimenters of the Academy of Science carried out their own study of Mesmer’s art of healing:
In our investigation, we tried to detect the presence of the magnetic fluid; but this fluid was imperceptible to all the senses.  The experiments we conducted on ourselves, including blindfolded subjects, caused us to reject it absolutely as a cure of illness.  We concluded that the system of magnetism did not cure anything; that both magnetism and his brilliant theory exist only in the imagination; and that a spectacle such as this seemed to transport us to the age and the reign of the fairies.
Unfortunately, Mesmer and his successors proceeded to swindle many credulous patients after this report was published.  Perhaps Mesmer earnestly believed in the technique.  Maybe his patients derived some comfort from the practice.  In any case, at least the study of the Academy of Science helped prevent magnetic healing from becoming an accepted mainstream medical treatment.

Non-reproducible science is bad science, and bad science is anti-science.  It's better to publish nothing at all than to publish an incorrect result.  As Franklin put it:
…one must not be indifferent to the ill-founded reign of false opinion: the sciences, which grow larger with the truth, have even more to gain by the suppression of an error.  An error is always a spoiled yeast which ferments and eventually corrupts the substance in which it is introduced.  But when this error leaves the empire of the sciences to spread among the multitude, and to divide and agitate minds when it presents a misleading way to heal the sick whom it discourages from looking elsewhere for help, a good government has an interest in destroying it.
Given the explosion of scientific inquiry in our era, we need a more concerted, systematic approach to finding the “spoiled yeast” and removing it before it spreads.  I think these reproducibility repositories are a step in the right direction.

Thursday, April 5, 2012

Giant Tyrannosaur relative discovered... and it's fuzzy!

Chinese paleontologist Xing Xu and his colleagues announced today in Nature that they've uncovered beautifully preserved fossils of three large Tyrannosauroids (relatives of the infamous T. rex).  The showstopper: they apparently had feathers.

I'll never see Jurassic Park in the same way again.

"Ian, Freeze!  Stop laughing!"
"Ha ha!  I can't help it.  This Tyrannosaur looks, uh, silly."
(Bottom image cropped from drawing by Brian Choo).

Xu named the new species Yutyrannus huali, a hybrid of Mandarin and Latin meaning "beautiful (huali) feathered (yu) tyrant (tyrannus)." Although smaller than T. rex, the largest of these three dinos is estimated to have weighed about 1.5 tons, making it the largest feathered dinosaur yet discovered -- in fact, about 40 times as large as the previous record-holder, Beipiaosaurus.  The authors opine that other large dinosaurs might have had feathers, but that "the discovery of Y. huali provides solid evidence for the existence of gigantic feathered dinosaurs and, more significantly, of a gigantic species with an extensive feathery covering."

It's not known what the plumage may have been used for, but the authors suggest it may have served as insulation.   There is no fossil evidence that T. rex bore a similar fuzzy coat, but it seems this is still up for debate.  Since T. rex appeared later in the Cretaceous, it's possible that it either had feathers, or lost them as the climate warmed up and insulation became less important.  There are many examples of mammals losing most or all of their fur coat (elephants, humans), even as their relatives did not (wolly mammoths, chimpanzees); why couldn't this have happened with dinosaurs?

Several smaller, early Tyrannosaur relatives had feathers.  Did T. rex have them, too?
(Source: Nature 484, 92–95.)
Whether a minority or a majority of dinosaurs had plumage is up for debate, but there is evidence that other dinos were feathery, including velociraptor.  It seems more and more accurate to say that dinosaurs were just early birds, and that birds are late dinosaurs.

In any case, there's something freakishly alluring about a predator that combines the fearsomeness of a T. rex with the beauty of a bird-of-paradise or the silliness of a chicken.  I'd like to see that remake of Jurassic Park.

Wednesday, April 4, 2012

Some baby steps towards better self-driving cars

Imagine you just spent a great night out with some friends.  You have some delicious food, good conversation, and perhaps a few drinks.  You don't worry about driving home, because you know your car will deliver you there safe and sound while you snooze in the back seat.  Such is one of many possible uses for the autonomous car.

I confess that I feel ambivalent about the idea.  Can we ever fully trust a computer to perform the complex cognitive task of driving as well as a human?  Still, it would be awesome to have your Chevy conduct you from errand to errand while you catch up on the news, or to send your Subaru out to pick up your clothes from the cleaners while you're at work.  As a commuter, I'd particularly benefit from the extra time a cybernetic chauffeur would give me each day.

I do think it'll happen eventually, and maybe within a couple of decades.  Google is already test-driving automated cars in California and Nevada.  If the proper sensing and judgment algorithms are perfected, robots may actually outperform human drivers in some respects.   Unfettered by the sluggish speeds of human nerve conduction velocity (mere tens of meters per second), they might have ultrafast reaction times.  If engineered properly, they will have no blind spots, will never misjudge distances, and will be free of distractions.  They will never get drunk.  They might even be able to accurately sense ice or other obstacles on the road and quickly communicate its location to other cars in the area so that they can take appropriate precautions.

University of  Michigan grad student Nick Carlevaris-Bianco has been working on the sensing part, using various lasers and cameras mounted on a Segway robot to help it build an accurate high-resolution map of its surroundings.


This technology must be introduced very carefully and slowly to avoid serious accidents.  The public and insurance companies need time to catch up.  But with some more development in this field, our grandchildren could be driving -- or rather, riding -- on roads that are virtually accident-free.  

Tuesday, April 3, 2012

Ben Franklin: "Sometimes I Regret I Was Born Too Soon"


I came across a great, partially prophetic quote in Ben's Compleated Autobiography today:
The rapid progress true science now makes occasions my regretting sometimes that I was born so soon.  It is impossible to imagine the heights to which the power of man may be carried over matter in a thousand years.  We may perhaps learn to deprive large masses of their gravity and give them absolute levity for the sake of easy transport.  Agriculture may diminish its labour and double its produce.  All diseases may by sure means be prevented or cured, not excepting even that of old age, and our lives lengthened at pleasure even beyond the antediluvian standard. 
So, 230 years on, how are we doing?

Power over gravity: check.  Well, sort of.  We can at least defy it for a while with the help of the Bernoulli effect, strong propulsion, or electromagnets.

Improved agricultural efficiency: check.  In the developed world, a small minority of the population provides a surplus of food at dramatically improved yields per acre.  This was made possible almost entirely by science and technology.  Now, if we can only make it more sustainable and environmentally friendly...

Curing all disease: not so much.  We've lengthened the life expectancy by a few decades in the developed world, and a large proportion of the developing world is catching up (in fact, maybe developed/developing world is becoming an outdated concept).  Infectious disease is well controlled by better hygiene, vaccinations, and antibiotics (for now...).  Chronic diseases like cancer and diabetes will take longer to conquer.

There have also been many surprises.  Franklin never saw Darwin, Einstein, Planck, Schrodinger, or Turing coming.  Sometimes I furtively imagine taking somebody from the 18th Century on a tour of all the technology and gadgets we now have thanks to paradigm changers like these, and it makes me appreciate my car or smartphone a little more.

What lies in the future?  Sustainable solutions to feeding the world's billions?  Hopefully.  True antigravity?  Genuine biological immortality?  Not likely in my lifetime, but we have 770 more years before Franklin's thousand are up.  Let's get on it.
O that moral science were as fair a way of improvement, and that men would cease to be wolves to one another and that human beings would at length learn what they now improperly call humanity!
Amen to that.  Still a long way to go on that front.

Sunday, April 1, 2012

Systems biology: How bacteria use their sense of "smell" to navigate

What is it that distinguishes a living thing from lifeless matter?  It's not the molecules it's made of -- chemists can synthesize proteins, DNA, RNA, and lipids, but to breathe life into these lifeless ingredients, a higher level of order is needed.  It's the networks of interactions between these molecules -- biological circuits -- that allow cells to respond to their environment, harness energy and funnel it into their growth and development, and pass their genes on to the next generation.  In fact, if we ever find life on another planet, I doubt its molecular building blocks will closely resemble those of life on Earth, but I am sure that its biological information networks will.

I've been learning a lot about these networks from Professor Uri Alon of the Weizmann Institute, who has posted several of his enlightening lectures to YouTube.  In one lecture he discusses chemotaxis, the ability of many bacteria and other cells to swim towards nutrients and away from toxins.

In order to perform chemotaxis, organisms like E. coli have to detect tiny gradients of nutrients or toxins, sometimes as small as 1 molecule in 1,000 over the length of their one-micrometer-long bodies.  Ultimately, they accomplish this by randomly swimming in different directions, and when they sense an increase in a nutrient (or decrease in a toxin), they are less likely to turn.  It's kind of a unicellular version of Hide the Object ("You're getting warmer!").

Before it can detect changes in the levels of a nutrient or toxin, the cell has to "subtract out" whatever level it currently sees in the environment.  This is adaptation, and is related to how your eyes adapt when you step outside into the sunlight, or how your sense of smell adapts after first smelling those chocolate chip cookies baking in your oven.  If you smelled cookies baking in a strange house, your nose could lead you to the kitchen, largely because your sense of smell adapts to the amount of cookie aroma in each room you visit.  You can tell if you're getting closer, because the cookie scent will become stronger.

Based on the model Alon describes, I built a little simulation showing how this works.  In the video below, each white dot represents a bacterium, and the green color represents a delicious nutrient such as aspartate.  On the left, the bacteria cannot sense the nutrient, while on the right, they can. You'll notice that the bacteria on the right wiggle and jiggle quite a bit, but they generally stick to the green part of the screen, where all the goodies are.  They sometimes move away from the green area, but usually realize their mistake and turn back.  The bacteria in this simulation have the properties of gradient sensing and adaptation, even though they are using just three virtual enzymes!


The real system in E. coli is based on only six proteins in total, so it's not much more complicated.  Their solution to the problem is different than anything we humans would design, but it's a simple and elegant system that, according to Prof. Alon, resembles many other networks throughout the tree of life.  Maybe it can even tell us something about the higher-order functions of our own brains?

Wednesday, March 28, 2012

Advice from Ben Franklin: Doctors should be competent in science

In 1766 Ben Franklin wrote letters of recommendation for the sons of two friends who were applying to medical school at the University of Edinburgh, which might have been the premier institution for studying medicine (or "physic") at the time.  Along with the letters he supplied the young students with some words of wisdom:
But you will be your own best friends, if you apply diligently to your studies, refraining from all idle, useless amusements that are apt to lessen or withdraw the attention from your main business.
Evidently, even 250 years ago the elder generation was concerned about their youngsters going off to drink and party at college.  More interesting to me is the following:
I recommend one thing particularly to you, that besides the study of medicine, you endeavour to obtain a thorough knowledge of natural philosophy in general.  You will from thence draw great aids in judging well both of diseases and remedies, and avoid many errors.  I mention this, because I have observed that a number of physicians, here as well as in America, are miserably deficient in it.
 How interesting that Franklin thought it necessary to explicitly draw the connection between medicine and science!  Medicine really must have been more an art (or pseudoscience) in that time.  I am sure he would be proud of the progress of modern evidence-based medicine, and would decry recent movements against it.

Monday, March 26, 2012

Ben Franklin's "Compleated Autobiography": First Thoughts

After the dismal story of Joseph Stalin's rise to power, I was hungry for a more uplifting biography, and one closer to home.  So, when I saw The Compleated Autobiography of Benjamin Franklin on the shelves of my audiobook shop, I eagerly picked it up.  Like a typical American, I have some nebulous ideas about Franklin as an accomplished statesman, inventor, scientist, and Founding Father of my country, but these perceptions tend to be changed by the long fermentations of history, spiked with the obligatory patriotic admiration.  But who was he, really?


Having just read Stalin's story, it will be interesting to compare these two historical figures.  Ben Franklin is something close to a demi-god to Americans, and it seems Stalin and his government worked to instill a similar opinion of himself during his rule of the Soviet Union.  Both men were revolutionaries with above-average intelligence and considerable charisma.  What about their personalities distinguishes these two towering figures of history?  What characteristics resulted in one being remembered as a brutally repressive dictator while the other persists as a beloved national hero?  It may seem naive, bizarre, or even offensive to make this comparison, but I'm trying to remain open-minded since my opinions of these historical figures are certainly colored by my education and culture.

In fact, it might not be fair to base my comparison on these two books.  Ben Franklin's autobiography was largely written by the man himself, then ultimately compiled and edited by historian and Franklin descendant Mark Skousen in 2007, more than two centuries after Franklin's death.  Young Stalin, on the other hand, is written entirely from the perspective of a modern Western historian who is not at all sympathetic to the dictator or his cause.  Certainly the tone of the two books will differ greatly with regard to their subjects.  Yet I do think that actions speak louder than words, and I hope that the actions and choices these two men made will provide a fair testament to their respective characters.

Now into the third chapter, I am already impressed by the clarity and versatility of Franklin's thought.  In one minute he is discussing tax disputes with the proprietors of Pennsylvania, and in the next, he is describing a musical instrument he invented or a chemistry experiment he conducted with Cambridge professor John Hadley.  Then, his mind will turn to a critique of a religious text and its moral implications.  Reading these words two and a half centuries later, Franklin's words still speak of a sharp mind tirelessly examining all aspects of the world around him, and an earnest desire to offer improvements when possible.

A modern replica of Franklin's musical instrument, the armonica.
His philosophy was clearly influenced by his Puritan upbringing.  He valued hard work, good deeds towards others, and a measure of sobriety (on returning to his home town of Philadelphia in 1762 he decried the rapid proliferation of taverns there in his six-year absence).  Yet he always had one foot firmly planted outside of his times, and a long stride he had.  While he must have been a racist by modern standards, Franklin, upon visiting a "negro school" in Philadelphia with a clergyman friend, observed that the children's "apprehension seems as quick, their memory as strong, and their docility in every respect equal to that of white children."  It took the rest of society two more centuries to come to that recognition of equality, and some still haven't caught on.  Similarly, when mobs of angry countrymen began brutally massacring innocent Native Americans in their midst, Franklin took initiative to condemn these attacks and secured the majority public opinion behind him.  Still, he didn't express any guilt over his countrymen taking the Natives' land in the first place.

I'm only just delving into this book, but so far it seems that Franklin was a remarkable person who deserves every bit of his reputation.  He was both an optimist and a realist, seeking to appraise every situation clearly and then make the best of it.  I may have to work harder to read between the lines for Franklin's faults... but in any case, looking forward to learning more.

Wednesday, March 21, 2012

Final Impressions of Young Stalin

Soooooo, I finally finished Young Stalin last week.  I took a few days off here and there since the story got a bit boring right around 1908-1913. Stalin had finished with his major bank robberies, leaving several years filled with short exiles, a long string of covert political party meetings, and an equally lengthy list of brief love affairs.  Hence, this section was riddled with unfamiliar Russian, Georgian and otherwise Slavic or Caucasian names that make my brain hurt.  I eventually persevered, and even if I’ve forgotten many of the details, this book taught me a great deal about Stalin’s personality and the events of his early life that poised him to become one of the world’s most brutal dictators.

Montefiore’s thesis in this book is that Stalin was far from the mediocrity that his Bolshevik peer Trotsky claimed; rather, he possessed a singular combination of incisive intelligence, quiet charisma, complete confidence in his own convictions, and utter lack of empathy that combined to propel him to the pinnacle of Soviet power for three decades.  The memoirs of his political peers (and rivals, many of whom would later fall victim to his targeted killing sprees of the 1930s) testify to his intellect, gravitas, and pigheadedness.  And, as an old man, Stalin himself reflected that there were only two people he really loved in life: his first wife, the dressmaker Kato (Ekatarina Svanidze) and his mother, Keke (Ketevan Geladze).  Once Kato succumbed to what was probably typhus, he stated that “with her died my last warm feelings for humanity.” 


Keke Geladze (left) and Kato Svanidze.  The portrait of Keke is by Isaak Brodsky.

I find the role of Kato in his life particularly fascinating.  It fits neatly into the archetype of the prodigy turned evil by the death of his beloved (ala Darth Vader).  On the other hand, Stalin was wedded first and foremost to the revolution, and had committed violent atrocities in its name well before Kato died.  He might well have become the dictator Stalin even had his Kato survived.  Furthermore, his love for her didn’t stop him from later murdering most of her surviving immediate family, perhaps because they had expressed disapproval of him and even blamed him for causing Kato’s death through neglect.  Probably his disregard for the value of human life had deeper roots, extending all the way back to his abusive upbringing in a society full of belligerent drunkards.

Of course, bloodthirsty sociopaths can be found in all societies at all times.  One needs only look to serial killers and terrorists to find examples of this.  The only difference with Stalin was one of historical accident: he had the will and opportunity to enforce his psychosis on millions.  It’s interesting to note that more moderate Bolsheviks were also vying for power after the 1917 revolution – those that Lenin called the “tea drinkers,” the conciliatory intellectuals that wanted to unite with the opposing Mensheviks.  If these had gained power, perhaps the legacy of 20th century Communism/Socialism would have looked much different.

Moreover, Stalin could not have ascended to power without Lenin, his well-to-do mentor and colleague.  In contrast to the fatherly portrait some have painted of Lenin, Montefiore points out that at times Lenin was more extremist, less conciliatory, and more committed to underhanded gangster politics than Stalin.  In the young Georgian, Vladimir Lenin saw a man who could get the job done by whatever means it took.  And to both men, all means were justified in achieving their goal.  As Stalin told one Menshevik acquaintance, “a lie always has a stronger effect than the truth.  The main thing is to obtain one’s objective.”  In Stalin’s view, this clearly extended far beyond the world of words, as is revealed by his history of bank robbing, incitement of riots, and generally wanton killing that only accelerated late in his career.

Stalin (left) and Lenin (right) in 1919, a few years after the revolution.  They seem like nice guys, no?
Regarding the dictator’s atheism, Montefiore says little beyond the role it played in his early schooling at the theological seminary in Tiflis (Tblisi).  Stalin’s reading of Darwin’s On the Origin of Species as a teenager had a strong impact on him, and seemingly prompted his initial questioning of Christianity.  “We’ve been deceived,” he told one peer at the time.  His introduction to the works of Marx, Plekhanov and others would come later, along with his conversion of many of his classmates to the cause.  On my reflection, it seems the main role of his atheism was to prime him to reject his teachers’ authority and gradually replace the Christian dogma with that of Marxism (or at least his own version of it, imbued at first with Georgian nationalism and later with many of Lenin's ideas).  Would there have been a Stalin had the young Iosef not read Darwin?  It’s hard to say; clearly he was a voracious reader and probably would have come across Marxist writings in any case.  Still, ironically, had his mother not worked doggedly to support his seminary studies, Stalin would likely have worked as a cobbler in his father’s workshop, and Soviet history might have looked quite different.

I can’t conclude without mentioning his five-year exile in Siberia.  Stalin spent the time between 1912 and 1917 in the tiny village of Turukhansk, which consisted of three extended families living in small huts.  It was completely isolated from civilization; the only means of transit in winter was by reindeer-propelled sleigh on the frozen Yenisei river, and in summer, by boat on the same.  It was here that the 34-year-old Stalin seduced and married a thirteen-year-old girl (scandalous even at that time) named Lidia, who bore him a son that he promptly abandoned.  In any case, the frigid winters and self-reliant ways of life here had a lasting impact on Stalin.  Into old age, he recounted increasingly impressive tales of his hunting exploits there, and it is said that he retained the Siberian habit of snacking on bits of frozen fish for the rest of his life.

Turukhansk (A), where Stalin spent several years in exile. (Swiped from Google Maps).
All in all, Young Stalin provided a fascinating glimpse of the randomness of history.  It is incredibly ironic that a revolution that was supposed to give power to the proletariat only gave rise to a new monarchy, freshly imbued with a steel-hearted utopian vision to impose on the people it was supposed to represent.  Circumstance granted the megalomaniacal Iosef Dzhugashvili a window of opportunity, and he took it.  Sadly, such reigns of terror can and probably will happen again... all it takes is political instability and an opportunistic regime with the power to oppress its people.  Which makes me view the recent uprisings in the Middle East, as well as my own government’s steps towards tighter surveillance of communications, with greater anxiety.

Friday, March 9, 2012

Can Blogging Assignments Improve Science Education?

Image pilfered from Pitt County Schools.


I'm fortunate that my institution has large support centers dedicated to improving students' and faculty members' skills in writing (the Sweetland Center) and teaching (Center for Research on Teaching and Learning).  Today, they jointly hosted a workshop entitled "Blogging in the Classroom," which I attended. Coming from a field in the natural and physical sciences, I expected to be a fish out of water.  Surely a workshop on blogging as a teaching tool would be intrinsically catered to instructors in the writing-heavy humanities and social sciences?  I actually considered skipping it, since I have plenty of data to analyze from a recent round of experiments.

Am I ever glad I went!  The panelists were three well-spoken instructors from the disparate fields of history, writing and environmental science.  It was the presence of this latter speaker, Bill Currie, that convinced me that  blogging might be a valuable tool in a science classroom.  There are a lot of ways to implement a blogging module in a course, but Dr. Currie uses it as a platform for public posting of formal essays by students as part of their coursework, i.e., an alternative to a traditional term paper.  Students are invited to post (civil!) comments on each other's entries, and are graded for both their own entries and their reactions to others.

Coming away from this workshop, I can see the following as possible advantages of blogging assignments.

1) It puts the direction of learning into the hands of students, rather than solely professors and grad assistants.  Rather than listening to the lone voice of a professor, they have a role in shaping their own learning.  In this way, blogging plays a role similar to traditional oral presentations, but uses less class time and exercises a different skill set.  It might also allow more in-depth discussion than classroom presentations.  Of course, student-run learning comes with its share of caveats, but perhaps some form of instructor (or peer!) review could be implemented to rein in factual errors to some extent.

2) It provides a safe(r) forum to exchange ideas without fear of sounding ignorant.  If students are given the option to post entries and comments anonymously (that is, anonymously to everyone except the professor), the panelists noticed that students less inhibited in discussing controversial topics.  I bet it would also encourage them to ask for clarification of fundamental points of scientific arguments without embarrassment.

3) It provides an incentive for students to showcase their best critical thinking and writing.  After all, many of these course-run blogs are public.  If your essay is being read not only by your instructors, but also by your peers (and potentially your grandmother, as one panelist put it), you have a stronger motivation to put forth your best, most polished work.

4) It enfranchises students and prepares them for deeper participation in the democratic forum of the internet.  This seems to me a very valuable skill for anyone wanting to shape our increasingly interconnected society through their ideas.

5) It provides real-world, public writing experience.  This should make for a nice resume item, and possibly "open the valve" of open-ended communication for many students, as panelist Naomi Silver put it.

Since there is such a heavy emphasis on learning the core material in my field (chemistry), any writing and blogging assignments would have to take a backseat to more structured conceptual learning and problem solving.  However, science professors have already been incorporating writing assignments into their curricula for some time.  In my intro biochemistry course, for instance, my prof had us summarize one or two peer reviewed articles in our own words, discussing the main findings and any strengths or weaknesses of each study.  This could easily be translated to a blog format, with the added advantage of promoting discussion (and, one hopes, a sense of relevance!) of the topics.  I also expect the anonymity of blogging via usernames would make students less ashamed to admit their misunderstandings about a topic.  This might be a convenient way for future scientists and medical professionals to "break into" the primary literature without worrying about what they think they're already supposed to know.

Lots of exciting possibilities here... I'm so glad to be living in the information age!

Thursday, February 23, 2012

Disposable personal genome sequencers may be available by the end of the year for under $1000

Last week, Nature published a story about portable genome sequencers under development by a UK company called Oxford Nanopore Technologies (US company Ion Torrent Systems is developing its own technology in parallel).  The devices work by electronically reading the "letters" of the genetic code as single molecules of DNA pass through tiny nanometer-sized pores in a membrane.  The company is developing pocket-sized disposable devices to accomplish this for around $900 apiece.  Within a year, you may be able to decode your whole genome for a price cheaper than my family's first computer.

A prototype of the MinION, a personal genome sequencer.   Yes, that's a USB plug.  Oxford Nanopore Technologies.

Its error rate is still high (4%) but the company hopes to reduce this error to less than 1% by the time the device launches.  In addition, thus far they have only reported the sequencing of viral genomes, which are orders of magnitude smaller than the human genome.

Regardless, this is a very promising start.  The technology has been in development for the better part of two decades, but this would be the first practical application of it.  With so many known genetic markers of diseases, this kind of device could help usher in a new era of more personalized medicine (as well as a slew of possible ethical issues).

Tuesday, February 21, 2012

Genesis of a Tyrant: Young Stalin


I am not a history buff.  Frankly, I never really saw the point.  Sure, there is the standard defense that it gives us perspective, that knowing the past gives us insight into the future.  But even a cursory glance at the past several hundred years of history seems to validate the quote attributed to Hegel, that “We learn from history that we do not learn anything from history.”  It’s difficult to derive any inspiration or hope from stories of war upon war, conquest after conquest, and the replacement of one form of oppression with another.

So, I’m branching out a bit with the audiobook Young Stalin by Simon Sebag Montefiore.  I chose Stalin’s particular tale for a few reasons.  First, in contrast to the other great tyrant of the 20th century, I know next to nothing about Stalin.  This is not surprising, since my K-12 curricula gave me a solid foundation in American history but emphasized world history too little (really only in the context of a single class on Western Civilization in 7th grade).  Second, and for the same reasons, I don’t know much about the history of Russia, which seems very alien and exotic to me.  When I hear terms like Bolshevik and Gulag thrown around, a fog of ignorance creeps over me.  Third, as I often hear religious conservatives cite Stalinism as the inevitable endpoint of liberal secularism, I want to understand for myself the relationship between Stalin’s political philosophy, his religious beliefs and upbringing, and the horrid brutality of his regime.


After listening to the prologue and the first couple of chapters, I’m already learning all sorts of interesting tidbits about the future dictator’s early years.  For instance, I never knew that he had grown up in Georgia, the same country that Russia invaded in 2008 – indeed, some of his relatives were from the very South Ossetia region upon which the 2008 conflict centered.  Or that his surname at birth was not Stalin, but the very Georgian name Dzhugashvili. Or that his father was a handsome and successful cobbler-turned-drunkard who savagely beat the young Iosif (Joseph) and his mother in fits of jealous inebriated rage, allegedly desensitizing Iosif to violence and teaching him to hate at a very young age (NOTE: My subsequent reading of Steven Pinker's work, particularly The Blank Slate, calls this heavily nurture-ist perspective into doubt).  Or that his mother dearly cherished him after losing her first two babies to illness, yet exerted a stern disciplinary hand (i.e., she beat him to keep him in line... apparently, it didn't work).  Or that he grew up in a culture rife with gang violence and male bravado, influenced by militant revolutionaries and romantic tales of righteous despoilment of the ruling class.

I'm just scratching the surface, though.  It will be interesting to see how a sensitive, flower-loving little boy was transformed into such a brutally oppressive dictator.


Saturday, February 18, 2012

Talent “ex nihilo”: How brain damage can make people more musical

When you think of brain damage, what effects come to mind?  If you’re like me, probably mental and physical handicaps – aphasia, amnesia, perhaps loss of coordination or personality changes.  As I finish Musicophilia, one of the standout surprises for me was that brain damage can sometimes add function… or so it seems.

Owing to the book’s focus, Dr. Sacks focuses on several cases of brain damage or disease contributing to “hypermusicality,” an enhanced appreciation of and/or talent for music.  The most striking example comes at the beginning of the book, where he describes a patient who became obsessed with music after an encounter with a bolt of lightning (see what I did there?).

Image credit: David Selby
The patient, Tony Cicoria, apparently had no major adverse health effects apart from burns to his face and left foot as well as some temporary sluggishness and memory loss that lasted a few weeks.  However, Circoria had a sudden irrepressible urge to learn piano music, despite having virtually no interest in performing music in his forty or so years.  He bought recordings of Ashkenazy playing Chopin and adored every one.  Around that time, a family babysitter needed somewhere to store her piano, and this was just the ticket he needed to start making music himself.

Then, he began hearing music in his head.  “It’s like a frequency,” he said, “a radio band.  If I open myself up, it comes.  I want to say, ‘It comes from heaven,’ as Mozart said.”  After the lightning strike, he also became hyper-religious, believing that he had been saved for some purpose, and that purpose was his music.

This incessant influx of music needed an outlet, and he began composing at every available opportunity.  It totally possessed him during every waking hour, and his wife was none too pleased.  (I can relate: when I go through obsessive piano binges, it can certainly intrude upon time together with my wife!)  In the intervening years, Cicoria has developed a veritable musical career, presenting his compositions first at informal recitals and later at more formal concerts.  All this despite having only a handful of piano lessons as a young boy, about thirty years before.

So, after being assaulted by several thousand amps of electricity, Cicoria’s brain was not only unimpaired, but revealed talents and interests that he never possessed before!

This sort of phenomenon is not restricted to those struck by lightning.  Sacks also describes several patients with frontotemporal dementia (FTD, or Pick’s disease), a neurodegenerative disease that is similar to Alzheimer’s except that it specifically affects the front and side portions of the brain.  Here is an image of a brain from a patient with advanced FTD:

Note the shrinking of the gyri (folds) of the brain towards the front, on the right.
Unlike Alzheimer’s, the first symptoms of FTD do not involve memory loss, but changes in personality and loquacity.  The symptoms of FTD vary a lot, but in general the patients become more outgoing and less inhibited, which is not surprising since the frontal lobe is involved in suppressing socially questionable behaviors.  Sacks discusses some patients that begin to spontaneously burst out into song, and even some that suddenly develop extraordinary gifts of composition out of the blue.  Bruce Miller described one such man who began composing music at the age of 68!  Sack speculates that these probably don’t represent entirely new behaviors of the brain, but result from the de-inhibition of existing thought patterns that are then free to flourish.

This reminds me of a few occasions I've had to speak French with native speakers while mildly intoxicated.  With a few glasses of wine in my system, I was actually more fluent.  It was as if my conscious brain gave way to a more fluent unconscious self.  Unfortunately, sobriety brought the return of my more typical halting, clumsy manner of speaking.  I have never experimented with psychoactive drugs, but I would not be surprised if such de-inhibitions were responsible for the bizarre creative visions of some drug-using artists and religious cults.

Sadly, most patients lose their newfound abilities as the FTD progresses further.  Many remain ebullient and enthusiastic, but their semantic memories usually begin to fail.  One elderly man would gaily sing “We Wish You a Merry Christmas” but could not tell his doctor precisely what Christmas was.  As Sacks put it,
The musical or artistic powers that may be released in fontotemporal dementia or other forms of brain damage do not come out of the blue; they are, one must presume, potentials or propensities that are already present but inhibited – and undeveloped.  Once released by damage to these inhibitory factors, musical or artistic powers can potentially be developed, nurtured, and exploited to produce a work of real artistic value – at least as long as frontal lobe function, with its executive and planning powers, is intact.  In the case of frontotemporal dementia, this may provide a brief, brilliant interlude as the disease advances.  The degenerative process in frontotemporal dementia, unfortunately, does not come to a halt, and sooner or later, all is lost – but for a brief time, for some, there can at least be music or art, with some of the fulfillment, the pleasure, and joy it can so uniquely provide.
One must wonder, finally, about the “Grandma Moses” phenomenon – the unexpected and sometimes sudden appearance of new artistic or mental powers in the absence of any clear pathology.  Perhaps one should speak of “health” rather than “pathology” here, since there may be, even at an advanced age, a relaxing or release of lifelong inhibitions.  Whether this release is primarily psychological, social, or neurological, it can unleash a torrent of creativity as surprising to oneself as it is to others.
What hidden creative potentials lie in each of our brains, ready to burst out but inhibited by the tyranny of rational thought?  Still, I’m not eager to get struck by lightning or stricken with dementia to find out.

Saturday, February 11, 2012

I don't remember you, but I love you: The story of Clive Wearing

The latest tenant in my dashboard CD player is Musicophilia by Oliver Sacks, the distinguished physician best known for Awakenings. In Musicophilia, Sacks explores the myriad modes of interaction between music and the human brain from a neurologists’s perspective. Topics range from the various components of musical awareness to the infinity of ways in which it can go haywire – musical hallucinations, impaired perception of pitch, harmony, melody, or timbre, and even hypermusicality induced by a lightning strike. However, one of the most intriguing and touching stories in the book concerns the tragic amnesia of musician and musicologist Clive Wearing.

In 1985, after a rare case of viral encephalitis, Clive acquired one of the most severe cases of amnesia ever documented. He lost not only decades of prior memories (retrograde amnesia), but also his ability to form new memories (anterograde amnesia). As a result, he lives entirely in the present, with only about thirty seconds of conscious memory. This transcends mere forgetfulness, impinging on his very sense of continuity from moment to moment. As his wife Deborah wrote:
It was as if every waking moment was the first waking moment. Clive was under the constant impression that he had just emerged from unconsciousness because he had no evidence in his own mind of ever being awake before… “I haven’t heard anything, seen anything, touched anything, smelled anything,” he would say. “It’s like being dead.”

It’s difficult to imagine how utterly terrifying this would be. On some level, Clive seems to be aware that he is an adult with a history, and has even retained some memories of events earlier in his life. Yet, at any given instant he has no conscious memory of having been awake, or even alive, before. At one point, hoping that it would help him to maintain some continuity from moment to moment, his caretakers encouraged him to keep a journal of his thoughts. This only resulted in page after page of entries like the following (taken from Wikipedia):
8:31 AM: Now I am really, completely awake.
9:06 AM: Now I am perfectly, overwhelmingly awake.
9:34 AM: Now I am superlatively, actually awake.

What kind of life can one hope to lead with such an impenetrable fog occluding both past and future? Yet Clive has been able to live with some measure of comfort and happiness. On some deep emotional level, he does remember his wife, and greets her with a shower of hugs and kisses when she comes to visit (or even when she drifts from his awareness for a minute or two and then he realizes, as if for the first time, that she is there). An accomplished pianist, he can still sight-read, improvise, and play many songs from memory with great musicality, even if he is convinced he has never played them before:
In these moments, it is clear that Clive’s amnesia is not complete. Human memory is a multi-dimensional complex of episodic recollections mingled with deeper emotional and procedural memories, and these latter two are still largely intact for him. He has adapted to his condition to some extent, and can hold conversations and enjoy the company of others in his own way. But it must have been an extremely difficult adjustment, not only for Clive, but for Deborah and their family members as well. I truly admire their courage.
In Total Recall, a perennial favorite of teenage boys, the psychic mutant Kuato offers Arnold Schwarzenegger’s character this admonition: “You are what you do! A man is defined by his actions, not his memory.” Clive certainly doesn’t completely remember the man he was before his encephalitis, but when he sees Deborah, or plays the piano, or directs a choir, he is very much alive.

Links:
Deborah Wearing’s Forever Today
Oliver Sacks’s Musicophilia

Motivations and purpose

I used to loath commuting, and it hated me. Now we're getting along a little better. Let me explain.

Every day, I have the distinct pleasure of driving 69 miles each way to the university where I am pursuing a higher education. In addition to expelling several more tons of CO2 into the atmosphere each year, this consumes an even more precious personal resource: time. In the past year, I have spent around 500 hours driving to and from my research lab, which amounts to about 20 days of travel time. I do this so that my wife and I can live together despite attending graduate schools in different cities. It's a worthwhile sacrifice, but for most of these last few years I couldn't help but lament the irrevocable loss of precious time.

Then, about a year ago, we subscribed to a local audio book rental store called Hear You Go. It was my wife's initiative, and I have to say I was pretty resistant to the idea. I enjoyed listening to the radio and my iPod day after day, and stubbornly clung to this habit. Fortunately, she persisted, and now I am probably more enthusiastic about audio books than she is!

Stepping back a bit: I've always been a slow reader. I have an insatiable curiosity about the world, but have generally considered reading books to take too much time, particularly when I must keep abreast of work in my own field of research (which, incidentally, lies at the intersection of biochemistry and nanotechnology). Like many of my generation, I procure the majority of my general-purpose information from blog posts, Podcasts, sound bytes, and of course, Wikipedia. I am part of probably the last generation that is still astounded by the amount of information at our fingertips thanks to the internet. But modern media often sacrifice depth in favor of breadth, and I've long felt that my slow reading speed and short attention span for most topics have kept me ignorant of a whole world of thought and discovery.

Thanks to my newfound passion for audiobooks, I currently breeze through about 2-3 books each month. I'm "reading" more widely and quickly than I ever have before. I actually look forward to my commute each day, eagerly anticipating the barrage of novel ideas, information, and stories that will come pulsing from the well-worn speakers of my '06 Impala. I am hooked!

This blog is about the rekindling of my passion for learning outside of my ultra-specialized field of research. It is about my journey, literally and metaphorically, towards my own internal realization of E. O. Wilson's "Consilience" -- the unity of knowledge from disparate fields. It is a candle lit by the fire of my curiosity that I hope to share with other curious and open minds.

I do not intend for this to be an endless series of audio book reviews. It will be much broader (and probably more random), encompassing any ideas my daily dabblings in the thoughts of others may engender. My initial focus will concern my current interests: science, biochemistry, evolution, music, food, language, and religion. If all goes as planned, this list will expand to encompass other topics such as history and economics. In the end, it's all the same: the attempts of one meager brain to make sense of this beautiful, bountiful, and intractably complex universe.

Won't you join me?