Map of life expectancy at birth from Global Education Project.

Thursday, August 17, 2017


Public monuments matter. They make assertions about the values shared by the groups that erect and maintain them. When those entities are governments, in a purportedly democratic society, the monuments are claims about the public consensus.

Monuments are also considerably more complicated than one might think without giving them much reflection. For one thing, they are time dependent. They purport to be about a person, or multiple people, or events. But that which is memorialized existed, or happened, some time before they were erected. So as statements, they refer not to the time of their subjects, but to the time of their creation. Maya Lin's Vietnam memorial was controversial because of the assertion it seemed to make about how we should view the Vietnam conflict at the time the monument was erected.  But after a while, that is also in the past. When we view monuments today, they are saying something about the time of their creation, and that may be jarringly different from our time. As it turns out, Lin's memorial has not only held up very well, it has come to seem even more appropriate over time. Her vision was prescient, not limited to the historical moment. But  obviously that isn't always the case.

My office is in the middle of Providence River Park, and specifically that part of the park that is full of monuments. There are monuments to the dead of both World Wars, to the Irish famine, and to the Shoah. The latter was erected last year, as it happens. This year, a monument (privately sponsored but on public land) to the Easter Uprising was installed next to the Irish famine memorial. There is also a Civil War memorial which, obviously, commemorates Union soldiers.

Some 20 years ago, I worked as a consultant for the sponsors of the New England Holocaust Memorial in Boston. Again, it was privately financed but stands on public land. It has been vandalized twice this year.

Most of the memorials near my office do not, as far as I know, attract much controversy today. People in Providence don't have a problem honoring the war dead or the victims of the Shoa. The Irish memorials would be problematic for some English people. The question of English responsibility for the Irish famine is contested, as is the righteousness of the Easter rebellion. But English identity has not been enduring among North American settlers, whereas lots of people still think of themselves as Irish. So these go up on public land without any visible fuss.

All that throat clearing leads to this. The Confederate monuments which have suddenly become incandescently controversial do not, in fact, refer to the Civil War. They refer to the post-Reconstruction era, the rise of Jim Crow, and the Reign of Terror that re-established white supremacy in the South. They were erected during that era, generally 50 years or more after the Civil War, as a message about the present, not the past. That is not to say that the past was any less reprehensible. The cause of the Confederacy was treason in defense of slavery. The heroes of the Confederacy are not remembered for any other accomplishments of historic significance. Whether the monuments refer to 1865 or 1915, that is still true.

Nevertheless, it is important to keep in mind that they were not erected to commemorate history, but rather to enforce a current ideology -- an ideology which it is imperative, in the present era, that we utterly repudiate. Anyone who does not understand this is unfit for public office in 2017.

Monday, August 14, 2017

It is happening here

The BS debate about how Democrats can win back Trump voters through economic populism needs to end now. Donald Trump appeals to Republican voters because he articulates and enables racism. That's what they mean when they call him a "straight shooter" and says he "tells it like it is" even though he lies ever time he moves his lips. He refuses to condemn Nazis and Klansmen because a) they are his base and b) they are among his closest advisers. (Bannon, Gorka and Miller, to get specific. His Attorney General is also a racist.)

Here's an interview with historian Timothy Snyder. I'll give you a pull quote:

What's most striking, if you want to try to link what happened yesterday to our own history, is that we now have a president who doesn't regard Nazis as a symbol of evil. . . .  His reaction to this event is to say that everyone is at fault, and we should all hold together. That's not the reaction that one would expect from the president of the United States. But it is consistent with what I've been trying to get across for the past few months. It's consistent with Trump and Steven Bannon's attempt to do away with the part of the American story that celebrates entering and winning the Second World War. It's consistent with their attempt to do away with the part of the American identity that has to do with being anti-fascist, or anti-Nazi. It's consistent with their botching the Holocaust Remembrance Day in January. It's consistent with the utterly bizarre way that Sean Spicer talked about the Holocaust, when he said Hitler didn't kill his own people. It's consistent with Trump being the first major American politician in recent memory to skip visiting the Ghetto Memorial when he came to Warsaw in August.
And above all, it's consistent with his “America First” slogan. This is what America First means. America First means an America where a Nazi Germany was not the enemy. So that's the broad historical circle. We have an administration which has "America First." What "America First" meant when it was used during the WWII era was that we should not resist Nazi Germany. Mr. Trump's remarks on Saturday are totally consistent with that. This is who and what the administration has been from the very beginning.
And believe me, the white supremacists are getting the message loud and clear.

This must end. This racist, malignant, demented psychopath must be removed from office. If we can't get that done, our century of progress has ended.

Thursday, August 10, 2017

Crispy Critters

That would be millions of us Homo sapiens as much of the Middle East and Europe bake in unprecedented heat. Yes, Iraq is normally hot in the summer, but not it is close to uninhabitable:

As temperatures rose towards 51C on Thursday, Iraq’s government declared a mandatory holiday, allowing civic servants to shelter at home.
So far this month in the Iraqi capital, every day but one has reached 48C or higher, and the forecast is for the high temperatures to continue for the next week. July was little different, in Iraq and in Syria, where the capital, Damascus, has also been several degrees hotter than usual nearly every day since late June.
In Kuwait, where birds have reportedly dropped from the skies, and Riyadh, where building work has ceased this week, locals have called for mercy from a hotter-than-normal air mass that has remained nearly stationary over central Arabia for more than three weeks, stretching the capacity of electricity networks beyond limits.
For those of you who don't know, that's 124 degrees Fahrenheit. In Europe temperatures are hitting 104 Fahrenheit, as they have in the U.S. northwest. Healthy people will survive that, but it's pretty damn miserable. In Seattle, where the 90 degree heat is rare, the misery is compounded by suffocating smoke from wildfires in British Columbia. I could continue to go around the world, from methane seeping from the melting arctic permafrost to the possible collapse of the West Antarctic ice sheet, and on and on. A major precipitant of the Syrian civil war was the collapse of Syrian agriculture due to extended drought and the flood of unemployed refugees into the cities.

If your tribal identity depends on denying what is obviously true and right in front of your nose, I expect that nothing I can say will matter to you.

No, it's not the only thing that matters, but if we don't get very serious about this, very soon, nothing else will matter very much.

Monday, August 07, 2017

CRISPR critters

I suppose I should say something about the recently announced work of Shoukhrat Mitalipov of Oregon Health and Science University, who claims to have successfully edited the genome of human embryos, in this case to eliminate a disease causing mutation. This work is as yet unpublished and not peer reviewed, but let's assume it is sound.

The technique, which has been much in the news, is called CRISPR/Cas9. I'm not going to go into the technical details here but you can certainly look it up if you are interested, the Wikipedia article is actually reasonably accessible if you have some basic understanding of genetics. But getting under the hood doesn't really matter. This is a genetic system that evolved in prokaryotic cells to combat viral phages. It turns out that it provides a method for precise editing of DNA. Previously, the best they could do was shoot DNA into a nucleus and hope that it would be incorporated somewhere; or they could selectively eliminate genes. This provides a method for editing specific genes.

Before Mitalipov's work, however, attempts to edit genes in human embryos weren't very successful, mostly because the editing didn't work in every cell. The Mitalipov team got in early, however, and they claim to have had potentially clinically useful results. This sounds like good news for people who carry genetic disorders and who want to have children, and it may well be. But it's setting off all sorts of ethical alarm bells.

Obviously you have to make a bunch of embryos and you'll end up destroying most of them. That already happens with in vitro fertilization, however, and the anti-abortion ideologues seem to have pretty much gotten used to it. As I have said many times, they do not really believe that blastocysts are human beings or that zygotes have the moral status of persons. That's just an excuse.

Once we've gotten past that, the obvious question is whether the technique could be used, not only to fix hereditary diseases, but to make designer babies -- with enhanced intelligence, physical capacities, specific talents, whatever their wealthy parents want.

I have a two-part answer to this. The first part is that the border between fixing disease and enhancement is very fuzzy. The boundary between say, just being short and dwarfism, or not being the brightest bulb on the tree and cognitive disability, is essentially arbitrary. So if you don't think enhancement is ethical, you need to decide where to draw the line -- and that's always going to be open to dispute.

Part two is that phenotypes are not generally highly determined by genotypes. While a single mutation can definitively cause certain diseases such as Huntington's or Sickle Cell, for the most part our genetic heritage interacts with our environment to create us. For example, people might be at risk for developing Type 2 diabetes if they consume a particular diet; while many people with the same genetic profile will never get Type 2 diabetes. Do we blame it on their genes or the culinary culture?

Intelligence, musical ability, athletic ability -- all of these are also products of genetic heritage unfolding within a particular environment and personal history. You might have John Coltrane's genes, but if you don't practice, or you could never afford a saxophone in the first place, you won't be John Coltrane. Furthermore, these proclivities for some outcome to occur within some given environment are determined by a whole suite of interacting genes. Scientists have so far found only very small influences on outcomes such as heart disease or IQ from any given gene variant, and even these often turn out to be spurious on further investigation, or might not occur at all in a different environment.

If you tried to maximize the genetic profile for intelligence, it might well be at the cost of some other characteristic, such as longevity or sociability or something else people would be loathe to damage. Or it might only work if the child could be guaranteed some specific form of nurture, and otherwise be affirmatively harmful.

So this is a bridge we won't have to cross for a long time, if we ever get there. Nonetheless I can't say it's impossible that nobody will ever try it. People should probably try to be educated about this issue and we should be talking about it, but there's no need to panic.

Wednesday, August 02, 2017


I've written about the North American warrior game here before, and if you've been hanging around long enough you know that my views on it have changed along with the evidence. Our biggest worry used to be spinal chord injuries. Rule changes to prohibit using the helmet as a ram, and coaching at the youth level to emphasize keeping the head up during contact, greatly reduced that risk.

But it's time now to face the truth: no conceivable rule changes that would let you keep on calling it the same game can keep the players' brains from turning to cornmeal mush. The fundamental problem is the helmet. This may be counterintuitive. It seems that the helmet is there to protect the brain, but the opposite is true. The helmet protects the skull and the face, which enables the head to collide forcefully with other players and the ground. When that happens, the brain sloshes around and smashes against the inside of the skull. Rugby players do get concussions and are apparently at risk for Chronic Traumatic Encephalopathy, but not as high a risk as North American football players precisely because they don't wear helmets. That forces them to protect their heads.

The linked study in JAMA has gotten a lot of publicity. In case you've been too obsessed with a certain orange idiot's twitter feed to pay attention, the researchers got families to donate the brains of deceased football players. 110 out of 111 met the diagnostic criteria for CTE based on microscopic examinations of their brain tissue.

Now, this doesn't mean that nearly 100% of NFL players will ultimately get the disease. The families donated the brains because they were worried, and indeed, basically all of these players had observable pathology when they were alive:

Among the 111 CTE cases with standardized informant reports on clinical symptoms, a reported progressive clinical course was common in participants with both mild and severe CTE pathology, occurring in 23 (85%) mild cases and 84 (100%) severe cases (Table 3). Behavioral or mood symptoms were common in participants with both mild and severe CTE pathology, with symptoms occurring in 26 (96%) mild cases and 75 (89%) severe cases. Impulsivity, depressive symptoms, apathy, and anxiety occurred in 23 (89%), 18 (67%), 13 (50%), and 14 (52%) mild cases and 65 (80%), 46 (56%), 43 (52%), and 41 (50%) severe cases, respectively. Additionally, hopelessness, explosivity, being verbally violent, being physically violent, and suicidality (including ideation, attempts, or completions) occurred in 18 (69%), 18 (67%), 17 (63%), 14 (52%), and 15 (56%) mild cases, respectively. Substance use disorders were also common in participants with mild CTE, occurring in 18 (67%) mild cases. Symptoms of posttraumatic stress disorder were uncommon in both groups, occurring in 3 (11%) mild cases and 9 (11%) severe cases.

Cognitive symptoms were common in participants with both mild and severe CTE pathology, with symptoms occurring in 23 (85%) mild cases and 80 (95%) severe cases. Memory, executive function, and attention symptoms occurred in 19 (73%), 19 (73%), and 18 (69%) mild cases and 76 (92%), 67 (81%), and 67 (81%) severe cases, respectively. Additionally, language and visuospatial symptoms occurred in 54 (66%) and 44 (54%) severe cases, respectively. A premortem diagnosis of AD and a postmortem (but blinded to pathology) consensus diagnosis of dementia were common in severe cases, occurring in 21 (25%) and 71 (85%), respectively. There were no asymptomatic (ie, no mood/behavior or cognitive symptoms) CTE cases. Motor symptoms were common in severe cases, occurring in 63 (75%). Gait instability and slowness of movement occurred in 55 (66%) and 42 (50%) severe cases, respectively. Symptom frequencies remained similar when only pure CTE cases (ie, those with no neuropathological evidence of comorbid neurodegenerative disease) were considered (eTable in the Supplement).

Sure, a lot of people develop dementia, but not this young. The median age at death was 66, the youngest was 47,  and 3/4 were younger than 76. You would most definitely not expect to see such a prevalence of brain damage in the general population in that age cohort. So granted, we don't know the actual prevalence among football players, and it's likely less than it appears from this sample. But . . .

The danger is obviously real and it's turning out to be unacceptable to a lot of players. We're seeing a spate of early retirement from the game. Sure, kids who are already dreaming of NFL glory are unlikely to give up playing in high school and college, but we have to expect that fewer and fewer players will allow their boys to take up the game. That, I think, is how the game will ultimately die, when the pipeline of players dries up. Regretfully, I have to say I hope that it will.