Do germ-phobia and hyper-sanitation protect the public, or should we be exposed to infections and get more resilient?

Are we going overboard on protecting the public from “germs”?

That’s an especially good question given the flu season.

I downsized into a high-rise condo building, so I am exposed to a lot more people (especially kids) every day. Furthermore, a garage I use in Arlington forces me into contact with people because of construction problems that lead to delays.

I did get the flu shot from the doctor in September.  On New Years Day, I had the scary dry cough and a very slight fever, but it went away in 36 hours, leaving a bit of a rattle.  I would like to think this was a case of a vaccination providing partial protection and making the symptoms of H3N2 mild (NY Times blog post).

There is also variation among people in how they react to flu.  About half of adults can have an exposure with relatively few symptoms.  That may be related to exposure size (very small exposures may be like vaccinations), or to the anti-oxidants in the persons immune system.  But, we also see a replay of the 1918 H1N1 flu: some younger adults with robust immune systems die of their own overreactive immune response and cytokine storm and “drown”.

When kids are exposed to more germs through less sanitation, they are sick more often as kids, but may grow up to be more resilient as adults.  NBC News as a good article on that from Dr. Ty here.   That was pretty much the case with me.  I had very few missed sick days during my entire career working as an adult. But I was “sickly” as a child.

In my early 40s, I had  couple of strep throats, but that hasn’t happened since, suggesting natural immunity.  In 2004, I had a serious periodontal infection leading to a cat scan.  It went away with stronger anitbiotics and did not return probably because of immune response.

A natural infection does provide more exact protection than a flu shot, but it seems reasonable that a fly shot that blunts a subsequent natural infection is the best chance.

But there is a larger question of how far society should go in preventing infections altogether.   I hear the debate on “presenteeism”, but I wonder if people just need to get tougher and more resilient.

We believe some infections are very dangerous, and have quarantined people who are exposed, as we saw with Ebola when a few people who had worked in Africa were isolated in 2014 (one in particular in NYC).  We’ve also treated SARS that way, like back in 2003, with aggressive contact tracing.  SARS (and MERS) are caused by coronaviruses, most of which produce only mild laryngitis or cold-like diseases;  but a few of them are novel and dangerous.

We got through the H1N1 crisis in 2011.  And we hear talk of bird flu (which became an ABC TV movie in 2005), like H5N1 or H7N9, with the idea that it could jump from species to humans (through other animals) and sustain transmission. In southeast Asia, the practice of having agriculture very close to homes increases the risk.

We have to deal with whether an enemy could introduce something like anthrax, as we saw in September 2001 right after 9/11, and that was dramatized on ABC Nightline in 1999.  There are good reasons to think this is harder for an enemy to do than most novelists admit.

Finally, there are sexually transmitted diseases, with the outdated “chainletter” debates from the rightwing in the 1980s where HIV could become more transmissible with time (a sci-fi horror scenario) or lead to the spread of more secondary infections (like TB).  But we never know when some bizarre new disease will arise and behave in an unprecedented way (as in my novel manuscript “Angel’s Brother”).   Likewise, with Zika, you have the idea that a virus could be spread both by insects and sexual transmission and affect only some people (unborn kids) a lot more than normal healthy adults.

The film “Unrest” presented the idea that clusters of chronic fatigue syndrome have been noted since the 1980s. In fact, clusters of Hodgkin’s Disease had been reported in a few communities in the late 1970s, a few years before AIDS became known.

(Posted: Friday, January 26, 2018 at 4 PM EST)

Vaccine development is still the best prevention against bio-terror or airline-spread pandemics

Among the major perils that can seriously disrupt western civilization as we know it would be future pandemics.

I haven’t covered the idea as much here as some other threats (EMP, cyberwar, solar storms, nuclear) and I actually don’t think that the threats are as likely.

Nevertheless, it’s good to review the various pieces in play.

In modern times, the most obvious major pandemic has, of course, been HIV, which grew in the male gay community and overseas in other communities, exploding with a kind of big bang in the early 1980s, with social and political consequences already widely covered (as with the HBO film of Larry Kramer’s “The Normal Heart” in 2014).  But HIV, as an STD, is extremely unlikely to affect the general public outside of restricted modes of transmission.  Other viruses, including recently Hepatitis C (and b) have behaved in a somewhat similar fashion without becoming enormous threats.  More recently, Zika virus has presented the idea of a virus transmitted both by sex and by arthropods (mosquitoes), which can pose some theoretical dilemmas about “amplification”.  Imagine a sci-fi scenario where a novel virus is normally harmless but can gradually make a population sterile (“Children of Men”, 2006), or pose novel results involving personal identity (as in my own novel manuscript “Angel’s Brother”).

After 9/11, the idea of bioterror took root very quickly with the almost coincidental “Amerithrax” anthrax attacks, that apparently started in Florida with an attack on a company that publishes supermarket tabloids.  In the beginning the attacks appeared to come from domestic Islamic extremism, but later attention was drawn to a scientist at Fort Dietrick, MD, with tragic results.  I do remember arrests at a Trenton NJ apartment complex (not too far from where I lived on my first job) that never got mentioned again.  Back in 1999 (two years before 9/11), ABC Nightline did a several-evening simulation of a fictitious anthrax powder attack in the BART subway in San Francisco, where powder with spores was thrown into a tunnel.  So the idea had been thought of before.  After the 2001 incidents, people were sometimes questioned by police when any powdery substance appeared in mail they had sent, an idea that would never have occurred to anyone before.

More speculation has been drawn to the possibility of re-weaponizing smallpox (as in Revolutionary and even French and Indian War times).  Daniel Percival developed this possibility in the FX 2002 film “Smallpox 2002: Silent Weapon”.  All of this depends on the fact that the practice of vaccinating Americans for smallpox has been allowed to lapse.

But the biggest concern in the past fifteen years or so has been the possibility of pandemics based on respiratory illnesses, mainly influenzas (with the Spanish Flu of 1918 the archtype) and SARS-like illnesses, caused by corona viruses, most of which are relatively harmless.  Major films on this issue include “Contagion” (2011, Steven Soderbergh), “Pandemic” (2007, Hallmark), and “Fatal Contact: Bird Flu in America” (2006, ABC Studios).

Wikipedia list many “avian influenza” viruses but two of the most important are H5N1 and H7N9 (which a China Today newspaper wrote about recently).  The practice of having poultry and farm animals very near houses in poor countries (or especially in Southeast Asia) raises the probability of animal-man transmission, and so far subsequent person-person transmission remains rare, but it if happens, air travel can spread it around the world.  The avian influenza issue raises the idea of “herd behavior” and how ordinarily private behavior sometimes has major secondary public consequences.

Then, of course, we have the history of Ebola Virus hemorrhagic fever, as broke out in West Africa in 2014.  A number of doctors and health care workers or relatives became infected, and a few returned to the U.S., including one death.  In fact, Ebola is a Category A bioterrorism agent  (whereas bird flu in Category C).  A major controversy developed over the need to isolate or quarantine those who might have been exposed, as on airline flights.

All of this brings up two major questions.  One is vaccine development, and the interest of the public in accepting the vaccines, given a new administration somewhat anti-science and sympathetic to vaccine denial.  Indeed, an effective Ebola virus vaccine may soon be available, which would be essential to encouraging humanitarian volunteer work overseas (again, we have an administration that has the near-sighted nationalistic “take care of your own first” value system).  I think we could become more pro-active in developing avian influenza vaccines now, as well as vaccines against corona-virus infections, because natural resistance to these agents does develop with exposure.

I note the flawed thinking behind the vaccine denial movement (as in the film “Vaxxed“), which seems, again, to stem from a “take care of your own first” value system (sometimes religion).

The other measure would be social distancing, and isolation of patients.  This has been used (as for example to stop SARS from spreading in 2003) but it hardly sounds practical in the long run, and tends to invoke draconian powers from government.

In fact, the CDC attracted controversy with its “Final Rule of Control of Communicable Diseases: Domestic and Foreign”, issued January 19, on the last day of the Obama administration.

Major reading includes (from the 1990s) Richard Preston’s “The Hot Zone” and Laurie Garrett’s “The Coming Plague: Emerging Diseases in a World out of Balance“.

(Posted: Friday, March 10, 2017 at 4 PM EST)

My own self-broadcast speech in a post-truth world (?)


So, how do you know what’s right and what’s wrong?  Let’s get back to that epistemology of Philosophy 101.

We can propose some ideas.  The Golden Rule.  Or the libertarian idea of harmlessness.  What we run up repeatedly is inequality, not just among groups but among people in any group.  Individuals benefit from the sacrifices of others that they didn’t see happen before their eyes.  Practically none of us have fully paid our dues, practically all of us have some bad karma, and can have “stuff” taken away from us, and we won’t find claiming victimhood particularly honorable.  You can think of some sort of “axiom of choice” that’s self-evident. If you want a better station in life than what seems “assigned”, reach down and offer a hand up to others, with some real personhood from “you”.  You may have to accept the idea of belonging to some group, even if that group isn’t right about absolutely everything.

History suggests, though, that you can rationalize almost any ideology.  That’s particularly true when you talk about subcultures of your basic political formats:  democracy, communism, fascism (including ancient “Spartanism”), theocracy.  So it’s natural to turn to religion, and let scripture decide which ideology is right.  But then you have to decide, whose scripture, and which interpretation.  You often wind up with pronouncements (like homophobia) that sound totally irrational to an individual, but perhaps sensible for the long term survival of a circumscribed, threatened tribal group.  Or you may decide this by social affiliation.  You belong to a group, whose leadership fights for you.  You may belong to more than one group, and the groups could be somewhat at odds.  Examples of groups to belong to:  natural family (as extended), faith-based, labor, business, or civil-rights oriented (by race, nationality, or gender “tuple”).  It’s the task of a progressive democracy to overcome both over-rationalization and fundamentalism, and come up with a culture of what values are acceptable for people to live together and communicate.

So, then, we come to the “value” of my style of ungated speech.  That’s coming under fire because of where “fake news” has led (especially with the recent total libel of some local small businesses in Washington DC, in “pizzagate”, and the violence that erupted).

There is a view that “news” should be delivered by professionals, who know how to fact-check, and that their “privilege of being listened to” be regulated by a political structure.  That’s how (Putin’s) Russia and China work with their own people now.  You can argue that If the leadership reached power legitimately, it’s in the best interest of everyone to know their place (“rightsizing”) and not speak out of turn.  Yes, that’s a kind of authoritarianism, hidden under democracy, which can provide the illusion of stability for a long time, but not forever.  (Marxist states have not been able to sustain themselves forever.)   Authoritarian states argue, with some credibility though, that “average Joe” people are vulnerable to “propaganda” (one of Vladimir Putin’s favorite words, even in justifying the 2013 anti-gay law), and are unable to ferret out “truth” for themselves among a quantum sea of infinite information.


So, then, you deal with my kind of speech.  I offer theories now as to how, for example, Section 230 could quickly come apart under Trump. But someone else might want me to shut up, inasmuch as I might be (unwittingly) throwing kerosene on a fire of people who want to  implement exactly the policies that I fear, that would end my own second career and my own life’s “second half”.  After all, there are some groups (many lower income families with children) whose members might be better off if there were no user-generated content because there was no Section 230.  These “average Joes” could be as passionate about something like this with Trump (protecting their kids, when many of us don’t even have them) as they are now about keeping manufacturing jobs within the United States.

A similar thing happened in the pre-Internet 1980s, in the early days of the AIDS epidemic, when, in Texas, conservative legislators wanted to expand the sodomy laws and ban gay men from most occupations (let alone the military) on a theory that gays would “amplify” a new disease until it threatens civilization as a whole. I actually corresponded with this group (the supposed “Dallas Doctors Against AIDS), looking for some sort of rational refutation.  As diabolical as it is HIV never has behaved this way, but it’s easy for a science fiction writer to conjure up a new imaginary virus that does.  (Something like that happens in my novel manuscript “Angel’s Brother”, where the “virus” contains short-lived micro black holes).

But what I’m doing is not spreading “news” or claiming that certain events have happened (as was done, for example, in the “Pizzagate” affair).  I’m articulating (playing “devil’s advocate”) possible new interpretations of known and reasonably verifiable facts. My doing so makes “identity politics” more difficult, because there is always the possibility that someone will have an inherited enhanced inclination to behave in some negative or (“downsteamly”) harmful way to overcome a disadvantage that society has assigned, over history, to hs or her “group”.  I am trying to force everyone to look for the truth in a post-truth world, before they act.

(Posted: Monday, December 5, 2016 at 10:15 PM EST)

Blood donation policy in US still excludes most gay men in practice; is this really necessary for public health?


I don’t think I have given blood since 1982.  I remember having outstanding blood pressure numbers then.

During that time, some gay men would sit all day in plasmapheresis centers, their forearms taped in a robust push to develop a Hepatitis B vaccine.  And Dallas banks would include “become a superdonor” with their statements in the early days of blood component and even bone marrow donation research.  I would get a Hepatitis B vaccination (two shots) from my own private doctor in the fall of 1982, covered by normal workplace health insurance.

I would learn that MSM (men who have sex with men) should exclude themselves from donating blood rather suddenly at an AIDS in formation forum at the old Metropolitan Community Church in the early spring of 1983.   This would also apply to organ donation (and I had signed an organ donor card around 1977).

So fast forward almost a quarter century   Finally, the FDA in the United States is willing to allow MSM to donate blood, but only those men who have abstained from gay sex for at least one year.  That even holds for men in longstanding monogamous relationships, not legally recognized as marriages.

In fact, no one is supposed to give blood if he or she has had sex with someone in a “risk” group in the past year.

Wikipedia has a chart of the rules, for the US and around the world, here.

Blood testing for HIV is quite thorough, and includes live antigen tests as well as antibodies.  Logically, with blood from a definitively higher risk source, there is a marginally higher statistical risk that an infected unit gets through.

This gets to be an ethical problem a bit like welcoming refugees –  asking members of the public to take a very small personal risk for a supposed common good.  It has to do with herd effects, like the vaccine debate.  It can invoke the idea of “sacrifice” (like military service).


MSM supposedly have a higher risk of undetected infection because of the herd “chain letter” effect – women cannot give HIV (and maybe certain other blood-born viruses) back to other men as easily as men can give it to one another.

This epidemiology seemed particularly the case with HIV (originally called HTLV-III).  It’s speculative whether it could be the case with something like Zika, or perhaps Hepatitis C.   But in early 1983, right wing elements in Texas tried to use this speculative “sci-fi horror” theory to justify a very draconian anti-gay law, which would have banned gays from most occupations (let alone the military), but fortunately it never got out of committee in the Legislature (the Dallas Gay Alliance was busy with this one).

I needed a blood transfusion (one unit) after my acetabular hip fracture after an accident in a convenience store in Minneapolis in January 1998.  I would be comfortable with receiving a unit if the waiting period were less, say 90 days.

I did not give blood after the Pulse attack in Orlando, although I would have been eligible.  I live in Virginia.  Had I lived in Florida, I probably would have.  ( I had visited the Pulse myself in July 2015).

But it’s notable that Floridians (women and non-gays) gave blood to save the lives of those not legally able to reciprocate.

In fact, in Russia, some lawmakers tried to use the “unsafe blood” argument as justification for the anti-gay “propaganda” law in 2013.  (That Putin sees all speech as “propaganda” is itself troubling.) That sort of thinking presumes people have a natural obligation to offer sharing of their organs and body parts.

The blood ban was personally embarrassing to me once.  When I was working for USLICO, a company that specialized in part in selling life insurance to military officers, I was approached about a blood drive by another employee in 1993, and he sounded oblivious as to why the request could be problematic.

The blood policy would appear to apply to the Armed Forces, where MSM have been able to serve openly since 2011.  But emergency battlefield transfusions are rare in practice today.

When I was growing up, there was not as much that could be done about many life-threatening diseases, and (apart from local blood drives) organ donations were not talked about much.  Technology has ironically created a moral dilemma about one’s claim to his own organs, and when it’s important to “step up.”

(Published: Friday, September 16, 2016, at 6 PM EDT)