Friday, October 31, 2014

Halloween

We used to turn out the lights and bring in the Pumpkin a bit after 8:00 PM, but this year I kept the operation going until 9:00. We had too much candy, I'm afraid.

Oops, just got three more.

We've noticed a lot of changes from twenty-some years or so ago when our kids were in the demographic. Fewer local elementary school age kids, for example - probably because the neighborhood has aged. Many more older kids, 11-16, who wouldn't have been caught dead trick or treating back when.

Our neighborhood has always gotten a sort of late surge, hordes of kids coming in vans, ten or twelve at a time, not sure from where. It's gotten smaller lately, and less concentrated - a van with five to seven kids instead of ten. Maybe the big vans, whom I sort of facetiously imagined being dispatched from candy stores in Juarez, have moved on to more lucrative neighborhoods. Or families and vans have just gotten smaller.

The costumes were pretty good this year. Several kids said they had gotten theirs from the internet.

Seventies Presidents: Minus the Marble Togas

The Invisible Bridge: The Fall of Nixon and the Rise of Reagan, by Rick Perlstein - some views from a reader at sea in the text.

I bought an electronic copy of Rick Perlstein's book, the third of his trilogy on Goldwater, Nixon and Reagan because I thought I could use a fast easy read or two to spell the relatively dense astrophysics textbooks I was working my way through. If it had occurred to me to check the hardcover page count (880 pages) I might have rethought the fast and easy part.

Perlstein is not mainly interested in telling a tale of competing ideologies, much less one of horse races, but rather of drawing a fully fleshed picture of the country, as the blurb puts it:

a dazzling portrait of America on the verge of a nervous breakdown in the tumultuous political and economic times of the 1970s.

Of course I was there in those years, present as an adult voting citizen, or at least as a graduate student who happened to be a military veteran. Military friends of mine were killed in Vietnam and others were bombed by Reagan at U C Berkeley. It's interesting, though, how much I had forgotten. One thing I had forgotten was the quantity of terrorism the country was subjected to at the time, almost all of it by American terrorists with American political objectives. Leftists targeting their latest crackpot targets, murdering an Oakland school official and a physics grad student among others. Racists blowing up Churches and schools. Soreheads of all political stripes targeting mostly innocent third parties.

Perlstein does not seem to be one of those historians who likes to glorify his Presidential subjects. Of course Nixon was well beyond his Presidential triumphs, such as they were, and working his way toward his paranoid crackup when the book opens, but he still comes off as a particularly contemptible human being. Ronald Reagan is central figure in the book, and Perlstein works pretty hard to find the man beneath the myth. It's not clear that he entirely succeeds, but Perlstein has a theory, and he makes a persuasive case.

Reagan came from a family just barely keeping it together. An alcoholic father and a mother who spent most of her energies outside the home, on her church work and her amateur theatrical career provided only minimally. In his earliest photographss, Reagan, claims Perlstein was always a lost looking outsider, the image of a bookish and nerdy outsider. At some point in his early teens, though (according to RC) there is a dramatic transformation in the pictures. He moves to center stage, shows a dominating presence and learns to present himself the better to show off his striking good looks. From that time on, he was always the hero of the personal drama he learned to construct about himself.

He had lots of friends but very few claimed to really know him. His children, perhaps especially, always felt like outsiders. His son Michael, for example, reported that when Reagan gave the graduation speech at his high school, Reagan shook his hand and introduced himself, seeming not to recognize him. His daughter claims that he concealed her mother's abusiveness, which apparently included regular beatings, and ordered her not to talk about them. All of his children seem to have tales of neglect or abuse and most have struggled to build lives of their own - something probably true of a lot of non-political Hollywood children as well.

His presidential rivals don't fair much better. Ford became perhaps unfairly famous for physical stumbles, but he is portrayed as not too bright and far from principled.

Jimmy Carter has spent much of his post-Presidential career running for the office of national saint, but Jimmy Carter the candidate gets the Perlstein treatment too. Behind the much practiced political smile was a mind both devious and when convenient, dishonest. Carter presented himself as a man who would be all things to all people, nuclear physicist, farmer, good old Southern boy and early advocate of black civil rights. In fact, he wasn't really quite any of these things - engineer rather than physicist, warehouseman rather than farmer, military officer more than good old boy, and a guy with a rather ambiguous record on civil rights.

Artificial Intelligence and Artificial Stupidity

Tesla CEO and famous technology innovator Elon Musk has repeatedly warned about AI threats. In June, he said on CNBC that he had invested in AI research because “I like to just keep an eye on what's going on with artificial intelligence. I think there is a potential dangerous outcome there.” He went on to invoke The Terminator. In August, he tweeted that “We need to be super careful with AI. Potentially more dangerous than nukes.” And at a recent MIT symposium, Musk dubbed AI an “existential threat” to the human race and a “demon” that foolish scientists and technologists are “summoning.” Musk likened the idea of control over such a force to the delusions of “guy[s] with a pentagram and holy water” who are sure they can control a supernatural force—until it devours them......Adam Elkus in Slate.

Elkus devotes his article to a lame brained attack on Musk's fears. I say lame because Elkus fails to engage Musk on substance or even seem to grasp the nature of the perceived threat. Mostly he just worries that Musk's expressions of concern will hurt funding for AI research - Doh! - and mumbles incoherently about Skynet. He also, and quite absurdly, accuses Musk of being technopathic - a disorder apparently characterized by an unjustified fear of technological change. His article concludes with the following feel-good drivel.

If Musk redirected his energies and helped us all learn how to understand and control intelligent systems, he could ensure that the technological future is one that everyone feels comfortable debating and weighing in on. A man of his creative-engineering talents and business instincts surely could help ensure that we get a Skynet for the John and Sarah Connors of the world, not just the Bay Area tech elites.

By focusing on the science fiction threat of AI, and failing to engage even that, Elkus ignores all the real threats that AI poses. One such threat is taking over many jobs formerly done by humans. It's sometimes assumed that this only happens when the computer gets as smart at its job as the human it replaces. That's far from the case, as anyone who has dealt with phone menu hell knows by personal experience. For the employer, very minimal performance may be perfectly acceptable if the price is right. The phone menu replaces reasonably competent humans who can occasionally exhibit common sense with minimally competent robots with zero common sense, producing a sort of artificial stupidity.

Even Skynet was an example of this kind of artificial stupidity. It's job was to protect the environment of Earth, presumably for the benefit of its inhabitants, but it apparently failed to understand the "presumably" part, but did notice that the big threat to the environment was those self-same inhabitants.

My internet went out for a few days recently, perhaps as a result of a much shorter lived neighborhood power failure. Patient navigation of a phone menu hell - if you can call cursing and snarling, "patient" that is - finally took me to a message that said that the provider knew my service was out, and that their technicians were working on it with no prognosis for a fix.

Of course no internet is a major catastrophe these days. It turns out that my family no longer owns a physical phone book, for example. Fortunately I have this "smart phone" right? Well it seems that I didn't understand my new, one day old, smartphone very well, since it proved equally inept at reaching the internet. It took me perhaps a day to realize that my phone, in its AI wisdom/stupidity passed all calls to the internet thru my still working home WiFi. It was smart enough to realize that this could save me data/money but not smart enough to figure out that this was a bad strategy when my WiFi modem could not connect to the internet..

These kinds artificial stupidity are usually just a minor nuisance, Skynet not withstanding, but they illustrate some of the folly of depending, Elkus fashion, on all of us learning "how to understand and control intelligent systems." Our brains aren't designed for that. A substantial part of my career was spent working with AI systems, and trying to apply them to solve real world problems, and I failed the simple test provided by my "smart" phone. Even a genius like Musk knows that he has problems with controlling AI systems.

Thursday, October 30, 2014

The LA Ghetto

The liberal arts, or education appropriate to a free person, originally consisted of the Trivium: grammar, rhetoric, and logic, and the Quadrivium: arithmetic, geometry, astronomy, and music. As knowledge and universities became more specialized, the liberal arts continued at first to encompass much of the curriculum, including science and mathematics.

Various forces seem to have propelled more and more studies into science, engineering, business, fine arts and other distinct colleges, while what was left of the traditional liberal arts became ghettoized into a Humanities college, a mini-university of people who counted on their fingers, often with only history, literature, philosophy, and some fragments of language studies as their subject matter, or sometimes lumped a bit uncomfortably with social sciences like anthropology and psychology.

Naturally this loss of centrality has the liberal arts professoriate angry, disoriented and perhaps dazed and confused. At a time when students are graduating from school with mountains of debt, they and their parents want to believe that they are learning something relevant. It's sort of clear that a degree in engineering, accounting, or even mathematics prepares you for a potentially middle class job, but what about that liberal arts degree? If one is lucky, that literature or history degree, plus some secondary ed courses, might get you a job teaching high school. And what the heck is an art history degree good for? Unpaid work as a museum docent?

"We teach critical thinking" respond the professors of liberal arts. Pish posh, I say. If you want to learn critical thinking, you might try statistics.

I might sound like I'm picking on the liberal arts types. Could my skepticism be nothing more than the old time grudge of a science nerd who remembered the philosophy majors taking bong hits, discussing Camus, and getting laid while we, the citizens of geekdom, labored at interminable problem sets? Of course not! For one thing, dope didn't invade the Campus big time until I was a grad student.

Actually, I would like to defend the worth of the liberal arts, even if they did let the crucial disciplines of math, science, and rhetoric slip from their grasp. Literature and history are about story and narrative, and story and narrative form the core of human thought - even in math and science. The first female winner of the Fields medal, the premier accolade in mathematics, mentioned that her childhood ambition had been to be a novelist - though now the characters in her narratives were mathematical objects. The mathematician's pen, like that of the poet, captures:

...the forms of things unknown, ... Turns them to shapes and gives to airy nothing A local habitation and a name. Such tricks hath strong imagination,......................... WS, A Midsummer Night's Dream, Act V, Scene 1.

Literature and History remain the central domain of narrative in its purest and original form. That's not a trivial role.

A very superior student I know, like many had trouble making up his mind about a major, so he spent a couple of years in the University's honors program before picking a more specialized major. I asked him what he learned there.

"How to write a five page essay," he replied.

A very non-trivial skill.

Tuesday, October 21, 2014

Losing It

The fifties, sixties, and early seventies were years of sky high top bracket income taxes, strong economic growth, and rapid improvement in median American incomes. They were also the last years of Democratic Party political dominance. Sometime between then and now, the Democrats lost the confidence of the American working class, especially working class men. How so?

Well, of course, it's complicated. Wars, race, oil shocks, terrorism all played their parts.

Political parties tend to be dominated by elites. The great depression and it's economic questions, and the struggle against Fascism and Communism had dominated the Democratic Party and its elites for a generation, but by the sixties and seventies social issues had come to the fore. Social issues have dominated the mind of the party elite every since: civil rights, feminism, abortion, immigration, gay rights. Enormous progress has been made on many of these issues, in the law and in public opinion, but these wars were never the wars of many white working men, and some have threatened them very directly.

Meanwhile, the Democratic party supped at the tables of the Wall Street and corporate lobbyist fat cats, and the position of the American worker has steadily deteriorated. Of course a lot of that deterioration is a direct result of the Reagan and Bush tax breaks for zillionaires policies, but voters have plenty of reason to be unhappy with the flabby defense provided by the Democrats too.

White Privilege

I caught Jon Stewart's interview with Bill O'Reilly the other day. Jon tried to get O'Reilly to admit that "White Privilege" existed in the US. O'Reilly handed him his head, incidentally demonstrating why Stewart would have been a terrible, awful, no good choice for news show host. Stewart is a wonderful satirist, and a very funny guy when interviewing his show business buds, but in trying to conduct a semi-hostile interview with O'Reilly he wound up looking like the befuddled pot head he described himself as being in his college years, while O'Reilly looked like the skilled debater he is.

Part of his problem was that he saddled himself with the dumb formulation: "white privilege", which is a lame and inaccurate way of describing the sorts of obstacles blacks face in America. A privilege, says my online dictionary is:

...a special right, advantage, or immunity granted or available only to a particular person or group of people.

That's not a good description of the advantages that whites enjoy today in America. Whites, by virtue of being white, don't get to go to the head of the line, pay lower taxes, or enjoy immunity from prosecution from crimes. The legal privileges were essentially removed in the civil war and many socially enforced informal "privileges" have been attacked by a series civil rights laws.

None of which contradicts the undeniable fact that blacks, on average, are much poorer economically, have less access to good neighborhoods and schools, are far more likely to be targeted (and shot) by police, and are subject to a lot of destructive discrimination. These are huge factors in American life, but have little or nothing to do with any "privilege." Rather, they have a lot to do with history and circumstance.

Rhetoric isn't a strong point with Stewart. If he wants to joust with the likes of Bill O'Reilly, he needs to be much better prepared, and he needs to take his thinking beyond slogans to details.

Sunday, October 19, 2014

Progress in Fusion Energy

Lockheed engineers recently claimed to have a breakthrough idea for a fusion reactor. Cheap, plentiful fusion power is only 20 years away!

Cynical observers familiar with the long history of fusion power research contained their enthusiasm. 50 years or so ago, when peaceful fusion power had already frustrated the expectations of its devotees for a while, a prescient wit observed that fusion power was fifty years away - and always would be. If that's been updated to twenty years away - with the same qualification - that might be considered progress of a sort.

The possibility of extracting energy from nuclear fusion was first understood in the 1920, and the great astrophysicist Eddington quickly appreciated that such fusion must power the stars. The Wikipedia article cited above notes that the first fusion reactor patent dates to 1948.

Getting hydrogen atoms to fuse to form helium is a multi-step process, but the basic idea is simple: get two protons close together and they can fuse to form deuterium (plus a positron, a neutrino, and a photon). A couple of more steps gets one to helium. The hard part is getting those protons really close together - about 10^-13 cm apart. It's hard because protons carry electrical charge and consequently repel each other.

It's a pretty easy lab experiment though - just accelerate protons to a few tens of thousands of electron volts (or so) and slam them into a proton rich target, and some tiny fraction will fuse in one of the stages of helium formation. Stars manage the feat by getting their internal temperatures up to about 15 million K. At that temperature, an incredibly tiny fraction of their protons will react. Because stars are really big - about 24,000 Earth masses, minimum - it takes a while for the energy released in fusion to leak out, and enough remains behind long enough to keep the temperature up and the reaction going.

Aside from stars, we really only know one way to keep the fusion energy from leaking out too fast to sustain thermonuclear fusion - take a big mass of hydrogen isotopes and drastically compress it to super high temperature with the blast of X-rays from a nuclear bomb. At the super high temperatures and densities achieved, a whole lot of fusion takes place before it has time to blow itself to smithereens.

Neither the gravitational confinement of the star nor the inertial confinement of the thermonuclear bombs can be domesticated on human scale, so other means of confinement must be sought. Such efforts have been failing now for about 2/3 of a century.

Friday, October 17, 2014

Training an Army to Fight ISIS

It will take years, they say.

Let me provide a clue: The US has demonstrated again and again that it is absolutely incompetent to train one side of a civil war. Vietnam, Iraq, Afghanistan. Like all our other proxy armies, they will show up to get they paychecks until they are faced with actual combat, at which point they will flee like mice before a cat.

My guess is that the root problem is that we are selling something that they aren't buying, but it probably doesn't matter.

Monday, October 13, 2014

Lumo Confuses Himself About Probability - Again

Unfortunately I must report that the Blogfather has once again gotten himself dizzy while chasing his tail in yet another futile attempt to make his version of Statistical Mechanics make sense. This time he decides that suitable mutilation of the concept of probability will do the trick:

But what's important to notice is that the meaning of the probability always refers to the situation

a property is unknown/blurred at t=1−ϵ it is well-known/sharp at t=1+ϵ

The two signs simply cannot be interchanged. The very meaning i.e. the right interpretation of the wave function or the phase space probability distribution is in terms of probabilities so the time-reversal-breaking quote above is inevitably understood in every and any discussion about probability distributions and wave functions.

He is trying to tell us that probability only applies to prediction of the future - though the stuff about wave functions and quantum mechanics is thrown in mostly for obfuscation. Oddly enough, he forgets one of the more familiar roles of probability in particle physics - his subdiscipline - is in retrodiction. Suppose one is measuring the cross section for a particular type of event as a function of energy and one notices more events at one energy than the rest. The question one then wants to answer is what is the probability that the current state resulted from the various possible or imaginable states in the past - is the bump in your curve a fluctuation or does it indicate resonance producing a real change in cross section.

Probability is a more versatile concept than is imagined in the Lumonator's world. We use it in a for more time symmetric fashion than he imagines. Of course all these ideas, including the unitary evolution of the quantum state, have been discussed by more subtle minds than mine or Lumos. He really ought to read some of them.

Zombies and Vampires

Zombies and Vampires are a major infestation in television and even movies. How should we explain the popularity of this sort of ancient pagan mythology in a supposedly scientific age? Aside from being cheap to film, these tribal dramas must tap into our primal psychology somehow.

The zombie and the vampire are typically contagious and evil. A suggestion that caught my eye is that the zombies metaphorically represent the poor and the vampires the rich. Each threaten, prey upon, and lust after the human - representing perhaps the middle class. In an age when middle class membership is increasingly fragile, with status heavily dependent upon circumstances seemingly beyond individual control, with poverty an illness or layoff (zombie bite) away, many may crave the vampire bite (lottery ticket?) that offers a way out.

Thursday, October 09, 2014

Entering and Leaving - Hong Kong Style

Tyler Cowen looks at elevator and subway leaving protocol in Hong Kong and makes just the kind of (IMHO) spectacularly dumb Libertarian/Coasean analysis I would expect:

I’ve noticed in Hong Kong that exiters are not accorded absolute priority. That is, those entering the elevator can push their way through before the leavers have left, without being considered impolite, unlike in the United States. In part, Hong Kongers are in a hurry, but that does not itself explain the difference in customs. After all, exiters are in a hurry too, so why take away their priority rights? Perhaps we should look again to Coase. If some people who wish to enter are in a truly big hurry, they can barge forward. Furthermore, an exiter who is not in a hurry at all can hold back, knowing that someone will rush to fill the void, rather than ending up in the equilibrium of excessive politeness where each defers to the other and all movements are delayed. That is not an equilibrium you see often in downtown Hong Kong.

There is another positive effect from the Hong Kong method. If you will be exiting the elevator, you have to step forward early on and be ready to leave promptly, to avoid being swamped by the new entrants. That means the process of exit takes place more quickly. And so the entrants who are in a hurry actually do get on their way earlier than would otherwise have been the case.

#smallstepstowardamuchbetterworld

- See more at: http://marginalrevolution.com/marginalrevolution/2014/10/should-everyone-leave-the-elevator-subway-car-before-others-try-to-enter.html#sthash.QqnJUbJE.dpuf

Nonsense, I say. The trouble with competitive exiting is turbulence - everybody is slowed down when the fluid particles, er people, try to push past each other. The point of competitive exiting has nothing to do with efficiency and everything to do with helpless pawns in the economy expressing their frustrations by pushing other people around. I think of it as degeneracy pressure, human style.

Wednesday, October 08, 2014

Tribal Shibolleths

Astrophysics is encrusted with odd conventions that serve as a monument to history, tradition, and the history of measurement technologies. The magnitude system, for example, is, as one of my Astronomy professors put it, "so irrational that it serves just to keep the physicists out." Of course there is the matter of a couple thousand years of history plus more modern questions of measurement that keep it in place. A related notion is measurement of distances in parsecs (and kiloparsecs and megaparsecs) rather than the much more rational (and covariant minded) light years or even meters. Add to that a vast amount of idiosyncratic notation and convention as well as plenty of necessary purely descriptive knowledge and the physicist wonders if he is dealing with physics or stamp collection, in Rutherford's metaphor. The answer, of course, is both.

Astronomy has been and remains more of an observational than experimental science.

One of the more peculiar traditions of astronomy is referring to all elements but hydrogen and helium as "metals". I have no idea where that came from. Any ideas?