## Friday, December 27, 2013

### Technocracy. It's new.

No. It's not.

I've read a number of recent books and articles about how technology, particularly computers and robots, will change everything and create a bipartite society, where "there will be those who tell computers what to do, and those who are told what to do by computers" – in a compact form. (As a computer engineer, I sort of approve of this message. :-)

This idea of a bipartite society with a small elite lording over the undifferentiated masses is not new (really, not new at all). That it's a result of technology instead of divine intervention or application of force is also not new, but, since most people have an "everything that happened before me is irrelevant because my birth was the most important event in the totality of space-time" attitude towards the past, this is ignored.

There are a few reasons contributing to the popularity of this idea:

It's mostly right, and in a highly visible way. Technological change makes life harder for those who fail to adapt to it. In the case of better robotics and smarter computers, adaptation is more difficult than it was for other changes like the production line or electricity. One way to see this is to see how previously personalized services have been first productized (ex: going from real customer service representatives to people following an interactive script on a computer) and then the production processes were automated (ex: from script-following humans to voice-recognition speech interfaces to computers). Technological change is real, it's important, and it's been a constant for a long time now.

(Added Jan. 7, 2014: Yes, I understand that the economics of technology adoption have a lot to do with things other than technology, namely broader labor and economic policies. I have teaching exercises for the specific purpose of making that point to execs and MBAs. Because discussion of these topics touches the boundaries of politics, I keep them out of my blog.)

It's partially wrong, but in a non-obvious way. People adapt and new careers appear that weren't possible before; there are skilled jobs available, only the people who write books/punditize/etc don't understand them; and humans are social animals. The reason why these are non-obvious, in order: it's hard to forecast evolution of the use of a technology; people with "knowledge work" jobs don't get Mike Rowe's point about skilled manual labor; most people don't realize how social they are.

(On top of these sociological reasons there's a basic point of product engineering that most authors/pundits/etc don't get, as they're not product engineers themselves: a prototype or technology demonstrator working in laboratory conditions or very limited and specific circumstances is a far cry from a product fitting with the existing infrastructure at large and usable by an average customer. Ignoring this difference leads authors/pundits/etc to over-estimate the speed of technological change and therefore the capacity of regular people to adapt to it.)

Change sells. There's really a very small market for "work hard and consume less than you produce" advice, for two reasons. First, people who are likely to take that advice already know it. Second, most people want a shortcut or an edge; if all that matters is change, that's a shortcut (no need to learn what others have spent time learning) and gives the audience an edge over other people who didn't get the message.

It appeals to the chattering classes. The chattering classes tend to see themselves as the elite (mostly incorrectly, in the long term, especially for information technologies) and therefore the idea that technology will cement their ascendancy over the rest of the population appeals to them. That they don't, in general, understand the technologies, is beyond their grasp.

It appeals to the creators of these technologies.
Obviously so, as they are hailed as the creators of the new order. And since these tend to be successful people whom some/many others want to understand or imitate, there's a ready market for books/tv/consulting. Interestingly enough, most of the writers, pundits, etc, especially the more successful ones, are barely conversant with the technical foundations of the technologies. Hence the constant reference to unimportant details and biographical information.

It appeals to those who are failing. It suggests that one's problems come from outside, from change that is being imposed on them. Therefore failure is not the result of goofing off in school, going to work under the influence of mind-altering substances, lack of self-control, the uselessness of a degree in Narcissism Studies from Givusyourstudentloans U.  No, it's someone else's fault. Don't bother with STEM, business, or learning a useful skill. Above all, don't do anything that might harm your self-esteem, like taking a technical MOOC with grades.

It appeals to those in power. First, it justifies the existence of a class of people who deserve to have power over others. Second, it describes a social problem that can only be solved by the application of power: since structural change creates a permanent underclass, not by their fault, wealth must be redistributed for the common good. Third, it readily identifies the class of people who must be punished/taxed: the creators of these technologies, who also create new sources of wealth to be taxed. Fourth, it absolves those in power from responsibility, since it's technology, not policy that is to blame. Fifth, it suggests that technology and other agents of change should be brought under the control of the powerful, since they can wreak such havoc in society.

To be clear, technology changes society and has been doing so since fire, the wheel, agriculture, writing,  – skipping ahead – printing press, systematic experiments, the production line, electricity, DNA testing, selfies... The changes these technologies have brought are now integrated in the way we view the world, making them so "obvious" that they don't really count. Or do they? Maybe "we" should do some research. If these changes were obvious, certainly they were accurately predicted at the time. Like "we" are doing now with robots and AI.

You can find paper books about these changes on your local sky library dirigible, which you reach with your nuclear-powered Plymouth flying car, wearing your metal fabric onesie with a zipper on your shoulder, right after getting your weekly nutrition pill. You can listen to one of three channels bringing you music via telephone wires, from the best orchestras in Philadelphia and St. Louis while you read.

Or you can look up older predictions using Google on your iPhone, while you walk in wool socks and leather shoes to drink coffee brewed in the same manner as in 1900. The price changed, though. It's much cheaper to make, but you pay a lot more for the ambiance.

## Saturday, December 21, 2013

### Books I read in 2013

At the beginning of 2013 I decided to keep a book log (including Audible audiobooks). These are the non-work books I read in 2013, by author. Some are re-readings, and there's still enough time for a few more. I'll be adding notes later.

✏ Chris Anderson: Makers: The New Industrial Revolution

✏ Julian Assange et alli: Cypherpunks: Freedom and the Future of the Internet

✏ Walter Bagehot: Lombard Street: A Description of the Money Market (reread; free)

✏ Albert-Laszlo Barabasi: Bursts: The Hidden Pattern Behind Everything We Do

✏ Gregory Benford: Foundation and Fear (reread on Dec 31st.)
Screenshot; I'm impressed by how careful Prof. Benford is to make sure that none of his personal feelings about being an academic in America comes across in his SciFi writing. The Foundation series is a good illustration of the preachiness and neoteny of most science fiction; it's mostly amateur sociology with minimal exploration of the real changes that technology creates. As a former aficionado, I have some residual interest in the genre, but you bet better futurism from A McKinsey or Bain conjectural report than from most SciFi, even Cyberpunk.
✏ Gregory Benford and Larry Niven: Bowl of Heaven
The only new sci-fi book I read this year. Hard sci-fi took a hit after 2000, when some authors decided to join the culture wars and write metaphors for the american political system.
In July I decided to reduce the amount of stuff I owned, so I replaced a number of paper books with electronic copies. This led to an assessment of which scifi books I wanted to reread. Earth was one of them, Brin's best book in my opinion. Some of the other scifi books to get replaced by eBooks are mentioned below. In the end, I donated or recycled almost two thousand paper books.
✏ Sean Carroll: The Particle at the End of the Universe: How the Hunt for the Higgs Boson Leads Us to the Edge of a New World

✏  Phillip Dennis Cate et alli: Impressionists on the Water (FAMSF Exhibition Catalog)

✏ Arthur C Clarke: Childhood's End (reread)

✏ Daniel Dennett: Intuition Pumps and Other Tools for Thinking

✏ Edward Dolnick: The Forger's Spell
Few things describe the arts world as precisely as the end of chapter 11: "Van Meegeren fooled the world with a seventeenth-century painting made of plastic."
✏ Niall Ferguson: Civilization: The West and the Rest

✏ Niall Ferguson: The Great Degeneration
Like Civilization, you can get most of the content of the book from Niall Ferguson's talks. But I wanted the notes and details so I read the books.
✏ Seth Godin: The Icarus Deception

✏ Rose-Marie Hagen and Ranier Hagen: Masterpieces in Detail (Art book)

✏ Chip Heath and Dan Heath: Decisive: How to Make Better Choices in Life and Work

✏ Robert Heinlein: The Cat Who Walks Through Walls (reread)

✏ Daniel Kahneman: Thinking Fast and Slow (reread)

✏ Walter Lewin: For the Love of Physics
As autobiographies of scientists go, this one is more educative than the Feynman pair (Surely you jest, Mr Feynman and What do you care what other people think?). Lewin is a superstar Physics professor from MIT, who would be the first to say that students learn Physics only when they solve the problem sets, not in the lectures. Screenshot.
✏ William Manchester and Paul Reid: The Last Lion: Winston Spencer Churchill: Defender of the Realm, 1940-1965
Vol. III of Manchester's biography of Churchill, written by Reid based on Manchester's notes. Hard on the French. Read in one day plus two evenings. 1200 pages, but the last 130 are notes and references.
✏ Michael Moss: Salt Sugar Fat: How the Food Giants Hooked Us

✏ Larry Niven and Jerry Pournelle: Lucifer's Hammer (reread)
One of my favorite sci-fi books (and the only Pournelle to make my top 10). I reread parts of it often (notes and highlights help). I bought it 5 times in different languages and formats.
✏ Donald Norman: The Design of Everyday Things: Revised and Expanded Edition (added Dec 25)
There are enough changes from the previous edition to merit purchasing it anew, but in my case I get the added benefit of moving from a paper edition to a Kindle book, reducing the need for physical storage space. Subsumes Living With Complexity as well.
✏ Iain Pears: The Bernini Bust (reread)

✏ Iain Pears: The Titian Committee (reread)

✏ Terry Pratchett and Neil Gaiman: Good Omens (reread)

✏ Mark Sisson: The Primal Blueprint
Good book, though I wouldn't want to give up resistant starch altogether and the high-impact exercise recommendation is better ignored. But worth reading as motivation for life changes.
✏ Benn Steil: The Battle of Bretton Woods: John Maynard Keynes, Harry Dexter White, and the Making of a New World Order

Reread for the 10th or 20th time, despite being over 1000 pages long (read it in one very long reading marathon when it came out); possibly Stephenson's best book. Screenshot.
✏ Nassim Nicholas Taleb: Antifragile: Things That Gain from Disorder
Best non-fiction book I read in 2013. I think I'll be rereading my highlights and notes for years to come. Sometimes NNT's style may be a little over the top, but the substance is worth it.
✏ Barbara Tuchman: The Guns of August (reread on Remembrance Day; screenshot)

✏ Barbara Tuchman: The March of Folly

✏ Mark Twain: The Innocents Abroad
Mark Twain takes on tourism, Americans, and foreigners. For some reason I had never read it before. It's available for free, since it predates the Mickey Mouse copyright rules.
✏ Lea Van Der Vinde et alli: Girl with a Pearl Earring: Dutch Paintings from the Mauritshuis (FAMSF Exhibit Catalog)

✏ Ingo Walther and Norbert Wolf: Masterpieces of Illumination: The World's Most Famous Manuscripts 400 To 1600 (Art book)

It's a book about class, friendship, religion, and growing up. The movie was a distortion of the book as bad as Starship Troopers was of the Heinlein original; the ITV series was acceptable, but the writing itself is a major part of the value of the book, and cannot be appreciated from video.
✏ Evelyn Waugh: Sword of Honor (reread)

✏ Evelyn Waugh: Vile Bodies (reread)

✏ P.G. Wodehouse: Big Money (reread)

✏ P.G. Wodehouse: Carry On Jeeves (reread)

✏ P.G. Wodehouse: The Code of the Woosters (reread, for the 20th time or so...)

✏ P.G. Wodehouse: A Man of Means
Found a Wodehouse I hadn't read before. The year was worth it. Huzzah!
✏ P.G. Wodehouse: Mulliner Nights (reread)

✏ William Zinsser: On Writing Well (reread)
Best book on writing ever, IMNSHO. I reread parts of it often; read the whole book at least once a year; and reread my notes about it before starting any writing project. Technically it's a work book for me, but I like to read it for pleasure as well.

The secret to reading this many books: watching very little television. Most of these books take only a few hours to read (though some may take a lot more), so an evening or two without television is enough to read a book. By that metric, I read a lot less than my potential, and that's not considering the multitasking afforded by audiobooks during walks or repetitive exercise like Concept II rowing.

## Sunday, December 8, 2013

### How strong must evidence be to reverse belief?

I've seen this quote attributed to Carl Sagan and to Christopher Hitchens, but I think Rev. Thomas Bayes may have called dibs on it a few centuries ago:

"Extraordinary claims demand extraordinary evidence."

As I've written before [at length, poorly, and desperately in need of an editor], I find the attitude of most people who use this phrase counterproductive. But instead of pointlessly arguing back and forth like they do in certain disciplines, we'll dig into the numbers involved and see what we can learn.

I know a super-genius, an Einstein-grade mind, who for decades believed that "this is the year that the Red Sox will come back and start a long series of victories," a belief unfounded in reality.

Yes, very smart people can have strong beliefs that appear nonsensical to others.

Let's say that the claim is about some proposition ("God exists," "Red Sox are a great team") which we'll call $G$. The prior belief in $G$ we'll denote $q \doteq \Pr[G]$; so a person may be a strong believer if $q = .99$ or a moderate believer if $q = .80$.

Let's call the evidence against $G$, $E$, which is a binary observable ("no Rapture","loss against the Chicago Cubs"), with the probability of observing the evidence given that $G$ is false denoted by $p \doteq \Pr[E| \neg G]$. We'll consider evidence that has symmetric error probabilities, $\Pr[E|G] = 1 - \Pr[E|\neg G]$ so the probability that we get a false positive is equal to that of a false negative, $1-p$.

For example, if $p=0.90$ there's a 10% chance of no Rapture even if God exists; if $p = 0.99$, then there's only a 1% probability of the faithful burning up in Hellfire on Earth with the rest of us sinners, when there is a God. Note that with symmetric errors, $p=0.90$ has the interesting characteristic that there's a 10% change of Rapture with no God at all, which probably would say something about the design of the experiment (psilocybin would be my guess).

Now this is the question we want to ask: for any given prior belief $q$ in $G$, how good would the evidence against it have to be (meaning how big would $p$ have to be) to convince the believer to flip her beliefs, i.e. to believe against $G$ with the probability $q$, or formally, to have $\Pr[G|E] = 1-q$. The reason to go for a flip of beliefs is empirical: no zealot like a convert.

(Really trying to goose up page views here. Was it Stephen Hawking who said a book's potential audience is halved by each formula in it? This blog must be down to individual quarks...)

For example, if JohnDCL believes in the greatness of the Red Sox with $q= 0.99$, how strong a piece of evidence of Sox suckage would be necessary for JohnDCL to think that the probability of the Red Sox being great is only 1%?

The result is $p \approx 0.9999$, in other words, JohnDCL would have to believe that the evidence only gives a false positive (it's evidence against $G$, remember) once every 10,000 tries.

Let's say the evidence is losing against the Chicago Cubs. For JohnDCL to flip his beliefs based on observing such a defeat, he'd have to believe that, were the Sox a great team, they could play the Cubs 10,000 times and lose only once. (Recall that we're assuming symmetric errors, for simplicity.)

Here are a few other values for $q$ and corresponding $p$:

$q = 0.999 \Rightarrow p \approx 0.999 999$ one false positive in a million tries;

$q = 0.9999 \Rightarrow p \approx 0.999 999 99$ one false positive in one hundred million tries;

$q = 0.99999 \Rightarrow p \approx 0.999 999 999 9$ one false positive in ten billion tries.

$q = 0.999999 \Rightarrow p \approx 0.999 999 999 999$ one false positive in one trillion tries.

(How strong is faith in the Red Sox? In God? In Quantitative Easing Forever And Ever?)

In other words, it's true that to reverse a strong belief you need extraordinary evidence. What is equally true is that the beliefs and the evidence aren't perceived equally by all participants in a conversation. People who proselytize for a cause will not be able to convince anyone else until they see the probabilities from the other person's point of view.

Of course, those who say "extraordinary claims demand extraordinary evidence" typically see the world from their own point of view only.

## Friday, December 6, 2013

### Word salad of scientific jargon

"The scientists that I respect are scientists who work hard to be understood, to use language clearly, to use words correctly, and to understand what is going on. We have been subjected to a kind of word salad of scientific jargon, used out of context, inappropriately, apparently uncomprehendingly." – Richard Dawkins, in the video Dangerous Ideas - Deepak Chopra and Richard Dawkins, approximately 27 minutes in.

That's how I feel about a lot of technical communications: conference panels, presentations, and articles.  An observed regularity: the better the researchers, the less they tend to go into "word salad of scientific jargon" mode.

## Thursday, December 5, 2013

### Identifying the problem: innumeracy or science ignorance?

In previous posts I said that many people who believe in Science™ (as opposed to people who know science) can't answer simple questions, like "what is the kinetic energy of a 2-ton SUV going 65MPH?"

An insightful person suggested that the problem might be due to innumeracy (which is bad in itself; read the linked book) so here's another version that requires no computation: which has more kinetic energy, the aforementioned SUV or a 1-ton car going 130MPH?

A sample of three people who believe in Science™ showed 100% inability to answer with explanation. (Explanation is necessary because a random pick will be right about half of the time.) Two of three picked the wrong answer (SUV) and the third "felt" the car was the right answer.

The question can be answered without calculation, as long as one knows how mass and speed relate to kinetic energy. We're not talking advanced science here: this used to be taught in the seventh ninth grade.

-- -- -- --
Note: these posts have nothing to do with the wrong idea that scientists have faith in science in the same sense that religious people have faith in a deity. This is about people who don't know any science but like to invoke Science™ as a talisman or a prop.

Postscript: I'm compiling a list of questions to ask when faced with a Science™ believer, tagged by fashionable intellectual pursuit; after a round of testing, I'll probably post it.

## Wednesday, November 27, 2013

### Intellectual counterfeit fashionistas and the corruption of STEM and analytics

I have acquaintances who say they like classical music but never listen to it and can't tell Bach from Brahms. While this is entertaining to classical music aficionados, a similar disconnect happens in STEM and business analytics, where it has serious consequences.

I've observed many people who are always saying how important science is, who can name several recent Nobel laureates in the sciences, but can't compute the kinetic energy of a 2-ton SUV going 65MPH (766kJ), or, ironically, can't explain what the research of those Nobel laureates was about.

I know people who are always talking about Big Data™ and "the" Management Information Revolution™ (yes, they think the current one is the only one), but cannot write Bayes's formula and think that standard deviation is the same as standard error.

These are the signs of the rise of the intellectual counterfeit fashionista (ICF). The ICF wants others to consider him or her an intellectual (that's the I), up to date on the latest hottest intellectual topic (that's the F), but is not willing to do the work and the learning necessary to understand that topic (that's the C).

No matter how infuriating or entertaining an ICF can be on a personal level, their rise is a problem -- chiefly because of their effect on education, the practice of technical professions, and the general perception of STEM and analytics in society.

Education: by trying to recruit proto-ICFs into STEM/analytics, teaching institutions end up having to water down their courses, since the ICFs don't want to do the work needed for real learning. This leads to lower quality education for every student, even the non-ICFs.

In the mid-to-long term, this creates a number of credentialed ignoramuses and gives rise to the strange situation where people who hire engineers say there's a dearth of them, while engineering associations say there's a glut. I guess it depends on how you define engineer, by skills or by credentials.

Professions: the obvious effect of ICFs is the rise in average incompetence. The more pernicious effect is the destructive nature of internal politics, which always increase in organizations with large numbers of people for which appearances and narratives are more important than observable realities and hard work.

I wish nerdiness became unfashionable again, so that the ICFs moved on to corrupt something else and left STEM and analytics alone.

## Sunday, November 24, 2013

### Carrying less to do more

Every so often I look back at a packing list from some years ago, and find myself flabbergasted at how much simpler travel has been made by technological advance and some judicious choices.

This is all the hardware (plus cell phone, mine being a prepaid for emergencies only) I carry on a work trip:

Laptop
+ Power brick
+ Presentation remote (with green laser and 4GB drive)

iPod Touch 5
+ Charger cable
+ audiophile earphones
+ sports earphones

Backup hard drive (2TB of space, mostly filled with optional content for work & downtime)
+ USB 3 cable

Large capacity USB flash drives (including a 32GB one on my keychain!)

Rite-in-the-Rain notebook
+ Fisher space pen

Microfleece cleaning tissues doubling as packing material.

Add a magazine to read when electronic devices aren't allowed (I get The Tech and Smithsonian Magazine for free, so I take those and dispose of them when done), clothing (planning helps), toiletries, and food for travel.

The magic enabling the ever shrinking ever more powerful hardware packing comes from multitaskers and digital content.

The iPod Touch replaces a lot of equipment I previously carried (iPod, still camera, video camera, voice recorder, backup remote control for presentation, books to read, and even my iPad 1.0 in many respects). The hard drive carries an hitherto unthinkable library of work and play stuff. (I don't play computer games, other than the occasional solitaire, bejeweled, or mahjong, so I don't carry -- or own -- a game controller.)

The second part of the magic is the move to digital content.

Many years ago I'd carry a small sleeve case with CDs for my Sony Discman (Get off my lawn, kids!!!), some DVDs, paperback books, work books -- hardcover textbooks! -- and other heavy objects with minimal bits-to-atoms ratio. Now I carry thousands of music tracks, hundreds of books, audiobooks, and technical papers, dozens of movies, videos, and television shows, and even a few comic books for nostalgia sake, all as bits on the hard drive. (Obviously these are not the only copies I have of those bits.)

Anything important is backed up in a multiplicity of places: laptop hard drive, portable hard drive, USB flash drives, multiple online services. Because it's well known that anything important of which you only have one copy will, by the laws of Physics, necessarily be lost, inoperative, or confiscated by the TSA.

Of course, you still need to bring a few changes of clothes and toiletries. There's no digitizing those.

## Thursday, November 21, 2013

### The roots of my disillusionment with 'official skeptics'

There were several contributing events, all similar in one point: 'official' skeptics prove to be so in name but not in actuality. This is one of the events, involving James Randi, whom I still admire.

James Randi had a long feud with Uri Geller regarding spoon bending. Now, I used to do a lot of spoon bending myself before I got a OXO Good Grips ice-cream scoop, but that's not the type of spoon bending that got Mssrs Randi and Geller at loggerheads.

Mr Geller claimed he had paranormal powers, which he demonstrated by bending spoons. Mr. Randi implied (for legal reasons he couldn't outright state) that Mr Geller was in fact using prestidigitation. (For a moment ignore the obvious question of why someone with paranormal powers would use them to bend eating utensils instead of, say, make a fortune on Wall St.) You'd think that Mr. Randi would explain how the trick is done, so that the audience could check whether Mr. Geller was in fact using that trick.

No. Mr. Randi invoked the Magician's Code and declined to explain how the trick is done. (FYI: you bend the spoon with finger pressure or against a table, takes a bit of practice to do it without other people noticing, and even with practice they will notice if they're looking for it.) So, here is Mr. Randi, allegedly a skeptic, asking his audience to accept on faith that there exists such a trick that Mr Geller could be using.

When Mr. Randi replicated his great feat of spoon bending, allegedly using a trick, Mr. Geller took advantage of Mr Randi's adherence to the Magician Code to say that Mr. Randi was in fact using his -- Randi's -- paranormal powers. All because Mr. Randi's argument relied on the audience's faith, not a testable proposition.

Now, that's ironic.

-- -- -- --
Note: this vignette was part of the post "Fed up with 'trust us, we're experts' science," but it detracted from the point of that post so I separated it into its own post.

## Wednesday, November 20, 2013

### Fed up with "trust us, we're experts" science

Somehow in my lifetime we went from Feyman's idea of science requiring 'a belief in the fallibility of experts,' to a caste system where science experts must be trusted without question, and acolytes jump on anyone who dares ask anything.

The trigger event for this rant was the Mythbusters Breaking Bad Special. In particular, the test of the hydrofluoric acid disposal of a body in a bathtub that ends up with a big hole on the floor and ceiling of Jesse's home. (Season 1, Episode 2, "Cat's in the bag.")

(Big Breaking Bad fan here, and still grudgingly a fan of the Mythbusters.)

First off, the Mythbusters test the effect of the 100ml of hydrofluoric acid on a number of samples of the materials involved (meat, wood, drywall, iron, steel, linoleum), all of the same size. Yes, size, not appropriate mass computed from molar calculation. Apparently no one thought of asking a chemist (though one is present to run the experiment) about mass balance and stoichiometry.

After they fail to dissolve these objects with the apparently arbitrarily chosen volume of hydrofluoric acid, the Mythbusters move on to replicate the scene in the show with a different solvent.

This is the point when I really lose it: they say that the solution to the body-disposal problem is to use sulfuric acid and a secret sauce.

A. Secret. Sauce.

Because knowledge should only be held by experts?! Say whaaa?

This is what science entertainment teaches its audience: if you're not an expert, you should not expect full information: "Trust us, we know what's going on, and you'll get to see the result on TV, so it's real." Of course this trains audiences to (a) accept TV as the authority on who's an expert; (b) believe in experts' statements without requiring proof or independent verification; and (c) think of science as something beyond the comprehension of the audience member, and therefore not to be questioned by him or her.

Yes, I get their legalistic "we're not here to teach people how to dispose of bodies," but it's ridiculous: acquiring the large quantities of acid necessary would be more suspicious than a number of other ways that can easily be found on the interwebs or on Bones or Dexter. Joe Pesci explains the traditional approach at the beginning of Casino: "dig the hole before you whack the guy, so you don't have to dig it with the body out in the open."

(The secret sauce is hydrogen peroxide, another chemical that would really raise eyebrows -- FBI and DHS eyebrows -- if purchased in quantity, since it is used for improvised explosive devices. Also, really really really temperamental chemical.)

Then I remembered the Mythbusters had done this before, in the thermite episode, for which they blurred the names of the igniter reagents. FYI,  to ignite thermite you drop glycerol on a mound of potassium permanganate on top of the thermite; though you can simply use a long-neck torch, like they did on, oh irony, Breaking Bad.

When I was a kid, I liked chemistry almost as much as electronics, and this is the kind of thing we got to play with before the world became full of Sitzpinkler. Do they even sell chemistry sets for children anymore? If not, where is the next generation of chemists and chemical engineers going to come from? Chemistry can be dangerous, but bringing up an entire generation ignorant of it is terminally stupid. But I digress...

Back to the main problem: It has become acceptable to make the argument that the audience should trust the experts on faith, since the technical stuff is either too difficult or too dangerous or too easily misused by the non-initiated.

This kind of thinking is more dangerous to science than 10 Tomás de Torquemadas. Because this is the kind of thinking that creates 10,000 Torquemadas, all convinced that they are the paladins of science and all ready to auto-da-fé those whom the experts deem to be the enemies of Science™. Thus quelling dissent and killing the basis of all progress in science.

A lot of people will line up for this; after all there are many people who like the idea and image of science. As long as they don't have to learn any, of course.

-- -- -- --
Note: edited on Nov 21st to remove unnecessary detour about "skeptics."

## Friday, November 8, 2013

### Thoughts inspired by a science joke

Another day, another science joke. Not a very funny one, but enlightening.

When I say "science joke," I mean one that involves a modicum of science knowledge. Which makes this yet another post against the scientistologists that are all in favor of science as long as they don't have to learn any. They like the idea and the image of science, but are not willing to do the work necessary to learn it.

Last sunday I tweeted: According to my alarm clock, the computer & phone spent two hours moving at almost 90% of speed of light. That's one explanation.

Since that was the end of Daylight Savings Time, what that tweet says is that clocks which get a synchronization signal from the internet were one hour behind those that I have to reset manually. The twist is that I calculated what speed would compress time 1:2, $v = 0.8660254 c$, and included that in the joke.

(By the way, this time compression is an example of the twins "paradox," which is not paradoxical at all.)

As for the people who "love science" (as long as they don't have to learn any), well, many of them have a vague notion that I was referring to relativity, but no idea whether the 90% number was right, wrong, or random. Science is something they believe in, without actually knowing any of the details.

More and more people are falling into this trap of believing in science as opposed to actually learning it. That is a very bad trend in a technology-dependent society.

## Thursday, November 7, 2013

### Twitter valuation is a bet on network value

No, I don't think Twitter is prima facie over-valued.

The following graphic from the WSJ (reproduced here because deep linking is discouraged) has been making the rounds, generally in support of the idea that Twitter's valuation is yet another finance mistake:

But here's the funny thing. Note how both LinkedIn and Twitter are apparently over-priced, and suddenly an alternative explanation appears: the market understands that, while right now the revenue models of these companies are not good, there is value in their networks that, either directly through advertising and other attention-monetizing strategies, or indirectly via the information value of the network, will eventually be captured. (Even if that requires a change of management, which sometimes it does.)

It's a bet on the future value of networks and their associated preference and communication data.*

As I mentioned in my post about the Skype acquisition, these companies are not just some black-box generators of revenue. In particular Twitter's resources include:
• Knowledge of the network to a level of detail that can be closed off to outsiders.
• Personnel and technology that allow for exploitation of the knowledge in the network; inasmuch as the data and technology have unique features, the personnel and the resources are partially locked into the company and are assets to be taken into account in valuation.
• An installed base that serves as a barrier to entry to competitors trying to build their network.
So, not being privy to the financial details, I cannot say whether the valuation is right or wrong, but I can certainly say that people who pass judgment on that valuation based on last year's revenue are terminally myopic. Sadly even people whom I respect seem to fall for this trap.

There's gold in those networks and the nerds who can analyze them.

-- -- -- --
* A bet not dissimilar to that of Google trying to build their own social network with Google Plus and all the actions they take in other properties like YouTube trying to nudge people into using the social media affordances of Google Plus instead of the older comments and video responses (now discontinued, get your linkage on G+).

## Monday, October 21, 2013

### My phone is just as smart as you guys!

Dunning-Kruger Effect, the internet is your multiplier.

Anyone can search for anything, which makes knowing what to search for and how to interpret the results more important than ever. The comoditization of information increases the value of knowledge.

Early on in the most recent episode of The Big Bang Theory (season 7, episode 5, "The Workplace Proximity" *), Amy, Bernadette, and Penny are in Penny's apartment drinking wine and talking about Amy's temporary move to Caltech:

Amy: "I'm leading a study to see if deficiency of the monoamine oxydase enzyme leads to paralyzing fear in monkeys"

[Bernadette lets slip that she might have done that research with death row convicts, which she quickly denies because it would have been unethical.]

Penny: "Not many people know this, but the monoamine oxydase [mispronounced as "oxidize"] enzyme was discovered by a woman, Mary Bernheim.

Penny: "That's right. My phone is just as smart as you guys."

And this captures a common confusion between knowledge and information. Note the pathologies illustrated in that vignette:

1. Who discovered MAO is irrelevant for the work Amy will be doing. Like Penny, many people pluck some vaguely related fact from the internet to interject into a discussion, in the illusion that they will appear knowledgeable. This behavior is becoming more and more common, especially with smartphones, but knowledge is a lot more than a simple collection of facts.

2. Penny searches for MAO because someone else brought up the topic. Without a framework of knowledge to integrate facts, people who depend on search don't know what to search for. In other words, the input for a meaningful search requires knowledge.

3. Even if Penny found useful MAO information, for example the mechanism by which it catalyzes the oxidation of monoamines and affects mood, she wouldn't be able to interpret the biochemistry and neuroscience involved. In other words the output of the search only gets meaning through knowledge.

Yes, I understand it's a joke. But this attitude that learning substantive material is passé, made unnecessary by the existence of search engines — an attitude that sadly can be found even among educators — is corrupting, corrosive, and counterproductive.

Without knowledge, information is useless. More people making knowledge-poor searches leads to more random facts being flung haphazardly into discussions; this makes having the knowledge to select and interpret the important facts more valuable than before.

Knowledge is power, the power to use information. Pity so few people know that.

-- -- -- --
* Even though the general arc of the show has become a soap opera, there are still some good jokes in each episode, and the final joke in this one is among the best.

## Friday, October 18, 2013

### Digging too deeply into a Heisenberg (Physics not crystal meth) joke

Some days ago I saw and retweeted this joke:
Police officer: "Sir, do you realize you were going 67.58 MPH?
Werner Heisenberg: "Oh great. Now I'm lost."
Ok, it's a funny joke, provided you have a passing acquaintance with basic physics.

But here's my problem: a lot of people who kinda-sorta understand that joke have no idea what's really behind it. And that's a problem I've had for a while now with the "science fanclub that cannot do basic science" as I call them. (The people who think that Surely you're joking Mr. Feynman is a physics book and like to watch soap opera biographies of scientists, heavy on the drama, light on the actual science.)

[Added later] My problem with these people is that they perceive science as something that comes from authority and must not be questioned or further investigated by others. For example, they "know" that the position and the velocity of an elementary particle cannot be jointly determined with arbitrary precision; but when pressed about how they know that, they say something about "Cosmos" or mention a Richard Dawkins book (which of course would not cover this); they behave as acolytes to those they recognize as high priests of science, who – presumably – are anointed by a Council of Wise Ones. That's precisely the opposite of what gave science its success, the idea that anyone can question received wisdom and experiment or observation are the ultimate arbiters of correctness. [End of addition.]

A simplified form of Heisenberg's inequality, good enough for our purposes, is

$\qquad \Delta p \, \Delta x \ge h$

Going by orders of magnitude alone, assuming that the mass of Heisenberg plus car is in the order of 1000 kg, and noting that the speed is given to a precision of 0.01 mi/h, an order of magnitude of 10 m/s, with $h \approx 10^{-34}$ Js, we get a $\Delta x$ of the order of

$\qquad \Delta x \approx \frac{ 10^{-34} }{10 000} = 10^{-38}$ m.

That's a lot of precision to consider oneself lost. For comparison, the width of a typical human hair is in the order of 10-100 micrometers, or $10^{-5}$ to $10^{-4}$ m.

Yes, these numbers show how stupid it would be to use Heisenberg's Uncertainty Principle for macroscopic observations. That's the joke; the fact that many members of the science-fanclub have no idea of the magnitudes involved but like to lord their science-fandom over others is part of my irritation.

I see this all the time in my job, with people who can't write Bayes's formula talking loudly about graphical models (should really be graphal models, BTW, since they are based on graphs, not graphics).

## Sunday, July 28, 2013

### For better presentations, avoid most presentation advice

If you want to become a better presenter, you probably should avoid most advice about presentations.

Yes, here I am, an educator, apparently telling people to avoid sources of knowledge. The problem is that much presentation advice is not a source of knowledge; more like a source of sophistry that helps perpetuate some of the worst problems with presentations.

As an avid reader of books, articles, and blog posts about presentations, I identified a few pathologies from the mass of material available:

1. Presentationism. This is what I call the tendency of people who do presentation training or information design training to focus on the style and delivery of the presentation instead of the substantive material that the presentation is about. This is a form of professional deformation, but one that can become a serious obstacle to understanding the real value of presentation skills: usually that of changing the audience's mind, unless the presentation is being done for entertainment, legal, or other purposes.

2. Perfectionism. The idea that all presentations have to be done to the standard of excellence and that all presenters should put as much effort as needed into preparing, rehearsing, delivering, and clarifying every presentation. In reality there are many people who have to do presentations with minimal resources, for whom the time and effort required to create a better presentation represent a net loss of value.

3. Ideological purity. Instead of choosing the best tool for a given presentation, many authors are strict ideologues: the presentation should conform to their choice of tool and styles. This affects some famous authors in information design and presentation techniques, and has led to pointless arguments about which tool is better, tout court. Like arguing whether a hammer or a drill is a better tool, independently of the project, and equally pointless. This creates a subordinate problem:

4. Subject matter and audience independence. According to a plurality of authors, Einstein presenting to an audience of Princeton scientists and the Frito-Lay head of sales for northeast Kentucky reporting on the penetration of new chip varieties to a group of mid-level executives should prepare and deliver their presentations in about the same manner, with similar presentation support (typically, though not always, slides), and about the same effort. To be clear, these authors don't suggest that the substance of the presentation should be the same, but rather that the process of preparing and delivering these presentations and the style and design of the materials should be the same.

5. The "tricks and tips" distraction. Many authors offer only tricks and tips, which may be good or bad, but in general create a false sense of learning: the problem with most bad presentations is systemic, not something that a tip will solve. Similarly, a lot of authors use cherry-picked results from psychology to support their approach. As a general rule, unless you can read the original source and determine whether the result applies to your circumstances, it's better to ignore this.

So, what is someone who wants to become a better presenter to do? I've written about it (note the "most" in the title above, which is not "all" on purpose), and here are three further recommendations:

- James Humes's Speak like Churchill, Stand like Lincoln is a short, well thought-out book on public speaking.

- Edward Tufte's books, courses, and web site, despite a bit of ideological purity, are possibly the best source for people for whom getting complex messages across to their audience is important and worth the effort.

- Don Norman's critique of Tufte makes a good counterpoint piece for ET's works.

Above all, think critically about the advice being given; ask "does this make sense in my case?" Even the best advice has exceptions.

## Sunday, January 20, 2013

### On hiatus

I'll be taking a break from blogging in order to finish a number of writing projects.

I'll probably tweet the occasional pithy thought and post any photos I find interesting. But long-form blogging is unlikely to continue in the previous form; when I return I'll probably be posting book notes or observations about coding in R.