Wednesday, January 29, 2014

Food rules, via experience marketing

Using experience marketing to improve one's lifestyle; if there was ever a case of déformation professionnelle…

A while ago I saw this quote in The London Lounge:
Fitzroy Maclean, the real-life James Bond who died a few years ago, always carried with him on his travels a tube of anchovy paste. He explained that in his experience one could always locate some alcohol and a crust of bread: his tube made it a party. This sort of discernment has much to do with small luxuries: too luxurious and they cease to be fun, too small and they cease to be rare.
I thought it very insightful, added some marketing knowledge and a dash of Paleo to it, and came to the conclusion that life is much better if I stick to the following rules:

1. Life is too short to eat bad food. Corollary: better to go hungry than eat bad food, considering that I have subcutaneous energy reserves. When we consider the needs serviced by food, in particular hedonic needs, the satisfaction created by bad food, which at best satisfies a functional need for calories, is necessarily inferior to that of an experience good; why waste a perfectly good opportunity of high value need satisfaction with mediocre product? What are we, an airline?

2. Small luxuries, infrequently consumed. Three reasons for this: (a) experiences saturate fast in an individual consumption occasion, i.e. the pleasure from eating 25g of Brie is much higher than 1/20 the pleasure from eating 500g of Brie in a single seating [yes, it's possible, with enough baguette]; (b) longer periods between consumption occasions lead to better experience, first by decreasing any habituation that might lower the situational consumption value, second by increasing the anticipatory value of the experience; (c) infrequent consumption of unhealthy items (such as crème brulée) doesn't have the long-term health damaging effects of frequent consumption (like insulin resistance). One of the great advantages of intermittent fasting (eating only when hungry, which sometimes means less than once a day) is that each meal is a noteworthy experience: fasting takes you off the hedonic treadmill. Oh, yes, there are also some health advantages, but who cares about that?

3. Creative use of available resources yields surprising value. When we productize an experience, such as with packaged food, a lot of the degrees of freedom for consumer adaptation of the experience are lost. But when we deconstruct the productized experience, as in the quote above, the result is that surprisingly good value can be achieved with minor adjustments. "Anchovy paste, therefore party" is not something that would have come to mind easily, but it's remarkable what a few drops of white truffle-infused olive oil can do to boring fish or a thin slice of foie gras (Nein! Verboten in Kalifornien!) to a green salad.

And two general rules that have to do with how manufacturing and marketing work. (Shush, don't tell anyone.)

4. Ignore most talk about nutrition. Most people are "educated" by marketing and PR, usually without realizing it, and the purpose of marketing is to deliver value and capture part of the surplus created, not to educate people about scientific results. Not that marketing communications are outright lies, but there are entire floors of copywriters in advertising agencies and buildings full of PR people whose job is to shape the narrative. (Doesn't that sound much better than "obfuscate, defuse, and deflect?") Read Salt, sugar, and fat; great marketing book, but will raise some heckles about manufactured food. (I'm assigning a few excerpts in my value proposition design modules.)

5. Cook your own food. Corollary: learn about food and cooking, from sources like On food and cooking: the science and lore of the kitchen, The professional chef, Larousse gastronomique, and Alton Brown. (Famous chef cookbooks are worthless, as they are mostly productization of the chef's brand equity. Neat marketing, worthless pedagogy. Julia Child excepted, because in her case the fame followed the cookbook.) With a little practice and learning, cooking becomes a multi-faceted experience (sense, think, and act, sometimes even feel – all in one activity), plus it can become a free part of one's identity, replacing those expensive Pradarmani clothes [so 1990s].

Former students will recognize much of the above, except in the opposite direction of what we discussed in class. Experience marketing is the foundation for this post, but here I'm optimizing for the point of view of the consumer while in class we optimized for the point of view of the seller.

Bon appétit

Tuesday, January 21, 2014

MOOC-rize this!

Participant-centered learning is not scalable, so it's MOOC-resistant.

A couple of colleagues (in different fields) have shared MOOC-related worries with me. The logic goes, our research jobs are funded to a large extent by teaching, and if the need for teachers disappears, many schools will stop hiring expensive research faculty. Cathy "Mathbabe" O'Neil suspects MOOCs will have tragic consequences for mathematics research.

I'm not convinced.

As I see it, there are three main MOOC threats to traditional higher education: cost-effectiveness, brand equity of the schools offering the MOOCs, and quality of content. There's also one main visible weakness, certification.

Cost-effectiveness. The cost-effectiveness of MOOCs is the main argument I hear for "the end of universities as we know them," to which I say: if you can replace class X with videos of lectures and computer-graded problems sets, good riddance to class X.

Distance learning is an old proposition, it started with something called "a book." MOOCs add better media, the possibility of computer graded problem sets (for some fields, and requiring a significant investment in problem set design), and tutoring or discussion affordances.

But here's the crux: the scalable parts of MOOCs are the easy part of education. The hard part is motivating students, interacting with them and being responsive to their questions, taking the time to understand the reason for their incomprehension, and reacting in real time to information they bring into the class or developments in the field.

So, while MOOCs will work really well for highly motivated, studious students (nerds like me), the average student will need more personalized attention than is cost-effective to offer in large scale.

Repeat after me: Personalized attention doesn't scale.

True, many classes in many institutions of higher learning don't deliver anything more than the scalable parts of the MOOCs; no personalized attention or significant interaction with the students at all. Those classes are ripe for replacement by MOOCs, and that's good.

This is what gets me steamed about Mathbabe's post: if the professors don't add value to a student reading the textbook and solving the problem sets (that in many cases are straight off teachers' manuals from the textbook publisher and graded by teaching assistants), then what is the purpose of hiring someone with a deep understanding of the field a/k/a a research faculty member?

The answer to that lies in the value of an instructor with a deep understanding of the field to manage participant-centered learning (now called "flipped classroom" but in fact the only way anyone ever really learned any technical material was by practicing it).


Brand equity. Who wouldn't rather say "I took the Caltech Machine Learning course" rather than "I took the Cal State-Moraga Machine Learning course"? This is indeed a problem, but to a large extent it's a matter of brand credibility footprint, not a technological issue.

With prestigious schools creating extension campuses and joint ventures with other universities, MOOCs are only a small part of the problem. And let's remember that brand extensions are not one-way propositions; MOOC-rizing courses may dilute a school's brand equity. (So may having extension campuses, of course. Armani Exchange doesn't help the brand equity of Armani.)

Talking to some Hahvahd B.S. colleagues, I got the distinct impression that they believe the student physical presence in their Cambridge (Allston, really) campus is an essential part of the brand identity, one that they are not willing to compromise on. I'd venture that at Hahvahd B.S.  they know a thing or two about the network and identity dimensions of brand equity.

So, I agree that the brand equity is an issue, but more because of extension campuses and joint ventures than MOOCs, since the brand credibility footprint is much more likely to encompass the former than the latter. (Says the visiting professor at TheLisbonMBA, a joint venture of UCP, UNL, and MIT.)


Quality. Obviously there's a difference between the quality of the classes taught at Caltech and at [the fictional] Cal State-Moraga; and that is part of the brand equity of Caltech. But the real question is whether the students of CS-Moraga are going to benefit from a class that was designed for Caltech students more than from one that was designed specifically for them.

Note that this immediately raises the question of whether CS-Moraga classes are customized to their student population (that is now, before being MOOC-rized). And that's again the issue of what faculty are doing at CS-Moraga: if they rely on the textbook and the teachers' materials provided by the textbook publisher in order to save themselves the trouble of actually preparing a class, then as I said above, good riddance.

On the other hand, in participant-centered learning the instruction follows from the participants' needs and skills, moving at their pace, therefore for good quality the instructor must have a broad training in the general field and a deep understanding of the materials of the class.

It's incumbent upon the faculty to make itself more valuable than a cost-effective MOOC, or a textbook for that matter. Otherwise, it's their own fault if they're MOOC-rized


Certification. Certification of knowledge is the weak point of MOOCs as they currently exist, but it's important to note two issues with this.

First, certification cannot be the only function of universities or research faculty, as certification alone doesn't require the large infrastructure and cost of a university or the need for broad research programs.

Second, and much more critical, if the MOOC certification weakness is part of the advantage of a traditional university, that weakness ends if universities stop taking their certification responsibilities seriously. When some schools graduate computer engineers who never wrote a program that passed a compiler's syntax check, let alone run, let alone run correctly or efficiently — to choose an example I heard from someone I trust — then the credibility of universities as certification mechanisms comes into question, and their advantage vis-a-vis MOOCs in this regard evaporates.

(Yes, there's a third possible issue, that of MOOCs adding some sort of credible certification. I believe that that's a long way off, given how it would require (a) an infrastructure to prevent fraud; (b) some sort of long-term evaluation, since not everything can be certified with a short test; and (c) legal protection in case of unacceptable demographic results in aggregate, which universities seem to have had grandfathered in, but other institutions have found themselves liable to.)



I for one welcome our new MOOC multimedia limited-interaction e-textbooks for the 21st Century. As a complement to real instruction: customized, personal, and responsive. And as a mechanism for making universities take certification seriously.

Friday, December 27, 2013

Technocracy. It's new.

No. It's not.

I've read a number of recent books and articles about how technology, particularly computers and robots, will change everything and create a bipartite society, where "there will be those who tell computers what to do, and those who are told what to do by computers" – in a compact form. (As a computer engineer, I sort of approve of this message. :-)

This idea of a bipartite society with a small elite lording over the undifferentiated masses is not new (really, not new at all). That it's a result of technology instead of divine intervention or application of force is also not new, but, since most people have an "everything that happened before me is irrelevant because my birth was the most important event in the totality of space-time" attitude towards the past, this is ignored.

There are a few reasons contributing to the popularity of this idea:

It's mostly right, and in a highly visible way. Technological change makes life harder for those who fail to adapt to it. In the case of better robotics and smarter computers, adaptation is more difficult than it was for other changes like the production line or electricity. One way to see this is to see how previously personalized services have been first productized (ex: going from real customer service representatives to people following an interactive script on a computer) and then the production processes were automated (ex: from script-following humans to voice-recognition speech interfaces to computers). Technological change is real, it's important, and it's been a constant for a long time now.

(Added Jan. 7, 2014: Yes, I understand that the economics of technology adoption have a lot to do with things other than technology, namely broader labor and economic policies. I have teaching exercises for the specific purpose of making that point to execs and MBAs. Because discussion of these topics touches the boundaries of politics, I keep them out of my blog.)

It's partially wrong, but in a non-obvious way. People adapt and new careers appear that weren't possible before; there are skilled jobs available, only the people who write books/punditize/etc don't understand them; and humans are social animals. The reason why these are non-obvious, in order: it's hard to forecast evolution of the use of a technology; people with "knowledge work" jobs don't get Mike Rowe's point about skilled manual labor; most people don't realize how social they are.

(On top of these sociological reasons there's a basic point of product engineering that most authors/pundits/etc don't get, as they're not product engineers themselves: a prototype or technology demonstrator working in laboratory conditions or very limited and specific circumstances is a far cry from a product fitting with the existing infrastructure at large and usable by an average customer. Ignoring this difference leads authors/pundits/etc to over-estimate the speed of technological change and therefore the capacity of regular people to adapt to it.)

Change sells. There's really a very small market for "work hard and consume less than you produce" advice, for two reasons. First, people who are likely to take that advice already know it. Second, most people want a shortcut or an edge; if all that matters is change, that's a shortcut (no need to learn what others have spent time learning) and gives the audience an edge over other people who didn't get the message.

It appeals to the chattering classes. The chattering classes tend to see themselves as the elite (mostly incorrectly, in the long term, especially for information technologies) and therefore the idea that technology will cement their ascendancy over the rest of the population appeals to them. That they don't, in general, understand the technologies, is beyond their grasp.

It appeals to the creators of these technologies.
 Obviously so, as they are hailed as the creators of the new order. And since these tend to be successful people whom some/many others want to understand or imitate, there's a ready market for books/tv/consulting. Interestingly enough, most of the writers, pundits, etc, especially the more successful ones, are barely conversant with the technical foundations of the technologies. Hence the constant reference to unimportant details and biographical information.

It appeals to those who are failing. It suggests that one's problems come from outside, from change that is being imposed on them. Therefore failure is not the result of goofing off in school, going to work under the influence of mind-altering substances, lack of self-control, the uselessness of a degree in Narcissism Studies from Givusyourstudentloans U.  No, it's someone else's fault. Don't bother with STEM, business, or learning a useful skill. Above all, don't do anything that might harm your self-esteem, like taking a technical MOOC with grades.

It appeals to those in power. First, it justifies the existence of a class of people who deserve to have power over others. Second, it describes a social problem that can only be solved by the application of power: since structural change creates a permanent underclass, not by their fault, wealth must be redistributed for the common good. Third, it readily identifies the class of people who must be punished/taxed: the creators of these technologies, who also create new sources of wealth to be taxed. Fourth, it absolves those in power from responsibility, since it's technology, not policy that is to blame. Fifth, it suggests that technology and other agents of change should be brought under the control of the powerful, since they can wreak such havoc in society.

To be clear, technology changes society and has been doing so since fire, the wheel, agriculture, writing,  – skipping ahead – printing press, systematic experiments, the production line, electricity, DNA testing, selfies... The changes these technologies have brought are now integrated in the way we view the world, making them so "obvious" that they don't really count. Or do they? Maybe "we" should do some research. If these changes were obvious, certainly they were accurately predicted at the time. Like "we" are doing now with robots and AI.

You can find paper books about these changes on your local sky library dirigible, which you reach with your nuclear-powered Plymouth flying car, wearing your metal fabric onesie with a zipper on your shoulder, right after getting your weekly nutrition pill. You can listen to one of three channels bringing you music via telephone wires, from the best orchestras in Philadelphia and St. Louis while you read.

Or you can look up older predictions using Google on your iPhone, while you walk in wool socks and leather shoes to drink coffee brewed in the same manner as in 1900. The price changed, though. It's much cheaper to make, but you pay a lot more for the ambiance.

Think about that last word.

Friday, December 6, 2013

Word salad of scientific jargon

"The scientists that I respect are scientists who work hard to be understood, to use language clearly, to use words correctly, and to understand what is going on. We have been subjected to a kind of word salad of scientific jargon, used out of context, inappropriately, apparently uncomprehendingly." – Richard Dawkins, in the video Dangerous Ideas - Deepak Chopra and Richard Dawkins, approximately 27 minutes in.

That's how I feel about a lot of technical communications: conference panels, presentations, and articles.  An observed regularity: the better the researchers, the less they tend to go into "word salad of scientific jargon" mode.

Wednesday, November 27, 2013

Intellectual counterfeit fashionistas and the corruption of STEM and analytics

I have acquaintances who say they like classical music but never listen to it and can't tell Bach from Brahms. While this is entertaining to classical music aficionados, a similar disconnect happens in STEM and business analytics, where it has serious consequences.

I've observed many people who are always saying how important science is, who can name several recent Nobel laureates in the sciences, but can't compute the kinetic energy of a 2-ton SUV going 65MPH (766kJ), or, ironically, can't explain what the research of those Nobel laureates was about.

I know people who are always talking about Big Data™ and "the" Management Information Revolution™ (yes, they think the current one is the only one), but cannot write Bayes's formula and think that standard deviation is the same as standard error.

These are the signs of the rise of the intellectual counterfeit fashionista (ICF). The ICF wants others to consider him or her an intellectual (that's the I), up to date on the latest hottest intellectual topic (that's the F), but is not willing to do the work and the learning necessary to understand that topic (that's the C).

No matter how infuriating or entertaining an ICF can be on a personal level, their rise is a problem -- chiefly because of their effect on education, the practice of technical professions, and the general perception of STEM and analytics in society.

Education: by trying to recruit proto-ICFs into STEM/analytics, teaching institutions end up having to water down their courses, since the ICFs don't want to do the work needed for real learning. This leads to lower quality education for every student, even the non-ICFs.

In the mid-to-long term, this creates a number of credentialed ignoramuses and gives rise to the strange situation where people who hire engineers say there's a dearth of them, while engineering associations say there's a glut. I guess it depends on how you define engineer, by skills or by credentials.

Professions: the obvious effect of ICFs is the rise in average incompetence. The more pernicious effect is the destructive nature of internal politics, which always increase in organizations with large numbers of people for which appearances and narratives are more important than observable realities and hard work.

I wish nerdiness became unfashionable again, so that the ICFs moved on to corrupt something else and left STEM and analytics alone.

Sunday, November 24, 2013

Carrying less to do more

Every so often I look back at a packing list from some years ago, and find myself flabbergasted at how much simpler travel has been made by technological advance and some judicious choices.

This is all the hardware (plus cell phone, mine being a prepaid for emergencies only) I carry on a work trip:

Laptop
+ Power brick
+ VGA adapter
+ Presentation remote (with green laser and 4GB drive)

iPod Touch 5
+ Charger cable
+ audiophile earphones
+ sports earphones

Backup hard drive (2TB of space, mostly filled with optional content for work & downtime)
+ USB 3 cable

Large capacity USB flash drives (including a 32GB one on my keychain!)

Rite-in-the-Rain notebook
+ Fisher space pen

Microfleece cleaning tissues doubling as packing material.

Add a magazine to read when electronic devices aren't allowed (I get The Tech and Smithsonian Magazine for free, so I take those and dispose of them when done), clothing (planning helps), toiletries, and food for travel.

The magic enabling the ever shrinking ever more powerful hardware packing comes from multitaskers and digital content.

The iPod Touch replaces a lot of equipment I previously carried (iPod, still camera, video camera, voice recorder, backup remote control for presentation, books to read, and even my iPad 1.0 in many respects). The hard drive carries an hitherto unthinkable library of work and play stuff. (I don't play computer games, other than the occasional solitaire, bejeweled, or mahjong, so I don't carry -- or own -- a game controller.)

The second part of the magic is the move to digital content.

Many years ago I'd carry a small sleeve case with CDs for my Sony Discman (Get off my lawn, kids!!!), some DVDs, paperback books, work books -- hardcover textbooks! -- and other heavy objects with minimal bits-to-atoms ratio. Now I carry thousands of music tracks, hundreds of books, audiobooks, and technical papers, dozens of movies, videos, and television shows, and even a few comic books for nostalgia sake, all as bits on the hard drive. (Obviously these are not the only copies I have of those bits.)

Anything important is backed up in a multiplicity of places: laptop hard drive, portable hard drive, USB flash drives, multiple online services. Because it's well known that anything important of which you only have one copy will, by the laws of Physics, necessarily be lost, inoperative, or confiscated by the TSA.

Of course, you still need to bring a few changes of clothes and toiletries. There's no digitizing those.

Thursday, November 21, 2013

The roots of my disillusionment with 'official skeptics'

There were several contributing events, all similar in one point: 'official' skeptics prove to be so in name but not in actuality. This is one of the events, involving James Randi, whom I still admire.

James Randi had a long feud with Uri Geller regarding spoon bending. Now, I used to do a lot of spoon bending myself before I got a OXO Good Grips ice-cream scoop, but that's not the type of spoon bending that got Mssrs Randi and Geller at loggerheads.

Mr Geller claimed he had paranormal powers, which he demonstrated by bending spoons. Mr. Randi implied (for legal reasons he couldn't outright state) that Mr Geller was in fact using prestidigitation. (For a moment ignore the obvious question of why someone with paranormal powers would use them to bend eating utensils instead of, say, make a fortune on Wall St.) You'd think that Mr. Randi would explain how the trick is done, so that the audience could check whether Mr. Geller was in fact using that trick.

No. Mr. Randi invoked the Magician's Code and declined to explain how the trick is done. (FYI: you bend the spoon with finger pressure or against a table, takes a bit of practice to do it without other people noticing, and even with practice they will notice if they're looking for it.) So, here is Mr. Randi, allegedly a skeptic, asking his audience to accept on faith that there exists such a trick that Mr Geller could be using.

When Mr. Randi replicated his great feat of spoon bending, allegedly using a trick, Mr. Geller took advantage of Mr Randi's adherence to the Magician Code to say that Mr. Randi was in fact using his -- Randi's -- paranormal powers. All because Mr. Randi's argument relied on the audience's faith, not a testable proposition.

Now, that's ironic.

-- -- -- --
Note: this vignette was part of the post "Fed up with 'trust us, we're experts' science," but it detracted from the point of that post so I separated it into its own post.