Showing posts with label STEM. Show all posts
Showing posts with label STEM. Show all posts

Tuesday, August 30, 2016

Some thoughts on quant interviews

Being a curmudgeonly quant, I started reacting to people who "love" science and math with simple Post-It questions like this:


(This is not a gotcha question, all you need is to apply Pythagorean theorem twice. I even picked numbers that work out well. Yes, $9 \sqrt{2}$ is a number that works out well.)

Which reminds me of quant interviews and their shortcomings.

I already wrote about what I think is the most important problem in quantitative thinking for the general public, in Innumeracy, Acalculia, or Numerophobia, which was inspired by this Sprezzaturian's post (Sprezzaturian was writing about quant interviews).


In search of quants

That was for the general public. This post is specifically about interviewing to determine quality of quantitative thinking. Which is more than just mathematical and statistical knowledge.

One way to test mathematical knowledge is to ask the same type of questions one gets in an exam, such as:

$\qquad$ Compute $\frac{\partial }{\partial x} \frac{\partial }{\partial y} \frac{2 \sin(x) - 3 \sin(y)}{\sin(x)\sin(y)}$.

Having interacted with self-appointed "analytics experts" who had trouble with basic calculus (sometimes even basic algebra), this kind of test sounds very appealing at first. But its focus in on the wrong side of the skill set.

Physicist Eric Mazur has the best example of the disconnect between being able to answer a technical question and understanding the material:

TL; DR: students can't apply Newton's third law of motion (for every action there's an equal and opposite reaction) to a simple problem (car collision), though they can all recite that selfsame third law. I wrote a post about this before.

Testing what matters

Knowledge tests should at the very least be complemented with (if not superseded by) "facility with quantitative thinking"-type questions. For example, let's say Bob is interviewing for a job and is given the following graph (and formula):

Nina, the interviewer, asks Bob to explain what the formula means and to grok the parameters.

Bob Who Recites Knowledge will say something like "it's a sine with argument $2 \pi \rho x$ multiplied by an exponential of $- \kappa x$; if you give me the data points I can use Excel Solver to fit a model to get estimates of $\rho$ and $\kappa$."

Bob Who Understands will start by calling the graph what it is: a dampened oscillation over $x$. Treating $x$ as time for exposition purposes, that makes $\rho$ a frequency in Hertz and $\kappa$ the dampening factor.

Next, Bob Who Understands says that there appear to be 5 1/4 cycles between 0 and 1, so $\hat \rho = 5.25$. Estimating $\kappa$ is a little harder, but since the first 3/4 cycle maps to an amplitude of $-0.75$, all we need is to solve two equations, first translating 3/4 cycle to the $x$ scale,

$\qquad$ $ 10.5 \,  \pi x = 1.5 \,  \pi$ or  $x= 0.14$

and then computing a dampening of $0.75$ at that point, since $\sin(3/2 \, \pi) = - 1$,

$\qquad$  $\exp(-\hat\kappa \times 0.14) = 0.75$, or $\hat \kappa = - \log(0.75)/0.14 = 2.3$

Bob Who Understands then says, "of course, these are only approximations; given the data points I can quickly fit a model in #rstats that gets better estimates, plus quality measures of those estimates."

(Nerd note: If instead of $e^{-\kappa x}$ the dampening had been $2^{-\kappa x}$, then $1/\kappa$ would be the half-life of the process; but the numbers aren't as clean with base $e$.)

This facility with approximate reasoning (and use of #rstats :-) signal something important about Bob Who Understands: he understands what the numbers mean in terms of their effects on the function; he groks the function.

Nina hires Bob Who Understands. Bonuses galore follow.

Bob Who Recites Knowledge joins a government agency, funding research based on "objective, quantitative" metrics, where he excels at memorizing the 264,482 pages of regulation defining rules for awarding grants.

Sunday, August 14, 2016

Working the solution versus solving the problem

Some time ago I tweeted that I was going to row a number of nautical miles on my trusty old Concept IIc machine. As an engineer, I use SI units for everything --- except on the water, where I use traditional units: nautical miles and knots.

A couple of rowers I know asked me how I had hacked the controller on the Concept IIc to change the units. This was my answer:

How I "hacked the software" on the Concept IIc to use nautical miles. #genius

Many people miss the point, that the others were making a common mistake in problem-solving, a mistake that forecloses most creative solutions:

The mistake is working the solution instead of solving the problem.

Hidden in the question about the hack is an assumption: that the solution has to come from my programming skills (they know what I do, so it's not an unreasonable assumption). That assumption sets a path to a solution, which would include reprogramming the firmware inside the Concept IIc controller.

Having the ability to backtrack from that path into the beginning and to choose another path is the key process in the thinking process here. Too many people start on one path and can't get off it to pursue other possible paths to the solution.

By focussing on the problem, i.e. the question "what is to be achieved?", rather than the solution under consideration, changing the software, the mistake was avoided.

Yes, this is a trivial and obvious (after the fact) example, but often the difference between a non-working "solution" and a working solution is a matter of focus on the problem to be solved.

Alas, changing their focus is too hard for some would-be problem solvers.

Saturday, July 30, 2016

Product ≠ Prototype ≠ Technology ≠ Idea

Production note: Some credit to Thunderf00t, for had he not made such a complete pig's breakfast of his analysis of Hyperloop, this "why scientists are bad at engineering" post wouldn't have been written. *


Product ≠ Prototype ≠ Technology ≠ Idea


There are significant differences between an idea ("it would be great to fly from London to New York in four hours, let's use fighter jet technologies to make an airliner") and a marketable product (the Concorde). That's just on the engineering side, without the additional complexity of the business side.


Ideas to technology

An idea is just an organization of thoughts, for example: "if we got a train riding on magnets instead of wheels, we could get rid of friction, wear, and fatigue; then if we put the train in a low pressure tube we could go really fast."

This idea becomes a technology when you get something actually working; this something is called, for obvious reasons, a technology demonstrator. It's used to show that the technology has some potential, and it used to be a minimum requirement for getting funding. (More on that below.)

Linear motor Maglev technology is already available, though maybe not quite up-to-spec, but there are some technological barriers to overcome regarding the tubes and the pods.

Here it's worth noting a common error of reasoning, which is to assume that just because something hasn't been done, it can't be done.
For example, TF's use of a video excerpt showing Brian Cox inside "the largest vacuum chamber in existence." It's the largest because there was never a need for a larger one. It doesn't represent a technology limit. It's not that difficult to make a long tube that can take a big pressure differential (= pipeline), though we currently design this kind of tube for over-pressure because that's what its current use requires.
Many of the "the largest X in existence" limits are determined by economic necessity, not laws of physics. Think about the largest pizza ever made; was its size determined by some limit of the laws of physics?
Sometimes the technology is based on existing science, or co-developed with it, like some of the current work in biotech. Sometimes the technology precedes the science needed to explain it (or at least the attention of the scientists whose expertise is necessary to build the explanation), as was the case of most of the mechanical innovations in the first industrial revolution.

Part of the funding of Hyperloop is an investment in technology development that will have applications beyond the Hyperloop itself ("spillovers"). There's this thingamabob called a "laser" that was imagined as a pew-pew death-ray in sciFi, became reality as a pure Physics experiment, and mostly is used to checkout groceries, read data off of polycarbonate discs, pump bits down fiberoptics, and annoy cats. Oh, some pew-pew, too.

Sometimes licensing or developing the technology in directions other than the originally intended ends up being the most important part of the business.

It's probably worth noting two things at this point:
  • Hyperloop projects haven't finished the technology development phase; that would be indicated by a technology demonstration. Assertions about the final product at this stage are futile.
  • Getting funded by professional investment organizations (with their due diligence and fiduciary obligations) requires passing much stricter scrutiny than that given to crowdsourced projects (like Solar Roadways, the Fontus water bottle, or Triton artificial gills).

Technology to prototype

Once the technologies necessary for implementing the idea exist, they have to be put together and made to work under laboratory conditions or at test-scale, in the form of prototypes.

Here's where the "scientists are bad at engineering" point becomes most pointy.

Prototypes will obey the laws of Physics (and other sciences), since they operate in reality. It may be the case that the laws aren't known yet (as with the first industrial revolution) or that they are being simultaneously developed, but no prototype can violate the laws of Physics.

The problem is that there's a lot of specialized knowledge that goes into engineering. Each small piece of knowledge obeys the laws of Physics, but deriving them from first principles isn't practical. (And real scientists don't dirty their hands with engineering.)
For example, a physicist friend of mine didn't know why the suspenders of a suspension bridge (the vertical cables from the big catenary cable to the bridge deck) sometimes have a thin metal helix around them. When pressed on it he said "it's probably a reinforcement of some kind." I knew that the helix is there to limit aerodynamic flutter, and told him. He said, "oh, of course" and mentioned some interesting facts of turbulent flow.
That's what I mean by "science is the foundation of engineering, but scientists don't learn the body of knowledge of engineering." Most scientists are humble enough to understand that there are things they don't know. My physicist friend didn't assert that the helix was for reinforcement; he actually said, "I don't know," a sentence more people would be wise to use.
For illustration, here's a series of videos about metal shop work (the presenter is a professor, I believe, since he keeps talking about research prototypes, but he's seriously shop-savvy):


Instructive and entertaining videos. A big hat tip to Star Simpson for the link, via Casey Handmer. Such is the serendipitous nature of internet knowledge discovery.

A prototype is a one-off, possibly scaled-down, version of the product reduced to its core elements. It's designed to be operated by specialists under controlled circumstances. It requires constant attention during performance and, conversely, is usually over-instrumented for its final purpose (as a product, that is), since part of its purpose as a prototype is to see which parts of the engineering body of knowledge need to be applied to the technology itself.

Sometimes that extensive instrumenting of prototypes helps discover hitherto unknown issues or phenomena and leads to rethinking of extant technologies and redesign or retrofit of existing products. Historically a good part of the body of knowledge of engineering has evolved by this process.
For example, vortex shedding in aircraft wings was not identified for the first several decades of aviation, even though the physics necessary for it was developed in the late 19th Century. Once the engineering idea of vortex shedding wingtips (or, for older airframes being retrofitted, winglets) entered the body of knowledge, it became universal for new airframe design.
The gulf between a prototype, typically a one-off object made to laboratory-grade specifications that requires an expert to operate, and a final product is almost as big as that between idea and prototype, and a lot of other specialized skills are necessary to bridge that gulf.

Prototype to product

Any engineering product development textbook will identify a lot of things that separate a prototype from a product, but here are a few off the top of my head (and the figure above):
  • Products have to be mass-produced by production facilities, not prototyping shops or laboratories. Figuring out how to mass-produce a product and organizing that production is what's called production engineering. Sometimes that involves the development of specialized production technology, and its prototyping and production, which might involve production engineering of its own, which might require... etc.
  • Products are to be operated by normal people, not expert operators (the drunk Russian truck drivers in the figure were motivated by the Only In Russia twitter account, a terrible sink of productivity). Though it's not entirely accurate, many people believe that Apple's success stems from its ability to deploy technology into final products by making it accessible to average users. That is the field of user experience design.
  • Products also need to be much more resilient, safe, repairable, and maintainable than prototypes. Though, sadly for the practice of engineering  ---and the environment --- the "discard don't repair" mentality has taken hold, so maintainability and repairability aren't priorities in much product design. It being a railway, Hyperloop would have to be designed for both, of course.
There are a lot more. Engineering textbooks exist for a reason, they're not just collections of photos of pretty machines. A lot of knowlege goes into actually making things.

In the case of Hyperloop the product is passenger rail transportation, so there's yet another body of knowledge involved, that of managing railroad operations.

Yes, it sounds exciting, doesn't it?

The whole "how hyperloop will kill you" schtick is nonsensical, since there's no final design to evaluate; but it becomes hilarious when almost all the ways to "kill" the passengers have well-established railroad solutions, namely sectioning (you can isolate sections of a line, and you can have isolation joints in the tube), shunt lines and spurs (to remove a pod from the main tube and access the outside world), instrumentation and control system with appropriate redundancies, and a wealth of other factors that any railroad engineer would be aware of.

I'm not a railroad engineer; these are basic Industrial Management observations.

And then there's deployment…

Anyone with a passing knowledge of operations management or project management could find some possible issues with the infrastructure of Hyperloop, even without knowing the details of the technology. Not impossibilities, issues that might cost money and time.
For example, a number of logistics complications come to mind regarding the construction of the Hyperloop along Route 5, namely: the movement of large-sized tube elements; the use of the Route 5 lanes as part of the construction area (even if most of the staging is done off of the road itself) while it's in use as a public roadway; and let's not forget that California municipalities are among the most anti-change in the world: NIMBY was invented here. Unless you know someone who knows someone who knows…
To have an idea of the scale of the problem created by moving the many elements of the tube, consider what happens when just one large assembly has to move on public roadways:

Building the Hyperloop infrastructure is essentially a large-scale project management problem, and specialists would be involved; I added the example above to show that there are more obvious difficulties than the risk of depressurization; in fact, depressurization isn't much of an issue under good operations management and a well thought-out track.

But pointing out commonsensical logistical difficulties doesn't help with the whole "I am a great scientist, hear me snark" persona.



- - - - - - - - - - Footnote - - - - - - - - - -

* My current view of transportation is that trains and ships are better for freight and cars and airplanes are better for people. By cars I mean autonomous individual vehicles, not necessarily individually owned, chaining for inter-city travel at 200-300 km/h (individual pods self-organizing into convoys), and swarming for autonomous intra-city travel. Most of the current problems with air travel are economic, regulatory, cultural, and managerial, not technological, though I'd like to see supersonic aircraft further along the product development process.

Maybe the Acela corridor would make sense for Hyperloop, though. Particularly since weather in the frozen Winter wasteland and broiling Summer Inferno of the Northeast is more volatile than in California, and the Hyperloop tube would be more resilient than the air shuttles, particularly the small planes. (Boston to NYC late December in a small plane… the horror, the horror.)

But as mentioned above, I believe there are some potential high-value spillovers from the technological developments necessary for Hyperloop, including advances in materials science and production engineering, even if it isn't ever actually built.


A couple of acquaintances asked me why I don't address TF's video (or its follow-up and comments on both YouTube and Reddit) directly. Giving it minimal thought,


But the main reason not to get into online arguments with strangers is basically the same as for not wrestling with a pig: you both get dirty but the pig enjoys it.

Friday, June 17, 2016

More fun with people who "love" science

"How many Joule in a kilowatt-hour?"

This is not a trick question. It's a trivially simple question, that requires a middle-school undestanding of science. Yet, someone whom I'll call Igor (for Ignorant Grandstanding Oblivious Rabble-rouser):

a) Had no idea what I was talking about;

b) Didn't think there was any relation between my question and Igor's topic of "energy";

c) Didn't realize that Igor's ignorance of basic units of energy undermined Igor's credibility as a source of information on "energy"; and

d) Wasn't deterred from continuing a long Jeremiad about the "good" types of "energy" and the "bad" types of energy.

(One Watt equals one Joule per second, so one kWh is 3.6 million Joule. I knew this before I was 10, since I was a science geek even then, but it's taught in middle school where I come from.)

Having worked in education for a while (on and off), I've seen many cases where people don't learn, forget what they learned, and forget that there's something to be learned. But Igor is different.

Igor thinks that learning is unnecessary, because Igor already knows. Igor knows because... well, because all Igor's life, Igor was never contradicted as long as Igor's words fit the prevailing narrative. Igor's self-esteem ballooned like a spinaker in strong wind, and never deflated. Igor's education avoided science, where Igor might occasionally be wrong, so Igor never learned the most important lesson:

Reality always wins in the end.

Thursday, February 25, 2016

People in glass houses shouldn't call smart kids ignorant


So, an acquaintance forwarded another "kids these days can only take tests but don't know anything important" link; it included these questions as example of the problem:

"Who fought in the Peloponnesian war?  What was at stake at the Battle of Salamis?  Who taught Plato, and whom did Plato teach?  How did Socrates die?  Raise your hand if you have read both the Iliad and the Odyssey.  The Canterbury Tales?  Paradise Lost? The Inferno? 
Who was Saul of Tarsus?  What were the 95 theses, who wrote them, and what was their effect?  Why does the Magna Carta matter?  How and where did Thomas Becket die?  What happened to Charles I?  Who was Guy Fawkes, and why is there a day named after him?  What happened at Yorktown in 1781?  What did Lincoln say in his Second Inaugural?  His first Inaugural?  How about his third Inaugural? Who can tell me one or two of the arguments that are made in Federalist 10? Who has read Federalist 10?  What are the Federalist Papers?"

The funny thing, and I'm not the first one to notice this, is that the people who ask these questions in order to call others ignorant have little knowledge of the sciences, technologies, engineering, and math. (Or economics and business, for that matter.)

So, here's my response:

What happens when you drop metallic copper into sulfuric acid? What does it mean that the half-life of caffeine in the human body is approximately 2 hours? What is the main function of the kidneys and how does the heart work, namely what's connected to each part? Raise your hand if you can write the chemical equations for sodium hydroxide reacting with hydrochloric acid and for the combustion of propane. The quadratic equation solution formula? The equations of motion for a ballistic projectile? The complex conjugate of $(4 - 7i)\times (3+ 2i)$? 
What is discounted cash flow? How far are the Sun and the Moon from Earth? What is kinetic energy, and for a given moving object does it increase more when you double the mass or the speed? Why does the standard error for an estimate matter? How does a pressure cooker do its faster cooking? What's the difference in market outcomes for an increase in demand and an increase in supply, everything else being constant? What happens at Lagrange Points? What amino acids are essential, and why are they "essential"? What's Newton's first law of motion? His second law? What's an example of the difference in programming languages between a cycle and a conditional statement? Who can tell me one or two main differences between Newtonian physics and general relativity? Newtonian physics and quantum mechanics? What makes quantum mechanics "quantum"?

I contend that knowing the answers to my questions is a lot more important than to the first set of questions. Alas, many "educated" people don't think so. After all, most of the top questions lead to discussions where one can say more or less what one wants, but the bottom questions all have outside validators (the science, engineering, math, and economics or business).

The kids may well be ignorant, but the haughty superciliousness of most people whose knowledge base is the Humanities or Social Sciences is completely undeserved.

I'm going to start asking people who make big pronouncements about the ignorance of today's youth to calculate something like the missing value in the diagram above. It's basic Pythagorean theorem, applied twice, so everyone with a basic education should be able to do it, right? Right? RIGHT?

[Thoughts ruminate during the work day…]

The more I think about these two cultures, the more I see it's not just about different knowledge, it's about the focus of attention.

Compare the following question, from the original article:
Who taught Plato, and whom did Plato teach?  
with
What is kinetic energy, and for a given moving object does it increase more when you double the mass or the speed?
The answer the author was looking for, I think, is Socrates and Aristotle. Not the thoughts of Socrates and of Aristotle, but simply the persons. A lot of the questions in the original article are about people or events, not about concepts, ideas, or tools, which are what all my questions are about. (Kinetic energy is the energy of motion, $E_{K} = \frac{1}{2} m v^{2}$ so doubling the speed quadruples the kinetic energy, while doubling the mass only doubles the energy.)

Of course, some questions are out-and-out cultural virtue signaling. I'll see your
Raise your hand if you have read both the Iliad and the Odyssey.
And raise you a
Raise your hand if you have read both Molecular Biology of the Gene and Walter Rudin's Real and Complex Analysis and can answer the questions at the end of the chapters.
Game, set, and match, as they say in the Super Bowl.

One of the funniest things to see is the collision of these two focuses of attention, for example when people who don't like science try to pretend they "love" science by emphasizing people or events. That's when we see "science" questions like
  • Where was Einstein born? 
  • What Nobel Prizes did Marie Curie win?
These are, at best, history questions. Compare with
  • What is the energy of a 1kg mass going $99\%$ of the speed of light? 
  • If we start with 100g of Thorium-231 ($^{231}\mathrm{Th}$, an isotope in the decay chain of Uranium) and wait 51 hours (two half-lives), how much $^{231}\mathrm{Th}$ is left?
The answers to these don't depend on historic events or individual people. (They do relate to the people in the questions above by way of their work.) They require computation and thinking, for real. And that "for real" part is killer. For example, one can argue endlessly about the meaning of texts and the existence of "penumbras" in law or sticking to original intent, but there is no arguing with the technical questions.

That's one of the big issues that separates technical material from "soft" material: there's really an answer, and that answer can be shown to be right or tested with experiments that don't depend on feelings or whether Taul of Sarsus came up with it in the $94 \frac{1}{2}$ theses he nailed to the door of the Delicatessen in Wittenberg while he went in for a Schlagobers after the battle of the Salamis (pork against beef against chicken against vegan).

BTW, people who "love" science and haughty non-STEM professoriate: what's the answer to those two technical questions? Hint: don't forget the Lorenz correction.

"Won't someone rid us of these meddlesome quants?"

Sunday, February 8, 2015

Science popularization has an identity problem

Some influential science popularizers are doing a disservice to public understanding of science and possibly even to science education.

Yes, it's a strong statement. Alas, it's a demonstrable one.

With the caveats that I enjoy the Mythbusters show, especially the recent series with their back-to-origins style, and that this post is not specifically about them, the recent episode about The A-Team presented an almost-perfect example of the problem.

"Stoichiometry."

Midway through the episode Adam uses this word. It's an expensive way of saying "mass balancing of chemical equations" (not how it was described in the show). And then, well... and then Jamie proceeded to not use stoichiometry.

To be concrete: they were exploding propane. Jamie tried mixing it with pure oxygen and got a big explosion. Then they mention stoichiometry. At this point, what they should have done was to introduce some basic chemistry.

The propane molecule has 3 carbon and 8 hydrogen atoms, $\mathrm{C}_{3} \mathrm{H}_{8}$. It burns with molecular oxygen, $\mathrm{O}_{2}$, yielding carbon dioxide, $\mathrm{C} \mathrm{O}_{2}$, and water vapor, $\mathrm{H}_{2} \mathrm{O}$.

Chemists represent reactions with equations, like this:

$\mathrm{C}_{3} \mathrm{H}_{8} + \mathrm{O}_{2} \rightarrow \mathrm{C} \mathrm{O}_{2} + \mathrm{H}_{2} \mathrm{O}$

This equation is unbalanced: for example, there are three carbons on the left-hand side, but only one on the right-hand side. By changing the proportions of reagents, we can get both sides to match:

$\mathrm{C}_{3} \mathrm{H}_{8} + \mathbf{5} \, \mathrm{O}_{2} \rightarrow \mathbf{3} \, \mathrm{C} \mathrm{O}_{2} + \mathbf{4} \, \mathrm{H}_{2} \mathrm{O}$

Once we have this balance, we can determine that we need 160 grams of oxygen for each 44 grams of propane. For this we need to look up the atomic masses (to compute molar masses) of carbon (12 g/mol), hydrogen (1 g/mol) and oxygen (16 g/mol). (*)

Back on the Mythbusters, after mentioning stoichiometry, Jamie starts trying out different proportions of propane to oxygen. If he had actually used stoichiometry he'd already have the proportions calculated, as I did above, about four times more oxygen than propane by mass; no need to experiment with different proportions.

(Yes, there'a a lot of experimentation in engineering, but no engineer ignores the basic scientific foundations of her field. Chemical engineers don't figure out mass balances by trial and error; they use trial and error after exhausting the established science.)

This illustrates a major problem in the way science is being popularized: to a segment of the educated and interested audience, science is an identity product. Like a Prada bag or a sports franchise logo on a t-shirt, they see science as something that can signal membership in a desired group and exclusion from undesirable groups.

Hence the word "stoichiometry" inserted in a show that doesn't actually use stoichiometry.

"Stoichiometry" here is, like the sports franchise logo, purely a symbol. The audience learns the word, in the sense that they can repeat it, but not the concept, let alone the principles and the tools of stoichiometry. The audience gains a way to signal that they "like" science, but no actual knowledge. Like a sedentary person who wears "team colors" to watch televised games.

Some successful science popularizers pander to this "like, not learn, science" audience, instead of trying to use that audience's interest in science to educate them.

So what, most people will ask. It's the market working: you give the audience what they want. And there's no question that selling science as identity is good business. Shows like House MD, Bones, The Big Bang Theory, all take advantage of this trend. Gift shops at science museums cater to the identity much more than the education: a look at their sales typically finds much more logo-ed merchandize than chemistry sets or microscopes.

(Personal anecdote: despite having three science museums nearby, I had to use the web to get a real periodic table poster. A printable simple table from Los Alamos National Lab.)

"Liking" science without learning it is bad for society:

1. Crowds out opportunities for education. People have limited time (and money) for their hobbies and activities. If they spend their "science budget" on identity, they won't have any left for actual science learning. Many more people read Feynman's two autobiographies than his Lectures On Physics or his popular physics books.

2. Devalues the work of scientists and engineers, by presenting a view of science that excludes the hard work of learning and the value of the knowledge base (trial-and-error in lieu of mass balance calculations, for example). Some people end up thinking that science is just another type of institution credential (or celebrity worship) instead of being validated by physical reality.

3. Weakens science education. Some people who go into science expect it to be easy and entertaining (in the purely ludic sense), instead of hard but rewarding (deriving satisfaction from really understanding something), as that's what the popularization depicts. They then want schools to match those expectations. While colleges may not want to simplify science and engineering classes, they put pressure on faculty for more "engaging" teaching: less technical, more show. (**)

4. As science becomes more of an identity product to some people, and increasingly perceived as identity-only by others, it becomes more vulnerable to non-scientific identity threats, such as derailing a major scientific and technical achievement in space exploration by talking about sartorial choices and sociological forces in academia.


So, what can we do?

First, we should recognize that an interest in science, even if currently trending towards identity, can be channeled into support for science and science education. As societal trends go, a generalized liking for science is better than most alternatives.

Second, there are plenty of sources of information and education that can be used to learn science. There's a broad variety of online resources for science education at different levels of knowledge, free and accessible to anyone with an internet connection (or indeed a library card; books were the original MOOCs).

Third, current "science as identity" popularizers may be open to educating their audiences. Contacting them, offering feedback, and using social media to otherwise proselytize for science (as in scientific knowledge and thinking like a scientist) might induce them to change their approach.

The most important thing anyone can do, though, is to try to get people who "like" science to understand that they should really learn some.

(Final note on the A-Team episode: Adam should have played Murdock, not Hannibal.)

- - - -
(*) I learned to do this on my own as a kid, but the material was covered in ninth grade chemistry. (A long time ago in a country far away, in ninth grade you chose a technical or artistic area in school; mine was 'chemical technology' because my school didn't have electronics.) A side-effect of my early interest in chemistry is that I have quasi-Brezhnevian eyebrows: you burn them off five or six hundred times, they grow back with a vengeance.

(**) Some schools protect their main reputation-building degrees by creating non-technical versions of the technical courses and bundling them into subsidiary degrees. So, for example, they have information technology courses, which sound like computer science courses but are in fact nothing like them.
          Another approach is the encroachment of humanities, arts, and social sciences "breadth" requirements into science and engineering degrees. When I studied EECS in Europe, we had five years of math, physics, chemistry, and engineering courses. A similar degree in the US has four years and usually a minimum of one-year-equivalent of those "breadth" requirements, though some people can have more than two-year-equivalent by choosing "soft engineering" courses like "social impact of computers."

Friday, December 6, 2013

Word salad of scientific jargon

"The scientists that I respect are scientists who work hard to be understood, to use language clearly, to use words correctly, and to understand what is going on. We have been subjected to a kind of word salad of scientific jargon, used out of context, inappropriately, apparently uncomprehendingly." – Richard Dawkins, in the video Dangerous Ideas - Deepak Chopra and Richard Dawkins, approximately 27 minutes in.

That's how I feel about a lot of technical communications: conference panels, presentations, and articles.  An observed regularity: the better the researchers, the less they tend to go into "word salad of scientific jargon" mode.

Wednesday, November 27, 2013

Intellectual counterfeit fashionistas and the corruption of STEM and analytics

I have acquaintances who say they like classical music but never listen to it and can't tell Bach from Brahms. While this is entertaining to classical music aficionados, a similar disconnect happens in STEM and business analytics, where it has serious consequences.

I've observed many people who are always saying how important science is, who can name several recent Nobel laureates in the sciences, but can't compute the kinetic energy of a 2-ton SUV going 65MPH (766kJ), or, ironically, can't explain what the research of those Nobel laureates was about.

I know people who are always talking about Big Data™ and "the" Management Information Revolution™ (yes, they think the current one is the only one), but cannot write Bayes's formula and think that standard deviation is the same as standard error.

These are the signs of the rise of the intellectual counterfeit fashionista (ICF). The ICF wants others to consider him or her an intellectual (that's the I), up to date on the latest hottest intellectual topic (that's the F), but is not willing to do the work and the learning necessary to understand that topic (that's the C).

No matter how infuriating or entertaining an ICF can be on a personal level, their rise is a problem -- chiefly because of their effect on education, the practice of technical professions, and the general perception of STEM and analytics in society.

Education: by trying to recruit proto-ICFs into STEM/analytics, teaching institutions end up having to water down their courses, since the ICFs don't want to do the work needed for real learning. This leads to lower quality education for every student, even the non-ICFs.

In the mid-to-long term, this creates a number of credentialed ignoramuses and gives rise to the strange situation where people who hire engineers say there's a dearth of them, while engineering associations say there's a glut. I guess it depends on how you define engineer, by skills or by credentials.

Professions: the obvious effect of ICFs is the rise in average incompetence. The more pernicious effect is the destructive nature of internal politics, which always increase in organizations with large numbers of people for which appearances and narratives are more important than observable realities and hard work.

I wish nerdiness became unfashionable again, so that the ICFs moved on to corrupt something else and left STEM and analytics alone.