In my series on Quantum Field Theory I wanted to document my own learning endeavors but it has turned into a meta-contemplation on the ‘explain-ability’ of theoretical physics.

Initially I had been motivated by a comment David Tong made in his introductory lecture: Comparing different QFT books he states that Steven Weinberg‘s books are hard reads because *at the time of writing Weinberg was probably the person knowing more than anyone else in the world on Quantum Field Theory*. On the contrary Weinberg’s book on General Relativity is accessible which Tong attributes to Weinberg’s learning GR himself when he was writing that textbook.

Probably I figured nothing can go awry if I don’t know too much myself. Of course you should know what you are talking about – avoiding to mask ignorance by vague phrases such as* scientists proved, experts said, in a very complicated process XY has been done*.

Yet my lengthy posts on phase space didn’t score too high on the accessibility scale. Science writer Jennifer Ouelette blames readers’ confusion on writers not knowing their target audience:

This is quite possibly the most difficult task of all. You might be surprised at how many scientists and science writers get the level of discourse wrong when attempting to write “popular science.” Brian Greene’s The Elegant Universe was an undeniably important book, and it started off quite promising, with one of the best explications of relativity my layperson’s brain has yet encountered. But the minute he got into the specifics of string theory — his area of expertise — the level of discourse shot into the stratosphere. The prose became littered with jargon and densely packed technical details. Even highly science-literate general readers found the latter half of the book rough going.

Actually, I have experienced this effect myself as a reader of popular physics books. I haven’t read *The Elegant Universe*, but Lisa Randall’s *Warped Passages* or her *Knocking on Heaven’s Door* are in my opinion similar with respect to an exponential learning curve.

Authors go to great lengths in explaining the mysteries of ordinary quantum mechanics: the double-slit experiment, Schrödinger’s cat, the wave-particle dualism, probably a version of Schrödinger’s equation motivated by analogies to hydrodynamics.

Then tons of different fundamental particles get introduced – hard to keep track of if you don’t a print-out of the standard model in particle physics at hand, but still doable. But suddenly you find yourself in a universe you lost touch with. Re-reading such books again now I find full-blown lectures on QFT compressed into single sentences. The compression rate here is much higher than for the petty QM explanations.

I have a theory:

**The comprehensibility of a popular physics text is inversely proportional to the compression factor of the math used **(even if math is not explicitly referenced).

In PI in the Sky John Barrow mulls on succinct laws of nature in terms of *the unreasonable effectiveness of mathematics*. An aside: Yet Barrow is as critical as Nassim Taleb with respect to the allure of ‘*Platonicit*y’: *What is most remarkable about the success of mathematics in [particle physics and cosmology] is that they are most remote from human experience *(Quote from PI in the Sky).

Important concepts in QM can be explained in high school math. My old high school physics textbook contained a calculation of the zero point energy of a Fermi gas of electrons in metals.

Equations in advanced theoretical physics might still appear simple, still using symbols taken from the Latin or Greek alphabet. But unfortunately these letters denote mathematical objects that are not simple numbers – this is highly efficient compressed notation. These objects are the proverbial *mathematical machinery(*)* that *act on* other objects. Sounds like the vague phrases I scathed before, doesn’t it? These operators are rather like a software programs using the thing to the right of this machine as an input – but that’s already too much of a metaphor as the ‘input’ is not a number either.

(*) I used the also common term *mathematical crank* in earlier posts which I avoid now due to obvious reasons.

You can create rather precise metaphors for differential operators in classical physics, using references to soft rolling hills and things changing in time or (three-dimensional) space. You might be able to introduce the curly small d’s in partial derivatives when applying these concepts to three-dimensional space. More than three-dimensions can be explained resorting by the beetle-on-balloon or ant-in-the-hose metaphors.

But if it gets more advanced than that I frankly run out of metaphors I am comfortable with. You ought to explain some purely mathematical concepts before you continue to discuss physics.

I think comprehension of those popular texts on advanced topics works this way:

- You can understand anything perfectly if you have once developed a feeling for the underlying math. For example you can appreciate descriptions of physical macroscopic objects moving under the influence of gravity, such as in celestial mechanics. Even if you have forgotten the details of your high school calculus lectures you might remember some facts on acceleration and speed you need to study when cramming for your driver license test.
- When authors start to introduce new theoretical concepts there is a grey area of understanding – allowing for stretching your current grasp of math a bit. So it might be possible to understand a gradient vector as a slope of a three-dimensional hill even if you never studied vector calculus.
- Suddenly you are not sure if the content presented is related to anything you have a clue of or if metaphors rather lead you astray. This is where new mathematical concepts have been introduced silently.

The effect of silently introduced cloaked math may even be worse as readers believe they understand but have been led astray. Theoretical physicist (and seasoned science blogger) Sabine Hossenfelder states in her post on metaphors in science:

Love: Analogies and metaphors build on existing knowledge and thus help us to understand something quickly and intuitively.

Hate: This intuition is eventually always misleading. If a metaphor were exact, it wouldn’t be a metaphor.

And while in writing, art, and humor most of us are easily able to tell when an analogy ceases to work, in science it isn’t always so obvious.

My plan has been to balance metaphors and rigor by reading textbooks in parallel with popular science books. I am mainly using Zee’s Quantum Field Theory in a Nutshell, Klauber’s Student Friendly Quantum Field Theory, and Tong’s lecture notes and videos.

But I also enjoyed Sean Carroll’s The Particle at the End of the Universe – my favorite QFT- / Higgs-related pop-sci book. Reading his chapters on quantum fields I felt he has boldly gone where no other physicist writing pop-sci had gone before. In many popular accounts of the Higgs boson and Higgs field we find somewhat poetic accounts of*particles that communicate forces*, such as the

*photon being the intermediary of electromagnetic forces*.

Sean Carroll goes to the mathematical essence of the relationship of (rather abstract) symmetries, connection fields and forces:

The connection fields define invisible ski slopes at every point in space, leading to forces that push particles in different directions, depending on how they interact. There’s a gravitational ski slope that affects every particle in the same way, an electromagnetic ski slope that pushes positively charged particles one way and negatively charged particles in the opposite direction, a strong-interaction ski slope that is only felt by quarks and gluons, and a weak-interaction ski slope that is felt by all the fermions of the Standard Model, as well as by the Higgs boson itself.

Indeed, in his blog Carroll writes:

So in the end, recognizing that it’s a subtle topic and the discussion might prove unsatisfying, I bit the bullet and tried my best to explain why this kind of symmetry leads directly to what we think of as a force. Part of that involved explaining what a “connection” is in this context, which I’m not sure anyone has ever tried before in a popular book. And likely nobody ever will try again!

This is the best popular account of symmetries and forces I could find so far – yet I confess: I could not make 100% sense of this before I had plowed through the respective chapters in Zee’s book. This is the right place to add a disclaimer: Of course I hold myself accountable for a possibly slow absorbing power or wrong approach of self-studying, as well as for confusing my readers. My brain is just the only one I have access to for empirical analysis right now and the whole QFT thing is an experiment. I should maybe just focus on writing about current research in an accessible way or keeping a textbook-style learner’s blog blog similar to this one.

Back to metaphors: Symmetries are usually explained by invoking rotating regular objects and crystals, but I am not sure if this image will inspire anything close to *gauge symmetry* in readers’ minds. Probably worse: I had recalled gauge symmetry in electrodynamics, but it was not straight-forward how to apply and generalize it to quantum fields – I needed to see some equations.

Sabine Hossenfelder says:

If you spend some time with a set of equations, pushing them back and forth, you’ll come to understand how the mathematical relationships play together. But they’re not like anything. They are what they are and have to be understood on their own terms.

Actually I had planned a post on the different routes to QFT – complementary to my post on the different ways to view classical mechanics. Unfortunately I feel the mathematically formidable path integrals would lend themselves more to metaphoric popularization – and thus more confusion.

You could either start with fields and quantize them which turn the classical fields (numbers attached to any point in space and time) into mathematical operators that actually create and destroy particles. Depending on the book you pick this is introduced as something straight-forward or as a big conceptual leap. My initial struggles with re-learning QFT concepts were actually due to the fact I had been taught the ‘dull’ approach (many years ago):

- Simple QM deals with single particles. Mathematically, the state of those is described by the probability of a particle occupying this state. Our mathematical operators let you take the proverbial quantum leap – from one state to another. In QM lingo you destroy or create states.
- There are many particles in condensed matter, thus we just extend our abstract space. The system is not only described by the properties of each particle, but also by the number of particles present. Special relativity might not matter.
- Thus it is somehow natural that our machinery now destroys or annihilates particles.

The applications presented in relation to this approach were all taken from solid state physics where you deal with lots of particles anyway and creating and destroying some was not a big deal. It is more exciting if virtual particles are created from the vacuum and violating the conservation of energy for a short time, in line with the uncertainty principle.

The alternative route to this one (technically called the canonical quantization) is so-called path integral formalism. Zee introduces it via an anecdote of a wise guy student (called Feynman) who pesters his teacher with questions on the classical double-slit experiment: A particle emitted from a source passes through one of two holes and a detector records spatially varying intensity based on interference. Now wise guy asks: What if we drill a third hole, a fourth hole, a fifth hole? What if we add a second screen, a third screen? The answer is that adding additional paths the particle might take the amplitudes related to these paths will also contribute to the interference pattern.

Now the final question is: *What if we remove all screens – drilling infinite holes into those screens?* Then all possible paths the particle can traverse from source to detector would contribute. You *sum over all (potential) histories*.

I guess, a reasonable pop-sci article would probably not go into further details of what it means to sum over an infinite number of paths and yet get reasonable – finite – results, or to expound why on earth this should be similar to operators destroying particles. We should add that the whole amplitude-adding business was presented as an axiom. *This is weird, but this is how the world seems to work!* (Paraphrasing Feynman).

Then we would insert an opaque blackbox [something about the complicated machinery – see details on path integrals if you really want to] and jump directly to things that can eventually be calculated like scattering cross-sections and predictions how particle will interact with each other in the LHC … and gossip about Noble Prize winners.

Yet it is so tempting to ponder on how the classical action (introduced here) is related to this path integral: Everything we ‘know about the world’ is stuffed into the field-theoretical counterpart of the action. The action defines the phase (‘angle’) attached to a path. (Also Feynman talks about rotating arrows!) Quantum phenomena emerge when the action becomes comparable to Planck’s constant. If the action is much bigger most of the paths are cancelled out because If phases fluctuate wildly contributions of different amplitudes get cancelled.

“I am not gonna simplify it. If you don’t like it – that’s too bad!”

Thanks for this article.

That’s exactly why I stopped reading pop-sci books or articles in Scientific American. If have filed some SciAm articles from the 80s about gauge theories written by Nobel laureates. They are full of metaphors but – at least for me – incomprehensible without math. Same with all the Higgs stuff last year.

I can only understanding these things if they fall out of the math.

So I decided to do it the hard way:

I worked through Klauber’s Student friendly Quantum Field Theory as well and now I know what a gauge theory is.

This enables me to understand the SciAm article 8-).

Thanks a lot for your comment and the follow! For me it was also the pop-sci reports on the Higgs boson that ‘forced’ me to read about the real stuff.

My favorite part of this post is the Feyman clip … ““I am not gonna simplify it. If you don’t like it – that’s too bad!” I don’t know how many times I wish I could have said that to my students. Sometimes I find that I spend too much time finding new ways to make things easier for students. I worry that taking these alternate approaches may not, ultimately, be best for the students – or, at least, not best for the best students. I’m torn … is it my job to get them to understand at all cost (even to the material itself) or is it my job to explain as I understand it and push them to catch up? The latter mode is what I experienced as an undergraduate and I turned out OK. What do you think? D

Thanks again for reading my wall of text, Dave! I fully agree with you – after all, university is also about learning to learn (actually this should start in high school already…). So – yes, students should be pushed. You only learn by trying to figure it our for yourself anyway. So you explain it “your way” (that will never please any student’s learning style anyway) and students need to make themselves familiar with the material afterwards (or probably even before, in a preparation for a lecture).

Probably as a teacher you could vary the presentation style that sometimes the “visual learners” are more pleased, and sometimes the “listening learners”? But you can’t present all the material in all possible ways all the time. I am not sure…. but one thing I demand: Even if lectures are probably not easy to understand, the teacher as such should be inspiring. (I think Feynman was not that easy to understand.)

But you have pushed a button (again) by saying “… and I turned out OK”. This is something I hardly dare say in your typical party small-talk on educational matters. Learning and understanding in an academic setting always came easy to me – any subject, not only math and physics (I seriously considered majoring in philosophy and literature…). I was not a lazy student, but I definitely did not have to put in tons of hours to get excellent grades. Thus, when I utter an opinion like “students should be pushed a bit to figure it out for themselves” I always hear: “You cannot judge that because for you it was so easy!!”.

I disagree as I wasn’t an autistic child able to manipulate enormous numbers in my hear, or something. I believe it works this way: If you have some talent (and I think anybody has…in some field…) you can sort of amplify that talent by putting in some (moderate level of) deliberate self-discipline. Which does not mean you have to cram extremely hard of exams… but you need to put yourself in a mental state that make you enjoy so-called hard work, like: trying to solve a mathematical problem again and again… alone! This will pay off rather soon and save lot of time when you encounter similar problems in the future. An Austrian philosophy professor (he is sort of public intellectual) has written a book about the modern educational system that he himself calls a polemic – for me it was pure joy to read. He scathes management consulting B.S. and “quality management” at universities as well as the myth of learning should always be joyful and involving play, group projects, 3D animations etc. He says: It won’t work without yourself putting in some hard work.

I believe it is key to make yourself love that so-called hard work of finding things out – instead of demanding that part should go away. I think it can be expected of a student that picked a field to major in?

I conjecture any child initially does have that curiosity combined with natural perseverance (?) What does the evolutionary biologist say about our curiosity – does it come natural to our brains? Is it fostered more in ancient tribal cultures for examples? (…need to re-read Jared Diamond maybe…)

Thanks again for the comment!!

I wonder if part of the accessibility problem isn’t also that the more esoteric a subject the fewer common metaphors there are to draw on. There’ve been hundreds of efforts at coming up with accessible representations of (say) special relativity with balls bouncing up and down in railroad cars and lights being shined against mirrors aboard spaceships and so on, and the good ones have had the chance to propagate and the abysmal ones get forgotten. Phase space? … Not so well explored. Quantum field theory? … There’s a lot of mathematical underpinning that needs explanation before you even get to the stuff to be explained.

Thanks, Joseph – I think you are right.

For less esoteric subjects so-called metaphors are often real isomorphisms. I probably didn’t use that term 100% correct, but I hope you know what I mean: E.g. a rubber membrane stretched by differently shaped objects is an exact 1:1 model for electrostatic fields – such models have been even used in earlier times as “analog computers”. But the farther you move away from anything isomorph to 3D space (and time independent from it) the more “poetic” and “vague” do metaphors become.

This recent article is an excellent example – it gives an overview on QFT and on latest research in mathematical methods that would dramatically simplify tedious calculations (even Feynman diagrams would be much more complicated): https://www.simonsfoundation.org/quanta/20130917-a-jewel-at-the-heart-of-quantum-physics/

You have no chance of understanding what is really going on although it is a well-written and well-researched article and some abstract concepts (Grassmannian…) are introduced – but you simply cannot make it more comprehensible without turning a single article into a series of lectures, I guess.

Just a few days ago, while browsing through “Chapters” bookstore near my home I noticed something peculiar in the “Math” section: None–not one–of the books in that section had any significant amount of mathematics notation in them. No variables, no equations, no standard mathematical sentences of any kind, and hardly any diagrams! I have been thinking about this while reading your post. IChapters is a huge book store chain and, if you consider that they have done their homework and stock what people want to buy then it’s pretty clear that people do not want to ,buy mathematics. Despite that people such as myself have been trying to say to students it is not fun at all for most people.

Now, of course, I could get going with a fine rant about:

1. How terrible it is that it’s become socially acceptable–almost desirable–to claim that you are no good at math.

2. How deceitful it is to claim that math or physics are easy to learn when, in fact, they are the product of insight and genious.

But I will not do that at this point, 🙂

I particularly like the message you have embedded here: when presenting difficult theoretical concepts it’s important to strike two kinds of balance (1) between verbal descriptions and more precise mathematical ones (2) between the author’s understanding and the reader’s understanding. Based on my experience as a teacher I can further weigh in with my own belief that there is no one best approach since every reader/student comes to the topic with somewhat unique experiences, prior knowledge and strengths. The best thing, overall is (1) for the writer to ensure that (s)he uses a varied approach and (2) for the reader to use multiple resources.

One final thing. In the early nineties I did come across what I think is a decent introductory text for Callculus. My office-mate’s children were at University at the time and were using Calculus by Stewart. I’m happy to say that the books are still around. In fact the lineup has been expanded and, I believe, the author has become fairly wealthy from the sales. What do you know–a math best seller! Stewart did it the right way–he classroom-tested his methods and put together a readable series of books that are responsive to student needs. http://www.stewartcalculus.com/ We need more of that!

Thanks, Maurice!! I agree (re math-free books), but probably books as this one signal a new trend? I considered buying it for “research on explaining math”:

The Theoretical Minimum: What You Need to Know to Start Doing Physics, co-authored by famous Leonard Susskind: http://www.amazon.com/The-Theoretical-Minimum-Start-Physics-ebook/dp/B00BSEQM9U/ref=sr_1_2?ie=UTF8&qid=1383670105&sr=8-2 (developed from a series of lectures, thus classroom-tested).

I found it imperative to use different books, especially with QFT, because of the different philosophical approaches. If I had to pick one, I’d pick Klauber though as – in contrast to most advanced text – it is really written from students’ perspective which is rare. He emphasizes a lot which topics he had found imprenetrable as students and why – he is like a student going back to his old list of questions and filling in the answers.

Usually experienced researchers write introductory books not (only) for students but for showing off their unique grasp of the subject. As brilliant and insightful Feynman’s Physics Lectures are… he kind of knew it and you could append “Wasn’t that brilliant?” to every paragraph 😉 Zee’s book is Feynman-y at times, so I like the combination with Klauber who, on the contrary, avoids to use terms as easy, obvious, and trivial and who does all derivation in all details.

Nothing worse in the middle of a derivation than seeing “the rest is left as an exercise.” arrrrrrrrgggghhhh 😦

You probably might not like Zee’s book 🙂

Nice article. As someone with an interest in science, but not much background in math, I’ve learned a lot from popular science books. However, I always suspected that the more comprehensible ones — the ones with the clever metaphors — were the equivalent of ‘physics for dummies.’ Also, when it comes to something like QM, which has multiple interpretations, metaphors can be misleading. If the author of the pop-sci is a scientist, I worry about this a little less. But if I’m reading science journalism, I worry about it a lot. How do I know the author has gotten the implications of the science right? I’m not a specialist. The same issue applies to philosophy of science. Unless philosophers also have advanced degrees in the relevant sciences — which some do, but many don’t — they may have to rely on essentially pop-sci material.

Thanks, Dan! I agree – the really clever metaphors are ‘phyiscs for dummies’. Aatish Bhatia’s legendary post on so-called negative absolute temperature comes to my mind: http://www.empiricalzeal.com/2013/01/05/what-the-dalai-lama-can-teach-us-about-temperatures-below-absolute-zero/ But I believe there is a caveat: This metaphor was so ingenious because of the perfect 1:1 correspondence of e.g. ‘happiness’ and ‘entropy’. However, the whole point of the explanation (and of debunking the hype around these experiment) was that ‘temperature’ is actually derived backwards from a formula including other quantities – if you tweak the latter, you can make temperature come out negative. The metaphor in accessible language is the 1:1 counterpart of a very concise exponential function. You have lost all the compression upsides of math.

But if you need to write an article about ‘more than one formula’ you cannot apply this method. I had noticed it myself (this is not so say my attempts were particularly clever…): I spent walls of text on trying to explain a rather simple math fact – the definition of that abstract space. I dawned on me: How will I ever get to dealing with the translation of all the math I am reading now if I need to expand the material that way?

As you mention philosophy of science and credibility, I cannot resist posting (I assume for the 100th time) a link to the Sokal hoax: http://en.wikipedia.org/wiki/Sokal_affair. As the author was a physicist indulging in “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity” (I do love this title!!) the sociology experts did simply believe his authority and didn’t detect the nonsense. I am not sure however if the editors were really not qualified as Sokal said he used “the silliest quotations [by postmodernist academics] he could find about mathematics and physics” – it seems they were just gullible (or tried to save time on proof-reading).

Thanks for the lead on the Sokal Affair. Sadly, it doesn’t surprise me that the journal published the article. If you want a laugh, here’s a primer on how you can write and speak ‘postmodern’: http://vserver1.cscs.lsa.umich.edu/~crshalizi/how-to-talk-postmodern.html. It’s a parody, but it’s very close to the truth.

Thanks, this is awesome!! I would like to mention (Facebook-style) now: @MHatzel and @postmoderndonkey now 🙂