Thursday, October 10, 2024

How we learn (Benedict Carey)

 
2014

Carey, Benedict.
How we learn: the surprising truth about when, where, and why it happens/ Benedict Carey. 
1. learning, psychology of.
2. learning.

BF318.C366 2014
153.1'5--dc23


Benedict Carey, How we learn, 2014                                          [ ]

   (Carey, Benedict., How we learn: the surprising truth about when, where, and why it happens/ Benedict Carey., 1. learning, psychology of., 2. learning., BF318.C366 2014, 153.1'5--dc23, 2014, )


Benedict Carey, science reporter at The New York Times 

from book, front cover, inner flap
learning tools like forgetting, sleeping, and daydreaming. 

   By road testing many of the counterintuitive techniques described in this book, Carey shows how we can flex the neural muscles that make deep learning possible. Along the way he reveals why teachers should give final exams on the first day of class, why it's wise to interleave subjects and concepts when learning any new skill, and when it's smarter to stay up late prepping for that presentation than to rise early for one last cram session. And if this requires some suspension of disbelief, that's because the research defies what we've been told, throughout our lives, and about how best to learn. 
   The brain is not like a muscle, at least not in any straight-forward sense. It is something else altogether, sensitive to mood, to timing, to circadian rhythms, as well as to location and environment. It doesn't take orders well, to put it mildly. If the brain is a learning machine, then it is an eccentric one. 


contents

introduction: broaden the margins                      ix 

part one: basic theory
  1. the story maker                                    3
     the biology of memory

  2. the power of forgetting                           21
     a new theory of learning 

part two: retention 
  3. breaking good habits                              45
     the effect of context of learning 

  4. spacing out                                       65
     the advantage of breaking up study time 

  5. the hidden value of ignorance                     80
     the many dimensions of testing 

part three: problem solving 
  6. the upside of distraction                        170
     the role of incubation in problem solving 

  7. quitting before you're ahead                     131
     the accumulation gifts of percolation 

  8. being mixed up                                   149
     interleaving as aid to comprehension             

part four: tapping the subconscious
  9. learning without thinking                        175
     harnessing perception discrimination 

 10. you snooze, you win                              195 
     the consolidating role of sleep 

     conclusion: the foraging brain                   213 

appendix: eleven essential questions                  223 
acknowledgements                                      229
notes                                                 231
index                                                 245 


p.5
   In a way, the brain's modules are like specialists in a movie production crew. The cinematographer is framing shots, zooming in tight, dropping back, stockpiling footage. The sound engineer is recording, fiddling with volume, filtering background noise. There are editors and writers, a graphics person, a prop stylist, a composer working to supply tone, feeling──the emotional content──as well as someone keeping the books, tracking invoices, the facts and figures. And there's a director, deciding which pieces go where, braiding all these elements together to tell a story that holds up. Not just any story, of course, but the one that best explains the “material” pouring through the senses. The brain interprets scenes in the instants after they happen, inserting judgments, meaning, and context on the fly. It also reconstructs them later on──what exactly did the boss mean by that comment?──scrutinizing the original footage to see how and where it fits into the larger movie. 
   It's a story of a life──our own private documentary──and the film “crew” serves as an animating metaphor for what's happening behind the scenes. How a memory forms. How it's retrieved. Why it seems to fade, change, or grow more lucid over time. And how we might manipulate each step, to make the details richer, more vivid, clearer. 
   Remember, the director of this documentary is not some film school graduate, or a Hollywood prince with an entourage. It's you. 

p.6
It's as if these cells are bound in collective witness of that experience. The connection between the cells, called synapses, thicken with repeated use, facilitating faster transmission of signals. 

p.7
   Intuitively, this makes some sense; many remembered experiences feel like mental reenactments. 

p.8
   I have not thought about that for more than 35 years, and yet there it is. 

p.9
   This kind of time travel is what scientists call episodic, or autobiographical memory, for obvious reasons. It has some of the same sensual texture as the original experience, the same narrative structure. Not so with the capital of Ohio, or a friend's phone number: We don't remember exactly when or where we learned those things. Those are what researchers call semantic memories, embedded not in narrative scenes but in a web of associations. 

p.11
   In 1953, Scoville described his patient's struggles to a pair of doctors in Montreal, Wilder Penfield and Brenda Milner, a young researcher who worked with him [in 1962 known as H.M. to protect his privacy, Henry Molaison, Hartford, Connecticut, tinkerer and machine repairman, suffered devastating seizures, who lost his ability to form new memories after a brain surgery that removed the medial temporal lobes, which contain the hippocampus].  

p.12
Brenda Milner, now a professor of cognitive neuroscience at the Montreal Neurological Institute and McGill University

p.13
The brain had specific areas that handled different types of memory formation. 

p.15
epilepsy patients (to whom brain science owes debts without end).

p.16
When split brain patients saw a picture of a fork with only their right hemisphere, they couldn't say what it was. They couldn't name it. Due to the severed connection, their left hemisphere, where language is centered, received no information from the right side. And the right hemisphere──which “saw” the fork──had no language to name it. 
   And here is the kicker: The right hemisphere could direct the hand it controls to draw the fork. 
   The Caltech trio didn't stop there. In a series of experiments with these patients, the group showed that the right hemisphere could also identify objects by touch, correctly selecting a mug or a pair of scissors by feel after seeing the image of one. 

p.17
Michael Gazzaniga, Roger Sperry, Joseph Bogen, 1960s, Caltech [California Institute of Technology] 
   “That was the question, ultimately”, said Michael Gazzaniga, who coauthored the Caltech studies with Roger Sperry and Joseph Bogen in the 1960s. “Why, if we have these separate systems, is it that the brain has a sense of unity?”  

p.18
(Remember, the left is where language skills are centered, and the right is holistic, sensual; it has no words for what it sees.) 

p.18
   The left hemisphere was just throwing out an explanation based on what it could see: the shovel. “It was just making up any old BS”, Gazzaniga told me, laughing at the memory of the experiment. “Making up a story.” 
   In subsequent studies he and others showed that the pattern was consistent. The left hemisphere takes whatever information it gets and tells a tale to conscious awareness. It does this continually in daily life, and we've all caught it in the act──overhearing our name being whispered, for example, and filling in the blanks with assumptions about what people are gossiping about. 

p.18
   The brain's cacophony of voices feels coherent because some module or network is providing a running narration. “It only took me 25 years to ask the right question to figure it out”, Gazzaniga said, “which was why? Why did you pick the shovel?”

p.19
   All we know about this module is it resides somewhere in the left hemisphere. No one has any idea how it works, or how it strings together so much information so fast. It does have a name. Gazzaniga decided to call our left brain narrating system “the interpreter”. 

p.19
   This is our director, in the film crew metaphor. The one who makes sense of each scene, seeking patterns and inserting judgments based on the material; the one who fits loose facts into a larger whole to understand a subject. Not only makes sense but  makes up a story , as Gazzaniga put it──creating meaning, narrative, cause and effect.
   It's more than an interpreter. It's a story maker.  

p.19
   This module is vital to forming a memory in the first place. It's busy answering the question “What just happened?” in the moment, and those judgements are encoded through the hippocampus. That's only part of the job, however. It also answers the questions “What happened yesterday?”  “What did I make for dinner last night?”  And, for global religions class, “What were the four founding truths of Buddhism, again?” 

p.20
That is to say: The brain does not store facts, ideas, and experiences like a computer does, as a file that is clicked open, always displaying the identical image. It embeds them in networks of perceptions, facts, and thoughts, slightly different combinations of which bubble up each time. And that just retrieved memory does not overwrite the previous one but intertwines and overlaps with it. Nothing is completely lost, but the memory trace is altered and for good. 
   As scientists put it, using our memories changes our memories. 

p.22
If learning is building up skills and knowledge, then forgetting is losing some of what was gained. It seems like the enemy of learning. 
   It's not. The truth is nearly the opposite. 

p.23
   We engage in this kind of focused forgetting all the time, without giving it much thought. To lock in a new computer password, for example, we must block the old one from coming to mind; to absorb a new language, we must hold off the corresponding words in our native tongue.  When thoroughly immersed in a topic or novel or computation, it's natural to blank on even common nouns──“could you pass me the whatyoucallit, the thing you eat with?”
   Fork. 

p.23
   As 19th-century American psychologist William James observed, “If we remembered everything, we should on most occasions be as ill as if we remembered nothing.”

p.23
   The study of forgetting has, in the past few decades, forced a fundamental reconsideration of how learning works. In a way, it has also altered what the words “remember” and “forget” mean. 

pp.23-24
“The relationship between learning and forgetting is not so simple and in certain important respects is quite the opposite of what people assume”, Robert Bjork, a psychologist at the University of California, Los Angeles, told me. “WE assume it's all bad, a failure of the system. But more often, forgetting is a friend to learning.” 

p.24
In many cases, they stumble because they remember too much. 

p.24
If recollecting is just that──a re-collection of perceptions, facts, and ideas scattered in intertwining neural networks in the dark storm of the brain──then forgetting acts to block the background noise, the static, so that the right signals stand out. The sharpness of the one depends on the strength of the other. 

p.24
   Another large upside of forgetting has nothing to do with its active filter property. 

p.24
Yet no complex memory comes back exactly the same way twice, in part because the forgetting filter blocks some relevant details along with many irrelevant ones. Features that previously were blocked or forgotten often reemerge. This drift in memory is perhaps must obvious when it comes to the sort of childhood tales we all tell and embellish. 

p.25
It's that retrieving any memory alters its accessibility, and often its content. 

p.25
One implication, for instance, is that forgetting a huge chunk of what we've just learned, especially when it's a brand new topic, is not necessarily evidence of laziness, attention deficits, or a faulty character. On the contrary, it is a sign that the brain is working as it should. 

p.25
The Forgetting Curve is exactly what it sounds like, a graph of memory loss over time. In particular, it charts the rate at which newly learned information fades from memory. It's a learning curve, turned upside-down: 
p.26
   This curve, first published in the late 1880s, ... 

pp.26-27
Yet its creator, Hermann Ebbinghaus, wasn't one for idle guesswork. He was exacting by nature, compulsive about evidence. 
Paris bookstall, Elements of Psychophysics by Gustav Fechner
He must have glimpsed this future as well, right then and there, because he later dedicated his greatest work, Memory: A Contribution to Experimental Psychology, to Fechner. 

p.27
Our brain can impute meaning to almost anything. 

p.31
“We not only tend to forget what we have once remember”, he wrote, “but we also tend to remember what we have once forgotten”. 
   Memory does not have just one tendency over time, toward decay. It has two. 

p.31
The brain doesn't hold on to nonsense syllables for long, then, because they are nonsense. 

pp.31-32
   Forgetting, remember, is not only a passive process of decay but also an active one, of filtering. It works to block distracting information, to clear away useless clutter. 

p.32
That could account for why there would be a difference in how well we remember nonsense syllables versus a poem, a short story, or other material that makes sense. 

p.34
By contrast, reminiscense is strong for imagery, for photographs, drawings, paintings──and poetry, with its word-pictures. And it takes time to happen. Ballard had identified the “bubbling up” of new verse in the first few days after study, when it's strongest. Other researchers had looked for it too early, minutes afterward, or too late, after a week or more. 

p.37
And the Forget to Learn theory says: If I stored it, it's in there for good. 
   That is, no memory is ever “lost” in the sense that it's faded away, that it's gone. Rather, it is not currently accessible. Its retrieval strength is low, or near zero. 

p.39
In its normadic homonid youth, the brain was continually refreshing its mental map to adapt to changing weather, terrain, and predators. Retrieval strength evolved to update information quickly, keeping the most relevant details handy. It lives for the day. Storage strength, on the other hand, evolved so that old tricks could be relearned, and fast, if needed. Seasons pass, but they repeat; so do weather and terrain. Storage strength plans for the future. 

p.39
Kids who grow up in North American households, for example, learn to look people in the eye when speaking, especially a teacher or parent. Kids in Japanese homes learn the opposite: Keep your gaze down, especially when speaking to an authority figure. To move successfully from one culture to the other, people must block──or forget──their native customs to quickly absorb and practice the new ones. 

pp.39-40
   “Compared to some kind of system in which out-of-date memory were to be overwritten or erased”, Bjork writes, “having such memories become inaccessible but remain in storage has important advantages. Because those memories are inaccessible, they don't interfere with current information and procedure. But because they remain in memory they can──at least under certain circumstances──be relearned.” 

p.64
Daniel Willingham, a leading authority on the application of learning techniques  in classrooms, advises his own students, when they're reviewing for an exam, not to work straight from their notes. “I tell them to put the notes aside and create an entirely new outline, reorganizing the material”, he told me. “It forces you to think about the material again, and in a different way.”

p.64
Each alteration of the routine further enriches the skills being rehearsed, making them sharper and more accessible for a longer period of time. This kind of experimenting itself reinforces learning, and makes what you know increasingly independent of your surroundings. 

pp.65-66
   The technique is called distributed learning or, more commonly, the spacing effect. People learn at least as much, and retain it much longer, when they distribute──or “space”──their study time than when they concentrate it. Mom's right, it is better to do a little today and a little tomorrow rather than everything at once. Not just better, a lot better. Distributed learning, in certain situations, can double the amount we remember later on. 

p.68
Hermann Ebbinghaus
Yet he could do just as well with only 38 repetition total if they were spaced out over three days.

p.68
Adolf Jost, an Austrian psychologist 
1897, Jost's Law
Translation: Studying a new concept right after you learn it doesn't deepen the memory much, if at all; studying it an hour later, or a day later, does.  

p.69
1970s
Vietnam War protests 
Harry P. Bahrick, a psychologist at Ohio Wesleyan University 

p.71
James Method
To implement this program, simply follow the example of American writers Henry and William James and grow up the child of wealthy, cultured parents who see it that throughout your childhood you travel widely in Europe and the Americas and receive language tutoring along the way. The Jameses were determined that their sons have what Henry Sr. called a “sensuous education”. The most famous of the sibling, the novelist Henry, studied with tutors in Paris, Bologna, Geneva, and Bonn; he spent extended time in each place and returned periodically throughout his life. As a result, he became proficient in French, Italian, and German. 
   The James Method integrated foreign language and first-rate instruction into childhood development. That's not quite the same as growing up in a multilingual home, but it's a pretty close facsimile. Children absorb a new language quickly when forced to speak and understand it──when living with it──and that is what the James children did to some extent. They had to memorize non-English verbs and nouns like the rest of us but did so at a time when the language modules in their brain were still developing. 

pp.72-73
Piotr Wozniak 
1982, a 19-year-old Polish college student 
“These optimum intervals are calculated on the basis of two contradictory criteria”, he wrote at the time. “Intervals should be as long as possible to obtain the minimum frequency of repetitions, and to make the best use of the so-called spacing effect ... Intervals should be short enough to ensure that the knowledge is still remembered.”
SuperMemo teaches according to Wozniak's calculations. 
It's easy to use and──after Wozniak make it available as freeware in the 1990s──the program took off, especially among young people trying to learn English in places like China and Poland (it's now a commercial website and an app). 

p.73
To build and retain foreign vocabulary, scientific definitions, or other factual information, it's best to review the material one or two days after initial study; then a week later; then about a month later. After that, the intervals are longer. 

p.74
   Why spaced study sessions have such a large impact on learning is still a matter of debate. Several factors are likely at work, depending on the interval. With very short interval──seconds or minutes, as in the early studies──it may be that the brain becomes progressively less interested in a fact when it's repeated multiple times in rapid succession. 

p.74
It has just heard, and stored, the fact that James Monroe was the 5th president. If the same fact is repeated again, and then a 3rd time, the brain pays progressively less attention.  

p.74
   For intermediate intervals of days or weeks, other factors might come into play. Recall the Forget to Learn Theory, which holds that forgetting aids learning in two ways: actively, by filtering out competing facts, and passively, in that some forgetting allows subsequent practice to deepen learning like an exercised muscle. 

p.75
   Space study──in many circumstances, including the neighbor example──also adds contextual cues, of the kind discussed in Chapter 3. You initially learned of the names at the party, surrounded by friends and chatter, a glass of wine in hand. The second time, you heard them yelled out, over the hedges. The name are now embedded in two contexts, not just one. The same thing happens when reviewing a list of words or facts the second time (although context will likely be negligible, of course, if you're studying in the same place both days). 

p.75
   The effects described above are largely subconscious, running under the radar. We don't notice them. With longer intervals of a month or more, and especially with three or more sessions, we begin to notice some of the advantages that spacing allows, because they're obvious. 

p.75
For the Bahricks, the longer intervals helped them identify words they were most likely to have trouble remembering. “With longer spaces, you're forgetting more, but you find out what your weaknesses are and you correct them”, Bahrick told me. “You find out which mediators──which cues, which associations, or hints you used for each word──are working and which aren't. And if they're not working, you come up with new ones.”

p.75
Practically nothing. 

p.76
The words and ideas are so strange at first that my brain has no way to categorize them, no place to put them. So be it. I now treat that first encounter as a casual walk-through, a meet-and-greet, and put in just 20 minutes. I know that in round two (20 minutes) I'll get more traction, not to mention round three (also 20 minutes). I haven't used any more time, but I remember more. 

p.76
Results from classroom studies continued to roll in: Spaced review improves test scores for multiplication tables, for scientific definitions, for vocabulary. The truth is, nothing in learning science comes close in terms of immediate, significant, and reliable improvements to learning. 

p.76
“I get sick of people taking my psych intro class and coming back next year and not remembering anything”, Melody Wiseheart, a psychologist at York University in Toronto, told me. “It's a waste of time and money; people pay a lot for college. As a teacher, too, you want to teach so that people learn and remember: That's your job. You certainly want to know when it's best to review key concepts──what's the best time, given the spacing effect, to revisit material? What is the optimal schedule for students preparing for a test?” 

p.77
2008, [Melody] Wiseheart and Harold Pashler
Wiseheart and Pashler

p.79
Remember, spacing is primarily a retention technique. Foreign languages. Science vocabulary. Names, places, dates, geography, memorizing speeches. 

p.79
For now, though, this [spacing effect learning] is a memorization strategy. 

p.79
William James
1901 book Talks to Teachers on Psychology: And to Students on SOme of Life's Ideals
“Cramming seeks to stamp things in by intense application before the ordeal. But a thing thus learned can form few associations. On the other hand, the same thing recurring on different days in different contexts, read, recited, referred to again and again, related to other things and reviewed, gets well wrought into mental structure.” 

p.82
fluency illusion
The fluency illusion is so strong that, once we feel we've nailed some topic or assignment, we assume that further study won't help. We forget that we forget. 

p.82
Fluency misperceptions are automatic. They form subconsciously and make us poor judges of what we need to restudy, or practice again. 

p.83
There's more to self-examination than you know. A test is not only a measurement tool, it alters what we remember and changes how we subsequently organize that knowledge in our minds. And it does so in ways that greatly improve later performance. 

p.84
Psalm 23 (The Lord is my shepherd, I shall not want...) 

p.86
“Recitation as a Factor in Memorizing”  
([ how do actors memorize their lines in theatre ])


p.87
Testing is studying, of a different and powerful kind. 


p.94
When the brain is retrieving studied text, names, formulas, skills, or anything else, it's doing something different, and harder, than when it sees the information again, or restudies. That extra effort deepens the resulting storage and retrieval strength. We know the facts or skills better because we retrieved them ourselves, we didn't merely review them. 
   Roediger goes further still. When we successfully retrieve a fact, he argues, we then re-store it in memory in a different way than we did before. Not only has storage level spiked; the memory itself has new and different connections. It's now linked to other related facts that we've also retrieved. The network of cells holding the memory has itself been altered. Using our memory changes our memory in ways we don't anticipate. 

p.97
   In plain English: The act of guessing engaged your mind in a different and more demanding way than straight memorization did, deepening the imprint of the correct answers. In even plainer English, the pretest drove home the information in a way that studying-as-usual did not. 
   Why?  No one knows for sure. One possible explaination is that pretesting is another manifestation of desirable difficulty. You work a little harder by guessing first than by studying directly. A second possibility is that the wrong guesses eliminate the fluency illusion, the false impression that you knew the capital of Eritrea because you just saw or studied it. A third is that, in simply memorizing, you saw only the correct answer and weren't thrown off by the other four alternatives──the way you would be on a test. 

p.97
A third is that, in simply memorizing, you saw only the correct answer and weren't thrown off by the other four alternatives──the way you would be on a test. 

pp.97-98
“Let's say you're studying capitals and you see that Australia's is Canberra,” Robert Bjork told me. “Okay, that seems easy enough. But when the exam question appears, you see all sorts of other possibilities──Sydney, Melbourne, Adelaide──and suddenly you're not so sure. If you're studying just the correct answer, you don't appreciate all the other possible answers that could come to mind or appear on the test.” 


p.101
   Is it possible that one day teachers and professors will give “pre-finals” on the first day of class? Hard to say. A prefinal for an intro class in Arabic or Chinese might be a wash, just because the notations and symbols and alphabet are entirely alien. My guess is that prefinals are likely to be much more useful in humanities courses and the social sciences, because in those courses our minds have some scaffolding of languages to work with, before making a guess. 

p.102
Those applications remind me of what the great Argentine writer Jorge Luis Borges once said about his craft: “Writing long books is a laborious and impoverishing act of foolishness: expanding in five hundred pages an idea that could be perfectly explained in a few minutes. A better procedure is to pretend that those books already exist and to offer a summary, a commentary.”

p.102
   Pretend that the book already exists. Pretend you already know. 

p.102
Pretend you already are an expert and give a summary, a commentary──pretend and perform. That is the soul of self-examination: pretending you're an expert, just to see what you've got. 

p.102
Many teachers have said that you don't really know a topic until you have to teach it, until you have to make it clear to someone else. Exactly right. 

p.103
Better yet, those exercise will dispel the fluency illusion. They'll expose what you don't know, where you're confused, what you've forgotten──and fast. 



pp.113-116
p.113
   Graham Wallas
In 1926, at the end of his career, he published The Art of Thought, a rambling meditation on learning and education that's part memoir, part manifesto. 

p.113
He [Graham Wallas] also conducts a wide-ranging analysis of what scientists, poets, novelists, and other creative thinkers, throughout history, had written about how their own insights came about. 

p.113
   Wallas was not content to reprint those self-observations and speculate about them. He was determined to extract a formula of sorts: a specific series of steps that each of these thinkers took to reach a solution, a framework that anyone could use. Psychologists at the time had no language to describe these steps, no proper definitions to work with, and thus no way to study this most fundamental human ability. To Wallas, this was appalling. His goal was to invent a common language. 

p.113
For example, he [Graham Wallas] quotes French mathematician Henri Poincaré, who had written extensively about his experience trying to work out the properties of a class of forms called Fuchsian functions. “Often when one works at a hard question, nothing good is accomplished at the first attack,” Poincaré had observed. “Then one takes a rest, longer or shorter, nothing is found, and then all of a sudden the decisive idea presents itself to mind.”

pp.113-114
Wallas also quotes the German physicist Hermann von Helmholtz, who described how new ideas would bubble up after he'd worked hard on a problem and hit a wall: “Happy ideas come unexpectedly, without effort, like an inspiration”, he wrote. “So far as I am concerned, they have never come to me when my mind was fatigued, or when I was at my working table ... they came particularly readily during the slow ascent of wooded hills on a sunny day.” 

p.114
The Belgian psychologist Julien Varendonck traced his insights to daydreaming after a period of work, sensing that “there is something going on in my foreconsciousness which must be in direct relation to my subject. I ought to stop reading for a little while and let it come to the surface.”

p.114
   Wallas saw, however, that the descriptions had an underlying structure. The thinkers had stalled on a particular problem and walked away. They could not see an opening. They had run out of ideas. The crucial insights came after the person had abandoned the work and was deliberately not thinking about it. Each insight experience, as it were, seemed to include a series of mental steps, which Wallas called “stages of control”. 

p.114
preparation
   The first is preparation: the hours or days──or longer──that a person spends wrestling with whatever logical or creative knot he or she faces. Poincaré, for example, spend 15 days trying to prove that Fuchsian functions could not exist, an extensive period of time given his expertise and how long he's played with the ideas before sitting down to construct his proof. “Every day I seated myself at my work table, stayed an hour or two, tried a great number of combinations and reached no result”, he wrote. Preparation includes not only understanding the specific problem that needs solving and the clues or instructions at hand; it means working to a point where you've exhausted all your ideas. You're not stalled, in other words. You're stuck──ending preparation.

pp.114-115
incubation
   The second stage is incubation, which begins when you put aside a problem. For Helmholtz, incubation began when he abandoned his work for the morning and continued as he took his walk in the woods, deliberately not thinking about work. For others, Wallas found, it occurred overnight, or during a meal, or when out with friends. 

p.115
incubation
   Some mental machinations were clearly occurring during this downtime, Wallas knew, and they were crucially important. Wallas was a psychologist, not a mind reader, but he ventured a guess about what was happening: “Some kind of internal mental process”, he wrote, “is operating that associates new information with past information. A type of internal reorganization of the information seems to be going on without the individual being directly aware of it.”  That is to say, the mind works on the problem off-line, moving around the pieces it has in hand and adding one or two it has in reserve but didn't think to use at first. 

p.115
incubation
   That's the general idea, at least, and in Wallas's conception, incubation has several components. One is that it's subconscious. We're not aware it's happening. Another is that the elements of the problem (the Pencil Problem, for example, presented at the school) are being assembled, taken apart, and reassembled. At some point “past information”, perhaps knowledge about the properties of triangles we hadn't initially recalled, is braided in. 
 
pp.115-116
illumination
   The third stage of control is called illumination. This is the aha! moment, the moment when the clouds part and the solution appears all at once. We all know that feeling, and it's a good one. Here's Poincaré again, on the Fuchsian functions problem giving up its secrets: “One evening contrary to my custom, I drank black coffee and could not sleep. Ideas rose in crowds; I felt them collide until pairs interlocked, so to speak, making a stable combination. By the next morning ... I had only to write out the results.” 

p.116
verification
   The fourth and final stage in the paradigm is verification, checking to make sure those results, indeed, work. 

p.116
incubation
   Wallas's principal contribution was his definition of incubation. He did not see this as a passive step, as a matter of the brain resting and returning “fresh”.  He conceived of incubation as a less intense, subconscious continuation of the work.  The brain is playing with concepts and ideas, pushing some to the side, fitting others together, as if absentmindedly working on a jigsaw puzzle.  We don't see the result of the work until we sit down again and notice an entire corner of the jigsaw puzzle is now complete──revealing a piece of the picture that then tells us how to work with the remaining pieces. In a sense, the letting go allows people to get out of their own way, giving the subconscious a chance to toil on its own, without the conscious brain telling it where to go or what to do. 

   p.196
   It's not just me, either. The history of scientific discovery is 
   salted with hints that sleep fosters profound intellectual leaps. 
   The 19th-century German chemist Friedrich August Kekulé, for example, 
   claimed that he stumbled upon the chemical structure of benzene──in 
   which the molecule curls into a ring shape──after dreaming of 
   snakes biting their tails. 

   p.196
   The Russian scientist Dmitri Mendeleev reportedly pull several 
   all-nighters, to no avail, trying to piece together what would 
   become his famous periodic table of the elements, but it was 
   only after nodding off, he told a colleague, that he saw “a 
   table where all the elements fell into place”. 

   p.196
   These kind of stories always remind of the Grimms' fairy 
   tale “The Golden Bird”, in which a young man on a mission to 
   find a magic bird with golden feathers falls in love with a 
   princess, whose father the king will grant her hand on one 
   condition: that the young man dig away the hill that stops the 
   view from his window in eight (8) days. The only complication? 
   This is no hill, it's a mountain, and after seven (7) days of 
   digging, the young man collapses in defeat. That's when his 
   friend the fox whispers, “Lie down and go to sleep; I will work 
   for you”.  And in the morning, the mountain is gone. 

   p.196
   What is the sleeping brain doing, exactly? 
      For that matter, why do we sleep at all? 
      The truth is, no one knows. Or, to be more precise, there's 
   no single, agreed-upon scientific explanation for it. 

   pp.204-205
   A full twenty-four (24) hours later, each student took the test 
   yet again, and the sleep group's advantage had increased on the 
   most distantly related pair. That's a large difference on the 
   hardest questions──35 percent, separating one kind of student 
   from another──but it's not unusual in studies of sleep and 
   learning.  “We think what's happening during sleep is that you 
   open the aperture of memory and are able to see this bigger 
   picture”, the study's senior author, Matthew Walker, told me. 
   “There is evidence, in fact, that REM [rapid eye movement sleep; 
   six stages of sleep, p.202, pp.206-209: wake, stage 1, REM, 
   stage 2, stage 3 & 4] is this creative memory domain when you 
   build different associations, combine things in different ways 
   and so on.” 

p.119
Put another way, the glare of insight was so bright, it obsured the factors that led to it. 

p.119
People routinely generate creative solutions when no clues are available at all: with their eyes closed, in basement study rooms, in tucked-away cubicles. Successful incubation, then, must rely on other factors as well. Which ones? You can't ask people what they are, because the action is all offstage, and there's no easy way to pull back the curtain. 

p.119
  But what if you──you, the scientist──could block people from seeing a creative solution, in a way that was so subtle it went unnoticed. And what if you could also discretely remove that obstacle, increasing the odds that the person saw the answer? Would that reveal anything about this hidden incubation?  Is it even possible? 

p.119
   A young German psychologist named Karl Duncker thought so. Duncker was interested in how people became “unblocked” when trying to crack a problem requiring creative thinking, too, and he'd read Maier's study. In that paper, remember, Maier had concluded, “The perception of the solution of a problem is like the perceiving of a hidden figure in a puzzle-picture.”  Duncker was familiar with picture puzzles. 

pp.119-210
Max Wertheimer, one of the founders of the Gestalt school of psychology. Gestalt──“shape”, or “form” in German──theory held that people perceive objects, ideas, and patterns whole, before summing their component parts.  

p.121
Duncker hadn't changed the instructions or the available materials one bit. Yet by emptying the boxes, he'd altered their mental representation. They were no longer merely containers, incidental to the problem at hand; they were seen as available for use. In Duncker's terminology, when the boxes were full, they were “functionally fixed”.  It was as if people didn't see them at all. 

p.121
   This idea of fixedness infects our perceptions of many problems we encounter. 

p.121
Mystery novelists are virtuosos at creating fixed ideas about characters, subtly prompting us to rule out the real killer until that last act (Agatha Christie's The Murder of Roger Ackroyd is a particularly devious specimen of this). 

p.122
   Between them, Maier and Duncker had discovered two mental operations that aid incubation, picking up clues from the environment, and breaking fixed assumptions, whether about the use of pliers, or the gender of doctor. Here's the rub: They had demonstrated those properties by helping their stumped subjects along with hints. Most of us don't have a psychologist on call, ready to provide deskside incubation assistance whenever we're stuck. We've got to make it happen on our own. The question is, how? 

p.125
   The authors attributed the finding to what they called “selective forgetting”.  A fixating (misleading) word temporarily blocks other possible answers, they argued, but “as more time elapses, after the initial failed attempts, the retrieval block may wear off”.  It's as if the students' brains were temporarily frozen by the bad hints and the five-minute break allowed for some thawing out. 

p.126
Romatic entanglements are another classic example:  We become infatuated, we think we're in love, but time loosens the grip of the fixation. We come to see exasperating flaws. Maybe she's not the one, after all. What was I thinking? 

p.127
   Thankfully, scientists have a method of stepping back to see the bigger picture, one they use when trying to make sense of a large number of varied results. The idea is to “pool” all the findings, positive and negative, and determine what the bulk of the evidence is saying. It's call meta-analysis, and it sometimes tells a clearer story than any single study, no matter how well done. 



p.127
2009, psychologists at Lancaster University in the United Kingdom 
Ut Na Sio and Thomas C. Ormerod 

p.128
They also emphasized that people don't benefit from an incubation break unless they have reached an impasse. Their definition of “impasse” is not precise, but most of us know the difference between a speed bump and a brick wall. Here's what matters: Knock off and play a videogame too soon and you get nothing. 

pp.128-129
As a rule, though, I find the third option works best. I lose myself in the kvetching, I get a dose of energy, I return 20 minutes or so later, and I find that the intellectual knot, whatever it was, is a little looser. 

pp.132-133
   Here's another, from the poet A. E. Housman, who would typically take a break from his work in the trough of his day to relax. “Having drunk a pint of beer at luncheon──beer is a sedative to the brain and my afternoon are the least intellectual portion of my life──I would go for a walk of two or three hours. As I went along, thinking of nothing in particular, only looking at things around me following the progress of the season, there would flow into my mind, with sudden unaccountable emotion, a line or two of verse, sometimes a whole stanza at once, accompanied, not preceded, by a vague notion of the poem which they were destined to form part of.”  Housman was careful to add that it was not as if the entire poem wrote itself.  There were gaps to be filled, he said, gaps “that had to be taken in hand and completed by the mind, which was apt to be a matter of trouble and anxiety, involving trial and disappointment, and sometimes ending in failure.”  



p.133
Creative leaps often come during downtime that follows a period of immersion in a story or topic, and they often come piecemeal, not in any particular order, and in varying size and importance. The creative leap can be a large, organizing idea, or a small, incremental step, like finding a verse, recasting a line, perhaps changing a single word. This is true not just for writers but for designers, architects, composers, mechanics──anyone trying to find a workaround, or to turn a flaw into a flourish. For me, new thoughts seem to float to the surface only when fully cooked, one or two at a time, like dumplings in a simmering pot. 

p.133
Mentally, our creative experiences are more similar than they are different.*
 * I'll leave it to others to explain Mozart. 

p.134
   This longer-term, cumulative process is distinct enough from the short-term incubation we described in the last chapter that it warrants another name. Let's call it percolation. 


pp.135-148
p.135
[Kurt] Lewin was meeting with a student of his, Bluma Zeigarnik, a young Lithuanian in search of a research project. On that afternoon one of the two──accounts vary──noticed something about the café's waiters: They never wrote down orders. They kept them in their head, adding items mentally──... another expresso ... a cup of tea ... a slice of kuchen ... ──until the bill was paid. 

p.135
   Yet once the bill was paid──if, after paying, you questioned what was on the tab──they'd have forgotten the entire order. No recollection at all. It was as if, once that order was settled, the waiter's mind checked off the box and moved on, dropping the entire experience from memory. 

p.136
The waiters could remember orders for a half hour, sometimes longer.
   What was going on here mentally? 
   Lewin and Zeigarnik came up with a hypothesis: Perhaps unfinished jobs or goals linger in memory longer than finished ones. If nothing else, Zeigarnik now had her research project. She put the question more specifically: What's the difference in memory between an interrupted activity and an uninterrupted one? 

p.137
“As everyone knows”, Zeigarnik wrote, “it is far more disturbing to be interrupted just before finishing a letter than when one has only begun.”

p.137
   Once people become absorbed in an assignment, they feel an urge to finish, and that urge builds as the jobs moves closer to completion. “The desire to complete the task may at first have been only a quasi-need”, she concluded, “but later, through losing oneself in the task, a genuine need arises.”

p.137
   In 1931, soon after publishing her work on interruption, Zeigarnik moved to Moscow with her husband, Albert, who had taken a position at the Soviet Ministry of Foreign Trade. 

p.137
   Yet the implication of her work survived, and then some. The Zeigarnik effect, as it's now known, became a foundational contribution to the study of goals and goal formation. 

p.138
The second is that interrupting yourself when absorbed in an assignment extends its life in memory and──according to her experiments──pushes it to the top of your mental to-do list. 

p.138
   This kind of interruption creates suspense and, according to the Zeigarnik effect, pushes the unfinished episode, chapter, or project to the top of our minds, leaving us to wonder what comes next. Which is exactly where we want it to be if we're working on something long-term and demanding. 

p.139
2001 experiment to measure the effect of goals on perception. 
Henk Aarts at Leiden University, psychologist 

p.139
The Bisaldrop Dubbel Zoute is a Dutch black licorice drop the size of a plug nickel. Bisaldrops are an acquired taste, slightly sweet and very salty, and best served with a cool glass of water. 

p.139
For our purposes, the important thing to know is that Bisals make you thirsty──and fast──which is why a group of scientists in the Netherlands used them in a 2001 experiment to measure the effects of goal on perception. 

p.139
The group, led by the psychologist Henk Aarts at Leiden University, began their trial the way so many scientists do: by lying. Researchers often attempt to disguise a study's true purpose so participants don't just play along or deliberately undermine the results. 

p.140
It was WHAT they wrote down that the psychologists were interested in, and that's where a significant difference became clear: The group that had been given the Bisaldrops remembered twice as many drink-related items as the control group. They were thirsty, and that influenced what they noticed in the office and remembered later, even if they weren't aware WHY they recalled those things. 

p.140
   The experiment was a clever demonstration of a fairly straight-forward principle of social psychology: Having a goal foremost in mind (in this case, a drink), tunes our perception to fulfilling it. And that tuning determines, to some extent, where we look and what we notice. 

p.140
“The results suggest that basic needs and motives cause a heightened perceptual readiness to register environmental cues that are instrumental to satisfying those needs”, the authors concluded. “It can foster the reduction of thirst by helping us to detect a can of Coke or a cool glass of beer that would go unnoticed under other circumstances.”

p.140
Of course we look for a drinking fountain when we're thirsty, or a snack machine when hungry. 

p.140
Whether they were aware of it or not, their thirst activated a mental network that was scavenging the landscape for anything linked to liquid. 

p.141
   In dozens of studies going back decades, psychologists have shown that this principle of tuned perception applies not only to elemental needs like thirst, but to any goal we hold foremost in mind. 

p.141
Not only that, I started to notice other, more exotic colors, as well as different styles and laces. Within weeks, I had a detailed mental map of a particular subculture: pre-teen Converse wearers in 1971 suburban Chicago, a subtle, intricate universe that was previously invisible to me. 

p.141
Academic pursuits are goals, too, and they can tune our perceptions in the same way that a powerful thirst or a new pair of sneakers can. 

p.142
“Once a goal becomes activated, it trumps all others and begins to drive our perceptions, our thoughts, our attitudes”, as John Bargh, a psychologist at Yale  University, told me. 
   So the question is: How, then, do we most effectively activate that goal? 

p.142
   By interrupting work on it at an important and difficult moment──propelling the assignment, via the Zeigarnik effect, to the top of our mind. 

p.142
“Chance favors the prepared mind.”  Seeing that quote always made me think, Okay, but how does one prepare for chance?  I have a better idea now, thanks to social psychology.  I'd put it differently than Pasteur, if less poetically:  Chance feeds the tuned mind. 

p.142
   My favorite articulation of how this happens comes from the novelist and short story writer Eudora Welty. In a 1972 interview, Welty was asked where her dialogue comes from. “Once you're into a story”, she replied, “everything seems to apply──what you hear on the city bus is exactly what your character would say on the page you were writing. Wherever you go, you meet part of your story. I guess you are tuned in for it, and the right things are sort of magnetized──if you can think of your ears as magnets.” 

p.143
The information we pick up isn't merely dumped into a mental ledger of overheard conversation. It also causes a ripple in our thinking about the story, our research paper, our design project, or our big presentation. 

p.144
Ronda Leathers Dively
   In 1992, a doctoral student in Illinois noticed the same tentative, deferential quality in her students' work. Ronda Leathers Dively, then finishing her degree in English at Illinois State University, was teaching a group of sophomores and juniors how to write for publication in an academic journal, using authoritative sources to make a cogent argument. 

p.144 
By the end of the course, however, she was discouraged. 

p.144
Most alarming, the work was no better at the end of the semester than at the beginning. That was her fault, not theirs. She was failing them. 

p.144
   Dively decided that the curriculum she followed was preventing percolation (or incubation, as she calls it) from happening. 

p.144
The course, in other words, allowed for no time to meditate on the topics, no real downtime at all. 

p.144
   So Dively decided to throw out the program. She would conduct an experiment of sorts. 

p.144
The course would demand the same amount of writing, but in a very different format. 

p.145
A third piece would be a response to a controversial school of thought on their topic. Dively also required them to keep journals along the way, tracking their personal reactions to the sources they were using.  Did the articles make sense?   Did they agree with the main point?  Was this expert or that consistent in his or her opinion? 

p.145
More time doesn't always add up to more authoritative writing, and sometimes means sinking deeper into indecision.  In this case, however, her students showed her something extra. 

p.145
The biggest improvement, she wrote, was that they took on “an expert persona, an authoritative presence capable of contributing to the scholarly exchange.”

p.145
   At the end of the semester she surveyed her students, asking about the new format. “As time goes by and I find more research, much of the information becomes embedded in me”, said one. “Now, I even question certain things which the author claims to be true. I realize I do not have to agree with everything in a professional journal.”  Another said, “I had a more complete understanding of the material I was dealing with because I was able to ask more questions of myself” in the journal. One student openly scoffed at an article “written for a beginner in environmental health in this somewhat prestigious journal. I would only recommend the reading of this article to someone with almost no knowledge of the subject.”
   In other words, her students were no longer looking to borrow someone else's opinion. They were working to discover their own. 

p.146
   She made percolation visible. 

   p.133
   Creative leaps often come during downtime that follows a period of 
   immersion in a story or topic, and they often come piecemeal, not 
   in any particular order, and in varying size and importance. 
   The creative leap can be a large, organizing idea, or a small, 
   incremental step, like finding a verse, recasting a line, perhaps 
   changing a single word. This is true not just for writers but for 
   designers, architects, composers, mechanics──anyone trying to find a 
   workaround, or to turn a flaw into a flourish. For me, new thoughts 
   seem to float to the surface only when fully cooked, one or two at a 
   time, like dumplings in a simmering pot. 

   p.133
   Mentally, our creative experiences are more similar than they are 
   different.*
    * I'll leave it to others to explain Mozart. 

   p.134
   This longer-term, cumulative process is distinct enough from the 
   short-term incubation we described in the last chapter that it warrants 
   another name. Let's call it percolation. 

p.146
Having that goal (the paper) continually active──unfinished──sensitized the students' minds consciously and subconsciously to relevant information all around them, like the thirsty participants in Henk Aarts's study. 

p.146
Those are the first two (2) elements of percolation: interruption, and the tuned, scavenging mind that follows. 

p.146
The journal entries provided the third element, conscious reflection. Remember, Dively had the students make regular entries on what they thought about the sources they used, the journal articles and interviews. Their thinking evolved, entry by entry, as they accumulated more knowledge. 

p.146
   Assembled into a coherent whole, this research──from Zeigarnik, Aarts, Dively, and other social psychologists who've spent the past decades studying goal fulfillment──takes some of the mystery out of the “creative process”.  No angel or muse is whispering to anyone here. Percolation is a matter of vigilance, of finding ways to tune the mind so that it collects a mix of external perceptions and internal thoughts that are relevant to the project at hand. We can't know in advance what those perceptions and thoughts will look like──and we don't have to. Like the thirsty students in Aart's study, the information flows in. 


p.151
   One of the first hints that there might be another way came in a 1978 experiment by a pair of researchers at the University of Ottawa. Robert Kerr and Bernard Booth were trained in kinetics, the study of human movement. 

p.151
In this case, Kerr and Booth wanted to know how two distinct kinds of practice affected a simple, if somewhat obscure, skill: beanbag tossing. (It was an inspired choice, as it turned out; it's a skill that most of us have tried, at a kid's birthday party or some amusement park game, but that no one works on at home.) 

p.151
The recruited thirty-six 8-year-olds who were enrolled in a 12-week Saturday morning PE course at a local gym and split them into two groups. 

p.152
The beanbag experiment was as obsure as they come. (So much so that it disappeared entirely from the website of the journal in which it originaly appeared, Perceptual and Motor Skills; it took editors weeks to find it when I asked.) 

p.152
Kinetics and cognitive psychology are worlds apart in culture and in status. One is closer to brain science, the other to gym class. 

p.153
Psychologists who study learning tend to fall into one of two camps: the motor/movement, or the verbal/academic.  The former focuses on how the brain sees, hears, feels, develops reflexes, and acquires more advanced physical abilities, like playing sports or an instrument.  The latter investigates conceptual learning of various kinds: language, abstract ideas, and problem solving. Each camp has its own vocabulary, its own experimental paradigm, its own set of theories. In college, they are often taught separately, in different courses: “Motor and Perceptual Skills” and “Cognition and Memory”. 

p.153
   A major implication of the Molaison studies was that the brain must have at least two biological systems for handling memory. One, for declarative memories, is dependent on a functioning hippocampus. The other, for motor memories, is based in different brain organs; no hippocampus required. The two systems are biologically distinct, so it stood to reason that they're functionally distinct, too, in how they develop, strengthen, and fade. Picking up Spanish is not the same as picking up Spanish guitar, and so psychology has a separate tradition to characterize each. 

p.157
But repetition creates a powerful illusion. Skills improve quickly and then plateau. By contrast, varied practice produces a slower apparent rate of improvement in each single practice session but a greater accumulation of skill and learning over time. 

p.157
In the long term, repeated practice on one skill slows us down. 

pp.157-158
   Psychologists had been familiar with many of these findings, as isolated results, for years. But it was Schimdt and Bjork's paper, “New Conceptualizations of Practice”, published in 1992, that arranged this constellation of disparate pieces into a general principle that can be applied to all practice──motor and verbal, academic as well as athletic. 

p.158
Their joint class turned out not to be devoted to contrasts, after all, but to identifying key similarities. “We are struck by the common features that underlie these counterintuitive phenomena in such a wide range of skill-learning situations”, they concluded. “At the most superficial level, it appears that systematically altering practice so as to encourage additional, or at least different, information processing activities can degrade performance during practice, but can at the same time have the effect of generating greater performance capabilities.”
   Which activities are those? 


pp.158-159
They were simply alternating targets. It was a small variation, only a couple of feet, but that alteration represents a large idea, and one that has become the focus of intense study at all levels of education. 

pp.163-164
   Interleaving.  That's a cognitive science word, and it simply means mixing related but distinct material during study. Music teachers have long favored a variation on this technique, switching from scales, to theory, to pieces all in one sitting. So have coaches and athletic trainers, alternating endurance and strength exercises to ensure recovery periods for certain muscles. These philosophies are largely rooted in tradition, in a person's individual experience, or in concerns about overuse. Kornell and Bjork's painting study put interleaving on the map as a general principle of learning, one that could sharpen the imprint of virtually any studied material. It's far too early to call their study a landmark──that's for a better historian than I to say──but it has inspired a series of interleaving studies among amateurs and experts in a variety of fields. Piano playing. Bird watching. Baseball hitting. Geometry.
   What could account for such a big difference?  Why any difference at all?  Were the distinctions between styles somehow clearer when they were mixed? 

p.164
   “That may be the most astounding thing about this technique”, said John Dunlosky, a psychologist at Kent State University, who has shown that interleaving accelerates our ability to distinguish between bird species. “People don't believe it, even after you show them they've done better.”  

p.164
   This much is clear: The mixing of items, skills, or concept during practice, over the longer term, seems to help us not only see the distinctions between them but also to achieve a clearer grasp of each one individually. 


pp.177-178
But they could do one thing the novices could not:  memorize a chess position after seeing the board for less than five seconds. One look, and they could reconstruct the arrangement of the pieces precisely, as if they'd taken a mental snapshot. 
   In a follow-up study, a pair of researcher at Carnegie Mellon University──William G. Chase and Herbert A. Simon──showed that this skill had nothing to do with the capacity of the masters' memory. Their short-term recall of things like numbers was no better than anyone else's. Yet they saw the chessboard in more meaningful chunks than the novices did.*  “The superior performance of stronger players derives from the ability of those players to encode the position into larger perceptual chunks, each consisting of a familiar configuration of pieces”, Chase and Simon concluded. 

* “Chunking”, in psychology, is the facility to store studied items in meaningful clusters based on prior knowledge. Take the sequence of letters Y, N, B; C, B, B; C, E; F, I, F; A, C, I; A M, B; A, Y.  Study those for a few minutes, then cover your eyes and try to remember as many as you can. The typical number most of us can remember is about seven. Now try it again after grouping the letters in this way: Y, NBC, BBC, FIFA, CIA, MBA, Y.  You remember more, because you've stored the letters in meaningful groups. 

p.178
Their eyes, and the visual systems in their brains, are extracting the most meaningful set of clues from a vast visual tapestry, and doing so instantaneously. I think of this ability in terms of infrared photography: You see hot spots of information, live information, and everything else is dark. 

p.178
Like chess and baseball prodigies, they do it through career-long experience, making mistakes, building intuition. 


p.179
   Eleanor Gibson came of age as a researcher in the middle of the 20th century, during what some call the stimulus-response, or S-R, era of psychology. 

p.179
Psychologists at the time were under the influence of behaviourism, which viewed  learning as a pairing of a stimulus and response: the ringing of a bell before mealtime and salivation, in Ivan Pavlov's famous experiment. 

pp.179-180
Most of us learn early in life, for instance, that making eye contact brings social approval, and screaming less so. We learn that when the family dog barks one way, it's registering excitement; another way, it senses danger.  In the S-R [stimulus and response] world, learning was a matter of making those associations──between senses and behaviours, causes and effects. 

p.180
The field, Gibson believed, was completely overlooking something fundamental: discrimination. How the brain learns to detect minute differences in sights, sounds, or textures. Before linking different names to distinct people, for example, children have to be able to distinguish between the sounds of those names, between Ron and Don, Fluffy and Scruffy. That's one of the first steps we take in making sense of the world. In hindsight, this seems on obvious point. Yet it took years for her to get anyone to listen. 

p.180
Gibson soon got the opportunity to study learning in young children, and that's when she saw that her gut feeling about discrimination learning was correct. 
 
p.183
Nor were their brains──as the English philosopher John Locke famously argued in the 17th century──empty vessels, passively accumulating sensations.  No, their brains came equipped with evolved modules to make important, subtle discriminations, and to put those differing symbols into categories.  

p.183
   “Let us consider the possibility of rejecting Locke's assumption altogether”, the Gibsons wrote. “Perhaps all knowledge comes through the senses in an even simpler way than John Locke was able to conceive──by way of variations, shadings, and subtleties of energy.”

p.183
   That is, the brain doesn't solely learn to perceive by picking up on tiny differences in what it sees, hears, and smells, or feels. In this experiment and a series of subsequent ones──with mice, cats, children, and adults──Gibson showed that it also perceives to learn. It takes the differences it has detected between similar-looking notes or letters or figures, and uses those to help decipher new, previously unseen material. Once you've got middle-C nailed on the treble clef, you use it as a benchmark for nearby notes; when you nail the A an octave higher, you use that to read its neighbors; and so on.  This “discrimination learning” builds on itself, the brain hoarding the benchmarks and signatures it eventually uses to read larger and larger chunks of information. 

pp.183-184
   In 1969, Eleanor Gibson published Principles of Perceptual Learning and Development, a book that brought together all her work and established a new branch of psychology: perceptual learning.  Perceptual learning, she wrote, “is not a passive absorption, but an active process, in the sense that exploring and searching for perception itself is active. We do not just see, we look; we do not just hear, we listen. Perceptual learning is self-regulated, in the sense that modification occurs without the necessity of external reinforcement. It is stimulus oriented, with the goal of extracting and reducing the information simulation. Discovery of distinctive features and structure in the world is fundamental in the achievement of this goal.”  

p.184
   Perceptual learning is active. Our eyes (or ears, or other senses) are searching for the right clues. Automatically, no external reinforcement or help required. We have to pay attention, of course, but we don't need to turn it on or tune it in.  It's self-correcting──it tunes itself.  The system works to find the most critical perceptual signatures and filter out the rest. Baseball players see only the flares of motion that are relevant to judging a pitch's trajectory──nothing else. The masters in Chase and Simon's chess study considered fewer moves than the novices, because they'd developed such a good eye that it instantly pared down their choices, making it easier to find the most effective parry. And these are just visual examples. Gibson's conception of perceptual learning applied to all the senses, hearing, smell, taste, and feel, as well as vision. 



pp.184-185
The flying conditions above Martha's Vineyard can change on a dime. Even when clouds are spare, a haze often settles over the island that, after nightfall, can disorient an inexperienced pilot. That's apparently what happened just after 9:40 P.M. on July 16, 1999, when John Kennedy Jr. crashed his Piper Saratoga into the ocean seven miles offshore, killing himself, his wife, and her sister. “There was no horizon and no light”, said another pilot who's flown over the island that night. “I turned left toward the Vineyard to see if it was visible but could see no lights of any kind nor any evidence of the island. I thought the island might have suffered a power failure.” 

p.185
The official investigation into the crash found that Kennedy had 55 hours of experience flying at night, and that he didn't have an instrument rating at all. In pilot's language, that means he was still learning and not yet certified to flying zero visibility, using only the plane's instrument panel as a guide. 

p.185
   The instruments on small aircraft traditionally include six main dials. One tracks altitude, another speed through the air. A third, the directional gyro, is like a compass; a fourth measures vertical speed (climb or descent). Two others depict a miniature airplane and show banking of the plane and its turning rate through space, respectively (newer models have five, no banking dial). 

p.185
   Learning to read any one of them is easy, even if you've never seen an instrument panel before. It's harder, however, to read them all in one sweep and to make the right call on what they mean collectively. Are you descending? Are you level? This is tricky for amateur pilots to do on a clear day, never mind in zero visibility. Add in communicating with the tower via radio, reading aviation charts, checking fuel levels, preparing landing gear, and other vital tasks──it's a multitasking adventure you don't want to have, not without a lot of training. 

pp.185-186
   This point was not lost on Philip Kellman, a cognitive scientist at Bryn Mawr College, when he was learning to fly in the 1980s. As he moved through his training, studying for aviation tests──practicing on instrument simulators, logging air time with instructors──it struck him that flying was mostly about perception and action. Reflexes. Once in the air, his instructors could see patterns that he could not. “Coming in for landing, an instructors may say to the student, ‘You're too high!’”  Kellman, who's now at UCLA, told me. “The instructor is actually seeing an angle between the aircraft and the intended landing point, which is formed by the flight path and the ground. The student can't see this at all. In many perceptual situations like this one, the novice is essentially blind to patterns that the experts has come to see at a glance.” 

p.186
   That glance took into account all of the instruments at once, as well as the view out the windshield. To hone the ability, it took hundreds of hours of flying time, and Kellman saw that the skill was not as straightforward as it seemed on the ground. Sometimes a dial would stick, or swing back and forth, creating a confusing picture. Were you level, as one dial indicated, or in a banking turn, like another suggested?  Here's how Kellman describes the experience of learning to read all this data at once with an instructor: “While flying in the clouds, the trainee in the left seat struggles as each gauge seems to have a mind of its own. One by one, he laboriously fixates on each one. After a few seconds on one gauge, he comprehends how it has strayed and corrects, perhaps with a jerk guaranteed to set up the next fluctuation. Yawning, the instructor in the right seat looks over at the panel and sees at a glance that the student has wandered off of the assigned altitude by 200 feet but at least has not yet turned the plane upside down.”

p.186
The training short-cut Kellerman developed is what he called a perceptual learning module, or PLM. 



p.203
One reason that palace intrigue makes for such page-turning fiction or addictive TV is what psychologists call “embedded hierarchy”.  The king is the king, the queen is the queen, and there are layers of princes, heirs, relatives, ladies-in-waiting, meddling patriarches, ambitious newcomers, and consigliere types, all scheming to climb to the top. Which alliances are most important?  What's the power hierarchy?  Who has leverage over whom?  You have no idea until you see the individuals interact. And if you don't see them square off one-on-one, you play out different scenarios to see if you can judge the players' relative power. Could Grishilda and Thorian shackled and tossed in the moat if the two clashed?  She is a favorite of the king's, after all.  Yet Thorian might have some connections up his sleeve ... wait, who's his mother again? 
   Learning scientists like embedded hierarchy problems because they model the sort of reasoning we have to do all the time, to understand work politics as well as math problems. We have to remember individual relationships, which is straight retention.  

p.205
We have to use those to induce logical extensions: If A > B and B > C, then A must be > C.  
   If A > [               ] C, and B > [               ] C, 
          [is greater than]            [is greater than]  , 
   then A must be > [            ]  C. 
                    [greater than]


p.203
Finally, we need to incorporate those logical steps into a larger framework, to DEDUCE the relationships between people or symbols that are distantly related. When successful, we build a bird's-eye view, a system to judge the relationship between any two figures in the defined universe, literary or symbolic, that's invisible to the untrained mind. 

p.205
The window Eugene Aserinsky had opened, revealing REM sleep, seemed, for a time, to expose little more than another dark room. “You had this great excitement, basically followed by forty [40] years of nothing; it was just horrible”, Robert Stickgold, a neuroscientist at Harvard, told me. 

p.205
   My own theory is that sleep amplifies many of the techniques we've discussed in this book. 

p.206
These findings have coalesced into a remarkable hypothesis, first described in 1995 by Italian scientists led by Antonio Giuditta at the University of Naples Federico II. 

p.206
   Technically, I suppose, we should call this idea the Giuditta-Smith-Stickgold Model of Learning Consolidation. I prefer to call it, simply, the Night Shift Theory. The lights go out, and basic maintenance is done. 

pp.206-209



pp.214-215
p.214
Humans have been around for at least a million, and for the vast majority of that time we've been preoccupied with food, shelter, and safety. We've been avoiding predators, ducking heavy weather, surviving by our wits, foraging. 

p.214
And life for foragers, as the Harvard psychologist Steven Pinker so succinctly puts it, “is a camping trip that never ends.”

p.214
Some of it would come from elders and peers, but most of it would be accumulated through experience. Listening. Watching. Exploring the world in ever-widening circles. That is how the brain grew up learning, piecemeal and on the fly, at all hours of the day, in every kind of weather. 

p.215
As we forage for food, the brain adapted to absorb──at maximum efficiency──the most valuable cues and survival lessons along the way. 

p.215
   Humans fill what the anthropologists John Tooby and Irven DeVore called the “cognitive niche” in evolutionary history. Species thrive at the expense of others, each developing defenses and weapons to dominate the niche it's in. 


pp.216-217
p.216
   I remember bringing my younger daughter to my newspaper office one weekend a few years ago when she was 12. I was consumed with a story I had to finish, so I parked her at an empty desk near mine and logged her into the computer. And then I strapped in at my desk and focused on finishing──focused hard. Occasionally, I looked up and was relieved to see that she was typing and seemed engrossed, too. After a couple hours of intense work, I finished the story and sent it off to my editor. At which point, I asked my daughter what she'd been up to. She showed me. She'd been keeping a moment-to-moment log of my behavior as I worked. She'd been taking field notes, like Jane Goodall observing one of her chimpanzees: 

   10:46──types
   10:46──scratches head
   10:47──gets papers from printer 
   10:47──turns chair around
   10:48──turns chair back around
   10:49──sighs
   10:49──sip tea
   10:50──stares at computer
   10:51──puts on headset
   10:51──calls person, first word is “dude”
   10:52──hangs up
   10:52──puts finger to face, midway between mouth and chin, thinking pose? 
   10:53──friend comes to desk, he laughs
   10:53──scratches ear while talking

And so on, for three pages. I objected. She was razzing me, naturally, but the phone call wasn't true, was it? Did I make a call? Hadn't I been focused the whole time, locked in, hardly looking away from my screen? Hadn't I come in and cranked out my story without coming up for air? Apparently not, not even close. 

pp.216-217
The truth was, she could never have invented all those entries, all that detail. I did the work, all right, and I'd had to focus on it. Except that, to an outside observer, I looked fidgety, distracted──unfocused. 

p.217
Concentration may, in fact, include any number of breaks, diversions, and random thoughts. 

p.217
We're still in foraging mode to a larger extent than we know. 

p.217
Meaning Maintenance Model

p.217
One encompassing theory is called the Meaning Maintenance Model, and the idea is this: Being lost, confused, or disoriented creates a feeling of distress. To relieve that distress, the brain kicks into high gear, trying to find or make meaning, looking for patterns, some way out of its bind──some path back to the campsite. 

pp.217-218
“We have a need for structure, for things to make sense, and when they don't, we're so motivated to get rid of that feeling that our response can be generative”, Travis Proulx, a psychologist at Tilburg University in the Netherlands, told me. “We begin to hunger for meaningful patterns, and that can help with certain kinds of learning.” 

p.218
In one experiment, [Travis] Proulx and Steven J. Heine, a psychologist at the University of British Columbia, found that deliberately confusing college students──by having them read a nonsensical short story based on one by Franz Kafka──improved their performance by almost 30 percent on a test of hidden pattern recognition, similar to the colored egg test we discussed in Chapter 10.  The improvements were subconscious; the students had no awareness they were picking up more. 

p.218
On the contrary, disorientation flips the GPS setting to “hypersensitive”, warming the mental circuits behind incubation, percolation, even the nocturnal insights of sleep. If the learner is motivated at all, he or she is now mentally poised to find the way home. Being lost is not necessarily the end of the line, then. Just as often, it's a beginning. 

--
πόλλ' οἶδ' ἀλώπηξ,ἀλλ' ἐχῖνος ἓν μέγα πόλλ' οἶδ' ἀλώπηξ,ἀλλ' ἐχῖνος ἓν μέγα

NOTICE: In accordance with Title 17 U.S.C., section 107, some material is provided without permission from the copyright owner, only for purposes of criticism, comment, scholarship and research under the "fair use" provisions of federal copyright laws. These materials may not be distributed further, except for "fair use," without permission of the copyright owner. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml

No comments:

Post a Comment

ba place space

  Bernie Clark., From the Gita to the Grail : exploring yoga stories & western myths, 2014 p.345      ... literal definition of dukkha: ...