Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage Articles Your Brain on Scrum

Your Brain on Scrum

I. The roots of agile

Agile relies on the belief that individuals and interactions are more important than tools. It turns out that this belief is much more than just that. Individuals do work more productively in teams. Social cognitive neuroscience research strongly suggests that there are good brain-based reasons why agile is so effective.

The agile software development framework has been with us for over a decade. The classic principles were stated in 2001 in the Agile Manifesto (

  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

These principles identify agile’s differences with the standard top-down waterfall method of creating software. The waterfall method requires a large overall plan and a set of processes and standard tools to use in following the plan. The execution of the plan is the immediate purpose. Unstated, but clear, is that managers are needed to supervise the execution of all the steps of the plan, including the intermediate steps, in the proper order. The actual working software comes only at the end of the waterfall.

In sharp contrast, agile gives control to individuals, where people on the agile team, interacting and responding to changes, take responsibility for producing the software.

The same meeting that produced the Agile Manifesto also produced these Twelve Principles:

  1. Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
  2. Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage.
  3. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.
  4. Business people and developers must work together daily throughout the project.
  5. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.
  6. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.
  7. Working software is the primary measure of progress.
  8. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.
  9. Continuous attention to technical excellence and good design enhances agility.
  10. Simplicity–the art of maximizing the amount of work not done–is essential.
  11. The best architectures, requirements, and designs emerge from self-organizing teams.
  12. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.

Interestingly, five of those 12 principles mention time, which to me shows that speed, timing and rhythm were Agile’s focus from the start.

II. Agile methods are supported by cognitive neuroscience

Now let’s turn to the science. The Agile Manifesto established a milestone in the world of work.

Six years earlier, in 1995, the science of brain study had turned a corner, too. That was the year the mirror neuron was discovered in the primate brain. Giacomo Rizzolati at the University of Parma discovered that mirror neurons in the brain light up when we see other people do things on purpose. If you see someone pick up a piece of fruit to eat, mirror neurons in your brain light up. This was, finally, the anatomy of empathy. Soon, new mirror-neuron studies were underway, and they led to new insights. A key insight was that the mirror neurons not only pick up on intentional actions like grasping a pencil, they also pick up on emotional actions such as facial expressions.

When we see others’ facial expressions, we activate the same in our own motor cortex, but we also transmit this information to the insula, involved in our emotions. When I see your facial expression, I get the movement of your face, which drives the same motor response on my face, so a smile gets a smile. The motor resonance is also sent on to your own emotional centers, so you share the emotion of the person in front of you. (Rock, page 160)

If empathy had an anatomical location in the brain—a place where specific nerves were dedicated to empathic connection with another—then about other aspects of social connection? What about like and dislike? What about respect, inclusion and ostracism? Where were they located?

Rizzolati’s discovery of the mirror neuron ignited an explosion of research. The new tools, now well refined, included functional magnetic resonance imaging (fMRI), positron emission tomography (PET), and brain-wave analysis via quantitative EEGs (QEEG). Well-designed experiments produced at first a trickle and then a river of studies about what motivates and demotivates people, at the brain level, in family life and at work. The new field of social cognitive neuroscience was born. By one estimate, 250 researchers now work fulltime in the field.

Membership in the group: hard-wired

The basic finding of the new science – social cognitive neuroscience — is that humankind is a far more deeply social being than we generally assume. Our interactions and our social behaviors are hardwired into the brain.

Rather strikingly, the science shows that the classic Maslow five-step pyramid of needs is mistaken. In Abraham Maslow’s view, human survival needs pertain to an individual alone in the universe: first he needs food and water, then he needs safety, and only after that does he have need for social contact and the esteem of others. But it’s now clear that the Maslow hierarchy is not supported by what the brain scans are showing.

The fMRI studies of the human brain show we are a community first, and experience ostracism from the community, for example, as a basic threat. You may have seen this news in the last year, that the experience of social exclusion (ostracism) lights up the exact same parts of the brain as does physical pain.

New studies of chimps and monkeys bolster the case that social contact is basic to primates. Here is a striking passage from Frans de Waal of the Yerkes Primate Center in Atlanta:

Origin stories [that present] humans as loners who grudgingly came together are ignorant of primate evolution. We belong to a category of animals known among zoologists as “obligatorily gregarious,” meaning that we have no option but to stick together. This is why fear of ostracism lurks in the corners of every human mind: being expelled is the worst thing that can befall us. It was so in biblical times, and it remains so today. Evolution has instilled a need to belong and to feel accepted. We are social to our core. (de Waal, page 221)

Threats to our sense of membership in the group (such as being excluded from a meeting where we think we belong) register, for us as for other primates, as threats to our very existence. Both the animal studies and the fMRI studies make the point that avoiding disconnection from others matters just as much as avoiding physical pain.

Agile values the connections between people over the solutions to technical problems. We say it’s important to take care of people’s membership issues and resolve disputes, and the neuroscience seems to be telling us “damn right.”

Whenever there is an argument about some choice of technology, and the voices get raised, you know it’s not about which technology to choose. It never is. Sam is yelling at Bill, “You’re stupid, you don’t understand, you are incompetent.” As soon as there’s some intense emotional state, it’s about something else, about the relationship between those two people—about how they don’t respect each other, or one felt humiliated in the last meeting, or one is embarrassed to be criticized in front of someone they have a crush on.

Social experience is a “survival issue”

One who explains this new science well, without oversimplifying too much, is David Rock, a former business journalist. His book Your Brain at Work, well footnoted, summarizes in non-specialist terms the findings and their implications for the management of people at work. Rock creates a useful acronym he calls SCARF.

There are five domains of social experience that your brain treats the same as survival issues. [This is] the SCARF model, which stands for Status, Certainty, Autonomy, Relatedness, and Fairness. This model describes the interpersonal primary rewards or threats that are important to the brain. (Rock, page 196)

Here are Rock’s definitions and comments (Rock, page 276):

  • Status: Where you are in the social order of the communities you are involved in. A sense of your status going up, even in a small way, activates your reward circuits. A sense of status doing down activates threat circuitry. Just speaking to a person of higher authority generally activates a status threat.
  • Certainty: This is the ability to predict the future. Increasing uncertainty is a threat, but increasing certainty is a reward.
  • Autonomy: Having control or choices. A sense of increasing autonomy is a pleasant reward, but a sense of decreasing autonomy is stressful.
  • Relatedness: Relatedness means being safely connected to the people around you. It involves sensing whether people are friend or foe, but other people are generally considered foe until proven otherwise.
  • Fairness: This is the state of being in which people act ethically and appropriately with one another.

Rock uses the terms “threat” and “reward” to refer to brain circuits that are automatic. When it comes to the big five SCARF values—status, certainty, autonomy, relatedness and fairness—our awareness of them is absolutely automatic and cannot be turned off. We can’t be talked out of our perception of threats to these five values. We live with these circuits always on.

Fairness. Fairness studies, such as those conducted by Golnaz Tabibnia at Carnegie Mellon, show that, as she says, “the tendency to prefer equity and resist unfair outcomes is deeply rooted in people.” In the brain, she reports, fair treatment produces a “primary reward” experience in the striatum just like the experience of eating tasty food; but unfair treatment excites the anterior insula, an organ that has been associated with our experience of disgusting tastes (quoted in Rock, page 175).

An increasing sense of fairness increases levels of dopamine, serotonin and oxytocin, and this emotional state “makes you open to new ideas and more willing to connect with other people” (Rock, page 178).

Common sense takes for granted that our needs for food, shelter and sex are more fundamental than our need for a sense of fairness. But the brain-scans say otherwise. Rookie managers often undervalue fairness, and therefore get surprised when someone reporting to them becomes outraged because they feel unfairly treated. When you perceive that you’ve been fairly treated, your brain gives you endorphins and you feel calmed and good. When you perceive unfair treatment, your brain aches. In a way, fair treatment can be more important than food.

Autonomy. The need for autonomy is familiar to us all. Autonomy is the sense of control over the environment. Its presence feels good, and its absence feels bad—ask any prisoner in a jail.

Daniel H. Pink’s best-selling books on management and motivation (A Whole New Mind and Drive) stress the importance of a sense of autonomy for individual performance and attitude in school, sports and business. Among his many examples is a study of 320 small businesses “half of which granted workers autonomy, the other half relying on top-down direction” (Pink, page 97). The businesses that offered autonomy grew at four times the rate of the control oriented firms and had one-third the turnover.

David Rock also argues for the importance of autonomy for well-being, citing many sources, including animal studies of rats pressing levers to get cocaine, and human studies of British civil servants, small business owners, and residents of nursing homes in which “over and over, scientists see that the perception of control over a stressor alters the stressor’s impact” (Rock, page 124). Stressors arouse the limbic system (the zone where we perceive threats to our existence), and that sense of biological emergency diminishes the capacity of our “highest” mind, the pre-frontal cortex, to think and to learn. Amy Arnsten says, “Even if we have the illusion that we are in control, our cognitive functions are preserved” (quoted in Rock, page 124).

III. Learning time is the bottleneck in software development: Agile is the fastest way to learn. Why.

Creating software requires huge amounts of learning by the software developers, and the time spent learning is the main bottleneck limiting productivity and profit. I think agile’s particular charisma and grace stems from its single-minded focus on speed of learning. I think there is good evidence that small-group social cohesion fosters happiness, and that happiness fosters learning.

Learning time is the bottleneck that limits invention, because we cannot invent more than we can learn; therefore the work of invention is mostly learning. That’s what Thomas Edison meant when he talked about investigating 999 dead ends on the path to inventing the light bulb. Here’s a thought experiment. Say you take ten days to write a paper—like this white paper—and then tear it up. How long would it take you to re-create it? Ten days? It might take only two days, because you will remember most of what you put into the first draft, and its structure and its key references. If it only takes two days to re-create the paper, that means eight of the ten days you put into that draft went to learning how to do it.

The same principle applies to software projects. If we spent six months writing software and then lost the code at the end of it, the work would not be totally lost. We could probably re-make the software in a fifth of the time spent. Even with the software gone, the brains still remain rewired as a result of the complex learning task. All the patterns and categories and intentions are still living in the brain, so the software can be re-created speedily.

Ambient learning is important

Agile is all about learning. Agile seeks to invent environments that speed up our ability to understand things. When agile works well, it does so because teams are learning efficiently, much more efficiently than software engineers in the standard waterfall method.

In an agile setup, all sorts of ambient learning takes place. You are all sitting at a table, working eight hours a day, and you overhear a conversation between two people and you pop on over. The threshold of activation is very low. You just turn to someone and say, “Hey can you look at this, and give me a second opinion?”

In agile, delays are diminished because issues get resolved quickly face-to-face. When your teammates work at a distance, things are much harder. A phone call has to be coordinated, you have to organize your thoughts first, maybe making notes, and then you pick up the phone and hope the communication is clear despite the fact that you can’t see the emotion and feeling on your teammate’s face.

There is something about the physicality of Agile work which ten years ago we might have called “the magic of physicality.” Now that neuroscience has discovered the brain basis of empathy, I’m inclined to call that magic the natural rapport of primate brains.

The evidence is strong that emotional state is infectious, in good and bad ways, in all zones of human life. Emotional states propagate and its almost a cliché by now that if you know happy people, you are more likely to be happy, and if your friends are overweight, you are more likely to be overweight. Learning is infectious, too. The best learning locations are not just rooms—they are commonly shared emotional states, shared by small groups of people in the same room who feel safe with each other. We often call these commonly shared emotional states “good working rapport.”

A researcher at Northwestern University named Dr. Mark Beeman has studied what happens in the brain just before, during, and after a moment of insight. He says just before you have an insight your brain goes quiet for a minute, and you have this feeling of waiting for the new idea to show up. We all know what this feels like, and Dr. Beeman has found that brain scans actually show this:

About a second and a half before people solve the problem with insight, they had a sudden and prolonged increase in alpha band activity over the right occipital lobe, the region that processes visual information coming into the brain. We think the alpha activity signifies people sort of had an inkling that they were getting close to solving the problem, that they had some fragile weak activation that was hinting at the solution somewhere in the brain. They want to shut down or attenuate the visual input, so they can decrease the noise in the brain, in order to allow them to see the solution better. (Rock, page 80)

Linda Rising is a mathematician and Scrum guru well known to many in the field who said just in the last few years how surprised she is by how much thinking is unconscious. She has said she used to think that the conscious part of our brain was the tip of the iceberg, about 10 per cent, and the unconscious was the 90 per cent hidden from view. Now, she says, it’s clear that the conscious mind is even tinier.

A good deal of our unconscious thinking seems to be devoted to checking the social environment for safety, openness, and the ability to be free. The safer the environment, the more energy and space in the brain are freed up for original thinking.

This is why agile programming operationally values the connections between people over the solutions to problems. Agile values the connection, the care and feeding of the relationship between people, not because we have fuzzy teddy bear personalities, but because it is productive.

The fun of Agile is critical

Linda Rising once said, “I don’t think we understand the success of Agile development and that’s one thing that gets in the way of convincing others. They think [we are] a strange collection of people who don’t like documentation. [But] it is much more than that.” She points out that people call their periods of Agile the highlights of their career, and say, “We enjoy it” and “It’s a fun thing.”

The enjoyment of doing great work in the context of an energized, fast-moving team is crucial to agile, as far as I am concerned. As an agile coach, I try to move teams toward the experience of fun for several overlapping reasons. First, the fun and relaxation helps people be productive; second, productivity is inherently enjoyable; third, fun productivity and rapid reaction times are actually the sign that the agile team is about to break the “sound barrier.”

Agile is all about rapid iteration and speedy feedback from the environment, although agile proponents did not invent the idea. Effective workers have always done it. Creative leaders have always used the rapid feedback concept freely, according to Peter Sims, who calls the practice of rapid feedback “making little bets.” He has written an interesting book on that concept of the little bet you make, that you can afford to lose, in order to learn about the environment. Goethe famously said that there was “genius in beginnings,” and he probably meant that even small beginnings are enough to break the ground for big achievements.

Bill Hewlett’s calculator bet

Here’s an example of a famous “little bet.” Back in 1972, the first Hewlett Packard handheld calculators had been invented, they sold for $400, and the market research at the time said that no one would buy them. They cost too much, there was no real need for them, and slide rules were quick simple and cheap. Rather than give up though, according to Peter Sims, Bill Hewlett decided to make a small, affordable bet for the sake of learning. He is reported to have said, “Why don’t we build a thousand and see what happens?” Thus HP discovered that there was indeed a market for these things, and within five months the company was selling 1,000 of them a day (Sims, page 27).

IV. Answering objections

A couple of objections and discussion points are natural at this point. The first is about the science: How good is this science? Are there holes in it? How many experiments have been done? Is the science mature, or the work of self-promoters? The second set of objections would be about Agile itself, and we’ll get to them shortly.

As I’ve said, 1995 is a turning point well acknowledged by the people in the field of social cognitive neuroscience, the year the discovery of mirror neurons set off a wave of new experiments.

Social cognitive neuroscience is a solid and established field now, according to Jeffrey M. Schwartz, MD, of UCLA. “There are arguments about the details of certain experiments, and disagreements in the margins,” he said in a recent conversation, but he added that the field, as a whole is coherent and self-consistent. (A large database of research results in the field is available at

As an intrigued onlooker, I would say the scientific paradigm for researching human social interactions has permanently shifted. The last 16 years have been a Kuhnian moment, and a new scientific discourse about the causes and effects of social behavior is now firmly in place. No disconfirming experiment (one proving social interactions are not brain events) now seems imaginable. There is simply too much data, too well-linked.

Why don’t more companies use Agile?

Let’s move on to the more significant objection: If Agile is so natural and brain-friendly, why is it not more wide-spread? Why don’t more companies make the move to Agile?

How come many great companies have done without agile? Walmart did not become the world’s largest retailer by using these methods, nor did Bill Gates roll up his riches using these methods. Nor did Goldman Sachs become the world’s most powerful investment bank by using these methods. These top-down companies often take the most grinding and callous approaches to their people and problems. They are not fun to work for, as one former Goldman worker comments:

I worked there as an analyst for three years in the early 90s, and I remember that most people couldn’t take advantage of the long line of black cars that waited until midnight outside 85 Broad Street to take them home. Instead, they had to call for cars, because they never got out early enough. I also recall being told that having a tan in the summer was a bad sign, because it meant that you weren’t working hard enough. [Source:]

The top-down method of managing people has a long history of success; it has worked splendidly in many situations. Top-down is the way commanders have run armies, starting from the dawn of history. The best-organized and tightest hierarchies have usually won the military battles, two famous examples being Julius Caesar and Napoleon.

Top-down is also the way the great factories of the industrial revolution worked, and worked so well. Frederick Winslow Taylor’s theory of scientific management, which showed up about the time Henry Ford began mass-production, epitomized this. Taylor’s view of workers was that every person sought to do the minimal amount of work possible in order not to be fired. He believed in putting people in a process and watching them strictly—otherwise they would just do nothing. The manager discovers what needs to be done, he writes it out in a checklist or recipe format, leaving nothing to the imagination. He times how long each step takes and tells his workers, “Do it faster.”

That’s the view of human beings as cogs or mechanical parts in a larger machine, run by management. For top-down control, the key question is Who is the boss? It works for turning out 1,000 identical cars on an assembly line, but it doesn’t work so well when the work product must be powerful working software.

Top-down control impedes intellectual work because it prevents interaction. Talking with someone in a different part of the organization requires going up the chain of command and then down it.

When I am bothered by the fact that people-are-cogs companies succeed, I remember that some kinds of production don’t require a lot of invention. I also note to myself that only a small number of people, certainly no more than one in 1000, are willing to work for long in highly toxic environments, and that most people are unwilling to subject others to these conditions. But I do admit that when these two types of people—the dominators and the submitters —unite in service of a market, sometimes highly profitable organizations result.

V. Conclusion

Agile is consistent with human nature; that’s what the research shows. Agile works well in the field of creative intellectual production because it fits our nature so well. Agile is realistic about the importance of putting “individuals and interactions” first and attending to the intellectual and emotional rapport of the working group. In a work structure that has autonomy and safety, agile teams learn quicker, respond quicker, communicate quicker, invent quicker, and have more fun. That’s why they get better results than teams working from a big plan with control from above.

About the Author

Michael de la Maza is an agile coach and trainer whose clients include EMC, Intuit, PayPal, Symantec, and Verizon Wireless.  He is the co-author of Professional Scrum with TFS and holds a PhD in Computer Science from MIT.  He can be reached at


  • The Agile Manifesto is found at
  • Baron-Cohen, Simon. The Science of Evil: on empathy and the origins of cruelty. New York: Basic Books, 2011.
  • de Wall, Frans. Our Inner Ape. New York: Riverhead Books, 2005.
  • Pink, Daniel H. Drive: the surprising truth about what motivates us. New York, Riverhead Books, 2009.
  • Rising, Linda. Interview with Linda Rising, 2007.
  • Rock, David. Your brain at work: strategies for overcoming distracton, regaining focus, and working smarter all day long. New York: Harper Collins, 2009.
  • Sims, Peter. Little Bets: How breakthroughs emerge from small discoveries. New York: Free Press, 2011.


Every effort has been made to secure required permissions for all images reprinted in this paper. Credits for image sources, in order:


My thanks to John Maguire, former director of the Program for Reporting on Science and Medicine at Boston University, who helped research and edit this paper. He is reachable at

Rate this Article