The Inner Workings of the Executive Brain

28. April 2014

Date: 28-04-2014
Source: The Wall Street Journal

New Research Shows the Best Business Minds Make Decisions Very Differently Than We Thought

Take much of what you know about how the best executives make decisions. Now, forget it.

For instance, we all “know” that tight deadlines lead to inspiration. Except they often don’t. Instead, they typically are counterproductive—making people less creative precisely when they need to be. Or most of us assume that when we try to solve problems, we’re drawing on the logical parts of our brains. But, in fact, great strategists seem to draw on the emotional and intuitive parts of their brain much more.

These are some of the insights coming from the world of neuroimaging, where scientists use sophisticated machines to map what’s going on inside the brain when people do jobs or ponder problems. The work is still in its early stages, but even now it offers an extraordinary opportunity that wasn’t possible before.

Researchers can now see how people’s brains react to a situation—a process that, obviously, the subjects themselves can’t see, let alone explain. That promises to provide a much clearer view of how leaders make good choices, and how other people can learn to follow their example.

Here’s a closer look at some of the discoveries researchers have made.

Want Innovation? Be Wary of Deadlines

We often think a deadline can help us shake off inertia and focus on getting a job done. But the brain research suggests precisely the opposite is true. A deadline, instead, more often limits our thinking and can lead to much worse decision making.

Mental accountingRichard Boyatzis —along with colleague Anthony Jack and others—has found that a tight deadline increases people’s urgency and stress levels. These people show more activity in the brain’s “task positive” network, which we use for problem solving. But it’s not the part of the brain that comes up with original ideas.

“The research shows us that the more stressful a deadline is, the less open you are to other ways of approaching the problem,” says Dr. Boyatzis, a professor in the departments of organizational behavior, psychology and cognitive science at Case Western Reserve University. “The very moments when in organizations we want people to think outside the box, they can’t even see the box.”

For example, an IT manager being pushed to launch a new software product quickly might rush to get all the bugs fixed. With less pressure, he or she might have taken a step back, asked why all those problems were cropping up in the first place, and come up with a completely different approach to writing the code that worked more smoothly and didn’t produce the glitches.

Does that mean companies should get rid of deadlines? In most cases, that’s not realistic. So Srini Pillay, an assistant clinical professor at Harvard Medical School and founder of the coaching firm NeuroBusiness Group, suggests that companies help employees reduce stress and access the creative parts of the brain even when they’re under pressure.

One such technique is learning to let the mind wander, with exercises like meditation. In that mental state, the creative part of the brain tends to be active. “When people hit a wall in their thinking, in general they start thinking harder,” says Dr. Pillay. “What the neuroscience research tells us is that it’s more important to think differently.”

Big Unknowns Lead to Bad Choices
The ticking clock of a deadline isn’t the only kind of pressure that makes for bad decisions. So does uncertainty, such as feeling that your job or your company’s future is under threat.

Decision making room for improvementDr. Pillay cites a study that discovered that feelings of uncertainty activated brain centers associated with anxiety and disgust, and that such concerns naturally lead to certain kinds of decisions. “In times of uncertainty,” he says, “you start acting out of that sense of doom and gloom.”

The problem, he says, is that the study also showed that 75% of people in uncertain situations erroneously predicted that bad things would happen. So the reactions and decisions that were made based on fear and anxiety could turn out to be exactly the wrong moves.

Let’s say a company is having a rough time navigating the weak economy. A manager who’s mired in doom-and-gloom thinking might be too pessimistic to hire new staff or invest in new equipment. But those might be exactly the moves the company needs to gain ground on competitors.

Given that uncertainty is a hallmark of many modern workplaces, the solution lies not in trying to avoid it, but in learning to accept it. “It’s important to be aware that your response is likely to be an exaggeration,” Dr. Pillay says. Read the rest of this entry »

Advertisements

The backlash against big data

21. April 2014

Date: 21-04-2014
Source: The Economist

“BOLLOCKS”, says a Cambridge professor. “Hubris,” write researchers at Harvard. “Big data is bullshit,” proclaims Obama’s reelection chief number-cruncher. A few years ago almost no one had heard of “big data”. Today it’s hard to avoid—and as a result, the digerati love to condemn it. Wired, Time, Harvard Business Review and other publications are falling over themselves to dance on its grave. “Big data: are we making a big mistake?,” asks the Financial Times. “Eight (No, Nine!) Problems with Big Data,” says the New York Times. What explains the big-data backlash?

Big data refers to the idea that society can do things with a large body of data that that weren’t possible when working with smaller amounts. The term was originally applied a decade ago to massive datasets from astrophysics, genomics and internet search engines, and to machine-learning systems (for voice-recognition and translation, for example) that only work well when given lots of data to chew on. Now it refers to the application of data-analysis and statistics in new areas, from retailing to human resources. The backlash began in mid-March, prompted by an article in Science by David Lazer and others at Harvard and Northeastern University. It showed that a big-data poster-child—Google Flu Trends, a 2009 project which identified flu outbreaks from search queries alone—had overestimated the number of cases for four years running, compared with reported data from the Centres for Disease Control (CDC). This led to a wider attack on the idea of big data. Read the rest of this entry »


Americans Aren’t Ready for the Future Google and Amazon Want to Build

19. April 2014

Date: 19-04-2014
Source: WIRED

Kidney 3 DA kidney structure being printed by the 3-D printer at Wake Forest Institute for Regenerative Medicine.

Americans are hopeful about the future of technology. But don’t release the drones just yet. And forget meat grown in a petri dish.

Pushing new tech on a public that isn’t ready can have real bottom-line consequences.

That’s the takeaway from a new study released by the Pew Research Center looking at how U.S. residents felt about possible high-tech advances looming in the not-too-distant future. Overall, a decisive majority of those surveyed believed new tech would make the future better. At the same time, the public doesn’t seem quite ready for many of the advances companies like Google and Amazon are pushing hard to make real.

If the stigma surrounding Google Glass (or, perhaps more specifically, “Glassholes”) has taught us anything, it’s that no matter how revolutionary technology may be, ultimately its success or failure ride on public perception. Many promising technological developments have died because they were ahead of their times. During a cultural moment when the alleged arrogance of some tech companies is creating a serious image problem, the risk of pushing new tech on a public that isn’t ready could have real bottom-line consequences. Read the rest of this entry »


The Limits of Social Engineering

16. April 2014

Date: 16-04-2014

Source: Technology Review By Nicholas Carr
Tapping into big data, researchers and planners are building mathematical models of personal and civic behavior. But the models may hide rather than reveal the deepest sources of social ills.

In 1969, Playboy published a long, freewheeling interview with Marshall McLuhan in which the media theorist and sixties icon sketched a portrait of the future that was at once seductive and repellent. Noting the ability of digital computers to analyze data and communicate messages, he predicted that the machines eventually would be deployed to fine-tune society’s workings. “The computer can be used to direct a network of global thermostats to pattern life in ways that will optimize human awareness,” he said. “Already, it’s technologically feasible to employ the computer to program societies in beneficial ways.” He acknowledged that such centralized control raised the specter of “brainwashing, or far worse,” but he stressed that “the programming of societies could actually be conducted quite constructively and humanistically.”

The interview appeared when computers were used mainly for arcane scientific and industrial number-crunching. To most readers at the time, McLuhan’s words must have sounded far-fetched, if not nutty. Now they seem prophetic. With smartphones ubiquitous, Facebook inescapable, and wearable computers like Google Glass emerging, society is gaining a digital sensing system. People’s location and behavior are being tracked as they go through their days, and the resulting information is being transmitted instantaneously to vast server farms. Once we write the algorithms needed to parse all that “big data,” many sociologists and statisticians believe, we’ll be rewarded with a much deeper understanding of what makes society tick. Read the rest of this entry »


Philosophy on Top

9. April 2014

Date: 09-04-2014
Source: Project Syndicate

PETER SINGER

Peter Singer, Professor of Bioethics at Princeton University and Laureate Professor at the University of Melbourne, is the author of Animal Liberation, Practical Ethics, One World, The Ethics of What We Eat (with Jim Mason), The Life You Can Save, and the forthcoming The Point of View of the Universe (with Katarzyna de Lazari-Radek). In 2013, he was named the world’s third “most influential contemporary thinker” by the Gottlieb Duttweiler Institute.

MELBOURNE – Last year, a report from Harvard University set off alarm bells, because it showed that the proportion of students in the United States completing bachelor’s degrees in the humanities fell from 14% to 7%. Even elite universities like Harvard itself have experienced a similar decrease. Moreover, the decline seems to have become steeper in recent years. There is talk of a crisis in the humanities.

I don’t know enough about the humanities as a whole to comment on what is causing enrollments to fall. Perhaps many humanities disciplines are not seen as likely to lead to fulfilling careers, or to any careers at all. Maybe that is because some disciplines are failing to communicate to outsiders what they do and why it matters. Or, difficult as it may be to accept, maybe it is not just a matter of communication: Perhaps some humanities disciplines really have become less relevant to the exciting and fast-changing world in which we live.

I state these possibilities without reaching a judgment about any of them. What I do know something about, however, is my own discipline, philosophy, which, through its practical side, ethics, makes a vital contribution to the most urgent debates that we can have.

Read the rest of this entry »


Get Familiar With Big Data Now—or Face ‘Permanent Pink Slip’

9. April 2014

Date: 09-04-2014
Source: The Wall Street Journal

Big data salariesDemand Rises for Analytics Professionals, Data Scientists

Sick of hearing about Big Data? Get used to it. Whether you believe analytics is a tired corporate buzzword or the key to future business growth, hundreds of companies are searching, and paying richly, for hires with quantitative skills.

During her three decades at the helm of executive recruiter Burtch Works, Linda Burtch has tracked the rising demand for workers who can understand and manipulate data. She has worked with clients such as Darden Restaurants Inc., Jack in the Box, Leo Burnett Worldwide, Foot Locker Inc. and other big firms to staff the teams responsible for understanding, for example, how marketing affects consumer behavior.

Ms. Burtch says, “it’s a candidate’s market right now.” According to recent surveys of Burtch Works’ contacts, analytics professionals—from entry-level data analysts to executives—earn a median base salary of $90,000 annually, rising to a median of $145,000 for managers. And the group’s just-released study of data scientists, a subset of the larger group, found that nonmanagers earn a median base salary of $120,000. (Data scientists work with large, unstructured sets of data. Analytics professionals generally deal with structured data sets, Ms. Burtch says.)

Not yet a quantitative expert? Better brush up. In a recent interview, the Chicago-based Ms. Burch talked about what companies want, how midcareer professionals can compete and why workers who are left behind could face a “permanent pink slip.” Edited excerpts:

Read the rest of this entry »


McKinsey’s Matt Rogers on the next industrial revolution

1. April 2014

Date: 01-04-2014
Source: Fortune

Rogers’ new book with Stanford Professor Stefan Heck argues that the business world is fast approaching a shortage of valuable natural resources. Here’s what managers need to know.

Matt Rogers of McKinsey & Co.

FORTUNE — Over the next 15 years, another 2.5 billion people in the developing world will join the middle class. China will add 2½ new cities the size of Chicago every year for the foreseeable future and will have 221 cities with over a million in population by 2025 (compared with 35 cities this size in Europe today). That kind of growth is going to create an unprecedented demand for oil, gas, steel, precious metals, water, and other precious resources. If we keep on our current course of consumption, commodity prices, food prices, and pollution levels are likely to spike, greatly increasing risks for business.

In their insightful new book Resource Revolution: How To Capture the Biggest Business Opportunity in a Century McKinsey director Matt Rogers and Stanford Professor Stefan Heck lay out a compelling road map for how managers need to change the way they think about resources if they want to not only survive but also thrive in the 21st Century.

Fortune’s Brian Dumaine caught up with Matt Rogers recently to discuss the book, which will be published on April 1.

The conventional wisdom about resources is that we’re running out, and we’re all going to die. But you believe we’re about to enter what you call a a resource revolution and that it will be the biggest economic opportunity of the 21st century.

Over the next two decades global growth will stress our resources, and that has a lot of people concerned. What gave us confidence to write the book is that we saw that you could combine advances in nanotechnology, materials science, information technology, and biology with traditional industrial technologies and meet resource requirements more easily than most expect. Read the rest of this entry »