To boost your brain’s creativity, take a hike, according to new research. But consider leaving the electronic gadgets at home.

Backpacking for four days in the wilderness without toting a laptop, iPhone or other high-tech device increased the creative problem-solving skills of people by 50 percent.

The beneficial effects of nature on the mind have been known anecdotally for generations, perhaps most famously noted by author Henry David Thoreau. He spent two years living a rustic life by Walden Pond and published “Walden,” his back-to-nature account, in 1854. Previous research has shown that exposure to nature replenishes basic brain functions like attention span, but little has been known about higher-level thinking properties, such as those involved in solving complex problems. The current study is the first measure of nature’s influence on creative problem-solving, Strayer said.

“There’s some concern that being in a modern urban environment with horns and technology constantly depletes nature’s restorative properties,” Strayer said. His advice: If you’re going to go on a hike, don’t bring your iPhone or cellphone. “Instead, try to focus on being in the environment you’re in.”

So, how do you laugh, on the Internet, in other languages? Here – haaaaaaaaaahahaha – is a starting guide:

Thai: 55555
In Thai, the number 5 is pronounced “ha” – so instead of saying “hahahahaha,” Thai speakers will sometimes write “55555.”

Japanese: www
This abbreviation, not to be confused (which is to say, often to be confused) with the one for the World Wide Web, likely originates with the Kanji character for “laugh,” 笑, which is pronounced as “warai” in Japanese. “Warai,” in message boards and chat rooms, quickly became shortened to “w” as an indication of laughter. And then, much the same way “ha” begat “haha” begat “hahaha,” the sentiment became extended – to “ww” and then “www” (and also, if you’re so inclined, to “wwwwwww”).

Chinese (Mandarin): 哈哈 or 呵呵
Though laughter is written 笑声 and pronounced xiào shēng, Mandarin also relies on onomatopoeia for laughter: 哈哈, pronounced hā hā, and 呵呵, pronounced he he. Similarly, xixi, 嘻嘻, suggests giggling.

Interestingly, the number 5, in Mandarin, is pronounced as “wu” – meaning that Thai’s “55555” would, in Chinese, be prounounced “wuwuwuwuwu.” This is the sound equivalent, a Chinese-speaking redditor points out, of “boohoo” – meaning that laughter in one language is crying in another. Similarly, since the number 8 is pronounced “ba,” Chinese speakers sometimes use “88” to sign off, or say “ba ba” (“bye bye”). Along those lines, should you want to reward someone you’re chatting with not just with laughter, but with actual praise … 8888888888 in Japanese represents applause, since 八 (eight) is pronounced “hachi,” which sounds like “pachi pachi,” which is onomatopoeia for clapping.

Korean: kkkkk or kekekekeke
This comes from ㅋㅋㅋ, short for 크크크, or keu keu keu – the Korean equivalent of the English “hahaha.”

French: hahaha, héhéhé, hihihi, hohoho; also MDR
French uses onomatopoeic laughter variations much like those in English. It also, like many non-English languages, uses the universalized “LOL” to indicate laugher. But French also has a more delightful acronym: The French equivelant of LOL is MDR, which means “mort de rire,” or “dying of laughter.”

Spanish: jajaja
In spanish, j is pronounced like the English h, so “jajaja” is the direct analog of the English “hahaha.”

Greek: xaxaxa
Same deal.

Hebrew: xà xà xà or חָה־חָה־חָה
Same.

Brazilian Portuguese: huehuehue, rsrsrsrs
Same, with the vowels varying rather than the consonants.

Danish: ha ha, hi hi, hæ hæ, ho ho, ti hi
Same deal.

Icelandic: haha, hehe, híhí
Same.

Russian: haha хаха, hihi хихи, hèhè хехе
Same.

Distraction at the office is hardly new, but as screens multiply and managers push frazzled workers to do more with less, companies say the problem is worsening and is affecting business.

While some firms make noises about workers wasting time on the Web, companies are realizing the problem is partly their own fault.

Even though digital technology has led to significant productivity increases, the modern workday seems custom-built to destroy individual focus. Open-plan offices and an emphasis on collaborative work leave workers with little insulation from colleagues’ chatter. A ceaseless tide of meetings and internal emails means that workers increasingly scramble to get their “real work” done on the margins, early in the morning or late in the evening. And the tempting lure of social-networking streams and status updates make it easy for workers to interrupt themselves.

Not all workplace distractions harm productivity. Dr. Mark found that people tended to work faster when they anticipate interruptions, squeezing tasks into shorter intervals of time. Workers’ accuracy suffered little amid frequent interruptions, but their stress rose significantly.

Other studies have found that occasional, undemanding distractions, such as surfing the Web, can help increase creativity and reduce workplace monotony, which may help boost alertness.

Companies are experimenting with strategies to keep workers focused. Some are limiting internal emails—with one company moving to ban them entirely—while others are reducing the number of projects workers can tackle at a time.

The pair instructed workers to let the importance and complexity of their message dictate whether to use cellphones, office phones or email. Truly urgent messages and complex issues merited phone calls or in-person conversations, while email was reserved for messages that could wait.

After an internal study found that workers spent some two hours a day managing their inboxes, the company vowed to phase out internal email entirely.

Workers can still use email with outside customers, but managers have directed workers to communicate with colleagues via an internal social network, which the company began installing earlier this fall, says Robert Shaw, global program director for the “Zero Email” initiative.

Businesses have praised workers for multitasking, “but that isn’t necessarily a good thing,” says Mr. Keene. “When you are focused on just a few things, you tend to solve problems faster. You can’t disguise the problem by looking like you’re really busy.”

Knowing is great for watching Jeopardy. It’s not nearly as good for life.

So why is learning about improvement so easy and actually improving so damn hard?

Most any change that requires a lot of consistent mental effort is going to fail because you spend most of the day on autopilot. […]

Any change has to work when you’re on autopilot. The importance of self-control is one of the biggest myths about improvement.

Almost all the techniques for change that have been shown to work don’t rely on thought or willpower.

The article then highlights some quotes from various books describing successful techniques for achieving personal change.

All of these techniques remove mental effort and can easily be incorporated into the routine you already have. That’s their strength.

But they all require a little bit of planning ahead of time. With these systems most people don’t really fail — most people never really start. How do you plan?

Chip and Dan Heath distill effective behavior change down to three simple steps in their well-researched and enjoyable book Switch: How to Change Things When Change Is Hard.

1) First, direct the rational mind

Provide crystal-clear direction. What looks like resistance is often a lack of clarity.
Script the critical moves — don’t think big picture, think in terms of specific behaviors.
Point to the destination. Change is easier when you know where you’re going and why it’s worth it.

2) Second, motivate the emotional mind

Focus on emotions. Knowing something isn’t enough to cause change. Make people (or yourself) feel something.
Shrink the change. Break down the change so it’s not scary.
When leading a group, cultivate a sense of identity and instill a growth mindset. Believe change is possible.

3) “Shape the Path”

What looks like a people problem is often a situation problem.
Tweak the environment. When a situation changes, behavior changes. So change the situation.
Build habits. When a behavior is habitual it doesn’t tax our minds as much.

Pick one of the techniques. Plan for 20 minutes. You might struggle to implement for a couple days, but after that, it’s easier to stay the course than it is to deviate. That’s the secret.

Don’t try to reinvent yourself. You’ll fail. Fit the new into the old. Make the new easier than the old.

You change all the time. The TV shows you watch change, the products you buy change, and the projects at work change. Change is going to happen, no matter what.

The question is, will you be in control of the change or will the change control you?

Space travel is tough on the human body. But what does it do to the human mind? Gary Beven, a space psychiatrist at NASA, answers our questions about how humans adapt to space, and what we have to do to go to Mars.

[…]

With twenty missions under his belt, he has a lot of practical experience with astronauts. Since space travel and psychiatry are two areas that are mined, often incorrectly, for drama, I ask what he has found to be the biggest misconceptions of psychiatry and space travel. Generally, it is the drama that’s the problem. Even in the 1990s, when the first NASA astronauts went up on Mir, their main problems were isolation and depression, not dramatic behavioral issues or space-based illness.

Beven explained:

One misconception is a concern or theory that the spaceflight environment may be inherently harmful or hazardous, from a psychological standpoint. Sustained life in microgravity on board a space vehicle does not appear to cause psychological decrement or psychiatric symptoms unique to that environment.

Any previously reported behavioral health problems have appeared to occur because of common earth bound issues. For instance, placing crews that have potential personality conflicts in a smaller space station environment, with few recreational outlets, and then overwork them or not provide enough meaningful work to do.

Milky Way Over Quiver Tree Forest

In front of a famous background of stars and galaxies lies some of Earth’s more unusual trees. Known as quiver trees, they are actually succulentaloe plants that can grow to tree-like proportions. The quiver tree name is derived from the historical usefulness of their hollowed branches as dart holders. Occurring primarily in southern Africa, the trees pictured in the above 16-exposure composite are in Quiver Tree Forest located in southern Namibia. Some of the tallest quiver trees in the park are estimated to be about 300 years old. Behind the trees is light from the small town of Keetmanshoop, Namibia. Far in the distance, arching across the background, is the majestic central band of our Milky Way Galaxy. Even further in the distance, visible on the image left, are the Large and Small Magellanic Clouds, smaller satellite galaxies of the Milky Way that are prominent in the skies of Earth’s southern hemisphere.

Pablo Picasso once said, “We all know that Art is not truth. Art is a lie that makes us realize truth, at least the truth that is given us to understand. The artist must know the manner whereby to convince others of the truthfulness of his lies.”

If we didn’t buy in to the “lie” of art, there would obviously be no galleries or exhibitions, no art history textbooks or curators; there would not have been cave paintings or Egyptian statues or Picasso himself. Yet, we seem to agree as a species that it’s possible to recognize familiar things in art and that art can be pleasing.

To explain why, look no further than the brain.

The human brain is wired in such a way that we can make sense of lines, colors and patterns on a flat canvas. Artists throughout human history have figured out ways to create illusions such as depth and brightness that aren’t actually there but make works of art seem somehow more real.

And while individual tastes are varied and have cultural influences, the brain also seems to respond especially strongly to certain artistic conventions that mimic what we see in nature.

The article then discusses elements we recognise in art. Examples are included to help explain these points.

Lines

That a line drawing of a face can be recognized as a face is not specific to any culture. Infants and monkeys can do it. Stone Age peoples did line drawings; the Egyptians outlined their figures, too.

It turns out that these outlines tap into the same neural processes as the edges of objects that we observe in the real world. The individual cells in the visual system that pick out light-dark edges also happen to respond to lines, Cavanagh said. We’ll never know who was the first person to create the first “sketch,” but he or she opened the avenue to our entire visual culture.

Faces

This brings us to modern-day emoticons; everyone can agree that this :-) is a sideways happy face, even though it doesn’t look like any particular person and has only the bare minimum of facial features. Our brains have a special affinity for faces and for finding representations of them (some say they see the man in the moon, for instance). Even infants have been shown in several studies to prefer face-like patterns over patterns that don’t resemble anything.

Color vs. luminance

To trick the brain into thinking something looks three-dimensional and lifelike, artists add elements – lightness and shadows – that wouldn’t be present in real life but that tap into our hard-wired visual sensibilities.

Mona Lisa’s smile

The human visual system is organized such that the center of gaze is specialized for small, detailed things, and the peripheral vision has a lower resolution – it’s better at big, blurry things.

That’s why, as your eyes move around the Mona Lisa’s face, her expression appears to change, Livingstone says. The woman was painted such that, looking directly at the mouth, she appears to smile less than when you’re staring into her eyes. When you look away from the mouth, your peripheral visual system picks up shadows from her cheeks that appear to extend the smile.

Shadows and mirrors

From a scientific standpoint, it’s possible to determine exactly how shadows are supposed to look based on the placements of light and how mirror reflections appear at given angles. But the brain doesn’t perform such calculations naturally. […]

Studies have shown that people don’t generally have a good working knowledge of how reflections should appear, or where, in relation to the original object, Cavanagh said. Paintings with people looking into mirrors or birds reflected in ponds have been fooling us for centuries.

Why we like art

There are certain aspects of art that seem universally appealing, regardless of the environment or culture in which you grew up, argues V.S. Ramachandran, a neuroscientist at the University of California, San Diego. He discusses these ideas in his recent book “The Tell Tale Brain.”

Symmetry, for instance, is widely considered to be beautiful. There’s an evolutionary reason for that, he says: In the natural world, anything symmetrical is usually alive. Animals, for instance, have symmetrical shapes.

And then there’s what Ramachandran calls the “peak shift principle.” The basic idea is that animals attracted to a particular shape will be even more attracted to an exaggerated version of that form.

[T]he distorted faces of famous artists such as Pablo Picasso and Gustav Klimt may be hyperactivating our neurons and drawing us in, so to speak. Impressionism, with its soft brushstrokes, is another form of distortion of familiar human and natural forms.

Further research: Can we know what is art?

There’s now a whole field called neuroesthetics devoted to the neural basis of why and how people appreciate art and music and what is beauty.

Semir Zeki at University College London is credited with establishing this discipline, and says it’s mushrooming. Many scientists who study emotion are collaborating in this area. Zeki is studying why people tend prefer certain patterns of moving dots to others.

There have been several criticisms about neuroesthetics as a field. Philosopher Alva Noe wrote in The New York Times last year that this branch of science has not produced any interesting or surprising insights, and that perhaps it won’t because of the very nature of art itself – how can anyone ever say definitively what it is?

Zeki said many challenges against his field are based on the false assumption that he and colleagues are trying to explain works of art.

“We’re not trying to explain any work of art,” he said. “We’re trying to use works of art to understand the brain.”

How does the brain’s survival instinct prevent innovation–and what can you do about it?

  1. The brain wants pains solved first. The brain is wired to minimize loss. We want to keep what we already have. Equally, we are not interested in something new, until we address our pains. The brain seeks preservation over pleasure.
  1. Expertise is the enemy of innovation. We build neural pathways to known solutions. What we know best (or in some cases have heard most recently) becomes our default answer. Unfortunately, once we find an answer to a problem, we stop looking for other possible solutions. As a result, the tried and true wins out and we get more of the same.
  1. The brain wants solutions, not problems. In the world of business, we hear the expression, “Don’t bring me problems, bring me solutions.” From a survival perspective this makes sense. When faced with the possibility of being eaten by a lion, we don’t want to study our navel. Action is critical. However, in the world of innovation, the “problem” is actually more important.
  1. The brain craves commonality. Contrary to conventional wisdom, opposites do not attract. It is safer to be in a tribe of people who think the same way. Things get done quickly. It feels effortless. But the downside is that it thwarts innovation.
  1. The brain sees what it believes. The brain uses a pattern matching technique called “confirmation bias.” In a nutshell, it rejects anything that is inconsistent with your belief structure. […]
  1. Your brain only sees a fraction of reality. What you focus on expands, to the exclusion of everything else. The brain’s reticular activating system is designed to filter out 99.99 percent of the stimuli out there. This prevents the brain from being overwhelmed by information. Unfortunately, as a result, you miss out on opportunities because you cannot even see they are there. When you are a technology expert, the solution to every problem involves software/hardware. Opportunities are limited to your frame of reference.
  1. The brain thinks too much: The dorsolateral prefrontal cortex is the judgmental part of the brain. It is analytical and calculating. This is great for decision-making that requires logic. But it can kill innovation. When athletes choke, they are over thinking and constrict the neural pathways that allow access to their deeper capabilities.

I’ve quoted the seven challenges. Arguably, these issues apply to more than just our ability to be innovative

The mysteries of the mind and brain are many and complex. Neuroscience, through the magic of technology like functional magnetic resonance imaging (fMRI) is just beginning to unravel some of them. Given that my livelihood revolves around creativity, I have become fascinated with neuroplasticity.

Neuroplasticity is the mind’s ability to change the brain. Yes, you read that right. Neuroplasticity radically reverses ages of scientific dogma, which held that mental experiences result only from physical goings-on in the brain, and we can’t do much about it. But extensive studies by neuroscientists confirm that our mental machinations do actually alter the physical structure of our brain matter. So, when you change your mind, you change your brain. This is great news for most of us.

Step 1: Relabel

The first step is to Relabel a given thought or feeling or behavior as something else. For example, an unwanted thought could be relabeled “false message” or “brain glitch”. This amounts to training yourself to clearly recognize and identify what is real and what isn’t, refusing to be tricked by your own thoughts.

Step 2: Reattribute

The second step is to Reattribute, which answers the question, “Why do these thoughts coming back?” The answer is that the brain is misfiring, stuck in gear, creating mental noise, and sending false messages. In other words, if you understand why you’re getting those old thoughts, eventually you’ll be able to say, “Oh, that’s just a brain glitch.” That raises the natural next question: What can you do about it?

Step 3: Refocus

The third step, Refocus, is where the toughest work is, because it’s the actual changing of behavior. You have to do another behavior instead of the old one. Having recognized the problem for what it is and why it’s occurring, you now have to replace the old behavior with new things to do. This is where the change in brain chemistry occurs, because you are creating new patterns, new mindsets. By refusing to be misled by the old messages, by understanding they aren’t what they tell you they are, your mind is now the one in charge of your brain.

Step 4: Revalue

It all comes together in the fourth step, Revalue, which is the natural outcome of the first three. With a consistent way to replace the old behavior with the new, you begin to view old patterns as simple distractions. You devalue them, really, as being completely worthless. Eventually the old thoughts begin to fade in intensity, the brain works better and better, and the automatic transmission in the brain begins to start working properly.

What all of this meant to me was that we can learn to improve our ability to defeat the traditional thinking traps we fall into when we try to change our view of whatever challenge we’re facing. We can override our default. We can retrain our brain by invoking the Apple tagline: Think different.

See also: Cognitive Behavioral Therapy and Acceptance and Commitment Therapy.