We’re ambivalent about work because in our capitalist system it means work-for-pay (wage-labor), not for its own sake. It is what philosophers call an instrumental good, something valuable not in itself but for what we can use it to achieve. For most of us, a paying job is still utterly essential - as masses of unemployed people know all too well. But in our economic system, most of us inevitably see our work as a means to something else: it makes a living, but it doesn’t make a life.

What, then, is work for? Aristotle has a striking answer: “we work to have leisure, on which happiness depends.” This may at first seem absurd. How can we be happy just doing nothing, however sweetly (dolce far niente)? Doesn’t idleness lead to boredom, the life-destroying ennui portrayed in so many novels, at least since “Madame Bovary”?

Everything depends on how we understand leisure. Is it mere idleness, simply doing nothing? Then a life of leisure is at best boring (a lesson of Voltaire’s “Candide”), and at worst terrifying (leaving us, as Pascal says, with nothing to distract from the thought of death). No, the leisure Aristotle has in mind is productive activity enjoyed for its own sake, while work is done for something else.

Given these assumptions, can capitalism help us achieve Aristotelian leisure?

But capitalism as such is not interested in quality of life. It is essentially a system for producing things to sell at a profit, the greater the better. If products sell because they improve the quality of our life, well and good, but it doesn’t in the end matter why they sell. The system works at least as well if a product sells not because it is a genuine contribution to human well-being but because people are falsely persuaded that they should have it. Often, in fact, it’s easier to persuade people to buy something that’s inferior than it is to make something that’s superior. This is why stores are filled with products that cater to fads and insecurities but no real human need.

It would seem, then, that we should increase leisure - and make life more worthwhile - by producing only what makes for better lives. In turn, workers would have the satisfaction of producing things of real value. (For a recent informed and vigorous defense of this view, see Robert and Edward Skidelsky, How Much Is Enough?)

But this raises the essential question: who decides what is of real value? The capitalist system’s own answer is consumers, free to buy whatever they want in an open market. I call this capitalism’s own answer because it is the one that keeps the system operating autonomously, a law unto itself. It especially appeals to owners, managers and others with a vested interest in the system.

Unfortunately, markets are almost never “perfect” — different participants in the market have different levels of knowledge, and the power of individual participants vary. In other words, markets are prone to failure due to information asymmetry.

This is why, especially in our capitalist society, education must not be primarily for training workers or consumers (both tools of capitalism, as Marxists might say). Rather, schools should aim to produce self-determining agents who can see through the blandishments of the market and insist that the market provide what they themselves have decided they need to lead fulfilling lives. Capitalism, with its devotion to profit, is not in itself evil. But it becomes evil when it controls our choices for the sake of profit.

Capitalism works for the good only when our independent choices determine what the market must produce to make a profit. These choices - of liberally educated free agents - will set the standards of capitalist production and lead to a world in which, as Aristotle said, work is for the sake of leisure. We are, unfortunately, far from this ideal, but it is one worth working toward.

During a series of ongoing experiments, functional magnetic resonance images track blood flow in the brains of subjects as they read excerpts of a Jane Austen novel. Experiment participants are first asked to leisurely skim a passage as they might do in a bookstore, and then to read more closely, as they would while studying for an exam.

Phillips said the global increase in blood flow during close reading suggests that “paying attention to literary texts requires the coordination of multiple complex cognitive functions.” Blood flow also increased during pleasure reading, but in different areas of the brain. Phillips suggested that each style of reading may create distinct patterns in the brain that are “far more complex than just work and play.”

The experiment focuses on literary attention, or more specifically, the cognitive dynamics of the different kinds of focus we bring to reading. This experiment grew out of Phillips’ ongoing research about Enlightenment writers who were concerned about issues of attention span, or what they called “wandering attention.”

Critical reading of humanities-oriented texts are recognized for fostering analytical thought, but if such results hold across subjects, Phillips said it would suggest “it’s not only what we read – but thinking rigorously about it that’s of value, and that literary study provides a truly valuable exercise of people’s brains.”

With the field of literary neuroscience in its infancy, Phillips said this project is helping to demonstrate the potential that neuroscientific tools have to “give us a bigger, richer picture of how our minds engage with art – or, in our case, of the complex experience we know as literary reading.”

There is often a big divide between what happens in the laboratory and the way laboratory findings are practically applied. The relationship between neuroscience research and education is no exception. While there are numerous educational products that claim to be based on neuroscience research (often quite dubiously so), the real impact of brain-based research on education has been much more subtle.

While neuroscience hasn’t yet radically changed the way we think about teaching and learning, it is helping to shape educational policies and influencing new ways of implementing technology, improving special education, and streamlining day-to-day interactions between teachers and students. While there is still a long way to go before we truly understand the science of learning and how to use those findings in the real world classroom, it’s important to highlight some of the key ways that neuroscience is changing the classroom of today for the better.

The article goes on to describe nine ways neuroscience research is influencing education

  1. Cognitive tutoring: […] The cognitive tutoring programs allow students to learn by doing and are based on cognitive psychology theory, employing an AI system to adjust to student needs as well as to track student progress and thought processes so teachers can better help them learn.
  1. High schools starting later: Neuroscience research has demonstrated that sleep patterns change, often significantly, as individuals age. Multiple studies have found that adolescents need more sleep than other age groups and are unlikely to function at peak cognitive capacity early in the morning. In addition to needing more sleep, teens also simply have different circadian rhythms, which often makes them drowsy and moody in the morning. Many schools are starting to use this data to make changes, pushing back start times to allow students to sleep in a little later.
  1. Offering more variety: Repetition can be a valuable learning tool, no matter what you’re trying to learn, but neuroscience research has pinpointed a “spacing effect,” demonstrating that students learn more when episodes of learning are spaced out over time rather than pushed into one single episode. One of the ways this manifests itself is by bringing greater variety into the classroom, with lessons extending over the course of a semester rather than being fit into a few days or weeks. Researchers have also found that variety is key in learning because, simply put, the brain craves it, boosting levels of both attention and retention in students.
  1. Individualized education: While our general brain anatomy is similar, neuroscience is showing that no two brains work exactly alike. […] learning tools that are adaptable to individual needs are especially valuable in the classroom. New, highly plastic digital tools are filling part of that role, but neuroscience and education are taking this information in another direction as well. Teachers are being encouraged to expose students to novel experiences when presenting information […].
  1. Understanding that you use it or lose it: […] Anyone who has ever tried to remember lessons from grade school decades later can attest to this, but neuroscience backs it up, demonstrating that people who read more challenging books often have a greater variety and number of neural connections. […] Research has shown that the more time students spend outside of school, the more they’ll forget, leading to more work to regain lost information. As a result, many schools are shortening summer breaks or going to a year-round schedule in order to reduce the amount of time students are away from their studies.
  1. Better identification and intervention for learning disorders: Neuroscience research is making it easier to identify which students have learning disabilities and to get those students interventions that can significantly help their academic performance. Through neuroscience research, new biomarkers and diagnostic strategies for disabilities like ADHD and dyslexia have been identified, in turn leading to more successful early interventions for students and some potentially amazing tools to help students learn. […]
  1. Making learning fun: Increasingly, neuroscience is demonstrating the importance of making learning a fun and positive experience. Pleasurable experiences cause the body to release dopamine, which in turn helps the brain remember facts. One great example of how this is making it into the classroom is Khan Academy, an online learning portal that challenges students to complete games and problem sets in order to win badges. […] Recent research has also shown just how much of an emotional experience learning can be, with negative emotional states like fear, anxiety, shame, or worry making it difficult or impossible for students to reason, learn, or store new memories. This data further stresses the need for developing learning environments that are not just fun but are also positive, safe places for students.
  1. Making learning social: Human beings are highly social creatures, so it should come as no surprise that neuroscience would point to a positive effect from social learning experiences. A study by teacher and neurologist Judy Willis in 2011 found that students who worked on writing in positive, supportive groups experienced a surge in dopamine (which we’ve already discussed the positive effects of), as well as a redirection and facilitation of information through the amygdala into the higher cognitive brain, allowing students to better remember information over the long term. She also found that learning in groups tended to reduce anxiety, which can frequently be a major roadblock to effective learning. […]
  1. Focus on neuroeducation: […] Through practice, it’s actually possible to change the way our brains are structured, adding more brain connections and changing neural pathways through the neuroplasticity afforded by our brain cells. […] Educators are increasingly encouraging administrators to move away from memorization-based learning to programs that ask students to solve problems, think critically, and explore creativity, as these methods not only build knowledge but also enhance and build brain pathways themselves, prepping the brain for future educational experiences.

The idea that the world could have been different is not only a matter for science fiction, but is also a matter of considerable interest in philosophy and science. Philosophers have long written about possible worlds and scientists got into the game fairly recently. From a philosophical standpoint, writers who create alternative histories are making use of counterfactuals. That is, they are describing a world that is counter to fact.

Interestingly enough, recent American politics seems to involve some interesting exercises in alternative reality fiction and counterfactual history.

While political narratives typically distort reality by including straw men, lies and partial truths, some narratives actually present entire counter factual worlds. In some cases the extent to which the reality of the speech differs from the actual world would seem to qualify the speech as science fiction. After all, it is describing a world somewhat like our own that does not exist, except in the imagination of the creator and those that share the creator’s vision.

One interesting addition to politicians presenting limited counterfactuals is the creation of entire counterfactual narratives, some of which can be regarded as complete alternative histories and descriptions of alternative realities. […] Political people also spin positive narratives, typically creating fictional pasts of an ideal world that never was and also of a wonderful world that never shall be.

In the case of science fiction, the authors are aware they are creating fiction and, in general, the audience gets that the works are fictional. Of course, there can be some notable exceptions when fans lose the ability to properly distinguish counterfactuals and alternative histories from truth and history. […] This creates the fascinating idea of people living in fictional political worlds that are populated by fictional political characters. Naturally, it might be wondered how this would work.

One obvious explanation is that people who do not know better and who are not inclined to engage in even a modest amount of critical thinking (checking the facts, for example) can easily be deceived by such fiction and accept it as reality. These people will, in turn, attempt to convince others of the reality of these fictions and they will also make decisions, such as who to vote for, on the basis of these fictions. As might be imagined, such fiction based decision making is unlikely to result in wise choices. As I have argued in a previous essay, people tend to not be very rational when it comes to political matters. Even when a factual error is clearly shown to be an error, people who accepted the claim because it matches their ideology will tend to be more inclined to believe the claim because (and not in spite) of the correction. This has the effect of making true believers almost immune to corrections in the case of factual errors. While this is clearly a problem for those who are concerned about facts and truth, this supplies those who spin the counterfactual narratives with the perfect audiences: believers who will reject challenges to the narrative in which they dwell and thus are willful participants in their own political continuum, be that the Republican Continuum, the Democrat Continuum or another one. For these people, art does not imitate life nor does life imitate art. Life, at least the political life, is art—albeit science fiction.

Related: The Philosophical Roots of Science Fiction.

Determining a conductor’s influence is tricky. Does a “good” conductor wangle bravura performances from his players, or simply preside over a self-organising virtuoso ensemble? To find out, Dr D’Ausilio watched two (anonymous) conductors leading five excerpts from Mozart’s symphony number 40 played by eight violinists from the Città di Ferrara orchestra.

Crucially, the judges rated the dictatorial performance more highly of the two. In the other excerpt, the despotic conductor was just as assertive, but the violinists seemed to pay as much attention to themselves as they did to him. This led to a performance that the panel liked less than the one under the meeker conductor, who exercised little influence over his players.

The findings are in harmony with what conductors knew all along: that baton-toting despots, like the late Herbert von Karajan, do add value—but only if they rein in the uppity musicians in front of them.

As pointed out by one of the commenters on the original post, the sample size is too small. But interesting nonetheless.

A growing body of evidence is making clear the links between what we taste and how we feel: Repulsion is repulsion, whether caused by a shameful act or a rotten egg. “Your brain can’t tell the difference between something that tastes bad and something that makes you feel morally violated,” says Kendall Eskine, a cognitive psychologist at Loyola University in New Orleans.

The findings suggest a link between having a sweet tooth and a sweet disposition — a link that the study documented in other ways too. People rated themselves as more agreeable and they were more generous with their time, for example, after eating a small piece of sweet chocolate than after eating a sour candy or a bland cracker. They also rated pictures of random faces more highly if captions explained that those people liked sweet foods.

People who were told in a study to drink a bitter-tasting herbal supplement offered particularly harsh judgments of morally questionable scenarios about things such as like a library book-stealing student or a man eating his own already-dead dog. Participants who drank a sweet berry punch or water, Loyola’s Eskine and colleagues reported last year in the journal Psychological Science, weren’t so condemning. Disgust proved especially strong for people who described themselves as politically conservative.

On the flip side, Eskine’s group found more recently that thinking about morally loaded acts can also change the way food tastes. Given a neutral-tasting shot of diluted blue Gatorade, participants in a study in press at the journal PLoS One thought the beverage tasted more delicious after reading about someone being morally virtuous and more disgusting after reading about a moral transgression.

“All of these little incidental bodily experiences can change or shape our judgments. There’s more and more evidence of that popping up all the time,” he says. “I don’t think we are victims to our bodies, but awareness can help us from making really harsh judgments just because we are drinking something gross.”

Ignorance is degrading only when it is found in company with riches. Want and penury restrain the poor man; his employment takes the place of knowledge and occupies his thoughts: while rich men who are ignorant live for their pleasure only, and resemble a beast; as may be seen daily. They are to be reproached also for not having used wealth and leisure for that which lends them their greatest value.

When we read, another person thinks for us: we merely repeat his mental process. It is the same as the pupil, in learning to write, following with his pen the lines that have been pencilled by the teacher. Accordingly, in reading, the work of thinking is, for the greater part, done for us. This is why we are consciously relieved when we turn to reading after being occupied with our own thoughts. But, in reading, our head is, however, really only the arena of some one else’s thoughts. And so it happens that the person who reads a great deal — that is to say, almost the whole day, and recreates himself by spending the intervals in thoughtless diversion, gradually loses the ability to think for himself; just as a man who is always riding at last forgets how to walk. Such, however, is the case with many men of learning: they have read themselves stupid. … And just as one spoils the stomach by overfeeding and thereby impairs the whole body, so can one overload and choke the mind by giving it too much nourishment. For the more one reads the fewer are the traces left of what one has read; the mind is like a tablet that has been written over and over. Hence it is impossible to reflect; and it is only by reflection that one can assimilate what one has read if one reads straight ahead without pondering over it later, what has been read does not take root, but is for the most part lost.

His argument is more nuanced than it might appear. For example, he offers:

One can never read too little of bad, or too much of good books: bad books are intellectual poison; they destroy the mind.

In order to read what is good one must make it a condition never to read what is bad; for life is short, and both time and strength limited.

And he indirectly argues for the ‘Great books’ …

It is because people will only read what is the newest instead of what is the best of all ages, that writers remain in the narrow circle of prevailing ideas, and that the age sinks deeper and deeper in its own mire.

And my favorite part of the essay:

It would be a good thing to buy books if one could also buy the time to read them; but one usually confuses the purchase of books with the acquisition of their contents. To desire that a man should retain everything he has ever read, is the same as wishing him to retain in his stomach all that he has ever eaten. He has been bodily nourished on what he has eaten, and mentally on what he has read, and through them become what he is. As the body assimilates what is homogeneous to it, so will a man retain what interests him; in other words, what coincides with his system of thought or suits his ends. Every one has aims, but very few have anything approaching a system of thought. This is why such people do not take an objective interest in anything, and why they learn nothing from what they read: they remember nothing about it.

And the best books, should be read twice:

Any kind of important book should immediately be read twice, partly because one grasps the matter in its entirety the second time, and only really understands the beginning when the end is known; and partly because in reading it the second time one’s temper and mood are different, so that one gets another impression; it may be that one sees the matter in another light.

Still curious? Pick up a copy of Schopenhauer’s Essays and Aphorisms.

For many it is the first highlight of the day, just when you need it most: the scent of freshly brewed coffee wafting through the house first thing in the morning.

But scientists claim to have solved the mystery of why coffee never tastes as good as it smells.

The act of swallowing the drink sends a burst of aroma up the back of the nose from inside the mouth, activating a “second sense of smell” in the brain that is less receptive to the flavour, causing a completely different and less satisfying sensation.

In contrast, some cheeses smell revolting but taste delicious because their whiff seems more pleasant to us when passing out of the nose than in, experts explained.

The phenomenon is down to the fact that, although we have sensors on our tongue, eighty per cent of what we think of as taste actually reaches us through smell receptors in our nose.

The receptors, which relay messages to our brain, react to odours differently depending on which direction they are moving in.

Only two known aromas - chocolate and lavender - are interpteted in exactly the same way whether they enter the nose from the inside or the outside.

In the case of coffee, the taste is also hampered by the fact that 300 of the 631 chemicals that combine to form its complex aroma are wiped out by saliva, causing the flavour to change before we swallow it, Prof Smith added.

With more than forty percent of American workers reporting chronic workplace stress, the long-term impact of stress and its influence on the human creative condition and business can be detrimental, says Rick Hanson PhD, a California based neuropsychologist and author of Just One Thing: Developing a Buddha Brain One Simple Practice at a Time.

Chronic stress degrades a long list of capabilities with regard to creativity and innovation, notes Hanson. It’s harder to think outside of the box, nimbleness and dexterity take a hit, and the response to sudden change is more difficult to manage.

“Threat makes you productive, but not necessarily effective. It can make you productive if you don’t have to think broadly, widely or deeply,” says Rock. “A threat response, which we might think of as stress, increases motor function, while it decreases perception, cognition and creativity.”

Ultimately, on the surface, stress might seem a good kick starter for productivity. But getting the creative juices flowing has more to do with the engagement of the employee and his or her disposition, notes Rock.

“What neuroscience is telling us, is that creativity and engagement are essentially about making people happier,” explains Rock who adds, “It’s what is called, a “toward state” in the brain.” In that “state,” Rock explains, workers feel curious, open minded, happier and interested in what they are doing.

A huge component of creating that state is to quiet the mind, and that means reducing stress. Rock discusses the neuroscience behind stress reduction here in my recent post at WorkLifeNation.com, Neuroscience Might Be New “it-strategy” to Boost Employee Creativity.

In my experience covering workplace issues for well over a decade, stress management programs in most companies, if they exist at all, are more of an ancillary stepchild in the wellness agenda. As David Ballard PhD told me, workplace flexibility, mental healthcare coverage and on-site fitness offerings certainly help to reduce stress, but it’s not enough. Perhaps a company will do more to help employees better manage stress, if the end-game is a more creative and engaged employee.

[W]hat, then, are the most important ideas ever put forward in social science? I’m not asking what are the best ideas, so the truth of them is only obliquely relevant: a very important idea may be largely false. (I think it still must contain some germ of truth, or it would have no plausibility.) Think of it this way: if you were teaching a course called “The Great Ideas of the Social Sciences,” what would you want to make sure you included?

• The state as the individual writ large (Plato)

• Man is a political/social animal (Aristotle)

• The city of God versus the city of man (Augustine)

• What is moral for the individual may not be for the ruler (Machiavelli)

• Invisible hand mechanisms (Hume, Smith, Ferguson)

• Class struggle (Marx, various liberal thinkers)

• The subconscious has a logic of its own (Freud)

• Malthusian population theory

• The labor theory of value (Ricardo, Marx)

• Marginalism (Menger, Jevons, Walras)

• Utilitarianism (Bentham, Mill, Mill)

• Contract theory of the state (Hobbes, Locke, Rousseau)

• Sapir-Worf hypothesis

• Socialist calculation problem (Mises, Hayek)

• The theory of comparative advantage (Mill, Ricardo)

• Game theory (von Neumann, Morgenstern, Schelling)

• Languages come in families (Jones, Young, Bopp)

• Theories of aggregate demand shortfall (Malthus, Sismondi, Keynes)

• History as an independent mode of thought (Dilthey, Croce, Collingwood, Oakeshott)

• Public choice theory (Buchanan, Tullock)

• Rational choice theory (who?)

• Equilibrium theorizing (who?)