The Universe Conspiracy – Pronoia

And, when you want something, all the universe conspires in helping you to achieve it. ― Paulo Coelho, The Alchemist

I have been developing a unifying theory about success (I know that sound a bold claim) partly influenced by Philip K.Dick’s book, Do Androids Dream of Electric Sheep? First published in 1968, the book served as the primary basis for my favourite film Blade Runner. The novel is set in a post-apocalyptic near future, where Earth and its populations have been damaged greatly by nuclear war during World War Terminus. Most types of animals are endangered or extinct due to extreme radiation poisoning from the war. To own an animal is a sign of status, but what is emphasised more is the empathic emotions humans experience towards an animal. But there is a problem with my theory; it is developing too easily. Someone told me that it was ‘cool’ because the Universe was conspiring in my favour. I am suffering from pronoia apparently.

Joseph Heller’s line in Catch 22. “Just because you’re paranoid doesn’t mean they aren’t after you” – might have to be turned on its head for Hollywood star Susan Sarandon. “Just because you’re pronoiad, doesn’t mean they aren’t for you,” the actress might say. Susan Sarandon talked about her belief in ‘pronoia’ as she was revealing what a joyous experience it had been to make Cloud Atlas, the film adapted from the award-winning novel. Some might say this is nonsense because it is a Hollywood actress telling us this ‘fact’.

Pronoia is defined as the opposite state of mind as paranoia: having the sense that there is a conspiracy that exists to help the person. It is also used to describe a philosophy that the world is set up to secretly benefit people. Almost a Zippie mantra promoted by Saradon. A Zippie is a person who does something for nothing. Any supporter of free culture, free food, free books, free software is a Zippie – and the Universe conspiring a central belief.

But does it make the proposition wrong? As students of logic should know, not every appeal to authority is a fallacious appeal to authority.  A fallacy is committed only when the purported authority appealed to either does not in fact possess expertise on the subject at hand, or can reasonably be supposed to be less than objective.

Hence if you believed that PCs are better than Macs entirely on the say-so of either your technophobic orthodontist or the local PC dealer who has some overstock to get rid of, you would be committing a fallacy of appeal to authority — in the first case because your orthodontist, smart guy though he is, presumably hasn’t much knowledge of computers, in the second case because while the salesman might have such knowledge, there is reasonable doubt about whether he is giving you an unbiased opinion.

But if you believed that PCs are better than Macs because your computer science professor told you so, there would be no fallacy, because he presumably both has expertise on the matter and lacks any special reason to push PCs on you.  That doesn’t necessarily mean he’d be correct, of course; an argument can be mistaken even if it is non-fallacious. Similarly, not every ad hominem attack — an attack against the man or women — involves a fallacious ad hominem.  Attacking the person can be entirely legitimate and sometimes even called for, even in an argumentative context, when it is precisely the man / women whom is the problem.

Attacking a person involves a fallacy when what is at issue is whether some claim the person is making is true or some argument he is giving is cogent, and where the attacker either

  • essentially ignores the question of whether the claim is true or the argument cogent, and instead just attacks the person giving it or
  •  suggests either explicitly or implicitly that the claim can be rejected false or the argument rejected as not cogent on the basis of some irrelevant purported fault of the person giving it.

So the question arises – does pronoia exist, ignoring who told us it might?. I have been exploring the idea that it if you do the right thing often enough, good things happen. The sneaking suspicion others are conspiring to help you and you them. Pronoia is also a prevalent theme in the 1988 novel The Alchemist, by Paulo Coelho. In it, the protagonist, a young boy is told by an older man to pursue his dreams.

He tells the boy, “When you want something, all the universe conspires in helping you to achieve it.” The book also deals with omens, signs that the universe wants the boy to follow a specific path, which will lead to his goal of fulfilling a dream.

The writer and Electronic Frontier Foundation co-founder John Perry Barlow defined pronoia as the suspicion the Universe is a conspiracy on your behalf.The academic journal “Social Problems” published an article entitled “Pronoia” by Fred H. Goldner in 1982 (vol 30, pp.82-91). It received a good deal of publicity at the time including references to it in Psychology TodayWired Magazine published an article in issue 2.05 (May 1994) titled “Zippie!”. The cover of the magazine featured a psychedelic image of a smiling young man with wild hair, a funny hat, and crazy eyeglasses. 

The simplest definition of pronoia may be to say that it is the opposite of paranoia. A person suffering from paranoia suspects that persons or entities (e.g. governments / deities) conspire against them. A person enjoying pronoia feels that the world around them conspires to do them good.

The principal proponent of pronoia in the 21st century has been the astrologer, writer, poet, singer, and songwriter Rob Brezsny. Brezsny’s book Pronoia Is the Antidote for Paranoia: How the Whole World Is Conspiring to Shower You with Blessings, published in 2005, explores the philosophy of pronoia.

 

Can we reject it on the basis of the non expert status of the writer? Well maybe we can relax and suspend our disbelief and imagine that if we do good things –  good things may happen to us in return. Maybe it does not matter in the long run. No act of kindness (no matter how small) is ever wasted.

Be Amazing Every Day.

How to chop wood then carry water.

Smile, breathe and go slowly but don’t look back in anger, I heard you say. This combination (mash-up) of two quotations (one from Oasis and one from Thich Nhat Hanh)  is having a profound influence on my life at this moment. You see, I am listening to Oasis and reading The Art of Power and loving both. In this moment, right now – they matter deeply and profoundly to me.

I have come across many clients who are living in anger and hate (living in the past) and are only looking forward in fear and towards perceived uncertainty. I have learned a great deal over the last few years about looking around and being totally aware. Right now. You see for me anger, hate, resentment, fear, jealously, envy, worry, doubt, mistrusting, conflict – these are all things that can feel very real at that (this) time. At the time I was experiencing them, they were the frame for my world. However, they are of the mind and just excuses to hang on to yesterday or to live in tomorrow.

Thich Nhat Hanh is one of my stronegest influencers in the last ten years. He is a Vietnamese Zen Buddhist monk and peace activist.  His key teaching is that, through mindfulness, we can learn to live happily in the present moment—the only way to truly develop peace, both in one’s self and in the world. Thich Nhat Hanh has published over 100 titles on meditation, mindfulness and engaged Buddhism, as well as poems, children’s stories, and commentaries on ancient Buddhist texts. He has sold over three million books, some of the best-known include Being PeacePeace Is Every StepThe Miracle of MindfulnessTrue Love and Anger.

His writings offered me some very practical methods of bringing mindfulness and loving kindness to the very centre of my being. You don’t need to be a Buddhist or spiritual to benefit from his teaching and learning this technique.

If you haven’t come across him before, here is a quick biography. Thich was born in Vietnam in 1926.  He became a Buddhist monk at the age of sixteen. During the Vietnam War, Thich chose to help villagers suffering from the bombings and the aftermath of war rather than to sit and quietly meditate in his monastery.  In the early 60’s he founded the School of Youth Social Service, rallying near 10,000 student volunteers to rebuild homes, organise agricultural cooperatives, and re-establish order in the lives affected by the ravages of war.During travels to the United States during the 1960’s, Thich spoke for peace in Vietnam.  During one of his visits he spoke with Martin Luther King, Jr and convinced him to oppose the Vietnam War publicly.  This helped to galvanise the peace movement that continued through the 70’s and until the war was finally ended.  In 1967 Dr. King nominated Thich Nhat Hanh for the Nobel Peace Prize.

So Sally can wait, she knows it’s too late as we’re walking on by /Her soul slides away, “But don’t look back in anger”, I heard you say. –

What I have come to accept (it has taken a lifetime) is that being present is an experience everyone should aim for, every day. It is a time when I feel completely at peace with the moment  I am in, right now. It is the basis of being amazing every day. If you get in the habit of being present, then you may notice some (or all) of what I have noticed –

  • I have feelings of intense calmness (only good eustress).
  • I notice I am smiling more (as do other people).
  • I am kinder to myself (physically and emotionally).
  • I am trying to be kinder to other people (with no motive – altruism).
  • I am not rushing as much to meetings (I am not late, but not early).
  • My reflexes are faster and I join the dots quicker.
  • My mind is clearer and this clarity solves problems.
  • I am more decisive and take better decisions.
  • I know what I want, right now.
  • I know what is right for me, right now.
  • I am better at public speaking, training, coaching and performing.
  • My confidence is deeper without arrogance.
  • I am dealing with death and life as equals.
  • I know and accept that am not perfect – but I am becoming more real.
  • I accept I have many faults and I own them (I eat too fast for one thing).
  • I feel stronger and more passionate about making a difference.
  • I am quieter and read more.

The old Zen standby, chop wood, carry water simplifies this to a feeling of not multi tasking or running faster, yet getting nowhere. I can now see how the past keeps creating my future, and when I am conscious of this, I get to make another choice. I get to forgive the past and embrace the now. When I was living in the past or future, I missed out on the freedom and peace in the now. Lately, I am becoming aware much sooner and quicker when this happens.

The simple truth is being present is when you choose to focus on a particular time frame. There are only three possible time frames: past, present and future. Once you become aware of the thoughts you are having and the content of those thoughts, you will notice which timeframe you are in at any given time. You will begin to notice how often your thoughts and feelings are focused on the past or the future. These thoughts are riddled with judgments, comparing the past or future to your present situation.

Most busy people spend so little of their time being fully present. The rest of the time, they drift in and out as there attention wanders. Your mind may even seem to be out of your own control. How can you be more present?

I start with the power of the breath. By taking many slow rhythmic, even breaths, I concentrate on this cycle; no gaps or holding the breath. Some people say do this through the nose, others through the mouth. I don’t mind as long as it is slow, rhythmic and even. Breath, along with change, is the only constant. I believe being present starts with the breath.

Now take a moment to consider what are you doing right now.  Consider, as a correspondence to that moment of suspended breath-time, what you’re doing right at that moment.

  • Are you ‘just’  reading this post?
  • Where are your thoughts?
  • What are you thinking about?
  • What are your emotions?
  • Where are your hands?
  • What is the time?
  • Is it moving slowly or fast?
  • So you are reading – that’s it…so, just read.

Part of the answer to being present is to learn how to become a ethnographic observer. A witness if you will. Become a witness to becoming aware of what you are doing – exactly what you are doing – in any given moment. Try to observe it, name it and stand away from it. When we cling to a now rather than simply bearing witness to it and letting it pass by, we become trapped in time as it passes.Then develop the routine of letting the rest go; much like bearing witness, whatever is not there in that moment let go.

Be there, right there, right then.

Then gently come back to the breath, when the world or your thoughts begin to again intrude, simply come back to the breath. The constancy of breath can create the constancy of presence for us, if we choose to show up. The act of being present is, in a sense, a meditation without meditating. The stillness here, though, comes from action – breathing, attending, witnessing, releasing and breathing again. This simple cycle can profoundly change the way that you experience our world.

Be Amazing Every Day

Go slowly, smile, breath (slowly evenly and rhythmically) and don’t look back in anger. Chop wood. Carry water.

 

Which is better: Aggression or Collaboration?

It must be obvious surely? Or perhaps a trick question? Well the answer is not as obvious as you may think. Human beings probably have killed in war more members of their own species than any other animal species on this planet. It is undeniable that ours is a pretty aggressive species when compared to the rest of the animal kingdom. Aggression and war are hard-wired into the brain, but so are acceptance, empathy and collaboration. But which is better? There’s only one way to find out: FIGHT! is a recurring feature within Harry Hill’s TV Burp.

The term aggression comes from the Latin aggressio, meaning attack.Every night on the news there are reports about murders, wars and rapes. You might want to start by stop watching the news. But the news isn’t the only place where people encounter violent or aggressive behaviour. We see it at work, while commuting, on the tube and in the home. You can observe it in queues, shops offices and in sport. It starts in the school yard and grows as we get older.


I am a huge fan of Psychologist Robert Plutchik, whom identified eight primary emotions which he coordinated in pairs of opposites: joy versus sadness; trust versus disgust; fear versus anger and anticipation versus surprise. He created the 2D wheel and a conical 3D version in 1980 as a tool for understanding his psycho-evolutionary theory of emotion. Intensity of emotion and indicator colour increases toward the centre of the wheel and decreases outward. At the centre terror becomes fear and then apprehension; ecstasy becomes joy and then serenity. Secondary emotions are displayed as combinations of the primary ones. The cross over and closeness is revealing when we look at our emotions towards aggressiveness.

Researchers in ethology (which is the scientific and objective study of animal behaviour under natural conditions) believe that aggression confers some sort of biological advantages. It all comes down to economics and the notion that aggression, much like anything else, has benefits and costs. Aggression may help an animal secure territory, including resources such as food and water. Aggression between males often occurs to secure mating opportunities, and results in selection of the healthier/more vigorous animal. Aggression may also occur for self-protection or to protect offspring.

Konrad Lorenz stated in his 1963 classic, On Aggression, that human behaviour is shaped by four main, survival-seeking animal drives.Taken together, these drives—hunger, fear, reproduction, and aggression—achieve natural selection. Well maybe. Humans share aspects of aggression with non-human animals, and have specific aspects and complexity related to factors such as genetics, early development, social learning and flexibility, culture and morals. What are these benefits and costs of aggression? Aggression between groups of animals may also confer advantage; for example, hostile behaviour may force a population of animals into a new territory, where the need to adapt to a new environment may lead to an increase in genetic flexibility.

It is interesting to note that during the Cold War, politicians on both sides used their belief that war was highly likely to justify the manufacture and deployment of more and more nuclear weapons. Yet the belief in the near inevitability of war makes war more likely. In 1978 Harvard biologist E. O. Wilson published a ground breaking book On Human Nature. The book tries to explain how different characteristics of humans and society can be explained from the point of view of evolution. Aggression is, typically, a means of gaining control over resources. Aggression is, thus, aggravated during times when high population densities generate resource shortages. According to Richard Leakey and his colleagues, aggression in humans has also increased by becoming more interested in ownership and by defending his or her property

With increased understanding of the relations between genes and environment behavioural scientists have acquired a deeper understanding of the bases of aggression than was previously possible. The brain is awash in chemicals, including hormones and neuro-transmtters that accentuate or dampen its responses and influence its organisation and operations. Neurotransmitters are chemicals that relay, amplify, or modulate signals that are sent between neurones and other cells.There are many different hormones and neurotransmitters, of which the most important are glutamate and GABA, which excite and modify synapses. From all of the possibilities explored and the theories made, you might think that there has been a conclusion made about exactly where aggression originates as well as what could incite a particular aggressive action, but there has not been. However the following compounds seem to be most active:

  • Adrenalin, which triggers the fight or flight response
  • Testosterone, which stimulates aggression
  • Oxytocin which instills trust, increases loyalty, and promotes the tend and befriend response
  • Oestrogen, which triggers the release of oxytocin
  • Endorphins, which reinforce collaborative experiences with pleasure
  • Dopamine, which generates a reward response and fortifies addiction
  • Serotonin, which regulates moods
  • Phenylethylaline, which induces excitement and anticipation
  • Vasopressin, which encourages bonding in males in a variety of species

Out of the last few years of neurophysiological research has emerged a new hope that solutions may indeed be found to the chemical and biological sources of aggression. But there is a caveat. While War has yet to be reduced to a simple set of deterministic biochemical events taking place exclusively within the brain, research clearly demonstrates that basic neurological processes provide all of us with alternative sets of instructions that lead either toward impasse or resolution, stasis or transformation, isolation or collaboration. While no one really knows the exact causes of aggression or if it can even be said that there is one thing that causes it. So, although, there may not be one conclusive answer to why people are aggressive, it doesn’t mean that a combination of theories can’t be right or that someday, researchers will find the answer.

The Cold War and the resonant fear of nuclear fear is now largely over, but old wars continue and new ones have been initiated in many parts of the world. You may hear that the waging war as an inevitable consequence of human nature. This attitude is not only dangerous in encouraging the view that war is the method of choice for settling disputes, it is also very wrong. To get the right answer requires not only a profound understanding of how the brain works, but a global shift in our attitude toward conflict, an expanding set of scientifically informed techniques, a humanistic and democratic prioritisation of ethics and values.

We don’t need a fight to know, we need to begin with a willingness to start with ourselves.

Be Amazing Every Day.

Unknown Pleasures

The title of one of my favourite (and iconic) albums is Unknown Pleasures by Joy Division. The title probably comes from Marcel Proust’s Remembrance of Things Past. I have (honestly) tried to read it, but it is a long novel in seven volumes known both for its length and its theme of involuntary memory, the most famous example being the episode of the madeleine. The narrator begins by noting, For a long time, I went to bed early. He comments on the way sleep seems to alter one’s surroundings, and the way habit makes one indifferent to them. As a neuroscience trainer, I love the idea of getting less sleep.

Listen to the silence, let it ring on. Eyes, dark grey lenses frightened of the sun. We would have a fine time living in the night, Left to blind destruction, Waiting for our sight. – Transmission (Joy Division)

Pleasure is usually describes as the broad class of mental states that humans and other animals experience as positive, enjoyable, or worth seeking. It includes more specific mental states such as happiness, entertainment, enjoyment, ecstasy, and euphoria. In psychology, the pleasure principle describes pleasure as a positive feedback mechanism, motivating the organism to recreate in the future the situation which it has just found pleasurable. According to this theory, organisms are similarly motivated to avoid situations that have caused pain in the past. And then punk came along and I was inspired to know more.

Joy Division were formed in Salford, Greater Manchester in 1976 during the first wave of punk rock. Bernard Sumner and Peter Hook had separately attended the legendary Sex Pistols show at the Manchester Lesser Free Trade Hall on 4 June 1976, and both embraced that group’s simplicity, speed and aggression. In fact according to legend every one of the 200 people there formed a band. Ian Curtis, who Sumner and Hook already knew, applied and, without having to audition, was taken on.

In 1979 I bought this amazing album I went that year so see them play live at West Runton Pavilion (North Norfolk) and met with Ian Curtis . I loved him and what Jon Savage described their music as, a definitive Northern Gothic statement: guilt-ridden, romantic, claustrophobic. His life is brought to many people’s attention in the stunning film Control.Curtis, who suffered from epilepsy and depression, committed suicide on 18 May 1980, on the eve of Joy Division’s first North American tour, resulting in the band’s dissolution and the subsequent formation of New Order.

The cover of the Unknown Pleasures album stimulated my love of Astronomy, Pulsars and the Universe (I still have the T shirt).The cover of their 1979 debut album is probably more well known than the album or band themselves. Famed cover art designer Peter Saville is credited with designing the cover, but as the myth goes it shows a series of radio frequency periods from the first pulsar discovered.I was studying brain science at the time and using complex mathematics like Fourier analysis to decode the data of action potential in nerve transmission. I thought the image on the cover (and it is largely cited correctly) as depicting the first pulsar discovered (CP 1919). In fact it’s not the first isolated plot of that pulsar, which was made in 1967. That honour goes to Jocelyn Bell Burnell from the Mullard Radio Astronomy Observatory in Cambridge, whom I was very lucky to meet when my father introduced (as head of medical research) in Cambridge.

Radio pulsars are neutron stars, huge, spinning ‘nuclei’ that contain some 1057 protons and neutrons. The large clump of nuclear matter, which has a mass about equal to that of the sun, is compressed into a sphere with a radius on the order of 10 kilometers. Consequently, the density of the star is enormous, slightly greater than the density of ordinary nuclear matter, which is itself some 10 trillion times denser than a lead brick. Currents of protons and electrons moving within the star generate a magnetic field. As the star rotates, a radio beacon, ignited by the combined effect of the magnetic field and the rotation, emanates from it and sweeps periodically through the surrounding space, rather like a lighthouse beam. Once per revolution the beacon cuts past the earth, giving rise to the beeping detected by radio telescopes.

Peter Saville, who had previously designed posters for Manchester’s Factory club in 1978, designed the cover of the album. Saville reversed the image from black-on-white to white-on-black and printed it on textured card for the original version of the album. The image itself according to Scientific American writer Jen Christiansen was by Harold D. Craft, Jr., was a graduate student at Cornell University in the early 70s, working with cosmic data a the massive Arecibo Radio Observatory in Puerto Rico. You can read Christiansen’s account of her investigation, and listen to her interviews with Craft at Scientific American. He and his colleagues were experimenting with some of the first digital measurements of radio waves from pulsars (collapsed stars that flash like lighthouses), using radar equipment at the observatory. By chance, Craft ended up writing the computer program that would produce this iconic image.

Unknown Pleasures’ cover was computer generated.

Craft said he had no idea that his image was being widely used on the cover of a famous record. “I went to the record store and, son of a gun, there it was. So I bought an album, and then there was a poster that [they] had of it, so I bought one of those too, just for no particular reason, except that it’s my image, and I ought to have a copy of it.”

Unknown Pleasures was recorded at Strawberry Studios in Stockport, England between 1 and 17 April 1979, with Martin Hannett producing. Describing Hannett’s production techniques, Hook said,that Hannett was only as good as the material he had to work with, “We gave him great songs, and like a top chef, he added some salt and pepper and some herbs and served up the dish. But he needed our ingredients.”

The experience of pleasure is subjective and different individuals will experience different kinds and amounts of pleasure in the same situation. Many pleasurable experiences are associated with satisfying basic biological drives, such as eating, exercise, hygiene or sex. For real pleasure, try listening again to Unknown Pleasures again, now.

Dance, dance, dance, dance, dance, to the radio.

Be Amazing Every Day.

Big Idea: Trivial Bikeshedding Management

Did you know that today is National Trivia Day* and 50 years ago (last Wednesday 5th February, 1965) trivia was invented? Well sort of true; a Columbia Spectatorarticle appeared on this day and used the term trivia to topics like,

  • Who played the Old Gypsy Woman in The Wolfman?
  • Answer: Maria Ouspenskaya (I did not know this either).

Columbia University students Ed Goodgold and Dan Carlinsky, who had proposed the new use of the term in their original article swiftly created the earliest inter-collegiate quiz bowls that tested culturally (and emotionally) significant yet essentially unimportant facts, which they dubbed trivia contests. The expression has also come to suggest information of the kind useful almost exclusively for answering quiz questions, hence the brand name Trivial Pursuit (1982).

The word originates from the Latin neuter noun trivium (plural trivia) is from tri- “triple” and via “way”, meaning a place where three ways meet. The word trivia was also used to describe a place where three roads met in Ancient Rome. Often misquoted with the comedic line that 2 are irrelevant (trivial) as only the one leading back to Rome is important. They did not, as some wag (Frank Skinner) suggested, pin pieces of rubbish information at these cross roads.

More accurately trivia are the three lower Artes Liberales: grammar, logic andrhetoric. These were the topics of basic education, foundational to the quadrivia of higher education, and hence the material of basic education and an important building block for all undergraduates. In management terms I came across Parkinson’s law of triviality on my MBA course years ago. It also known as ‘bikeshedding’ and was first described by C. Northcote Parkinson in 1957. His argument was that organisations give disproportionate weight to trivial issues.

Parkinson observed and illustrated that a committee whose job is to approve plans for a nuclear power plant spent the majority of its time with pointless discussions on relatively trivial and unimportant but easy-to-grasp issues, such as what materials to use for the staff bike-shed, while neglecting the less-trivial proposed design of the nuclear power plant itself, which is far more important but also a far more difficult and complex task to criticise constructively. As he put it:

The time spent on any item of the agenda will be in inverse proportion to the sum [of money] involved.

A reactor is used because it is so vastly expensive and complicated that an average person cannot understand it, so one assumes that those that work on it understand it. On the other hand, everyone can visualise a cheap, simple bicycle shed, so planning one can result in endless discussions because everyone involved wants to add a touch and show personal contribution.

Thus bike shedding involves discussions about relatively unimportant issues which result in extensive debate. Know that feeling at many a management meetings?

It may be the result of individuals who wish to contribute feeling that they do not have the knowledge or expertise to contribute on more significant issues. Bike shedding can result in discussions that, whilst on-topic, nevertheless effectively drown out other discussions on more significant issues.

My top 7 favourite pieces of trivia are currently:

  1. On Good Friday in 1930, the BBC reported, “There is no news.” Instead, they played piano music.
  2. In the 1980s, Pablo Escobar’s Medellin Cartel was spending $2,500 a month on rubber bands just to hold all their cash.
  3. M&M’s actually stands for “Mars & Murrie’s,” the last names of the candy’s founders.
  4. In 1907, an ad campaign for Kellogg’s Corn Flakes offered a free box of cereal to any woman who would wink at her grocer.
  5. The Arkansas School for the Deaf’s nickname is the Leopards.
  6. The Vatican Bank is the world’s only bank that allows ATM users to perform transactions in Latin.
  7. The unkempt Shaggy of Scooby-Doo fame has a rather proper real name of Norville Rogers.

*There is a National Trivia Day, but it is January 4th.

Be Amazing Every Day

Your Vast Prediction Machine

Think of the brain as a vast prediction machine. I drove my car to the station this morning; what colour is it?The brain’s desire to know the answer (I don’t have a car but to help your brain, let’s call it red) and indeed what the future holds in general is a powerful motivator in everyday life. We know that massive neuronal resources are devoted to predicting what will happen each moment.

Using research by the neuroscientists at Cal Tech it is becoming clearer that the brain needs to resolve some difficult and seemingly opposing issues to thrive.

Much is known about how people make decisions under varying levels of probability (risk). Less is known about the neural basis of decision-making when probabilities are uncertain because of missing information (ambiguity). Yet we know the brain loves certainty. This is the assurance you can avoid pain and gain pleasure (or even comfort). Some people pursue this need by striving to control all aspects of their lives, while others obtain certainty by giving up control and adopting a philosophy, faith or belief system.

Your brain is doing something quite remarkable right now. There around 40 environmental cues you can consciously pay attention to right now. Remember we have at least 27 senses (see here). Subconsciously this number is well over two-million. That’s a huge amount of data that can be used for prediction. The brain likes to know what is going on by recognising patterns in the world. It likes to feel certain. We learn much more than we ever consciously understand. Most of the signals that are peripherally perceived enter the brain without our awareness and interact on unconscious levels. This is why we say that learners become their experience and remember what they experience, not just what they are told.

Jeff Hawkins inventor of the Palm Pilot and more recently founder of a neuroscience institute explains the brain’s predilection for prediction in his book (On Intelligence),

Your brain receives patterns from the outside world, stores them as memories, and makes predictions by combining what it has seen before and what is happening now… Prediction is not just one of the things your brain does. It is the primary function of the neocortex, and the foundation of intelligence.

Meaning is not always available on the surface. Meaning often happens intuitively in ways that we don’t understand. So that, when we learn, we use both conscious and unconscious processes. In teaching, you may not reach a student immediately, but two years later he / she may be somewhere else and suddenly join the dots and get it.

The brain requires at the same time as this certainty a measure of uncertainty, causing variety. This is to avoid the boredom reflex and requires our brain to look for distraction. The evolution of play and creation of novelty stem from this quest for uncertainty. The need for the unknown, for change and new stimuli also makes us feel alive and engaged. This is in part caused by the hunger for information, just for the sake of it. Often that information doesn’t make us more effective or adaptive, it just reduces a sense of relative uncertainty.

Your brain loves a quick burst of dopamine we get when a circuit is completed. It feels good – but that doesn’t mean it’s good for us all the time. All of this explains many otherwise strange phenomenon. Knowing that we automatically avoid uncertainty explains why any kind of change can be hard – it’s inherently uncertain. It explains why we prefer things we know over things that might be more fun, or better for us, but are new and therefore uncertain. It might also explain why we prefer the certainty of focusing on problems and finding answers in data from the past, rather than risking the uncertainty of new, creative solutions.

This means that we are naturally programmed to search for meaning. This principle is survival oriented and is the basis of why your brain wanted to know the colour of my car (which I don’t have). The brain needs and automatically registers the familiar while simultaneously searching for and responding to additional stimuli.

We want to know what things mean to us. The brain likes to think ahead and picture the future, mapping out how things will be, not just for each moment, but also for the longer term. The paradox of certainty and uncertainty combined with significance and meaning.

Be Amazing Every Day.

Silence Your Brain!

Peter was after a talking parrot, so he went to the local pet shop in the hope of securing such a find. He was in luck. The shop assistant assured her that the parrot would learn and repeat any word or phrase it heard. Peter was delighted. However, a week later, the parrot still hadn’t spoken a word. Peter returned to the shop to complain, however, it appeared that the assistant was accurate in what he had said and refused a refund. Why didn’t the parrot talk? [answer at the end, but remember the parrot repeats every single word it hears].

Shut up! Like the mute button on the TV remote control, our brains filter out unwanted noise so we can focus on what we’re listening to. Most of us will be familiar with the experience of silently talking to ourselves in our head. That inner monologue usually conducted in silence. Self doubts, insecurities and a general soundtrack or commentary to life.

Have you ever been at the supermarket and realise that you’ve forgotten to pick up something you needed. You might say (outloud), ‘saugages!’ or whatever your temperoary lapse of recall was. Or maybe you have got an important meeting with your boss later in the day, and you’re simulating, (silently in your head) how you think the conversation might go, possibly hearing both your own voice and your boss’s voice responding. This is the phenomenon that psychologists call inner speech, and they’ve been trying to study it pretty much since the dawn of psychology as a scientific discipline.

Our Brain’s have a built in filter for unwanted noise. When it comes to following our own speech, a new brain study from the University of California, Berkeley, shows that instead of one homogenous mute button, we have a network of volume settings that can selectively silence and amplify the sounds we make and hear. They discovered that neurones in one part of the patients’ hearing mechanism were dimmed when they talked, while neurones in other parts lit up. Their findings, published in the Journal of Neuroscience, offer clues about how we hear ourselves above the noise of our surroundings and monitor what we say. Previous studies have shown a selective auditory system in monkeys that can amplify their self-produced mating, food and danger alert calls, but until this latest study, it was not clear how the human auditory system is wired.

With this in mind it might make more sense when we need to really listen to something that is important. Say you have to listen to fill a prescription or enter data that is potentially life threatening if you get it wrong. When we want to listen carefully to someone, the first thing we do is stop talking. The second more surprising thing we do is stop moving altogether. This strategy helps us hear better by preventing unwanted sounds generated by our own movements.

This interplay between movement and hearing also has a counterpart deep in the brain. Indeed, indirect evidence has long suggested that the brain’s motor cortex, which controls movement, somehow influences the auditory cortex, which gives rise to our conscious perception of sound. A new study, in Nature, combines cutting-edge methods in electrophysiology, optogenetics and behavioural analysis to reveal exactly how the motor cortex, seemingly in anticipation of movement, can tweak the volume control in the auditory cortex. The findings contribute to the basic knowledge of how communication between the brain’s motor and auditory cortexes might affect hearing during speech or musical performance.

And the parrot? The parrot was deaf. Therefore it couldn’t repeat a single word it had heard – as it had heard no words at all.

Be Amazing Every Day

Innovation is full of Paradox.

Whoever Makes the Most Mistakes Wins.

Tom Peter’s again, right on the money. Paradox can prove to be very revealing about human nature and leadership. Nobel Prize winner Niels Bohr was a Danish physicist who made foundational contributions to understanding atomic structure and quantum theory. He once said, ‘Now that we have met with paradox we have some hope of making progress.’ 

At the most basic level, a paradox is a statement that is self contradictory because it often contains two statements that are both true, but in general, cannot both be true at the same time. What generates real innovation is actually understanding why (and how) paradox can inspire people.

The origins of innovation can be found in the evolution and development of the neocortex. These higher centres of the human brain are the source of abstract thought and also our very human quality of learning from failure. The ability to Explore, Play and Create Novelty in a safe environment becomes critical. The word ‘innovate’ can be traced all the way back to 1440. It comes from the Middle French word [which apparently on my Linkedin profile is something I am an expert in] ‘innovacyon’, meaning ‘renewal’ or ‘new way of doing things’. This echoes Peter Drucker’s brilliant reflection on innovation,

Change that creates a new dimension of performance.

The act of introducing something new (innovation) begins with an internal brain process. We can look at where by using tools like fMRI to determine which areas ‘light up’ during the process but it’s origin is unclear. Somewhere there is spark, a neuro-chemical reactions and the beginning of the fascination over an idea. This state of innovation, constant fascination and being intensely interested in something, is a primitive survival mechanism that might not help survival (you might eat the wrong killer berry). Yet by making a safe environment, where we can explore, play and create novelty we create a spark that both motivates and innovates.

Daniel Pink beautifully describes (in his book Drive) the paradox of money as a motivator (watch the surprising results it delivers). Companies need to allow more autonomy and self direction. That’s why Google gives its workforce 20% of their time to explore projects on their own. That’s why 3M and W.L. Gore do something similar. They know that the root of innovation is fascination and failure.

Wise leaders accept their setbacks as necessary footsteps on the path towards success. In The Innovation Paradox, Richard Farson and Ralph Keyes argue that failure has its upside, success its downside. These two are not as distinct as we imagine. They co-exist, are even interdependent. Both are steps toward achievement.

It’s not success or failure, but success and failure.

Every company worth knowing has identified innovation as a core competency needing to be developed. However a large percentage of our time and our organisation’s energy is necessarily spent on activities that don’t require innovation. We also know that scaling up an innovation depends on the operation of relatively routine tasks and processes, many of which are in place and already have been proved effective. What’s needed in organisations whom aspire to a culture of innovation, is the energy to create a spark and then embrace success and failure as equals.

The key to the innovator paradox then is the development of this neurochemical spark within people. So what sparks people? If you do some analysis of the most creativity and innovate people in history, you will find that the spark lies deep in their brain. They are able to be curious and creative. They become fascinated, even obsessed by ideas. While it can certainly be supported by systems, it can never be reduced to systems. Because that’s where innovation starts, with the innovator and the inspired individual, compelled by their DNA to make a difference. Then all that person needs is from you is time, some resources, meaningful collaboration, and periodic reality checks from someone who understands what fascination is all about.

One’s only rival is one’s own potentialities. One’s only failure is failing to live up to one’s own possibilities. In this sense, every man can be a king, and must therefore be treated like a king. – Abraham Maslow

If you study the lives of people who have had those Eureka moments, you may well note that their breakthroughs almost always came after extensive periods of intense, conscious effort. They worked, they struggled, they explored, played and created novelty. They gave up, they recommitted and then the breakthrough came, often at unexpected moments. The conscious mind works overtime in an attempt to solve a problem or achieve a goal. Unable to come up with the breakthrough, the challenge gets turned over to the subconscious mind, which then proceeds to figure it out in its own, without time pressure and focus.

Coming up with the right question is at least half of getting the right answer. If you want a breakthrough idea, begin by coming up with a breakthrough question. Find the one that communicates the essence of what you’re trying to create. Perhaps Einstein said it best when he declared, Not everything that can be counted counts; and not everything that counts can be counted. He was referring, of course, to the part of the human brain that ‘knows’ intuitively; the part that is tuned in, connected, and innately creative.

If you, or the people who report to you, are not currently in a state of innate fascination, it’s time to turn things around. That is, of course if you really want to spark some innovation. Throughout history, the best managers and leaders always have allowed this special space of paradox and innovation to exist. Since failures so often lead to successes, and vice-versa, rather than try to sort these two out, wise managers focus on the innovation process and what can be learned from it.

What exists on the other side of failure, is fuel for your untapped creativity.

Be Amazing Every Day.

[If you don’t get this message, call me; if you do get it, don’t call. Spread the word.]

Wrong-Brained

Sometimes I just want to give up. I really don’t know why I bother with my epic quest for truth, science and reason.

You are such a right-brain thinker’, she yelled.

I probably should not have said she was so wrong. Maybe I should not have added that she was being a ‘meme sustaining poptart psychologist’ and ‘both neuro-scientifically and anatomically inaccurate’. Like the time she came in when I was watching the cricket and said, “It’s over” and I replied, “No, 3 balls left”.

Despite what you may have been told, you are not left-brained or right-brained. From books to television programs, you may have heard the phrase mentioned numerous times or perhaps you’ve even taken an online test to determine which type best describes you. From self-help and business success books to job applications and smartphone apps, the theory that the different halves of the human brain govern different skills and personality traits is a popular one.

According to this (wrong) theory of left-brain or right-brain dominance, each side of the brain controls different types of thinking. Additionally, people are said to prefer one type of thinking over the other. For example, a person who is labelled left-brained is often said to be more logical, analytical, and objective, while a person whom is labelled right-braine is said to be more intuitive, thoughtful, and subjective. So what exactly did this theory suggest?

The Right Brain Nonsense: According to the left-brain, right-brain dominance failed theory, the right side of the brain is best at expressive and creative tasks. Some of the abilities that are popularly associated with the right side of the brain include:

  • Recognising faces
  • Expressing emotions
  • Music
  • Reading emotions
  • Colour
  • Images
  • Intuition
  • Creativity

The Left Brain Nonsense: The left-side of the brain is (not) considered to be adept at tasks that involve logic, language and analytical thinking. The left-brain is often described as being better at:

  • Language
  • Logic
  • Critical thinking
  • Numbers
  • Reasoning

Too bad it’s not true. Short of having undergone a hemispherectomy (removal of a cerebral hemisphere), no one is a left-brain only or right-brain only person.In pop psychology, the theory is based on what is known as the lateralisation of brain function. So does one side of the brain really control specific functions? Are people either left-brained or right-brained? Like many popular psychology myths, this one grew out of observations about the human brain that were then dramatically distorted and exaggerated.

To try and put this to bed forever, a new two-year study published in the journal Plos One, University of Utah neuroscientists scanned the brains of more than 1,000 people, ages 7 to 29, while they were lying quietly or reading, measuring their functional lateralisation – the specific mental processes taking place on each side of the brain. They broke the brain into 7,000 regions, and while they did uncover patterns for why a brain connection might be strongly left or right-lateralised, they found no evidence that the study participants had a stronger left or right-sided brain network. Jeff Anderson, the study’s lead author and a professor of neuroradiology at the University of Utah says:

It’s absolutely true that some brain functions occur in one or the other side of the brain, language tends to be on the left, attention more on the right.

But the brain isn’t as clear-cut as the myth makes it out to be. For example, the right hemisphere is involved in processing some aspects of language, such as intonation and emphasis. Where has this come form because I am pretty sure you will have heard it? Experts suggest the myth dates back to the 1800s, when scientists discovered that an injury to one side of the brain caused a loss of specific abilities. The concept gained ground in the 1960s based on Nobel-prize-winning split-brain work by neuropsychologists Robert Sperry, and Michael Gazzaniga. The researchers conducted studies with patients who had undergone surgery to cut the corpus callosum – the band of neural fibres that connect the hemispheres – as a last-resort treatment for epilepsy.

They discovered that when the two sides of the brain weren’t able to communicate with each other, they responded differently to stimuli, indicating that the hemispheres have different functions.Both of these bodies of research tout findings related to function; it was popular psychology enthusiasts who undoubtedly took this work a step further and pegged personality types to brain hemispheres.

Brain function lateralisation is evident in the phenomena of right- or left-handedness and of right or left ear preference, but a person’s preferred hand is not a clear indication of the location of brain function. Although 95% of right-handed people have left-hemisphere dominance for language, 18.8% of left-handed people have right-hemisphere dominance for language function. Additionally, 19.8% of the left-handed have bilateral language functions. Even within various language functions (e.g., semantics, syntax, prosody), degree (and even hemisphere) of dominance may differ.

Additionally, although some functions are lateralised, these are only a tendency. The trend across many individuals may also vary significantly as to how any specific function is implemented. The areas of exploration of this causal or effectual difference of a particular brain function include its gross anatomy, dendritic structure, and neurotransmitter distribution. The structural and chemical variance of a particular brain function, between the two hemispheres of one brain or between the same hemisphere of two different brains, is still being studied.

Researchers have demonstrated that right-brain/left-brain theory is a myth, yet its popularity persists. Unfortunately many people are likely unaware that the theory is outdated. Today, students might continue to learn about the theory as a point of historical interest – to understand how our ideas about how the brain works have evolved and changed over time as researchers have learned more about how the brain operates. The important thing to remember if you take one of the many left brain/right brain quizzes that you will likely encounter online is that they are entirely for fun and you shouldn’t place much stock in your results. According to Anderson:

The neuroscience community has never accepted the idea of ‘left-dominant’ or ‘right-dominant’ personality types. Lesion studies don’t support it, and the truth is that it would be highly inefficient for one half of the brain to consistently be more active than the other.

We love simple solutions (see also 21 days to break a habit) Human society is built around categories, classifications and generalizations, and there’s something seductively simple about labeling yourself and others as either a logical left-brainer or a free-spirited right brainer. The problems start, however, when the left-brained/right-brained myth becomes a self-fulfilling prophecy. What research has yet to refute is the fact that the brain is remarkably malleable, even into late adulthood.

It has an amazing ability to reorganise itself by forming new connections between brain cells, allowing us to continually learn new things and modify our behavior. Let’s not underestimate our potential by allowing a simplistic myth to obscure the complexity of how our brains really work.

Be Amazing Every Day.

Nielsen, J. A., Zielinski, B. A., Ferguson, M. A., Lainhart, J. E., & Anderson, J. S. (2013). An evaluation of the left-brain vs. right brain hypothesis with resting state functional connectivity magnetic resonance imaging.

Rogers, M. (2013).Researchers debunk myth of “right brain” and “left-brain” personality traits. University of Utah, Office of Public Affairs. Retrieved from http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0071275

Less is More

The common phrase ‘Familiarity Breeds Contempt’ is a familiar dictum to many managers and leaders, who have had this concept drilled into them since their earliest days of MBA school and management training. I’ve heard it from family members, teachers and employers. Indeed I am sure you can can recall horror stories about bad managers who lost control of their authority by becoming too familiar with their juniors and people they were meant to be leading.

So a crucial leadership question is, does the familiarity really breed contempt? If it does, how does the leader maintain the perfect balance (the so called Goldilock’s Syndrome) where he/she has to get it just right level of familiarity? Is less more which would offer camaraderie and also avoid the potential contempt by the subordinate? In the first decade of the 20th century, an obscure British journalist came up with a newer version of the phrase. His name was Holbrook Jackson, and he was pretty well known among the journalistic intelligentsia of the time. He said,

Familiarity breeds not contempt, but indifference

The far better known (both at the time and subsequently) Gilbert Keith (G.K.) Chesterton’s had a particular preference for paradox, and was never hospitable to platitudes. Chesterton adds to the Jackson quotation, with an acidic,

But it can breed surprise. Try saying ‘Boots’ ninety times.

Excellent, and worth a try! Benjamin Franklin went further (not in a management sense) and proposed that fish and visitors have something in common. Both begin to stink after 3 days. The present research offers empirical support for Franklin’s quip. The more people learn about others (and anyone who has had houseguests knows all too well how much one can come to know in a short time) the less they like them, on average.

The present neuroscience research shows that although people believe that learning more about others leads to greater liking, more information about others leads, on average, to less liking. It seems ambiguity (i.e. lacking information about another) leads to liking, whereas familiarity—acquiring more information—can breed contempt. This less is more effect is due to the cascading nature of dissimilarity. Once evidence of dissimilarity is encountered, subsequent information is more likely to be interpreted as further evidence of dissimilarity, leading to decreased liking.

The evidence seems to be on the whether familiarity always breeds contempt, is that it depends on a variety of factors. However is seems that familiarity can breed contempt, more often than does not. To give some hard evidence and big data for this theory, Sirota Consulting, they surveyed and looked at employee job satisfaction of 1.2 million employees at 52 companies over 30 Years. According to Sirota’s research there is a significant decline in overall job satisfaction after an employee has been working with an employers for an average of six months or more.

The leaders who always maintained a safe distance with subordinates at all times, so that those employees did not cross the line of respect, earned more respect. It certainly involved non-transparency from leader in various matters, but the show still went on successfully. There were absolutely no complaints, even though the annual raises were poor and performance ratings were below average. The subordinates often praised the leader, and even justified the low raises as not being the leader’s fault (Stockholm syndrome )

It seems the leaders who offered total transparency and camaraderie to their subordinates, often found some of their subordinates being disrespectful and deceitful towards them, despite their good deeds and commitment to employee development and promotion. interestingly there were complaints from employees who always received good raises, but received only one time low raise and below average rating (which was fair and just because of the subordinate’s poor performance).

The 4 things that seem to matter most are (and are regarded as being savvy):

  • Equity – to be treated fairly
  • Achievement – to be proud of the job and company
  • Confidentiality – knowing when not to share
  • Camaraderie – to have good, productive relationships with fellow employees

It always makes sense in keeping the correct and careful balance in maintaining professional relationship between the leader and the subordinate at all times. Socialising with those we lead is to be cautioned, for it can most probably lead to contempt and loss of respect. Nonetheless, the leader can still opt to socialise with subordinates at company functions or special occasions, yet always maintaining the socialisation at arms-length.

In summary there is no doubt that familiarity can breed contempt, but the savvy manager must understand how to develop a working camaraderie without crossing-the-line into revealing personal details.
 The last word and best insight on this Familiarity Breeds Contempt story comes from Mark Twain, who said it most appropriately:

Familiarity breeds contempt. How accurate that is. The reason we hold truth in such respect is because we have so little opportunity to get familiar with it.

Be Amazing Every Day.