Posts Tagged ‘psychology’

The Irrational Brain suggested reading

Tuesday, February 17th, 2015 | Life

For anyone attending my talk at Atheist Society tonight, here is the suggested reading list I will later be promising to post on my blog:

  • Michael Shermer – The Believing Brain
  • Duncan J. Watts – Everything Is Obvious
  • Daniel Kahneman – Thinking, Fast and Slow
  • Noreena Hurtz – Eyes Wide Open
  • Nate Silver – The Signal and The Noise

Consciousness Explained

Monday, January 19th, 2015 | Books

In Consciousness Explained, Daniel Dennett puts forwards his theory of consciousness. Below, I have done my best to explain my understanding of the concept idea, and several other interesting ideas that he puts forward. However, that is assuming I have understood it correctly, and I would not want to bet a significant amount on that.

Consciousness

Take two dots that take it on turn to go on and off, each one a different colour. If you watch this you will see the dots changing from one colour to the other. However, they don’t. They just take it in turn to go on and off. It’s known as “colour phi phenomenon”.

There is an online demonstration here, though I have to admit that I was completely unable to recreate the effect.

Lets assume the demo does work though. What is going on here? The continuous motion the brain sees must be invited by the brain. In a traditional Cartesian theatre model in which Descartes suggests there is a mind inside our head watching everything, we have two options.

It could be a Orwellian revision. That is to say our body sees the two spots separately but then goes back and tampers with the memory to add the motion. Just as in 1984, they went back and re-wrote history. It could also be a Stalinesque revision. Much like Stalin’s show trials, our brain never sees the truth, but merely a fakery concocted by the brain for the purposes of the mind.

Dennett puts forward the Multiple Drafts model. This replaces the Cartesian theatre all together and suggests that nobody is actually looking. We record it, but don’t have consciousness until we actually look, at which point our brain has made a conclusion without actually filling the rest in. There is no tampering, our brain simply takes in the information of the two dots and assumes that it must be motion because there is no evidence to contradict this.

Taste

We taste with our nose as tongues can only detect the basic five tastes (four according to Dennett). The rest is with the nose.

Hallucinations

Strong hallucinations are impossible. You cannot touch a ghost for example. This is important because it is good evidence the mind makes it up. Simply seeing a ghost is easy for the mind to make up. However, to actually touch, get feedback, would be far more difficult for the mind to do.

Beer

Beer is not an acquired taste. If the taste remained as bad as the first time you try it, you would never drink it. What happens is that the taste changes to you. A subtle but important difference.

Pain

Pain is evolutionary useful, but not all pain. What is the point of being pain from gallstones for example? However, in general, pain is a result of evolution because it serves a useful purpose. It tells us to avoid harmful activities.

For this reason, it may be sensible to assume that trees do not feel pain. As they cannot run away, there seems to evolutionary purpose for developing the ability to feel pain.

It is also worth noting that ideas cannot cause physical pain. Imagine yourself being kicked in the shins. It feels uncomfortable, but not physically painful. This is interesting because people often call anxiety “uncomfortable”. Whereas any anxiety suffer knows, it causes physical pain. And there is a distinct difference, as this mental exercise shows.

consciousness-explained

Eyes Wide Open

Tuesday, December 30th, 2014 | Books

I recently read Noreena Hertz’s book “Eyes Wide Open: How to Make Smart Decisions in a Confusing World”. It’s quite a good read. In the book she puts forward some of the problems with decision making in the modern world and how can improve our own thinking.

I have picked out some of my favourite quotes and ideas.

“We need to be better decision makers, have decision making classes in schools”

This I would totally agree with. If people had a better understanding of decision making, scientific analysis and understanding statistics and information you would hope that we would at least some of the time have better decisions making, even if that doesn’t fix political bias.

However, I think some of the picture of the “modern world” being such a problem is unjustified.

“The average copy of the New York Times contains more information than you would have encountered 300 years ago.”

That I would suggest is nonsense. How do you measure information? I am sure the New York Times contains a lots of facts and figures, but if you think about the amount of information you pick up just by living your life, its a lot.

Take cooking for example. There is so much knowledge in preparing ingredients, putting it all together and cooking it, serving it, tasting it – tasting food alone has to be a huge amount of information. The human brain can store loads of information.

She probably means specific information in a context. However, it struck me as an add thing to say. She then goes on to say that this is a lot given we can only hold seven things in our memory. Though the latest research indicates this is only two or three things anyway.

“Our world is increasingly unstable and we cannot rely upon it anymore.”

Again, this to me seems like nonsense. Our world is the most stable it has ever been. On a global level, less people are being killed by war than ever before. However, it is on a personal level were we really have seen the chance.

Hundreds of years ago, if the crops failed, you were fucked. Totally fucked. There was a good chance you would die. Just ask the Irish. Today I can walk into Tesco and buy food 24 hours a day, 7 days a week. It will always be there. There is no time when Tesco do not have food.

So who cares if Leman Brothers might collapse overnight. I will still be able to find food, clean water, shelter and medical care tomorrow, and the day after, and the day after that.

In short, I think she paints a much bleaker picture than we actually find the world. However, there is plenty of room for improvement. Luckily there were some buzz words to the rescue – lets go about making some empowered decision making.

Intuitive thinking

One of the first things that I liked was that She then says that intuitive thinking is often wrong, in contradiction to Gladwell. She even says something like you can’t just blink and make a good decision. Perhaps as an intention reference to the nonsense Gladwell wrote in his book entitled “Blink”, or perhaps not.

Social media

Hurtz puts forward the idea that the constant ping of emails, phone calls and other distractions utterly ruins our train of thought. “Social media is distracting.” I have not seen the research on this, but it would be interesting to know if this is also true of the younger generation who have grown up with it.

Cult of the measurable

Hurtz laments the rejection of anything that cannot be measured. SHe claims that domestic violence is ignored because it is hard to measure. This is a big claim, so I would like to see some evidence on that before I believe it. Of course it could be the case, and if it is, that is something we should address.

Measurables are important though. Maybe not with wine, the example Hurtz uses, but they are with most things. How do you measure success without measurables? How do you make an evidence-based decision if you cannot measure the evidence? It provides the justification for your decisions.

Positivity bias

Most people have a positive bias. Ironically, it is depressed people see the world most clearly. Everyone else overlooks the negative stuff. This should be taken into account when making decisions. You should re-adjust your perceptions in case bad things happen.

I discussed this idea with my friends and family. They said, in my case, I was probably adjusting far too much already lol.

Recency bias

I am not sure what the actual name of it is. However, Hurtz tells the story of an ER doctor that had seen a lot of pneumonia cases recently. A patient came in with slightly odd symptoms that did not quite fit. However, the doctor diagnosed it as pneumonia. Another doctor, who had not seen all the cases, immediately correctly the diagnosis to aspirin poisoning.

This is something I could definitely do to be more aware of at work. Often I will be trying to trace down a bug, as it is the same thing I have seen before, but the usual fixes and debugging are getting me nowhere. Usually, it will turn out to be something totally different, but because I am zoned in on a particular problem, I miss it.

Challenger in Chief

Hurtz recommends you appoint someone to pay “Challenger in Chief”. Their job is to challenge your ideas in an attempt to overcome your optimism bias. They can play Devil’s Advocate and put your ideas to the test.

Pick your historical lessons carefully

Do not get hung up on past success and failures. Richard Zanuck, one of the producers of the Sound of Music, went on to commission several more musicals after the huge success of the first. They flopped. History is not always a good indicator.

This correlates with what Duncan J Watts writes. History only happens once, so is a sample size of one. His classic example is the Minidisk. Sony, hurt from losing the VHS Betamax wars, really did learn its lessons and make an excellent product in the Minidisk. But it still flopped, because of the entirely unpredictable rise of file sharing making MP3 devices popular. Do learn from your mistakes. But do not learn too much.

Thinking time

According to Eyes Wide Open, Barack Obama advised David Cameron to allocate large parts of his day to time where he does nothing but sits and thinks.

I cannot find any evidence to support the claim made in the book, but it is good advice anyway. At work, sometimes I just sit and think. That time is an investment, allowing me to work out the pros and cons of my ideas before I implement them, thus saving time in the long run.

It is also a good idea to not implement ideas straight away. When you first come up with an idea your a) probably quite excited about it and b) have not had time to think it through. Put it at the back of your mind and mull it over for a while before doing anything.

This is something I already practice at work and home. If I decided to take on a new project or get involved with a new charity, I will wait a few weeks and see if I am still as excited about it as I was when I first thought of the idea. Only after sustained interest in an idea will I pursue it.

Similarly, at work, if we need a new feature implementing, I will generally leave it to the next day so that my mind has time to process the pros and cons of my approach.

CV writing

Studies on CVs that suggest if you write it in the third person it is taken more seriously. So the next time you are updating your CV, replace “I lead a team and I implemented x” to “Leading a time and implementing x”.

Anchoring

Anchoring is a real problem, and something Kahneman writes a lot about in Thinking, Fast and Slow. If you are not familiar with the problem it is this. If you ask someone who much a house is worth, they will probably give you a reasonable estimate. However, if you tell them the house recently sold for a huge amount, they will subconsciously anchor on this, and give you a much higher estimate.

This is not always a problem, but is a massive problem when it comes to things like sentencing a convicted criminal to x number of years in prison. It is also one of the reasons why you can get a much better pay rise by switching companies.

Once you are aware of these potential anchors and biases, you can try and eliminate them. Hurtz recommends painting a blank canvas. If you are looking round a new house for example, and the current owner has baked some fresh bread to bias your senses, take the time to try and imagine it without it without the smell.

Colours affect our judgement. This is something we saw a lot at Sky Bet. Just changing the colour of a button for example could have a significant impact on whether people clicked it or not.

Narrowcasting

Do not be so hasty to block people with different opinions on Facebook and Twitter. It is important to expose yourself to different points of view, otherwise you find yourself in a bubble where all you ever get is people reinforcing your existing opinions, regardless of their validity.

I have Facebook friends who post material from the far left, and occasionally from the right. I have religious friends and foreign friends with cultural differences, and many of their opinions I do not agree with. However, I am glad they share them with me to challenge my own point of view.

Eli Pariser also has a great TED talk about this.

Summary

While I think the introductory chapter perhaps over-emphasises the problems with modern society, this book is filled with good ideas. Of course, I would think that as I already use a lot of them, but there was plenty of useful reminders and new ideas that for me, made this book an excellent read.

eyes-wide-open

The Tipping Point

Saturday, November 1st, 2014 | Books

Malcolm Gladwell is a man who lies for money. Actually, I do not know that. In fact, if I was to guess, I would guess that he geniunely believes what he writes. I however, am far more skeptical about the claims he makes.

Take for example the 10,000 hours rule. This is based on a study done by Anders Ericsson. Ericsson however, does not agree with Gladwell. In fact in 2012 he wrote an entire paper on it entitled “The Danger of Delegating Education to Journalists”. Gladwell’s response? To claim that Ericsson has wrongly interpreted his own study.

Approaching with a sensible amount of skepticism then, I took on Gladwell’s book The Tipping Point: How Little Things Can Make a Big Difference.

The first section talks about the law of the few. This explains how a few key individuals (such as connectors who are people that know everybody, and mavens who know lots of information on say supermarket prices) are the key to many things in our society. He cites the popular idea of the six degrees of Kevin Bacon where you can connect almost all actors to each other through Kevin Bacon.

He then talks about stickiness. How sticky in the message? He cites an example of a leaflet telling students to get a tetanus shot. It turned out that it did not matter how many horrible photos and descriptive language they used in the leaflets – the percentage of students actually going and getting the shots remained at 3%. Yet when they included a map and opening times of the on-campus health centre, this rose to 28%, even though all the students must have known where the health centre was.

In the third section, he goes on to talk about the power of context. Quoting the example of the drastic crime drop in New York City, he espouses the broken window theory. This is the idea that if you leave a broken window people will think nobody cares about the area and crime will increase, whereas if you fix it right away people will see people care and stop committing crime.

There are some strong rebuttals to what Gladwell writes however.

In the case of the law of the few, Gladwell cites a Milgram experiment where he had people send on packages to try and get to someone in a different city. He found that most packages made it, and most of them went through a few key individuals. Gladwell calls these people connectors. However, when Duncan Watts, author of Everything is Obvious, replicated the study, he found that connectors were not important.

In the case of the broken windows theory, this was one of the case studies in Freakonomics, in which the books shows that while everyone in New York was patting themselves on the back for their brilliant new policing strategy that was cutting crime, what had actually happened was that two decades ago they had legalised abortion, and now all the would-be criminals were simply never being born.

The-Tipping-Point

Why do some atheists become pagans?

Thursday, August 28th, 2014 | Humanism, Religion & Politics, Thoughts

Recently, I saw one of my friends post on Facebook about attending Pagan Pride. I found this interesting because they used to run an atheist society. When I think about it, I can name quite a few people who have flirted with paganism, either before they came to atheist society, or having left the society and then drifted over to paganism.

It seems to me that there seems to be a stronger link between atheism and paganism than between atheism and other religious beliefs. I wonder why this is.

The simplest explanation, could be the size of my dataset. While having reviewed my personal experience revealed this connection, it could simply be that this by chance, and if I looked at a wider variety of evidence I would see something different. In particular, cultural setting probably plays a large part, though if that was the case you would expect the dominant religion to feature to be Christianity. Still, that seems a good explanation. However, in the interest of discourse, I want to discuss the possibilities assuming that that is not the case.

My first instinct was that Paganism is easier to swallow than more dogmatic religions. It seems fair to say that in order to become religious, you probably have to swallow its bullshit to some degree. With the Abrahamic religions, that is quite well defined bullshit. it is hard to wriggle out of because their god helpfully wrote it all down in a series of contradicting books that explained exactly what it was, then created a series of prolific institutions to further expand its claims.

Paganism does not have this. Nobody really knows what it is about. Thus from an intellectual point of view, it is easier to swallow their nonsense because you have more freedom to accept or reject specific claims and can water it down to taste.

However, I am not convinced by this explanation. Religion is not an intellectual argument. It is an emotional one. I am not sure who said “[the problem with convincing believers is that] you can’t reason yourself out of a n argument you did not reason yourself in to”. People do not make these choices using logical. If they did, nobody would be religious. It is a willing suspension of your disbelief in order to gain the emotional reward gained from religious adherence.

That is not to say that religious people cannot defend their ideology. They do, and come up with plenty of arguments for their belief. However, as Michael Shermer’s research shows, people form beliefs first and then come up with reasons why they believe if afterwards.

Therefore, if we accept that religion is an emotional choice, the watering down of theology offers no benefit. Indeed, for me personally, it would be less appealing. If I was to ignore my rationality and choose on an emotional level, I would much rather have the loving, protective (if a little jealous and vengeful) Christian god watching over my life and occasionally listening to my prayers (I am rich and white, and would generally pray fur curable things after all) than the vague concept of a Mother Goddess which may nor may not split down into a polytheist set. I want the certainty that our human brains naturally crave. Otherwise what is the point?

Another explanation could be the similar, but importantly different, idea that we inherently have believing brains (referencing Michael Shermer once again). In a straight forward clash between emotion trying to override logic, it makes more sense to go to one extreme or the other. But suppose that rather than craving the certainly of religion, we simply allow our rationality to slide to the point where we tolerate our inherent trait of building narratives and purposes were not exist.

If we were to subconsciously form this belief, which we are all somewhat predisposed to do, we would then go looking for a way to explain why we held this belief. Again, belief first, reasons second. But the key point with this is that we are still essentially acting on a rational, intellectual level, but from a base point that we are formed a faulty premise that there is something greater out there. Retroactively fitting an explanation to this, would lead us to fitting on the belief system that causes the least conflicts with that world view. Here, with its lack of doctrine and defined beliefs, Paganism probably has the edge.

The Believing Brain

Sunday, August 17th, 2014 | Books

Michael Shermer is founder of The Skeptics Society and psychology researcher. The Believing Brain brings together much of his research over the past few decades.

Shermer’s take home message is to do with how we form beliefs. Namely, that we form our beliefs first, and then work out what evidence supports them. This is not the way we like to think we make decisions. We like to think that we gather the evidence, weigh it up, then make a decision. However, there is good evidence that we do not.

“The brain will almost always find ways to support what we want to believe, so we should be especially skeptical of things we want to believe.”

That is not actually an exact quote, but I think it is roughly it.

Evolution has given us pattern-detecting brain because false positives are far less harmful than false negatives. This leads us to see patterns that are not there.

This is true even of exaggerated patterns. For example birds will prefer to sit on eggs with even more pronounced patterns than they are supposed to have. Shermer suggests this is also true of dating. Wearing high heals extends the legs of women, so men’s brains are tricking into thinking they are more attractive. Similarly women like men with broad shoulders and who are tall, so platform heals and shoulder pads might help.

We are also predisposed to think there is an agency behind everything. These innate evolutionary traits of patternicity and agenticity explain why so many of us are susceptible to believing there is a creator, even though there is no evidence for this.

He goes on to discuss the idea of SETI as a religion. People believe in it, even though there is no evidence for it. To be fair to him, he does go on to explain in detail why SETI is different from a religion, however I still do not entirely agree with the comparison. SETI is at least consistent with a naturalist world view and is therefore a plausible theory that we are investigating, rather than believing in.

He spends a chapter making the case that conservatives are not that bad. But then he is one. However, he makes a good case of it being important. We need a system to regulate altruism and freeloaders and both conservative and liberal agendas can do this. He also points out a lot of evidence for egalitarianism and communism do not work, hence why we need such agendas.

The final few chapters of the book look at the development of the scientific method and how it can help to overcome the biases and failings of our believing brains. This includes a discussion of how the universe was created. It feels a bit out of place in what is essentially a psychology book, as it will probably become out-of-date independently of the rest of the book’s content. Most of it I knew, but it was an interesting re-cap none the less.

Overall, it is definitely worth a read, offering some powerful explanations for why people believe what they believe and its implications for how we live our lives and structure our society.

the_believing_brain

In defence of social science

Tuesday, August 12th, 2014 | Science, Thoughts

Like everyone with a degree in real science (that is I have a Bachelor of Science in a subject that does not contain the word “science” in the title), I have often mocked social sciences. The “soft” sciences. You know, the ones that are not real science.

I think that perhaps it is time for us to stop such mocking though.

I am not sure whether we actually believe our own jokes or not. I imagine that we do; that a lot of scientists actually think social science is a load of nonsense.

There are some understandable reasons for this. Physics gives us very definite answers. Even in the days of quantum physics, which you could argue have introduced greater uncertainty, our body of knowledge and accuracy of predictions has only increased. In comparison, psychology and sociology are not able to give us the definite answers or universal rules that the natural sciences bring to the table.

However, there are a number of good reasons for this. First of all, they are new. While you can trace anything back far enough if you loosen the definition, psychology as we know today really only began 130 years ago. In comparison to the thousands of years physics has had, it is a baby. It has not had time to develop the body of knowledge that the natural sciences have.

Consider that it took Newton building on hundreds of years of research to bring together a unified theory of physics into a working body of knowledge. In his own words:

If I have seen further it is by standing on the shoulders of giants.

Similarly another few centuries for Einstein to bring together relatively, with quantum being even newer – and these are summations were are only just building. In may be that there simply has not been time yet for psychology to to have their scientist who brings it all together.

Or perhaps there may be no universally applicable laws, which brings me on to my second reason – social science might just be a lot more complicated than natural science! That is perhaps heretical to suggest, but I think I can make a case for it.

Natural science is very difficult. There are huge equations, our brains are not designed to deal with imaging the sub-atomic level, it is incredibly difficult to measure, etc. Yet we have managed to work out the composition of stars millions of light years away. It is doable.

Social science on the other hand, is not rocket science. It is arguably a lot harder! It might be difficult to work out the composition of fuel you need in a rocket, especially without blowing yourself up, but once you have done it, it is done. The laws of chemistry hold and you can almost guarantee the same result every time.

Not so with social science. The brain is such a complex machine that everyone is slightly, or significantly, different. You cannot predict what a person will do. And that is on the micro level! Scale that up the macro level, trying to make forecasts for global politics or economics, and you have to try and model the behaviour of 7,000,000,000 individuals that make almost entirely unpredictable decisions. That is difficult.

But why do we need to take social sciences more seriously?

I would argue that they are perhaps more important. Few people would deny that being able to bring back rocks from Mars is awesome. I am sure it is also valuable for scientists. However, consider the benefits of focusing on psychological research.

We, humans, are rubbish at making decisions. We use common sense, which is a collection of biases that we think is real knowledge. We build a world model that only somewhat reflects reality. When something does not fit our worldview, we ignore it. We form beliefs and then justify them. We are subconsciously prejudice and we do not even know it.

Now imagine how much better hard science we could do if we learned to spot, mediate and perhaps even remove these issues. Imagine the happier, more peaceful, progressive societies we could live in once we properly understand why people make all the stupid decisions that cause problems in the world. My guess, is that it would be a massive improvement.

Thinking, Fast and Slow

Friday, August 1st, 2014 | Books

Daniel Kahneman is a psychologist who won the 2002 Nobel Prize in Economics. His book “Thinking, Fast and Slow” summarises a lot of the research he has done and proves to be a fascinating reading.

As someone who isn’t a psychologist I found some of it heavy going, but very interesting. The book is arranged into sections and these are then broken down into short chapters, which made it more readable.

Some of it was shocking too. For example, when it comes to making parol decisions, one of the biggest factors is how recently the parol office has eaten! Just after a meal they are far likely to grand you parol than just before a meal.

His discussion on priming reminded me a lot of what Richard Wiseman talks about in his book Rip It Up. Behaviour can drive emotion, even though we always think of it as emotion that drives behaviour.

The question of how effective pure branding advertising is gains some support. “Familiarity is not easy to distinguish from truth”. The more you show something to people the more confident they feel about it. Other times a lack of clarity is helpful. For example, using a bad font makes exam scores go up, because people have to concentrate more than they normally would and so make less mistakes.

Much of the book discusses the differences between System 1 (that does the fast thinking) and System 2 (that does the slower, more considered thinking). Elina often reproaches me for not noticing snails on the path, suggesting that I need to notice things or one day I will be eaten by a lion (metaphorically, these days). I now maintain that my System 1 is keeping an eye on things and simply not bothering to engage my System 2 because there is no danger.

Kahneman also adds weight o Burton Malkiel’s book A Random Walk Down Wall Street, which both discuss how the stock market is almost entirely unpredictable and therefore stock market traders actually add no value to what they do. Indeed, as 60% of mutual funds do worse-than-guessing, they actually subtract value.

Ultimately, people are just really bad at making judgements. 90% of drivers rate themselves as above average. Similarly, the majority of new businesses fail, yet the people who start them almost always believe they are exempt from such rules.

The answer to many of these issues is to replace judgement with a formula. This is essentially the entire point of Michael Lewis’s book Moneyball. Even a simple formula will do, according to Kahneman multiple regression does not even make it much more accurate.

One of the most useful points I took away from the book (almost certainly not the objectively most useful) is the idea of taking small gambles. In one chapter, Kahneman describes how people are unwilling to take profitable one-off gambles, such as a 50/50 chance of winning £20 or losing £10, but would be willing to take it if they knew they could take it 100 times in a row. The larger sample size means they are very likely to come out on top. However, they fear doing it once because there is a 50% chance they will lose and that will go down in their mental accounting.

Kahneman makes the point that we are “not on our death bed” and thus we will get chance to get even with the universe over time. Extended warranties are a great example of this. You pay a premium to insure your products, so it costs you money in the long term. A better strategy is not to buy the warranty and accept that sometimes you are going to have to replace a product – but over your lifetime you will almost certainly be up.

Thinking,_Fast_and_Slow

Everything is Obvious: Why Common Sense is Nonsense

Saturday, April 26th, 2014 | Books

Everything is obvious – once you know the answer. That is the suggestion put forward by Duncan J Watts in his book. Is is not available as an ebook, which is very annoying, so I had to read this one using paper. Like I am living in the nineties…

It was a phenomenal read. Watts first puts forward the case against common sense. Within the first twenty pages I felt like I could never trust myself to make a decision again. Luckily common sense is not the kind of thing that lets logic get in the way, as Watts explains.

He points out that common sense is not that common. If it was, we could all just think about a problem, and come to the same conclusion. But we do not. Common sense is built up from our experiences to explain how to deal with every day situations. That means that each of us has different common sense. Not to mention that many of our common sense rules are contradictory to each other.

This is a problem because when we try and solve a problem, we often use common sense. These are built on our experiences, which are different from other people’s experiences, hence are not directly translatable. One of the most extreme cases of this is that what is the obvious solution to a politician from a rich Western country is not the actual solution that impoverished third world countries actually need.

He then goes on to point out that when you realise you cannot trust your own common sense and go looking at lessons from history, these are useless too. History only plays out once, which as any statistician will tell you, is a pretty poor sample size. The iPod may have been a huge success while MiniDisc floundered, but was it due to Apple having a better strategy than Sony, or where they simply the victims of circumstances? The honest answer is, we will probably never know.

Finally he presents some solutions to the problems put forward. We need to be aware of our biases. We need to do things that we can test and measure scientifically. Sometimes however, this simply is not possible. In those situations, we are basically screwed…

Still, at least we know that now.

everything-is-obvious

Asch experiment

Sunday, March 23rd, 2014 | Video

What would you do if you walked into an elevator, and everyone else was facing the other way?