The Not So Rational Programmer
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 66 | |
Author | ||
Contributors | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/37566 (DOI) | |
Publisher | ||
Release Date | ||
Language | ||
Producer | ||
Production Place | San Antonio |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
Ruby Conference 20159 / 66
8
10
11
14
18
19
21
22
23
25
26
27
30
31
35
36
38
40
41
42
46
47
48
51
52
54
55
57
60
61
62
64
66
00:00
Programmer (hardware)Rational numberPoint cloudMedical imagingCASE <Informatik>Software testingSoftware developerPoint cloudPhysical systemComputer hardware2 (number)Number1 (number)Computer animationLecture/Conference
01:25
Physical systemError messageCodeException handlingComputer programmingComputer animationLecture/Conference
01:54
Decision theoryMemory managementData managementAbstractionProjective planeVirtual machineProgrammer (hardware)Integrated development environmentGroup actionComputer programmingMeeting/Interview
03:01
Group actionVirtual machineDecision theoryStandard deviationComputer animation
03:38
Programmer (hardware)Condition numberPeer-to-peerInformationMultiplication signProcess (computing)Group actionSurvival analysisMereologyComputer animation
04:25
Decision theoryCognitionHeuristicAsynchronous Transfer ModeInformationInformation overloadCondition numberProcess (computing)Multiplication signSlide ruleComputer animation
05:42
ResultantDecision theoryMultiplication signPattern recognitionProcess (computing)AutomationAsynchronous Transfer ModeNumberMoment (mathematics)Computer animationMeeting/Interview
06:59
InformationMultiplication signWord1 (number)Decision theoryMental calculationMultiplication tableOffice suiteGame controllerProgrammer (hardware)InformationMultiplicationResultantApproximationAsynchronous Transfer ModeElementary arithmeticCartesian coordinate systemProjective planeProcess (computing)Goodness of fitSoftware frameworkEstimatorCalculationComputer animationLecture/Conference
11:52
Projective planeLie groupAxiom of choiceSoftware frameworkDecision theoryInformationCASE <Informatik>Oval
13:45
Sound effectExpressionPoint (geometry)MathematicsCognitionDecision theoryProcess (computing)UsabilitySound effectAxiom of choiceComputer animation
15:31
UsabilityCognitionDialectUsabilitySound effectLogical constantCognitionComputer animation
16:08
CognitionPoint (geometry)WindowDirection (geometry)Cartesian coordinate systemSoftware frameworkDisk read-and-write headError messageFrame problemMultiplication signState of matterNatural languageWordComputer animation
18:21
Performance appraisalComputer configurationDecision theoryGame theoryResultantObservational studySound effectObject (grammar)Degree (graph theory)Electronic mailing listMultiplication signProcess (computing)FreewareGroup actionFlow separationTraffic reportingComputer animationLecture/Conference
19:55
Variety (linguistics)CognitionMultiplication signPerspective (visual)Decision theorySound effectInformationExterior algebraGroup actionWeb pageLattice (order)Projective planeComputer animation
20:46
Sound effectEstimationEuler anglesDivisorCategory of beingLattice (order)Chaos (cosmogony)Projective planeDifferent (Kate Ryan album)Disk read-and-write headWeb pageMereologySound effectGroup actionComputer animationLecture/Conference
23:08
Decision theoryGroup actionRight angleWeb pageSound effectTelecommunicationLie groupPoint (geometry)Direction (geometry)WordDegree (graph theory)Computer animation
25:44
Performance appraisalLocal GroupRevision controlGroup actionPerformance appraisalSoftware developerExterior algebraConformal mapPoint (geometry)Arithmetic meanBounded variationSound effectDecision theoryArtificial neural networkFeedbackIndependence (probability theory)outputShape (magazine)Negative numberComputer animation
28:05
Cohesion (computer science)Group actionType theorySurvival analysisDecision theoryDivisorHybrid computerComputer animation
28:44
Context awarenessLocal GroupPerformance appraisalPrime idealGroup actionDifferent (Kate Ryan album)Integrated development environmentCohesion (computer science)Performance appraisalContext awarenessSound effectGoodness of fitDivisorDynamical systemMultiplication signConstructor (object-oriented programming)Point (geometry)AreaDecision theoryExpressionFeedbackView (database)Negative numberExpert systemMereologyState of matterData structureComputer-assisted translationComputer animationLecture/Conference
33:34
Decision theoryInformationProcess (computing)CASE <Informatik>Computer animationMeeting/Interview
34:03
Sound effectPoint (geometry)Decision theoryProcess (computing)Exterior algebraSound effectDegree (graph theory)InformationView (database)Musical ensembleConservation lawGroup actionComputer animation
35:19
InformationMultiplication signInformationSign (mathematics)Computer animation
36:14
Decision theoryCognitionField (computer science)Thomas BayesComputer animation
37:51
Point cloudVideoconferencing
Transcript: English(auto-generated)
00:19
I work for a test cloud in Berlin.
00:22
I live in Tokyo, and I work remotely from there. This is me, just in case you want to know and see like a second image of myself. All right, to start out with, I have a question for you. Or actually, I have two questions,
00:41
but I'll start with question number one. So who in here has ever had to work with a really weird old legacy system? Not even like a complete rebel, just like had to deal with it somehow. Please raise your hand if you have. All right, second question.
01:01
Who here in this room has a brain? Please raise your hand if you do. Not everyone has a brain. That's kind of surprising, but I assume you all have one. All right, so everyone who raised their hand for question number two should, in fact, also have raised your hand for question number one, because what our brain really is is a big, fat, old legacy
01:23
system. And it's been a while since the last hardware update. The good news, our brain is extremely powerful, and it can do a lot of amazing things. The bad news, the documentation is pretty crappy.
01:45
Their error handling is not that great either. And we can't even debug it, because we don't have access to the code base at all. So that sounds like the nightmare of every programmer, doesn't it?
02:01
The problem is we can't quite walk out in a project manager and just quit. We're kind of stuck with this obnoxious and brilliant heap of jelly in our skulls that helps us to process and react to our environment, that helps us to reason about abstract problems, and that actually lets us create,
02:23
communicate, even program. And on the other hand, it constantly keeps forgetting people's names. It reminds us of awkward situations three years ago in a very random fashion. And it constantly makes decisions for us without even asking.
02:43
So today, I would like to talk about how we can understand our brains better, the way they work, and also these weird little flaws that are called cognitive biases, and what to do about them.
03:02
You see, we as programmers, we really like to look at ourselves as a group of people that is somehow more rational than others. You know, because after all, we earn our living by talking to machines every day, and machines aren't exactly known for being super emotional.
03:20
So, if we make technical decisions, or if we plan projects, or if we assess capabilities or competencies, it's just fair to assume that we adhere to rational standards, and that's what we base our decisions on, right? Well, I have a surprise for you.
03:40
Programmers are human. And they have human brains. To be fair, most of the time, our brains do an amazing job. They have to process vast amounts of information, and they somehow have to come back to us with appropriate reactions, like how to deal with all this stuff that comes to us all the time.
04:03
You see, the human brain is really old, and many parts of it developed when social coherence of a group was actually really important for survival, and things like accurate and quick assessment of threats, when it really mattered what your peers think of you,
04:21
because being ostracized might well mean that you're gonna die. And race conditions were a completely different problem than what they are to us nowadays, most of the time. So, what is cognitive bias? Cognitive biases are heuristics that the brain uses
04:43
to process information in a very quick way and come up with appropriate reactions. They're pretty useful most of the time, but they are not perfect processes, so they can actually lead to some pretty suboptimal behavior or decisions.
05:02
The thing is this, we are all biased. That's something that's very important to understand. It's natural, it's how our brains work, and it's not really necessarily a bad thing. Our brain uses all these shortcuts, so it can actually deal with all that information
05:20
that's coming to us and give us back something reasonable, so we don't just get an information overflow because the brain gets stuck with all these details. In fact, our brain has different modes how it acts, and I'm gonna show you two really simple examples that will illustrate this. So, when you look at the next slide,
05:45
there are a couple of things that happen to you without even noticing. You recognize right away that this is a person, a child in fact, and also probably someone you haven't seen before, or maybe you've seen them but you don't know them personally.
06:02
You can also tell right away that she's currently not happy at all, and if she was standing right in front of you at this moment, she might be very close and starting to cry or actually shout at you. This process of recognition and perception is something your brain does really quickly and effortlessly for evolutionary reasons.
06:22
It's important to understand, like, oh, this is another person there, what I am, and to understand what they feel like. And in this mode, the brain also does a lot of very quick and automated decision making, where we often don't even realize that it's happening just because it's happening so fast.
06:42
And oftentimes, it sacrifices accuracy and correctness for speed and approximated results that are okay most of the time, and a sufficiently large number of times. When you look at this on the other hand,
07:02
unless you're really good at mental arithmetic, or you still remember your multiplication table from elementary school, which I don't, to be fair, your brain probably drew a blank here. Like, there's no evolutionary reason why our brain should be able to automatically process like semi-difficult multiplication problems.
07:21
So it can't, and there is not really any way for it unless you actually memorize the results to spontaneously come up with them. So you can probably tell that this is a multiplication problem by looking at it because that's something you've learned, so you can recognize and you can also tell that five and five million is probably not a very reasonable estimate for a result.
07:42
But if you really want the result, you have to actively start thinking about it, you have to start calculating. And that's a lot slower, a lot more difficult, and a lot more demanding than a fast thinking mode. Put simply, when our brain is in fast thinking mode,
08:02
it uses cognitive biases to approximate solutions. And like all approximation approaches, this does not prioritize optimal solutions, but it prioritizes coming up with a feasible solution in a reasonable time. You can't really turn this off.
08:22
It's hardwired into our brains. But in some situations, there are ways how you can work around it. That's not always possible. Simply because firstly, you might not even recognize it's happening because it's happening all the time.
08:41
And secondly, it might not be viable to do this, excuse me, to do something about it all the time just because if you were to question every single move that your brain makes, this would slow us down so much that it would be difficult to even act as humans. There are situations though where it is definitely viable
09:02
and also very good to try and work around these biases. And actively making decisions is one of them. We make decisions all the time, from really small ones like what to have for lunch to major ones like how to make our lives meaningful.
09:23
And most of the decisions that we make at work fall somewhere in between. How to implement this new feature, what kind of tools or frameworks to use for this new project, which applicant to hire for the job we're looking for.
09:40
These are important decisions and our brain's biases affect each and every single one of them. So what can we do to make our decisions as good as possible? I will start out with looking at cognitive biases that affect our personal decision making
10:01
like when we make decisions on our own and since we as programmers usually don't work alone all the time but we work in teams, I will also look at some cognitive biases that come into play when you're making decisions in a team of people.
10:23
Confirmation bias is one of the first things that we have to look at when we're trying to trick our brains into making better decisions. Cognitive bias means that when we search for information or when we interpret information, we tend to do it in a way that confirms the opinions we're already holding.
10:43
As I said, this affects both searching for information and interpreting information. We take what we already think is right or what seems to make sense or what seems to us as obvious and then we try to confirm this idea while at the same time, we're ignoring possible alternatives.
11:03
If you have a strong opinion on something or a very emotionally attached opinion, you might even get angry if someone challenges it. For example, many people have rather strong or emotional opinions about topics such as abortion or gun control, gun ownership, something like that.
11:24
So no matter wrong or right, that doesn't matter at all but if you read something or if you hear something that challenges this opinion, you're very prone to actually like waving it off as nonsense or even getting upset if somebody challenges it. While you will happily take in every piece of information
11:42
that seems to confirm what you're already believing and that confirms your opinion because it's obviously right, right? Or something more related to actual technical decision making and I would like to say here that this is a less emotionally charged topic but I don't really wanna lie.
12:02
So for example, if you already are convinced that Rails is the best this world has to offer for this new project you're starting, you're really very prone to not listening to people that come to you and tell you that Rails is a shitty framework for a reason X, Y, or Z.
12:21
You might listen to what they have to say but you'll probably discard it pretty quickly for the opinion you already have and you won't really dwell on it for very long. You're also much more prone to look for information that tells you why Rails would be a good choice
12:42
for this project and not why it would be a bad choice. The confirmation bias is a good example for the brain to recognize it's happening and sacrificing accuracy and correctness for speed and less effort.
13:03
In fact, we have many preconceptions about the world that are actually true or close enough to true in most of the cases. So accuracy enables us actually to act and think much faster and therefore our brains doesn't really constantly check if our opinions are true or false.
13:20
It will just assume that this is the base on what we're acting. So this is not necessarily a bad thing but when the problem comes up is when the opinion that we are holding and that we are trying to confirm in this way is actually not a very good solution for the problem that we're trying to solve.
13:45
So what can we do about the confirmation bias? A good approach to counter it is to challenge your own opinion. Try to prove yourself wrong. When you're making an important decision, try to put yourself in the shoes of someone
14:02
whose job it is to actually show that your approach is not correct. This is not easy at all but it will certainly help you to take on a different viewpoint and maybe uncover some things that you hadn't thought about before. And if you are not sure that you can do this honestly enough, ask someone like a coworker or just someone you trust
14:22
to play devil's advocate for you and to actually challenge this opinion. Don't be defensive about it. Actually take into consideration what this person is saying. Change your opinion if required. I know this is not easy at all but if we're not ready to change our opinions, we don't really need to start with all this
14:41
working around cognitive bias because there's no point to it. And then if in the end it turns out that the original thought or idea that you had still looks like the best, then it's probably not such a bad choice.
15:04
All right, another cognitive bias that strongly influences our decision-making is the mere exposure effect. The mere exposure effect means that we tend to like things more if we are familiar with them. We have a preference for something just because we know it
15:21
and this is a bias that's also strongly rooted in survival. Things we know, things we understand, things we are familiar with create something in us that is called cognitive ease. Cognitive ease makes us feel good, it makes us feel safe in a given situation. And our brain uses this effect as a kind of dial
15:43
for constant situation assessment to make sure we're safe. If there's nothing that challenges us, if there's nothing that looks like a potential threat or that we have to direct a lot of attention to, it will assume that the situation is okay. Cognitive ease feels good, it feels comfortable
16:01
but it also makes us think in a much more casual and superficial way. Cognitive strain on the other hand happens when we encounter something that we don't really have any experience about, something we don't know, something we actively have to wrap our heads around
16:21
and our brain takes this as a clue that there might be a potential threat, there might be a problem that we have to solve and therefore it gives us a little heads up where it says like, attention, attention, there is something you should think about. So cognitive strain makes us pay a lot more effort into our thinking.
16:41
We do fewer errors in this state of thinking but it also makes us a lot slower, less creative and less intuitive. So again, preferring things we like is a natural thing.
17:01
It just happens that way in our brains and we can't really turn it off. Our brain wants us to be safe so it seeks out situations that make it feel safe but as with the confirmation bias, we can actually work around it by asking ourselves about the world we dislike
17:20
or like something or someone, a certain language, a certain framework, a certain applicant. So for example, if we're thinking about a framework we're gonna use, do we like it just because we're familiar with it or is it actually the best tool for the job?
17:41
Do we doubt a certain applicant for actual reasons or just because they're not the kind of person we're used to interacting with? But at the same time, we shouldn't just toss familiarity out of the window. I mean, there's a point to this whole thing because being familiar with something enables us to hit the ground running
18:00
and to get started on something really quickly because we understand it. So the thing is just we should figure out beforehand if this is actually the direction we're wanting to run to and not do it just automatically because our brain throws it to us and says, take this, this is the easiest way.
18:20
So what do we do about the mere exposure effect? A good way of dealing with this is when you have a major decision to make, set up a list of objective criteria or like as objective as possible and use these to evaluate your options. Do this before you actually start looking at options
18:42
and when you're evaluating, stick to these criteria and that should help you to at least to some degree keep your personal impression out of the game. But once you're done with this criteria, write down a short note about your personal impression, like what you feel about this option. That way you can combine this objective evaluation
19:01
and your personal impression and there's been studies that show that this is actually a very good way to make decisions that brings good results. And when you decide, stick mostly to the objective criteria and then use this personal impression report as a support for decisions.
19:20
If you have enough people for that, you can also separate the person that does the evaluation and the person that actually makes the final decision because that's a way to actually keep personal preferences of one person out of the game and if you're not super comfortable with making this hard separation, you can also just take your evaluation results and give them to a person that was not involved
19:42
in the evaluation process and get their independent opinion. Yep, that was that. The confirmation bias and the mere exposure effect
20:00
have strong influences on the way how we personally make decisions but a lot of the time we also make decisions in teams. Why do we make decisions in teams or groups? There are different benefits we can actually get from that. One of them is, for example, that we come in with a variety of perspectives and looking at the problem and with a lot more information of possible alternatives or possible solutions
20:23
and it also makes for a better decision reliability which means that through being in a group, you can kind of even out the personal biases of people and dampen them. So let's look at a couple of cognitive biases that actually have an effect
20:41
when we're making decisions in a group and that stop us from getting these benefits. When discussing something in a team, it's really important that we're all on the same page about the topic we're talking about. I mean, that sounds like a no-brainer really but I personally have surely been on a lot
21:01
of project meetings where we all thought we have an agreement on something and then somebody asks a question and suddenly the whole thing is in confusion because like three people find out that they actually were thinking about something completely different but they were thinking this was actually the topic we're discussing about. So that happens a lot and that is actually an example
21:21
of the false consensus effect. The false consensus effect means that people tend to assume that what they think is normal and so they overestimate how much other people actually think the same way that they do or how much other people agree with them.
21:41
This is due to a couple of different factors. One of them is that even though our brains are actually pretty good at people thinking, we often have surprisingly poor social judgments. We just get people wrong a lot. Secondly, we tend to project our own assumptions
22:00
and our own attitudes and opinions onto other people. This might to a part be wishful thinking but one of the reasons is also that our brain does not really have any very good way into looking into the heads of other people and actively knowing what these people think. So our brain sneakily and silently
22:20
replaces the actual question we're asking like what does this person think with another question that's easier to answer and that would be what would I think if I was this person? And then it returns the answer to us for the second question but we never know that was actually not the answer to the question we asked.
22:43
We tend to do this a lot more with people that we perceive as similar to us or that are members of the same group. So mostly our coworkers fall into that category. That means if we don't clearly communicate our thoughts and our opinions on something, everybody will think that everybody else thinks the same way that they do
23:02
and that's a very clear recipe for disaster and chaos. Something else that can happen with the false consensus effect is when you have a fairly dominant opinionator in your group like for example, someone who has a fairly senior role
23:21
or someone who is just very good at verbally leading discussions or yeah, conversations, is that when they state their opinion, they'll usually do it in a very convincing way which is per se obviously not a bad thing but it might lead to a certain dynamic where other people if they feel like they can't really go up against this opinion
23:40
or they're not qualified enough to do so, they'll just shut up and not say what they think and then this dominant opinionator which I'm not saying this because it's something bad, it's just a phrase to describe a certain communication style, this dominant opinionator will then assume that everybody else thinks the way they do and that way because nobody's speaking up,
24:02
decisions get made that do not actually reflect the opinion of the team. So what can we do about the mere, excuse me, about the false consensus effect? Be explicit obviously when we call a meeting
24:20
or for a discussion for a decision, be absolutely explicit what this is about, what is the topic of this, what is the goal of this, what are we going to talk about and what is the decision that we are going to make so that way everyone is on the same page and knows what we're talking about so we can go into the right direction right away.
24:41
For the second thing I described, there's encourage questions of course when you want to encourage questions about the topic before you start discussing so that kind of goes hand in hand with what I already said and collect opinions first.
25:01
So before you start out a discussion, before you get into this dynamic where one person's opinion kind of lies over the other person's, let everyone write down their opinions on the topic at hand on little pieces of paper or sticky notes or something because that way you can actually collect everybody's ideas and opinions without having them run through
25:22
the group consensus first and then you can actually take each one of those points that came up and discuss about them and make sure that everybody's opinion gets heard. There's something else that can happen in groups
25:40
that can severely undermine decision making and it's called group think. Group think means that to preserve the harmony of the group or the conformity in the group, members of this group or of this team will actually try to minimize conflict and reach a consensus without critical evaluation
26:00
of alternative viewpoints or even by oppressing or suppressing differing opinions from inside or outside the group. Inside the group, this can lead to things like people very quickly adapting their own opinions
26:21
to what they perceive as the majority opinion of the group or the opinion of a leading member of the group, a senior developer, for example. It can also take the shape of people actively or unconsciously suppressing differing opinions and discarding them really quickly
26:40
if somebody brings them up. So what is perceived as loyalty to the group actually makes people not bring up controversial issues or challenge opinions that have already been established. This generally leads to a very sharp decrease in personal creativity and individual creativity and in critical and independent thinking
27:02
and it has a very negative effect on the decision making of the group. When it comes to the opinion of non-group members, group think can start out causing things like just not getting any input or feedback from members outside of the group,
27:22
consciously or unconsciously. All the way to actually actively or semi-actively trying to bar outside influences on the group. An example for the semi-active approach is something you might have heard before is when people say things like, well, they don't really understand how we do things
27:42
or they don't know as much about this as we do or any variation of this. Again, this has negative effects on the group's decision making because it creates an echo chamber where only the group's own consensus or artificial consensus is reflected back in them and nothing else ever gets in.
28:07
Ironically, even though it provably worsens the decision making of a group, group think actually makes the group's members feel a lot more confident that their decisions are right
28:20
and that the quality of their decisions is very good because it creates some type of feeling of belonging together, group cohesion and invincibility within the group. Group think is a dynamic with an evolutionary background as well and its purpose is in fact to create cohesion within a social group and to avoid infighting
28:41
which is important for survival. There are three factors that play together that lead to group think. First of all is high group cohesiveness. If your group doesn't feel like it belongs together, you won't have an issue with group think at all because you don't have a cohesive group. The thing is, high cohesiveness alone
29:02
does not necessarily lead to group think. There are two other factors and at least one of them needs to be present to create this dynamic. One of them are structural faults. For example, an insulation of the group. So it's really isolated and does not really communicate with a lot of outside people.
29:21
Another one is actually if you have a group that has a very homogenous setup of members, if you have a group where every single member is very similar in their background, in where they are from, like what they are like, their opinions, this very strongly encourages group think. And the third factor that can also have some influence
29:42
are situational context. Things like perceived threats from the outside that feel highly stressful to the group or recent failures of the group tend to encourage group think. So that's actually good news for us because we really do want cohesive groups
30:01
but we want them without the group think effect. So what can we do to counter group think, to make our groups work together in a way that they are cohesive but not like sheep? First of all, a cohesive but diverse group starts you out with a really good bunch
30:21
of different viewpoints and opinions. So try to form teams that are diverse and not homogenous, where people come from different backgrounds, from different demographics, from different viewpoints, with different experiences. Also, oh yeah, I have a slide for that.
30:41
Also, encourage critical evaluation. You have to try and build an atmosphere that actually encourages people to voice their opinions, to evaluate ideas that come up in a critical way because you see, if you have an environment where people feel that if they actually say something that goes against the mainstream of the group,
31:00
it will be frowned upon or they will be punished in some way or the other, they won't do it because why would they? It will only have negative consequences for them. So try to build an environment that encourages critical thinking and the expression of your personal opinions in a constructive way.
31:20
If you're a leading or fairly senior member of the group, you might want to think about not starting out a discussion on decision making with stating your own opinion on the matter because that way, you're very prone to priming your team to stick to at least the area of things like what you think about this topic.
31:41
So they might just adapt to what you think and they might not bring in their own ideas. So what you can do for this as well is you can use this sticky note opinion collection technique that I've already described that's very helpful for this as well. And this doesn't really mean that you can't state your opinion if you're the group leader,
32:00
if you're a senior member, or that you shouldn't take part in the decision making. It just means don't state your opinion first. Let the other people talk first, let them bring in their opinion, and then after that, bring in yours. To avoid creating an echo chamber, actually actively invite outside expert
32:21
or outside people into the group. Let them state their view on things and then actively have your group members discuss the topic at hand with these people. And in general, encourage your team members to actively discuss ideas off the group with trusted people outside the group to get some feedback that's exactly not
32:41
within this echo chamber of the in-group. And last but not least, think about appointing a devil's advocate if you're making an important decision. Do this actually as a real role in the team. So one person on the team will be responsible to take a critical stance against every idea that comes up
33:03
and actually question it in a constructive way, obviously. It's not the point of a devil's advocate to shoot down everything other people say, but that way you can institutionalize critical thinking into your group. Just make sure that this devil's advocate is another person every time you're having a discussion
33:21
because otherwise you end up with a member of the group that other people really resent over time because he constantly keeps shooting down their ideas. And remember this, when we all think alike, then no one is thinking. If somebody says something in a decision-making process
33:44
and everybody just agrees without any further discussion, you should become immediately suspicious and think why is this happening? Could this be a case of groupthink? All right, that's been a lot of information so far. So let's shortly recap what cognitive biases
34:02
we've looked at. We've looked at the confirmation bias. This is a bias that takes influence in our personal decision-making and it causes us to search or interpret information in a way that reconfirms our opinions that we're already holding. The mere exposure effect also influences
34:22
our personal decision-making processes and it means that we tend to like things more just because we're familiar with them. Next is the false consensus effect. This works in team decision-making and it actually says that we tend to overestimate the degree to which other people agree with us
34:43
or think like us. And then groupthink, where people in an attempt to conserve group harmony try to minimize conflict and find a consensus without actually actively evaluating alternative viewpoints or suppressing other opinions.
35:02
I have one more thing that I would like to mention and I think it's a very important thing because if we don't understand this, there's not really any point starting to think about working around cognitive biases at all and it's this.
35:22
It's okay to change your opinion based on new or updated information. I mean, it's really something that sounds very obvious but if we're honest with ourselves, a lot of the time, we hold onto our opinions
35:42
that we've already formed out of a matter of pride or ego because we really, really don't like admitting that we might have been wrong about something. In fact, reevaluating and updating opinions based on new information and new facts is not shameful at all and it's not a sign that you can't make up your mind.
36:02
It's rather actually one of the only ways where we can get closer to rationally thinking and acting. Daniel Kahneman is a psychologist. He's done a lot of great work in the field of cognitive bias.
36:21
He wrote an awesome book and I definitely recommend you read it. You can ask me for the title after the talk if you want to. And Daniel Kahneman says, our comforting conviction that the world makes sense rests on a secure foundation, our almost unlimited ability to ignore our ignorance.
36:41
You see, we can't really stop cognitive biases. They are hardwired into our brains. They're just there and it would probably not even be a very good idea to try and stop them because that would quite interfere with the way our brain works. So there's not really any way for us to become completely rational creatures. But what we can do is that we can try to chip away
37:02
at ignoring our ignorance. We can start to learn about cognitive biases. We can start to learn about situations when they happen and then we can recognize them and use techniques to counter or circumvent them when appropriate.
37:22
That way, we become better at understanding ourselves and thereby also understanding others. We become better team workers, we become better decision makers, and we generally become better and more successful at what we do. So in that spirit, let's all start working
37:42
at being less ignorant about ourselves. And I hope my talk has given you a good start for that. Thank you for listening.