Bestand wählen
Merken

Overcoming Cognitive Bias

Zitierlink des Filmsegments
Embed Code

Automatisierte Medienanalyse

Beta
Erkannte Entitäten
Sprachtranskript
few thank you everyone for showing up for the last
session of the conference spent on some content the I'm in a mentally ravens craft and
I am a Python east
I'm a PSS so I am not a programmer I use Python to get stuff done and and then I studied cognitive science at Stanford many move that over there so I can actually see so I want to talk today a bit about our brains and our brains are very clever they're very good at doing things but there are also very easy they wanna reduce the cognitive load and so if you've seen any of my previous talks our brains do a lot of little tricks using heuristics and things to reduce how much work they have to do a this can be lost some anything cause some problems so our brains use heuristics they do automatic pattern matching and they fill in the blanks which is great except when it's a proper so for example lots of fluffy white companies 1st sold Q at and they set up a pattern in your brain so that when you see a brick wall with its fluffy white animal behind your brain can fill in the blanks and say all I recognize that that's a fluffy white puppy except sometimes it's a OK it so we have these got
these pitfalls we see patterns where they don't exist in the mind to propose to then let's try that step better again we see patterns where they don't exist that's where superstitions come from we have faulty reasoning this is why we keep having to remind people that correlation does not equal concentration we do stereotyping we build a patent about groups of people and then apply those patterns to individuals and sometimes those assumptions based on those patterns were faulty so speaking of patent let's look at some computers and programmers because we're here at a programming conferences
the this is a total of links she wrote the 1st computer program and she was working with Charles Babbage on his difference engine and she wrote a program using his computer to calculate newly numbers moving forward 50
years how many computers do you see in this work and of circled them humans
computers were the thing that when these were women were
working at the observatory helping to do computations for astronomers even in world war 2
women were human computers during all the computations and then by
the end of world war 2 they were building electronic computers now how
many programmers do you see in this picture and women were the 1st programmers they took the hardware diagrams the circuit diagrams and figured out from there how to actually program the computers also in Bletchley
Park they were doing the same thing the but
we still had human computers that were out there working for things like the Jet Propulsion Laboratory it and you'll notice that not all of them were white this was before the civil rights movement in the US and they were still hiring the women of color because they needed human computers to do the computation which is quite how many people
have seen the movie hidden figures and k I you haven't it's an awesome movie go see but this is the protagonist Katherine Johnson was a human computer at NASA and John Glenn asked her to read to all of the computations that they had done and the electron computer because he didn't trust that new-fangled gadgets he wanted someone who knew how the mathematics the forum would trust his life to it so she is recently been awarded the Congressional Medal of Honor for
her work to move forward to the sixties Mary Ellen Wilkes wrote the operating system for the link which was the 1st microcomputer in the home so the
seventies Adele Goldberg helped would develop small talk in the seventies
Elizabeth helped create new for the Arpanet the Frances Allen and
IBM Fellow and Turing Award winner laid the groundwork groundwork for parallel computing and of course the
Amazing Grace Hopper I'm sure we've all heard of Grace Hopper and she's the 1 who document the 1st of the actually found a moth in the works there's no
and Grace Hopper Celebration of Women in Computing which you see thousands of women around you who are all into programming and other forms of technical stuff these 4
ladies part Italian women working under legal project that help develop up helped to detect gravitational waves with prime time
this year we got to hear and pass on talk from Filaria who told us how we're using Python in cosmetology where they I met
at high kind Italian she's the founder of simple parts and and possibly and
then there's my friend K K is a scientist K. designs and builds lasers why do I bring up uptake what
you ask someone you meet at a technical conference like taken where do you work what's your favorite programming language with your favorite library what you do with Python how you use Python would you ask a woman that you meet at a technical conference like like on a Europe-wide thought so
you programming you here with your husband you ever true women don't fit the pattern of program that we built in our brains no my friend K a scientist if you ask her are you a programmer she say no more but if you ask do you use Python she uses the Python NumPy matplotlib all time infection could probably run circles around most of the people in this room with those factors that she uses for her work so he she of Python Eastern I'd say yes so yes this talk actually is about cognitive biases the so remember the fluffy white puppies there also cute the length of the weight of this and
black beans and like these 2 I have 2 dogs and 1 cat and 8
chickens so look around
detect conferences look around your work look around you need ups if they
look less like this level of
diversity and more like this your brains are going to
build a pattern based on that same this is what a programmer looks like our brains pattern automatically
and so although programmers you encounter it work at conferences look like a white male nothing wrong with white males and married to 1 but if all of them look like a white male anyone who doesn't look like a white male doesn't fit the pattern it's how our brains work and so if
this is your pattern of what a programmer looks like when you
see a billboard like that with the platform engineer quote you don't expect to see this as the picture of the engineer in fact there was so much push back on this picture that she created a hash cake I look like an engineer because
people needed to be reminded that engineers look like a lot of different things so when you ask are you a
programmer of a prominent a conference this cognitive bias at work and you might be objecting and once biased than most people I'm too smart to be highest the we only care about performance Theorem meritocracy yeah there's something called the bias blind spot studies have shown that like less likely to be aware of our own biases than those of others and intelligence because we're all really smart people intelligence is uncorrelated with how high here by a splines 5 years or how where you are of your own biases the the and people with the high bias points about take less advice from others and learn less from entire biased training there's a lot more cognitive biases and confirmation bias is a big 1 and it's a tendency for people to want to prove that they're right so that their preconceptions that right the there's been group of biased where we want to favor people who are from members of armed we tend to think that others will think like us will have the same priorities this is why we run into problems if we don't actually talk to real users when were creating products because we think 0 well I know what they need maybe you don't move may be adopted but you may simply be projecting your own beliefs onto them the selective perception their status quo class there's a lot of biases the so just a few so
confirmation bias here is a test that they give for confirmation fights the ruling is if if the card has a vowel on 1 side then it must have an even number on the other side so want to cards do you turn over to test the role obviously a turnover per day the card with the AI and to see if there's and
even number on the other side the but which other cards the a turn over the floor where
the 7 and we say the force did you guys
already know this company say the 7 yeah if you have a lot of
people who haven't been warned about transformations as will say all we have to turn over the 4 because we want to prove the rule is right the now as programmers we know that if
we only test to prove that a program it's right and works right with the expected input that we're going to have problems and so we test for edge cases we try to break the programmer and so we've learned over time people have taught us and we learned the hard way sometimes to test further breakage to trying to prove that it's right now about that performance on me only care about performance this shows up in universities in the orchestra as in a lot of places that you would think of as being very focused on diversity very liberal very wanting to be unbiased they gave a study they had a study for university hiring committees where they sent identical resonates out of 2 different committees some of them have women's names and some of them had male names and they found out that resonates with women's names were much more critically even by the women of resonators with men's things and these are the people who really want to be unbiased but this is consciously was happening to them to and
orchestra was trying to increase the diversity of its membership they they knew that women and people of
color were just as talented but they were having trouble with hiring people and so they said will do blind auditions so they put on the screen so that they could not see the performer when they came in and so they could judge only on their skills this increased the
number of people of color who got given offers to join the orchestra but not so much the women the 1 what a women wear on their feet when they're dressed up as high heels and musicians of course could hear that this person was walking in high heels and just that was enough of a clue that this was a woman and they judge their performance more critically and so they put down a carpenter so that they couldn't hear the person walking and suddenly women were being offered positions at the same level as the man How does this happen yeah
let me some of when you this is what we use to build a model of a programmer this is so he doesn't fit the model and sell your brain is going to stereotypes and say this person doesn't match the can and look for proof of the pattern that you already have and so why doesn't this person that quite as this person belong here you gonna try and prove that they're going to be more critical of errors and weaknesses you can give less
credit to strength and abilities you will perceive them different this isn't conscious this is subconscious this is that of perceptual it's how our brains work it's not that we're bad people it's just this where humans this is how our brains work so cognitive biases getting in the way of our meritocracy so how do we overcome with awareness and conscious effort and making you aware here of some of these cognitive biases that that you still need to put in the effort to overcome them to short circuit them and to change those mental models that you've built up the welcome and mentor new Python easters reach out to marginalized groups actively work to bring in marginalized programmers and Python East just to conferences and user groups your Python I think has done a really good job at this get organizations to sponsor people who are like you to attend conferences to who speak at conferences actively challenge the patterns if subconsciously filter by offering a range of new and varied input when you're hiring boy gender that's there's nothing about to change the way it I believe these are in english only but there are apps out there that will help you look at how you phrased your ads to make sure that you're not turning away women and other people from submitting that's amazing use blind resin these were possible removed the name because that's a clue to gender and often to happen acidic removed universities and other clues to group membership do you really really care whether this person got their skills from a CS degree in order do you really care about their skills whether it came from the boot camp or wherever or self talk we all
know Austin programmers who were self talk focus more on skills and culture fit if it tends to fall into the in-group bias we're going to give more the way to people who are already part of our group the use technology like gap chamfers to do blind auditions to find out what kind of skills this person has before you bring them into interview that and when you're interviewing them
face-to-face short circuit be cognitive bias by looking 1st for reasons to high you're already subconsciously going to be looking for reasons to leave them out because they don't picture patents so focus on the reasons you want to hire this person actively look for strengths and abilities from these people and watch out for impostor syndrome if your candidacy saying 0 well that was a team effort or I got lucky that's probably impostor syndrome talking and they're going to downplay their accomplishments so take that into account when you're evaluating them at conferences like Europe right on Pike and your user
groups and need not look at for people who are not like the people
who don't fit the stereotypes listen to that
presume they're smart Justice Martin technical as he was believe them when they share their experiences look for
things you have in common and their strengths and abilities at a
technical conference presume everyone you meet there is a programmer or does some programming if you're at a Python
conferences everyone here is a Python east stuff until you're told otherwise stop asking for your program instead what kind of programming do you do how do we use Python or any other question you would ask someone who you already assumes a program and you may find out that that person is a lot more likely than you expected and you may learn about me she uses of high on that you didn't know about I've talked to people who have done all of these different things with kinds time including swirled utterances working together to overcome the stereotypes and other cognitive biases we all get that much closer to actually achieving a true meritocracy and making the Python community and maybe the world better because after all items gonna say well I don't know how but it will the and I guess I can't really fast because sometimes this takes longer so and request thank
I didn't and this last made in my photos
need to form the questions we will time for questions listed and and aggressions it also put I'm just curious to hear your opinion
about the gender pay gap and the but what the gender pay gap that women this is a lower salaries of their we all know that the of your opinion that I would appreciate what I just learners and that would like to go How do you look at it and how to use the potential improvement so the gender pay gap
exists there's been plenty of studies about it are some of the thing there's a whole variety of things involved with that everything from women not negotiating the same as men do when they start a new jobs are not being willing to ask for raises as much as men do are and sometimes there is subconscious bias in the people who are making decisions about the races so 1 of the things that you can do is to if someone is at the same level and doing the same or go through and use use programming to start looking for those gaps in trying to close them if they're getting the same level of need exceeds or whatever evaluations then they should be at the same level as there are other people who are working at that same level as far as pay cells and you find I'm not going there and think the questions they give the presentation 1st of all and is there any way to reveal the Spartan all brains or how long will it take for
instance I good luck in my team we have like 40 per cent of women and therefore developers and they're really hard core but I still have the stereotype that manages that and know all that's the type so there is is there a way to undo his learning so if you're talking about undoing
the project the of mental models we have of what a programmer is like this is the best way is to keep challenging it with needing more people who don't fit the model and getting to know them as people that helps a lot on any women here and the men are also welcome who haven't been to Grace Hopper women in computing it is an amazing conference and it can really help especially women to overcome that feeling that I don't really belong intact because you're surrounded by 3 thousand women in tech and it's great really helps to overcome that subconscious bias that we have as well because we're raised in the same culture and so we need to overcome it just as much as our men of appears to be so that can help a lot and just seeking out other people women people of color people who are like you and saying hey let's talk about things time are let's talk about programming or whatever and it's really helps to change how you do things from 1 of the things that I ran into so it's not just cover or gender is also disabilities and things like that but I worked at a history of medicine library when I was younger and this woman came in and was seeking very very slowly and so I was feeling kind of distance is about her as being well OK so she's not so bright turns out because I equate fluency of speaking with intelligence alright it at that time and I found out she was working on her PhD in muscular dystrophy which she had which was why she speaking so very slowly and the woman was brilliant I just hadn't personally encounters someone with this and challenged my own mental models of intelligence and verbal fluency so the more you do that with whatever mental models you have the better you are off now if anyone went here is wondering why do we want more diversity on of got talk called diversity as a dependencies that you can look up on you to on but I'm assuming everyone here has already gotten past that part but if you do want to review it you're welcome to look that up other questions but this so at the beginning of your talk you move a set a couple of
times in these individuals how to compute and resolves individuals helped create I kept wondering why not just say that the computed and created is there some kind of a certain person perhaps impostor syndrome if such as it exists as impostor syndrome and the server person can sorry it's really hard to
here and here in so there is the question is about impostor syndrome I was having trouble hearing yeah perhaps if there was a bit of an impostor syndrome
on in your presentation as well in the beginning when you describe those women as is also help to create and how to compute as opposed to created and computer I actually on the women that I mentioned who helped create were co-creators our Grace Hopper created a certain things and discovered certain things but the women and mentioned as helped create were actually co-creators but I could be I know that I suffer from impostor syndrome so it's always good to point
things out like that thank you under the thank you was extremely interesting
and you said earlier on in your told that there is no correlation between intelligence and the propensity to fall victim to blind spots so why doesn't our definition or conception of intelligence includes this is what seems to be quite important cognitive ability not to fall victim to bite sports for and the study that I'm mentioning is actually fairly recent and so people are still figuring this stuff out on that emotional intelligence and and things like that Social
Intelligence are different from cognitive intelligence on that's part of it but it's possible that we all fall into these things just because this is how our brains are made and so there's the orthogonal axes there they're not connected whether you have up bias blind spot and you really really intelligent they're are completely uncorrelated that's what they found in the study of his will and what want to redefine intelligent so that it is incorporated its things but I don't know you'd you'd have to include the study then see how they decided that actually I'm thinking that an era definition of intelligence of stays Isaac Newton the greatest geniuses ever lived was not intelligent would say 0
my definition of what's waterfall for the high dim norms essence nevertheless it created modern science the way there is any sort of like was was about other revising University reverences being curriculum suppose your company gets 5 to 10 million euros maize are here you won't can have with great that were to ask your people to spend a whole lot of time interviewing due interview about 200 thousand maybe 300 thousand as to how good is 1 of the criteria used to but they could also 300 about of 6 millions of years I would rather Europe all those things equal would rather hire somebody graduated from Stanford like you Baroness somebody who's a theoretical calls for a while to long so causal or or or in the book so would it be better to what Orlando et it that's what we would be doing it how we do you could say so we can also think of you what tying the fractional we used a number of erasers poets
the I agree that it's a hard problem hand all they agreed that have the degree mainly from a certain university man and something the but with the that's the where they got their skills the whole from that you may or may not be relevant to how they do their jobs so for example someone with a degree in music the or art or psychology the from Stanford who went to a boot camp to learn how to program are they better or worse than someone who God PCs degree from IIT I have my university I don't know but you would have to look at the actual skills and so 1 of the problems that you run into as humans the use of the keynote yesterday from the woman who talked about the many past that we follow to give a programming women tend to get into programming through various past rather than through OK I started programming at high school and then I go to computer science and then I go get my job women tend to take a more circuitous path and so if you're going for people who got a CS degree at the University of you're going to just by that criteria weed out a lot more women the then manner simply because of the way that women get into technology so that's a problem that you have to decide how you're going to address I can't I can't tell you how to address it other than to say be aware of it and think about what kinds of ways you can look to change things a little bit the it
and all I don't think yeah I think that the that and just south of there and there and is part of the and of the world and you end up the if you haven't gone to the
despicable machines talk yesterday or I you should see the video that because that actually is something that they ran into years problems with machine learning because of the corpus that they use they came up with the results that were not and unbiased and so we need to be aware of that has an issue that our machines are our results from our machine learning is only good as good as what we put into it for training data and a lot of the training data is going to be biased if it's from news sources it's going to be biased if it's from prison records it's going to be biased so being aware of that and figuring out how you're going to address it is so the above my pay levels in all and
and
it is hard for us to recognize some of these things and so sometimes you need to have someone else look at your data who comes from a different perspective and who may be able to see things that you don't have people from different perspectives really does help a lot because they are not going to be what it's like to kind of testing you want someone testing your results are your programming who wasn't deeply into the problem of because when you deep into the problem in busy programming or developing or whatever your so deep into it that you've got all of your mental model there of how this should look at what this really means and so you have all these expectations already built-in you're not gonna see the problems you're not going to see what's going on you need to have the 2nd pair of eyes out there it's who hasn't been so deeply involved and now I know that of open source is awesome for programming because as Eric Raymond said with enough I'll eyeballs all bugs are shallow maybe we need more eyeballs on these programs that we use and the results that we're getting to make sure that they're not coming up with biased results before we use them to make to decision making about people yeah and the question is are we at times but we still have time enough of question
good so as you can see I am a woman of water so what can I do to help other people overcome these cognitive bias you can get the thank seriously builds the
submit talks enough to conferences we self-select out and we I've heard the excuses 0 I don't know enough about such and such I didn't write the book on it therefore I don't know enough to give the talk on it where I've seen issues where someone who all hey I've learned about this thing and used it once so I can give a talk on it some women need to be more like that sometimes we need to be out there giving talks technical talks not just my psych parts on showing that we are part of the technical community and ballot members and when someone does come up to us and safe for your programmer yes are you our book and just say yes and I use Python in this way and that way and engage them in a conversation and help them realize that a OK well maybe that was not a useful question and and help them understand that there are better questions out there that they could be asking and the more that we get to know each other as people I think that will help a lot the the questions that I think so much for the great talk about you
mentioned removing the names and genders from see these but to prevent selection bias before like when you see someone sees the I'm wondering a lot of times you actually to interview someone in person or by a spy for something do you have any tips short of asking all candidates to work provide on the head and the through . data voices how we can avoid this kind of bias when we're actually seeing them in person and asking these questions OK
so once you've got to the interview point you've already put them presumably through some level of skills testing so you know that they got some level of skills if you haven't done that maybe use gap jumpers or something like that to help you build ways to do testing to make sure that you know that they do have skills before they get as far as an interview and then like to mention you can help to short circuit your own bias by looking for the strengths looking actively for the abilities for the skills for the reasons to hire because your automatically already going be figuring out all the reasons not to but start focus on looking on the reasons to hire this person on why they are a good candidate and that will help you to kind of get beyond that hyper criticality that we have when we see someone who doesn't fit the model the but if it was poll it's 1 of the questions
so so yeah but you know you have
Intel
Software
Programmiergerät
Bit
Last
Zahlenbereich
Mustervergleich
Mustersprache
Kognitionswissenschaft
Heuristik
Informationsverarbeitung
Facebook
Inhalt <Mathematik>
Computeranimation
Subtraktion
Programmiergerät
Total <Mathematik>
Gruppenkeim
Zahlenbereich
Sprachsynthese
Computerunterstütztes Verfahren
Computer
Binder <Informatik>
Computeranimation
Mustersprache
Konzentrizität
Mustersprache
Optimierung
Babbage, Charles
Korrelationsfunktion
Computerunterstütztes Verfahren
Computeranimation
Programmiergerät
Hardware
Vorlesung/Konferenz
Computerunterstütztes Verfahren
Computer
Computeranimation
Videospiel
Webforum
Mathematik
Datenverarbeitungssystem
Rechter Winkel
Programmverifikation
Computer
Baumechanik
Computerunterstütztes Verfahren
Kantenfärbung
Figurierte Zahl
Computeranimation
Personalcomputer
Turing-Test
Datennetz
Netzbetriebssystem
Parallelrechner
ARPANet
Turing-Test
Binder <Informatik>
Gerade
Computeranimation
Mikrocomputer
Programm
Fehlertoleranz
Bildschirmmaske
Gravitation
Mereologie
LIGO <Astronomie>
Computer
Projektive Ebene
Plancksches Wirkungsquantum
Primideal
Compiler
Optimierung
Gravitationswelle
Computeranimation
Programmiersprache
Verzeichnisdienst
Lesezeichen <Internet>
LASER <Mikrocomputer>
Grundraum
LASER <Mikrocomputer>
Mereologie
Programmbibliothek
Programmbibliothek
Optimierung
Computeranimation
Sichtbarkeitsverfahren
Programmiergerät
Dicke
Gewicht <Mathematik>
Kreisfläche
Mustersprache
Computerunterstützte Übersetzung
Optimierung
Teilbarkeit
Computeranimation
Programmiergerät
Mustersprache
Besprechung/Interview
Vorlesung/Konferenz
Ultraviolett-Photoelektronenspektroskopie
Computeranimation
Mustersprache
Programmiergerät
Hash-Algorithmus
Mustersprache
Systemplattform
Computeranimation
Fitnessfunktion
Beobachtungsstudie
Subtraktion
Programmiergerät
Wellenpaket
Euler-Winkel
Klasse <Mathematik>
Gruppenkeim
Informationsverarbeitung
Toter Winkel
Information
Biprodukt
Kontextbezogenes System
Computeranimation
Chipkarte
Informationsverarbeitung
Theorem
Binder <Informatik>
Spline
Softwaretest
Forcing
Zahlenbereich
Schlussregel
Computeranimation
Chipkarte
Softwaretest
Beobachtungsstudie
Programmiergerät
Resonanz
Fakultät <Mathematik>
Informationsverarbeitung
Schlussregel
Transformation <Mathematik>
Computeranimation
Softwaretest
Keilförmige Anordnung
Wellenpaket
Code
Ein-Ausgabe
Nichtunterscheidbarkeit
Optimierung
Grundraum
Ortsoperator
Zahlenbereich
Kantenfärbung
Element <Mathematik>
Übergang
Metropolitan area network
Randverteilung
TVD-Verfahren
Programmiergerät
Element <Mathematik>
Selbst organisierendes System
Gruppenkeim
Element <Mathematik>
Computeranimation
Informationsmodellierung
Spannweite <Stochastik>
Prozess <Informatik>
Gruppentheorie
Mustersprache
Grundraum
App <Programm>
Booten
Matching <Graphentheorie>
Automatische Differentiation
Kontextbezogenes System
Ein-Ausgabe
Minimalgrad
Softwareschwachstelle
Geschlecht <Mathematik>
Beweistheorie
Ein-Ausgabe
Ordnung <Mathematik>
Fehlermeldung
Programmiergerät
Informationsverarbeitung
Gruppentheorie
Rechter Winkel
Leistungsbewertung
Mereologie
Gruppenkeim
Computeranimation
Fitnessfunktion
Chipkarte
Programmiergerät
Zustandsdichte
Gruppenkeim
Optimierung
Optimierung
Computeranimation
Chipkarte
Subtraktion
Informationsverarbeitung
Digitale Photographie
LASER <Mikrocomputer>
Informationsverarbeitung
Optimierung
Optimierung
Computeranimation
Inverser Limes
Geschlecht <Mathematik>
Computeranimation
Beobachtungsstudie
Prozess <Informatik>
Datentyp
Zellularer Automat
Speicherabzug
Optimierung
Kombinatorische Gruppentheorie
Instantiierung
Entscheidungstheorie
Varietät <Mathematik>
Übergang
Leistungsbewertung
Programmiergerät
Informationsmodellierung
Server
Programmbibliothek
Projektive Ebene
Abstand
Kantenfärbung
Optimierung
Fehlertoleranz
Bit
Punkt
Computer
Kombinatorische Gruppentheorie
Computeranimation
Beobachtungsstudie
Informationsverarbeitung
Toter Winkel
Korrelationsfunktion
Codec
Beobachtungsstudie
Newton, Isaac
Informationsverarbeitung
Mereologie
Zahlenbereich
Systemaufruf
Toter Winkel
Normalvektor
NP-hartes Problem
Bit
Booten
Minimalgrad
Polarkoordinaten
Prozess <Informatik>
Gruppe <Mathematik>
Mereologie
Grundsätze ordnungsmäßiger Datenverarbeitung
Optimierung
Informatik
Grundraum
Metropolitan area network
Gefangenendilemma
Resultante
Virtuelle Maschine
Datensatz
Wellenpaket
Quellcode
Algorithmische Lerntheorie
Computeranimation
Videokonferenz
Übergang
Softwaretest
Resultante
Informationsmodellierung
Subtraktion
Erwartungswert
Perspektive
Wasserdampftafel
Open Source
Optimierung
Programmfehler
Entscheidungstheorie
Programmiergerät
Geschlecht <Mathematik>
Trennschärfe <Statistik>
Mereologie
Schreib-Lese-Kopf
Softwaretest
Informationsmodellierung
Punkt
Fokalpunkt
Computeranimation
Übergang

Metadaten

Formale Metadaten

Titel Overcoming Cognitive Bias
Serientitel EuroPython 2017
Autor Ravenscroft, Anna
Lizenz CC-Namensnennung - keine kommerzielle Nutzung - Weitergabe unter gleichen Bedingungen 3.0 Unported:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen und nicht-kommerziellen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben
DOI 10.5446/33680
Herausgeber EuroPython
Erscheinungsjahr 2017
Sprache Englisch

Inhaltliche Metadaten

Fachgebiet Informatik
Abstract Overcoming Cognitive Bias [EuroPython 2017 - Talk - 2017-07-14 - Anfiteatro 2] [Rimini, Italy] Starting with a brief description of how built-in mechanisms in our brains lead to cognitive bias, the talk will address how a variety of cognitive biases manifest in the Python and tech communities, and how to overcome them

Ähnliche Filme

Loading...
Feedback