Sometimes the questions are complicated
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 96 | |
Author | ||
License | CC Attribution - NonCommercial - ShareAlike 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/51846 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
NDC Oslo 201674 / 96
2
7
8
9
12
14
19
20
26
28
31
33
38
40
43
45
48
50
51
61
63
65
76
79
80
83
87
88
90
93
94
96
00:00
Software developerSubsetBitDifferent (Kate Ryan album)CASE <Informatik>Physical systemGenderCuboidShared memoryArithmetic meanRight anglePoint (geometry)Observational studyMereologyMultiplication signSoftware developerOrder (biology)Bridging (networking)SoftwareAutomationDevice driverControl flowMechanism designFacebookService (economics)Group actionMessage passingTask (computing)Computer architectureSpring (hydrology)TouchscreenLine (geometry)WindowDependent and independent variablesStatistics19 (number)NumberInsertion lossWordInheritance (object-oriented programming)Software bugEvent-driven programmingData analysisArmJSONXML
05:29
Thermal radiationQuantum stateGreatest elementProper mapInheritance (object-oriented programming)Multiplication signObservational studyLoginMereologyBit rate19 (number)Quantum stateOnline helpFault-tolerant systemNeuroinformatikRight angle
07:25
Formal languageCompilation albumCalculationFault-tolerant systemWordCategory of beingNeuroinformatikLine (geometry)Formal languageMultiplication signCartesian coordinate systemMobile appGroup actionSet (mathematics)Virtual machineType theoryRight angleWritingComputer animation
09:00
Personal computerConvex hullMotion blurPersonal computerMultiplication signVideo gameDegree (graph theory)Computer scienceSocial classNeuroinformatikHand fanFamilySelf-organizationWave packetLevel (video gaming)Student's t-testSet (mathematics)QuicksortGradientMereologyAutomatic differentiationRight angleInheritance (object-oriented programming)AnglePoint (geometry)Archaeological field surveyBuffer overflowSoftware developerObservational studyCASE <Informatik>Arithmetic meanNo free lunch in search and optimizationComputer programmingGodDisk read-and-write headFlowchartThermal expansionDifferent (Kate Ryan album)Process (computing)WeightDataflowCivil engineeringUltraviolet photoelectron spectroscopySign (mathematics)
16:20
GradientSelf-organizationGoogolVideoconferencingComputer-generated imagerySystem administratorDean numberEmailSource codeSoftware developerNeuroinformatikCalculationInformationWordMultiplication signSoftware development kitMathematicsVideo gameSupercomputerBlock (periodic table)Internet service providerLine (geometry)Social classDisk read-and-write headFamilyGame theoryRobotTable (information)Data storage deviceHTTP cookieDescriptive statisticsInheritance (object-oriented programming)RoboticsBuildingGoogolProgrammer (hardware)Medical imagingRight angleVideoconferencingComputer iconComputer scienceType theoryDecision theoryTorusSummierbarkeitPRINCE2Keyboard shortcutDoubling the cube2 (number)Forcing (mathematics)Computer animation
23:40
Representation (politics)Computer animation
24:19
Observational studyMultiplication signTouchscreenSpacetimeBeta functionInterrupt <Informatik>Software testingLogic gateMathematicsMetropolitan area networkUniverse (mathematics)Data conversionGodStorage area networkAssociative propertyMoment (mathematics)Arithmetic meanCASE <Informatik>Observational studyTask (computing)Graph coloringInformationSelf-organization2 (number)Point (geometry)Rational numberComputer clusterObject (grammar)Sound effectDecision theoryDistanceFeedbackData managementLine (geometry)Multiplication signQuicksortBasis <Mathematik>Right angleInclusion mapCodeProcess (computing)Type theoryPosition operatorFlow separationGame theoryPlastikkarteLattice (order)TwitterTaylor seriesInterrupt <Informatik>Dependent and independent variablesGenderWebsitePerformance appraisalProjective planeDifferent (Kate Ryan album)Computer animationLecture/ConferenceMeeting/Interview
34:03
Software testingEmailConvex hullSoftware testingNeuroinformatikWebsiteQuicksortRight angleType theoryObservational studySocial classProcess (computing)Projective planeCharge carrierGenderOffice suiteAssociative propertyComputer animation
36:01
Key (cryptography)Simulated annealingAssociative propertyFamilyWeb pageTask (computing)InformationForceShape (magazine)Decision theoryCloud computingElectronic meeting systemData conversionResultantAxiom of choiceComputer configurationExtreme programmingSoftware testingFamilyGenderMultiplication signFigurate numberObservational studyMereologyInformationKeyboard shortcutSet (mathematics)Normal (geometry)Associative propertyCharge carrierKey (cryptography)WordSlide ruleYouTubeLevel (video gaming)Process (computing)MathematicsRight angleVideoconferencingLatent heatArithmetic meanCAN busLine (geometry)GoogolComputer animation
41:35
TouchscreenLattice (order)Reading (process)Goodness of fitInternet forumMultiplication signPlastikkarteRight angleCircleProcess (computing)WhiteboardTime zoneArithmetic meanInterrupt <Informatik>BitAddress spaceEmailComputer programmingDecision theoryCollaborationismType theoryTerm (mathematics)QuicksortResultantFrequencyExtension (kinesiology)Integrated development environmentRadio-frequency identificationComa BerenicesNormal (geometry)InformationUML
47:08
Computer animation
Transcript: English(auto-generated)
00:04
Hey everyone, can everyone hear me all right? Great, thanks. Some questions are like Facebook relationship status, right? It's complicated and it gets very interesting when you start adding your parents there. But in almost every
00:24
every company that I've worked, every team that I've been part of, either I've been the only woman in the team or there's been like one more person. And I'm talking in the course of like 15 to 20 years of working in the tech industry. So at some point I started thinking like why is this? And this
00:45
question started to bug me a lot. And I really wanted to find the answer. So before we get on, something about me to give you a little bit about my nerd credentials. I've been in the tech industry for 15 to 20 years. I started
01:03
writing systems using Foxpro screen builder. Yep, really old. Moved on to C++ and C sharp and I've written systems that biologists use to analyze DNA still today. And you know they use their qPCR equipment and get their
01:22
data analysis and they use it today. And I've been part of teams that developed 911 emergency response software and it's still active in major US cities today helping people. So somewhere along the line I totally fell in love with event-driven architecture and designing reliable
01:43
systems. So I started looking into more messaging architecture and then I joined particular software and I'm currently a developer at particular software the makers of end service bus. They create API's for developers that help you know create very very impactful software. So if you want to talk tech to me
02:04
please come over to the particular software booth. We can hang out and talk tech all day. But today I want to share some of my experiences in how we can make the industry more diverse, more inclusive. So why is diversity important
02:22
right? I mean there are so many studies that have been conducted that show that teams that have more diverse people, people from you know different culture, race, gender can come and find very creative solutions. And yet you know when we look at our own teams that's not the case. So
02:46
instead of talking stats and numbers I want to share a few stories with you. This is in the 1900s. Mary Anderson, she's a woman and she wanted to enjoy New York. So she went to New York and she's in this little trolley
03:04
car and she's looking out the window trying to look out and absorb all of the magnificent New York. So what happened was it started to rain and what she noticed was the driver of the trolley car had stuck
03:23
his head out of the window and he's like frantically trying to like use his hand and to wipe away the rain all the while driving. This is the time and age when there weren't windshield wipers invented yet. Now Mary saw this and she saw a problem there
03:42
and so she went back home and she was an inventor. So she invented this very simple mechanical device which had a spring loaded arm where you could attach this rubber blade and it went and attached on the outside of the car and the driver from the inside of the car could simply use a
04:01
simple turn device to to operate the wiper. And so she patented this in 1903 or 1902 and but the the group think or the common way of thinking at that time was like woman what are you thinking this is very distracting to drivers having this thing
04:21
you know on the outside like what you know what is possibly I mean what is wrong with this or what is so cool about this we don't want this get away. Right and there was another woman Charlotte Bridgewood who looked at Mary's patent and thought this is a brilliant idea and but she thought that it could be improved upon
04:43
like you know the driver still has to turn this knob manually so she wanted to automate this and she automated that and you know made the patent better. Unfortunately it wasn't until like 20 years later that the industry you know adopted this
05:01
as a safety mechanism and then said like yes all automobiles should have this this is important. But you know without Mary thinking out of the box the common thinking at the time was like this is perfectly acceptable situation for people to like you know like put half their bodies out of their car to like wipe their
05:22
windshield wipers right. So this sort of like thinking outside the box is important. Now let's talk about Dr. Alice Stewart she was a fantastic doctor a pioneer and there's a book about her the book is titled The Woman Who Knew Too Much. Now she was a doctor in the 50s
05:44
there weren't a lot of women doctors now at her time she was dealing with a very very interesting problem she found out that children were dying at an alarming rate because of leukemia and her own godchild had died so she wanted to get to the bottom of
06:02
this problem and so when she found out she found an alarming alarming discovery that all the children that had died the parents had access to proper health health care so there were well-off people who had access to health care
06:20
and then the common practice at the time was as part of prenatal care pregnant women were being x-rayed to check the wellness of the baby so she found this and she published a study in 1956 and her study was again not accepted right away and she had to
06:43
prove more and more with more and more data until the 1970s where then you started seeing these big signs in front of like x-ray rooms saying like hey if you're pregnant you should watch out and the techs start asking you you know are you pregnant you shouldn't be being x-rayed
07:00
again like the common thinking at the doctors at the time was x-ray was a cool new technology they used x-rays to figure out problems and and they couldn't believe that this could cause a problem and again it took a person on the outside looking in to to identify a problem and it was a very
07:22
big problem now let's talk about our wonderful dr grace hopper she's a woman in tech right she invented the or she she coined the word debugging we're all devs so we understand what that is and the common thinking at the time was
07:41
computers were fantastic machines to perform calculations and dr grace hopper challenged that idea she felt like computers can do so much more right so she thought if we could write instructions in a human understandable language
08:06
then we as human beings can you know write cool new applications and then these set of instructions can then be converted into machine understandable language and then you have compilers so she paved the way to towards you know creating a whole new world
08:27
a whole new line of apps and she said that the most dangerous phrase in the languages we've always done it this way and that's the problem right so we we are we are set
08:42
in our ways and when when we are a group of the same type of people we tend to look at it the same way we don't look at problems differently now we might think that like hey this is all in the past well this is the 2000s how many of you have watched the
09:05
movie concussion have anybody okay so this there's a there's a movie called concussion and it's based on dr omalu's uh story now dr omalu is originally from nigeria and he immigrated to the
09:22
united states he's a forensic pathologist now what he did in 2002 was like he conducted an autopsy of a very famous football player american football player called mike webster who used to be very very famous for the for the steelers now
09:44
mike webster was only 50 years old and before he died i mean his life was in shambles i mean he went from being famous and and somebody you know so awesome and but towards the years that he died he was literally on the streets and he was literally going crazy but
10:04
when he died and dr omalu started digging into the reasons he actually found a new disease he named it cte chronic traumatic encephalopathy chronic meaning repeated traumatic means injuries
10:23
causes brain damage i mean if you look at it it's common sense you bash your head repeatedly you're going to end up with brain damage but when he published a study in the medical journal i mean nfl is a huge huge organization there's a lot of money involved
10:42
billions of dollars like a ad spot for super bowl runs like several million dollars right it's a huge industry and it took the nfl until like 2009 to actually like put up some guidelines for concussion for their players
11:01
i mean you've got like more than 80 people x nfl players that have been diagnosed with cte and fought more than 5 000 nfo x nfo players have sued the league and yet today today raja gudel the commissioner of nfl still claims that football is safe i mean this so but the point here
11:27
is dr omalu didn't know anything about football until he actually performed the autopsy for of mike webster and then he like dug deep and he was able to find a problem and again
11:42
that whole looking you know at a different angle helps find new solutions now going back to tech right 5.8 stack overflow in 2015 conducted a survey and in that survey
12:01
they surveyed about 26 000 developers now from 157 odd countries out of that 26 000 people only 5.8 percent of them were women that's pretty sad this wasn't always the case if you look
12:22
at the 1980s almost 40 percent were like women in tech but now it's like if you look at every major company you're talking like less than 20 percent and why is that if you look at the 1980s what happened was the personal computers started being marketed and then
12:43
the personal computers were targeted more for boys i mean you had ads like this right so you have here here's a personal computer i mean it even says you know because the sooner your child starts the further he'll go
13:04
like how much more can you you know can you can you target and here's another one adam helps prepare kids for college and helps pay for it too and again here's a picture of a boy now what happened was because the personal computers were marketed as
13:22
toys and they were more for boys you know boys were playing with it or like kids were getting more involved with computers at an early age i mean boys and when college started even in introductory level courses when women came like a lot of the people that were in
13:44
the class who were boys who were men knew a lot of this stuff so when you're in a class and you kind of feel like you're the only one that doesn't know this stuff and you look around you and everybody else seems to know what they're doing you kind of feel left
14:01
out and it's in you kind of start to question yourself maybe this isn't for me right so that sort of feeling sets in and a lot of women did leave the the computer science class now i want to share my own personal story
14:20
this was also in the 80s and in my school i had started a computer center this is a time when i'm originally from south india i now live in the u.s south india the temperatures can be like 30 degrees celsius with 80 humidity we didn't have air conditioners in classes we just had ceiling fans and when the computer center started i mean
14:44
it was like you know it was like a clean room to go in there you had to wear like uh like little like you know the booties like oh to cover your feet even if you're wearing your socks it was like a clean room so it was so fascinating so i walked in there and and
15:00
a teacher who teaches computer science kind of like pulled me in and it's like she talked to me about different summer classes and then so she she wanted me to take up the summer classes for basic programming i was like oh my god like this is this is a lot of money it was it was 300 rupees at the time which was
15:21
if you convert roughly now it's like five or six dollars u.s dollars but back in the day it was a lot of money and being from a middle class family i wasn't sure if my family could afford it or not but my parents were fully supportive of me doing the class and they signed me up they signed me up
15:41
i i took the class and i was the the smallest person in the class everybody else in the class were grown-ups or kids college kids and stuff i didn't know a lot of the stuff that was being taught everything was brand new i was the the odd person in the room i was miserable because i i i was good i felt good i was a you know a good
16:05
a student that gets like good grades well when i went to that class it was like it was horrible because like i know how to solve problems but i had to write it down i had to draw flow charts and it was like it was a part of the training that was like challenging your brain to like
16:22
think a certain way think a different way right i hadn't exercised my brain muscles that way so i found it very difficult and i didn't do well in that class and and i just felt the same way it was like this isn't for me but then my teacher again pulled me out and said like no
16:44
try a different class and then she put me in she signed me up for more classes so it worked out well for me and when i took computer classes in high school it was just so much easier because it was the same stuff that was being taught
17:02
and i'd already know knew how to think a certain way then it's just it just started to fall in place so for me that was the big big change and i had the right people at the time in my life to kind of push me in and and give me that encouragement
17:21
but sadly like you know the culture and the society isn't that way we tend to associate women more to caregiving and family and men more to career and in like you know providers so the thing is our brain is awesome it's it's like a
17:45
supercomputer it's taking it's it's making you know a million calculations a second so many computations and it can do that because of the amount of information that has been provided now in computer science there's like
18:02
two big problems if you're a developer naming things and invalidation so the the bias is like it's like a cash so your human brain like takes certain shortcuts because of all the prior knowledge of information
18:21
now if you've fed it you know hundreds and hundreds of years of stereotypes that's built in sometimes we don't even notice some of the decisions that we make we kind of tend to make them unconsciously and that's the problem now talking about stereotypes i googled
18:42
the word ceo google gave me a wonderful definition of who a ceo is it's like the top person in the organization and next to that is a picture of this white dude in his early 30s or late 30s then i hit the images tab
19:05
right i mean for heaven's sakes even the icon and the clip art is of a dude with a tie and and these are the types of stereotypes that we you know push into our heads and and and we see images that reinforce the stereotypes
19:23
and this is all we've you know we're fed into i mean i'm sure if you do if you google programmer this won't be far off you'll see a lot more dudes than you know then you then then you see other minorities or women now i'm the last minute parent summer is started and schools ended
19:44
so i'm frantically trying to figure out like what are the summer camps as a last minute parent i did this i googled for summer available summer camps literally like a day before i flew into ndc so when i googled this is from this uh i'm sure you can't read it but this
20:05
is from the city of tamaquila.org so these are all the summer camps it was listed now the summer camps it was listed was gear i saw the gear to robots which had like some really cool stuff like you know lego bots and building you know all kinds of robots and things
20:21
like that it seemed really cool and then i saw two more classes and it seemed to kind of target girls there was a girls retreat camp and if i were a princess camp and when i read the description let me read it to you so join us for this fun-filled
20:41
summer camp spend your mornings creating arts and crafts decorating cookies designing jewelry painting and more so this is what i guess we want our five to ten year old girls to know in summer or as there are kids that are building bots and stuff and this is being fed
21:02
to our girls and if i were a princess camp there's stuff like you know good manners learn to play games and and um we'll work on our table manners so again it's like it's like stuff that's being targeted again to the stereotype of like women being like caregivers and
21:23
men being like providers and this kind of stuff starts very very early our girls are more than just a princess and if you look at this kind of stereotyping and bias this is very very prevalent in the toy industry now
21:40
in the toy industry i mean you've got girl toys you've got boy toys if you i mean if you walk in there's a pink aisle full of like fluffy stuff and and frills in like major toy stores and then on the if you go to the boys aisle again you've got stuff like bots and building things and and
22:01
girls at a very very early age start to associate this my own little daughter she's nine now and when i went when i took her to a toy store trying to like it's like hey how about this how about this she's like no mom this is that's a boy toy i don't want that don't you get it i don't want that
22:21
i'm like wow she's nine and she's already got this thing in her head that these are toys that she didn't want to be associated with now this is where like debbie sterling she's a woman in tech and she wanted to challenge this bias in the toy industry and she started this company called goldie blocks so what
22:41
what she found out was like you know how girls process information is different from how boys process information girls love stories and so so she built like a tool kits for girls like where they can build stuff you know to challenge their special skills and stuff and she did this and she has this
23:03
really cool like you know storyline and character lines this goldie blocks ruby rails who's a programmer of course and and so she's got this whole line and when my daughter like picked up the goldie block set it was kind of cool this is like goldie blocks invention diary and it was like
23:20
so so cool like so she it was like like reading it's like goldie blocks the secret diary and she was like all into it and trying to build stuff so with debbie's permission i want to play a short video for you
25:10
play a very very big role in how children see things and this kind of like starts at middle school age we have a pipeline problem right we feed these stereotypes
25:23
and in the 80s you had revenge of the nerds war games and all those kind of stuff that kind of like you know gave nerds a bad name and and girls didn't want to be associated with being called a nerd and they kept away from computers because like that was like a nerdy thing to do so on one end you have the
25:41
pipeline issue where all these stereotypes are pushing or girls away from tech and then on the other hand you've got the inclusive the inclusiveness problem right we already have women in tech and they are leaving tech and this is because of various problems let's talk about like how
26:02
hiring during during the hiring process how how that it can affect now dr kareen mas she's a social psychologist she conducted a very simple study she gave a fictional resume with i mean it was an identical resume which is one change
26:22
the name change john versus jennifer she passed that resume around and this was for a position of a lab manager she passed it around to a bunch of scientists both men and women to kind of see the feedback that jennifer got versus john people found that john to be more
26:43
assertive or like they felt him to be more confident and competent but jennifer sadly didn't fare that well even women felt that jennifer wasn't didn't have that much experience it's the same resume and they felt like
27:02
some of the scientists even went on to say that like you know i'm not willing to even mentor jennifer because i think this is going to be a waste of my time i'm not i don't want to deal with this sadly this is the same resume with just the name change and these are scientists very very objective people that look at data to
27:24
make rational objective decisions and yet this happened and this is because of our existing cognitive biases right so what's in the name unfortunately everything now norway i have to like thank you guys and and have i mean you
27:45
have such a wonderful policy when it comes to parental leave i mean it's amazing but sadly in like the rest of the world it's not the case now when a woman gets pregnant it's kind of considered as like a resource issue again when a when a
28:02
man's about to have a child he's kind of like seen more as a oh my god he's getting into more responsibility so he gets promoted much easier on the other hand it's not the same story for women a lot of women don't even go up to their managers to even say they're pregnant because
28:20
they know that if it's like the salary appraisal time they could get dinged because they're seen as resources that can't effectively work that time and this sort of happened for me when i was pregnant with my daughter i had asked i'd been asking my manager for c sharp work for a long time
28:44
and i had like several years of c sharp experience i was like hey please let me get this so that was a a big project that came in the pipeline and there was a big feature and so i was assigned to it then i was pregnant with my daughter i was just two and a half months
29:01
pregnant but i told my manager i was like hey you know i just want you to be aware so you know a few months from now you can prepare immediately i was taken off that assignment and that task was assigned to somebody else i mean come on people i'm gonna have a baby several months later
29:21
not the exact same instant so i can write code even when i'm pregnant it's not like i'm losing stuff right so unfortunately this is sadly the case in many many companies and you know many many companies and then fortune magazine published the study called the hasting study
29:42
women in stem women of color in stem like of all the subjects that they had interviewed all faced bias 100 percent of them there was not one woman of color that did not face bias they they fought they had faced some type of harassment or the other now asian women are seen as very
30:05
demure calm or like you know less outspoken but latinas and black women get labeled as the angry black woman or it's it's very different now when like we're having conversations
30:26
it's it's like you know it could be in slack or hipchat or we're having conversations and you could just be commenting on stuff and some of those comments could get perceived as like angry like how did you come to that conclusion
30:43
from a github comment or a slack comment how can you attach like this emotion anger where does that come from right so but this tends to affect the the whole line of conversation because all of a sudden you tend to lose all of that
31:01
information the other person's trying to convey and now you're so sort of associating this negative emotion to that and taking away all the good points that this person's trying to highlight now you've got stuff like i look like an engineer there's the twitter hashtag where this this
31:21
woman isis she's a full stack engineer for a company all she did was appear in her company's recruitment ad and that ad was like in uh in san francisco in several places and there was a whole bunch of sexist comments which started the whole i look like an engineer moment
31:42
movement where a lot of women posted pictures of you know what they do and it kind of highlighted there are so many brilliant women out there doing some amazing things right while it highlighted that it also highlights like all the sexist stuff that goes on today there's gamer gay shirt gay
32:03
distractingly sexy i mean you could just keep on going with the twitter text then of course the interruptions now when kanye west goes to interrupt taylor swift the whole world reacts like how dare you kanye what were you thinking man
32:23
but this sort of stuff happens regularly in meetings on an everyday basis now who's pulling out the kanye card unfortunately you know people get talked over in meetings interrupted this sort of stuff happens on a day-to-day basis so what are some of the things that
32:45
you can do today to effect a change in your own organizations where you work first of all you have to speak up because if you look at it and if you see a problem and if you
33:00
don't highlight it within your organization then how can we affect change people people affect change you can't just stuff in a policy and say hey follow this policy that's not going to work but when people start having meaningful conversations that's when change happens people change
33:22
people conversations change people and then it can cause like many effective policies to be created which would be much more effective but we can't have the mindset oh this is not my problem i mean when we see something we have to speak up and also challenging our own selves now harvard university came up with this
33:45
test it's the it's called the implicit association test it tests your own biases and there's several different tests that you can take regarding gender career iat skin tone test and so on now it's just a five to ten
34:03
minute test like like i said this this this is if you go to the website these are the several tests you can take and the skin tone test is also very interesting because we tend to associate fairness to right and like you know dark skin to like i don't know what i mean you see gandalf the gray he was gandalf the
34:24
great the wizard but it wasn't until he became gandalf the white he was like so awesome right he defeated balrog and then became gandalf the white now again if you look at elves like very fair people and they were standing up for what was right then you look on one side at
34:44
orcs they were mean looking dark you know ogre looking people again this this sort of like stereotypes are built unfortunately in the u.s if you look there was a study that was conducted black boys teachers tend to punish black boys
35:03
more than other kids it could be the for the same type of offense talking in class or the black boy would get more punished than white and this this is not just white teachers it's even black teachers
35:20
that's in the classroom and this is again because of the bias i mean yes yesterday in troy's keynote that was a gif that i found interesting uh the gif was like he was talking about like how you know you had like bendable computers and so so one guy the white dude was working with a computer that was perfectly bendable then you you see the black guy that's
35:42
trying to do the same thing to a mac again you you look at how like the black guy is being stereotyped in that way i mean you don't notice it but once you start thinking about these biases then you start to pay attention and then you start to see more problems now the iat itself is a very simple test
36:04
in this test the gender carrier iat all you're supposed to do is like you're asked to associate male names and female names to either carrier or family so the the the test itself is super simple the first two things are just baseline you just have to press the e
36:21
and i key like really fast as names pop up in the middle the first two slides just kind of like one wants to know that hey do you understand male names and female names and then same thing with with family and career words that are associated with family and career so once this baseline is
36:41
established now comes the interesting part like names appear in the middle you're supposed to associate male to career and female to family now this is our normal bias normal way of thinking so we go fast or as the names pop up
37:00
we press e or i now comes a very interesting part where this is switch up now as names come up now you're asked to associate male specifically to family and women specifically to career and the test knows like how fast you went the previous uh the previous test and how fast you
37:23
go here then what you find out after a bunch of more questions you get the result this is my own result this is me you know the person that's talking about gender bias this is my result saying like i am moderately biased so this is something to be aware of and
37:42
i'm aware of it and i'm working so the the thing about bias is unconscious bias is like once you know that you are biased you can try to fix it and you can try to learn or do things that go against that bias to kind of like fix up the shortcuts so there are several different things you
38:04
can do um one is you can you can take there are several companies that offer bias workshops maybe you can have one of those companies come and do bias workshops and again this is where like korean dr korean moss the social psychologist that conducted the hiring study
38:22
she did another experiment where she went and to the set of scientists but this time she told them about the john versus jennifer study and then they also attended a gender workshop a bias workshop the results were vastly different
38:43
these people became the biggest proponents of diversity in their company because they understand what's going on they understand the bias and so they're looking out for these extra things that they previously missed before and if that's not possible you can you
39:01
can just the harvard university study that's available you can take it anytime so perhaps have a team iat session and figure out discuss the results start a conversation and there's this wonderful book called predictably irrational it talks about like you know how our human behavior is
39:23
and with with a whole bunch of data from a whole bunch of experiments it talks about things like anchor bias where you know you you're given three choices the first choice is like not that much of features it's kind of priced at one extreme the one in the middle is kind of nice
39:44
it's priced moderately and the third option is like extreme extremely priced high so you as a consumer you're looking at the middle option go hey that's the best choice i'm going to get that but that's exactly what the marketing guys want you to pick that's why it's placed in the middle
40:01
so you you have all these different biases and this book is wonderful it talks about like all of those things now the the thing about the bias is also um it's like you have to you have to learn to work against it there was a really cool experiment conducted by
40:22
dustin and uh there's a i think a youtube video it's called the backwards bicycle what he did was there was a bicycle and they they made one change to it if you turn the handlebar to the right the bike will go left so if you turn
40:40
to the left the bike go right and that was the one simple thing and and he told the people that were trying to ride the bicycle hey these are the adjustments now can you ride the bicycle from this end of the stage to that end of the stage but you can't put your feet down and people try and it's so hilarious to watch people try and the thing is they know this is how
41:04
the bike works they know but still it's so hard and it's so hard for the brain to make that switch so dealing with our biases is hard and it takes time but the only way to deal with it is to challenge ourselves constantly and to feed ourselves like more
41:21
information and to go step outside our own norms right so the other thing is like when you look at your uh screening process you know how are you sourcing your candidates are you sourcing your candidates as like hey we have a hiring position so if you go friends of friends of
41:42
friends hey this is a cool job i think you should apply or like then we tend we tend to bring in people that we know people that we know are probably people who look like us who are in our comfort circle right again then we end up with more people who think like us who look like us and that's not how you build
42:01
diversity and again it's like sometimes you have to go outside of your comfort zone try to look at the normal places that you normally wouldn't look for finding candidates and meeting moderators they talked about interruptions and meetings now you can be anybody can be a meeting moderator when you see someone
42:22
speak over jennifer some sam can speak up and say hey can jennifer finish can she finish a thought because when you're being interrupted in a meeting you're trying to get your idea across and when you can't you're not being heard
42:41
that's not proper collaboration but when someone speaks up and says that like hey can she let her finish and this could be even a guy on the team can sam finish right that kind of puts the owners back on the person and they can finish their thought get their idea across
43:01
and that's wonderful so this sort of stuff works very well where i work at particular and this is where like one of our colleagues will say hey can can sean finish a sentence please and so we we all like you know call out each other when this happens so the thing about bias is that
43:24
you know we're not bad people making bad judgments sometimes because of the stuff that's been fed to us over a period of time we tend to make these bad decisions but unfortunately if you look at bias bias is nothing
43:40
but discrimination and discrimination is just plain wrong the reason i'm standing here today is because no matter what my daughter chooses arts or tech and if she chooses tech i want her to be in an environment where she is not discriminated upon where she has the fullest extent of
44:00
opportunities available to her and i know this is a very very very tough topic but i really appreciate your time and i would love to hear your stories send them to me my email address is my first name dot last name at me.com it's much easier if you have your program cards uh you can you can email me thank you so much for your time i really
44:24
really appreciate it do you have any questions thank you oh
44:42
sorry yeah so there's i mean the interview
45:08
process itself is interesting right sometimes if you ask a question um like you know how would you rate your c-sharp skills skills on a on a scale of like say one to ten i might give you
45:21
a seven but that doesn't mean my c-sharp skills are like you know seven ish and a guy might give you a nine that doesn't mean that his c-sharp skills are like you know whoo-hoo out of the roof right so sometimes like some of those questions that we ask are
45:40
are you know can make an impact and also how we um how we process like for example like some people do really well like in some interviews you're asked to go in and like draw stuff on the board or like you know whiteboard a solution some people may not be very comfortable with that whereas if you give them a
46:02
little bit of time they are perfectly capable of giving you that solution so you got to think about like you know if you want the best candidate like what are all the different avenues by which you can get that information out of the candidate so there's a lot of good articles like uh one of the stuff that i
46:21
i read was like google's interview process and some of the questions that they ask in terms of like behavior type questions but the interesting thing is they ask the same type of questions to everyone so that when you get the results you can like look at those results in a meaningful way and make an objective decision
46:47
any other questions thank you so much for your time