Code Reviews: Honesty, Kindness, Inspiration: Pick Three
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 69 | |
Author | ||
License | CC Attribution - ShareAlike 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/37754 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
Ruby Conference 201763 / 69
2
5
8
11
13
19
20
21
32
33
34
35
37
40
43
44
52
54
55
56
57
58
62
64
65
67
00:00
GradientMachine codeCore dumpVariable (mathematics)TwitterMachine codeQuicksortEuler anglesFlow separationPhysical systemSingle-precision floating-point formatLine (geometry)Electronic program guideMultiplication signSelf-organizationSoftware developerShared memoryBlogPressureFeedbackPeer-to-peerGradientOrder (biology)Variable (mathematics)Object (grammar)Marginal distributionGroup actionPoint (geometry)CASE <Informatik>Right angleSemiconductor memoryView (database)Graph coloringVideo gameNetwork topologyComputer fileWordSpring (hydrology)GoogolXMLUMLComputer animation
05:19
Machine codeInformation securityObject (grammar)Machine codeBitTorvalds, LinusSound effectGroup actionLine (geometry)HierarchyFundamental theorem of algebraFerry CorstenMessage passingCore dumpPower (physics)Software frameworkParameter (computer programming)Lattice (order)Function (mathematics)Inclusion mapDefault (computer science)View (database)Kernel (computing)Integrated development environmentWordResultantState of matterRight angleSoftware developerProcess (computing)RoutingTuring testArithmetic meanLevel (video gaming)Point (geometry)Computer animation
10:27
Context awarenessProcess (computing)Squeeze theoremDependent and independent variablesStatement (computer science)Functional (mathematics)Machine codeBit rateObject (grammar)Context awarenessSoftware developerFeedbackMachine codeVideo gameMessage passingTraffic reportingGradientGroup actionMereologyProcess (computing)Right angleSoftware frameworkQuicksortCoefficient of determinationMathematicsVirtual machineServer (computing)Single-precision floating-point formatPhysical systemPolygonMultiplication signPointer (computer programming)ForestBitRevision controlRemote procedure callSynchronizationSelf-organizationTelecommunicationArithmetic meanPrinciple of relativityElectronic program guideOpen sourceStatement (computer science)DichotomyState of matterDependent and independent variablesShape (magazine)Arithmetic progressionFamilyRoundness (object)TrailPerformance appraisalShared memorySummierbarkeitPerfect groupCASE <Informatik>Social classSystem administratorNetwork topologySoftwareNumberProjective planePerspective (visual)HypothesisCommitment schemeMultilaterationPlastikkarteSqueeze theoremComputer animation
19:30
Context awarenessFactory (trading post)Dependent and independent variablesSoftware frameworkStatement (computer science)Process (computing)Data conversionMereologyLibrary (computing)CASE <Informatik>Order (biology)Multiplication signDependent and independent variablesQuicksortData managementFeedbackBitDevice driverProduct (business)Arithmetic meanBound stateStatement (computer science)MultilaterationSoftware frameworkResponse time (technology)Integrated development environmentOpen setSoftwareContext awarenessMachine codePerturbation theorySpacetimeLocal ringPower (physics)Slide ruleWave packetGame controllerAnalogyDivisorSelf-organizationBit rateMedical imagingPressureProjective planeRight angleResultantHacker (term)Point (geometry)WordMeasurementSemantics (computer science)Data centerComputer animation
28:33
Statement (computer science)Dependent and independent variablesInequality (mathematics)System callMachine codeCellular automatonReading (process)Computer animation
29:28
Software frameworkSelf-organizationMathematical analysisRight angleSheaf (mathematics)QuicksortMachine codeData managementAcoustic shadowNumeral (linguistics)Projective planeObject (grammar)Process (computing)Block (periodic table)FeedbackDependent and independent variablesLibrary (computing)WhiteboardWaveHacker (term)Line (geometry)Point (geometry)Physical systemOnline helpTerm (mathematics)Open sourceCASE <Informatik>Goodness of fitSurvival analysisVotingGoogolDirection (geometry)Focus (optics)MereologyLecture/Conference
35:54
Dependent and independent variablesFactory (trading post)Data conversionFeedbackDevice driverFactory (trading post)Rule of inferenceSource codeProjective planeGoodness of fitKey (cryptography)Computer animation
36:57
Coma BerenicesXML
Transcript: English(auto-generated)
00:12
So my name is Jacob Stovall, and how's this sound? Is it too echoey? Is it OK? It's good? OK. The name of this talk is honesty, kindness, and inspiration, pick three.
00:22
If you're a live tweeter, I would love it if you talked to me about this talk. If not, no worries. I'll just sort of confess before I get started. This is my first full-length talk, and I'm just honored to be here. It's really cool. Thank you. And I'm just blown away how many people are here at the very last sort of time slot.
00:41
So that's more than I could have ever hoped for. So here we go. I'm going to share a tweet with you that I saw back last spring. It's from a joke account called iamdeveloper. The tweet is, code review can be honest, nice, pick one.
01:02
And it got me thinking, this attitude seems to be the attitude among many developers. It's that I can either give you feedback that is touchy and feely and makes you feel good or is useless and is useless, or we can get some real work done, but you'll feel like garbage,
01:21
and that's just the way it is. The tweet received many objections from people who were wondering out loud why feedback can't be both honest and nice. And that is sort of the inspiration for this talk. It's what got me to propose it. Many employers who are rightfully
01:42
seeking high code quality have created cultures inadvertently that encourage code reviews, leaving developers feeling unappreciated, overly criticized, ultimately burned out. Furthermore, women, people of color,
02:02
and developers from other marginalized groups have documented overly harsh, unproductive code reviews that seem really more combative than productive. It's basically that the discrimination that they face at the workplace really comes to a point inside code reviews. And furthermore, when people identifying in those same groups,
02:24
if they were to give a code review that was perceived as harsh or brutally honest, they could be dismissed as angry, pushy, or difficult to work with. So one of the reasons I wanted to do this talk was an excuse to learn all about code reviews
02:42
and do research on them. I did a lot of Googling, and I did a lot of talking to many people at this conference, in fact. And there's a really great blog article by Eric Dietrich called How to Use Code Review to Execute Someone's Soul. And he lists several flavors of toxic code reviews.
03:03
And I've paraphrased them. I'm about to paraphrase some of them for you. And I've also added in some others from just sort of my ramblings along the internet. There's the nitpicking session. It's basically we're going to talk about how to name variables, how to organize files, clever one-liners. Frankly, I don't think that code reviews are really
03:23
the best place for this sort of thing. We have linters. We have style guides. If an organization can really just agree on one and then also agree to enforce it, I think that's a much better place to do so. So much of this stuff also can come up
03:41
as opinion disguised as fact. So you can have an opinion that variables should be lowercase and underscored. And that is a popularly held opinion in this community, but it's still an opinion. And maybe we just need to call it an opinion and dispose of dressing it up as objective fact.
04:01
Also, and I've talked to several people about this, there can be this sort of pressure as a reviewer to find something, anything wrong with the code. You can't just pass it off and say, it looks good to me, right? If I, and it's that if I don't find something wrong with it, my peers and my supervisor will see me as incompetent or lazy.
04:20
Like I couldn't find anything wrong with it. And so what you get is people taking the proverbial red pen like your 11th grade English teacher and circling sort of the lower order things like formatting problem or indentation or bad variable name or things like that. And you can get this sort of deluge of things that I mean, while they're nice to have,
04:41
they're really, they can bury the lead. Like they can bury sort of like the discussion about the more important stuff. Like how does this submission, how does it fit into larger system that you're submitting it to? And I think that's what code reviews are really for. There's also the marathon where it's like, we're gonna sit down in front of a projector,
05:01
we're gonna go over every single line ad nauseum to make sure it works and make sure it's right. I mean, that may seem like a good idea. And far be it from me to tell you that it's not a good idea for your team. But one, I just asked the question like, what teams do you have time for that? And two, I'll just share in my professional life,
05:20
I have known people that are able to get their way simply by talking the longest. They're not explicitly rude in what we would consider rude. But when an argument ensues, what they do is they dig their heels in, they do not, they refuse to concede. If the meeting ends, they'll bring it up at the next meeting.
05:41
And then they win by default because the people they were arguing with got tired, had to leave, had to pick up their kids, whatever it is. So that doesn't sound like the most productive thing either. There's the firing squad and the exit exam and at this point, the relationship has become adversarial.
06:01
It's that I'm going to try to poke as many holes in your code as I can to find deficiencies. If I can find deficiencies, you failed. If I did not find deficiencies, you've passed. And I can understand that, particularly with things, concerns regarding security and making sure the code works. But I think this can have some adverse side effects.
06:25
You'll spot a toxic code review when it's defended using something along the lines of this statement. I'm just focused on the code, don't take it personally. I find this framework particularly concerning because I think it opens the door to toxic criticism.
06:40
I agree that as developers, we don't want to be too attached to our code, we heard that this morning. But I think the phrase don't take it personally is flawed. We're thought workers, after all. The code we write is a consequence of the way we see the problem, of the way we see the world.
07:02
So telling me that I need to compartmentalize myself and my world view doesn't sound entirely possible to me. Furthermore, in a toxic environment, I can imagine a team where people really don't trust each other. This phrase, don't take it personally,
07:22
could be used for all kinds of abusive behavior. You're incompetent, don't take it personally. Here's a quote from Linus Torvalds, the creator of Linux. Maybe you know what I'm about to say. I'm not a nice person and I don't care about you. I care about the technology and the kernel. That's what's important to me.
07:42
It's a great leader. I've given this talk before and I've been asked the question, but Linus Torvalds has created so much, right? He's given so much to all of us. For that matter, what about Steve Jobs? He was an infamous, brilliant jerk. Are brilliant jerks, maybe they're just worth it. You compare how they're mean,
08:02
but you look at their output. Isn't it worth it? For one thing, I think that this ignores the externalities involved when people are in power or are allowed to abuse their power under the excuse that their output is so great. Those externalities could include team burnout, people being afraid to speak their minds, people being afraid to make mistakes.
08:22
We're a creative profession. We have to have the freedom to make mistakes. We just have to. This behavior can come from the jerk, but it can also come from the people that are influenced by that jerk and depending on how far they are up in the hierarchy of the company,
08:41
that bad behavior can spread downwards and outwards, right? And at its worst, we see things like sexual harassment, discrimination, workplace bullying, and sexual assault. And we've heard many examples of this, especially recently. If you add it all up, I'm not sure that brilliant jerks are worth it.
09:02
At its most fundamental level, though, I'm not convinced that we can ask teammates to just leave their emotions at the door. There's a large body of research out there, especially recently, that despite what economists say, we are not rational beings. We are emotional beings. It's really just the core of how our brains work.
09:25
There's more. Again, I've talked to a lot of people at this conference. Everyone I talk to has something to say about code reviews. They either love them or hate them, and if they hate them, I ask them why, because I wanna keep a running tally of all the things that don't work.
09:41
So a lot of you do like code reviews, so I'm not saying this is an awful thing. And I'm about to share with you some things that I think would make them go better. But there is also the bottleneck, and that's where a single person or a team, maybe the security team or something, is responsible for reviewing everyone's code. Nothing can go forward until that supervisor
10:00
or small team has approved everything, right? And as a result, we get people who start writing their code with the objective of getting it past that one person. So you don't do it that way, so-and-so wouldn't like it, right? And then everyone starts trying to think a little bit more like that one person, and that's group think. And as creative workers,
10:21
that's probably not something that's good for us. Whoops, let's turn that off. As a reviewer who doesn't have enough context, that's gonna be a problem too. I mean, I think that's interesting, because outside reviewers can bring new fresh perspectives in. But I think problems can happen
10:41
when that reviewer doesn't have enough context about the larger system that this thing they're reviewing relates to. So what they end up doing is, again, they start taking out the proverbial red pen, and they start identifying all the things that they can see, but there's probably a lot of other things that they're just not clued into, because they're just not close enough to the project.
11:03
Not necessarily that you shouldn't use outside people, but that can be an issue. Similarly, when someone's too close to the project, they've been working on it themselves for a very long time, and they just missed the forest for the trees. Code reviews that are summative, and summative comes from the education world,
11:22
if you don't know. It's basically an evaluation of something that is finished. It's like a final exam at the end of a class in college. How much did you learn? So a summative evaluation is, how good is this now that you're ready to submit it? As opposed to a formative assessment, which is an evaluation of something as it is taking shape.
11:43
So think about it this way, would you rather get a code review about a bunch of code you wrote over the last month, or would you rather get a code review about a small chunk of code that is still a work in progress, and you wrote it yesterday? How much context would you,
12:00
with switching, would you have to do just to get back into the place that you wrote a whole month ago? Sometimes code reviews can be too asynchronous, and if you work on open source, you know exactly what I'm talking about. So let's say I make a commit on a Monday afternoon, and my colleague, she starts reviewing it Tuesday morning. But then she has a family emergency,
12:20
she has to leave work, and she doesn't get around to finishing it until Wednesday. But by then, things have changed, and of course, the code has to change too. And unfortunately, my poor teammate wasn't looped in, it was my fault. I wasn't looped into this change, and she finished that code review even though it was basically useless, right? Now that must be pretty disheartening to her,
12:40
because she invested all that time in something that ended up not being needed. A lot of us are remote workers, or asynchronous workers. I do some of my work that way too. We have to be able to work asynchronously, so I'm not saying we shouldn't do that. But it's a challenge, right? Humans are hardwired for face-to-face communication, doing things together with another human at the same time in the same place.
13:03
And frankly, just that code review is just homework. It's just a bunch of work that's just piled on your desk, and on top of everything else you're supposed to do, it's you're supposed to review all this other stuff. And it just feels like too much. I've talked to people who have said that code review's required for every commit.
13:21
I think that's a great idea. But I've spoken to people that have felt that it sometimes goes a little bit too far, right? It's like, we can think of a human being another step in your CI server, right? A human will review every single change. But humans have to do context switching, and they don't do it as well as a machine.
13:43
So let's say there's a really trivial change, and I need to get someone to review it so we can push it out in a couple of hours. That means my colleague has to stop what they're doing and review it. And maybe it only takes five minutes, but they had to change context twice
14:00
just to review that code. And maybe it was trivial. So was it worth it? I don't know. But that's a price to pay. And then finally, all of the above can lead to this one. No one believing in the process, right? So it's sort of a self-fulfilling prophecy, right? It's like no one thinks that code review is working at their company,
14:21
and so it's definitely not gonna work at their company. And then it just gets worse and worse. So where does that leave us? The thesis of this talk is definitely not stop doing code reviews. I'm hoping that we can, there's a way to sort of find, we can find our way to better code reviews. We need to arrive at the best code possible.
14:42
So there's the poop sandwich, which maybe you've heard of. The basic idea is like you sandwich two good things, and in between comes a not so good thing, or a hard to hear thing. I really like how you did X, but the part where you did Y was not so good. You should do Z instead. On the whole, I loved how it was, blah, blah, blah. And I think this is a little bit better because it helps developers,
15:02
it helps develop trust within teams because the stuff that's hard to hear is padded between two nice things. It would be a little tricky to be just uniformly mean in something like that. In a past life, past career, I was a high school drama teacher. And balancing critical feedback
15:21
with the need to get better was really important to me, as was protecting the ego of fragile 14-year-olds. We're all 14-year-olds at some extent, like deep down, we are. Let's just acknowledge that. The problem with that was that the receiver
15:41
would sort of hear their feedback and they'd think of it like a report card. They'd say, okay, I got two good grades and one bad grade, right? And I can take these two good grades and I can think of them like trophies and I can put them in my shelf. And that bad grade, I can put it in my closet and not think about it. So these two things, I don't have to get better at, I'm just good at them. That bad thing, I'll just forget about it
16:00
because I'll never get better at it. And when in fact, the truth is the bad thing, they totally can get better at it. And the good things, they could be even better still. So I wanted something better. And fortunately, I learned about something better and that's what this talk is about.
16:20
It's called the Liz Lerman Critical Response Process. The objective is to inspire the creator to go back to their work with fresh eyes, excited about making their creation even better. Liz Lerman proposes that the concept of balancing kindness and honesty is a false dichotomy. Really, it's about inspiration.
16:40
Inspiration to go back to your work and make it better. It's like you can't wait to go back to your keyboard and start working on all the suggestions you got. The story takes us, believe it or not, to the dance world. Liz Lerman is and was a dancer. And the dance world is and was notoriously harsh in the way it gave feedback to people.
17:00
She wanted a framework where creators were encouraged to think critically about their work and wanted to make it better as opposed to just earning a good grade or getting a job or earning approval from people who could further their career. It makes sense to me. If your number one objective is to get a good grade, get a job, what impact would you expect
17:23
that would have on your creativity? The critical response process started out being something used for artists, but it's not constrained to works we would call capital A art. It has been used by artists, but also administrators, scientists, academics, and even the corporate sector.
17:41
It's really for anything that's creative. And I think software is the perfect use case for this. There are three roles in this process. There's the creator, the responders, and the facilitator. The creator or creators are the people that are directly responsible for the work. The responders are people that are offering
18:01
their honest and encouraging feedback to the work. Not nice, encouraging. The facilitator keeps us on track by following the framework. So it's worth noting that roles can sometimes get blurred. So if there's not a facilitator in the room, great. We're all facilitators. Sometimes everyone in the room is a shared owner of the work.
18:23
They all worked on it. And that means we're all gonna play the role of both creator and responder. There's four steps. And step one is statements of meaning. We're gonna state, the responders are gonna state what about the work had meaning or stood out. And statements should not be evaluative, good or bad.
18:42
There's a really great talk from RailsConf 2016 by Nadia Oduayo. It's called The Guest, a Guide to Code Hospitality. You should really see it if you haven't or if you weren't there. And in it, she proposes that we think about introducing a person to a code base in the same way that a host welcomes a guest
19:00
into their home. I mean, if you were a guest in someone's home, you probably wouldn't, upon first walking in, comment on the dirty clothes on the floor. It's rude. So here's a good example. I noticed that this code is written in a functional style. That is not evaluative. We can have a great discussion about that, but it's neither good nor bad.
19:21
Here's a bad example. This version of Rails is out of date. You should update to Rails 5. Maybe that's true, but it's evaluative and we're gonna save that for later. To be clear, the process isn't trying to somehow silence important criticism. I just wanna make that clear. We're just trying to save that for later. So step two is questions from the creators.
19:43
Responders are allowed to give their opinion about, oh, excuse me, so questions from the creators, that's actually wrong, I'm sorry. Creators are gonna ask questions about the work and responders are allowed to give their opinion about things explicitly asked about. That's the key. So the process aims to put the creator
20:01
in the driver's seat. The creator gets to go first by asking questions that they're just dying to know about. You may have done this before, too. When you're writing something, you're like, oh, I just know that that part over there is a bit hacky. I just know that this part over here could run faster. And that's where you get to go first by saying,
20:20
is this part hacky, what do you think? And now responders, they get to answer the question, but only the question and nothing else. Here's a good example. In the docs, was it clear to you what this method does? If not, how could I make it more clear? And here's a good response. I think I understood how to call the method, but not why. Maybe think about including an example use case.
20:42
Hey, that's great, tangible feedback, that's something I can jot down and go and fix it. Here's a bad example. No, and in general, the docs were not so well organized. The way you should organize it is blah, blah, blah. That's out of scope of the original question. And we could imagine that this conversation is suddenly spinning out of control into sort of a flame war
21:02
about how to organize documentation. That's not what the creator wanted feedback on. So we're not gonna do that. Step three is neutral questions from responders. The purpose here is to understand the context of the work. The questions should not have an embedded opinion.
21:20
Opinions that just happen to end in a question mark. So to continue that analogy of having a guest in your home, let's say you are taking a shower in the guest bathroom and there's no hot water. You probably wouldn't go to your host and say, why is there no hot water? Instead, you might say something like, is there something special I should know
21:41
about the guest shower? Because you are leaving a space for the possibility that there may be something about the guest shower that you don't know. And you could get a perfectly reasonable response. Hey, the hot water in the guest room, it just takes a while. Just turn it on, wait a few minutes, it'll come on for sure.
22:00
Or I already took a hot shower and we're out of hot water, just wait 30 minutes. That's great. Now you know something better and you can be a better guest. If you've ever given a talk at a conference and you've heard someone give their opinion with a question mark at the end, you know what I'm talking about. We don't hear it at this conference though.
22:21
Here's a good example. What ideas guided you to select FactoryBot for this project? Bad example, what were you thinking when you chose FactoryBot for this project? Okay, starting a sentence with what were you thinking makes your opinion clear. It's rude but it's also, it's like, it's out of bounds
22:42
because that's an opinion. We're not sharing opinions, we're just trying to understand the work better. Opinions come at the end. Which brings us to step four, opinions. Responders may give their opinions on the work with the consent of the creators. Okay, here's an example. I have an opinion about response times in production.
23:02
Would you like to hear it? Creator says, sure, go for it. And they give their opinion. Or the responder could say, I have an opinion about the use of FactoryBot in this project, would you like to hear it? No thanks. Moving on. Right? The process believes that the creator knows best
23:22
how the conversation should be steered in order to yield best results. It trusts that the creator can see when the conversation is going down an unproductive path. In this case, the creator may have their reasons. They may know that they're committed to that library at this time and it's really just not practical to change it.
23:42
Or maybe they could have all kinds of reasons. But it gives the creator the power to redirect the conversation to something that's going to be productive. The nice thing about this framework is that it's flexible. I've given in this talk at my local meetup
24:00
and I've gotten several questions before. Here are some of them. Would this work on GitHub or Slack? GitHub, I don't think it would. GitHub, as in a conversation on a PR, definitely not. That's just sort of a pile on of comments. That's sort of like, again, the proverbial red pen. I think there's a time and place for that and GitHub's probably great for that.
24:20
But not this. People need to be in the same space, real or virtual, to sort of have that conversation together. Slack would work as long as people are all gonna agree to sort of do it at the same time. If it's gonna be asynchronous, probably not. It needs to be a conversation. Again, we're humans and we're most hardwired
24:40
for synchronous conversation and I think this process really needs that. Does this mean I can say no thanks to my manager's opinions? Probably not. But it does mean that you and your manager might be able to work together and establish a shared understanding of when and how feedback could be given.
25:00
Maybe, and again, it's entirely between you and your manager and your team. Maybe feedback at 4 p.m. on a Friday is not the best time. Or 8 a.m. on a Monday before anyone's even started work. Maybe that's just, we've discussed ahead of time, that's off limits. But if feedback is sort of inbounds
25:20
of that shared understanding, then there's an implied understanding that yeah, that opinion is welcome at that time based on the working relationship you have and trust. The framework seems too rigid. Does it have to be? Definitely not. Okay, now all of you, I took a one-day training on this but I didn't need to.
25:41
All of you are now empowered to use this and I'm gonna share the slides with you too. So if you all are ever wanting to have a conversation like this, you all are empowered to say you know what, I think it's time, we're just gonna skip ahead to step four. Or now that we've talked about opinions, we're gonna skip back to step one. You can totally do that based on just what seems the most appropriate.
26:01
I'll also point out that it seemed that not everyone in the process needs to know it by name. Let's say you're the only person in the conversation that knows about this and you're responding to someone's work. If you start by asking them questions to understand the context of what their work is
26:23
before you start giving your opinion, that's a win, right? Because you now understand it better and you can probably give better opinions. On the other hand, if you're getting feedback and no one else knows this process, only you, if you're the creator and you're getting feedback and you start by asking particular questions about your code that you're just dying to know,
26:43
that's a win too because you steered the conversation in a way that you know will be productive. Again, when I gave this talk recently, somebody commented that, I wrote it down, great teams do this naturally, they just don't have a name for it. I think that's totally true. For the rest of us, we have a process.
27:02
For the imperfect teams that we're a part of. Do you really expect jerks to go along with this? I don't. Honestly, like I don't. I honest, but also I think and hope that in most work environments, there may be this many jerks, but there may be this many people
27:21
who would love to use a process like this if they ever knew about it. And I just think maybe the issue is that we're designing code reviews without any kind of intentionality. And I think if there were a little bit of intentionality injected, I think we could all be a lot happier.
27:41
Are there ever times when this framework wouldn't be the most efficient? If the network's down, that's not a good time to do it. If there's a fire in the data center, there's not a good time to do it. But you're going to have a conversation afterwards about what happened and how you can make sure it won't happen again. And you want that environment
28:00
to be as blame-free as possible and really be focused on identifying the problem so it won't happen again and who will be responsible for what and what we need to change about our workflows so it doesn't happen. And this is a perfect time, this is a perfect place to do that. So just to recap, there's four steps.
28:21
It's the statements of meaning, questions from the creator, neutral questions from responders, and then opinions with consent of creators. I'm gonna open the floor for questions in just a minute, but I also wanted to give a quick shout-out to the Greater Than Code podcast. If you don't know, it's a weekly panel-style podcast
28:41
which talks about the human side of tech. This summer, they put out a call for guest contributors to their blog, and because they just seemed so nice, I decided to submit an article which eventually became this talk, even though I had never done something like that before. And were it not for that podcast, I wouldn't be standing here right now, so I really appreciate everything they've done
29:02
to support me, and I think you should check them out if you haven't already. These are some resources which you might not be able to read, but you can go to my GitHub and download the whole talk and see them there. And that's my talk, thank you very much.
29:25
I'm happy to take your questions. Oh, yeah, I'll repeat it. That's actually great, it's like, over here, we just had a comment, it's like, as soon as you make a PR, first thing you do, mark up your own PR, and say like, this part looks hacky,
29:41
what do you think? Right, and direct people to say like, that's actually a great idea, because I think that as a reviewer, that would give me a framework for going, oh, this person has these questions, okay, I'm gonna focus on that. I feel like I've been helpful, yeah, thank you. Other questions, or? Yeah, so I think the question, if I'm getting this right, is that
30:01
there's some questions, or there's some topics where the creator just needs to learn something new. And Liz Lerman, again, she was a dancer, right? She talked about, like, in the ballet world, which is like, highly technical,
30:20
I don't know anything about ballet, but like, there's right and there's wrong. And I think what you're getting at is the same thing, like, there's some things that are just right, or probably most likely a good idea, and there's other things that are just probably not a good idea. And honestly, like, that's another beast, the way I see it, right?
30:40
There are certain things that are like, no, please don't do that. However, like, I think what's important to do is to be upfront about that, and say like, at this company, we don't do that. Or at this company, we need you to know that this is the way to do it, that's just the right thing to do. And just be upfront about that, and just so everyone can follow it.
31:01
Does that answer your question? Yeah, so I don't think that, I don't think this process really is for that sort of thing. Yeah. That's awesome. The question was, what about communities that are always asynchronous, like open source communities? So I'll give a shout out to Agile Ventures,
31:22
if anyone knows about them. You can give them a Google, they're basically like, they are sort of a collection of numerous open source projects that people can jump into. They have a Slack community you can get invited to, and they have two daily standups
31:40
that you can just sort of, anyone can drop into on Google Hangouts. And just sort of ask their questions, say these are the blocks that I've had. And so yeah, I mean, hosting a daily standup, or even a weekly standup, or say like, I'm gonna be hacking on this on Sunday, if anyone wants to jump on a hangout with me at any point
32:01
I would love to sort of help you get started on this project. But it definitely takes some added intentionality for open source projects for sure. Yeah, so thank you, the comment was, let's be intentional about saying like,
32:20
let's say I believe and I think when that's the case, and not this is the way it's done. As a relatively newer comer, I appreciate knowing when something is objectively true, and when something is a good idea, but there's also other good ideas out there. I think that's a really terrific comment,
32:41
is like, if organizations can be really intentional about, these are the things that we are, these are the things that we hold to be true, but they are opinions, and these are the things that are just plain objective truth, yeah. Was there a question that you wanted me to answer, was that, I appreciate the comment, but was there any?
33:02
Yeah, the question was like, if you were on the receiving end of a code review, that is not so great, maybe toxic, what can you do to sort of turn it around into something that makes it less so? And I'll just first acknowledge that like,
33:20
if you're getting burned by your coworkers, there's no way to sort of wave a wand and make that not true. Like, it hurts, and it can't just be conjured away. As a white male, straight white male, cis male, I also have to acknowledge that like,
33:41
there's probably a lot of comments that I've never been on the receiving end of, that some in this room may have, but with that huge caveat, I get the best I can offer is that there are, there's hopefully something, what was that? Oh, that was in the back. There's hopefully something, he can pipe up too if he wants.
34:02
There's hopefully something to learn from it in terms of trying to sort of dig below the harsh comment that that person is making and say like, okay, well, I got this harsh comment before maybe I can at least learn like what's the underlying truth that they've got.
34:21
Maybe they're telling me like, I shouldn't use this library at all. Why is that? Maybe I'll wanna Google it and find out. They definitely shouldn't have said it that way, but what can I learn from that? So I don't think that's a really great answer, but I appreciate the question. So the creator, what if the creator is the jerk? I think that's a terrific answer or question.
34:45
Managers need to step up. They need to say, this is the way we're gonna be doing things. We're going to be giving and receiving. We're gonna be giving feedback that's honest and it's useful and we expect everyone to sort of learn from it and work from it,
35:03
but managers need to step up. Okay, so the question was like, let's say a responder, like it's sort of a standoff between two, the creator and the responder and the responder's not getting their way and so they recruit someone else on the team to sort of get them to agree with them and so now all of a sudden it's a vote of two to one.
35:22
I guess one thing you could do is you could say, you could be intentional about who the team is, who is going to be participating in this process and then close the door. I mean, I think like, I think, again, I think that goes back to intentionality. I think that's like, that's really sneaky toxic behavior
35:40
to sort of like pull people in when you clearly have an objective. So maybe a closed door would be the thing to do in a team like that. Yeah, the facilitator is a role that is held by a human or it could be held, it could be shared among several humans but the idea is like, actually that's really good feedback.
36:01
So let's say, thank you. That was a great clarifying question that answered something for me. I'm not joking, that was actually great. Let's say someone says this, what were you thinking when you chose factory for this project? The facilitator says, whistle. Don't, that's a question disguised as an opinion.
36:21
Can you try to rephrase that please? So they're like objectively enforcing the rules and they're saying, okay, now we're done. They might say, hey, we've heard enough of step one, let's move on to step two because the creator, while they are in the driver's seat, they're busy writing things down, they're busy thinking, they're busy asking clarifying questions and we want a facilitator
36:41
that's really gonna drive the conversation. Good question. Well, it is 3.20. I would love to talk more about this, so come find me. Thank you very much.