Do it right, then do the right thing
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 150 | |
Author | ||
License | CC Attribution - NonCommercial - ShareAlike 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/51533 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
TwitterRight angleStrategy gameEncapsulation (object-oriented programming)Context awarenessMathematical analysisSoftwareSoftware developerInformation technology consultingVolumeComputer programPattern languageMalwareProgrammer (hardware)Data managementTime zoneSoftware maintenanceDegree (graph theory)Transport Layer SecurityFocus (optics)Group actionEndliche ModelltheorieGoodness of fitSoftwareObservational studyMathematical analysisMetropolitan area networkCASE <Informatik>Self-organizationSoftware developerMoment (mathematics)Cartesian coordinate systemDisk read-and-write headMedical imagingInformation technology consultingAverageType theoryWordData managementSlide ruleSoftware maintenanceVideoconferencingLevel (video gaming)Pattern languageDifferent (Kate Ryan album)PlastikkarteStrategy gamePhysical systemMultiplication signBitRight angleText editorWave packetMatrix (mathematics)Bit rateClient (computing)Arithmetic meanClassical physicsContext awarenessSystem callCodeAreaLogic gateAdditionWater vaporShared memoryFeedbackWeb 2.0TheoryCellular automatonLine (geometry)Sound effectTwitterInterface (computing)Writing2 (number)Integrated development environmentJSONXMLUMLComputer animation
08:51
Time zoneCompilation albumMathematical analysisSoftwareBit rateIterationAuthorizationGreatest elementComplex (psychology)Drop (liquid)Information technology consultingService (economics)WebsiteProduct (business)Military baseReading (process)Multiplication signProcess (computing)Level (video gaming)ResultantData managementSoftware maintenanceCodeGroup actionFigurate numberFamilyMenu (computing)Sound effectWave packetBit rateHypermediaMögliche-Welten-SemantikMathematical analysisLattice (order)Office suiteElectronic mailing listMathematicsLocal ringDegree (graph theory)Right anglePlanningProjective planeGraphics tabletMobile appData recoveryGastropod shellPhysical systemMessage passingWordSoftware developerSoftware testingEnterprise architectureCycle (graph theory)40 (number)Bus (computing)AreaGoogolParameter (computer programming)Computer architecture19 (number)NumberDifferent (Kate Ryan album)FreewareBusiness modelTest-driven developmentValue-added networkCartesian coordinate systemClient (computing)FeedbackType theoryIterationSubsetComputer animation
17:38
Bit rateIterationOpen setAxiom of choiceProcess (computing)Variable (mathematics)Virtual machineGroup actionIntegrated development environmentSoftwareIterationVirtual machineAxiom of choiceMultiplication signSelf-organizationBitBit rateWindowProjective plane1 (number)Mathematical optimizationProgrammschleifeMainframe computerCybersexView (database)Product (business)GodArrow of timeMathematicsSoftware developerNumberRevision controlReading (process)Limit (category theory)Source codePhysical lawVariable (mathematics)Basis <Mathematik>Musical ensembleRiflingComputer programmingOffice suiteLine (geometry)PlanningElement (mathematics)DivisorTerm (mathematics)FamilyWaveCycle (graph theory)Software development kitMechanism designAreaSoftware testingChannel capacityCountingDomain nameWebsiteRandomizationCellular automaton
26:26
Software testingFunction (mathematics)IterationLoop (music)FeedbackBinary decision diagramData modelProjective planeSelf-organizationPerformance appraisalGodScheduling (computing)Standard deviationLine (geometry)Closed setShared memoryElektronische WahlDivisorRandomizationStreaming mediaParameter (computer programming)ResultantDifferent (Kate Ryan album)MathematicsCountingSoftware developerData managementAlgorithmProper mapPoint (geometry)Software testingRight angleProduct (business)Function (mathematics)Video gameLoop (music)IterationFeedbackDecision theoryNeuroinformatikObservational studyFlow separationLevel (video gaming)Multiplication signStructural loadSubsetBitRiflingGame theoryContext awarenessComplete metric spaceGoodness of fitControl flowNatural numberMetropolitan area networkFormal languageTouch typingDevice driverBoom (sailing)Computer clusterVotingWorkstation <Musikinstrument>Validity (statistics)Computer animation
35:13
Binary decision diagramData modelBuildingIterationSoftware developerBit rateLevel (video gaming)WordStrategy gameLogicNeuroinformatikSoftware developerGodNumberSelf-organizationData managementMultiplication signLevel (video gaming)IterationProcess (computing)Projective planeSoftware testingCodeStress (mechanics)Overhead (computing)QuicksortResultantSign (mathematics)Product (business)Cycle (graph theory)Control flowThermal conductivityCoprocessorClassical physicsKey (cryptography)Right angleGoodness of fitPlanningMathematicsPhysical systemBitPhase transitionRoundness (object)Kanban <Informatik>Mainframe computerSubstitute goodChannel capacityCompilerFrequencyBefehlsprozessorPerformance appraisalMechanism designGreatest elementArithmetic meanAverageScaling (geometry)Sound effectReal-time operating systemCodeText editorCompilation albumFreewareChainAreaGroup actionForest40 (number)Peer-to-peerBit rateWebsiteComputer clusterType theoryForm (programming)Cellular automatonComputer animation
44:00
IterationFeedbackTwitterSoftwareStrategy gameControl flowSheaf (mathematics)Multiplication signPerformance appraisalDirection (geometry)Interior (topology)Escape characterStack (abstract data type)Extreme programmingParameter (computer programming)PlanningMereologyHypothesisTraffic reportingSpacetimeClient (computing)Product (business)BuildingMoment (mathematics)TwitterPoint (geometry)CuboidDiagramDrop (liquid)EmailWeb pageProjective planeWindowSoftware testingFeedbackData managementGroup actionIterationPlastikkarteGodMotion captureSelf-organizationMathematical analysisCylinder (geometry)Right angleProcess (computing)Touch typingRoundness (object)State of matter2 (number)Revision controlLine (geometry)Order (biology)Fitness functionOffice suiteObservational studyWebsitePersonal identification numberGraph coloringComputer animation
52:47
IterationSoftwareStrategy gameFeedbackComputer animationJSONXMLUML
Transcript: English(auto-generated)
00:06
That's flying a kite. This talk is to some degree flying a kite. Are we on? Three, two, one, action. We'll go. This talk is to some degree flying a kite, and looking at this slide up here now,
00:23
I think I probably want a question mark right after that G, right at the end there. Oh, don't do that. Walk past that. There is a question mark on, then. This is as much a question to you, the audience, and I'd really appreciate it after this talk.
00:44
Grab me, tweet me, whatever, and say, am I talking complete rubbish here? Or just possibly is there something in what I'm saying here? Because for the last few years I keep running across this thing where I actually start thinking it's more important to do things right
01:02
than it is to do the right thing, and I will attempt in the next 50 minutes, an hour, to explain why I keep wondering this and why I keep coming back to this and why I'm starting to find evidence and the way the thinking develops. Hopefully I'll keep your attention for that.
01:23
Quick word about me. This is me, my beautiful face there. Usually I'm associated with the word agile. Whatever the word agile means, whatever agile is these days, because anybody actively not agile here? Anybody know anyone who's proud not to be agile?
01:44
Yeah, yeah, the man down here, yeah, yeah. Proud to be a waterfall organisation. That's going to get you a lot of good recruits, isn't it? Yeah, so normally people talk about me as agile Alan, would you believe it? A few of my clients take the call of me agile Alan, a place I go back to. I did actually write a book about agile a few years ago.
02:05
It's the blue book at the back there, Changing Software Development, Learning to be Agile, which I always tell people is not the first book to read when you want to know about agile. It's a much deeper look at agile and knowledge and learning and how that fits into the businesses and why I think agile is a better model for just the way we actually develop software
02:22
than whatever came before traditional waterfall stage gate, whatever you want to call it. So it's still very available. Please go out and buy it. Last year I wrote this book, Business Patterns for Software Developers, and despite me and the editor spending a lot of time over the title, we kind of got it wrong.
02:42
And everyone assumes software developers means coders. And if you're a coder, I'm sure you'll really enjoy this book. We meant software developers as wider groups, the organisations, the teams, everybody's involved in creating software products. And that is a book of about 36 patterns on how to run a software business.
03:02
So there's a card here, you can pick it up and take it home to remind you to buy it. There's some more on the way out. You'll also find me in some more technical books, the patterns book, Kevlin Henney's 97 Things book, and a book that's out later this year on business analysis. That's me. I make a living doing agile training
03:21
and consultancy type stuff. And I, yeah. That's me. There's a management commandment out there. You talk to managers for more than a few minutes now or whatever, or you've ever gone to a management course and for long you'll hear this said,
03:44
do the right thing, then do it right. Any of you ever heard that? A few heads nodding, yeah? You can tell your developers, if this is a management one, yeah, yeah. It's a management commandment. Every manager has taught this. It's almost management 101 if you go on a course, you know,
04:01
the first thing, do the right thing and then do it right. And I'm here today to challenge that. I strongly suspect that while this might be right in some context, the context we increasingly live in, our IT-enabled world, means that it might often be the case that
04:21
you want to do things right, and then you can work out what the right thing to do is. I've got to try and persuade you of that. One thing I want to be clear of is I'm not, in saying this, I'm not actively saying do the wrong thing. I'm not actively saying, knowingly do the wrong thing. I'm not saying that. What I'm saying is, maybe we shouldn't
04:41
obsess so much about what the right thing is. Maybe we should take a general stab at the right thing, get good at trying to do something, and feed back and learn there. So what I am saying is what I'm saying is you only know the right thing by doing.
05:01
So let me give you some evidence here. This is the first piece of evidence, the first exhibit, if you will. And this is where my suspicion started to be aroused that maybe we should be doing things right and then doing the right thing. This is a study from Bain Management Consultants.
05:22
Bain, really good. secondary to McKinsey's. Any Bain people in the audience want to argue with that? And it was published in a seriously academic journal, the Sloan Management Review. So it should have legs. If you want, you can Google, you can find a study. Yes, you can argue with some of the methodology. Yes, you can argue with some of the findings. But
05:41
let's just stick with it for the moment. These guys, classic management consultants, they divide the world into a two by two matrix. And they rate companies, largely corporate IT environments, on two criteria. Are they less effective at doing IT or more effective at doing IT? Are they less aligned to what the
06:01
business wants or more aligned to the business? Okay? So actually another way of saying, are we doing things right or are we doing the right thing? So you can divide the world up into a two by two matrix. Here we go. According to these guys, 74% of companies are in what they call the maintenance zone. This
06:21
explains why so many of you and so many of my friends work in IT shops which aren't particularly effective and work is miserable. Three quarters of companies are stuck doing well, they say maintenance. They don't actually mean maintenance work. They're probably as large as maintenance work. 74% of anything is average.
06:41
So your IT spending is average. Overall, over a three year period, they think their sales are falling 2% a year. While most people are maybe here, it's not a particularly good place to be, is it? Obviously, we'd all much rather be up there. IT
07:02
enabled growth. The promise that through IT, whether you are writing software like Microsoft, whether you're using video conferencing systems like the former Tamberg, whether you are drilling oil like Statoil but you're using your IT to do it better. IT is allowing you to grow.
07:22
Very few companies make it to this nirvana. Just 7%. Interestingly, they spend less on IT than average. Being effective at this is cheaper. Makes me a bit of a surprise, isn't it? Look at those sales figures. Their sales are roaring ahead.
07:42
35%. There's a 37% difference between those two quadrants. It's pretty obvious where you want to be and where you don't want to be. Unfortunately, most of us are not where we want to be. It's the other two quadrants which is a really interesting one that really laid the foundations of this suspicion. Well-oiled IT.
08:02
Again, not many companies, just 8%. IT spending is massively below average. Look at those sales figures. We're not saying these guys are knowingly doing the wrong thing, but they're less aligned to the business than they could be. But they're effective at what they do.
08:23
And now this. The second largest group of companies and they're highly aligned to the business. They're following business strategy. One may think they're doing the right thing. Their IT spending has rocketed.
08:40
And their sales are falling. It's almost a mirror image of the well-oiled group. If you can't be an IT-enabled growth, where do you want to be? How many of you would do the right thing badly? And how many of you would do not quite so much right
09:01
well? Choose for yourselves. Let's relate this to Agile. Agile does this. Most of the Agile literature is about doing things right. Test-driven development improves the quality of your code. Acceptance test-driven
09:20
development, or BDD, or whatever we're calling it today, doing the right thing. It's like doing things right. Ensuring you're getting the tests automated. Ensuring you're getting quick feedback. Stand-up meetings, planning meetings. You can go through the Agile techniques. And almost every one of them is about doing things right. There's very little in Agile about
09:40
doing the right thing. Agile largely allows us to move to the left-hand side, as you're seeing of this quadrant. Once you get there, you can think about moving there. But managers are trained. I know, I've got a bloody management degree. You're trained to go from
10:01
bottom left-hand up, do the right thing, and then do it right. If you move maintenance up to alignment trap, it gets a damn sight more difficult to do things right. Now, you can go and read the original paper yourself, and you can read what the authors say about it. They talk about complexity and engineering complexity
10:21
out, and being 2007 and being management consultants, yes, the letters SOA were involved. My reading of this, trying to map this into the Agile world, is where does this go wrong? And I think what happens is, if you're down there in the maintenance zone, and you follow the management commandment to do things, do the
10:41
right thing before you do the right thing, before you do things right, what do you do? You spend more time in that analysis. So you hire more analysts, more business analysts, more product managers, whatever you want to call them. You spend more time on architecture. You spend more time on reviewing your work. You get the requirements really pinned down. You elaborate on them.
11:01
You spend more time in meetings, reviewing them. And you need more people to do all this. So all those project managers, architects, business analysts, etc. push your costs up. But you don't get anything out the door. Eventually you get the right thing out the door, but while you've been getting something, while you've been deciding what the right thing is, one of your competitors has probably done it. You're playing
11:21
catch-up when you can do things right. You can take a punt. You can run an experiment. That's what I think is going on. Doing the right thing costs money. It doesn't come free. Business analysis and project management are not benign
11:42
activities. They have a downside. They cost money. They take time. It also assumes there is a right answer, and it's knowable. How many of you are working on a project
12:01
that's more than four years old? One, two, that's a surprising number of hands. How many of you who work on this project that's four years old have requirements around mobile phones and iPads in the system? About half of you, I guess. The iPad didn't exist when you started
12:21
that project. If the right thing to do now is to build an iPad app, you couldn't have known that at the beginning. The world moves on. So the answer might not be knowable. Exhibit B in the evidence here. How many of you have come across this book?
12:41
Erica Reeve's Lean Startup. It was last year's it book. It was last year's book for every entrepreneur and politician to read. And it's a damn good book. It takes a lot of the things we're talking about in our channel and escalates them up to a kind of an enterprise level. And what basic saying is knowing the right thing to do is difficult. But actually, get into
13:01
the market and understand what people want. So here's an example. I think it's from Texas of a company. It's an online grocery type company, you know. I don't know. You must have these Norway's that you order your food online and somebody drops it off in a van. This company in Texas took it one step further. They would write your menu. They'd understand you're
13:20
feeding details of your family. Mum, Dad, three kids, baby, whatever. And they would decide what your menu should be and send it to you to prove it and then they'd deliver the food to you. And this company started with no customers, few employees and no technology. And if you signed up, if you're on the first bill to sign up for the service, I think they used to go to mother and baby groups or maybe in Norway they'd go to daddy and
13:41
baby groups. And they'd talk to families about, well, they'd make notes and they'd come back to their offices and the chefs would analyse and they'd drop a list. And then they might go down to the local supermarket and buy it and they'd drop it off in the back of their cart, yeah. But they'd learnt about their market. They learnt more, they learnt more. And then they could start to work out what
14:01
IT, whether they had a viable business model and what IT actually needed to support that. They didn't try and plan it in advance. They just started to get into the market and understand it. This book's full of stories like that. AB testing. Any of you come across AB testing? Yeah. You don't know what the right thing is. You put something out there, you see what the market
14:21
says. Client of mine, they launched a new sideline. They tried six different buttons on their website to get people to click through to the new product. They found one of the buttons had an 80% higher click-through rate. That was the button with a drop shadow. How the hell could you know that? It's only by doing it, can you see?
14:43
Doing it need not be expensive. You know, it may have been in the days of C++ and SCO servers, expensive to put a product out there. You know, we've got Ruby, Groovy, all those other things. You just rent the processing cycles from Amazon or Google or whoever.
15:01
It's not expensive to do this stuff. Most of the tools are free. Just download it from Google. Capital's less of an issue. You don't need so much capital to get started. In Europe, we're discovering the joys of angels and venture capitalists. They're not perfect, but you know, more readily available.
15:20
Exhibit C, me! That first book, what I wrote. You know, the argument I'm making there is that we have to learn. Everything we do in software development is about learning, learning about what the business wants, learning about what the technology can do, learning about the processes. And true learning leads to change. If we learn something, and we don't do something different, all
15:40
we've done is memorize the fact. You know, the Archduke Ferdinand was assassinated in Sarajevo in 1914, and it started a war. Fact. If, however, I learn the bus I take to work every day gets me to work late, and I start getting an earlier bus, or I take a different bus, I've changed.
16:00
And as a result of that, I learn something. And that's what we do. We have to do something to learn, and we have to change our learning. It's all about learning. Exhibit D. Ida Goose was the chief planner of Shell for many years. I think, about the time he retired, he wrote
16:21
this book, which is an excellent book if you want to get into management stuff. The key message from this book is in that quote. He says managers, but I think in the IT world it applies to us all. The only competitive advantage the company of the future will have is, and I also appreciate the word
16:41
employees, employees' ability to learn faster than their competitors. Can you move faster than your competitors? Can you incorporate learning about what your customers want? About the technologies available? About the overall market? About the processes you're using? How fast can you learn?
17:00
How fast can you use that learning to create change? How do you learn? You come to conferences, yeah, yeah, and hopefully you'll learn some good stuff and you'll take it back. But when you get back to your business, your business is slightly different. I think we learn
17:21
by iterating. And the faster you iterate, the more you iterate, the faster you learn. And when I say learn I also mean you incorporate the change. Learning creates change and change creates learning. You iterate, you do something. As a result of that, you learn. As a result of that learning, you act on the learning and you change something and you
17:40
iterate again and again and again. We learn to iterate, we learn to iterate faster and we improve our aim. So the arrows there and the dartboard, not the dartboard, it's a bit like this. Which target should we aim for? Which is the right one to aim for?
18:00
How are you going to choose? We'll take a shot. Bang. Another one. Bang. Bang. Yeah, maybe we'll take six months. Between each shot at the market we'll spend six months thinking about where our next shot should be. And we'll change the aim a bit. And we'll do this. Yeah. If you do it faster.
18:23
Yeah. Ready, fire, fire, aim, fire. You've got to iterate. You've got to incorporate the learning into that. How many of you iterate? Hands up.
18:40
This end of the room seems to iterate more than this end of the room. They were the guys who came earlier I think on the hall, so you self-divided. Yeah. Software was always iterative. The only software that stopped changing is dead software. This is why the whole project
19:01
metaphor is just wrong in our industry. Go to Sourceforge and you will find hundreds of thousands of software projects that don't change. They don't change. Nobody uses them. If people use software, if it's useful, you want something else to it you change it. It has to evolve. It has to move from Windows 2000 to Windows Vista to whatever
19:21
else. It has to change. And you want more features on it? When software stops being used, it stops changing. So in project terms, a project has to have a start and an end date. The only software that really satisfies the end criteria is when it's dead. We need to keep going around loops again and again. We always used to improve
19:41
our software, but we used to have random iterations. Nine months, six months, three weeks. No, we're talking about two weeks, two weeks, two weeks. Or maybe week, week, week or day, day, day or something. Faster we go, the more we can learn. Which weapon do you want? Who works in an
20:04
organisation using machine guns? And who works in an organisation that uses snipers rifles? Obviously, if I'm proposing a ready-fire-fire-aim-fire-fire-aim mechanism, a machine gun is going to be a little bit better.
20:21
A sniper's rifle takes time to target and to shoot and to reload. I want you all in your minds to think which would you compare your organisation to? Are you working in a machine gun organisation or a sniper's rifle? I don't actually think this is a choice for you to face. I think most organisations, this is a choice.
20:44
There's a few organisations out there that can do rapid work. They can do rapid iterations. They can push rapid products out. They can learn rapidly. They can improve their aim rapidly, rapidly, rapidly. And I'd suggest most of the rest aren't sniper's rifles
21:00
who can carefully take aim at the product, the market, and spend their time aiming and then using the optimal shot, the optimal amount of budget, just take them off. No. Most of the ones that go slow have got something like a 200-year-old gun there. You're still working in COBOL on mainframes. Pushing a release out takes weeks.
21:21
And you risk a verse. You don't want to push releases out. Oh, no, no, no, because people aren't like that. It might lead to some problem. I'd suggest that, you know, on the whole, modern organisations, you're either rapid-fire organisations and there's very few of them. Or you're working with these kind of, not quite a flintlock.
21:40
I think it's a bit more than a flintlock gun. If you look up on Wikipedia, you can read all about that. I think this is a choice we face. And if you're if you're in this kind of old 200-year-old gun, of course you're not going to waste any shot. You're going to spend your time analysing, agonising,
22:01
planning, targeting, researching. While you're doing that, the competitors who can do rapid-fire are just going to do it. And they'll find out what the right thing is by just getting into the market and finding out. There was a time with market research, going down the line, reading about a domain, commissioning a research company,
22:21
going out and asking people in the street, do you prefer vanilla ice cream or strawberry? Vanilla ice cream or strawberry? Vanilla ice cream or strawberry? What's the way to do it? In IT, push out two websites, one sells vanilla, one sells strawberry, and see which gets the most hits, which gets the most purchases.
22:44
There's a question here. I always think of the software world as a supply and a demand side. I'm guessing most of you work on the supply side, your programmers, testers, your people who are fairly close to the code, you're supplying the need.
23:01
And in the last ten years, we've got pretty good at improving our supply, and we've got pretty good at supplying more often. All those tools in the Agile Toolkit, they do that for us, and all the tools in the Lean Toolkit. I think a lot of the business side, the side that demands the side that wants our products, our software, are still struggling with this, they're
23:22
still doing market research, they're still analysing markets, and they haven't quite realised yet that you can just push it out there cheaply if you take the time to develop that kind of capacity, capability. You know, choose your weapon.
23:42
Sniper's Rifle is good when you've got a known target, when you've got a clear shot, got time to prepare, and a limited number of variables, there's not a strong wind. Nobody's likely to surprise you or interrupt while you're getting this ready. Machine gun might be better when there's lots of targets, where there's a confused environment,
24:02
where there's many variables, and where you're prepared to frequently miss. So here's another problem here. You've got to be prepared to miss. If you're not getting some failures, if you're not pushing some things out there that don't work, like my friends who pushed six versions of the button out on the website, five of them failed.
24:21
Five of them did nothing. One of them had an 80% better rate. That meant you had five failures for one success. Now if you don't like failure, if you can't tolerate failure, maybe we should just take our best guess at one of those six, and keep our costs down. Because, you know, doing six buttons is six times more expensive than doing one button, isn't it?
24:41
If you're concerned about costs, if you're concerned about not having failures, then maybe you should just take a shot once in a while. If, on the other hand, you're prepared to, you know, try a lot and see what works, and we find what you're doing, then go for it. Do it cheaply. You know, in
25:01
our terms, a cyber development is akin to a slow-moving market. Market is known. You understand the market. You're in there, you're talking to the customers, and your competitors are slow as well. Maybe this is the important thing, the competitors. Capital is scarce. You can't afford to build six of this and see
25:20
which work. You won't build one. Development is expensive, and that's actually the assumption that underlies a lot of the way business views IT. Do that old Woody Allen joke about the two women who go to a restaurant, and one says to the other one, says, you know, the food here is so terrible, and the other one says, yes, and the portions are so small.
25:41
I think that's the way a lot of corporations view their IT people. They get small amounts of technology from them on an irregular basis, and it's not particularly tasty. They don't really like getting it. If you improve that, if you start to give high-quality software on a regular basis, it becomes very different.
26:01
Machine development, if you like. You can cope in a fast-moving market. You can cope with your competitors by fast. Capital is cheap in this world. We're using tools like Ruby and Groovy and Python and JavaScript, not C and even C++, although, God knows, I love C++.
26:21
And we're paid to highlight failure. So perhaps your tools dictate your approach. Having chosen to go with a 200-year-old rifle, you've got to make every shot count because it takes so long to do. You've got to make sure you aim
26:41
and you hit and you reload and you aim and you hit. Because it's a way, every shot can count. Next thing about competitors. You, your competitors, what do you think the results will be? You've both got 200-year-old guns.
27:02
You and your competitors, what's the result going to be? Who wins? Any guesses? Random. Yeah, I think it could be random. It can't depend on something else, I guess. Random. I think stalemate. You've both got a large share of the
27:21
market, you're just surviving. What if you've got a 200-year-old rifle and your competitors got an M16? Who wins? I think your competitors think you're toast. You've got a 200-year-old rifle and your competitors got a sniper's rifle. I think
27:42
you're slow toast. Yeah? Who wins? I'd like to think it's me. I think it would be me. But there's some other factors coming into play.
28:01
We must never forget context. And this context is massive. I'm not even going to try and touch it. Or you can just change the game completely. Let's get away from guns. Iteration is the key.
28:20
This is why I think the ability to iterate, iterate fast and to learn is more important than doing the right thing. The capability to do something, learn from something and inform the next decision is the most important thing. Now over time, you will
28:41
do the right thing at some point. But you will base that on data and experience and doing something. If you can't push anything out, you will never be able to learn. You'll never be able to gather the data and you'll never be able to execute on your ideas. I don't think you need to have the right, know exactly what the right
29:01
thing is that's not offered. But I think you need to iterate. You need to get good at iterating. And you need to get good at iterating fast. Anyone do three-week iterations? One. I bet he works for a bank.
29:21
Three-week iteration. Whenever I come across three-week iteration, there was a nightmare. People will try to squeeze mini waterfalls of things and they're scared of going fast. You do an iteration. You do a short iteration to make yourself good at that. To make your life difficult. Get good at doing short iterations, delivering something and learning, and doing that fast. Once you can do that, you can test things for your customers.
29:43
You can test your output in the market. And we can evaluate. When you take the do the right thing approach, you gather all the data you can, you evaluate it, you decide what the right thing is, and then you execute.
30:01
What I'm suggesting here is that you you look at the data you've got, you guess, you do something, as a result of which you get more data to evaluate, you evaluate it, you take a hopefully more informed guess, you do something else, you get some more data, you evaluate it again,
30:22
you do something else, you take you keep on doing it. Close the loop. I think I think this is a step a lot of us are missing out on. We're not as good as we should be at evaluating and maybe this is tied to the whole big data
30:41
boom, learning to evaluate what we're doing and what the feedback is here. There's a study I read a few years ago which said one of the major obstacles towards businesses really extracting value from IT is project management's desire to assess success on three
31:01
criteria. Schedule Budget Features If your project is on schedule and on budget and has all the features you asked for it, does that make success in the market? Does that make you more successful in your competitors?
31:21
No! But how many project managers have been trained to evaluate project success? The only true success is does it bring in more money for your business? Okay, you might not be in a business, you might be in a government or a charity or something, but whatever your organisation is trying to achieve does the IT thing you've done move it closer towards that goal?
31:44
Schedule Features, Budget But if we keep evaluating our projects on that, we're evaluating the wrong criteria. Close the feedback loop, evaluate on what happens. Paraphrase, moutsey, tongue, let a thousand
32:01
flowers bloom. Get good at selecting those that succeed and cull the rest. This is not an argument for saying, let's let lots of meaningless, pointless costly development streams continue. This is an argument for saying
32:21
do a load of stuff, get good at doing stuff and periodically iterate at a higher level. So let's have several development teams all try and tackle the same problem and after a while we will evaluate the results and we will promote or move forward those that are more successful.
32:43
It's a bit like, you know aerospace companies for a long time they used to have multiple computers, like the spaceship has five computers or something, doesn't it? And they vote, every change to altitude or whatever, the spatial computers vote on what to do. And you know they've got programming in different languages, different algorithms all of it and they change.
33:05
Let's see what works. Let's try different things. And the rest of the stuff, stop doing it. Which leads to an interesting point. I've said do stuff and see what works. What is also implicit in that is that you stop doing stuff that isn't
33:20
so successful. It might be successful but compared to the other stuff you're doing, it's not as successful. And if it's not successful, just stop doing it anyway. Brakes. Kevlin's on tomorrow, isn't he? Is Kevlin keynoting tomorrow? You know what Kevlin's standard lines is why do you have brakes in cars?
33:41
So you can go fast. Any of you ever driven a car without brakes? I did it once. I inherited my mum's car and then one day the brakes didn't work. You just have to drive it to the garage to get it fixed. My god did I go slow. And the same goes at a higher level. Get good
34:01
at stopping things that aren't working. So you know we have things like test driven development and acceptance test driven development at a technical level. We need to replicate them and layer up at a management level. We need things like proper portfolio management. Portfolio management is kind of interesting. I say portfolio management, we talk about it.
34:21
And an awful lot of people say that's a good idea we should do that. A few people say oh we just do that. It's just second nature. Most people don't actively, most organisations in my experience don't particularly actively do proper portfolio management and stop things. Do you guys get Dragon's Den in Norway or like where the venture capitalists sit there and all the entrepreneurs come in
34:42
and they say I'd like you to sponsor my automatic baby changer or something and they decide who they're going to give the money to. We should be doing that with our products. With our projects. You know your organisations should make the leaders of each team come in periodically and pitch to continue that work.
35:00
This is what venture capitalists do. This is what they do in Silicon Valley. Why don't we do it inside our organisations? Why don't we kill off the projects that aren't so successful? Put more money into those that work and stop those that don't? If you're going to let a thousand flowers bloom you've got to have a mechanism for stopping those that aren't delivering.
35:23
I notice there's a clock down here. I'm going a bit faster than I thought I was going to go. So we might finish early and have more coffee. You can't see into the future so stop trying to see into the future. You know maybe 20 years ago it doesn't seem like that long ago to me.
35:43
For some of you it seems like ages ago. Let's go back 40 years. 40 years ago you know any kind of IT systems were expensive and slow. We were working in COBOL on mainframes. It was so expensive we used to write it out by hand and then code it up into a deck and put them through. You wouldn't use a compiler
36:01
or anything like that. Now we've got cheap processor cycles. We can do things like automated testing. 40 years ago using the computer to run tests. Oh my god. That is expensive CPU time. CPUs are cheap now. We can change the way we do things. We can't see the future but we can
36:21
start to construct tests. We can start to use that CPU time to try and understand the future. So I'm suggesting that maybe it's now cheaper to use our technology to experiment rather than spending our money in trying to understand what the market is, what the future holds.
36:43
We get good at probing, experimenting, do a lot of things. Like my friends put out six buttons and see which gets the highest number of clicks. Conduct a lot of experiments. Yeah. But where does this relate to doing the right thing? Here, you know, this is classic lean startup stuff and there was somebody in the break
37:01
overhead from speaking, you know, talking about this, doing experiments. Doing experiments is a sign that you don't know what the right thing to do is. You've got to experiment to find out the right thing. If doing an experiment is expensive and costly, how many experiments are you going to do? You're back in the COBOL days.
37:22
So, I want to use the way I'd iterate. People think about development teams, two-week, three-week iterations of coding and testing and perhaps demoing them or maybe even releasing them. We need to start pushing that up. Now, I've deliberately avoided putting time scales on here.
37:40
I think most teams that iterate do two-week iterations and they release at the end. I've heard of teams that release 50 or 70 times a week. They might be doing Scrum and they might be doing Kanban and they might be doing something else. I don't care. I don't care what your period is, but frequently, we are developing code and we're releasing code.
38:01
And probably slightly less frequently, we're evaluating what comes back. Maybe we're doing it just in real time, or maybe we're doing it slightly less. We are evaluating this stuff because it takes time to gather enough data. But we are evaluating, we're collecting the data, deciding what we do next. And the next level up, we've got
38:21
this portfolio-type process. We're evaluating what all these different products are doing, all these different experiments. Who's experimenting the most? Who's getting the best results? It doesn't automatically mean you cull projects which aren't pulling down the numbers or getting the revenue. But you need to have a damn good case for not.
38:40
And that usually goes by the word strategy. There's some strategic things you will accept you will lose a lot of money on. You won't get a lot of traction because you foresee some other benefit you will get from them. Just always be, always watch out. When someone uses that word strategy, remember the words of the economist Professor John Kay? Strategy is another word for
39:01
expensive. Every time someone in your organization says, we're doing this because it's strategy. We're doing this because it's strategic. Substitute the word expensive and see if the logic still holds. Strategic initiatives are expensive initiatives that take a long time to deliver and frequently fail. Just supply that test. Just try it on for
39:21
you. We're not just trying to iterate in the development level. We're talking about the analysts, the product managers iterating about what they want. We're talking about the organization iterating at a portfolio level. Because ultimately above your organization unless you're in government the shareholders the shareholders are probably on the whole
39:41
pension funds and other sorts of trust funds and they do iterate about who they're investing in. At the very highest level is iteration and evaluation occurring. And now at the bottom level we've got iteration and evaluation occurring. Do it at all levels. Don't be so obsessed about doing the right thing all the time. Just get good at evaluating
40:02
the data. But to do that you've got to be able to iterate and you've got to use that iterative capacity. Doing the right thing is more important than ever. The capability to experiment is the important thing.
40:21
let me leave you some commandments. I get strained on this stage. I'd much rather be dicey about all that. Do the right thing. Or sorry do it right. Do it fast. Learn and iterate. Go round the loop
40:40
again. The key word in there is learn. You know, Einstein's thing about doing the definition of insanity. Doing the same thing and expecting different results. Which we all do on some occasions, don't we? We've all done it. You know, I always do. Whenever I need to fly somewhere I can't find exactly the flight I want at exactly the time I want at exactly the price I want. I just go to
41:02
another website and do the same search. Very rarely does it produce the flight I want at the price I want at the time I want that I keep doing. We've got to break out of that kind of performance. We've got to start doing things differently. Learn from what you're doing. Do it again. But unless you can do it right, you can't you can't even try and do this.
41:22
This is key. Fail fast. Try something. Fail fast. Fail cheap. If it's expensive to fail you're going to be scared of failing. If it's expensive of failing you're scared of failing, you're going to put a lot of effort into not failing. So you end up
41:42
with things called pre-project phases. Which take a long time. They may exist formally, might not exist formally. They take a long time. They spend a lot of time analysing and understanding what could be done. And all the time they are in flight, the world is changing. There's some data, if any of you read
42:00
Capus Jones, he says the industry average is required to change and grow 2% per month. 2% per month. 24% over the course of a year. Adding effects of compound interest and that kind of good stuff. Compound change, you're talking about 27%. Close to a third, not half a third of your requirements have changed. If you spend a year on a pre-project
42:22
phase understanding what the right thing to do is a third have exchanged. Don't spend that time, just get into the market, try something, see what happens. But you've got to do it cheaply because when it's expensive and people are scared of failure they don't want to lose this so we must somehow try and offset this so you end up trying to get this pre-project
42:42
to be bigger and better. But while you're doing that you're undercutting yourself because the thing you're analysing is changing until you get into... You see? Even I get confused. Fail fast, fail cheap, evaluate and learn! Invest in breaks,
43:00
yes at a technical level, TDD and all that stuff, but at an organisational level, portfolio management, dragon's dens, find ways of evaluating what you're doing and saying is this worth doing some more of? Should we be getting rid of this? Should we be changing this? What are the prospects coming up here?
43:22
I don't think there's enough of this going on. I think companies still set up and charter or whatever projects which they still expect to run for a significant amount of time and then for a significant amount of budget and they're still measuring them on time and budget and features.
43:40
They're not measuring them on actual outputs, on actual improved business value or whatever you're good at. This doesn't by the way mean there's absolutely no planning. I'm not sounding off against planning. I'm all for planning. But I don't think planning is what you think it is. Planning is, you know,
44:00
Eisenhower's quote, plans are useless, planning is essential. Planning itself is a learning activity. Planning is when you sit down, you've got your people, you've got the data you've got and in planning you mentally rehearse what you're going to do. You think through it, you talk about it. The mistake is to try and execute those plans.
44:22
Planning is a lightweight, cheap, fast way of rehearsing what you're going to do and it might uncover some problems in what you're going to do. In doing the planning you might uncover something. Planning is a useful rehearsal exercise. But it's not the real thing. There's some research I read a few years ago about the
44:40
Canadian Army and the research discovered the Canadian Army do two things. They plan and they fight wars. Or they fight. Canada doesn't go to war, they're too nice for that. The Army fights in places like Afghanistan or wherever and they plan. And when the Army go into action the plans go out the window
45:01
and they just do it. And when they've not got a war or peacekeeping mission or whatever to do, they plan. Planning just fills up the time. I think if you try and plan your way out of a situation you'll just use up the time, you'll spend money, you'll employ planners, you'll employ whoever it is, project managers or whoever who writes the plans.
45:21
You'll spend that money if you plan for it. But where will you get to? So planning is a learning exercise in its own right and again, do it fast, do it cheap, see what you get out of it. My takeaways? Fast iterations allow for learning.
45:41
If you can't iterate fast, if you can't iterate, you can't learn. If you can't do it fast, you can't act on what you're doing, you can't do it in market time because your competitors might not be running fast today but your competitors might be sitting in this room and might take this to heart. Or your competitors might already be there.
46:00
Probably your competitors, more than anything else, you can't dictate whether you need to do this and how fast you need to do it. Learn to iterate fast. Iterate in the market, don't hold back. Push something out there in some small way. Whether it's sending somebody into the market to show a product.
46:21
I know a company, they actually went down, they got a booth on Oxford Street for the day in London and actually stopped people and said come and see our product and talk about it. In some small way, test it with customers, test it in the market, because that's where the truth is. Only the market, only your customers know the absolute truth.
46:40
Your research reports don't tell you. Learn to evaluate the feedback. Look at what you're learning. Fail fast, fail cheap, learn, and put those breaks in place. That way you can really try this stuff. Otherwise, if you don't have the breaks in place, you'll spin off in a different direction.
47:05
God, we've got loads of time. 13 minutes. I'm not saying do the wrong thing. What I'm saying is, I think increasingly doing things right allows us to do the right thing.
47:25
While we obsess about doing the right thing, we are devoid of reality. We have to get good at doing things right so we can learn what the right thing is. That's, if you like, that's my hypothesis.
47:44
The argument stacks up in my mind. I can see some evidence here, but I guess part of me is scared of telling it to anybody. Don't tell anyone about this. Part of me is scared of this. It just flies in the face of management orthodoxy and whose little need to say the rest of the management world,
48:02
all those Harvard business school professors and God no else is wrong. I could be making a complete fool of myself here, but I think there's an argument. Anyone want to shoot me down? Anyone want to... Questions or anything?
48:21
You've got 11 minutes for 50, 90 seconds, and I've been asked by the organizers on the way out to actually put the color-coded cards in the box. Maybe I've been speaking too fast with my slightly upward accent. You don't understand the word, mate.
48:48
Business has a goal, maybe. And that's what you're talking about. We are going to try this, over-complicate things,
49:00
maybe just moving out of scope. I mean, building the right thing for me is fast and simple to move towards the goal. Fast is not building something. You don't need space rockets to walk next door. For me, that's the building the right thing,
49:21
is keeping things simple and focused, because usually customers, they want to build when they just need a bicycle. And that's, for me, is building the right thing. You want to build a bicycle. I think they are seeing the right thing as the space rocket.
49:41
They think, you know, to go to market to sell this thing, we need a space rocket. Everyone has bicycles. Why would anyone buy a bicycle from us? We have to have a space rocket. It has to be a blue space rocket. And also, I want it to come to a point on the end. The great thing about polos, they're always nice and pointy at the end. And I don't want any of this Russian rubbish with the booster rockets.
50:01
No, no, no. I want one nice, straight, cylindrical thing. And maybe we should have an escape capsule. You need an escape capsule, you know. And we also need those nice sections that fall away from the nice footage from outer space. Must have all this, mustn't we? Yeah, yeah, yeah. Well, it's a nice, simple rocket straight up and just did the job.
50:21
We could at least see whether there's anything out there. You know, the first rockets that went into space, we didn't know what was out there. You know, the first Spudnik that went round. Yeah, I thought... But why is your business over-complicated?
50:40
I was only following orders. More or less. Yeah. Do they ever accomplish... They've got time on their hands. They've got time to write a hundred page diagrams document. So I have this client at the moment and they have very little... Well, they've got a massive backlog.
51:00
But they're not particularly good at planning out what's happening in the near future. I've been encouraging to develop a kind of planning capability around that. At least think a few weeks, a few sprints ahead. And the report I got recently was that they've now taken it to the other extreme. Some people have invested lots and lots of detail in this. And I think they've invested because they've got time to.
51:21
Because you can. Ask a project manager to draw a project plan. It may all fit on one A4 sheet of paper. But if he's given a week to do it, does he want to turn in an A4 sheet of paper at the end? No! I think sometimes businesses ask us, or businesses ask for more,
51:40
because they've got time to it and because it's expected and because they could do more research and they could do more analysis. And rather than trying to pin down exactly what flavour of red the space rocket will be painted, you need to put something out there.
52:07
Anyone else see that or something similar? I'm happy to take any more questions.
52:20
Or you can go have a coffee. I've stunned you into silence. Yeah. Drop me an email or tweet or something. I'd really like to know what you think. Whether you think there might be something in this or whether you think the rest of the world is right and I'm wrong. But please, just get good at evaluating.
52:44
Thank you very much.