We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Designing Humanity

00:00

Formal Metadata

Title
Designing Humanity
Title of Series
Part Number
123
Number of Parts
188
Author
License
CC Attribution - ShareAlike 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
This session proposes that the design of digital interfaces and interactions is a fundamentally moral and political enterprise. From weather forecasts to activity trackers, digital interactions shape the way we understand the world around us. We are shaped in turn by the interfaces we use. Swiping right to select a date or sharing selfies, digital interactions are designed to fulfil a vision of frictionless ubiquity. Is that the right thing to do? What would be better? How should we respond? Come and listen to a few suggestions…
HypermediaWeightAngleMultiplication signJSONXMLComputer animationLecture/ConferenceMeeting/Interview
Metropolitan area networkCausalityInformationTelecommunicationInterface (computing)Sound effectDimensional analysisSymmetry (physics)Level (video gaming)Interface (computing)BitMessage passingInteractive televisionTouchscreenType theoryDigitizingPower (physics)Lecture/Conference
Maxima and minimaPower (physics)Metropolitan area networkSound effectInterface (computing)Theory of relativityInterface (computing)Type theoryPower (physics)Lecture/Conference
Metropolitan area networkSet (mathematics)Video gameGroup actionObject (grammar)Equaliser (mathematics)Theory of relativityPrice indexView (database)Lecture/Conference
Multiplication signTrailFitness functionSet (mathematics)Order (biology)Theory of relativityCategory of beingType theoryInterface (computing)Lecture/ConferenceMeeting/Interview
Metropolitan area networkPhase transitionSet (mathematics)Sound effectLecture/Conference
Identity managementSound effectBitType theoryDifferent (Kate Ryan album)DemosceneMultiplication signLecture/Conference
Context awarenessInterface (computing)Identity managementType theoryInterface (computing)FacebookForm (programming)Exception handlingVideo gameResonatorArithmetic meanTheory of relativityHypermediaMetropolitan area networkPower (physics)Dynamical systemLatent heatLecture/ConferenceComputer animation
TouchscreenInterface (computing)WordMenu (computing)HierarchyCartesian coordinate systemForm (programming)Different (Kate Ryan album)CodeSoftwareSemiconductor memoryExtension (kinesiology)CoprocessorMachine visionPhysical systemVideo gameComputer hardwareLevel (video gaming)Materialization (paranormal)Scripting languageAdditionDimensional analysisDisk read-and-write headInterface (computing)Water vaporComputer fileLecture/Conference
Multiplication signInterface (computing)Constraint (mathematics)Materialization (paranormal)Lecture/Conference
DeterminantInterface (computing)Interactive televisionPhysical systemPersonal digital assistantInterface (computing)FrictionSmoothingLecture/Conference
Physical systemMetropolitan area networkUniform boundedness principleDependent and independent variablesTelecommunicationMultiplication signModal logicContext awarenessDigital electronicsoutputReflection (mathematics)Dynamical systemLecture/Conference
Port scannerDeterminantParameter (computer programming)Physical systemInterface (computing)Dependent and independent variablesLecture/Conference
Duality (mathematics)Gamma functionCloud computingPhysical systemNumberLimit (category theory)Group actionParameter (computer programming)Lecture/ConferenceMeeting/Interview
Covering spaceThermodynamisches SystemInterface (computing)Physical systemRight angleInterface (computing)Sound effectTouchscreenLecture/Conference
TouchscreenSound effectUltrasoundDigital electronicsDatabase transactionMoment (mathematics)Physical systemDigitizingLecture/Conference
Interactive televisionPattern recognitionIdentity managementInterface (computing)Profil (magazine)Lecture/Conference
Video gamePattern recognitionDecision theoryGroup actionDigital electronicsSphereMachine visionFlow separationParameter (computer programming)Computer animation
Ext functorMereologyVideo gameParameter (computer programming)Software frameworkDigital electronicsInterface (computing)SpacetimeHacker (term)DigitizingMeeting/InterviewLecture/Conference
Process (computing)Performance appraisalLecture/Conference
Workstation <Musikinstrument>MeasurementAlgorithmInterior (topology)Group actionOperator (mathematics)Transformation (genetics)Adaptive behaviorCASE <Informatik>Operating systemLecture/Conference
Physical systemInterface (computing)Interface (computing)Parameter (computer programming)System callRing (mathematics)Form (programming)Sign (mathematics)Functional (mathematics)Arithmetic meanPoint (geometry)Lecture/Conference
Interface (computing)Decision theoryGroup actionAkkumulator <Informatik>Different (Kate Ryan album)Frame problemBound statePhysical systemStrategy gameLecture/Conference
Interface (computing)Object (grammar)Pattern languageSampling (statistics)Default (computer science)Process (computing)NumberInformationInterface (computing)VirtualizationMereologyPhysicalismComputer animation
Length of stayObject (grammar)Group actionPhysicalismVirtualizationTask (computing)Personal area networkDecision theoryGoodness of fitChemical equationDefault (computer science)Self-organizationPattern languageLecture/ConferenceMeeting/Interview
Service (economics)Regular graphCASE <Informatik>Moment (mathematics)Order (biology)Default (computer science)Physical systemPattern languageLecture/Conference
CuboidGame theoryInsertion lossCASE <Informatik>WebsiteDot productStructural loadComputer animationLecture/Conference
Wave packetCASE <Informatik>Interface (computing)Right angleGame theoryInteractive televisionPrice indexDigital electronicsReal numberLecture/ConferenceMeeting/Interview
Total S.A.Order (biology)WordLecture/Conference
Gamma functionCASE <Informatik>Uniform resource locatorMedical imagingTraffic reportingMobile appoutputInformationFrequencyData storage deviceGraphical user interfaceGoogle ChromeStructural loadLecture/ConferenceSource codeXML
Metropolitan area networkMobile appStructural loadCASE <Informatik>Computing platformMultiplication signLecture/Conference
CodeProcess (computing)Power (physics)Complex systemTheory of relativityInterface (computing)FeedbackRegulator genePhysical systemPlastikkarteCodeVideo gameNumberField (computer science)Type theoryLecture/Conference
Codierung <Programmierung>Neumann boundary conditionMetropolitan area networkInterface (computing)CodeField (computer science)Associative propertySource codeXML
Web browserHTTP cookieGroup actionParameter (computer programming)Category of beingStructural loadComputer animationLecture/Conference
Category of beingInterrupt <Informatik>Information privacyInterface (computing)Set (mathematics)Lecture/Conference
ArmProcess (computing)Information privacyMobile appPhysical systemPersonal identification numberUniform resource locatorLevel (video gaming)Lecture/Conference
Physical systemWebsiteFacebookQuicksortLecture/Conference
ArmPhysical systemImaginary numberTrailWebsiteMultiplication signFeedbackPattern languageMathematicsLecture/Conference
Group actionPhysical systemGame theoryPattern languageMultiplication signMathematicsInterface (computing)Lecture/Conference
Port scannerMathematical singularityInterface (computing)System callArithmetic meanLecture/Conference
Port scannerMetropolitan area networkDisk read-and-write headSystem callLecture/Conference
HypermediaLecture/ConferenceJSONXML
Transcript: English(auto-generated)
Our name is John Fass, he is a designer, researcher and lecturer at the Royal College of Art and the London College of Communication.
He lives in Milan, here in Berlin, Brussels, and in London. So this is a big thanks to John Fass and Designing Humanity.
My name is John, thanks for the introduction. I'm a teacher, lecturer in London. I run a course on information and interface design in South London at the London College of Communication. I'm a lecturer at the Royal College of Art in Kensington. I'm going to talk about design principally because that's what I do, I'm a designer.
I'm also going to talk extremely fast and use a lot of English idioms so that I can really test the person who's writing down what I'm going to say. So, that's me, send me a message, send me a tweet. So, digital interfaces, they are our doorway into the digital world and they're the everyday activity of what I do and what I care about.
And I think that a lot of what we've heard at Republica this week so far is really important stuff about data politics and digital politics.
But a lot of this debate, it just doesn't reach up to the interface level and it doesn't reach up to the details of designing interactions for the screen. So that's what I'm going to talk about a bit. And I think that this ethical and political dimension of interface design should be a bit more widely understood. So, I'm going to first talk about the concepts behind this idea and how technologies can
have a lot of unforeseen effects on the people that use them, both good and bad. I'm going to talk about how technologies, particularly interfaces, embody certain types of power relations that are implicit in how they're made and delivered to us online or otherwise. I'm going to give some suggestions for how we could design interfaces to be more moral and more ethical and more useful for a just society.
This talk is called Designing Humanity after a Dutch philosopher of technology called Peter Paul Verbeek who says that technologies increasingly shape our behavior and our actions in the world in obvious but often quite unexpected ways.
Designing technologies then means designing humanity, designing behaviors into technologies. So this view takes technology not as functional instruments for achieving a certain thing but as active mediators in a set of relations. So we don't really see technologies as objects that we pick up and use but as
ways that we get through to our life world, the things that we experience in life. So that means I see digital technologies as indicative of this set of relations that we have with the world around us. So who here wears any kind of activity or fitness tracker? Is anyone wearing one of those now?
I'd expected a few more. Quite weird devices. My wife used to wear one until she found herself climbing up and down the stairs every night about six times in order to reach her required target goal. A weird type of behavior. But that's what that interface elicits quite often.
This is a sleep tracker. I wore one of these, a headband, every night for about six months. It tracks you at what sleep phase you're in and I was interested in tracking my own sleep. And I wore it until I started waking up every night looking at the thing to try and find out what sleep phase I was in. Ridiculous. I threw it away after that.
Technologies can make us behave in quite strange ways and no one is immune to this, the Holy Father himself included. And I suppose I would argue that crowd behavior amplifies this effect. So I think here he's doing a walkabout somewhere and people want to get selfies with him.
Technologies also shape our identity. They allow us to be different types of people and assume different types of identities. Lawrence Lessig writes a little bit about this. So I suppose the question for a designer of ethical interfaces in this context is what types of identities does your interface encourage, inhibit or produce?
And are you even aware that technologies do this, allow for this kind of thing? Technologies of power intersect with technologies of the self. This is an idea from Foucault. Which has, I think, a specific resonance for social media.
There's a lot of talk about social media at this conference. So this is quite a clear example for me. A lot of you have probably seen this, but it's a very current meme from earlier on this week. It's about a breast cancer campaign and you're not allowed to show women's bodies on Facebook. So you have to use man breasts to do so.
Quite interesting. So this is an interface that forbids you to show your body except in very specific and prescribed forms. Power relations in a very obvious dynamic. So what happens when technologies are no longer mediating our experience of everyday life? Like say a camera phone that we might look through a sunset at.
But in fact the technologies constitute that life itself. Like an Oculus Rift or a VR helmet. Then there might be some really different ethical dimensions to what an interface designer might do. If all of life is experienced by the technology, not simply a passage through the technology to other life.
Then there may be some other more complicated moral questions. So the external world in this situation is external only to the extent that it sits on your face in the form of a helmet or a mask. And if you go through to that laboratory, I can't quite work out the colon or how to say the word, but I think you say laboratory. Everything is VR over there.
And I suppose I would like to see maybe a more critical approach to how those interfaces are designed. So this is Zuck's vision. In fact we should experience the world. I find it quite telling that everyone else is immersed in VR with only the Zuck of his head above water. I wonder kind of what that might mean. There's no one in the audience that wants to take their headset off.
So all designers act through materials. Furniture makers use wood and steel and aluminium and jewellery designers use gold and metals and all this kind of thing. But the materials of interface design are a bit different. They're much less materially present. They include menus and I think a hierarchy of hardware, screens, processes, memory chips, software, desktop applications,
menu bars, tools, files and at code level, perhaps HTML and JavaScript and CSS and all that. Code level tools that an interface designer might use.
And all of those technologies are in some way holistically aligned towards an end. That is either functional, getting people through a system. And I suppose what I want to say is we should harness those tools also for moral, ethical and political ends. I use these tools all the time, every day. The designers become very familiar with them and in some ways they disappear.
But the constraints of these materials somehow extend into also implicit moral and ethical constraints. So again the question for an interface designer is how do your design constraints evolve into behaviour constraints? And are you even aware that that's happening and what can you do about it?
So interface design has developed under the influence of a very overt technological determinism. A very technocentric way of being in the world and acting. Everything's got to be seamless and smooth and there's no friction.
It doesn't look like punk. It looks like very smooth Californian seamless stuff. But this illusion is very hard to maintain. So this is how Tinder responded when their system went down about 10 days ago this was. Really what they're saying, oh we're really sorry, this system's totally broken. All those interactions that you've been doing to find someone to go out with, they've totally disappeared.
But their interface doesn't really allow them to respond in a way that represents what's happening. Oh we're really sorry, they're going to be back really soon. So at the same time the making of technology has changed from a response to human necessity. We need tools for communication to the essential purpose of human effort.
Now the idea is that's kind of all we do is develop technologies. Often quite blindly I would say. So in the context of digital technologies this effort is shaped by market dynamics. That traditionally they don't really value reflection or critical input.
So left alone technological determinism will never provide for its own self-correction. It just can't do that. If left on its own it must act towards its own ultimate end. So my argument is that interface designers should assume some of the moral responsibility for that to happen. Because technology is never going to do it on its own.
Digital systems are designed to encourage certain behaviors and inhibit others. Usually their primary motivation is to increase the number of users and consequently monetize the actions of those users. So certain limits are always going to be built into the system given that those are the prerogatives of the people behind it.
So here you can see the amount of Tinder likes is strictly limited. You want more likes, you've got to pay for them. The system creates its own needs and then the interface is designed around fulfilling or denying those needs. So it's a fairly closed system.
So we access the world through technologies, social world and the physical world. From these glasses that I'm looking through to you all out, not even that many people texting, quite surprising. To the phone screen that I read the news from and this has a fairly significant effect on how we see the world. So our understanding of how the human brain works is profoundly shaped by magnetic resonance imaging technologies.
Our ideas of what the unborn child looks like are mediated by ultrasound technologies. And that influences the way we think about the world around us. Digital technologies, I think, always seem to tend towards invisibility or certainly increasing miniaturization.
So contactless payment, this is a huge thing at the moment. There are all these announcements where I live saying contactless has arrived as if we're all supposed to have been waiting for it for so long. Here it is, hooray. But might it not be better that financial transactions were in fact more visible, not less visible.
So this is a big thing too. So tapping in and out of the public transport system with your watch seems like a strange thing to do to me. This conceals a whole lot of more complex interactions that are not present in any kind of interface. So using your watch to control your access to public transport has significant implications for identity recognition, data capture, algorithmic profiling, all that other stuff.
So the ethical and moral thing to do for an interface designer here is to make those things apparent, make them clear. We know what's really happening when you just touch your wrist for a second to that reader. So this is pet facial recognition.
I would argue the ultimate endpoint of camera vision technology. Recognize your pet wherever he goes. So what I want to say is that digital life is life. There aren't two separate spheres of moral or political action. Digital life is life. Moral and political decisions that govern individual and collective life are simply displaced to digital technologies.
They're encompassed by them, many of them developed and maintained by large corporations who we don't know and never see. So we should insist that digital life is inseparable from what we might call real life. Because if we don't, we're always going to have the argument, oh well don't worry, that doesn't have to be moral, that's only digital.
So that's another important part of this argument is that digital life is life. It needs a moral framework around it just like everyday life does. An interface designer should play a part in that argument. Digital technologies, in fact all technologies, are always being adapted and transformed by the people that use them.
They're actively shaped by people and this is why hacker labs and maker spaces and if you go into that main hall you'll see the gig stand there. All that stuff is really important because it shows that technologies aren't closed. We can enter into them and we can adapt them and improvise them to our own uses that perhaps might be unintended.
But that means that the evaluation of the success of any design must not only include its functional qualities, how well does it do its job, how easy is it to use, how quickly does it download, but also its moral qualities. Seen as a measure of how it encourages people to act and what kind of acts it results in and that's the distinction I make between ethical and moral.
So ethical implies actions in the world and moral implies behavior or inner understandings. Technologies are always adapted. People will do whatever they want with them. As technology becomes more physically closed with tiny, tiny screws that you need specialist screwdrivers to unscrew and fused plastic cases,
they also become more conceptually closed with proprietary operating systems, commercially secret algorithms and other things that we can't enter into legally. We're not allowed to and so our opportunities for adaptation and transformation become more
important as they become more limited and we need to be aware of that. So one of the questions I might ask you guys is this movement is already well established that we make stuff and take things apart and make new things out of it. So why are interfaces so rigid and so unchangeable and so persistent?
Interfaces seem to be the last thing in this argument of being able to change things. So this is an interesting one. In India, people use the missed call function as an important signifier. So without having to use your expensive minutes, your phone rings and you don't answer it and the fact that you've missed the call has a meaning of its own.
So this is a form of what I would call socio-technical behavior and that's been exploited by political parties and global corporations. So the point I want to make here is that our assumptions about human activity, i.e. that people will answer the phone when it rings, they're designed into the interface, the big button to answer it or the swipe to answer it or whatever.
But in fact, people do very different things with technologies and that means the decision about what to include shapes the possibilities for human action and human agency and therefore control over what we do in the world. And it frames the design decision as a moral decision. So what are we able to do? What does the system allow us to do? What actions are possible?
So one strategy for that is repurposing stuff. So here's someone, they've taken a battery out of their phone and got it as a lithium ion rechargeable cell. So couldn't we build these possibilities into the interface itself?
Maybe not the physical object, some people are thinking about that already, but into the interface and how it works. This idea that it could be taken apart and you can adapt it for yourself. This is the crystal meth drone. There are unintended consequences to everything that we put into the world.
Once the design is baked into it though, it's pretty difficult to change it. So this is why open technologies are so important. So how does this manifest itself, these problems on interfaces? Dark patterns, I'm sure some of you already know about this, but they work on defaults.
It's when interface design is deliberately set out to deceive people and to put misleading information. Many of these are now very everyday experiences for a lot of people. So part of the design process is anticipating how people will use any particular designed object, virtual or physical.
And designers then build in those prescriptions, how should it be used and what is possible. So some actions are invited and some are discouraged. And the ethical task then is to decide the balance between those two things. And I'm not really saying that things should be always encouraged or things should be always inhibited, but that there should be a moral balance between those decisions and not just a functional or a visual one.
So dark patterns depend on defaults. And this is quite a well-known piece of research now that shows that if you set the default for organ donorships to opt out, you have a far greater uptake of people.
So here you have to choose not to be an organ donor. And in virtually every country that's ever done this, you have much, much larger uptake. Oxfam, good organization, no? We could support that. Global NGO doing good things, distributing aid all around the world. Oh, but look, they've got a default. It's not a single donation, it's a regular monthly donation.
That would be pretty easy to overlook that. And suddenly you're making a regular monthly donation to Oxfam. I'm not saying that's a bad idea, and in fact we probably should all do that. But the fact that they set this default, I would argue, is morally questionable. And if Oxfam is doing that, you can bet most other people are too. Here someone's bought an iPad, and at the last moment the system has slipped in an iPad case
without them ordering that or selecting that. Classic example of a dark pattern. So when you receive your iPad, it comes with the case and you think, hold on, I didn't order that, but you did and you paid for it. This is the UK postal service. How many of us have seen this kind of thing?
This one is especially tricky because the first two are if you don't want to receive, and the next two are if you do want to receive. So you'd have to read all that text and click the correct boxes to avoid your mailbox filling up with junk. And how often do you see this kind of thing? I mean, loads of websites do this. So here's a game, two dots. Quite good fun to play. I downloaded it.
I think it was four dots then. Over time, lots of games do this, and they're very good at it. The interface trains you to react in a certain way to an interface item, in this case the green button. So going from left to right, you can say, okay, play again, go, I want to play, I want to play.
Oh, hold on, now I'm paying to play, with no real indication that that interaction has changed. And if you play this game a lot, you're going to end up spending money that you don't want and didn't intend to spend. This is kind of a famous example, Ryanair, otherwise known as the devil.
They try to get you to buy insurance. And in order not to buy insurance, you have to scroll down to between Denmark and Finland to the beginning of the word D for don't give me insurance. Tricky stuff, don't know who designed that.
Very sneaky, very unethical. A lot of this stuff isn't visible at all. Even if you did scroll down to between Denmark and Finland, there's lots of sneaky, unethical stuff going on without you even knowing. A few examples here. Most of these are all kind of the same example, which is hidden geolocation.
So here's an emoji input. This is kind of a famous case from the Google Chrome store. You download an app, and it lets you use all kinds of weird and wonderful emojis. But hold on. It reports your location every 10 minutes. So they've just snuck that into the app without letting anyone know that that is happening.
So it turns out its main purpose is not to give you new and fun emojis. Its main purpose is to tell advertisers where in the world you are and sell that information on. Same thing going on here. Here's a weather app. What's the weather, where I'm going, or where I'm likely to be, or at the weekend, or what's going on. Hold on. It reports your location with incredible frequency. There's no real ever reason for it to do that.
Who here uses Instagram? Loads of people. Have you turned geolocated images off? You probably have. Instagram doesn't care about that. They geotag all your images whether you've got geotagging turned off or not.
Very sneaky, very immoral, very unethical. They shouldn't do it. Finally, a famous case. The torch app that reports where you are to advertisers around the world. This one is a famous case. So designers can use technology to persuade, and there's a great case for that, political advocacy.
Oxfam do it. To seduce people into buying things or even seduce people into acting to do good things in the world and enforcing people to act in certain ways. You see this all the time. Loads of apps and platforms. They force you to give your name. I was looking at Uber yesterday. I would never sign up to Uber, by the way, don't worry. But you're not allowed to use Uber until you've given them your full name, your phone number, and your credit card details.
That's the first thing that they ask you for before you can even see what the system does or open an account. Airbnb, that incredibly complex system of feedback. Don't worry, you can give this comment, but your host won't see it.
But we'll see it, but all that business. Don't know who thought of that. So these technologies, they enforce invisible power relations. You see this all the time. And one of the jobs of the interface designer is to bring them to light, just to make them apparent so everyone can see them. So how could things be better? Well, a few clear and easy ways.
This is quite a new field, I think, but this is voluntary regulation. So codes of ethics that are explicitly for user experience and interface design work. DIA is an Australian design institute, code of ethics, ethics code for the user experience professional association.
This kind of thing is well known in other fields. So why shouldn't interface designers have it? Well, they're doing it. I would support this kind of thing. This is what I would call resistance. So when you're being invisibly tracked by your browser or your cookies or whatever, I use Ghostery.
It's pretty good, but there's lots of other things that you can use that do this. And they're tools explicitly designed to counteract hidden enforced actions, i.e. we're taking loads of data about you and we're not going to tell you. So my argument is that this is a category of ethical design that takes ethics and morality as explicitly as its subject.
So the people who design these things, they're not really saying we're going to design an ethical interface. They're saying we're going to interrupt this marketplace of tracking with a thing that we think is better and more ethical. So I would support this kind of thing.
And finally, this kind of thing is happening more often as well. It's an intervention directly into the interface. You can see the first one is Kayak, and it says it's an app, and it will say these are the Kayak privacy settings. Do you really want to do this? And it will pop up through the process of using the system.
The second one you can see is privacy protection. It will work on any app. So you can say protect or fake. So you can fake your ID. The map one, you can start to fake your location. So you can drag that pin anywhere in the world you want to be. That would be a good one for the Instagram users, for example. Can you say, well, you might think that I'm in Berlin, but I'm just going to say that I'm in Santa Monica or whatever.
Quite a clever idea, I think. And the fourth one there is since Facebook were discovered to have been tracking people who just visit the site without even being members, this is a way of giving false names to the system so that it will track imaginary people and deceive the system in that way.
So I'm not the only one who's thinking about this. I'm not really presenting this as a brand new idea. There's a designer called Gabriel White who's been doing interesting stuff. He says that designers should provide feedback on how much time people spend on a site
and how the system elicits a change in usage patterns over time. So when I start using an app, I might be using it in a certain way, maybe once a day or something. And as it starts to shape my behavior, so my pattern of usage will change on that app.
I'm playing a game more often or whatever, and the interface should explicitly show that. How has this system started to shape my behavior? You could also compare the way you use a system to the way other people use it, that kind of thing. We should design features so that users are not encouraged to do repeated and obsessive actions.
Candy Crush, great example. The ethical thing to do as a designer would be to consider the behavioral impact of what you're designing. Have you deliberately designed it for addictive or obsessive behavior? So the speaker after me, Mushan, he's done some interesting work on this topic as well
about how interfaces enforce obedience of a certain kind. So don't miss his talk directly after this. I think you're locked in here now, so you can't go anywhere anyway. So all of this stuff could be built into interfaces. So what I want to end with is a call for designers like me.
I'm a teacher as well, of course, but in my design work, it's quite head-down stuff. It's quite difficult to be able to lift your head away from the coal face and say, what am I doing and how am I doing it? So my call is that we should do that.
Consider the implications for humanity and behavior and what people do of all the stuff that we design. Thanks for not falling asleep. Thanks for listening. I'm happy to talk to anyone who wants to talk more about this. Thanks a lot.