All your family secrets belong to us - Worrisome Security Issues in Tracker Apps
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 322 | |
Author | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/39686 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Internet der DingeMobile WebMobile WebTrailSoftware engineeringFocus (optics)Reverse engineeringInformation securityVulnerability (computing)Software testingHacker (term)Source codeGroup actionOrder (biology)ResultantCuboidWordAndroid (robot)Point (geometry)Binary codeStudent's t-testCartesian coordinate systemMathematical analysisProjective planeBitCodeDynamical systemMultiplication signUniverse (mathematics)Goodness of fitInternet der Dinge
02:31
Server (computing)Process (computing)Mobile appGoogolSpywareVulnerability (computing)AuthorizationMobile appTrailOrder (biology)InformationIntegrated development environmentServer (computing)Cartesian coordinate systemWordFamilyDependent and independent variablesGoodness of fitUniform resource locatorGoogolImplementationData storage deviceAndroid (robot)Web crawlerTraffic reportingSlide rulePerspective (visual)BitInstance (computer science)Information securityProjective planeBlogCASE <Informatik>Fitness functionSmartphoneDigital video recorderClient (computing)ResultantSpyware
06:15
TrailFreewareSensitivity analysisStatisticsTime zoneFront and back endsMobile appAuthorizationMultiplication signInformationDifferent (Kate Ryan album)Cartesian coordinate systemTelecommunicationVulnerability (computing)System identificationFiber (mathematics)Point (geometry)Pay television1 (number)System callProjective planeState observerVector spaceGame theoryTotal S.A.VideoconferencingReal-time operating systemAuthenticationUniform resource locatorCategory of beingMassProcess (computing)WordFreewareTrailMessage passingWeb pageLevel (video gaming)Parity (mathematics)Installation artCommunications protocolGoodness of fitSpywareClient (computing)Metropolitan area networkComputer animation
10:28
Game theoryCartesian coordinate systemPay televisionInstance (computer science)Code
11:00
Pay televisionRootElectronic data processingCodeFlagKey (cryptography)Proxy serverCartesian coordinate systemClient (computing)Computer fileInstance (computer science)Android (robot)Automatic differentiationCASE <Informatik>Boolean algebra
12:03
Computer-generated imageryPay televisionMessage passingServer (computing)BitComplete metric spaceState observerRight angleCASE <Informatik>Proxy serverSet (mathematics)Goodness of fitClient (computing)
13:35
System administratorCartesian coordinate systemInformation securityInheritance (object-oriented programming)Computer fileSet (mathematics)CuboidMobile appLevel (video gaming)Personal identification numberMechanism designMusical ensembleAdditionSoftware bugGame theoryGoodness of fit
15:03
WordAuthenticationProxy serverPasswordLoginUsabilityPasswordLoginPersonal identification numberTouchscreenProxy serverMultiplication signFlagSlide ruleCASE <Informatik>
15:46
Client (computing)Information and communications technologyProcess (computing)CryptographyReading (process)Mobile appTelecommunicationInstance (computer science)CASE <Informatik>Software developerSlide ruleProxy serverCartesian coordinate systemProcess (computing)Drop (liquid)ImplementationMathematicsEncryptionAuthorizationWordVulnerability (computing)INTEGRALInformation and communications technologyGoodness of fitError messageMetropolitan area networkSoftware bugLoginClient (computing)
17:41
CryptographyMetropolitan area networkParameter (computer programming)Different (Kate Ryan album)Multiplication signCodeConnected spaceScatteringPosition operatorImplementationKey (cryptography)Pattern languageReverse engineeringXML
18:20
CryptographyExclusive orReverse engineeringKey (cryptography)Procedural programmingPasswordRandomizationSet (mathematics)Codierung <Programmierung>Metropolitan area networkExclusive orParameter (computer programming)Computer animationProgram flowchart
18:59
CryptographyTransport Layer SecurityServer (computing)Information and communications technologyPublic key certificateImplementationAuthenticationParameter (computer programming)Software developerSet (mathematics)LoginAndroid (robot)AuthenticationDatabaseFront and back endsInformation securityPublic key certificateConnected spacePasswordCodeCartesian coordinate systemTelecommunicationMobile appTrailEncryptionUniform resource locatorTransport Layer SecurityValidity (statistics)Device driverPattern languageObject (grammar)Instance (computer science)Data storage deviceCASE <Informatik>Computer animation
20:56
DatabaseNumbering schemeInformationTrailDatabaseData storage deviceEmailNumbering schemeReal-time operating systemUniform resource locatorQuery languageAddress spaceMobile appFront and back endsReal number
21:28
Statement (computer science)CodeBitEmailAdditionMobile appStatement (computer science)Right angleVulnerability (computing)Classical physicsGame controllerInjektivitätAddress spaceInstance (computer science)Sampling (statistics)Lecture/ConferenceComputer animation
22:14
Client (computing)Server (computing)CodeAuthenticationProcess (computing)Server (computing)Vulnerability (computing)AuthorizationLevel (video gaming)Lecture/ConferenceMeeting/InterviewComputer animation
22:55
Default (computer science)UsabilityComputer configurationDefault (computer science)WebsiteVulnerability (computing)Front and back endsCartesian coordinate systemDemo (music)Videoconferencing
23:43
Demo (music)VideoconferencingDemo (music)WebsiteTrailProfil (magazine)Uniform resource locatorConnected spaceRandomizationDiagramMultiplication signInformationLecture/ConferenceProgram flowchart
24:58
BitTrailAddress spaceMultiplication signFlow separationComputer animationProgram flowchart
25:30
Demo (music)AuthenticationMessage passingMereologySoftware bugWebsiteForm (programming)AuthenticationStudent's t-testMobile appTrailTelecommunicationLoginMessage passingReverse engineeringProcess (computing)EncryptionCartesian coordinate systemProfil (magazine)Uniform resource locatorDigitizingAdditionNumberCodeLecture/Conference
27:30
Message passingAuthenticationNumberServer (computing)Message passingTimestamp
28:11
MereologyTrailInjektivitätExploit (computer security)Uniform resource locatorInjektivitätType theoryNumberFront and back endsMobile appMultiplication signTimestampRow (database)
29:14
Point cloudComputer-generated imageryData storage deviceMedical imagingPoint cloudAdditionAuthenticationInstance (computer science)BitPhysical systemFunctional (mathematics)CASE <Informatik>Complex (psychology)
30:01
Demo (music)Point cloudPhysical systemInjektivitätMedical imagingUser interfaceSoftware bugParameter (computer programming)Sheaf (mathematics)Web browserData storage deviceWeb 2.0Computer animationProgram flowchart
31:00
Medical imagingSheaf (mathematics)InformationPoint cloudSensitivity analysisComputer animation
31:33
Process (computing)Server (computing)Complex (psychology)PasswordInjektivitätSoftware bugInformation privacyData storage deviceCartesian coordinate systemMobile appProcedural programmingEmailServer (computing)Process (computing)NumberAddress spaceInjektivitätDeterminismPasswordFront and back endsMultiplication signPairwise comparisonMedical imagingComputer animation
33:41
GoogolMobile WebSoftware developerMobile appAuthenticationDatabaseProcess (computing)Point cloudServer (computing)Crash (computing)WordWeb 2.0Real-time operating systemMathematical analysisData storage deviceProgram flowchart
34:30
AuthenticationEmailServer (computing)AuthenticationDatabaseTable (information)EmailLoginCorrespondence (mathematics)CASE <Informatik>Address spaceProcess (computing)Data storage device
35:17
Table (information)Correspondence (mathematics)DatabaseMobile appQuery languageUniform resource locatorDependent and independent variablesEmailAddress spaceComputer animation
35:54
Mobile appServer (computing)Formal verificationSoftware developerCartesian coordinate systemDatabaseClient (computing)Computer animation
36:35
Address spaceTable (information)Rule of inferenceClient (computing)Server (computing)GoogolPasswordInformationProcess (computing)Directed setMobile appGastropod shellMobile WebStatement (computer science)Information securityFamilyFreewarePortable communications deviceReal numberNumberAddress spaceInformation securityDatabaseUniform resource locatorType theoryAuthorizationVideoconferencingRight angleMobile appPasswordEmailProcess (computing)Data storage deviceRule of inferenceClient (computing)2 (number)Front and back endsTable (information)FacebookMultiplication signCASE <Informatik>EncryptionAuthenticationServer (computing)LoginDefault (computer science)Reverse engineeringDemo (music)Software bugCartesian coordinate systemTraffic reportingSoftware developerDrop (liquid)Android (robot)Constructor (object-oriented programming)TelecommunicationStatement (computer science)Dependent and independent variablesMobile WebVulnerability (computing)Electronic mailing listGoogolCycle (graph theory)InformationBootingDirection (geometry)Expected valueCodeSign (mathematics)Software development kitTerm (mathematics)XMLComputer animation
41:16
Open setWordWebsite
41:54
Digital object identifierComputer animation
Transcript: English(auto-generated)
00:02
Yeah, thank you for the short introduction. Um, I'm Siegfried, uh, this is my colleague, uh, Stefan, and today we will talk about our, uh, tracking application investigation. Uh, we are both from Germany, from a research institute called Fraunhofer SIT. It's located in Darmstadt, close to Frankfurt. Um, a few words about my- ourselves. Uh, so I'm
00:25
Siegfried, I'm leading a research group at this, uh, research institute called Secure Software Engineering, and our main focus is on static and dynamic code analysis, so writing new analysis in order to find vulnerabilities in binaries, as well as soft-, uh, source code. I'm also a founder of the team called Team SICK,
00:44
which I will say in a second, and a founder of CodeInspect, which is a reverse engineering tool for Android. So Stefan, would you like to say a few words about yourself? Yeah, hello, my name is, uh, Stefan. I belong to the Test Lab Mobile Security Group. I'm also developing static and dynamic analysis tools, and in my spare
01:02
time, I'm digging around a bit with IoT stuff, and together with Siegfried, I'm a co-founder of our hacking team. Good. Thank you. So we talked about this Team SICK, so what we presented today is not, um, the results of our-, of both of us, it's basically the results of our team, uh, which is called Team SICK. So what is
01:22
it, a few words about it. It's a research-, it's a, a hacking group, so we meet once a week, uh, in our spare time, so the team, the team consists of researchers from this research institute, as well as students around our universities, uh, around the university, and, um, so we usually look into different interesting projects, then we
01:42
try to find some vulnerabilities, and that's basically the goal to learn from each other. So the credits definitely go to those brilliant students and researchers mentioned below. Two of them are actually here in the audience. Good. So, before we start with the talk, um, a short beer announcement. Um, since we are, well,
02:01
you already know from Germany, and actually from Munich, or close to Munich, we thought it might be yet another cool idea to bring or import some boxes of beer, and since this is our third, first, uh, DEFCON talk, uh, we imported two boxes of beer this year, so after the talk, feel free to come and grab a cold beer. There are 40
02:21
bottles for you guys. Thank you. So, let's get started. Uh, short agenda for today, um, I will start with a little bit of motivation, um, and then do a little background information, and then we dig into our results of our security findings. Uh,
02:44
first topic is client-side authorization. I will explain what I mean by that. Then we will talk about the client-side vulnerabilities, and then we will talk about server-side vulnerabilities. In the end, a few words about responsive disclosure, because this was funny this year, and, uh, a summary in the end. Good. Motivation. Um, so
03:04
when I started, um, when we started, um, putting the slides together, I said, how can I motivate tracker applications? So, while you first of all think about surveillance, so for instance, in the 60s already, there was, um, radio receivers inside a pipe, so very
03:25
small stuff, which was very interesting to see. In the 60s, again, like, uh, camera inside a pack of cigarettes to hide and to, to audio record or video record the environment, and in the 70s, um, like, a microphone fit already into a dragonfly, uh, in
03:42
order to, to spy on people. So, I guess you already get it, this was the past, so how is it right now? I guess we all have it in our pocket, um, it's the smartphone, um, well, because there are a lot of sensors in it, like GPS and this kind of stuff, so you get a lot of information from people, um, this is the reason why there's already spyware
04:00
and rat, um, abusing this, um, this stuff, and, and, and extracting all the information and using it for whatever reason. Um, but we also ask ourselves, or we ask, are there any benign reasons or are there any good reasons to use such kind of apps or surveillance apps or tracking applications? And then we found 3 different topics, which was interesting, so first of all, app, families, so there are apps out
04:25
there where adults want to know where their childs are, so they want to know if they are, whatever, safe in the environment based on the location information, um, there are couples, which was interesting, like Track My Boyfriend, Track My Girlfriend apps, um, I don't know why they do this, so they mutually agree on installing the
04:44
application and whatever to check if they are not cheating on each other, I don't know, but there are many of them out there. And friends as well, so, um, you want to know where your body is in order to meet or whatever, so there are benign reasons. The question is how do you differentiate between the good and the bad now, right? Because
05:02
from an implementation perspective, both are implemented in the same way. So, for this, we looked up on Google Play and we found these kind of apps, so we thought there might be a definition, what are good apps and what are bad apps. So, I found, uh, well, one of them is the Android Security Report stated that commercial spyware is any application that
05:25
transmits sensitive information off the device without user consent, um, does not display a persistent notification that this is happening, so this means if you want to install a benign tracking application and if you want to upload it to a Play Store, you need to show the monitor person notifications like whatever right now I'm accessing your, um,
05:45
location information, whatever, and I'm sending it to your mom, some kind of this. If this is the case, then this is a legitimate app, and if not, then it's, um, considered a spyware and it shouldn't be in the Play Store. This is at least, uh, what we found. Good. So, so we've only focused in this talk or in our project on those legitimate apps, um, and we
06:04
asked ourself the question because they're collecting a lot of data, um, how well is the, the collected data protected on the client side or on the server side. For that, um, well, we looked up, as I said, Google Play Store, we typed in tracking application, track my
06:21
boyfriend, track my girlfriend, and then we found 19 different apps. Why is this an odd number? Well, we just downloaded a few of them, the first hints, and at some point we stopped, and then, well, we found so many vulnerabilities that at some point we got bored, and this is the reason why there are 19. This is no special reason for 19. So, we looked that we at least get those, um, that have, um, the most installations, um,
06:45
based on the Google Play Store statistics, and another point is we only look for free applications. So, I know that there are a lot of commercial spyware applications out there. Those were not, um, the target in this, um, project, only the ones that
07:00
you can download for free and you get for free and you can use for free. So, as a spoiler, um, we found 37 different vulnerabilities in total, very, very sensitive ones, and, um, well, in this talk we will show a few of them or at least a few categories of them. Good. Um, before we s- yeah, the takeaways of this talk, um, so
07:21
what will you learn today? Um, I have to tell you that if you expect any, uh, sophisticated exploit in this talk, unfortunately, I have to disappoint you. So, it was, in this project, very, very easy to, to get access to all this highly sensitive data and even to do mass surveillance in real time. Um, and we usually play this game of, um, can
07:45
we upgrade the applications which, where you have to pay for premium features, can we upgrade it for free? And, yeah, we will also say a few words about that and, yes, it was possible, again, this year. Good. Um, then I will come to the background information, just very, very, um, small background information, very easy setup that
08:04
we are all at the same page. So, how does this work? So, usually you have this application and you have an observer and a monitored person and both install this application and then there is some kind of pairing process where they know, okay, I belong to this guy or whatever or I can monitor this person. And on the monitored
08:21
side, well, it collects all this sensitive information like location and so on and sends it to the backend and in the backend side, you basically, the observer pulls the information saying, hey, right now, I wanna know where my kid is or whatever. So, on the, this means, on the, on the backend side, there are information like location information, um, call history, text messages, whatsapp messages, whatever and a part, a
08:46
couple of application also had the cool feature of installing a messenger into this tracking application. So, this means that you can chat with your girlfriend or whatever. So, you also can send pictures and videos and this is important for the remaining talk. So, all these data are stored in the backend. So, what are the
09:05
attack factors here? Well, as I said, the usual game can be upgrade, um, premium features for free. So, we will say a few words about that. Then, obviously, the two communication channels, um, can we do a man in the middle and how was it implemented, how was the protocol implemented, we will say a few words about
09:21
that. And the other last, um, attack vector is, um, the backend basically. Good. So, in the following, we will talk about all of those three stages now. First, client side authorization. Um, so, before I do this, I start with, we all know this, but just to, to get clear what, how, what it means to do, to access the
09:43
sensitive data. So, you have an observer and you would like to access sensitive data or data from the backend. So, there are usually two steps involved. First, authentication, including identification, and then authorization, that you're authorized, the, the check is on the backend, that, which checks if you're
10:01
allowed to access this data. We all know this, but I'm just saying. Um, and then what we saw is usually, so, there was most of the time this kind of authentication process, many times broken, many times there was none, but at least it wasn't there. And then there was something where we found a client side authorization, and I will explain this, uh, in a second. So, I will show you four
10:23
different examples of what we found out, which was not okay. Good. The first one, um, yeah, as I said, the usual game of premium features. So, these kind of applications contain some features which are disabled by default, and if you pay wherever, five dollars or something like this, then you get super cool premium
10:43
features. One of them is, for instance, um, removing advertisement that you have read, that you're not seeing the advertisement anymore. Very easy. So, we ask ourselves, um, how was this implemented to, for instance, get rid of the advertisement, and then we looked into the code, and we found the
11:02
following, um, shared preferences, get Boolean, L adds, for instance, then there was a check if remove. So, if this flag is set to true, and if yes, then they basically dis- disable this few on the client side. For those of you who don't know the code here, or what it, what, what the share preferences is in Android, share
11:22
preferences is a file that comes with the application, um, it's a XML based, XML based file, and it has a key value pair. So, in this case, this L adds, for instance, was set to false. If you set it to true, then you basically can get rid of the, um, advertisement. Um, the question is, for those who don't know it, can, how
11:42
can we manipulate this file? Well, there are basically two ways. First one, if you have a rooted device, it's very easy to, to change this, um, value on an unrooted device. You, and the application allows you to back up, so you back up the application including this file, then you con, then you do, you modify the file, and then you restore it. This is all known and well known from, also from the
12:02
past. Then, when we looked into this, uh, shared preferences file, we found some other cool settings there. So, one of them was SMS full. So, SMS full is like, in, from the monitor person, all the text message are basically can be, uh, can be
12:21
accessed by the observer. Um, so the full text message because they wanna know if the girlfriend or boyfriend is cheating or whatever, so they wanna, uh, exactly know what's going on. And, as I already mentioned, so if you set this one guy from false to true, uh, oh yeah, sorry, I forgot to say. So, what does this full mean? This full means, if you did not pay, you only get the first x characters of the text message, and if you pay,
12:44
then you get the complete text message as an observer. Um, well, if you set it to true, we already learned this right now, then you get the full text message. But the question was, how was this implemented? It was implemented in such a way, so the observer basically said, hey, please give me all text messages from this guy, or from my
13:02
kid, or from my girlfriend, and then the observer says, yeah, okay, sure, um, you get the complete text message, text message one, two, three, like the complete one. And then, at the client side, there was a check, like, okay, so if you did not pay, I only show the first 50 characters, and if this is not the case, you basically see the complete text message, and this was, this was a little bit funny to see, and you
13:23
shouldn't do this. Yeah, because, I mean, come on. So, yeah. So, yeah, what the fuck? Um, good. So, next stage, uh, second stage of, of this kind of, of bugs were, um, so, as I said, there are basically these two roles. You have a
13:45
parent, which is, has the admin role, and then you have your kid, which has, uh, less privileges, and if you're an admin, you can, um, create a new admin, or whatever, and if, well, and you can monitor, basically, your kid. And the question was, okay, how do these apps differentiate between an admin, or, um, uh, uh,
14:05
a parent, and a child, uh, a children? So, and the question was, yeah, there's a shared preferences file, and there is a set called is parent, and if this guy is set to true, um, then you're a parent, and then you're an admin. So, this means, if you are the kid, and you change your shared preferences file to true, then you're an
14:24
admin, and can spy back to your parents, if you want. Good. Next stage, another example of this kind, I guess you already get this, um, thing. So, there was, there were applications that contained, um, additional security, uh, protection
14:44
mechanisms, which, um, once you open the application, um, you can enter a pin, and then it asks you for the pin, and if you enter the correct pin, then you own, then you can access the, the application, or the data in the application. So, this is a, a good security feature. The question again was, uh, how was
15:01
this implemented, and I guess you already get this kind of game now. There was a flag in the shared preferences file, this time p flag, so it's not so directly pointing to remove, uh, to lock screen or something, and if you set this to false, then you do not see any lock screen at all, even if you edit a pin or something, you can directly
15:21
access the data there. Yeah. And last but not least, um, obviously, the same also worked for login, so, honestly, so there is an is login, and if you were logged in before, so they store basically the username and password, and if you set this guy to true, it automatically locks you in, even without typing the username and password. Again,
15:45
share preferences, I mean, yeah. So, the last slide here in this case is please do not use share preference for authorization checks. So, for those of you who are bug hunters, so please look into share preferences, it's always fun, and you find a lot of stuff, and for the developers, please don't use this again. We talked about this two
16:03
years ago, and last year about share preferences, and, uh, there are for sure more apps that, um, don't underst- or developers that don't understand this, so please don't do this again. Good. So, this was it from my side. I will now hand over to Stefan, who will continue, um, with the remaining slides.
16:22
Okay. Thank you. Thank you. Um, I will explain the rest of our findings and vulnerabilities now. First, uh, the client side and communication vulnerabilities, and, um, for the people who are not aware about the concept, a few words about, uh, man in the middle attacks. The basic idea is just to, to get as an attacker between the
16:42
communication, so between the user and the backend, and try to eavesdrop or even manipulate the communication. If, for instance, the app communicates in plain text, this would be very easy for an attacker. We can, because he can read everything, change data, and so on. Another case would, if the app has implementation flaws, like it
17:02
uses a broken encryption or has, um, errors so that the, the attackers easily can, can bypass the encryption. And the last step, so this would be, uh, the, the only reliable protection against man in the middle attacker to implement secure, correct, um, and confidential, integrity protected, authenticated, um, communication. So, our first step, I
17:28
mentioned man in the middle attack. We had an application where it was required to sign in, and we wanted to know how secure is this, um, login process. So, as a user, you have to enter your credentials. As an attacker, we observed them. The first thing
17:44
you can see, it's a HTTP connection, so it's plain text. So, man in the middle attacker would be able to read the plain text credentials. But, as you can see in this get request, they are not our credentials. So, we, we applied a few times to, to get or to
18:00
see some pattern. And, as you can see, we have different, uh, parameter names and different parameter positions, but we have always two, um, same parameter values. So, this looks interesting, and we, we dig now into the code to find more, um, about the implementation. The first thing we saw was a hard coded encryption key. Reverse
18:25
engineering, then this algorithm, we saw, okay, the user data, or the user name is XORed with this key, base 64 encoded. And, there was a predefined set of, um, random values. One of them was randomly picked and combined with the, um, user name. This is
18:44
the same procedure for the, um, password. So, this means if we now can, as a man in the middle attacker, observe the traffic, we just simply take the value, decode it, XOR it, and we get the credentials in plain text. The other parameters we saw also were, um,
19:03
garbage. So, we, we had two additional parameters that were also randomly selected from a predefined set. And, but they had no, no value. And, this, this is some kind of weird obfuscation. We don't know why this was done by the developer, but, um, it's the wrong way. So, as I said, if you were able to eavesdrop this data, you can
19:23
decrypt it, get the login data, and authenticate. How to do it rightly in Android? So, secure communication is not so hard. You just have to do an HTTPS connection, use TLS 1.2, or then later 1.3. You just need a valid server
19:40
certificate to get it for free, for instance, from Let's Encrypt. And, on the, um, Android side, doing, uh, HTTPS is very easy. Define a new, uh, new URL object, open a connection. Yeah. And, and that's all. Then it's done. Um, okay. So, the next thing we
20:01
saw, problems with authentication. Um, as Siegfried already mentioned, um, the apps are transferring all the, the tracking and location data to some kind of back end. In most cases, this back end hosts the database. And, if you want to connect to a database, uh, for instance, in Android, you have to, to instantiate the database driver, and then you
20:22
have to establish a connection. Now, what we, we saw in the application is a typical pattern, how you should not do it in an application. At first, they established a connection, so you need the URL, you need a login name, and you need a password. And the problem is now the password is stored in the application. This means everybody who has
20:44
access to the application or to this code can simply extract the password, so when the user name has the URL and has complete access to the back end storage, where all the data is restored. We had a few apps, here you see a, um, simplified, um, database
21:01
scheme, so this back end stores the email address, some name, and especially also the location information. And in our findings, we, in common, had 860,000 different, um, tracking apps, uh, locations, and even if you make regular requests or query, you also can, um, observe or track, track the people in real time, because the app regularly
21:24
sends, um, updated data to this, um, database. Um, the application, um, that's not all, um, of course, so when we looked also in the code a bit, um, a step further, we thought, okay, yeah, SQL, um, so they, for SQL, you should, in the right
21:42
way, use prepared statements, you can see, okay, they already defined a prepared statement, and now we would expect some, some method which will set the values in this statement. But what we see was, they overwrite the prepared statement with a concatenated statement, and as a user, you can, for instance, control the email address,
22:02
and this sample, what you see here, is a classic SQL injection vulnerability. So, the, the app is broken already by design, but we stumbled above this, additionally. So, I don't want to be unpolite, but this is really stupid code. Okay, now there are more
22:21
exciting things, let's get to the servers, um, which are hosting all the daters. Um, secret already introduced, so we have, and we need an authentication process, an authorization process, and we, um, try to analyze and find if there are problems, design flaws, or other vulnerabilities to bypass or break this, um, processes. So, we
22:45
have different, let's say, stages of vulnerabilities and findings in some five, and, um, I will now explain the different findings on the server side. So, the first thing is not really a vulnerability, it's more, um, let's say, a feature or a, um,
23:02
usability thing. Um, um, the application after the installation has by design or by default, an option which says all located and tracked data are sent to the backend, and everybody has access. This means they are freely accessible. So, this is kind of,
23:22
let's say, design flaw. A better option would be some opt-in where default is not everybody can access it, and if the user agrees, he can activate that everybody can access this data. How look this data, or how can you access it? It's very easy, you just need to know the website and the username of the person you want to track or
23:42
want to listen, and for this we have a prepared a short, um, demo video. So, if you go to the website, um, you just, as I said, you have to know the URL of the website. Then, you have to insert a username. For this, we choose a random username. We call the
24:05
user, uh, user sexy. If you open it, you already see you get some, some tracking on, on Google Maps. Now, you can open, um, details. You see one, the
24:22
track button, you get the correct location, when the track was stored, the starting time, um, you get some, some kind of profile information, the, the altitude and so on. Here you see the altitude diagram, and now you also can track or reply the
24:43
track of the person. You can see, okay, he's entering his car, you see his speed, he's driving around. Because of the connection with, with Google Maps, you can also zoom in or look in, in details, so we can see, okay, the person is going to some,
25:01
some school, he's driving around and so on. It's also a bit curious, this, this, in this track, the person is going to a school at 1 p.m. It's, he's moving between the school and the ATM several times. Don't ask what she's doing or he's doing
25:22
there. And yeah, at the end, she's, she's going, um, back to his, um, his home address. Okay. As I said, we, this was just a feature, this was not really a bug, but this is not all. We also stumbled about a bug in this website, so this is not a
25:42
feature, this is a bug in the form of an XSS. So, next step, authentication problems. Sometimes you have the impression, yeah, authentication what? Um, we took another application, if you look at the traffic of the application or reverse engineer code, you see some HTTP request in there, it's already mentioned, nothing
26:05
new, plain text communication. Then, we have a user ID, the user ID is the, the idea of the user itself, so that the person wants to monitor something. This user ID was protected by Caesar encryption. I don't know why, um, makes no really sense. Then we
26:26
have the child ID, this is the idea of the person you want to observe. This is a simple 10 digits large number, and we have the current date, this is the date when the last, um, tracking of the person was stored. And as you can see, this is not a
26:42
very complex request, the, if you can also try or guess the, this, uh, child ID, if the person you want to monitor, if you enter this, the host responds the whole track of the person. We choose this, um, tracking data and printed it into some Google Maps, and here you see some tracking profile of one of our students. And this is
27:02
completely, um, accessible without any authentication login process or whatever. Everybody who knows this URL can track other persons. Okay, Secret already introduced some, some additional features, apps also have the, the possibility to, um,
27:21
send or track, um, text messages. And the question was, can we also get this, uh, messages? Oh, sorry. Um, um, if you look into the traffic again, to, to get a message from the monitor person, you have to make a simple, there's an API, you have to make
27:42
a simple post request, you get a number, how much SMS you want to have from this user, and his user ID. So, after that, you get, uh, a timestamp and, uh, the phone number and the messages the monitor person, um, sent. Now, what happens if you let the user
28:02
ID empty? You get all stored text messages from the server. Yeah. Okay. So, as you see, this is not, no rocket science, we have no real complex exploits, you just have
28:22
to know how to, to use your browser and send a URL. Now we get into exploiting, um, we have here SQL injection, very simple. So, again, we have another type of app, and in this app, it's also possible to track a person. Here you have to know the, the mobile
28:40
number of the person, so this means, um, the, uh, the back end provides an API. If you enter the mobile number, you get the, the longitude, latitude, the location, timestamp of the person you want to monitor. Okay, now a little spoiler, we're talking about SQL injection, so what do you think what happens if you do this? Yes, you get all, um, data,
29:03
phone number, location data from the back end. If you look at the history, the first, um, recording started in the year 2016. That's all. Simple SQL injection. The next one is a bit more, um, let's say complex SQL. Um, secret also mentioned additional features
29:22
like messenger functions, so people can, for instance, with your girlfriend, you can exchange images. As an unusual messenger, these images are stored on a cloud system, and of course, there's one cloud for all images, not every user has his own cloud, and in this case, the user needs to authenticate at the cloud back end, and also filter,
29:44
this means, okay, this user has authenticated, the images belong to the user, so he just gets the images of his girlfriend or not images of foreign people. The question is now, can we somehow bypass this authentication or were we able to compromise the cloud?
30:02
Um, spoiler, a little demo. So, if you take a look at this cloud, it already provides some, simple web interface. Um, by the way, we have to obfuscate the URL because this, um, bug
30:20
was still not fixed from the vendor. So, um, we provide, uh, the cloud storage provides us some kind of simple web interface, but you see with you, when you enter the URL in the browser, we get no images because we are not authenticated. And we are in the section SQL injection, so let's try a simple SQL injection at the parameter. That should
30:44
be in the upper corner, um, a bigger image of the inject, of the SQL injection. You can see it here, and surprise, surprise, we get a preview of the images stored on the cloud system. We can also, um, open these images and download. And as you can, can imagine,
31:04
if people are exchanging images and have the possibility to exchange images, they also will exchange not just burgers or selfies, they will also exchange, um, more private or sensitive information. Let's say from the section of, um, adult entertainment and so on. Um, so
31:25
we also found a very, um, um, sensitive data on the cloud. And, um, yes, we, we, I cannot say how much data, we did not count them or download them because of, of, um, um,
31:43
privacy reasons and so on. And, um, as I said, the bug is still not fixed. Okay. And so, in this way, we would be able to dump all images. So, yeah, thank you. Then, the last step,
32:01
these were just images. Now, we want to go to the crown jewels. So, can we get the credentials? And, um, one, one of our application had a strange, um, let's say, installation process. So, the app were able to recognize if it was already installed on the device. So, they had some special reinstallation procedure. This means, when you
32:23
install the app the first time, it generates some kind of, um, device ID and stores this on the back end. And, you remove the application and reinstall it. It requests to the server for the device ID and compares and is able to detect, okay, I already was installed on this device. And, if it realized that it already was installed on the
32:42
device, the server already, uh, sends the user name and the password, um, and the email address back to the application. So, our first idea was, okay, the device ID, um, how, can we spoof it? But, um, the problem is the, the device ID is a long number. It's very complex and, and, um, it's a deterministic number. You can, perhaps,
33:04
reproduce it, but it's not the best way. So, our other trick lets the idea empty. Does it also not work? So, let's try an SQL injection. Here you see a little curl command, which is, uh, doing the, the request with the SQL injection. And, what we
33:20
get was a stored, um, user credentials, the user name, user ID, the password in plain text. Now, you can imagine, we can iterate over all values and all in all we were able to, or could extract over 1.7 million data, passwords, credentials, everything in plain text. So, if you think, what the fuck? So, then, um, okay, there, there's, there's
34:00
a few words at first about Firebase, who is not aware of it. Firebase is a service from,
34:05
from Google supporting, um, web or app developer. They're providing service for crash analyzes, uh, analysis, cloud messaging or storage. In our case, we focus on two services. The one is a real-time database and the other thing is an authentication process or an API for this. So, just imagine this real-time
34:25
database like a classic database if you are not aware about this Firebase, um, service. So, we have another app. They have implemented an authentication process. They hosted their own authentication server and as a user or as an attacker, at first, you
34:41
have to send a login request. For this, you have to send the user email. In our case, as an attacker, we will send the victim's email. On this, um, back end, there's a, there's a user's email and this table stores the user email and the corresponding user ID. And
35:05
so, if you are sending your post request, the database is curing the database. If he finds the corresponding, um, email address, he replies this user ID. In the next step, the app now tries to access to this Firebase database by sending the user and this
35:24
queried or corresponding user ID to get access to the stored data. So, we are querying in this user ID, in this publicly available table and as a response, we get the location data, the address, um, date when the request was sent. So, this was the first, uh,
35:43
thing and as you can imagine, guessing the email address, you will get really easy access. I'm sorry I have found in this movie not a better face palm, but this is the first one. So, in the next step, you see the app, uh, or the database back end also replies the user credentials back to the app. The quest, the question is now, why? Yeah, um,
36:07
this is an example how you should not do it. Um, the developer implemented a client side verification. This means they're, um, expecting the credentials from the server and comparing it in the application and if it's correct, they allow access. So, I think
36:24
you're aware of it. It's not the correct way how to do this. Another thing is our old trick. What do you think what happens if you remove the user ID? Of course, we get all stored data from this database containing location, address data, user credentials,
36:43
security token, whatever. Yeah, shit happens. So, um, this is, um, this is the, um, it's sometimes easy to, to, to bash, um, people. So, what's the problem here? The first thing is, um, they did not set any, any authorization rules on the Firebase. This is
37:04
always a common problem. Developer are not aware about this thing and they use some default configuration and then the, the authorization is, is disabled. Further thing, as I explained, if you're doing, um, authentication, you don't do it on the client side. You have to do this on the server side and especially if you, if you want to work
37:24
with Firebase, then use the SDK. Don't, um, construct any strange code constructs by yourself. The SDK supports, um, Google sign in. You can use custom email, password sign in, Facebook, whatever. They already implemented it in the correct way in this SDK.
37:43
It's also possible to use your own authentication back end. Um, there's, there's a good tutorial on the Firebase side. If you do this step by step, you're on the right way, but don't do anything with public available databases, construction and other weird things. Okay, a few experiences about our responsive disclosure process. Of course, we
38:05
informed, informed all vendors. We gave them 90 days to fix them, but 90 days are not strict. If the vendor says, okay, I need longer because of some development cycles or whatever, we say, it's okay as long as you fix it. Um, this time we got a few strange
38:21
reaction. The first one is as expected. We will fix it. Thank you. Everybody is fine. The second is, um, yeah, sometimes you have no reaction. Um, one reacted, um, how much money do you want? They thought, uh, we want to, to eavesdrop them, but then we clarified, no, just won't give you the, the report. Please fix it. Yeah, the last thing, it's
38:45
not a bug. It's a feature. Um, for this, um, people, um, manufacturer who do not react on, on, on our emails, we tried to involve Google. Google has this, um, app security improvement team and also security team. We wrote an email, uh, to them, send the advisories,
39:04
reports, everything, but we did not get any direct replay or any, um, reaction. Um, last week we checked the store and the 12 of the applications were removed. So seven are still vulnerable. I also did the demo video. You saw this backend is still active and nobody is,
39:22
um, um, reacting. So a short term summary of our talk. Um, as always, as you saw, don't use plain text communication. Mobile is, is, is a radio communication. It's, it's most cases it's very easy to eavesdrop, sniff from, manipulate your data. SQL prepared statements.
39:45
It's, it's nothing new. Uh, each, especially Android, they provide a huge API of, of, uh, for SQL and prepared statements. Um, if you're doing app development, don't, don't just focus on the app. If you use backend, this is, this is a bond. You have to, to consider also the
40:03
security on, on the backend side. And very important also, don't store any user secrets like passwords, encryption keys, whatever on the client side. Everybody who has access to the app and there are a lot of people who are like reverse engineering Android apps, it's very easy to, to extract the information. Um, also a secret already explained the
40:26
shared preference thing. If you have anything, any special feature or you need a license, Google provides an API for this. And also if you're working with Firebase, use, uh, read the Firebase tutorials, um, use the authentication and, and authorization API they
40:44
provide. Here you see at the end, uh, again, a list of the apps we analyzed. Um, as you can see, the left column are the apps with the client side vulnerability. The right side is where apps were, uh, the backend is involved where we're able to access
41:01
location data or even all storage data. If you look at the table, nearly all, all apps, um, uh, especially in the backend were, were vulnerable against some, some type of attack. So this is the end of the talk. So thank you for your attention. Um, yeah. Or,
41:26
two, two last words. Um, all our findings we wrote for, also for the vendor, we wrote advisories. The advisories are accessible on the, on the website. Find on the findings. And the last thing, who wants to talk with us or discuss or has a question, come to
41:44
us, grab a cool beer. We also have a bottle opener so you don't have to be thirsty. And thank you again.