NC State - Prof. William Enck Page Analysis Techniques for Mobile Operating System Security Prof. William Enck Raleigh ISSA April 5, 2012 1
May 06, 2015
NC State - Prof. William Enck Page
Analysis Techniques for Mobile Operating System Security
Prof. William EnckRaleigh ISSAApril 5, 2012
1
NC State - Prof. William Enck Page
A cautionary tale ...
2
NC State - Prof. William Enck Page
Traditional computing vs. smartphones
• Smartphones: logical conclusion of access consolidation, service decentralization, and commoditization of computing
• Usage model is very different
‣ Multi-user single machine to single-user multiple machines
‣ Always on, always computing social instrument‣ Enterprise: separate action from geography
• Changing Risk
‣ Necessarily contains secrets (often high value)‣ Collects sensitive data as a matter of operation‣ Drifts seamlessly between “unknown” networks‣ Highly malleable development practices, largely
unknown developers3
NC State - Prof. William Enck Page
Rethinking (host) Security
• Permissions define capabilities.
• Application markets deliver functionality (free or paid) via packaged applications.
• Users make permission decisions.
• Applications are run within sandboxes provided by the OS.
• Note: App markets don’t (and can’t) provide security for everything.
4
security 6= users
security == permissions
NC State - Prof. William Enck Page
Research Questions• Questions:
‣ What permissions do applications ask for?
‣ What do applications do with the permissions?
‣ What can applications do with the permissions?
5
NC State - Prof. William Enck Page
Example: Android Security• Permissions granted to applications and never changed
‣ Permissions are enforced when an application accesses a component, API, etc
‣ Runtime decisions look for assigned permissions(access is granted IFF app A assigned perm X at install)
• Example permissions: location, phone IDs, microphone, camera, address book, SMS, application “interfaces”
6
Application 1
Permission Labelsl1,...
A: ...
Application 2
Permission Labels
...
B: l1
C: l2X
InheritPermissions
NC State - Prof. William Enck Page
Q1: what do applications ask for?
• Kirin certifies applications by vetting policies at install-time (relies on runtime enforcement)
• Insight: app config and security policy is an upper bound on runtime behavior.
• Kirin is a modified application installer
‣ Apps with unsafe policies are rejected
7
Android Application Installer
Kirin Security Service
New Application
Pass/Fail
(1) Attempt Installation (2) (3)
Kirin Security Rules
Optional Extension
Display risk ratings to the user and
prompt for override.(4)
NC State - Prof. William Enck Page
Kirin Security Policy• Kirin enforces security invariants at install-time
• Local evaluation of two manifest artifacts‣ The collection of requested permissions (uses-permission)
‣ The types of registered Intent message listeners
• Example:‣ Do not allow an application with Location and Internet
permissions and receives the “booted” event
8
restrict permission [ACCESS_FINE_LOCATION, INTERNET] and receive [BOOT_COMPLETE]
NC State - Prof. William Enck Page
• Policy evaluation is the satisfiability of invariants‣ Invariant violations found in O(n) w.r.t. policy size
• Model:‣ KSL rules are tuples:
‣ Configuration policy is a tuple:
‣ if
‣ Certified if
Policy Evaluation
9
(1) An application must not have the SET_DEBUG_APP permission label.
(2) An application must not have PHONE_STATE, RECORD_AUDIO, and INTERNET permission labels.
(3) An application must not have PROCESS_OUTGOING_CALL, RECORD_AUDIO, and INTERNET permission labels.
(4) An application must not have ACCESS_FINE_LOCATION, INTERNET, and RECEIVE_BOOT_COMPLETE permission labels.
(5) An application must not have ACCESS_COARSE_LOCATION, INTERNET, and RECEIVE_BOOT_COMPLETE permission labels.
(6) An application must not have RECEIVE_SMS and WRITE_SMS permission labels.
(7) An application must not have SEND_SMS and WRITE_SMS permission labels.
(8) An application must not have INSTALL_SHORTCUT and UNINSTALL_SHORTCUT permission labels.
(9) An application must not have the SET_PREFERRED_APPLICATION permission label and receive Intents for the CALL action string.
Figure 4: Sample Kirin security rules to mitigate malware
sender could hide other activity performed by the malware. Whilethis attack is also limited by notifications in the status bar, again,the message contents can be transformed as spam.
Rule 7 mitigates mobile bots sending SMS spam. Similar toRule 6, this rule ensures the malware cannot remove traces of itsactivity. While Rule 7 does not prevent the SMS spam messagesfrom being sent, it increases the probability that the user becomesaware of the activity.
Finally, Rule 8 makes use of the duality of some permission la-bels. Android defines separate permissions for installing and unin-stalling shortcuts on the phone’s home screen. This rule ensuresthat a third-party application cannot have both. If an applicationhas both, it can redirect the shortcuts for frequently used applica-tions to a malicious one. For instance, the shortcut for the webbrowser could be redirected to an identically appearing applicationthat harvests passwords.
4.2.3 Permission and Interface Security RulesPermissions alone are not always enough to characterize mal-
ware behavior. Rule 9 provides an example of a rule consideringboth a permission and an action string. This specific rule preventsmalware from replacing the default voice call dialer applicationwithout the user’s knowledge. Normally, if Android detects twoor more applications contain Activities to handle an Intent mes-sage, the user is prompted which application to use. This interfacealso allows the user to set the current selection as default. However,if an application has the SET_PREFERRED_APPLICATION per-mission label, it can set the default without the user’s knowledge.Google marks this permission as “dangerous”; however, users maynot fully understand the security implications of granting it. Rule 9combines this permission with the existence of an Intent filter re-ceiving the CALL action string. Hence, we can allow a third-partyapplication to obtain the permission as long as it does not also han-dle voice calls. Similar rules can be constructed for other actionstrings handled by the trusted computing base.
5. KIRIN SECURITY LANGUAGEWe now describe the Kirin Security Language (KSL) to encode
security rules for the Kirin security service. Kirin uses an applica-tion’s package manifest as input. The rules identified in Section 4only require knowledge of the permission labels requested by anapplication and the action strings used in Intent filters. This sectiondefines the KSL syntax and formally defines its semantics.
5.1 KSL SyntaxFigure 5 defines the Kirin Security Language in BNF notation.
A KSL rule-set consists of a list of rules. A rule indicates com-binations of permission labels and action strings that should not be
hrule-seti ::= hrulei | hrulei hrule-seti (1)hrulei ::= “restrict” hrestrict-listi (2)
hrestrict-listi ::= hrestricti | hrestricti “and” hrestrict-listi (3)hrestricti ::= “permission [” hconst-listi “]” |
“receive [” hconst-listi “]” (4)hconst-listi ::= hconsti | hconsti “,” hconst-listi (5)
hconsti ::= “’”[A-Za-z0-9_.]+“’” (6)
Figure 5: KSL syntax in BNF.
used by third-party applications. Each rule begins with the keyword“restrict”. The remainder of the rule is the conjunction of setsof permissions and action strings received. Each set is denoted aseither “permission” or “receive”, respectively.
5.2 KSL SemanticsWe now define a simple logic to represent a set of rules written
in KSL. Let R be set of all rules expressible in KSL. Let P bethe set of possible permission labels and A be the set of possibleaction strings used by Activities, Services, and Broadcast Receiversto receive Intents. Then, each rule ri ⇥ R is a tuple (2P , 2A).3
We use the notation ri = (Pi, Ai) to refer to a specific subset ofpermission labels and action strings for rule ri, where Pi ⇥ 2P andAi ⇥ 2A.
Let R � R correspond to a set of KSL rules. We construct Rfrom the KSL rules as follows. For each �rule i, let Pi be the unionof all sets of “permission” restrictions, and let Ai be the unionof all sets of “receive” restrictions. Then, create ri = (Pi, Ai)and place it in R. The set R directly corresponds to the set of KSLrules and can be formed in time linear to the size of the KSL ruleset (proof by inspection).
Next we define a configuration based on package manifest con-tents. Let C be the set of all possible configurations extracted froma package manifest. We need only capture the set of permissionlabels used by the application and the action strings used by its Ac-tivities, Services, and Broadcast Receivers. Note that the packagemanifest does not specify action strings used by dynamic BroadcastReceivers; however, we use this fact to our advantage (as discussedin Section 7). We define configuration c ⇥ C as a tuple (2P , 2A).We use the notation ct = (Pt, At) to refer to a specific subset ofpermission labels and action strings used by a target application t,where Pt ⇥ 2P and At ⇥ 2A.
3We use the standard notation 2X represent the power set of a setX , which is the set of all subsets including ⇤.
(1) An application must not have the SET_DEBUG_APP permission label.
(2) An application must not have PHONE_STATE, RECORD_AUDIO, and INTERNET permission labels.
(3) An application must not have PROCESS_OUTGOING_CALL, RECORD_AUDIO, and INTERNET permission labels.
(4) An application must not have ACCESS_FINE_LOCATION, INTERNET, and RECEIVE_BOOT_COMPLETE permission labels.
(5) An application must not have ACCESS_COARSE_LOCATION, INTERNET, and RECEIVE_BOOT_COMPLETE permission labels.
(6) An application must not have RECEIVE_SMS and WRITE_SMS permission labels.
(7) An application must not have SEND_SMS and WRITE_SMS permission labels.
(8) An application must not have INSTALL_SHORTCUT and UNINSTALL_SHORTCUT permission labels.
(9) An application must not have the SET_PREFERRED_APPLICATION permission label and receive Intents for the CALL action string.
Figure 4: Sample Kirin security rules to mitigate malware
sender could hide other activity performed by the malware. Whilethis attack is also limited by notifications in the status bar, again,the message contents can be transformed as spam.
Rule 7 mitigates mobile bots sending SMS spam. Similar toRule 6, this rule ensures the malware cannot remove traces of itsactivity. While Rule 7 does not prevent the SMS spam messagesfrom being sent, it increases the probability that the user becomesaware of the activity.
Finally, Rule 8 makes use of the duality of some permission la-bels. Android defines separate permissions for installing and unin-stalling shortcuts on the phone’s home screen. This rule ensuresthat a third-party application cannot have both. If an applicationhas both, it can redirect the shortcuts for frequently used applica-tions to a malicious one. For instance, the shortcut for the webbrowser could be redirected to an identically appearing applicationthat harvests passwords.
4.2.3 Permission and Interface Security RulesPermissions alone are not always enough to characterize mal-
ware behavior. Rule 9 provides an example of a rule consideringboth a permission and an action string. This specific rule preventsmalware from replacing the default voice call dialer applicationwithout the user’s knowledge. Normally, if Android detects twoor more applications contain Activities to handle an Intent mes-sage, the user is prompted which application to use. This interfacealso allows the user to set the current selection as default. However,if an application has the SET_PREFERRED_APPLICATION per-mission label, it can set the default without the user’s knowledge.Google marks this permission as “dangerous”; however, users maynot fully understand the security implications of granting it. Rule 9combines this permission with the existence of an Intent filter re-ceiving the CALL action string. Hence, we can allow a third-partyapplication to obtain the permission as long as it does not also han-dle voice calls. Similar rules can be constructed for other actionstrings handled by the trusted computing base.
5. KIRIN SECURITY LANGUAGEWe now describe the Kirin Security Language (KSL) to encode
security rules for the Kirin security service. Kirin uses an applica-tion’s package manifest as input. The rules identified in Section 4only require knowledge of the permission labels requested by anapplication and the action strings used in Intent filters. This sectiondefines the KSL syntax and formally defines its semantics.
5.1 KSL SyntaxFigure 5 defines the Kirin Security Language in BNF notation.
A KSL rule-set consists of a list of rules. A rule indicates com-binations of permission labels and action strings that should not be
hrule-seti ::= hrulei | hrulei hrule-seti (1)hrulei ::= “restrict” hrestrict-listi (2)
hrestrict-listi ::= hrestricti | hrestricti “and” hrestrict-listi (3)hrestricti ::= “permission [” hconst-listi “]” |
“receive [” hconst-listi “]” (4)hconst-listi ::= hconsti | hconsti “,” hconst-listi (5)
hconsti ::= “’”[A-Za-z0-9_.]+“’” (6)
Figure 5: KSL syntax in BNF.
used by third-party applications. Each rule begins with the keyword“restrict”. The remainder of the rule is the conjunction of setsof permissions and action strings received. Each set is denoted aseither “permission” or “receive”, respectively.
5.2 KSL SemanticsWe now define a simple logic to represent a set of rules written
in KSL. Let R be set of all rules expressible in KSL. Let P bethe set of possible permission labels and A be the set of possibleaction strings used by Activities, Services, and Broadcast Receiversto receive Intents. Then, each rule ri ⇥ R is a tuple (2P , 2A).3
We use the notation ri = (Pi, Ai) to refer to a specific subset ofpermission labels and action strings for rule ri, where Pi ⇥ 2P andAi ⇥ 2A.
Let R � R correspond to a set of KSL rules. We construct Rfrom the KSL rules as follows. For each �rule i, let Pi be the unionof all sets of “permission” restrictions, and let Ai be the unionof all sets of “receive” restrictions. Then, create ri = (Pi, Ai)and place it in R. The set R directly corresponds to the set of KSLrules and can be formed in time linear to the size of the KSL ruleset (proof by inspection).
Next we define a configuration based on package manifest con-tents. Let C be the set of all possible configurations extracted froma package manifest. We need only capture the set of permissionlabels used by the application and the action strings used by its Ac-tivities, Services, and Broadcast Receivers. Note that the packagemanifest does not specify action strings used by dynamic BroadcastReceivers; however, we use this fact to our advantage (as discussedin Section 7). We define configuration c ⇥ C as a tuple (2P , 2A).We use the notation ct = (Pt, At) to refer to a specific subset ofpermission labels and action strings used by a target application t,where Pt ⇥ 2P and At ⇥ 2A.
3We use the standard notation 2X represent the power set of a setX , which is the set of all subsets including ⇤.
We now define the semantics of a set of KSL rules. Let fail :C ⇥ R ⌅ {true, false} be a function to test if an applicationconfiguration fails a KSL rule. Let ct be the configuration for targetapplication t and ri be a rule. Then, we define fail(ct, ri) as:
(Pt, At) = ct, (Pi, Ai) = ri, Pi ⇤ Pt Ai ⇤ At
Clearly, fail(·) operates in time linear to the input, as a hash tablecan provide constant time set membership checks.
Let FR : C ⌅ R be a function returning the set of all rules inR ⇧ 2R for which an application configuration fails:
FR(ct) = {ri|ri ⇧ R, fail(ct, ri)}
Then, we say the configuration ct passes a given KSL rule-set R ifFR(ct) = ⌃. Note that FR(ct) operates in time linear to the size ofct and R. Finally, the set FR(ct) can be returned to the applicationinstaller to indicate which rules failed. This information facilitatesthe optional user override extension described in Section 2.
6. KIRIN SECURITY SERVICEFor flexibility, Kirin is designed as a security service running on
the mobile phone. The existing software installer interfaces directlywith the security service. This approach follows Android’s designprinciple of allowing applications to be replaced based on manu-facturer and consumer interests. More specifically, a new installercan also use Kirin.
We implemented Kirin as an Android application. The primaryfunctionality exists within a Service component that exports anRPC interface used by the software installer. This service readsKSL rules from a configuration file. At install time, the installerpasses the file path to the package archive (.apk file) to the RPC in-terface. Then, Kirin parses the package to extract the security con-figuration stored in the package manifest. The PackageManagerand PackageParser APIs provide the necessary information.The configuration is then evaluated against the KSL rules. Finally,the passed/failed result is returned to the installer with the list of theviolated rules. Note that Kirin service does not access any criticalresources of the platform hence does not require any permissions.
7. EVALUATIONPractical security rules must both mitigate malware and allow
legitimate applications to be installed. Section 4 argued that oursample security rules can detect specific types of malware. How-ever, Kirin’s certification technique conservatively detects danger-ous functionality, and may reject legitimate applications. In thissection, we evaluate our sample security rules against real applica-tions from the Android Market. While the Android Market doesnot perform rigorous certification, we initially assume it does notcontain malware. Any application not passing a security rule re-quires further investigation. Overall, we found very few applica-tions where this was the case. On one occasion, we found a rulecould be refined to reduce this number further.
Our sample set consisted of a snapshot of a subset of popular ap-plications available in the Android Market in late January 2009. Wedownloaded the top 20 applications from each of the 16 categories,producing a total of 311 applications (one category only had 11 ap-plications). We used Kirin to extract the appropriate informationfrom each package manifest and ran the FR(·) algorithm describedin Section 5.
7.1 Empirical ResultsOur analysis tested all 311 applications against the security rules
listed in Figure 4. Of the 311 applications, only 12 failed to pass
Table 1: Applications failing Rule 2Application DescriptionWalki TalkiePush to Talk
Walkie-Talkie style voice communication.
Shazam Utility to identify music tracks.InaugurationReport
Collaborative journalism application.
all 9 security rules. Of these, 3 applications failed Rule 2 and 9applications failed Rules 4 and 5. These failure sets were disjoint,and no applications failed the other six rules.
Table 1 lists the applications that fail Rule 2. Recall that Rule 2defends against a malicious eavesdropper by failing any applica-tion that can read phone state, record audio, and access the Inter-net. However, none of the applications listed in Table 1 exhibiteavesdropper-like characteristics. Considering the purpose of eachapplication, it is clear why they require the ability to record audioand access the Internet. We initially speculated that the applica-tions stop recording upon an incoming call. However, this was notthe case. We disproved our speculation for Shazam and Inaugu-ration Report and were unable to determine a solid reason for thepermission label’s existence, as no source code was available.
After realizing that simultaneous access to phone state and audiorecording is in fact beneficial (i.e., to stop recording on incomingcall), we decided to refine Rule 2. Our goal is to protect againstan eavesdropper that automatically records a voice call on eitherincoming or outgoing call. Recall that there are two ways to obtainthe phone state: 1) register a Broadcast Receiver for the PHONE_STATE action string, and 2) register a PhoneStateListenerwith the system. If a static Broadcast Receiver is used for theformer case, the application is automatically started on incomingand outgoing call. The latter case requires the application to bealready started, e.g., by the user, or on boot. We need only con-sider cases where it is started automatically. Using this informa-tion, we split Rule 2 into two new security rules. Each appends anadditional condition. The first appends a restriction on receivingthe PHONE_STATE action string. Note that since Kirin only usesBroadcast Receivers defined in the package manifest, we will notdetect dynamic Broadcast Receivers that cannot be used to auto-matically start the application. The second rule appends the bootcomplete permission label used for Rule 4. Rerunning the applica-tions against our new set of security rules, we found that only theWalkie Talkie application failed our rules, thus reducing the num-ber of failed applications to 10.
Table 2 lists the applications that fail Rules 4 and 5. Recall thatthese security rules detect applications that start on boot and accesslocation information and the Internet. The goal of these rules is toprevent location tracking software. Of the nine applications listedin Table 2, the first five provide functionality that directly contrastwith the rule’s goal. In fact, Kirin correctly identified both Accu-Tracking and GPS Tracker as dangerous. Both Loopt and Twidroidare popular social networking applications; however, they do in factprovide potentially dangerous functionality, as they can be config-ured to automatically start on boot without the user’s knowledge.Finally, Pintail is designed to report the phone’s location in re-sponse to an SMS message with the correct password. While thismay be initiated by the user, it may also be used by an adversary totrack the user. Again, Kirin correctly identified potentially danger-ous functionality.
The remaining four applications in Table 2 result from the lim-itations in Kirin’s input. That is, Kirin cannot inspect how an ap-
We now define the semantics of a set of KSL rules. Let fail :C ⇥ R ⌅ {true, false} be a function to test if an applicationconfiguration fails a KSL rule. Let ct be the configuration for targetapplication t and ri be a rule. Then, we define fail(ct, ri) as:
(Pt, At) = ct, (Pi, Ai) = ri, Pi ⇤ Pt Ai ⇤ At
Clearly, fail(·) operates in time linear to the input, as a hash tablecan provide constant time set membership checks.
Let FR : C ⌅ R be a function returning the set of all rules inR ⇧ 2R for which an application configuration fails:
FR(ct) = {ri|ri ⇧ R, fail(ct, ri)}
Then, we say the configuration ct passes a given KSL rule-set R ifFR(ct) = ⌃. Note that FR(ct) operates in time linear to the size ofct and R. Finally, the set FR(ct) can be returned to the applicationinstaller to indicate which rules failed. This information facilitatesthe optional user override extension described in Section 2.
6. KIRIN SECURITY SERVICEFor flexibility, Kirin is designed as a security service running on
the mobile phone. The existing software installer interfaces directlywith the security service. This approach follows Android’s designprinciple of allowing applications to be replaced based on manu-facturer and consumer interests. More specifically, a new installercan also use Kirin.
We implemented Kirin as an Android application. The primaryfunctionality exists within a Service component that exports anRPC interface used by the software installer. This service readsKSL rules from a configuration file. At install time, the installerpasses the file path to the package archive (.apk file) to the RPC in-terface. Then, Kirin parses the package to extract the security con-figuration stored in the package manifest. The PackageManagerand PackageParser APIs provide the necessary information.The configuration is then evaluated against the KSL rules. Finally,the passed/failed result is returned to the installer with the list of theviolated rules. Note that Kirin service does not access any criticalresources of the platform hence does not require any permissions.
7. EVALUATIONPractical security rules must both mitigate malware and allow
legitimate applications to be installed. Section 4 argued that oursample security rules can detect specific types of malware. How-ever, Kirin’s certification technique conservatively detects danger-ous functionality, and may reject legitimate applications. In thissection, we evaluate our sample security rules against real applica-tions from the Android Market. While the Android Market doesnot perform rigorous certification, we initially assume it does notcontain malware. Any application not passing a security rule re-quires further investigation. Overall, we found very few applica-tions where this was the case. On one occasion, we found a rulecould be refined to reduce this number further.
Our sample set consisted of a snapshot of a subset of popular ap-plications available in the Android Market in late January 2009. Wedownloaded the top 20 applications from each of the 16 categories,producing a total of 311 applications (one category only had 11 ap-plications). We used Kirin to extract the appropriate informationfrom each package manifest and ran the FR(·) algorithm describedin Section 5.
7.1 Empirical ResultsOur analysis tested all 311 applications against the security rules
listed in Figure 4. Of the 311 applications, only 12 failed to pass
Table 1: Applications failing Rule 2Application DescriptionWalki TalkiePush to Talk
Walkie-Talkie style voice communication.
Shazam Utility to identify music tracks.InaugurationReport
Collaborative journalism application.
all 9 security rules. Of these, 3 applications failed Rule 2 and 9applications failed Rules 4 and 5. These failure sets were disjoint,and no applications failed the other six rules.
Table 1 lists the applications that fail Rule 2. Recall that Rule 2defends against a malicious eavesdropper by failing any applica-tion that can read phone state, record audio, and access the Inter-net. However, none of the applications listed in Table 1 exhibiteavesdropper-like characteristics. Considering the purpose of eachapplication, it is clear why they require the ability to record audioand access the Internet. We initially speculated that the applica-tions stop recording upon an incoming call. However, this was notthe case. We disproved our speculation for Shazam and Inaugu-ration Report and were unable to determine a solid reason for thepermission label’s existence, as no source code was available.
After realizing that simultaneous access to phone state and audiorecording is in fact beneficial (i.e., to stop recording on incomingcall), we decided to refine Rule 2. Our goal is to protect againstan eavesdropper that automatically records a voice call on eitherincoming or outgoing call. Recall that there are two ways to obtainthe phone state: 1) register a Broadcast Receiver for the PHONE_STATE action string, and 2) register a PhoneStateListenerwith the system. If a static Broadcast Receiver is used for theformer case, the application is automatically started on incomingand outgoing call. The latter case requires the application to bealready started, e.g., by the user, or on boot. We need only con-sider cases where it is started automatically. Using this informa-tion, we split Rule 2 into two new security rules. Each appends anadditional condition. The first appends a restriction on receivingthe PHONE_STATE action string. Note that since Kirin only usesBroadcast Receivers defined in the package manifest, we will notdetect dynamic Broadcast Receivers that cannot be used to auto-matically start the application. The second rule appends the bootcomplete permission label used for Rule 4. Rerunning the applica-tions against our new set of security rules, we found that only theWalkie Talkie application failed our rules, thus reducing the num-ber of failed applications to 10.
Table 2 lists the applications that fail Rules 4 and 5. Recall thatthese security rules detect applications that start on boot and accesslocation information and the Internet. The goal of these rules is toprevent location tracking software. Of the nine applications listedin Table 2, the first five provide functionality that directly contrastwith the rule’s goal. In fact, Kirin correctly identified both Accu-Tracking and GPS Tracker as dangerous. Both Loopt and Twidroidare popular social networking applications; however, they do in factprovide potentially dangerous functionality, as they can be config-ured to automatically start on boot without the user’s knowledge.Finally, Pintail is designed to report the phone’s location in re-sponse to an SMS message with the correct password. While thismay be initiated by the user, it may also be used by an adversary totrack the user. Again, Kirin correctly identified potentially danger-ous functionality.
The remaining four applications in Table 2 result from the lim-itations in Kirin’s input. That is, Kirin cannot inspect how an ap-
We now define the semantics of a set of KSL rules. Let fail :C ⇥ R ⌅ {true, false} be a function to test if an applicationconfiguration fails a KSL rule. Let ct be the configuration for targetapplication t and ri be a rule. Then, we define fail(ct, ri) as:
(Pt, At) = ct, (Pi, Ai) = ri, Pi ⇤ Pt Ai ⇤ At
Clearly, fail(·) operates in time linear to the input, as a hash tablecan provide constant time set membership checks.
Let FR : C ⌅ R be a function returning the set of all rules inR ⇧ 2R for which an application configuration fails:
FR(ct) = {ri|ri ⇧ R, fail(ct, ri)}
Then, we say the configuration ct passes a given KSL rule-set R ifFR(ct) = ⌃. Note that FR(ct) operates in time linear to the size ofct and R. Finally, the set FR(ct) can be returned to the applicationinstaller to indicate which rules failed. This information facilitatesthe optional user override extension described in Section 2.
6. KIRIN SECURITY SERVICEFor flexibility, Kirin is designed as a security service running on
the mobile phone. The existing software installer interfaces directlywith the security service. This approach follows Android’s designprinciple of allowing applications to be replaced based on manu-facturer and consumer interests. More specifically, a new installercan also use Kirin.
We implemented Kirin as an Android application. The primaryfunctionality exists within a Service component that exports anRPC interface used by the software installer. This service readsKSL rules from a configuration file. At install time, the installerpasses the file path to the package archive (.apk file) to the RPC in-terface. Then, Kirin parses the package to extract the security con-figuration stored in the package manifest. The PackageManagerand PackageParser APIs provide the necessary information.The configuration is then evaluated against the KSL rules. Finally,the passed/failed result is returned to the installer with the list of theviolated rules. Note that Kirin service does not access any criticalresources of the platform hence does not require any permissions.
7. EVALUATIONPractical security rules must both mitigate malware and allow
legitimate applications to be installed. Section 4 argued that oursample security rules can detect specific types of malware. How-ever, Kirin’s certification technique conservatively detects danger-ous functionality, and may reject legitimate applications. In thissection, we evaluate our sample security rules against real applica-tions from the Android Market. While the Android Market doesnot perform rigorous certification, we initially assume it does notcontain malware. Any application not passing a security rule re-quires further investigation. Overall, we found very few applica-tions where this was the case. On one occasion, we found a rulecould be refined to reduce this number further.
Our sample set consisted of a snapshot of a subset of popular ap-plications available in the Android Market in late January 2009. Wedownloaded the top 20 applications from each of the 16 categories,producing a total of 311 applications (one category only had 11 ap-plications). We used Kirin to extract the appropriate informationfrom each package manifest and ran the FR(·) algorithm describedin Section 5.
7.1 Empirical ResultsOur analysis tested all 311 applications against the security rules
listed in Figure 4. Of the 311 applications, only 12 failed to pass
Table 1: Applications failing Rule 2Application DescriptionWalki TalkiePush to Talk
Walkie-Talkie style voice communication.
Shazam Utility to identify music tracks.InaugurationReport
Collaborative journalism application.
all 9 security rules. Of these, 3 applications failed Rule 2 and 9applications failed Rules 4 and 5. These failure sets were disjoint,and no applications failed the other six rules.
Table 1 lists the applications that fail Rule 2. Recall that Rule 2defends against a malicious eavesdropper by failing any applica-tion that can read phone state, record audio, and access the Inter-net. However, none of the applications listed in Table 1 exhibiteavesdropper-like characteristics. Considering the purpose of eachapplication, it is clear why they require the ability to record audioand access the Internet. We initially speculated that the applica-tions stop recording upon an incoming call. However, this was notthe case. We disproved our speculation for Shazam and Inaugu-ration Report and were unable to determine a solid reason for thepermission label’s existence, as no source code was available.
After realizing that simultaneous access to phone state and audiorecording is in fact beneficial (i.e., to stop recording on incomingcall), we decided to refine Rule 2. Our goal is to protect againstan eavesdropper that automatically records a voice call on eitherincoming or outgoing call. Recall that there are two ways to obtainthe phone state: 1) register a Broadcast Receiver for the PHONE_STATE action string, and 2) register a PhoneStateListenerwith the system. If a static Broadcast Receiver is used for theformer case, the application is automatically started on incomingand outgoing call. The latter case requires the application to bealready started, e.g., by the user, or on boot. We need only con-sider cases where it is started automatically. Using this informa-tion, we split Rule 2 into two new security rules. Each appends anadditional condition. The first appends a restriction on receivingthe PHONE_STATE action string. Note that since Kirin only usesBroadcast Receivers defined in the package manifest, we will notdetect dynamic Broadcast Receivers that cannot be used to auto-matically start the application. The second rule appends the bootcomplete permission label used for Rule 4. Rerunning the applica-tions against our new set of security rules, we found that only theWalkie Talkie application failed our rules, thus reducing the num-ber of failed applications to 10.
Table 2 lists the applications that fail Rules 4 and 5. Recall thatthese security rules detect applications that start on boot and accesslocation information and the Internet. The goal of these rules is toprevent location tracking software. Of the nine applications listedin Table 2, the first five provide functionality that directly contrastwith the rule’s goal. In fact, Kirin correctly identified both Accu-Tracking and GPS Tracker as dangerous. Both Loopt and Twidroidare popular social networking applications; however, they do in factprovide potentially dangerous functionality, as they can be config-ured to automatically start on boot without the user’s knowledge.Finally, Pintail is designed to report the phone’s location in re-sponse to an SMS message with the correct password. While thismay be initiated by the user, it may also be used by an adversary totrack the user. Again, Kirin correctly identified potentially danger-ous functionality.
The remaining four applications in Table 2 result from the lim-itations in Kirin’s input. That is, Kirin cannot inspect how an ap-
We now define the semantics of a set of KSL rules. Let fail :C ⇥ R ⌅ {true, false} be a function to test if an applicationconfiguration fails a KSL rule. Let ct be the configuration for targetapplication t and ri be a rule. Then, we define fail(ct, ri) as:
(Pt, At) = ct, (Pi, Ai) = ri, Pi ⇤ Pt Ai ⇤ At
Clearly, fail(·) operates in time linear to the input, as a hash tablecan provide constant time set membership checks.
Let FR : C ⌅ R be a function returning the set of all rules inR ⇧ 2R for which an application configuration fails:
FR(ct) = {ri|ri ⇧ R, fail(ct, ri)}
Then, we say the configuration ct passes a given KSL rule-set R ifFR(ct) = ⌃. Note that FR(ct) operates in time linear to the size ofct and R. Finally, the set FR(ct) can be returned to the applicationinstaller to indicate which rules failed. This information facilitatesthe optional user override extension described in Section 2.
6. KIRIN SECURITY SERVICEFor flexibility, Kirin is designed as a security service running on
the mobile phone. The existing software installer interfaces directlywith the security service. This approach follows Android’s designprinciple of allowing applications to be replaced based on manu-facturer and consumer interests. More specifically, a new installercan also use Kirin.
We implemented Kirin as an Android application. The primaryfunctionality exists within a Service component that exports anRPC interface used by the software installer. This service readsKSL rules from a configuration file. At install time, the installerpasses the file path to the package archive (.apk file) to the RPC in-terface. Then, Kirin parses the package to extract the security con-figuration stored in the package manifest. The PackageManagerand PackageParser APIs provide the necessary information.The configuration is then evaluated against the KSL rules. Finally,the passed/failed result is returned to the installer with the list of theviolated rules. Note that Kirin service does not access any criticalresources of the platform hence does not require any permissions.
7. EVALUATIONPractical security rules must both mitigate malware and allow
legitimate applications to be installed. Section 4 argued that oursample security rules can detect specific types of malware. How-ever, Kirin’s certification technique conservatively detects danger-ous functionality, and may reject legitimate applications. In thissection, we evaluate our sample security rules against real applica-tions from the Android Market. While the Android Market doesnot perform rigorous certification, we initially assume it does notcontain malware. Any application not passing a security rule re-quires further investigation. Overall, we found very few applica-tions where this was the case. On one occasion, we found a rulecould be refined to reduce this number further.
Our sample set consisted of a snapshot of a subset of popular ap-plications available in the Android Market in late January 2009. Wedownloaded the top 20 applications from each of the 16 categories,producing a total of 311 applications (one category only had 11 ap-plications). We used Kirin to extract the appropriate informationfrom each package manifest and ran the FR(·) algorithm describedin Section 5.
7.1 Empirical ResultsOur analysis tested all 311 applications against the security rules
listed in Figure 4. Of the 311 applications, only 12 failed to pass
Table 1: Applications failing Rule 2Application DescriptionWalki TalkiePush to Talk
Walkie-Talkie style voice communication.
Shazam Utility to identify music tracks.InaugurationReport
Collaborative journalism application.
all 9 security rules. Of these, 3 applications failed Rule 2 and 9applications failed Rules 4 and 5. These failure sets were disjoint,and no applications failed the other six rules.
Table 1 lists the applications that fail Rule 2. Recall that Rule 2defends against a malicious eavesdropper by failing any applica-tion that can read phone state, record audio, and access the Inter-net. However, none of the applications listed in Table 1 exhibiteavesdropper-like characteristics. Considering the purpose of eachapplication, it is clear why they require the ability to record audioand access the Internet. We initially speculated that the applica-tions stop recording upon an incoming call. However, this was notthe case. We disproved our speculation for Shazam and Inaugu-ration Report and were unable to determine a solid reason for thepermission label’s existence, as no source code was available.
After realizing that simultaneous access to phone state and audiorecording is in fact beneficial (i.e., to stop recording on incomingcall), we decided to refine Rule 2. Our goal is to protect againstan eavesdropper that automatically records a voice call on eitherincoming or outgoing call. Recall that there are two ways to obtainthe phone state: 1) register a Broadcast Receiver for the PHONE_STATE action string, and 2) register a PhoneStateListenerwith the system. If a static Broadcast Receiver is used for theformer case, the application is automatically started on incomingand outgoing call. The latter case requires the application to bealready started, e.g., by the user, or on boot. We need only con-sider cases where it is started automatically. Using this informa-tion, we split Rule 2 into two new security rules. Each appends anadditional condition. The first appends a restriction on receivingthe PHONE_STATE action string. Note that since Kirin only usesBroadcast Receivers defined in the package manifest, we will notdetect dynamic Broadcast Receivers that cannot be used to auto-matically start the application. The second rule appends the bootcomplete permission label used for Rule 4. Rerunning the applica-tions against our new set of security rules, we found that only theWalkie Talkie application failed our rules, thus reducing the num-ber of failed applications to 10.
Table 2 lists the applications that fail Rules 4 and 5. Recall thatthese security rules detect applications that start on boot and accesslocation information and the Internet. The goal of these rules is toprevent location tracking software. Of the nine applications listedin Table 2, the first five provide functionality that directly contrastwith the rule’s goal. In fact, Kirin correctly identified both Accu-Tracking and GPS Tracker as dangerous. Both Loopt and Twidroidare popular social networking applications; however, they do in factprovide potentially dangerous functionality, as they can be config-ured to automatically start on boot without the user’s knowledge.Finally, Pintail is designed to report the phone’s location in re-sponse to an SMS message with the correct password. While thismay be initiated by the user, it may also be used by an adversary totrack the user. Again, Kirin correctly identified potentially danger-ous functionality.
The remaining four applications in Table 2 result from the lim-itations in Kirin’s input. That is, Kirin cannot inspect how an ap-
restrict permission [ACCESS_FINE_LOCATION, INTERNET] and receive [BOOT_COMPLETE]
NC State - Prof. William Enck Page
Studying the (early) Market• Evaluate 300+ popular Market apps (Jan 2009)
‣ 5 had both dangerous configuration and functionality (1.6%)
‣ 5 had dangerous configuration but not functionality (1.6%)
10
(1) An application must not have the SET_DEBUG_APP permission(2) An application must not have the READ_PHONE_STATE, RECORD_AUDIO, and INTERNET permissions(3) An application must not have the PROCESS_OUTGOING_CALL, RECORD_AUDIO, and INTERNET permissions(4) An application must not have the ACCESS_FINE_LOCATION, INTERNET, and RECEIVE_BOOT_COMPLETE permissions(5) An application must not have the ACCESS_COARSE_LOCATION, INTERNET, and RECEIVE_BOOT_COMPLETE permissions(6) An application must not have the RECEIVE_SMS and WRITE_SMS permissions(7) An application must not have the SEND_SMS and WRITE_SMS permissions(8) An application must not have the INSTALL_SHORTCUT and UNINSTALL_SHORTCUT permissions(9) An application must not have the SET_PREFERRED_APPLICATION permission and receive Intents for the CALL action string
NC State - Prof. William Enck Page
Q2: What do the applications do?• TaintDroid is a system-wide integration of taint
tracking into the Android platform
‣ VM Layer: variable tracking throughout Dalvik VM‣ Native Layer: patches state after native method invocation‣ Binder IPC Layer: extends tracking between applications‣ Storage Layer: persistent tracking on files
• TaintDroid is a firmware modification, not an app11
Network Interface
Native System Libraries
Virtual Machine
Virtual Machine
Application Code Application CodeMsg
Secondary Storage
Message-level tracking
Variable-leveltracking
Method-leveltracking
File-leveltracking
NC State - Prof. William Enck Page
Dynamic Taint Analysis• Dynamic taint analysis is a technique that tracks
information dependencies from an origin
• Conceptual idea:
‣ Taint source
‣ Taint propagation
‣ Taint sink
• Limitations: performance and granularity is a trade-off12
c = taint_source()...a = b + c...network_send(a)
NC State - Prof. William Enck Page
Performance
• Memory overhead: 4.4%
• IPC overhead: 27%
• Macro-benchmark:‣ App load: 3% (2ms)
‣ Address book: (< 20 ms)5.5% create, 18% read
‣ Phone call: 10% (10ms)
‣ Take picture: 29% (0.5s)
13
0
200
400
600
800
1000
1200
1400
1600
1800
2000
sieve loop logic string float method total
Android
TaintDroid
CaffeineMark 3.0 benchmark(higher is better)
14% overhead
CaffeineMark score roughly corresponds to the number of Java instructions per second.
NC State - Prof. William Enck Page
• Selected 30 applications with bias on popularity and access to Internet, location, microphone, and camera
• Of 105 flagged connections, only 37 clearly legitimate
applications # permissionsThe Weather Channel, Cetos, Solitarie, Movies, Babble, Manga Browser 6
Bump, Wertago, Antivirus, ABC --- Animals, Traffic Jam, Hearts, Blackjack, Horoscope, 3001 Wisdom Quotes Lite, Yellow Pages, Datelefonbuch, Astrid, BBC News Live Stream, Ringtones
14
Layer, Knocking, Coupons, Trapster, Spongebot Slide, ProBasketBall 6
MySpace, Barcode Scanner, ixMAT 3Evernote 1
Application Study
14
NC State - Prof. William Enck Page
Findings• 15 of the 30 applications shared physical location
with an ad server (admob.com, ad.qwapi.com, ads.mobclix.com, data.flurry.com)
‣ Most traffic was plaintext (e.g., AdMob HTTP GET):
• 7 applications sent device (IMEI) and 2 apps sent phone info (Ph. #, IMSI*, ICC-ID) to a remote server without informing the user.
15
...&s=a14a4a93f1e4c68&..&t=062A1CB1D476DE85 B717D9195A6722A9&d%5Bcoord%5D=47.661227890000006%2C-122.31589477&...
NC State - Prof. William Enck Page
Q3: What can the applications do?
• Static analysis: look at the possible paths and interaction of data
‣ Very, very hard (often undecidable), but community has learned that we can do a lot with small analyses.
• Step 1: ded decompiler for Android applications
• Step 2: static source code analysis for both dangerous functionality and vulnerabilities
‣ What data could be exfiltrated from the application?
‣ Are developers safely using interfaces?
16
NC State - Prof. William Enck Page
(1) DEX Parsing
(2) Java .class Conversion
(3) Java .class Optimization
Missing Type Inference
Constant Pool Conversion
Method Code Retargeting
CFG Construction
Type Inference Processing
Constant Identification
Constant Pool Translation
Bytecode Reorganization
Instruction Set Translation
ded Decompiler• Android applications are written
in Java, but compiled for the optimized Dalvik VM language
‣ Non-trivial to retarget back to Java: register vs. stack architecture, constant pools, ambiguous scalar types, null references, etc.
• ded recovers source code from application package‣ Retargeting: type inference, instruction translation, etc‣ Optimization: use Soot to re-optimize for Java bytecode‣ Decompilation: standard Java decompilation (Soot)
• Decompiled top 1,100 free apps from Android market: over 21 million lines of source code
17
Retargeting Process
NC State - Prof. William Enck Page
Studying Application Security• Queried for security properties using program analysis,
followed by manual inspection to understand purpose
• Used several types of analysis to design security properties specific to Android using the Fortify SCA framework
18
Misuse of Phone Identifiers Data flow analysis
Exposure of Physical Location Data flow analysis
Abuse of Telephony Services Semantic analysis
Eavesdropping on Video Control flow analysis
Eavesdropping on Audio Structural analysis (+CG)
Botnet Characteristics (Sockets) Structural analysis
Havesting Installed Applications Structural analysis
Leaking Information to Logs Data flow analysis
Leaking Information to IPC Control flow analysis
Unprotected Broadcast Receivers Control flow analysis
Intent Injection Vulnerabilities Control flow analysis
Delegation Vulnerabilities Control flow analysis
Null Checks on IPC Input Control flow analysis
Password Management* Data flow analysis
Cryptography Misuse* Structural analysis
Injection Vulnerabilities* Data flow analysis
* included with analysis framework
Analysis for Dangerous Behavior Analysis for Vulnerabilities
Also studied inclusion of advertisement and analytics libraries and associated properties
NC State - Prof. William Enck Page
Phone Identifiers• We’ve seen phone identifiers (Ph.#, IMEI, IMSI, etc)
sent to network servers, but how are they used?
‣ Program analysis pin-pointed 33 apps leaking Phone IDs
• Finding 2 - device fingerprints
• Finding 3 - tracking actions
• Finding 4 - along with registration and login
19
NC State - Prof. William Enck Page
r1.append((new StringBuilder("device_id=")).append(tm.getDeviceId()).toString()).append((new StringBuilder("&device_software_version=")).append(tm.getDeviceSoftwareVersion()).toString());r1.append((new StringBuilder("&build_board=")).append(Build.BOARD).toString()).append((new StringBuilder("&build_brand=")).append(Build.BRAND).toString()).append((new StringBuilder("&build_device=")).append(Build.DEVICE).toString()).append((new StringBuilder("&build_display=")).append(Build.DISPLAY).toString()).append((new StringBuilder("&build_fingerprint=")).append(Build.FINGERPRINT).toString()).append((new StringBuilder("&build_model=")).append(Build.MODEL).toString()).append((new StringBuilder("&build_product=")).append(Build.PRODUCT).toString()).append((new StringBuilder("&build_tags=")).append(Build.TAGS).toString()).append((new StringBuilder("&build_time=")).append(Build.TIME).toString()).append((new StringBuilder("&build_user=")).append(Build.USER).toString()).append((new StringBuilder("&build_type=")).append(Build.TYPE).toString()).append((new StringBuilder("&build_id=")).append(Build.ID).toString()).append((new StringBuilder("&build_host=")).append(Build.HOST).toString()).append((new StringBuilder("&build_version_release=")).append(Build$VERSION.RELEASE).toString()).append((new StringBuilder("&build_version_sdk_int=")).append(Build$VERSION.SDK).toString()).append((new StringBuilder("&build_version_incremental=")).append(Build$VERSION.INCREMENTAL).toString());r5 = mContext.getApplicationContext().getResources().getDisplayMetrics();r1.append((new StringBuilder("&density=")).append(r5.density).toString()).append((new StringBuilder("&height_pixels=")).append(r5.heightPixels).toString()).append((new StringBuilder("&scaled_density=")).append(r5.scaledDensity).toString()).append((new StringBuilder("&width_pixels=")).append(r5.widthPixels).toString()).append((new StringBuilder("&xdpi=")).append(r5.xdpi).toString()).append((new StringBuilder("&ydpi=")).append(r5.ydpi).toString());r1.append((new StringBuilder("&line1_number=")).append(tm.getLine1Number()).toString()).append((new StringBuilder("&network_country_iso=")).append(tm.getNetworkCountryIso()).toString()).append((new StringBuilder("&network_operator=")).append(tm.getNetworkOperator()).toString()).append((new StringBuilder("&network_operator_name=")).append(tm.getNetworkOperatorName()).toString()).append((new StringBuilder("&network_type=")).append(tm.getNetworkType()).toString()).append((new StringBuilder("&phone_type=")).append(tm.getPhoneType()).toString()).append((new StringBuilder("&sim_country_iso=")).append(tm.getSimCountryIso()).toString()).append((new StringBuilder("&sim_operator=")).append(tm.getSimOperator()).toString()).append((new StringBuilder("&sim_operator_name=")).append(tm.getSimOperatorName()).toString()).append((new StringBuilder("&sim_serial_number=")).append(tm.getSimSerialNumber()).toString()).append((new StringBuilder("&sim_state=")).append(tm.getSimState()).toString()).append((new StringBuilder("&subscriber_id=")).append(tm.getSubscriberId()).toString()).append((new StringBuilder("&voice_mail_number=")).append(tm.getVoiceMailNumber()).toString());i0 = mContext.getResources().getConfiguration().mcc;i1 = mContext.getResources().getConfiguration().mnc;r1.append((new StringBuilder("&imsi_mcc=")).append(i0).toString()).append((new StringBuilder("&imsi_mnc=")).append(i1).toString());r254 = (ActivityManager) mContext.getSystemService("activity");$r255 = new ActivityManager$MemoryInfo();r254.getMemoryInfo($r255);r1.append((new StringBuilder("&total_mem=")).append($r255.availMem).toString());
com.eoeandroid.eWallpapers.cartoon - SyncDeviceInfosService.getDevice_info()
Device Fingerprints (1)
20
NC State - Prof. William Enck Page
Device Fingerprints (2)
21
com.avantar.wny - com/avantar/wny/PhoneStats.javapublic String toUrlFormatedString(){
StringBuilder $r4; if (mURLFormatedParameters == null) { $r4 = new StringBuilder(); $r4.append((new StringBuilder("&uuid=")).append(URLEncoder.encode(mUuid)).toString()); $r4.append((new StringBuilder("&device=")).append(URLEncoder.encode(mModel)).toString()); $r4.append((new StringBuilder("&platform=")).append(URLEncoder.encode(mOSVersion)).toString()); $r4.append((new StringBuilder("&ver=")).append(mAppVersion).toString()); $r4.append((new StringBuilder("&app=")).append(this.getAppName()).toString()); $r4.append("&returnfmt=json"); mURLFormatedParameters = $r4.toString(); }
return mURLFormatedParameters;}
IMEI
NC State - Prof. William Enck Page
Tracking
22
public void onCreate(Bundle r1){ ... IMEI = ((TelephonyManager) this.getSystemService("phone")).getDeviceId(); retailerLookupCmd = (new StringBuilder(String.valueOf(constants.server))).append("identifier=").append(EncodeURL.KREncodeURL(IMEI)).append("&command=retailerlookup&retailername=").toString(); ...}
http://kror.keyringapp.com/service.php
com.froogloid.kring.google.zxing.client.android - Activity_Router.java (Main Activity)
public void run(){ ... r24 = (TelephonyManager) r21.getSystemService("phone"); url = (new StringBuilder(String.valueOf(url))).append("&vid=60001001&pid=10010&cid=C1000&uid=").append(r24.getDeviceId()).append("&gid=").append(QConfiguration.mGid).append("&msg=").append(QConfiguration.getInstance().mPCStat.toMsgString()).toString(); ...}
http://client.qunar.com:80/QSearch
com.Qunar - net/NetworkTask.java
NC State - Prof. William Enck Page
Registration and Login
23
com.statefarm.pocketagent - activity/LogInActivity$1.java (Button callback)
public void onClick(View r1){ ... r7 = Host.getDeviceId(this$0.getApplicationContext()); LogInActivity.access$1(this$0).setUniqueDeviceID(r7); this$0.loginTask = new LogInActivity$LoginTask(this$0, null); this$0.showProgressDialog(r2, 2131361798, this$0.loginTask); r57 = this$0.loginTask; r58 = new LoginTO[1]; r58[0] = LogInActivity.access$1(this$0); r57.execute(r58); ...}
IMEI
Is this necessarily bad?
NC State - Prof. William Enck Page
Location• Found 13 apps with geographic location data flows
to the network
‣ Many were legitimate: weather, classifieds, points of interest, and social networking services
• Several instances sent to advertisers (same as TaintDroid). More on this shortly.
• Code recovery error inAdMob library.
24
NC State - Prof. William Enck Page
Phone Misuse• No evidence of abuse in our sample set
‣ Hard-coded numbers for SMS/voice (premium-rate)
‣ Background audio/video recording
‣ Socket API use (not HTTP wrappers)
‣ Harvesting list of installed applications
25
NC State - Prof. William Enck Page
Ad/Analytics Libraries• 51% of the apps included an ad or
analytics library (many also included custom functionality)
• A few libraries were used most frequently
• Use of phone identifiers and location sometimes configurable by developer
26
1
10
100
1000
1 2 3 4 5 6 7 8
1
10815
3732
91
367
Num
ber
of li
brar
ies
Number of apps
1 app has 8 libraries!
Library Path # Apps Obtains
com/admob/android/ads 320 L
com/google/ads 206 -
com/flurry/android 98 -
com/qwapi/adclient/android 74 L, P, E
com/google/android/apps/analytics 67 -
com/adwhirl 60 L
com/mobclix/android/sdk 58 L, E
com/mellennialmedia/android 52 -
com/zestadz/android 10 -
com/admarvel/android/ads 8 -
com/estsoft/adlocal 8 L
com/adfonic/android 5 -
com/vdroid/ads 5 L, E
com/greystripe/android/sdk 4 E
com/medialets 4 L
com/wooboo/adlib_android 4 L, P, I
com/adserver/adview 3 L
com/tapjoy 3 -
com/inmobi/androidsdk 2 E
com/apegroup/ad 1 -
com/casee/adsdk 1 S
com/webtrents/mobile 1 L, E, S, I
Total Unique Apps 561
L = Location; P = Ph#; E = IMEI; S = IMSI; I = ICC-ID
NC State - Prof. William Enck Page
public static String getDeviceId(Object r0){
Context r4; String r7; r4 = (Context) r0;
try { r7 = ((TelephonyManager) r4.getSystemService("phone")).getDeviceId();
if (r7 == null) { r7 = ""; } } catch (Exception $r8) { WebtrendsDataCollector.getInstance().getLog().d("Exception fetching TelephonyManager.getDeviceId value. ", $r8); r7 = null; }
return r7;}
Probing for Permissions (1)
27
com/webtrends/mobile/analytics/android/WebtrendsAndroidValueFetcher.java
Catches SecurityException
NC State - Prof. William Enck Page
public static String getDeviceId(Context r0){
String r1; r1 = "";
label_19: { if (deviceId != null) { if (r1.equals(deviceId) == false) { break label_19; } }
if (r0.checkCallingOrSelfPermission("android.permission.READ_PHONE_STATE") == 0) { deviceId = ((TelephonyManager) r0.getSystemService("phone")).getSubscriberId(); } } //end label_19: ...}
Probing for Permissions (2)
28
com/casee/adsdk/AdFetcher.java
Checks before accessing
NC State - Prof. William Enck Page
Developer Toolkits• We found identically implemented dangerous
functionality in the form of developer toolkits.
‣ Probing for permissions (e.g., Android API, catch SecurityException)
‣ Well-known brands sometimes commission developers that include dangerous functionality.
• “USA Today” and “FOX News”both developed by Mercury Intermedia(com/mercuryintermedia),which grabs IMEI on startup
29
NC State - Prof. William Enck Page
Custom Exceptions
30
void init(){ URLConnection r3; ... r3 = (new URL("http://www.word-player.com/HttpHandler/init.sample")).openConnection(); ... try { $r27 = this.mkStr(((TelephonyManager) _context.getSystemService("phone")).getLine1Number()); } catch (Exception $r81) { break label_5; } ...}
v00032.com.wordplayer - CustomExceptionHandler.java
Phone Number!?
NC State - Prof. William Enck Page
Intent Vulnerabilities• Similar analysis rules as independently identified
by Chin et al. [Mobisys 2011]
• Leaking information to IPC - unprotected intent broadcasts are common, occasionally contain info
• Unprotected broadcast receivers - a few apps receive custom action strings w/out protection (lots of “protected bcasts”)
• Intent injection attacks - 16 apps had potential vulnerabilities
• Delegating control - pending intents are tricky to analyze (notification, alarm, and widget APIs) --- no vulns found
• Null checks on IPC input - 3925 potential null dereferences in 591 apps (53%) --- most were in activity components
31
NC State - Prof. William Enck Page
Study Limitations• The sample set
• Code recovery failures
• Android IPC data flows
• Fortify SCA language
• Obfuscation
32
NC State - Prof. William Enck Page
Summary• What permissions do applications ask for?
‣ Kirin demonstrated how permission combinations can be effectively used to certify applications at install-time.
• What do applications do with the permissions?
‣ TaintDroid “looks inside” of applications to understand how privacy sensitive information is being used.
• What can applications do with the permissions?
‣ We used program analysis and manual inspection to characterize implemented application behavior
33
NC State - Prof. William Enck Page
Thank you!
34
William EnckAssistant Professor
Department of Computer ScienceNC State [email protected]
http://www.enck.org