You may notice some images loading slow across the Open Collections website. Thank you for your patience as we rebuild the cache to make images load faster.

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Analysis of data-at-rest security In smartphones Muslukhov, Ildar 2018

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Notice for Google Chrome users:
If you are having trouble viewing or searching the PDF with Google Chrome, please download it here instead.

Item Metadata


24-ubc_2018_september_muslukhov_ildar.pdf [ 4.03MB ]
JSON: 24-1.0371128.json
JSON-LD: 24-1.0371128-ld.json
RDF/XML (Pretty): 24-1.0371128-rdf.xml
RDF/JSON: 24-1.0371128-rdf.json
Turtle: 24-1.0371128-turtle.txt
N-Triples: 24-1.0371128-rdf-ntriples.txt
Original Record: 24-1.0371128-source.json
Full Text

Full Text

Analysis of Data-At-Rest Security In SmartphonesbyIldar MuslukhovB. Information Technology, Ufa State Aviation and Technical University, 2003M. Information Technology, Ufa State Aviation and Technical University, 2005Ph.D., Ufa State Aviation and Technical University, 2008A THESIS SUBMITTED IN PARTIAL FULFILLMENTOF THE REQUIREMENTS FOR THE DEGREE OFDoctor of PhilosophyinTHE FACULTY OF GRADUATE AND POSTDOCTORALSTUDIES(Electrical and Computer Engineering)The University of British Columbia(Vancouver)August 2018© Ildar Muslukhov, 2018The following individuals certify that they have read, and recommend to theFaculty of Graduate and Postdoctoral Studies for acceptance, the dissertation en-titled:Analysis of Data-At-Rest Security In Smartphonessubmitted by Ildar Muslukhov in partial fulfillment of the requirements forthe degree of Doctor of Philosophyin Electrical and Computer EngineeringExamining Committee:Prof Konstantin BeznosovSupervisorProf Sidney FelsSupervisory Committee MemberProf Julia RubinSupervisory Committee MemberProf Sathish GopalakrishnanUniversity ExaminerProf Reid HolmesUniversity ExamineriiAbstractWith almost two billion users worldwide, smartphones are used for almost every-thing – booking a hotel, ordering a cup of coffee, or paying in a shop. However,small size and high mobility makes these devices prone to theft and loss. In thiswork we aim to broaden our understanding of how smartphone users and applica-tion developers protect sensitive data on smartphones.To understand how well users are protecting their data in smartphones, we con-ducted several studies. The results revealed that 50% of the subjects locked theirsmartphone with an unlocking secret and 95% of them chose unlocking secretsthat could be guessed within minutes.To understand how well application developers protect sensitive data in smart-phones, we analyzed 132K Android applications. We focused on identifying mis-use of cryptography in applications and libraries. The study results revealed thatdevelopers often misuse cryptographic API. In fact, 9 out of 10 Android applica-tions contained code that used a symmetric cipher with a static encryption key.Further, source attribution revealed that libraries are the main consumer of cryp-tography and the major contributor of misuse cases. Finally, an in-depth analysisof the top libraries highlighted the need for improvement in the way we define anddetect misuse of cryptography.Based on these results we designed and evaluated a system for encryptionkeys management that uses wearable devices as an additional source of entropy.Evaluation results showed that the proposal introduces insignificant overhead inpower consumption and latency.iiiLay SummaryThis thesis presents the results of research on how secure user data in smartphonesagainst data thieves. We studied this question from two perspectives, i.e., usersand application developers. With end-users we focused on how they choose theirpasswords to lock smartphones. The results of the study revealed that more than95% of users choose passwords that are easy to guess, i.e., a thief can guess it inunder an hour. With application developers we looked at how often developersput user data at risk, by incorrectly using certain security functions. Our findingsshow that application developers do put user data at risk. Overall, this researchshows that when it comes to data security in smartphones, we are still far fromhaving adequate protection.ivPrefaceThis research was the product of a fruitful collaboration between the author of thedissertation and the following people: Yazan Boshmaf, San-Tsai Sun, Primal Wi-jesekera, Ivan Cherepau and Konstantin Beznosov (advisor) from the Universityof British Columbia, and Cynthia Kuo and Jonathan Lester from Nokia Research.I am deeply grateful to my mentors Michael Halcrow and Andrew Honig fromGoogle for an opportunity to work on improving Linux Kernel fuzzing and addingencryption to the EXT4 file system.Work presented herein consists of research studies that have been published orare under review in peer-reviewed international conferences and workshops.The user studies on characterization of smartphone end users presented inChapter 2, and partly discussed in Chapter 5, led to the following publications:• I. Muslukhov, Y. Boshmaf, C. Kuo, J. Lester, K. Beznosov. UnderstandingUsers? Requirements for Data Protection in Smartphones. In Proceed-ings of Data Engineering Workshops of the 28th IEEE International Con-ference on Data Engineering (ICDEW’12), pp. 228–235, Arlington, VA,USA, 2012.• I. Muslukhov, Y. Boshmaf, C. Kuo, J. Lester, K. Beznosov. Know YourEnemy: The Risk of Unauthorized Access in Smartphones by Insiders. InProceedings of the 15th International Conference on Human-Computer In-teraction with Mobile Devices and Services, Munich, Germany, 2013, Pages271-280, 22% acceptance rate.v• I. Cherepau, I. Muslukhov, N. Asanka, and K. Beznosov. On The Impactof Touch ID on iPhone Passcodes. In Proceedings of the 11th SymposiumOn Usable Privacy and Security (SOUPS ’15), Ottawa, ON, Canada, 2015,Pages 257-276, 24% acceptance rate.• D. Marques, I. Muslukhov, T. Guerreiro, L. Carriço, K. Beznosov. Snoop-ing on Mobile Phones: Prevalence and Trends. In Proceedings of the 12thSymposium On Usable Privacy and Security (SOUPS ’16), Denver, CO,US, 2016, 28% acceptance rate. Distinguished Paper Award.• A. Mahfouz, I. Muslukhov, and K. Beznosov. Android users in the wild:Their authentication and usage behavior. Pervasive and Mobile Computing.Special Issue on Mobile Security, Privacy and Forensics, Volume 32, Pages50-61, Elsevier, October 2016.I was responsible for designing and conducting the interview-based user study,where Yazan Boshmaf actively participated in the interviewing process. Me andYazan separately coded interview data to reduce personal bias. Other projectmembers actively participated in the discussion of the interview guide, discus-sion of the data analysis, and the paper writing process (the first paper on the listabove). For this study I obtained ethics approval from the Behavioural ResearchEthics Board (BREB) at UBC. Approval H11-02230, titled “Mobile Data Protec-tion.”After the completion of the interview-based study, I proceeded with the designof an online survey. Administration of the survey and data analysis was done byme. All co-authors actively participated in the discussion of the survey structure,questions, results and paper writing process (the second paper on the list above).For this follow-up study I obtained ethics approval from the BREB. ApprovalH11-03512, titled “Mobile Data Protection - Follow up.”I have significantly contributed to the study design, data analysis and paperwriting process for papers three and five on the list above. For both of these publi-cations we obtained approvals from BREB. Approvals H12-02254 titled “Smart-viphone Unlock in a Wild” and H14-02759 titled “TouchID.”My contributions to the publication with Diogo Marques were limited to thediscussion of research questions, study design and paper writing process. I did notparticipate in data analysis and data collection processes.The measurement study on how smartphone applications (mis)use crypto-graphic API, presented in chapter 3, resulted in the following publication:• I. Muslukhov, Y. Boshmaf and K. Beznosov. Source Attribution of Crypto-graphic API Misuse in Android Applications. In Proceedings of the 2018 onAsia Conference on Computer and Communications Security (ASIACCS’18). Incheon, Republic of Korea, 2018, Pages 133-146, 20% acceptancerate.The cryptography (mis)use study was designed, implemented and executedby me. I was also responsible for all data analysis. Co-authors, Yazan Boshmafand Konstantin Beznosov, contributed to the preliminary research discussion andpaper writing process.The design of the Sidekick system – a user-space approach at decoupling data-at-rest encryption and smartphone unlocking, presented in Chapter 4, led to thefollowing publication in a peer-review journal:• I. Muslukhov, S-T. Sun, P. Wijesekera, Y. Boshmaf, and K. Beznosov. Us-ing Wearable Devices to Secure Data-At-Rest in Stolen Tablets and Smart-phones. Pervasive and Mobile Computing. Special Issue on Mobile Se-curity, Privacy and Forensics, Volume 32, Pages 26-34, Elsevier, October2016.Work on this project was mainly done during my collaboration with the Fu-sionpipe company through Engage and Engage+ grants. The idea of the projectwas conceived by me through discussion of the needs of Fusionpipe’s clients. Allco-authors contributed to the discussion of research questions, study design andpaper writing process.viiWhile working on the research presented in this thesis, I also participated inrelevant industry led projects, that resulted in the following patents:• H. Khosravi, I. Muslukhov, P. Luong. Method and System for DecouplingUser Authentication and Data Encryption on Mobile Devices. US PatentApplication. 13/943,070, Patent number US20140321641 A1. Publicationdate 16 July, 2013.• U. Savagaonkar, M. Halcrow, T. Y. Ts’o and I. Muslukhov. Method AndSystem of Encrypting File System Directories. US Patent Application. USPatent Application.US 14/829,095, Patent number US 9639708 B2. Publi-cation date 5 Feb, 2017.The discussion in Chapter 5 is partially influenced by ideas and findings thatled to the following publication:• Serge Egelman, Andreas Sotirakopoulos, Ildar Muslukhov, Konstantin Beznosov,and Cormac Herley. 2013. Does my password go up to eleven?: the impactof password meters on password selection. In Proceedings of the SIGCHIConference on Human Factors in Computing Systems (CHI ’13). ACM,New York, NY, USA, 2379-2388.viiiTable of ContentsAbstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiiLay Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ivPreface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vTable of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ixList of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiiiList of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvAcknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxDedication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxi1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1 Goals and Methodology . . . . . . . . . . . . . . . . . . . . . . . 41.2 Research Summary . . . . . . . . . . . . . . . . . . . . . . . . . 51.2.1 Smartphone Users’ Experiences with Data Protection . . . 51.2.2 Analyzing Cryptographic API use in Android Applications 71.2.3 Storing Encryption Keys on Wearable Devices . . . . . . . 101.3 Contributions Summary . . . . . . . . . . . . . . . . . . . . . . . 11ix2 Smartphone Users Experience with Data Protection . . . . . . . . . 132.1 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . 132.2 Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.3 Study 1 – Interviews . . . . . . . . . . . . . . . . . . . . . . . . 152.3.1 Methodology . . . . . . . . . . . . . . . . . . . . . . . . 152.3.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.3.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . 272.4 Study 2 – Online Survey . . . . . . . . . . . . . . . . . . . . . . 282.4.1 Methodology . . . . . . . . . . . . . . . . . . . . . . . . 282.4.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 302.4.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . 392.5 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412.6 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412.7 Discussion and Future Work . . . . . . . . . . . . . . . . . . . . 452.7.1 All-or-Nothing Locking Approach . . . . . . . . . . . . . 472.7.2 Improving Security of Unlocking Methods . . . . . . . . . 482.8 Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 502.9 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 513 Analyzing Cryptographic API use in Android Applications . . . . . 523.1 Motivation and related work . . . . . . . . . . . . . . . . . . . . 523.2 Common rules in cryptography . . . . . . . . . . . . . . . . . . . 553.2.1 Symmetric key encryption . . . . . . . . . . . . . . . . . 563.2.2 Password-based encryption . . . . . . . . . . . . . . . . . 573.2.3 Random number generation . . . . . . . . . . . . . . . . 583.3 Cryptography in Android . . . . . . . . . . . . . . . . . . . . . . 583.3.1 Android applications ecosystem . . . . . . . . . . . . . . 583.3.2 Java cryptography . . . . . . . . . . . . . . . . . . . . . . 593.4 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 603.5 Crypto API linting with BinSight . . . . . . . . . . . . . . . . . . 613.5.1 Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . 62x3.5.2 Linting . . . . . . . . . . . . . . . . . . . . . . . . . . . 643.5.3 Attribution . . . . . . . . . . . . . . . . . . . . . . . . . 663.6 Measuring Crypto API misuse . . . . . . . . . . . . . . . . . . . 683.6.1 Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . 693.6.2 Linting and attribution . . . . . . . . . . . . . . . . . . . 703.6.3 Crypto API misuse in Android Applications . . . . . . . . 733.6.4 The impact of third-party libraries . . . . . . . . . . . . . 853.6.5 In-depth analysis of top libraries . . . . . . . . . . . . . . 873.6.6 The impact of third-party libraries revisited . . . . . . . . 923.7 Discussion and Future Work . . . . . . . . . . . . . . . . . . . . 933.7.1 Extending the Crypto API analysis . . . . . . . . . . . . . 953.7.2 How Crypto API Misuse Rates Have Changed . . . . . . . 973.8 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 984 Storing Encryption Keys on Wearable Devices . . . . . . . . . . . . 1004.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1004.2 Threat Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1014.2.1 Threats and Risks . . . . . . . . . . . . . . . . . . . . . . 1024.2.2 Attack . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1034.2.3 General Assumptions . . . . . . . . . . . . . . . . . . . . 1044.2.4 Crypto-Attacker . . . . . . . . . . . . . . . . . . . . . . . 1054.2.5 Network-Attacker . . . . . . . . . . . . . . . . . . . . . . 1064.3 Sidekick Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 1074.3.1 High Level Overview . . . . . . . . . . . . . . . . . . . . 1074.3.2 Securing Communications over BLE . . . . . . . . . . . . 1104.4 System Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . 1124.4.1 Experimental Setup. . . . . . . . . . . . . . . . . . . . . 1124.4.2 Latency . . . . . . . . . . . . . . . . . . . . . . . . . . . 1124.4.3 Power Consumption . . . . . . . . . . . . . . . . . . . . 1134.4.4 Session Key Renewal . . . . . . . . . . . . . . . . . . . . 1154.4.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . 115xi4.5 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1164.6 Discussion and Future Work . . . . . . . . . . . . . . . . . . . . 1174.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1195 Discussion and Conclusion . . . . . . . . . . . . . . . . . . . . . . . 120Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123A User Studies Questions . . . . . . . . . . . . . . . . . . . . . . . . . 138A.1 Pre-screening Questions . . . . . . . . . . . . . . . . . . . . . . . 138A.2 Interview Scenario and Coding Sheet . . . . . . . . . . . . . . . . 140A.2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 140A.2.2 Applications Types and Your Experience Today/Yesterday 140A.2.3 Application Specific Questions . . . . . . . . . . . . . . . 141A.2.4 Data Types Specific Questions . . . . . . . . . . . . . . . 141A.2.5 Current Practices . . . . . . . . . . . . . . . . . . . . . . 142A.3 Study 2 Questionaire . . . . . . . . . . . . . . . . . . . . . . . . 143A.3.1 Part I: Consent Forms and Smartphone Task . . . . . . . . 143A.3.2 Part II: Demographic Questions . . . . . . . . . . . . . . 143A.3.3 Part III: Smartphone Experience . . . . . . . . . . . . . . 146A.3.4 Part IV: Smartphone Lock Use . . . . . . . . . . . . . . . 148A.3.5 Part V: Applications and Data Being Used . . . . . . . . . 151A.3.6 Part VI: Password Saving Habits . . . . . . . . . . . . . . 153A.3.7 Part VII: Data Types Sensitivity and Value . . . . . . . . . 154xiiList of TablesTable 2.1 Demographics of 22 Interview Participants in Study 1. . . . . . 19Table 2.2 Types of Data and their Sensitivity and Value from Users’ Per-spectives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20Table 2.3 Security Practices and Experience of Interviewed Participants . 24Table 2.4 The 15 most used data types by the subjects. All cases includeonly data types for personal use, since no work-related datatypes made it to the top 15. . . . . . . . . . . . . . . . . . . . 33Table 2.5 Distribution of reasons for using a locking system (N=379).Note that N 6= ∑n, because the participants were able to pro-vide multiple reasons. CI stands for confidence interval, giventhe number of subjects that were able to answer that question. . 34Table 2.6 Distribution of reasons for not using a locking system (N=345).Note that N 6= ∑n, because the participants were able to pro-vide multiple reasons. CI stands for confidence interval giventhe number of subjects that were able to answer this question. . 35Table 2.7 The distribution of “negative” experience of the participants(N = 724). . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37Table 2.8 Parameters of logistic regression models, where a0 is intercept,a1 is the coefficient in front of Age variable, p is the biggestp-value for both a0 and a1, RD is the residual deviance, AIC isAkaike Information Criterion, and R2 is Nagelkerke R-squared. 39xiiiTable 3.1 Summary of used datasets . . . . . . . . . . . . . . . . . . . . 61Table 3.2 Cryptographic API endpoints and related rules. . . . . . . . . . 65Table 3.3 Summary of duplicates and Crypto API use in all three datasets 68Table 3.4 Obfuscation analysis of class identifiers. . . . . . . . . . . . . 70Table 3.5 Attribution of cryptographic API call sites. . . . . . . . . . . . 71Table 3.6 The top-6 ciphers used in Android applications. PDE was usedwith MD5 and 3DES. . . . . . . . . . . . . . . . . . . . . . . 80Table 3.7 Summary of Top-2 libraries from each dataset that made use ofCrypto API. Empty values imply that the library was not foundin the dataset. . . . . . . . . . . . . . . . . . . . . . . . . . . . 86Table 4.1 Overall Latency for each four request/response message pairsfor the default values for CInterval and CLatency. . . . . . . . 113Table 4.2 CR2032 battery life in days, depending on the acceptable over-all latency for a request and on the number of requests per day. 114xivList of FiguresFigure 2.1 CDF of new collected information across the interviewed par-ticipants. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18Figure 2.2 The proportion of concerned users with sensitivity in the pres-ence of a stranger (horizontal axis) and in the presence ofan insider (vertical axis). Data labels across the vertical axisand circles in the plots represent data types for personal use;data labels across the horizontal axis and squares in the plotsrepresent data types for work related use. Filled shapes andred-colored data labels represent statistically significant dif-ferences between subjects’ concerns with respect to a strangerand an insider (U-test for rates, WSRT for ranks, p < 0.05).The meanings for the abbreviated data type labels are in Ta-ble 2.4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36Figure 2.3 Distribution of the experiences E4-E7 (meaning for these la-bels are provided in table 2.7) over participants’ age groups.We removed all the subjects that were younger than 10 andthose that were 50 or older for clarity purposes. . . . . . . . . 40Figure 3.1 Cryptographic API linting for Android applications using Bin-Sight. Gray components represent parts that were reimple-mented from CryptoLint [57], and white components repre-sent the extensions that we added. . . . . . . . . . . . . . . . 61xvFigure 3.2 Ratio of APK files that violated at least one Crypto API userule per dataset. “Any” category includes all call-sites for theanalysis, without considering the source (i.e., library or an ap-plication). This approach was used in the CryptoLint study.The remaining categories (Libs, Libs?, Apps and ?) includecall-sites that belong to the corresponding source only (i.e., alibrary, a possible library, an application or a fully obfuscatedcase). The proportions are calculated as the ratio of APK filesthat contained at least one misuse from specific category (or,any category for “Any”) against the total number of APK filesthat used Crypto API in the dataset. The total number of APKfiles that made at least one call to Crypto API for each datasetis provided in the legend. . . . . . . . . . . . . . . . . . . . . 74Figure 3.3 Ratio of call-sites that violated one Crypto API use rule perdataset. The total number of call-sites to Crypto API for eachdataset is provided in the legend. . . . . . . . . . . . . . . . . 75Figure 3.4 Ratio of APK files that violated Rule 1 - “Do not use ECBmode for symmetric cipher.” The total number of APK filesthat used symmetric cipher per dataset is provided in the legend. 77Figure 3.5 Ratio of call-sites that used ECB mode for symmetric cipher.The total number of call-sites that created symmetric Cipherobjects in Java per dataset is provided in the legend. . . . . . . 78Figure 3.6 Ratio of APK files that violated Rule 2 - “Do not use staticIV for CBC mode in symmetric cipher.” The total numberof APK files that used symmetric cipher in CBC mode perdataset is provided in the legend. . . . . . . . . . . . . . . . . 79Figure 3.7 Ratio of call-sites that used static IV with CBC mode for sym-metric cipher. The total number of call-sites that used Cipherobjects in CBC mode per dataset is provided in the legend. . . 80xviFigure 3.8 Ratio of APK files that violated Rule 3 - “Do not use staticencryption key for a symmetric cipher.” The total number ofAPK files that used symmetric cipher per dataset is providedin the legend. . . . . . . . . . . . . . . . . . . . . . . . . . . 81Figure 3.9 Ratio of call-sites that used static encryption key for a sym-metric cipher. The total number of call-sites that set an en-cryption key for a symmetric cipher per dataset is provided inthe legend. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82Figure 3.10 Ratio of APK files that violated Rule 4 - “Do not use static saltfor PBKDF.” The total number of APK files that used PBKDFper dataset is provided in the legend. . . . . . . . . . . . . . . 83Figure 3.11 Ratio of call-sites that used static static salt for PBKDF. Thetotal number of call-sites that provided a salt value for PBKDFper dataset is provided in the legend. . . . . . . . . . . . . . . 84Figure 3.12 Ratio of APK files that violated Rule 5 - “Do not use less than1,000 iterations for PBKDF.” The total number of APK filesthat used PBKDF is provided in the legend. . . . . . . . . . . 85Figure 3.13 Ratio of call-sites that used 1,000 or less iterations for PBKDF.The total number of call-sites that used PBKDF per dataset isprovided in the legend. . . . . . . . . . . . . . . . . . . . . . 86Figure 3.14 Ratio of APK files that violated Rule 6 - “Do not use staticseed for SecureRandom.” The total number of APK files thatused SecureRandom per dataset is provided in the legend. . . . 87Figure 3.15 Ratio of call-sites that used static seed for SecureRandom.The total number of call-sites that seed SecureRandom perdataset is provided in the legend. . . . . . . . . . . . . . . . . 88xviiFigure 3.16 Proportion of APK files that would become Crypto API misuse-free depending on the number of fixed top ranked libraries.The legend shows the total number of applications that had atleast one misuse in the corresponding dataset. We identified222, 507 and 198 libraries with misuse in R12, R16 and T15datasets, hence, the end of the corresponding lines. . . . . . . 89Figure 4.1 In currently deployed systems, a user needs to provide an unlock-ing secret to unlock his or her device. The unlocking secret, mostprobably, is an easy-to-guess one. That secret is then used to derivea Data Encryption Key (DEK), which is then used for data encryp-tion/decryption. When application developers need to encrypt datain smartphones, they usually use a static data encryption key, i.e.,hard code it into their application, and then also roll out their ownimplementation of the data encryption. Sidekick addresses both is-sues by randomly generating key encryption keys (KEK) and thenstoring them on a wearable device. Sidekick makes data encryptionindependent from the unlocking secret, by mainly relying on KEKswhile making the use of unlocking secrets optional (showed as adashed line). It also provides a simpler API to application develop-ers so that they do not need to roll out their own implementation ofdata encryption and a encryption key management system. . . . . . 108xviiiFigure 4.2 High-level design of the Sidekick System. A data containing device(DCD) runs applications that link the Sidekick library. The librarytakes care of all communications with the KSD, e.g., storing or re-trieving a KEK. Once a required KEK is retrieved, a correspondingDEK is decrypted and stored in the Decrypted DEKs Cache by theSidekick Library. The DEK is then passed to the Data EncryptionSystem in order to encrypt/decrypt data. Each application has aseparate KEK List. The Reference Monitor on the KSD mitigates amisbonding attack by ensuring that each application has access onlyto its own KEK List. . . . . . . . . . . . . . . . . . . . . . . . 109Figure A.1 Different types of smartphone locks. . . . . . . . . . . . . . . 148xixAcknowledgmentsFirst and foremost, I would like to thank my advisor, Konstantin Beznosov, forgiving me the opportunity to venture into different topics and disciplines, and forpatiently guiding me through this journey.Second, I would like to thank all of my collaborators, colleagues and support-ive friends. In no particular order, I give my special thanks to you all: Yazan Bosh-maf, San-Tsai Sun, Primal Wijesekera, Ivan Cherapau, Cynthia Kuo, JonathanLester, Diogo Marques, Ahmed Mahfouz and Lina Qiu.Third, I would like to thank all members of LERSSE for their feedback andconstructive discussions. I would like to thank Ross Sheppard for his invaluablehelp with proofreading this dissertation.Fourth, I would like to give special thanks to Dmitry Samosseiko from Sophosfor providing vital support for the study of Android applications. I would also liketo thank my internships hosts in Google, Andrew Honig and Michael Halcrow, forthe opportunity to work on cool projects for Linux OS.Last but not least, I would like to thank my wife Albina, who patiently sup-ported me during this journey, and my kids David and Daniel, who surrounded meno matter what.xxDedicationTo my wonderful family Albina, David and Daniel, who were alwaysbeside me and cheered me up no matter what. To my parents andAlbina’s parents, who were of significant help during Daniel’sappearance to this world.xxiChapter 1IntroductionSmartphones have become ubiquitous, highly personal and versatile devices thatare used by almost two billion users [15, 24]. Additionally, useful and diversesets of applications and features, in combination with vast internal storage, havemade these devices appealing for organizations [5]. The diverse sets of data thatusers can store on their smartphones may of course include sensitive or confiden-tial data. For example, photos, videos, emails, or saved passwords are consid-ered sensitive or confidential by various stakeholders. Unfortunately, due to highmobility, smartphones are prone to theft and loss [2, 4], which implies that thissensitive data needs adequate protection.Indeed, recent reports provide evidence that the threat of smartphone theftand the consequential risk of sensitive information disclosure have a significantimpact on companies and users today. In the US, every tenth smartphone ownerhas been a victim of a smartphone theft at least once [4], more than 30% of allstreet robberies involve a smartphone [2], the number of stolen smartphones hasdoubled in 2014, reaching 3.1 million devices [1], and in 96% of cases when a lostdevice is found, the person who finds it attempts accessing sensitive informationon the lost device [9].One way to protect confidentiality of sensitive data on smartphones is throughfile system level encryption. For example, the current implementation of full-disk1encryption in Android provides such a service as part of the storage IO stack [91].Another approach is to implement custom, application-specific solutions usinga supported cryptographic application program interface (API), or Crypto API forshort. For example, application developers can encrypt user data before storing iton a device or transmitting it over a network. Unfortunately, neither one of theseapproaches is problem-free. Users often choose such unlocking secrets that arevulnerable to unsophisticated password guessing attacks [56, 85].Significant improvements in general purpose GPU technologies, such as CUDAby NVidia [26] and availability of the tools that can harness GPU’s hardware (e.g.,HashCat [33]), password guessing attacks became highly practical. For instance,recent experiment demonstrated1 that NVidia GTX 1070 GPU can probe aroundfour million secret candidates for Android OS. This allows attackers to try all pos-sible combinations for a 6-digit PIN code in under a second. When it comes toiOS, Apple decided to slow down password based key derivation function by em-ploying a specialized hardware with embedded key. Such design allowed them toslow down a single encryption key derivation call to 80 milliseconds [37], whichallows probing 1.08 million secrets in a day. Considering that unlocking secretguessing attack for iOS stack is significantly slower, I use iOS encryption keyderivation speed as a baseline to define easy-to-guess term, which is a secret thatcan be guessed by trying one million most probable secrets. Note, that if a userchose to use a 6-digit PIN as an unlocking secret, then this definition implies thatan attacker will be able to search the entire space.To mount a password guessing attack an attacker would first obtain a bit-by-bit image of the internal storage [104]. We can safely assume that attackers areaware of formats and data structures for the used file system, thus, knows howto extract the encrypted version of the master key. This assumption is sound,considering that both iOS and Android use well documented file systems. Oncethe attacker obtains encrypted master key, he launches password guessing attack1Experiment setup and all stats can be found at the following address external hardware, by trying most probable combinations of the secret anddecrypting the master key. The attacker verifies the correctness of the decryptedmaster key by decrypting content with known data structure. For iOS devices,unlocking secret guessing attack has to be partially executed on the stolen device,since key derivation process iOS relies on the embedded key, which is stored in thespecialized hardware. This requires that the attacker is able to hijack the bootingsequence, in order to gain control over specialized hardware.In addition to the aforementioned password guessing attack, easy-to-guess se-crets are also vulnerable to shoulder surfing attacks [93]. Recent improvementsin smartphone authentication technology have targeted certain usability problemswith the use of complex secrets. For instance, the Touch ID sensor in iPhonesreduces the frequency of secret-based authentications, thus, making it easier forusers to choose longer alphanumeric passwords for unlocking secrets. The resultsof a recent study showed, however, that even when users enable the Touch IDsensor, they still tend to choose easy-to-guess unlocking secrets [45].As for application-specific solutions, the CryptoLint study report from 2012showed that 88% of Android applications misuse Crypto API [57]. In particular,application developers often use static encryption keys or initialization vectors(IVs), which violate cryptographic notions of security, such as indistinguishabil-ity under a chosen-plaintext attack (IND-CPA) [52]. The problem of Crypto APImisuse, however, is still far from being fully understood. First, the CryptoLintstudy was limited in the analysis of libraries, which increases the risk of countingthe same bug multiple times, especially in light of recent results from the Lib-Scout study [39], which showed that third-party libraries also misuse Crypto API.Second, the results from the CryptoLint study are five years old, and it is unclearif misuse rates have changed since then, and if they have, in which direction theyhave changed.31.1 Goals and MethodologyThe main objective of this research is three fold. First, this work aims to widenunderstanding of issues and threats users face when it comes to protecting theirsmartphones with an unlocking secret. Second, we aim to provide a deeper anal-ysis of the current state for the problem of Crypto API misuse in mobile applica-tions, especially when it comes to the source of the code from which the misuseoriginates. Finally, we evaluate technical feasibility of using wearable devices forimproving encryption key management system in mobile devices, such as tabletsand smartphones.To achieve these goals, I adopted the following research methodology. First,a set of user studies was conducted to gain a better understanding of the userexperience with the existing data protection systems in smartphones. Second,135,590 Android applications were analyzed to (a) measure how the misuse ratesof Crypto API have changed since the CryptoLint study [57], and (b) identifythe responsible party for each misuse case, by attributing it to its source, i.e., athird-party library or an application. The results of both studies revealed that thekey management subsystem of data encryption in smartphones is the weakest link,which renders data protection insecure. Third, a key management system based onwearable devices was designed, implemented and evaluated in terms of technicalimpact on data access latency and power consumption.The results of the user studies showed that smartphone users tend to chooseeasy-to-guess unlocking secrets, which makes it trivial for an attacker to derive aproper encryption key. The analysis of Android applications revealed that 9 outof 10 applications use static encryption keys, i.e., an attacker can extract theseencryption keys with any of the existing tools (such as ApkTool [17]), and, thus,decrypt data. The experimental evaluation of the proposed key management sys-tem for smartphones and tables revealed that one can find a trade off betweenincreasing data access latency and power consumption that allows the wearabledevice to run for more than a year on a single coin cell battery.41.2 Research SummaryThe research presented in this dissertation consists of three parts. The first partpresents the results of the studies on user experience in regards to data protectionin smartphones. The second part of this thesis presents the results of the analysisof 132K Android applications focusing on the misuse rates of Crypto API. Thisincludes the source attribution analysis results and the results of the trend analysisbased on the differences between the applications collected in 2012 and 2016. Thethird part, contained in Chapter 4, explores a possible approach to addressing theidentified limitations of data encryption in smartphones for specialized domains,such as health care. In what follows I summarize an approach taken for each ofthe research projects and the key findings.1.2.1 Smartphone Users’ Experiences with Data ProtectionWith the user base of smartphones slowly approaching two billion, there are stillopen research questions on how to protect users’ data in smartphones. In partic-ular, it is not clear what kinds of experiences users have with the existing dataprotection systems today. To fill this knowledge gap we performed a set of userstudies. First, we conducted a set of interviews. The study was designed to col-lect qualitative data on users’ experiences with data protection systems, such assmartphone locking. In modern smartphones, the smartphone locking system isthe corner stone for data-at-rest protection, since the unlocking secret is used toprotect the data encryption key. If the unlocking secret is easy-to-guess, then anattacker can mount an offline password-guessing attack, which eventually will al-low the attacker to decrypt all encrypted data. In addition to user experience, welooked at how different types of attackers impacted user perception of associatedrisks. These interviews allowed us to get a better understanding of a variety ofissues that smartphone users face every day. To corroborate the results of the in-terviews and to assess the prevalence of different experiences, we designed andadministered a follow up study in the form of an on-line survey.5The results of the studies revealed that users are divided into three categories(a) those who store sensitive data on their devices and use unlocking secrets(around 50%), (b) those who store sensitive data, but do not use unlocking se-crets due to usability issues (20%), and, finally, (c) those who choose not to storesensitive data on smartphones (the remaining 30%).Both of the above studies broadened our understanding on which data typesare considered sensitive and why. In particular, the results showed that personalmessages, account credentials, photos, and videos are among the most sensitivetypes of data. Further, studies showed that data sensitivity depends on the typeof adversary. For instance, SMS messages were considered to have a higher sen-sitivity when the intruder was someone from the victim’s acquaintances, i.e., aninsider. Contact details, on the other hand, were only considered sensitive forstrangers, i.e., someone who does not know the victim.The results of the study revealed that while 50% of smartphone users em-ployed authentication-based smartphone locking, 95% of them chose an easy-to-guess unlocking secret. Since unlocking secrets are used to protect data encryptionkeys, the use of easy-to-guess secrets makes it simple for an attacker to obtain thekey by mounting an inexpensive password-guessing attack [56]. The results of afollow up study, led by Ivan Cherapau [45], showed that even if a fingerprint sen-sor was available, smartphone users still preferred to use easy-to-guess unlockingsecrets for various reasons, mainly rooted in usability issues.Finally, the results of the studies showed that apart from the usual thieves,smartphone users are facing insider attackers. In particular, the results obtainedin the studies provide evidence that 1 in 10 of smartphone users have accessedsomeone’s device without permission. The accuracy of these results was improvedin a follow up study, led by Diogo Marques [83]. In particular, the results ofthis study revealed a higher rate; that 1 in 5 smartphone users had snooped intosomeone else’s smartphone.The main contributions of the conducted user studies are the following: (a)investigation of users’ experiences with existing data protection systems in smart-6phones, (b) sensitivity assessment for various types of data in the presence ofstrangers and insiders, and (c) measured prevalence of snooping behavior. The re-sults suggest that assumptions made by previous research studies about the safetyof certain locations are, to say the least, questionable. For instance, while bothRiva et al. [94] and Hayashi et al. [72] have suggested that work or home are safelocations and that smartphones should disable smartphone locking completely atthese locations, the results from our studies suggest the opposite; these locationshaving plenty of insider attackers. Furthermore, because users tend to chooseeasy-to-guess unlocking secrets for their smartphones, existing data protectionsystems in smartphones are rendered insecure due to the central role of unlockingsecrets in protecting encrypted data.1.2.2 Analyzing Cryptographic API use in AndroidApplicationsWhile users can protect their data by enabling encryption2 and choosing suffi-ciently complex unlocking secrets, application developers can also play a role indata protection. In particular, application developers can choose to encrypt userdata themselves, by calling Crypto API directly. With millions of applicationsavailable to smartphone users, it is important to understand how these applications(mis)use Crypto API. Unfortunately, a recent study showed that 88% of Androidapplications made at least one mistake while consuming Crypto API [57]. Fur-ther, it is unclear if these misuse rates have changed since the mentioned studytook place. In addition, the report [57] had limited analysis of libraries and didnot look into the security implications of the misuse cases.To bridge this knowledge gap we designed and developed the BinSight system- a system that uses static analysis and program slicing in order to identify CryptoAPI misuse cases in Android applications. We analyzed 132K Android applica-tions in total, which originated from three datasets collected in 2012, 2015, and2In iOS disk encryption is always enabled, while in Android it is an option a user can enableor disable.72016. The dataset from 2012 was given to us by the authors of the CryptoLintstudy [57], which allowed us to replicate the original study and assess the ratio ofover-counted bugs in the CryptoLint report.Our analysis results revealed that 9 out of 10 calls to Crypto API originatedfrom third-party libraries and that the original CryptoLint study had missed 249(or 96%) libraries in their dataset. Further, the results showed that 222 of themissed libraries were responsible for 70% of the flagged Android applications.This strongly suggests that source attribution is crucial for the accuracy of CryptoAPI misuse analysis. Comparison of the misuse rates between applications col-lected in 2012 and 2016 showed that while applications and libraries have im-proved in certain aspects of Crypto API use, they worsened in others. In particu-lar, while libraries have significantly reduced the use of ECB mode for symmetricciphers, libraries significantly increased their reliance on static IVs and static en-cryption keys. In addition, the RC4 cipher, a cipher with a known vulnerability,gained popularity in 2016 and became the third most commonly used cipher.Analysis of the applications collected in 2016 revealed that 89.5% of theflagged applications had Crypto API misuse cases in only third-party libraries. Inother words, 507 libraries were responsible for introducing Crypto API misusesto 79,207 (out of 88,510) Android applications. Unfortunately, such dominancemakes the current approach of measuring misuse rates of Crypto API highly bi-ased towards libraries. The root cause of the bias is that libraries, especially thepopular ones, inflate the ratio of APK files that misuse Crypto API. To address thislimitation, we proposed to use the ratio of call-sites with and without mistakes foreach source type. This allowed us to identify cases when the original metric, i.e.,the ratio of Android applications with misuses, conveyed a misleading message.In particular, according to the ratio of Android applications, applications them-selves have significantly improved in terms of not using static encryption keys. Atthe same time, the call-sites ratio suggests the opposite, i.e., the code of applica-tions worsened.Finally, the results of manual in-depth analysis of the top libraries revealed8that a misuse of Crypto API does not necessarily imply security vulnerability.In particular, one might use Crypto API for reasons other than confidentiality orintegrity protection. As we show, the Google Play SDK library used a symmetriccipher to obfuscate code. In addition, we found an edge case to the use of ECBmode, when a single block of random data was encrypted.The present study makes the following contributions: (1) replication of previ-ously published research through obtaining the original data from the CryptoLintstudy, (2) comparative analysis of Crypto API misuse in Android applications be-tween 2012 and 2016, (3) improvements to the analysis framework by introducingsource attribution and de-duplications, (4) and analysis of security implications ofmisuses in the top libraries.In comparison with existing tools, e.g., Soot [32], neither CryptoLint [57] norBinSight introduce anything novel to the field of static analysis itself. They areboth highly specialized tools tailored towards the analysis of specific issues inAndroid applications. Nevertheless, while CryptoLint failed to analyze 23% ofthe APK files, BinSight was able to analyze all but six of APK files out of a tentimes larger dataset. Furthermore, the results of the analysis demonstrated thatthe previously used method of measuring misuse rates is biased towards libraries.To address this issue, we proposed to use the ratio of call-sites with mistakes toall call-sites, which provides intuition into how probable it is that a call to CryptoAPI from an application itself or a library would make a mistake.Overall, when it comes to protecting user data in smartphones, both applica-tion and library developers are doing a poor job. In particular, 50% of all calls tosymmetric cipher API end up using a static, i.e., hard-coded, encryption key. Thissuggests that the encryption key management system is currently the weakest link,since (a) smartphone users tend to choose easy-to-guess unlocking secrets and (b)application and library developers rely on static encryption keys.91.2.3 Storing Encryption Keys on Wearable DevicesThe results from both user studies and the analysis study on the rates of CryptoAPI misuse in Android applications revealed that the encryption key is not ef-fectively protected. On one hand, we see that users tend to choose easy-to-guessunlocking secrets, making password-guessing attacks trivial. On the other hand,application and library developers often use static encryption keys, which can beextracted from binaries with any of the existing reverse engineering tools. Toaddress this limitation we designed and evaluated a system we named Sidekick.The Sidekick system uses wearable devices to store encryption keys, eliminatingthe dependency of data encryption security in smartphones on the entropy of un-locking secrets, and providing a secure location for developers to store encryptionkeys.The evaluation of the Sidekick system revealed that our proposal is both ef-fective and efficient. In particular, the system it allows fetching a 256-bit longencryption key from the wearable device in under a second. This is a significantimprovement over a commonly chosen 4-digit PIN-code, which, in comparison,provides around 13-bits of entropy (assuming that a PIN is selected randomly).Sidekick’s power consumption impact on smartphones was below 1% of batterycapacity and allowed the wearable device to function for up to 400 days on asingle coin-cell battery.The contributions of this study include: (a) the design and evaluation of asystem that decouples user authentication and data encryption in smartphones,(b) recommendations on the value for configuration parameters for the wirelesscommunication stack, and (c) making the system available as open source andincorporating key parts of it in a real product. We envision the Sidekick sys-tem’s inclusion in features provided by existing wearable devices, such as smartwatches or fitness trackers, so that users would not need to use yet another device.Considering that the proposed system needs 20Kb of ROM and 4Kb of RAM ona wearable device and can run on an 8-bit CPU, such an integration should besimple.101.3 Contributions SummaryTo summarize the previously stated, this thesis makes the following contributionsto research:First, we studied user experiences with data protection in smartphones.We show that half of smartphone users do not use locking systems due to varioususability issues or security concerns. We also show that the majority of the usersthat lock their smartphones choose easy-to-guess unlocking secrets, which makesdata decryption a simple exercise for an attacker. Finally, our results suggest thatthe assumptions of recent research about certain environments are highly ques-tionable. In particular, while several authors have suggested that work and homeare safe environments, we showed that these locations commonly experience in-siders. The studies presented in this dissertation provide evidence that users ex-perience attacks by insiders in real life. These results imply that the security ofdata-at-rest in smarpthones needs to be re-evaluated without assuming that userswill choose a hard-to-guess unlocking secret. Specifically, if developers of thedata encryption layer use unlocking secret to generate or protect their master key(the actual encryption key used for data protection) they should not use unlockingsecret as a single source of randomness and should use other sources as well.Second, we replicated previous study on Crypto API (mis)use rates in An-droid applications. By obtaining the set of applications used in the previouslypublished research, we were able to replicate the original study and confirm itsfindings. While doing so, we identified certain limitations in the methodology. Inparticular, we showed that the authors of the CryptoLint study missed 96% of thelibraries in their dataset, which resulted in a misleading message. That is, whilethey reported that 88% of the applications misused Crypto API, 70% of them weredue to 222 libraries.Third, we conducted analysis of Crypto API (mis)use rates in Androidapplications. By collecting new applications in 2016 we were able to comparehow the misuse rates have changed since 2012. In particular, we showed thatwhile applications and libraries have improved in certain areas, e.g., the use of11ECB mode for symmetric ciphers, they have become worse in others, e.g., the useof static encryption keys.Analysis of Crypto API misuse revealed that while applications developersimproved in certain aspects (e.g., random number generation), they made moremistakes when it comes to the use of symmetric ciphers. Specifically, the use ofstatic encryption keys and initialization vectors has increased between 2012 and2016. Even more, the popularity of long time considered insecure (e.g., DESor RC4) ciphers has also increased. Combination of with users preferences onunlocking secret choices with frequency of Crypto API misuse by applicationsdevelopers suggest that both parties fail to secure sensitive data in smartphones,thus further research is needed into various aspects of the problem. First, usablesecurity research community should investigate if it is possible to have a usable,yet secure authentication method on smartphones, such that be used as a sourceof entropy for encryption key derivation function.Finally, we evaluated technical feasibility of encryption key managementsystem based on wearable devices for smartphones. We designed, implementedand evaluated the technical aspects of using wearable devices for encryption keysmanagement in smartphones. The results of the lab experiment revealed howone can approach the trade-off between data access latency, session key refresh-ing schedule and power consumption on both wearable devices and smartphones.Such an approach can be used to address both users preference of easy-to-guessunlocking secrets and misuse of Crypto API. In particular, by acting as an ad-ditional source of entropy for key derivation process, the proposed system caneliminate the sole dependency of security of master encryption key on securityof unlocking secret. This can be achieved by encrypting the master encryptionkey with a randomly generated key encryption key (KEK), which is then storedon a wearable device. Second, by integrating such system with existing file sys-tem (e.g., EXT4) one can simplify data encryption for application developers. Insuch design, developers would declare a file as encrypted and use the proposedencryption keys management system to manage encryption keys.12Chapter 2Smartphone Users Experience withData ProtectionThis chapter presents the results of two user studies conducted to gain a betterunderstanding of how users perceive threats associated with disclosure of data-at-rest stored on the smartphones they own. This chapter begins with a discussion ofresearch questions. It then proceeds to the design and results of the first user study,which was qualitative in nature based on the results of semi-structured interviews.We then present the quantitative results of the second user study, which was basedon the results of online surveys. The chapter concludes with overview of relatedwork and a summary of three follow up studies and a discussion of results andconclusions.2.1 Research QuestionsBy studying smartphone users one can get a deeper understanding of how usersperceive smartphone threats, such as loss and theft, and risks associated with theftand loss, e.g., confidential and sensitive information disclosure or reputation dam-ages. The following research questions were defined, in order to fill this knowl-edge gap:13• RQ1 - What types of sensitive data do users store on their smartphones?• RQ2 - What practices do users employ and do not employ for data confi-dentiality protection?• RQ3 - Why do users choose to use (or choose not to use) certain securitypractices?• RQ4 - How concerned are users with unauthorized access of their data orsmartphone functionality by an attacker?• RQ5 - How many users have experienced unauthorized access to sensitivedata?Throughout this study we define data as being sensitive if a user wants it tobe available to a limited number of persons, including just to him/herself. Forinstance, while users might want to keep photos of their children accessible tofamily members, they might want to limit access to personal messages or browsinghistory to themselves.Answering RQ1 provides a better understanding of the variety of sensitivedata that users store on their smartphones. Knowing the types of data we needto protect makes it easier for the research community to design an effective andefficient data protection system. Answering RQ2 and RQ3 would improve ourunderstanding about which security practices in smartphones users employ or donot employ and, most importantly, why they choose to do so. Having a betterunderstanding on these two important research questions would give us and thewider research community a clearer picture of everyday issues that users facewhen they try to protect their smartphone data.Finally, answering RQ4 and RQ5 provides a better understanding of users’perception and previous experiences with theft and loss threats. In particular, RQ4provides a deeper insight into whether or not users consider the risk of confidentialdata disclosure important.14Answering RQ5 provides empirical evidence to the question of whether ornot users have experienced theft and loss of their smartphones, which results inconfidential data disclosure. Even more, RQ5 is looked at from two perspectives.First, subjects were treated as victims, i.e., asked if someone else had unauthorizedaccess to their smartphones. Second, subjects were treated as attackers and wereasked if they had accessed someone else’s smartphone without permission.2.2 ApproachTo answer the research questions defined in the previous section, two user studieswere conducted. First, a qualitative study based on a set of semi-structured inter-views was conducted ("Study 1" throughout the rest of this chapter). Second, inorder to corroborate the results of Study 1 and to gain statistical power, a quantita-tive study was administered in the form of an online survey ("Study 2" throughoutthe rest of this chapter).2.3 Study 1 – InterviewsThis section presents the methodology and the results of the qualitative studybased on interviews with 22 subjects.2.3.1 MethodologyIn the exploratory study we used semi-structured interviews for data collection.The main objective of this study was to gather qualitative answers, rather thanquantitative. The decision to begin with a qualitative study is based on the factthat qualitative studies give a better opportunity to explore the problem domainwithout restricting ourselves. That is, by starting with a qualitative study, wecan conduct a well-informed quantitative study. One crucial advantage of semi-structured interviews is that an interviewer can easily deviate from the initial in-terview structure, which allows researchers to dig deeper into unanticipated topicsthat emerge during the interviews.15We used theoretical sampling [67] rather than random sampling during the se-lection process of participants. We made this choice, because it was more impor-tant for us to recruit a diverse pool of subjects, rather than having a representativesample of general population. Diversity is often more important during qualitativestudies, especially when questions on variability of some parameters need to beanswered. Accordingly, before scheduling an interview with a candidate we askedeach of them to fill out a pre-interview questionnaire. Once we obtained their an-swers, we checked if demographic parameters added to the diversity of our subjectpool. In the questionnaire, we asked seven questions about age, gender, completededucation, job position(s) and area of work, hobbies, annual household income,and native language. The list of questions is provided in Section A.1. Each inter-viewed participant was paid $25 CAD for a one-hour long interview. We appliedand received approval from UBC Behavioural Research Ethics Board to conductthis study (application H11-02230).All interviews began with a set of simple questions, such as “what applica-tions did you use during the last few days?” or “what was the first thing you didwith your smartphone today?” Subjects were then asked about the applicationsthey were using on their smartphones, while interviewers recorded the names of allmentioned applications. For the applications that could have been used for busi-ness or work, such as calendar, emails, and messengers, subjects were also askedto clarify if they used these applications for work or business related activities.Afterwards, each subject was asked about how he or she used these applicationsand what kinds of data each application stored.In what followed, subjects were asked if there was a user account associatedwith each application and, if so, if they need to authenticate every time they launchthe application. This was asked in order to uncover applications where users choseto save their credentials, allowing application launching without re-entering logininformation. For example, if a participant used an email client, we asked herabout the kinds of emails she would usually receive to the accounts she registeredin that application. We also asked whether she saved passwords from any of the16used email accounts or typed them in each time she needed to access her emails.In most cases, to validate responses from subjects, we asked them to launch theapplication in front of us.During a pilot study, we found that it is difficult for participants to provide aclear answer about the sensitivity and the value of their data without a scenario.To address this issue, we gave several scenarios to participants, aimed to commu-nicate probable risks more clearly. For the sensitivity of each data type we askedparticipants about the consequences of disclosing the information to a stranger oran insider. We explained these two terms to subjects as follows - a stranger issomeone you do not know and he/she does not know you (e.g., a thief on a bus),while an insider is someone from your social circle or someone who knows you(e.g., a coworker).Finally, by the end of the interview, participants were asked about the practicesthey used to ensure that their sensitive data are kept confidential. Subjects werealso asked why they did not use certain tools and features, e.g., unlocking secretsor regular data backup.All interviews were conducted by two interviewers in order to ensure that allimportant questions were asked. Interviews were audio-recorded and transcribedverbatim. Later, the transcriptions of the interviews were coded, analyzed andchecked by both interviewers. To ensure sufficient numbers of participants in ourstudy we used information saturation analysis. That is, after each interview wecategorized all additional unique pieces of information that the interview revealed.Once information saturation was observed, the recruitment of new subjects wasstopped.2.3.2 ResultsDemographics of Recruited SubjectsIn total, 22 interviews were conducted during October, 2011. Half of them wereconducted at the University of British Columbia (UBC) Point Grey campus, and17Figure 2.1: CDF of new collected information across the interviewed participants.the rest at UBC’s Robson Square campus in Vancouver. As shown in Table 2.1the demographics of the recruited participants were diverse and included subjectsfrom various occupations and age groups. Note, that some of the participantshad more than one job, resulting in the total number of subjects not equaling thenumber of participants per occupation.After the 18th participant we observed that adding additional participants didnot reveal new information. In accordance with the theoretical sampling approach,the recruitment of new subjects was stopped. The graph, shown in Figure 2.1,supports this decision and shows that saturation in data collection was reached.RQ1 - Types of Data Stored on SmartphonesWhile analyzing the types of data users mentioned during the interviews, we ob-served that users refer to data from two different angles. First, they explicitlydefine that certain types of data are sensitive for them. Second, they definedsome types of data as being valuable. The sensitive data included records thatusers wanted to keep to themselves, e.g., personal messages, while valuable data18Table 2.1: Demographics of 22 Interview Participants in Study 1.Parameter Property ParticipantsGender Males 10Females 12Age under 18 119-24 725-30 531-35 236-40 341-45 346-50 1Education Still in High School 1High School 6Professional School orCollege Degree 4University (Bachelor’s) 6Graduate School(Master’s or PhD) 5Household income under 15K 615K-30K 330K-50K 350K-80K 7more than 80K 3Smartphone OS iOS 11Android 4Symbian 2BlackBerry OS 4WebOS 1Occupation 1 Caregiver, 1 Curator Assistant, 1 Entrepreneur, 2Graduate Students, 1 High-school Student, 1 Lan-guage Teacher, 2 Marketing Specialists, 2 Munic-ipal Workers, 1Network Administrator, 1 Nurse, 1Librarian, 1 Pilot Instructor, 1 Proof-reader, 4 SalesWorkers, 1 Security Guard, 1 Software Engineer, 1Tailor, 1 Undergraduate Student, and 1 UnemployedData Stored Work Related 9Personal 22Phone Ownership Personal 19Company 319Table 2.2: Types of Data and their Sensitivity and Value from Users’ Perspectives.Data Type Sensitive ValuableSMS MessagesPhotos/VideosVoice RecordingsNotesContactsMusicPasswordsEmailsDocumentsEvents in CalendarRecorded GPS Tracksincluded data that users were worried about loosing, e.g., memorable photos orvideos. Considering users’ needs, sensitive data require confidentiality protec-tion, i.e., should be only accessible by the owners, while valuable data requireavailability protection, i.e., being available to the user, even if the device is lostor stolen. If, however, a data record is both sensitive and valuable, one shouldcarefully design the availability of the system in order to avoid compromisingconfidentiality. Considering that most systems for availability protection rely onsome sort of cloud storage, this creates another attack vector on users’ sensitivedata, and could have a devastating impact on users themselves, e.g., recent iCloudhacks that exposed personal images of celebrities 1.A summary of data types and their sensitivity and value is provided in Ta-ble 2.2. Based on subjects’ feedback, we mark sensitivity and value as none,partial or full. In Table 2.2 a fully filled circle in sensitive and valuable columns1 Due to the vulnerability of the iCloud’s web interface, attackers were able to mount a pass-word guessing attack on celebrity accounts and eventually gained access to the backup of theirphotos from their smartphones that most of the subjects agreed that the data type is sensitive or valuable.On the other hand, an empty circle means that a data type was not consideredsensitive or valuable by any participant. Data types that were considered sensitiveonly by a minority, i.e. at least one participant, are shown as half filled circles.Again, considering that these results were obtained in a qualitative study, oneshould treat them with caution. This is why I refrain from reporting any statisticsfrom our observations, aside from the diversity of opinions. In fact, I do not pro-vide any descriptive statistics about the data types or their classes as this was notthe goal of Study 1. To provide such descriptive statistics one should use a differ-ent, quantitative approach. The following discusses the reasons subjects used tojustify the sensitivity or value of data types.Passwords: Some of our participants stored passwords on their smartphones us-ing different means. One participant stored passwords for online banking as con-tact records in their address book. Another created notes for door PIN-codes forher workplace. A sub-group of the participants used special applications, suchas password managers. Most participants opted to allow applications to save theassociated credentials, so that they did not have to enter a password every timethey opened the application (e.g., email clients, Facebook application, etc). Allparticipants considered these passwords to be sensitive. It should be noted thatthe participants who used password managers were less worried, since such ap-plications usually required an additional password. However, they did admit thatthe password they used was the name of a person or a simple word. Interestingly,some of the participants considered password lists highly valuable, and these listswere stored only on their smartphones where loss of the list would incur a signif-icant amount of work to restore access to the corresponding accounts.Music and Events in the Calendar: Music and events, on the other hand, werenever mentioned as being sensitive or valuable. Most of the participants justifiedsuch judgment by the fact that they had a copy of such data on their computers,21online, or that they could remember the information. In the case of losing appoint-ment information or events, subjects reported that they also had a physical agendabook to keep the information.Voice Recordings: Several subjects used their smartphones to record audio ofconversations, which had the potential for being confidential and sensitive. Forexample, one of the subjects was a quality assurance specialist, and had recordedmultiple conversations with employees that provided anonymous feedback on theinternal affairs of the company. On the other end of the spectrum, subjects alsoused voice recordings for taking notes and memos. These types of recordingswere not considered as sensitive. Most of the subjects that used voice recordersfor note taking did consider these recordings as valuable, mainly because subjectswere not sure if they would be able to recover them if lost.Photos and Videos: Some of the participants defined photos and videos on theirsmartphone as both sensitive and valuable. Others considered their photos andvideos sensitive for cultural reasons. For example, one of the participants statedthat photos of his family were sensitive, as women in his culture wear a Hijab inpublic. Interestingly, most of the participants who took pictures and videos ontheir smartphones kept them there for some period of time in order to accumulatea considerable amount before transferring them to a PC. Several participants whorecently lost or damaged their phones admitted that they had lost valuable picturesas well. Moreover, it was hard for participants to recall on the spot whether theyhad valuable or sensitive pictures, without first looking through their images andvideos.SMS Messages: The analysis of the interviews revealed that SMS messages havea short temporal value, which is lost once they have been read. Most of the partic-ipants stated that they do not use SMS for highly meaningful conversations, andrather use SMS messages for friendly chats and as a way to keep in touch with22their friends. Certain subjects did, however, consider specific types of messagesas being sensitive, but only if they were read by certain people, e.g., their parents“I do not like the idea of someone, especially my parents, going through my mes-sages...”. At the same time, these subjects were comfortable sharing these SMSmessages with their friends.Contacts: The participants sometimes considered contact details as being sen-sitive, mostly because they were not comfortable sharing such data with others.Reasons varied from reputation consequences “If someone got hold on of my con-tacts, I would feel uncomfortable, because I feel like those people trusted me tokeep their phone numbers private” to expected threats to people “I am not surewhat those who got my contacts numbers will do with them, they could call themor send them spam”. The value of contact details was justified mainly by thelack of synchronization with a PC or an online account. Interestingly, some ofthe participants stated that they had a copy of their contact details in paper form,which they carried around with them as they had lost their smartphone previouslyor experienced other technical problems, such as their batteries dying.Email Messages: Participants classified their emails as being not valuable, be-cause all of them were able to access emails either online or on their personalcomputers. A majority of the participants had multiple email accounts config-ured on their smartphones. They classified their emails as “junk-collecting” or“sign-up” email, personal and work related. Nine of the participants used workemail accounts on their phones and received confidential business emails whichcontained unreleased products details, marketing company budgets, business pro-posals, etc. The mix of unimportant and important emails defined email sensitivityas “could be sensitive”.Documents: Some participants uploaded work-related documents to their smart-phones. These documents often contained confidential information, such as de-23Table 2.3: Security Practices and Experience of Interviewed ParticipantsParameter Property ParticipantsUse pin-lock Yes 7No 11No, but used to 4Had experience of Losing phone 5Breaking phone 4Losing and Breaking phone 1scription of new products or sales figures. The necessity of having such data ontheir smartphones was justified by the need to access vital numbers during travels.These documents were not considered valuable, since they were also stored on thecompany’s servers or work-related computers.GPS Tracks: GPS tracks are usually recorded by training assistant applications,such as miCoach for iOS. Subjects used such applications to track their outdoorexercises and monitor performance. The tracking information that these applica-tions collected was considered to be highly sensitive, mainly since most of thesetracks lead to the subjects’ homes.RQ2 – RQ4 - Security Practices for Data Protection and ConcernsSubjects were asked whether or not they used any tools for data protection. In ad-dition, the interview structure also included questions that focused on the reasonsfor and against using such tools. The results of the interviews on such practicesare presented as follows. A short summary of the results is provided in Table 2.3.Most of the participants, but not all of them, backed up valuable data wheneverthey “felt” that they needed to, which varied from once a week to once in sixmonths. Those who lost their devices and valuable data, however, admitted thatthey started paying greater attention to this practice after the loss. The subjects24often cited the following reasons for such infrequent and irregular backups (1)inconvenience of current systems, (2) lack of time, and (3) lack of information onwhat data needs to be backed up.Several participants stated that they do not trust the security of their smart-phones at all, and, thus, decided not to store any sensitive or valuable data on suchdevices. Their decisions were based on concerns they had with the security ofsmartphones, especially if they were lost, stolen or infected by malware. Inter-estingly, 20 participants stated that they considered smartphones to be less securethan PCs. The high mobility of smartphones was cited as the main justification,since this meant an increase in chance that the device could be lost or stolen.The participants were then asked about what they would do if they had just losttheir smartphone. A majority of the subjects told us that their first action wouldbe to try to recover their device, by going through places they visited in the lastfew hours. Four participants said that they use special applications to track theirdevices, such as “Find my iPhone” [3] and would try to locate their smartphonesthrough this approach first. The answers of those who had lost their mobile phonesbefore did not show any differences.In the case that subjects could not recover their smartphone within a coupleof hours, all participants told us that they would call their service providers andblock their line to avoid paying for someone else using their high cost services.All participants who stored phone passwords in any form said that they wouldchange their passwords in a day or two after the loss. Not surprisingly, the phoneitself was also mentioned as a financially valuable asset to lose.In the scenario when their smartphone had been stolen or used by someoneelse, participants showed a different perception of risks, depending on who the at-tacker was. Threat expectations were higher for 17 participants when the attackerwas someone who knows them. The participants also stated that when lendingtheir smartphone to their friends, they would like to keep an eye on them becausethey had concerns about this person looking through their personal data, such asmessages and pictures. Most of the time they did not care about showing some25data, such as messages and emails, to complete strangers, but did care if such datawere seen by someone within their social circle.Not surprisingly, 21 participants stated that they would like to store backupsof sensitive data at home on their PCs or external hard drives, rather than havingthem online. Two Android users decided to disable synchronization with theirGmail account completely because of privacy concerns. Moreover, half of theparticipants only used external hard drives as a backup solution for their homemedia files, such as videos, pictures and documents. Although most of them diduse some form of “cloud” storage, such as Gmail, Facebook or Dropbox, theypreferred to store only “shareable” content with these services. This is, also, con-sistent with the findings of Ion et al. [74], where they studied users’ attitudestowards adoption of cloud storage in general.Out of 22 subjects, only seven participants used PIN-locks on their smart-phones. Out of these seven, one subject used it only because of a company workpolicy, and told us that he would not use it otherwise. Another participant saidthat she only used PIN-lock to protect her SMS messages from her parents, andfound it annoying that she was not able to isolate and protect these messages. Allparticipants that used PIN-codes stated that they typed PIN-codes very often fordata that is both not sensitive and not valuable to them, such as weather forecastsor games.Those who previously used device locking with a PIN-code, but who switchedit off at some point, justified this decision by needing quick access to specific dataand functions of the device on the go or in specific circumstances. For instance,one of the participants said that she gave up on using PIN-codes to lock her devicewhen she was at a party and needed constant access to the Internet to check certaininformation. She found it highly inconvenient to type her PIN-code each time, soshe decided to switch off smartphone locking completely. Moreover, needing totype PIN-codes or passwords in on-the-go situations rendered some users’ devicesunusable, especially when users were in a rush.Similarly, users who did not use any type of PIN-code agreed that typing a26PIN-code for every application on their phone “does not make any sense”. Forthose who did not use such locks, the main reasons were (1) subjects did not haveany sensitive data on their smartphones, (2) it was too inconvenient for them totype their PIN code or password, or (3) they felt “socially-awkward” to type apassword in front of their friends or family members.RQ5 - Experience with Unauthorized AccessTwo subjects stated that they had been victims of unauthorized access to theirsmartphones in the past. One subject said that she had to lock her smartphoneat home because her brother and parents tried to access her photos and messages“I am enabling smartphone lock once I am home, to prevent my brother and myparents going through all my pictures and SMS messages.” Another subject, afemale student who shared an apartment with other students, said that while shewas asleep, her roommates used her phone without her permission.One subject admitted that she looked through all messages and pictures on aphone that she found in a cinema theater “I first called to the last number in callhistory [...] then I had to wait for them, so I decided to peek into photos andmessages, just out of curiosity. Would you not do the same?”.2.3.3 SummaryThe results of the qualitative study presented in this section provide a better un-derstanding of users’ perspective on threats to smartphones, especially those thatare relevant to confidential data disclosure. In addition, the results of the studysheds light on (1) the data types that smartphone users store on their devices, (2)participants’ opinions on data sensitivity, and (3) security practices users employor do not employ and why.Overall, the results of the qualitative study suggest that users store significantamounts of sensitive data on their smartphones and that they are concerned withthe disclosure of such data if the device is lost or stolen. The majority of them,however, tend to not take any actions in order to ensure confidentiality protec-27tion. In particular, only a few subjects used secure passwords, while the rest used4-digit PIN-codes. Even more, the PIN-codes were considered unusable by a ma-jority of the subjects and were often avoided completely. Subjects justified suchbehavior by the need to have instant access to their data and applications, whichoften appeared to be not sensitive (e.g., games, weather forecast applications, andinternet browsers).Finally, this study provides evidence that users indeed experience unautho-rized access to their devices and data. It is clear that these attacks are mounted byboth strangers and insiders. In particular, several subjects admitted that they hadbeen victims of such attacks, where one of them took proactive actions to defendherself, by locking her phone at home (i.e., an insider attacker). Another subjectadmitted that she accessed photos and messages on a phone she found in a cinematheater (a stranger attacker).2.4 Study 2 – Online SurveyAlthough Study 1 provided us with rich qualitative data, it did not allow us toquantify different opinions, and hence, did not allow us to answer such quantita-tive questions as “How many users use PIN-codes for device locking?” or “Howmany users encountered unauthorized access to their devices?” To address thisknowledge gap, a quantitative study, based on online surveys, was performed.2.4.1 MethodologyThe design of Study 2 is based on the results of Study 1. In particular, the ques-tions and structure of the online survey was defined around the identified datatypes in Study 1. In addition, subjects were questioned about their perception ofthreats and risks with two different types of adversaries in mind, i.e., a strangerand an insider. In Study 2, we used an online survey, which allowed us to recruit alarger and more diverse participant pool and measure statistical prevalence of var-ious opinions, practices and experiences. To ensure clarity of the questions and28correctness of the data collection process, we conducted four pilot studies (be-tween January and May 2012) with 60 subjects in total. Data from pilot studiesare not included in the analysis presented in this chapter.The online survey consisted of four parts. In the first part, general questionswere asked about the use of smartphones. In particular, participants were askedwhen they locked their smartphones and if they also used a code (either PIN,Draw-a-Secret, or a password) to unlock it2. Next, the subjects were asked to visita web page on their smartphones. Our data collection tool used this opportunityto record a UserAgent3 string from their smartphone. This allowed us to eliminatesubjects that did not provide evidence that they owned a smartphone.The second part of the survey included questions about respondents’ previousexperience with their smartphones, e.g., loss or damage. Subjects were also ques-tioned if they had previously accessed someone’s smartphone without permission,and if someone had accessed their smartphone without their permission.The third part of the survey contained questions about data types that subjectsstored on their smartphones. Participants were provided with a pre-populated listof data types (compiled based on the results of Study 1) and were asked to selectthose that they stored. An option to add a new type was also available. Thesequestions were asked twice, once for personal data and once for work-relateddata.In the final part of the survey, subjects were required to rate their agreementwith the following statement, “I would not have any concerns if Personal/WorkData Type could be viewed by such a thief” on a 5-point Likert scale for eachdata type. The following options were provided: Strongly Agree, Agree, Neutral,Disagree, Strongly Disagree. The rating task was performed twice, once for astranger scenario and once for an insider scenario. The stranger scenario was2Face and finger print recognition were not available at the time of this study.3A UserAgent is a string that every browser sends to the web server. For instance, the follow-ing string is sent from an HTC Sensation 4G that runs Android 2.3.4 - ”Mozilla/5.0 (Linux; U;Android 2.3.4; en-us; HTC Sensation 4G Build/GRJ22) AppleWebKit/533.1 (KHTML, like Gecko)Version/4.0 Mobile Safari/533.1”29presented as, “Assume your smartphone was just stolen by a person who does notknow you [sic],” and the insider scenario as, “Assume your smartphone was juststolen by a person who knows you [sic].”Finally, subjects ranked the importance of each data type they stored on theirsmartphones. Ranking was performed twice, once for the stranger scenario andonce for the insider scenario. In each ranking task, subjects were asked to rankthe data types by their level of concern, with most concerned at the top and leastconcerned at the bottom. A drag and drop user interface was provided for thistask.We instrumented a custom survey website with tools that allowed us to trackthe following: how much time each subject spent on each question; IP addresses ofthe PC and smartphone used for the survey; and UserAgent strings for the PC andsmartphone. Later, these data were used to remove subjects that either skimmedthrough the survey (23 subjects), or did not use a smartphone (942 subjects). TheUserAgent string was also used to measure the representativeness of our subjectsin terms of mobile platforms and OS versions.In our data analysis, we used the Fisher Exact Test (FET) or Chi-SquaredTest (CHI) for tests on contingency tables. To analyze the differences betweensensitivity rates for strangers and insiders, we used the U-test (Wilcoxon ranksum test). To analyze the differences between sensitivity ranks for strangers andinsiders, we used the Wilcoxon signed-rank test (WSRT).Study 2 was conducted between May 16 and June 23, 2012. The survey wasavailable in the US, UK, Australia, New Zealand, and Canada on Amazon’s Me-chanical Turk (MTurk); through other advertisement services, such as Kijiji andCraigslist; and through “word of mouth” approach. The study was approved bythe UBC Behavioural Research Ethics Board (application H11-03512).2.4.2 ResultsIn what follows we report the results of Study 2.30Demographics of Recruited Subjects2,092 subjects were recruited for the online survey. 1,725 respondents success-fully completed the survey, i.e., answered all required questions. Further investi-gation revealed that only 783 of the subjects used their smartphones as requiredin the smartphone ownership testing task. As was identified in the preliminarypilot studies, the survey required at least 10 minutes to finish. This is why 23participants, who finished the study in less than 10 minutes, were excluded fromthe analyzed set. Finally, considering that the MTurk platform was the most suc-cessful recruitment tool, we decided to remove subjects recruited through othermeans, as to avoid having a user study that was difficult to re-produce.The remaining 724 participants completed the survey in 25 minutes on av-erage (std. dev., s=12.5). The majority of the participants were from the US(634); the rest were divided between Canada (50), the UK (29), Australia (9),and New Zealand (2). The majority of subjects used Android OS (391/51%)and iOS (278/37%). We did not find a statistically significant difference for oursample platform distributions and the distributions reported by Google and Kun-zler [8, 68] (FET,p>0.08). Three hundred seventy of the subjects were male (51%)and 354 of the subjects were females. The average age for the subjects was 25.6years (s=5.98). The average annual income was $43k (s=$19k).The list of occupations reported by the participants was diverse and includedmore than 500 different titles in 16 various industry fields, such as agriculture,business, construction, education, etc.We compared the demographics of our subjects with the results reported bySmith [105]. To the best of our knowledge, Smith’s study is the only study thatprovides statistics on a representative sample of the US population of smartphoneusers (n=2,253), and the majority of our subjects were from the US. For this partonly, all non-US subjects were removed (90). For the rest of the analysis, all724 subjects were used. The analysis of differences between our US subjects andthe ones reported by Smith’s study [105] did not reveal a statistically significantdifference in gender distribution. However, there was a statistically significant31difference in age, income, and education. In particular, our participants appearedto be younger (29.6, σ = 9.69,χ = 361.6676,d f = 3, p < 0.001). This, how-ever, is not surprising, as it was previously shown that MTurk subjects tend tobe younger [88]. Although the differences in education and income distributionswere statistically significant (FET, p < 0.001), we consider them practically in-significant due to small relative values. The average income in Smith’s study washigher by 6% ($46k, sd = $20k), and the difference in education levels revealedthat our sample had 9% more subjects with a high school diploma and 9% fewersubjects with a college or higher degree.Demographic data analysis suggests that the recruited subject pool is a diverseand a representative sample, at least for the US, with a slight bias towards youngersmartphone users.RQ1 - Types of Data Stored on SmartphonesSubjects found the list of options for data types that we provided sufficient, sinceonly three of them added new types. The 15 most used data types are provided inTable 2.4. Note that each data type name has a corresponding code, e.g., photosand videos are coded as phv, which is later used in discussion and figures forbrevity.RQ2 and RQ3 - Security Practices for Data ProtectionThe results of the online survey confirmed findings from the interview-basedstudy, which suggested that most of the subjects did not take appropriate actionsto protect their data on smartphones. In particular, only roughly half of the sur-vey participants (379, 52%) locked smartphones with a code. These subjects arereferred to as the lock-using group (LOCK). The remaining (345) participants didnot use a locking system. We refer to this group as OPEN.Subjects justified the necessity to lock their device by needing to limit accessto data or functionality. In particular, 64% (243) of subjects in the LOCK groupdid so to prevent unauthorized access to their data, while 73% (278) of them did32Data Type (Label) %1 - Photos and videos (phv) 942 - SMS/MMS messages (sms) 933 - Call history (cah) 904 - Emails (eml) 875 - Contacts details (cod) 876 - Music (mus) 817 - Browser search history (bsh) 748 - Browsing history (bwh) 739 - Events in calendar (evt) 7310 - Notes and memos (n&m) 7211 - Data in social networking applications (osn) 6812 - Progress in games (gam) 6813 - Documents (doc) 6414 - Voice recordings (voc) 4215 - Passwords saved in applications or passwords managers) (pwd) 37Table 2.4: The 15 most used data types by the subjects. All cases include only datatypes for personal use, since no work-related data types made it to the top to avoid unauthorized use of the smartphone’s functionality.Similar to the results of Study 1, the OPEN group included 155 subjects thatkept sensitive data on their smartphone and had used a locking system before, buthad stopped due to various usability problems (too many authentication prompts,necessity to authenticate even if non-sensitive data was accessed, etc.). The other190 subjects in the OPEN group did not have any sensitive data on smartphones.Interestingly, most of the subjects in the LOCK group used either a PIN-code(206) or a Draw-a-Secret (DAS) (168) authentication method, whereas only 52used alpha-numeric passwords. Note that subjects were able to select multipletypes of authentication methods if they owned several smartphones, thus ∑n 6=379. The participants from Study 1 justified the choice of PIN or DAS with ease ofuse, in comparison to fully-fledged alphanumeric passwords. The distribution ofsubjects’ justifications for using a smartphone lock is presented in Table 2.5, andthe distribution of reasons for not using a smartphone lock is shown in Table 2.6.33Reason n % CI (α =0.05,zα/2 =1.96)I feel comfortable having such protection 334 88 ±3.18I do not want other people to use my phone services with-out my permission284 75 ±4.28I do not want other people sneaking into my smartphone,when I do not see it246 65 ±4.7I have confidential and sensitive data on my smart-phone(s)167 44 ±4.89My employer requires that 25 7 ±2.39I do not want my smartphone to "pocket dial" 6 2 ±1.2My smartphone got stolen and i did not have a lock on it. 1 ≈ 0 -I lose (and later recover) my cell phone a lot. 1 ≈ 0 -It is a default on my phone 1 ≈ 0 -Table 2.5: Distribution of reasons for using a locking system (N=379). Note thatN 6= ∑n, because the participants were able to provide multiple reasons. CIstands for confidence interval, given the number of subjects that were able toanswer that question.To summarize, a majority of subjects in the LOCK group used smartphonelocking to feel comfortable, to avoid others using their device or looking throughtheir data, or because they stored sensitive data (in some cases as required by theiremployer). On the other hand, 58% of OPEN group subjects did not lock theirdevice because they did not store any data that required protection. Interestingly,46% of subjects previously had locked their smartphones with a code, but stoppeddoing so due to usability issues. These results are consistent with the findings fromthe interview-based study, where subjects voiced concern about lack of granularityin the current locking mechanisms of smartphones.34Reason n % CI (α =0.05,zα/2 =1.96)I do not have any data that I want to hide on my phone 200 58 ±4.86I tried locks before and found them very inconvenient 159 46 ±4.9I often need instant access to applications that do not storeany sensitive data (e.g. weather forecast, news, games)145 41 ±4.84It is not worth for me to use smartphone lock, because theamount of data and applications that are sensitive are verysmall compared to those non-sensitive114 33 ±4.62I do not save my passwords in applications and type itevery time I use an application that stores sensitive data(e.g. email application, facebook application)66 19 ±3.84I do not care if my phone services will be used by some-one55 16 ±3.6It’s always with me or in my sight 16 5 ±2.08I did not have time to setup it (new phone) or I am lazy 5 1.38 ±1.15I did not know about this feature 3 1 ±0.89Other 6 2 ±1.25Table 2.6: Distribution of reasons for not using a locking system (N=345). Notethat N 6= ∑n, because the participants were able to provide multiple reasons.CI stands for confidence interval given the number of subjects that were able toanswer this question.RQ4 - Security Concerns with Sensitive DataTo answer RQ4 the differences between users’ concerns with confidentiality oftheir data were analyzed. First, we analyzed Likert scale ratings with U-test,since the collected data were ordinal, and, thus, parametric tests (such as t-test,ANOVA) were not applicable. The results of U-test revealed that out of the 32data types subjects rated their concerns differently for only six types. In particu-lar, subjects showed highest concern with an insider threat for SMS messages, callhistory, browsing history, and search history in the browser. Additionally, subjectswere more concerned about strangers for contact details and progress in games.35llllllllllllllllRatio of Concerned Subjects, % (Stranger)Ratio of Concerned Subjects, % (Insider)(a) Rates01020304050607080900 10 20 30 40 50 60 70 80 90gammusgpsevtvocbshbwhcahn&mcoddocphvosnsmsemlpwdmusgambshbwhcahsmsevtosn gpsphvvoccodn&m docemlpwdr=0.91llllllllllllllllSensitivity Rank (Stranger)Sensitivity Rank (Insider)(b) Ranks02468101214160 2 4 6 8 10 12 14 16Lower SensitivityHigher SensitivityLower Sensitivity Higher Sensitivitygamgpsmusvocbshevtbwhcahn&mosndoccodpwdphvemlsmsgammusgpsosn bwhvocbshevtcahn&m phvpwdsmsdoccodemlr=0.96Figure 2.2: The proportion of concerned users with sensitivity in the presence of astranger (horizontal axis) and in the presence of an insider (vertical axis). Datalabels across the vertical axis and circles in the plots represent data types forpersonal use; data labels across the horizontal axis and squares in the plotsrepresent data types for work related use. Filled shapes and red-colored datalabels represent statistically significant differences between subjects’ concernswith respect to a stranger and an insider (U-test for rates, WSRT for ranks,p < 0.05). The meanings for the abbreviated data type labels are in Table 2.4.Figure 2.2a shows the proportions of subjects that were concerned with strangers(x axis) and insiders (y axis) for every data type. The proportion of concerned sub-jects for a data type was estimated as a fraction of the number of subjects that wereeither concerned or highly concerned with unauthorized access to the total numberof subjects that stored such data. This plot shows that users’ concerns with regardsto both adversaries are highly correlated (r=0.91), which suggests that both typesof adversaries are worth considering.Statistical analysis of ranks for data types revealed 11 statistically significantdifferences (WSRT, p < 0.05). Most of the differences, however, had small ab-solute values, and could be ignored. For each of the data type we calculated anaverage value of the user provided rank and plotted results on Figure 2.2b. Simi-36Description of the experience n/%E1 - I have left my mobile phone at some place, but recovered itlater (e.g., at my friends’ place, in a restaurant, at parents’ house, atschool, etc.)363/50E2 - I have broken my mobile phone before, so that it was not usable 335/46E3 - I have lost my mobile phone before and did not find it 165/23E4 - Someone used my mobile phone without my permission withintention to use its functionality (phone call, browsing the Internet,etc.)100/14E5 - I used someone’s mobile phone without owner’s permission forsome functions (phone call, browsing the Internet, etc.)102/14E6 - Someone used my mobile phone without my permission withintention to look at some of my data89/12E7 - I used someone’s mobile phone without owner’s permission tolook into his/her data66/9Table 2.7: The distribution of “negative” experience of the participants (N = 724).lar to the ratings, the correlation between ranks of user concerns for both types ofattackers was high (r=0.96).From these results, we can conclude that while users are concerned with unau-thorized access to their data, these concerns are somewhat different for variousdata types. For example, users are more concerned with insiders gaining unautho-rized access to their SMS messages, call history or browsing history. At the sametime, users are more concerned with strangers if contact details are at stake.RQ5 - Experience with Unauthorized Access To SmartphoneA summary of subjects’ “negative” previous experiences is provided in Table 2.7.Half of the subjects had left their phones behind before. While such experiencedoes not necessarily translate into theft or loss, it does make their device an easytarget, since an attacker would have plenty of time to go through data. Almost aquarter of subjects (23%) that lost a device did not recover it.Interestingly, 12% of the subjects had found that someone accessed sensitive37data on their smartphones without their permission. Furthermore, 9% had admit-ted looking into someone else’s smartphone without permission. These resultsprovide empirical evidence that unauthorized access to data and functionality byinsiders impacts about 10% of smartphone users. Subjects from Study 1 justifiedthese invasions of privacy by simple curiosity (e.g., for partners or roommates)or by an urge to take care of and be informed about their children, i.e., parents"snooping".We performed a logistic regression analysis in order to identify groups ofsmartphone users that had higher chances of being a victim of an authorized ac-cess. Logistic regression is best suited for models with binomial independent vari-ables – in this case, those who have or do not have experience. In this analysis, weonly analyzed the experience related to an unauthorized access (i.e., E4-E7). Webuilt a model for each experience separately, four models in total. If a subject hadsuch an experience, then we coded it as 1, otherwise 0. For independent variables,we considered the following values: A -Age, G - Gender, and L - Lock Use. Forbinomial independent variables (Gender, Lock Use), we used bipolar representa-tion (-1,1). Equation 2.1 shows the form of the model we investigated, where Exstands for one of the experiences from E4-E7.Ex =ea0+a1G+a2L+a3A+a4GL+a5GA+a6LA+a7GLA1+ ea0+a1G+a2L+a3A+a4GL+a5GA+a6LA+a7GLA(2.1)The intuition of the model shown above is to assess if a given experience iscorrelated with subjects’ attributes, such as age or gender [108]. The goal of logis-tic regression analysis is to eliminate attributes that do not have significant impacton an experience. In addition to the analysis of attributes, we also had to con-sider any interaction effect that might arise from a combination of variables, e.g.,younger females or adult males who do not lock their device. These interactioneffects are represented as GL, GA, LA and GLA variables in the equation above.Logistic regression analysis revealed that, for all four models, all interactioneffects were not statistically significant (p > 0.174), and thus could be removedfrom the model. Furthermore, Gender and Lock Use also showed statistically38Experience a0 a1 p RD AIC R2E4 -2.95 -0.53 < 0.001 546 550 0.09E5 -2.90 -0.51 < 0.001 554 558 0.08E6 -2.70 -0.36 < 0.001 521 525 0.05E7 -3.13 -0.52 < 0.001 425 429 0.05Table 2.8: Parameters of logistic regression models, where a0 is intercept, a1 is thecoefficient in front of Age variable, p is the biggest p-value for both a0 and a1,RD is the residual deviance, AIC is Akaike Information Criterion, and R2 isNagelkerke R-squared.insignificant prediction power on the experience (p > 0.185). That is why wesimplified our models to the form shown in Equation 2.2. The parameters of themodels are shown in Table 2.8.Ex =ea0+a1A1+ ea0+a1A(2.2)First, the logistic regression analysis revealed that our models did not havestrong predictive power since R2 values were low. However, the coefficients ofintercept and age showed a statistically significant difference from zero. Negativevalues of the intercept and the coefficient for age showed that the younger subjectshad higher chances of experiencing unauthorized access. This is also depicted inFigure 2.3, where a larger ratio of younger subjects had experienced E4-E7. Thismight be attributed to various factors. For instance, younger smartphone usersmight tend to share their devices more frequently, or younger students often shareaccommodation with others while attending college or university.2.4.3 SummaryThe results of Study 2 confirmed the findings of Study 1. In particular, userswere concerned with an unauthorized access to their devices for both data andfunctionality. Furthermore, the results provided evidence that threat of an insiderattacker impacts about 10% of smartphone users. That is, 12% of smartphone39% of Participants0510152025303540E4 E5 E6 E7Subjects' Age10−1415−1718−2425−2930−3435−3940−4445−49Figure 2.3: Distribution of the experiences E4-E7 (meaning for these labels are pro-vided in table 2.7) over participants’ age groups. We removed all the subjectsthat were younger than 10 and those that were 50 or older for clarity purposes.users have experienced unauthorized access of their data or functionality, and 9%of the participants admitted that they had accessed someone else’s smartphonewithout permission.Study results suggest that most subjects (95%) who locked their smartphones(i.e., subjects in LOCK group, n=379), used PIN or DAS authentication meth-ods. According to the recent research [55, 71], these methods are not resistant toeavesdropping, especially when users are distracted by other factors [48]. Evenmore, as was explained in Chapter 1, such unlocking secrets fall into easy-to-guesscategory. This is why we argue that the effectiveness of data protection systemsagainst attackers that steal a victim’s device is at least questionable and requiresfurther research.402.5 LimitationsThe design of this study has several limitations. First, both Study 1 and Study2 rely mainly on self-reporting, which is subjective, e.g., subjects might have notunderstood certain terminology. In order to reduce this risk to validity, we avoidedsecurity terminology and jargons in the questionnaire. In addition, a set of pilotstudies for both Study 1 and Study 2 were carried out, in order to improve theclarity of the interview and survey questions.Second, because the results of Study 2 are based solely on smartphone owners,the results in Table 2.7 should be treated as a lower bound. In addition, usersmight be reluctant to report on socially unacceptable behavior, such as snoopinginto someone’s phone without permission. It is also possible that a user might notknow if someone had accessed his or her smartphone without permission.Finally, the participants of this study were recruited on the MTurk platform,which has been reported to differ [88] from the population of smartphone users [105].Even though, alternative recruiting methods were used during subject recruitment,they unfortunately proved to be less effective. The comparison of demographicsbetween recruited subjects and the previously reported population of smartphonesusers in the US [105] did reveal statistically significant differences. Most of them,however, were insignificant in practical terms.2.6 Related WorkSeveral authors have investigated user concerns with the security and privacy oftheir smartphones. Chin et al. [46] conducted a user study to understand howconcerned users are about their privacy when they use smartphone applications,especially for sensitive tasks like banking or shopping, etc. The authors found thatusers do tend to reduce the frequency of such activities because they are highlyconcerned with privacy. The participants attributed the cause of their behaviorto fear; interestingly, theft and loss of a smartphone were among users’ top fiveindicated fears. Unfortunately, none of these fears were investigated further. In41particular, it is still unclear if these fears reflect real threats, and whether or notusers had experienced such threats.Dorflinger et al. [54] investigated users’ attitudes towards gradual security lev-els and novel authentication methods. Although the authors provide a better un-derstanding of user concern with novel authentication methods for smartphones,the question of how concerned users are with sensitive data in their smartphonesremains unanswered. Even more, it is not clear what kinds of data users considerto be sensitive and if such sensitivity depends on who the attacker is. The studiespresented in this chapter, on the other hand, fill this knowledge gap.Similarly, Ben-Asher et al. [42] studied users’ attitudes towards alternativeauthentication methods and the sensitivity of data and smartphone functionality.The authors, however, considered a limited set of data types, which only includedseven different types of data. The authors also did not investigate how data sen-sitivity varied with different types of attackers, such as insiders or strangers. Toimprove on these results, subjects in both user studies presented in this chapterwere allowed to provide their own data types. In addition, subjects assessed datasensitivity for two scenarios: (1) when data is leaked to a stranger, and (2) whendata is leaked to an insider.The research community has paid a lot of attention to evaluating novel au-thentication methods in recent years. For example, Shi et al. [102] and Riva etal. [94] evaluated implicit authentication methods for smartphones based on userbehavior or context. Hayashi et al. [72] discusses possible extension to the wholeapproach of how we lock our mobile devices. In particular, the authors suggestthat we could automatically detect specific environments and disable smartphonelock, if the environment is considered to be safe. In all the aforementioned papers,the authors made the following crucial assumption; that environments can be eas-ily classified into safe environments, e.g., home or work, and not safe. By beingable to detect safe environments, the authors were able to reduce the number ofauthentication attempts for a user, and thus, improve device locking usability.In September 2013 Apple unveiled the Touch Id sensor in the new iPhone425S smartphone. The finger-print sensor was designed to improve user experiencewith unlocking smartphones by simply pressing the home button. Such a design,however, raises an interesting research question “How does technology like TouchId affect the strength of locking secrets that users choose?” Although Touch Id ad-dresses certain usability problems (e.g., being able to unlock devices on the go),the strength of unlocking secrets is still crucial for confidentiality protection ofdata in smartphones. When a user fails to unlock his iPhone with a finger print,the device will ask for the unlocking secret. Furthermore, the underlying dataencryption sub-system also uses unlocking secrets in the encryption key deriva-tion process. Having a guessable unlocking secret increases the chance that anattacker will mount a successful password guessing attack, and sequentially gainunauthorized access to data-at-rest.To understand how the use of Touch Id impacts users’ selection of unlockingsecrets, Cherapau et al. [45] conducted three user studies based on interviewsand surveys. The authors began with a study based on in-person surveys, whichallowed us to measure the adoption of Touch Id and the strength of unlockingsecrets. They then proceeded with interviews to understand the justification forthe chosen unlocking secrets used with Touch Id. Finally, an online survey wasconducted to measure the prevalence of the various adoption strategies establishedin the first two studies. Overall, the results of the study revealed that users donot take full advantage of the Touch Id sensor and still use guessable unlockingsecrets. In particular, there was no statistically significant difference observedbetween the size of the search space of unlocking secrets chosen by users that usedTouch Id and those that did not. That is, both groups relied mainly on 4-digit PINs.Similarly to Study 1, presented in Section 2.3, participants stated that they adoptedshort PIN codes instead of alphanumeric passwords because of the better usabilityof the former. These results suggest that the addition of biometric sensors, suchas Touch Id, has not changed the current state of affairs on data security in lost orstolen smartphones. After we published our results, Apple introduced an optionfor using 6-digit PIN codes. It is still unclear, however, how many users would43switch to this option, given that more than 30% of subjects in the Touch Id studywere unaware that they could choose to use alphanumeric passwords.Asking participants to admit socially undesirable behavior has its limitations,since users might be reluctant to share such information with others. To addressthis limitation Marques et al [83] used an anonymity preserving list survey exper-iment, which allowed users to share their experience without explicitly admittingtheir actions [84]. List experiments employ a between subjects design, i.e., twodistinct groups of subjects. Both groups are presented with a list of activitiesor opinions, and the subjects are required to report the number of items that theyhave done or share. The lack of explicit selection provides subjects with a sense ofanonymity. The control group has a list of four items, where two of the items cho-sen are highly likely to be selected by participants, e.g., brushing teeth or drinkingwater. The other two options chosen are highly unlikely to be selected by theparticipants, e.g., flying to the Moon. The treatment group also includes an extraoption, which in our case was the act of snooping on someone else’s smartphone.For each of the groups the authors analyzed the reported numbers of itemsusers selected. The difference between the averages corresponds to the ratio ofsubjects in the treatment group that selected the extra option. The results of theanalysis revealed that the snooping rate was significantly higher, i.e., 31% in therecruited subjects pool, or 1 in 5 adults, if the results are prorated for the generalpopulation of smartphone users in the US. In addition, the results of this study con-firmed the correlation between age and snooping behavior; younger smartphoneusers had a higher chance of being a victim of a snooping attack.To gain a better understanding of how often users unlock their devices ona daily basis, Mahfouz et al. [82] conducted a field study with 41 smartphoneusers. Each participant installed a custom built monitoring application and ranit for at least 20 days. The application collected various data about smartphoneusage, including the number of successful and unsuccessful unlocking attempts,the length of unlocking attempts, the number of failed unlocking attempts andother events. The results of the study revealed that users who used an unlocking44secret unlocked their smartphone more frequently, i.e., 51 times a day on averageagainst 41 times a day. Users who relied on PIN-codes were also less prone tomake a mistake during the authentication process (0.5% error rate) in comparisonto users who used Draw-A-Secret (3.5%) or an alpha-numeric password (4%).Although Draw-A-Secret (DAS) showed a significantly higher error rate than PIN-codes, it was used by 69% of subjects that used unlocking secrets, whereas PIN-codes were adopted by 22%. This suggests that users are willing to compromise,to some extent, the error rate of the authentication methods in smartphones forusability. Yet, it is still unclear which usability properties of an authenticationmethod are the main factors for such decisions, e.g., ability to remember andrecall secrets easily, ability to enter the secret easily, or ability to use the methodwhile distracted and on the go, etc. It is also not clear to what extent users arewilling to tolerate the error rate.2.7 Discussion and Future WorkThe results of Studies 1 and 2 revealed that users do store various types of data ontheir smartphones, including both relatively small data items, such as SMS mes-sages, and large data items such as photos and videos. Sensitivity of data variedand also depended on the type of attacker. For instance, while contact details wereconsidered sensitive if a stranger accessed them, users were not concerned withan insider reading them. On the other hand, users had higher concerns with aninsider accessing certain personal data records, such as SMS messages. This sug-gests that a data protection system for smartphones should be both (1) efficientat supporting data types of various sizes, and (2) provide effective protection fordata types against both insiders and strangers.This chapter also presented analysis on the security practices users employtoday, in order to protect confidential data in smartphones. In particular, while halfof the recruited subjects used an unlocking secret, 95% of them chose either DASor PIN-codes. These methods, however, allow the attacker to mount a relativelyinexpensive password guessing attack. For example, it takes less than a second to45go through all combinations of 4-digit PIN-codes for Android OS [104] and about14 minutes for iOS [34, 37]. Even more, research by Raguram et al. [93] showedthat these unlocking secrets can be reconstructed from recording reflections of asmartphone screen.The results of the study on the impact of Touch ID on users’ choice of un-locking secrets revealed that smartphone users still preferred weaker, easier touse authentication methods. Even more, about a quarter of the surveyed subjectsstated that they previously used a locking secret, but decided to disable it due tovarious usability issues. Interestingly, a follow-up study revealed that a higher er-ror rate of authentications does not necessarily correspond to a lower adoption rate(i.e., PIN-codes and DAS). This shows the importance of authentication methodusability in smartphones for choosing which method to adopt. Considering thatthe same set of authentication methods is still being used today, and that the ad-dition of finger print sensors, such as Touch ID, have not increased the entropy ofunlocking secrets users choose, a data protection system should not assume thatusers will employ a hard to guess authentication secret to keep data in smartphonesprotected.Another important result of the studies presented in this chapter is that the re-sults of the studies provide evidence that smartphone users experience attacks byinsiders. In particular, 89 subjects (or 12%) in Study 2 had caught someone fromtheir social circle snooping through their smartphone without permission. Evenmore, 9% of surveyed subjects admitted that they had accessed data in some-one else’s smartphone without permission. The accuracy of this estimate was in-creased in the followup study [83], by revealing that 1 in 5 users in the US lookedthrough someone’s smartphone without permission.These results suggest that novel authentication methods, especially those thatare proposed to be used in smartphones, have to be evaluated against attackers thatare as capable as insiders. In addition, the assumptions that the research commu-nity makes about safety of certain locations should be reevaluated. For example,both Riva et al. [94] and Hayashi et al. [72] proposed approaches to reduce the46frequency of required authentications for smartphone unlocking based on loca-tion type. In particular, these authors assumed that home is safer and proposeddisabling smartphone locking at home completely. The studies presented in thischapter, however, suggest the opposite, that the home can be full of insider at-tackers. This is especially true when a user has a roommate or a family memberwilling to peek into their smartphone.Finally, the results presented in this chapter show that younger demographicgroups have a higher risk of experiencing unauthorized access by insiders. Itis still not clear, however, which factors increase or decrease this likelihood. Thefact that younger users often share accommodation while in school could be one ofsuch factors. Living with parents and siblings might also contribute to the increaseof insider threat. Finally, more relaxed social norms, such as over-sharing, couldalso increase chances of unwanted access to data in a smartphone by an insider.2.7.1 All-or-Nothing Locking ApproachThese results presented in this chapter suggest that there is a gap in what thecurrent smartphone locking systems provide and what smartphone users actuallyneed, especially to protect themselves from insiders. Future research should focuson how to deter and prevent such attackers from unauthorized access. For exam-ple, in addition to smartphone locking, one can use facial recognition to detectwhen the current user is not the owner of the smartphone. The recently introducedFace ID unlocking mechanism in iPhone X could be the enabling technology forsuch an approach. In cases when a smartphone owner needs to share his devicewith someone, researchers might propose an easy-to-configure interface that un-locks certain parts of the smartphone that the owner considers public.Recent research also studied contextual awareness for the unlocking process [73,79, 92]. In such proposals, the context usually defines the complexity and usabilityof the unlocking process. In safe environments the user is required to go througha simple authentication mechanism, such as PIN-code or DAS. In untrusted envi-ronments the unlocking process relies on stronger authentication methods. Such47approaches, however, might increase the mental load on users, and will have to bemore carefully studied "in the wild".Other researchers have tried to improve the current “all-or-nothing” model forsmartphone locking. In such a model, the locking of a smartphone is either fullyenabled or completely disabled. The all-or-nothing model, as the results from ourstudies showed, pushed 20% of smartphone users to disable smartphone locking.To address these limitations, Riva et al. [94] and Hayashi et al. [72] proposedautomatic disabling of locking in certain assumed-to-be-safe locations, work orhome. The results from our user studies, however, showed that these assumptionsare questionable. In particular, the results revealed that these locations are full ofinsiders and users do experience invasion of privacy by such attackers. Thus, itis still not clear how to design a more granular access control system that takescontext into account and provides greater flexibility to the users, while defendingthe users against potential insiders.Finally, our studies also revealed that younger smartphone users have a higherrisk of experiencing unauthorized access by insiders. It is still not clear whichparticular factors increase or decrease this likelihood. The fact that younger usersoften share accommodation while in school could be one of such factors. Liv-ing with parents and siblings might increase the insider threat. Further, more re-laxed social norms, such as over-sharing, could also be a factor that increases thechances of unwanted insiders access. Future research needs to aim at improvingour understanding of factors that impact such experiences.2.7.2 Improving Security of Unlocking MethodsIncreasing the complexity of unlocking secrets that smartphone users choose leadsto stronger security of full-disk data encryption, since this derives its encryp-tion key from the unlocking secret. This can be achieved by either nudgingusers to pick stronger unlocking secrets, or by improving authentication meth-ods themselves in terms of security and usability. Persuading users to choosemore complex secrets have been vastly studied in the usable security domain (e.g.,48see [58, 66, 77, 106]). For example, Egelman et al. [58] and Ur et al. [106] studiedthe effect on password strength meters on users’ choice. While the results fromboth studies showed that password meters can be effective in improving users’choice, it is still unclear how these results would translate to mobile context, suchas smartphone or typing on the go. Forget et al. [66] studied effectiveness of per-suasive text passwords (PTP). In PTP, once a user chooses a password the systemautomatically adds a random character in a random place. User is allowed to shuf-fle the character and the position until he finds the combination that he accepts.The results of the user study showed this approach was mostly effective, with ex-ception of password that were randomly chosen to begin with. Finally, Komanduriet al. [77] studied the effect password selection policies have on the actual securityof passwords that users choose. Surprisingly, the study revealed that commonlyadopted practices, such as requirement of having a special character in the pass-word, lead to less secure passwords, while simple policies, such as lower-case 16characters, allow users to choose less guessable passwords.Other researchers have focused on improving the authentication methods them-selves. For instance, both De Luca et al. [49, 50] and von Zezschwitz et al. [107]have proposed novel authentication methods that improved the usability of userauthentication in smartphones, while addressing specific types of attacks, e.g.,shoulder surfing [59, 99, 113]. These proposals, however, are still prone to pass-word guessing attacks, due to a relatively small search space for authenticationsecrets, which is comparable to a commonly adopted 4-digit PIN-code.Recent improvements in sensor capabilities provide additional opportunitiesfor improving the usability of existing authentication methods that are believed tobe unusable. For instance, a finger print scanner in iPhones (Touch ID sensor),significantly reduces the frequency of user authentication based on secrets. Thismakes it possible for a user to choose a more complex alpha-numeric passwordinstead of relying on a 4-digit PIN-code. Unfortunately, one of the follow upstudies [45] revealed that smartphone users still prefer easy-to-guess authentica-tion secrets, such as 4-digit PIN-codes. The main reasons for such preferences49were unawareness that a more complex option was available and the need to shareunlocking secrets with others.Finally, it is still unclear to what extent easy-to-guess unlocking secrets arebeing exploited. While there is some anecdotal evidence that attackers try to ac-cess private data on a stolen device (e.g., Honey Stick Project by Symantec [9]),such results were not obtained in a scientifically sound manner. In this work Iassumed an opportunistic attacker – an attacker that aims to profit from the stolensmartphone itself, but who opportunistically tries to access data as well. There isno evidence to suggest that such attackers pose a real threat. This is why futureresearch should also attempt to uncover the impact of data breaches that originatefrom stolen smartphones for which users choose easy-to-guess unlocking secrets.2.8 ChallengesConducting user research is challenging, but doing research on sensitive mat-ters, such as private data in smartphones, adds another dimension of complexity- ethics. Researchers are bound to high ethics standards that often make certainkinds of research impossible. For example, when researchers attack existing sys-tems, they often are limited to using the researchers themselves as subjects, e.g.,evaluation of a shoulder surfing attack [93], or must clearly state their intent beforethe attack. Both of these approaches impact outcomes through bias.In particular, in late 2012 I tried to evaluate the assumption that users arevulnerable to eavesdropping attacks in public places, such as coffee shops andwhile using public transit. The study design was based on observing users inreal settings to assess (in)security of their unlocking secrets. To limit subjectbias, we planned to approach subjects for debriefing and consent for includingtheir data in the analysis after the observations were made. While initially ourstudy was approved by the ethical board, the application was rejected two monthslater4. This made it impossible for me to assess how easy it was for an attacker to4Application H12-02254, titled “Smartphone Unlock in a Wild”. Approved on December 19,2012. Rejected on February 25, 201350eavesdrop a user’s unlocking secret before stealing their smartphone.Similarly, it is challenging to assess how often user data is being accessedby insiders, since it is challenging to obtain consent from the insider attackerswithout impacting their behavior. Furthermore, when it comes to strangers, it isunclear how often they actually try to guess unlocking secrets and decrypt data.All we have at the moment is the anecdotal evidence that attackers are interestedin accessing private data and that users employ easy-to-guess unlocking secrets.2.9 ConclusionThis chapter presents the results of user studies that focused on understandingusers’ concerns with sensitive data in smartphones in the presence of two differenttypes of attackers, insiders and strangers. It provides evidence that an insiderthreat is real and that 1 in 5 users in the US had peeked into someone else’ssmartphone without permission. In addition, the studies revealed that the vastmajority of smartphone users employ unlocking secrets that can be guessed withinminutes. It also appears that the introduction of novel authentication mechanisms,such as Touch ID, did not have a considerable effect on password complexity.These results suggest that research on novel authentication methods for smart-phones needs to account for an attacker as capable as an insider. An insider isan attacker who aims to eavesdrop an unlocking secret and mount a so-calledlunchtime attack, where a victim leaves her device unattended for a brief span oftime, sufficient for the attacker to unlock it and gain unauthorized access to sen-sitive data. In addition, researchers should not assume that smartphone users willchoose an unlocking secret that is complex enough to keep their sensitive dataprotected. As the results of the conducted studies suggest, the state is the oppo-site, users prefer choosing unlocking secrets that are easy to memorize and type,which, unfortunately, are also easy-to-guess for attackers.51Chapter 3Analyzing Cryptographic API use inAndroid ApplicationsWhile smartphone users can protect of their data by choosing an unlocking secretthat is hard to guess for an attacker in a reasonable amount of time (e.g., tensor hundreds of years), developers control secrecy of their applications’ data byusing encryption algorithms and protocols in a secure fashion. In this chapter Ipresent the results of the analysis of how application developers use and misusea set of cryptographic functions that are commonly employed for encryption keyderivation, secure random number generation and symmetric ciphers.3.1 Motivation and related workThe research community has paid a lot of attention to the (mis)use of cryptogra-phy in smartphone applications. For instance, Lazar et al. [78] studied CommonVulnerabilities and Exposures (CVE) that were reported between January of 2011and May of 2014 that were related to cryptography. The results of their analysisshowed that 83% of the CVEs were introduced by application developers that in-correctly used Crypto API. To understand how this issue can be addressed, Acaret al. [36] studied the usability of several cryptographic libraries. The results of52their user study suggested that while making the Crypto API simpler had its bene-fits, application developers still required proper documentation, code samples andcertain features to be available for the library to be used properly.Several researchers used static analysis methods and tools to analyze CryptoAPI misuse in Android application binaries. For example, Fahl et al. [62] studiedthe misuse of asymmetric cryptography for SSL/TLS protocols, and certificatevalidation in particular. The analysis of 13,500 top free Android applicationsrevealed that 8% of the analyzed applications misused SSL/TLS API, which madethese applications potentially exploitable.Egele et al. [57] designed and implemented the CryptoLint system based onAndroGuard [23] framework. CryptoLint used static analysis to identify misusesof Crypto API in Android applications. The authors defined six rules for correctuse of Crypto API, which were either based on formal definitions, such as Indis-tinguishability under chosen-plaintext attack (IND-CPA) notion of security [52],or recently published reports. For instance, the use of the Electronic Code Book(ECB) mode of operation for symmetric ciphers is considered to be insecure underIND-CPA, since symmetric ciphers in the ECB mode produce exactly the same ci-phertext for two identical plaintexts, allowing attackers to identify plaintext by thecorresponding ciphertext.Other rules, e.g., the use of SecureRandom class, require that developers donot use static seed values, since using static values makes a random number gen-erator (RNG) predictable. If a predictable RNG is used for encryption key gener-ation, then an attacker can seed his RNG with the same static value, generate thesame encryption key, and, eventually, decrypt data.When it comes to password-based encryption key derivation functions (PBKDF),one should be careful with two parameters: (a) the salt value, and (b) the numberof iterations. The use of a static salt value allows attackers to employ a rainbowtable approach [87], which can significantly reduce the computational efforts re-quired to mount a successful password guessing attack. The number of iterationsdefines how much computational work an attacker needs to do for a single pass-53word candidate. Choosing the number below the recommended value of 1,000iterations, results in faster computation for attackers.The results of the analysis based on the CryptoLint system revealed that 88%of Android applications that used Crypto API violated at least one rule. Sim-ilarly to the CryptoLint study, we focus on the same set of rules (replicated inSection 3.2), while introducing source attribution to the analysis pipeline. In ad-dition, we extended the original dataset of the CryptoLint study by adding newlycollected applications from 2015 and 2016. To make reproduction of similar stud-ies easier in the future, we made the BinSight tool available as open source.Other researchers focused on libraries or source of the information the de-velopers used. For instance, Derr et al. [39] studied how promptly applicationdevelopers adopt new versions of libraries, especially when there is a known vul-nerability in the library. While doing so, the authors also evaluated the six rulesdefined in the CryptoLint study for the identified libraries. Unsurprisingly, the re-sults of the analysis revealed that libraries violated these rules too. In comparison,we studied violations that originated from either of the sources, i.e., a library oran application itself. Acar et al. [35] studied how the source of information thatapplications developers used during implementation of the applications impactedcode security. The results of the studies showed that developers with no securitybackground often use sources, such as Stack Overflow, that frequently containinsecure snippets. Furthermore, majority of the applications on Google Play con-tained security related errors that were common in the wild, including discussionthreads on Stack Overflow. Similarly, Fisher et al. [64] showed that 15.4% ofapplications on Google Play contained security-related code snippets from StackOverflow, 97.9% of which were insecure.To summarize, while previous research has looked into either libraries or An-droid applications as a whole, there are still several open research questions onmisuse of Crypto API. That is, it is unclear how similar or different the misuseof Crypto API in applications themselves and libraries. In addition, it is not clearif either of these source has changed since 2012 and how. Finally, it is unclear54if libraries or applications contribute the most of Crypto API misuse cases. Toanswer these questions we ought to be able to attribute calls to Crypto API tolibraries or to applications. From practical perspective, attributing a Crypto APImisuse to its source has several important implications. First, one needs to clearlyidentify the responsible party for fixing the bug. Second, identifying the sourceof a misuse allows researchers to reduce over-counting of bugs, by identifyingones that originate from libraries. In addition, by being able to analyze binaries,the BinSight tool allows applications developers to obtain an insight into how alibrary (mis)uses Crypto API. This allows them to make an informed decision onwhether or not they want to use this library in their application.3.2 Common rules in cryptographySimilarly to the CryptoLint report [57], we analyzed the same set of rules forsecure use of Crypto API. Throughout the rest of this chapter I use the term APKfile to refer to an Android Application binary as a whole, i.e., when the originof a call to Crypto API is not taken into account. Such treatment of Androidapplications as a whole resembles the reporting approach from the CryptoLintstudy [57]. I use the term applications to refer to the cases when Crypto API callsoriginate from Android applications themselves. Finally, I use the term librariesto refer to the cases where Crypto API are called from libraries.An APK file was flagged as misusing Crypto API if it contained a violation ofany of the rules from any source. While one can declare these APK files as inse-cure, we note misuse of Crypto API does not imply an exploitable vulnerability.The main reason for this argument is that developers might be using cryptographyfor purposes other than data confidentiality or integrity. For example, one mightuse Crypto API for obfuscation. In the rest of this section, the Crypto API userules are explained in more detail. For a complete description of rules and theirin-depth justification, please refer to the CryptoLint study report [57].553.2.1 Symmetric key encryptionA block cipher is a deterministic algorithm that operates on fixed-length groupsof bits, called blocks, with an unvarying transformation specified by a symmetrickey. A stream cipher, on the other hand, is a stateful algorithm that combinesplaintext digits with a pseudo-random keystream, which is generated from a sym-metric key. Block and stream ciphers are used in symmetric key encryption toencrypt messages of arbitrary length. It is important to know that a symmetrickey encryption scheme must be either probabilistic or stateful to be IND-CPAsecure [40].In block ciphers, a mode of operation defines security properties the cipherwould provide, such as confidentiality. A popular mode is electronic codebook(ECB), which is a stateless, deterministic algorithm defined over a block cipher.As such, the ECB mode is not IND-CPA secure. The major problem with ECBmode is that identical messages encrypt to identical ciphertexts, which representsan information leak that is often intolerable. Still, ECB mode is commonly con-sidered secure if the message is smaller than the size of the block in the underlyingcipher, and all messages are unique. Therefore,Rule 1: Do not use ECB mode for encryption.Another popular mode of operation is ciphertext block chaining (CBC), whichis an encryption algorithm that is built from a block cipher, where each block ofplaintext is XORed with the previous block of ciphertext before being encryptedwith the block cipher. The first block of plaintext is XORed with an initializationvector (IV). Using a constant IV will result in a deterministic, stateless cipher,which is not IND-CPA secure. Thus,Rule 2: Do not use a constant IV for CBC mode.Any symmetric encryption scheme, defined using a block or a stream cipher,should not reveal its key. If the key is hard-coded into a publicly-available applica-tion as a constant, then the key is not private, and so the resulting encryption does56not provide confidentiality. Symmetric encryption schemes commonly assume arandomized key generation algorithm that should be used instead. Accordingly,Rule 3: Do not use constant encryption keys.3.2.2 Password-based encryptionUser-created passwords are often predictable and prone to offline password guess-ing attacks [101]. For technical details on how an offline password guessing attackis usually mounted please refer to Chapter 1. To address this issue, password-based encryption (PBE) schemes try to increase the costs of such attacks by re-quiring significant amounts of computation in order to derive an encryption key.For example, in iOS the key derivation process is calibrated to take about 80 mil-liseconds [37]. This results in a significant increase of efforts for attackers, sincethey need to try thousands of different password candidates, while virtually havingno effect on end-user experience. PBE schemes achieve this by using random saltvalue and applying multiple iterations of a cryptographic hash function, typicallyusing a key derivation algorithm.The salt and the iteration count entail a multiplicative increase in the workrequired for a password guessing attack [41]. Using constant salt values en-ables attackers to use the rainbow table approach [87], which significantly re-duces computational burden in cases when passwords for multiple accounts arebeing guessed. Choosing low iteration counts results in less work per guess for at-tackers, which, again, speeds up the password guessing attack. For this reason, wechose to use 1,000 iterations as a minimum value, as suggested by RFC 2898 [76].Hence,Rule 4: Do not use constant salts for PBE, andRule 5: Do not use fewer than 1,000 iterations for PBE.Note, that several recent publications proposed different approaches on in-creasing computational cost of key derivation process, e.g., scrypt [89] or using5710,000 iterations for PBKDF2 [70], in this work we decided to use 1,000 as ourbare minimum in order to be able to compare with the results from 2012. Futureresearch, however, should strongly consider increasing the number of iterations.3.2.3 Random number generationAndroid provides an API to a seeded, cryptographically-strong pseudo-randomnumber generator (PRNG) via the SecureRandom class [31]. This PRNG is de-signed to produce non-deterministic output, but if seeded using a constant value,it will produce a constant, known output across all implementations. If such aPRNG is used to derive keys, the resulting keys would not be random, making theencryption insecure. As such,Rule 6: Do not use a constant to seed SecureRandom.3.3 Cryptography in AndroidAs discussed in Section 3.1, there are various reasons to use cryptography inAndroid by both applications and third-party libraries. In the following, a briefintroduction is provided for the application ecosystem in Android, focusing onpackaging and Java runtime, and for the use of cryptography in Java.3.3.1 Android applications ecosystemAndroid applications are authored as either native C/C++ or Java source code. Inthis study, only applications written in Java are considered, because Java has hada stable Crypto API interface since the release of Java 1.4 in 2002. An AndroidJava application is compiled to Dalvik executable (DEX) bytecode. This bytecodeis packaged with additional resources, such as images, third party libraries, andconfiguration files, into an application package (APK) file. The APK file is thenuploaded to the Google Play Store, and when a user installs the application, theAPK file is downloaded and installed on their device.58Even though DEX bytecode is compiled from Java, the Dalvik virtual machine(DVM) is considerably different from the Java virtual machine. For example,while the Oracle Java virtual machine is stack-based, DVM is register-based, witha dedicated assembly language called Smali. However, it is possible to convertDEX bytecode to Oracle’s Java bytecode with the Dex2Jar tool [19], albeit withsome limitations, such as the inability to decode specific classes. We note thatDVM was recently replaced by Android runtime (ART), which translates the DEXbytecode into the CPU’s native instructions for faster execution.3.3.2 Java cryptographyAndroid provides a rich execution framework that offers access to various sub-systems, including Java cryptography architecture (JCA). The JCA standardizeshow developers make use of many cryptographic algorithms by defining a stableAPI. Accordingly, a cryptographic service provider (CSP) is required to registerwith the JCA in order to provide the actual implementation of these algorithms.This abstraction allows developers to replace the default CSP, which is Boun-cyCastle [18] in Android, with a custom CSP that satisfies their demands. Forexample, SpongyCastle [22] is a popular third-party CSP that supports a widerrange of cryptographic algorithms.Symmetric and asymmetric encryption schemes are accessible to developersthrough the Cipher class, as described in Listing 3.1. To use a specific encryptionscheme, the developer calls the Cipher.getInstance factory method and provides atransformation as an argument. A transformation is a string that specifies the nameof an algorithm, a cipher mode, and a padding scheme to use. In Listing 3.1, thereturned cipher instance uses AES in CBC mode with PKCS#5 padding. Only thealgorithm name is mandatory, while cipher mode as well as the padding schemeare optional. Unfortunately, all CSPs default to the ECB mode of operation if onlythe cipher name is specified, which is insecure [30].Listing 3.1: Simplified symmetric key encryption in Java59// values of iv and key should be randomly generatedpublic byte[] encrypt(byte[] iv, byte[] key, byte[] data) {IvParameterSpec iv_spec = new IvParameterSpec(iv);SecretKeySpec key_spec = new SecretKeySpec(key, "AES");Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5PADDING");cipher.init(Cipher.ENCRYPT_MODE, key_spec, iv_spec);return cipher.doFinal(data);}3.4 DatasetsAs summarized in Table 3.1, three datasets were used for the analysis, 132,590APK files in total. R12 is a subset of the CryptoLint dataset with 10,990 APKfiles. The original dataset had 145,095 APK files and was collected between Mayand July of 2012 [57] by crawling the Google Play marketplace. First, the authorsof CryptoLint excluded all APK files that did not use Crypto API. Second, theauthors also excluded all APK files that had all Crypto API calls originating from11 white-listed libraries. This resulted in a sub-set with 15,134 APK files. TheCryptoLint tool, however, failed to analyze 3,386 files from this sub-set and 758files were lost since 2012, resulting in 10,990 APK files in the R12 dataset. Con-sidering that the 758 lost files are a random sample of the set that was presentedin the CryptoLint report [57], such loss does not have a significant impact on ourresults.R16 dataset was collected in May of 2016 with the help of Sophos. To selectAPK files in the R16 dataset we first generated a random sample of 120,000 APKfiles that were available on Google Play market at that time and then downloadedthat set from Sophos servers. Unfortunately, some of the files were corruptedduring the downloading process, leaving us with 117,320 APK files.For trend analysis we focused only on the R12 and R16 sets. In addition, weconsidered two versions of the R12 and R16 datasets. First, we analyzed subsetsof R12 and R16 from which we removed all APK files that had all calls to Crypto60Name Number of APKs Sampling YearR16 117,320 Random 2016R12 10,990 Random 2012T15 4,280 Top-100 2015Table 3.1: Summary of used datasetsDeduplicationDisassembly sCFGExtractionStatic Program SlicingRule EvaluationPreprocessing Linting1 2Obfuscation AnalysisThird-party Library DetectionAttribution3Figure 3.1: Cryptographic API linting for Android applications using BinSight.Gray components represent parts that were reimplemented from Cryp-toLint [57], and white components represent the extensions that we added.API originating from 11 libraries selected by the authors of CryptoLint. We denotethese sub-sets as R12* and R16* respectively. Second, we analyze both R12 andR16 as-is.Finally, the T15 dataset includes the Top 100 Android applications in eachcategory from June 2015. For this dataset a list of the Top 100 applications wasfirst obtained through Google Play store API. Then each APK file was separatelydownloaded through the ApkDownloader tool [25]. The downloading processwas completed between June 13–28, 2016. As 20 applications were removedfrom the Google Play Store before we were able to download them, the final sizeof the dataset is 4,2801. We compared T15 to R16 only for additional insight intodifferences between average and top Android applications.3.5 Crypto API linting with BinSightAt a high level, the rules defined in §3.2 represent temporal properties that can bevalidated using automated program analysis in a task known as linting [61]. Lint-1Due to large size of T15 and R16 datasets we cannot make them available online, but willshare upon request. For the R12 dataset we refer readers to the authors of the CryptoLint study.61ing is a process of validating certain formal conditions on source code throughstatic analysis of the code or binary. Usually, it implies that one converts an appli-cation into a super control flow graph (sCFG) representation and then analyzes thestructure of the graph to validate defined conditions. For example, one can traceall the inputs for a variable that holds an encryption key and check whether or notthat variable holds static values, i.e., values that are hard-coded, or dynamicallygenerated values, e.g., through a random number generator.While previous research has proposed various linters for Android Crypto API [57,103], they suffer from various limitations. In particular, the state-of-the-art linterCryptoLint is not available as open source and is unable to analyze over 23% ofAPK files [57] . In addition, none of these tools provide any code navigation,which is valuable for manual in-depth analysis. Finally, existing tools do not sup-port attribution of the source of misuse, i.e., by using these tools one cannot tellwhether a misuse is due to an application code or a third-party library.To overcome these limitations, we developed BinSight framework based ontechnical description of the CryptoLint [57]. Although the BinSight tool does notintroduce any novel ideas to the field of static analysis, it nevertheless improvesanalysis stability over CryptoLint, provides a rich, graphical UI for manual in-spection of an APK file, and attributes a Crypto API call to an application or alibrary. To improve accuracy of the analysis conducted in the CryptoLint study,we introduced two additional stages to the APK analysis pipeline, as illustrated inFigure 3.1. We released BinSight as an open source project and included imple-mentation details as comments in the source code2.In what follows, each stage of the analysis pipeline is discussed.3.5.1 PreprocessingEach downloaded APK file undergoes a two-step pre-processing stage before it islinted. The goal of this stage is to filter out all applications that do not use CryptoAPI and remove all duplicate APK files, as described below.2The project can be located at the following URL: to CryptoLint, our analysis operates on a higher-level representation ofthe Dalvik bytecode. In particular, we use ApkTool [17] to decode an APK fileand disassemble it into a set of Smali files. Each Smali file represents a class def-inition, and uses DEX operation codes to represent instructions [20]. We pickedApkTool over AndroGuard [23], which was used by CryptoLint, to improve anal-ysis reliability. As shown in §3.6, BinSight was unable to analyze six applicationsout of 95K, while CryptoLint failed to analyze 23% of applications out of 15K.After an application was disassembled, we searched all of its generated Smalifiles to locate entry points to Crypto API. If such entry points were not found, theapplication was disregarded from further analysis. Otherwise, we proceeded tothe de-duplication step.DeduplicationDownloading thousands of APK files from Google Play is technically challenging.First, it has to span over weeks or months in order to avoid account blocking.Second, an application might be listed in multiple categories. These challengeslead to duplicates in a dataset, and thus, removing duplicates is important forvalidity of the results. For de-duplication we relied on application Id (stored inthe manifest file).For each dataset we separately generated a list of all APK filenames, corre-sponding application Id and its download time (for T15 and R16 sets), or, whenavailable, application version (R12). We then identified all duplicates within adataset by grouping files with the same application Id. For identified duplicateswithin a dataset we kept the latest version of the application, based on its down-load date or version.633.5.2 LintingOnce the interesting pool of APK files was identified, BinSight evaluated the com-mon cryptographic rules defined in Section 3.2. In particular, it computed staticprogram slices that terminate in calls to Java Crypto API, and then extracted thenecessary information from these slices to evaluate the rules.In what follows I provide a brief overview of the three main steps involved inthis stage. I refer the reader to related work for further details [57, 103].Super Control Flow Graph extractionIt is typical for an application to use Crypto API in multiple methods. For exam-ple, a cipher object could be instantiated in an object constructor and then used intwo different methods to encrypt and decrypt the data. If the two methods wereanalyzed in isolation, we would not be able to extract the encryption scheme thatwas used when the cipher object was instantiated. Fortunately, the super control-flow graph (sCFG) of an application allows us to perform inter-procedural anal-ysis, which is required to correlate the use of a cipher object for encryption anddecryption with its instantiation.BinSight constructs the sCFG of a preprocessed application as follows. First,it extracts the intra-procedural CFGs of all methods from the decoded Smali classfiles. This task also involves translating all methods into single static assignment(SSA) form [47], and extracting the class hierarchy of all classes in the appli-cation. Next, BinSight superimposes a control-flow graph over the CFGs of theindividual methods, resulting in the sCFG. In this sCFG, call edges are addedbetween call instructions and method entry points, and method exit points areconnected with exit edges back to the call site.One one hand, similar to CryptoLint, BinSight constructs an over-approximatedsCFG of the application. That is, BinSight extracts calls that might never be ex-ecuted when the application is running. On the other hand, BinSight does notanalyze dynamic edges, which could be created through Java Reflection API [29].This creates a risk of missing crucial calls to Crypto API, hence the resulting64Endpoint signature RuleCipher.getInstance() 1cipher.init() 2secureRandom.setSeed() 6new SecretKeySpec() 3new PBEKeySpec() 4new PBEParameterSpec() 5new SecureRandom() 6Table 3.2: Cryptographic API endpoints and related rules.sCFG would become under-approximated.Static program slicingStatic program slicing is the computation of a set of program statements, calledslices, that may affect the values of certain variables at a particular program pointof interest, referred to as a slicing criterion [44]. BinSight applies static programslicing on the sCFG to identify if the analyzed application uses any of the CryptoAPI. In particular, BinSight searches the sCFG for nodes that belong to Java’sCrypto API endpoints. If these nodes are found, it uses their incoming edges tolocate all call sites in the application. Note that this search depends on the type ofthe Crypto API endpoint in the sCFG. Table 3.2 shows the relevant API endpointsand their corresponding cryptographic rules.Rule evaluationRule evaluation depends on the values assigned to the parameters of a CryptoAPI, where value assignment can be either local or external to the containingmethod. For the earlier case, BinSight computes a backward slice of the programto all possible locations where the involved parameter is set, after which we applyvalidation logic on its value. As for the latter case, the evaluation depends onthe origin of value assignment outside the method. As such, BinSight computes65backward slices to all locations where this value can be assigned. BinSight stopsthe computation if it reaches a dead-end, where a node does not have any incomingedge or it reaches an assignment to a static value.3.5.3 AttributionAfter the linting stage, every call site that terminates in a Crypto API is attributedto its source, which could be the application code itself or a third-party library.Our attribution approach relies on package names of classes that have API callsites, and cross-references them with an exhaustive list of third-party libraries.The attribution has to overcome obfuscated package names in order to correctlymap call sites to libraries. This is done in the following two steps.Obfuscation analysisAlthough de-obfuscating Android applications has been recently studied [43, 81],the underlying techniques, while effective for manual forensics, are inefficient foranalyzing applications on a large scale. Moreover, it is unclear how prominent theuse of obfuscation is in the real-world, especially in the classes that use CryptoAPI. To automatically detect the level of obfuscation a rule-based classifier wasdeveloped that identifies whether a given package name is fully obfuscated or not.The manual analysis of the results of the classifier revealed that even if the packagewas partially obfuscated, one can still use it to identify the library it belongs to.Section 3.6 shows that less than 4% of package names were fully obfuscated,requiring sophisticated de-obfuscation techniques.The following rules were defined as a result of several manual iterations overthe results the classifier provided. Our main objective at this point was to find pat-terns that one can use in the classifier. Eventually, seven rules (listed below) wereassembled, which allowed automatic assignment of the level of class identifierrenaming (CIR), i.e., none, class, partial, and full CIR obfuscation.1. If all parts of the identifier are of length one, then it is a case of full ob-fuscation. An identifier of a class is the combination of the class name and66its package name, e.g., defines a class ShowAd package.2. If all but the first part of the identifier are of length 1 and the first part isin the set {com, ch, org, io, jp, net}, then it is a case of full obfuscation.For instance, the example provided in step 1 would translate to com.a.a.B,where com.a.a would be renamed package name and B would be renamedclass name.3. If none of the package name parts in the identifier are of length one, thenit is a case of either none or class-level obfuscation. The intuition behindthis rule was based on the observation that, obfuscating software tends torename parts to a single character.4. If at least one part but not all of the identifier are of length one, then it is acase of partial obfuscation. The intuition behind this rule was the same asfor the step 3.5. If class name is longer than 3 chars then it is a case of no obfuscation.Similarly to rules 3 and 4, we observed that if a class names were renamedto names with one or two characters.6. If class name length is 1 character, then it is a case of class name obfusca-tion.7. If class name length is 2 or 3 characters and the first character is in lowercase, then it is a case class name obfuscation. We observed that namingconvention for Java classes uses an upper-case letter for the first character,thus, the cases where the first character was in lower-case and the length ofthe name was 2 or 3 characters it meant that the name is obfuscated.Third-party library detectionAs mentioned above, almost all call sites that terminate in cryptographic APIscorrespond to package names that were identifiable. The fully obfuscated pack-67Number of APKsName Total Unique Dups Crypto?R12 10,990 10,222 768 (7%) 10,222 (100%)R16 117,320 115,683 1,637 (1.4%) 95,775 (82.8%)R16* 117,320 115,683 1,637 (1.4%) 93,994 (81.3%)T15 4,280 4,067 213 (5%) 3,645 (89.6%)Total 132,590 129,972 2,618 109,642Table 3.3: Summary of duplicates and Crypto API use in all three datasetsage names were labeled as “obfuscated”, meaning that BinSight was unable toattribute them to a library or the application.For the remaining majority of the package names a frequency analysis wasconducted. All non-obfuscated package names that were seen in a single APK filewere assigned to the application category. For the remaining set of package namesa manual analysis was conducted, starting with the most frequently encounteredpackage names first. The manual analysis was finished once it covered 95% of thecall-sites. The remaining package names, which were not covered by the manualanalysis, were labeled as “possibly library”. As a result of this analysis, a listof package names was compiled that allowed BinSight to identify libraries andpossible libraries. Note that this approach complements a recent proposal thatrelies on a list of library signatures generated from a large database of third-partySDKs [39].3.6 Measuring Crypto API misuseThis section presents the result of the analysis of 109,642 APK files (out of132,590) that had at least one call to Crypto API. To the best of our knowledge,this is the largest dataset analyzed for Crypto API misuse (e.g., the CryptoLintstudy is based on the analysis of 11,748 APK files only). First, we discuss dupli-cates, obfuscation detection and source attribution for each of the datasets. Thenwe present the overall statistics on Crypto API misuse, and proceed with the anal-68ysis of each rule separately. In our analysis we compare R12 to R16 in order tounderstand what may have changed between 2012 and 2016, and T15 to R16 inorder to understand how a top application differs from an average one.For each comparison we conducted a statistical significance test (Chi-square)to test whether the found difference was statistically significant with 99% con-fidence. In what follows we discuss only statistically significant results, and allfigures show 99% confidence interval whiskers.3.6.1 PreprocessingUnsurprisingly, every application in R12 made at least one Crypto API call, con-firming the analysis and the white-listing performed by Egele et al. [57]. Inter-estingly, while Egele et al. found that only 10.4% of the applications in theiroriginal dataset with 145K APK files made a call to Crypto API, this ratio hassignificantly changed for R16 and T15. In particular, we found that 83% and 90%of the applications in R16 and T15, respectively, made at least one call to CryptoAPI. Such a significant increase in use of Crypto API in Android applications canbe attributed to many reasons, including white-listing that authors of CryptoLintapplied or increased necessity to protect user data.Our analysis revealed that while all datasets contained duplicates, R12 had thelargest ratio, of 7%. We removed all duplicates from the analyzed datasets. Thesummary of the datasets after de-duplication is shown in Table 3.3.Unlike CryptoLint, BinSight was able to disassemble and analyze all but sixof the 132,590 APKs, which represents a significant improvement over the Cryp-toLint tool, and which failed to analyze 3,386 APK files (23% of the analyzed set)due to technical problems3. BinSight completed analysis in about 14 days on adual Xeon CPU computer with 128GB RAM, i.e., processing about 7500 APKfiles a day, which suggests that BinSight is not only robust, but also scalable. We3According to CryptoLint authors, there were two major problems: (a) the tool did not finishanalysis within 30 minutes, and (b) the analysis infrastructure ran out of memory.69Class identifier renaming levelNone Class Partial Full TotalR16 509,643 203,447 106,091 21,279 840,46060.64% 24.21% 12.62% 2.53% 100%R12 78,883 14,513 6,882 2,002 102,28077.12% 14.19% 6.73% 1.96% 100%T15 26,821 12,907 3,620 1,804 45,15259.40% 28.59% 8.02% 3.99% 100%Table 3.4: Obfuscation analysis of class identifiers.made BinSight available as an open source project4.3.6.2 Linting and attributionObfuscation analysisAs noted in Section 3.5, it is unclear how prominent obfuscation is, in particularin class identifier renaming (CIR) [39]. To understand this, the reliability of usingpackage names for source attribution was analyzed, by quantifying CIR in eachdataset. The analysis was limited to only those classes that made at least onecall to Crypto API. While this served the needs of this study, these results onthe prevalence of obfuscation should not be considered as a generalization to allAndroid applications.There are different levels at which CIR can be applied by an obfuscator likeDexGuard. For instance, for class com.domain.package.Class, an obfuscatormight not change the identifier at all. It might rename the class name only, partsof the package name, or sometimes the whole class identifier. For the first threelevels, we can map the class to a library or an application if the package name hasan identifiable prefix. As for the fourth level, we cannot use the package name forsource attribution.4 (%)Call sites Libs Apps Libs? ?R16 840,460 90.7 4.9 1.9 2.5R12 102,280 79.5 14.5 4.0 2.0T15 45,152 80.6 10.7 4.7 4.0Table 3.5: Attribution of cryptographic API call sites.The goal of this analysis was to quantify how many class identifiers fall intoeach of the four CIR levels, as follows. First, a list was automatically compiledthat contained all unique class identifiers that made at least one call to Crypto API.Then the list was manually inspected, in order to derive patterns for full and partialobfuscations. After the set of identified rules was updated, the list of packagenames that made at least one call to Crypto API was recompiled. At this point,the category was also added to each package name, so that the further analysiswas focused on package names without category. The process was completedin four runs, which resulted in the list of seven classification rules presented inSection 3.5.3. While the presented rules were simple to evaluate, they sufferedfrom false positive and negative cases. The analysis of those case showed thatthe number of such cases was below 1% out of total number of cases (i.e., falsepositive and negative rates combined).The results of the analysis revealed that using package names for source at-tribution is still reliable. In particular, for applications in R16 we were able toidentify the source for 97.5% of classes that made calls to Crypto API. The resultsof the analysis for all datasets are provided in Table 3.4.Third-party library detectionWe classified package names into one of four categories: applications (apps), li-braries (libs), possible libraries (libs?), and obfuscated (?). We now describe howwe performed this classification. First, we assigned all package names that havebeen fully obfuscated to the obfuscated category. We then assigned all pack-71age names that were found in a single application into the applications category.For the remaining packages, which were found in two or more applications, weranked them based on how many applications used them in each dataset, and thenperformed manual inspection in a decreasing order of rank. In particular, for eachpackage name we labeled as a library if we were able to find the library’s sourceor website, or if our manual inspection revealed that it was indeed a library. Westopped manual analysis once we identified enough package names to cover 95%of the call sites. We assigned the remaining unclassified package names to thepossible libraries category.In total, we manually analyzed 12,165 package names from the three datasets,out of which 3,622 (29.7%) belonged to libraries. Overall, we identified 638,260, and 265 libraries in R16, R12 and T15, respectively. The top-2 libraries andtheir package names are shown in Table 3.7. The fact that our analysis revealed260 libraries in the R12 dataset suggests that the CryptoLint study suffers fromover-counting misuse cases.Source attribution based analysis revealed that the libraries were responsiblefor the majority of calls to Crypto API in all three datasets, as summarized inTable 3.5. Furthermore, 79.5% of all calls to Crypto API in the R12 dataset origi-nated from 260 libraries. While the authors of the CryptoLint study did white-list11 libraries, our analysis shows that the white-listing approach was inadequate.In particular, the authors missed 249 libraries, which accounted for 79.5% of thecalls in their dataset. This suggests that the reported results in CryptoLint studysuffer from over-counting, due including same library in their statistics multipletime. Overall, these results with missed libraries and the ratio of calls that origi-nate from libraries suggests that researches that analyze Android applications forCrypto API misuse one should be careful misuse inflation, due to counting thesame misuse cases from a library multiple times.To this end, we showed that (a) one can reliably use package name for sourceattribution, since this covers 97.5% of the calls to Crypto API, and (b) librariesare the major contributor to Crypto API calls and should be properly identified.723.6.3 Crypto API misuse in Android ApplicationsIn what follows we present the main findings on Crypto API misuse rates acrossall source categories (applications or libraries) for all three datasets. We beginwith the results of the analysis on overall misuse rates across all rules, i.e., at leastone rule is violated. Then we proceed with an in depth analysis of misuse ratesfor each rule separately.To understand the extent of which the white-listing approach used in the Cryp-toLint study has impacted results, we additionally analyzed two sub-sets R12* andR16*, which were generated by applying the same white-listing approach to theR12 and R16 datasets respectively.We measured misuse rates from two complimentary perspectives. First, sim-ilar to the CryptoLint study, we assessed the ratio of APK files that containeda misuse. Second, we assessed the ratio of Crypto API call-sites that made amistake. While the APK files ratio provides intuition into how many APK filescontain at least one misuse of Crypto API, such an approach is biased towardslibraries, especially popular ones. Call-site ratio provides an assessment of thelikelihood that a call from an application or a library will make a mistake. Andsince we separate calls from libraries and applications, this measure provides astronger intuition into how misuse rates have changed within libraries and appli-cations.The following reports the ratio of APK files with misuse for each categoryseparately (named accordingly) and overall (“Any” category). The ratio for eachcategory is computed for the total number of APK files in the dataset. This, how-ever, does not imply that the sum of ratios for all four categories will be equal tothe “Any” category, since an APK file might have misuses contributed by varioussources. Similarly, the ratio of call-sites with misuse contains the “Any” category,which shows the ratio for all call-sites. Unlike the ratio of APK files, the ratio ofcall-sites with misuse is assessed separately for calls within each category.730%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (10222) R16* (93994) R12 (10222) R16 (95775) T15 (3645)Figure 3.2: Ratio of APK files that violated at least one Crypto API use rule perdataset. “Any” category includes all call-sites for the analysis, without con-sidering the source (i.e., library or an application). This approach was usedin the CryptoLint study. The remaining categories (Libs, Libs?, Apps and ?)include call-sites that belong to the corresponding source only (i.e., a library,a possible library, an application or a fully obfuscated case). The proportionsare calculated as the ratio of APK files that contained at least one misuse fromspecific category (or, any category for “Any”) against the total number of APKfiles that used Crypto API in the dataset. The total number of APK files thatmade at least one call to Crypto API for each dataset is provided in the legend.Overall Crypto API misuse rateThe ratios of APK files with at least one violation of the rules per category areshown in Figure 3.2. Unsurprisingly, our results for the R12* subset were in-linewith previously reported results, i.e., 94.5% in our study and 88% in CryptoLintstudy [57]. We attribute the difference here to two factors: (a) we removed 7%of duplicates, and (b) 768 APK files from the original R12 dataset were lost. As740%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (113571) R16* (1185970) R12 (138465) R16 (1274445) T15 (66492)Figure 3.3: Ratio of call-sites that violated one Crypto API use rule per dataset.The total number of call-sites to Crypto API for each dataset is provided in thelegend.expected, we found that the white-listing approach used in the CryptoLint studyreduced the ratio of APK files to include libraries that had introduced misuses.However, this did not have any impact on the call-sites ratio, as shown in Fig-ure 3.3.Overall, we found that since 2012 the ratio of APK files with at least one mis-use has decreased from 94.5% to 92.4%. At the same time, the overall likelihoodof a call-site to Crypto API to make a mistake remained around 28%, that is,each fourth call to Crypto API makes a mistake. Per category analysis, however,showed that while libraries have increased the ratio of APK files they introducedto Crypto API misuses to 90% (from 80%), the likelihood of a call-site from alibrary to make a mistake did not change significantly. The increase in the ratioof APK file libraries introduced to misuses of Crypto API can be explained by the75overall increase in the number of libraries. In particular, while the R12 datasetcontained only 260 libraries, R16 had 638 identified libraries.Unlike libraries, applications have improved in both the ratio of APK files andthe likelihood of a call-site that makes a mistake. In particular, the ratio of APKfiles decreased from 21% to 5% and the ratio of call-sites from 31.8% to 27.7%.Although, the increase in the total number of libraries might have also impactedthe ratio of APK files for applications.Comparing T15 with R16 revealed a single statistically significant difference,namely, the ratio of APK files to which applications introduced misuses. Whilethe ratio for R16 was 5%, for T15 it was 14.6%. This difference can be attributedto various factors, such as the difference in the total number of libraries (265 inT15 against 638 in R16).Symmetric key encryptionData analysis revealed that the overall use of ECB mode for symmetric ciphershas significantly decreased since 2012, as shown in Figures 3.4 and 3.5 . That is,the number of APK files that were flagged as using ECB mode has dropped from77% in R12 to 30% in R16. Similarly, the ratio of relevant call-sites that use ECBmode has dropped from 53% to 29%. Source attribution, however, revealed thatthis decrease can be mainly attributed to improvements in libraries. In particular,while applications decreased the ratio of relevant call-sites that use ECB modefrom 63% to 47%, libraries have reduced this ratio from 52% to 26%, i.e., a twofold improvement. Comparison of the T15 and R16 datasets revealed that anaverage application is less likely to use the ECB mode than the top application.The white-listing approach used by the CryptoLint study had a negligible impacton the results, i.e., most of the introduced differences were either not statisticallysignificant or practically negligible.Despite the positive outlook on the use of ECB mode, we found that there wasa statistically significant increase in the use of static IVs, as shown in Figures 3.6and 3.7. In particular, since 2012 the ratio of APK files that use a symmetric760%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (10102) R16* (86309) R12 (10144) R16 (88323) T15 (3519)Figure 3.4: Ratio of APK files that violated Rule 1 - “Do not use ECB mode forsymmetric cipher.” The total number of APK files that used symmetric cipherper dataset is provided in the legend.cipher in CBC mode to static IVs increased from 32% to 96% by 2016. Theratio of relevant call-sites to symmetric cipher APIs has increased from 31% to71%. The increases were mainly due to libraries rather than applications, sincethe later actually improved and decreased their use of static IVs. In particular,applications reduced the ratio of call-sites that violate Rule 2 (use of static IVwith CBC mode) from 64% to 52%, while libraries have increased that ratio from26% to 72%. Comparison of the T15 and R16 datasets did not reveal practicallysignificant results, i.e., the T15 dataset was comparable to the R16 dataset. Whilethe white-listing approach did have a statistically significant impact on the R12*sub-set, it did not have similar effect on the R16* sub-set and it did not have aneffect on the overall results, since all differences were still statistically significant.The decrease in use of ECB mode and the increase of static IV use with CBC770%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (25061) R16* (230184) R12 (31192) R16 (251021) T15 (14105)Figure 3.5: Ratio of call-sites that used ECB mode for symmetric cipher. The totalnumber of call-sites that created symmetric Cipher objects in Java per datasetis provided in the legend.mode, suggest that while developers tried to move away from insecure ECB mode,they failed to adopt a secure mode for symmetric ciphers. This failure might beexplained by a lack of understanding. Another factor that may have impactedthe shift is the warning message that the Android Lint tool began to show afterthe CryptoLint study was conducted. The warning message highlights that ECBmode is default and is insecure (“...because the default mode on android is ECB,which is insecure.”). In fact, Crypto Stack-Exchange5 is full of questions andsuggestions on how to “fix” this warning message by replacing “ECB” mode with“CBC”.The analysis of Rule 3 violations (not using static encryption keys) revealedthat, again, the overall rates of static key usage have increased (see Figures 3.85 Libs Libs? Apps ?R12* (2516) R16* (79895) R12 (6007) R16 (82092) T15 (3265)Figure 3.6: Ratio of APK files that violated Rule 2 - “Do not use static IV for CBCmode in symmetric cipher.” The total number of APK files that used symmet-ric cipher in CBC mode per dataset is provided in the legend.and 3.9). In particular, since 2012 the ratio of APK files that use symmetric cipherwith a static key increased from 70% to 93%. The ratio of call-sites that use sym-metric cipher with static key increased from 45% to 57%. Unlike with the use ofstatic IV, both applications and libraries were to blame. Although applications diddecrease the ratio of APK files that contribute to a violation of rule 3, the ratio ofcall-sites that violate rule 3 and originate from applications showed the opposite,i.e., the likelihood that a call from applications would use a static encryption keyincreased from 40% to 44%. This highlights that relying solely on the ratio ofAPK files with misuses might be misleading due to the impact libraries have onthis measurement.In addition to validating the formal rules of using cryptography, we extractedthe top-6 most-used symmetric ciphers from each dataset, as summarized in Ta-790%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (7457) R16* (136425) R12 (12697) R16 (152280) T15 (7565)Figure 3.7: Ratio of call-sites that used static IV with CBC mode for symmetriccipher. The total number of call-sites that used Cipher objects in CBC modeper dataset is provided in the legend.Cipher (%)Call sites AES DES 3DES PDE∗ RC4 Blowfish OthersR16 251,021 64.4 13.6 1.1 0.7 2.1 0.9 17.2R12 31,192 58.9 12.5 8.8 6.5 0.4 1.9 10.9T15 14,105 67.8 8.9 0.8 0.9 1.1 0.8 19.7Table 3.6: The top-6 ciphers used in Android applications. PDE was used with MD5and 3DES.800%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (8559) R16* (87524) R12 (9339) R16 (89438) T15 (3495)Figure 3.8: Ratio of APK files that violated Rule 3 - “Do not use static encryptionkey for a symmetric cipher.” The total number of APK files that used symmet-ric cipher per dataset is provided in the legend.ble 3.6. While the use of AES has increased and DES cipher has barely changed,triple DES, which is a more secure version of DES, has significantly decreased,from 9% in R12 down to about 1% in both R16 and T15. Surprisingly, we foundthat the RC4 cipher has made it to the top-3 used ciphers in both T15 and R16,even though it is considered insecure [65] and the security community has sug-gested removing it from cryptography libraries [90].Assuming that the warning message might have been the root cause in thedrastic decrease of ECB mode use, it is worth investigating in future research ifadding similar messages for static IVs and encryption keys will have a similareffect. In addition, to supplement these warning messages, Google can provide“ready-to-use” code snippets to application developers in Android Studio IDE.This can eliminate the necessity for the developers to search online for code ex-810%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (20820) R16* (225202) R12 (26090) R16 (236506) T15 (12635)Figure 3.9: Ratio of call-sites that used static encryption key for a symmetric cipher.The total number of call-sites that set an encryption key for a symmetric cipherper dataset is provided in the legend.amples that might potentially have implementation issues.Password-based encryptionSince 2012, the rates of misuse of Password-based encryption (PBE) has overalldecreased for both static salts (Rule 4) and number of iterations (Rule 5), as shownin Figures 3.10- 3.13. In particular, the ratio of APK files that provided staticsalts for PBKDF has decreased from 81% to 74%. The ratio of APK files thatused less than 1,000 iterations decreased from 58% to 51%. The ratio of calls torelevant Crypto API that violate either Rule 4 or 5 has also decreased (as shownon Figures 3.11 and 3.13). Source attribution-based analysis showed that bothlibraries and applications have improved.Comparison of the T15 and R16 datasets showed that, on average, 19% and820%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (1867) R16* (8380) R12 (2355) R16 (11840) T15 (714)Figure 3.10: Ratio of APK files that violated Rule 4 - “Do not use static salt forPBKDF.” The total number of APK files that used PBKDF per dataset isprovided in the legend.24% less APK files from T15 violated rules 4 and 5 respectively. Further, persource-category analysis revealed that this was mainly due to improvements inlibraries used by the applications in the T15 dataset. Applications, however, vio-lated rule 4 more frequently (10% more). We omit discussion of R12* and R16*sub-sets since our analysis did not reveal any statistically and practically signifi-cant results.While these results suggest that there is a negative trend in the misuse ofCrypto API, future research should focus on how to improve these results evenfurther. For example, one might consider showing a message to application devel-opers with implications of using static salts or fewer than 1,000 iterations. Such awarning message might include time estimates of how long a password guessingattack would take on today’s hardware to go through the entire password space.830%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (5318) R16* (33283) R12 (6398) R16 (38773) T15 (1765)Figure 3.11: Ratio of call-sites that used static static salt for PBKDF. The total num-ber of call-sites that provided a salt value for PBKDF per dataset is providedin the legend.Random number generationThe use of static seed values for SecureRandom has significantly decreased since2012 (Figures 3.14 and 3.15). In particular, while the ratio of APK files thatprovide static seed to SecureRandom has dropped from 73% to 67%, the ratioof relevant call-sites that use static seed value decreased to 43% from 69%. Al-though the ratio of APK files where libraries have introduced a violation of rule6 has grown by 3%, call-site analysis has revealed that the libraries have signif-icantly reduced the likelihood that a call to SecureRandom will provide a staticvalue (from 72% in 2012 to 42% in 2016). This, again, shows that relying on theAPK files ratio as the only way of measuring Crypto API misuse might convey anincorrect message.Comparison of the T15 and R16 datasets revealed that libraries used by the top840%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (1867) R16* (8380) R12 (2355) R16 (11840) T15 (714)Figure 3.12: Ratio of APK files that violated Rule 5 - “Do not use less than 1,000iterations for PBKDF.” The total number of APK files that used PBKDF isprovided in the legend.applications had much lower impact on the APK files with violation of rule 6 andhad a lower rate of calls to Crypto API that provided static values. I omit discus-sion of R12* and R16* sub-sets, since the analysis did not reveal any statisticallyor practically significant results.Considering that the SecureRandom class can seed itself and that re-seedingdoes not decrease its entropy, we would suggest that this class always seeds itself,even if an application developer provides a static seed value for the constructor.3.6.4 The impact of third-party librariesAnother important factor to consider for libraries is popularity. That is, a pop-ular library with misuse will impact a significantly larger set of APK files. Tounderstand how popularity impacts misuse rates we proceed with the following850%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (5318) R16* (33283) R12 (6398) R16 (38773) T15 (1765)Figure 3.13: Ratio of call-sites that used 1,000 or less iterations for PBKDF. Thetotal number of call-sites that used PBKDF per dataset is provided in thelegend.Rank in datasetCompany Library Package Violated rules R16 R12 T15Google Play SDK 2, 3 1 – 1Apache HTTP Auth org.apache.http.impl.auth 1 2 3 5InMobi Advertising com.inmobi.commons.core.utilities.a – 3 – 2Google Advertising 3 38 1 36VPon Advertising 1, 3 – 2 –Table 3.7: Summary of Top-2 libraries from each dataset that made use of CryptoAPI. Empty values imply that the library was not found in the dataset.860%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (3173) R16* (7176) R12 (3239) R16 (7190) T15 (579)Figure 3.14: Ratio of APK files that violated Rule 6 - “Do not use static seed forSecureRandom.” The total number of APK files that used SecureRandomper dataset is provided in the legend.analysis. We measured the number of APK files that would be misuse-free if onebegan fixing misuses in libraries, starting with the most popular libraries. Fig-ure 3.16 shows this impact for each dataset. In particular, by fixing the top mostlibrary in R16, 50,015 APK files would become misuse-free (or 56% of all APKfiles with misuses), and by fixing all libraries 79,207 APK files would be fixed (or89.5% of APK files with misuses).3.6.5 In-depth analysis of top librariesConsidering that libraries made 90% of all calls to cryptographic APIs and, iffixed, can potentially reduce the number of APK files with API misuse by a factorof 10 (see Figure 3.16), an in depth manual analysis on the top 2 libraries fromeach dataset was performed. After selecting the top 2 libraries from each dataset,870%10%20%30%40%50%60%70%80%90%100%Any Libs Libs? Apps ?R12* (3377) R16* (11930) R12 (3480) R16 (11944) T15 (833)Figure 3.15: Ratio of call-sites that used static seed for SecureRandom. The totalnumber of call-sites that seed SecureRandom per dataset is provided in thelegend.we found that one library was in the top 2 in T15 and R16, resulting in 5 librariesin total.In what follows we provide the details of the results of our manual in-depthanalysis of five libraries. This analysis was mainly focused on the reasons forthe use of Crypto API and the security impact of the identified misuses, if any.Overall, the results of the analysis revealed that four out of five top libraries vi-olated some of the rules. Three of them were identified as false positives, i.e.,formal violation of Crypto API use which did not introduce a security vulnerabil-ity. Most of the analyzed misuse cases originated from obfuscation. In addition,three libraries implemented their own data encryption layer over HTTP or HTTPSprotocols. This issue, however, can be trivially addressed by switching to HTTPSfor all communications and dropping the libraries’ implementations of data en-880%20%40%60%80%100%0 100 200 300 400 500Libraries rankR12 (9,886)R16 (88,510)T15 (3,491)Figure 3.16: Proportion of APK files that would become Crypto API misuse-freedepending on the number of fixed top ranked libraries. The legend shows thetotal number of applications that had at least one misuse in the correspondingdataset. We identified 222, 507 and 198 libraries with misuse in R12, R16and T15 datasets, hence, the end of the corresponding lines.cryption.Google advertisementThis library was the top library in the R12 dataset, and made the top 40 in boththe T15 and R16 datasets. This library provides advertisement services to ap-plications. It makes use of data encryption API in the AdUtil class, located package. The implementation uses a static key for en-cryption (i.e., violating Rule 3), which is hard-coded in the AdUtil class. Theencryption function receives plaintext as a string and returns cipher text, also as89a string. The encryption uses AES cipher in CBC mode with PKCS5Padding.The encryption function is later used to encrypt a string representation of a user’slocation, before being sent back to Google’s servers. Considering that the com-munication happens over HTTPS protocol, the use of static key does not impactconfidentiality in the presence of a network attacker. However, we would recom-mend fixing this to avoid exploitability if for some reason the HTTPS protocolwas not available.In addition, we found that in R16 and T15, this library has significantly changed.In particular, the newer version no longer used encryption. The structure of thelibrary was significantly simplified as well. There were, however, several appli-cations in both T15 and R16 datasets where the old version of the library wasused and had AdUtil class with the same key and same misuse. Interestingly,these applications were updated relatively recently (2 – 3 months prior to the datacollection of T15 and R16). This observation confirms findings from a recent re-port [39] which showed that application developers are slow to adopt new versionsof libraries.VPon advertisementThis library is also an advertisement library, present only in the R12 dataset. Sim-ilarly, it uses a cipher to encrypt and decrypt data. All identified call-sites werelocated in the CryptUtils class in the package. Thislibrary violates two rules: the use of ECB mode (rule 1) and a static encryptionkey (rule 3). CryptUtils class exposes two types of encryption functions, onethat uses javax.crypto.SealedObject as an input for encryption, and one thataccepts key and data as a string and returns a string as a result. The functions thatwork with SealedObject are used to encrypt requests that are sent back to theserver and decrypt responses from it. This suggests that the static key is sharedbetween the library and VPon’s servers. The requests are sent over both HTTPand HTTPS. Unfortunately, we were unable to understand exactly what data aresent over what protocol. The second function, based on strings as input and out-90put, is only used to decrypt obfuscated string literals. Decrypting string literalsin Android applications is a common obfuscation approach, which implies thatdata confidentiality is not the primary objective. To summarize, this library vio-lates two rules (use of ECB mode and use of static key) to communicate with theadvertisement server and to obfuscate data.Apache libraryThis library provides an ability for applications to communicate over HTTP andHTTPS protocols. It was the only library in the top 5 in all three datasets. Thelibrary uses Crypto API in many locations, but one specific call site, which usedECB mode, drew our attention. In particular, this library implements a suit ofNT Lan Manager (NTLM) authentication protocols, which are commonly usedto authenticate over HTTP(s). This protocol, by design, uses a DES cipher inECB mode (i.e., violating rule 1), to implement challenge response validation.However, it only encrypts a single cipher block (i.e., 16 bytes). Considering suchuse, this misuse was classified as a false positive, since the encrypted content israndom and fits into a single block.Google Play SDKThis library provides services of the Google Play platform, such as In-App pur-chases or authentication with Google accounts. This library was the top mostlibrary in both T15 and R16, and was absent from the R12 dataset. It violatedtwo rules, the use of static IV and static keys (rules 2 and 3). Interestingly, thislibrary implemented only a decryption function that accepts a byte array as a keyand a string as cipher-text. It outputs plain-text as a byte array. The key is Base64encoded and hard-coded as a property in a static class, which is located in package. The same static class contains all thecipher-texts that get decrypted. All cipher texts are hard-coded in Base64 for-mat. Further analysis revealed that one of the cipher texts is actually an encryptedDEX file, which upon decryption (about 3K in size) loaded into application space91through Java Reflection API [29]. The remaining cipher texts are properties of theclass (such as name of class and name of its fields or functions). This case fallsinto the obfuscation category, and was thus considered as a false positive.InMobi advertisementThis library (the second-most popular library in the T15 dataset) provides in appadvertisement capabilities to the applications. The call-sites to Cipher facilitieswere found in InternlSDKUtil, located in the com.inmobi.commons.internalpackage. This class uses an AES cipher in CBC mode with PKCS7 padding.It also uses an RSA cipher, to exchange a symmetric key with the server. Wefound that this library, similarly to VPon, uses encryption facilities to encryptcommunications with their advertisement server. Interestingly, we saw the useof both HTTP and HTTPS protocols for communication to the same domain ad-dress, thus, it is unclear why the library developers had not switched all com-munications to HTTPS. This library generates an encryption key once, stores itin SharedPreferences and then reuses it on all subsequent communications.Formally, InMobi’s implementation did not violate any of the evaluated rules ofCrypto API use.3.6.6 The impact of third-party libraries revisitedThe results of the in-depth analysis of the top libraries revealed that the currentapproach used for identification of Crypto API misuses in APK files suffers froma significant ratio of false positives. We classify a misuse case as a false positiveif the actual use of the Crypto API was not meant to provide integrity or confi-dentiality protection, i.e., not a concern for IND-CPA. For example, while GooglePlay SDK violated Rules 2 and 3 (did not use static IV for CBC mode and staticencryption key), it did so for obfuscation purposes only. Another limitation of thecurrent approach is that it misses certain edge cases, e.g., encryption of a singleblock of random data in ECB mode. Such cases significantly inflate misuse rates,and thus, convey a wrong state of actual misuse of cryptography in Android appli-92cations. Future research should focus on expanding BinSight’s ability to classifyif cryptographic APIs are used for obfuscation purposes.3.7 Discussion and Future WorkThe results of the analysis of more than 132K Android applications revealed that9 in 10 calls to Crypto API originate from third party libraries. Libraries arealso the main source of misuse cases, where 89.5% of the APK files collected in2016 were flagged only due the libraries they used. By re-analyzing the datasetfrom the CryptoLint study we found that the authors have missed 249 out of 260libraries in their dataset, which significantly contributed to over-counting in theirresults. In particular, 222 of the missed libraries were responsible for 70% ofthe flagged APK files in their dataset. These results suggest that future researchsources on Crypto API (mis)use must use source attribution and analyze librariesand applications separately.Our implementation of source attribution relies on package names. For thisapproach to work, the classes with calls to Crypto API should not be fully ob-fuscated, i.e., full renaming of class identifiers should not be used. Although ouranalysis revealed that only 2.5% of the classes were fully obfuscated among theapplications in 2016, future research should focus on improving the ability ofanalysis tools to identify libraries and applications. One can achieve this objec-tive by exploring methods for de-obfuscation proposed by Bichsel et al. [43] orBackes et al. [39].To help developers choose secure libraries, the research community shouldalso invest time in establishing a centralized repository to share identification datafor libraries. Application developers would be able to consult such a repositoryto make informed decisions on which libraries to use, while library developerswould also be able to respond to the discovered issues and explain misuse cases.By using static analysis we inherited all limitations that come with such anapproach. In particular, our sCFG is both an over and under estimation of theactual sCFG. One one hand, we overestimated sCFGs by including all detected93edges. Such an approach might include edges that will never be executed duringan application’s run-time. On the other hand, in our analysis we did not includeedges that were dynamically created. Such edges are usually created through JavaReflection API [29]. These limitations create risks for validity of the presentedresults. Future research should consider complementing BinSight with dynamicanalysis capabilities and include analysis of reflection API into the sCFG con-struction process.Although one cannot obfuscate calls to platform APIs, such as Crypto API, itis still possible to hide them through late binding. In particular, one can use JavaReflection API to side-load a binary that would make the actual call to Crypto API.This, as mentioned above, can be addressed by augmenting BinSight’s analysispipeline with the analysis of calls to Java Reflection API, which can be based onthe same static analysis approach we used for Crypto API calls.Even though the ratio of fully obfuscated classes in our datasets was negligi-ble (2.5% in R16), understanding how obfuscated applications differ from non-obfuscated ones is still an important and interesting research question to investi-gate. Such low adoption of full obfuscation, on the other hand, allowed us to usetrivial yet efficient and effective source attribution based on package names.While looking into the top libraries we found that not all misuses of CryptoAPI necessarily have security implications. However, our analysis was exploratoryin nature and does not provide precise assessment of the ratio of all identifiedCrypto API misuses that are false positives. Considering that the top library fromR16 was responsible for 56% of flagged APK files and that it used Crypto API forobfuscation, we suspect that a significant portion of misuse cases are indeed falsepositive. This suggests that the current approach to analyzing Crypto API misuseis inadequate in several aspects.First, as we showed, currently defined rules miss certain edge cases, whichinflate the number of misuse cases. Second, without understanding the purposeof why the Crypto API is used in the first place, it is impossible to say whenwe actually encounter a misuse, given that Crypto API might be used for other94reasons than confidentiality or integrity protection. Finally, without understandingthe types of data that are being handled in the Crypto API calls, it is impossible tosay to what extent a misuse might correspond to a security vulnerability.To this end, our analysis showed that there are still plenty of open researchquestions that need to be addressed by future research. Given the identified limita-tions of the current approach to the analysis of Crypto API misuse, it is impossibleto say if the current misuses actually results in security issues. In what follows, Idiscuss two research areas for future work.3.7.1 Extending the Crypto API analysisIn our analysis we used the same six rules for secure use of Crypto API that theauthors of the CryptoLint study defined. This set, however, is far from complete.First, future research should consider adding new rules, based on the recently re-ported attacks on the use of cryptographic APIs or modes of operation [95, 112].Second, extending BinSight’s coverage to asymmetric ciphers, similarly to whatShaui et al. [103] proposed, would provide the research community and applica-tion developers with a single tool that can provide an overall evaluation of howan application uses symmetric and asymmetric cryptography. Such an evaluationwould have to be able to separate public keys from private ones, since having astatic public key does not pose the same security threat as using static encryptionkeys for symmetric ciphers. Third, considering that recent research has showedthat applications incorrectly validate SSL certificates [62], one can also includeanalysis for the implementation of secure protocols, such as TLS or SSL.While identifying misuse cases is important, it is also highly important to un-derstand what happens with data once it is encrypted. In addition, an insight intowhat kind of data is being encrypted would allow ranking the uncovered issuesbased on severity level. That is, if highly sensitive data are being handled by codethat misuses Crypto API, one should try to fix that issue first, comparing to abenign misuse such as obfuscation.One can achieve such a goal by analyzing program slices that gather data be-95fore submitting them for encryption. Furthermore, these slices should also includethe flows where cipher-text is passed through an input/output (IO) API. IncludingIO flows is important for a proper assessment of security, since the attack surfaceis significantly different for network and on-host attackers. Finally, one can usedynamic analysis methods and data tainting to track all sensitive data records. Forexample, see [60] and [38].Although one cannot obfuscate calls to platform APIs, such as Crypto API, itis still technically possible to hide such calls from static analysis. For instance, anapplication can use Java Reflection API to side-load a binary that would contain aclass definition that makes the actual calls to Crypto API. To uncover such cases,future research should introduce analysis of the Reflection API into the BinSightpipeline. Once such call sites are found and the corresponding binaries are down-loaded, one can feed these binaries to the existing BinSight pipeline, and proceedwith the analysis as usual.More importantly, our manual in-depth study of top libraries revealed that theidentified misuse cases did not impact the actual security of the applications. Inparticular, while static linting was efficient in detecting Crypto API misuses, itfailed to capture actual security implications, since most of the manually analyzedlibraries used cryptography for other reasons than confidentiality or integrity pro-tection. Reporting too many false positives would make it hard to convince ap-plication and library developers to change their coding practices. Future researchshould first focus on identifying these false positives. For instance, one couldwhite-list known benign misuse cases and share them with the community. Anexample of this is the use of ECB mode in the Apache library for NTLM proto-col implementation, which is secure since it encrypts a single block of randomdata. Another approach is to employ dynamic analysis and data tainting, in orderto uncover the kinds of data involved. This would allow ranking of all misusesbased on how sensitive the involved data are, and thus, would allow the researchcommunity to focus on what is most important.963.7.2 How Crypto API Misuse Rates Have ChangedBy using the original dataset from the CryptoLint study [57], we were able to(a) replicate the original study and (b) compare how misuse rates have changedbetween 2012 and 2016. The analysis of applications from the CryptoLint studyshowed comparable results, i.e., about 90% of Android applications contained atleast one misuse of Crypto API. The analysis of applications collected in 2016revealed similar results, i.e., around 90% also had at least one case of Crypto APImisuse.In contrast to the CryptoLint study itself, we analyzed calls from librariesand applications separately. Source attribution analysis revealed that 9 out of 10calls to Crypto API originated from libraries. Such library domination makes themisuse rate measured by the CryptoLint study highly biased towards libraries. Inparticular, we showed that 507 libraries in the 2016 dataset were the only reasonwhy 80.5% of APK files were flagged as misusing Crypto API. To provide betterunderstanding of trends, we used the ratio of call-sites with mistakes to all call-sites as a complementary metric. While the ratio of APK files with Crypto APImisuse provides insight into the impact of libraries and applications on the overallnumber of APK files with misuses, the call-site ratio provides intuitive probabilityfor how often a call from libraries or applications will make a mistake.Trend analysis showed that while both applications and libraries have im-proved in certain aspects, e.g., the use of ECB mode for symmetric ciphers, theyhave significantly worsened in others areas, such as in the use of static encryptionkeys. As there has been a decrease in the use of the ECB mode, a reason for thisis possibly the introduction of a warning message in Google’s Android Applica-tion Integrated Development Environment (IDE), which highlights the insecurityof the ECB mode. Interestingly, there are plenty of “suggestions” online for howto “fix” this warning message by replacing the ECB mode with CBC.In addition, our analysis has revealed that RC4, a symmetric cipher withknown vulnerabilities [65], has become the third most-used cipher among ap-plications collected in 2016. Future research should focus on studying if warn-97ing messages, similar to the “insecure ECB mode” message, would change howdevelopers use Crypto API. Further, recent research showed that application de-velopers also need code samples to make sure they use API properly [36]. Thus,one approach would be to incorporate samples into the IDE as ready-to-use codesnippets.3.8 ConclusionThis chapter presents the results of a study on the misuse of cryptography APIsin Android applications. Although other researchers had previously measured thespread of misuse, this study differs in that it focused on source attribution, i.e.,understanding whether a misuse originates from a library or an application. Suchfocus revealed that 9 out of 10 calls to Crypto APIs originate from libraries andthat libraries are the major contributor of misuse, both in terms of APK files andthe ratio of call-sites. In particular, third-party libraries were the only source ofCrypto API misuses for 89.5% of flagged APK files.While replicating the CryptoLint study and confirming its results, we showedthat the study’s analysis missed most of the libraries (249 out of 260). This led toover-counting, i.e., counting the misuse multiple times, since 70% of the identifiedAPK files by the CryptoLint had misuses that originated only from 222 libraries.That is, 222 libraries were solely responsible for the flagging of 6932 APK filesin the R12 dataset (out of 9886).This chapter also provides insights on how Crypto API misuse rates havechanged between 2012 and 2016. We found that the trends were mixed. Whilethe overall ratio of APK files with misuse attributable to libraries had significantlyworsened, libraries managed to improve in some areas (decreasing the use of ECBmode and static seed for SecureRandom). This chapter also demonstrates that us-ing the ratio of APK files is biased towards libraries, especially the popular ones.To address this limitation we proposed to use the ratio of call-sites that made amistake. We demonstrated that in certain cases the ratio of APK files provides amisleading message.98Finally, this study also provides insights into misuse cases by presenting re-sults of a manual in-depth analysis of the top-2 libraries from each dataset. Theresults showed that while static linting is efficient at detecting Crypto API mis-uses, it fails to capture actual security implications. In particular, manual analysisrevealed that the investigated libraries used cryptography for reasons other thanconfidentiality or integrity protection. Another observation was the edge case forRule 1, i.e., encrypting a single cipher block of random data in ECB mode. Thisis why future research on Crypto API misuse should focus on improving the abil-ity of analysis tools to identify use cases, their ability to detect edge cases, and,more importantly, their ability to identify types of data being processed by callsto Crypo API. Understanding the all this allows assessing the severity of a misusecases, i.e., does it potentially leads to a data breach of sensitive data, or it is a merefunctional false positive, e.g., obfuscation.99Chapter 4Storing Encryption Keys onWearable DevicesIn this chapter we present the results of feasibility evaluation of a system thatuses wearable devices to manage encryption key. The main intuition on whysuch a system might help address the issues presented in previous chapters isthe proliferation of various wearable technologies (e.g., smart watches or fitnesstrackers).4.1 IntroductionPublic and private organizations see plenty of benefits in the adoption of smart-phones or tablets for their businesses. For example, the bring your own device(BYOD) policy has become a norm [5]. While some businesses have less con-straints on how well company’s data needs to be protected, certain types of dataare under stricter requirements. For example, health related data in the US is re-quired to follow the Health Insurance Portability and Accountability Act of 1996(HIPAA) [28] and protection of any personally identifiable information in Canadahas to follow the Personal Information Protection and Electronic Documents Act(PIPEDA).100Adoption of the BYOD policy creates certain challenges in following HIPAAor PEPIDA. In particular, as shown in Chapter 2, 90% of users tend to rely on easy-to-guess unlocking secrets, which results in practical password guessing attacksand enables data decryption. Furthermore, 1 in 3 users do not lock their devicesat all, which makes any data stored on the device immediately readable.Although it is hard to measure how often an attacker actually tries to accessconfidential data on users’ smartphones, there is some anecdotal evidence to con-sider. In the US alone every tenth smartphone owner has experienced theft at leastonce [4]. More than 30% of all street robberies involve smartphone theft [2]. Fi-nally, 46% of companies from North America and Europe stated that theft of adata bearing device, such as a smartphone, was the key factor in the data breachesthey experienced [5].All of the above makes it challenging, if not impossible, for certain organi-zations to adopt smartphones and tablets in their businesses. In the following wedesign and evaluate a system, called Sidekick, that aims to address two issues.First, it aims to make unlocking secrets optional for data-at-rest security, by im-plementing wearable devices as key storage devices. Second, it aims to give fullcontrol to organizations over how their data are being encrypted, while still allow-ing the BYOD policy.Evaluation results revealed that this proposal is practical from a technical pointof view. That is, one can use Sidekick on all existing platforms and on devicesthat have Bluetooth Low Energy (BLE) stack. In addition, the system imposesnegligible latency and power consumption overhead. There are still, however,plenty of open research questions, especially on usability of this proposal.4.2 Threat ModelThis section provides a description of threats, risks and attackers’ capabilities con-sidered during the design and development of the Sidekick system.1014.2.1 Threats and RisksThe Sidekick system was designed with a focus on smartphone loss and theftthreats, or theft, for brevity. In an adversarial model we consider an opportunisticattacker, i.e., an attacker who’s main objective is to profit from selling the deviceitself. As an additional source of revenue, such an attacker might attempt access-ing data. Such an opportunistic attacker would only spend sufficient amount oftime to find an unlocking secret which is easy-to-guess, as defined in Chapter 1.To achieve this, the attacker would use available tools for offline password guess-ing attack, e.g., HashCat [33] in order to find the unlocking secret. Attackerswho aims accessing data as a primary objective are beyond our scope, since theycan coerce smartphone owners to give up their unlocking secret through physicalthreats.Once the device is in the hands of an attacker, there is a risk of confidential datadisclosure. If an attacker gains access to confidential data, the owner of the datamight suffer losses, such as reputation and/or financial damages. For example, thevictim might have to pay fees for leaking private customer information. Our mainfocus is on organizations that adopted the BYOD policy but need control over pro-tection of their data confidentiality. To achieve this, such organizations are willingto require their employees to carry a wearable device. This is why in our evalu-ation we focused only on technical aspects and not on usability of the proposal.The main reason for us to focus on the technical aspects, rather than usability orsecurity, is three fold. First, we aimed to use the BLE communication stack whichis available on all mobile platforms, but was deliberately compromised for energyefficiency [10]. Second, we evaluated a well-known and researched mutual au-thentication protocol on top of the BLE stack, with the Elliptic Curve Diffie Hell-man protocol for establishing session keys. Third, considering that we envisionedSidekick to be implemented as a background service on the wearable devices thatare already in use, thus, having minimal impact on the user interactions with thewearable device.1024.2.2 AttackAccessing confidential data that are not encrypted by either PBE or the applicationitself is trivial. If, however, PBE is enabled, the attacker would need to extract abit-by-bit image of the internal storage first. This can be achieved through existingtools (e.g., [7]). The main reason for extracting the image first is to bypass thelimitation on the number of failed unlocking attempts. This limitation is enforcedon the OS level and often leads to complete data wipe-out. For example, in iOSa user can enable device wipe-out after 10 unsuccessful unlocking attempts. Withthe storage image in hands, the attacker mounts a password guessing attack inorder to recover the key encryption key (KEK), which is used to protect the actualdata encryption key (DEK).The stolen device might use specialized hardware for PBE. For instance, iPhonesand iPads use cryptographic chip for key derivation process. This chip has an em-bedded hardware key, which is used in key stretching. If such specialized hard-ware is used, the attacker will need to run certain computations on that hardwareduring the password guessing attack. Such attacks are called on-device brute-force attacks and, in general, are significantly slower than off-device attacks sincemassive parallelization becomes unavailable. Both on-device and off-device at-tacks are considered offline, as they bypass enforced limitations on the numberof allowed failed unlocking attempts. In off-device scenario, the storage imageis process off device, hence none of the restrictions can be applied. In on-devicescenario, an attacker uses specialized hardware directly, which allows him to cir-cumvent limitations enforced by Operating System or drivers. On-device attacks,however, are significantly harder to mount and require certain types of vulnera-bilities in the booting sequence. Once the attacker recovers the KEK, it becomestrivial to recover the DEK and obtain access to encrypted data.If an attacker needs to bypass data encryption implemented by an application,a binary file of the application needs to be first obtained and reversed engineered.For example, Android applications are distributed as APK files, which can bedownloaded from the Google Play store with existing tools such as APKDown-103loader [25]. If a misuse of cryptographic API is found, the attacker then uses thisknowledge to decrypt the data.Because Sidekick relies on wearable devices, one must evaluate attacks on theused wireless communication stack and its implications on the overall security ofthe encrypted data. An attacker might attempt to obtain KEKs transferred overthe BLE channel. If such an attack is successful, then one can decrypt DEK, andthen decrypt encrypted data. An attacker might aim to corrupt the KEK duringtransmission over BLE, which, if successful, would make the corresponding DEKencryption cryptographically inaccessible. This capability might be exploited byransomware – a malware that encrypts data and requires a payment to be made forthe victim to get his or her data back.4.2.3 General AssumptionsThe design of Sidekick makes several general assumptions about the capabili-ties of an attacker. First of all, a perfect cryptography is assumed, i.e., attackerscannot differentiate the used cipher (AES) from a random permutation function.Considering that non-generic attacks have yet to be found for the AES cipher,this assumption is sound. It is also assumed that there are no security bugs in theimplementation of the data encryption system that would introduce a shortcut forencryption and decryption (e.g., by using a hard-coded KEK or DEK, or by usinga biased and predictable random number generator).In addition, confidential data disclosures through a compromised OS kernelare not considered for the following reasons. First, having a secure OS kernel doesnot prevent the password guessing attack and attacks on misused cryptographicAPIs. Second, unless a trusted secure platform is used, a compromised OS kernelrenders any data encryption ineffective. That is, by virtue of controlling the OSkernel, the attacker can read and write any memory page, and thus can extractDEKs and KEKs from RAM directly.Use of wireless communication stacks enables attackers to track users basedon device addresses (i.e., Media Access Code). It is assumed, however, that track-104ing users is not one of the objectives of the attackers. Such an objective is not onlyunrelated to data-at-rest security, but can also be trivially addressed with existingmethods and tools, e.g., by enabling the BLE privacy feature [10]. Finally, theability to deny the existence of data is not considered either, since (a) it is com-plimentary to confidentiality protection, and (b) can be addressed using one of theavailable systems (e.g., [104]).Finally, we assume that an attacker’s main focus is the smartphone itself andthe data stored on the device is secondary. That is, an attacker would attemptaccessing data only if data protection is inadequate, e.g., the device is unlockedor unlocking secret can be guessed within time frame that the attacker considersreasonable. Considering that the main objective of the Sidekick is increasing thesearch space for the encryption keys, we focus the evaluation on costs associatedwith the use of such external device. We do not focus on security of the wearabledevice itself, both in terms of physical and operating systems, since we assumethat the attacker will either not have access to this wearable device or a secure andtamper resistant device is being used.4.2.4 Crypto-AttackerThe crypto-attacker aims to obtain confidential data by recovering the KEK throughpassword search. It is assumed that the crypto-attacker has the following capabil-ities. First, he has physical access to the victim’s smartphone. Second, the crypto-attacker knows the design of the data encryption system and knows how the KEKis generated. Third, he can obtain a bit-by-bit image of the internal storage on thestolen smartphone, which allows bypassing the file system access control. Tech-niques that allow acquiring the raw storage image have been widely discussed inthe last few years [104, 114]. Fourth, if the stolen device uses special hardwarefor data encryption, e.g., crypto-chip in iPhones, he knows how to mount an on-device password guessing attack. Finally, it is assumed that the crypto-attackerhas limited time during the attack, and that the attacker is not capable of mountinga successful guessing attack on a pseudo-randomly generated 128-bit KEK.1054.2.5 Network-AttackerA network-attacker might have several objectives. First, he might be interested indata-at-rest stored on the smartphone. In this case the attacker plans to steal thesmartphone later, but first aims to obtain all KEKs transmitted over BLE in orderto eliminate the necessity to later perform a KEK search. Second, the network-attacker might be interested in corrupting KEKs to cryptographically lock theuser’s valuable data. The attacker can then use his knowledge of how he corruptedthe KEK and request a ransom payment from the victim.To compromise the wireless channel, the network-attacker can use one of twoapproaches. First, the attacker can focus on the wireless messages themselves byexploiting insecure protocols and gaining the ability to recover or corrupt KEKs.In particular, we assume that the network-attacker is able to exploit vulnerabilitiesreported in BLE stack thus far [96–98]. In particular, these attacks showed thatrecovering the established pairing keys for the BLE stack is practical. This givesthe attacker ability to decrypt and modify any message transmitted over the BLEstack.Second, if the attacker has control over an application on the victim’s smart-phone with access to the Bluetooth stack, he can communicate with the wear-able device and retrieve required KEKs before stealing the device. This type ofan attack is called a misbonding attack [86]. In order to mount a misbondingattack an attacker needs to obtain access to the Bluetooth stack, by requestingandroid.permission.BLUETOOTH permission for an Android application. Thisallows the application to communicate with all Bluetooth devices that are pairedand connected, including the wearable device used for KEKs storage. From thewearable device perspective, all applications that communicate with it have thesame identity, that of the paired smartphone. That is why wearable device withSidekick must be able to identify each application in order to enforce access con-trol on the stored KEKs.1064.3 Sidekick DesignIn this section we present the design of the Sidekick system. We begin with ahigh-level overview of the system, then proceed with a discussion of used counter-measures to mitigate attacks by a network-attacker. We conclude with a securityanalysis of the proposal in the presence of network and crypto attackers.4.3.1 High Level OverviewThe Sidekick system relies on a wearable device in the encryption keys manage-ment task. This allows the decoupling of user authentication and data encryptionby making the dependency on the unlocking secret optional in the Data EncryptionKey derivation process. Instead, Sidekick generates all KEKs randomly and storesthem on an external device. Such systems that separate storage have been alreadyproposed and deployed for personal computers and laptops, e.g., TrueCrypt andBitLocker [12, 14]. With the evaluation of the Sidekick system we aim to evalu-ate if it is practically feasible to use an insecure BLE stack while achieving both(a) keeping the latency and power consumption reasonable, and (b) mitigating themisbonding attack by introducing mutual authentication between applications onthe smartphone side and Sidekick service on the wearable device side.The key technical difference in the design of the Sidekick system from allexisting proposals is the process of fetching the KEK from a wireless wearabledevice, shown in Figure 4.1. Thus, our technical evaluation of Sidekick is focusedon the performance of the KEK fetching process.There are several reasons to choose a wireless stack and a wearable deviceover existing physical connections. Physically attached external devices, such asmemory cards or USB flash drives, are not well suited for smartphones, sincenot all modern smartphones have a USB port or allow using external memorycards. Physically attached external devices also require constant user attention,since forgetting to unplug the external device from the smartphone destroys allsecurity properties that a data encryption system provides. The proliferation of107Figure 4.1: In currently deployed systems, a user needs to provide an unlockingsecret to unlock his or her device. The unlocking secret, most probably, is aneasy-to-guess one. That secret is then used to derive a Data Encryption Key(DEK), which is then used for data encryption/decryption. When applicationdevelopers need to encrypt data in smartphones, they usually use a static dataencryption key, i.e., hard code it into their application, and then also roll outtheir own implementation of the data encryption. Sidekick addresses bothissues by randomly generating key encryption keys (KEK) and then storingthem on a wearable device. Sidekick makes data encryption independent fromthe unlocking secret, by mainly relying on KEKs while making the use ofunlocking secrets optional (showed as a dashed line). It also provides a simplerAPI to application developers so that they do not need to roll out their ownimplementation of data encryption and a encryption key management system.wearable devices, such as smartwatches and fitness trackers, suggest that therewill be plenty of options for users to choose from for storing KEKs.Sidekick uses the BLE stack for the following reasons, which are mostly prac-108Figure 4.2: High-level design of the Sidekick System. A data containing device(DCD) runs applications that link the Sidekick library. The library takes careof all communications with the KSD, e.g., storing or retrieving a KEK. Once arequired KEK is retrieved, a corresponding DEK is decrypted and stored in theDecrypted DEKs Cache by the Sidekick Library. The DEK is then passed tothe Data Encryption System in order to encrypt/decrypt data. Each applicationhas a separate KEK List. The Reference Monitor on the KSD mitigates amisbonding attack by ensuring that each application has access only to its ownKEK List.tical. First, BLE hardware and BLE APIs are available on all platforms today,while other Personal Area Network (PAN) stacks, notably Near Field Communi-cation (NFC), have limited support. Second, the use of BLE in wearable devicesis energy efficient, since recent research has showed that a BLE-based System-On-Chip (SoC) can work for months off a single coin-cell battery [6, 69]. Finally,most of the released wearable devices already rely on the BLE stack for their com-munications with smartphones (e.g., Nike+ [16]). Sidekick was prototyped on theCC2540 SoC developed by Texas Instruments. Although several other, more ca-pable BLE-enabled SoCs were available at the time, CC2540 was chosen sinceit was the most popular and least capable BLE-enabled SoC at the time of theexperiments [13].The overall design of Sidekick is shown in Figure 4.2. A smartphone, thatstores sensitive data is called a data containing device (DCD). A wearable devicethat stores KEKs is called a key storing device (KSD). When a users tries to accesssensitive data on the DCD, Sidekick fetches the corresponding KEK from the KSD109and recovers the DEK. Once the data is decrypted with the recovered DEK, a usercan read or modify that data on the DCD.There are four requests that Sidekick can send to a KSD, namely get, store,update, and delete on a KEK. Each of the four requests (Req) has a correspondingresponse (Resp) from the KSD. For instance, when the DCD needs to store anew KEK on the KSD, it sends a StoreReq to the KSD with the new KEK in thepayload. Once the KSD has processed that request, it responds with a StoreResp,which contains a KEKID, a unique KEK identifier, in the payload. Later, whenthe DCD needs to fetch that KEK, it needs to provide the KEKID it received in theStoreResp payload.4.3.2 Securing Communications over BLEWhile the main objective of Sidekick is to mitigate attacks by thecrypto-attacker,we have to address the risks that arise from the use of the BLE stack. In particular,a network-attacker can exploit one of the previously reported vulnerabilities [96–98]1. The existence of such attacks is not surprising, since the security of BLEwas compromised on purpose to make BLE-based SoCs power-efficient [10]. Inaddition, Sidekick needs to overcome the limitations in access control in mobileoperating systems for access to the BLE stack. In particular, Sidekick needs tocontrol access to each KEK so that two different applications cannot fetch eachother’s KEKs.Transport Layer Security. To ensure integrity and confidentiality protec-tion for all of Sidekick’s communications over the BLE (to protect against thenetwork-attacker who aims to corrupt KEKs), we used the Counter with CBC-MAC mode (CCM) [109], since (a) all BLE-enabled SoCs have this implementedin hardware [10], and (b) it has been proven to be secure [75].Mutual Authentication. On top of the transport layer, which is the AttributeProtocol (or ATT) in the BLE stack, a well-known and studied mutual authentica-tion protocol based on a shared secret was used [63]. The mutual authentication1As of this writing all these attacks are still practical.110protocol was used for two reasons: (a) to establish a pairing key during the ini-tial pairing between the KSD and an application on the DCD, and (b) since eachapplication established its own key, the pairing key was used to authenticate ap-plications to properly enforce access control to KEKs.Pairing KSD and DCD. For mutual authentication to work, one should firstestablish a bootstrapping shared secret. Unfortunately, wearable devices are oftenlimited in their Input/Output capabilities. For instance, the CC2540 SoC only hastwo LEDs in the default circuit design, which is why a blinking LED (BLED)approach, proposed by Saxena et al. [100], was adopted in Sidekick to establishthe initial shared secret. In BLED, one device generates a pseudo random key andshares it by controlling how an LED blinks, while the other, significantly morecapable device, uses a camera and converts a blinking LED into a bit stream. Thesecurity of a secret established in this way is only important while a new, signifi-cantly stronger shared key is being established through a key establishment proto-col, such as Elliptic Curve Diffie-Hellman (ECDH). The results of benchmarkingexperiments with the Samsung S3 and the iPhone 4S revealed that both devicescan reliably handle a bit-stream of 3 bits/s. To make the wait time shorter forend users, while maintaining a sufficient level of security, Sidekick uses BLEDto established a 32-bit secret in about 10 seconds, which corresponds, approxi-mately, to a six-digit PIN-code. Limited amount of RAM and processing poweravailable in modern wearable devices make it impossible to use the original DHprotocol [51]. To overcome this limitation, Sidekick uses the ECDH protocol,based on the P128 curve.Other Considerations. Sidekick needs to mitigate Replay and Retry attacksas well. To mitigate Replay attacks Sidekick uses a Nonce in each message andverifies that Nonce on the recipient side. To mitigate Retry attacks, Sidekick usesa monotonically increasing number as a Nonce, which allows a recipient to detectretried old messages by comparing the message number of the message in questionwith the last message sent/received thus far.1114.4 System Evaluation4.4.1 Experimental Setup.Sidekick was evaluated with an iPhone 4S and a Samsung S3 smartphone as theDCDs, and a CC2540 SoC [11] as the KSD. The KSD was implemented in C as afirmware for the CC2540. The DCD side was implemented in native languages forthe given platforms, i.e., Objective C for iOS, Java for Android. Sidekick had lowmemory requirements on the KSD side (20Kb or ROM and 4Kb or RAM), andnegligible impact on smartphones. The evaluation of the system was conductedunder the assumption that KEKs and DEKs are 256-bit long.4.4.2 LatencyThe overall latency is defined as the time span from the moment an applicationon the DCD submits a request to the moment the application receives a corre-sponding response from the KSD. The overall latency consists of two parts (a)communication latency – the time spent on completing a request over BLE, and(b) computation latency – the time spent on all required calculations, such as mes-sage encryption and decryption. Considering that the benchmarking experimentsrevealed that communication latency is in the order of several magnitudes higher,the computation latency is omitted from further discussion.There are several fundamental factors in BLE that impact communication la-tency. First, maximum transmission unit size at the ATT layer, referred to asMTU_ATT in the BLE specification [10], limits the number of bytes one can fitinto a single ATT packet. Second, the BLE specification defines a connection in-terval (CInterval), i.e., a time window for a single packet. Finally, BLE definesa connection interval latency (CLatency), which defines the maximum number ofallowed connection intervals without a message before the connection is consid-ered closed. Note that, to maintain connection, the BLE stack sends so calledempty-PDUs in unused connection intervals. CLatency allows devices to skip112sending empty-PDU to conserve energy, by allowing the wearable device to staylonger in the most power-efficient modes.The results of sniffing on the BLE connection setup process revealed that theCC2540 SoC and Android OS smartphones supported up to 23 bytes in eachpacket, while the iPhone 4S allowed up to 132 bytes in a single packet. Thedefault values for CInterval were 30ms and 48.5ms for Android and iOS respec-tively. Finally, both platforms used zero as the default value for CLatency, i.e.,skipping connection intervals was not allowed by default.Table 4.1 provides a summary of latency for each of the four supported re-quests. These results suggest that with the default configuration of BLE stack inboth iOS and Android, Sidekick introduces less than a second delay in the DEKrecovery process. Considering that this delay is significantly smaller than the pro-cess of unlocking a smartphone with a secret, as was shown in Chapter 2, one canhide this delay by fetching KEK during the unlocking process.Table 4.1: Overall Latency for each four request/response message pairs for the de-fault values for CInterval and CLatency.Req/Resp , Overall Latency, ms(Payload Length, bytes) Android OS iOSStore (32/4) 873 540Retrieve (4/32) 873 540Update (32/0) 873 540Delete (4/0) 776 4804.4.3 Power ConsumptionSmartphones. Power consumption is a crucial property of a system that is meantto be used in smartphones. Draining too much power would make the proposalless appealing to end-users. The results of the laboratory experiment on powerconsumption revealed that retrieving a single 256-bit KEK consumes approxi-mately 5.7µAh on smartphones, or about 0.0004% their battery capacity. That is,if a user unlocks their smartphones 100 times a day and a KEK is fetched dur-ing each unlock, that would consume 0.04% of battery capacity. Considering that113users tend to charge their smartphones on a daily basis, one can completely ignorethe power consumption overheads that Sidekick introduces.Wearables. The CC2540 SoC is based on the 8-bit 8051 CPU, which providesgreat flexibility in power consumption through four power modes: Active Mode,and Modes 1-3. In theory, CC2540 can run for 9 hours in Active Mode and up to30 years in Mode 3 on a single CR2032 battery [69]. Of course, in practice thebattery lifespan depends on specific firmware. Power-consumption experimentsrevealed that retrieving a single KEK consumes approximately 0.12µAh of batterycapacity on CC2540, which corresponds approximately to 0.00005% of CR2032capacity. That is, a single CR2032 battery allows the KSD to receive and processabout 2 million requests from the DCD.To assess battery lifespan more precisely, one needs to define a daily workload,i.e., the number of requests sent per day to the KSD. For example, if we considerthat the KSD stores a single KEK for the entire smartphone, then we can set theexpected daily workload to approximately 100 requests a day [110]. On the otherhand, a banking or a business application might store multiple keys on the KSD.Four workloads were used, with 1, 10, 100, and 1,000 requests a day, to covervarious Sidekick usage scenarios.In addition, several various values for CInterval were used to show how aslight increase in the overall latency impacts battery lifespan. In particular, thefollowing four values {0, 15, 32, 48} for CInterval parameter were evaluated. Thevalues correspond to 540, 1000, 1500, 2000 ms of the overall latency for a KEKfetching request.Table 4.2: CR2032 battery life in days, depending on the acceptable overall latencyfor a request and on the number of requests per day.Maximum Latency, msRequests per Day 540 1,000 1,500 2,0001 14 217 443 65210 14 217 442 651100 14 215 434 6331,000 14 197 366 496114The results shown in Table 4.2 suggest that the default configuration for theBLE channel (CInterval = 30ms and CLatency = 0) for the CC2540-based KSDallows it to run for only two weeks. If, however, we increase maximum allowedlatency up to 2,000 ms, then KSD can last more than 600 days on a single battery,assuming a workload of 100 requests per day.4.4.4 Session Key RenewalSecrecy of the session keys is crucial for the overall security of Sidekick. Powerconsumption experiments on wearable devices revealed that establishing a sin-gle 128-bit session key consumes 0.044% (or 0.1 mAh) of battery. That is, asingle CR2032 battery allows at most 2,272 session keys to be established withECDH-P128. This is why it is also important to factor in the energy consumptionof session key establishment and renewals in the battery lifespan assessment forwearable devices.For demonstration purposes let us consider the following example:2 a userunlocks his smartphone 100 times a day, and, in parallel, Sidekick makes a requestto the KSD to fetch the KEK. Considering that the fastest unlocking secret [82]requires about 2 seconds, we would also assume that having a two second overalllatency is acceptable. With these parameters set, the results shown in Table 4.2suggest that the CR2032 battery will last for 633 days. Now, if we renew thesession key twice a day, that would correspond to 406 days of battery lifespan,i.e., 36% of battery capacity will be spent on key establishment.4.4.5 SummaryOverall, the results of the benchmark, latency and power consumption experi-ments revealed that the use of wearable devices to decouple user authenticationand data encryption in smartphones is a practical proposal. In particular, the la-tency of retrieving a KEK can be completely hidden behind the smartphone un-2This example closely matches requirements of the current data protection systems in iOS andAndroid OS.115locking process. The evaluated proposal had insignificant impact on battery lifeand the current implementation allows a KSD device to run for more than a yearon a single coin-cell battery.4.5 Related WorkThere are two complimentary ways of improving security of password-based en-cryption. One can increase the cost of password guessing attacks either by in-creasing the costs of the key derivation process or by nudging users to choosepasswords that are harder to guess.The research community has also proposed other KDFs such as ones that fo-cus on substantially increasing the cost of each step for attackers. For example,Boyen [111] developed a halting key derivation function, which forces an attackerto perform substantial amounts of additional computations for each guess, andthus significantly increases the overall cost of the attack, while keeping users’costs relatively low. Other proposals, e.g., [70, 104], have suggested to increasethe number of PBKDF iterations, to keep up with the recent improvements incomputational capabilities of modern processors. This, however, is a never-endingarms race.Others proposals focused on designing and evaluating novel authenticationmethods for smartphones (e.g., [49]) that are usable, yet, secure. While the pre-sented evaluation results suggest that the proposed authentication methods are us-able and resilient against specific attacks, such as shoulder-surfing attacks, usersstill choose easy-to-guess unlocking secrets, which are comparable with 4-digitPIN-codes in complexity. In contrast, Sidekick is a KDF agnostic system, that is,it eliminates the dependency of data encryption security on the unlocking secret.Although, one can tangle a randomly generated KEK with an unlocking secret,that would still provide at least the same amount of entropy that the randomlygenerated KEK does.Finally, researchers proposed protection techniques for data-at-rest on mobiledevices. For instance, DOrazio et al. [53] proposed an approach to conceal or116delete unprotected data in iPhones. In particular, the proposal generates a newkey C and then uses this key to encrypt the per-file key (stored in file’s metadatablock on iOS). This solution renders the file cryptographically unreadable withoutC key. If the user does not store that key anywhere, then the data practicallybecomes cryptographically deleted. Their solution, however, requires substantialexpertise from end-users in order to setup the concealment. For instance, usersare required to jailbreak their devices. Sidekick, on the other hand, was designedto work transparently with minimal user involvement. If a user can use a fitnesstracker, she should be able to use Sidekick as well.4.6 Discussion and Future WorkOverall, the evaluation results suggest that the Sidekick system is practical inadded latency and power consumption. Low power consumption makes the pro-posal attractive due to minimal maintenance efforts, i.e., frequency of battery re-placement or not needing to change how often one charges a smartphone. Consid-ering that the current implementation does not require any changes in the mobileOS, one can use the proposal right away. In fact, the implementation of securecommunication from Sidekick is being used by the company FusionPipe3.The Sidekick system was designed for organizations that want to take controlover protecting the confidentiality of the data in their employees’ smartphones.Hence we have focused our evaluation on primary technical aspects – low cost andlow maintenance solutions. There are, however, plenty of open research questionsand challenges remaining.First, having yet another device might push users away and it is not clearif users are embracing existing wearable devices or keeping them closeby at alltimes. Thus far it is not clear if a system design based on a wearable device offersbetter usability than strict requirements to pick a hard-to-guess unlocking secret.Second, if such a system is adopted by end users for themselves, one should extendSidekick with a usable fall-back mechanism that would allow user recovery in case3 wearable device becomes unavailable. Finally, physical and system security ofthe wearable devices might need to be properly evaluated, especially when suchprotection is required.While technical evaluation of the Sidekick system showed promising results,there are still plenty of remaining research questions. For instance, it is unclear ifusers are willing to trust such a system in the first place, considering that a userwill not be able to access data if something happens with the wearable device.Also, it is still unknown how often users actually use wearable devices and wearthem. Finally, the usability of all user interactions involved in the configurationprocess of Sidekick is not studied, and thus, might potentially hinder adoption.Fortunately, there are ways to make the proposal simpler, at least for appli-cation developers. Existing cryptographic libraries and APIs are often hard touse [36], so to address issues related to the misuse of cryptographic APIs, suchas the use of ECB mode, static IVs and static encryption keys, future researchshould consider integrating Sidekick with the recent EXT4 file system driver [27].This file system has recently4 received an update adding support of per-file en-cryption [80]. To encrypt a file in EXT4, an application developer or a user wouldneed to declare the file as encrypted. This can be achieved by either setting fileattributes through system calls or by using the e4crypto command line tool5. Thedriver takes care of properly using cryptographic primitives, thus simplifying dataencryption for developers.While declaring a file as encrypted in EXT4, one needs to provide a so-called“encryption policy”. This policy can define the process of how file-specific keyencryption key (KEK) is derived in the Linux kernel. The actual data encryptionkey is then wrapped with the KEK and stored in the file’s inode. Before proceed-ing with the requested IO operation, the driver attempts to reconstruct the KEK,and, subsequently, recover the DEK. As of this writing, the EXT4 driver supports4The set of patches that enabled encryption in the EXT4 driver were released in 2015 withLinux kernel 4.1. The author of this thesis worked on its implementation during his internship atGoogle in 2015.5 keyrings6 as the only source of data chucks for the KEK derivation process.As a result, both the driver and the e4crypto tool need to be extended to supportSidekick’s wearables.Integration of the Sidekick system with EXT4 driver would provide severaladvantages. First, application developers will not be required to implement theactual implementation of the data encryption, since it will be provided by theEXT4 driver. Second, security research community would be able to focus onsmaller code based, i.e., the file system driver. Finally, considering that drivers of-ten have direct access to hardware, drivers are able to take advantage of hardwareacceleration, such as AES-NI7.4.7 ConclusionThis chapter presented the design and evaluation of Sidekick – a system that de-couples data encryption and user authentication in smartphones by using wearabledevices for encryption key management. The evaluation results showed that theproposed system is effective from a technical standpoint; it can be deployed acrossall mobile platforms right away, works on new and old devices, and does not no-ticeably increase power consumption and latency. However, there are still plentyof open research questions remaining, especially with regards to usability.Sidekick was evaluated on iOS and Android platforms and a commonly usedBLE-enabled SoC (CC2540 [11]). The results of experiments on latency andpower consumption revealed that with the session key renewed twice a day, aKey Storage Device based on CC2540 can work more than 400 days on a singlecoin-cell battery.6 is hardware implementation of AES by Intel CPUs.119Chapter 5Discussion and ConclusionIn this thesis we looked at data-at-rest security from various angles. In particular,we studied how end users use and misuse smartphone locking systems. We thenstudied how application developers employ Crypto API in their applications andwhat kind of mistakes they make. We then looked into practical the use of wear-able devices is for sensitive data protection in smartphones. In what follows wesummarize the implications of the results presented in this dissertation.A set of users studies presented in this work provide a deeper insight into howand why users use (or do not) smartphone locking systems (see Chapter 2 for moredetails). The results of these studies revealed that there is a gap in the design ofsmartphone locking systems and users needs. First, the authentication methodsare still far from being able to provide both usability and security, since almostall subjects chose an easy-to-guess unlocking secret. Second, 20% of the subjectsfound it cumbersome to unlock their device when all they needed was weatherforecast. Although recent updates to both iOS and Android allowed certain fea-tures of the phone to be used while in locked state (e.g., Camera), there are stillplenty of other services and applications that need to move in the same acces-sibility domain, e.g., games, anonymous browser. Finally, results of the studiesrevealed that users are targeted by attackers from their social circle, thus, homemight not be as safe as previously thought.120These results grant further research into usable authentication methods andmore flexible application access control systems for smartphones. While newauthentication methods should be usable they also need to be secure in the envi-ronment where shoulder surfing attacks are highly probable.Another vector of attack on users’ smartphones is through smartphone appli-cations that victims use. For example, if an application that stores and handlessensitive data misuses Crypto API, data might be decrypted while in transit orwhile stored, if a bit-by-bit image is obtained. To understand the extent of expo-sure we conducted an analysis study on 132K of Android applications. The resultsof the analysis (see Chapter 3 for more details) revealed that the current data pro-tection systems in smartphones are inadequate in the presence of an attacker withphysical access to the device. In both cases, if a full-disk encryption is used, orif an application itself encrypts data, then an attacker has higher chances of beingable to recover the encryption key, and thus, decrypt the data.An attackers’ ability to recover an encryption key arises mainly from twolimitations. First, as we showed in Chapter 2, users tend to choose unlockingsecrets that are easy-to-guess within minutes [34, 37, 104]. Second, more thanhalf of calls to symmetric cipher API by developers relies on a static encryptionkey, which can be trivially extracted with such tools as Dex2Jar and Java Decom-piler [19, 21].Addressing both of these limitations is challenging. On the user side, oneneeds to understand how to improve complexity of unlocking secrets without de-grading usability. Unfortunately, while new technologies have promised to ad-dress certain limitations, they appear to be not as effective [45]. On the applica-tions developers side of the issue, not much progress has been made. To compli-cate the matters, the problem of Crypto API misuse is still far from being fullyunderstood by research community.While our studies present interesting results, e.g., almost all calls to CryptoAPI are made by a small set of libraries, there is still gap in our understand ofwhat does that mean from practical perspective. Furthermore, manual in-depth121analysis showed that the existing analysis approach based on static analysis suffersfrom a new type of false positive – functional false positive. A functional falsepositive is a case where a misuse has no security implications. For example, inthe recent years binary code obfuscation became a standard, and while it encryptscertain bits of binary with a static key (i.e., formally misuses Crypto API) thegoal is not to protect confidentiality, but rather make the reverse engineering taskmore complex. Future research should not only expand the set of the rules forCrypto API misuse, but should also look into techniques that can be used to detectfunctional false positives.While application developers used static encryption keys, end-users struggledwith creating hard-to-guess passwords. Both of these issues rendered sensitivedata-at-rest insecure. With the design of Sidekick system (presented in Chapter 4)we aimed to evaluate if using wearable devices for encryption key management isvital from practical point of view. We define practical as (a) a solution that wouldbe unnoticeable from latency point of view, and (b) would not drain significantamount of power from both a smartphone and wearable. We envisioned Sidekickto be deployed as a service into already used wearable devices, such as FitBitor Apple Watch. While results of the experiments showed that wearable devicescould provide a practical solution, it is still unknown if smartphone users wouldlike to adopt such an approach.122Bibliography[1] Smart phone thefts rose to 3.1 million last year, consumer reports finds. last accessedApril 22, 2015. → pages 1[2] Announcement of new initiatives to combat smartphone and data theft. lastaccessed May 12, 2015. → pages 1, 101[3] Find My iPhone. last accessedFebruary 4, 2012. → pages 25[4] Phone Theft In America. lastaccessed April 22, 2015. → pages 1, 101[5] 2014 Cost of Data Breach Study. last accessed April 22,2015. → pages 1, 100, 101[6] Bluetooth SIG Analyst Digest Q4 2012.→ pages 109[7] Elcomsoft iOS Forensic Toolkit.,2012. Accessed February 15, 2013. → pages 103123[8] Dashboards | Android Developers., 2012.Accessed July 18, 2012. → pages 31[9] Symantec Smartphone Honey Stick Project., 2012. → pages 1,50[10] Specification Of The Bluetooth System 4.1., 2013. AccessedFeb 08, 2013. → pages 102, 105, 110, 112[11] Bluetooth Low Energy System on Chip CC2540., 2013. Accessed February 15, 2013.→ pages 112, 119[12] TrueCrypt - Free Open-Source On-The-Fly Disk Encryption Software forWindows 7/Vista/XP, Mac OS X and Linux.,2013. Accessed February 15, 2013. Version 7.1a. → pages 107[13] Bluetooth Smart CC2541 SensorTag., 2014. → pages 109[14] BitLocker Drive Encryption.,2014. Accessed February 26, 2014. → pages 107[15] 2 Billion Consumers Worldwide to Get Smart(phones) by 2016, 2014.URL →pages 1[16] Nike+., 2014. Accessed February26, 2014. → pages 109[17] Apktool - A tool for reverse engineering Android apk files (Version 2.01)., July 2015. last accessed June 29,2015. → pages 4, 63124[18] The Legion of the Bouncy Castle., July2015. last accessed June 29, 2015. → pages 59[19] Tools to work with Android .dex and Java .class files., July 2015. last accessed June 29,2015. → pages 59, 121[20] Dalvik bytecode., July2015. last accessed October 29, 2016. → pages 63[21] Yet another fast Java Decompiler., July 2015. lastaccessed June 29, 2015. → pages 121[22] Spongy Castle - repackage of Bouncy Castle for Android., July 2015. last accessed June 29,2015. → pages 59[23] Reverse engineering, Malware and goodware analysis of Androidapplications ... and more (ninja !))., November 2016. last accessedNovember 16, 2016. → pages 53, 63[24] Android now has 1.4 billion 30-day active users globally., 2016. AccessedAugust 2, 2016. → pages 1[25] Direct apk downloader. Direct APK Downloader, 2017. URL → pages 61, 104[26] CUDA | GeForce, 2017. URL → pages 2[27] Ext4 file system., 2017. →pages 118[28] Summary of the HIPAA Security Rule, 2017. URL → pages100125[29] java.lang.reflect (Java Platform SE 8) - Provides classes and interfaces forobtaining reflective information about classes and objects., 2017. URL → pages 64, 92, 94[30] Java Cryptography Architecture Oracle Providers Documentation for JavaPlatform Standard Edition 7., May 2017. last accessedMay 15, 2017. → pages 59[31] SecureRandom (Java Platform SE 7), 2017. URL→ pages 58[32] Soot - a framework for analyzing and transforming java and androidapplications, 2017. URL → pages 9[33] hashcat - Advanced Password Recovery., 072018. → pages 2, 102[34] D. Abalenkovs, P. Bondarenko, V. K. Pathapati, A. Nordbø,D. Piatkivskyi, J. E. Rekdal, and P. B. Ruthven. Mobile Forensics:Comparison of extraction and analyzing methods of iOS and Android.Master Thesis, Gjøvik University College, 2012. → pages 46, 121[35] Y. Acar, M. Backes, S. Fahl, D. Kim, M. L. Mazurek, and C. Stransky.You get where you’re looking for: The impact of information sources oncode security. In Security and Privacy (SP), 2016 IEEE Symposium on,pages 289–305. IEEE, 2016. → pages 54[36] Y. Acar, M. Backes, S. Fahl, S. Garfinkel, D. Kim, M. L. Mazurek, andC. Stransky. Comparing the usability of cryptographic APIs. InProceedings of the 38th IEEE Symposium on Security and Privacy, 2017.→ pages 52, 98, 118[37] Apple. iOS Security, 8.1 and up., 2014.Accessed April 26, 2015. → pages 2, 46, 57, 121[38] S. Arzt, S. Rasthofer, C. Fritz, E. Bodden, A. Bartel, J. Klein, Y. Le Traon,D. Octeau, and P. McDaniel. Flowdroid: Precise context, flow, field,126object-sensitive and lifecycle-aware taint analysis for android apps. InProceedings of the 35th ACM SIGPLAN Conference on ProgrammingLanguage Design and Implementation, PLDI ’14, pages 259–269, NewYork, NY, USA, 2014. ACM. ISBN 978-1-4503-2784-8.doi:10.1145/2594291.2594299. URL → pages 96[39] M. Backes, S. Bugiel, and E. Derr. Reliable third-party library detection inandroid and its security applications. In Proceedings of the 2016 ACMSIGSAC Conference on Computer and Communications Security, CCS’16, pages 356–367, New York, NY, USA, 2016. ACM. ISBN978-1-4503-4139-4. doi:10.1145/2976749.2978333. URL → pages 3, 54, 68, 70, 90,93[40] M. Bellare and P. Rogaway. Introduction to modern cryptography, 2017.URL → pages56[41] M. Bellare, T. Ristenpart, and S. Tessaro. Multi-instance security and itsapplication to password-based cryptography. In Advances inCryptology–CRYPTO 2012, pages 312–329. Springer, 2012. → pages 57[42] N. Ben-Asher, N. Kirschnick, H. Sieger, J. Meyer, A. Ben-Oved, andS. Möller. On the need for different security methods on mobile phones.In Proceedings of the 13th International Conference on Human ComputerInteraction with Mobile Devices and Services, MobileHCI ’11, pages465–473, New York, NY, USA, 2011. ACM. ISBN 978-1-4503-0541-9.doi:10.1145/2037373.2037442. URL → pages 42[43] B. Bichsel, V. Raychev, P. Tsankov, and M. Vechev. Statisticaldeobfuscation of android applications. In Proceedings of the 2016 ACMSIGSAC Conference on Computer and Communications Security, CCS’16, pages 343–355, New York, NY, USA, 2016. ACM. ISBN978-1-4503-4139-4. doi:10.1145/2976749.2978422. URL → pages 66, 93[44] D. W. Binkley and K. B. Gallagher. Program slicing. Advances inComputers, 43:1–50, 1996. → pages 65127[45] I. Cherapau, I. Muslukhov, N. Asanka, and K. Beznosov. On the impact oftouch id on iphone passcodes. In Proceedings of the Symposium onUsable Privacy and Security, SOUPS ’15, page 20, July 22-24 2015. →pages 3, 6, 43, 49, 121[46] E. Chin, A. P. Felt, V. Sekar, and D. Wagner. Measuring user confidence insmartphone security and privacy. In Proceedings of the Eighth Symposiumon Usable Privacy and Security, SOUPS ’12, pages 1:1–1:16, New York,NY, USA, 2012. ACM. ISBN 978-1-4503-1532-6.doi:10.1145/2335356.2335358. URL → pages 41[47] R. Cytron, J. Ferrante, B. K. Rosen, M. N. Wegman, and F. K. Zadeck.Efficiently computing static single assignment form and the controldependence graph. ACM Transactions on Programming Languages andSystems (TOPLAS), 13(4):451–490, 1991. → pages 64[48] A. De Luca, M. Langheinrich, and H. Hussmann. Towards understandingatm security: a field study of real world atm use. In Proceedings of theSixth Symposium on Usable Privacy and Security, SOUPS ’10, pages16:1–16:10, New York, NY, USA, 2010. ACM. ISBN 978-1-4503-0264-7.doi: URL → pages 40[49] A. De Luca, A. Hang, F. Brudy, C. Lindner, and H. Hussmann. Touch meonce and i know it’s you!: implicit authentication based on touch screenpatterns. In Proceedings of the 2012 ACM annual conference on HumanFactors in Computing Systems, CHI ’12, pages 987–996, New York, NY,USA, 2012. ACM. ISBN 978-1-4503-1015-4.doi:10.1145/2208516.2208544. → pages 49, 116[50] A. De Luca, M. Harbach, E. von Zezschwitz, M.-E. Maurer, B. E. Slawik,H. Hussmann, and M. Smith. Now you see me, now you don’t: Protectingsmartphone authentication from shoulder surfers. In Proceedings of theSIGCHI Conference on Human Factors in Computing Systems, CHI ’14,pages 2937–2946, New York, NY, USA, 2014. ACM. ISBN978-1-4503-2473-1. doi:10.1145/2556288.2557097. URL → pages 49128[51] W. Diffie and M. Hellman. New directions in cryptography. IEEETransactions on Information Theory, IT-22:644–654, 1976. URL → pages 111[52] D. Dolev, C. Dwork, and M. Naor. Non-malleable cryptography. In SIAMJournal on Computing. Citeseer, 1998. → pages 3, 53[53] C. DOrazio, A. Ariffin, and K. K. R. Choo. ios anti-forensics: How canwe securely conceal, delete and insert data? In System Sciences (HICSS),2014 47th Hawaii International Conference on, pages 4838–4847, Jan2014. doi:10.1109/HICSS.2014.594. → pages 116[54] T. Dorflinger, A. Voth, J. Kramer, and R. Fromm. "My Smartphone is aSafe!" - The User’s Point of View Regarding Novel AuthenticationMethods and Gradual Security Levels on Smartphones. In SECRYPT 2010- Proceedings of the International Conference on Security andCryptography, Athens, Greece, July 26-28, 2010, SECRYPT is part ofICETE - The International Joint Conference on e-Business andTelecommunications, pages 155–164. SciTePress, 2010. → pages 42[55] P. Dunphy, A. P. Heiner, and N. Asokan. A closer look atrecognition-based graphical passwords on mobile devices. In Proceedingsof the Sixth Symposium on Usable Privacy and Security, SOUPS ’10,pages 3:1–3:12, New York, NY, USA, 2010. ACM. ISBN978-1-4503-0264-7. doi:10.1145/1837110.1837114. URL → pages 40[56] M. Dürmuth, T. Güneysu, M. Kasper, C. Paar, T. Yalcin, andR. Zimmermann. Evaluation of Standardized Password-Based KeyDerivation against Parallel Processing Platforms, pages 716–733.Springer Berlin Heidelberg, Berlin, Heidelberg, 2012. ISBN978-3-642-33167-1. doi:10.1007/978-3-642-33167-1_41. URL → pages 2, 6[57] M. Egele, D. Brumley, Y. Fratantonio, and C. Kruegel. An empirical studyof cryptographic misuse in android applications. In Proceedings of the2013 ACM SIGSAC conference on Computer & communications security,pages 73–84. ACM, 2013. → pages xv, 3, 4, 7, 8, 9, 53, 55, 60, 61, 62, 64,69, 74, 97129[58] S. Egelman, A. Sotirakopoulos, I. Muslukhov, K. Beznosov, andC. Herley. Does my password go up to eleven?: The impact of passwordmeters on password selection. In Proceedings of the SIGCHI Conferenceon Human Factors in Computing Systems, CHI ’13, pages 2379–2388,New York, NY, USA, 2013. ACM. ISBN 978-1-4503-1899-0.doi:10.1145/2470654.2481329. URL → pages 49[59] M. Eiband, M. Khamis, E. von Zezschwitz, H. Hussmann, and F. Alt.Understanding shoulder surfing in the wild: Stories from users andobservers. In Proceedings of the 2017 CHI Conference on Human Factorsin Computing Systems, CHI ’17, pages 4254–4265, New York, NY, USA,2017. ACM. ISBN 978-1-4503-4655-9. doi:10.1145/3025453.3025636.URL → pages 49[60] W. Enck, P. Gilbert, B.-G. Chun, L. P. Cox, J. Jung, P. McDaniel, andA. N. Sheth. Taintdroid: an information-flow tracking system for realtimeprivacy monitoring on smartphones. In Proceedings of the 9th USENIXconference on Operating systems design and implementation, OSDI’10,pages 1–6, Berkeley, CA, USA, 2010. USENIX Association. URL → pages 96[61] W. Enck, D. Octeau, P. McDaniel, and S. Chaudhuri. A study of androidapplication security. In USENIX security symposium, volume 2, page 2,2011. → pages 61[62] S. Fahl, M. Harbach, T. Muders, L. Baumgärtner, B. Freisleben, andM. Smith. Why eve and mallory love android: An analysis of android ssl(in) security. In Proceedings of the 2012 ACM conference on Computerand communications security, pages 50–61. ACM, 2012. → pages 53, 95[63] N. Ferguson, B. Schneier, and T. Kohno. Cryptography Engineering:Design Principles and Practical Applications. John Wiley & Sons, 2011.→ pages 110[64] F. Fischer, K. Böttinger, H. Xiao, C. Stransky, Y. Acar, M. Backes, andS. Fahl. Stack overflow considered harmful? the impact of copy&paste onandroid application security. In Security and Privacy (SP), 2017 IEEESymposium on, pages 121–136. IEEE, 2017. → pages 54130[65] S. R. Fluhrer, I. Mantin, and A. Shamir. Weaknesses in the key schedulingalgorithm of rc4. In Revised Papers from the 8th Annual InternationalWorkshop on Selected Areas in Cryptography, SAC ’01, pages 1–24,London, UK, UK, 2001. Springer-Verlag. ISBN 3-540-43066-0. URL → pages 81, 97[66] A. Forget, S. Chiasson, P. C. van Oorschot, and R. Biddle. Improving textpasswords through persuasion. In Proceedings of the 4th Symposium onUsable Privacy and Security, SOUPS ’08, pages 1–12, New York, NY,USA, 2008. ACM. ISBN 978-1-60558-276-4.doi:10.1145/1408664.1408666. URL → pages 49[67] B. G. Glaser. Theoretical sensitivity : advances in the methodology ofgrounded theory. Sociology Press, Mill Valley, CA, 1978. → pages 16[68] K. Glen. iOS 5.1 Reaches 61% Adoption in Just 15 Days.,2012. Accessed July 18, 2012. → pages 31[69] C. Gomez, J. Oller, and J. Paradells. Overview and evaluation of bluetoothlow energy: An emerging low-power wireless technology. Sensors, 12(9):11734–11753, 2012. ISSN 1424-8220. doi:10.3390/s120911734. →pages 109, 114[70] P. A. Grassi, E. M. Newton, R. A. Perlner, A. R. Regenscheid, W. E. Burr,J. P. Richer, N. B. Lefkovitz, J. M. Danker, Y.-Y. Choong, K. Greene, et al.Digital identity guidelines: Authentication and lifecycle management.Technical report, 2017. URL →pages 58, 116[71] E. Hayashi, J. Hong, and N. Christin. Security through a different kind ofobscurity: evaluating distortion in graphical authentication schemes. InProceedings of the 2011 annual conference on Human factors incomputing systems, CHI ’11, pages 2055–2064, New York, NY, USA,2011. ACM. ISBN 978-1-4503-0228-9.doi: URL → pages 40131[72] E. Hayashi, O. Riva, K. Strauss, A. J. B. Brush, and S. Schechter.Goldilocks and the two mobile devices: going beyond all-or-nothingaccess to a device’s applications. In Proceedings of the Eighth Symposiumon Usable Privacy and Security, SOUPS ’12, pages 2:1–2:11, New York,NY, USA, 2012. ACM. ISBN 978-1-4503-1532-6.doi:10.1145/2335356.2335359. URL → pages 7, 42, 46, 48[73] E. Hayashi, S. Das, S. Amini, J. Hong, and I. Oakley. Casa: Context-awarescalable authentication. In Proceedings of the Ninth Symposium on UsablePrivacy and Security, SOUPS ’13, pages 3:1–3:10, New York, NY, USA,2013. ACM. ISBN 978-1-4503-2319-2. doi:10.1145/2501604.2501607.URL → pages 47[74] I. Ion, N. Sachdeva, P. Kumaraguru, and S. Capkun. Home is Safer thanthe Clould! Privacy Concerns for Consumer Cloud Storage. InProceedings of Symposium on Usable Privacy and Security, pages 1–20,Pittsburgh, PA, USA, July 2011. URL →pages 26[75] J. Jonsson. On the security of CTR+ CBC-MAC. In selected Areas inCryptography, pages 76–93. Springer, 2003. → pages 110[76] B. Kaliski. Pkcs #5: Password-based cryptography specification version2.0, 2000. → pages 57[77] S. Komanduri, R. Shay, P. G. Kelley, M. L. Mazurek, L. Bauer,N. Christin, L. F. Cranor, and S. Egelman. Of passwords and people:Measuring the effect of password-composition policies. In Proceedings ofthe SIGCHI Conference on Human Factors in Computing Systems, CHI’11, pages 2595–2604, New York, NY, USA, 2011. ACM. ISBN978-1-4503-0228-9. doi:10.1145/1978942.1979321. URL → pages 49[78] D. Lazar, H. Chen, X. Wang, and N. Zeldovich. Why does cryptographicsoftware fail?: A case study and open problems. In Proceedings of 5thAsia-Pacific Workshop on Systems, APSys ’14, pages 7:1–7:7, New York,NY, USA, 2014. ACM. ISBN 978-1-4503-3024-4.132doi:10.1145/2637166.2637237. URL → pages 52[79] H. Lu and Y. Li. Gesture on: Enabling always-on touch gestures for fastmobile access from the device standby mode. In Proceedings of the 33rdAnnual ACM Conference on Human Factors in Computing Systems, CHI’15, pages 3355–3364, New York, NY, USA, 2015. ACM. ISBN978-1-4503-3145-6. doi:10.1145/2702123.2702610. URL → pages 47[80] Ext4 encryption [].,2017. → pages 118[81] Z. Ma, H. Wang, Y. Guo, and X. Chen. Libradar: Fast and accuratedetection of third-party libraries in android apps. In Proceedings of the38th International Conference on Software Engineering Companion,pages 653–656. ACM, 2016. → pages 66[82] A. Mahfouz, I. Muslukhov, and K. Beznosov. Android users in the wild:Their authentication and usage behavior. Pervasive and MobileComputing, 32:50–61, 2016. → pages 44, 115[83] D. Marques, I. Muslukhov, T. Guerreiro, L. Carriço, and K. Beznosov.Snooping on mobile phones: Prevalence and trends. In Twelfth Symposiumon Usable Privacy and Security (SOUPS 2016), Denver, CO, June 2016.USENIX Association. URL → pages 6, 44, 46[84] S. McNeeley. Sensitive issues in surveys: Reducing refusals whileincreasing reliability and quality of responses to sensitive survey items. InHandbook of survey methodology for the social sciences, pages 377–396.Springer, 2012. → pages 44[85] I. Muslukhov, Y. Boshmaf, C. Kuo, J. Lester, and K. Beznosov. Knowyour enemy: the risk of unauthorized access in smartphones by insiders.In Proceedings of the 15th international conference on Human-computerinteraction with mobile devices and services, MobileHCI ’13, pages271–280, New York, NY, USA, 2013. ACM. ISBN 978-1-4503-2273-7.doi:10.1145/2493190.2493223. → pages 2133[86] M. Naveed, X. Zhou, S. Demetriou, X. Wang, and C. Gunter. Inside job:Understanding and mitigating the threat of external device mis-bonding onandroid. In Proceedings of the 21th Annual Network and DistributedSystem Security Symposium, NDSS Symposium’14, San Diego, CA, USA,2014. → pages 106[87] P. Oechslin. Making a faster cryptanalytic time-memory trade-off. InCrypto, volume 2729, pages 617–630. Springer, 2003. → pages 53, 57[88] G. Paolacci, J. Chandler, and P. G. Ipeirotis. Running experiments onamazon mechanical turk. Judgment and Decision Making, 5(5):411–419,2010. URL → pages32, 41[89] C. Percival, Tarsnap, and S. Josefsson. The scrypt Password-Based KeyDerivation Function draft-josefsson-scrypt-kdf-02, Aug 2015. URL → pages 57[90] A. Popov. RFC7465 - prohibiting RC4 cipher suites., Feb 2015. URL → pages 81[91] A. D. Portal. Encryption | android developers, May 2015. URL→ pages 2[92] D. Preuveneers and W. Joosen. Smartauth: Dynamic contextfingerprinting for continuous user authentication. In Proceedings of the30th Annual ACM Symposium on Applied Computing, SAC ’15, pages2185–2191, New York, NY, USA, 2015. ACM. ISBN 978-1-4503-3196-8.doi:10.1145/2695664.2695908. URL → pages 47[93] R. Raguram, A. M. White, D. Goswami, F. Monrose, and J.-M. Frahm.iSpy: automatic reconstruction of typed input from compromisingreflections. In Proceedings of the 18th ACM conference on Computer andcommunications security, CCS ’11, pages 527–536, New York, NY, USA,2011. ACM. ISBN 978-1-4503-0948-6. doi:10.1145/2046707.2046769.URL → pages 3, 46, 50134[94] O. Riva, C. Qin, K. Strauss, and D. Lymberopoulos. Progressiveauthentication: deciding when to authenticate on mobile phones. InProceedings of the 21st USENIX Security Symposium, Usenix Security’12, pages 301–316, Berkeley, CA, USA, 2012. USENIX Association. →pages 7, 42, 46, 48[95] J. Rizzo and T. Duong. Practical Padding Oracle Attacks. In Proceedingsof the 4th USENIX Conference on Offensive Technologies, WOOT’10,pages 1–8, Berkeley, CA, USA, 2010. USENIX Association. URL → pages 95[96] T. Rosa. Bypassing passkey authentication in bluetooth low energy.Cryptology ePrint Archive, Report 2013/309, 2013.→ pages 106, 110[97] M. Ryan. Bluetooth: With low energy comes low security. In Presented aspart of the 7th USENIX Workshop on Offensive Technologies, Berkeley,CA, 2013. USENIX. URL → pages[98] M. Ryan. Bluetooth Smart: The Good, The Bad, The Ugly and The Fix., 2013. AccessedFebruary 26, 2014. → pages 106, 110[99] H. Sasamoto, N. Christin, and E. Hayashi. Undercover: Authenticationusable in front of prying eyes. In Proceedings of the SIGCHI Conferenceon Human Factors in Computing Systems, CHI ’08, pages 183–192, NewYork, NY, USA, 2008. ACM. ISBN 978-1-60558-011-1.doi:10.1145/1357054.1357085. URL → pages 49[100] N. Saxena, J.-E. Ekberg, K. Kostiainen, and N. Asokan. Secure devicepairing based on a visual channel (short paper). In Proceedings of the2006 IEEE Symposium on Security and Privacy, SP ’06, pages 306–313,Washington, DC, USA, 2006. IEEE Computer Society. ISBN0-7695-2574-1. doi:10.1109/SP.2006.35. → pages 111[101] R. Shay, S. Komanduri, P. G. Kelley, P. G. Leon, M. L. Mazurek, L. Bauer,N. Christin, and L. F. Cranor. Encountering stronger passwordrequirements: User attitudes and behaviors. In Proceedings of the Sixth135Symposium on Usable Privacy and Security, SOUPS ’10, pages 2:1–2:20,New York, NY, USA, 2010. ACM. ISBN 978-1-4503-0264-7.doi:10.1145/1837110.1837113. URL → pages 57[102] E. Shi, Y. Niu, M. Jakobsson, and R. Chow. Implicit authenticationthrough learning user behavior. In Information Security, volume 6531 ofLecture Notes in Computer Science, pages 99–113. Springer Berlin /Heidelberg, 2011. ISBN 978-3-642-18177-1. URL → pages 42[103] S. Shuai, D. Guowei, G. Tao, Y. Tianchang, and S. Chenjie. Modellinganalysis and auto-detection of cryptographic misuse in androidapplications. In Dependable, Autonomic and Secure Computing (DASC),2014 IEEE 12th International Conference on, pages 75–80. IEEE, 2014.→ pages 62, 64, 95[104] A. Skillen and M. Mannan. On implementing deniable storage encryptionfor mobile devices. In Proceedings of the 20th Annual Network andDistributed System Security Symposium, NDSS Symposium’13, SanDiego, CA, USA, 2013. → pages 2, 46, 105, 116, 121[105] A. Smith. Nearly half of american adults are smartphone owners. March 5, 2012. → pages 31, 41[106] B. Ur, P. G. Kelley, S. Komanduri, J. Lee, M. Maass, M. L. Mazurek,T. Passaro, R. Shay, T. Vidas, L. Bauer, N. Christin, and L. F. Cranor. Howdoes your password measure up? the effect of strength meters onpassword creation. In Proceedings of the 21st USENIX Conference onSecurity Symposium, Security’12, pages 5–5, Berkeley, CA, USA, 2012.USENIX Association. URL → pages 49[107] E. von Zezschwitz, A. De Luca, B. Brunkow, and H. Hussmann. Swipin:Fast and secure pin-entry on smartphones. In Proceedings of the 33rdAnnual ACM Conference on Human Factors in Computing Systems, CHI’15, pages 1403–1406, New York, NY, USA, 2015. ACM. ISBN978-1-4503-3145-6. doi:10.1145/2702123.2702212. URL → pages 49136[108] S. H. Walker and D. B. Duncan. Estimation of the probability of an eventas a function of several independent variables. Biometrika, 54(1-2):167–179, 1967. → pages 38[109] D. Whiting, N. Ferguson, and R. Housley. Counter with CBC-MAC(CCM)., 2003. → pages 110[110] V. Woollaston. How often do you check your phone? the average persondoes it 110 times a day, October 2013. URL → pages 114[111] B. Xavier. Halting password puzzles – hard-to-break encryption fromhuman-memorable keys. In 16th USENIX SecuritySymposium—SECURITY 2007, pages 119–134. Berkeley: The USENIXAssociation, 2007. Available at → pages 116[112] A. K. L. Yau, K. G. Paterson, and C. J. Mitchell. Padding Oracle Attackson CBC-Mode Encryption with Secret and Random IVs, pages 299–319.Springer Berlin Heidelberg, Berlin, Heidelberg, 2005. ISBN978-3-540-31669-5. doi:10.1007/11502760_20. URL → pages 95[113] N. H. Zakaria, D. Griffiths, S. Brostoff, and J. Yan. Shoulder surfingdefence for recall-based graphical passwords. In Proceedings of theSeventh Symposium on Usable Privacy and Security, SOUPS ’11, pages6:1–6:12, New York, NY, USA, 2011. ACM. ISBN 978-1-4503-0911-0.doi:10.1145/2078827.2078835. URL → pages 49[114] J. Zdziarski. Identifying back doors, attack points, and surveillancemechanisms in iOS devices. Digital Investigation, 11(1):3–19, 2014. →pages 105137Appendix AUser Studies QuestionsA.1 Pre-screening Questions1. What is your gender?1. Male2. Female2. What is your age?1. under 182. 19 – 243. 25 – 304. 31 – 355. 36 – 406. 41 – 457. 46 – 501388. 51 – 559. 56 – 6010. 61 – 6511. over 653. What is your highest level of completed education?1. High-school2. University (Bachelor?s)3. Graduate School (Master?s, PhD)4. Professional School (College degree)5. Other4. How many jobs do you have and what are they? (Record each job title, industrysector)5. What is your household income?1. under 15K2. [15K,30K)3. [30K,50K)4. [50K,80K)5. more than 80K6. What is your native language?139A.2 Interview Scenario and Coding SheetA.2.1 IntroductionHello Mr/Ms Participant, thank you for taking part in our study. We appreciateyour time. This study will be in the form of an interview, and is going to beaudio recorded. The audio record will be used only for further analysis and willbe securely stored at UBC before being deleted.In this study we are investigating the use of mobile phones.If you don’t have any questions please read this consent form, and if you areagree to be interviewed today, please sign the form.Before we start the interview, could I ask you to show us your phone(s)? (ifthe participant is ok, photograph their phone(s), and find out exact model(s) andstorage capacity and write this information down)1_______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________For what purposes do you use computer and smartphone?A.2.2 Applications Types and Your ExperienceToday/YesterdayCould you, please, describe us how have you used your phone(s) today from themoment you woke up.How would you describe your daily smartphone usage? (Give example if nec-essary) For example, you could say that you are using Application A a lot during1All instructions to the interviewer are in italics.140your usual day; occasionally Application B, and so on.What other applications do you use on your phone?________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________A.2.3 Application Specific QuestionsRepeat these questions for each application you identified in the previous section.Let?s talk about each of these applications.1. Why do you use application XYZ on your smartphone?2. What kind of data do you use with that application?3. Do you use this application for personal matters of for your work?Note: sometime interviewee should ask explicitly whenever or not they savedusername and password in the application. Likewise, if you can infer what typeof data an application might use and participant didn?t mention this type of data,ask him/her explicitly.A.2.4 Data Types Specific QuestionsRepeat these questions for each data type you identified in the previous sections.How confidential or sensitive are data records of type XYZ for you?Scenario 1: Assume your smartphone got stolen by a person who knows youand you know him. Can you answer the following questions:141Do you see any risks for you, your family, or your friends if this person can seedata records of type XYZ?Do you see any risks for you, your family, or your friends if this person can seedata records of type XYZ and corresponding application?Scenario 2: Assume your smartphone is stolen by a person who doesn?t knowyou and you don?t know him. Can you answer the following questions:Do you see any risks for you, your family, or your friends if this person can seedata records of type XYZ?Do you see any risks for you, your family, or your friends if this person can seedata records of type XYZ and corresponding application?A.2.5 Current Practices1. How many computers do you use at home and at work?2. How many of those computers you connect your smartphone(s) to?3. What actions do you take, in order to protect your valuable, confidential andsensitive data from risks, associated with threat of your smartphone got stolen,broken, or lost?4. Do you password protect your smartphone? Why?5. Assume you have just lost your smartphone. What would you do in first hours?6. How soon will you get yourself a replacement phone/smartphone?7. What would you do with the old smartphone before giving it away?8. Have you ever lost your smartphone or mobile phone?9. Have you ever lost any data on any device, such as laptop, desktop or smart-phone?IF 8 or 9 IS YES THEN ask question 1010. How it changed your practices in keeping data safe?142A.3 Study 2 QuestionaireA.3.1 Part I: Consent Forms and Smartphone TaskFirst participants were asked to consent with the study and kinds of data beingcollected. If, however, the participants stated that his/her age is less than 19 years,a separate consent form was presented for parents/guardians. We then asked par-ticipants to follow a link on their smartphones and fill out some contact detailsso that we can contact them if they win the raffle. For this purpose the followingmessage was displayed:Please follow the link with on your smartphoneAs you know we do a raffle among participants of this study for one iPad 3(WiFi, 32GB). In order to be considered for this raffle you have to follow thislink on your smartphone and provide your contact details in the form. Otherwise,because this study is anonymous, we will have no means to communicate to you ifyou win the prize. enter your email and phone. Warning: you should visit this link from yoursmartphone, only those who did so will be considered for the raffle.Your activation code is:CODE: ABC123Please, click "Continue" button after you submit your contact details from yoursmartphone.A.3.2 Part II: Demographic QuestionsThe demographic section of the survey included the following questions:Question A: What is your gender?1. Male2. FemaleQuestion B: What is your age?1431. Under 102. 10-143. 15-174. 18-245. 25-296. 30-347. 35-398. 40-449. 45-4910. 50-5411. 55-5912. 60-6413. 65+Question C: What is your highest level of completed education?1. Less than or still in High School2. High School3. University (Bachelor’s)4. Graduate School (Master or PhD)5. Community College or Professional School (College degree)6. Other144Question D: List any work for which you have been paid in the past 3 months.Provide position title for each job. (Open-ended question.)Question E: Select in what industry(ies) have you worked for the past 3 months?(Mark all applicable)1. None or Unemployed2. Agriculture3. Forestry, fishing, mining, quarrying, oil and gas (Also referred to as Naturalresources)4. Utilities5. Construction6. Manufacturing7. Trade8. Transportation and warehousing9. Finance, insurance, real estate and leasing10. Professional, scientific and technical services11. Business, building and other support services12. Educational services13. Health care and social assistance14. Information, culture and recreation15. Accommodation and food services16. Public administration14517. Other servicesQuestion F: What is your annual household income in US Dollars?1. I prefer not to answer2. Under 5,000 USD3. From 5,000 USD, up to 9,999 USD4. From 10,000 USD, up to 14,999 USD5. From 15,000 USD, up to 29,999 USD6. From 30,000 USD, up to 49,999 USD7. From 50,000 USD, up to 74,999 USD8. From 75,000 USD, up to 99,999 USD9. From 100,000 USD, up to 149,999 USD10. More than 150,000 USDA.3.3 Part III: Smartphone ExperienceQuestion A: How many smartphones currently do you have and use?1. One2. Two3. Three4. More than ThreeQuestion B: Describe ownership of the smartphones you currently use. (Select allthat apply)1461. I bought a new smartphone for personal use2. I bought used smartphone for personal use3. My friend/relative gave me a smartphone as a gift4. My smartphone is given me by my Employer/Company for work5. My smartphone is given me by my Employer/Company as a gift6. OtherQuestion C: Please select what kind of previous experience you have with mobilephones and smartphones (select all that apply).1. I have lost my mobile phone before and didn’t find it"2. I have broken my mobile phone before, so that it was not usable"3. I have left my mobile phone at some place, but recovered it later (e.g., atmy friends’ place, in a restaurant, at parents’ house, at school, etc.)"4. Someone used my mobile phone without my permission with intention tolook at some of my data"5. Someone used my mobile phone without my permission with intention touse its functionality"6. Someone used my mobile phone without my permission with no bad inten-tions"7. I used someone’s mobile phone without owner’s permission to look intohis/her data"8. I used someone’s mobile phone without owner’s permission for some func-tions (phone call, browsing the Internet)147Finally, we asked participants if they used any locks on their smartphone. Inorder to define what smartphone lock is we first provided them with the followingexplanation: In the following question we will ask you about your phone lock. By“phone lock” we mean a protection of the smartphone that requires some "secret",such as password or PIN-code, to unlock it. Here are some examples of phonelocks: (Figure A.1).Figure A.1: Different types of smartphone locks.Question D: Do you use any type of locks, shown above, on your smartphone(s)?1. Yes, I use lock2. No, I don’t use lockA.3.4 Part IV: Smartphone Lock UseIn this section of the survey we asked participants about the reasons they used ornot used a smartphone lock. The type of the question a participant saw dependedon whether they answered that they used a lock or not. We asked the followingquestion to all of the participants that did not use a lock.148Question A: Why do not you use a lock on your smartphone(s)? (Mark all appli-cable)1. I do not have any data that I want to hide on my phone2. I do not care if my phone services will be used by someone3. I tried locks before and found them very inconvenient4. I often need instant access to applications that do not store any sensitive data(e.g. weather forecast, news, games)5. I do not save my passwords in applications and type it every time I use anapplication that stores sensitive data (e.g. email application, or Facebookapplication)6. It is not worth for me to use smartphone lock, because the amount of dataand applications that are sensitive are very small compared to those non-sensitiveThe participants that did use a lock were asked the following set of questions:Question A: Which of the locks do you use in your smartphone(s)? (Mark allapplicable)1. PIN-Code (only digits)2. Password (could have digits and letters)3. Draw a Secret (Pattern)4. Face-recondition5. Finger print scan6. OtherQuestion B: How does the lock in each of your smartphone works? (Mark allapplicable)1491. It locks my smartphone after I pushed power button2. It locks my smartphone after smartphone’s display switches off3. It locks my smartphone when I am not using it for some period of time4. It locks my smartphone after it switched off completely or rebooted (whichmight happen because battery might got fully discharged or smartphone gotrebooted it5. It locks my SIM card if phone got switched off completely or rebooted(which might happen because battery might got fully discharged or smart-phone got rebooted it6. OtherQuestion C: Why do you use lock on your smartphone(s)? (Mark all applicable)1. My employer requires that2. I feel comfortable having such protection3. I have confidential and sensitive data on my smartphone(s)4. I do not want other people to use my phone services without my permission5. I do not want other people sneaking into my smartphone, when I do not seeitQuestion D: If your employer requires you to use a phone lock, assume for amoment that this is not the case and you can decide on your own. Rate youragreement with the following statement: “ would rather use a smartphone lockfor my entire smartphone, than a lock for specific applications or data items”1. Strongly Disagree2. Disagree1503. Neutral4. Agree5. Strongly Agree6. Not ApplicableQuestion E: If your employer requires you to use a phone lock, assume for amoment that this is not the case and you can decide on your own. Rate youragreement with the following statement: “My need to get fast access to someapplications or data on my smartphone influenced or will influence my decisionon phone lock use”1. Strongly Disagree2. Disagree3. Neutral4. Agree5. Strongly Agree6. Not ApplicableA.3.5 Part V: Applications and Data Being UsedQuestion A and B was asked separately for work and personal use cases. Oursurvey also allowed the participants to add new application or data types, in caseif the list was incomplete.Question A: In the past year, which applications or features have you used in yoursmartphone(s) for work or personal use? (Mark all applicable)1. SMS/MMS messages2. Voice Calling1513. Email Client4. Calendar5. Notes6. Instant messenger (e.g. GTalk, MSN Messenger, ICQ etc.)7. Social Networking application (e.g. Facebook, Tweeter, Google+, What’sUP etc.)8. Voice recorder9. Photo Camera10. Music Player11. Video Player12. Maps13. Training Assistant that helps you to track your exercise performance on themap.14. Password Keeper/Manager15. Games16. Documents viewers or editors (e.g. Word, Excel, Adobe Acrobat etc.)Question B: Select data which you created, or received, or stored on your smart-phone during last 12 months? (Mark all applicable)1. SMS/MMS Messages2. Call history3. Browser Search History1524. Browsing History5. Photos and Videos6. Voice Recordings7. Notes and Memos8. Contacts Details9. Music10. Emails11. Documents12. Events in Calendar13. Data in Social Networking Applications14. Recorded GPS tracks, from such applications as training assistants15. Progress in Games16. Passwords (that includes password managers, passwords in notes and savedpasswords in applications)Note, that in Question C as available options we used which ever data type aparticipants selected or added as an answer to Question B.Question C: In the past year, select data which you preferred to delete from yoursmartphone(s) immediately after reading/using it? (Mark all applicable)A.3.6 Part VI: Password Saving HabitsQuestion A: How do you check your email on your smartphone(s)?1. I open the application and see my emails immediately; the application doesnot ask me for password1532. I open the application, then I type my email account password, after that Isee my emails3. I use Internet browser to check my emails4. I do not check email on my phone5. OtherQuestion B: With other applications, where an account is required, I usually. . .1. Save my account password, so that I can open application faster2. Do not save account password and type it all the times3. I do both4. OtherA.3.7 Part VII: Data Types Sensitivity and ValueAll data that were identified in Part V of this questionnaire were listed in thissection and users had to rate their agreement with various statements dependingon proposed scenario and type of data use, i.e., personal or work. We used thefollowing set of statements to assess data type’s sensitivity:1. I would not have any concerns if Personal/Work DataType could be viewedby such a thief2. I will have some concerns if Personal/Work DataType could be viewed bysuch a thief3. I think such thief will use my Personal/Work DataType for some purpose,that might be detrimental to me4. I think such thief will not use my Personal/Work DataType for any purposes154For the states 1 and 3 we proposed the following scenario:• Scenario 1: Assume your mobile phone has just been stolen by a thief, whodoes not knowFor the states 2 and 4 we proposed the following scenario:• Scenario 2: Assume your mobile phone has just been stolen by a thief, whoknows you. It could be anyone from your social circle, but you do not knowwho exactly he or she is.In order to compare different data types between each other we asked theparticipants to rank their data based on its value. This task was done through dragand drop, where the participants were modifying the rank list by moving itemswith their mouse up or down the list. The participants saw the following taskstatement:• Rank Task 1: Scenario 1: Assume your mobile phone has just been stolenby a theft, who does not know you. Rank data types so that you would havethe biggest concern with the first item being revealed and the least amountof concern with the last item being revealed. Use drag and drop for ordering.• Rank Task 2: Scenario 2: Assume your mobile phone has just been stolenby a theft, who knows you. It could be anyone from your social circle, butyou do not know who exactly he or she is. Rank data types so that youwould have the biggest concern with the first item being revealed and theleast amount of concern with the last item being revealed. Use drag anddrop for ordering.155


Citation Scheme:


Citations by CSL (citeproc-js)

Usage Statistics



Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            async >
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:


Related Items