Semester of Graduation

Summer 2021

Degree

Master of Computer and Information Science (MCIS)

Department

Division of Computer Science and Engineering

Document Type

Thesis

Abstract

Smartphones continue to proliferate throughout our daily lives, not only in sheer quantity but also their ever-growing list of uses. They are no longer just for communication and the occasional phone game. Smartphones can be used to open garage doors, transfer money, see who is at your front door, and much, much more. With this increased dependence and use, smartphone security is critical. In this paper we propose a system to verify a user’s identity by applying a convolutional neural network (CNN) model to an image of the user’s hand while holding their device. This model aims to address situations where other more common biometrics are inaccessible or inconvenient, such as when wearing a face mask or when contact-free verification is necessary. It is also designed for secure smartphone reader transactions such as mobile payments, boarding public transport with mobile tickets, or for building access using mobile phone-based identification.

Our proposed system uses an image of the gripping hand holding the smartphone and the smartphone's user ID token to verify if the person holding the device is who the device claims they are. In order to assess this our system is built using two major components: image preprocessing and our CNN model. Upon receiving the image and ID token as input, our model will first run the image through our preprocessing procedure. The image is subjected to resizing, background removal, and image illumination to remove background inconsistencies, normalize the lighting, and emphasize the hand features. The image is then fed to our CNN model which presents a user classification prediction. Finally, the prediction is compared to the user ID token to verify if the user holding the phone's identity matches the user ID token presented by the smartphone. We adapted our system for both user classification and user verification use, achieving up to 100% accuracy and a 0.02 loss value with our verification system and 94% accuracy with a 0.049 loss value with our classification system.

Committee Chair

Chen Wang

DOI

10.31390/gradschool_theses.5423

Share

COinS