Semester of Graduation
4
Degree
Master of Science in Computer Science (MSCS)
Department
Department of Computer Science
Document Type
Thesis
Abstract
In recent years, video conferencing has seen a significant increase in its usage due to the COVID-19 pandemic. When casting user’s video to other participants, the videoconference applications (e.g. Zoom, FaceTime, Skype, etc.) mainly leverage 1) webcam’s LED-light indicator, 2) user’s video feedback in the software and 3) the software’s video on/off icons to remind the user whether the camera is being used. However, these methods all impose the responsibility on the user itself to check the camera status, and there have been numerous cases reported when users expose their privacy inadvertently due to not realizing that their camera is still on. Users may simply forget that they have turned on their camera or even temporarily forget that they are in a conference and are being watched by others, let alone actively checking the camera status. In this work, we explore equipping a camera with motors and using its motions to advertise its working status to the user. We implement a prototype privacy-preserving motorized camera system on Raspberry Pi, which uses a pre-trained Histograms of Gradients(HOG) and a pre-trained CNN model to track face location and find user’s head orientation angles in real-time. A point of focus is then generated for the motorized camera to rotate to; using motors controlled by a proportional–integral–derivative (PID) contoller. When the camera rotates in vertical and horizontal axes to either track user’s face or to mimic user’s head orientation, its status is broadcasted by the movement itself. A survey of 85 participants about their satisfaction of existing method is done. User study is done with varying number of participants, and their perception of the camera regarding privacy protection and usability are evaluated. Results show that the motorized camera effectively leverages the inherent motion-awareness of human users to actively remind the user of its status.
Date
11-3-2022
Recommended Citation
Shrestha, Anish, "Enabling the Human Perception of a Working Camera In Web Conferences via Its Movement" (2022). LSU Master's Theses. 5691.
https://repository.lsu.edu/gradschool_theses/5691
Committee Chair
Dr. Chen Wang
DOI
10.31390/gradschool_theses.5691
Included in
Artificial Intelligence and Robotics Commons, Graphics and Human Computer Interfaces Commons, Other Computer Engineering Commons, Robotics Commons