Rideshare scooters are increasingly popular in urban cities. However, rider safety remains a key challenge. This paper presents a situation-aware Urban Rideshare Safety (URS) framework that merges smartphone sensing with deep learning–based visual analytics. The system uses a smartphone’s built-in camera to apply the YOLO-based object detection algorithm. It identifies road surfaces, sidewalks, and curbs in real time. A labeled urban ride image dataset is used to train and validate detection models. The framework has two layers: smartphone sensors collect data, and cloud servers handle deep learning. These servers generate structured visual outputs that support safety indicators. The results show deep learning object detection is feasible for urban rideshare contexts. This establishes a foundation for future multimodal systems that combine visual, vibrational, and reasoning models. Using accessible smartphone technology, the system creates a foundation for intelligent assistance to improve rideshare safety.
Published in: 2nd IEOM World Congress on Industrial Engineering and Operations Management, Windsor, Canada
Publisher: IEOM Society International
Date of Conference: October 14
-16
, 2025
ISBN: 979-8-3507-4450-7
ISSN/E-ISSN: 2169-8767