People with low vision (LV) experience challenges in visually tracking balls and players in sports like basketball and tennis, which can adversely impact their health. While existing research has studied video analysis for sports viewing, proposed camera-based assistance for casual non-ball sports, and contributed third-person perspective sports datasets, none are suitable for enhancing first-person sports playability. We present ARSports, a wearable AR research prototype that overlays instance segmentation masks in real-time for improving sports accessibility. ARSports also consists of first-person perspective sports datasets, which we manually collected and annotated, and fine-tuned instance segmentation models. Our evaluations suggest that combining real-time computer vision and augmented reality to create scene-aware visual augmentations is a promising approach to enhancing sports participation for LV individuals. We contribute open-sourced egocentric basketball and tennis datasets and models, as well as insights and design recommendations from our pilot study with an LV research team member.