Description
This open-ended project invites students to explore Quantization-Aware Training (QAT) with PyTorch to optimize computer vision models for Arm-based mobile devices (e.g., Android smartphones).
The project centers on training a model using a non-restrictively licensed dataset and deploying it either on Arm-powered mobile devices (leveraging Android Neural Networks API )
Students will apply QAT to maintain accuracy while reducing model size and inference latency, making it suitable for real-time applications like:
- Sign language recognition for accessibility.
- Visual anomaly detection in manufacturing.
- Personal health and activity monitoring from camera feeds.
The project encourages referencing work by contributing optimized and quantized models for Arm platforms on HuggingFace. The final quantized model will be uploaded to HuggingFace and may be submitted for listing in the Arm on HuggingFace space, encouraging open, community-supported contributions.
Hardware / Software Requirements
- Languages: Python, Java/Kotlin (if Android), Shell
- Frameworks: PyTorch
- Tooling: PyTorch Lightning, Android Studio
- Hardware Options:
- Android phone with Arm Cortex-A CPU
- Deployment Targets:
- Android
Resources
Benefits / Prizes
- Standout projects could be internally referred for relevant positions at Arm! π
- If your submission is approved, you could receive a recognised badge that you can list on your CV and shared on LinkedIn. A great way to stand out from the crowd! π
- Itβs a great way to demonstrate your initiative and commitment to your field.
- It offers the opportunity to learn valuable skills that are highly relevant to a successful career at Arm! π