ACADEMIC RESEARCH
IEEE-2024-0847
ROLE
Lead Author / Researcher
CONTEXT
IEEE Conference / Research Paper
YEAR
2024
Optimizing Neural Network Inference on Edge Devices
Proposed a novel quantization technique that reduces model size by 75% while maintaining 98% accuracy for real-time inference on IoT devices.
VIEW PAPER