Mobile robots suffer from sensor data corruption due to body oscillations and disturbances. Especially, information loss on images captured with onboard cameras can be extremely high and such loss may become irreversible or deblurring can be computationally costly. In this paper, a novel method is proposed to minimize average motion blur captured by mobile cameras. A real-time computable motion blur metric (MMBM) is derived by using only inertial sensor measurements. MMBM is validated by comparing it to optic flow results. To express the applicability of MMBM, a motion blur minimizing system is built on the RHex. To this end, an onboard camera is externally triggered depending on the real-time-calculated MMBM while the robot is walking straight on a flat surface. The resulting motion blur is compared to motion blur levels of a regular, fixed frame-rate image acquisition schedule by qualitative inspection on captured images.