Mobile robots suffer from sensory data corruption due to body oscillations and motion disturbances. In particular, information loss in images captured with on board cameras can be very high, may become irreversible or computationally costly to compensate. In this thesis, a novel method to minimize average motion blur captured by such mobile visual sensors is proposed. To this end, an inertial sensor data based motion blur metric, MMBM, is derived. The metric can be computed in real time. Its accuracy is validated through a comparison with optic-flow based motion-blur measures. The applicability of MMBM is illustrated through a motion blur minimizing system implemented on the experimental SensoRHex hexapod robot platform by externally triggering an on board camera based on MMBM values computed in real-time while the robot is walking straight on a flat surface. The resulting motion blur is compared to motion blur levels obtained with a regular, fixed frame rate image acquisition schedule by both qualitative inspection and using a blind image based blur metric computed on captured images. MMBM based motion blur minimization system, through an appropriate modulation of the frame acquisition timing, not only reduces average motion blur, but also avoids frames with extreme motion blur, resulting in a promising, real-time motion blur compensation approach.