You can try to explore the execution with GDB
First, running in calibration mode set a breakpoint at
double LSM9DS1<IO>::calibrateGyroAcel(float max_slope)
on file LSM9DS1_impl.h
a good test point would be line #451, before the sample averaging is performed inside this loop:
for (ii = 0; ii < 3; ii++) {
this->gBiasRaw[ii] = gBiasRawTemp[ii] / samples;
this->aBiasRaw[ii] = aBiasRawTemp[ii] / samples;
}
Do gBiasRaw
or aBiasRaw
get filled with reasonable values?
Now on normal running mode take a look at Chordata::K_Ceptor::Bang()
(this function is kind of messy right now and it would be hard for outsiders to get most of what's happening there, refactoring it is one of the first steps on the notochord roadmap). In line #350 it calls the actual reading of the sensors:
imu->readGyro();
imu->readAccel();
in these functions aBiasRaw
and gBiasRaw
should be subtracted from the incoming vectors. Are these offset consistent with what you read on the first stage or with what's stored in the EEPROM?