ST意法半导体
直播中

谢芳芳

7年用户 138经验值
私信 关注
[问答]

MotionMC校准失败所有参数都有默认值

嗨,
我目前正在尝试将MotionEC库用于E-Compass应用程序,并使用MotionMC库校准磁力计。我正在使用带有LSM303AGR的STM32L1-DISCO。我的一般想法是运行校准算法x秒,然后获取校准参数,并将它们应用于电子罗盘计算。
MotionEC库工作得非常好,但我遇到了MotionMC库的麻烦。
首先,我正在运行MotionMC_Update函数,其中包含我从磁力计读取的参数。之后,我正在调用MotionMC_GetCalParams来获取校准参数。但是库总是返回以下默认值:
HI_Bias = {0,0,0},
SF_Matrix = {{1,0,0},{0,1,0},{0,0,1}},
CalQuality = MMC_CALQSTATUSUNKNOWN
这是我的代码(基于ST示例)。
// COMPASS_REPORT_INTERVAL = 20
//每隔20ms调用一次
void doCalibrationStep()

{
SensorAxes_t mag_data_uncompensated;
getMagneto(安培; mag_data_uncompensated); MMC_Input_t data_in;

data_in.Mag [0] =(float)mag_data_uncompensated.AXIS_X / 10.0f;
data_in.Mag [1] =(float)mag_data_uncompensated.AXIS_Y / 10.0f;
data_in.Mag [2] =(float)mag_data_uncompensated.AXIS_Z / 10.0f; data_in.TimeStamp = timestamp * COMPASS_REPORT_INTERVAL;
//示例data_in:Mag = { - 64,5.5,17.1000004},TimeStamp = 0

MotionMC_Update(安培; DATA_IN); MMC_Output_t data_out;

MotionMC_GetCalParams(安培; DATA_OUT); dbg('CalParams X:%f,Y:%f,Z:%f,质量:%d  r  n',data_out.HI_Bias [0],data_out.HI_Bias [1],

data_out.HI_Bias [2],data_out.CalQuality); //无论我多久运行算法时间戳++,一切都有默认值;
}
void doCalibration()

{
MotionMC_Initialize(COMPASS_REPORT_INTERVAL,1);
char lib_version [200]; MotionMC_GetLibVersion(lib_version);

DBG(lib_version); //这里打印正确的版本< doCalibrationStep()以20ms的间隔重复调用,直到x秒过去>
MotionMC_GetCalParams(安培; cal_params); //所有参数均为零
}
我完全不知道什么是错的。我错过了什么吗?
谢谢

以上来自于谷歌翻译


以下为原文




Hi,
I'm currently trying to use the MotionEC library for an E-Compass application, and to calibrate the Magnetometer with the MotionMC library. I'm using an STM32L1-DISCO with an LSM303AGR. My general idea is to run the calibration algorithm for x seconds, then get the calibration parameters, and apply them to the e-compass calculation.
The MotionEC library works perfectly fine but I ran into troubles with the MotionMC library.
At first, I'm running the MotionMC_Update function with the parameters I'm reading from the Magnetometer. After that, I'm calling MotionMC_GetCalParams to get the calibration parameters. But the library is always returning the following default values:
HI_Bias = {0, 0, 0},
SF_Matrix = {{1, 0, 0}, {0, 1, 0}, {0, 0, 1}},
CalQuality = MMC_CALQSTATUSUNKNOWN
Here is my code (based on the ST examples).
// COMPASS_REPORT_INTERVAL = 20
// gets called every 20ms
void doCalibrationStep()

{
    SensorAxes_t mag_data_uncompensated;
    getMagneto(&mag_data_uncompensated);    MMC_Input_t data_in;

    data_in.Mag[0] = (float) mag_data_uncompensated.AXIS_X / 10.0f;
    data_in.Mag[1] = (float) mag_data_uncompensated.AXIS_Y / 10.0f;
    data_in.Mag[2] = (float) mag_data_uncompensated.AXIS_Z / 10.0f;    data_in.TimeStamp = timestamp * COMPASS_REPORT_INTERVAL;
    // example data_in: Mag = {-64, 5.5, 17.1000004}, TimeStamp = 0

    MotionMC_Update(&data_in);    MMC_Output_t data_out;

    MotionMC_GetCalParams(&data_out);    dbg('CalParams X: %f, Y: %f, Z: %f, Quality: %drn', data_out.HI_Bias[0], data_out.HI_Bias[1],

            data_out.HI_Bias[2], data_out.CalQuality);   // everything has default values, no matter how often I run the algo    timestamp++;
}
void doCalibration()

{
    MotionMC_Initialize(COMPASS_REPORT_INTERVAL, 1);
    char lib_version[200];    MotionMC_GetLibVersion(lib_version);

    dbg(lib_version);   // here the correct version is printed   
    MotionMC_GetCalParams(&cal_params);   // all parameters are zero
}
I have absolutely no clue what's wrong. Do I miss something?
Thanks

回帖(2)

曾玲娟

2019-3-29 10:16:57
您是否通过3D空间以八字形图案缓慢旋转传感器?
或者您可以沿每个轴旋转它。
在此运动后,您应该获得校准值。

以上来自于谷歌翻译


以下为原文




Did you rotate the sensor slowly in a figure eight pattern through 3D space?
Or you can rotate it along each axis.
After this movement you should get the calibration value.
举报

谢芳芳

2019-3-29 10:27:54
我在过去几周没有时间看这个问题,但是今天有时间进一步调查。
对于可能遇到此问题的用户:
问题在于,在可以读取校准值之前,校准算法必须经常运行。根据间隔,这需要一些时间,在我的情况下> 10秒。我只运行校准算法几秒钟,因此没有得到结果。

以上来自于谷歌翻译


以下为原文




I did not have time in the last few weeks to look at this issue, but found time today to investigate further.
For users that might encounter this problem:
The problem, was/is that the calibration algorithm has to run quite often before one can read calibration values. Depending on the interval, this takes some time, in my case >10 seconds. I ran the calibration algorithm for a few seconds only and, thus, did not get results.
举报

更多回帖

发帖
×
20
完善资料,
赚取积分