I took the Y-axis value of the accelerometer and converted it to degrees, using a notion abs(y-axis) * 90 with a offset. So, for example, if the accelerometer reads 0.5 or -0.5 on the Y-axis, that's converted to 45°. If I zero out the accelerometer, that 45° is now considered as 0°, and any frontal movement is negative and back movement is positive. I use that value for slope calculation.
Now, that works perfectly fine when standing still. When I angle the phone at 45° the slope is 100%.
But, problems arise in the real world. Since I'm using a accelerometer for my angle detection, readings when hitting potholes and bumps in the road go all over the place. One second I read 80%, the next -300%, and so on.
What can I do to smooth out the readings and make them semi-accurate?
I could try to discard any invalid value, such as anything +-20% of slope and average that over 10 seconds, but I feel that would be inaccurate.
I was looking into the Kalman filter, based on a thesis on slope detection using an accelerometer for Scania trucks, but the jerking of a truck is much less violent than that of a mountain bike.
What can I do to predict and smooth out sensor values over time?