Non-Constant Acceleration due to Gravity
Recently, I had the first physics lab for my university physics course. This lab was fairly simple, as we were merely using a computer and a distance sensor to graph the position, velocity, and acceleration of a cart as it moved along a linear track.
One of the situations we captured data for involved starting the cart at the bottom of an inclined ramp and giving it a push upwards. As expected, it rolled up, came to a stop, and then came back down the track to its starting position. The position-vs-time graph was essentially parabolic, the velocity-vs-time graph was essentially linear, and the acceleration-vs-time graph was essentially linear. So far, so good.
At this point in the lab, the instructor pointed out that, if the data was examined closely, the acceleration of the cart was greater while the cart was traveling upwards than when the cart was traveling downwards (approximately and , respectively), and asked us to determine why in our lab report.
Now, gravity was the only force acting upon the cart, and thus it's acceleration should be a constant, at least at the scale our experiment was conducted at, so these results are completely baffling my lab group. So far, we have proposed the following ideas, but none of them seem very plausible.
Doppler effect on the ultrasonic distance sensor
Friction
Air resistance
Human error
The first seems highly improbable, and the last three are more obfuscation and hand waving than actual theories.
Why does our experimental data show the acceleration due to gravity to change based on the direction the object is moving?