A question jumped into my head while processing data from MEMS sensors and Vicon: Is there any better ways of estimating first-order derivative of a function using samples at a fixed time interval, i.e. a "table"?

To be precise, suppose we have a function  which is smooth enough, and a series of samples  where .

The most intuitive way would be:

By looking into the Taylor Series, we found that the estimation error is the same order of :

But if we examine another Taylor Series,

it is obvious that we can eliminate the  term by subtracting  from :

Thus obtaining an estimation with error which is the same order of :

So,

seems to be a better estimation. But it depends on the smoothness of  as it is possible that  is much larger than .