Do you know anything about the distribution that the data points are sampled from? Without that, you would have to do a least-squares regression...
Ultimately, any sinusoidal curve should be represented as:
Thus, you take 3 partial derivatives of the least squares value and set equal to zero. I did this out
I just did out some of the calculus, and the 3 resulting equations cannot be analytically solved for (A,omega,phi)
You could run a solver for the 3 variables and get approximations. Alternatively
, you could try doing the following:
1. Guess an amplitude. Divide all your data points by it, then take the inverse cosine of these new values. Run a linear regression on this new data set to get omega and phi.
2. Adjust amplitude, repeat (1), and compare errors. The L_2 error is a continuous function of A, so you should be able to find a local minimum without too much work.
You could even creatively guess at A... perhaps the average of the two most extreme values, or the average of all the values. Those are the two best possibilities that come to mind.