However, this point overlooks the fact that there is a difference between a "computing machine" and a "computer".
In its broadest sense, anything that provides a convenient abstraction for making inferences about the world is a "computing machine". Therefore, in that sense, we've had computing machines for over 35,000 years – that's the age of the earliest counting implements known, the Lebombo and Ishango Bones.
However, what differentiates a "computer" (in the sense we've used the word since Turing) from a computing machine is its generality: while a computing machine is designed to work on a specific problem class (no matter how large that class is), a computer can perform any conceivable computation, including the simulation of other computers and computing machines, both less and more sophisticated than the computer itself. This is no meaningless distinction – it is in fact the root of the computing revolution, as relatively simple silicon processors can run software programs much more complex than themselves.
The Antikythera is certainly a remarkable device, but it's by no means a computer; it's merely a computing machine. You cannot, for example, use the Antikythera to solve differential equations, and you certainly cannot use it to simulate another device, say an arithmetic calculator. Likewise, the most sophisticated calculators still fail the Turing threshold, as they cannot be used to simulate the operation of other calculators.
For a very enlightening account of Turing's breakthrough paper "On Computable Numbers, with an Application to the Entscheidungsproblem", which introduces Turing machines and many of the theoretical foundations of modern computing, I recommend reading the excellent The Annotated Turing. It provides a wealth of explanations without which I wouldn't ever hope to understand Turing's paper, besides providing important historical context to its development and relevance. A must-read for any serious computer professional.