Calculating Running time from Time Complexity
I have read about big O notation for time complexity and for counting functions like that for prime numbers. Recently on StackOverflow I read:
The problem with defining how long an algorithm takes to run is that you usually can't give an answer in milliseconds because it depends on the machine, and you can't give an answer in clock cycles or as an operation count because that would be too specific to particular data to be useful.
My question is, if we consider an algorithm that is running on a known time complexity (polynomial, linear, etc.) on a machine whose parameters are known, how can we calculate running time in seconds? Essentially, how can time complexity be translated into real time for a given machine?
I ask because I have seen instances where people have said x algorithm will take y time to run.
From what I understand after reading the wikipedia page on time complexity, I would think it is the polynomial value or number of computations divided by the amount of computations a given machine can process per unit time. Is this correct? Is there a general answer?