At work, there is a statistical process going on, which I feel is probably mathematically incorrect

jistefaftexia99kq6

jistefaftexia99kq6

Answered question

2022-04-12

At work, there is a statistical process going on, which I feel is probably mathematically incorrect but can't quite put my finger on what is wrong:
They are totalling up the number of hours people work per week (in minimum units of 15 minutes), and then producing averages for the whole department per week. Obviously, the results come out to be non-integer numbers of hours with a long number of decimals.
Then, they are judging the result of certain productivity-boosting techniques and displaying the findings in "minutes gained/lost"...in some cases producing productivity gains of as little as a minute or two minutes per week.
So to summarise, they are calculating units of quarters of an hour, but then presenting the average productivity gains in minutes...is this presuming an accuracy which is not present in the initial measurement? I think it is, but don't know how to argue it to my boss.

Answer & Explanation

necrologo9yh43

necrologo9yh43

Beginner2022-04-13Added 23 answers

The measurement procedure you describe may be about the best available. Unless employees punch time clocks, determining time worked per wk with greater precision than to the nearest 1/4 hr may not be the possible.
Averages tend to be more precise than individual observations. The precision improves as the number of values averaged increases. To guess the precision of your averages, one would need to know (a) the number of people in the department, and (b) the variance or standard deviation of the hrs/wk within the dept.
Based on actual data, statistical tests could determine whether a difference of 2 min is statistically significant. (Judging whether such a small difference, if real, is of practical importance is a managerial issue, not a statistical one.)
Very roughly speaking, if the number of hours a randomly chosen person works per week is approximately normal with standard deviation 30 min, then a 95% confidence interval based on 25 employees would estimate of the mean number of hours within about ± 2 ( 30 ) / 25 = 12 min. But if the SD is 15 min and there are 100 employees, then the margin for error estimating the mean would be about ± 2 ( 15 ) / 100 = 3 min. [Don't try using this formula (without adjustments) for fewer than about 25-30 employees.]

Do you have a similar question?

Recalculate according to your conditions!

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?