I was making that point.Looking for Knowledge wrote:superjawes wrote:A "." when used in mathematics is a decimal point, meaning that .000 = 0. For that reason, we either express "two thousand" as 2,000 or 2000 so that 2.000 is not confused with 2.

Actually, the amount of digits (0's included) past the decimal point indicate the level of significant figures (sig. fig.) in the equation. Generally that number is based on the level of measurably inherent in the tool(s) used for measuring the figure. That said, I do not know why that level of sig. fig. was used in this example, but .000 does not equal 0. .000 equals an amount measured to the nearest thousandth by something capable of measuring coders to that level of magnitude. Honestly it's silly. 0.001 coders couldn't do shit.

Even at the extremes, humans are (generally) discrete objects, so 2.4 coders will not do you any more good than 2.0, and although 2.000 is more precisely two coders, the significance past the "2" doesn't change that. Although it could be an interesting problem if we go the other way and are trying to compare 1.5 coders with 1.9995 coders...