When divided by a given mutation rate, the ρ (rho) statistic provides a simple estimator of the age of a clade within a phylogenetic tree by averaging the number of mutations from each sample in the clade to its root. However, a long-standing critique of the use of ρ in genetic dating has been quite often cited. Here we show that the critique is unfounded. We demonstrate by a formal mathematical argument and illustrate with a simulation study that ρ estimates are unbiased and also that ρ and maximum likelihood estimates do not differ in any systematic fashion. We also demonstrate that the claim that the associated confidence intervals commonly estimate the uncertainty inappropriately is flawed since it relies on a means of calculating standard errors that is not used by any other researchers, whereas an established expression for the standard error is largely unproblematic. We conclude that ρ dating, alongside approaches such as maximum likelihood (ML) and Bayesian inference, remains a useful tool for genetic dating.