A core tenet of many modern Agile approaches is estimating work effort using story points or abstract units instead of time estimates. Proponents argue points are more accurate because they supposedly account for variability in developer productivity. But this concept is deeply flawed.
Points are always eventually deconstructed into dates on a calendar for planning purposes. No business can operate solely in an abstract point system detached from real-world time constraints. Dates are the true atomic unit of planning in software development, not arbitrary points.
Beyond that, treating all developers as interchangeable widgets during estimation with points is absurd. The truth is that developers have unique tendencies when estimating work. Some developers tend to overestimate tasks out of an abundance of caution, while others are perpetually optimistic and underestimate work.
As long as a developer is consistent in their tendencies, you can apply a multiplier to translate their points to realistic date estimates. For example, if Mindy tends to estimate double the actual time required, her point estimates can be multiplied by 0.5X to get realistic dates. Dave always underestimates by 80%, so his point estimates are multiplied by 5X.
By tailoring estimates to individuals and converting points to dates, we can do realistic planning and remove the guesswork and abstraction of points. The variability between developers is real and ignoring it leads to bad plans.
Let’s stop pretending that a point is some universal constant across developers. Time is the one true constant in software development. Dates are real, points are imaginary. The sooner we get back to date-based planning, the more predictability and honesty we can achieve in estimating and delivery.