I think even the famed Grady and Booch as well as other software engineering gurus have pretty given up on how to accurately do estimation and project management on a large scale. There attempts at capturing the variables like amount of experience of devs individually and cumulatively, type of project problem space, industry averages for lines of code for similar projects, industry averages of bugs per lines of code, averages of time spent debugging those bugs, etc. Still ends up being a big crystal ball with people putting together Gantt charts.
Hence the shift to smaller scale with Agile where things are just tackled in small springs of about 2-3 weeks and only features or parts of features that will fit in that time slot will be taken on. Devs doing planning poker guesstimating how long a particular story will take to implement. And then burn down charts within those sprints to see velocity with in the sprint. And then a running average of past sprint velocities to guess at how much the team can realistically tackle given past performance.