Everybody agrees improving software development speed is a good thing. It's a simple math game: the benefit starts when the result is done. Also, software development resource costs are 99% people, and people are payed by the hour. So if you are done sooner, you get profit sooner and have to pay less hours.
However, there is often no real metric defined for software speed. Sure, there are the SLOCs or function points, but these are volume metrics. Any metric that measures speed should also include start and finish time measurements.
Also, speed often gets sacrificed at the altar of perceived efficiency. For instance, individual developers are slowed down by poor networks, outdated computers or overly constraining policies. By keeping a slow network or computers running longer no investment is necessary and this is perceived as a cost saving. However, the slow network will mean everyone is waiting instead of working. Outdated computers prevent speed improvement and feature upgrades, and will break more often, causing more strain on your support staff.
How long does it take to replace your broken computer? In many organizations, this can take days, because someone from support has to pick up a box and install it with a company image. The support person is often paid less and is efficient at this, so this is seen as a cost saving. But, every second the computer is broken is a waste for me. Even when the support guy or girl would respond the second I called, I am just waiting for the replacement. Instead of two people involved (one waiting, one doing work), it is much cheaper to just give me the replacement with some very simple instructions to install it. And it is faster too. (I know this is possible because it's kind of the way it works at Info Support).
The above examples are pretty low level, but in the grand scheme of things, speed is often inhibited by policies and practices without very much thought. Mary "lean grandma" Poppendieck has a wonderful story on this called "Competing on the basis of Speed". First of all Poppendieck provides a concrete speed metric:
The time it takes from the first instance a user tells "I need this / this is not working right" to a real working solution available to the user (in production).
This seems like a good metric: from a user perspective, this is how long the software development organization takes to do any work.
However, from an organizational standpoint, this is huge. The entire chain impacts your development speed: resource allocation (money, people), having to do bids, gearing up a project team, doing the actual implementation, get it into production, everything more and in between. Measuring speed using specific volume metrics like LOCs or function points no longer seems to make much sense now. Also, delaying policies like "we only do something in production once a year" will have a huge impact on the overall speed. If you want to optimize you need to look at the entire chain: what is needed, where is something delayed, and why? Poppendieck has some interesting pointers for doing this.
Poppendieck also has another message: speed is a competitive advantage. If you are able to move faster, you will outperform your competitors. If you do not focus on speed and your competition does, you will be left behind. Software plays a crucial role in most organizations these days, software development speed is very important. It's not just about being faster is being cheaper. It's also about getting the big bucks (if you are faster, you are first), or become extinct.