(Originally published by me on www.servicevirtualization.com)
Because I work closely with application development professionals on an on-going basis, I am fairly in tune with the happenings of that profession. (It doesn’t hurt that I, too, was in an application development related role for 18 years.) So when I heard more and more people extol the virtues of Test Driven Development (TDD) I wanted to look into it myself to see what the hullabaloo was all about.
Application code is written to fulfill the requirements outlined by the Line of Business. Taken as a whole, the result is an entire application that provides a business service, ultimately allowing an organization to either add new revenue streams or expand the capacity of existing ones.
The problem that often occurs is that “this isn’t your father’s
application development job” anymore. The need to remain competitive in
the marketplace often adds the requirement of being both incredibly
agile (resulting in more aggressive / shorter release cycles) while at
the same supporting the latest trends in technology as a business
enabler. Currently, big data, cloud computing, mobile device support
and “the Facebook effect” (meaning highly interactive applications
taking great advantage of asynchronous processing to provide nearly
instantaneous results) are the darlings of the industry but it could be
anything.
As a result, the applications that are being demanded by the Lines of Business are increasing in their complexity. And that means the task of managing the resulting application quality has also become more complex. This spawned the Agile development movement, which ultimately evolved to TDD. Both of these were devised to manage the complexity so that the rate of change does not make the ability to validate the correctness of the result time- and cost-prohibitive.
For those of you who have not been exposed to TDD, the primary difference between Agile (we’ll use SCRUM here as the reference since that is arguably the most prevalent Agile methodology in use) and TDD is the following:
Because I work closely with application development professionals on an on-going basis, I am fairly in tune with the happenings of that profession. (It doesn’t hurt that I, too, was in an application development related role for 18 years.) So when I heard more and more people extol the virtues of Test Driven Development (TDD) I wanted to look into it myself to see what the hullabaloo was all about.
Application code is written to fulfill the requirements outlined by the Line of Business. Taken as a whole, the result is an entire application that provides a business service, ultimately allowing an organization to either add new revenue streams or expand the capacity of existing ones.
Architectural complexity increases with time |
As a result, the applications that are being demanded by the Lines of Business are increasing in their complexity. And that means the task of managing the resulting application quality has also become more complex. This spawned the Agile development movement, which ultimately evolved to TDD. Both of these were devised to manage the complexity so that the rate of change does not make the ability to validate the correctness of the result time- and cost-prohibitive.
For those of you who have not been exposed to TDD, the primary difference between Agile (we’ll use SCRUM here as the reference since that is arguably the most prevalent Agile methodology in use) and TDD is the following:
- SCRUM defines success as the successful implementation of a set of features and functionality to be completed by the end of the next sprint, and the developers write code to meet those goals
- TDD, however, defines success as the implementation of code that successfully addresses a set of (initially) failing tests that are developed in parallel by the developers