by Pete Carapetyan
An Emergency Deadline
Engaged in March of 2004 to assist a very small company with a past due deadline, dataFundamentals had to rely on my 20 years in product delivery to even make the resources fit the time available. The project was already seven months past its one-month schedule, and requirements had been so loosely defined as to make failure almost assured if one more misstep were allowed. It took laser sharp focus and strong powers of persuasion to get everyone rowing in the same direction, but we made the deliverable.
Putting a Process in Place
When engaged to carry the project to the next milestone as project Architect, we were lacking almost everything normally found in a mature development process. One at a time, dataFundamentals brought in infrastructure and systems for everything from proper source control procedures to build and test processes, requirements and documentation, and phased development cycles.
From previous projects, we found that agile development practices, most notably test-first and small iterative development cycles, were the most effective way to work, so these were the core approaches used to build the processes.
Building a Team
Change is tough on any team, and this team was facing many challenges that more experienced teams didn't have to wrestle with. So in addition to lots of training on the basics, we had to slowly and purposefully integrate lots of cooperative processes and trust that is more normal on the many open source projects that we had worked with over the years.
The results have been remarkable. A team developed that is cohesive, efficient, and works well within the boundaries set by our roles and task lists. This is an accomplishment that is as satisfying as the quality of the software itself.
Database Design and Normalization
Several previous iterations of this software had become very hard to maintain due to database design problems and it was a hard sell to convince all involved that the investment of time necessary to normalize the database first was worth the time.
Interestingly enough, though this ended up being a significant investment of time on the part of the company, very little time had to be spent by dataFundamentals once the initial period of persuasion had passed. Most of the work was done by existing staff, with me providing initial training, objectives, and occasional input and feedback. The existing team knew instinctively what good design looked like, but they had never been forced to take the time to implement it.
Great Software Takes a Good Foundation
DataFundamentals' years' long involvement with the Keel Framework was tapped to bring this in as the foundation for all software. Maintainability and feature sets had always been this company's largest challenges, and this was something that Keel provided in spades.
Doing the math on the cost of converting and maintaining previously built applications, a decision was made to re-architect the suite of applications from scratch rather than attempting to convert a large amount of questionable code. The usable portions of code were identified and then converted into workable portions and wrapped as single components before being brought back into the new development. This allowed for the best combination of new and old development without losing the investment in the old code.
Several months were spent building and tuning each of the relevant components into compliant, cohesive Keel components. Existing staff began to learn the powerful effect of having very consistent, simplified components designed to do one thing very well, and then testing that one thing against a battery of automated tests.
Because of Keel's maturity as a framework, much of what the customer needed didn't have to be done, it was already there. From clustered boxes working together as a set, to failover and role-based security, these were areas we didn't have to focus on.
Automated Code Generation
After a year of developing the undergirding routines, the team was ready to extend the GUIs to the many areas that it had not yet exercised the database. This meant input and retrieval forms for each of the many new tables, hundreds of files all very prone to syntax errors and consistency problems, had to be created.
I spent a few weeks automating this process, and thousands of lines of code appeared with one click of the mouse, providing boilerplate, customizable forms and table views for everything in the database.
Swing Development
Early in the process it became apparent that we needed the deployment flexibility of our browser based web application but that, even with AJAX the browser, wasn't going to cut it as a client.
A move was made to Swing as client. Server side code stayed virtually unchanged, but dataFundamentals used it's experience with Java Swing and Visual Basic to build the kind of functionality at the base level that would extend in a component framework across all applicable views. This allowed the client to use things such as rendering engines as widgets, rather than customizing every line of code in a hard to maintain spaghetti bowl, as with previous versions of the software.
Advanced Component Development
One of the notable issues for this customer are the contrast between the sophistication of its software needs, and the size of its market. It is not uncommon to have to write a component which will only be used by one or two customers, yet it needs to plug in to the system as if it were any other component.
DataFundamentals brought with it high standards for "Black Box" design and insisted that the interface to each component allow for switching out components, much as one might switch out stereo-hi-fi components. It took the organization a while to adjust, but the results have been tremendously successful, often surprising us by helping in situations we had not even anticipated.
One example is the many uses of libraries from the public domain, place in proprietary "Black Box" wrappers within the company's system. By providing this level of abstraction at such a concrete level, the company has discovered that it can safely take chances with libraries that it did not write because they are carefully segretated and maintained as units that can be tested against everything else with automated testing.