[The photo shows Jim Long, Chief Methodologist at Vitech, and Fred Knopf, Vice President of Marketing, staffing the Vitech booth at a software productivity consortium event in the mid-90s.]
This is Part III in a series of posts about the history of Vitech. Part I recounts the company’s beginning, in 1992. Part II tells the story of an influential mentor. In Part IV, Vitech enters the new millennium, and a classic infantry carrier vehicle gets a boost. In Part V, Vitech contributes to thought leadership for the developing discipline of systems engineering.
When David Long sat down to consider a name for his company back in the summer of 1992, one did not have the luxury of searching the Internet, either for inspiration or simply to avoid those names already in use. After some deliberation, Long hit on Vitech, short for “vital technologies.” Vitech was thus Vitech from the very beginning—a name that has served the company well.
The story of naming the software was a somewhat winding road as well. “People always ask what CORE stands for, believing it’s an acronym for systems engineering concepts,” David notes. “From the earliest days, I referred to the base capability being developed as ‘the core,’ knowing that we would continue to deliver greater capability over time. Though I explored other names, ‘the core’ stuck, so in 1993 the product officially became known as CORE, which represented the center and essence.”
Vitech’s first commercial customer was the National Security Agency, which was doing security analysis of hardware. “Our product allowed them to model security requirements, external threats, vectors for cyber-attacks, and corresponding tests to verify performance,” Long recalled. “The NSA team had prior exposure to RDD-100 from Ascent Logic, so they understood the concepts, but were looking for an easier-to-use desktop implementation. As they learned about the development of CORE, they felt it was exactly what they needed. In fact, to best serve their needs, CORE 1.0 was released significantly before the planned launch date. NSA was our first customer and remains a customer to this day.” From there, growth was organic and gradual, much of it via word of mouth.
CORE would go on to achieve such renown within the systems engineering community that it became the go-to product used to teach model-based systems engineering. Today, the software is used as a base around which exercises are written in systems engineering textbooks such as Dennis Buede and William Miller’s book, The Engineering Design of Systems Models and Methods (published by John Wiley and Sons, 2016). The software has in fact been embedded in this classic systems engineering textbook since its first edition in 2000.
The growth of Vitech as a company paralleled the growth of systems engineering more generally. In the mid-1990s, the systems engineering community was still a small, interconnected world. “You knew who was doing systems engineering. You understood their problems,” Long said. What would become the international professional association of systems engineers, International Council on Systems Engineering, or INCOSE, had just been founded in 1990 as the National Council on Systems Engineering in the United States. (It would not become the international body INCOSE until 1995.) At the time, systems engineering under that name was almost exclusively practiced in aerospace and defense; it wouldn’t be until later that automotive and other industries would recognize similar practices and begin to align under the title “systems engineering.”
In the early years, the company grew in customers and capabilities. Vitech delivered multiple point releases in the 1.x series to meet internal expectations of the capabilities necessary to support a model-driven systems design process. As the team grew, in 1995 it moved into corporate office space in Vienna, Virginia, outside of Washington, D.C. In 1998, CORE 2.0 was released, enabling systems engineering teams to collaborate live working from a single source of truth for their project as they addressed systems requirements, behavior, architecture, and test.
As the dawn of a new millennium loomed, many around the world became concerned about the threat of Y2K—the potential for systems based on old software coding to malfunction at the turn of the century. While Y2K was not in Vitech’s traditional systems design space, it did lead to an interesting project that complemented Vitech’s portfolio of aerospace and defense projects.
A national flood insurance provider approached Vitech deeply concerned about Y2K. While they had been preparing for the time when midnight struck on December 31st, 1999, in September of 1996, they realized that their deadline would come three years earlier. They had overlooked the fact that flood insurance is written on a three-year term, and thus found themselves scrambling to meet a December 31st, 1996 deadline. Upon realizing this, company representatives turned to Vitech for systems engineering expertise to quickly understand their processes and the underlying systems, so that they could then quickly develop an implementation and test strategy to meet the looming and immovable deadline.
The fix for the Y2K problem uncovered a greater issue, but fortunately one that Vitech’s methodology and CORE software could address: The overall structure of the company’s various flood insurance policy pathways and supporting groups was extremely complicated. They had 70-80 data sub-systems distributed over 27 locations. Moreover, their system architecture was poorly documented, and the structure was too complex for one person to keep it all in his or her head.
Recognizing the reality of the schedule, the systems engineers at Vitech realized that a multi-pronged strategy was required. First, the only way to manage final certification of the system was via an interface control document. Each data center manager would ultimately certify that if they received Y2K-compliant data, they would generate Y2K-compliant data (allowing each data center to be treated as a black box subsystem with the internal implementations ignored). Second, process models were developed to clearly capture the processing steps and, more importantly, the associated data as flood insurance policies moved through the system.
Team members on this project, from both the company and Vitech, made giant maps of company processes that they then taped to the wall so they could visualize the program and discover any hiccups. Testers took colored pencils and followed the process on the maps around the room, identifying duplicate test paths that could be dropped to save time, and unaddressed paths for which new tests were written.
By using CORE, the company was able to reengineer their systems in only a couple of months, addressing both the Y2K problem and the issue of their unwieldy and uncoordinated insurance policy processes. In addition to spending less time on the problem, the company was able to design better coverage by addressing the gaps they discovered. Without systems engineering, there is little doubt that they would have been unprepared to serve clients on January 1st, 1997.