Computer software development is typically an expensive, error prone and time consuming exercise.
Large computer projects more often than not run behind schedule and over budget, as well as fail to deliver all the requested features. Typically, this is due to many underlying factors, such as:
- Poorly Understood and/or Insufficiently Defined Needs.
- Changing Software Platforms, Interfaces and Operating Systems.
- Complex Program Interface Designs and Excessive Features.
- A Shortage of Qualified Development Personnel.
- Implementation and Related Training/Re-Training Costs.
- Evolving User and Business System(s) Requirements.
Top-Down and Bottom-Up Design.
A well planned computer application has many factors that will
contribute to the overall degree of success of the project. Typically however, computer
software is easier to create than it is to modify.
Therefore it is essential to understand all aspects of the system
design BEFORE committing to a particular direction and / or platform.
The only thing worse than software that doesn't work, is software
that cannot be easily fixed, updated or modified in the light of evolving requirements.
Even the most rigorous, lengthy and expensive studies (including those employing needs and system analysis using sophisticated Computer Assisted Software Engineering (CASE) Tools and Modeling Techniques, etc.) are often found to have missed the mark when the final version is first presented to the actual users.
While these advanced (CASE) methodologies do provide a short-cut to
creating a logical system design from a computer software perspective, they often do little in terms of practical integration or long term flexibility and may actually add to overall application deployment time and costs.
Changing Management and User Specifications and Directives, combined with evolving Platforms and System Needs, provide a Moving Target for Developers or Information Management Departments, and typically account for a major portion of delays and cost overruns.
A Solid Management Theory and Objectives Outline is
essential and ensures a proper and consistent Direction for the Project.
An Application's End Users should be included in any successful program design, as only they understand details that may have been filtered out at higher Management levels.
An Application Framework Approach
Additionally, the End Users often assist the evolving project by providing basic data collection and data entry and therfore can help to define a proper working structure for both the data relationships and the program flow.
Most large computer applications can be broken into three distinct components or building blocks:
The Platform - Recommended Software Building Blocks
- Platform (Back-End Software) Server-Side Operating System(s) and Database Standards.
- Interface (Front-End Software) Client-Side Operating System(s) and User Needs.
- Application (or Middle Ware) Actual Structure and
These building blocks are not static and are all subject to frequent change. By separating a program into these three components Change Management and Continuous Improvement become possible.
A Client/Server computing environment logically separates the front end (Client) from the back end (Server).
This allows application developers to focus on the Real Application (or Middle-Ware) i.e. the Actual Structures, Business Logic and Rules, as well as the basic Data Management and Report Definitions, etc.
Although a Client/Server computing environment is very powerful from an Integration perspective, it is also expensive.
Application development on this platform is likewise very expensive, as it is often complex and relatively slow, compared to a local (PC) environment.
Anything that can be done to eliminate uneccesary details and simplify the project, could make the difference between success and failure.
While Server Side software is still undergoing rapid evolution, it is (more-or-less) an inter-operable and stable platform due to established and entrenched standards such as SQL and ODBC.
Typically IS departments handle most of these details and the end user is (hopefully) blissfully ignorant about what goes on at this level.
The Windows Based Interface is still quite volatile at present, as is the overall platform.
Microsoft NT (Server and Workstation) is the only (Windows) Platform that will provide the needed stability for your mission critical applications.
Unfortunately, NT has relatively high overhead and requires a fast computer and lots of memory to run effectively.
NT Workstation, once it has stabilized with upcoming networking features, should become the user platform of choice in most situations.
The Windows GUI, while currently very popular as an Application Development Environment, will loose ground overall to Intranet Style Applications, due to many factors such as: Portability, Reliability, Centralized Management, Ease of Use, Universal Access, and Overall Cost.
So, What's it going to be?
Intranet based strategies (using a simple Navigator over TCP-IP etc.) are more advanced conceptually, as well as having the advantage of being based on Open Standards, but are still very immature, volatile and unproved for mission critical applications. (see our Intranet
Platforms are certainly very important, especially from an Information Management (I.M. or M.I.S.) perspective, and the core of most systems has more or less already been defined (for the near future at least).
i.e. ODBC / SQL (using TCP-IP) for the back end Servers, along with Windows and Intranet browser based systems for the front end (Clients), plus additional specialty tools in the middle tier as needed.
The bottom line is that even by using so called Standards, a complex (heterogeneous) and dynamic (continuously changing) computer environment, appears inevitable.
Therefore the central issue for application design, becomes one of
correct underlying structure and on-going change management rather than which platform will be the Final one?
©Micromax Information Services Ltd. 1999