IT Systems Modeling


System Analysis
System architecture
Cite This

Heeding the warning that a picture is worth 1000 words, and keeping in mind the Credo of Humility, which states that "all models are wrong, but some are usefl", the est way to capture information systems requirements is through various modeling techniques.  From these the right architecture, weak system development methodology,  hard systems development  methodology,  and or CASE tools can be applied as the situations may warrant.





 Behavioral Modeling 
Behavioral modeling is concerned with identifying things that happen, the activities associated with what happens and the response of the system to these activities.
Event is something that happens at a point in time.  For example, in a company an event can be:  employee is hired
Activity describes what is done when an even occurs.  For the even, employee is hired, an activity can be:  maintain employee data
Response:  is the output produced when an activity is complete
Table One summarizes selected events, activities and responses for a business system concerned with producing payroll -- for complete tutorial, click here (pdf format). Also refer to  process improvement documentbusiness Process re-engineering, brainstorming
Table 1:  Summary of Produce Payroll Behavior Model
Employee is hired
Maintain employee data
Update database
Tax rates change
Maintain tax tables
Update database
Tax calculations change
Maintain payroll system
Modify payroll system
End of pay period
Produce payroll
produce paychecks, direct deposit transfers, deduction withholding report, federal tax direct deposit transactions, general ledger transactions, cash requirements report, departmental payroll expense report,
End of month, end of quarter, end of year
Initiate reporting appropriate for the period.
Initialize MTD, QTD, YTD values
Use Case Analysis
The Use Case model is about describing WHAT (as opposed to How)  the system will do at a high-level and with a user focus for the purpose of capturing system requirements.
Use cases document user-system interactions required to perform tasks and define the user and system actions for each "user visible" task performed by the information system.  As such they effectively define all user interfaces and the sequence of interactions within them.  Use cases focus on "typical" interactions first, describing the case when, "everything goes right”  and separately describe exception processing as necessary to accomplish the task.
Use cases name the major "processes" and show the actors with which they interact.  The process of developing use cases is iterative and begins by identifying the actors involved in the activity. Actors
An actor may be a person or an external system.  A single person may play a number of different roles in any given activity.  It is important to identify what tasks the different roles are trying to accomplish and their contribution to the activity.  These dictate what the information system must do and what the actor must do.  There are typically different levels of roles within a business activity supported by an information system.  These include, data entry rolemonitoring roleanalysis role, and decision-making role.  The actors and tasks are organized into a use case diagram, which shows the actors as named stick figures, tasks as named ovals, and interactions as lines.  In particular it is important to document when different actors interact with the same task to insure that each is consulted when developing the use case for the task.

It can be said that a use case is a collection of possible sequences of interactions between the system under discussion and its Users (or Actors), relating to a particular goal. The collection of Use Cases should define all system behavior relevant to the actors to assure them that their goals will be carried out properly. Any system behavior that is irrelevant to the actors should not be included in the use cases.  Click here for use case tuturial - pdf format

Business Process Modeling
Business processes are modeled to identify bottlenecks and inefficiencies in the business processes and opportunities to address these via the reorganization or work and the application of information technology
Business Process Modeling Representation
Business processes are represented using stick figure diagrams (see figure 1) or workflow diagrams to document what is actually done in a business process.  The purpose is to identify bottlenecks and inefficiencies so they can be reengineered to make them more effective and more efficient.   When modeling business processes, the goal is  to identify opportunities for information technology to enable the business organization to be more productive, to enhance its ability to service its customers, or in some way improve the way in which it does business, rather than on making an already inefficient process, be inefficient faster!
To model business processes one can use icons that represent "things" in the business and in its environment that:
1.       Cause the business to perform
2.       Impose rules on the way in which the business performs
3.       Require being informed of the occurrence or results
4.  In any other way interact with the business process under consideration.  These include employees, managers, information and mechanical systems, other organizational units, stakeholders within the business, customers, government, stockholders, etc.
Consider applying the above framework to the Sales Order Processing model of Figure.1
In a typical business likely the questions may be addressed to a person in Marketing or Sales with a good understanding of the Customer view.  Furthermore, the analyst must determine if all Customers can be considered to be homogeneous for purposes of the analysis or if different Customer segments must be analyzed separately.
Following the framework established above:
1.  From the Customer perspective, the mission  is to acquire the most appropriate products for the business situation at a good price, to be delivered in a appropriate timeframe.
2.  Tasks necessary to accomplish that mission are: determine what products are on the market, their characteristics, their appropriateness, their cost, available suppliers, their reliability, their delivery and service capability, their reputation, the potential to purchase from a customer or establish a strategic alliance with the supplier.
3.       Problems that might have been encountered include, poor quality or overpriced purchases, late or improper deliveries, inadequate supplier payment terms or response for returns and allowances.  Solutions include finding a high quality supplier and negotiating appropriate pricing and terms. 
4.       The decisions are, what to purchase and from whom. 
The Customer's critical success factors include acquiring appropriate products at a fair price, measured by how well the products perform and the availability of comparable products at a lower cost.Refer to tutorial for additional detail:  Also, refer to process improvement documentbusiness Process re-engineering
Information Systems Process Modeling (Dataflow Diagrams)
Given a well-specified business process and the information requirements to support it,  Data Flow Diagrams (DFDs) can be used to represent the data acquisition, transformation and storage and the information delivery processes within an information system, and then processed via Computer Assisted Software Engineering (CASE) tools to generate the necessary programs.  Alternatively these DFDs can be used by programmers to maximize the probability of successfully implementing all the specified information system requirements.   DFD make use of External Entities, Data Flows, Processes, and Data Stores. 
External Entities represent  people or external systems with which the current (or proposed) system must interact to acquire its input data or to which the current system must deliver its output data.  External Entities typically represent the “responsible party” for the data flow.  For example, while a payroll clerk may actually enter the employee hours worked into a payroll system, the employee is the responsible party for the data and would be a better choice for the External Entity from which this data flow occurs.
Data Flows represent the data content.  It can  be acquired from an External Entity, output to an External Entity, required as input to an information system Process or produced as output from an information system Process.  For example, the data on an employee’s timecard could be represented as a Data Flow named, Timecard.  The data elements included in each data flow must be documented in a data dictionary associated with the data flow.  The Timecard data flow, likely contains data elements such as: employee number, employee name, period ending, and hours worked.
Processes transform data according to business rules (see Table 1)  The business rules and processing logic must be documented for each process.  Processes transform data, they do not create it.    All outputs produced from a Process must be calculated from its inputs according to its processing logic, ultimately expressed in a computer programming language, but initially expressed using a process narrative, Structured English, decision tables or decision trees.  Processes that are too complicated to be easily described in this way are decomposed or exploded into 5 to 9 processes at the next lower level of detail. 
Data Stores can be viewed as an “inventory” of data that lives within the system when no processes are active.  Data Stores do not transform data, they are simply repositories.  Exactly the same data must flow out of a Data Store as flows into it.  As with Data Flows, the contents of each Data Store must be documented in a data dictionary.  Often they are used to “decouple” processes over time.  For example, if employee fill out and submit timecards each day, but payroll is produced only once per week, the information system may show a process to enter and validate timecard data which outputs the data into a Timecard Data Store.  Another process inputs data from that data store to produce paychecks, update payroll records, and produce required payroll reports. 
DFDs are “leveled” to show increasing detail of the transformation processes. The top few DFD levels typically represent the top few menus in the information system application.  Both are designed to identify the organization of functions within the system.  There exist numerous CASE tools to assist in producing and analyzing DFDs.  These are often combined with code generators to produce menus, screens and algorithms for the system implementation.
At the highest level the information system is represented by a single process called a Context diagram.  The Context diagram defines what data will be input to the system and what data will be output.  It represents a contract with the client.  It essentially says, “I will develop software to capture the inputs and transform them into the outputs shown.  The system will not capture any inputs or produce any outputs not shown.”  It is the analyst’s job to be sure that the outputs shown on the Context Diagram are sufficient to meet the needs of the client and that the inputs shown on the Context Diagram are sufficient to produce them
DFD representation:
There are two popular types of notations for DFDs: Yourdon & Coad or Gane & Sarson as shown below:
Figure 1 shows a Context (or Leve 0) Data Flow Diagram for a simplified Payroll System.
  • The process is Produce Payroll. 
  • The External Entities specified are:  Employee and Payroll Manger. 
  • Six data flows are specified: Timecards, Pay Checks, W2 Forms, Employee Data, Tax Tables and Parameters, and Payroll Reports. 
Employee provides Timecards and receives Pay Checks and W2-Forms.  Payroll Manager provides Employee Data and Tax Tables and Parameters and receives Payroll Reports. 
The contents of each data flow must be defined in a data dictionary.  Figure 2 shows the data dictionary for the six data flows included in the Context Diagram. Employee Data includes employee number, name, social security number, and wageRate. 
Although the source of data such as name and social security number actually come from the employee, the Payroll Manager is deemed to be the “responsible party” for this data.  Similarly, the Federal Government is the source of Tax Tables and Parameters (e.g., MaxFICAWages, FICARate), however, again, the Payroll Manager is deemed to be the “responsible party.” 
Note:  At this stage in the analysis, the contents of the data flow, Payroll Reports, has not been defined.  This informs the development team that analysis must be done to determine the exact requirements.  General areas for investigation such as General Ledger and Federal and State reporting requirements are noted.
If the analyst can describe the processing logic and business rules to transform each input into each output, then the process analysis is complete.  Typically, however, the context process is  “exploded” into five to nine subprocesses, each of which accomplishes some part of the

Table 2:  Typcial Dataflow Diagram Notations

Yourdon and Coad Notations
Gane and Sarson Notation

Process Notations

Process Notations

Datastore Notations

Datastore Notations

Dataflow Notations
Dataflow Notations
External Entity Notations
External Entity Notations
transformation.  Two techniques are typically used to explode a process, Transaction Analysis and Functional Decomposition
In Transaction Analysis, each output data flow is analyzed as follows.  First a process is created to produce that output.  The input data flows needed to produce that output are determined and then a source must be determined for each of these inputs.  If the source is not an External Entity, then a process must be created to produce that data flow and the process is repeated.  At this point data stores are introduced to hold data that “lives” within the system, beyond being used by a process.  In Functional Decomposition, five to nine major subfunctions are identified and tied together using data flows.  Frequently Transaction Analysis is used as a means to validate Functional Decomposition.
Figure 3 illustrates an explosion of Process Payroll into six subprocesses:
  • 1.       Enter Timecards
  • 2.       Produce Pay Checks
  • 3.       Produce W-2s
  • 4.       Maintain Tax Data
  • 5.       Produce Payroll Reports
  • 6.       Update Employee Data. 

Typically the first explosion represents the menu choices for an application.  That is, the major functions the user can initiate.  From a Transaction Analysis perspective, the output Data Flow, Pay Checks is produced by the process, Produce Pay Checks.  To produce that output requires input data flows containing Timecard data, Employee Check Data, and Tax Tables and Parameters.  Each input data flow must have a source.  In this case each comes from a data store, since it is deemed to be “long lived.”  The input data needed by a process is determined by developing the logic and business rules needed to produce the output, as illustrated in the processing logic in Table 1 to calculate FICA Tax, one of the data items in the Pay Checks Data Flow.  If the processing logic is too complex, that process must be exploded.

Business Modeling
Business Management Tip (February 23, 2004):Modeling Really Quickly While Interviewing a Subject Matter Expert
Simulation Tip (May 27, 2002):Simulation Graphs and Reports
Simulation Tip  (May 21, 2001):Simulation Cheat Sheet
Business Modeling (August 21, 2000):What's in a Role?
e-Business Modeling (July 17, 2000):
Using Value Chains to Model an e-Commerce Business
Business Modeling (May 15, 2000):
Impact Analysis Through the Browser
Business and UML Modeling (April 17, 2000):
Building an Auto-Decomposition Diagram
UML and Business Modeling (November 22, 1999):
Creating UML Class Models from Business Objects

General Modeling
General Tip (September 27, 2004):Coloring the Background of the Diagram Workspace
Presentation Tip (September 6, 2004):Using Mouseover Highlighting to Make a Point
General Management Tip (August 9, 2004):Breaking Symbol Names Into Multiple Lines
Help Tip (December 8, 2003):Creating Your Own Custom Help File
General Tip (September 8, 2003):Adding a Professional Look to a Hierarchy Diagram
General Modeling  (June 25, 2001):Singling Out Symbols for Rename
General Modeling  (January 29, 2001):Using Notes on Diagrams, Symbols, and Definitions
Using New System Architect Feature (December 11, 2000):Using the Show/Hide Sub-Tree Option for Hierarchy Diagrams
General Modeling Techniques (March, 1999):
The System Architect Meta-Model
General Modeling Techniques (February, 1999):
Inserting Pictures into System Architect

Data Modeling
Data Modeling Tip (October 14, 2002):Propagating Foreign Keys in a Relational Data Model
Data Modeling Tip  (February 26, 2001):Attaching an Entity Definition to a Data Store Definition
Data Modeling (July 31, 2000):Setting Multiplicity Notation
Data Modeling (January 10, 2000):
Ordering Compound Keys, Indexes, and Access Paths
Features are an important concept in software product lines because they represent reusable requirements or characteristics of a product line. The concept of a feature is quite intuitive and applies to all product lines, not just software product lines. Consider a vehicle product line

Use Case Driven Object Modeling with UML: A Practical Approach
Presents an approach to UML modeling that includes a minimal set of diagrams and techniques to get from use cases to code quickly. By Doug Rosenberg and kendall scott, Addison-Wesley.- a few chapters of the book are available for free online
The goal of this white paper is to demonstrate how the Borland® Suite of Tools can automate use case modeling activities.
As web services move from the cutting edge to the mainstream, identification of the opportunities to use this technology become as important as understanding the technology itself.
This white paper provides an overview of what is new in the UML 2.0 standard.
This edition of the Coad Letter takes a look at the dynamics of use cases.
This tutorial provides a quick introduction to the Unified Modeling Language™
This MSDN Webcast demonstrates how to use Borland Together Edition for Visual Studio .NET to design your application. The session will discuss how to integrate design into the entire application lifecycle from requirements to deployment.
Here is an update of when you can expect to see the UML 2.0 specifications and some ideas on how you can get your first glimpse
Author/consultant/trainer Scott Ambler talks about Agile Modeling, eXtreme Programming, silver bullets, embracing change, virtually clueless high school guidance counsellors, and what he'll be doing at BorCon this year.
This article looks at business rules in a modeling and development environment.
This article examines a new element of the use case diagram in UML 2.0 including multiplicities and conditions on "extends" relationships

This article examines the way that profiles have changed in the UML 2.0 spec.
This article examines the way that exceptions are modeled in the UML 2.0 specification.
The UML 2.0 Specification has been approved by the OMG. The final editing process is going on and the specification is set to be released to the public by the end of the year. Here is what is new.
Resources on how to get up-to-speed with JDataStore
Domain Analysis with Color Modeling
Read chapter 1 from the book "Java Modeling in Color with UML" by Peter Coad, Eric Lefebvre, and Jeff De Luca"
Todd Olsen, Chief Scientist of Borland's Together Business Unit, will show you how to Accelerate your productivity with modeling using the integration of Borland JBuilder and Together Edition for JBuilder

No comments:

Post a Comment