The feasibilty study and evaluation criteria

HideShow resource information
  • Created by: Jasmin
  • Created on: 21-04-13 10:41

The feasibilty study and evaluation criteria

FEASIBILITY STUDY - To find out whether a new system can be developed at a reasonable cost and in a sensible amount of time

  • benefits of new system outweigh costs
  • decision made whether or not to create the new system or improve the old system

Methods used for fact finding (the investigation of a system to understand how it works or should work)

QUESTIONNAIRES - given to each user. Questions about how job is done now. About information that the new system needs

INTERVIEWS - take longer. How existing system works and what things are required from new system

OBSERVATION - chat to user about what the new system must be able to do. Can see problems encountered by old system

INSPECTION OF RECORDS - looking at any paperwork involved with current system

1 of 4


EVALUATION CRITERIA - Determing at end of project whether it has been a success or not

  • Desired outcomes: list of what systems must be able to do
  • Performance criteria: criteria that can be measured. How desired outcomes can be measured

Stages of the systems life cycle

    • Identifying the problem that needs solving
    • Understanding the existing system/proposed system
    • Identifying the desired outcomes
    • Setting up performance criteria
  • ANALYSIS: Detail at the current system or requirements for a task that has never been performed before.
2 of 4


DESIGN:Desiging system in line with desired outcomes.

  • Choosing method for input, storage and output
  • Deciding what processing needs to be performed ont he data
  • Designing validation tests/test plans

Test plans: detailed list of the tests that are to be conducted on the system when it has been developed to check it is working properly. Produced during design stage. Make sure:

  • Tests are numbered
  • Each test has the data to be used in the test clearly specified
  • Reason test is stated
  • Expected result

Typical data: normal data which passes all validation checks

Extreme data: borderline what system will accept

Erroneous data: data is unacceptable and rejected by validation check

3 of 4


IMPLEMENTATION: System actually built according to the design

  • Using software tools to produce working version
  • Producing working system according to desired outcomes

TESTING: once implemented.

  • Entering test data as specified in test plan
  • Comparing results with what should have happened

USER TRAINING AND DOCUMENTATION:user trained on how to use them. Documentation that user can turn to for learning about a new procedure or for dealing with a problem

EVALUATION AND MONITORING:find out about problems with the new system.

  • Checking that original user requirements and performance criteria have been fully met
  • Assessment of how happy new clients are

MAINTENANCE:extra functions that need to be added to the existing system are identified

4 of 4


No comments have yet been made

Similar ICT resources:

See all ICT resources »See all Data and Information resources »