Capability Maturity Model [CMM]

  • Developed by the software community in 1986 with the leadership from the SEI.
  • Has become the factor standard for assessing and improving processes related to software development.
  • Has evolved into a process maturity framework Provides guidance for measuring software process maturity helps establish process improvement programs.

Maturity levels

  • Initial
  • Repeatable
  • Defined
  • Manageable
  • Optimizing

Level 1: Initial

Each maturity level decomposes into several key process areas that indicate the areas an organization should focus on to improve its software process.

Level 2 - Repeatable: Key practice areas

Requirements management
Software project planning
Software project tracking & oversight
Software subcontract management
Software quality assurance
Software configuration management


Level 3 - Defined: Key practice areas

Organization process focus
Organization process definition
Training program

Integrated software management
Software product engineering
Inter group coordination
Peer reviews

Level 4 - Manageable: Key practice areas

Quantitative Process Management
Software Quality Management

Posted in | 0 comments

Configuration Management

Configuration management covers the processes used to control, coordinate, and track: code, requirements, documentation, problems, change requests, designs, tools/compilers/libraries/patches, changes made to them, and who makes the changes.

Posted in | 0 comments

Bug

If the software does not meet the below criteria then it is bug.

  • Any thing which is not defined by the client
  • Excess Things are added in the software
  • D not produce a expected result
  • it is not user friendly

Posted in Labels: | 0 comments

Difference between User Acceptance Testing and System Testing

User acceptance Testing is performed by the Client of the application to determine whether the application is developed as per the requirements specified by him/her. It is performed within the development of the organization or at the client site. Alpha testing and Beta testing are the examples of Acceptance Testing.
System testing is performed by the Testers to determine whether the application is satisfying the requirements as specified in SRS.

Posted in Labels: | 0 comments

Integration Testing

Integration Testing is defined as: testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems

Posted in Labels: | 0 comments

Regression Testing

Regression Testing is a re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle. Automated testing tools can be especially useful for this type of testing.

Posted in Labels: | 0 comments

Testing Techniques

White Box Testing
  • Aims to establish that the code works as designed
  • Examines the internal structure and implementation of the program
  • Target specific paths through the program
  • Needs accurate knowledge of the design, implementation and code
Black box testing
  • Aims to establish that the code meets the requirements
  • Tends to be applied later in the life cycle
  • mainly aimed at finding deviations in behavior from the specification or requirement
  • Causes are inputs, effects are observable outputs
Alpha Testing:A customer conducts the Alpha testing at the developer's site. The software is used in a natural setting with the developer recording errors and usage problems. Alpha tests are conducted in the controlled environment by the developer
Beta Testing:The beta testing is conducted at one or more customer sites by the end user(s) of the software. The developer will not be present in the customer's place. So, the Beta test is a 'live' application of the software in an environment that cannot be controlled by a developer. The customer records all the problems (real or apparent) that are encountered during the beta testing and reports to the developer at regular interval

Posted in Labels: | 0 comments

Test Planning - Test Strategy

Actual writing of a strategy involves aspects, which define other issues between the Testing organization and the client. Testers must basically understand some of the issues that are discussed in the strategy document, which are outlined below
1. End-to-End: The test path uses the entire flow provided in the application for completion of a specified task. Within this process various test conditions and values are covered and results analyzed. There maybe a possibility of reporting several defects relating to the segments while covering the test path. The advantage of using this approach is to minimize combination and permutation of conditions/values and ensure coverage and integration.
2. Automation Strategy: Automation of testing process is done to reduce the effort during regression testing. In some cases automating the entire testing process may not possible due to technical and time constraints.
3. Performance Strategy: The client specifies the standards for the performance testing. It generally contains
• Response time
• Number of Virtual Users Using the above information, a Usage Pattern of the application is derived and documented in the strategy.
4. Risk Analysis: Risk's associated with projects are analyzed and mitigation's are documented in this document,
Schedule Risk: Factors that may affect the schedule of testing are discussed.
Technology Risk: Risks on the hardware and software of the application are discussed here
Resource Risk: Test team availability on slippage of the project schedule is discussed.
Support Risk: Clarifications required on the specification and availability of personnel for the same is discussed.
5. Effort Estimation: The function points in the Functional Specifications will be used as the basis for the purpose of estimating the effort needed for the project. The average of the different estimates from the Peers in the test team will be taken as the basis for calculation of the effort required.
6. Infrastructure: Hardware and software requirements for the testing the application are documented. Apart from this, any other requirement should also be documented. Infrastructure that has to be provided by the client is also specified.

Posted in Labels: | 0 comments

Test Case Execution Process

Targets for completion of Phases: Time frames for the passes have to decided and committed to the clients well in advance to the start of test. Some of the factors consider for doings so are
Number of cases/scripts: Depending on the number of test scripts and the resource available, completion dates are prepared
Complexity of Testing: In some cases the number of test cases may be less but the complexity of the test may be a factor. The testing may involve time consuming calculations or responses from external interfaces etc.
Number of Errors: This is done very exceptionally; Pre-IST testing is done to check the health of the application soon after the preparations are done. The number of errors that were reported should be taken as a benchmark.

Posted in Labels: | 0 comments

Allocation of test cases on different passes

All test scripts may not be possibly executed in the first passes. Some of the reasons for this could be
  • Functionality may some-times be introduced at a later stage and application may not support it, or the test team may not be ready with the preparation
  • External interfaces to the application may not be ready
  • The client might choose to deliver some part of the application for testing and rest may be delivered during other passes

Posted in Labels: | 0 comments

Allocation of test cases among the team

The Test team should decide on the resources that would execute the test scripts. Ideally, the tester who designed the test script for the module executes the test. In some cases, due to shortage of time or resource at that point of time, additional test scripts might have to be executed by some members of the team.Clear documentation of responsibilities is done in the test plan.

Posted in Labels: | 0 comments

Test Execution Sequence

Test scripts can either be executed in a random format or in an sequential fashion. Some applications have concepts that would require sequencing of the test cases before actual execution. The details of the execution are documented in the test plan.Sequencing can also be done on the modules of the application, as one module would populate or formulate information required for another.

Posted in Labels: | 0 comments

Test Plan

This document is a deliverable to client. It contains actual plan for test execution with details to the minute.

Posted in Labels: | 0 comments

Software Development Life Cycle (SDLC)

1. System Information/Engineering and Modeling
a.The initial proposal phase to understand and identify the project requirements and main features proposed in the application
b. By the end of the study the entire client systems, the team furnishes a document that holds specific recommendations for the candidate system. Out Put of this phase

  • A rough estimation of efforts and prices for entire project
  • A Rough Schedule for entire phase
  • Initial business functions in the project

2. System Requirement Analysis & Specification
The objective of this phase to identify and document the user requirements for a proposed system.This phase contains complete information of function, information of the domain for the software,behavior,performance,Interfacing.Out put of this phase is

  • An estimate for effort or price for remaining project phases(Design, Development, Testing Deployment, Support)
  • Details Schedule of the remaining Phases(Design, Development, Testing, Deployment, Support)
  • The final functional specification and analysis for other phases

3. System Analysis and Design
a. This process of designing exactly of the specifications are to be implemented
b. How the software to written, including object model with properties and methods, Database Designing,client server architecture. Out put of this Phase A Final Design Specification
4. Code Generation
a. Design Must be Formatted in to machine readable format
b. With respect of the application the right programming language is used,

  • A Beta version of application
  • Preliminary Manuals and user document
  • Preliminary Technical Documents

5. Testing
Once code is generated Program testing begins.Different testing methodologies are available to unrevealed the bugs.Out put of this phage

  • A Final version of complete application
  • A Final Manual
  • A Final Technical Document

6. Deployment & Maintenance
In This software ready to deploy in customer place, maintenance period starts for the application.Application should create accept the changes in this phase

Posted in Labels: | 1 comments

Some Major Computer System Failures Caused by Bugs

1. In August of 2006 a U.S. government student loan service erroneously made public the personal data of as many as 21,000 borrowers on it's web site, due to a software error. The bug was fixed and the government department subsequently offered to arrange for free credit monitoring services for those affected.
2.A September 2006 news report indicated problems with software utilized in a state government's primary election, resulting in periodic unexpected rebooting of voter checking machines, which were separate from the electronic voting machines, and resulted in confusion and delays at voting sites. The problem was reportedly due to insufficient testing.
3. Software bugs in a Soviet early-warning monitoring system nearly brought on nuclear war in 1983, according to news reports in early 1999. The software was supposed to filter out false missile detections caused by Soviet satellites picking up sunlight reflections off cloud-tops, but failed to do so. Disaster was averted when a Soviet commander, based on what he said was a '...funny feeling in my gut', decided the apparent missile attack was a false alarm. The filtering software code was rewritten

Posted in Labels: | 0 comments

Software Testing Axioms

1. It is impossible to test a program completely
2. Software testing is risk based exercise
3. Testing cannot show that bugs don't exist
4. The more bugs you find, the more bugs there are
5. Not all the bugs you find will be fixed
6. Product specifications are never final

Posted in Labels: | 3 comments