Saturday, October 11, 2008

Jignesh Resume

Jignesh Patel

Contact: jignesh6969@gmail.com


PROFESSIONAL SUMMARY

· Over 5 years of work experience as manual and automated testing of both Client – server and Web based applications.

· Good Experience in Facebook Application, Real Estate, Shopping Cart, Education, Community, Banking and Finance domain.

· Excellent understanding of Software Development Life cycle (SDLC), Use cases documentation and Testing Methodology.

· Expertise in analyzing Functional requirements and break them down into Test Scenarios, and basing on the Test Scenarios, SRS and Functional requirements analyze and prepare Test Cases, Test Scripts, and maintain quality assurance documents like Requirements Traceability Matrix (RTM), Summary Report etc.

· Actively participated and Involved in the preparation of Test Plan, and documenting Test Scenarios, Test Cases, Test Scripts.

· Expertise in identification of discrepancies of Actual versus Expected, document defects with high level of detail, accuracy, and informative recreation steps, facilitates application defect resolution with functional and technical group.

· Expertise in analyzing the Severity of bug, logging the bug in the bug tracking tools and Tracking the bug till its closure, and Bug Reporting.

· Excellent knowledge in Quick test professional script development, execution, and maintenance.

· Excellent knowledge in Bug Tracking tools like Mantis, Clear Quest and Bugzilla.

· Expertise in performing all types of testing like Regression, Retest, System, Security and used all testing techniques like Equivalence partitioning, Boundary value analysis and Error guessing.

· Proficient knowledge in Performance Testing, Load and Stress Testing.

· Good knowledge in operating systems UNIX and LINUX.

· Proficient programming skills in VB Scripting, Test Script Language (TSL), Structured Query Language (SQL), and PL/SQL

· Experienced in working Oracle, MS-Access databases and Windows Operating Systems.

· Proficient in MS Office Suite like MS Access, MS Excel, MS Word, Power Point.

· Quick learner and good team player with excellent written and verbal communication and interpersonal skills. Coach fellow team members with their deliverables and review them as required. Ability to handle multiple tasks and work independently as well as in group.

TECHNICAL SKILLS:

Test Automation tools

Quick Test Professional 9.5/9.2 , Winrunner 7.0/6.0

Bug Tracking tools

Mantis 1.1.7, Bugzilla 3.4/3.0.1.

Languages

VB,SQL, PL/SQL, C.

Scripting Languages

VB Script, Java Script.

Application Software

MS Excel, MS Word, MS Power Point, MS Outlook

DBMS & RDBMS

MyPHP, Oracle 8i/9i & MS Access

Operating Systems

Windows XP, Windows NT,2003,2000,UNIX,LINUX, Mac

PROFESSIONAL EXPERIENCE:

Company: Digi-Corp Pvt Ltd, Ahmedabad, India Mar 2006 – Jan 2008

Position: QA Analyst

PROJECT DETAILS:

Century21ame Group

Century 21 Amlak Middle East was leading real estate brokerage and franchise system in the region. This site has developed in two languages 1. English and 2. Arabic. It concept is cycle between Regional Office, Franchisee, Agents and Buyer & Seller. It is real estate site: - upload property details, Creating Franchisee, Agents, Maintain seller-buyer details etc.

RESPONSIBILITIES:

· Developed Test Plans, Test Procedures and Test Cases according to business specifications given and tested GUI functionalities according to the guidelines of the test plan.

· Performed extensive manual testing of each module.

· Involved in SQL database backend Testing.

· Logging and tracking defects into Mantis and producing defect reports.

· Developed and reviewed Test Cases for positive and negative test scenarios conducted baseline testing and generated reports.

· Performing all types of testing like Regression, Retest, System, and Security and used all testing techniques like Equivalence partitioning, Boundary value analysis and Error guessing.

· Performed Graphical User Interface (GUI), Functional, Security and Performance Testing.

· Tested the Database to make sure that the data being stored in the system has not introduced errors and the database contains the expected data.

· Interacting with developers on various issues relating to software defects, builds.

· Suggested Enhancements to the application.

· Prepared Traceability Matrix document to map the test cases to the requirements.

· Initiated and helped in process improvements.

ENVIRONMENT:

Manual/Automated Testing, Win runner, PHP, MVC architecture, SQL, Star Team, MS Visio.

Company: Vyas infotect Ltd, Ahmedabad, India July 2005 – Mar 2006

Position: Software Tester

PROJECT DETAILS: SAISMS

The project is for Defense and deals of NSE and BSE with Record in Office automation. Application is a client server model. The scope of the project was – writing business NSE, BSE and MCX system display Add, Edit, Delete Subscribe, Package, Compose SMS, Payment Detail and Report threads, writing test cases for functional testing, creating test data, GUI testing using Winrunner. Mobile Communication using Modem and Gateway system.

RESPONSIBILITIES:

· Developed Test Plans, Test Procedures and Test Cases according to business specifications given and tested GUI functionalities according to the guidelines of the test plan.

· The project was executed in two rounds.

Round 1

· Execution of the test cases using sample test data.

Round 2

· Identification of business threads.

· Writing test cases for functional testing.

· Creating test data based on the test cases and executing the test cases.

· Using SQL for checking the output of various reports.

· GUI testing using WinRunner.

· Mobile Communication using Modem and Gateway system

· Performed extensive manual testing of each module.

· Logging and tracking defects into Mantis and producing defect reports.

· Developed and reviewed Test Cases for positive and negative test scenarios conducted baseline testing and generated reports.

· Performing all types of testing like Regression, Retest, System, and Security and used all testing techniques like Equivalence partitioning, Boundary value analysis and Error guessing.

· Performed Graphical User Interface (GUI), Functional, Security and Performance Testing.

· Interacting with developers on various issues relating to software defects, builds.

· Initiated and helped in process improvements.

ENVIRONMENT:

Manual/Automated Testing, .net, MVC architecture, SQL, Star Team, MS Visio.

EDUCATION:

Bachelor of Computer Application (B.C.A) , INDIA

Tuesday, December 4, 2007

Software Testing

Jignesh Patel (Quality Assurance Executive)
Software testing is the process used to measure the Quality of developed computer software. Usually, quality is constrained to such topics as correctness, completeness, Security, but can also include more technical requirements as described under the ISO standard, such as capability, reliability, efficiency, portability, maintainability, compatibility, and usability. Testing is a process of technical investigation, performed on behalf of stakeholders, that is intended to reveal quality-related information about the product with respect to the context in which it is intended to operate. This includes, but is not limited to, the process of executing a program or application with the intent of finding errors. Quality is not an absolute; it is value to some person. With that in mind, testing can never completely establish the correctness of arbitrary computer software; testing furnishes a criticism or comparison that compares the state and behaviors of the product against a specification. An important point is that software testing should be distinguished from the separate discipline of Software Quality Assurance (SQA), which encompasses all business process areas, not just testing.

Structure of test case

Formal, written test cases consist of three main parts with subsections:
Ø Information contains general information about Test case.
o Identifier is unique identifier of test case for further references, for example, while describing found defect.
o Test case owner/creator is name of tester or test designer, who created test or is responsible for its development
o Version of current Test case definition
o Name of test case should be human-oriented title which allows to quickly understand test case purpose and scope.
o Identifier of the requirement which is covered by the test case. Also here could be an identifier of a use case or a functional specification item.
o Purpose contains short description of test purpose, what functionality it checks.
o Dependencies Eid mubarak in advance. if you want to see my picture than open http://www.jigneshqa.blog.com/ site.

Ø Test case activity
o Testing environment/configuration contains information about configuration of hardware or software which must be met while executing test case
o Initialization describes actions, which must be performed before test case execution is started. For example, we should open some file.
o Finalization describes actions to be done after test case is performed. For example if test case crashes database, tester should restore it before other test cases will be performed.
o Actions step by step to be done to complete test.
o Input data description
Ø Results
o Expected results contains description of what tester should see after all test steps has been completed
o Actual results contains a brief description of what the tester saw after the test steps has been completed. This is often replaced with a
o Pass/Fail. Quite often if a test case fails, reference to the defect involved should be listed in this column.

o White box, black box, and grey box testing

o White box and black box testing are terms used to describe the point of view a test engineer takes when designing test cases. Black box testing treats the software as a black-box without any understanding as to how the internals behave. Thus, the tester inputs data and only sees the output from the test object. This level of testing usually requires thorough test cases to be provided to the tester who then can simply verify that for a given input, the output value (or behavior), is the same as the expected value specified in the test case.
o White box testing, however, is when the tester has access to the internal data structures, code, and algorithms. For this reason, unit testing and debugging can be classified as white-box testing and it usually requires writing code, or at a minimum, stepping through it, and thus requires more skill than the black-box tester. If the software in test is an interface or API of any sort, white-box testing is almost always required.
o In recent years the term grey box testing has come into common usage. This involves having access to internal data structures and algorithms for purposes of designing the test cases, but testing at the user, or black-box level. Manipulating input data and formatting output do not qualify as grey-box because the input and output are clearly outside of the black-box we are calling the software under test. This is particularly important when conducting integration testing between two modules of code written by two different developers, where only the interfaces are exposed for test.
o Grey box testing could be used in the context of testing a client-server environment when the tester has control over the input, inspects the value in a SQL database, and the output value, and then compares all three (the input, sql value, and output), to determine if the data got corrupt on the database insertion or retrieval.

Levels of testing


o Unit testing tests the minimal software component, or module. Each unit (basic component) of the software is tested to verify that the detailed design for the unit has been correctly implemented. In an Object-oriented environment, this is usually at the class level, and the minimal unit tests include the constructors and destructors.
o Integration testing exposes defects in the interfaces and interaction between integrated components (modules). Progressively larger groups of tested software components corresponding to elements of the architectural design are integrated and tested until the software works as a system.
o Functional testing tests at any level (class, module, interface, or system) for proper functionality as defined in the specification.
o System testing tests a completely integrated system to verify that it meets its requirements.
o System integration testing verifies that a system is integrated to any external or third party systems defined in the system requirements.
o Acceptance testing can be conducted by the end-user, customer, or client to validate whether or not to accept the product. Acceptance testing may be performed as part of the hand-off process between any two phases of development. See also Development stage
o Alpha testing is simulated or actual operational testing by potential users/customers or an independent test team at the developers' site. Alpha testing is often employed for off-the-shelf software as a form of internal acceptance testing, before the software goes to beta testing. The term alpha implies that the software is functionally complete and development will go into bug-fix mode only afterwards and no new features will be added.
o Beta testing comes after alpha testing. Versions of the software, known as beta versions, are released to a limited audience outside of the company. The software is released to groups of people so that further testing can ensure the product has few faults or bugs. Sometimes, beta versions are made available to the open public to increase the feedback field to a maximal number of future users.

Manual Testing Tips

Manual Testing Tips
Web Testing
During testing the websites the following scenarios should be considered.
  • Functionality
  • Performance
  • Usability
  • Server side interface
  • Client side compatibility
  • Security

Functionality:
In testing the functionality of the web sites the following should be tested.

  • Links
  • Internal links
  • External links
  • Mail links
  • Broken links
  • Forms
  • Field validation
  • Functional chart
  • Error message for wrong input
  • Optional and mandatory fields
  • Database
  • Testing will be done on the database integrity.
  • Cookies
  • Testing will be done on the client system side, on the temporary internet files.

Performance:
Performance testing can be applied to understand the web site's scalability, or to benchmark the performance in the environment of third party products such as servers and middleware for potential purchase.
Connection speed:

o Tested over various Networks like Dial up, ISDN etc

Load

o What is the no. of users per time?
o Check for peak loads & how system behaves.
o Large amount of data accessed by user.

Stress

o Continuous load
o Performance of memory, cpu, file handling etc.

Usability :

Usability testing is the process by which the human-computer interaction characteristics of a system are measured, and weaknesses are identified for correction.
Usability can be defined as the degree to which a given piece of software assists the person sitting at the keyboard to accomplish a task, as opposed to becoming an additional impediment to such accomplishment. The broad goal of usable systems is often assessed using several

criteria:
  • Ease of learning
  • Navigation
  • Subjective user satisfaction
  • General appearance

Server side interface:

In web testing the server side interface should be tested.
This is done by Verify that communication is done properly.
Compatibility of server with software, hardware, network and database should be tested.
The client side compatibility is also tested in various platforms, using various browsers etc.

Security:

The primary reason for testing the security of an web is to identify potential vulnerabilities and subsequently repair them.
The following types of testing are described in this section:

  • Network Scanning
  • Vulnerability Scanning
  • Password Cracking
  • Log Review
  • Integrity Checkers
  • Virus Detection

Performance Testing

Performance testing is a rigorous usability evaluation of a working system under realistic conditions to identify usability problems and to compare measures such as success
rate, task time and user satisfaction with requirements.
The goal of performance testing is not to find bugs, but to eliminate bottlenecks and establish a baseline for future regression testing.

To conduct performance testing is to engage in a carefully controlled process of measurement and analysis. Ideally, the software under test is already stable enough so that this process can proceed smoothly.
A clearly defined set of expectations is essential for meaningful performance testing.
For example, for a Web application, you need to know at least two things:

  • expected load in terms of concurrent users or HTTP connections
  • acceptable response time

Load testing:

Load testing is usually defined as the process of exercising the system under test by feeding it the largest tasks it can operate with. Load testing is sometimes called volume testing, or longevity/endurance testing
Examples of volume testing:

  • testing a word processor by editing a very large document
  • testing a printer by sending it a very large job
  • testing a mail server with thousands of users mailboxes

Examples of longevity/endurance testing:
testing a client-server application by running the client in a loop against the server over an extended period of time
Goals of load testing:
Expose bugs that do not surface in cursory testing, such as memory management bugs, memory leaks, buffer overflows, etc.
ensure that the application meets the performance baseline established during Performance testing. This is done by running regression tests against the application at a specified maximum load.
Although performance testing and load testing can seen similar, their goals are different. On one hand, performance testing uses load testing techniques and tools for measurement and benchmarking purposes and uses various load levels whereas load testing operates at a predefined load level, the highest load that the system can accept while still functioning properly.

Stress testing:
Stress testing is a form of testing that is used to determine the stability of a given system or entity. This is designed to test the software with abnormal situations. Stress testing attempts to find the limits at which the system will fail through abnormal quantity or frequency of inputs. Stress testing tries to break the system under test by overwhelming its resources or by taking resources away from it (in which case it is sometimes called negative testing).
The main purpose behind this madness is to make sure that the system fails and recovers gracefully -- this quality is known as recoverability.

Sunday, October 28, 2007

Quality Assurance Plan

Quality Assurance Plan

1. Introduction
[The introduction of the Quality Assurance Plan provides an overview of the entire document. It includes the purpose, scope, definitions, acronyms, abbreviations, references, and overview of this Quality Assurance Plan.]
1.1 Purpose
[Specify the purpose of this Quality Assurance Plan.]
1.2 Scope
[A brief description of the scope of this Quality Assurance Plan; what Project(s) it is associated with and anything else that is affected or influenced by this document.]
1.3 Definitions, Acronyms and Abbreviations
[This subsection provides the definitions of all terms, acronyms, and abbreviations required to properly interpret the Quality Assurance Plan. This information may be provided by reference to the project's Glossary.]
1.4 References
[This subsection provides a complete list of all documents referenced elsewhere in the Quality Assurance Plan. Identify each document by title, report number (if applicable), date, and publishing organization. Specify the sources from which the references can be obtained. This information may be provided by reference to an appendix or to another document. For the Quality Assurance Plan, this should include:
• Documentation Plan
• Measurement Plan
• Test Plan
• Software Development Plan
• Problem Resolution Plan
• Configuration Management Plan
• Subcontractor Management Plan
• Risk Management Plan]
1.5 Overview
[This subsection describes what the rest of the Quality Assurance Plan contains and explains how the document is organized.]
2. Quality Objectives
[This section needs to reference the section of the Software Requirements Specification that deals with quality requirements.]
3. Management
3.1 Organization
[Describe the structure of the organization responsible for Quality Assurance. The Rational Unified Process recommends that the Software Engineering Process Authority (SEPA) be responsible for the process component of Quality Assurance. The Rational Unified Process further recommends that the evaluation of product be done within the project (most notably by an independent test team) and by joint customer/developer review.]
3.2 Tasks and Responsibilities
[Describe the various Quality Assurance tasks that will be carried out for this project and indicate how they are synchronized with the project's major and minor milestones. These tasks will include:
• Joint Reviews
• Process Audits
• Process Reviews
• Customer Audits
For each task, identify the team member responsible for its execution.]
4. Documentation
[Enclose the Documentation Plan artifact by reference.
Also, list the minimum documentation that must be produced during the project to ensure that the software product that is developed satisfies the requirements. The suggested minimum set is:
• Software Development Plan (SDP)
• Test Plan
• Iteration Plans
• Software Requirements Specification (SRS)
• Software Architecture Document
• User Documentation (for example, manuals, guides)
• Configuration Management Plan
Provide pointers to the Development Case to show where in the process the adequacy of these documents is evaluated.]
5. Standards and Guidelines
[This section references any standards and guidelines that are expected to be used on the project, and addresses how compliance with these standards and guidelines is to be determined. The relevant artifacts are enclosed by reference. The suggested set for the Rational Unified Process is:
• Development Case
• Business Modeling Guidelines
• User-Interface Guidelines
• Use-Case Modeling Guidelines
• Design Guidelines
• Programming Guidelines
• Test Guidelines
• Manual Style Guide]
6. Metrics
[This section describes the product, project, and process metrics that are to be captured and monitored for the project. This is usually addressed by enclosing the Measurement Plan artifact by reference.]
7. Review and Audit Plan
[This section contains the Review and Audit Plan. The Review and Audit Plan specifies the schedule, resources, and methods and procedures to be used in conducting project reviews and audits. The plan details the various types of reviews and audits to be carried out during the project, and identifies any external agencies that are expected to approve or regulate the artifacts produced by the project.
This section should identify:
• Review and Audit Tasks
Describe briefly each type of review and audit that will be carried out on the project. For each type, identify the project artifacts that will be the subject of the review or audit. These may include Joint Customer-Developer Technical and Management Reviews, Process Reviews and Audits, Customer Audits, Internal Technical and Management Reviews.
• Schedule
Detail here the schedule for the reviews and audits. This should include reviews and audits scheduled at project milestones, as well as reviews that are triggered by delivery of project artifacts. This subsection may reference the project or iteration plan.
• Organization and Responsibilities
List here the specific groups or individuals to be involved in each of the identified review and audit activities. Describe briefly the tasks and responsibilities of each. Also, list any external agencies that are expected to approve or regulate any product of the project.
• Problem Resolution and Corrective Action
This subsection describes the procedures for reporting and handling problems identified during project reviews and audits. The Problem Resolution Plan may be referenced.
• Tools, Techniques and Methodologies
Describe here any specific tools, techniques or methodologies that are to be used to carry out the review and audit activities identified in this plan. You should describe the explicit process to be followed for each type of review or audit. Your organization may have a standard Review and Audit Procedures Manual, which may be referenced. These procedure descriptions should also address the collection, storage and archiving of the project’s Review Records.
A suggested set of reviews and audits (drawn from the Rational Unified Process) to use as a basis for planning is:
• Requirements Review (maps to the traditional Software Specification Review)
• Architecture Review (maps to the traditional Preliminary Design Review)
• Design Review (maps to the traditional Critical Design Review)

Note that the product-, technique-, criteria-, and metrics- related aspects of these reviews are addressed in the Rational Unified Process itself and instantiated in the Evaluation Plan section of the Software Development Plan. The Review and Audit Plan section of the Quality Assurance Plan will concern itself with the Joint (customer, developer) Review aspects, for example, artifacts required, responsibilities, conduct of the review meeting, pass or fail criteria.
• Functional Configuration audit (to verify all requirements in the SRS have been met)
• Physical configuration audit (to verify that the software and its documentation are complete and ready for delivery)
• Process audits
• Process reviews
• Managerial reviews (Project Approval Review, Project Planning Review, Iteration Plan Review, PRA Project Review)
• Post-mortem reviews (Iteration Acceptance Review, Lifecycle Milestone Review, Project Acceptance Review).]
8. Evaluation and Test
[This section references the Software Development Plan (Evaluation Plan section) and the Test Plan.]