17 October 2012

Defect types different names

Defect:

            While testing when a tester executes the test cases he might observe that the actual test results do not match from the expected results. The variation in the expected and actual results is known as defects. Different organizations have different names to describe this variation, commonly defects are also known as bug, problem, incidents or issues. In short we can say identify by tester is called Bug.

Defect Severity:

Def1: Defect Severity or Impact is a classification of software defect (bug) to indicate the degree of negative impact on the quality of software.

Def2: severity: The degree of impact that a defect has on the development or operation of a component or system.

Def3: A defect is a product a nomaly or flaw, which is variance from desired product specification. The classification of defect based on its impact on operation of product is called Defect Severity.

DEFECT SEVERITY CLASSIFICATION:

         The actual terminologies, and their meaning, can vary depending on people, projects, organizations, or defect tracking tools, but the following is a normally accepted classification.

Critical: 

 Def1: The defect affects critical functionality or critical data. It does not have a workaround. Example: Unsuccessful installation, complete failure of a feature.
 
Def2: There is s functionality block. The application is not able to proceed any further.

Def3: The defect results in the failure of the complete software system, of a subsystem, or of a software unit (program or module) with the system.

Crash: 

crashes, loss of data

Block:

       Other bugs can't be fixed until this one is. blocks further development and/or testing work. Unlikely unless testing directly from SVN

Fatal:

Def1:A defect that will cause the system to crash or close abruptly or effect other applications.

Def2: A defect that will cause the system to crash or close abruptly or effect other applications.

Def3: Fatal Defects are the defects, which results in the failure of the complete software system, of a subsystem, or of a software unit so that no work or testing can be carried out after the occurrence of the defect.

Showstopper:

Def1: Name itself tells that bug that doesn't allow the user or customer to proceed further steps with testing called as Showstopper.
Def2: the bug which doent allow to carryout further testing

Major: 

Def1: The defect affects major functionality or major data. It has a workaround but is not obvious and is difficult.

Example: A feature is not functional from one module but the task is doable if 10 complicated indirect steps are followed in another module/s.

Def2: The application is not working as desired. There are variations in the functionality.

Def3: A defect, which will cause an observable product failure or departure from requirements.

Def4: The defect result in the failure of the complete software system of a subsystem, or of a software unit(program or module) within the system. There is no way to make the failed components, however, there are acceptable processing alternatives, which will yield the desired result.

Def5: Major Defects are one, which also causes failure of entire or part of system, but there are some processing alternatives, which allows further operation of the system.  

Hotfix: 

A bug found in the customersplace for which ur going to send the solution immediatley or fix it imeediately

Average: 

The defect does not result in a failure, but causes the system to produce incorrect, incomplete, or inconsistent results, or the defect impairs the systems usability.

Minor: 

Def1: The defect affects minor functionality or non-critical data. It has an easy workaround. Example: A minor feature that is not functional in one module but the same task is easily doable from another module.

Def2: There is no failure reported due to the defect, but certainly needs to be rectified.

Def3: A defect that will not cause a failure in execution of the product.

Def4: The defect does not cause failure, does not impair usability, and the desired processing results are easily obtained by working around the defect.

Def5: Minor Defects does not result in failure but causes the system to produce incorrect, incomplete, or inconsistent results, or the defect impairs the system usability.

Cosmetic: 

Def1: Defects in the User Interface or Navigation.

Def2: The defect is the result of non-conformance to a standard, is related to the aesthetics of the system, or is a request for an enhancement.Defects at this level may be deferred or even ignored.

Def3: Cosmetic Defects are small errors that do not prevent or hinder functionality.

Text:

 consistency of font

Tweak: 

Bug affects everyone.

Trivial:

Def1: The defect does not affect functionality or data. It does not even need a workaround. It does not impact productivity or efficiency. It is merely an inconvenience.

Example: Petty layout discrepancies, spelling/grammatical errors.

Def2: This is a bug, but it doesn't really cause problems. cosmetic problem like misspelled words or misaligned text

Suggestion:

 Feature which can be added for betterment

Feature/Enhancement:

A feature request. request for a new feature or change in functionality for an existing feature

The actual terminologies, and their meaning, can vary depending on people, projects, organizations, or defect tracking tools, but the following is a normally accepted classification.

Severity is also denoted as in Method 1:
  • S1 = Critical
  • S2 = Major
  • S3 = Minor
  • S4 = Trivial
Severity is also denoted as in Method 2:
  • S1 = Major
  • S2 = Minor
  • S3 = Fatal

Severity is also denoted as in Method 3:
  • S1 = Critical
  • S2 = Major
  • S3 = Minor
  • S4 = Cosmetic
  • S5=Suggestion

Severity is also denoted as in Method 4:
  • S1 = Critical
  • S2 = Major
  • S3 = Average
  • S4 = Minor
  • S5 = Cosmetic

Severity is also denoted as in Method 5:
  • S1 = Blocker
  • S2 = Critical
  • S3 = Major
  • S4 = Normal
  • S5 = Minor
  • S6 = Trivial

Severity is also denoted as in Method 6:
  • S1 = Critical
  • S2 = Major
  • S3 = Minor
  • S4= Text
  • S5= Tweak
  • S6 = Trivial =Cosmetic
  • S7= Suggestion 
Severity is also denoted as in Method 5:
  • S1 =Show Stopper
  • S2 = Hot Fix
  • S3 = Critical
 Following are examples of type of Defects, which falls under each category.
Fatal Defects
  • Functionality does not permit for further testing.
  • Runtime Errors like JavaScript errors etc.
  • Functionality Missed out / Incorrect Implementation (Major Deviation from Requirements).
  • Performance Issues (If specified by Client).
  • Browser incompatibility and Operating systems incompatibility issues depending on the impact of error.
  • Dead Links.
  • Recursive Loop.
Major Defects
  • Functionality incorrectly implemented (Minor Deviation from Requirements).
  • Performance Issues (If not specified by Client).
  • Mandatory Validations for Mandatory Fields.
  • Images, Graphics missing which hinders functionality.
  • Front End / Home Page Alignment issues.
Minor Defects
  • Screen Layout Issues
  • Spelling Mistakes / Grammatical Mistakes.
  • Documentation Errors
  • Page Titles Missing.
  • Alt Text for Images.
  • Background Color for the Pages other than Home page.
  • Default Value missing for the fields required.
  • Cursor Set Focus and Tab Flow on the Page.
  • Images, Graphics missing, which does not, hinders functionality.
Cosmetic Defects
  • Suggestions
  • GUI image colours etc.

16 October 2012

Severity of Defect

Severity:

          Severity means how much severe is the particular defect in the application (i.e.) how it affects the functionality of the application
Def:  The degree of impact that a defect has on the development or operation of a component or system.

Levels of Severity:

  • Feature
  • Trivial
  • Text
  • Tweak
  • Minor
  • Major
  • Block
  • Crash

Feature/Enhancement: 
            A feature request. request for a new feature or change in functionality for an existing feature

Trivial:
           This is a bug, but it doesn't really cause problems. cosmetic problem like misspelled words or misaligned text

Text: consistency of font

Tweak: Bug affects everyone.

Minor:

           Bug that bothers a few people. minor loss of function, or other problem where an easy workaround is present

Major:
            Bug affects an area of MediaWiki that is important on WMF sites. major loss of function

Block: 
          Other bugs can't be fixed until this one is. blocks further development and/or testing work. Unlikely unless testing directly from SVN

Crash/Critical: crashes, loss of data (internally, not your edit preview!)

Priority of defect

Priority:

             Priority means the importance and urgency to fix the defect by the developers (i.e.) which defect should be fixed first and which should be fixed in later versions.

Def:    Defect Priority (Bug Priority) indicates the importance or urgency of fixing a defect. Though priority may be initially set by the Software Tester, it is usually finalized by the Project/Product Manager.

Priority Levels:

  • Low
  • Normal
  • High
  • Urgent
  • Immediate

Immediate: Must be fixed the defect immediately. needs to be fixed ASAP, a week at the most

Urgent: Must be fixed the defect in the next build. should be fixed within the next month

High: Must be fixed the defect in any of the upcoming builds but should be included in the release.

Normal: May be fixed the defect after the release / in the next release. should be fixed in this quarter, or by the next release

Low: May or may not be fixed at all. should get to this within 6 months, or the release after next

Severity and Priority:

Severity and Priority:

           Generally we have to set the Severity and Priority level for a Defect. Severity level will be set by the Testing Team and the Priority level will be set by the Development Team.

Severity:

           Severity means how much severe is the particular defect in the application (i.e.) how it affects the functionality of the application

Def: The degree of impact that a defect has on the development or operation of a component or system.

Levels of Severity:

  • Feature
  • Trivial
  • Text
  • Tweak
  • Minor
  • Major
  • Block
  • Crash

Priority:

              Priority means the importance and urgency to fix the defect by the developers (i.e.) which defect should be fixed first and which should be fixed in later versions.

Def:   Defect Priority (Bug Priority) indicates the importance or urgency of fixing a defect. Though priority may be initially set by the Software Tester, it is usually finalized by the Project/Product Manager.

Priority Levels:

  • Low
  • Normal
  • High
  • Urgent
  • Immediate


15 October 2012

Versions of QTP

Version history of QTP

Astra Quicktest (First version)         -   May, 1998
Astra QuickTest 3.0                        -   Feb, 2000
Astra QuickTest 5.0                        -   Feb, 2001
(Astra QuickTest Professional 5.5)
QuickTest Professional 6.5              -   Sep, 2003
QuickTest Professional 8.0              -   2004
QuickTest Professional 8.2              -   early 2005
QuickTest Professional 9.0              -   April, 2006.
QuickTest Professional 9.1/9.2        -   Feb, 2007
QuickTest Professional 9.5              -   Jan, 2008
QuickTest Professional 10.0            -   Jan, 2009
QuickTest Professional 11.0            -   Sep 2010

Astra Quicktest (First version):

         The first version of QTP was named Astra QuickTest and it was released by Mercury Interactive(MI) in May 1998.

Astra QuickTest 3.0:

          Astra QuickTest 3.0 was released in Feb 2000. It needed IE4.0 or higher to run.

Astra QuickTest 5.0 (Astra QuickTest Professional 5.5):

         Astra QuickTest 5.0 was released in Feb 2001. This version was able to test multimedia elements like Real Audio/Video and Macromedia Flash etc.
         After the addition of various patches, Astra Quicktest 5.0 was renamed Astra QuickTest Professional 5.5. This version was able to test websites.

QuickTest Professional 6.5:

        QuickTest Professional 6.5 was released in Sep 2003 and lost Astra as part of its name. 

QuickTest Professional 8.0

      QuickTest Professional 8.0 was released in late 2004.
 The major new features added in this version were:
  • Unicode Support
  • Keyword View
  • Business Process Testing
  • Action/Test Parameters

QuickTest Professional 8.2:

     QuickTest Professional 8.2 was released in early 2005. The major new features added in this version were:
  • Patches on top of version QTP 8.0
  • Ability for Auto-Documentation
  • Step Generator
  • Enhanced Expert View

QuickTest Professional 9.0:

          QuickTest Professional 9.0 was released in April 2006. This was the time when Mercury started phasing out another popular product WinRunner since the company had integrated all its capabilities in QTP. The major new features added in this version were:
  • Object Repository Manager
  • Object Repository Merge Tool
  • Multiple Object Repositories per Action or Component
  • XML Object Repository Format
  • Function Library Editor
  • Handling Missing Actions and Resources

QuickTest Professional 9.1/9.2:

          QuickTest Professional 9.2 was released in Feb 2007. During this time HP completed its acquisition of Mercury interactive(MI) which started in late 2006. 
The major new features added in this version were:
  • Mercury Screen Recorder
  • Dynamic Management of Object Repositories

QuickTest Professional 9.5:

           QuickTest Professional 9.5 was released in Jan 2008. The major new features added in this version were:
  • Support for tabbed browsing
  • Bitmap checkpoint tolerance level through UI itself
  • WebAddin Extensibility

QuickTest Professional 10.0:

          QuickTest Professional 10.0 was released in Jan 2009.  The major new features introduced in this version were:
  • Centrally Manage and Share Testing Assets,Dependencies, and Versions in Quality Center 10.00
  • Perform Single-User Local System Monitoring While Running Your Tests
  • Improve Portability by Saving Copies of Tests Together with Their Resource Files
  • Call Actions Dynamically During the Test Run
  • Develop Your Own Bitmap Checkpoint Comparison Algorithm
  • Centrally Manage Your Work Items and ToDo Tasks in the To Do Pane
  • Improve Test Results Analysis with New Reporting Functionality
  • Test Standard and Custom Delphi Objects Using the Delphi Add-in and Delphi Add-in Extensibility

QuickTest Professional 11.0:

         QuickTest Professional 11.0 was released in Sep 2010.  This is the latest version available in the market as of August 2012. The major new features added in this version  were:
  • XPath and CSS based object identification
  • Good Looking and Enhanced Results Viewer
  • Easy Regular Expressions
  • Now identify objects not only in relation to each other but in relation to neighboring objects. Visual Relation Identifier
  • Load Function Libraries at Run Time
  • Test Your GUI and UI-Less Application Functionality in One Test
  • Record Support For FireFox is now available
  • QTP 11 is capable of receiving Java or .NET log framework messages from your application which can then be embedded in the run results
  • Embed/Run Javascript in web pages
  • Improved test data management when integrated with Quality Center
  • QTP 11 now supports Web 2.0 Toolkit Applications out-of-the-box similar to any other add-ins.

12 October 2012

Test Cases for Text Editor

Test Cases for Text Editor:

  • Check for the successful message by enter characters
  • check for the successful message by enter numbers
  • check for the successful message by enter special characters
  • check whether the selected text become bold or not by click on Bold icon
  • check whether the selected text become bold or not by press ctrl+B
  • check whether the selected text become Italic or not by click on Italic icon
  • check whether the selected text become Italic or not by press ctrl+I
  • check whether the selected text become strithrough or not by click on strikethrough icon
  • check whether undo is working or not
  • check for the redo is working or not after undo
  • check for the redo is working or not before undo
  • check whether bullets are applied for selected text lines
  • check whether numbers are applied for selected text lines
  • check whether the selected bold text become unbold or not by click on Bold icon
  • check whether the selected bold text become unbold or not by press ctrl+B
  • check whether the selected Italic text become normal or not by click on Italic icon
  • check whether the selected Italic text become normal or not by press ctrl+I
  • check whether the selected strithrough text become normal or not by click on strikethrough icon

Gray Box testing

Gray Box Testing:

       Gray Box Testing is a software testing method which is a combination of Black Box Testing method and White Box Testing method. In Black Box Testing, the internal structure of the item being tested is unknown to the tester and in White Box Testing the internal structure in known.
        In Gray Box Testing, the internal structure is partially known. This involves having access to internal data structures and algorithms for purposes of designing the test cases, but testing at the user, or black-box level.

       Gray Box Testing is named so because the software program, in the eyes of the tester is like a gray/semi-transparent box; inside which one can partially see.

EXAMPLE

An example of Gray Box Testing would be when the codes for two units/modules are studied (White Box Testing method) for designing test cases and actual tests are conducted using the exposed interfaces (Black Box Testing method).

LEVELS APPLICABLE TO

           Though Gray Box Testing method may be used in other levels of testing, it is primarily useful in Integration Level Testing

11 October 2012

25-points for Website Usability Checklist

25-points for Website Usability Checklist:

The list is split into 4 roughly equal sections,
 (I) Accessibility,
 (II) Identity, 
(III) Navigation, and
 (IV) Content.

I. Accessibility:

          This section contains not only traditional accessibility issues, but anything that might keep a visitor from being able to access the information on a website. If no one can load your site, or the type is too small to read, all of the usability in the world won't matter.
  1.  Site Load-time Is Reasonable
  2.  Adequate Text-to-Background Contrast
  3.  Font Size/Spacing Is Easy to Read
  4.  Flash & Add-ons Are Used Sparingly
  5.  Images Have Appropriate ALT Tags
  6. Site Has Custom Not-found/404 Page

II. Identity:

     7. Company Logo Is Prominently Placed
     8. Tagline Makes Company's Purpose Clear
     9. Home-page Is Digestible In 5 Seconds
     10. Clear Path to Company Information
     11. Clear Path to Contact Information

III. Navigation:

     12. Main Navigation Is Easily Identifiable
     13. Navigation Labels Are Clear & Concise
     14. Number of Buttons/Links Is Reasonable
     15. Company Logo Is Linked to Home-page
     16. Links Are Consistent & Easy to Identify
     17. Site Search Is Easy to Access

IV. Content:

     18. Major Headings Are Clear & Descriptive
     19. Critical Content Is Above The Fold
     20. Styles & Colors Are Consistent
     21. Emphasis (bold, etc.) Is Used Sparingly
     22. Ads & Pop-ups Are Unobtrusive
     23. Main Copy Is Concise & Explanatory
     24. URLs Are Meaningful & User-friendly
     25. HTML Page Titles Are Explanatory





Check list for usability testing

Usability Testing :

It is the extent to which a particular application can be used effectively and without much difficulty.
             Usability testing is done from the users point of view. This testing type makes an attempt to describe the “look and feel” and usage features or usage aspects of a product. Many types of testing are objective in nature. Many people disagrees that usability testing really belongs to The Software Testing domain.


Usability testing forms a part of non functional testing.

Benefits:

1. Increased productivity
2. Speed of work
3. Ease of using
4. Customer support reduced
5. Decreased Training costs
6. Increased customer satisfaction
7. Increased market life

Check List:

1. Page look and feel should be consistent (on all pages)
2. All controls should fit in the screen size
3. All controls should be visible in the screen
4. All Controls should be clearly aligned in the screen
5. All labels should be clear for vision
6. All labels should be readable
7. Windows tool bar support should be there
8. Fonts/Font size used in the screen should be consistent across the page
9. Colors used in the page should be consistent
10. Ability for the user to maximize, minimize, restore and close the windows using those buttons
11. Window cascading should exists
12. Ability to maximize and minimize via page header bar
13. There should be tool tip for each control
14. There should be support for key board inputs (mouse disconnected and not disconnected)
15. Support for short cut keys
16. Check if ‘Tab’ is working proper for cursor movement
17. Cursor focus over different controls should be appropriate (ie. Cursor focus should move from 1st control to 2nd control and so on when tab key is used)
18. Tool tips should be there
19. Tool tips should be clear to user to understand
20. Scroll bars (horizontal, vertical, any internal scroll bars) should work properly
21. Scroll bars should appear when page is minimized
22. Scroll bars should appear when the controls out reach the page size
23. All the content should fit in the page and should be visible when the window is resized
24. Any part of the data including data in the sub sections should not go invisible when window resizes
25. Drop downs should have scroll bar when content is out of the visible region
26. Drop down values should appear sorted properly
27. Scroll bar should appear for text boxes when content is long
28. Combo boxes should be flexible enough to display long values (automatically resizes based on the size of the content)
29. User should be able to edit the text
30. Cut, copy and paste functionality should work
31. Should be able to perform re do on most recent undo task
32. Undo, Redo functionality should exists
33. Search functionality in the page
34. All the links to other pages should work properly
35. Whenever the list is displayed it should get displayed in sorted format
36. Sort functionality should work and it should be correct
37. Mandatory fields should appear with proper indication
38. Alert message should be clear and understandable
39. Error messages should be understandable
40. Information should be placed relevantly in the page (ie. personal details in one section, account details in one section, credit details in one section…)
41. User should be allowed to see the data in the order in which he want (ie. Ascending or descending, date wise.....)

Test cases for Lift

Test cases for Lift:

  • Check when you press from the outside of the lift it should be opened when it is in ground floor
  • Verify it should give a beep when it reaches opens you
  • Check it should display in the outer box that from which floor it comes to reach us
  • Check when you step in to the lift after 10 seconds when it knows nothing in the door it to be closed automatically
  • Check when you give or press the number to which upto the floor it be activated
  • Check when you press different no`s it should be highlighted
  • Check upto the capacity of the lift only the load is given
  • Verify when you are going upwards on the first floor if any one pressed the lift it should stop there
  • Verify on reaching the correct floor it should be automatically stopped and open
  • Verify when the lift is empty when any one touches it should reach very fastly.
  • Verify whether the appropriate buttons are available or not
  • Check when click on open button twice it should open or not
  • Check whether lift door open or not by click on open button while lift moving between two floors
  • Check whether lift door open or not by press and hold on open button some time
  • Check for the direction by pressing up and down button.
  • Enter in the lift specify the floor say 4 and check for the corresponding floor?
  • Lift is moving and the power break off. Does the alarm blow in the control room that there is power break off?
  • Also check for the case how the inside person inform the lift man that he got stuck in the lift (if there is phone button does it works)?
  • Also check for the response time to start lift again?
  • When a person is in the lift any light is there or not?
  • How many persons at a time will carry the lift in the since in moving time.
  • With out load if we press the up button is it moving or not.
  • What happen with out specifying the floor number if we press the up or down?
  • What happen if the lift is in over load mode (over load sound will come before moving or it moves).
  • With out load if we press the up button and specify the floor number what is happening?
  • Is the door get closed itself or have to close manually.
  • If the door closed automatically check for the closing time?
  • If the door closed automatically also check for the sensor? Suppose the closing time is 2 min and it passed and people are still getting in. What happen does the door get closed or wait for the people to get in?
  • What happen when you press the same floor number on which lift is already there?
  • After getting into the lift one person is there he wants to go up but we want to go down already he specified the floor number before we enter in to the lift. After that we specified our floor number then what will be the priority of the lift?
  • Lift is empty and is moving downwards I entered and want to go up. Is there the button to change the direction of lift and the button working?
  • Is the floor number displayed up in side the lift correct?
  • When happen when the out side person want to open the door but Inside person wants to close the door at the same time.



TEST PLAN GUIDELINES

TEST PLAN GUIDELINES:

  • Make the plan concise. Avoid redundancy and superfluousness. 
  • If you think you do not need a section that has been mentioned in the template above, go ahead and delete that section in your test plan.
  • Be specific. For example, when you specify an operating system as a property of a test environment, mention the OS Edition/Version as well, not just the OS Name.
  • Make use of lists and tables wherever possible. Avoid lengthy paragraphs.
  • Have the test plan reviewed a number of times prior to baselining it or sending it for approval. 
  • The quality of your test plan speaks volumes about the quality of the testing you or your team is going to perform.
  • Update the plan as and when necessary. An out-dated and unused document stinks and is worse than not having the document in the first place.

Test Cases for Date Field

Test Cases For Date Field:

They may be many cases if the text box is editable or not purpose of the date field used etc .

Test Cases if the Date field is not text box: 

  • Ensure that calendar window is displayed and active when the calendar is invoked by pressing the calendar icon. (Once we faced an issue the calendar window is in minimized state when we invoked the calendar.)
  • Ensure that calendar date is defaulted to system date
  • Ensure that when a date is selected in the calendar (double click or some other method) the selected date is displayed in the text box and calender should be disappered from there.
  • Check for the format as per requirement i.e.mm-dd-yy
  • Check for the list of years by click on down arrow displayed years of selected year before and after 10 years
  • Check for the list of 12 months by click on month down arrow
  • Check for the clear of date by click on clear text link in the calender what before selected
  • Check for the month changed to previous month by click on Prev text link
  • check for the month changed to after months by click on next text link
  • check for the today date highlighted or not by click on today text link

Test Cases if the Date field is editable  text box: 

 

  •  enter the valid date ,month, year 
  •  enter the invalid blank space and valid month invalid blanck space 
  •  enter the invalid blank space and invalid blank space and invalid blank space 
  •  enter the zeros in date and invalid blank space and zeros in year.
  •  enter alphabetic,alphanumeric data in the field.
  •  enter special characters.
  •  enter decimal point. 
  •  Enter valid date,invalid blank space in month,invalid blank space in year
  •  Enter invalid date,valid month and invalid blank space in year. 
  •  Enter invalid date,invalid blank space in month and valid year.
  •  enter the invalid date and valid month and valid year 
  • enter the valid date and valid month and invalid year
  •  enter the valid date and invalid month and valid year
  •  enter the valid date and invalid month and invalid year
  •  enter the invalid date and invalid month and invalid year 
  • enter the invalid date and valid month and invalid year
  •  Leave blank spaces for all the fields. 
  • Check whether the year entered is leap or ordinary year.
  •  For ordinary year the max limit of number in day field in the month Feb should be 31. 
  • For leap year the max limit of number in day field in the month Feb should be 29.
  •  With respect to the corresponding alternate months corresponding numbers in day field i.e 30/31 should be accepted. 
  • Enter the date below/beyond the range, i.e 32 or 0 etc.
  •  Enter the month below/beyond the range,i.e 13 or 0...
  •  Enter zero before the single digit number in date/month.
  •  Check without entering zero before the single digit.
  •  Enter valid date,valid month and only last two digits of year.
  •  Enter any alphabets in these fields.
  •  Enter special characters and check. 
  • Enter zeroes in all the fields. 
  • Check whether the control passes on to the month field after entering date. 
  • Also check whether the control passes to year field after entering date and month. 
  • Check the format of date,month and year.
  • Check whether it is accepting date in date field,month in month field and year in corresponding field.
    should not accept 000000 as a date.




10 October 2012

Test Cases for User Registration form

Test cases for User Registration form: 

 

  • Check the availability of the fields or objects email,first name, last name , Gender, nationality, Description
  • Check the spellings of all objects
  • Check the consistency of all objects
  • check the mandatory fields in the form
  • Check the successful message for entry of all mandatory fields with the proper inputs
  • check the error messages for unfill of all mandatory fields
  • Check for the error message for unfill of email id
  • check for the error message for fill email address as “swetha”
  • check for the error message for fill email address as “#@%^%#$@#$@#.com”
  • check for the error message for fill email address as “@gmail.com ”
  • check for the error message for fill email address as “swetha < swetha@gmail.com>”
  • check for the error message for fill email address as “swetha.gmail.com”
  • check for the error message for fill email address as “swetha@gmail@gmail.com”
  • check for the error message for fill email address as “. swetha@gmail.com”
  • check for the error message for fill email address as “swetha.@gmail.com”
  • check for the error message for fill email address as “swetha..s@gmail.com”
  • check for the error message for fill email address as “ swetha@gmail.com (swetha)”
  • check for the error message for fill email address as “ swetha@gmail”
  • check for the error message for fill email address as “ swetha@gmail..com”
  • check for the error message for fill email address as “swetha@-gmail.com”
  • check for the error message for fill email address as “ swetha@111.222.333.444”
  • check the error message for special characters for first Name
  • check the error message for more than 20 characters for first name
  • check the error message for special characters for Last Name
  • check the error message for more than 20 characters for last name
  • check for the error message for mobile no field by enter characters
  • check for the error message for mobile no field by enter less than 10 characters
  • check for the error message for mobile no by enter more than 10 numbers

9 October 2012

White Box Testing

WHITE BOX TESTING -

                             Testing based on an analysis of internal workings and structure of a piece of software. Includes techniques such as Branch Testing and Path Testing.

White Box Testing also called as “Structural testing or Glass Box Testing”

                            White box testing involves looking at the structure of the code. When you know the internal structure of a product, tests can be conducted to ensure that the internal operations performed according to the specification. And all internal components have been adequately exercised.

EXAMPLE
                           A tester, usually a developer as well, studies the implementation code of a certain field on a webpage, determines all legal (valid and invalid) AND illegal inputs and verifies the outputs against the expected outcomes, which is also determined by studying the implementation code.

LEVELS APPLICABLE TO

White Box Testing method is applicable to the following levels of software testing:
  • Unit Testing: For testing paths within a unit
  • Integration Testing: For testing paths between units
  • System Testing: For testing paths between subsystems
However, it is mainly applied to Unit Testing.

White Box Testing Technique

Introduction

Software is tested from two different perspectives:
  1. Internal program logic is exercised using “white box” test case design techniques.
  2. Software requirements are exercised using “black box” test case design techniques.
                In both cases, the intent is to find the maximum number of errors with the minimum amount of effort and time.
               White-box testing of software is predicated on close examination of procedural detail. Logical paths through the software are tested by providing test cases that exercise specific sets of conditions and/or loops. The "status of the program" may be examined at various points to determine if the expected or asserted status corresponds to the actual status.

White Box Testing is coverage of the specification in the code.

  • Code coverage:
  • Segment coverage:
                   Ensure that each code statement is executed once.
  • Branch Coverage or Node Testing:
                  Coverage of each code branch in from all possible was.
  • Compound Condition Coverage:
                   For multiple condition test each condition with multiple paths and combination of different path   to reach that condition.
  • Basis Path Testing:
                  Each independent path in the code is taken for testing.
  • Data Flow Testing (DFT):
                  In this approach you track the specific variables through each possible calculation, thus defining the set of intermediate paths through the code.DFT tends to reflect dependencies but it is mainly through sequences of data manipulation. In short each data variable is tracked and its use is verified.
This approach tends to uncover bugs like variables used but not initialize, or declared but not used, and so on.
  • Path Testing:
                 Path testing is where all possible paths through the code are defined and covered. Its a time consuming task.
  • Loop Testing:
                  These strategies relate to testing single loops, concatenated loops, and nested loops. Independent and dependent code loops and values are tested by this approach.

WHITE BOX TESTING ADVANTAGES

  • Testing can be commenced at an earlier stage. One need not wait for the GUI to be available.
  • Testing is more thorough, with the possibility of covering most paths.
WHITE BOX TESTING DISADVANTAGES
  • Since tests can be very complex, highly skilled resources are required, with thorough knowledge of programming and implementation.
  • Test script maintenance can be a burden if the implementation changes too frequently.
  • Since this method of testing it closely tied with the application being testing, tools to cater to every kind of implementation/platform may not be readily available.
  • White Box Testing is like the work of a mechanic who examines the engine to see why the car is not moving.


Black Box Testing

BLACK BOX TESTING -

                 Testing without the knowledge of the internal workings of the application being tested. Tests are usually functional.Black Box Testing, also known as Behavioral Testing.

                The internal structure/design/implementation of the item being tested is not known to the tester. These tests can be functional or non-functional, though usually functional.This method is named so because the software program, in the eyes of the tester, is like a black box; inside which one cannot see.




This method of attempts to find errors in the following categories:
  • Incorrect or missing functions
  • Interface errors
  • Errors in data structures or external database access
  • Behavior or performance errors
  • Initialization and termination errors
LEVELS APPLICABLE TO
Black Box Testing method is applicable to all levels of the software testing process:
  • Module Level Testing
  • Integration Level testing
  • System Level Testing
  • Acceptance Testing
The higher the level, and hence the bigger and more complex the box, the more black box testing method comes into use.

EXAMPLE
            A tester, without knowledge of the internal structures of a website, tests the web pages by using a browser; providing inputs (clicks, keystrokes) and verifying the outputs against the expected outcome.


BLACK BOX TESTING TECHNIQUES

Following are some techniques that can be used for designing black box tests.

Equivalence partitioning

             Equivalence Partitioning is a software test design technique that involves dividing input values into valid and invalid partitions and selecting representative values from each partition as test data.

Boundary Value Analysis

             Boundary Value Analysis is a software test design technique that involves determination of boundaries for input values and selecting values that are at the boundaries and just inside/outside of the boundaries as test data.

Cause Effect Graphing

            Cause Effect Graphing is a software test design technique that involves identifying the cases (input conditions) and effects (output conditions), producing a Cause-Effect Graph, and generating test cases accordingly.

BLACK BOX TESTING ADVANTAGES
  • Tests are done from a user’s point of view and will help in exposing discrepancies in the specifications
  • Tester need not know programming languages or how the software has been implemented
  • Tests can be conducted by a body independent from the developers, allowing for an objective perspective and the avoidance of developer-bias
  • Test cases can be designed as soon as the specifications are complete
  • Well suited and efficient for large code segments
BLACK BOX TESTING DISADVANTAGES
  • Only a small number of possible inputs can be tested and many program paths will be left untested
  • Without clear specifications, which is the situation in many projects, test cases will be difficult to design
  • Tests can be redundant if the software designer/ developer has already run a test case.
  • Ever wondered why a soothsayer closes the eyes when foretelling events? So is almost the case in Black Box Testing.
  • Limited Coverage since only a selected number of test scenarios are actually performed.

Testing Methods

Testing Methodology:

There are three testing methodologies . They are
  • Black Box Testing
  • White Box Testing
  • Gray Box Testing

Definition for testing Methodologies:

BLACK BOX TESTING -

              Testing without the knowledge of the internal workings of the application being tested. Tests are usually functional.Black Box Testing, also known as Behavioral Testing.

             The internal structure/design/implementation of the item being tested is not known to the tester. These tests can be functional or non-functional, though usually functional.This method is named so because the software program, in the eyes of the tester, is like a black box; inside which one cannot see.



EXAMPLE :
             A tester, without knowledge of the internal structures of a website, tests the web pages by using a browser; providing inputs (clicks, keystrokes) and verifying the outputs against the expected outcome.

WHITE BOX TESTING -

             Testing based on an analysis of internal workings and structure of a piece of software. Includes techniques such as Branch Testing and Path Testing.

White Box Testing also called as “Structural testing or Glass Box Testing”

             White box testing involves looking at the structure of the code. When you know the internal structure of a product, tests can be conducted to ensure that the internal operations performed according to the specification. And all internal components have been adequately exercised.





EXAMPLE :
            A tester, usually a developer as well, studies the implementation code of a certain field on a webpage, determines all legal (valid and invalid) AND illegal inputs and verifies the outputs against the expected outcomes, which is also determined by studying the implementation code.

GRAY BOX TESTING:-

             This is a Testing method which is a combination of Black Box Testing method and White Box Testing method.

             In Gray Box Testing, the internal structure is partially known. This involves having access to internal data structures and algorithms for purposes of designing the test cases, but testing at the user, or black-box level.Gray Box Testing is named so because the software program, in the eyes of the tester is like a gray/semi-transparent box; inside which one can partially see.





EXAMPLE:
             An example of Gray Box Testing would be when the codes for two units/modules are studied (White Box Testing method) for designing test cases and actual tests are conducted using the exposed interfaces (Black Box Testing method).







Where Exactly Testing Comes into the picture?

Where Exactly Testing Comes into the picture?

    

Conventional Testing:- 

                      Generally in conventional Testing the test engineer people will test the developed application whether the related parts are working according to the requirements or not.

Unconventional Testing:-

                      In Unconventional Testing the QA(Quality Assurance) people will check each and every role of the Organization in-order to check whether they are doing their work according to the company's guidelines or not.


8 October 2012

Test Plan


Test Plan:

Definition:

 A document describing the scope, approach, resources and schedule of intended test activities. It identifies amongst others test items, the features to be tested, the testing tasks, who will do each task, degree of tester independence, the test environment, the test design techniques and entry and exit criteria to be used, and the rationale for their choice,and any risks requiring contingency planning. It is a record of the test planning process. 
Master test plan: A test plan that typically addresses multiple test levels.

Phase test plan: A test plan that typically addresses one test phase.

 

TEST PLAN TEMPLATE

                  The format and content of a software test plan vary depending on the processes, standards, and test management tools being implemented. Nevertheless, the following format, which is based on IEEE standard for software test documentation, provides a summary of what a test plan can/should contain.
Test Plan Identifier:
  • Provide a unique identifier for the document. (Adhere to the Configuration Management System if you have one.)
Fields in the test plan are

  • Introduction
            ->  Objective
            ->  Reference Documents
  • Coverage of Testing                    
            ->  Features to be tested    
            ->  Features not to be tested
  • Test Strategy  
            ->  Levels of testing
            ->  Types of testing
  • Test design techniques
  • Configuration Management
  • Test Metrics
  • Terminology
  • Automation Plan
             ->  List of Automated tools
  • Base Criteria
             ->  Acceptance Criteria
             ->  Suspension Criteria
  • Test Deliverables
  • Test Environment
  • Resource Planning
  • Scheduling
  • Risks and contingencies
  • Assumptions
  • Approval Information
Introduction:
  • Provide an overview of the test plan.
  • Specify the goals/objectives.
  • Specify any constraints.
References:
  • List the related documents, with links to them if available, including the following:
    • Project Plan
    • Configuration Management Plan

Test Items:
  • List the test items (software/products) and their versions.
Features to be Tested:
  • List the features of the software/product to be tested.
  • Provide references to the Requirements and/or Design specifications of the features to be tested
Features Not to Be Tested:
  • List the features of the software/product which will not be tested.
  • Specify the reasons these features won’t be tested.
Approach:
  • Mention the overall approach to testing.
  • Specify the testing levels [if it's a Master Test Plan], the testing types, and the testing methods [Manual/Automated; White Box/Black Box/Gray Box]
Item Pass/Fail Criteria:
  • Specify the criteria that will be used to determine whether each test item (software/product) has passed or failed testing.
Suspension Criteria and Resumption Requirements:
  • Specify criteria to be used to suspend the testing activity.
  • Specify testing activities which must be redone when testing is resumed.
Test Deliverables:
  • List test deliverables, and links to them if available, including the following:
    • Test Plan (this document itself)
    • Test Cases
    • Test Scripts
    • Defect/Enhancement Logs
    • Test Reports

Test Environment:
  • Specify the properties of test environment: hardware, software, network etc.
  • List any testing or related tools.
Estimate:
  • Provide a summary of test estimates (cost or effort) and/or provide a link to the detailed estimation.
Schedule:
  • Provide a summary of the schedule, specifying key test milestones, and/or provide a link to the detailed schedule.
Staffing and Training Needs:
  • Specify staffing needs by role and required skills.
  • Identify training that is necessary to provide those skills, if not already acquired.
Responsibilities:
  • List the responsibilities of each team/role/individual.
Risks:
  • List the risks that have been identified.
  • Specify the mitigation plan and the contingency plan for each risk.
Assumptions and Dependencies:
  • List the assumptions that have been made during the preparation of this plan.
  • List the dependencies.
Approvals:
  • Specify the names and roles of all persons who must approve the plan.
  • Provide space for signatures and dates. (If the document is to be printed.)

test cases for ATM

Test Cases for ATM machine:

  • check the Machine is accepting ATM card check the card inserted properly . Check the card inserted reverse.
  • check the Machine is rejecting expired card
  • check the Successful entry of PIN number
  • check the Unsuccessful operation due to enter wrong PIN number 3 times
  • check the Successful selection of language
  • check the Successful selection of account type
  • check the Unsuccessful operation due to invalid account type
  • check the Successful selection of amount to be withdraw
  • check the Successful withdrawal.
  • check the Expected message due to amount is greater than day limit
  • check the Unsuccessful withdraw operation due to lack of money in ATM
  • check the unsuccessful withdrawal operation due to wrong denominations
  • check the unsuccessful withdral operation due to transactions is greaterthan day limit
  • check the unsuccessful withdral operation due to click cancel after insert card
  • check the unsuccessful withdral operation due to click cancel after insert card & pin number
  • check the unsuccessful withdral operation due to click cancel after insert card , pin number & language
  • check the unsuccessful withdral operation due to click cancel after insert card , pin number , language &account type
  • check the unsuccessful withdral operation due to click cancel after insert card , pin number , language ,account type & withdral operation
  • check the unsuccessful withdral operation due to click cancel after insert card , pin number , language ,account type ,withdral operation &amount to be withdral
  • check the Expected message due to amount to withdraw is greater than possible balance.
  • check the Unsuccessful withdraw operation due to click cancel after insert card

Test Cases For PEN

Test Cases for PEN: 

  • check the pen type
  • check the pen cap is present or not
  • check the pen ink is filled or not
  • check the pen writing or not
  • check the ink color i.e black or blue
  • check the pen color
  • check weather the pen is used to write all types of papers or not
  • check the ink capacity of the pen
  • check the pen product by fiber or plastic
  • Check the Dimension of the pen
  • Check the logo of the pen maker,
  • Check the Total length that pen can write
  • Check the Strength of the nib
  • Check the grip of the pen
  • Check the capacity Against the surface it can write
  • check the Hanger provided in the cap
  • check the body of the pen should perfectly fit with the cap of the pen
  • check the Ink in the pen, at what temp does the ink get blocked.
  • Check the Size of the nib like 0.5mm/0.6mm
  • check when pen is pressed hard against the hard surface then refill should not come out of the pen from the backside. [Load Testing]
  • check whether it can be write on leaves or not
  • check by dip the pen in water and then write. do this for each letter to be written
  • bend the refill at multiple end and then try to write with it .
  • soak the refill upward and then write on a paper.
  • rub the pen ball on a stone and then write
  • Writing with a pen without ink in the refill
  • Fill the refill with any other liquid instead of Normal
  • writing Ink. and try to write with that pen.
  • Remove the cap of a pen and keep it in a pocket to check whether there is any leakage of ink.
  • Throw away the pen with rapid speed on the floor and
  • check whther the pen brokes?
  • Check whether pen is writing in any climate
  • Check how much of time pen able to write continuosly with out rest on same way
  • Check pen size as per specification
  • Check whether pen is ball or ink as per specification
  • Check whether pen is ball or ink as per specification
  • it should not be too large.
  • it should not be too small.
  • Check whether we can see the liquid from outside
  • Check whether the pen can write if it writes in ulta i.e if we sign or write or anything by keeping in the wall and writing

Test Cases for FAN

Test Cases For Fan:

  • It should have a hook for hanging in the roof.
  • Check the Ceiling Fan company (Logo, branded)
  • it should have minium three blades.
  • If should be moving once the electricty pass into it.
  • check the regulator if it user friendly to Adjust the SPEED.
  • Speed of the fan should be controlled by the regulator.
  • check if does any sound while rotating or stopping
  • on switch--.fan on(position of regulator on)
  • on switch-- fan off(position of regulator off)
  • It should be stop once the electric switch off
  • The fan should run with minimum noise.
  • The blades should have proper distance from the ceiling.
  • The fan while in motion, should not vibrate.
  • The color of the fan should be dark.
  • Fan should work in clock-wise direction
  • Check the Ceiling fan running at Voltage problem.
  • Check the Ceilling fan life by run certain hour per day.
  • Check the Ceiling fan sound after several hour of run

 Test cases for fan which are given below in Test Case Document Format:

Test case1: Check whether it moves or not.
Description: Ensure that fan should moves properly.
Expected result: Fan should be moving.

Test case2: Check it should have minimum 3 blades.
Description: Ensure that length of fan blades should be considered to 3 blades.
Expected result: Length of fan blades should not be shorter than 3 blades.

Test case3: Check it should be on when electric button (switch) is on.
Description: Ensure that fan should start working when electric switch is on.
Expected result: Fan should be on when electric button (switch) is on.

Test case4: Check whether Speed of the fan definitely be controlled by the regulator.
Description: Ensure that speed of fan should be controlled.
Expected result: Fan speed should be controlled by the regulator.

Test case5: Check it should be stop working once the electric switch off.
Description: Ensure that fan should stop working once the electric switch is off.
Expected result: Fan should be off once electric switch is off.

Test case6: Check the proper “company name” definitely be displayed on fan or not.
Description: Always ensure that name of company definitely be properly displayed on fan.
Expected result; Proper name of company definitely be displayed on fan.

Test case7: Check Fan should always work in clock-wise direction.
Description: Ensure that direction of fan should be in clock-wise.
Expected result: Fan should work in clock-wise direction.

Test case8: Check the color of the fan blades.
Description: Always ensure that all the blades of fan have same color.
Expected result: Color of all the blades of fan should be of same color.

Test case9: Check the fan during (while) in motion should not vibrate.
Description: Ensure that the fan during (while) in motion should not vibrate.
Expected result: Fan should not vibrate.

Test case10: Check whether the blades should have decent distance from the ceiling.
Description: Ensure that fan blades should have decent distance from the ceiling.
Expected result: Fan blades should have decent distance.

Test case11: Check the size of the fan blades.
Description: Always ensure that all the blades of fan have same size.
Expected result: Size of all the blades of fan should be of same size.

Test case12: Check whether it operates in low voltage.
Description: Ensure that fan should properly operate in low voltage.
Expected result: Fan should be properly operated on low voltage.

Test case13: Check whether speed varies when regulator adjusted.
Description: Ensure that speed of fan varies when we adjust the regulator.
Expected result: Speed of fan varies while adjusting the regulator.



6 October 2012

Test Cases For Shopping Cart

Test Cases For Shopping Cart:

 

  • Adding Item in shopping cart
  • Deleting item from shopping cart
  • Verifying Item count after adding and deleting
  • Verifying Coupon codes-If your application supports that(Coupons help you redeem the points)
  • Verifying Sales price,MRP as as per data you have been provided
  • Verifying Payment Methods-Paypal,Google Checkout,Other payment methods
  • Verifying purchasing of any item when it is not in stock
  • Verifying shipping methods and shipping charges if applicable
  • Verifying purchasing with Taxable cities(Minnesota and Wisconsin are taxable cities in US)and non taxable cities
  • Verify with Different credit card types-Visa,Master,Discover,Amex
  • Verify with expired credit card.
  • Check whether clicking on add to cart button takes the user to cart page or not
  • Shopping cart holds the items till the session is active. As soon as session is closed the shopping cart should be refreshed.
  • Shopping cart should not contain duplicate items, although user can add quantities of the product to be purchased
  • Shopping cart should contain the price of the product, its name and link to the product's details
  • Is there any cross selling product that means..if we buy one product then we must buy that cross selling product.
  • check whether the special offers (optional)if selected means they also should include in cart.


Test Cases For Check box and Radio button

Test Cases For Check Box and Radio button

Test Case to test Check Box:-

 

 

 

 

 



1. Click the check box Control and press the submit button and test whether it is redirecting to next page .
2. Uncheck the Check box Control and press the submit button and test whether it is redirecting to next page
3. Check the alignment of the check box i.e. its size, color, style.
4. Check the alignment of the label for the check box i.e. whether it can be left or right
5. Check and Uncheck the check box control and test whether it is automatically redirecting to the server or not if the submit button is not present

Test Case to test Radio button:-

 

1. If the Radio button control is present in group then check one of the radio button and press submit button to check whether it is redirecting to next page or not
2. Check multiple radio button control at a time and press the submit button and test whether its redirecting .An error message must be displayed i.e. you cannot check multiple radio buttons at the same time.
3. Check the alignment of all the radio button control i.e. whether they are in same directions (Left or right).All the radio button control must be in same directions
4. Check the alignment of the check box i.e. same size,color,style
5. Check whether radio button control could be checked or unchecked through Tab key or with the help of mouse cursor(GUI Testing

5 October 2012

Test Cases for Mouse

Test Cases For Mouse:

  • To Check the Mouse company
  • Check whether it is PS/2, USB or Serial port mouse or Cardless Mouse
  • it should be plugged to all the ports of different manufactures
  • Check for the basic functionality(smoke Test) whether it is connected to ur PC or not.
  • proper finished plastic body,to hold the mouse with right and left key and scroll
  • It should be platform Independent
  • Now,check for the functionality , it should be a pointer on the screen and when it is moved on the links  it  should turn out to hand symbol.
  • Check whether the pointer turns out to be a vertical bar in the textbox.
  • Right Click on the mouse should display the context window
  • Double clicking on any folder should open up the file
  • Should be able to scroll up or scroll down using the scroll button on the mouse
  • Should be able to change the functionality of the right
  • Should be able to point to the scrollbar and then drag up and down
  • Should always point to the right place, where it is intended to point

Test cases for LogIn window

Test cases for sample Login Form 

The basic objective of writing test cases is to validate the testing coverage of the application. Well written test cases can make the testing cycle smooth and efficient. A good test case is easy to determine if a feature of an application is working correctly.


  • Verify the following text fields in the Loginscreen-Username,Password.
  • Verify the following Buttons in Login screen-Login,Reset.
  • Verify the mandatory fields in Login screen.(without entering data in the text fields)
  • User Name and Password field are appearing in proper position In the page or not.
  • User name and password fields are having same font, proper look up feeling or not.
  • User name and password field are properly aligned.
  • Check the spellings are proper are not.
  • Button fields are active and doing desired work or not.
  • Verify the validation message when trying to login by leaving any username and password fields.
  • Verify the validation message shown when trying to login by leaving username field.
  • Verify the validation message shown when trying to login by leaving password field.
  • Verify the error message after logging when wrong User name using.
  • Verify the error message after logging when wrong Password using.
  • Verify the error message after logging when wrong User name and Password using.
  • Verify the error message after logging when long username , that exceeds the limit of characters using.
  • Verify the error message after logging when long Password ,that exceeds the limit of characters using.
  • Try copy/paste in the password text box.
  • Verify if user is able to login to Application when using correct credentials.
  • Verify the functionality of RESET button.
  • verify the look and feel of Login Screen.
  • After successfull sign-out, try "Back" option from your browser. Check whether it gets you to the "signed-in" page.

Test Case Document for LogIn Form:

 

Project Name: Website Name Page Name: LogIn Screen











Module Name: Name of the module User: End User











Author: Swetha Execution Date:













Requirement id Test Case Id Pre-Requisite Description/Test Status Expected Value Test Data Actual Value Result Build No Priority
R01 TC-01 Loginscreen Verify the following text fields in the Loginscreen-Username,Password. User name and password must be available NA







R01 TC-02 Loginscreen Verify the following Buttons in Login screen-Login,Reset. Login and reset buttons must be available NA







R01 TC-03 Loginscreen Verify the mandatory fields in Login screen. Username and Password fields must be mandatory NA







R01 TC-04 Loginscreen User Name and Password field are appearing in proper position In the page or not. Username and password field must be appearing in proper position NA







R01 TC-05 Loginscreen User name and password fields are having same font, proper look up feeling or not. Username and password fields must have same font NA







R01 TC-06 Loginscreen User name and password field are properly aligned. Username and password fields must be properly aligned NA







R01 TC-07 Loginscreen Check the spellings are proper are not. Spellings must be correct NA







R01 TC-08 Loginscreen Button fields are active and doing desired work or not. Button fields must be enable NA







R01 TC-09 Username and password Verify the validation message when trying to login by leaving any username and password fields. Error message must be " EnterUser name and Passwords " NA







R01 TC-10 Password Verify the validation message shown when trying to login by leaving username field. Error message must be "Enter Username " Password:test







R01 TC-11 Login button Verify the validation message shown when trying to login by leaving password field. Error message must be "Enter Password" Username:swetha







R01 TC-12 User Name Verify the error message after logging when wrong User name using. Error message must be " Username and password are wrong " Username:swathi










R01 TC-13 Password Verify the error message after logging when wrong Password using. Error message must be " Username and password are wrong " password:test12







R01 TC-14 Username,password Verify the error message after logging when wrong User name and Password using. Error message must be " Username and password are wrong " Username:swathi,
password:test12








R01 TC-15 Username Verify the error message after logging when long username , that exceeds the limit of characters using. Error message must be " Username and password are wrong " Username:
swethaaaaaaaaaaaaa








R01 TC-16 Passwod Verify the error message after logging when long Password ,that exceeds the limit of characters using. Error message must be " Username and password are wrong " Password:
test12aaaaaaaa








R01 TC-17 Password field Try copy/paste in the password text box. Error message must be " wrong password " password:test







R01 TC-18 Login button Verify if user is able to login to Application when using correct credentials. Must display the appropriate page Username:swetha
password:test








R01 TC-19 Reset button Verify the functionality of RESET button. Must reset the text in user name and password fields NA







R01 TC-20 Login screen verify the look and feel of Login Screen. Log in screen must be good to look NA







R01 TC-21 Login button After successfull sign-out, try "Back" option from your browser. Check whether it gets you to the "signed-in" page. Must display login screen NA