Monday 18 February 2013

5 Why's Quality Assurance minded people will love this..

5 Why's
Quality Assurance minded people will love this..

Benjamin Franklin's 5-Why Analysis: For want of a nail a shoe was lost, for want of a shoe a horse was lost, for want of a horse a rider was lost, for want of a rider an army was lost, for want of an army a battle was lost, for want of a battle the war was lost, for want of the war the kingdom was lost, and all for the want of a little horseshoe nail.

<:-) No hidden messages, just something to think about. Posted because I found this somewhere.>

Wednesday 13 February 2013

Basic steps to Getting Started with JMeter


Apache JMeter
Download apache Jmeter from http://jakarta.apache.org/jmeter
System Requirements: Java should be installed.
  1. JMeter window will appear as
  2. Extract the binary or zipped folder.
  3. Goto bin folder then RUN> Jmeter.bat(windows batch  file)



  1. Thread Group>Add>Logic Controller>Recording Controller
  2. To create Test Script :
    TestPlan>Add>Threads(Users)>Thread Group

  1. Thread Group>Add> Config Element >HTTP Cache Manager
                                                                       >HTTP Cookie Manager
                                                                       >HTTP  Request Defaults

  1. Check the both two Check boxes in HTTP Cache Manager.

  1. Check the both two Check boxes in HTTP Cookie Manager.

  1. HTTP Request Defaults
    If particular IP or website is entered at Server Name and IP scripts will recorded for that Server Name and IP only. Likewise mentioned port number the browser using that particular port no will be recorded.

  1. Thread Group>Add>Listener> View Results Tree
                                                        > Aggregate Report

  1. Add Proxy Server to Workbench
    TestPlan>WorkBench>Add>Non-Test Elements>HTTP Proxy Server

  1. In Proxy Server
    add a
    port number like 7070, 7080, 9091 etc.
    Target Controller Should be
    Thread Group>Recording Controller so that scripts will be recorded inside the Recording Controller.
    Also add
    URL patterns to exclude as follows.*\.css , .*\.js , .*\.jpg , .*\.png , .*\.ico , .*\.gif etc.

  1. Then Do proxy setting in the browser you want to use
    Ex. Firefox Tools>Options

  1. Advanced> Network >Settings

  1. Select Manual Proxy Configuration
    HTTP Proxy : localhost and Port : 7070 (that provided in JMeter Proxy Server)
    Clear the field for
    No Proxy For

    Then click on
    OK. Again OK.

  1. Then Start JMeter Proxy Server.

  1. Then Record the scripts from the Browser (whatever is done on the browser ) and Stop the Proxy Server.

  1. After Recording disable the unwanted samples,
    like
    ‘image file extensions i.e  jpg, png etc’, java scripts, asmx/jsdebug .

  1. Then Goto RUN> Clear all

  1. To play back the recorded Script.
    1.Save the recorded script having extension
    .JMX











  1. Apply Load
    Thread Group>No of Threads (Users)  e.g. 100
                             >Ramp up Period 420 sec(Initial 100 hits will be dissipated to 420 secs. )
                             >Loop count (to repete)


  1. Find Performance Results from Aggregate Report.











  2. To run it1.Start (Runs on local server)
    2.Start no pause (Runs on local server don’t pause)
    3.Remote Start(Runs on selected Remote server)
    4.Remote Start All(Runs on all Remote server)
  3. Find any kind of discrepancy (if any) from View Result Tree. (HTML view)
    Red indicated in View Results Tree is the failure/error.



  1. If you are getting memory issue.
    Goto JMeter.bat>open with notepad>find Heap >Increase the heap size (default 512MB)
  2. How to add Remote Server.
    In bin folder Goto Jmeter(PROPORTIES file)>open with notepad
    Add IP name as mentioned above.


  1. Result from Aggregate Report
    Avg Page Response Time (Seconds)
    90 Pctile Page Response Time (Seconds)
    Max Page Response Time (Seconds)
    Total Pages Viewed
    (# Samples)
    Errors Total Count
    Errors %
    Throughput
    Hits/Second

















How is the testing going? An elevator/corridor question!!


Senior Mgmt Exec: How is the testing going with ... product?

Test manager: Fine. So far so good .. 

Senior Mgmt Exec: Hmm. < Silence endures. No more questions.>

More often than you imagine, this conversation will occur in an elevator or in the corridors of the office. Does it give any understanding of the testing effort to the executive in the senior management? Unfortunately, No.

A formal setting is not needed for any senior exec to get an understanding of the effort. This elevator / corridor question is enough to cause jitters. Yes :-).

Let us revisit this conversation.

Senior Mgmt Exec: How is the testing going with ... product?

Test manager: We have completed 65% of the functional testing. Another 4 weeks to go and 2 more regression cycles to complete. We are on-schedule.

Senior Mgmt Exec: That is great. How is the bug fixing going on?

Test Manager: Bug Fix rate is 8.4. We still have a couple of P1 defects to be fixed by the development. It will be done and tested before the final regression testing starts. There are no P0 defects.

Senior Mgmt Exec: Awesome! That is very nice to hear. Good Job!

The above conversation gives a lot of confidence to the senior executive on the testing effort for the product. The second question seems like a question for the Dev manager to answer but test managers need to be on top of all related metrics and overall status.

It is extremely important to be able to cite metrics to boost the confidence. The first conversation would have left the exec wondering and would not have given any clue of the status. It would, infect, have made his/her ulcers to flare up, don't you think? :)



<:-) No hidden messages, just something to think about. Posted because I found this somewhere.>

Positive vs Negative Testing



Positive testing determines that your application works as expected. If an error is encountered during positive testing, the test fails.

Negative testing ensures that your application can gracefully handle invalid input or unexpected user behavior. For example, if a user tries to type a letter in a numeric field, the correct behavior in this case would be to display the “Incorrect data type, please enter a number” message. The purpose of negative testing is to detect such situations and prevent applications from crashing. Also, negative testing helps you improve the quality of your application and find its weak points.

The core difference between positive testing and negative testing is that throwing an exception is not an unexpected event in the latter. When you perform negative testing, exceptions are expected – they indicate that the application handles improper user behavior correctly.

It is generally considered a good practice to combine both the positive and the negative testing approaches. This strategy provides higher tested application coverage as compared to using only one of the specified testing methodologies.

Quote of The Day

Any day can be a Kiss day like 13th Feb...!!