Sunday, 30 June 2013

Basic Web Performance Part 5

As we progress on our journey to performance test our web application. I think it would be appropriate to introduce few key terminology we use in our performance testing.It is important to know what are expectations from our performance testing and certain do and don't . so let me get started as i am not going to explain those terms here you might have to figure out yourself :)
Key Terminology
  •  Ramp Up/Ramp down
  •  Users/Threads
  • Iterations
  • Through put
  • WorkLoad/Business Flow
  • Request/Response
  • Load/Stress
Why Do Performance testing
  1. Assessing release readiness
  2. Assessing infrastructure adequacy
  3. Assessing software performacne adequacy
  4. Collecting metrics for Performance Tuning
Do and Don't

There are certain do and donts before you perform the performance test.These are again the guideline and all of them might not be applicable.

Do
  • Suitable time to Load test when all the parameters like network , user access, application access are under control.
  • Define the performance metrics , accepted level or SLA and goals.
  • Know the objective of test.
  • Know the Application Under Test architecture and details like protocol supported , cookies manages etc.
  • Know the Workload Scenarios and peak and normal usage time before the test.
  • Use the meaningful test scenario to construct Test Plan which is real life .
  • Run the tool on machine other than that running the Application.
  • Machine running the tool should have sufficient network bandwidth and resources(memory,CPU).
  • Run the test for long duration to have minimum deviations.
  • Ensure the application is stable and with no errors logged on log files by manually browsing the application.
  • Incorporate the think time to emulate real life scenario.
  • Keep Close watch on following (processor, memory,disk and network)
Don't
  • Never run the test against servers which are not assigned to test, else you may be accused of DoS attack.
You can see more on performance testing on the blog http://prashantbansode.blogspot.in/

In the next article we will see how we can follow these guidelines and start on performance testing with JMETER!!

Saturday, 29 June 2013

Basic Web performance Part 4

While previous articles we have loooked upon the website request response in more single user scenario. This may not be the case as most of company website / web product would be used by thousands/ millions of user. Some transaction might be used by few thousands users while some might be used by millions.It is advisable to properly test your website with more real life user distribution and this  cannot happen by manually and we need to automate the process.We need tools support and we have broadly categorize the tool into two classes
1. Open Source
2. Commercial
There might be more subtle difference before you choose a tool like protocol support, vusers support , learning curve  etc. Since our discussion is limited to web application we will focus on HTTP protocol here. The tools in open source category are OpenSTA, Jmeter while on another side commercial tools like LoadRunner , WebLoad, Rational tools. We choose Jmeter for our testingfor below reasons
  1. Jmeter can be used for testing static and dynamic resource
  2. Can support heavy load i.e. users
  3. Great analysis for the performacne results/metrics
  4. High extensible with support on javascript/groovy etc to enhance the script.
  5. And Lastly No License Cost!!   

How to Install
Since it is open source you can do a quick googling , binging and get the executable download. More precisely you can go to
http://jmeter.apache.org/index.html and under download on left pane there is Download Releases latest is 2.9 version with prerequisites for java 6  . Under the binaries you can get the .zip or .tgz based on the operating system you are planning to install.For some reason you want to install older version you can go to Archives and download the version interested in.

Jmeter Launch
Once you have extracted the files , you will see bin,docs,extras,lib,printabledocs and few auxiliary files. You need to got to the bin folder and launch the jmeter.bat file this will launch the Jmeter and you will see something as below. The left pane on below screen contains the Elements of Jmeter and right pane is for configuration the settings for these elements.
We will talk more about the elements in jmeter in our next session and also touch base on some key terminologies in performance testing.

Wednesday, 26 June 2013

ETL Performance - Data loading and Snapshot

Off-late we had requirement to test the performance of our ETL. ETL as a process which consist of below

E- Extraction
T- Transformation
L-Loading

The data is acquired by various source systems(aka UpStream) can be relational database , flat files etc. the data is then transformed by various business logic and then loaded on Marts. We had to acquire data from SQL database, the acquisition was done by tool developed by MS , Replication. The Replication performance was good enough with default setting for replication. We calculated the latency between the publisher to Distributor and Distributor to Subscriber and the latency was less than 3-5 seconds. So having the good performance we focused on performance of Transformation and Loading part ; we had the SSIS framework and were using the control flow and Data flow task to load and transform the data .

The Loading and Transformation performance can be divided into following three part

1. Full/Initial run
2. No Data change in Upstream and perform the run
3. Populate the delta record in Upstream and perform the run.

The performance of the initial run was less than 2 hours for 25GB of data in various facts and dimension populated successfully. We than re-ran the job without any modification in source data and the ETL took 1 hours and 15 minutes ; since there was no data change i think this time was just to check the records against the last checkpoint where data was processed successfully and /or re-population the data by doing truncate of existing records. The last part was to populate the data in source systems and do the run again and capture the performance . The data loading was required to be done in key tables based on business requirement and the metrics collected. The data loading was done with separate utility in source system and we found out to load all the data successfully into source we might have to wait for 3 days. We could not wait for that time we verified the data loaded which was happened and took snapshot of the source DB and ran out the ETL job and captured the benchmarking. Snapshot in SQL DB is process which captures the DB details in time and later we can compare the performance benchmarking with equivalent data present in snapshot DB.Later we will talk about the ETL performance reports . Thanks all and happy ETL perf testing :-)