Sunday 30 June 2013

Basic Web Performance Part 5

As we progress on our journey to performance test our web application. I think it would be appropriate to introduce few key terminology we use in our performance testing.It is important to know what are expectations from our performance testing and certain do and don't . so let me get started as i am not going to explain those terms here you might have to figure out yourself :)
Key Terminology
  •  Ramp Up/Ramp down
  •  Users/Threads
  • Iterations
  • Through put
  • WorkLoad/Business Flow
  • Request/Response
  • Load/Stress
Why Do Performance testing
  1. Assessing release readiness
  2. Assessing infrastructure adequacy
  3. Assessing software performacne adequacy
  4. Collecting metrics for Performance Tuning
Do and Don't

There are certain do and donts before you perform the performance test.These are again the guideline and all of them might not be applicable.

Do
  • Suitable time to Load test when all the parameters like network , user access, application access are under control.
  • Define the performance metrics , accepted level or SLA and goals.
  • Know the objective of test.
  • Know the Application Under Test architecture and details like protocol supported , cookies manages etc.
  • Know the Workload Scenarios and peak and normal usage time before the test.
  • Use the meaningful test scenario to construct Test Plan which is real life .
  • Run the tool on machine other than that running the Application.
  • Machine running the tool should have sufficient network bandwidth and resources(memory,CPU).
  • Run the test for long duration to have minimum deviations.
  • Ensure the application is stable and with no errors logged on log files by manually browsing the application.
  • Incorporate the think time to emulate real life scenario.
  • Keep Close watch on following (processor, memory,disk and network)
Don't
  • Never run the test against servers which are not assigned to test, else you may be accused of DoS attack.
You can see more on performance testing on the blog http://prashantbansode.blogspot.in/

In the next article we will see how we can follow these guidelines and start on performance testing with JMETER!!

Saturday 29 June 2013

Basic Web performance Part 4

While previous articles we have loooked upon the website request response in more single user scenario. This may not be the case as most of company website / web product would be used by thousands/ millions of user. Some transaction might be used by few thousands users while some might be used by millions.It is advisable to properly test your website with more real life user distribution and this  cannot happen by manually and we need to automate the process.We need tools support and we have broadly categorize the tool into two classes
1. Open Source
2. Commercial
There might be more subtle difference before you choose a tool like protocol support, vusers support , learning curve  etc. Since our discussion is limited to web application we will focus on HTTP protocol here. The tools in open source category are OpenSTA, Jmeter while on another side commercial tools like LoadRunner , WebLoad, Rational tools. We choose Jmeter for our testingfor below reasons
  1. Jmeter can be used for testing static and dynamic resource
  2. Can support heavy load i.e. users
  3. Great analysis for the performacne results/metrics
  4. High extensible with support on javascript/groovy etc to enhance the script.
  5. And Lastly No License Cost!!   

How to Install
Since it is open source you can do a quick googling , binging and get the executable download. More precisely you can go to
http://jmeter.apache.org/index.html and under download on left pane there is Download Releases latest is 2.9 version with prerequisites for java 6  . Under the binaries you can get the .zip or .tgz based on the operating system you are planning to install.For some reason you want to install older version you can go to Archives and download the version interested in.

Jmeter Launch
Once you have extracted the files , you will see bin,docs,extras,lib,printabledocs and few auxiliary files. You need to got to the bin folder and launch the jmeter.bat file this will launch the Jmeter and you will see something as below. The left pane on below screen contains the Elements of Jmeter and right pane is for configuration the settings for these elements.
We will talk more about the elements in jmeter in our next session and also touch base on some key terminologies in performance testing.

Wednesday 26 June 2013

ETL Performance - Data loading and Snapshot

Off-late we had requirement to test the performance of our ETL. ETL as a process which consist of below

E- Extraction
T- Transformation
L-Loading

The data is acquired by various source systems(aka UpStream) can be relational database , flat files etc. the data is then transformed by various business logic and then loaded on Marts. We had to acquire data from SQL database, the acquisition was done by tool developed by MS , Replication. The Replication performance was good enough with default setting for replication. We calculated the latency between the publisher to Distributor and Distributor to Subscriber and the latency was less than 3-5 seconds. So having the good performance we focused on performance of Transformation and Loading part ; we had the SSIS framework and were using the control flow and Data flow task to load and transform the data .

The Loading and Transformation performance can be divided into following three part

1. Full/Initial run
2. No Data change in Upstream and perform the run
3. Populate the delta record in Upstream and perform the run.

The performance of the initial run was less than 2 hours for 25GB of data in various facts and dimension populated successfully. We than re-ran the job without any modification in source data and the ETL took 1 hours and 15 minutes ; since there was no data change i think this time was just to check the records against the last checkpoint where data was processed successfully and /or re-population the data by doing truncate of existing records. The last part was to populate the data in source systems and do the run again and capture the performance . The data loading was required to be done in key tables based on business requirement and the metrics collected. The data loading was done with separate utility in source system and we found out to load all the data successfully into source we might have to wait for 3 days. We could not wait for that time we verified the data loaded which was happened and took snapshot of the source DB and ran out the ETL job and captured the benchmarking. Snapshot in SQL DB is process which captures the DB details in time and later we can compare the performance benchmarking with equivalent data present in snapshot DB.Later we will talk about the ETL performance reports . Thanks all and happy ETL perf testing :-)

Friday 21 June 2013

Data Distribution on Performacne DB- Internal Testing


We had an  performance issue which we found in our internal performance lab ; The issues was logged as blocker and release was pushed off! The whole  team was on fire!. The release went into red! Round the clock focus went on the bug. After the development team few rounds of analysis after push back for  reasons below
1. as environment issues , this task got completed in my environment very early.
2. Unit testing we are not seeing this issues( mind unit testing in DW is heavy transaction and sometime bring down your system)

We finally nailed down the issues with one ETL package talking more time and it was because of data in one of our transaction system which we have increased 3 times . We were less on time to fix the bug in the cycle so we thought to review our data against the existing customers in production . On our analysis the largest current customer data on the said table was less than the data we have originally in performance database.Then we started the process to analysis potential customer and we saw of all 10+ biggest customer database we analyzed only 1 Customer DB has data more than originally present in Performance DB.

Sometime you might be investing effort on one off requirement whose frequency of happening is less than 1 %.  We changed our strategy of Test data and moved back to original data in this table and plan to include the data only when we will onboard this Customer.
For the general testing , The data to be populated in Database should be more  towards the mean than on extreme to avoid the such conditions!. We can plot the normal distribution of data and try to populate the data in Performance DB which is more towards the median.


Thursday 13 June 2013

Basic Web Performance Part 3

In last articles we spoke about the measuring performance ; in this series we will continue our discussion and talk about measuring performance by analyzing the web logs . We will see how we can configure and capture the data in IIS web server logs . This raw data can be then analyzed with another free tool from microsoft called log parser. So lets get started.

Let me first show the steps to configure the logs. You need to go to your web server in inetmgr. In inetmgr go to website on left pane and select the same. Then go to Logging and click on the same . Below is pictorial representation of how to configure . Mostly we will select the following fields in logs
1. Time-Taken
2. bytes received
3. bytes send

Time taken is total time taken for request includes the time the request was queued , the execution time and rendering the response to client. The aspx request will include all these three component while the static component will only include time to client. The data transferred b/w client and web server can be calculated through bytes sent/received.



Once you have the raw data ; we will use the log parser  to analyze the data. The log parser is free utility from Microsoft and can be downloaded . You can run queries on the raw data through the log parser to get the status code , time taken etc. Here are few examples:

200 status Code
logparser -rtp:-1 "SELECT cs-uri-stem, cs-uri-query, date, sc-status, cs(Referer) INTO 200sReport.txt FROM ex0902*.log WHERE (sc-status >= 200 AND sc-status < 300) ORDER BY sc-status, date, cs-uri-stem, cs-uri-query"
400 status codes
logparser -rtp:-1 "SELECT cs-uri-stem, cs-uri-query, date, sc-status, cs(Referer) INTO 400sReport.txt FROM ex0811*.log WHERE (sc-status >= 400 AND sc-status < 500) ORDER BY sc-status, date, cs-uri-stem, cs-uri-query"
Bandwidth usage :Returns bytes (as well as converted to KB and MB) received and sent, per date, for a Web site.
logparser -rtp:-1 "SELECT date, SUM(cs-bytes) AS [Bytes received], DIV(SUM(cs-bytes), 1024) AS [KBytes received], DIV(DIV(SUM(cs-bytes), 1024), 1024) AS [MBytes received], SUM(sc-bytes) AS [Bytes sent], DIV(SUM(sc-bytes), 1024) AS [KBytes sent], DIV(DIV(SUM(sc-bytes), 1024), 1024) AS [MBytes sent], COUNT(*) AS Requests INTO Bandwidth.txt FROM ex0811*.log GROUP BY date ORDER BY date"
Bandwidth usage by request :Returns pages sorted by the total number of bytes transferred, as well as the total number of requests and average bytes.
logparser -i:iisw3c -rtp:-1 "SELECT DISTINCT TO_LOWERCASE(cs-uri-stem) AS [Url], COUNT(*) AS [Requests], AVG(sc-bytes) AS [AvgBytes], SUM(sc-bytes) AS [Bytes sent] INTO Bandwidth.txt FROM ex0909*.log GROUP BY [Url] HAVING [Requests] >= 20 ORDER BY [Bytes sent] DESC"

you can refer for more examples on http://logparserplus.com/Examples. You can output the IIS log file data into the csv and do more analysis.
In next series we will talk about the performance tools and dig more on open source tool JMETER and its capability.

Thursday 6 June 2013

Tester Planning (MPP vs Excel)

Last week we discussed about the plan with my manager. He wants me to come up with a plan for my testing in two days. It was not tough with i aware of the testing involved that would be mostly regression and the additional of few scenarios based on the changes.The initial step was to go through he requirement document and understand the changes and once i understood the changes and was aware of scenarios to be included; the next step is to put all the analysis in plan document and come up with the date. I need a tool. So i approached my IT team with requesting them for MPP ; that is quiet a tool for your planning needs take cares of business days, holidays,resources in your plan. Though i was more focused on coming up with effort and dates. The process of getting approval for license for Microsoft Project plan was difficult as it was costing 300-600$ and i was sure i wont get approval. So i decided to explore the obvious excel and there i came to know few functions that would be handy to come up with basic plan.

Here are few functions in excel which will come with Analysis pack add-in
1. Workday
2. Networkdays
3. Date

I used workday and date and also the Data->group function was handy to come up with Task and Sub-task structure as in MPP with only business day.Below is screen shot of my plan.There are lot many functions which i will explore later and come up with complete tools  for my Test planning.