Friday, 21 June 2013

Data Distribution on Performacne DB- Internal Testing


We had an  performance issue which we found in our internal performance lab ; The issues was logged as blocker and release was pushed off! The whole  team was on fire!. The release went into red! Round the clock focus went on the bug. After the development team few rounds of analysis after push back for  reasons below
1. as environment issues , this task got completed in my environment very early.
2. Unit testing we are not seeing this issues( mind unit testing in DW is heavy transaction and sometime bring down your system)

We finally nailed down the issues with one ETL package talking more time and it was because of data in one of our transaction system which we have increased 3 times . We were less on time to fix the bug in the cycle so we thought to review our data against the existing customers in production . On our analysis the largest current customer data on the said table was less than the data we have originally in performance database.Then we started the process to analysis potential customer and we saw of all 10+ biggest customer database we analyzed only 1 Customer DB has data more than originally present in Performance DB.

Sometime you might be investing effort on one off requirement whose frequency of happening is less than 1 %.  We changed our strategy of Test data and moved back to original data in this table and plan to include the data only when we will onboard this Customer.
For the general testing , The data to be populated in Database should be more  towards the mean than on extreme to avoid the such conditions!. We can plot the normal distribution of data and try to populate the data in Performance DB which is more towards the median.


Thursday, 13 June 2013

Basic Web Performance Part 3

In last articles we spoke about the measuring performance ; in this series we will continue our discussion and talk about measuring performance by analyzing the web logs . We will see how we can configure and capture the data in IIS web server logs . This raw data can be then analyzed with another free tool from microsoft called log parser. So lets get started.

Let me first show the steps to configure the logs. You need to go to your web server in inetmgr. In inetmgr go to website on left pane and select the same. Then go to Logging and click on the same . Below is pictorial representation of how to configure . Mostly we will select the following fields in logs
1. Time-Taken
2. bytes received
3. bytes send

Time taken is total time taken for request includes the time the request was queued , the execution time and rendering the response to client. The aspx request will include all these three component while the static component will only include time to client. The data transferred b/w client and web server can be calculated through bytes sent/received.



Once you have the raw data ; we will use the log parser  to analyze the data. The log parser is free utility from Microsoft and can be downloaded . You can run queries on the raw data through the log parser to get the status code , time taken etc. Here are few examples:

200 status Code
logparser -rtp:-1 "SELECT cs-uri-stem, cs-uri-query, date, sc-status, cs(Referer) INTO 200sReport.txt FROM ex0902*.log WHERE (sc-status >= 200 AND sc-status < 300) ORDER BY sc-status, date, cs-uri-stem, cs-uri-query"
400 status codes
logparser -rtp:-1 "SELECT cs-uri-stem, cs-uri-query, date, sc-status, cs(Referer) INTO 400sReport.txt FROM ex0811*.log WHERE (sc-status >= 400 AND sc-status < 500) ORDER BY sc-status, date, cs-uri-stem, cs-uri-query"
Bandwidth usage :Returns bytes (as well as converted to KB and MB) received and sent, per date, for a Web site.
logparser -rtp:-1 "SELECT date, SUM(cs-bytes) AS [Bytes received], DIV(SUM(cs-bytes), 1024) AS [KBytes received], DIV(DIV(SUM(cs-bytes), 1024), 1024) AS [MBytes received], SUM(sc-bytes) AS [Bytes sent], DIV(SUM(sc-bytes), 1024) AS [KBytes sent], DIV(DIV(SUM(sc-bytes), 1024), 1024) AS [MBytes sent], COUNT(*) AS Requests INTO Bandwidth.txt FROM ex0811*.log GROUP BY date ORDER BY date"
Bandwidth usage by request :Returns pages sorted by the total number of bytes transferred, as well as the total number of requests and average bytes.
logparser -i:iisw3c -rtp:-1 "SELECT DISTINCT TO_LOWERCASE(cs-uri-stem) AS [Url], COUNT(*) AS [Requests], AVG(sc-bytes) AS [AvgBytes], SUM(sc-bytes) AS [Bytes sent] INTO Bandwidth.txt FROM ex0909*.log GROUP BY [Url] HAVING [Requests] >= 20 ORDER BY [Bytes sent] DESC"

you can refer for more examples on http://logparserplus.com/Examples. You can output the IIS log file data into the csv and do more analysis.
In next series we will talk about the performance tools and dig more on open source tool JMETER and its capability.

Thursday, 6 June 2013

Tester Planning (MPP vs Excel)

Last week we discussed about the plan with my manager. He wants me to come up with a plan for my testing in two days. It was not tough with i aware of the testing involved that would be mostly regression and the additional of few scenarios based on the changes.The initial step was to go through he requirement document and understand the changes and once i understood the changes and was aware of scenarios to be included; the next step is to put all the analysis in plan document and come up with the date. I need a tool. So i approached my IT team with requesting them for MPP ; that is quiet a tool for your planning needs take cares of business days, holidays,resources in your plan. Though i was more focused on coming up with effort and dates. The process of getting approval for license for Microsoft Project plan was difficult as it was costing 300-600$ and i was sure i wont get approval. So i decided to explore the obvious excel and there i came to know few functions that would be handy to come up with basic plan.

Here are few functions in excel which will come with Analysis pack add-in
1. Workday
2. Networkdays
3. Date

I used workday and date and also the Data->group function was handy to come up with Task and Sub-task structure as in MPP with only business day.Below is screen shot of my plan.There are lot many functions which i will explore later and come up with complete tools  for my Test planning.