Why Performance Testing ?
If you work on a team responsible for the delivery of SharePoint into your organization there's usually a mountain of tasks to complete before making it live. One of the most vital tasks is to make sure that SharePoint is "fast" enough to meet the requirements of your users. If you get performance problems after going live this can easily ruin all of your hard work and - even worse - your users will think SharePoint sucks!
So, by the time you "go live" you should be confident of the following :
- You know the maximum load your service can take.
- You know your service can run for a pro-longed period of time.
- You know the point at which you may need to add more servers, so you can plan.
- When you do get to maximum load you need to know where the bottle necks in your service are, (e.g. SQL, Network, SP servers).
If you don't know these metrics up front, then you are going to have to react as problems occur. If you have to react it will be probably be done under extreme pressure, usually because your boss is yelling at you!
So how do you find out these performance metrics ?
One of the best ways to discover how well your environment can cope is to carry out some performance tests using a tool that will emulate the load and expected journeys of your user base.
Not every user will use SharePoint in the same way. You will have some users that download a document, some that will publish documents, some that will use Search, or even some that will use be 100% socialites. In the testing world these usage patterns can be mapped onto something called a User Journey. To get started with Performance Testing, you will need to know what these journeys are likely to be and also what the mix will be. You can either take an educated guess ("guestimate") on what these journeys will be or even better, run a pilot on a good cross-section of staff before going live. You can then use the actual usage data from the pilot phase and feed that into a typical user journey. Both SharePoint analytics and the IIS logs will give you all the data you need.
Once you have this data you can then list all of the types of journeys,
- Hit the home page
- Do a search
- Read a document
Another journey may be :
"The Social User"
- Hit the home page
- Go to their MySite (some pre-provisioned, some not)
- Add a status update
- Add a forum post
- Like a document
Visual Studio Performance Testing
If you have Visual Studio Ultimate, you can take advantage of the testing tools that ship with it. There are a few others such as Load Runner, but they can be pricey. If you take time to learn Visual Studio test tools you will soon be able to perform adequate tests. The other great side effect is that you also are functionally testing your application. So once you automate your testing, you can use it to repeatedly test your app.
Once really cool thing about Visual Studio Testing is that it's really easy to record a user journey by simply using the browser to use SharePoint. Once you have completed the journey it will be saved as a web test so that it can automatically be reused under load in your Performance Tests.
To read more about Visual Studio Testing tools go here.
Visual Studio Test Agents
For large implementations, it's impossible to emulate anything more than 50 users from 1 machine (aka Test Agent). You will be constrained by the machines CPU, Memory or Network I/O. To increase the load you need to install the Visual Studio Test Agent service on more than 1 machine. As an example, to emulate 600 concurrent users I used 10 machines to spread the load.(It's quite easy to see when a machine is constrained as VS collects performance data from the agent and things go RED.)
Once configured, each Test Agent, will emulate several users using SharePoint. Once the Agent has the performance data they will log it to a common SQL Server where it can be analysed.
Note! You don't need to buy specific machines for this, you should consider using your existing developer machines. It's VERY likely you will be doing this kind of testing out of hours, so it's silly not to use them.
What performance data to you need to gather ?
To understand what to look for does require a good understanding of machine architecture and performance counters. However, Visual Studio really helps as it has a pre-recorded set of "important" counters. Each counter has a pre-defined threshold, so it will be obvious if there is a problem. For example, if your SQL Server CPU hits 95% this be will clear to see.
The main things to you want to record for each type of test are :
- Test Duration
- WFE’s in Load
- Agents used
- Avg. User Load
- Pages / Sec
- Avg. Page Time (Sec)
- Requests / Sec
- Requests Cached %
- Avg. Response Time
Please note, you should gather performance data from all servers involved in providing your SharePoint service. This includes application servers, WFE's, SQL Servers and also non SharePoint servers. There could be a bottleneck in any one of these that can cause knock on effects and resulting slow performance.
Always test customised code.
Microsoft have already tested standard SharePoint functionality. If there were performance issues they would already be well known and probably fixed. However, Microsoft cannot test any customizations (i.e. code) you have done. Make sure anywhere you have new code (e.g web parts, event receivers, custom pages ect), you include it under load. Custom code is the most likely cause of performance issues as it's unlikely your developers will have tested it at the loads it will be used at.
Having said that, you still want to test standard SharePoint as you may have under powered kit. I am just saying pay special attention to custom code.
Vary the users, browsers and network mix.
Visual Studio allows you to log-in as different types of users. Do this. Different types of users can cause different paths of code to fire. You may have personalisation features, or different security models that come into play based on user.
You can also emulate different browsers and networks. Try and match your organization closely.
What type of tests do you need to perform ?
The type of tests you can carry out will generally fall into one of these types :
Goal based Test
The intention of the Goal-based test is to identify the number of page / requests that can be served while the WFE’s are running at around 70% utilisation. The test runs for 10 minutes and readjusts load based on CPU utilisation.
The intention of this test is to hit the production farm at about 50 – 75% of expected peak Usage over long periods of time. This will hopefully identify specific problems that are time related, such as memory leaks, or scheduled tasks that disrupt service.
The purpose of this test is to gradually ramp up the User load to find the point at which response rates start to fall away.
Constant Load Tests
This type of testing hits the servers with a constant load that is expected at peak performance.
What are the performance steps ?
In summary, here's the high-level steps that you need to go through :
- Identify your user journeys.
- Records your user Journeys using Record and Play with a browser and Visual Studio.
- Create Test Scenarios in Visual Studio - (Vary the users, time, browsers, networks, journeys).
- Configure "Test Agents" to increase the amount of load.
- Run the Tests.
- Analyse the results.
- Publish findings and make recommendations.
The Sample Report
This sample performance testing report is a 25 page report taken from an actual SharePoint 2010 farm serving 15,000 users. The Farm has 2 App Servers and 4 WFE's. If you need to do Performance Testing in your organization, you may wish to use it as a reference point.