So, you’ve spent months building a new Sitecore platform for a client. Created templates, components and custom functionalities. You’re almost done, the deadline is closing in, you’re approaching the final sprint, time to run load tests and see how much traffic and visits the platform can handle...
In this case, a worldwide corporate site running Sitecore 9.1, Helix compliant, hosted on Azure WebApps. Running a CM server for Sitecore Content Editing, and two frontend servers (CD1 and CD2) to handle main site traffic (the live site is running a different setup at the moment).
During sprint 0, numbers around performance were agreed upon with the client and set in stone in the Service Level Agreement. The first BlazeMeter load test was done; 16 hits/second. Ouch, that hurts. Nowhere near the required 80 hits/sec.
How to improve performance? A couple of tips;
I hope you’ve already thought about turning on caching for renderings on your components. If you didn’t, that is step one. Go through every feature-rendering and turn on caching. Check if you need the ‘Vary by Data’ and ‘Vary by Parameter’ options. Typically, you should be able to cache everything except specific functionalities that require user interaction like search, forms and login/my-environments. Keep personalization in mind as well when needed.
We are using nested renderings starting with a Section, containing Grids. Grid columns contain features. Try and turn on caching on the Section. This greatly improved performance. But it also broke the forms and search results of course. I’ve decided to create a new separate non-cached Section component and moved the functionalities from the cached Section that broke down. Pay special attention to components that are being used on all pages like the main navigation, header and footer.
Check the Sitecore cache size configuration. Information on this can be found here; https://doc.sitecore.com/developers/91/platform-administration-and-architecture/en/configuring-caching.html
Ran a new load-test and got to 32 hits/sec. While we doubled the performance, we are still nowhere near the requirement.
xDB and Tracking
Tracking and xDB are enabled by default. Check with your customer if they need these features. Because if they don’t, turn it off! It will gather gigabytes of data over time and consumes quite some resources. If your client does require it, configure these to measure/track only what you need.
Turning off xDB and Tracking immediately improved our test results. But we weren’t there yet.
Use HTTP/2 instead of 1.1. It supports multiplexing and compression. All details can be found here: https://www.upwork.com/hiring/development/the-http2-protocol-its-pros-cons-and-how-to-start-using-it/
Throwing more money at the cloud setup to get better performance works in most cases but know what you’re doing. It turned out our choice to run two S3 (standard) frontend servers through a traffic manager was not the right choice. We’ve switched to one frontend server running a P2V2 (premium) with 3 instances and downscaled the other CD server. We can still do deployments without downtime but had to add some extra steps to Octopus Deploy to up- and downscale, manage the traffic manager and run warm-up scripts.
Switching to a P2V2 with three instances made a big difference.
Strange thing is, a S3 costs the same as a P2V2, but the change made a big difference in performance for Sitecore. Premium tier runs faster processors, SSD storage, and double memory-to-core ratio compared to Standard.
S3 4 7 GB 50 GB €0.338/hour, P2v2 2 7 GB 250 GB €0.338/hour
The load test results were finally exceeding requirements; 100 hits/sec!
Load tests do not directly reflect user experience because this type of test only looks at the time it takes to receive the HTML. It does not load the images or wait for page rendering.
Now that we were in the green with hits/sec, we turned to improving the actual user experience. We already had HTTP/2 implemented. We did notice images and static files were taking some time to load.
Use a CDN
If you are not using a CDN (Content Delivery Network), you should. I'm using Cloudflare. Besides the typical CDN features it will also add some security extra’s. Go through all the options with a fine comb. Here are some of my settings;
With the CDN in place, user experience and load times are well exceeding the requirements. After some time we even scaled down the CD Azure WebApp from three to two instances, saving the client some money.
Contact me if you need help and please comment if you know more tips and tricks.
Kom een keer langs!
...maar een vraag stellen mag ook.