For various reasons, we host our own Jira and Confluence instances within AWS. We have been talking about adding a CDN (CloudFront) to see if it will help with performance, especially for our end users in other countries. Rather than just add it and claim victory, I wanted to actually be able to quantify the changes, so I started looking for a way to measure the changes.
As I started looking around, I came across a number of tools that Atlassian has available for testing peformance. I settled on their Data Center App Performance Toolkit. They have other tools available such as Jira Performance Tests, but I wanted something I could run against both Jira and Confluence.
Setup
I started by launching an Ubuntu instance in AWS. I went with Ubuntu because the site that I downloaded older versions of Chrome from only had the Debian packages. Configuring the server was pretty simple. For extra packages, you need to install python3, openjdk-8-jdk, python3-venv, python3-pip unzip, and git
.
apt-get install -y python3 openjdk-8-jdk python3-venv python3-pip unzip git libpq-dev
Download and installed the latest version of Chrome.
wget -O chrome64_80.0.3987.149.deb \ \
https://www.slimjet.com/chrome/download-chrome.php?file=files%2F80.0.3987.149%2Fgoogle-chrome-stable_current_amd64.deb
apt install chrome64_80.0.3987.149.deb
Clone the GitHub repository
git clone https://github.com/dc-app-performance-toolkit.git
Create the python3 virtual environment and install the requirements.
python3 -m venv atlassian-testing
. ./atlassian-testing/bin/activate
cd dc-app-performance-toolkit
pip install -r requirements.txt
Configuration
Configuring the application is pretty straight forward. Change directories to app/
and edit the jira.yaml (or confluence.yaml) file and update the env settings:
env:
application_hostname: jira.test.com
application_protocol: https
application_port: 443
application_postfix: # e.g. /jira in case of url like http://localhost:2990/jira
admin_login: jira-test-admin
admin_password: 123456789012
concurrency: 200
test_duration: 45m
Since we use Okta for authentication and require MFA, I needed to use a local user for testing. To do that, I had to update a few files so that the user would use the non-sso login page. I updated the jmeter/jira.xml
and selenium_ui/jira/pages/selectors.py
files and replaced all instances of login.jsp
with login.jsp?nosso
. For jmeter/confluence.xml
I replaced dologin.action
with dologin.action?nosso
and for selenium_ui/confluence/pages/selectors.py I replaced login.action
with login.action?nosso
.
Running it
Once the configuation is done, running it is straight forward. In the app
directory, run the bzt
command.
bzt jira.yml
Creating a Performance Chart
After you have run your tests, you can create a performance chart using the built in report generator. In the reports
directory, update the performance_profile.yml file
with the run name and path. You can also give the report a name.
# Defines which column from test runs is used for aggregated report. Default is "90% Line"
column_name: "90% Line"
runs:
# fullPath should contain a full path to the directory with run results. E.g. /home/$USER/dc-app-performance-toolkit/jira/results/2019-08-06_17-41-08
- runName: "With CASB"
fullPath: "/home/ubuntu/dc-app-performance-toolkit/app/results/confluence/CASB-2020-06-19_14-06-06"
- runName: "With NLB"
fullPath: "/home/ubuntu/dc-app-performance-toolkit/app/results/confluence/NLB-2020-06-19_13-43-03"
# Chart generation config
index_col: "Action"
title: "Confluence Performance Testing"
image_height_px: 1000
image_width_px: 1200
With the configuration in place, you can generate the report by running the following command.
python csv_chart_generator.py performance_profile.yml
Getting this setup has greatly improved my ability to make informed decisions about any changes I want to make to the infrastructure. I am already making plans to use it to better size my instances after I get the Cloudformation work done.
Comments