We put docs in the app and engaged more evaluators
TL;DR
We ran an experiment to put help documentation into the JIRA Service Desk Cloud app. We wrote four articles meant to address four specific product goals.
15% of the experiment cohort interacted with in-product documentation.
We learned in-app help has no direct effect on conversion or activation. More importantly, it has no negative impact.
For the instances that interacted with the docs, we saw a:
- 19% increase in requests created in the customer portal
- 24% increase in requests created by email
- 24% increase in customers invited to the instance
- 22% increase in agents added to the project
If the instance interacted with goal-oriented content, users were more likely to complete product activities at higher rates and more frequently.
This has many implications for future information experience (IX) work. And, we recommend moving this type of solution into production.
Curing customer pain with ShipIts
Complexity is one of the top JIRA and JIRA Service Desk pain points. Customers experience pain when using our products. What's more, they experience pain when finding help in external documentation.
JIRA Service Desk designers and IX writers started tackling this problem. They used their ShipIt time to experiment. They wanted to try out the idea of providing contextual in-app help. Read more about the ShipIt project called Docs that Rock.
This is not a new idea at Atlassian. Read more about three other in-app documentation experiments (there's probably more).
These ShipIts were successful hackathon projects. But, we needed more data to confirm if the idea is worth investing in and implementing in our products.
Moving from hackathon to evaluator experiment
The A-team picked up from the ShipIt projects. We experimented with providing goal-driven content to help evaluators of the JIRA Service Desk Cloud app. We used a tool called Elevio.
The A-team poured through previous data. We identified some key points of confusion that prevent activation in JIRA Service Desk. Read more about the top three activation blockers for JIRA Service Desk evaluators.
With these activities in mind, we drafted four articles. We crafted each article with specific goals in mind. We wrote specific calls to action. We sparred and cut down as much text as possible. And, we laid them out in Elevio.
We wanted to see if putting information directly in front of users would make them engage more. Read more about the experiment details.
Our implementation allowed users to click article links and interact directly with JIRA Service Desk. For example, if the user clicked an "invite team" link in the doc, we popped the Invite team dialog in the project sidebar.
I'm happy to report the results of this experiment.
Disclaimers
- Our analyst recommends we take these with a grain of salt, as the volumes of the experiment were quite low.
- There's some concern about correlation and causation that needs further investigation. We don't know right now whether the user read the article before performing the action, or if they are just a sort of "click all the buttons" type evaluator.
- Bear in mind that early signal testing is usually conducted with between 5 and 30 users. If we're able to draw conclusions from those tests, then surely 126 users can give us some direction. But, I'm not an analyst.
- There are a lot of tables in this post.
How many people interact with in-app help?
We showed the experiment to 835 instances. 126 of them clicked the help button. That means that 15% of the group interacted with in-app help.
To be fair, Elevio's help button is much more discoverable than how we currently showcase help to our users.
Currently, users have to click the ? button in the global nav. Then, choose between nine or ten different help options. Most likely, users want to read through CAC docs [our documentation website] using the JIRA Service Desk help link in this list. Once they get to CAC, they still have to search for what they need help with.
Using Elevio, agents and admin get a nice, big help button that they can clearly see. And, what's more, they get contextual help. When they click the button, they get help topics related to the page they are viewing.
This button was stealthy. It also collected unsolicited insights for us.
We recorded what page the user was looking at when they clicked the button. It's reasonable to assume this indicates where users have trouble understanding the product.
Here's the top three JIRA Service Desk pages users had the most trouble with:
- Queues (35% of clicks)
- Customers (10% of clicks)
- Project settings > Request types (10% of clicks)
We passed this information to the team formally known as Success to help focus their work.
We're also using this information to rank which documents to write first if we move forward with Elevio.
Did they do more things in the product?
Short answer: yes. Long answer: yes-ish.
Instances that interacted with the in-app help performed the associated product actions in higher numbers. But, these evaluators might have done so anyways. It's the age-old correllation ≠ causation question that needs further analysis.
That said, I have reason to believe that users better understood JIRA Service Desk.
Results to consider
We wanted to measure our articles' effect on four product actions and how often users did them:
- Creating requests in the customer portal
- Creating requests via email
- Inviting customers to the project
- Adding agents to the project
These goals align with activation activities.
Here's how the experiment performed to these goals:
Didn't interact with help | Did interact with help | Delta | |
---|---|---|---|
Created a portal request | 270 instances out of 709 (38%) | 72 instances out of 126 (57%) | ↑19% |
Created an email request | 105 instances out of 709 (14%) | 48 instances out of 126 (38%) | ↑24% |
Invited customers | 96 instances out of 709 (14%) | 48 instances out of 126 (38%) | ↑24% |
Added agents | 73 instances out of 709 (10%) | 40 instances out of 126 (32%) | ↑22% |
For the instances that interacted with the docs, we saw a:
- 19% increase in requests created in the customer portal
- 24% increase in requests created by email
- 24% increase in customers invited to the instance
- 22% increase in agents added to the project
The overview numbers are compelling enough. But, the aim was to learn about how in-app content influences product behavior. Let's look at results article by article.
Article 1: Users, email and portal
28 instances interacted with this article.
The content of the article discussed user types and the channels customers use to raise requests.
We wrote the article with a few goals in mind:
- Teach basic concepts about our user types (such as, agents and customers)
- Better explain licensing and billing (highlighting that customers are free)
- Teach basic concepts for how customers send requests to the service desk through email and the customer portal
- Get users to read related articles
Some of these goals are tricky to test because we have no way of checking comprehension.
But, there was a link in the article that popped JIRA Service Desk's customer channel dialog. The article explained: "Select Customer channels from the sidebar to find your service desk's email address or portal URL."
Here's what the instances did:
Didn't view the article | Viewed the article | Delta | |
---|---|---|---|
Created a portal request | 327 instances out of 807 (41%) | 15 instances out of 28 (53%) | ↑12% |
Created an email request | 143 instances out of 807 (18%) | 10 instances out of 28 (35%) | ↑17% |
Invited customers | 135 instances out of 807 (17%) | 9 instances out of 28 (32%) | ↑12% |
Added agents | 107 instances out of 807 (13%) | 6 instances out of 28 (21%) | ↑8% |
Break it down. If instances interact with this article in context, they are:
- 12% more likely to create a request in the customer portal
- 17% more likely to create requests through their respective email channels
- 15% more likely to invite customers to the project
- 8% more likely to add additional agents to the project
Article 2: Get to know your email channel
15 instances interacted with this article.
The content informed evaluators:
- that they have a built-in support email address,
- how to find that address from the sidebar (with a link that popped the customer channels dialog out of the project sidebar)
- how to use their email channel
It had one specific goal: create a request via email.
Here's what the instances did:
Didn't view the article | Viewed the article | Delta | |
---|---|---|---|
Created an email request | 146 instances out of 820 (18%) | 7 instances out of 15 (47%) | ↑29% |
18% of instances that didn't view the article created requests via email. Viewing the article increased this product action to 47%. That's roughly a 29% increase.
Let's dig a bit further. How many requests did they create via email?
Forgetting the instances that didn't create any email requests, here's the amount of email requests created:
Didn't view the article | Viewed the article | Delta | |
---|---|---|---|
Average number of email requests created per instance | 5.19 | 60.71 | ↑1170% |
Instances that created email requests without viewing the article created roughly five email requests on average.
Instances that created email requests and viewed the article with this goal in mind created many more. They created on average 61 email requests. That's roughly 12x as many requests, a 1170% increase.
Article 3: View the customer portal
11 instances interacted with this article.
The content pointed out that the service desk has a customer portal. It discussed where to find the URL. Again, it included a link that popped the customer channels dialog in the project sidebar.
We wanted the content to get users to:
- View the customer portal
- Create request via the portal
- Invite customers to the portal
Here's how the article performed:
Didn't view the article | Viewed the article | Delta | |
---|---|---|---|
Created a portal request | 336 out of 824 (41%) | 6 out of 11 (55%) | ↑14% |
Invited customers | 137 out of 824 (17%) | 7 out of 11 (64%) | ↑47% |
More instances that viewed the article created portal requests than those that didn't. 55% created at least one portal request, if they viewed the article. Only 14% performed the same action without viewing any in-app help.
Similarly, more instances that viewed the article invited customers than those that didn't. 64% invited a customer, compared to only 17% of instances that didn't view the article.
Forgetting the instances that didn't create any portal requests, here's how many portal requests were created:
Didn't view the article | Viewed the article | Delta | |
---|---|---|---|
Average number of portal requests created per instance | 2.41 | 2.5 | ↑4% |
Instances that viewed the article created slightly more requests than those that didn't. We saw a 4% increase in the average number of portal requests in these cases.
Leaving aside the instances that didn't invite customers, here's how many customers the instances invited:
Didn't view the article | Viewed the article | Delta | |
---|---|---|---|
Average number of customers invited per instance | 2.93 | 4.85 | ↑65% |
Admins and agents who viewed the article invited many more customers, on average.
Article 4: Invite your team and customers
12 instances interacted with this article.
The article discussed how to add an agent. It included a link that popped the invite team dialog in the project sidebar. It also discussed how to add a customer to a service desk project. Another link navigated users to the Customers page.
We had two goals in mind for this content. Get the evaluator to:
- Invite a customer
- Add an agent
Here's how the article performed:
Didn't view the article | Viewed the article | Delta | |
---|---|---|---|
Invited customers | 137 out of 823 (17%) | 7 out of 12 (58%) | ↑41% |
Added agents | 109 out of 823 (13%) | 4 out of 12 (33%) | ↑20% |
41% more instances invited a customer if they viewed the article. 20% more added an agent.
Not only did more instances invite customers, they invited more customers on average.
Forgetting the instances that didn't invite any customers, here's the number of invited customers:
Didn't view the article | Viewed the article | Delta | |
---|---|---|---|
Average number of customers invited per instance | 2.93 | 5.00 | ↑71% |
Strangely, even if the article spurred more instances to invite agents, they invited less agents:
Didn't view the article | Viewed the article | Delta | |
---|---|---|---|
Average number of agents invited per instance | 1.39 | 1.25 | ↓10% |
Conclusions
Even if the volumes are low, these results hint at the value in-app help can provide to the company. In-app help can:
- Better engage users who read documentation
- Track what those users do in the product
- Focus IXers to write docs that affect business goals, instead of shooting in the dark
- Reduce the pain of finding help when users need it
Ultimately, it keeps customers active in the product.
What this means for IX
A goal-driven writing approach makes more sense than "document all the things". We need to write with a goal in mind to maximize our limited IX resources. We need to use a ground-up approach. We should map IX writing goals to feature goals to product goals to business goals.
Tracking documentation is a good thing. We can see how our content performs in a tangible way. We should stop trying to interpret page views and bounce rates. Start thinking about how documentation increases product outcomes.
It's all part of the feedback loop IX writers form. We learn from users, write articles to ease their pain. And then, we push back our findings to our product's simplification efforts.
The easiest way to track documentation against product goals is to put the docs in the product.
Our goal for IX should never be "go away from the product to a documentation space and do things there so IX seems relevant." Our goals should be the same as the collective Atlassian goal: "Keep users in the product longer and engage them more."
What's next?
JIRA Service Desk's IX writers are looking to put this in place on a larger scale. We hope to write for top pain points across the product and push these articles to existing instances.
I'm especially keen to see in-app help's effect on churn for the product that has the highest churn rate.
Thanks
Heaps of people to thank for helping out with this experiment. A, R, R, and J took on the original ShipIt. A lent us her Elevio account to set this experiment up. S hacked it to work on evaluator instances. The A-team let me convince them to use their experiment pipeline time for this. W and the product advocates gave us great insights into customers. N never stops supporting the A-team's efforts. A looked into the results and ran queries for us. A and I looked into analytics for us. B and the Help and Support team in Austin sparred our content and experiment approach (I hope the results are useful to them). Thanks to you all for supporting this effort.