Template - Content design scorecard - minified
Document | Design scorecard |
---|---|
Status | |
Evaluators | The team members who are evaluating the design |
Experience | This scorecard measure the content quality of the [insert feature or experience link] experience. |
User story or job to be done | The main user story or job to be done being captured in this flow, or link to functional requirements specification. |
Related links |
How to use this template
- Copy this template into your project space.
- Work your way through each step of your selected experience or flow.
- At each step consider the checklist below and score your designs, adding notes about why you've scored a certain way.
- Once you've considered these points, collate the key points in the good and bad sections below.
- Repeat steps 2-4 with a peer, setting the right amount of context around the journey and role of the scorecard.
- Collate and prioritize the next steps identified from this round of assessment. Complete this together with your team.
- Share any decisions and/or red flags you've identified at your next team catchup.
Summary
Complete this section last.
Overall score: A/B/C/D
The good | The bad | Next steps |
---|---|---|
Summarize good practices | Summarize where improvements can be made | Assign tasks and due dates for improvements |
Content design bandwidth and engagement: 0-5 plus notes
Bandwidth | Engagement | Notes |
---|---|---|
Was the content designer balancing competing priorities (0) or fully focuses on this work in a sustainable way (5)? | Was the content designer operating as a self-serve reviewer (0), as an internal service (3), or fully embedded with the development team (5)? | Include notes about the assigned content designer's bandwidth and engagement |
How to score
A | B | C | D |
---|---|---|---|
Excellent Represents the current best practice. Maximizes clarity and simplicity of the content. |
Good Generally represents good practice and contributes clear, simple content but with exceptions. Capable of further improvement. |
Fair Of average quality. may have some good facets but capable of significant further improvement. Not representative of current good practice. |
Poor Unclear, confusing. |
Scorecard
Criteria name | What is this criteria? | How to measure | Score and notes |
---|---|---|---|
Table stakes | |||
Legibility |
Use of legible founts and text layout
|
All applicable met = A Missing some = B Missing most = C None that apply are met = D |
Score: Notes: |
Structure |
Quality of the UI/UX flow or the long-form document's organization in relation to its function
|
All applicable met = A Missing some = B Missing most = C None that apply are met = D |
Score: Notes: |
Action |
Clarity about what action is required of the user
|
All applicable met = A Missing some = B Missing most = C None that apply are met = D |
Score: Notes: |
Other crucial criteria | |||
Plain words |
Extent to which the vocabulary is easily understood
|
All applicable met = A Missing some = B Missing most = C None that apply are met = D |
Score: Notes: |
Readability |
Ease with which the reader can follow the argument of the text
|
You can use a formula, if needed. For example, the Flesch-Kincaid grade level calculator. Be warned that readability scores can be misleading and may not correlate to user performance. In concept tests, try a single ease question: "How easy or difficult was it to read the information provided?" |
Score: Notes: |
Audience fit |
Appropriateness to the knowledge and skill level of the user
|
All applicable met = A Missing some = B Missing most = C None that apply are met = D |
Score: Notes: |
Relevance |
How relevant the content is to the recipient
|
For product designs: All applicable met = A Missing some = B Missing most = C None that apply are met = D For documentation: "Was this helpful?" scores and the rate of "not relevant" responses.
|