This guide walks new UXArmy users through setting up a clear, effective usability test – from choosing task types to running checks, launching the study, and analysis. It’s based on the questions first-time users ask most often and reflects the latest UXArmy platform updates (Nov 2025).
The focus is purely on the usability testing tools. It does not cover UXArmy’s participant panel or the moderated research tool DeepDive®, which will be covered separately.
1. Choosing the Right Tasks for Your Test
Let’s start by understanding what usability testing tasks are available in UXArmy and what they are. In UXArmy, you can mix many of these tasks inside a single test. Based on the core interaction task you choose, the compatible tasks will be shown to you.
Core interaction tasks
- Mobile App Task (iOS & Android) – recordings*, task duration, task success rate
- Website Task – recordings, navigation path analysis, heatmap, task duration, task success rate
- Prototype Task – recordings (optional), navigation path analysis, heatmap, task duration, task success rate
*Note: UXArmy provides ‘full test recording’ instead of task specific recording. This means your test is recorded from the start of the test to the very end of the last task. Screen and audio recording is available for Desktop, Tablet, and Phones (both iOS & Android). Face recording is available for Desktop.
Behavioral and impression-based tasks
- First Click – tests instinctive navigation
- 5-Second Test – shows an uploaded image for a fixed duration
- Time-Limited Task – limits how long participants can view a prototype or webpage
Verbal tasks
- Speaking Task – Collects only verbal responses. Great alternative to Open Feedback when you want rich feedback that is uninhibited by typing speed. Auto-transcriptions in UXArmy video player make it easy to skim through long feedback.
- Image Test – a Speaking Task with an image shown
Survey-style questions
- Open Feedback
- Single Select & Multi-Select
- Rating Scale
- Ranking
- Matrix Questions
- Simple Input (with email, number, date, or text validation)
- Yes/No
For Transitions & Instructions
- Note – transition screen suitable for setting the scene, instructions, warnings
Important image-related rules
- You can add images to almost all question types, answer options, including Note
- Matrix questions are the only exception
Resources:
- Check out the UXArmy Templates for Usability Tests and Surveys in platform
- Learn more about the different usability testing tasks types available in UXArmy
2. Meet ‘Note’: your secret ingredient for clearer tasks
Note is not an actual task; it is a transition screen with a friendly “Continue” button instead of “Start Task.” And it’s surprisingly powerful for keeping your test easy to follow.
One of the most common early mistakes is putting everything – the scenario, instructions, warnings, and the actual mission – into a single screen. When that happens, it creates cognitive overload for participants.
A better approach is to let Note handle the setup, and let the task handle the action.
Use Note to:
- set context to shape the participant’s mindset
- give instructions
- add warnings or reminders
- prepare them for what’s coming next .
Then use the task itself to state only the mission.
This creates a clean rhythm:
(1) Understand → (2) Act.
When the mission stands alone, everything feels smoother for them which often means better insights for you.
3. How to Write Effective Usability Testing Tasks
1) When you ask for feedback, help them tap into something real
“Tell us what you think of this design” sounds simple… but it’s actually a tough question for participants. It puts all the pressure on them to figure out how to respond, and you often get vague answers, or short unhelpful ones like “It’s nice. I like it”
A much better approach is to anchor your request in a concrete moment so participants know exactly what lens to use.
For example:
Instead of:
“Tell us what you think of this design.”
Try something like:
“Think back to the last time you were planning an overseas holiday. As you look at this page, tell us what catches your eye first, what feels helpful, and what feels confusing for that situation.”
2) Avoid UX/design jargon
Use everyday language. Participants don’t speak in “flows,” “hero sections,” or “UI.”
Notice how in the previous example, we changed ‘design’ in ‘tell us what you think of this design’ to ‘page.’
3) Don’t bury the task in a wall of text
Avoid stuffing a scenario, instructions, and the mission all in one screen. Use Note + Task combination to present instructions and the actual mission in digestible tidbits.
4) Use formatting + emojis
Use font sizes, formatting, highlights, emojis (🚫,⛔, ⚠️, ✅) to draw attention to crucial points.
4. Build Tests Faster Using ‘Duplicate’
Duplicate Task
Useful when repeating the same question style (e.g., rating scale after each navigation task).
Duplicate Test
Once a test collects its first response, many things become uneditable. Use Duplicate Test to quickly iterate after dry runs or pre-tests without rebuilding from scratch.
5. Pre-Launch Checklist (Must-Do Before You Hit Launch)
Before launching your usability test, check your Test Setting:
- Public Test Name
- Transcription Language (choose correct accent for accuracy)
- Test Language (participant UI language)
- Invitation Language
- Welcome Message (editing required for non-English tests)
- Thank You Messages (editing required for non-English tests)
A clean setup improves participant experience and ultimately the quality of insights.
6. What Happens at Launch: How Pooled Credits Work
UXArmy uses pooled credits across all research types.
Credits are deducted only when a participant submits a response – not at test launch.
Pre-launch, UXArmy shows an estimated credit requirement based on your target sample size.
Actual deduction = per completed response.
This gives you flexibility and makes budgeting more predictable.
7. Post-Launch: What You Can and Cannot Edit
Editing the test
- We don’t recommend it, but you can edit tasks before the first participant submits a response.
- After the first participant response is collected, edit is restricted on a number of items. This is necessary for consistency in test results. Here’s the complete list of what you can and cannot edit post test launch and starting when.
Adjusting sample size
- You can increase target responses for Draft or Active tests.
- You cannot increase sample size once the test is Completed (launch a Duplicate Test instead). Test status changes to ‘Completed’ once the desired number of responses you set in the Participant tab are net.
8. When Participants Face Issues (And How to Fix Them)
1) Video not uploading
Top causes for recording submission failures are:
- Weak WiFi or use of Mobile Data. Connect to strong and stable WiFi.
- App closed before upload completed
How to advise the self-sourced participant if he/she experiences submission failure?
Ask the participant to reconnect to strong Wi-Fi and reopen the UXArmy app. This will trigger an auto-retry of the upload.
2) Recording stopped mid-test unexpectedly
On iPhones, tapping the very top of the screen during testing can accidentally stop the screen broadcast. When that happens, the screen and audio recording stop, even though the app itself stays open. As a result, only a partial recording will be submitted for that participant.
This is caused by the way iOS handles screen broadcasting, so remind participants not to tap the top edge.
3) Wrong browser for recorded test
For desktop recording, always use Chrome.
“No-recording” tests can use any browser.
4) Fold/Flip phones
- Foldables should be folded before the starting the test
- Flip phones must be fully open before starting the test
This ensures the correct form-factor is detected.
Resources:
Useful participant video guides for self-sourced unmoderated usability testing available in youtube playlist by language: English, Korean, Japanese, Thai, Arabic
9. Managing Your Team on UXArmy
UXArmy has a number of features to support collaborative research. One of which is Team Management & Workspaces, which is available for Pro and Enterprise plans:
Type of Seats:
- Full Seats: Owner, Editor
- Collaborator Seats: Access to test results & ability to analyze together
- View-Only Sharing: No login required. URL share with password protect option.
Teams can work together efficiently regardless of geography with Transcriptions in 23 languages, in-platform Translation of transcriptions into 67 languages, AI translation of tasks during creation. Check which languages are supported for unmoderated testing.
10. Technical Limits to Keep in Mind
- Up to 70 tasks per usability test
- Usability Test Recording limits: 50 minutes (mobile), 60 minutes (desktop)
- Moderated interviews/focus groups Session length: up to 3 hours
11. Best Practices for Overseas Research
UXArmy supports usability testing in many languages because people give richer, clearer feedback in their native tongue. The participant interface alone is available in 17 languages: English, Arabic, Bengali, Chinese (Simplified), Chinese (Traditional), Danish, Dutch, Filipino (Philippines), French, German, Hindi (India), Indonesian (Bahasa Indonesia), Italian, Japanese, Korean, Portuguese, Spanish.
When testing in markets where English isn’t widely used:
- Have a native speaker review your tasks, welcome message, and thank-you message.
- Pre-test on real devices with native speakers to catch OS-level language quirks or confusing phrasing.
- Remember that machine translations can feel literal—always ask a native speaker to validate AI-generated text.
- Give your local reviewer Editor access so they can update wording directly and speed up iterations.
These small steps dramatically improve clarity, cultural fit, and the overall quality of your test.
Resources:
- Check which languages UXArmy supports
- If you need help with overseas testing, UXArmy offers in-house support across many Asian markets. Contact Us to learn more.
Experience the power of UXArmy
Join countless professionals in simplifying your user research process and delivering results that matter
Frequently asked questions
Does UXArmy offer AI-powered UX research tools?
Yes. UXArmy includes several AI-powered tools that automate analysis and enhance insights such as AI summaries, sentiment analysis, smart highlights and tagging, follow-up questions, and automatic transcription and translation. These features help you get clearer insights, faster.
What is a Note task?
A ‘Note’ is a transition screen used to give context, instructions, or reminders before the real task. It’s not an actual task – participants simply tap ‘Continue’ instead of ‘Start Task’. Note helps separate setup from action, reducing cognitive load and making the test flow clearer.
Can I test my website and a competitor’s website in one test? What about mobile apps?
Yes, you can for both websites and mobile app testing. UXArmy supports multi-asset testing. In addition, if you select mobile app testing, you can even add a prototype navigation task into the same test test.
Can I attach images to questions?
Yes, you can add images to questions as well as answer options. The only exception is the Matrix question.
Can I test videos on UXArmy?
Yes, provide the url of the video you want to test to a Website Task – such as a youtube video link – in place of the website url.
What metrics can I get in UXArmy?
UXArmy provides a rich set of behavioral, performance, and qualitative insight metrics across both unmoderated and moderated usability tests. These metrics are designed to help researchers understand what users did, how they behaved, and why they struggled or succeeded — all in one analysis workspace.