Skip to main content

Study Lifecycle

Understanding study statuses and the journey from draft to completion

Updated today

Every study goes through a series of stages from initial creation to final completion. Understanding these stages helps you manage your research projects effectively.

Study Status Overview

Status

Stage

What's Happening

Draft

Creation

Building your survey

Testing

QA

Verifying everything works

Translation

Localization

Adding language versions

Feasibility

Pre-launch

Confirming recruitment viability

Published

Live

Collecting real responses

Completed

Closed

Data collection finished

Archived

Storage

Historical reference

Status Flow

[Draft] → [Testing] → [Translation] → [Feasibility] → [Published] → [Completed] → [Archived]

Not all studies follow every step. You can skip stages based on your needs:

  • No translations? Skip the Translation stage

  • Small internal study? Go directly from Testing to Published


Draft

Your study begins in draft status when you create it.

What You Can Do

  • Create and edit questions

  • Build your survey structure

  • Add screening and conditional logic

  • Configure AI follow-ups

  • Add stimulus materials

Moving Forward

To move to Testing:

  1. Complete your survey design

  2. Ensure validation passes (no errors)

  3. Enable test mode


Testing

The testing phase lets you verify your survey works as expected before reaching real participants.

What You Can Do

  • Generate test links

  • Walk through the survey yourself

  • Share test links with colleagues for QA

  • Verify conditional logic paths

  • Check screening rules

  • Test on mobile and desktop

Test Participant Features

  • Test responses don't count toward quotas

  • Marked separately from real responses

  • Can generate unlimited test links

  • Responses visible in the Responses tab (filtered)

Moving Forward

When testing is complete:

  1. Review all test responses

  2. Fix any issues discovered

  3. Clear test data (optional)

  4. Proceed to Translation (if needed) or Feasibility


Translation

If your study needs multiple languages, the translation stage manages localization.

What You Can Do

  • Add language versions

  • Translate question text

  • Translate options and labels

  • Review translations for accuracy

  • Preview in each language

Translation Workflow

  1. Complete the primary language version

  2. Add additional languages

  3. Translate or auto-translate content

  4. Review and refine translations

  5. Test each language version

Moving Forward

When all translations are complete:

  1. Review each language version

  2. Test the survey in each language

  3. Proceed to Feasibility

See Translation Features for detailed guidance.


Feasibility

The feasibility stage confirms you can recruit your target participants before full launch.

What You Can Do

  • Configure recruitment criteria

  • Set up quotas

  • Check estimated costs

  • Review feasibility estimates

  • Activate panel links for testing

Feasibility Check

The system estimates:

  • Likelihood of reaching target respondents

  • Expected recruitment difficulty

  • Estimated cost per complete

  • Time to fill quotas

When to Use

  • Large-scale studies with specific demographics

  • Studies with narrow targeting criteria

  • Panel-based recruitment

  • Budget-sensitive projects

Moving Forward

When feasibility is confirmed:

  1. Review and accept cost estimates

  2. Finalize recruitment configuration

  3. Publish the study


Published

Your study is live and collecting real responses.

What's Happening

  • Participants can access the survey

  • Responses are collected and analyzed

  • Quotas are tracked and enforced

  • Quality checks are applied

What You Can Monitor

  • Response count and completion rate

  • Quota progress by segment

  • Response quality scores

  • Individual participant status

Making Changes

While published, you can:

  • Monitor responses in real-time

  • View partial analysis as responses come in

  • Generate reports (once sufficient responses)

While published, avoid:

  • Changing questions (affects data consistency)

  • Modifying screening logic (affects quotas)

  • Altering quota structure significantly

Moving Forward

The study moves to Completed when:

  • All quotas are filled

  • You manually close the study

  • The deadline is reached (if set)


Completed

Data collection is finished. Time for analysis.

What You Can Do

  • Run full analysis

  • Generate comprehensive reports

  • Export all response data

  • Create presentation slides

  • Share findings with stakeholders

Data Availability

  • All responses are final

  • Transcripts are fully processed

  • AI analysis is complete

  • Export formats are available

Making Reports

With collection complete:

  1. Generate the analysis report

  2. Review themes and personas

  3. Explore data with Report Chat

  4. Build slides for presentation

  5. Export as needed


Archived

Older studies can be archived to keep your workspace organized.

When to Archive

  • Analysis is complete

  • Findings have been delivered

  • The study is no longer active reference

  • You want to declutter your study list

What Happens

  • Study moves to Archived section

  • Data remains accessible

  • Reports can still be viewed

  • Can be unarchived if needed


Participant Status

Individual participants have their own status tracking:

Participant Status

Meaning

Pending

Invited but hasn't started

Started

Began the survey

Completed

Finished successfully

Screened Out

Failed screening questions

Aborted

Started but quit

Quality Stopped

Stopped for quality issues

Managing Status Transitions

Reverting Status

In most cases, you cannot revert to a previous status. However:

  • A completed study cannot be reopened for responses

  • Published studies can be paused (contact support)

  • Archived studies can be unarchived

Emergency Changes

If you need to make urgent changes to a published study:

  1. Consider the impact on data quality

  2. Document any changes made

  3. Note the response count when changes occurred

  4. Analyze impacted responses separately if needed

Best Practices

Before Publishing

  • Complete all questions and logic

  • Test all conditional paths

  • Verify screening rules work correctly

  • Review quota configuration

  • Complete translations (if applicable)

  • Preview on mobile devices

  • Have colleagues test the survey

  • Clear test responses before launch

During Fieldwork

  • Monitor quota progress daily

  • Check response quality metrics

  • Address any technical issues quickly

  • Communicate with panel providers (if applicable)

  • Track timeline against goals

After Completion

  • Review overall response quality

  • Generate and review analysis

  • Export raw data for backup

  • Create presentation materials

  • Archive when no longer needed

Did this answer your question?