Skip to main content

Response Quality

Evaluate response quality, understand scoring, and manage data quality

Response Quality Dialog

Access detailed quality information for any response.

Opening Quality Details

  1. Click on a response row

  2. Click View Quality Details

  3. Review the quality breakdown

Quality Overview

The dialog shows:

  • Overall quality score

  • Individual quality factors

  • Specific flags or issues

  • Recommendations

Quality Scoring Methodology

Deepfield automatically evaluates responses across multiple dimensions.

Scoring Factors

Factor

Weight

What It Measures

Completion time

25%

Reasonable survey duration

Response length

25%

Open-ended answer depth

Consistency

20%

Logical coherence across answers

Engagement

15%

Thoughtful, varied responses

Media quality

15%

Clear audio/video recordings

Score Ranges

Score

Rating

Meaning

80-100

High

Excellent quality, use with confidence

60-79

Medium

Acceptable, minor concerns

40-59

Low

Review before including

0-39

Very Low

Likely unusable

Quality Flags

Specific issues that affect quality scores:

Common Flags

Flag

Description

Impact

Speeding

Completed too quickly

Major: -20 points

Straightlining

Same answer for all matrix rows

Moderate: -15 points

Short responses

Brief open-ended answers

Moderate: -10 points

Inconsistency

Contradictory answers

Major: -20 points

Poor audio

Unclear recording

Minor: -5 points

Flag Details

Each flag includes:

  • Specific issue description

  • Evidence (e.g., completion time, answer examples)

  • Severity level

  • Suggested action

Interpreting Scores

High Quality Responses

Characteristics:

  • Reasonable completion time

  • Detailed open-ended responses

  • Consistent answer patterns

  • Clear media recordings

Low Quality Indicators

Warning signs:

  • Completed in under 2 minutes

  • Single-word open-ended answers

  • All "Strongly Agree" for matrices

  • Inaudible recordings

Context Matters

Consider:

  • Survey length (longer = more time expected)

  • Question difficulty

  • Topic engagement

  • Respondent demographics

Managing Low-Quality Responses

Review Process

  1. Filter to show low-quality responses

  2. Review each individually

  3. Check specific quality flags

  4. Decide: include, exclude, or investigate

Exclusion Criteria

Consider excluding if:

  • Multiple major flags

  • Score below 40

  • Clear evidence of gaming

  • Unusable media

Inclusion Despite Low Score

May include if:

  • Only minor flags

  • Valuable qualitative content

  • Specific segment representation

  • Context explains anomalies

Quality in Analysis

Impact on Reports

Low-quality responses can:

  • Skew quantitative results

  • Add noise to themes

  • Reduce insight accuracy

  • Affect persona validity

Quality-Based Filtering

When generating reports:

  • Set minimum quality threshold

  • Exclude specific flags

  • Balance quality vs. sample size

Improving Quality

Study Design

Strategy

Effect

Shorter surveys

Less fatigue, better engagement

Clear questions

More thoughtful responses

Varied formats

Maintains attention

Logical flow

Easier to answer consistently

Recruitment

Strategy

Effect

Target audience

More relevant, engaged respondents

Fair incentive

Motivates quality completion

Clear expectations

Sets appropriate effort level

Monitoring

Strategy

Effect

Early review

Catch issues before scale

Quality gates

Stop low-quality sources

Feedback loop

Inform providers of issues

πŸ’‘ Tip: Review the first 10-20 responses closely. Quality issues at the start often indicate systemic problems that will continue.

Did this answer your question?