The Hidden Cost of Tool Proliferation in Qualitative Research Operations

By Sago

  • article
  • Advertising Testing
  • Agile Qualitative Research
  • Agile Quantitative Research
  • Survey Research
  • Discussion Tools
  • DIY Surveys
  • Insight Communities
  • Online Focus Groups and Forums
  • Qualitative Research
  • Survey Panel
  • Participant Recruitment

Summarise with AI

ChatGPT
Claude Logo
Gemini Logo
Perplexity Logo

For research agencies managing multiple concurrent projects, the technology stack often grows organically from everyday needs rather than strategic planning.  

A video conferencing licence here, an online community platform there, cloud storage for ethnographic materials, and separate software for analysis. Each tool solves a specific problem, but the cumulative effect creates an invisible operational tax that many agencies fail to measure until they examine the end-to-end workflow.

At De La Riva, where we regularly run multiple qualitative projects at once, this became especially visible in our daily workflow. Our experience illustrates how tool fragmentation affects not just efficiency metrics, but team autonomy, collaboration patterns, and the ability to scale operations.

This article is based on the presentation “Unlocking Efficiency and Insight: The Qual Transformation Story”, presented by Sago at the Insights to Action Summit in October 2025. The full video replay is free to watch here:


The Anatomy of a Fragmented Workflow

Ana Bravo, who is in charge of the qualitative fieldwork and analysis process at De La Riva, describes the pre-consolidation workflow as a manual integration challenge. We used separate vendors for video conferencing, online communities, file storage for ethnographic materials such as videos and photos, and analysis software. Each vendor delivered outputs in isolation.

“As researchers, we had to manually connect the information through documents and isolated supports”, Ana explains. “Everything was disaggregated, and in our hands was the job to create support for the communications to connect all the suppliers’ output”.

The operational implications extended beyond simple inconvenience. With at least five people involved in each project across the qualitative value chain, dependencies multiplied. As an example, a researcher needed specific focus group links from one colleague, recordings from another, and had to queue requests for video downloads before analysis could begin.

The Dependency Chain Problem

Tool fragmentation creates dependency chains that undermine team autonomy. In our case at De La Riva, team members could not complete their assigned tasks without waiting for others to provide access, transfer files, or complete preliminary steps in different systems. We felt this in everyday workflow: waiting on links, files, or access before moving forward. 

This might seem manageable for a single project. But at scale, with dozens of projects in motion, these dependencies compound. Queue times extend, and project managers spend hours coordinating handoffs rather than focusing on research quality.

The bottleneck is not any individual tool’s performance. Most specialised research platforms excel at their designated function. The problem emerges at the seams between tools, where human effort must bridge the gaps.

The Real Cost Goes Beyond Time

Agencies typically evaluate technology investments through cost-per-seat calculations or feature comparisons. But tool fragmentation imposes costs that rarely appear in procurement spreadsheets.

  • Knowledge fragmentation: When project materials live in multiple systems, institutional knowledge becomes siloed. New team members face a steeper learning curve, and experienced researchers cannot easily reference past work as materials are scattered across platforms.
  • Quality control challenges: Reviewing work requires accessing multiple systems, each with different interfaces and permission structures. This friction makes comprehensive quality assurance more difficult, particularly under deadline pressure.
  • Client experience variability: Some clients adapt easily to multiple platforms, whilst others struggle with each additional login credential and interface. Agencies cannot standardise the client experience when the underlying technology varies by project type.
  • Innovation constraints: Developing new methodologies requires understanding what combinations of techniques are operationally feasible. When each technique lives in a separate platform, experimenting with hybrid approaches becomes logistically complex before it even reaches the conceptual design phase.

The Integration Value Proposition

Our consolidation at De La Riva onto QualBoard, a unified qualitative research platform, achieved 98% project migration over approximately 18 months. The quantitative efficiency gains were significant, but we emphasise that the transformation in team dynamics proved equally valuable.

“What we gained with QualBoard that we didn’t have before was autonomy amongst the collaborators”, Ana notes. “Everyone can do their job without depending on someone else’s job. Since every single result is held in QualBoard as the centre of the value chain, everyone can get into it and perform their task without making a request or sending an email or making a call”. In our daily basis, it meant fewer back-and-forth messages and less waiting to begin tasks.  

This autonomy enabled us to decentralise process execution to the people performing each task. Rather than routing requests through gatekeepers who controlled access to different platforms, team members gained direct access to the materials they needed within their scope of responsibility.

Integration Enables Methodology Innovation

Platform consolidation delivered an unexpected benefit. Once we established operational stability with integrated tools at De La Riva, we began designing new methodologies that would have been impractical in a fragmented environment.

We developed a real-time communication testing approach that combines crowd surveys with immediate qualitative follow-up sessions. Moderators survey 90 to 120 respondents, then, within 30 minutes, move selected participants into live qualitative discussions based on their survey responses.

This methodology compressed fieldwork that previously required four days into two hours, whilst delivering preliminary results within 24 hours. The approach was only feasible because both quantitative and qualitative components operated within the same platform, eliminating the integration work that would have made such rapid transitions impossible.

Evaluating Your Own Tool Stack

Agencies considering platform consolidation should examine specific operational patterns rather than conducting abstract cost-benefit analyses. The real question is where time, attention, and momentum are being lost.  

Map the actual information flow for a typical project. How many times do team members request access, wait for file transfers, or manually move data between systems? How often do projects experience delays because materials are temporarily inaccessible or in the wrong format?

Survey your team about task dependencies. Which activities cannot proceed without outputs from different systems? Where do queue times develop? Which team members function as bottleneck points simply because they control access to specific platforms?

Assess your ability to experiment with new approaches. When you design a new methodology, how much of the planning discussion focuses on technical integration challenges rather than research design? Do you avoid certain technique combinations because the operational complexity outweighs the research value?

The goal is not to achieve perfect consolidation. Some specialised requirements will always demand dedicated tools. But when tool proliferation creates more operational overhead than research value, the fragmentation cost has become too high.


Moving Forward

Tool consolidation represents a significant change management challenge, which we will explore in depth in a companion article. But the first step is recognising that operational efficiency in qualitative research depends not just on individual tool capabilities, but on how seamlessly those tools work together across the entire value chain. Ultimately, the measure of success is whether the work feels clearer and lighter for the people doing it.

For agencies managing substantial project volume, the hidden costs of tool proliferation may be the single largest constraint on growth and innovation.

This article is based on the presentation “Unlocking Efficiency and Insight: The Qual Transformation Story”, presented by Sago at the Insights to Action Summit in October 2025. The full video replay is free to watch here:


Authors

Learn more about

Sago
Sago is a data collection and research tech company that provides qualitative and quantitative research solutions and services.
FIND OUT MORE Sago

Scroll to Top