How to identify which stem cell workflows your product actually serves

Reading time: 9 minutes

Our Pillar 3 anchor argued that positioning is a product decision rather than a marketing output, and that it rests on five connected questions. This article takes the first of those questions, which logically precedes the rest. Before you can describe your buyer, your competitors, or your value proposition, you need a defensible view of which workflows in developmental and stem cell science your product actually serves. Most positioning errors in this field trace back to a weak answer to this question, often supplied implicitly by the research context in which the tool was first conceived rather than explicitly by the team building it for market.

The short version of the argument is this. A tool is not positioned against a field. It is positioned against specific steps in specific workflows run by specific people for specific purposes. The field as a whole — "stem cell science", "cell therapy", "regenerative medicine" — is too large and too varied to serve as a target. Two laboratories that both work on pluripotent stem cells may run workflows that share almost no operational steps in common. A tool that fits one may be irrelevant to the other, not because the science differs but because the process does.

Why the research-context default goes wrong

Many ancillary technologies for stem cell science originate in a specific laboratory solving a specific problem in a specific protocol. This is both a strength, because the tool is born out of real need, and a structural risk, because the need that motivated the tool may not be the need that matters at commercial scale. The Pillar 2 article on product-market fit described this pattern as the translation from academic novelty to commercial workflow. The question this article takes up is what the other side of that translation actually looks like.

When a team skips workflow identification, they inherit the workflow of the originating laboratory by default. That laboratory's protocols, cell types, scales, and quality metrics become the implicit reference for what the tool is good for. If the originating laboratory ran small-batch academic experiments on a single iPSC line under research-grade conditions, the tool's natural fit will be with other academic laboratories doing similar work. Attempts to position it for clinical manufacturing, industrial bioprocessing, or agricultural applications will founder on differences the team did not originally design for. The tool was never built for those workflows. It was built for one workflow, and assumed to generalise.

Generalisation is possible, but it has to be earned, not assumed. Earning it starts with making the originating workflow explicit, then mapping it against the workflows of users the team hopes to reach.

Workflow, not application

A common first attempt at workflow identification confuses the workflow with the application. A team will describe their tool as serving "cell therapy manufacturing" or "disease modelling" or "organoid research" and consider the question answered. These are application domains, not workflows. They tell you which biology the user cares about; they do not tell you what the user actually does on a given day, in what order, with what inputs, producing what outputs, assessed by what criteria.

The distinction matters because workflows within an application domain vary enormously. Consider autologous cell therapy, where cells are collected from a patient, manipulated, and returned to the same patient. The workflow differs from allogeneic cell therapy, where cells from a single donor are expanded to produce doses for many patients, even though both are "cell therapy manufacturing". The scale logic is opposite: autologous workflows scale out (many small batches in parallel), allogeneic workflows scale up (few large batches). This distinction has direct consequences for which ancillary technologies each workflow can accommodate [1]. A tool that fits allogeneic large-batch characterisation may have no place in an autologous point-of-care workflow, and vice versa, even though both are part of the same application.

A working definition of a workflow for positioning purposes is the end-to-end sequence of steps a specific team takes to convert inputs (cells, reagents, consumables, information) into a defined output (a characterised cell batch, a dataset, a qualified intermediate, a shipped product), together with the quality criteria at each step, the decision points where the workflow can branch or halt, and the roles responsible for each action. This is more specific than an application, more specific than a protocol, and more specific than a unit operation. It is what a tool actually plugs into.

Mapping the workflow as it is, not as it is described

The workflow described in a published protocol, a company's standard operating procedures, or a conference presentation is frequently not the workflow actually run. Operators take shortcuts. Equipment substitutions happen. Quality checks that were designed to occur at step seven get moved to step four because of how the lab scheduled its work. Data is exported, annotated by hand, and re-imported elsewhere. The written workflow is an idealised map; the real workflow is the territory.

This matters because a tool that fits the written workflow may conflict with the real one. A product that requires a specific sample format can be defeated if the operator habitually batches samples in a different format. A software tool that expects data to arrive at a particular step can be bypassed if the team has developed a parallel manual pipeline. The tool does not fail because the science is wrong; it fails because the workflow reality is different from the workflow diagram.

Three practical methods help surface the real workflow.

Direct observation. Spending time in the laboratories and manufacturing facilities of potential customers reveals friction that interviews miss. A national survey of FACT-accredited cell processing facilities in the United States found that a slight majority had adopted automation in their workflows, but the specific practices, quality checks, and infrastructure varied substantially from facility to facility even within a common regulatory framework [2]. The variability is real and material. A tool evaluated on a facility's documented SOPs may perform differently on its actual operations.

Process shadowing. Sitting with the operator who performs the task the tool is meant to change, end to end, from sample receipt to output, without interrupting, reveals the decision rules that are not written down. Which measurements does the operator trust? Which do they re-run because of known variability? Where do they spend disproportionate time on problems that appear trivial on paper? These questions are answered by watching, not by asking.

Failure-mode inventory. Asking the team about the last three times the workflow failed, what caused each failure, and what the fix was, surfaces the parts of the process that are actually vulnerable. A tool that addresses the source of the most common or most costly failures has a clear positioning argument. A tool that addresses a step the team does not consider problematic has to create the problem before it can sell the solution — a much harder commercial path.

Identifying your product's real position in the workflow

Once the workflow is mapped, the question becomes where your product sits in it. Three sub-questions tighten the answer.

Which step does your product change? This is the specific unit operation the tool touches. Cell isolation, media preparation, seeding, expansion, harvest, characterisation, fill-finish, release testing, shipping, reconstitution, documentation. Your tool does something at one or more of these steps. Identify them explicitly. Vague claims that the tool "improves the workflow" without specifying the step are a symptom that the analysis has not been done.

What does the step look like before and after your product is introduced? A genuine position is described as a transformation: before your product, step N takes six hours, requires a skilled operator, produces a yield of 40% with 15% variability, and generates data that must be manually transcribed. After your product, step N takes two hours, runs unattended, produces a yield of 65% with 5% variability, and generates data that flows automatically to the next step. If you cannot describe the before-and-after in concrete, measurable terms, the position is a claim, not a product.

What does the rest of the workflow require of your product? The step you change is embedded in steps you do not change. The inputs to your step come from upstream processes with their own specifications. The outputs of your step feed downstream processes with their own requirements. A tool that transforms step N but cannot accept the input formats step N-1 produces, or cannot emit outputs step N+1 needs, introduces new friction even as it removes old friction. The Pillar 2 article on toolchain fragmentation examined this interoperability problem in detail; its conclusions apply with equal force at the positioning stage. A workflow fit is only real if it includes the adjacent steps.

A worked example: the microfluidic separation case

One way to see how this analysis plays out in practice is to look at a published example. A 2024 study in Cytotherapy described an inertial spiral microfluidic device for enriching T-cells from leukaemic blood samples, proposed as an improvement to the starting-material preparation step in chimeric antigen receptor T-cell (CAR-T) manufacturing [3]. The paper is instructive not because the microfluidic principle is new — spiral inertial separation has existed for years — but because the authors explicitly positioned the tool against a specific step in a specific workflow, with quantified performance against concrete criteria.

The authors identified T-cell enrichment from leukaemic starting material as a bottleneck in CAR-T workflows because leukaemic samples contain elevated B-cells and blasts that must be depleted to produce a workable T-cell product. They demonstrated the tool on both healthy donor samples (used to optimise) and patient samples from two disease subtypes. They reported enrichment data (T-cell purity rising from 45% to 73% in healthy donors) and depletion data (80–89% reduction in non-target populations in one disease subtype) and acknowledged that the second disease subtype, which presented with lower starting T-cell numbers, was a less favourable fit for the method [3].

This is what workflow-level positioning looks like in practice. A specific step (starting-material enrichment). A specific workflow (CAR-T manufacturing from leukaemic blood). A specific comparator (conventional enrichment methods). Quantified before-and-after performance. Honest acknowledgement of the subpopulation where the method works less well. A TechBio company producing a similar device would not need to invent this framing; they would inherit it from the scientific report itself. The gap between a paper positioned this cleanly and a product positioned this cleanly is often smaller than founders expect, but it only closes if the team treats workflow identification as the first product-development decision, not the last marketing decision.

When workflow identification changes the product

A common outcome of doing this analysis rigorously is that the team realises the workflow they thought they were serving is not the workflow they should be serving. The tool turns out to fit an adjacent workflow better. The step they thought was the bottleneck is not the bottleneck customers actually experience. The user they imagined as the buyer is not the one who makes the purchase decision.

This realisation is uncomfortable, because it often implies product changes. But it is much less expensive than the alternative, which is discovering the same information after launch, through failed sales cycles and churning pilots. Autologous iPSC-based cell therapies, for instance, have been constrained by the fact that their workflows scale out rather than scale up, which makes many tools developed for allogeneic large-batch manufacturing poorly suited to them [1]. Tools designed from the outset for distributed, per-patient manufacturing are differently positioned from tools adapted from allogeneic contexts. The workflow decision shapes the product, not the other way around.

What this leads to next

Once the workflows your product serves are mapped, the next positioning questions can be approached with traction. Mapping the competitive landscape becomes a question of who else operates at the steps you have identified, not who else mentions the application domain. Validating demand becomes a question of whether the specific users running those workflows will pay to change them. Defining the buyer becomes a question of who authorises the purchase for teams running those workflows. Framing the value becomes a question of describing the before-and-after transformation in language the workflow's practitioners recognise.

For applications where developmental and stem cell biology is itself the ancillary technology — species preservation, cultured food, human ageing and rejuvenation — the workflow identification question takes a different form, because the workflows are less mature and frequently being invented from prior art in other species. The article on prior art in laboratory and domesticated species addresses that case.

If the biology in this article is unfamiliar, the Pillar 1 articles on key stem cell methods and current applications give the foundation.

References

[1] Madrid M, Sumen C, Aivio S, Saklayen N. Autologous Induced Pluripotent Stem Cell-Based Cell Therapies: Promise, Progress, and Challenges. Curr Protoc. 2021;1(3):e88. DOI: 10.1002/cpz1.88

[2] Elsallab M, Bourgeois F, Maus MV. National Survey of FACT-Accredited Cell Processing Facilities: Assessing Preparedness for Local Manufacturing of Immune Effector Cells. Transplant Cell Ther. 2024;30(6):626.e1-626.e11. DOI: 10.1016/j.jtct.2024.03.016

[3] Elsemary MT, Maritz MF, Smith LE, Warkiani ME, Thierry B. Enrichment of T-lymphocytes from leukemic blood using inertial microfluidics toward improved chimeric antigen receptor-T cell manufacturing. Cytotherapy. 2024;26(10):1264-1274. DOI: 10.1016/j.jcyt.2024.05.005

About StemCells.Help

StemCells.Help is an advisory consultancy that aids innovation and real-world impact of life science applications built on developmental and stem cell biology. Founded by Dr Paul De Sousa, it draws on over four decades of experience spanning early embryo development, animal cloning, pluripotent stem cell manufacturing, and technology commercialisation. If you build tools for these domains or work in an emerging application where the biology is the enabling technology, StemCells.Help can provide experienced scientific counsel to ground your decisions. To discuss your needs, talk to Paul.

ORCID: 0000-0003-0745-2504

Web: stemcells.help

Previous
Previous

Mapping competitive landscape when your category barely exists yet

Next
Next

Positioning your TechBio product in developmental and stem cell science