Measure Twice, Prompt Once: Why Process Still Wins in the Age of AI
There’s something enduring about a well-organized workshop. Growing up, my father was a carpenter, and his workspace was filled with every tool you could imagine—saws, planes, clamps, chisels. To an outsider, it looked like everything you’d ever need to build anything. But as I spent more time around him, I began to understand that the tools themselves weren’t what made him a master carpenter. It was how he approached the work. He used each tool with purpose—measuring, preparing, and following a plan before ever making the first cut.
That lesson has never been more relevant than it is today, as our industry navigates the rapid rise of generative AI.
The Illusion of Speed: The Prefabricated Mindset
With the emergence of tools like Relativity aiR, there’s a natural temptation to believe that technology alone can solve the complexity of modern discovery. The promise is compelling: faster review, smarter categorization, fewer human hours.
And to be clear, these tools are powerful.
But what we are seeing is a growing tendency to treat generative AI like a prefabricated build—something you can assemble quickly and expect to perform—when in reality, complex matters still require the same planning, measurement, and disciplined process as any lasting structure.
Prefabrication has its place. It can accelerate timelines and create efficiency in the right context. But when applied without consideration for the underlying structure, it often leads to gaps, weaknesses, and the need for rework.
The same is true here.
From Raw Material to Finished Product
If we extend the workshop analogy, the data is the raw material—the wood brought into the shop. It may be rough, inconsistent, and full of imperfections. Before anything meaningful can be built, that material needs to be understood, measured, and shaped.
The matter itself is the project—the end goal that defines what you are trying to build and why.
Process is the plan—the blueprint that defines how to get from raw material to finished product. Without it, even the best tools will produce inconsistent results.
Analytics is the measuring discipline—ensuring that what you’re working with is properly assessed, aligned, and ready for the next step. It reduces error, increases precision, and creates confidence before action is taken.
And generative AI is the power tool—capable of accelerating work at a scale we’ve never seen before, but only when applied to material that has been properly prepared.
Each component has a role. But just as in a workshop, using the wrong tool at the wrong time—or skipping steps altogether—can compromise the entire outcome.
The Role of Tools—Used with Purpose
In any workshop, different tools serve different functions, and their value is defined by when and how they are used. The same is increasingly true in modern discovery environments.
Some tools help you understand the material in front of you—surfacing patterns, highlighting context, and identifying what matters versus what doesn’t. Others help isolate and shape the data—cutting away noise, refining scope, and focusing effort. Still others help bring clarity and finish to the work—ensuring that what is ultimately delivered is consistent, defensible, and aligned with the objectives of the matter.
In practice, this is where thoughtfully designed applications begin to play a meaningful role. Capabilities such as contextual search and term-in-context analysis—like those found in tools such as Lineal’s Amplify™ Snippets—help teams better understand how information appears within the broader dataset, rather than relying on isolated hits. Tools that identify and remove non-human or low-value communications—such as BotDetector—serve to eliminate noise early, allowing teams to focus on what truly matters.
Similarly, technologies that enable conceptual identification and categorization of visual data—like Images—allow teams to work more effectively across data types that have historically been difficult to analyze at scale. And structured process visibility—through solutions like Workflow—ensures that teams are not only executing tasks, but doing so in the right sequence and with clear alignment to the overall objective.
These types of tools are not meant to replace judgment—they enhance it.
Individually, each capability is useful. But their real value emerges when they are applied in sequence, as part of a broader system.
This is where many organizations struggle. The tendency is to reach for the most powerful tool first—to start with acceleration before understanding. But without context, even the most advanced tools can produce inconsistent or incomplete results.
The craft lies in knowing what to do first.
Preparation Is the Multiplier
The real power of generative AI emerges when it is embedded within a broader framework of process and analytics.
Before a single prompt is executed, there is critical work to be done: understanding the data landscape, reducing noise through early case assessment, identifying patterns and outliers, and structuring data in a way that aligns with the objectives of the matter.
This is where an analytics-led approach—often operationalized through structured methodologies such as Amplify™ Review—begins to take shape. By prioritizing early insight and decision-making, teams can significantly reduce the volume of data requiring downstream review while increasing overall precision.
By the time generative AI is introduced, the dataset should already be refined and contextualized. At that point, AI is no longer searching for signal in noise—it is accelerating decisions on data that already matters.
This is the difference between using AI as a blunt instrument and using it as a precision tool.
Process Defines Outcomes
One of the most important shifts we are seeing is the realization that process is not just operational—it is strategic.
The sequence matters. Apply generative AI too early, and you introduce inconsistency and noise. Apply it too late, and you limit its value. Apply it at the right moment—after analytics, within a defined process—and you unlock exponential efficiency.
This is where organizations are beginning to separate themselves—not based on whether they have access to AI, but on how they deploy it.
Purposeful orchestration is becoming the differentiator.
Learning Through Iteration
Back in that workshop, not every cut was perfect. Adjustments were made. Techniques improved. Over time, the process became more refined.
The same is true with generative AI.
There is a learning curve—prompting strategies evolve, processes are refined, and feedback loops become critical. The organizations that will succeed are not the ones expecting perfection on day one, but those willing to iterate, measure, and adapt.
Generative AI is not a static tool; it is part of a dynamic system that improves with use.
From Tools to Outcomes
What we are seeing across the industry is a shift away from technology-first thinking and toward outcome-driven design.
When process, analytics, and generative AI are aligned, data volumes are reduced before review begins, generative AI operates on high-value datasets, human review becomes more focused and efficient, and costs decrease while defensibility increases.
This is not about choosing between process, analytics, or AI. It is about bringing them together in the right order, with the right intent.
Conclusion: The Craft Still Matters
Having the tools is not enough.
Success—whether in a carpenter’s workshop or a modern eDiscovery environment—comes from preparation, planning, and purposeful execution. Generative AI is one of the most powerful tools we’ve ever had, but its impact is defined by the environment in which it operates.
Measure the material. Build the plan. Apply the right tools at the right time.
Then get to work.
__
About Author
Marco Nasca is the Vice President of Sales at Lineal and a 2001 graduate of DePaul University College of Law. For more than two decades, he has worked at the forefront of eDiscovery and legal technology, advising corporations and law firms on the defensible application of technology in complex litigation, investigations, and regulatory matters. His work focuses on structured data, emerging technologies, and the practical implications of evolving judicial doctrine.
__
About Lineal
Lineal is an innovative eDiscovery and legal technology solutions company that empowers law firms and corporations with modern data management and review strategies. Established in 2009, Lineal specializes in comprehensive eDiscovery services, leveraging its proprietary technology suite, Amplify™ to enhance efficiency and accuracy in handling large volumes of electronic data. With a global presence and a team of experienced professionals, Lineal is dedicated to delivering custom-tailored solutions that drive optimal legal outcomes for its clients. For more information, visit lineal.com
Table of contents
Subscribe to our newsletter
Thank you for subscribing.
You’ll get practical insights, product updates, and content your team can actually use.
