Controls at ingestion are now equal to core technology capabilities
When ingestion breaks, trust breaks. And lately, trust is in short supply. That was the undercurrent running through Shield’s recent expert panel, where industry leaders pulled back the curtain on why data ingestion controls are make-or-break for compliance.
In our recent webinar, “Data you can Defend: Operationalizing trust at the point of ingestion”, I had a conversation with Therese Craparo, partner at Reed Smith, and Daniel Ihrig, eDiscovery & Surveillance Architect at Macquarie Group.
While you may have missed the live discussion, we captured the most important insights below. And if you want to watch the full session, click here.
The foundation shifted
As communication channels multiply, so do the risks. “There’s a communication feature in almost every application that our clients are using today,” said Craparo. From WhatsApp to Bloomberg to Salesforce, every new touchpoint is a potential weak link—and each demands its own ingestion rules.
Teams can’t rely on internal policies or manual oversight to keep up. More channels mean more failure points. What worked five years ago simply doesn’t scale.
Successful ingestion is a day one requirement
Ihrig, who has decades of hands-on experience ensuring data integrity in complex regulatory environments, sees a clear shift: “Completeness and reconciliation controls are no longer considered a day two activity.” These controls now belong in implementation plans, not post-launch audits.
Alex de Lucena made the stakes clear with an analogy: “It’s as if in the home-buying process, the inspection became equal to—or happened at the same time—as someone actually saw the house. You love the closet space, but does the roof leak?”
In other words, verifiable ingestion can’t be a box checked later. It must be foundational from the start. Institutions expect end-to-end monitoring baked in—not bolted on.
Strong governance begins with three essential checks:
- Did we get the data? This first checkpoint confirms data arrived from upstream systems. Some sources offer detailed manifests. Others don’t. Good ingestion frameworks adapt to both.
- Did it reach the archive? Automated pipeline checks verify data made it into your systems intact and deduplicated. If 400 messages become 385, is that noise or signal?
- Can we search and act on it? Precision matters. “If I’m searching for everything for ‘Tom Rimmer,’ and the system records it as ‘Trimmer,’ I’m going to miss a whole chunk of data,” said Rimmer. “You’ve got to test your use cases at onboarding—not after the fact.”
From vendor to partner
The best vendors aren’t just tech providers. They’re proactive collaborators.
“You’re not a technology provider, you’re a service provider,” said Craparo. That includes flagging systemic ingestion issues early. Vendors see patterns across firms—and must share them.
Some don’t. One vendor reportedly said, “Why would I ever tell you if I knew there was a problem with my system?” That’s not just a trust issue. It’s a risk to your entire compliance program.
Know your channels, know your controls
For compliance teams under pressure, Craparo offers clear advice: “Know your channels. Have a record of what feeds your archive. Know the controls that govern each one.”
It’s not about being flawless. It’s about being defensible. “I can defend a process,” she said. “I can’t defend a guess.”
Rimmer added that many institutions are moving beyond reactive monitoring. “There’s more demand for flexibility in reports. Execs want more detail. They’re asking: Can we build our own dashboards? Can we query it with APIs?”
The tools are here. AI agents can map data flows, flag anomalies, and even retrieve missing records. But trust in the tech—and transparency about how it works—are lagging.
“The industry would be interested in seeing that,” Craparo noted. But teams need to understand what they’re adopting. Accountability doesn’t stop at the user interface.
The window for readiness is closing
As de Lucena put it: “Clients are excited about AI, about false positive reduction—but they want to see the numbers first. If they can’t account for everything going into the platform, they don’t trust what comes out.”
Rimmer agreed: “The legacy approach was: is the channel working because I’m receiving something? People now understand—that’s not good enough.”
History shows enforcement takes time. But when it comes, it hits hard. Off-channel communication violations were years in the making.
“Don’t bury your head in the sand,” Craparo warned. “Understand what it is and be ready, because you will see it.”
De Lucena reinforced the urgency: “The tools exist. What’s missing is confidence in how they work. We can visualize every error, every failed message, everything we dedupe. But adoption lags because trust lags.”
The bottom line? If you can’t prove what you ingested, you can’t protect your institution.
Now is the time to fix it.
Shield makes this possible—giving firms real-time ingestion monitoring, defensible audit trails, and data you can actually trust. Because compliance starts with knowing what’s coming in—and being ready to prove it.
Reach out now to see how we can help you make sure data ingestion is not an after thought.
Related Articles
What ASIC’s Roadmap Means for Compliance Teams
Subscribe to our newsletter
Gain access to exclusive insights, industry influencers, and thought leaders in
Digital Communications Governance and Archiving (DCGA).