The news that insurance professionals share

insurance data infrastructure
Insurance Data Infrastructure Before AI: Expert View

Bruce Broussard: Why insurance data infrastructure must come before AI adoption

The Percipience co-founder and managing partner has spent four decades watching insurers make the same mistake: installing the next big thing while ignoring legacy issues.

Throughout his four-decade career in insurance technology, Percipience co-founder and Managing Partner Bruce Broussard has seen a dismal pattern relentlessly repeating itself. To curtail the cycle, Broussard and the company’s other managing co-founder, Ajay Kelshiker, started Percipience five years ago.

15 years at IBM designing enterprise data solutions for carriers like Allstate, State Farm, and Progressive, followed by six and a half years launching Insurity’s first data software product led Broussard to question why insurers invest millions in policy administration systems; then billions more in time, staffing and training, trying to consolidate them? Often those systems failed, waxing obsolete by the time they’re completed.

Now, with artificial intelligence having passed the emergence stage at insurance organizations large and small, Broussard is watching the old tech habits kick in again.

Many insurance organizations are investing heavily in AI, but without a strong insurance data infrastructure, these initiatives often fail. Instead of building a unified data foundation, insurers attempt to layer AI on top of fragmented systems—policy, billing, and claims platforms that were never designed to work together.

This approach leads to inconsistent insights, unreliable models, and ultimately failed implementations. AI does not fix operational or data fragmentation, it amplifies it. When data is siloed, incomplete, or misaligned, AI simply scales those inefficiencies across the organization.

The reality is that insurance data infrastructure is not just a technical prerequisite—it is a strategic foundation. Without a unified, governed data layer that creates a single source of truth, insurers cannot fully unlock the value of AI, analytics, or digital transformation initiatives.

“A lot of insurance organizations are skipping the essential enterprise data layer and are going straight to launching their AI systems,” says Broussard. “They try to deploy AI as a use case before they’ve established an enterprise data foundation, building point solutions on top of siloed data buried in policy, billing, and claim systems.”

The result is a list of failures, Broussard says. These include insights that don’t reconcile across the organization, models working from inconsistent data, and ultimately, failed implementations that join the graveyard of ambitious technology projects.

The necessary grunt work nobody wants

Broussard doesn’t mince words about the role Percipience, a Mandeville, Louisiana-based insurtech data and analytics provider, is meant to play. “It’s not the sexiest kind of business,” he says. “But it’s the plumbing that you need to get to the sexier piece.”

That plumbing is made manifest in the form of Percipience’s Data Magnifier platform. It’s intended to act as “the Rosetta Stone for insurance,” Broussard says. In essence, Data Magnifier translates data from disparate transaction systems into a unified enterprise foundation. That it is unglamorous work may explain why there’s so little competition in the space, Broussard says.

“The core vendors are primarily dealing with data coming out of their core system, as opposed to all of the data coming out of all of the systems,” Broussard explains. Beyond policy, billing, and claims systems, insurers operate accounting platforms, investment systems, underwriting tools, actuarial modeling constructs, and often receive Excel files or bordereau in PDF format from Managing General Agents (MGAs).

“The ability to take all of the data from every source and bring it in — that’s not something those packages have typically done well,” Broussard contends.

The distinction matters because transaction systems and data systems are fundamentally different. A personal auto insurance application “hasn’t changed very much in the last 50 years,” Broussard notes. The data coming in remains relatively static. But the data insurers need in order to evaluate risk, design products, and manage claims evolves constantly, sometimes within months.

“Dealing in data systems is a very different beast than dealing with policy, billing, or claim systems,” he says. “Just because you’re good at one doesn’t mean you’re good at another.”

Where policy systems were in 1989

The insurance data platform market today occupies roughly the same stage of maturity that policy administration systems did in the late 1980s and early 1990s. In Broussard’s view, it’s nascent, fragmented, and dominated by custom builds.

Before 2013, none of the major core vendors offered a packaged data solution, he says. Insurity pioneered the space with Insurance Enterprise View (later rebranded DataHouse), a product Broussard helped bring to market. Other vendors quickly followed. So did consolidation.

The big problem as Broussard tells it is that these combinations weren’t engineered from the ground up as enterprise data platforms. “They were really trying to solve reporting layers on top of their transaction processing capabilities,” Broussard says.

As a result, more than 50% of insurers still build their own data solutions, either from scratch or through consulting engagements with firms that construct custom systems. It’s an expensive, time-consuming approach. And they often produce static solutions unable to keep pace with new data sources, analytics tools, and AI capabilities.

“Custom builds are problematic once they’re delivered,” Broussard explains. “They exist as they exist, and if you don’t have the resources to continue to invest in and advance them, it’s hard to keep up with all the changes and opportunities that present themselves in data.”

For insurers buying vendor solutions, the challenges are different but equally frustrating.

Proprietary systems lock customers into specific platforms and tools. Source code remains hidden. Changes require going back to the vendor or certified system integrators, which is generally an expensive proposition. Also, these projects rarely guarantee the vendor’s priorities align with the customer’s needs.

“In the data and analytics world, new data sources, new analytics capabilities, new tooling, and new platforms are expanding monthly,” Broussard says. “If you don’t keep up with that and you don’t have a mechanism to keep up with that, your solution isn’t going to be very current for very long.”

The MSIG proof point

In June 2025, MSIG USA won a Celent Model Insurer Award for enterprise-wide digital transformation powered by Data Magnifier. The recognition validated Percipience’s approach: the carrier integrated 17 systems across 16 lines of business and 30-plus years of history in 10 months, achieving significant reductions in underwriting processing time, measurable improvements in underwriting performance, and substantial cost savings.

The timeline stands in stark contrast to traditional enterprise data implementations.

Traditionally, organizations spend 12 to 18 months building a data model before integration work can begin, Broussard says. And that time frame assumes they have the talent, which Broussard estimates at fewer than two dozen people in the U.S. capable of doing the work at enterprise scale. “I don’t think we’ve had any of our customers take longer than 10 months to get into production. And we’ve had customers in production in 10 weeks,” Broussard says.

The acceleration comes from “productizing” the most challenging work. “Getting the data model is the hardest part and the most important part of the construct,” Broussard says. “Nothing really happens in a data warehouse, data lakehouse, or analytics foundation build without a solid data model.”

Data Magnifier delivers a fully baked data model out of the box, Broussard says, allowing mapping and integration to start on day one. The platform includes industry-specific accelerators and APIs that understand personal lines, commercial lines, specialty products, and regulatory reporting requirements. “This is a product that works,” Broussard emphasizes. “It’s been proven at enterprise scale. It’s not a proof of concept that someone wants to see if they can get value from.”

Beyond implementation: total cost of ownership

Because Broussard and his co-founder Kelshiker are both former insurance chief information officers, they developed Percipience’s platform for today’s CIOs, designing Data Magnifier with long-term viability in mind. “We’re not looking for something we can sell to somebody,” Broussard says. “We tried to build something that would work for people. We wanted this to be a solution that would last 20-, 25-, 30-years-plus.”

That philosophy reveals itself in several ways. Unlike proprietary vendor platforms, Percipience provides full source code, complete data model designs, and comprehensive documentation. It’s got everything customers need to control their environment without incurring the usual “tech debt” that demands more and more software add-ons, Broussard says. The company has trained clients to implement the product entirely themselves after two-to-three weeks of instruction.

Platform independence is another critical differentiator. Data Magnifier runs on Google Cloud, AWS, and Azure with various databases and integration tools. “We know that we can move from platform to platform and tool to tool,” Broussard says. “People are making huge investments in data and analytics right now. They don’t want it wiped out and lost when platforms change.”

A decade ago, nobody in insurance discussed cloud deployment. Now it dominates every conversation. Broussard doesn’t know what the landscape will look like in another six to eight years, but he’s confident Data Magnifier will work on it.

The AI foundation thesis

For MSIG USA, the data foundation enabled the next phase: integrating a new policy system and AI-based underwriting platform from Convrt. Percipience structured the integration to accommodate new capabilities as Convrt continues expanding its end-to-end functionality.

It’s the sequence Broussard believes insurers must follow. Establish the enterprise foundation first with a governed, reconciled single source of truth across all systems. Only then does AI become far more powerful and far less risky, because every model is working from consistent, explainable data that can be verified and validated.

Can insurers derive some AI benefit without fixing all their data problems? “Absolutely,” Broussard acknowledges. But significant enterprise transformation requires a solid, broad enterprise-level data foundation.

The foundation metaphor isn’t accidental. Just as buildings require structural integrity before adding floors, insurance organizations need data infrastructure before layering on advanced analytics.

Owning the competitive advantage

Percipience set out to solve three core problems: integrating data from multiple vendors and homegrown systems into a coherent foundation; enabling insurers to own their data constructs without requiring specialized expertise; and protecting investments in analytics, machine learning, and AI from becoming obsolete when underlying platforms change.

The urgency has only intensified. Margins continue to tighten, and competition is increasing. Organizations that can use information effectively will gain a competitive advantage.

That advantage requires speed. Five-year project planning cycles don’t work in data and analytics, where requirements evolve within months.

With the right platform in place, insurers can respond quickly to new MGA partnerships, regulatory requirements, third-party data sources, or evolving business needs.

Without that foundation, insurers face a harder truth: no matter how sophisticated their AI ambitions, they’re building on sand.

“Only God knows what the business needs are going to be three or four years from now,” Broussard says. “But if you’ve got a foundation with all of your data aggregated in a single source of truth that is auditable and trusted, answering the next set of questions becomes much easier.”

Related tags:
People

Join the community of experts

Connect with frontline insurance professionals and industry experts for fresh perspectives on an evolving industry

Research reports

Related articles