What’s changing for data management in 2026? [Q&A]

what’s-changing-for-data-management-in-2026?-[q&a]
What’s changing for data management in 2026? [Q&A]
Data silos

Data is the lifeblood of enterprises but it’s only of value if it can be managed effectively and made available to support decision making when it’s needed. Increased use of AI systems has stood the data management world on its head so increasingly the old rules no longer apply.

We spoke to Vitor Avancini, CTO of AI and data consultancy Indicium, to find out how data management is changing and what’s ‘in’ and ‘out’ this year.

BN: What’s driving the biggest changes in enterprise data management as we head into 2026?

VA: Two forces are finally colliding in a productive way. On one side, enterprise data management has matured. On the other, AI systems now demand coherence, consistency, and trust in the data they consume. That pressure is forcing companies to stop tinkering on the edges and start transforming the core. The result is a very clear picture of what’s in and what’s out for 2026.

BN: Governance has always been a pain point. What’s in now that’s changing the game?

VA: Native governance. Tools like Unity Catalog, Snowflake Horizon, and AWS Glue Catalog are embedding governance into the platform itself rather than bolting it on. Automated data quality monitoring, anomaly detection, and usage intelligence now run continuously. But the important shift is philosophical. Automation handles detection, but humans still decide severity, escalation, and accountability. We’re rejecting the fantasy of fully automated governance and embracing a balanced model where machines inform and humans interpret.

BN: Are organizations really consolidating platforms, or is that just vendor marketing?

VA: It’s very real. The era of stitching together ten or fifteen separate tools is over. Complexity finally caught up with the Modern Data Stack. That’s why Databricks, Snowflake, and Microsoft are expanding into unified environments. The Lakehouse is emerging as the architecture of choice because it gives companies one environment for structured data, unstructured data, analytics, ML, and AI training. This isn’t vendor lock-in. It’s survival in a world where data volumes are exploding and AI requires consistency.

BN: The extract, transform, load (ETL) process has been declared ‘dead’ many times. What’s actually happening in 2026?

VA: What’s happening now is different. Hand-coded ETL is genuinely entering its final chapter. Python scripts and custom SQL jobs are simply too brittle for today’s needs. Managed orchestration systems like Databricks Lakeflow, Snowflake Openflow, and AWS Glue now cover ingestion through monitoring and recovery. And zero-ETL patterns are emerging as the ideal for real-time use cases. Many organizations are bypassing traditional pipelines entirely and replicating data instantly from operational to analytical systems.

BN: Dashboards have been the standard for decades. Are they really out?

VA: Not overnight, but their dominance is fading fast. Business users don’t want static charts. They want answers, explanations, and context. Conversational analytics and agentic BI are stepping into that gap. Instead of clicking through filters, users can describe what they want or ask an AI agent why a metric changed. Early text-to-SQL tools tried to automate query writing. The new generation acts more like analysts, synthesizing insights and generating visualizations on demand.

BN: AI is reshaping storage strategy. What’s changing underneath the hood?

VA: Two big shifts. First, vector-native storage is becoming essential because retrieval-augmented generation depends on embeddings. Databases must store vectors as first-class citizens. Second, Apache Iceberg is becoming the default open table format. It creates interoperability across engines, reduces duplication, and finally turns object storage into a unified foundation. Iceberg future-proofs data architecture in a way the industry has wanted for a decade.

BN: Let’s flip to what’s out. What are organizations actively moving away from?

VA: The extremes. Monolithic warehouses can’t handle unstructured data or real-time needs. Hyper-decentralized stacks created governance chaos. Hand-coded ETL, passive catalogs, manual stewardship, and static dashboards are collapsing under their own weight. Even rigid interpretations of Data Mesh have cooled as companies focus less on theory and more on AI readiness. And on-premises Hadoop clusters are disappearing because object storage with serverless compute is simply better in every dimension.

BN: If you had to summarize the mindset shift for 2026 in one idea, what would it be?

VA: Clarity. Companies are rejecting fragmentation, manual intervention, and analytics that can’t communicate. The future belongs to unified platforms, native governance, vector-ready storage, conversational analytics, and self-maintaining pipelines. AI isn’t replacing data management. It’s rewriting the rules to reward simplicity, openness, and integrated design.

Image credit: Islander11/depositphotos.com