Precisely, a global leader in data integrity, has unveiled the results of its latest study, “The Observability for AI Innovation”, conducted in collaboration with the Business Application Research Centre (BARC). The study surveyed over 250 data and AI stakeholders across the globe, revealing key insights into how organizations are leveraging observability to achieve trusted AI and analytics outcomes. The research emphasizes the growing importance of AI observability and its role in building transparent, accountable, and high-performance AI systems.
Findings: Observability is Gaining Ground, but Challenges Remain
Progress in Observability Programs
According to the study, 76% of organizations have formalized or optimized data quality and data pipeline observability programs. This demonstrates a strong commitment to building trustworthy AI foundations. However, AI/ML model observability lags slightly behind at 70%, with organizations still facing varying levels of maturity in their efforts.
Despite these positive trends, the study found that many organizations are still struggling with inconsistent or underdeveloped observability programs. The lack of defined metrics and alignment with enterprise-wide governance frameworks presents a significant challenge. Without well-structured observability measures, organizations risk not meeting their AI objectives effectively.
Measuring Success: A Need for Clear Metrics
The research revealed that 68% of respondents use both qualitative and quantitative metrics to measure the success of their observability efforts. However, 32% still rely on ad-hoc methods or have no formal measurement practices at all. This gap in metric definition can impede the optimization of observability efforts, potentially affecting the effectiveness and scalability of AI and data initiatives.
The Rise of Unstructured Data and Its Impact on Observability
One of the emerging trends in the research is the growing focus on unstructured data. 62% of organizations are exploring the use of semi-structured data (e.g., JSON or log files), with 28% actively using it. Additionally, 60% are currently evaluating the use of unstructured documents such as text, images, video, and sound. These findings underscore the rising importance of handling diverse data types, particularly as advanced use cases like predictive machine learning and Generative AI increasingly depend on them.
Observing unstructured data requires more sophisticated techniques than traditional structured data, including the careful management of object metadata to ensure accurate tracking and appending.
Regional Insights: North America Leads in AI and Observability Maturity
The study also uncovered significant regional differences in AI adoption and observability maturity. North American firms reported significantly higher adoption rates and maturity in their observability programs than their European counterparts. 88% of North American organizations have formalized observability efforts across various domains, compared to only 47% in Europe.
Moreover, North American companies emphasize regulatory compliance and data privacy more heavily, despite the absence of federal AI regulations similar to the EU’s AI Act. The study highlights that North American firms also prioritize model accuracy, with twice as many organizations having formal observability measurements compared to Europe.
The Growing Importance of Observability for AI
Cameron Ogden, Senior Vice President of Product Management at Precisely, stated, “As AI and the rise of agentic use cases increase the risks and rewards of analytics, data teams are fortifying their observability programs to improve data governance and quality. This research reinforces that observability is not just a “nice-to-have,” but a foundational capability for ensuring the integrity of enterprise data, especially when powering AI models for trustworthy and scalable outcomes.”
The findings from Precisely’s latest global research underscore the growing importance of AI observability as organizations scale their AI and analytics initiatives. While progress is being made, many companies still face challenges in ensuring consistent observability, especially in the management of unstructured data and defining clear measurement metrics. The study highlights that building robust observability programs is essential for fostering trusted AI and ensuring data integrity, which are foundational for scalable AI-driven success in the future.