The promises of AI remain immeasurable, but one thing may be to curb it. “The infrastructure that powers AI today will not maintain the demands of tomorrow,” a recent CIO.com article leads. “CIOs need to rethink how to scale smarter, not only bigger.
CratedB agrees – database companies bet on solving problems by becoming a “unified data layer for analytics, search, and AI.”
“The challenge is that most IT systems rely on or are built around a batch or asynchronous pipeline. Currently, there is a need to reduce the time between data production and consumption,” explains Stephen Castellani of SVP Marketing. “CratedB is a very good fit because it can really give insight into the correct data in a huge amount of format.”
In the blog post, we focus on a four-stage process in which CratedB functions as “connective tissue between operational data and AI systems.” From ingestion to real-time aggregation and insights, providing data to the AI pipeline, and enabling feedback loops between models and data. The speed and diversity of data is important. Castellani is focusing on reducing query times from a few minutes to milliseconds. In manufacturing, telemetry can be collected from machines in real time, resulting in greater learning of predictive maintenance models.
As Castellani explains, there are other benefits. “Some people use CratedB in factories to help with knowledge,” he says. “If something goes wrong, the machine will receive a specific error message and say, “I am not an expert on this machine. What does it mean, how can I fix it?”
However, AI has not been stationary for a long time. “We don’t know what it will look like in a few months or even weeks,” Castellani points out. Organizations aim to increase autonomy and move to an entirely agent AI workflow, but manufacturing is lagging behind as part of the wider product and service industry, according to the recent Pyments Intelligence Research. CratedB partners with Tech Mahindra on this front to provide agent AI solutions for automobiles, manufacturing and smart factories.
Castellani points to excitement about the Model Context Protocol (MCP). This standardizes how applications provide context to large-scale language models (LLM). He compared it to the trends in enterprise APIs 12 years ago. Still in the experimental stage, CratedB’s MCP server acts as a bridge between AI tools and the analytical database. “When talking about MCP, it’s pretty much the same approach (with APIs), but with LLMS,” he explains.
Tech Mahindra is just one of CratedB’s important partnerships for the future. “We’re continuing to focus on the basics,” adds Castellani. “Performance, scalability… investing in the ability to intake data from more and more data sources, and always keep the minimum (ing) to a minimum, both on the intake and query side.”
Stephane Castellani will speak on the topic of bringing AI to real-time data (with CRATEDB) in AI & Big Data Expo Europe. You can see a full interview with Stephane below.