Things are getting smaller. Not just in terms of gadget miniaturization, medical nanotechnology, increasingly sophisticated industrial electromechanical units, and the process of so-called narrowing this leads to our candy bars being thinner or shorter at the same price, but also the data – the data also getting smaller.
Data is getting smaller in two key ways: a) we decompose the components of application data streams into smaller containerized elements to work inside similarly compartmentalized and containerized application services – and b) the time windows in which the business must react to data events are shrinking.
This last time constraint on data of course brings us back to the reality of real-time data and the need to be able to work with it.
There’s no real-time data, really
As far as the actual workings of the space-time universe we live in are concerned, real-time data is somewhat of a tautology, i.e. data always has a disbursement of time that must be paid for them to exist. Data can travel at the speed of light, but it’s still speed. When we talk about real-time, we mean data transports that operate fast enough that a human cannot perceive any time lag. Thus, real time expresses a human perception of time rather than a machine perception or definition.
All of this is important because we are now supposed to adopt a The world of industry 4.0 where our factories are run by AI-augmented intelligence and intelligent automation. But manufacturers may not be ready for Industry 4.0 if they face complex data issues caused by production bottlenecks caused by disparate information systems within an organization, many of which will still require human intervention – from manually entering sensor readings into databases to inefficient systems. Ready to build health monitoring (i.e. out of the box) and lack of integration with enterprise resource management (ERP) systems.
KX, headquartered in Palo Alto, is keen to write a few errors in this space. Known as KX and KX Systems, the company is recognized for its work analyzing high-speed, real-time data streams inside intelligent systems that can also simultaneously take on workload-related tasks. historical data.
The analytical maturity curve
Examining the speed of today’s industrial data processing and the need to reach one’s personal Nirvana state of rapid and intensive data analysis continuously, KX calls the evolutionary state of a given company its point on the “curve”. maturity of the analysis” of the data. Marketing’s frantic naming attempts notwithstanding, KX is right, that is, the business window for creating differentiated value is shrinking for organizations in all markets and industries. Logically, the faster they can act on information derived from data created in the moment, the better the outcome.
As KX CTO Eric Raab has declared before“The opportunities for flow analytics have never been greater. In fact, according to my company’s research, 90% of companies believe that to stay competitive over the next three years, they need to increase their investments in real-time data analytics solutions Whether it’s a financial institution that needs to adjust client portfolio metrics based on ever-changing stock prices, a utility that monitors network throughput electric or an e-commerce site that needs to generate a monthly report, data accuracy at speed is extremely difficult.”
What kind of data analysis can we get from enterprise software platforms that can run at this kind of speed? According to KX, finding (and acting on) anomalous data will be a key use case.
Generally defined and explained as data points, events, or observations outside of the normal behavior of a data set, anomalous data can be a key indicator and flag to alert a business that something has already caused (or is likely to cause) a problem somewhere in the company.
“The ability to quickly detect and respond to abnormal incidents is essential, particularly because gaining the ability to react in real time can limit the cost of anomalies. In addition to preventing problems from persisting within the business, embracing real-time data can also improve process efficiency. Types of positive [advancements and innovations possible here include] faster services, increased sales, higher product quality, and lower prices – showing how wide and varied the impact of real-time data can be,” notes KX, its Creation Speed Research Report of commercial value.
The company emphasizes that using real-time data systems brings productivity gains by reducing the man-hours spent processing and managing data. This type of platform allows users to automate complex workflows that would otherwise be time-consuming and therefore use tested machine learning (ML) models that provide some level of practical and actionable insights for direct commercial actions.
The Path to Microsecond Business
If, collectively, we have walked through this argument and agreed (even by a percentage point) that we need to focus more on real-time data and analytics technologies that can work with complex, high-powered information sources. throughput, then we might be on the way to implementing platforms such as KX and/or its competitors.
On this orange tree, KX is not the only fruit. A list of data streaming specialists worth noting today might include Confluent for its fully managed Kafka services, Tibco for its Tibco Spotfire product, Amazon Web Services Kineses, Microsoft Azure’s IoT offerings, and of course Apache Kafka itself for open source purists. That’s not to say KX isn’t special, it just underscores and perhaps validates the company’s position in what is clearly a defined technology discipline working to solve a critical need.
Companies in any industry vertical implementing this level of technology are on the path to what we may soon call “microsecond business operations,” a term that may stick.