Skip to content

Towards a Next-Generation Foundation for Managing Real-Time Data

<Hello world!> My name is Paul Appleby and I’m thrilled to be taking on the role of CEO at Kinetica. I couldn’t be more excited about the opportunity as how we manage data is at the epicenter of the 4th Industrial Revolution, impacting every industry and every region across the globe.

When I was at Salesforce several years ago, a journalist somewhat facetiously asked me if I believed social and cloud were fads. To his shock, I responded an emphatic, “YES!” At the time, I believed that social was the gateway to the digital marketing revolution and cloud computing would simply become computing. I see a similar trend around big data, a term thrown around so often it has lost meaning.

Simply put, using data for deep learning and insight is no longer a science project. The next wave of data evolution is about ingesting huge oceans of data and using deep learning to create transformative digital relationships. Enterprises are doubling down on using data to create machine learning algorithms, creating core IP that drives growth and differentiation. According to the World Economic Forum, half of the names of companies on the Fortune 500 have disappeared since the year 2000. New upstarts able to extract real-time value from data have taken the opportunity to disrupt and supplant enterprise giants.

The opportunities are vast – whether it is processing retail data to drive next-generation  omni-channel experience; using health data for predictive analytics and preventive medicine; improving logistics companies route efficiency by analyzing location data in real time; or analyzing user preferences to recommend new products or services. In all these examples, effectively using data in real time is the essential ingredient that separates the winners from the losers.

However, there’s much work to be done. The big data models of yesteryear are fragmented and not designed for today’s challenges. Traditional approaches endorsed a bimodal model that separated operational data from analytical data and unstructured data from structured data. These systems were designed based on the assumption that memory was expensive and compute was limited, which are no longer true today. As data platforms evolved they grew in complexity and relied solely on human insight. Everytime a new problem arose a new add-on component was developed that had to be piecemealed together. The end-result was a brittle, “duct-taped” architecture that could not keep pace with the needs of the modern business.

Bringing this full-circle, we as a community need to reconstitute how businesses approach big data and deep learning by not only collecting and analyzing data but also extracting real-time value through machine learning. Customers want a unified data platform that serves business owners, data scientists and technologists. A platform purpose-built for the IoT and the unprecedented levels of data velocity that’s growing year by year. The Kinetica vision is to deliver a next-generation foundation that combines blazing fast data ingestion, real-time analysis, geospatial visualization, and machine learning–running on an advanced compute foundation harnessed for GPU and advanced CPU power. Essentially, we want to arm our customers with the ability to transform data into a digital currency that derives deep insights, driving better, smarter, faster decisions that grow top-line revenue and competitive differentiation.

We are here to help you in this transformation, and I welcome your thoughts.

Get notified when we publish new posts:

White Paper

Build Real-time Location Applications on Massive Datasets

Vectorization opens the door for fast analysis of large geospatial datasets
Download the white paper