Real-time Database for Analytics & Decisioning
Combine live streaming data with stateful relational data, all in one database, and without the need to tie together a collection of piecemeal technologies.
Combine Stream Processing with Relational Insight
Other data systems all have critical flaws when it comes to real-time analysis on moving data – most aren't designed to handle the high-cardinality joins and the constant aggregation and re-aggregation as new data changes the picture. Stream processors lack sophistication. Batch processing in data warehouses is too slow and only gives a rear-view picture. Assembling a variety of specialized technologies to build real-time systems soon becomes complex

Example: A Bike Sharing Tracker
Take this example: a bike sharing company wants to track the status of bikes out for rent and generate notifications when a docking station is low on bikes.
You can combine a kafka feed from bike docking stations, with static information on the bike stations that might even be held externally. This can be brought together to create a live chart of inventory, and generate alerts when inventory falls below a set level.

CREATE OR REPLACE DATA SOURCE bk,
LOCATION = '',
WITH OPTIONS (
'Kafka_topic_name' = 'station status',
credential = 'MY_CONFLUENT_CRED')
LOAD DATA INTO station_status
FROM FILE PATHS '' FORMAT JSON
WITH OPTIONS (
DATA SOURCE = 'bk',
'SUBSCRIBE' = 'TRUE',
'type_inference_mode' = 'SPEED')
Ingest Streaming Data from a Variety of Sources
Kinetica can ingest data at high speed from a variety sources. Ingestion can be distributed over multiple nodes. Kinetica's Kafka Connector makes it simple to attach to high-volume streaming feeds.
Aggregate, Fuse, and Analyze
Streaming data can be easily be fused with stateful data for context or aggregations.
CREATE OR REPLACE MATERIALIZED VIEW regression_view
AS SELECT DECIMAL(REGR_INTERCEPT(s1.total_capacity - s2.num_available), s2.last_reported)) as b, DECIMAL(REGR_SLOPE(s1.total_capacity - s2.num_available), s2.last_reported) as a, s1.station_id
FROM station_information s1, station_status_historical s2
WHERE s1.station_id = s2.station_id
GROUP BY station_id
CREATE OR REPLACE MATERIALIZED VIEW streaming_demand
REFRESH EVERY 1 MINUTE
AS SELECT ROUND((r.b + (r.a * DECIMAL(s2.last_reported)))) as bike_demand, s2.station_id, s2.num_available
FROM streaming_station_status s2, regression_view r where s2.station_id = r.station_id
Streaming Materialized Views
Create continuously updated views - or graphs, models, or just make data available for query. Materialized views can be updated on query, on time-based refresh, or on change – producing a real-time high-throughput view of constantly changing data.
Here we create a new view to show the running demand for bikes at a station.
Event Triggers Based on Decisioning Rules
Create alerts on moving data. It's super simple to build sophisticated event-driven pipelines using just SQL straight from Kinetica
CREATE STREAM demand_alert ON TABLE streaming_demand
WHERE bike_demand > num_available
WITH OPTIONS (event = 'insert' ,
datasink_name = |
increasing_column = 'last_reported'))
More Examples...
Real-Time Risk Analysis
See how Kinetica is being used in financial services to provide a continuously running picture of exposure and risk,
Common Operational Picture
Defense and public safety organizations use Kinetica to provide real-time interactive dashboards for insights on rapidly evolving situtations.
Cyber Threat Analysis
Watch how Kinetica can analyze over 2.5 billion rows of fast moving network data to understand and identify malicious threats at scale.
Book a Demo!
Sometimes marketing copy can sound too good to be true. The best way to appreciate the possibilities that Kinetica brings to large-scale geospatial analytics is to see it in action, or try it with your own data, your own schemas and your own queries.
Contact us, and we can set you up with a demo and a trial environment for you to experience it for yourself.