Skip to content

Kinetica Developer Edition

Kinetica can be quickly installed and run on your local machine as a Docker container. The Developer Edition has most of the essential core functionality of Kinetica, and comes complete with pre-loaded examples and datasets.

Follow the steps below:

Workbench Screenshot
  1. Get the Container
    curl -o kinetica.bat && .\kinetica.bat start

    Requires Docker with at least 8GB of RAM allocated, Windows 10 2017 or later.

    curl -o kinetica && chmod u+x kinetica && ./kinetica start

    Requires Docker with at least 8GB of RAM allocated.

    Kinetica can also be set up in the cloud on AWS or Azure. The managed service starts at only $1.50/hr, and allows you to scale as you grow.

  2. Start Kinetica

    The kinetica shell script manages your installation, starts, and stops the database.

    Start Kinetica:
    ./kinetica start
    Stop Kinetica:
    ./kinetica stop
    For more commands and configuration options:
     ./kinetica --help
  3. Open Kinetica Admin

    Access localhost:8080/gadmin using your browser to set your password.

  4. Launch Workbench

    Workbench is an interactive notebook environment for querying, analyzing and visualizing your data with SQL. You can launch it via your browser using the following link localhost:8000/workbench/. The default username is admin.

Feature Matrix

How does the Developer Edition compare to Kinetica Enterprise Edition?
  Developer Edition Enterprise Edition
Free for Personal Development Use  
SQL Analytics
Spatial Functions
Graph Server / Solvers
WMS Visualization
Row/Column Level Security
Cluster/Ring Resiliance HA  
Community Support
Enterprise Support  
Support Portal & Knowledge Base  
API Access
Intel/CPU Instance
Single Node Computing
GPU-Enabled Instance  
Cluster Computing  

Book a Demo!

The best way to appreciate the possibilities that Kinetica brings to high-performance real-time analytics is to see it in action.

Contact us, and we'll give you a tour of Kinetica. We can also help you get started using it with your own data, your own schemas and your own queries.