Skip to content

Moving AI From Science Experiment To The Mainstream

For AI to become mainstream, it will need to move beyond small scale experiments run by data scientists ad hoc.

The complexity of technologies used for data-driven machine and deep learning means that data scientists spend less time developing algorithms and more time automating and operationalizing the process. Business analysts, on the other hand, find it difficult to leverage trained models and incorporate AI findings into their data sets.

Because of this great divide, a new role, the data engineer, has emerged in between the data scientist and business analyst. But most organizations struggle to find skilled resources for this role. In fact, Gartner predicts that by 2020, 50% of organizations will lack sufficient artificial intelligence and data literacy skills to achieve business value.

These siloed roles make it difficult to get insight— whether human or artificial intelligence.

As enterprises move forward operationalizing AI, they will instead need to look for products and tools to automate, manage, and streamline the entire machine learning cycle, starting from data ingestion, to algorithm creation and model training and deployment.

To get value out of this investment in integrated, data-driven AI, organizations would be well-served by a single engine capable of streamlining, automating, and managing the entire analytical and machine learning life cycle – as well as managing the data that these models and code work on. The platform will also need to support all of the custom, open source, and packaged software used by both data scientists and business analysts.

The GPU-optimized analytical database achieves this. It not only supports core database workloads, but also runs in-database analytical workloads and manages the entire lifecycle using a single engine. The massive parallel processing power of the GPU makes it ideal for accelerating traditional analytical database workloads, as well as compute-intensive operations for advanced analytics and machine learning.

Next-generation data analytics and model management tools built on top of the GPU-powered database deliver automated processes, so data scientists can focus on the algorithms, business analysts can utilize AI findings, and data engineers can streamline the entire workflow.

Getting ROI from AI

AI is perfect for analyzing extreme data generated by the ever-expanding universe of digital devices. High-ROI applications include customer 360 to improve experiences, product recommendations in retail sales, and value-at-risk analysis for financial services, for starters.  To get the best results, businesses need an AI process across the entire data corpus, integrating the results into their traditional analytics.

Based on this enormous potential, investment in AI will increase. A single engine leveraging GPUs can bring AI to traditional analytics cost-effectively, achieving a healthy return on investment.

For those who don’t invest, Gartner predicts that by 2021, 80% of data science labs that have failed to establish measurable operationalization practices will be outsourced.

For artificial intelligence, the future is now.

Editor’s Note: This article was originally published in Forbes on 8/22/18

Dipti Borkar is VP of Product Marketing at Kinetica.

 

MIT Technology Review

Making Sense of Sensor Data

Businesses can harness sensor and machine data with the generative AI speed layer.
Download the report