Alpine Data Supplies Missing Link for Predictive Analytics

 
 
By Chris Preimesberger  |  Posted 2015-11-30 Print this article Print
 
 
 
 
 
 
 

Startup develops big data predictive solutions that simplify the process of building predictive models for data sets.

Predictive business analytics initiatives can be tricky to set up correctly and to obtain desired results, if each instance doesn't contain the exact right query and data sources.  This is where data scientists can be extremely valuable. But what if you don't have a data scientist on staff?

Many times, similar queries come up in an enterprise's use case. What if there was a sort of institutional memory to capture them all, so that the analytics work could be referenced and not have to be repeated, wasting time, energy and salaries?

San Francisco-based Alpine Data Labs is a 4year-old startup that provides answers for those questions. This kind of solution becomes especially valuable because more and more analytics options are being added to workflows in new-gen IT systems.

Alpine Data Labs develops big data predictive solutions that simplify the process of building predictive models for data sets. This week the company revealed the general availability of its Alpine Custom Operator Framework. This is a flexible methodology for developing custom algorithms that can be plugged directly into Alpine's parallel machine learning engine.

As a result, in correlation with Alpine's Touchpoints control-ware, the Custom Operator Framework enables data science and business analyst teams to create, manage and distribute frequently-requested analytic assets to business users directly into their existing activities and workflows.

Data science teams receive requests to perform the same function against different data sets time and time again. While these functions create a meaningful difference for business users, they are complex and multi-faceted, and data science teams are forced to deal with them tactically. What often results is a lot of wasted time for the data science teams -- and their salaries generally aren't trivial.

Here's one use case example: A customer management team at a financial institution is building credit models and needs to fill in missing fields for individuals with incomplete profiles. A data scientist might approximate these fields with aggregates from other individuals with more complete profiles. The function to compute these aggregates might be quite complex, repetitive, and time-consuming to build.

In many cases, teams in different parts of the organization will re-create the same function over and over again, introducing inconsistencies and re-work, Chief Product Officer Steven Hillion said.

Alpine Data's Custom Operator Framework enables a data scientist to perform this function once then operationalize that Custom Operator so that it can be discovered and re-used by other teams -- even accessed by business users to perform future analyses on their own.  The Custom Operator Framework fulfills a key role to help organizations free up valuable data science resources and place the power of predictive models in the hands of business users, Hillion said.

The Custom Operator Framework provides a visual development environment for users to easily operationalize proprietary methods and open-source algorithms to enhance common business functions.

The flexible nature of the Custom Operator Framework means data science teams and business analysts can add their proprietary and open-source algorithms, models and code to the Alpine platform, and make them available as visual elements in analytics workflows, Hillion said.

Go here for more information.

 

 
 
 
 
Chris Preimesberger

Chris Preimesberger is Editor of Features & Analysis at eWEEK. Twitter: @editingwhiz
Join us for our next eWEEKChat Dec. 9: "Predictions, Sure Things and Wild Guesses for IT in 2016."

 
 
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel