The Infer Platform

Digging into our thinking around our free platform, where it is heading and what will become the Inference Layer of the Modern Data Stack.
Lego shop

This has nothing to do with the Infer Platform but I love this facade - the astronaut Lego takes me right back to my childhood.

Hello again, I am Erik 👋 Welcome to the third post in the The Beginning of Infer series. In this post we will dig a little bit deeper into the product itself. We will look at the outline of our product strategy and the reason for some of the choices made. In particular, our thinking around our initial free platform and how we plan to develop that into what will become the Inference Layer of the Modern Data Stack.

This will conclude the "Beginning of Infer" story for now. More will come later but for now the plan is to start posting about the product itself, starting with a sneak peek of the free platform in the next post.

As with all things startup this is very much WIP - if there is something you think we should do differently or better, please drop me a line 🙏

The Inference Layer and SQL-inf

The aim of Infer is to build the Inference layer for the Modern Data Stack. Meaning a layer that sits between the Data Layer and the Consumer Layer of the data stack, thereby enabling consumers to easily perform advanced analytics on their data, as if this functionality was native to their Data Layer.

From a user perspective, the main component of the Inference Layer is SQL-inf. SQL-inf is our variant of SQL that enables analysts to continue using the programming language they love and hate the most, SQL, while giving them the ability to perform advanced analysis without having to learn a new technology.

The function of the Inference Layer is to parse and execute SQL-inf, fetch data from the underlying data layer, execute SQL-inf commands on this data and return the results to the consumer layer. When fully build out, users will be able to use SQL-inf within any tool of their choice - Tableau, Looker, Excel, dbt etc - and with any data source - Snowflake, Big Query, Redshift etc - by integrating Infer into their data stack.

However, along the way to the fully fledged Inference Layer we plan a few stops that will enable us to provide value sooner and with less friction. The first of these stops is the free Infer App.

Infer data flow new2

The Infer Inference Layer place in the Data Stack

The Infer App and The Cold Start Problem

Ultimately, our mission is to help businesses use their data better by providing analysts with better and more powerful tools and methods for analysis as well as resources for learning how to perform better analysis.

To achieve this in an easy and accessible way, our first product won’t be the stand-alone, "all-singing, all-dancing" Inference Layer but a free, easy-to-use, self-contained platform, the Infer App.

The Infer App allows users to perform advanced analysis without having to integrate anything directly into their data stack. Behind the scene this platform is powered by the Inference Layer – allowing us to build out the Inference Layer while providing immediate value to analysts. Accompanying the Infer App will be extensive educational resources to better enable users to take advantage of the advanced analytics and machine learning of Infer within the context of their specific use cases.

By building the inference layer out this way, ie paired with a free, easy-to-use frontend platform, we can provide value to analysts on Day One and avoid the “cold start problem”. Integrating a fully featured inference layer comes with setup and integration costs – investments that potential users must make upfront before any value is accrued. This becomes a barrier to entry for our users and, hence, a barrier for us to enable analysts to help their stakeholders make better, more data-driven decisions. By instead approaching our users "bottom-up", we can from the beginning focus on the people that we ultimately want to help – the analysts.

Over time we will offer premium features within the platform as we build out the Inference Layer, such as connectors to external data sources (Snowflake, BigQuery etc), connectors to external BI tools and other systems (Looker, Tableau etc), API/DB interfaces for other external connections, integration with DBT, unlimited compute limits, better support, on-premise deployment and so on and so on. This is how we expand our platform from the Infer App to the fully featured Inference Layer, building out the value we provide to our users at each step along the way.

Short timeline

Rough timeline of the build out of Infer

The Infer App: Concepts

Before we finish, let's give a quick introduction to the Infer App. we will do so by covering a few core concepts of the platform to illustrate the user workflow of the platform accompanied by a few screenshots of the platform in its current incarnation 🥳

The Infer App consists of three elements: Data Management, Analysis, Results.

Data Management: Here users can upload data sets - to start with as CSV files - and manage their uploaded datasets. A dataset forms the basis of a project, meaning a set of analysis run on the dataset. In future versions the concept of a project will include multiple datasets and eventually, as our first premium feature, a direct connection to the user's data warehouse.

Analysis: This is where the magic happens. Within a project, which for now is 1-2-1 with a dataset, users can perform analysis using SQL-inf. Each SQL-inf command is executed against the data set of the project and is saved along with the results. Each of these saved queries can be named, deleted, duplicated etc. This means that you can always go back to previous steps in the analysis and reference between them. These collections of commands and results, related to the same data, are what forms projects.

Results: When a piece of analysis has been performed users can explore the results in-app and either choose to iterate on it, ie change and rerun the analysis, or export it for use in other applications, eg dashboards. The results section presents data visualisations specific to the type of analysis run as well as the raw output in table form. The idea is not to have a dashboard-like interface but a simple and quick interface for eye-balling results, in order to decide on the next step of the analysis.

Infer platform alpha highlights

The three components highlighted within the Infer platform. This is the current ALPHA version, much to be changed but the overall layout will remain.

In summary, the user flow for a given analysis becomes:

  1. upload a dataset
  2. construct your SQL-inf statement either from scratch using the editor or using use case specific templates
  3. run the analysis and review the results
  4. iterate on the analysis by adapting the previous query or starting a new from scratch
  5. export the finished analysis to other visualisation and dashboard tools for sharing and further analysis.

If you have read the previous post in the The Beginning of Infer series, Our Big Mission, you will recognise this flow. This is essentially the flow of exploratory analysis and research described there. As we build out the Inference Layer, and add the ability to connect directly to the data and consumer layers, this flow will become even more powerful for exploratory analysis.

But Infer isn't about exploration and ad-hoc analysis. By providing an API and native integration with dbt we also allow users to use SQL-inf within their data pipelines and to build data assets based on our advanced analytics. The API will be available from the beginning of our BETA programme with the dbt integration coming later.

Bye, Bye! See you next time 👋

That was it for this time.

In upcoming posts we will look at the product in details. We will also share some thoughts on where we sit in the data stack, how we think about “exploratory data analysis” and our believe that the disproportionate focus on asset and pipeline creation has prevented good research and better exploratory tools.

Latest posts

Lookmumnohands
Building Customer Segmentations using ML
How to build a great customer segmentation to support and inform your business and product strategy
Dbt infer
ML Analytics with dbt
Perform advanced ML Analytics in dbt using SQL-inf and Infer.
Christmassy
Data Assets versus Data Exploration
With the spread of dbt we have become accustomed to a certain style of development but sometimes this isn't the right setup.