Getting beyond the ABC’s of Data & Analytics 

At LoQutus we are guiding our clients to become digital leaders. With Analytics & Insights we help our customers shape the right environment for leveraging data and analytics. It used to be all about the ABChere, about building an Analytical model to deliver Business Intelligence and Customer Intelligence. In this blogpost we share our vision of the importance of these ABC’s and how companies can get to the next level by organizing for the ‘D’: Data DevOps. 

 

 

 First things first – Analytical Model 

Any organization that wants to do more with its data, has to realize that it is all about building an analytical model. Throughout its business operationsdata is captured that describes just a tiny fraction of the reality of the business. It used to be just the core transactions, getting the basics right to be able to deliver the core products of services to the clients. Lately more fast-moving contextual data is captured to get more info about the customer, the ‘things’ that are used in delivering these customer services, and metrics that describe the business operations. 

Building an analytical model is about translating this transactional, ‘row-based’ view of the business, into an analytical or ‘column-based’ view. In Analytics, we are not interested in the individual transaction, we have many questions that target specific columns across the entire dataset. What age groups buy our product? Which product categories thrive? What impact does a price change have? 

Organizations need this analytical model, and they need it to be more future-looking (predictive) than just describing the past. 

 

 

Important today is that building these analytical models is no longer an exclusive handcrafted manual process. For this, machine learning algorithms must be used by experts to automate part of the workload. The existing data environment often needs to be modernized to support these capabilities. 

Currently we are helping a large industry client build the modern data landscape to provide this kind of analytical model to the business. Today it’s already being leveraged to provide dynamic pricing analytics and competitor insights. 

 

B – Getting smart – Business Intelligence 

Business Intelligence used to be about a BI-team delivering tailored reports to the business. In today’s fast changing landscape, data teams are being organized around delivering data assets that the business can use to explore self-service BI. 

If you ask us what the key component is in delivering self-service BI, our answer is a validated semantic model. This model should capture all the necessary business logic and calculations, not only validated records and transactions. That way for example a KPI definition is calculated consistently, regardless of the end user tool used.  These semantic models should be delivered in a fast, reusable and easy to use way.  

These semantic models are key into enabling the business to create dashboards that support their day-to-day operations.  

At multiple clients in the public sector, we are helping them build these ‘golden datasets’ that are explored through multiple reports and dashboards.  

 

 Delivering better products & services – Customer intelligence 

Within Data & Analytics special attention needs to be given to customer data. Not only to handle it with the right security, but also to obtain the rich value it provides. Customer segmentation analysis and personalization are key differentiators for any business.  

Important to consider here is that your data possibly enables you to explore new markets, namely the market for your data itself! Retailers for example have data that their suppliers find valuable to optimize their business.  

For one of our retail clients we are building a data sharing platform that delivers valuable insights to their partners. 

 

D – Productizing all this – Data DevOps 

DevOps is about enabling continuously improving the business with technology. Developers work with the business to deliver small technology increments, and the Operations are organized to quickly apply these changes in a stable and secure manner delivering value fast 

In this ‘transactional’ world, these DevOps practices have already gained a lot of traction. Organizations that are building web applications for their clients have these continuous integration and deployment pipelines that are heavily automated.  

On the data and analytics side, this traction is only just getting started. In the past, the data warehouse and reporting projects were about big initial investments capturing the core data sets in a number of corporate datamarts. After the initial setup, only small changes were doneor periodically when new systems were introduced on the transactional side. 

Today this is no longer the reality. Much more data is used for analytics, and the data landscape for companies is constantly in motion. Focus for the data teams have shifted from delivering tailor-made reports, to delivering tailor-made datasets with the right business logic, quality and performance. Data warehouses are being enriched with data lakes, and the data being analyzed is constantly growing. In order to keep all of these moving parts under control and enabling continuous changes, data devops practices needs to be introduced.  

The end to end pipeline of connecting to data, transforming, cleaning and shaping into the right analytical models, and delivering these assets in modern data products, needs to be put under the right governance, by the right people, using the right processes, and the right technology. These DevOps practices need to be applied on the data platform that enables building these data assets, on the pipeline that these data assets go through, and on the delivery process of the data products that the business uses. 

 

We will detail how these data practices can be set up in our next blog posts, so stay tuned!

 

Written by our LoQutus A&I Team

 

 

Interested in learning more or in how we can help you with these steps in analytics & insights? Contact us!