Posted on 1 Comment

From BI to AI: The Evolutionary Path of Enterprise Analytics


As we progress along the road of this pandemic-driven recession, CIOs and IT departments need to keep a clear-eyed view of the future and the tasks that we are held to manage. Because even as we deal with all the challenges of remote work, distributed decisions, and uncertain economic environments, we are also held to the challenges of supporting future business needs and supporting the next generation of technology, which continues to be created and launched. 

This means that we need to follow the path of COVID IT

If you’ve followed the stages and actions we recommended in our webinar series or at our Technology Expense Management Expo, you’ve passed through the stage of pure survival, securing remote work, and auditing your environment. Now, we are at Stage 4, which is to gather best practices, celebrate successes, and train employees on the New Normal.

A key aspect of Stage 4 and Stage 5 is the use of data and analytics to support better decision making, improved forecasting, more nuanced automation, and more accurate models and workflows to make sense of complex business phenomena as your organization continues to move from BI (Business Intelligence) to AI (Artificial Intelligence).

To prepare business data for future needs, Amalgam Insights recommends the following steps:

First, improve data collection. This means treating all data in your business as something that will be reused to provide value and cleaning up all existing data through data prep, data quality, and data transformation tasks. It also means putting data into the right format: the age of the relational database as the only tool for analytics is disappearing as non-relational and NoSQL databases have come to the forefront and graph databases like Neo4j and Amazon Neptune finally start their ascendent rise as relationship analytics and semantic search start to eclipse standard Boolean AND, OR, NOT and SQL-based logic.

This isn’t to say that SQL is going anywhere. I still recommend that anybody using data start with a strong foundational knowledge of SQL, as this is probably the only skill I learned 20 years ago that I still use on a regular basis. Skilled relational data querying will always have an important role in the business world. But for business analytics and data managers trying to figure out what is next, consider how to expand your data sources, data quality, and data formats to fit what your company will ask for next.

Second, contextualize your data. One of my favorite sayings, first attributed to Jason Scott, is that “Metadata is a love note to the future.” The ability to prioritize, categorize, and contextualize data sources, fields, and relationships is vital to supporting the future of machine learning and natural language analysis. This means supporting data catalogs, data unification, and master data management tools to bring data together. This stage of data maturity is easy to ignore because it requires getting business context from relevant stakeholders to manage and define data. Given that it can be hard enough to get business users to simply enter data accurately and consistently, the effort to get data definitions and context can be intimidating. But this is a necessary precursor to having “smarter” data and to making the “smarter” decisions that businesses are promised by analytic and machine learning solutions. And the combination of data prep, cleansing, and context make up the majority of work that data scientists end up doing as they try to create relevant models. Solutions that Amalgam Insights recommends most often in this area include Alation, Atlan, Collibra, Informatica, Qlik, Talend, Unifi, and the offerings from megavendors SAP, Oracle, and IBM.

Third, make visualization tools and outputs ubiquitous. Every person in the company should have access to relevant metrics that drive the company. It’s 2020: we’re beyond the time of Skynet and the Terminator, Blade Runner, HAL, and other iconic cinematic visions of the future. The very least we can do is make basic charts and graphs available and accessible to all of our co-workers. Find out what prevents line-level employees from accessing and using data and break down those barriers. Amalgam Insights’ experience is that this challenge comes from a combination of not knowing how to find the right data and how to form the right charts. The answer will likely come from a combination of natural language enhancements driven by the likes of ThoughtSpot, Narrative Science, and Tellius as well as visualization and reporting specialists such as Yellowfin and Toucan Toco and embedded analytics specialists such as Logi Analytics and Izenda.

Fourth, shift from reporting and discovery to predictive analytics. Over the past decade, Tableau has been a fantastic tool for data discovery and continues to lead the market in helping companies to find out what is in their data. However, companies must start thinking of data not only in terms of what it tells us about the present, but how it helps to structure our work and forecast what is next. Data can be used to structure descriptive and predictive models through iterative and guided machine learning. Google’s work with Tensorflow stands out as an end-to-end machine learning solution. During Amalgam Insights’ short existence, DataRobot has quickly risen to become a leader in automated machine learning and its acquisitions of Nexosis for accessibility, ParallelM for MLOps, and Paxata for data prep help have stood out. Microsoft Azure, Amazon Web Services, and IBM Watson also have their own services as well: there are a variety of options for modeling data.

By taking these steps, you can ensure that your data does not fall prey to a premature death as it is rendered obsolete or surrounded by enough technical debt to become functionally useless. If you have any questions on how to better support your data from a future-facing perspective, please contact us at research @ amalgaminsights.com to set up a consultation.

Comments are closed.