Posted on Leave a comment

Managing Inventory for Kubernetes Cost Management

Last week, we mentioned why Kubernetes is an important management concern for cloud-based cost management. To manage Kubernetes from a financial perspective, Amalgam Insights believes that the foundational starting point needs to be in discovering and documenting the relevant technical, financial, and operational aspects of container inventory.

Kubernetes Cost Management requires a complete inventory of containers that includes the documentation of namespaces, clusters, and pods associated with each node. This accounting allows companies to see how their Kubernetes environment is currently structured and provides the starting point for building a taxonomy for Kubernetes.

In addition, a container-based inventory also needs to include the technical context associated with each container. Containers must be tracked along with the cloud-based storage and compute services and resources associated with the container across the lifespan of the container. Since the portability of containers is a key value proposition, companies must focus on the time-series tracking of assets, services, and resource allocation with each container.

Containers must also track these changes on an ongoing basis as they are not simply static assets like a physical server. Although IT organizations are used to looking at usage, itself, on a time-series basis, IT assets and services are typically tracked simply based on when they are moved, added, changed, or deleted. Now, assets and services must also be tracked based on when they are reassigned and reallocated across containers and workloads. These time-based assignments for container-based reassignment can be difficult to track without a strategy to track these changes over time.

Inventories must also be tagged from an operational perspective, where containers and clusters are associated with relevant applications, functions, resources, and technical issues. This is an opportunity to tag containers, namespaces, and clusters with relevant monitoring tools, technical dependencies, cloud providers, applications, and other requirements for supporting containers.

From a practical perspective, this combination of operational, financial, and technical tagging ensures that a container can be managed, duplicated, altered, migrated, or terminated without any effects to relevant working environments. There is no point in saving a few dollars, Euro, or yuan only to impair an important test or production environment.

Kubernetes inventory management requires a combination of operational, financial, and technical information tracked over time to fully understand both the business dependencies and cost efficiencies associated with containerizing applications.

To learn more about Kubernetes Cost Management and key vendors to consider, read our groundbreaking report on the top Kubernetes Cost Management vendors.

Posted on

Market Alert: Box Acquires SignRequest to Develop Internal Electronic Signatures

Key Takeaway: “This opportunity for existing Box customers to embed e-signature more deeply into their document approval processes is a multi-billion dollar opportunity when the analytics, automation, workflows, and business process optimization opportunities are all taken into account.”


On February 3, 2021, Box announced its intention to acquire SignRequest, a Dutch e-signature vendor founded in 2014, for an estimated $55 million to develop Box Sign. Box plans to launch Box Sign in the summer of 2021 and to make it available both for personal and enterprise plans. Amalgam Insights believes this is an interesting opportunity for multiple reasons.

First, consider that Box’s entire go-to-market strategy is driven around placing enterprise standards around cloud-based content. This has always been its key driver and was the foundational starting point for allowing Box to succeed at a time when cloud-based content management system startups were popping up like wildfire a decade ago. As a starting point, let’s just use “enterprise standards” as a shorthand to describe the governance, security, analytics, and automation necessary to translate basic data and activity into the context and foundation needed to support businesses. Adding e-signatures allows Box to better serve its pharmaceutical, healthcare, government, legal, & other regulated clients with contractual & personal information transfers.

Second, the emergence of the COVID pandemic has driven the need to develop remote work capabilities and highlighted weaknesses in paper-based workflows that organizations have avoided for decades. The disease-driven digital transformation happening now is forcing companies to conduct the operational equivalent of changing the tires on a car while driving on a highway and requires complex problem-solving solutions that are well-packaged and readily available. This need drove the revenue of enterprise Software-as-a-Service companies in 2020 and will continue to drive growth as the majority of companies still need to fill gaps in their digital work toolsets.

Third, with internal e-signature, Box can now add human trust, activity, response time, & human-driven automation to a variety of documents and activities where it was previously dependent on partners. Human sign-off is a key data component, but it’s not the be-all, end-all of work. This is an opportunity to add signature-based approval as a foundational metadata component to every document, workroom, and content-based collaboration that Box supports, which is a vital area that no company has fully conquered. Looking at the enterprise market, companies that have started taking on this challenge include Workiva and ServiceNow, which are both obviously cloud SaaS darlings both from a revenue growth and valuation perspective.

I’m hoping this is a step towards Box being a Workiva (and eventually ServiceNow) competitor and starting to push activity analytics, machine-learning driven optimizations, and workflow capabilities to themarket. The content activity Box supports has immense latent value in benchmarking, authorizing, and rationalizing work. This trusted activity was one of the areas that some analysts, including myself, hoped that Blockchain would serve. But reality has proven that unlocking value from trusted activity requires hybrid activity that includes people, documents, and transactions.

This hybrid activity management along with the analytics, automation, trust, and force multiplier productivity that could result from this combination of human trust, document context, timely context, and related documents and workflows is the true promise of this acquisition. Existing document management vendors either lack the enterprise governance, platform standardization, automation, or functional capabilities to bring authorization and work together to the masses in a cost-efficient manner. Box’s business model that includes both freemium and enterprise models provides a unique opportunity to bridge the gaps in e-signature adoption, content, and business scale to provide both a better e-signature product and a next-generation trust platform driven by e-signature.

The takeaways here are two-fold. First, look closely at Box to see how they bring Box Sign to market in 2021. This opportunity for existing Box customers to embed e-signature more deeply into their document approval processes is a multi-billion dollar opportunity when the analytics, automation, workflows, and business process optimization opportunities are all taken into account. Second, expect enterprise workflow and content vendors ranging from ServiceNow to Workiva to OpenText to both change their esignature offerings and to start a product war to support greater advancement in signature-based capabilities, data management, and analytics as Box threatens to change the game.

Posted on 1 Comment

Databricks and DataRobot Funding Rounds Highlight a Rising Trend in Tech Funding: the Investipartner

In the tech era, one of the key buzzwords to describe businesses going to market is the idea of “coopertition” where companies choose to work together towards common goals while competing with each other.

Coined by Novell founder Raymond Noorda, this neologism now describes a common occurrence in the technology world and is a key operational aspect in describing Microsoft’s rapid ascent as Satya Nadella took office as CEO. Under Nadella, Microsoft is happy to sell its cloud infrastructure services while supporting competitive applications such as the 2019 announcement of selling Salesforce on Azure. Needless to say, coopertition is both a mature and expected business practice.

In the 2020s, this idea of coopertition has transformed and evolved as several tech trends have accelerated the pace of business.

  • Large tech companies have billions of dollars in cash on hand, see their stock trading at record highs, and need to continue growing rapidly.
  • Venture capital and private equity-backed companies have improved their ability to build “unicorns,” startups that grow to billion-dollar valuations within a few years. This size increasingly prevents even relatively large companies from purchasing these startups.
  • The radical growth of data, analytics, and machine learning as both data and algorithmic models continue to grow at triple-digit pace year over year
  • Customer interest in purchasing best-in-breed point solutions to solve specific problems
  • Customers are increasingly comfortable in quickly knitting these solutions together through a shared platform, use of APIs, virtualization, and containerization.

This combination of technology creation and consumption makes it difficult for incumbent vendors to build and bring tools to market in a relevant time frame before startups pop up and rapidly gain market share. In light of this challenge, Amalgam Insights notes that a number of recent funding announcements show signs of modernization of “cooperition” where vendor companies in competitive or adjacent markets invest in a quickly emerging and growing partner that solves issues that are related to their own solutions.

Rather than purchasing the company outright or creating their own version, the vendors choose to take a minority stake in these companies while having some shared go-to-market or partnering strategy. This additional step beyond cooperation involving an equity investment is a trend that Amalgam Insights calls “Investipartnering” where companies choose to make equity commitments with go-to-market partners. Recent examples include:

DataRobot, an automated machine learning solution that has quickly acquired and developed machine learning preparation, operations, and deployment capabilities. In December 2020, DataRobot raised a $320 million Series F round which added investipartners Snowflake, Salesforce Ventures, and Hewlett Packard Enterprise to accompany go-to-market approaches to pair analytics and cloud infrastructure with DataRobot’s ability to develop and operationalize machine learning.

Databricks, a unified analytics platform built by the creators of Apache Spark, announced its $1 billion Series G round on February 2021. This round included new investors Amazon Web Services, Salesforce Ventures as well as additional investment from Microsoft. In addition, Databricks took investment from the Canada Pension Plan Investment Board, which is currently a private equity owner of Informatica. Databricks competes in data management, machine learning, and analytics against each of these investors to some extent, but is also seen as a strategic partner.

Of course, this approach requires both that the startup is willing to partner with an established company in a space where the startup may also be positioned for further growth. And it requires that the large investing company both has the humility to realize that it may not be best suited to create the solution in question or that it should diversify its holdings in a particular market.

And this is not a unique or especially new trend. Microsoft’s investments in Apple in 1997 and Facebook in 2007 both show prior examples of investipartnering. However, what is new is the increased frequency with which high-flying companies such as Microsoft, Amazon, Adobe, Salesforce, Paypal, ServiceNow, Zoom, Snowflake, and Workday will continue to play this role in building fast-growing startups.

As large technology companies continue the need for growth and startups seek strategic smart money to facilitate their transition from private to public companies, Amalgam Insights expects that the Investipartner route will continue to be an attractive one for savvy technology companies that realize that the power of building markets is more important than a basic winner-take-all strategy.

Posted on 1 Comment

The Need to Manage Kubernetes Cost in the Age of COVID

Kubernetes has evolved from an interesting project to a core aspect of the next generation of software platforms. As a result, enterprise IT departments need to manage the costs associated with Kubernetes to be responsible financial stewards. This pressure to be financially responsible is exacerbated by the current status of COVID-driven pandemic recession that the majority of the world.

Past recessions have shown that the companies best suited to increase sales after a recession are those that

  • Minimize financial and technical debt
  • Invest in IT
  • Support decentralized and distributed decision-making, and
  • Avoid permanent layoffs.

Although Kubernetes is a technology and not a human resources management capability, it does help support increased business flexibility. Kubernetes cost management is an important exercise to ensure that the business flexibility created by Kubernetes is handled in a financially responsible manner. Technology investments must support multiple business goals: optimizing current business initiatives, supporting new business initiatives, and allowing new business initiatives to scale. Without a financial management component, technology deployments cannot be fully aligned to business goals.

From a cost perspective, Amalgam Insights believes that the IT Rule of 30 also applies to Kubernetes.

The IT Rule of 30 states that any unmanaged IT spend category averages 30% in wasted spend, due to a combination of resource optimization, demand-based scaling, time-based resource scaling, and pricing optimization opportunities that technical buyers often miss.

IT engineers and developers are not typically hired for their sourcing, procurement, and finance skills, so it is understandable that their focus is not going to be on cost optimization. However, as Kubernetes-related technology deployments start exceeding $300,000, companies start to have 6-figure dollar savings opportunities just from optimizing Kubernetes and related cloud and hardware resources used to support containerized workloads.

To learn more about Kubernetes Cost Management and key vendors to consider, read our groundbreaking report on the top Kubernetes Cost Management vendors.

Posted on

Informatica Supports the Data-Enabled Customer Experience with Customer 360

On January 11, 2021, Informatica announced its Customer 360 SaaS solution for customer Master Data Management. Built on the Informatica Intelligent Cloud Services platform, Informatica’s Customer 360 solution provides data integration, data governance, data quality, reference data management, process orchestration, and master data management as an integrated SaaS (Software as a Service)-based solution.

It can be easy to take the marketing aspects of master data management solutions for granted, as every solution on the market focused on customer data seems to claim that they help to manage relationships, provide personalization, and support data privacy and compliance while supporting a single version of the truth. The similarity of these marketing positions provides confusion on how new offerings in the master data space differ. And the idea of providing a “single version of the truth” is becoming less relevant or useful in an era where data grows and changes faster than ever before and the need for relevant data based on a shared version of the truth becomes more important than simply having one monolithic and complete version of the truth documented and reified in a single master data repository.

Customer master data also provides challenges to companies as this data has to be considered in context of the expanded role that data now plays in defining the customer. In the Era of COVID, customer interactions and relationships have largely been driven by remote, online transactions as in-person shopping has been sharply curtailed by a combination of health concerns, regulations, and, in some cases, supply chain interruptions that have affected the availability of commodities and finished goods. In this context, customer data management plays an increasingly important role in driving customer relationships, not only by providing personalization of data but also in managing appropriate metadata to support the appropriate building of hierarchies and relationships. Both now and as we move past the current time of Coronavirus, companies must support the data-enabled customer and reduce barriers to commerce and purchasing.

In exploring the Informatica Customer 360 solution, Amalgam Insights found several compelling aspects that enterprise organizations should consider as they build out their customer master data and seek a solution to maintain data consistency across all applications.

First, Amalgam Insights considers the combination of data management, metadata management, and data cleansing capabilities in Customer 360 to be an important capability. Customer data is notorious for its ability to become dirty and inaccurate because it is linked to the characteristics of human lives: home addresses, email addresses, phone numbers, purchase activities, and preferences, etc…

In this context, it is vital that a master data solution focused on customer data needs to support clean and relevant data and the business context provided with reference data and other relevant metadata. Rather than treat master data literally as a static and standalone data record, Customer 360 brings together the context and cleansing needed to maximize both the value and accuracy of master data.

Second, Customer 360’s use of artificial intelligence and machine learning will help businesses to maintain an accurate and shared version of the truth. AI is used in this solution to

  • assist with data matching across data sets
  • provide “smart” field governance to auto-correct data in master data fields with defined formats such as zip codes or country abbreviations
  • use Random Forests to support active learning for blocking and matching
  • support deep learning techniques for text matching and cleansing text that may be difficult to parse

Third, Informatica’s Customer 360 solution provides a strong foundation for application development based both on being built on a shared microservices architecture as well as investments in DevOps-friendly capabilities including metadata versioning supported by automated testing and defined DevOps pipelines. The ability to open up both the services available on the Customer 360 solution as well as the data to custom applications and integrations will help the data-driven business to make relevant data more accessible

The Customer 360 product also includes simplified consumption-based pricing, a different user interface designed for a more simple user experience, as well as improved integration capabilities including real-time, streaming, and batch functionality that reflects the changing nature of data. Amalgam Insights looks forward to seeing how a pay-as-you-go SaaS approach to Customer 360 is received, as this combination is relatively new in the world of master data management implementations that are often treated as massive CapEx projects.

Overall, Amalgam Insights sees Informatica’s Customer 360 as a valuable step forward for both the master data management and customer data markets.

This combined vision of providing consumption-based pricing, a contextual and intelligently augmented version of the truth, and a combination of data capabilities designed to maximize the accuracy of customer data within a modern UX is a compelling offering for organizations seeking a customer data management solution.

As a result, Amalgam Insights recommends Customer 360 for organizations interested in minimizing the amount of time spent in cleansing, fixing, and finding customer data while accelerating time-to-context. Organizations focused on progressing beyond standard non-AI-enabled data cleansing and governance processes are best positioned to maximize the value of this offering. 

Posted on

Why Babelfish for Aurora PostgreSQL is a Savage and Aggressive Announcement by AWS

On December 1st at Amazon re:invent, Amazon announced its plans to open source Babelfish for PostgreSQL in Q1 of 2021 under the Apache 2.0 license. Babelfish for PostgreSQL is a service that allows PostgreSQL databases to support SQL Server requests and communication without requiring schema rewrites or custom SQL.

As those of you who work with data know, this is an obvious shot across the bow by Amazon to make it easier than ever to migrate away from SQL Server and towards PostgreSQL. Amazon is targeting Microsoft in yet another attempt to push database migration.

Over my 25 years in tech (and beyond), there have been many many attempts to push database migration and the vast majority have failed. Nothing in IT has the gravitational pull of the enterprise database, mostly because the business risks of migration have almost never warranted the potential operational and cost savings of migration.

So, what makes Babelfish for PostgreSQL different? PostgreSQL is more flexible than traditional relational databases in managing geospatial data and is relatively popular, placing fourth on DB-Engines ranking as of December 2, 2020. So, the demand to use PostgreSQL as a transactional database fundamentally exists at a groundroots level.

In addition, the need to create and store data is continuing to grow exponentially. There is no longer a “single source of truth” as there once was in the days of monolithic enterprise applications. Today, the “truth” is distributed, multi-faceted, and rapidly changing based on new data and context, which is often better set up in new or emerging databases rather than retrofitted into an existing legacy database tool and schema.

The aspect that I think is fundamentally most important is that Babelfish for PostgreSQL allows PostgreSQL to understand SQL Server’s proprietary T-SQL. This removes the need to rewrite schemas and code for the applications that are linked to SQL Server prior to migration.

And it doesn’t hurt that, as an open source project, the PostgreSQL community has traditionally been both open and not dominated by any one vendor. So, although this project will help Amazon, Amazon will not be driving the majority of the project or have a majority of the contributors to the project.

My biggest caveat is that Babelfish is still a work in progress. For now, it’s an appropriate tool for standard transactional database use cases, but you will want to closely check data types. And if you have a specialized industry vertical or use case associated with the application, you may need an industry-specific contributor to help with developing Babelfish for your migration.

As for the value, there is both the operational value and the financial value. From an operational perspective, PostgreSQL is typically easier to manage than SQL Server and provides more flexibility to migrate and host the database based on your preferences. There is also an obvious cost benefit, as the inherent license cost of SQL Server will likely cut the cost of the database itself by 60%, give or take on Amazon Web Services. For companies that are rapidly spinning up services and creating data, this can be a significant cost over time.

For now, I think the best move is to start looking at the preview of Babelfish on Amazon Aurora to get a feel for the data translations and transmissions since Babelfish for PostgreSQL likely won’t be open sourced for another couple of months. This will allow you to measure up the maturity of Babelfish for your current and rapidly growing databases. Given the likely gaps that exist in Babelfish at the moment, the best initial use cases for this tool are for databases where fixed text values make up the majority of data being transferred.

As an analyst, I believe this announcement is one of the few in my lifetime that will result in a significant migration of relational database hosting. I’m not predicting the death of SQL Server, by any means, and this tool is really best suited for smaller TB and below transactional databases at this point. (Please don’t think of this as a potential tool for your SQL Server data warehouse at this point!)

But the concept, the proposed execution, and the value proposition of Babelfish all line up in a way that is client and customer-focused, rather than a heavy-handed attempt to force migration for vendor-related revenue increases.

Posted on

Underspecification, Deep Evidential Regression, and Protein Folding: Three Big Discoveries in Machine Learning

This past month has been a banner month for Machine Learning as three key reports have come out that change the way that the average lay person should think about machine learning. Two of these papers are about conducting machine learning while considering underspecification and using deep evidential regression to estimate uncertainty. The third report is about a stunning result in machine learning’s role to improve protein folding.

The first report was written by a team of 40 Google researchers, titled Underspecification Presents Challenges for Credibility in Modern Machine Learning. Behind the title is the basic problem that certain predictors can lead to nearly identical results in a testing environment, but provide vastly different results in a production environment. It can be easy to simply train a model or to optimize a model to provide a strong initial fit. However, savvy machine learning analysts and developers will realize that their models need to be aligned not only to good results, but to the full context of the environment, language, risk profile, and other aspects of the problem in question.

The paper suggests conducting additional real-world stress tests for models that may seem similar and to understand the full scope of requirements associated with the model in question. As with much of the data world, the key for avoiding underspecification seems to come back to strong due diligence and robust testing rather than simply trusting the numbers.

The second report is Deep Evidential Regression, written by a team of MIT and Harvard authors which did the following.

In this paper, we propose a novel method for training non-Bayesian NNs to estimate a continuous target as well as its associated evidence in order to learn both aleatoric and epistemic uncertainty. We accomplish this by placing evidential priors over the original Gaussian likelihood function and training the NN to infer the hyperparameters of the evidential distribution

http://www.mit.edu/~amini/pubs/pdf/deep-evidential-regression.pdf

From a practical perspective, this method provides a relatively simple way to understand how “uncertain” your neural net is compared to the reality that it is trying to reflect. This paper moves beyond the standard measures of variance and accuracy to start trying to understand how confident we can be in the models being created. From my perspective, this concept couples well with the problem of underspecification. Together, I believe these two papers will help data scientists go a long way towards cleaning up models that look superficially good, but fail to reflect real world results.

Finally, I would be remiss if I didn’t mention the success of DeepMind’s program, AlphaFold, in the Critical Assessment of Structure Prediction challenge, which focuses on protein-structure predictions.

Although DeepMind has been working on AlphaFold for years, this current version tested yesterday provided results that were a quantum leap compared to prior years.

From Deepmind: https://deepmind.com/blog/article/alphafold-a-solution-to-a-50-year-old-grand-challenge-in-biology

The reason that protein folding is so difficult to calculate is that there are multiple levels of structure to a protein. We learn about amino acids, which are the building blocks of proteins and basically defined by DNA. The A’s, T’s, C’s, and G’s basically provide an alphabet that defines the linear lineup of a protein with groups of three nucleotides defining an amino acid.

But then there’s a secondary structure where internal bonding can make the proteins line up as alpha sheets or beta helices. The totality of these secondary structures, this combination of sheets and helix shapes, makes up the tertiary structure.

And then multiple chains of tertiary structure can come together into a quaternary structure, which is the end game for building a protein. If you really want to learn the details, Khan Academy has a nice video to walk you through the details, as I’ve skipped all of the chemistry.

But the big takeaway: there are four levels of increasingly complicated chemical structure for a protein, each with its own set of interactions that make it very computationally challenging to guess what a protein would look like based just on having the basic DNA sequence or the related amino acid sequence.

Billions of computing hours have been spent on trying to figure out some vague idea of what a protein might look like and billions of lab hours have then been spent trying to test whether this wild guess is accurate or, more likely, not. This is why it is an amazing game-changer to see that DeepMind has basically nailed what the quaternary structure looks like.

This version of AlphaFold is an exciting Nobel Prize-caliber discovery. I think this will be the first Nobel Prize driven by deep learning and this discovery is an exciting validation of the value of AI at a practical level. At this point, AlphaFold is the “Data Prep” tool for protein folding with the same potential to greatly reduce the effort needed to simply make sure that a protein is feasable.

This discovery will improve our ability to create drugs, explore biological systems, and fundamentally understand how mutations affect proteins on a universal scale.

This is an exciting time to be a part of the AI community and to see advances being made literally on a weekly basis. As an analyst in this space, I look forward to seeing how these, and other discoveries, filter down to tools that we are able to use for business and at home.

Posted on

Updated Analysis: ServiceNow Acquires Element AI

(Note: Last Updated January 15, 2021 to reflect the announced purchase price.)

On November 30, 2020, ServiceNow announced an agreement to purchase Element AI, which was one of the top-funded and fastest-growing companies in the AI space.

Element AI was founded in 2016 by a supergroup of AI and technology executives with prior exits including Jean-Francois Gagne, Anne Martel, Nicolas Chapados, Philippe Beaudoin, and Turing Award winner Yoshua Bengio. This team was focused on helping non-technical companies to develop AI software solutions and expectations were only raised after a $102 million Series A round in 2017 followed by a $151 million funding round in 2019.

Element AI’s business model was similar to the likes of Pivotal Labs from a software development perspective or Slalom from an analytics perspective in that Element AI sought to provide the talent, skills, resources, and development plans to help companies adopt AI. The firm was often brought in to support AI projects that were beyond the scope of larger, more traditional consultancies such as Accenture and McKinsey.

However, Element AI faced a crossroads in 2020 in between several key market trends. First, the barrier to entry for AI has reduced considerably due to the development of AutoML solutions combined with the increased adoption of Python and R. Second, management consulting revenue growth slowed down in 2020, which reduced the velocity of pipeline to Element AI and made it harder to project the “hockey stick” exponential growth expected by highly funded companies, especially in light of COVID-related contract delays. And third, the ROI associated with AI projects is now better understood to largely come from the automation and optimization of processes associated with already-existing digital transformation projects that make separate AI efforts to be duplicative in nature, as Amalgam Insights has documented in our Business Value Analysis reports over time.

In the face of these trends, the acquisition of Element AI by ServiceNow is a very logical exit. This acquisition allows investors to get their money back relatively quickly.

(Update: on January 14, 2021, ServiceNow filed that the purchase price was approximately US $230 million or CDN $295 million. This was a massive discount on the estimated $600 million+ valuation from the previous September 2019 funding announcement )

Not every bet on building a multi-billion dollar company works out as planned, but this exercise was successful in creating a team of AI professionals with experience in building enterprise solutions. Amalgam Insights expects that over 200 Element AI employees will end up moving over to ServiceNow to build AI-driven pipelines and solutions under ServiceNow’s chief AI officer Vijay Narayanan. This team was ultimately the key reason for ServiceNow to make the acquisition, as Element AI’s commercial work is expected to be shut down after the close of this acquisition so that Element AI can focus on the ServiceNow platform and the enterprise transformation efforts associated with the million-dollar contracts that ServiceNow creates.

With this acquisition, ServiceNow has also stated that it intends to maintain the Element AI Montreal office as an “AI innovation hub,” which Amalgam Insights highly approves of. Montreal has long been a hub of analytics, data science, and artificial intelligence efforts and it would both help ServiceNow from a technical perspective to maintain a hub here and could help assuage some wounds that the Canadian government may have from losing a top AI company with government funding to a foreign company. Given ServiceNow’s international approach to business and Canada’s continued importance to the data, analytics, and AI spaces, this acquisition could be an unexpected win-win relationship between ServiceNow and Canada.

What To Expect Going Forward

With this acquisition, ServiceNow has quickly gained access to a large team of highly skilled AI professionals at a time when its revenues are growing 30% year over year. At this point, ServiceNow must scale quickly simply to keep up with its customers and this acquisition ended up being a necessary step to do so. This acquisition is the fourth AI-related acquisition made by ServiceNow after purchases of Loom Systems for log analytics, Passage AI to support conversational and natural language understanding, and Sweagle to support configuration management (CMDB) for IT management.

At the same time, Amalgam Insights believes this acquisition will provide focus to the Element AI team, which was dealing with the challenge of growing rapidly, while trying to solve the world’s AI problems ranging from AI for Good to defining ethical AI to building AI tools to discovering product-revenue alignment. The demands of trying to solve multiple problems as a startup, even an ambitious and well-funded startup, can be problematic. This acquisition allows Element AI to be a professional services arm and a development resource for ServiceNow’s ever-evolving platform roadmap as ServiceNow continues to expand from its IT roots to take on service, HR, finance, and other business challenges as a business-wide transformational platform.

Posted on Leave a comment

Analysis: Qlik acquires Blendr.io to Enhance Data Integration Capabilities

Key Stakeholders: Chief Information Officers, Chief Technical Offiers, Chief Data Officers, Data Management Managers, Analytics Managers, Enterprise Information Managers

Why It Matters: SaaS and public cloud data sources are both rapidly proliferating and playing a bigger role in enterprise analytics. It is no longer enough to build a Single Source of Truth without being able to share and integrate data across all relevant sources.

Key Takeaway: This is an evolutionary time for analytics solutions as AI, process automation, public cloud, & SaaS proliferation are quickly changing the demands for analytics. This iPaaS (Integration Platform as a Service) acquisition shows how Qlik is augmenting its core capabilities to keep up with quickly-changing market demands.

About the Acquisition

On October 22, 2020, Qlik acquired Blendr.io, an integration Platform as a Service that supports data integration across over 500 Software as a Service applications and clud data sources. This research note analyzes why Qlik acquired Blendr.io, what this means for current and potential clients of both companies, and key recommendations for data and analytics professionals to consider based on this acquisition.

Blendr.io was founded in 2017 in Belgium by tech entrepreneur Niko Nelissen, who had previous experience in building out marketing automation, event ticketing, data hosting, and data center operations businesses. In short, his background enveloped all aspects of providing data as a service both as backend data infrastructure as well as front-end data used for sales and marketing technologies.

This background led to the creation of Blendr.io, which quickly arose as a tool to support data automation and workflows across data, alerts, caches, applications, and databases. This capability has proven especially valuable at a time when application proliferation has occured and the “trusted business platform” has been replaced by a federation of Best-in-Breed applications that need to be connected together from data, process, and synchronization perspectives.

Based on this need, Qlik’s acquisition of Blendr.io makes sense as a functional addition that both strengthens Qlik’s sales and marketing support and allows Qlik to play a greater role in delivering what they call “Active Intelligence.”

This acquisition also is accretive to Qlik’s prior acquisitions made since it became a Thoma Bravo portfolio company in 2016:

July 2018: Qlik acquires Podium Data to gain a Big Data-fluent data catalog and governance capability.

March 2019: Qlik acquires Attunity to support cloud data synchronization and to validate the advice I had provided at Attunity’s 2013 Analyst Day on the future valuation of Attunity.

January 2020: Qlik acquires RoxAI to support real-time alerts associated with data and analytic changes.

August 2020: Qlik acquires Knarr Analytics assets to strengthen its supply chain analytics capabilities and to bring in key talent.

This acquisition also comes as Qlik is in the process of retiring a prior acquisition, Qlik DataMarket. This 2014 acquisition was originally intended to help clients to combine internal business data with external geographic, financial, or political data. But as access to public external data has become easier to support and business clients have found the challenge of simply managing internal data across a wide variety of private sources to become a bigger challenge, Qlik has made a similar shift through this acquisition.

What to Expect from this Acquisition

Qlik customers with significant SaaS portfolios should be excited to see this acquisition, as it now allows Qlik to develop native analytic products across a variety of marketing, sales, productivity, machine learning, and public cloud platforms. Qlik states that it will start launching products based on the blendr.io acquisition in 2021. Amalgam Insights expects that the combination of iPaaS, data governance, and analytics will allow Qlik to create secure amalgams of data and process automation.

This step of integrating SaaS and cloud data to analytics is necessary for Qlik to deliver on the promise of the “data-driven enterprise” that we have all heard so much about over the past several years. To get beyond the hype, data must be contextualized, analyzed, and used both to accelerate basic rules-based actions and to support decisions based on more complex scenarios, politics, and strategies. Qlik’s acquisition of blendr.io is an important step forward in addition to the Podium Data and Attunity acquisitions in allowing Qlik to support a multi-app, multi-cloud, and multi-data environment where there is no longer the “single source of truth.”

For Blendr.io customers, expect Qlik to continue development on Blendr as a standalone product. It is in Qlik’s best interest to continue the development of Blendr’s iPaaS, as this is a competitive space. Blendr.io competes favorably with the likes of Dell Boomi, Jitterbit, MuleSoft, SnapLogic, Workato, and Zapier. Qlik will be pressured to maintain functionality on par or ahead of its competitors. However, similar to prior acquisitions, expect Blendr to see a name change and an expanded focus on supporting the Qlik portfolio of products. Over time, it would not be surprising to see this “Qlik iPaaS” being more integrated with the Qlik Data Catalyst catalog product and the replication and ingestion components of the Qlik Data Integration platform.

Recommendations

For Qlik customers, this is a good time to start putting pressure on your sales and account team on the types of products you would like to see from a combination of iPaaS integration, governed data, and analytics. Why build it yourself when you can make Qlik do a fair bit of the heavy lifting?

In addition, Qlik customers may want to share their current SaaS portfolios with their account teams to drive the development of the iPaaS. The challenge with managing SaaS-based data is not in connecting one app to one data source: APIs make this task fairly straightforward. The real challenge is that the average enterprise uses over 1,000 discrete apps and data-driven services based on data provided by security vendors such as Symantec and NetSkope, which leads to a near-infinite number of potential connections. Companies must get a “starter kit” of connectors to handle the Pareto 20% of issues that represent 80% of their day-to-day work.

For Blendr.io customers, expect to be introduced to the Qlik portfolio of products. For those who have not looked at Qlik in a few years and think of it mainly as the QlikView visual discovery solution, take a look at Qlik’s portfolio across data management, data lake and warehouse management, data replication, and the Qlik Sense data engine which includes more modern geospatial, search, and natural language analytic capabilities. Since Thoma Bravo’s acquisition of Qlik in 2016, Qlik has expanded its portfolio to support data management challenges.

Conclusion

Overall, Amalgam Insights is bullish on this latest acquisition, as it both fills a gap in Qlik’s existing portfolio of data and analytics capabilities and brings in a technology that is still relatively new and flexible for Qlik to integrate into its quickly growing portfolio. With this acquisition, Qlik gets one step closer to becoming an enterprise data solution.

From a market perspective, it is hard not to compare the moves Qlik makes to the moves made by fellow private equity-owned data companies TIBCO (purchased by Vista Equity in September 2014) and Informatica (purchased by Permeira and Canada Pension Plan Investment Board in August 2015). Qlik’s focus on expanding data discovery and access across frequently used business data has been fairly consistent across its acquisitions. As these companies race towards the financial timelines and outcomes required by private equity, Amalgam Insights believes that Qlik is well on path to creating a whole that is more than the sum of its parts.

If you have any additional questions about this acquisition, the current state of the business analytics market, or how to work with Amalgam Insights, please contact us at info@amalgaminsights.com to set up a consultation.

Posted on Leave a comment

Analysis: CoreView Raises $10 Million Series B Round for SaaS Management

Key Stakeholders: CIO, CTO, CFO, Software Asset Managers, IT Asset Managers, IT Procurement Managers, Technology Expense Managers, Sales Operations Managers, Marketing Operations Managers.

Why It Matters: The investment in CoreView comes at a time when SaaS proliferation and management are becoming a core IT problem. CoreView’s leadership position in managing Microsoft 365 and enterprise SaaS portfolios makes it a vendor to consider to solve the SaaS mess.

Top Takeaway: Enterprises and IT suppliers managing large SaaS portfolios either from a financial or operational perspective must find a solution to manage the hundreds or thousands of SaaS apps and services under management or risk both security breaches and financial bloat with millions of dollars at stake.

About the Acquisition

On October 5th, 2020, CoreView raised a $10 million Series B round which was led by Insight Partners. CoreView provides a Software as a Service management platform to secure and optimize Microsoft 365 and additional applications as an augmentation of the Microsoft M365 Admin Center.

CoreView was founded in 2014 in Milan, Italy by a team with experience as Microsoft system integrators to provide governance for Office 365 deployments. In October 2019, CoreView augmented its solution with the acquisition of Alpin, a SaaS management solution used to monitor SaaS activity and manage costs.

With this funding, CoreView is expected to increase both its direct clientele as well as its global reseller and service provider base. Having grown almost three-fold over the past year, CoreView is acquiring this funding at a time when SaaS management is becoming an increasingly important part of IT management.

From Amalgam Insights’ perspective, this funding is interesting for two reasons: the quality of the investor and the growing challenge of managing SaaS.

First, this round was led by Insight Partners, which has a strong history of investing in fast-growing DevOps and data companies in line with CoreView’s enterprise software management needs, including Ambassador, Carbon Relay, JFrog, Resolve, Dremio, and OneTrust. Because this investor has been deeply involved with investments in the future of software development and management, Amalgam Insights believes that Insight Partners provides value to CoreView as an investor.

Second, this funding is coming at a time when SaaS proliferation has become an increasingly challenging problem. This funding indicates where the next wave of growth is going to occur in IT management. After a decade of stating that “There’s an app for that, companies must now face the challenge of standardizing and optimizing their app environments. Security vendors such as Symantec and NetSkope have published estimates that the average enterprise uses between 900 and 1200 discrete apps and services on a regular basis, which creates a logistical nightmare.

A decade ago, I wrote on the challenges of Bring Your Own Device and the issues of expense and inventory management for these $500 devices. But with the emergence of Bring Your Own App, multiplied by the sheer proliferation of productivity, sales and marketing, and other line-of-business applications, SaaS management was already coming of age as its own complicated challenge for IT as SaaS was growing 20-25% per year as a market. With the challenges of COVID-19, SaaS has only become more important for keeping remote and work-from-home employees connected to key tools and data.

Recommendations

Based on this funding round, Amalgam Insights makes one key recommendation for IT departments: get control of your SaaS portfolio, which is likely scattered across line-of-business admins, expense reports, and the half of SaaS associated with enterprise software that is currently in IT. Even if the app administration remains in the hands of the line-of-business teams, IT needs to be aware of the data governance, data integration and duplication, and zero-trust based management of least-privilege access across apps. IT still has a job in the SaaS era as SaaS continues to grow from its current size of a quarter of all software in 2020 to Amalgam Insights’ projection in 2025 that the SaaS market will triple to approximately $300 billion and become half of the enterprise software market.

An additional recommendation for all IT agents, resellers, and service providers is to gain a SaaS management capability as soon as possible. At this point, this means looking at two areas: SaaS Operations Management focused on the governance and configuration of SaaS and SaaS Expense Management focused on the inventory and cost optimization of SaaS. There is some overlap between the two as well as some areas of specialization. From Amalgam Insights’ perspective, CoreView is recommended as both an Operations Management and an Expense Management solution with a specialization in supporting Microsoft 365.

If you have any questions about this research note or on the SaaS Management market, please contact Amalgam Insights at info@AmalgamInsights.com to set up time to speak with our analysts.