Posted on

Observable raises a $35 million B round for data collaboration

On January 13, 2022, Observable raised a $35.6 million Series round led by Menlo Ventures with participation from existing investors Sequoia Capital and Acrew Capital. This round increases the total amount raised by Observable to $46.1 million. Observable is interesting to the enterprise analytics community because it provides a platform to help data users to collaborate throughout the data workflow of data discovery, analysis, and visualization.

Traditionally, data discovery, contextualization, analytics, and visualization can potentially be supported by different solutions within an organization. This complexity is multiplied by the variety of data sources and platforms that have to be supported and the number of people who need to be involved at each stage which leads to an unwieldy number of handoffs, the potential issue of using the wrong tool for the wrong job, and an extended development process that results from the inability for multiple people to simultaneously work on creating a better version of the truth. Observable provides a single solution to help data users to connect, analyze, and display data along with a library of data visualizations that help provide guidance on potentially new ways to present data.

From a business perspective, one of the biggest challenges of business intelligence and analytics has traditionally been the inability to engage relevant stakeholders to share and contextualize data for business decisions. The 2020s are going to be a decade of consolidation for analytics where enterprises have to make thousands of data sources available and contextualized. Businesses have to bridge the gaps between business intelligence and artificial intelligence, which are mainly associated with the human aspects of data: departmental and vertical context, categorization, decision intelligence, and merging business logic with analytic workflows.

This is where the opportunity lies for Observable in allowing the smartest people across all aspects of the business to translate, annotate, and augment a breadth of data sources into directional and contextualized decisions while using the head start of visualizations and analytic processes that have been shared by a community of over five million users. And then by allowing users to share these insights across all relevant applications and websites, these insights can drive decisions in all relevant places by bringing insights to the users.

Observable goes to market with a freemium model that allows companies to try out Observable for free and then to add editors at tiers of $12/user/month and $40/user/month (pricing as of January 13, 2022). This level of pricing makes Observable relatively easy to try out.

Amalgam Insights currently recommends Observable for enterprises and organizations with three or more data analysts, data scientists, and developers who are collaboratively working on complex data workflows that lead to production-grade visualization. Although it can be more generally used for building analytic workflows collaboratively, Observable provides one of the most seamless and connected collaborative experiences for creating and managing complex visualizations that Amalgam Insights has seen.

Posted on

Reviewing 2021 IT Cost Trends

IT Cost Management is one of the core practices at Amalgam Insights. This practice focuses on tracking both vendors and product offerings that help enterprises fight off the IT Rule of 30, Amalgam Insights’ observation that every unmanaged IT category averages 30% in bloat and waste and that this can be even greater for emerging technology areas such as cloud computing.

From our perspective, the demand for a more holistic technology expense capability has been in demand at the enterprise level since the mid-2010s and companies narrowly focused on managing telecom, mobility, software, and cloud computing as four separate IT silos will miss out on a variety of opportunities to optimize and rationalize costs.

In this practice, we tactically look at technology expense management vendors, including specialists in telecom expense, managed mobility services, cloud cost management, cloud FinOps (Financial Operations), Software as a Service management, IT finance solutions, hybrid cloud subscriptions and financing, and other new IT strategies that can lead to a minimum of 20-30% cost reduction in one or more key IT areas. In each of these IT areas, Amalgam Insights maintains a list of recommended vendors that have proven their ability to deliver on both identifying and fixing the issues associated with the IT Rule of 30, which are provided both in our published research as well as in our end-user inquiries with enterprise clients.

With that out of the way, 2021 was a heck of a year from an IT management perspective. Although a lot of pundits predicted that IT spend would go down in a year where COVID-driven uncertainty was rampant, these cost control concerns ended up being less relevant than the need to continue getting work done and the resilience of a global workforce ready and willing to get things done. In doing so, 2021 saw the true birth of the hybrid worker, one who is just as comfortable working in the office or at home as long as they have the right tools in hand. In the face of this work environment, we saw the following things happen.

The Rise of the Remote Employee – Amalgam Insights estimates that 30% of employees will never be full-time in-office employees again, as they have either moved home full-time or plan to only come into the office one or two times per week as necessary to attend meetings and meet with new colleagues and partners. Although many of us may take this for granted, one of the issues we still face is that in 2019, only 5% of employees worked remotely and many of our offices, technology investments, and management strategies reflect the assumption that employees will be centrally located. And, of course, COVID-19 has proven to be both a highly mutating virus and a disease fraught with controversies regarding treatment and prevention strategies and policies, which only adds to the uncertainty and volatility of in-office work environments.

Legacy networking and computing approaches fall flat – On-premise solutions showed their age as VPNs and the on-site management of servers became passe. At a time when a pandemic was running rampant, people found that VPNs did not provide the protection that was assumed as ransomware attacks more than doubled in the United States and more than tripled in the United Kingdom from 2020 to 2021. It turns out that the lack of server updates and insecure ports on-premises ended up being more dangerous for companies to consider. We also saw the Death of Copper, as copper wired telecom services were finally cut off by multiple telecom vendors, leaving branch offices and the “Things” associated with operational technology rudely left to quickly move to fiber or wireless connections.  Blackberry finally decided to discontinue to support of Blackberry OS as well, forcing the last of the original Blackberry users to finally migrate off of that sweet, sweet keyboard and join the touch screen auto-correct world of smartphone typers. It was a tough year for legacy tech.

Core Mobility Grew Rapidly in 2021 – Core spend was up 8% due to device purchases and increased data use. In particular, device revenue was up nearly 30% over last year with some of the major carriers, such as AT&T, Verizon, and T-Mobile (now the largest carrier in the United States). However, spend for customized and innovative projects disappeared both as 5G buildouts happened more slowly than initially expected and 5G projects froze due to the inability to fulfill complex mesh computing and bandwidth backfill projects. This led to an interesting top-level result of overall enterprise mobility spend being fairly steady although the shape of the spend was quite different from the year before.

Cloud Failures Demonstrated need for Hybrid and Multi-Cloud Management – Although legacy computing had its issues, cloud computing had its black eyes as well. 8760 hours per year means that each hour down gets you from 100% to 99.99% (4 9’s). Recent Amazon failures in November and December of 2021 demonstrated the challenges of depending on overstressed resources, especially US-1-East. This is not meant to put all the blame on Amazon, as Microsoft Azure is known for its challenges in maintaining service uptime as well and Google Cloud still has a reputation for deprecating services. No one cloud vendor has been dependable at the “5 9’s” level of uptime (5 minutes per year of downtime) that used to define high-end IT quality. Cloud has changed the fundamental nature of IT from “rock-solid technology” to a new mode of experimental “good enough IT” where the quality and value of new technology can excuse some small uptime failures. But cloud failures by giants including Akamai, Amazon, AT&T, Comcast, Fastly, and every other cloud leader show the importance of having failover and continuity capabilities that are at least multi-region in nature for mission-critical technologies.

Multi-cloud Emergence – One of the interesting trends that Amalgam Insights noticed in our inquiries was that Google Cloud replaced Microsoft Azure as the #2 cloud for new projects behind the market leader Amazon. In general, there was interest in using the right cloud for the job. Also, the cloud failures of leading vendors allowed Oracle Cloud to start establishing a toehold as its networking and bare-metal support provided a ramp for mature enterprises seeking a path to the cloud. As I’ve been saying for a decade now, the cloud service provider market is going the way of the telcos, both in terms of the number of vendors and the size of the market. Public cloud is now is $350 billion global market, based on Amalgam Insights’ current estimates, which measures to less than 7% of the total global technology market. As we’ll cover in our predictions, there is massive room for growth in this market over the next decade.

SD-WAN continues to be a massive growth market – From a connectivity perspective, Software Defined Wide Area Networks (SD-WAN) continue to grow due to their combination of performance and cost-cutting. This market saw 40% growth in 2021 and now uses security as a differentiator to get past what people already know. From an IT cost management perspective, this means that there continues to be a need for holistic project management including financial and resource management for these network transformation projects. Without support from technology expense management solutions with strong network inventory capabilities, this won’t happen.

As we can see, there were a variety of key IT trends that affected technology expenses and sourcing in 2021. In our next blog on this topic, we’ll cover some of our expectations for 2022 based on these trends. If you’d like a sneak peek of our 2022 predictions, just email us at info@amalgaminsights.com

Posted on

About Amalgam Insights

Amalgam Insights is working on a new website experience for you in 2022 to help with your Technology Expense Management, Data and Analytics, and Business Planning Management challenges.

In the meantime, if you need to reach us, please contact Hyoun Park (hyoun@amalgaminsights.com) or Lisa Lincoln (lisa@amalgaminsights.com). We thank you for your patience as we get set up!

Continue reading About Amalgam Insights
Posted on

Domino Data Lab Raises $100 Million F Round to Enable the Model-Driven Enterprise

On October 5, Domino Data Lab announced a $100 million F round led by private equity firm Great Hill Partners and joined by existing investors Coatue, Highland Capital, and Sequoia Capital. Domino Data Lab is a company we have covered since the inception of Amalgam Insights in 2017. From the start, it was obvious that Domino Data was designed to support data science teams that sought to manage data science exploration and machine learning outputs with enterprise governance.

This investment is obviously an eye catcher and is in line with other massive rounds that data science and machine learning solutions have been raising, such as DataRobot’s July 2021 G round of $300 million, Dataiku’s August 2021 $400 million round, or Databricks’ gobsmacking August 2021 round of $1.6 billion. In light of these funding rounds, one might be tempted to ask the seemingly absurd question of whether $100 million is enough!

Fortunately, even in these heady economic times, $100 million is still a significant amount of cash to fund growth and the other funding rounds demonstrate that this is a hot market. In addition, Domino Data’s focus on mature data science practices and teams means that the marketing, sales, and product teams can focus on high-value applications for developers and data analysts rather than having to try to be everything for everyone.

In addition, the new lead investor Great Hill Partners is a firm that Amalgam Insights considers “smart money” in that it specializes in investments roughly around this $100 million size with the goal of pushing data-savvy companies beyond the billion dollar valuation. A quick look at Great Hill Partners shows that they have assigned both founder Chris Gaffney and long-time tech executive Derek Schoettle to this investment, both of whom have deep expertise in data and analytics.

With this investment, Amalgam Insights expects that Domino Data will continue to solve a key problem that exists in enterprise machine learning and artificial intelligence: orchestrating and improving models and AI workloads over time. As model creation and hosting have become increasingly simple to initiate, enterprises now face the potential issues of technology debt associated with AI. Effectively, enterprises are replacing “Big Data” issues with “Big Model” issues where the breadth and complexity of models become increasingly difficult to govern and support without oversight and AI strategy. This opportunity cannot be solved through automated model creation or traditional analytic and business intelligence solutions as the combinations of models, workflows, and governance associated with data science require a combination of testing, collaboration, and review that is lacking in standard analytic environments. With mature data science teams now becoming an early majority capability at the enterprise level, Domino Data’s market has now caught up to the product.

Domino Data’s funding announcement also mentioned the launch of a co-selling agreement with NVIDIA. Although this agreement isn’t novel and NVIDIA has a variety of agreements with other software companies, this particular agreement allows NVIDIA and Domino Data to provide both the hardware and software to develop optimized machine learning at scale. Amalgam Insights expects that this agreement will allow enterprises to accelerate their development of machine learning models while providing a management foundation for the ongoing governance and support of data science. Enterprise-grade data science ultimately requires not only the technical capability to deploy a model, but the ability to audit and review models for ongoing improvement or disconnections

From an editorial perspective, it is amazing to see how quickly Domino Data Lab has grown over the past three years. When we first briefed Domino Data in 2017, we frankly stated that the solution was ahead of its time as enterprises typically lacked the formal teamwork and organizational structure to support data science. It wasn’t that businesses shouldn’t have been thinking about data science teams, but rather that IT and analytics teams simply were not keeping up with the state of technology. And in response, Domino Data actually launched a data science framework to define collaborative data science efforts.

Recommendation for Amalgam Insights’ Data and Analytics Community

Funding announcements typically are associated with growth expectations: the bigger the round, the higher the sales and marketing expectations. Domino Data is raising this money now both because it is seen as a market leader in supporting data science and that companies have reached a tipping point in requiring solutions for collaborative and compliant data science management.

Amalgam Insights’ key recommendation based on this funding round as well as recent funding from other vendors is to review current data science capabilities within your organization and ensure that the compliance, governance, and collaborative capabilities are on par with your current analytics, business intelligence, and application development capabilities. The toolkits for collaborative data science have evolved massively over the past couple of years and data science is no longer a task for the “lone-wolf genius” but for an enterprise team expected to provide high-value digital assets. Compare current data science operationalization and management solutions to existing in-house capabilities and conduct a realistic analysis of the time, risk, and total cost of ownership savings associated with each approach. With a mature vendor landscape now in place to help support data science, this is the time for early majority data science adopters to take full advantage of their capabilities over market competitors by creating a mature data science environment and quickly building AI where competitors still depend on manual or static black-box processes.

Posted on 1 Comment

Market Alert: NetApp Agrees to Acquire CloudCheckr to Improve Cloud Cost Environments

On October 4th, 2021, NetApp announced a definitive agreement to acquire CloudCheckr, a market leader in cloud financial and operational optimization. NetApp positions this acquisition as accretive to its agreement to acquire Spot (now called Spot by NetApp post-acquistion) in June 2020 to support hybrid cloud optimization and usage management. This Market Alert explains why NetApp agreed to purchase CloudCheckr and provides recommendations for cloud professionals seeking to make a decision on purchasing or evaluating a cloud cost, cloud optimization, or Cloud FinOps (Financial Operations) solution.

About CloudCheckr

CloudCheckr was founded in 2011 in Rochester, New York in the United States as a solution to support the cost management, operational automation, compliance, and security of cloud Infrastructure as a Service. CloudCheckr was founded by Aaron Klein and Aaron Newman, who currently serves as Chairman. Over the past decade, CloudCheckr has gained over $4 billion dollars in spend under management to support over 600 clients and 10,000 employee users.

CloudCheckr raised its first significant round of funding in 2017, when it announced a massive $50 million Series A round from Level Equity. (Note: This acquisition occurred a couple of months before Amalgam Insights was founded, but I covered this announcement at my previous firm.)

This unusually large round of funding was justified by CloudCheckr’s status as a profitable bootstrapped organization with the opportunity to scale in a high growth area. At the time, CloudCheckr had over 150 clients and $1 billion in spend under management, meaning that the organization has grown roughly four times as large over the last four years after this initial round of funding. CloudCheckr also raised a second round of $15 million in 2019 from Level Equity to support product and engineering capabilities around the same time that the firm appointed Tim McKinnon as CEO.

Contextualizing the CloudCheckr Acquisition

Cloud optimization has been a rapidly growing market for several reasons: the IT Rule of 30, the growth of the IaaS market, and the nascent and emerging nature of best practices for managing cloud computing.

First, Amalgam Insights’ IT Rule of 30, which states that every unmanaged IT subcategory averages 30% waste, is definitely true in for Infrastructure as a Service (IaaS), where cloud spend is poorly governed and where end users, procurement, accounting, and finance are rarely working together as a team to manage these costs in a coordinated fashion. From a cloud perspective, this percentage roughly equates to moving from about 50% utilization to 80%+ utilization of provisioned services based on active monitoring of services.

Second, the IaaS market as a whole continues to grow roughly 25% per year as roughly 60% of institutional or enterprise-grade storage and compute is in the cloud rather than an asset-intensive data center investment.

Third, cloud IaaS billing and product deployment are still fairly immature or agile (depending on your point of view) with rapid launches, updates, changes, and obsolescence based on adoption trends and customer requests. From a functional perspective, this rapid change can often provide great value, but it also means that the financial expectations associated with instances can often change without any formal change management, billing review, or contractual review. All of these trends lead to a volatile billing, usage management, and compliance environment that is difficult to manage without a combination of proactive analysis, alerts, and holistic visualization.

How CloudCheckr Augments Spot by NetApp

NetApp’s acquisition of CloudCheckr fits well into the trends of this space and can be seen as part of a trend of acquisitions that includes Apptio’s 2019 acquisition announcement of Cloudability or Flexera’s 2018 acquisition of Rightscale or VMware’s 2018 acquisition of CloudHealth Technologies. All of these acquisitions filled the needs of IT management providers to support multi-cloud management for enterprises and managed service providers seeking to manage large pools of cloud spend and resources. From Amalgam Insights’ perspective, these are still early days for cloud computing as a whole as cloud currently makes up roughly a third of enterprise data infrastructure spend. From a market perspective, AI believes that we are in a period where cloud infrastructure cost management vendors represent growth assets now that multi-cloud best practices are starting to emerge and cloud service providers are treated more along the lines of telecom carriers for services that provide utility pricing and capacity.

At a time when cloud computing is obviously the highest growth area for IT spend, with IaaS spend expected to double every three years for the rest of this decade, IT systems management firms see the complexity of cloud as a fundamental challenge to the ongoing management of cloud services.

This acquisition builds onto existing Spot by Netapp’s capabilities in supporting usage and resource tracking as well as NetApp’s recent acquisition of Data Mechanics to support big data analytics. Although initial press releases and interviews position CloudCheckr as an acquisition to help support Spot by NetApp, Amalgam Insights notes that these two technology solutions are different in nature.

Spot by NetApp excels in providing a software-driven capability for monitoring and optimizing storage and compute infrastructure. This optimization provides a lot of value and can often seen as a be-all and end-all for infrastructure cost management to identify the portfolio of on-demand, reserve, and spot instances used to support infrastructure.

However, experiencted IT expense managers have seen that IT cost management requires a holistic lifecycle approach that involves a combination of usage optimization, service order automation, resource governance, inventory management, multi-cloud sourcing, invoice and payment management, and effective alignment of services with business-driven demand. This level of analysis requires a view into the products, cost centers, projects, and comparative cloud usage patterns that may require changing services and providers or using alternative billing approaches such as setting up reserved instances or savings plans for ongoing operationalization.

At the same time, cost management in the cloud is also often related to managing access and governance associated with existing resources. A basic example of this issue is Amazon S3 bucket governance, which can both be a security issue as well as a potential cost issue based on what is placed within the bucket.

Recommendations for the Cloud, FinOps, and NetApp Communities

As we consider this acquisition, it is important for us to not simply recommend a purchase, but to provide a course of action that will help IT departments to optimize their cloud environments. Based on this acquisition, Amalgam Insights provides the following recommendations based on our experience in tracking cloud cost and Kubernetes cost management over the past four years.

  1. To manage cloud costs, resource optimization is just the starting point. To fully tackle the IT Rule of 30 and regain all of the misplaced IT costs created in less governed times, it is important to make sure that all orders are governed with business logic. The goal here is not to prevent developers from quickly building but to make sure that every service is accounted for, effectively governed, and disconnected in a timely and appropriate manner. From a practical perspective, this monitoring requires some level of centralization that allows all developers and architects to have a shared version of the truth and a consistent inventory that brings together all accounts and services used by IT.
  2. For CloudCheckr customers, this acquisition provides an opportunity to take advantage of the Spot by NetApp cost optimization capability, especially in selecting spot instances that can greatly reduce the cost of managing standard cloud workloads. This spot management capability requires a combination of process modeling and price monitoring that is typically outside the core skills of cloud architects or IT expense professionals that are looking at cloud costs.
  3. For Spot by NetApp customers, consider both the value of presenting cloud costs for accounting and finance audiences as well as the power of governing resources to drive additional cost savings and increase the maturity of treating cloud as a strategic business resource. These are capabilities that CloudCheckr provides for enterprise cloud environments. From a practical perspective, IT departments should check and see if they have already covered these important aspects of cloud management either with homegrown or other third-party solutions. Amalgam Insights recommends that organizations that have not filled these gaps should consider adopting CloudCheckr capabilities.
Posted on Leave a comment

Analyst Insight: Raindrop Systems

Executive Summary

Key Stakeholders: Chief Procurement Officer, Procurement Directors and Managers, Controllers, Vice Presidents of Accounting, Accounting Directors and Managers, Legal Directors, Finance Directors and Managers, Sales Operations Directors and Managers, Marketing Operations Directors and Managers

Why Raindrop Matters: Mid-market organizations between $100 million and $1 billion in annual revenue are in a tricky situation where they have the advantage of being relatively small and nimble, but must also face enterprise-grade operational challenges to support global supply chains, customer bases, and vendor ecosystems. Raindrop provides organizations at this size and above with a freemium SaaS product to support organizations starting to support mature enterprise spend practices while maintaining the governance, compliance, and security issues that come up for enterprises

Top Takeaway: Amalgam Insights recommends Raindrop as a business spend management solution to be considered for organizations with over $20 million in revenue that currently have fragmented or non-existent sourcing management, supplier management, and payment management capabilities.

Introduction to Raindrop Systems

Amalgam Insights recently briefed with Raindrop Systems, an emerging startup focused on Enterprise Spend Management. This vendor got our attention because it has come up in inquiries as a potential solution for managing contracts and spend in mid-market organizations between 100 million and 5 billion dollars in annual revenue, though Raindrop does support enterprise-sized clients exceeding 10 billion dollars in annual revenue.

Founded in 2019 and based in Santa Clara, California in the United States, Raindrop is part of the SaaS trend of “Built by X, for X” companies that was built by procurement professionals to solve enterprise procurement lifecycle demands from sourcing to payment. Amalgam Insights notes that this is a trend in the finance, accounting, and sourcing markets where subject matter experts have come in to build SaaS solutions that have quickly become competitive with legacy solutions that were developed by programmers or technicians lacking deep enterprise practitioner experience. Raindrop was founded by Vijay Caveripakkam and Ward Karson, who bring both provisioning practitioner and consulting backgrounds to the company.

Contextualizing Raindrop in the Enterprise Spend Market

In exploring the Raindrop solution, Amalgam Insights notes that Raindrop’s key differentiation points were associated with providing a modern user experience to support collaboration, workflow automation capabilities, analytic tracking of savings, and clear calls to action to support cost savings, and supplier risk management. The last of these is of particular note, as Amalgam Insights has traditionally seen spend management solutions focus on being systems of record, which led to a practical outcome of creating a data repository that was both complete and overwhelming to analyze on a regular basis. In the 2020s, as Big Data has become regular data and every large data source will overwhelm the human ability to manually audit and query data effectively, people need to use a combination of workflow automation, machine learning, natural language parsing and processing, and rapid contextualization of data to effectively keep up with the constant increases in contract text, transactions, and payments in place.

Raindrop has two key current focus areas. The first is with organizations with less than 50 million dollars in business spend, based on the product’s ease of implementation and a freemium approach that allows clients to sign up for a production-ready instance of the service at no cost to see how contractual obligations can be administrated through a formalized solution. Its second key focus is on larger mid-market to enterprise organizations with 100 to 500 million dollars in business spend to support that customer base. 

Raindrop is designed to be a general spend solution to support planning, supplier management, sourcing, contract management, and payables within a single solution. This integrated lifecycle approach is intended to ensure that data capture associated with contract, vendor, and stakeholder management occurs during transactional execution of payments, contract execution and renewal, and supplier evaluation. As part of this process, Raindrop includes a semi-automated contract loading process that currently automates about half of the contract term entry. From a data privacy perspective, Raindrop has placed a focus on GDPR and on not keeping payment information to support privacy and governance issues such as PCI DSS and FedRAMP compliance.

This solution comes to market at a time when the mid-market procurement market faces a bit of a vacuum. When Scout RFP was purchased by Workday for $540 million, it had made inroads into the mid-market to support cloud-based sourcing and supplier management. However, post-acquisition, Scout RFP development has been more focused on the enterprise customers that Workday focuses on. This development is similar to the work that planning and budgeting solution Adaptive Insights saw post-Workday acquisition.

At the same time, mid-market organizations face greater complexity in their sourcing environments as competitive markets, lowered barriers to purchase, and the proliferation of suppliers in a variety of markets has led to the need to manage spend more closely. As just one example, Amalgam Insights estimates that the average 500 employee company with approximately $100 million in revenue is supporting 250 SaaS vendors, but estimates that less than 100 of them are being formally sourced or managed through procurement. This haphazard approach will lead to spend leakage, missed renewal opportunities, and the inability to aggregate and negotiate licenses over time. And although mid-market companies have typically started to invest in spend management, it is typically in a piecemeal fashion where contract management, purchasing, invoices, payments, and budgeting can each be in a separate application or spreadsheet and there is no direct coordination between each silo other than manual employee changes made on an ad-hoc basis.

Because of both the increasingly complexity of mid-market procurement management and the potential for cost savings through competitive and bulk purchasing exercises, Amalgam Insights believes that the need for mature sourcing and spend management capabilities is now necessary for mid-market organizations that are responsible for good stewardship. Amalgam Insights estimates that the total cost of ownership savings for sourcing management typically matches the expense of an employee or full-time-equivalent at about 100 employees or $20 million in annual revenue and that this cost crossover can occur even earlier in a company’s progression if it is tech-heavy in its operations.

Amalgam Insights’ Recommendations

Amalgam Insights believes that Raindrop’s offering provides a low barrier to entry for business spend as a solution that is as-a-Service, free to start, and provides a modern user interface on par with current web-based applications while providing the breadth of services needed to support business spend management and a standard set of security and compliance requirements needed to support data privacy. In light of the increasing importance of managing the source-to-pay lifecycle for mid-sized organizations, Amalgam Insights recommends Raindrop as a business spend management solution to be considered for organizations with over $20 million in revenue that currently have fragmented or non-existent sourcing management, supplier management, and payment management capabilities.

Posted on

Market Alert – Zoho Launches its Business Intelligence Platform

On July 13, 2021, Zoho announced its Business Intelligence Platform which combines data preparation, machine learning, analytics, presentation, and application development into a single suite of services. This platform brings Zoho’s new DataPrep capability and combines is with the existing Zoho Analytics product. The combined platform provides a single solution the clean, augment, analyze, and present data as a visual presentation, application, or as an answer to a question asked in standard English. This analysis is based both on briefings and discussions with Zoho customers.

To provide some background on Zoho, Zoho is a private company founded in 1996 to provide business applications in the cloud. As of now, Zoho supports a business suite of 45 applications available as an integrated suite called Zoho One. Zoho’s Analytics offering, called Zoho Analytics, was first launched as Zoho Reports in 2009 to help users understand their data. Zoho Analytics supports over 10,000 customers across both cloud and on-premises environments. Zoho also supports over 300 white-labelled embedded BI customers across telco, supply chain, and other enterprise environments.

Since 2009, Zoho Analytics has evolved to build in-house capabilities to support data management, a data warehouse, data discovery, collaboration, and a breadth of APIs all used to help support a combination of self-service users and embedded analytics use cases.  Zoho’s analytic platform is based on four key components:

  1. Data Preparation through Zoho DataPrep, which is Zoho’s self-service data preparation application to support data quality, enrichment, transformation, and data modeling capabilities. DataPrep uses machine learning to provide guidance on data enrichment and modeling
  2. Visualization: Zoho enhances traditional dashboarding, reporting, and charting capabilities by providing both access and integration to Zoho Sites, Zoho’s website and portal building product, and Zoho Show, which focuses on presentations. By providing the tools to present and publicly share data outputs, Zoho Analytics extends beyond the dashboard to provide standard access to the digital tools that are typically used to share information: presentations, meetings, intranets, and the internet.
  3. Augmented Analytics, where Zoho’s presentation of descriptive and predictive analytics is enhanced both by the Ask Zia conversational AI (which will be described later in this report) as well as by Zia Insights, which provides a conversational or natural language processing interface for asking questions about the data.
  4. Applications, where Zoho provides a low-code platform called Zoho Creator to build analytic applications that can then be shared through Zoho Marketplace to share either with internal or external customers.

As can be seen from this description, Zoho provides a number of applications in its business suite (including a recently released Zoho Contracts product focused on sourcing) that expand beyond analytics in providing a full business suite of applications for small and medium sized businesses.

Zoho Analytics includes native data management capabilities including data preparation, cleansing, integration, transformation, enrichment, and modeling. This combination of capabilities allows Zoho to keep data relevant over time and to ensure that data is aligned to business terminologies and taxonomies.

Zoho also has its own data warehouse used to model data for analytics. This data warehouse is native to Zoho and includes both in-memory and columnar services to accelerate access to data. Although Zoho typically targets the SMB market, this warehouse has been proven to scale to terabyte-sized data with billions of rows for clients that have needed to grow their analytic environments. This warehouse is currently only available as a part of Zoho Analytics and can be mapped to other cloud data warehouses such as Snowflake or Amazon Relational Database Service (RDS) through metadata mapping.

Zoho Analytics works with Zoho’s applications, but also includes over 250 native integrations to other vendors used by its clients including HubSpot, Microsoft Dynamics, Quickbooks, Salesforce, ServiceNow, Shopify, and Xero with the understanding that some Zoho clients may use other applications as well. In creating these integrations, With these integrations, Zoho Analytics rivals many of its standalone providers, which is demonstrated by 60% of Zoho Analytics’ usage being associated with data outside of the Zoho One suite.

Zoho Analytics provides all of these services on its own cloud, Zoho Cloud, which has both GDPR and CCPA compliance. Based on customer demand, Zoho has also made Zoho Analytics available on-premises for specific geographies, markets, and use cases, which is an interesting departure from Zoho’s typical focus on cloud apps. Based on this combination of data management capabilities, Amalgam Insights considers Zoho Analytics to take a full-stack approach that allows companies to build an analytic environment solely within the Zoho ecosystem.

The pricing for a paid Zoho Analytics account starts at $24 per month as an entry point and can increase up to $455 per month to support up to 50 users, 50 million rows of data, and unlimited workspaces. Zoho Analytics can also be purchased as part of the Zoho One Suite, which brings almost all of Zoho’s applications into a single subscription which ranges from $37 to $105 per user per month depending on whether a company chooses to enroll every employee or seeks a more flexible model for bringing a portion of employees onto Zoho.

Zoho Analytics’ Support for modern analytic capabilites

All of Zoho Analytics’ capabilities across data management, integration, and analytic distribution help to scale the core capability of visualization and analysis. Amalgam Insights views these visualization and analytic capabilities to provide reports and dashboards to be on par with the current analytic market, but today’s market requires analytic companies to push forward to support natural language queries, collaboration, and machine learning. In this light, Amalgam Insights looked at Zoho Analytics’ capabilities across six areas: natural language processing, statistical and advanced analytics, collaboration, mobility, APIs, and future-facing roadmap.

Zoho has taken steps towards this next generation of analytic delivery through its Ask Zia capability. Zia serves as Zoho’s AI assistant. Initially launched in February 2017, Zia was originally designed to be a sales assistant for the Zoho CRM product to help provide reminders for contacting prospects and taking relevant actions at the right time. Since then, Zia has evolved into an assistant that understands natural language questions across multiple applications including the Analytics product. The word “questions” is used here rather than “query” because the inputs for Zoho’s Ask Zia are plain language questions and requests such as “show me the sales for last year” rather than a query-based text string such as “Search sum of sales amount where Year equals 2020.” Although the two phrases are semantically similar, the prior phrase is based on a standard use of language while the latter requires an understanding of querying logic that is not natural. With Ask Zia, Zoho Analytics is a conversational analytics solution that is capable of providing insights and results to any employee based on the language they use in their everyday workplace.

Business analytics solutions are also increasingly asked to support more statistical and advanced analytics capabilities to help more advanced data users understand probabilities, forecasting, data relationships, trending and regression analysis, seasonality and time-series relationships, and model choices to support optimal forecasting and root-cause analysis. Zoho Analytics first announced forecasting capabilities in November of 2018 and continues to increase support for relevant models to improve results for regression, exponential smoothing, and seasonality trend Loess decomposition.

From a consumption perspective, Zoho Analytics supports embedded analytics, web-based apps, as well as the ability to share linked data through collaboration platforms such as Slack, Microsoft Teams, or Zoho’s own Zoho Cliq. Zoho Analytics results can be embedded in webpages, shared as a permalink, placed into slideshows, and provided as public resources. Zoho Analytics also supports analytic storytelling through its slideshows, which work similarly to storyboards that exist in other analytics solutions and are used to present Zoho Analytics workspaces.

All of these analytic outputs are available through the Zoho Analytics mobile app, which is designed to emphasize the workspaces, dashboards, and reports created within Zoho Analytics. The app also supports exporting and commenting on analytic views as well.

From a programmatic perspective, Zoho Analytics functionalities are also available through an API which is accessed by a variety of other Zoho applications, such as Zoho CRM, Zoho Creator (a low-code application platform), Zoho Projects (project management), and Zoho Books (accounting) and can also be used to build add-on capabilities for third-party and custom-built applications. The API can be used with a variety of commonly used programming languages including Java, C#, Python, PHP, Go, and Google Applications through prebuilt “Client Libraries” created to support each language and includes a variety of interfaces to access data, modelling, metadata, collaboration, embedded analytics, and Single Sign-On integration. This API includes SQL access to Zoho Services data for those seeking query-based access to data, through what Zoho calls CloudSQL, which supports ANSI, Oracle, Microsoft SQL, IBM DB2, MySQL, PostgreSQL, and Informix-flavored SQL queries.

Amalgam Insights notes that Zoho Analytics has stated that future roadmap items for Zoho Analytics include further improvements in self-service analytics, data catalog, and the analytic formula engine to increase access to computational tools needed for advanced analytics. Self-service analytics has been a key goal for business intelligence over the past decade, driven by solutions such as Tableau and Qlik. Amalgam Insights considers self-service analytics to be an intermediate stage of analytics between the analyst-controlled reports and dashboards that defined the initial generation of analytic solutions and the current era of contextualized insight where analytic outputs are guided by natural language, pre-contextualized for users through role definitions and machine learning and augmented by a combination of human judgment and third-party data sources to guide business-relevant actions.

The data catalog is increasingly important as companies seek to contextualize data for machine learning and data science. By creating a catalog that effectively categorizes and indexes the relationships within relevant business data, businesses can be more prepared for the near-future where they use advanced algorithms and neural nets to support better decision-making and automate high-volume transactional judgment calls.

And Amalgam Insights believes that access to statistical tools, machine learning algorithms, and production-ready machine learning models will provide strategic advantages to the quantitatively savvy organizations that realize that behind the math are opportunities to improve organizational processes, increase business transaction volume and size, and proactively support customer issues.

Amalgam Insights’ Recommendations

Based on the pricing and new functionality of Zoho Analytics, Amalgam Insights recommends Zoho Analytics both as a starting point for using business intelligence and as a tool to bring together data across small and medium businesses that are starting to bring together a variety of applications and have outgrown their initial data organization strategies.

The variety of capabilities Zoho provides across prep, presentation, and application development also makes Zoho a useful and cost-effective augmentation for organizations have already invested in visualization, self-service analytics, and data discovery. In large organizations, where it is not uncommon to invest in multiple analytics and business intelligence products as long as they drive productivity, Amalgam Insights believes Zoho Analytics should also be considered as an end-to-end analytics solution that can quickly bring new data sources from prep to scalable, cloud-based app.

Posted on 3 Comments

The Emerging Age of Decision Intelligence

Amalgam Insights recently caught up with diwo, a decision intelligence solution that provides companies with a consistent approach for contextualizing recommendations and developing portfolios of strategies, scenarios, and actions. We first spoke with diwo in 2017 at its launch at the Strata Conference.

Based on discussions with the vendor and with enterprises facing challenges with their analytics, machine learning, Big Data, and data management environments, Amalgam Insights believes that decision intelligence will be a foundational evolutionary stage for enterprise analytics environments in the 2020s and that diwo has an opportunity to be a leading player in this market.

To explain why, consider how the value chain from data recognition and contextualization all the way to recommendation has been largely ignored in the enterprise analytics world as analytics products have been overly focused on the process of “self-service” analytics that drives employees into constant and unending cycles of discovery used to identify data that might be of value.

There are four core issues in enterprise data and analytics that decision intelligence solves:

  1. Decision Intelligence makes analytic recommendations more human.
  2. “Analysis paralysis,” where the search for results and self-driven discovery can lead to a never-ending set of analysis.
  3. A third challenge that exists in the analytic market at large is that the solutions that support natural language and search-based interfaces to ask data questions typically lack “memory.”
  4. A fourth challenge is that even if end-users ask the correct questions, it is hard for analytics solutions to provide semantic or contextual sense of whether those fields are relevant. “Correlation is not Causation” is a standard Statistics 101 truism. But in traditional analytics solutions, correlation is often conflated and is presented as causation.

To read the rest of this piece, read our new report diwo and the Emerging Age of Decision Intelligence available at no cost until July 2nd. This report covers key aspects of this emerging era and how Amalgam Insights believes diwo can play a role in this new era.

Posted on 1 Comment

Neo4j Takes on the Battle for Context with a $325 Million F Round

On June 17th, 2021, Neo4j, a graph database company, announced a $325 million investment led by a $100 million investment by Eurazeo and joined by new investors GV (previously named Google Ventures), DTCP, and Lightrock as well as existing investors Creandum, Greenbridge Partners, and One Peak. Eurazeo is private equity company with over 15 billion Euro in Assets Under Management as part of a larger investment portfolio of over 22 billion Euro. With this round, Eurazeo Managing Director Nathalie Kornhoff-Brüls joins the Neo4j Board of Directors.

This monster funding round speaks to the confidence that investors have in the future of Neo4j. But in this particular instance, Amalgam Insights believes that this large funding amount is especially important because of what it means for breaking the status quo of enterprise analytics.

Analytics and data management in the business world have been built around the relational database focused on controlling and governing individual data inputs. This fundamental framework has been very useful in creating an environment that can be configured to present a single shared source of truth. However, it is not especially good at supporting and processing data relationships, which is a challenge in today’s data environment as data grows quickly and data relationships increasingly represent some level of transaction or behavior aligned with a business activity that needs to be tracked or analyzed in near-real-time.

In addition, the hype regarding artificial intelligence and machine learning has finally crossed over into practical reality as the toolkits for operationalizing models have reached mainstream availability. Even as enterprises may not fully understand machine learning, but they can easily purchase access or use open source projects to access the data management, model creation, storage, and compute capabilities needed to support machine learning projects. But for companies to fully execute on the promise of machine learning, they need to create more efficient relationship-based data environments that allow models to be tested and to provide results. Building relationship-centered data is part of what I originally called the Battle for Context when Amalgam Insights was first founded.

And now four years later, Neo4j has a chance to deliver on this challenge for context at a global scale. Neo4j has been a graph data leader for years, especially since it started back in 2007 before the need for graph database management was fully clear to the enterprise market at large. Since then, Neo4j has been a stalwart in its market education of graph data. But it has fundamentally been fighting a status quo where companies have been either unwilling or unable to translate their key transactional data environments into the relationship-based models that will be necessary for broad machine learning. With this round of funding, Neo4j finally has a chance to conduct the volume of marketing and sales needed to educate the data and analytics audience. In contrast to other large rounds of funding announced in the data world, such as Snowflake’s $479 million round in February 2020 or Databricks’ $1 billion round in February 2021, Amalgam Insights believes that Neo4j’s funding round serves a slightly different purpose.

Those previously-mentioned funding rounds were all seen as final rounds of funding before an upcoming IPO with participation by software vendor partners in their ecosystem. In contrast, Neo4j both has a more foundational opportunity and challenge in that graph should be the foundation of enterprise machine learning and relationship-driven data environments, but the ecosystem and platform maturity are still not quite where the data warehouse market is. Amalgam Insights sees this round as being more similar to DataRobot’s $270 million round raised in November 2020 which allowed DataRobot to continue acquiring companies and building out its platform to fit enterprise challenges.

Ultimately, the goals that enterprises should associate with graph data are the combinations of unlocking relationships within data that will take orders of magnitude in time, money, and skillsets to discover in relational data as well as the opportunity to unlock tens to hundreds of millions of dollars in value through ongoing machine learning and artificial intelligence operationalization opportunities that have already been identified but cannot run at high-performance levels without a better data environment. The acquisition and use of graph databases is a technological bottleneck that will prevent enterprises from fully unlocking AI and we are only now reaching a point where the understanding of relationship data, training data, machine learning feedback, and transactional data is sufficient for business managers to understand the value of dedicated graph databases rather than simply placing a graph structure on relational or multimodel data.

Recommendation to the Amalgam Insights database and analytics community

At the very least, start learning about graph data structure as combinations of edges, vertices, and relationships as well as linear algebra to gain an understanding of how graph data differs from the standard high school algebraic logic of relational databases. Yes, learning math and a new set of data relationships is not as easy as downloading a library or learning a new software functionality. But graph relationships are a fundamental change in the way that data will be managed over the next couple of decades and there will be a great deal of work needed both to ETL/ELT relational data into graph databases as well as to manage graph databases for the upcoming world of AI replacing aspects of standard business analytics.

If your organization is looking at relationship analytics or machine learning initiatives beyond a single project, look at Neo4j, which currently has a dominant position as a standalone graph database and is available as open source under GNU General Public License (GPL v3).

And if you have questions about the current state of Neo4j or are trying to bridge gaps from BI to AI in your organization, please contact research@amalgaminsights.com to schedule time to speak with our analysts. We look forward to serving you in our continued role in helping you to understand the future of your data.

Posted on 1 Comment

Alation Raises $110 Million D Round to Help Businesses Better Understand Their Data

On June 3rd, Alation announced a $110 million D round led by Riverwood Capital with participation from new investors Sanabil Investments and Snowflake Ventures. Current investors Costanoa Ventures, Dell Technologies Capital, Icon Ventures, Salesforce Ventures, Sapphire Ventures, and Union Grove Partners also contributed to the round. With this round, Alation has raised a total of $217 million and is valued at $1.2 billion.

One of the first things that stands out with this investor list is how Alation serves as an example of “investipartnering” where business partners also become limited equity partners. With Snowflake, Salesforce, and Dell all on the cap table, Alation stands out as being a strategic partner for some of the biggest cloud players on the planet. 

Here’s why this investment makes sense in today’s data environment.
One of the biggest challenges for data in 2021 is effectively governing & defining data across a wide variety of sources. Alation has been both a pioneer and now a consistent market-leading data catalog both from a revenue and functionality perspective. 

Yet, there is still a massive greenfield opportunity to rationalize taxonomies, naming conventions, integrations, and data-centric decisioning processes within the larger enterprise data ecosystem. These data challenges were already challenging enough for analytics, where businesses have had data warehouse, master data, and data integration tech for decades. But now this data also has to be prepped for machine learning & AI, where these structures are less useful. One of the reasons that a variety of industry estimates state that data scientists spend as much as 80% of their time cleansing data is because data scientists have either eschewed traditional enterprise data structures or are simply unaware of the analytic data ecosystem that has been built in enterprises over the past several decades as they seek to tackle problems.

 In an agile, “Post-Big Data” data world, the true hub of data intelligence is either at the catalog or datalake level, depending on how data is used and organized. In today’s data world, the data warehouse is an important piece of core infrastructure for enterprise data, but is not typically agile enough to support the rapid data selection, augmentation, transformation, and analysis associated with both self-service analytics. and machine learning efforts. In this modern data context, Alation is a vital player in advancing the cause of referencing, contextualizing, and linking datasets together rapidly.

And the investment by Snowflake Ventures reflects that Snowflake knows they need more control over less structured data. Snowflake is under pressure to justify its massive valuation as a cloud data leader and now has to meet the growth expectations of being worth well over $50 billion and having had a peak valuation of over $125 billion in its brief tenure as a public company. Alation will be a vital part of Snowflake’s story in providing a more agile environment for the entirety of enterprise data as Snowflake moves closer to a variety of datalake capabilities that allow for more flexible data transfer.

I’ve covered Alation since its Series A in 2015 led by Costanoa Ventures, which has now established itself as a premier early stage investor in data-driven startups, back when I was the Chief Research Officer at Blue Hill Research. Their focus on data navigation at a time when Hadoop was seen as The Big Data Answer ended up being prescient and Alation’s value is now established with over 250 enterprise clients. 

But there is a larger opportunity. As Alation has expanded from a data catalog solution to a broader data discovery, context, governance, and collaboration solution and as the challenges of data and metadata management move downmarket, Alation’s capabilities are increasingly aligned with fundamental market needs to categorize and share data effectively.

To become a global solution, Alation needs to get into thousands of organizations, a goal that I think is now realistic with this latest round of funding that both boosts sales in the short term and sets a path to ongoing scalable growth.

Recommendation for the Data Management Community

The key takeaway for the data community is that legacy data management tools typically lack the speed and functionality necessary to identify, classify, and organize new data for new analytics, machine learning, and AI use cases. This includes everything from unstructured documents to relevant binary files to time-series, graph, and geographic data. This problem has driven both the commercial and investor interest in Alation and this problem is moving downmarket as more organizations seek to start building scalable and repeatable machine learning, data science, and analytic application development environments. Organizations that are not actively planning to improve their metadata and data collaboration efforts will find themselves fundamentally hampered in trying to make the leap from BI to AI and in keeping up with the new business world of augmented, automated, process mapped, natural language-based, and iterative feedback-driven transformation. Behind all the buzzwords, companies must first understand their existing data and contextualize their new data.