Posted on

Calero-MDSL Acquires Network Control to Support Mid-Market TEM Demand

On August 2, 2022, Calero-MDSL announced the acquisition of Network Control, a telecom expense and managed mobility services vendor based in Waverly, Iowa. This acquisition continues the acquisitive streak of Calero-MDSL and increases its status as the largest telecom expense management solution in terms of spend under management.

Network Control provides telecom expense management and managed mobility services. Founded in 1998 and headquartered in Waverly, Iowa, Network Control was privately held with no outside investment. Network Control is owned by Mark Hearn, a long-time TEM executive who purchased the company in 2011. Amalgam Insights estimates that Network Control has roughly doubled in headcount to approximately 100 employees between the 2011 acquisition and the 2022 purchase by Calero-MDSL.

With this acquisition, Calero-MDSL is making greater strides into the mid-market in acquiring a client base that collectively includes over 200,000 devices and $300 million in spend under management over 75 customers. From a pure spend perspective, Network Control does not represent a substantive addition to Calero-MDSL’s estimated $22 billion under management as the largest TEM in terms of spend under management. However, Network Control brings several important skills to Calero-MDSL that will be vital for the continued growth of the combined company.

First, Network Control has shown the ability to consistently win new business in the mid-market enterprise and is known for its retention. In Amalgam Insights’ CIO Guide for Wireless Expense Management, Network Control was listed as one of Amalgam Insights’ Distinguished Vendors based on its 98%+ retention rate for customers, with the majority of account losses over time coming from merger and acquisition activity or from the cessation of business activities. Mid-market enterprises between $1 million and $20 million in annual telecom spend is an increasingly competitive space for the large TEM vendors that are reaching the practical limits of saturation among the Global 2000 where they have traditionally focused. As TEM has become an established business practice over the past 15-20 years, TEM vendors have been able to polish both their software platforms and managed services capabilities and now are better positioned to provide these capabilities downmarket to support the next $200 billion in global mid-market telecom and technology spend that has traditionally been almost a greenfield market.

In addition, Network Control brings strong managed services capabilities for managed mobility, with approximately 100 employees trained in supporting a managed mobility services organization across operations, logistics, sales, and other business functions which will be valuable to Calero-MDSL in bolstering existing managed mobility capabilities. Network Control is known for its flexibility and client-centric focus in bringing new services to clients as well as for the quality of customer service provided.

Network Control also has a sustained record of winning deals against the likes of Tangoe and Sakon, which happen to be two of Calero-MDSL’s largest rivals in the TEM space. In our CIO Guide, we saw that Network Control ran into competitive deals in approximately 80% of their sales, which was indicative both of the relatively educated nature of potential customers and Network Control’s ability to win against larger vendors.

What to Expect?

First, for mid-market businesses between $100 million and $5 billion in annual revenue, expect increased attention from TEM companies seeking your business to manage your telecom spend. They are seeking environments that have been manually managed or managed with spreadsheets and fall under the IT Rule of 30, which states that any unmanaged IT spend category averages 30% in duplication and waste. This will also be a shift for TEM and MMS vendors that have traditionally sold into the mid-market and found that their biggest competition was against the status quo. As this market starts to shift towards what is being called the “mid-market” or the “mid-enterprise,” expect to see more competitive deals. Calero-MDSL has acquired a company that has a history of winning mid-market business against Calero-MDSL’s biggest rivals based on understanding mid-market pain points and service needs. By adding marketing and sales muscle to Network Control’s operational capabilities, Calero-MDSL has an opportunity to support the mid-market in an unprecedented way.

Second, this acquisition looks like it could kick off a second wave of TEM consolidation. In the early 2010s, there was a massive wave of consolidation in the TEM market driven by venture capital-backed vendors seeking exits or running out of funding. In the 2020s, the situation is slightly different as the firms that have remained to this day tend to be privately owned and profitable companies that have established both best practices and processes to support loyal customer bases. We have started to see the acquisition of these private firms with the acquisitions of Vision Wireless and Wireless Analytics by Motus and with this acquisition, but there are at least a half dozen additional firms with strong mid-market experience that would be strong candidates for a similar acquisition or rollup. However, the big caveat here is that any acquisition of these companies needs to be coupled with a strong customer service culture as the mid-market TEMs Amalgam Insights covers frequently average 98% retention or higher with over 100% wallet share; this is a demanding market where technology, services, and client management must be aligned.

Third, this acquisition shows that the cost of acquiring talent is still significant in the TEM world. Calero-MDSL would have needed an extra year to find the volume of high-level talent that they are getting at one time with the acquisition of Network Control. The ability to find personnel with experience in managing the spend and procurement of millions of dollars in annual technology spend is still relatively rare. This skill will become increasingly necessary in the recessionary times that we are currently facing. Companies cannot simply eliminate technology, so they will need to financially reconcile their environments both with in-house and third-party resources. Network Control has proven its ability to maintain a high level of service by maintaining a high staff-to-client ratio, a practice that Amalgam Insights recommends keeping as the relative cost of labor is smaller than the cost of finding a new customer.

Fourth, it is safe to assume that Network Control was purchased for its talent, capabilities, and client base rather than its software platform. Although Network Control’s TEMNet is a functional platform, the amount of investment that Calero-MDSL has put into its platform ensures that customers will eventually be migrated to this platform. As long as this migration is handled carefully, this should not be a challenge. Calero-MDSL has prior experience in migrating clients from previous acquisitions A&B Groep and Comview, among others.

Overall, Amalgam Insights believes that this acquisition will be accretive to Calero-MDSL both in providing greater capacity to support managed mobility services and to learn the demands of mid-market clients from an experienced team. This acquisition also will eventually provide Network Control clients with access to the Calero-MDSL platform, which has been built to support global environments and now also includes Unified Communications as a Service and Software as a Services support. Amalgam Insights believes this acquisition demonstrates Calero-MDSL’s continued commitment to expanding its market share and providing telecom and technology expense savings to a wider clientele of organizations.

Posted on

Market Alert: Vendr Raises $150 Million B Round to Help Enterprises Purchase SaaS More Efficiently

On June 16, 2022, Vendr, a SaaS (Software-as-a-Service purchasing platform) announced a $150 million Series B round co-led by prior investor Craft Ventures and novel investor SoftBank Vision Fund 2 and joined by Sozo Ventures, F-Prime Capital, Sound Ventures, Tiger Global, and Y Combinator. The company states that this funding will drive platform enhancements.

Why this funding announcement matters

To fully contextualize this announcement, Amalgam Insights will dig into the context of the macroeconomic issues driving the importance of this announcement, the tactical importance of a SaaS purchasing solution in the Technology Lifecycle Management (TLM), and the nature of the investment compared to other historical funding announcements in the TLM space.

Macro Trends for Corporate Spend Reduction

First, this announcement comes at a time when the United States is facing inflation that approaches double-digits. The current 8.6% inflation rate in this country threatens to devour the average 8.19% net margin that publicly traded companies (excluding financial services) currently achieve. In addition, we are facing a global recessionary trend driven by COVID, supply chain issues, geopolitical strife including the occupation of Ukraine, strained Sino-US relations, inconsistent oil and gas policies, and an excess of money supply created over the past several years. In the face of these global challenges, it is prudent for companies to seek to reduce discretionary costs where it is possible and to shift those costs to strategic growth areas. Traditionally, recessions have been a time when strong companies invest in their core so that they can execute when the economy picks up again.

SaaS as a Strategic and Expanding Complex Spend Category

In this context, SaaS is a massive, but complex, opportunity to cut costs. Amalgam Insights estimates that the SaaS market has grown 25% per year in each of the last two years. Multiple studies show that enterprises that have reached the billion-dollar annual revenue threshold average over 300 apps directly purchased by the organization and over 900 apps running over their networks, either on in-office networks or on employee devices. The hundreds of apps here obviously equate to hundreds, possibly thousands, of accounts and bills that can be consolidated, negotiated, and potentially rationalized to concentrate spend on strategic vendors and gain purchasing power. It is not uncommon to find large enterprises using 20 or more different project management solutions, just to look at one SaaS subcategory.

This rationalization is vital if enterprises are to take the IT Rule of 30 seriously. Amalgam Insights states that the IT Rule of 30 is that any unmanaged IT category averages a 30% opportunity to cut costs. But that 30% requires following the Technology Lifecycle to fully uncover opportunities to cut costs.

Technology Lifecycle Management

The majority of companies that Amalgam Insights speaks to in the IT expense role limit their diligence in IT spend to the right side of this lifecycle including timely bill payment, possibly cross-charging to relevant business entities and cost centers, and right-sizing expenses by finding duplicate or over-provisioned accounts. While this is necessary to execute on the IT Rule of 30, it is not sufficient. In the SaaS space, Amalgam Insights believes there is conservatively a $24 billion spend reduction opportunity globally based on improved SaaS purchasing and negotiations. At the micro level, this equates to a 2 million dollars for the average billion-dollar+ enterprise, with results varying widely based on SaaS adoption (as SaaS only makes up 30% of overall enterprise software spend globally), company size, and level of internal software contract knowledge.

Putting The Investment in Perspective

Amalgam Insights understands the scale of this business opportunity. Even so, this $150 million B round represents a massive round in the Technology Lifecycle Management space. Consider other large funding rounds in this space including:

Zylo’s 2019 $22.5 million B Round for SaaS Management

BetterCloud’s 2020 $75 million F Round for SaaS Management

Productiv’s 2021 $45 million C Round for SaaS Management

Beamy’s 2022 $9 million A Round for European SaaS Management

Torii’s 2022 $50 million B Round for SaaS Management

and looking further across the Technology Management spectrum

Cloudability’s 2016 $24 million B Round for IaaS Management (later acquired by Apptio)

CloudCheckr’s 2017 $50 million A Round for IaaS Management (later acquired by NetApp)

CloudHealth’s 2017 $46 million D Round for IaaS Management (later acquired by VMware)

MOBI’s 2015 $35 million investment round for Managed Mobility (later acquired by Tangoe)

I hasten to add here that more is not always better. But this range of funding rounds is meant to show the amount of investment that typically goes into solutions designed to manage technology expenses, inventory, and sourcing. At first glance, Vendr’s funding round may seem like just another funding announcement in the billions and trillions of dollars involved in the tech sector to those who do not cover this space closely. But as someone who has covered telecom, cloud, and SaaS expense management closely for the last 14 years, this round stands out as a massive investment in this space.

In addition, the investors involved in this round are top-tier including Craft Ventures, where founder and ex-Paypal founder David Sacks has been a proponent of Vendr, and the combination of Tiger Global and Softbank, which may be the two most aggressive funds on the planet in terms of placing big bets on the future. The quality of both smart money and aggressive money in this investment during a quasi-recessionary period speaks to the opportunity that exists here.

What to expect from this round?

The official word from Vendr so far is that this funding round is about data and platform. Vendr acquired SaaS cost and usage monitoring firm Blissfully in February 2022 to bring sourcing and expense management together and support the full lifecycle for SaaS. Amalgam Insights expects that some of these funds will be spent to better integrate Blissfully into Vendr’s operations. In addition, the contract information that Vendr has represents a massive data and analytics opportunity, but this will likely require some investment into non-standard document management, database, machine learning, and data science technologies to integrate documents, tactics, terms, and results. Whether this investment takes the form of a multi-modal database, graph database, sentiment analysis, custom modeling, process mining, process automation, or other technologies is yet to be seen, but the opportunity to gain visibility to the full SaaS lifecycle and optimize agreements continuously is massive not only from a cost perspective, but also a digital transformation perspective. The data, alone, represents an immediate opportunity to either productize the benchmarks or to provide guidance to clients with ongoing opportunities to align SaaS usage and acquisition trends with other key operational, revenue, and employee performance trends.

This part is editorializing, but Vendr has the opportunity to dig deeper into tech-driven process improvement compared to current automation platforms that focus on documenting and driving process, but have to abstract the technologies used to support the process. In the short term, Vendr has enough work to do in creating the first SaaS Lifecycle Management company that brings buying, expense, and operations management together. But with this level of funding, Vendr has the opportunity to go even further in aligning SaaS to business value not only from a cost-basis perspective, but from a top-line revenue contribution perspective. Needless to say, Amalgam Insights looks forward to seeing Vendr deliver on its vision for managing and supporting SaaS management at scale and to tracking the investments Vendr makes in its people, products, and data ecosystem.

Posted on

HP Offers to Acquire Poly

On March 28, HP announced an offer of $3.3 billion to acquire integrated communications vendor Poly. Poly, created from the merger of Plantronics and Polycom, acquiring @PolyCompany is interesting because both firms have a long history of supporting remote and home offices. Both companies have dealt with the challenges of the digital office. But this acquisition hints at a potential split for HP.

HP is obviously known as a printer company and printer ink prices ($3,000 per gallon) make even the most expensive gas pumps look like amazing bargains. But HP also has its Z by HP workstation brand, which is well-aligned to the Poly portfolio. It would be great to see that combined Poly/Z portfolio come together as the future of the digital office and to create that new “office in a box” or “office in a browser” that is always a goal for tech companies. There are still a few gaps in the portfolio, though.

The starting point is good spatial audio. As Poly has known since its telepresence days, 2 big secrets to optimal video conferencing are life-sized video and spatial audio. Both are hardware accessory issues: camera & speakers. Poly is great at the former, so-so at the latter. To take this a step further, HP Poly can be the smart accessory (and maybe even the programmable accessory) company providing all of the accessories beyond the phone and PC to support a better office, but this also requires continued API investment. Poly could have been the smart watch & VR headset company, but didn’t keep up. The opportunity is still there if Poly takes the immersive home office seriously and provides the one-stop shop for transforming the kitchen/guest bedroom/garage/remote office room into a communications hub.

And all that video and audio data is an obvious fit with the #datascience @ZbyHP portfolio. So, if all this makes sense, what is the issue?

Printer Ink.

For HP to pursue this path, it must embrace a business model path with one eye towards the actual Metaverse: VR, AR, workflow digitization, & eliminating the need for print. Z/Poly provides an obvious set of next steps: smart accessories, continued growth of the developer community, process automation & workflow orchestration Printers can be a part of this future if they are “iPhoned” to support higher dpi & eliminate the need for constant ink but anybody who has ever tried to implement a printer from scratch knows just how prehistoric this experience is compared to the mobile, SaaS, Big Data world that is pervasive in our consumer lives where even our refrigerators and light bulbs are now able to give us recommendations.

Does HP have the stomach to truly disrupt itself over the next decade, as Netflix wiped out its mail business & destroyed the value of its DVD library? Or will it spin out Z/Poly to maximize value? Or will Poly become a cash cow held back by legacy HP? HP now has more tools to truly reinvent the digital home office when remote employees can dip into the real estate budget. It will be fairly clear within this calendar year which of these three options ends up being HP’s true intentions: wither, cash cow, or innovate.

For the sake of the innovative geniuses who have worked at Plantronics and Poly love the years, I really hope their technology gets a chance to reach the next level. And as an analyst, I look forward to seeing what big brains @blairplez @DaveMichels @zkerravala have to say about this proposed acquisition as I have found their guidance and perspective invaluable over the years as an analyst who has dabbled in their market.

From a Technology Expense Management perspective, the big takeaway here is that the telecom environment is going farther and farther away from the dedicated phone systems and now even mobile devices that have traditionally been the hub of voice and video. HP’s acquisition of Poly will be part of a trend of creating more focused home office solutions as the future of the hybrid workplace requires less investment in 100,000 square foot (10,000 square meter) headquarters spaces and more investment in the 20 square feet (2 square meters) that we choose to work in at any given point. These accessories will require purchasing and tracking just as all business assets require and may have additional connectivity or computational support demands over time just as smartwatches, connected Internet of Things devices, and devices using edge computing require. Connected devices belong in a unified endpoint management solution, but this HP acquisition may start leading to some questions as to whether remote office management is part of a managed print strategy, enterprise mobility strategy, or general IT asset strategy. Amalgam Insights recommends that remote office tech investment, which will eventually match enterprise mobility as a $2,000/employee/year total cost of ownership for all relevant hybrid and home employees, should be handled as part of an enterprise mobility strategy where device management and logistics have already been defined.

Posted on

Russia Invades Ukraine: 5 Considerations for the IT Community

As anyone who has checked the news today is aware, Russia invaded Ukraine early this morning as the United Nations was holding an emergency meeting seeking to persuade Russia not to invade. The initial results have included stunning pictures of Russian military vehicles and missiles entering Ukraine, the Moscow Stock Exchange falling over 30% in one day, and new international sanctions.

Although the subtleties of geopolitical complexity, NATO, the historical Russian Empire, Ukranian governmental changes, European oil and gas supplies, and nuclear arms are far far far beyond the scope of what we cover at Amalgam Insights, we absolutely hope for a quick and peaceful end to this attack.

In the meantime, we live in a global economy and there are specific aspects of this invasion that specifically affects the IT world.

First, plan for potential delays in software development. Ukraine had established itself as an important nearshore and offshore application development source with over 200,000 skilled developers. Many top software companies and enterprises employ developers from Kyiv and other Ukrainian cities. With this invasion, developers are either moving west to Lviv, Ivano-Frankivsk, and Lutsk or into Poland or being conscripted into defense forces. From a practical perspective, this is going to delay development of new versions and features. Check up with your key vendors to see whether there are expected delays based on this issue. Obviously, there is no feature more important than these lives; this is just about being able to manage expectations and to keep in touch with the people who are building the tools you use at work.

Second, check up on cybersecurity. With current sanctions and financial access locked down, Russia will be looking for liquid funds by any ways necessary. This includes ransomware, accessing computing for cryptomining, and using remote computing to mask trails to access other digital assets. This is a good time to update your patches and passwords and to be diligent on social engineering schemes designed to get employees to click through or give away passwords on the phone. Clicking unknown links is always bad, but this is an especially good time to be paranoid about updates even from trusted vendors and suppliers.

Third, keep your cryptocurrency and NFTs (non-fungible tokens) safe. Crypto has been an enabler for black market activity because of its nature as a relatively liquid asset that is relatively easy to transfer. Make sure that any digital assets you or your organization have are backed up on a well-governed store such as ClubNFT. And make sure your crypto is safe on a wallet you own.

Fourth, budget for cloud costs to increase quickly over the rest of the year as the cost of computing increase. Russia and Ukraine are the primary producers and purifiers of the noble gas neon, which is used to etch semiconductors from 180 to 1X nm nodes, which make up roughly 75% of the total market. Ukraine provides 90% of the world’s supply of purified neon, with Iceblick alone estimated to provide over 60% of the world’s neon. As strategic Ukrainian targets are attacked, the supply of neon will decrease in the short term making chip prices go up. Even if Russia manages to create its own purification capacity, sanctions will make neon extremely expensive. As an example, when Ukraine was initially invaded in 2014, neon prices went up 6x.

Fifth, expect a flood of disinformation across all areas. Modern war is conducted not only as a military exercise, but as a financial, digital, informational, and political exercise. There are aspects of information that Putin and the Russian government are interested in controlling for their own specific reasons that can lead to non-factual announcements. This is going to be, in technical terms, “a pain in the ass” to manage as fact checking becomes more important. This may include disinformation around cybersecurity, healthcare, politics, or any other number of areas with the goal of providing distractions. As a key ally of Ukraine and a core member of NATO, the United States will likely be a target of the social rumor mill in a variety of ways. Ironically, I’ll use a Russian proverb for this recommendation: Доверяй, но проверяй (Doveryay, no proveryay – Trust, but verify).

And, obviously, make sure that your organization is not dependent on Russian computing and financial resources as the risk that those resources will be cut off from the rest of the world is unfortunately real as the escalation of cyber and financial conflict increases.

This invasion is a sad and worrisome time for the world. In our roles as technologists and IT shepherds, there is only so much we can do. But it is up to us to make sure that the assets and services that we manage are kept safe and in control in challenging times. Stay safe and keep your organization as safe as possible.

Posted on

Alteryx Acquires Trifacta: Considerations for DataOps, MLOps, & the Analytic Community

On February 7, 2022, Alteryx completed its acquisition of Trifacta, a data engineering company known for its promotion of “data wrangling” and in bringing to the forefront the challenge of cleansing data in making Big Data useful and supporting machine learning. Alteryx announced its intention to acquire on January 6th for $400 million with an additional $75 million dedicated to an employee retention pool.

Trifacta was founded in 2012 by Stanford Ph.D Sean Kandel, then-Stanford professor Jeffrey Heer, and Berkeley Professor Joe Hellerstein as a data preparation solution at a time when Big Data started to become a common enterprise technology. The company was formed based on Wrangler, a visualization of data transforms that tackled a fundamental problem of reducing the estimated 50-80% of worktime that data analysts and data scientists spent preparing data for analytical use.

Over the past decade, Trifacta raised $224 million with its last round being a $100 million round raised in September 2019. Trifacta quickly established itself as a top solution for data professionals seeking to cleanse data. In a report I wrote in 2015, one of my recommendations was “Consider Trifacta as a general data cleansing and transformation solution. Trifacta is best known for supporting both Hadoop and Big Data environments, including support for JSON, Avro, ORC, and Parquet.” (MarketShare Selects a Data Transformation Platform to Enhance Analyst Productivity, Blue Hill Research, February 2015)

Over the next seven years, Trifacta continued to advance as a data preparation and data engineering solution as it evolved to support major cloud platforms. During this time, three key trends emerged in the data preparation space starting in 2018.

First, data preparation companies focused on the major cloud platforms starting with Amazon Web Services, then Microsoft Azure and Google Cloud. This focus reflected the gravity of net-new analytic and AI data shifting from on-premises resources into the cloud and was a significant portion of Trifacta’s product development efforts over the past few years.

Second, data preparation firms started to be acquired by larger analytic and machine learning providers, such as Altair’s 2018 acquisition of Datawatch and DataRobot’s 2019 acquisition of Paxata. Trifacta was the last remaining market leading data preparation company left on the market for acquisition after having developed the data preparation and wrangling market.

Third, the task of data preparation evolved into a new role of data engineering as enterprises grew to understand that the structure, quality, and relationships of data had to be well defined to get the insights and directional guidance that Big Data had been presumed to hold. As this role became more established, data preparation solutions had to shift towards workflows defined by DataOps and data engineering best practices. It was no longer enough for data cleansing and preparation to be done, but for them to be part of governed process workflows and automation within a larger analytic ecosystem.

All this is to provide guidance on what to expect as Trifacta now joins Alteryx. Although Trifacta and Alteryx are both often grouped as “data preparation” solutions, their roles in data engineering are significantly different enough that I rarely see situations where both solutions are equally suited for a specific use case. Trifacta excels as a visual tool to support data preparation and transformation on the top cloud platforms while Alteryx has long been known for its support of low-code and no-code analytic workflows that help automate complex analytic transformations of data. Alteryx has developed leading products across process automation, the analytic blending in Designer, location-based analytics in Location, as well as machine learning support and Alteryx Server to support analytics at scale.

Although Alteryx provides data cleansing capabilities, its interface does not provide the same level of immediate visual feedback at scale that Trifacta provides, which is why organizations often use both Trifacta and Alteryx. With this acquisition, Trifacta can be used by technical audiences to identify, prepare, and cleanse data and develop highly trusted data sources so that line-of-business data analysts can spend less time finding data and more time providing guidance to the business at large.

Recommendations and Insights for the Data Community

Alteryx clients that consider using Trifacta should be aware that this will likely result in an increased number of analytically accessible data sources. More always sounds better, but this also means that from a practical perspective, your organization may require a short-term reassessment of the data sources, connections, and metrics that are being used for business analysis based on this new data preparation and engineering capability. In addition, this merger can be used as an opportunity to bring data engineering and data analyst communities closer together as they coordinate responsibilities for data cleansing and data source curation. Trifacta provides some additional scalability in this regard that can be leveraged by organizations that optimize their data preparation capabilities.

This acquisition will also accelerate Alteryx’s move to the cloud, as Trifacta provides both an entry point for accessing a variety of cloud data sources and a team of developers, engineers, and product managers with deep knowledge of the major cloud data platforms. Given that Trifacta was purchased for roughly 10% of Alteryx’ market capitalization, the value of moving to the cloud more quickly could potentially justify this acquisition all on its own as an acquihire.

Look at DataOps, analytic workflows, and MLOps as part of a continuum of data usage rather than a set of silos. Trifacta has its 12,000 customers with a mean average of four seats per customer focused on data preparation and engineering. With this acquisition, the Trifacta and Alteryx teams can work together more closely in aligning those four data engineers to the ~30 analytic users that Alteryx averages for each of its 7,000+ customers. The net result is an opportunity to bring DataOps, RPA, analytic workflows, and MLOps together into an integrated environment rather than the current set of silos that often prevent companies from understanding how data changes can affect analytic results.

It has been a pleasure seeing Trifacta become one of the few startups that successfully defines an emerging market of data prep and to coin a term “data wrangling” that was successful enough that it gained market acceptance both with users and with competitors. Many firms try to do this with little success, but Trifacta’s efforts represent the notable exception where its efforts will outlive its time as a standalone company. Trifacta leaves a legacy of establishing the importance of data quality, preparation, and transformation in the enterprise data environment in a world where raw data is imperfect, but necessary to support business guidance. And as Trifacta joins Alteryx, this combined ability to support data from its raw starting point to machine learning models and outputs across a hybrid cloud will continue to be a strong starting point for organizations seeking to provide employees with more control and choice over their analytic inputs and outputs.

If you are currently evaluating Alteryx or Trifacta and need additional guidance, please feel free to contact us at research@amalgaminsights.com to discuss your current selection process and how you are estimating the potential business value of your purchase.

Posted on

Taking a More Analytic Approach to Wordle

Taking a More Analytic Approach to Wordle

The hottest online game of January 2022 is Wordle, a deceptively addictive online game where one tries to guess a five-letter word starting from scratch. Perhaps you’ve started seeing a lot of posts that look like this:

In the unlikely case you haven’t tried Wordle out yet, let me help enable you with this link: https://www.powerlanguage.co.uk/wordle/

OK, that said, the rules of this game are fairly simple: you have six chances to guess the word of the day. This game, created by software developer Josh Wardle, was adorably created as a game for his partner to enjoy. But its simplicity has made it a welcome online escape in the New Year. The website isn’t trying to sell you anything. It isn’t designed to “go viral.” All it does is ask you to guess a word.

But for those who have played the game, the question quickly comes up on how to play this game better. Are there quantitative tricks that can be used to make our Wordle attempts more efficient? How do we avoid that stressful sixth try where the attempt is “do or die?”

For the purposes of this blog, we will not be going directly into any direct Wordle sources because what fun would that be?

Here’s a few tips for Wordle based on some basic analytic data problem solving strategies.

Step 1: identify the relevant universe of data

One way to model an initial guess is to think about the distribution of letters in the English language. Any fan of the popular game show “Wheel of Fortune” has learned to identify R, S, T, L, N, and E as frequently used letters. But how common are those letters?

One analysis of the Oxford English Dictionary done by Lexico.com shows that the relative frequency of letters in the English language is as follows:

LetterFrequencyLetterFrequency
A8.50%N6.65%
B2.07%O7.16%
C4.54%P3.17%
D3.38%Q0.20%
E11.16%R7.58%
F1.81%S5.74%
G2.47%T6.95%
H3.00%U3.63%
I7.54%V1.01%
J0.20%W1.29%
K1.10%X0.29%
L5.49%Y1.78%
M3.01%Z0.27%

This is probably a good enough starting point. Or is it?

Step 2: Augment or improve data, if possible

Stanford GraphBase has a repository of 5757 five letter words used as a starting point for analysis. We know this isn’t exactly the Wordle word bank, as the New York Times wrote an article describing how Wardle and his partner Palak Shah whittled down the word bank to a 2,500 word pool. We can use this to come up with a more specific distribution of letters. So, how does that differ?

Surprisingly, there’s enough of a difference that we need to decide on which option to use. We know that a lot of plural worlds end in s, for instance, which is reflected in the Stanford data. If I were doing this for work, I would look at all of the s-ending words and determine which of those were plural, then cleanse that data since I assume Wordle does not have duplicate plurals. But since Wordle is not a mission-critical project, I’ll stick with using the Stanford data as it has a number of other useful insights.

Step 3: Identify the probable outcomes

So, what are the chances that a specific letter will show up in each word? Wordle isn’t just about the combination of potential letters that can be translated into words. In a theoretical sense, there are 26^5 potential combinations of words that exist or 11,881,376 words. But in reality, we know that AAAAA and ZZZZZ are not words.

Here’s a quick breakdown of how often each letter shows up in each position in the Stanford five-letter data along with a few highlights of letter positions that stand out as being especially common or especially rare.

The 30.64% of words ending in “s” are overwhelmingly plural nouns or singular verbs which leads to the big question of whether one believes that “s-ending” words are in Wordle or not. If they are, this chart works well. If not, we can use the Oxford estimate instead, which will give us less granular information.

1 – (1-[probability])^5

But with the Stanford data, we can do one better and look both at the possibility of each letter in each position as well as to get an idea of the overall odds that a letter might be used by looking at

  1. – [(1 – (First)) * (1 – (Second)) * (1 – (Third)) * (1 – (Forth)) * (1 – (Fifth))]

To figure out the chances that a letter will be used. And we come to the following table and chart.

I highlighted the three letters most likely to show up. I didn’t show off the next tier only because I was trying to highlight what stood out most. In general, I try to highlight the top 10% of data that stands out just because I assume that more than that means that nothing really stands out. My big caveat here is that I’m not a visual person and have always loved data tables more than any type of visualization, but I realize that is not common.

Step 4: Adjust analysis based on updated conditions

As we gain a better understanding of our Wordle environment, the game provides clues on which letters are associated with the word in question. Letters that are in the word of the day but are not in the right position are highlighted in yellow. Based on the probabilities we have, we can now adjust our assumptions. For instance, let’s look at the letter “a”

If we are looking at a word that has the letter “a”, but we know it is not in the first position, we know now we’ve cut down the percentage of words we’re thinking of by about 10%. We can also see that if that “a” isn’t in the second position, it’s probably in the third position.

Step 5: Provide results that will lead to making a decision

Based on the numbers, we can now guess that there’s a 50% chance that “a” is in the second position as 16% of five-letter words have an “a” out of the 31.57% of words that have an “a” but not in the first position. That is just one small example of the level of detail that can be made based on the numbers. But if I am providing this information with the goal of helping with guidance, I am probably not going to provide these tables as a starting point. Rather, I would start by providing guidance on what action to take. The starting point would likely be something like:

The letters used more than 20% of the time in five-letter words are the vowels a, e, i, and o and the consonants l, n, r, s, & t, much as one would expect from watching Wheel of Fortune. Top words to start with based on this criteria include “arise,” “laser,” and “rates.”

In contrast, if one wishes to make the game more challenging, one should start with words that are unlikely to provide an initial advantage. Words such as “fuzzy” and “jumpy” are relatively poor starting points from a statistical perspective.

Conclusion

First, this common approach to data definitely showed me a lot about Wordle that I wouldn’t have known otherwise. I hope this approach helps you both in thinking about your own Wordle approach and to further explore the process of Wordle and other data. And it all started with some basic steps:

So, having done all this analysis, how much do analytics help the Wordle experience? One of the things that I find most amazing about the process of playing Wordle is how our brains approximate the calculations made here from a pattern recognition perspective that reflects our use of language. Much as our brain is effectively solving the parallax formula every time we catch a ball thrown in the air, our brains also intuitively make many of these probabilistic estimates based on our vocabulary every time we play a game of Wordle.

I think that analytic approaches like this help to demonstrate the types of “hidden” calculations that often are involved in the “gut reactions” that people make in their decision-making. Gut reactions and analytic reactions have often been portrayed as binary opposites in the business world, but gut reactions can also be the amalgamation of intelligence, knowledge, past experiences, and intuitive feelings all combined to provide a decision that can be superior or more innovative in comparison to pure analytic decisions. Analytics are an important part of all decision-making, but it is important not to discount the human component of judgment in the decision-making process.

And as far as Wordle goes, I think it is fun to try the optimized version of Wordle a few times to see how it contrasts with your standard process. On the flip side, this data also provides guidance on how to make Wordle harder by using words that are less likely to be helpful. But ultimately, Wordle is a way for you to have fun and analytics is best used to help you have more fun and not to just turn Wordle into an engineering exercise. Happy word building and good luck!

Posted on

Observable raises a $35 million B round for data collaboration

On January 13, 2022, Observable raised a $35.6 million Series round led by Menlo Ventures with participation from existing investors Sequoia Capital and Acrew Capital. This round increases the total amount raised by Observable to $46.1 million. Observable is interesting to the enterprise analytics community because it provides a platform to help data users to collaborate throughout the data workflow of data discovery, analysis, and visualization.

Traditionally, data discovery, contextualization, analytics, and visualization can potentially be supported by different solutions within an organization. This complexity is multiplied by the variety of data sources and platforms that have to be supported and the number of people who need to be involved at each stage which leads to an unwieldy number of handoffs, the potential issue of using the wrong tool for the wrong job, and an extended development process that results from the inability for multiple people to simultaneously work on creating a better version of the truth. Observable provides a single solution to help data users to connect, analyze, and display data along with a library of data visualizations that help provide guidance on potentially new ways to present data.

From a business perspective, one of the biggest challenges of business intelligence and analytics has traditionally been the inability to engage relevant stakeholders to share and contextualize data for business decisions. The 2020s are going to be a decade of consolidation for analytics where enterprises have to make thousands of data sources available and contextualized. Businesses have to bridge the gaps between business intelligence and artificial intelligence, which are mainly associated with the human aspects of data: departmental and vertical context, categorization, decision intelligence, and merging business logic with analytic workflows.

This is where the opportunity lies for Observable in allowing the smartest people across all aspects of the business to translate, annotate, and augment a breadth of data sources into directional and contextualized decisions while using the head start of visualizations and analytic processes that have been shared by a community of over five million users. And then by allowing users to share these insights across all relevant applications and websites, these insights can drive decisions in all relevant places by bringing insights to the users.

Observable goes to market with a freemium model that allows companies to try out Observable for free and then to add editors at tiers of $12/user/month and $40/user/month (pricing as of January 13, 2022). This level of pricing makes Observable relatively easy to try out.

Amalgam Insights currently recommends Observable for enterprises and organizations with three or more data analysts, data scientists, and developers who are collaboratively working on complex data workflows that lead to production-grade visualization. Although it can be more generally used for building analytic workflows collaboratively, Observable provides one of the most seamless and connected collaborative experiences for creating and managing complex visualizations that Amalgam Insights has seen.

Posted on

Reviewing 2021 IT Cost Trends

IT Cost Management is one of the core practices at Amalgam Insights. This practice focuses on tracking both vendors and product offerings that help enterprises fight off the IT Rule of 30, Amalgam Insights’ observation that every unmanaged IT category averages 30% in bloat and waste and that this can be even greater for emerging technology areas such as cloud computing.

From our perspective, the demand for a more holistic technology expense capability has been in demand at the enterprise level since the mid-2010s and companies narrowly focused on managing telecom, mobility, software, and cloud computing as four separate IT silos will miss out on a variety of opportunities to optimize and rationalize costs.

In this practice, we tactically look at technology expense management vendors, including specialists in telecom expense, managed mobility services, cloud cost management, cloud FinOps (Financial Operations), Software as a Service management, IT finance solutions, hybrid cloud subscriptions and financing, and other new IT strategies that can lead to a minimum of 20-30% cost reduction in one or more key IT areas. In each of these IT areas, Amalgam Insights maintains a list of recommended vendors that have proven their ability to deliver on both identifying and fixing the issues associated with the IT Rule of 30, which are provided both in our published research as well as in our end-user inquiries with enterprise clients.

With that out of the way, 2021 was a heck of a year from an IT management perspective. Although a lot of pundits predicted that IT spend would go down in a year where COVID-driven uncertainty was rampant, these cost control concerns ended up being less relevant than the need to continue getting work done and the resilience of a global workforce ready and willing to get things done. In doing so, 2021 saw the true birth of the hybrid worker, one who is just as comfortable working in the office or at home as long as they have the right tools in hand. In the face of this work environment, we saw the following things happen.

The Rise of the Remote Employee – Amalgam Insights estimates that 30% of employees will never be full-time in-office employees again, as they have either moved home full-time or plan to only come into the office one or two times per week as necessary to attend meetings and meet with new colleagues and partners. Although many of us may take this for granted, one of the issues we still face is that in 2019, only 5% of employees worked remotely and many of our offices, technology investments, and management strategies reflect the assumption that employees will be centrally located. And, of course, COVID-19 has proven to be both a highly mutating virus and a disease fraught with controversies regarding treatment and prevention strategies and policies, which only adds to the uncertainty and volatility of in-office work environments.

Legacy networking and computing approaches fall flat – On-premise solutions showed their age as VPNs and the on-site management of servers became passe. At a time when a pandemic was running rampant, people found that VPNs did not provide the protection that was assumed as ransomware attacks more than doubled in the United States and more than tripled in the United Kingdom from 2020 to 2021. It turns out that the lack of server updates and insecure ports on-premises ended up being more dangerous for companies to consider. We also saw the Death of Copper, as copper wired telecom services were finally cut off by multiple telecom vendors, leaving branch offices and the “Things” associated with operational technology rudely left to quickly move to fiber or wireless connections.  Blackberry finally decided to discontinue to support of Blackberry OS as well, forcing the last of the original Blackberry users to finally migrate off of that sweet, sweet keyboard and join the touch screen auto-correct world of smartphone typers. It was a tough year for legacy tech.

Core Mobility Grew Rapidly in 2021 – Core spend was up 8% due to device purchases and increased data use. In particular, device revenue was up nearly 30% over last year with some of the major carriers, such as AT&T, Verizon, and T-Mobile (now the largest carrier in the United States). However, spend for customized and innovative projects disappeared both as 5G buildouts happened more slowly than initially expected and 5G projects froze due to the inability to fulfill complex mesh computing and bandwidth backfill projects. This led to an interesting top-level result of overall enterprise mobility spend being fairly steady although the shape of the spend was quite different from the year before.

Cloud Failures Demonstrated need for Hybrid and Multi-Cloud Management – Although legacy computing had its issues, cloud computing had its black eyes as well. 8760 hours per year means that each hour down gets you from 100% to 99.99% (4 9’s). Recent Amazon failures in November and December of 2021 demonstrated the challenges of depending on overstressed resources, especially US-1-East. This is not meant to put all the blame on Amazon, as Microsoft Azure is known for its challenges in maintaining service uptime as well and Google Cloud still has a reputation for deprecating services. No one cloud vendor has been dependable at the “5 9’s” level of uptime (5 minutes per year of downtime) that used to define high-end IT quality. Cloud has changed the fundamental nature of IT from “rock-solid technology” to a new mode of experimental “good enough IT” where the quality and value of new technology can excuse some small uptime failures. But cloud failures by giants including Akamai, Amazon, AT&T, Comcast, Fastly, and every other cloud leader show the importance of having failover and continuity capabilities that are at least multi-region in nature for mission-critical technologies.

Multi-cloud Emergence – One of the interesting trends that Amalgam Insights noticed in our inquiries was that Google Cloud replaced Microsoft Azure as the #2 cloud for new projects behind the market leader Amazon. In general, there was interest in using the right cloud for the job. Also, the cloud failures of leading vendors allowed Oracle Cloud to start establishing a toehold as its networking and bare-metal support provided a ramp for mature enterprises seeking a path to the cloud. As I’ve been saying for a decade now, the cloud service provider market is going the way of the telcos, both in terms of the number of vendors and the size of the market. Public cloud is now is $350 billion global market, based on Amalgam Insights’ current estimates, which measures to less than 7% of the total global technology market. As we’ll cover in our predictions, there is massive room for growth in this market over the next decade.

SD-WAN continues to be a massive growth market – From a connectivity perspective, Software Defined Wide Area Networks (SD-WAN) continue to grow due to their combination of performance and cost-cutting. This market saw 40% growth in 2021 and now uses security as a differentiator to get past what people already know. From an IT cost management perspective, this means that there continues to be a need for holistic project management including financial and resource management for these network transformation projects. Without support from technology expense management solutions with strong network inventory capabilities, this won’t happen.

As we can see, there were a variety of key IT trends that affected technology expenses and sourcing in 2021. In our next blog on this topic, we’ll cover some of our expectations for 2022 based on these trends. If you’d like a sneak peek of our 2022 predictions, just email us at info@amalgaminsights.com

Posted on

About Amalgam Insights

Amalgam Insights is working on a new website experience for you in 2022 to help with your Technology Expense Management, Data and Analytics, and Business Planning Management challenges.

In the meantime, if you need to reach us, please contact Hyoun Park (hyoun@amalgaminsights.com) or Lisa Lincoln (lisa@amalgaminsights.com). We thank you for your patience as we get set up!

Continue reading About Amalgam Insights
Posted on

Domino Data Lab Raises $100 Million F Round to Enable the Model-Driven Enterprise

On October 5, Domino Data Lab announced a $100 million F round led by private equity firm Great Hill Partners and joined by existing investors Coatue, Highland Capital, and Sequoia Capital. Domino Data Lab is a company we have covered since the inception of Amalgam Insights in 2017. From the start, it was obvious that Domino Data was designed to support data science teams that sought to manage data science exploration and machine learning outputs with enterprise governance.

This investment is obviously an eye catcher and is in line with other massive rounds that data science and machine learning solutions have been raising, such as DataRobot’s July 2021 G round of $300 million, Dataiku’s August 2021 $400 million round, or Databricks’ gobsmacking August 2021 round of $1.6 billion. In light of these funding rounds, one might be tempted to ask the seemingly absurd question of whether $100 million is enough!

Fortunately, even in these heady economic times, $100 million is still a significant amount of cash to fund growth and the other funding rounds demonstrate that this is a hot market. In addition, Domino Data’s focus on mature data science practices and teams means that the marketing, sales, and product teams can focus on high-value applications for developers and data analysts rather than having to try to be everything for everyone.

In addition, the new lead investor Great Hill Partners is a firm that Amalgam Insights considers “smart money” in that it specializes in investments roughly around this $100 million size with the goal of pushing data-savvy companies beyond the billion dollar valuation. A quick look at Great Hill Partners shows that they have assigned both founder Chris Gaffney and long-time tech executive Derek Schoettle to this investment, both of whom have deep expertise in data and analytics.

With this investment, Amalgam Insights expects that Domino Data will continue to solve a key problem that exists in enterprise machine learning and artificial intelligence: orchestrating and improving models and AI workloads over time. As model creation and hosting have become increasingly simple to initiate, enterprises now face the potential issues of technology debt associated with AI. Effectively, enterprises are replacing “Big Data” issues with “Big Model” issues where the breadth and complexity of models become increasingly difficult to govern and support without oversight and AI strategy. This opportunity cannot be solved through automated model creation or traditional analytic and business intelligence solutions as the combinations of models, workflows, and governance associated with data science require a combination of testing, collaboration, and review that is lacking in standard analytic environments. With mature data science teams now becoming an early majority capability at the enterprise level, Domino Data’s market has now caught up to the product.

Domino Data’s funding announcement also mentioned the launch of a co-selling agreement with NVIDIA. Although this agreement isn’t novel and NVIDIA has a variety of agreements with other software companies, this particular agreement allows NVIDIA and Domino Data to provide both the hardware and software to develop optimized machine learning at scale. Amalgam Insights expects that this agreement will allow enterprises to accelerate their development of machine learning models while providing a management foundation for the ongoing governance and support of data science. Enterprise-grade data science ultimately requires not only the technical capability to deploy a model, but the ability to audit and review models for ongoing improvement or disconnections

From an editorial perspective, it is amazing to see how quickly Domino Data Lab has grown over the past three years. When we first briefed Domino Data in 2017, we frankly stated that the solution was ahead of its time as enterprises typically lacked the formal teamwork and organizational structure to support data science. It wasn’t that businesses shouldn’t have been thinking about data science teams, but rather that IT and analytics teams simply were not keeping up with the state of technology. And in response, Domino Data actually launched a data science framework to define collaborative data science efforts.

Recommendation for Amalgam Insights’ Data and Analytics Community

Funding announcements typically are associated with growth expectations: the bigger the round, the higher the sales and marketing expectations. Domino Data is raising this money now both because it is seen as a market leader in supporting data science and that companies have reached a tipping point in requiring solutions for collaborative and compliant data science management.

Amalgam Insights’ key recommendation based on this funding round as well as recent funding from other vendors is to review current data science capabilities within your organization and ensure that the compliance, governance, and collaborative capabilities are on par with your current analytics, business intelligence, and application development capabilities. The toolkits for collaborative data science have evolved massively over the past couple of years and data science is no longer a task for the “lone-wolf genius” but for an enterprise team expected to provide high-value digital assets. Compare current data science operationalization and management solutions to existing in-house capabilities and conduct a realistic analysis of the time, risk, and total cost of ownership savings associated with each approach. With a mature vendor landscape now in place to help support data science, this is the time for early majority data science adopters to take full advantage of their capabilities over market competitors by creating a mature data science environment and quickly building AI where competitors still depend on manual or static black-box processes.