Posted on

Tangoe Acquires MOBI to Strategically Expand Enterprise Mobility Capabilities

On December 5th, 2018, Tangoe announced the acquisition of MOBI, a leading managed mobility services organization based in Indianapolis, Indiana in the United States. With this acquisition, Tangoe increases its IT spend under management to over $40 billion, increasing its lead over other spend management vendors with multiple billions of dollars of enterprise technology under management including Flexera, Snow Software, Microsoft Azure Cost Management, CloudHealth by VMware, Calero, MDSL, Cass Information Systems, and Sakon.

Key questions to consider for this acquisition include:

  • Why did Tangoe decide to buy MOBI at this time? For its customer base? Corporate culture? Technology?
  • How does this acquisition affect enterprises seeking toolsets to assist with the orchestration and accounting of digital transformation initiatives?
  • How will work be split and coordinated between Tangoe’s Austin logistics warehouse and MOBI’s Indianapolis-based facilities?
  • What will Tangoe do with MOBI’s Robotic Process Automation initiative of Mobots?
  • Will Tangoe keep MOBI’s staff or will there be a bunch of high-quality mobility and support staff available?
  • What happens to MOBI partners who may compete with Tangoe?
  • Will MOBI customers be moved to the Tangoe Matrix platform immediately?
  • Will Tangoe contribute to the burgeoning Indianapolis tech scene that is currently one of the hottest startup spots in the country?

To learn more about which of these questions can be answered and which of these questions require greater due diligence, please read my full analysis, which is available at: https://www.amalgaminsights.com/product/amalgam-insights-market-milestone-tangoe-acquires-mobi-to-enhance-mobility-management-capabilities

Posted on 1 Comment

Market Milestone – Looker Raises $103 Million E Round to Expand the Looker Data Platform

On December 6, 2018, Looker announced that it closed a $103 million E round led by Premji Invest, a private equity firm owned by Wipro chairman Azim Premji. This round also includes new funds from Cross Creek Advisors, a venture capital firm focused on late-stage investments with current investments including tech darlings such as Anaplan, Datastax, Docker, DocuSign, Pluralsight, and Sumo Logic. This round also included participation from current investors from prior rounds (such as Looker’s Series D covered by Amalgam Insights).

This round is expected to be a final round before a potential Initial Public Offering. Of course, in recent times, plans for IPO from hot companies have been interrupted at the last moment, such as with Workday’s $1.55 billion acquisition of Adaptive Insights (see Amalgam Insights’ coverage for more information) and SAP’s $8 billion acquisition of Qualtrics. A Looker IPO may not be guaranteed, but Amalgam Insights would expect that the valuation of Looker is likely not going to be affected whether the company ends up going public or being acquired for strategic reasons.

Why is Looker worth $1.6 billion?

With this funding, Looker crosses into “unicorn” territory with a valuation of approximately $1.6 billion. This valuation is based on Looker’s run-rate of over $100 million in annual revenue supported by an employee base that is approaching 600, year-over-year growth exceeding 70%, and expansion into Tokyo to support the Asia-Pacific region.

Amalgam Insights believes that this valuation reflects several key trends and intelligent strategic decisions made by Looker in supporting enterprise-grade business intelligence and data supply chain needs.

First, Looker was developed to support a variety of data preparation, cataloging, governance, querying, and presentation capabilities at scale across a wide variety of data sources and formats. This approach reflects Looker’s purpose of being a data platform built for a new generation of data analysts based on the volume and variety of challenges that drive current analytic challenges.

Taking a step back, I remember when I first ran into Looker on the trade show circuit. I was looking at a variety of BI solutions in 2013, including the likes of Tableau, Qlik, and Microstrategy, when I ran into a small booth manned by a guy named Keenan Rice, who currently serves as Looker’s VP of Strategic Alliances. As a jaded analyst, I asked how his solution was different from the 30+ other solutions that were being shown on that show floor. At the time, everyone was bragging about their pixel-perfect visualizations, report building capabilities, and basic data presentation that were interesting to see, but rarely provided significant value, competitive differentiation, or Return on Investment.

However, Keenan’s pitch differed substantially. Although Looker also provided visualizations for large data tables, Keenan started by talking about data analyst challenges in preparing data for self-service analytics, using Looker’s SQL-based LookML as a modelling layer to access a variety of data, bringing new data into existing data and analytic workflows, and delivering it all as Software-as-a-Service. As a former data analyst, this was all music to my ears, but I wondered if Looker would be able to stand out as a solution against all of the dashboarding solutions, report builders, and the massive marketing budgets of incumbent BI vendors. Looker was just starting as a company and had just announced its Series A, so it faced significant odds in standing out based on the $10 million to $40 million range of Series A or Series B funding that a company like Looker would typically get at this point.

But it has been a pleasure both to seek Looker grow over the years and to lead a new generation of solutions focused on simplifying data supply chains and pipelines. It ends up that Looker was just early enough to avoid having comparable competition while solving a market need that businesses understood, especially those companies seeking a new generation of BI-based capabilities and wanting to develop a more data-driven organization. So, that’s a long way of saying that Looker took a big and laborious bet against the grain six years ago and has been rewarded for having a combination of good product and good timing.

But this isn’t the only reason that Looker has been successful.

As Looker has achieved market fit and success, the solution has evolved from a data workflow solution into an emerging application development solution that allows analysts and developers to work collaboratively in building secure and embeddable data models. By truly building Looker as a platform rather than simply calling it a platform as an aspirational goal based on a set of APIs, Amalgam Insights expects that Looker will become increasingly valuable for data analysts and the “citizen” developers who seek to increase appropriate access to enterprise data and analytic outputs.

And Looker has now taken an important step forward in providing department-specific apps in the most recent Looker 6 launch. Starting with digital marketing and web analytics, Looker is now taking advantage of its analytics capabilities to provide out-of-the-box support to help enterprises with key challenges rather than forcing clients to build foundational analytics that have been built over and over. Both these apps and Looker’s development focus build on top of Looker’s prior focus on Looker Blocks first launched in 2015, which were a collection of SQL, visualizations, and pre-built analytic models designed to accelerate analytic projects.

So, what is next for Looker? IPO? Global domination? 

All kidding aside, with expansion into AsiaPac to accompany Looker’s existing European offices in London and Dublin and Looker’s current 70% growth rate, it is possible that Looker could increase net-new revenues in 2019 faster than BI stalwarts such as Information Builders and Qlik. From a financial perspective, Amalgam Insights notes that Looker’s growth mirrors that of Alteryx, which has roughly quadrupled its stock price since its March 2017 IPO which was based on Alteryx’ 2016 revenue of roughly $86 million with 59% year-over-year growth.

Looker has stated that it believes that this Series E round should be its last funding round before IPO and, given recent market valuations for successful software companies, there should be no reason to expect otherwise from Looker. Looker’s progress both as a software development platform as well as a platform of pre-built applications and services bodes well for Looker as it continues to evolve as an enterprise platform focused on expanding access to data insights.

Finally, Amalgam Insights believes that Looker’s success will lead to continued success with other data and analytics companies focusing on rapid data modelling, data mapping, and analysis of multiple data sources. Looker’s agile BI approach that avoids the challenges of traditional BI and ETL solutions in supporting multiple data sources has become a new standard for accelerating the value of data. This both means that traditional BI companies will need to accelerate their own data pipeline efforts or to partner with other vendors and that Looker will start to be targeted in the same way that the likes of Tableau, Qlik, and Microstrategy have in the past. The price of success is increased competition and innovation, which is good news for the BI, data, and analytics markets and should provide Looker with enough challenges to avoid resting on its laurels.

Posted on Leave a comment

Red Hat Hybrid Cloud Management Gets Financial with Cloud Cost Management

Key Stakeholders: CIO, CFO, Accounting Directors and Managers, Procurement Directors and Managers, Telecom Expense Personnel, IT Asset Management Personnel, Cloud Service Managers, Enterprise Architects

Why It Matters: As enterprise cloud infrastructure continues to grown 30-40% per year and containerization becomes a top enterprise concern, IT must have tools and a strategy for managing the cost of storage and compute associated with both hybrid cloud and container spend. With Cloud Cost Management, Red Hat provides an option for its considerable customer base.

Key Takeaways: Red Hat OpenShift customers seeking to managing the computing costs associated with hybrid cloud and containers should starting trialing Cloud Cost Management when it becomes available in 2019. Effective cost management strategies and tools should be considered table stakes for all enterprise-grade technologies.

Amalgam Insights is a top analyst firm in the analysis of IT subscription cost management, as can be seen in our:

In this context, Red Hat’s intended development of multi-cloud cost management integrated with CloudForms is an exciting announcement for the cloud market. This product, scheduled to come out in early 2019, will allow enterprises supporting multiple cloud vendors to support workload-specific cost management, which Amalgam Insights considers to be a significant advancement in the cloud cost management market.

And this product comes at a time when cloud infrastructure cost management has seen significant investment including VMware’s $500 million purchase of Boston-based CloudHealth Technologies, the 2017 $50 million “Series A” investment in CloudCheckr, investments in this area by leading Telecom and Technology Expense Management vendors such as Tangoe and Calero, and recent acquisitions and launches in this area from the likes of Apptio, BMC, Microsoft, HPE, and Nutanix.

However, the vast majority of these tools are currently lacking in the granular management of cloud workloads that can be tracked at a service level and then appropriately cross-charged to a project, department, or location. This capability will be increasingly important as application workloads become increasingly nuanced and revenue-driven accounting of IT becomes increasingly important. Amalgam Insights believes that, despite the significant activity in cloud cost management, that this market is just starting to reach a basic level of maturity as enterprises continue to increase their cloud infrastructure spend by 40% per year or more and start using multiple cloud vendors to deal with a variety of storage, computing, machine learning, application, service, integration, and hybrid infrastructure needs.

Red Hat Screenshot of Hybrid Cloud Cost Management

As can be seen from the screenshot, Red Hat’s intended Hybrid Cloud Cost Management offering reflects both modern design and support for both cloud spend and container spend. Given the enterprise demand for third-party and hybrid cloud cost management solutions, it makes sense to have an OpenShift-focused cost management solution.

Amalgam Insights has constantly promoted the importance of formalized technology cost management initiatives and their ability in reducing IT cost categories by 30% or more. We believe that Red Hat’s foray into Hybrid Cloud Cost Management has an opportunity to compete with a crowded field of competitors in managing multi-cloud and hybrid cloud spend. Despite the competitive landscape already in play, Red Hat’s focus on the OpenShift platform as a starting point for cost management will be valuable for understanding cloud spend at container, workload, and microservices levels that are currently poorly understood by IT executives.

My colleague Tom Petrocelli has noted that “I would expect to see more and more development shift to open source until it is the dominant way to develop large scale infrastructure software.” As this shift takes place, the need to manage the financial and operational accounting of these large-scale projects will become a significant IT challenge. Red Hat is demonstrating its awareness of this challenge and has created a solution that should be considered by enterprises that are embracing both Open Source and the cloud as the foundations for their future IT development.

Recommendations

Companies already using OpenShift should look forward to trialling Cloud Cost Management when it comes out in early 2019. This product provides an opportunity to effectively track the storage and compute costs of OpenShift workloads across all relevant infrastructure. As hybrid and multi-cloud management becomes increasingly common, IT organizations will need a centralized capability to track their increasingly complex usage associated with the OpenShift Container Platform.

Cloud Service Management and Technology Expense Management solutions focused on tracking Infrastructure as a Service spend should consider integration with Red Hat’s Cloud Cost Management solution. Rather than rebuild the wheel, these vendors can take advantage of the work already done by RedHat to track container spend.

And for Red Hat, Amalgam Insights provides the suggestion that Cloud Cost Management become more integrated with CloudForms over time. The most effective expense management practices for complex IT spend categories always include a combination of contracts, inventory, invoices, usage, service orders, service commitments, vendor comparisons, and technology category comparisons. To gain this holistic view that optmizes infrastructure expenses, cloud procurement and expense specialists will increasingly demand this complete view across the entire lifecycle of services.

Although this Cloud Cost Management capability has room to grow, Amalgam Insights expects this tool to quickly become a mainstay, either as a standalone tool or as integrated inputs within an enterprise’s technology expense or cloud service management solution. As with all things Red Hat, Amalgam Insights expects rapid initial adoption within the Red Hat community in 2019-2020 which will drive down enterprise infrastructure total cost of ownership and increase visibility for enterprise architects, financial controllers, and accounting managers responsible for responsible IT cost management.

Posted on Leave a comment

Observations on the Future of Red Hat from Red Hat Analyst Day

On November 8th, 2018, Amalgam Insights analysts Tom Petrocelli and Hyoun Park attended the Red Hat Analyst Day in Boston, MA. We had the opportunity to visit Red Hat’s Boston office in the rapidly-growing Innovation District, which has become a key tech center for enterprise technology companies. In attending this event, my goal was to learn more about the Red Hat culture that is being acquired as well as to see how Red Hat was taking on the challenges of multi-cloud management.

Throughout Red Hat’s presentations throughout the day, there was a constant theme of effective cross-selling, growing deal sizes including a record 73 deals of over $1 million in the last quarter, over 600 accounts with over $1 million in business in the last year, and increased wallet share year-over-year for top clients with 24 out of 25 of the largest clients increasing spend by an average of 15%. The current health of Red Hat is undeniable, regardless of the foibles of the public market. And the consistency of Red Hat’s focus on Open Source was undeniable across infrastructure, integration, application development, IT automation, IT optimization, and partner solutions, which demonstrated how synchronized and focused the entire Red Hat executive team presenters were, including Continue reading Observations on the Future of Red Hat from Red Hat Analyst Day

Posted on Leave a comment

Torchbearer Case Study: OceanX Delivers the Consumer Subscription Experience with Oracle Cloud Infrastructure

(Note: Torchbearer Case Studies provide enterprises with a strong example of the “Art of the Possible” in using technology to enhance their business environment. Amalgam Insights provides emerging best practices based on the experience of the Torchbearer to inspire and educate Early Adopter and Early Majority buyers seeking to develop a strategic advantage.)

At Oracle Open World, I was interested in learning more about the Oracle Cloud and its role in data and analytics. Although Oracle has admittedly been relatively late to the enterprise cloud, the Oracle Cloud was front and center throughout Oracle Open World and it was interesting to see how Oracle was able to bring up enterprise technology deployments across its portfolio. One example of enterprise success in working with the Oracle Cloud that stood out was from OceanX.

At Oracle Open World, OceanX won the 2018 Oracle Innovation Award for Data Management. OceanX is a spinoff of Guthy-Renker started in 2016 to provide an integrated subscription commerce platform which brings together e-commerce, fulfilment, customer service, and business analytics associated with direct-to-consumer subscription programs. This allows consumer brands to provide personalized and bespoke products in their subscription offerings and to build consumer relationships based on a holistic view of the customer’s preferences, purchases, and interactions.
Continue reading Torchbearer Case Study: OceanX Delivers the Consumer Subscription Experience with Oracle Cloud Infrastructure

Posted on

Is IBM’s Acquisition of Red Hat the Biggest Acquihire of All Time?

Estimated Reading Time: 11 minutes

Internally, Amalgam Insights has been discussing why IBM chose to acquire Red Hat for $34 billion dollars fairly intensely. Our key questions included:

  • Why would IBM purchase Red Hat when they’re already partners?
  • Why purchase Red Hat when the code is Open Source?
  • Why did IBM offer a whopping $34 billion, $20 billion more than IBM currently has on hand?

As a starting point, we posit that IBM’s biggest challenge is not an inability to understand its business challenges, but a fundamental consulting mindset that starts with the top on down. By this, we mean that IBM is great at identifying and finding solutions on a project-specific basis. For instance, SoftLayer, Weather Company, Bluewolf, and Promontory Financial are all relatively recent acquisitions that made sense and were mostly applauded at the time. But even as IBM makes smart investments, IBM has either forgotten or not learned the modern rules for how to launch, develop, and maintain software businesses. At a time when software is eating everything, this is a fundamental problem that IBM needs to solve.

The real question for IBM is whether IBM can manage itself as a modern software company.

Continue reading Is IBM’s Acquisition of Red Hat the Biggest Acquihire of All Time?

Posted on Leave a comment

From Calero World Online: From TEM to ITEM: Leveraging TEM for Non-Traditional Expenses

On October 18th, I presented a webinar at Calero World Online on the future of IT cost and subscription management. In this presentation, I challenge existing telecom and IT expense management managers to accept their destiny as pilots and architects of enterprise digital subscriptions.

Telecom expense has traditionally been the most challenging of IT costs to manage. With the emergence of software-as-a-service, cloud computing, the Internet of Things, and software-defined networks, the rest of the IT world is quickly catching up.

In this webinar, you will learn:

  • How the latest trends and technology are driving change to enterprise management strategies
  • How the challenges of traditional TEM and cloud expense management are similar in nature (and why TEM is a good place to start)
  • How organizations are benefiting from ITEM best practices using sample use cases

To learn more about the upcoming challenges of IT expense management, aligning technology supply to digital demand, and being the shepherd for your organization’s technology sourcing, utilization, and optimization, click here to watch this webinar on-demand.

Posted on Leave a comment

ICYMI: On Demand Webinar – Four Techniques to Run AI on Your Business Data

On October 17th, I presented a webinar with Incorta’s Chief Evangelist, Matthew Halliday, on the importance of BI architectures in preparing for AI. This webinar is based on a core Amalgam Insights belief that all enterprise analytics and data science activity should be based on a shared core of trusted and consistent data so that Business Intelligence, analytics, machine learning, data science, and deep learning efforts are all based on similar assumptions and can build off each other.

While AI is beginning to impact every aspect of our consumer lives, business data-driven AI seems to be lower on the priority list of most enterprises. The struggle to understand the practical value of AI starts with the lack of ability to make business data easily accessible to the data science teams. Today’s BI tools have not kept up with this need and often are the bottlenecks that stifle innovation.

In this webinar, you will learn from Hyoun Park and Matthew Halliday about:
  • key data and analytic trends leading to the need to accelerate analytic access to data.
  • guidance for challenges in implementing AI initiatives alongside BI.
  • practical and future-facing business use cases that can be supported by accelerating analytic access to large volumes of operational data.
  • techniques that accelerate AI initiatives on your business data.

Watch this webinar on-demand by clicking here.

Posted on Leave a comment

Why It Matters that IBM Announced Trust and Transparency Capabilities for AI


Note: This blog is a followup to Amalgam Insights’ visit to the “Change the Game” event held by IBM in New York City.

On September 19th, IBM announced its launch of a portfolio of AI trust and transparency capabilities. This announcement got Amalgam Insight’s attention because of IBM’s relevance and focus in the enterprise AI market throughout this decade.  To understand why IBM’s specific launch matters, take a step back in considering IBM’s considerable role in building out the current state of the enterprise AI market.

IBM AI in Context

Since IBM’s public launch of IBM Watson on Jeopardy! in 2011, IBM has been a market leader in enterprise artificial intelligence and spent billions of dollars in establishing both IBM Watson and AI. This has been a challenging path to travel as IBM has had to balance this market-leading innovation with the financial demands of supporting a company that brought in $107 billion in revenue in 2011 and has since seen this number shrink by almost 30%.

In addition, IBM had to balance its role as an enterprise technology company focused on the world’s largest workloads and IT challenges with launching an emerging product better suited for highly innovative startups and experimental enterprises. And IBM also faced the “cloudification” of enterprise IT in general, where the traditional top-down purchase of multi-million dollar IT portfolios is being replaced by piecemeal and business-driven purchases and consumption of best-in-breed technologies.

Seven years later, the jury is still out on how AI will ultimately end up transforming enterprises. What we do know is that a variety of branches of AI are emerging, including Continue reading Why It Matters that IBM Announced Trust and Transparency Capabilities for AI

Posted on 1 Comment

IBM Presents “Change the Game: Winning with AI” in New York City


(Note: This blog is part of a multi-part series on this event and the related analyst event focused on IBM’s current status from an AI perspective.)

On September 13th, 2018, IBM held an event titled “Change the Game: winning with AI.” The event was hosted by ESPN’s Hannah Storm and held in Hell’s Kitchen’s Terminal 5, better known as a music venue where acts ranging from Lykke Li to Kali Uchis to Good Charlotte perform. In this rock star atmosphere, IBM showcased its current perspective on AI (artificial intelligence, not Amalgam Insights!).

IBM’s Rob Thomas and ESPN’s Hannah Storm discuss IBM Private Cloud for Data

This event comes at an interesting time for IBM. Since IBM’s public launch of IBM Watson on Jeopardy! in 2011, IBM has been a market leader in enterprise artificial intelligence and spent billions of dollars in establishing both IBM Watson and AI. However, part of IBM’s challenge over the past several years was that the enterprise understanding of AI was so nascent that there was no good starting point to develop machine learning, data science, and AI capabilities. In response, IBM built out many forms of AI including

  • IBM Watson as a standalone, Jeopardy!-like solution for healthcare and financial services,
  • Watson Developer Cloud to provide language, vision, and speech APIs, Watson Analytics to support predictive and reporting analytics,
  • chatbots and assistants to support talent management and other practical use cases,
  • Watson Studio and Data Science Experience to support enterprise data science efforts to embed statistical and algorithmic logic into applications, and
  • Evolutionary neural network design at the research level.

And, frankly, the velocity of innovation was difficult for enterprise buyers to keep up with, especially as diverse products were all labelled Watson and as buyers were still learning about technologies such as chatbots, data science platforms, and IBM’s hybrid cloud computing options at a fundamental level. The level of external and market-facing education needed to support relatively low-revenue and experimental investments in AI was a tough path for IBM to support (and, in retrospect, may have been better supported as a funded internal spin-off with access to IBM patents and technology). Consider that extremely successful startups can justify billion dollar valuations based on $100 million in annual revenue while IBM is being judged on multi-billion dollar revenue changes on a quarter-by-quarter basis. That juxtaposition makes it hard for public enterprises to support audacious and aggressive startup goals that may take ten years of investment and loss to build.

IBM’s perspective at this “Change the game” event was an important checkpoint in learning more about IBM’s current positioning on AI. Amalgam Insights was interested in learning more about IBM’s current positioning and upcoming announcements to support enterprise AI.

With strategies telestrated by IBM’s Janine Sneed and Daniel Hernandez, this event was an entertaining breakdown of IBM case studies demonstrating the IBM analytics and data portfolio and interspersed with additional product and corporate presentations by IBM’s Rob Thomas, Dinesh Nirmal, Reena Ganga, and Madhu Kochar. The event was professionally presented as Hannah Storm interviewed a wide variety of IBM customers including:

  • Mark Vanni, COO of Trūata
  • Joni Rolenaitis, Vice President of Data Development and Chief Data Officer for Experian
  • Dr. Donna M. Wolk, System Director of Clinical and Molecular Microbiology for Geisinger
  • Guy Taylor, Executive Head of Data-Driven Intelligence at Nedbank
  • Sreesha Rao, Senior Manager of IT Applications at Niagara Bottling LLC
  • James Wade, Director of Application Hosting for GuideWell Mutual Holding Company
  • Mark Lack, Digital Strategist and Data Scientist for Mueller, Inc
  • Rupinder Dhillon, Director of Machine Learning and AI at Bell Canada

In addition, IBM demonstrated aspects of IBM Private Cloud for Data, their cloud built for using data for AI, as well as Watson Studio, IBM’s data science platform, and design aspects for improving enterprise access to data science and analytic environments.

Overall, Amalgam Insights saw this event as a public-friendly opportunity to introduce IBM’s current capabilities in making enterprises ready to support AI by providing the data, analytics, and data science products needed to prepare enterprise data ecosystems for machine learning, data science, and AI projects. In upcoming blogs, Amalgam Insights will cover IBM’s current AI positioning in greater detail and the IBM announcements that will affect current and potential AI customers.