3 Big 2019 Trends and 4 Strategic Tips for Managing the Transforming Cost of Technology

Amalgam Insights estimates that the total technology spend formally and centrally managed by enterprises with over $1 billion in revenue from telecom, network, mobility, Software-as-a-Service (SaaS), and Infrastructure-as-a-Service will double from June of 2019 by the end of 2021 driven by the massive growth of cloud computing and the need to manage a variety of “shadow IT” costs that grow to the size that formal management is required.. By “formally and centrally managed,” Amalgam Insights assumes full visibility of inventory, contracts, billing, and service orders across all vendors with active usage and supplier optimization efforts.

In 2019, Amalgam Insights notes several key trends in the world of technology expense management that IT organizations should be aware of.

First, every IT expense management solution is increasingly focused on cloud-based expenses, either in terms of Software as a Service or Infrastructure as a Service. Established Software Asset Management (SAM) companies are working on their SaaS expense capabilities including Aspera, Flexera, ServiceNow, and Snow Software and a variety of standalone vendors including Alpin, Binadox, Cleanshelf, Intello, Torii, and Zylo are emerging. Amalgam Insights is planning a SmartList for November 2019 to focus on key vendors to manage SaaS across the SaaS expense, SAM, and Technology Expense markets.

On the IaaS side, every major global Technology Expense Management solution has launched IaaS management capabilities, also known as FinOps or Cloud FinOps, including Asignet, Calero, Cass Information Systems, Cimpl, Dimension Data, MDSL, Sakon, and Tangoe. In addition, this market has been an area of rapid acquisition over the past couple of years including Microsoft’s acquisition of Cloudyn, Apptio’s acquisition of Cloudability (which calls this practice “FinOps”), and VMware’s acquisition of CloudHealth Technologies. And there are still standalone players such as Cloudcheckr in this space as well. This crowded market seeking to manage the next $100 billion of public cloud spend represents an interesting set of choices for IT departments in choosing how to aggregate and manage IT spend.

(As an aside, Amalgam Insights finds the use of the term “FinOps” by Apptio and Cloudability to be an interesting way to coordinate multiple departments and provide guidance on how to manage cloud expenses. At the same time, FinOps seems to be recreating the wheel to some extent in rebuilding a set of practices and cross-departmental teams that already have been managing telecom expenses for a number of years. Amalgam Insights is quite interested in seeing how this duplication of effort within IT departments will sort itself either by the establishment of separate Cloud FinOps departments, integration of Cloud FinOps and “Telecom FinOps” a.k.a. Telecom Expense Management, or the integration of both cloud and telecom into a larger IT expense or finance role. This is an interesting transitional period for IT expense as more spend moves to subscription, usage, feature, user, department, and project-based spend and chargeback models. )

A third key trend is the move to Europe. European IT is being targeted as an area that has traditionally been informally managed or managed in geographic silos that prevent strong global management and alignment with strategic enterprise efforts. To solve this problem, there has been a variety of acquisitions and office launches across Europe as the likes of Calero, MDSL, Tangoe, Flexera, Snow Software, Cloudcheckr, ServiceNow, VMware, and others. Amalgam Insights notes that European IT management challenges have been poorly supported in the past by global vendors that have not adequately accounted for the differences in managing data, connectivity, and compliance in each European country and the relative lack of geographic footprint to support services. In light of this, Amalgam Insights has been tracking European IT management successes and provides guidance on this topic for end user, investor, and vendor advisory clients.

To prepare for this evolution in technology expense management, Amalgam Insights provides the following guidance for Chief Information Officers, Chief Procurement Officers, Chief Accounting Officers, and related IT procurement, finance, and expense managers to prepare for the second half of 2019 and beyond based on prior guidance and research.

$100K and 30% are key benchmarks for specialized IT spend categories. Once an IT spend management category exceeds $100,000 per month, organizations start having the potential to save at least one employee’s worth of payroll through optimization by pursuing the 30% savings that typically exist in a previously unmanaged environment that has not been formally managed or audited over the last five years. At this point, your company should start looking for a dedicated solution for expense management, whether it be manual support from an in-house employee with telecom experience, a software platform to support management, or managed services to support contract, invoice, inventory, and usage management. These rules of thumb are especially true in SaaS and IaaS management, which are currently rife with poor governance, duplicate spending, and unmonitored usage patterns associated with decentralized cloud computing purchases.

Enterprise buyers at Global 2000 companies should review their current strategy for IT spend categories with the goal of supporting all cloud, software, hardware, network, and mobility spend from a usage and subscription-based perspective. IT is moving to a subscription and usage-based paradigm that started by enterprise telecom and then evolved through the enterprise adoption of cloud computing. Firms currently considering new or replacement IT expense vendors, due diligence in understanding prospective vendors’ roadmap and experience for new technology categories is vital for futureproofing this investment. This is especially true in an “As-a-Service” world where all aspects of IT are increasingly being sourced and billed in a telecom-like function which shifts these vendor relationships from traditional asset-based approaches to subscription and relationship-based approaches. Look for depth in the following functional areas: inventory, invoice line-items, service orders, disputes, optimization, governance, and security.

Customer satisfaction, geographic expertise, vertical expertise, and depth of managed and professional services are key differentiators. Amalgam recommends that companies looking at TEM solutions focus not only on the technical aspects of invoice and inventory management, but on the alignment between the vendor’s expertise and the potential buyer’s geographic footprint and business model. This alignment is often more important than the extremely granular Request for Proposals that Amalgam has seen in this industry. In 2019, there have been a number of specific trends in this regard, such as the telecom expense management market’s push towards aggregating European spend among market leaders, focus on providing additional managed services such as security and managed mobility, and vertical-specific cost management focuses that also include specific asset management and IT management strategies. Customer retention and satisfaction are also key metrics. For mature solutions, it is not uncommon to see annual customer retention metrics above 95% and to see an year-over-year increase in wallet share as new cloud and app spend is brought into a solution.

Rather than ask hundreds of questions where there is little to no differentiation, such as how invoices are processed and whether a specific type of inventory can be stored within the solution, focus on how the vendor supports the usage and management of value-added technology. The goal of IT is not to restrict the use of helpful technologies, but to increase the use of productivity-driving and outcome-improving technologies and then to providing that optimal level of utilization in a cost-effective manner. Spend management vendors that can help identify technology associated with revenue growth or customer satisfaction provide a competitive edge in understanding the cost-basis of strategic IT. By taking these steps, companies can start to better understand how to manage IT spend more practically.

(Note: This piece is an excerpt from Amalgam Insights’ upcoming SmartList for Technology Expense Management Market Leaders scheduled to publish in August 2019. If you would like more information about this topic, if you are considering a net-new or replacement purchase for IT expense management, or are interested in the upcoming report, please feel free to contact us at info@Amalgaminsights.com)

Strategic Presentation for the Amalgam Insights Community – 5G Context for the Strategic Enterprise

I’ve recently had the opportunity to present on the present and future of 5G as a business enabler. Based on the past 20 years I’ve spent around the carrier, reseller, IT management, and industry analyst sides of the business, I’m looking forward to sharing part of my presentation to the Amalgam Insights audience and show why 5G introduces a new Age of Mobility to follow the Age of Voice, the Age of Text, the Age of Apps, and the Age of Streaming!

If you’re interested in further discussing any aspect of this deck or the repercussions of 5G for your business, please feel free to get in contact with us at info@amalgaminsights.com! Please click on the link below to download the slide deck and learn more about what 5G practically means for the world of business over the next year or two.

Amalgam Insights – 5G Context for the Strategic Enterprise

In Case You Missed It: Why Augmented Reality is an Effective Tool in Manufacturing: A Brain Science Analysis

As you may know, PTC LiveWorx 2019 featured my live presentation “Why Augmented Reality is an Effective Tool in Manufacturing: A Brain Science Analysis“.

For those of you who couldn’t make it to LiveWorx or to the presentation, we are providing the slides so you can catch up! Simply click on the title slide or link below to download the slides.

Why Augmented Reality is Effective in Manufacturing

If you’d like to discuss this presentation in greater detail or have me speak to your organization on this topic, please follow up at info@amalgaminsights.com.

Context:

This presentation focuses on a brain science evaluation of augmented reality tools in manufacturing and their roles in:

    • reducing the cognitive load on the learner
    • providing the opportunity for limitless practice, and
    • accelerating learning and retention by broadly recruiting multiple learning systems in the brain in synchrony.

Specifically, AR tools simultaneously recruit cognitive, behavioral and experiential learning systems in the brain thus creating multiple distinct, but highly interconnected memory traces.

In comparison, traditional tools focus almost exclusively on the cognitive skills learning system. In this presentation, I examine a number of use cases for AR including supply chain, environmental health and safety (EHS), product development, equipment operation, and field service. Finally, I summarize the brain science underpinnings for the effectiveness of AR tools in each use case.

Please take a look and follow up with any questions you may have!

Why Augmented Reality is Effective in Manufacturing

The Death of Big Data and the Emergence of the Multi-Cloud Era

RIP Era of Big Data
April 1, 2006 – June 5, 2019

The Era of Big Data passed away on June 5, 2019 with the announcement of Tom Reilly’s upcoming resignation from Cloudera and subsequent market capitalization drop. Coupled with MapR’s recent announcement intending to shut down in late June, which will be dependent on whether MapR can find a buyer to continue operations, June of 2019 accentuated that the initial Era of Hadoop-driven Big Data has come to an end. Big Data will be remembered for its role in enabling the beginning of social media dominance, its role in fundamentally changing the mindset of enterprises in working with multiple orders of magnitude increases in data volume, and in clarifying the value of analytic data, data quality, and data governance for the ongoing valuation of data as an enterprise asset.

As I give a eulogy of sorts to the Era of Big Data, I do want to emphasize that Big Data technologies are not actually “dead,” but that the initial generation of Hadoop-based Big Data has reached a point of maturity where its role in enterprise data is established. Big Data is no longer part of the breathless hype cycle of infinite growth, but is now an established technology.
Continue reading “The Death of Big Data and the Emergence of the Multi-Cloud Era”

Inside our Slack Channel: A Conversation on Salesforce acquiring Tableau

As you may know, analysts typically only have the time to share a small fraction of the information that they have on any topic at any given time, with the majority of our time spent speaking with clients, technologists, and each other.

When Salesforce announced their acquisition of Tableau Monday morning, we at Amalgam Insights obviously started talking to each other about what this meant. Below is a edited excerpt of some of the topics we were going through as I was preparing for PTC LiveWorx in Boston, Data Science analyst Lynne Baer was in Nashville for Alteryx, and DevOps Research Fellow Tom Petrocelli was holding down the fort in Buffalo after several weeks of travel. Hope you enjoy a quick look behind the scenes of how we started informally thinking about this in the first hour or so after the announcement.

When the Salesforce-Tableau topic came up, Tom Petrocelli kicked it off.
Continue reading “Inside our Slack Channel: A Conversation on Salesforce acquiring Tableau”

Market Milestone – Google to Buy Looker to Transform Business Analytics

Key Stakeholders:

Chief Information Officers, Chief Technical Officers, Chief Digital Officers, Chief Analytics Officers, Data Monetization Directors and Managers, Analytics Directors and Managers, Data Management Directors and Managers, Enterprise Architects

Why It Matters:

Google’s proposed $2.6 billion acquisition of Looker provides Google with a core data engagement, service, and application environment to support Google Cloud Platform. This represents an impressive exit for Looker, which was expected to IPO after its December 2018 Series E round. This report covers key considerations for Looker customers, GCP customers, and enterprises seeking to manage data and analytics in a multi-cloud or hybrid cloud environment.

Top Takeaway:

Google Cloud Platform intends to acquire a Best-in-Breed platform for cloud analytics, embedded BI, and native analytic applications in Looker. By filling this need for Google customers, GCP has strengthened its positioning for enterprise cloud customers at a time when Amalgam Insights expects rapid and substantial growth of 25%+ CAGR (Compound Annual Growth Rate) across cloud markets for the next few years. This acquisition will help Google to remain a significant and substantial player as an enterprise cloud provider and demonstrates the latitude that Google Cloud CEO Thomas Kurian has in acquiring key components to position GCP for future growth.

To read the rest of this piece, please visit Looker, which has acquired a commercial license for this research.

Please register or log into your Amalgam Insights Community account to read more.
Log In Register

Data Science and Machine Learning News Roundup, May 2019

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Amazon, Anaconda, Cambridge Semantics, Cloudera, Databricks, Dataiku, DataRobot, Datawatch, Domino, Elastic, Google, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta, TROVE.

Domino Data Lab Champions Expert Data Scientists While Outpacing Walled-Garden Data Science Platforms

Domino announced key updates to its data science platform at Rev 2, its annual data science leader summit. For data science managers, the new Control Center provides information on what an organization’s data science team members are doing, helping managers address any blocking issues and prioritize projects appropriately. The Experiment Manager’s new Activity Feed supplies data scientists with better organizational and tracking capabilities on their experiments. The Compute Grid and Compute Engine, built on Kubernetes, will make it easier for IT teams to install and administer Domino, even in complex hybrid cloud environments. Finally, the beta Domino Community Forum will allow Domino users to share best practices with each other, as well as submit feature requests and feedback to Domino directly. With governance becoming a top priority across data science practices, Domino’s platform improvements around monitoring and making experiments repeatable will make this important ability easier for its users.

Informatica Unveils AI-Powered Product Innovations and Strengthens Industry Partnerships at Informatica World 2019

At Informatica World, Informatica publicized a number of key partnerships, both new and enhanced. Most of these partnerships involve additional support for cloud services. This includes storage, both data warehouses (Amazon Redshift) and data lakes (Azure, Databricks). Informatica also announced a new Tableau Dashboard Extension that enables Informatica Enterprise Data Catalog from within the Tableau platform. Finally, Informatica and Google Cloud are broadening their existing partnership by making Intelligent Cloud Services available on Google Cloud Platform, and providing increased support for Google BigQuery and Google Cloud Dataproc within Informatica. Amalgam Insights attended Informatica World and provides a deeper assessment of Informatica’s partnerships, as well as CLAIRE-ity on Informatica’s AI initiatives.

Microsoft delivers new advancements in Azure from cloud to edge ahead of Microsoft Build conference

Microsoft announced a number of new Azure Machine Learning and Azure AI capabilities. Azure Machine Learning has been integrated with Azure DevOps to provide “MLOps” capabilities that enable reproducibility, auditability, and automation of the full machine learning lifecycle. This marks a notable increase in making the machine learning model process more governable and compliant with regulatory needs. Azure Machine Learning also has a new visual drag-and-drop interface to facilitate codeless machine learning model creation, making the process of building machine learning models more user-friendly. On the Azure AI side, Azure Cognitive Services launched Personalizer, which provides users with specific recommendations to inform their decision-making process. Personalizer is part of the new “Decisions” category within Azure Cognitive Services; other Decisions services include Content Moderator, an API to assist in moderation and reviewing of text, images, and videos; and Anomaly Detector, an API that ingests time-series data and chooses an appropriate anomaly detection model for that data. Finally, Microsoft added a “cognitive search” capability to Azure Search, which allows customers to apply Cognitive Services algorithms to search results of their structured and unstructured content.

Microsoft and General Assembly launch partnership to close the global AI skills gap

Microsoft also announced a partnership with General Assembly to address the dearth of qualified data workers, with the goal of training 15,000 workers by 2022 for various artificial intelligence and machine learning roles. The two companies will found an AI Standards Board to create standards and credentials for artificial intelligence skills. In addition, Microsoft and General Assembly will develop scalable training solutions for Microsoft customers, and establish an AI Talent network to connect qualified candidates to AI jobs. This continues the trend of major enterprises building internal training programs to bridge the data skills gap.

Kubernetes Grows Up – The View from KubeCon EU 2019

Our little Kubernetes is growing up.

By “growing up” I mean it is almost in a state that a mainstream company can consider it fit for production. While there are several factors that act as a drag against mainstream reception, a lack of completeness has been a major force against Kubernetes broader acceptance. Completeness, in this context, means that all the parts of an enterprise platform are available off the shelf and won’t require a major engineering effort on the part of conventional IT departments.

The good news from KubeCon+CloudNativeCon EU 2019 in Barcelona, Spain (May 20 – 23 2019) is that the Kubernetes and related communities are zeroing in on that ever so important target. There are a number of markers pointing toward mainstream acceptance. Projects are filling out the infrastructure – gaining completeness – and the community is growing.

Project Updates

While Kubernetes may be at the core, there are many supporting projects that are striving to add capabilities to the ecosystem that will result in a more complete platform for microservices. Some of the projects featured in the project updates show the drive for completeness. For example, OpenEBS and Rook are two projects striving to make container storage more enterprise friendly. Updates to both projects were announced at the conference. Storage, like networking, is an area that must be tackled before mainstream IT can seriously consider container microservices platforms based on Kubernetes.

Managing microservices performance and failure is a big part of the ability to deploy containers at scale. For this reason, the announcement that two projects that provide application tracing capabilities, OpenTracing and OpenCensus, were merging into OpenTelemetry is especially important. Ultimately, developers need a unified approach to gathering data for managing container-based applications at scale. Removing duplication of effort and competing agendas will speed up the realization of that vision.

Also announced at KubeCon+CloudNativeCon EU 2019 were updates to Helm and Harbor, two projects that tackle thorny issues of packaging and distributing containers to Kubernetes. These are necessary parts of the process of deploying Kubernetes applications. Securely managing container lifecycles through packaging and repositories is a key component of DevOps support for new container architectures. Forward momentum in these projects is forward movement toward the mainstream.

There were other project updates, including updates to Kubernetes itself and Crio-io. Clearly, the community is filling in the blank spots in container architectures, making Kubernetes a more viable application platform for everyone.

The Community is Growing

Another gauge pointing toward mainstream acceptance is the growth in the community. The bigger the community, the more hands to do the work and the better the chances of achieving feature critical mass. This year in Barcelona, KubeCon+CloudNativeCon EU saw 7700 attendees, nearly twice last year in Copenhagen. In the core Kubernetes project, there are 164K commits and 1.2M comments in Github. This speaks to broad involvement in making Kubernetes better. Completeness requires lots of work and that is more achievable when there are more people involved.

Unfortunately, as Cheryl Hung, Director of Ecosystems at CNCF says, only 3% of contributors are women. The alarming lack of diversity in the IT industry shows up even in Kubernetes despite the high-profile women involved in the conference such as Janet Kuo of Google. Diversity brings more and different ideas to a project and it would be great to see the participation of women grow.

Service Mesh Was the Talk of the Town

The number of conversations I had about service mesh was astounding. It’s true that I had released a pair of papers on it, one just before KubeCon+CloudNativeCon EU 2019. That may have explained why people want to talk to me about it but not the general buzz. There was service mesh talk in the halls, at lunch, in sessions, and from the mainstage. It’s pretty much what everyone wanted to know about. That’s not surprising since a service mesh is going to be a vital part of large scale-out microservices applications. What was surprising was that even attendees who were new to Kubernetes were keen to know more. This was a very good omen.

It certainly helped that there was a big service mesh related announcement from the mainstage on Tuesday. Microsoft, in conjunction with a host of companies, announced the Service Mesh Interface. It’s a common API for different vendor and project service mesh components. Think of it as a lingua franca of service mesh. There were shout-outs to Linkerd and Solo.io. The latter especially had much to do with creating SMI. The fast maturation of the service mesh segment of the Kubernetes market is another stepping stone toward the completeness necessary for mainstream adoption.

Already Way Too Many Distros

There were a lot of Kubernetes distributions a KubeCon+CloudNativeCon EU 2019. A lot. Really.  A lot. While this is a testimony the growth in Kubernetes as a platform, it’s confusing to IT professionals making choices. Some are managed cloud services; others are distributions for on-premises or when you want to install your own on a cloud instance. Here’s some of the Kubernetes distros I saw on the expo floor.  I’m sure I missed a few:

Microsoft Azure Google Digital Ocean Alibaba
Canonical (Ubuntu) Oracle IBM Red Hat
VMWare SUSE Rancher Pivotal
Mirantis Platform9

 

From what I hear this is a sample, not a comprehensive, list. The dark side of this enormous choice is confusion. Choosing is hard when you get beyond a handful of options. Still, only five years into the evolution of Kubernetes, it’s a good sign to see this much commercial support for it.

The Kubernetes and Cloud Native architecture is like a teenager. It’s growing rapidly but not quite done. As the industry fills in the blanks and as communities better networking, storage, and deployment capabilities, it will go mainstream and become applicable to companies of all sizes and types. Soon. Not yet but very soon.

Scenario-Based Learning and Behavior Change: A Brain Science Analysis

Key Stakeholders: Chief Learning Officers, Chief Human Resource Officers, Learning and Development Directors and Managers, Corporate Trainers, Content and Learning Product Managers, Leadership Trainers, Cybersecurity Trainers, Compliance Officers, Environmental Health and Safety Trainers, Sales Managers.

Why It Matters: People skills, compliance skills, safety skills and other skills involve choosing the right behavior in real-time or near real-time. It is behavior change that is the gold standard for Learning and Development, and many L&D vendors utilize scenario-based approaches to elicit behavior change. In this report, we use brain science to evaluate the effectiveness of scenario-based learning approaches in eliciting behavior change, and determine whether this approach helps employees to choose appropriate behaviors and to be more effective managers and employees.

Top Takeaway: Real-time interactive scenario-based learning approaches optimally elicit behavior change by directly engaging the behavioral skills learning system in the brain. Non-interactive scenario-based approaches are effective for behavior change (although to a lesser degree) because they engage emotional learning centers in the brain that draw learners in, and make them feel like they are part of the training. Non-interactive scenario-based approaches are practical and cost-effective alternatives to real-time interactive scenario-based approaches.

Overview

Please register or log into your Amalgam Insights Community account to read more.
Log In Register

Knowledge 2019 and ServiceNow’s Vision for Transforming the World of Work

In May 2019, Amalgam Insights attended Knowledge 2019, ServiceNow’s annual end-user conference. Since ServiceNow’s founding in 2004, the company has evolved from its roots as an IT asset and service management company to a company that supports digital workflow across IT, HR, service, and finance with the goal of making work better for every employee. In attending this show, Amalgam Insights was especially interested in seeing how ServiceNow was evolving its message to reflect what Amalgam Insights refers to as “Market Evolvers,” companies that have gained market dominance in their original market and taken advantage of modern mobile, cloud, and AI technology to expand into other markets. (Examples of Market Evolvers include, but are not excluded to, Salesforce, ServiceNow, Workday, Informatica, and Tangoe.) Continue reading “Knowledge 2019 and ServiceNow’s Vision for Transforming the World of Work”