We are in the midst of another change-up in the IT world. Every 15 to 20 years there is a radical rethink of the platforms that applications are built upon. During the course of the history of IT, we have moved from batch-oriented, pipelined systems (predominantly written in COBOL) to client-server and n-Tier systems that are the standards of today. These platforms were developed in the last century and designed for last century applications. After years of putting shims into systems to accommodate the scale and diversity of modern applications, IT has just begun to deploy new platforms based on containers and Kubernetes. These new platforms promise greater resiliency and scalability, as well as greater responsiveness to the business. Continue reading Canonical Takes a Third Path to Support New Platforms
Year: 2019
Augmented Reality in Product Development: A Neuroscience Perspective
Product development is a collaborative process in which the product evolves from an idea, to drawings and ultimately to a physical prototype. This is an iterative process in which two-dimensional (2D) static images and schematics drive development early in the process only later leading to the development of a physical 3-dimensional (3D) prototype. This approach places a heavy load on the cognitive system in the brain because 3D dynamic representations and imagery must be constructed in the brain from a series of 2D static images. Continue reading Augmented Reality in Product Development: A Neuroscience Perspective
Look Beyond The Simple Facts of the Cimpl Acquisition
(Note: This blog was co-written by Hyoun Park and Larry Foster, an Enterprise Technology Management Association Hall of Famer and an executive who has shaped the Technology Expense Management industry. Please welcome Larry’s first contribution to Amalgam Insights!)
On August 22, 2019, Upland Software announced the acquisition of Cimpl (f.k.a. Etelesolv), a Montreal-based telecom expense management platform that was the market leader in the Canadian market and had expanded into the United States market. With this acquisition, Cimpl will become a part of Upland’s Project & Financial Management Solution Suite and add approximately $8 million in annual revenue.
Context for the Acquisition
The TEM (Technology Expense Management) industry has experienced a continual series of ebb-and-flow acquisitions/mergers over the past twelve years. The typical TEM acquisition/merger encompasses two or more independent entities within the realm of TEM, WEM (Wireless Expense Management) or MMS (Managed Mobility Services) merging to create a more comprehensive expense management solution portfolio with superior global delivery capabilities.
The reality is that many of these mergers are driven by economic reasons where one or both entities can reduce overhead by eliminating duplicate services. Overhead is eliminated by unifying back-office operations and amalgamating technology platforms. These types of consolidation mergers are typical in a maturing industry that is eventually dominated by a few leading solution providers representing the majority of market share. All of the leading TEM solution providers including Tangoe, MDSL and Calero encompass a long history of multiple “like-minded mergers”.
Cimpl as an outlier in the TEM market
Until this recent acquisition, Cimpl has maintained the persona of the independent dark horse of the TEM industry quietly residing in Quebec, Canada refining its multi-tenant cloud platform and progressively building its market share.
Unlike most TEM managed service solution providers, Cimpl has decided to focus on being mainly a pure software company and providing a white-label technology platform for its delivery partners. In early 2018 CIMPL stealthily started to expand its physical presence into the United States. Since its inception, Cimpl has continued to progressively achieve conservative incremental success and stay profitable in contrast to a number of TEM vendors that have gone through boom-or-bust cycles driven by external funding (or the lack thereof).
The Challenge for TEM
The traditional acquisition playbook is preventing the TEM industry from being recognized as a strategic asset by organizations. Nonetheless, the TEM industry is experiencing a dramatic paradigm shift as organizations continue to replace legacy communication services with the ever-growing spectrum of cloud-based services. Traditionally, TEM solutions have focused on validating the integrity of invoice charges across multiple vendors prior to payment and allocating expenses to the respective cost centers leveraging the leased service. Enterprises derive value from TEM solutions by enabling a centralized ICT (Information and Communications Technology) shared service to automate the lifecycle from provisioning through payment and managing the resolution of disputed invoice charges for essentially static services.
However, as organizations adopt more ephemeral cloud services that encompass multi-vendor private, public and hybrid leased environments for compute, storage, API-enabled integrations, connectivity, input/output, and telecommunications, the purpose of the centralized ICT business operation is being transformed from managing daily operations to a fiduciary broker focused on optimizing technology investments. Unlike the recurring charges that represent the majority of traditional telecom charges, cloud services are consumption-based, meaning that it’s the responsibility of the client user to deactivate and manage the appropriate configuration of contracted services based on statistical analysis and forecast of the actual usage.
In the world of cloud, the provisioning activities such as activations, changes, and deactivations are done “on-demand,” completely independent from the ICT operation. The primary focus of ITEM solutions is to manage recurring and non-recurring invoice charges in arrears. As ICT operations evolve into technology brokers, they need real-time insight underpinned by ML and AI algorithms that make cost optimization recommendations to add, consolidate, change or deactivate services based on usage trends.
Why the CIMPL acquisition will help Upland
This context brings us to the real ingenuousness of the Cimpl acquisition. In the typical quiet financial days of August when everyone is away on vacation, Upland Software announced an accretive acquisition of Cimpl with a purchase price of $23.1M in cash and a $2.6M cash holdback payable in 12 months. Upland expects the acquisition to generate annual revenue of approximately $8M, of which $7.4M is recurring. The keyword buried within all of those financial statistics is “accretive” which means their strategy is to help increase natural growth.
Upland already has an impressive complementary portfolio of profitable software solutions. A closer look at the acquisition of Cimpl shows how Upland is formulating a solution strategy to manage all aspects of the Information and Communication Technology business operations.
The strategic value of the Cimpl acquisition becomes very clear when you recognize that Upland is the first company to combine an IT Financial Management platform (ITFM), ComSci, with an IT Expense Management-based solution (ITEM), Cimpl. Upland already owns additional complementary solutions such as a document and workflow automation, a BI platform, customer engagement platform, and a knowledge-based platform. With these components, Upland is working to create an industry-leading ERP-type solution framework to automate, manage, & rationalize all aspects of ICT business operations.
Although both ITFM and ITEM support the ICT business operations, they focus on different aspects. ITFM is predominately used on the front end to manage budgets and on a monthly basis to support internal billing/chargeback activities and leveraged by the IT CFO office whereas ITEM solutions like Cimpl are used by analysts and operational managers because they focus on managing the high volumes of transactional operations and data throughout the month, including provisioning and payment of leased services such as both landline and mobile communication services and now the ever-expanding array of cloud services.
Looking Forward: Our Recommendations
In this context, take the following recommendations into account based on this acquisition.
Expect other leading TEM, ITFM, CEM/CMP (Cloud Expense Management and Cloud Management Platform) solution providers to develop competitive solution frameworks that bring multiple IT categories together from a finance and expense management perspective.
ICT managers need to evolve and transform their solution to due diligence approach beyond pursuing and leveraging independent ITFM, ITEM, CEM/CMP solutions to choosing solutions with comprehensive IT management frameworks. As IT continues to become increasingly based on subscriptions, project-based spend, and on-demand peak usage across a variety of categories, ICT managers should aim towards having a single management control plane for finance and expenses rather than depend on a variety of management solutions
Real-time management is the future of IT expense management. The next levels of operational efficacy will be underpinned by more comprehensive real-time insight that helps organizations understand the most optimal way to acquire, configure and consume inter-related cloud services and pay their invoices. This will require insights on usage, project management, service management and real-time status updates associated with expense and finance. By combining financial and operational data, ICT managers will have greater insights into the current and ongoing ROI of technology under management.
The Amalgam Insiders have 5 Key Questions for VMworld
(Editor’s Note: This week, Tom Petrocelli and Hyoun Park will be blogging and tweeting on key topics at VMworld at a time when multi-cloud management is a key issue for IT departments and Dell is spending billions of dollars. Please follow our blog and our twitter accounts TomPetrocelli, Hyounpark, and AmalgamInsights for more details this week as we cover VMworld!)
As Amalgam Insights prepares to attend VMworld, it is an especially interesting time from both an M&A and a strategic perspective as VMware completes acquisitions of its sibling company Pivotal and end-user security startup Carbon Black. As these acquisitions are in progress and Amalgam Insights has the opportunity to question executives at the top of Dell Technologies, including Pat Gelsinger and Michael Dell, Amalgam Insights will be looking forward to answers to the following questions:
1. How will VMware accelerate Pivotal’s growth post-acquisition? Back in 2013 when Pivotal was first founded, I stated in an interview that
The potential for Pivotal was immense. Even in light of The Death of Big Data, Pivotal still has both the toolkits and methodology to support intelligent analytic and algorithm-based application architectures at a time when VMware needs to increase its support there in light of the capabilities of IBM-Red Hat, Oracle, and other competitors. We’re looking forward to getting some answers!
2. How will the Carbon Black acquisition be integrated into VMware’s security and end-user computing offerings? Carbon Black is a Boston-area security startup focused on discovering malicious activity on endpoints and will be a strong contributor to WorkspaceONE as VMware seeks to manage and secure the mobile-cloud ecosystem. And along with NSX Cloud for networking and CloudHealth Technologies for multi-cloud management, Carbon Black will help VMware to tell a stronger end-to-end cloud story. But the potential and timeline for integration will end up defining the success of this 2 billion+ dollar acquisition.
3. Where does CloudHealth Technologies fit into VMware’s multi-cloud management story? Although this 500 million dollar acquisition looked promising when it occurred last year, the Dell family previously invested in Enstratius to manage multi-cloud environments and that acquisition ended up going nowhere. What did VMware learn from the last time around and how will CloudHealth Technologies stay top of mind with all these other acquisitions going on?
4. Where is VMware going with its machine learning and AI capabilities for data center management? I can’t take credit for this one, as the great Maribel Lopez brought this up (go ahead and follow her on LinkedIn!). But VMware needs to continue advancing the Software-Defined Data Center and to ease client challenges in supporting hybrid cloud environments.
5. How is VMware bringing virtualization and Kubernetes together? With VMware’s acquisitions of Heptio and Bitnami, VMware has put itself right in the middle of the Kubernetes universe. But virtualization and Kubernetes are the application support equivalent of data center and cloud, two axes on the spectrum of what is possible. How will VMware simplify this componentization for clients who are seeking hybrid cloud help?
We’ll be looking for answers to these questions and more as we roam the halls of Moscone and put VMware and Dell executives to the test! Stay tuned for more!
VMware plus Pivotal Equals Platforms
(Editor’s Note: This week, Tom Petrocelli and Hyoun Park will be blogging and tweeting on key topics at VMworld at a time when multi-cloud management is a key issue for IT departments and Dell is spending billions of dollars. Please follow our blog and our twitter accounts TomPetrocelli, Hyounpark, and AmalgamInsights for more details this week as we cover VMworld!)
On August 22, 2019, VMware announced the acquisition of Pivotal. The term “acquisition” seems a little weird here since both are partly owned by Dell. It’s a bit like Dell buying Dell. Strangeness aside, this is a combination that makes a lot of sense.
For nearly eight years now, the concept of a microservices architecture has been taking shape. Microservices is an architectural idea wherein applications are broken up into many, small, bits of code – or services – that provide a limited set of functions and operate independently. Applications are assembled Lego-like, from component microservices. The advantages of microservices are that different parts of a system can evolve independently, updates are less disruptive, and systems become more resilient because system components are less likely to harm each other. The primary vehicle for microservices are containers (which I’ve covered in my Market Guide: Seven Decision Points When Considering Containers), that are deployed in clusters to enhance resiliency and more easily scale up resources.
The Kubernetes open-source software has emerged as the major orchestrator for containers and provides a stable base to build microservice platforms. These platforms must deploy not only the code that represents the business logic, but a set of system services, such as network, tracing, logging, and storage, as well. Container cluster platforms are, by nature, complex assortments of many moving parts – hard to build and hard to maintain.
The big problem has been that most container technology has been open-source and deployed piecemeal, leaving forward-looking companies to assemble their own container cluster microservices platforms. Building out and then maintaining these DIY platforms requires continued investment in people and other resources. Most companies either can’t afford or are unwilling to make investments in this amount of engineering talent and training. Subsequently, there are a lot of companies that have been left out of the container platform game.
The big change has been in the emergence of commercial platforms (many of which were discussed in my SmartList Market Guide on Service Mesh and Building Out Microservices Networking), based on open-source projects, that bring to IT everything it needs to deploy container-based microservices. All the cloud companies, especially Google, which was the original home of Kubernetes, and open-source software vendors such as Red Hat (recently acquired by IBM) with their OpenShift platform, have some form of Kubernetes-based platform. There may be as many as two dozen commercial platforms based on Kubernetes today.
This brings us to VMware and Pivotal. Both companies are in the platform business. VMware is still the dominant player in Virtual Machine (VM) hypervisors, which underpin most systems today, and are marketing a Kubernetes distribution. They also recently purchased Bitnami, a company that makes technology for bundling containers for deployment. At the time, I said:
“This is VMware doubling down on software for microservices and container clusters. Prima facie, it looks like a good move.”
Pivotal markets a Kubernetes distribution as well but also one of the major vendors for Cloud Foundry, another platform that runs containers, VMs, and now Kubernetes (which I discuss in my Analyst Insight: Cloud Foundry and Kubernetes: Different Paths to Microservices). The Pivotal portfolio also includes Spring Boot, one of the primary frameworks for building microservices in Java, and an extensive Continuous Integration/Continuous Deployment capability based on BOSH (part of Cloud Foundry), Concourse, and other open source tools.
Taken together, VMware and Pivotal offer a variety of platforms for newer microservices and legacy VM architectures that will fit the needs of a big swatch of large enterprises. This will give them both reach and depth in large enterprise companies and allow their sales teams to sell whichever platform a customer needs at the moment while providing a path to newer architectures. From a product portfolio perspective, VMware plus Pivotal is a massive platform play that will help them to compete more effectively against the likes of IBM/Red Hat or the big cloud vendors.
On their own, neither VMWare or Pivotal had the capacity to compete against Red Hat OpenShift, especially now that that Red Hat has access to IBM’s customer base and sales force. Together they will have a full range of technology to bring to bear as the Fortune 500 moves into microservices. The older architectures are also likely to remain in place either because of legacy reasons or because they just fit the applications they serve. VMware/Pivotal will be in a position to service those companies as well.
VMware could easily have decided to pick up any number of Kubernetes distribution companies such as Rancher or Platform9. None of them would have provided the wide range of platform choices that Pivotal brings to the table. And besides, this keeps it all in the Dell family.
Amalgam Insights Analysis: Incorta Raises $30 Million C Round
On August 15, 2019, Incorta announced the closing of a $30 million Series C round led by Sorenson Capital and including participation from existing investors GV, Kleiner Perkins, M12 (formerly known as Microsoft Ventures), Telstra Ventures, & Ron Wohl. Incorta has raised $75 million in funding since its founding.
Incorta is an innovator in the enterprise analytics space because of its Direct Data Mapping capabilities that allow organizations to build data warehouse-like organizational capabilities without having to create ETL jobs or data models. This allows organizations to massively accelerate their time to incorporate new data into analytic environments and represents one of the new paradigms in data and analytics that is getting organizations closer to the Holy Grail of supporting ubiquitous analytics. This approach also provides performance benefits associated with analytic queries compared to traditional database approaches to organizing large stores of data for analytic usage, such as columnar data queries.
This round of funding was validated by Incorta’s rapid recent growth, including 284% growth in revenue year-over year (including key deals in verticals such as financial services, retail, and manufacturing), 200 new employees added over the last six months with new offices in Chicago, Dubai, and Bangalore, and a strengthened marketing and sales partnership with Microsoft
Incorta is expected to use this funding to acquire top technical, sales, marketing, and operational talent, continue investing in current technologies, and work with public cloud providers on supporting effective and efficient deployments.
Why It Matters
Enterprise analytics have gone through several generations of evolution, starting with the initial development of reporting on structured databases and ETL with the likes of Business Objects, Cognos, and Microstrategy , then continuing with the growth of self-service and data discovery driven by Qlik and Tableau, and then a variety of Big Data solutions including Hadoop, MongoDB, Oracle Exadata, and SAP HANA. We also saw a set of cloud BI solutions that accelerated the ability to create data models and star schemas, such as Birst, Power BI, and Oracle Analytics Cloud.
Although many of these technologies continue to be valid and effective ways to manage analytic access, we are entering a new era of analytics solutions seeking to accelerate analytic access and ubiquity not by simply improving current paradigms of data querying, modeling, and integration, but by fundamentally changing the way we access and structure analytic data by taking advantage of the cloud and modern methods of supporting complex data. Amalgam Insights believes that Incorta is an emerging game changer in the current era of analytics in its approach that actively avoids traditional data modeling and ETL in favor of a more direct approach of bringing data sources together and conducting analytics on the data. This approach is fundamentally important because it practically allows businesses to bring in new data sources as trusted analytic data without having to figure out the complexities of Kimball-esque data warehouse mapping.
With this round of funding, Amalgam Insights believes that Incorta is equipped to sell its unique approach to analytics to a larger audience. However, if the recent funding rounds of Looker, Thoughtspot, and Sisense are any indication, Incorta will likely look for at least one more round of funding either to support global marketing and selling efforts or to prepare for IPO. Either way, Amalgam Insights expects Incorta to continue its success as enterprise IT organizations seek a faster and more efficient way to continue bringing more data sources into their analytic environments. Amalgam Insights highly recommends Incorta as a solution for enterprise analytics departments struggling to keep up with the pace of readying new data sources for analytics. In addition, we recommend Incorta as a solution that is worth an initial free trial and sandbox installation even in mature and stable analytic environments to ensure that analytics teams are up to date with the Art of the Possible with modern analytic tools.
(Note: Amalgam Insights currently recommends Incorta, Looker, Microstrategy, Sisense, and Thoughtspot as analytic vendors with novel approaches for developing ubiquitous analytics.)
The Neuroscience of Corporate Retention Training
The Brain is Hardwired to Forget
High-quality Learning and Development solutions provide organizations with the competitive advantage that they need to address digital transformation in the rapidly changing workplace. These solutions allow effective upskilling and reskilling of employees, but also attract, and retain the best talent. Organizations recognize that it is more time and cost effective to develop talent from within, than to recruit, hire, and onboard external talent. In addition, research suggests that a commitment to developing existing talent for today and the future enhances employee satisfaction, engagement, and productivity, all the while reducing turnover; a win for employers and employees.
The goal of any L&D solution is to speed initial learning and to enhance long term retention. Although initial learning is important, the ultimate goal is long-term retention and mastery. It does no good to train to perfection today, only to have that information forgotten tomorrow. Unfortunately, the brain is hardwired to forget, and thus long-term retention is a challenge. We know this from 100 years of research. To meet this challenge L&D solutions must construct, design and deliver training content in a way that trains for retention.
From a neuroscience perspective (see schematic brain figure), when you first study and begin to learn you are using short-term (working) memory and attention to process the information. This processing recruits the prefrontal cortex with the goal of building a memory representation and transferring this representation (through repetition) into long-term memory store. Long-term memory resides in the hippocampus and other medial temporal lobe structures. During initial learning these memory traces are strong, but they are highly susceptible to weakening (aka forgetting) over time. The goal of retention training is to reduce the rate of forgetting.
There are three critical tools that organizations should look for in an L&D solution if you want to enhance retention.
-
Periodic Testing: Effective L&D solutions periodically test the learner on their knowledge of the relevant subject matter. Knowledge that is poorly retained should be identified, and this information should be retrained. New information should also be introduced to build the learner’s knowledge base. With this approach, we are layering prior with new material, thus reinforcing that prior knowledge and allowing it to serve as the scaffolding upon which to build. This layering enhances retention. This testing procedure opens the door to personalized L&D methodologies that will ultimately be developed.
- Spaced Microlearning Training: Effective L&D solutions use spaced training that comes in brief bursts of compelling and engaging content that focus on a specific topic (also called microlearning). Brief bursts of training increase the odds that attentional processes will remain engaged, and the learner’s attention span will not be exceeded. These microlearning boosts should be spaced over time so that any forgetting that occurs can be addressed. Spaced training provides the time for memory consolidation in the hippocampus and medial temporal lobes and experiential application and learning on the job.
-
Scenario-based Storytelling: Effective L&D solutions use scenario-based storytelling to convey information. Storytelling engages emotion and motivation centers in the brain that draw the learner in, and allow them to see themselves in the learning. Engaging emotion and motivation centers increases the effectiveness of the prefrontal cortex and hippocampus and speeds initial learning while reducing the rate of forgetting.
By combining periodic testing with spaced microlearning and scenario-based storytelling, L&D solutions will “Train for Retention”. This approach will develop well-rounded employees who are ready to meet the needs of the modern workplace today, and the new challenges that await tomorrow.
If you share the challenge so many organizations face in struggling with the challenges of digital transformation and simply want L&D solutions that “stick”, look for solutions that embrace testing, spaced microlearning, and scenario-based storytelling.
Mobile Solutions Launches New Robotic Process Automation Capabilities
[Edited July 25th to reflect Mobile Solutions’ current public-facing offerings]
Key Takeaway: Mobile Solutions is providing Robotic Process Automation for Managed Mobility Services with a focus on mid-market enterprises and organizations. This capability provides Mobile Solutions with a starting point for handling basic service orders.
On July 10th, Mobile Solutions announced the launch of two Robotic Process Automation (RPA) assistants named Maxine and Maximus Brain that are assisting with service orders entered into Mobile Solutions’ mobility management platform, MAX.
Maxine serves as Continue reading Mobile Solutions Launches New Robotic Process Automation Capabilities
Understanding Microsoft’s Investment in OpenAI
On July 22, Microsoft announced a $1 billion investment in OpenAI, a lab focused on “artificial general intelligence,” or the goal of creating artificial intelligence with human-like observation and learning capabilities. With this announcement, Microsoft becomes the “exclusive” cloud computing provider for OpenAI and will have access to productizing OpenAI capabilities as they come to market.
Key Takeaways: Microsoft makes a long-term investment in “general intelligence” to start on the next generation of AIs that will be coming to market in five-to-ten years and will be able to recoup some costs back as OpenAI’s cloud provider and monetizer of OpenAI technologies.
Continue reading Understanding Microsoft’s Investment in OpenAI
The Neuroscience of Effective Sales Coaching
It’s About the How, Not the What
As any sales manager will tell you, one of the most challenging aspects of his or her job is sales coaching. It is one thing for your sales professionals to have a cognitive understanding of what to say during a pitch – the words to use, the appropriate script, what body language to display and which tactics to utilize when handling objections or other nuanced situations.
It is another thing (and mediated by completely different systems in the brain) for your sales professionals to have the people skills, the intuition if you will, to know how to deliver the pitch, how to display the appropriate body language, and how to handle a broad range of situations. Although a cognitive understanding – an understanding of the “what” – is useful, it is people skills – the “how” – that closes the deal. Continue reading The Neuroscience of Effective Sales Coaching