Posted on

The Amalgam Insiders have 5 Key Questions for VMworld

(Editor’s Note: This week, Tom Petrocelli and Hyoun Park will be blogging and tweeting on key topics at VMworld at a time when multi-cloud management is a key issue for IT departments and Dell is spending billions of dollars. Please follow our blog and our twitter accounts TomPetrocelli, Hyounpark, and AmalgamInsights for more details this week as we cover VMworld!)

As Amalgam Insights prepares to attend VMworld, it is an especially interesting time from both an M&A and a strategic perspective as VMware completes acquisitions of its sibling company Pivotal and end-user security startup Carbon Black. As these acquisitions are in progress and Amalgam Insights has the opportunity to question executives at the top of Dell Technologies, including Pat Gelsinger and Michael Dell, Amalgam Insights will be looking forward to answers to the following questions:

1. How will VMware accelerate Pivotal’s growth post-acquisition? Back in 2013 when Pivotal was first founded, I stated in an interview that

“Pivotal is the first application platform that combines cloud, Big Data, and rapid application development and it represents a fundamental shift in enterprise IT. By creating an enterprise-grade Big Data application platform, Pivotal has the opportunity to quickly unlock value from transactional data that has traditionally been archived and ignored without requiring a long period of up training, integration, and upfront development time.”

 

The potential for Pivotal was immense. Even in light of The Death of Big Data, Pivotal still has both the toolkits and methodology to support intelligent analytic and algorithm-based application architectures at a time when VMware needs to increase its support there in light of the capabilities of IBM-Red Hat, Oracle, and other competitors. We’re looking forward to getting some answers!

2. How will the Carbon Black acquisition be integrated into VMware’s security and end-user computing offerings? Carbon Black is a Boston-area security startup focused on discovering malicious activity on endpoints and will be a strong contributor to WorkspaceONE as VMware seeks to manage and secure the mobile-cloud ecosystem. And along with NSX Cloud for networking and CloudHealth Technologies for multi-cloud management, Carbon Black will help VMware to tell a stronger end-to-end cloud story. But the potential and timeline for integration will end up defining the success of this 2 billion+ dollar acquisition.

3. Where does CloudHealth Technologies fit into VMware’s multi-cloud management story? Although this 500 million dollar acquisition looked promising when it occurred last year, the Dell family previously invested in Enstratius to manage multi-cloud environments and that acquisition ended up going nowhere. What did VMware learn from the last time around and how will CloudHealth Technologies stay top of mind with all these other acquisitions going on?

4. Where is VMware going with its machine learning and AI capabilities for data center management? I can’t take credit for this one, as the great Maribel Lopez brought this up (go ahead and follow her on LinkedIn!). But VMware needs to continue advancing the Software-Defined Data Center and to ease client challenges in supporting hybrid cloud environments.

5. How is VMware bringing virtualization and Kubernetes together? With VMware’s acquisitions of Heptio and Bitnami, VMware has put itself right in the middle of the Kubernetes universe. But virtualization and Kubernetes are the application support equivalent of data center and cloud, two axes on the spectrum of what is possible. How will VMware simplify this componentization for clients who are seeking hybrid cloud help?

We’ll be looking for answers to these questions and more as we roam the halls of Moscone and put VMware and Dell executives to the test! Stay tuned for more!

Posted on Leave a comment

VMware plus Pivotal Equals Platforms

(Editor’s Note: This week, Tom Petrocelli and Hyoun Park will be blogging and tweeting on key topics at VMworld at a time when multi-cloud management is a key issue for IT departments and Dell is spending billions of dollars. Please follow our blog and our twitter accounts TomPetrocelli, Hyounpark, and AmalgamInsights for more details this week as we cover VMworld!)

On August 22, 2019, VMware announced the acquisition of Pivotal. The term “acquisition” seems a little weird here since both are partly owned by Dell. It’s a bit like Dell buying Dell. Strangeness aside, this is a combination that makes a lot of sense.

For nearly eight years now, the concept of a microservices architecture has been taking shape. Microservices is an architectural idea wherein applications are broken up into many, small, bits of code – or services – that provide a limited set of functions and operate independently. Applications are assembled Lego-like, from component microservices. The advantages of microservices are that different parts of a system can evolve independently, updates are less disruptive, and systems become more resilient because system components are less likely to harm each other. The primary vehicle for microservices are containers (which I’ve covered in my Market Guide: Seven Decision Points When Considering Containers), that are deployed in clusters to enhance resiliency and more easily scale up resources.

The Kubernetes open-source software has emerged as the major orchestrator for containers and provides a stable base to build microservice platforms. These platforms must deploy not only the code that represents the business logic, but a set of system services, such as network, tracing, logging, and storage, as well. Container cluster platforms are, by nature, complex assortments of many moving parts – hard to build and hard to maintain.

The big problem has been that most container technology has been open-source and deployed piecemeal, leaving forward-looking companies to assemble their own container cluster microservices platforms. Building out and then maintaining these DIY platforms requires continued investment in people and other resources. Most companies either can’t afford or are unwilling to make investments in this amount of engineering talent and training. Subsequently, there are a lot of companies that have been left out of the container platform game.

The big change has been in the emergence of commercial platforms (many of which were discussed in my SmartList Market Guide on Service Mesh and Building Out Microservices Networking), based on open-source projects, that bring to IT everything it needs to deploy container-based microservices. All the cloud companies, especially Google, which was the original home of Kubernetes, and open-source software vendors such as Red Hat (recently acquired by IBM) with their OpenShift platform, have some form of Kubernetes-based platform. There may be as many as two dozen commercial platforms based on Kubernetes today.

This brings us to VMware and Pivotal. Both companies are in the platform business. VMware is still the dominant player in Virtual Machine (VM) hypervisors, which underpin most systems today, and are marketing a Kubernetes distribution. They also recently purchased Bitnami, a company that makes technology for bundling containers for deployment. At the time, I said:

“This is VMware doubling down on software for microservices and container clusters. Prima facie, it looks like a good move.”

Pivotal markets a Kubernetes distribution as well but also one of the major vendors for Cloud Foundry, another platform that runs containers, VMs, and now Kubernetes (which I discuss in my Analyst Insight: Cloud Foundry and Kubernetes: Different Paths to Microservices). The Pivotal portfolio also includes Spring Boot, one of the primary frameworks for building microservices in Java, and an extensive Continuous Integration/Continuous Deployment capability based on BOSH (part of Cloud Foundry), Concourse, and other open source tools.

Taken together, VMware and Pivotal offer a variety of platforms for newer microservices and legacy VM architectures that will fit the needs of a big swatch of large enterprises. This will give them both reach and depth in large enterprise companies and allow their sales teams to sell whichever platform a customer needs at the moment while providing a path to newer architectures. From a product portfolio perspective, VMware plus Pivotal is a massive platform play that will help them to compete more effectively against the likes of IBM/Red Hat or the big cloud vendors.

On their own, neither VMWare or Pivotal had the capacity to compete against Red Hat OpenShift, especially now that that Red Hat has access to IBM’s customer base and sales force. Together they will have a full range of technology to bring to bear as the Fortune 500 moves into microservices. The older architectures are also likely to remain in place either because of legacy reasons or because they just fit the applications they serve. VMware/Pivotal will be in a position to service those companies as well.

VMware could easily have decided to pick up any number of Kubernetes distribution companies such as Rancher or Platform9. None of them would have provided the wide range of platform choices that Pivotal brings to the table. And besides, this keeps it all in the Dell family.

Posted on Leave a comment

Amalgam Insights Analysis: Incorta Raises $30 Million C Round

On August 15, 2019, Incorta announced the closing of a $30 million Series C round led by Sorenson Capital and including participation from existing investors GV, Kleiner Perkins, M12 (formerly known as Microsoft Ventures), Telstra Ventures, & Ron Wohl. Incorta has raised $75 million in funding since its founding.

Incorta is an innovator in the enterprise analytics space because of its Direct Data  Mapping capabilities that allow organizations to build data warehouse-like organizational capabilities without having to create ETL jobs or data models. This allows organizations to massively accelerate their time to incorporate new data into analytic environments and represents one of the new paradigms in data and analytics that is getting organizations closer to the Holy Grail of supporting ubiquitous analytics. This approach also provides performance benefits associated with analytic queries compared to traditional database approaches to organizing large stores of data for analytic usage, such as columnar data queries.

This round of funding was validated by Incorta’s rapid recent growth, including 284% growth in revenue year-over year (including key deals in verticals such as financial services, retail, and manufacturing), 200 new employees added over the last six months with new offices in Chicago, Dubai, and Bangalore, and a strengthened marketing and sales partnership with Microsoft

Incorta is expected to use this funding to acquire top technical, sales, marketing, and operational talent, continue investing in current technologies, and work with  public cloud providers on supporting effective and efficient deployments.

Why It Matters

Enterprise analytics have gone through several generations of evolution, starting with the initial development of reporting on structured databases and ETL with the likes of Business Objects, Cognos, and Microstrategy , then continuing with the growth of self-service and data discovery driven by Qlik and Tableau, and then a variety of Big Data solutions including Hadoop, MongoDB, Oracle Exadata, and SAP HANA. We also saw a set of cloud BI solutions that accelerated the ability to create data models and star schemas, such as Birst, Power BI, and Oracle Analytics Cloud.

Although many of these technologies continue to be valid and effective ways to manage analytic access, we are entering a new era of analytics solutions seeking to accelerate analytic access and ubiquity not by simply improving current paradigms of data querying, modeling, and integration, but by fundamentally changing the way we access and structure analytic data by taking advantage of the cloud and modern methods of supporting complex data. Amalgam Insights believes that Incorta is an emerging game changer in the current era of analytics in its approach that actively avoids traditional data modeling and ETL in favor of a more direct approach of bringing data sources together and conducting analytics on the data. This approach is fundamentally important because it practically allows businesses to bring in new data sources as trusted analytic data without having to figure out the complexities of Kimball-esque data warehouse mapping.

With this round of funding, Amalgam Insights believes that Incorta is equipped to sell its unique approach to analytics to a larger audience. However, if the recent funding rounds of Looker, Thoughtspot, and Sisense are any indication, Incorta will likely look for at least one more round of funding either to support global marketing and selling efforts or to prepare for IPO. Either way, Amalgam Insights expects Incorta to continue its success as enterprise IT organizations seek a faster and more efficient way to continue bringing more data sources into their analytic environments. Amalgam Insights highly recommends Incorta as a solution for enterprise analytics departments struggling to keep up with the pace of readying new data sources for analytics. In addition, we recommend Incorta as a solution that is worth an initial free trial and sandbox installation even in mature and stable analytic environments to ensure that analytics teams are up to date with the Art of the Possible with modern analytic tools.

(Note: Amalgam Insights currently recommends Incorta, Looker, Microstrategy, Sisense, and Thoughtspot as analytic vendors with novel approaches for developing ubiquitous analytics.)

Posted on 1 Comment

The Neuroscience of Corporate Retention Training

The Brain is Hardwired to Forget

High-quality Learning and Development solutions provide organizations with the competitive advantage that they need to address digital transformation in the rapidly changing workplace. These solutions allow effective upskilling and reskilling of employees, but also attract, and retain the best talent. Organizations recognize that it is more time and cost effective to develop talent from within, than to recruit, hire, and onboard external talent. In addition, research suggests that a commitment to developing existing talent for today and the future enhances employee satisfaction, engagement, and productivity, all the while reducing turnover; a win for employers and employees.

The goal of any L&D solution is to speed initial learning and to enhance long term retention. Although initial learning is important, the ultimate goal is long-term retention and mastery. It does no good to train to perfection today, only to have that information forgotten tomorrow. Unfortunately, the brain is hardwired to forget, and thus long-term retention is a challenge. We know this from 100 years of research. To meet this challenge L&D solutions must construct, design and deliver training content in a way that trains for retention.

From a neuroscience perspective (see schematic brain figure), when you first study and begin to learn you are using short-term (working) memory and attention to process the information. This processing recruits the prefrontal cortex with the goal of building a memory representation and transferring this representation (through repetition) into long-term memory store. Long-term memory resides in the hippocampus and other medial temporal lobe structures. During initial learning these memory traces are strong, but they are highly susceptible to weakening (aka forgetting) over time. The goal of retention training is to reduce the rate of forgetting.

There are three critical tools that organizations should look for in an L&D solution if you want to enhance retention.

  1. Periodic Testing: Effective L&D solutions periodically test the learner on their knowledge of the relevant subject matter. Knowledge that is poorly retained should be identified, and this information should be retrained. New information should also be introduced to build the learner’s knowledge base. With this approach, we are layering prior with new material, thus reinforcing that prior knowledge and allowing it to serve as the scaffolding upon which to build. This layering enhances retention. This testing procedure opens the door to personalized L&D methodologies that will ultimately be developed.

  2. Spaced Microlearning Training: Effective L&D solutions use spaced training that comes in brief bursts of compelling and engaging content that focus on a specific topic (also called microlearning). Brief bursts of training increase the odds that attentional processes will remain engaged, and the learner’s attention span will not be exceeded. These microlearning boosts should be spaced over time so that any forgetting that occurs can be addressed. Spaced training provides the time for memory consolidation in the hippocampus and medial temporal lobes and experiential application and learning on the job.
  3. Scenario-based Storytelling: Effective L&D solutions use scenario-based storytelling to convey information. Storytelling engages emotion and motivation centers in the brain that draw the learner in, and allow them to see themselves in the learning. Engaging emotion and motivation centers increases the effectiveness of the prefrontal cortex and hippocampus and speeds initial learning while reducing the rate of forgetting.

By combining periodic testing with spaced microlearning and scenario-based storytelling, L&D solutions will “Train for Retention”. This approach will develop well-rounded employees who are ready to meet the needs of the modern workplace today, and the new challenges that await tomorrow.

If you share the challenge so many organizations face in struggling with the challenges of digital transformation and simply want L&D solutions that “stick”, look for solutions that embrace testing, spaced microlearning, and scenario-based storytelling.