Posted on 1 Comment

Why Extended Reality (xR) is Poised to Disrupt Corporate Learning and Development – Part II: The Brain Science

Note: If you missed Part I of this blog series, catch up and read Part I: The Problem. This is part of a four-blog series exploring the psychology and brain science behind the potential for extended reality tools to disrupt corporate Learning & Development.

Four Dissociable Learning Systems in the Brain

The human brain is comprised of at least four distinct learning systems. A schematic of the learning systems is provided in the figure below. Continue reading Why Extended Reality (xR) is Poised to Disrupt Corporate Learning and Development – Part II: The Brain Science

Posted on Leave a comment

Five Vital Sourcing and Vendor Management Recommendations for Complex Technology Categories

As part of Amalgam Insights’ coverage of the Technology Expense Management market, we provide the following guidance to sourcing, procurement, and operations professionals seeking to better understand how to manage technology expenses.

In immature or monopoly markets where one dominant vendor provides technology services, vendor management challenges are limited. Although buyers can potentially purchase services outside of the corporate umbrella, enterprises can typically work both with the vendor and with corporate compliance efforts to consolidate spend. However, vendor management becomes increasingly challenging in a world where multiple vendors provide similar but not equivalent technology services. To effectively optimize services across multiple vendors, organizations must be able to manage all relevant spend in a single location.

In Telecom Expense Management, this practice has been a standard for over a decade as companies manage AT&T, Vodafone, Verizon, and many other telecom carriers with a single solution. For Software-as-a-Service, a number of solutions are starting to emerge that solve this challenge. And with Infrastructure-as-a-Service, this challenge is only starting to emerge in earnest as Microsoft Azure and Google Cloud Platform rise up as credible competitors to Amazon Web Services.

To effectively manage sourcing and vendor management in complex technology categories, Amalgam suggests starting with the following contractual steps:

Align vendor and internal Service Level Agreements. There is no reason that any vendor should provide a lower level of service than the IT department has committed to the enterprise and other commercial partners.

Define bulk and tiered discounts for all major subcategories of spend within a vendor contract. Vendors are typically willing to discount for any usage category where a business buys in bulk, but there is no reason for them to simply hand over discounts without being asked. This step sounds simple, but typically requires a basic understanding of service and usage categories to identify relevant categories.

Avoid “optional” fees. For instance, on telecom bills, there are a number of carrier fees that are included in the taxes, surcharges, and fees part of the bill. These charges are negotiable and will vary from vendor to vendor. Ensure that the enterprise is negotiating all fees that are carrier-based, rather than assuming that these fees are all mandatory government taxes or surcharges which can’t be avoided.

Renegotiate contracts as necessary, not just based on your “scheduled” contract dates. There is no need to constantly renegotiate contracts just for the sake of getting every last dime, but companies should seek to renegotiate if significantly increasing the size of their purchase or significantly changing the shape of their technology portfolio. For instance, an Amazon contract may not be well-suited for a significant usage increase of a service due to business demand.

Embed contract agreements and terms into service order invoice processing and service management areas. It is not uncommon to see elegant contract negotiations go to waste because the terms are not enforced during operational, financial, or service management. Structure the business relationship to support the contract, then place relevant contract terms within other processes and management solutions so that these terms are readily available to all stakeholders, not just the procurement team.

Effective vendor and contract management is an important starting point to support each subsequent element of the technology lifecycle to enforce effective management at scale. In future blogs, we will cover best practices for inventory, invoice, service order, and lifecycle management across telecom, mobility, network, SaaS, and IaaS spend.

Posted on 3 Comments

Why Extended Reality (xR) is Poised to Disrupt Corporate Learning and Development – Part I: The Problem

(Note: This is the first of a four-party series on the Learning Science of xR in Corporate Learning.)

Executive Summary

Key Stakeholders: Chief Learning Officers, Chief Human Resource Officers, Chief People Officer, Chief Talent Officer, Learning & Development Directors and Managers, Corporate Trainers, Content and Learning Product Managers, Hiring Directors, Hiring Managers, Human Resource Directors, Human Resource Managers.

Why It Matters: A major goal of corporate Learning and Development (L&D) is to build scalable tools that facilitate hard and behavioral (soft and technical) skills mastery. Mastery is most effectively achieved through experiential learning and repetition that engages multiple learning systems in the brain in synchrony and facilitates the development of situational awareness.

Top Takeaway: Compared to traditional learning tools, extended reality (xR) technologies, such as virtual and augmented reality, speed the development of mastery and expertise through repeated experiential learning that broadly engages multiple learning systems in the brain in synchrony and is scalable.

Overview

“Learning is an experience. Everything else is just information” – Albert Einstein

This is a powerful quote from Albert Einstein and is supported by learning science—the marriage of psychology and brain science. As I elaborate below, experiential learning is effective because it engages multiple learning systems in the brain in synchrony.

Taken a step further, if one can obtain multiple related, but distinct experiences, then one can begin to develop mastery, expertise, and broad-based situational awareness. It is one thing to have knowledge and behavioral skills, but it is another to be able to apply that knowledge and behavior under time or social pressure, when you are well or poorly rested, or with a team that is new or familiar to you.

When attempting to master hard skills, such as the ability to identify the signs of harassment in the workplace, it is important to experience multiples signs of harassment and from different points of view such as from that of the harasser, the target of the harassment, or a bystander.

When attempting to master people skills, such as the verbal and non-verbal communication skills needed to be an effective leader, it is important to gain experience with multiple verbal and non-verbal skills and from different points of view such as from that of the manager or employee, an interview or performance review setting, or a large contentious team meeting. When attempting to master a technical skill, such as the ability to maintain and run a large piece of equipment, it is important to gain experience with multiple aspects of the equipment and under different scenarios such as routine maintenance, emergency maintenance under time pressure, or a situation in which you are training or supervising a new technician. In each of these examples, multiple related, but distinct experiences provide not only the opportunity for learning, but also for developing situational awareness; a hallmark of mastery and expertise.

The need for experience-based learning and repetition to build mastery and expertise provides the foundational principles for the effectiveness of extended reality (xR) technologies in Learning and Development (L&D) in general, and corporate L&D in particular. xR technologies include virtual reality (VR) in which the learner is immersed in a completely new virtual environment and augmented reality (AR) in which the learner is in a combined real and virtual environment where digital information is overlaid onto the learner’s field of view. These technologies are grounded in experience-based learning and offer repeatable, broad-based practice that builds situational awareness and is scalable.

This report focuses on a learning science evaluation of the potential for xR technologies to disrupt corporate L&D. xR technologies have the potential to improve the quality and quantity of training, to speed learning and enhance retention in all aspects of corporate learning. This follows because xR technologies broadly engage multiple learning systems in the brain in synchrony, especially experiential learning systems, and allow the training to repeated many times to enhance situational awareness and facilitate the development of mastery and expertise.

The Importance of Training in the Corporate Sector

The pace of change in the corporate sector is such that high-quality training is a necessity. Employees must constantly obtain new hard skills, whether learning rules and regulations in the workplace, definitions of appropriate and inappropriate behavior, or how to interpret data science applications. Employees must continually acquire and refine their people (aka soft) skills to be more effective communicators, collaborators and leaders. Whether it is growing concerns with automation, the extensive data suggesting that diversity increased revenue and workplace harmony, the #metoo movement, or more likely the combination of all three, employees and management must acknowledge the need for effective people skills training in organizations large and small.

Employees are constantly gaining new behavioral and technical skills such as learning new digital technologies, upskilling on some existing piece of software, or learning to use and maintain a piece of equipment. With each of these classes of skills–hard, people, and technical–employees must not only become proficient, but the goal is to obtain mastery and expertise. Mastery and expertise lead to situational awareness and the ability to perform effectively under any condition, whether routine or non-routine involving stress, pressure or anxiety, and the ability to anticipate the future. Whereas one can have a catalog of facts, and a repertoire of behaviors, in the end, one has to extract the appropriate information and engage the appropriate behavior in each distinct situation. Experience and repetition drive hard skills, people skills, and technical skills for situational awareness.

In Part 2, we’ll explore the brain science in greater detail and go over four distinct learning systems that affect learning.

If you enjoyed this blog, please share it on your social networks! And if you would like to learn more about how to use this information to support your corporate learning efforts, please email us at research@amalgaminsights.com to speak with me.

Posted on Leave a comment

Data Science and Machine Learning News Roundup, February 2019

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Amazon, Anaconda, Cambridge Semantics, Cloudera, Databricks, Dataiku, DataRobot, Datawatch, Domino, Elastic, Google, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta, TROVE.

Four Key Announcements from H2O World in San Francisco

At H2O World in San Francisco, H2O.ai made several important announcements. Partnerships with Alteryx, Kx, and Intel will extend Driverless AI’s accessibility, capabilities, and speed, while improvements to Driverless AI, H2O, Sparkling Water, and AutoML focused on expanding support for more algorithms and heavier workloads. Amalgam Insights covered H2O.ai’s H2O World announcements.

IBM Watson Now Available Anywhere

At IBM Think in San Francisco, IBM announced the expansion of Watson’s availability “anywhere” – on-prem, and in any cloud configuration, whether private or public, singular or multi-cloud. Data no longer has to be hosted on the IBM Cloud to use Watson on it – instead, a connector from IBM Cloud Private for Data permits organizations to bring various Watson services to data that cannot be moved for privacy and security reasons. Update: Amalgam Insights now has a more in-depth evaluation of IBM Watson Anywhere.

Databricks’ $250 Million Funding Supports Explosive Growth and Global Demand for Unified Analytics; Brings Valuation to $2.75 Billion

Databricks has raised $250M in a Series E funding round, bringing its total funding to just shy of $500M. The funding round raises Databricks’ valuation to $2.75B in advance of a possible IPO. Microsoft joins this funding round, reflecting continuing commitment to the Azure Databricks collaboration between the two companies. This continued increase in valuation and financial commitment demonstrates that funders are satisfied with Databricks’ vision and execution.

Posted on Leave a comment

Big Changes in the Cloud Data Migration Market: Attunity and Alooma Get Acquired

Mid-February (Feb. 17 – 23) was a hot week for data and cloud migration companies with two big acquisitions.

Google announced on Tuesday, Feb. 19 the acquisition of Alooma to assist with cloud data migration issues. This acquisition aligns well with the 2018 acquisition of Velostrata to support cloud workload migration. This acquisition reflects Google’s continued interest in the Israeli cloud tech ecosystem to acquire technology solutions. It also shows that Google is pushing for enterprise cloud data and workloads and will be better positioned to migrate storage and compute from other clouds, such as Amazon Web Services and Microsoft Azure. As Google Cloud Platform continues to grow, this allows Google to be better positioned to

  • become a primary enterprise solution for more enterprises as Google is better positioned to acquire structured data en masse from public and private competitors
  • be a credible backup or business continuity solution for enterprises that may want a multi-cloud or hybrid cloud solution
  • be a competitive provider to help enterprises with cost reduction either through being a foil in contract negotiations or to simply optimize cost in areas where Google resources and services end up being either more cost-effective or easier to manage than similar services from other cloud vendors

This acquisition will allow enterprises looking at cloud as a primary or significant compute and storage tool to consider Google Cloud Platform to replace, replicate, and/or backup existing cloud environments and provides a step forward for Google Cloud Platform to continue growing in the Infrastructure-as-a-Service market that Amalgam Insights estimates is currently growing at over 50% year-over-year driven by AWS growing 45%, Microsoft growing 76%, Google Cloud Platform’s assumed growth of 30-40%, and Alibaba Cloud growth of 84% based on the last quarter, the last of which shows Alibaba’s potential to leapfrog Google Cloud Platform in the next even as it has not significantly expanded past the Chinese market.

And there’s more!

On Thursday, February 21, Qlik announced its intention to acquire Attunity for $560 million. Attunity is based in the Boston area and has been a market leader in change data capture and data duplication with a considerable enterprise and cloud provider customer list. The scale of Attunity’s data transfer needs has made Attunity a proven solution for managing, transfer, backup, and recovery of data for on-prem, private cloud, and public cloud.

When I first covering Attunity in 2012, the stock was trading at around $5 per share after having survived both the dot-com and 2008 recessions. At the time, the company was repositioning itself as a cloud enabler and I had the opportunity to speak at their 2013 Analyst Day. At that time, I told the investing crowd that Attunity was aligned to a massive cloud opportunity because of its unique role in data replication and in supporting Cloud-based Big Data.

In 2018, Attunity’s stock finally reaped the benefits of this approach as the stock tripled based on rapidly growing customer counts and revenue driven by the need to manage data across multi-region cloud environments, multiple cloud vendors, and hybrid cloud environments. In light of this rapid growth, it is no surprise that Attunity was a strong acquisition target in early 2019’s cash-rich, data-rich, and cloud-dependent world. Looking at Attunity’s income statements, it is easy to see why Qlik made this acquisition from a pure financial perspective as Attunity has crossed the line into profitability and developed a scalable and projectable business that now needs additional sales and marketing resources to fully execute.

Amalgam Insights believes that Attunity provides Qlik with a strong data partner to go with last year’s acquisition of Podium Data (also a Boston-area startup) as a data catalog. With this acquisition, Qlik continues to build itself out as a broad enterprise data solution post-Thoma Bravo acquisition.

With this acquisition, Qlik users are in a comfortable position of being provided with a next-generation data ecosystem to support their move to the cloud and to support a broad range of data sources, formats, and use cases. Qlik is taking a step forward to support mature enterprise needs at a time when a number of its Business Intelligence competitors are focusing on incremental changes in usability, data preparation, or performance.

Amalgam Insights sees the acquisition of Attunity as a competitive advantage for Qlik in net-new deals and believes that this acquisition provides companies considering investments in cloud data or broad cloud migrations to immediately add Qlik to the list of vendors that need to be added to the enterprise toolkit to fully manage these projects.

The big picture for enterprises is that cloud data migration is a core capability to support BCDR (Business Continuity and Disaster Recovery), hybrid cloud, and multi-cloud environments. Google Cloud Platform is now more enterprise-ready and competitive with its larger competitors, Amazon Web Services and Microsoft Azure, at a time when Thomas Kurian is taking the reins. Qlik is establishing itself as a powerful Big Data and Cloud Data vendor at a time when Big Data continues to triple year after year. The enterprise data world is changing quickly and both Google and Qlik made moves to respond to burgeoning market demand.

Posted on 1 Comment

Tom Petrocelli Clarifies How Cloud Foundry and Kubernetes Provide Different Paths to Microservices

DevOps Research Fellow Tom Petrocelli has just published a new report describing the roles that Cloud Foundry Application Runtime and Kubernetes play in supporting microservices. This report explores when each solution is appropriate and provides a set of vendors that provide resources and solutions to support the development of these open source projects.

Organizations and Vendors mentioned include: Cloud Foundry Foundation, Cloud Native Computing Foundation, Pivotal, IBM, Suse, Atos, Red Hat, Canonical, Rancher, Mesosphere, Heptio, Google, Amazon, Oracle, and Microsoft

To download this report, which has been made available at no cost until the end of February, go to https://www.amalgaminsights.com/product/analyst-insight-cloud-foundry-and-kubernetes-different-paths-to-microservices

Posted on Leave a comment

Todd Maddox Publishes Brain Science Analysis of Augmented Reality for Product Lifecycle Management

Today, Todd Maddox published the report “Why Augmented Reality is Effective in Product Lifecycle Management: A Brain Science Analysis.” This report provides best practices for implementing augmented reality across product development, supply chain management, equipment operation, troubleshooting, and field service.

Recommendations are based on Maddox’ research, which has been cited over 10,000 by his academic peers.

This report is available at no cost through the end of February based on the generosity of our clients. To download this report at no cost for the rest of February, please go to https://www.amalgaminsights.com/product/analyst-insight-why-augmented-reality-is-effective-in-product-lifecycle-management.

Posted on 1 Comment

Four Key Announcements from H2O World San Francisco

Last week at H2O World San Francisco, H2O.ai announced a number of improvements to Driverless AI, H2O, Sparkling Water, and AutoML, as well as several new partnerships for Driverless AI. The improvements provide incremental improvements across the platform, while the partnerships reflect H2O.ai expanding their audience and capabilities. This piece is intended to provide guidance to data analysts, data scientists, and analytic professionals working on including machine learning in their workflows.

Announcements

H2O.ai has integrated H2O Driverless AI with Alteryx Designer; the connector is available for download in the Alteryx Analytics Gallery. This will permit Alteryx users to implement more advanced and automatic machine learning algorithms into analytic workflows in Designer, as well as doing automatic feature engineering for their machine learning models. In addition, Driverless AI models can be deployed to Alteryx Promote for model management and monitoring, reducing time to deployment. Both of these new capabilities provide Alteryx-using business analysts and citizen data scientists more direct and expanded access to machine learning via H2O.ai.

H2O.ai is integrating Kx’s time-series database, kdb+, into Driverless AI. This will extend Driverless AI’s ability to process large datasets, resulting in faster identification of more performant predictive capabilities and machine learning models. Kx users will be able to perform feature engineering for machine learning models on their time series datasets within Driverless AI, and create time-series specific queries.

H2O.ai also announced a collaboration with Intel that will focus on accelerating H2O.ai technology on Intel platforms, including the Intel Xeon Scalable processor and H2O.ai’s implementation of XGBoost. Driverless AI on Intel, globally.  Accelerating H2O on Intel will help establish Intel’s credibility in machine learning and artificial intelligence for heavy compute loads. Other aspects of this collaboration will include expanding the reach of data science and machine learning by supporting efforts to integrate AI into analytics workflows and using Intel’s AI Academy to teach relevant skills. The details of the technical projects will remain under wraps until spring.

Finally, H2O.ai announced numerous improvements to both Driverless AI and their open-source H2O, Sparkling Water, and AutoML, mostly focused on expanding support for more algorithms and heavier workloads among their product suite. Among the improvements that caught my eye was the new ability to inspect trees thoroughly for all of the tree-based algorithms that the open-source H2O platform supports. With concern about “black-box” models and lack of insight around how a given model performs its analysis and why it yields the results it does for any given experiment, providing an API for tree inspection is a practical step towards making the logic behind model performance and output more transparent for at least some machine learning models.

Recommendations

Alteryx users seeking to implement machine learning models into analytic workflows should take advantage of increased access to H2O Driverless AI. Providing more machine learning capabilities to business analysts and citizen data scientists enhances the capabilities available to their data analytics workflows; Driverless AI’s existing AutoDoc capability will be particularly useful for ensuring Alteryx users understand the results of the more advanced techniques they now have access to.

If your organization collects time-series data but has not yet pursued analytics of this data with machine learning yet, consider trialing KX’s kdb+ and H2O’s Driverless AI. With this integration, Driverless AI will be able to quickly and automatically process time series data stored in kdb+, allowing swift identification of performant models and predictive capabilities.

If your organization is considering making significant investments in heavy-duty computing assets for heavy machine learning loads in the medium-term future, keep an eye on the work Intel will be doing to design chips for specific types of machine learning workloads. NVIDIA has its GPUs and Google its TPUs; by partnering with H2O, Intel is declaring its intentions to remain relevant in this market.

If your organization is concerned about the effects of “black box” machine learning models, the ability to inspect tree-based models in H2O, along with the AutoDoc functionality in Driverless AI, are starting to make the logic behind machine learning models in H2O more transparent. This new ability to inspect tree-based algorithms is a key step towards more thorough governance surrounding the results of machine learning endeavors.

Posted on Leave a comment

Leveraging Psychology and Brain Science to Optimize Retention and Behavior Change

Amalgam Insights’ Learning Science Research Fellow Todd Maddox has recently published an Analyst Insight focused on exploring how psychology and brain science can inform learning practitioners and provide tools that optimize information retention and behavior change. The workplace is changing rapidly and the modern employee needs continuous learning of hard skills, people (aka soft) skills and situational awareness. Neuroscience reveals that each of these skill sets is mediated by a distinct learning system in the brain, each of which has its own unique operating characteristics. The modern employee expects learning in the flow of work, available 24/7 on any device, with engaging content and experience.

Maddox’ key finding was that Qstream’s mobile microlearning solution meets these challenges by delivering content in a way that engages the cognitive skills learning system in the brain during hard skills training, the behavioral skills learning in the brain during people skills training, and the emotional skills learning system in the brain during situational awareness training. The user experience engages employees through scenario-based challenges which stimulate critical thinking, gives real-time feedback, explains answers, supports personalized coaching, and delivers learning in minutes per day.

For a complementary copy of the complete report, vist the Qstream website at: https://info.qstream.com/leveraging-learning-science-how-qstreams-mobile-microlearning-solution-changes-behavior.

Posted on Leave a comment

Data Science and Machine Learning News Roundup, January 2019

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Amazon, Anaconda, Cambridge Semantics, Cloudera, Databricks, Dataiku, DataRobot, Datawatch, DominoElastic, Google, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta, TROVE.

Cloudera and Hortonworks Complete Planned Merger

In early January, Cloudera and Hortonworks completed their planned merger. With this, Cloudera becomes the default machine learning ecosystem for Hadoop-based data, while providing an easy pathway for expanding into  machine learning and analytics capabilities for Hortonworks customers.

Study: 89 Percent of Finance Teams Yet to Embrace Artificial Intelligence

A study conducted by the Association of International Certified Professional Accountants (AICPA) and Oracle revealed that 89% of organizations have not deployed AI to their finance groups. Although a correlation exists between companies with revenue growth and companies that are using AI, the key takeaway is that artificial intelligence is still in the early adopter phase for most organizations.

Gartner Magic Quadrant for Data Science and Machine Learning Platforms

In late January, Gartner released its Magic Quadrant for Data Science and Machine Learning Platforms. New to the Data Science and Machine Learning MQ this year are both DataRobot and Google – two machine learning offerings with completely different audiences and scope. DataRobot offers an automated machine learning service targeted towards “citizen data scientists,” while Google’s machine learning tools, though part of Google Cloud Platform, are more of a DIY data pipeline targeted towards developers. By contrast, I find it curious that Amazon’s SageMaker machine learning platform – and its own collection of task-specific machine learning tools, despite their similarity to Google’s – failed to make the quadrant, given this quadrant’s large umbrella.

While data science and machine learning are still emerging markets, the contrasting demands of these technologies made by citizen data scientists and by cutting-edge developers warrants splitting the next Data Science and Machine Learning Magic Quadrant into separate reports targeted to the considerations of each of these audiences. In particular, the continued growth of automated machine learning technologies will likely drive such a split, as citizen data scientists pursue a “good enough” solution that provides quick results.