Dual Learning Systems in the Brain: Implications for Corporate Training

Effective training is critical in all business sectors. In 2017, over $360 billion was spent on training worldwide, with over $160 billion spent in the U.S. alone. Given the ever-changing nature of the corporate landscape, as new technologies are introduced (e.g., AI) or upgraded (e.g., constant software upgrades), and as new challenges arise (e.g., sexual harassment in the workplace) corporate training must evolve to meet the growing need.

Corporate training can be loosely classified into two categories: hard skills and soft skills. An extensive body of scientific and neuroscientific research (much of it from the author’s research laboratory) suggests that the human brain contains at least two distinct systems that are recruited during learning. One system is referred to as the cognitive skills learning system, which is optimized for hard skills learning. The other is referred to as the behavioral skills learning system, which is optimized for soft skills learning.

Cognitive Skills Learning System

The cognitive skills learning system in the brain is comprised of the prefrontal cortex and the medial temporal lobes. This system is optimally tuned to learn hard skills such as learning new software, learning a company’s rules and regulations, or memorizing the set of steps to take to complete a task. The neural architecture of this system determines the set of training procedures that optimize learning. The scientifically-validated best practices for optimized learning of hard skills are many, and will be outlined in subsequent articles, but suffice it to say that learning in this system is passive, it involves observing and learning by watching, and repeating information mentally. For example, suppose you are learning a new CRM. You might read the manual, watch a few slide shows, or watch a video of someone performing specific functions within the CRM. You might study this information multiple times and practice the tasks in your head. Eventually, you will launch the CRM software to try out some of the things that you learned.

Behavioral Skills Learning System

The behavioral skills learning system in the brain resides in the basal ganglia. This system is optimally tuned to learn soft skills (also called people skills, 21st Century skills or socio-emotional skills). Soft skills include showing empathy, embracing diversity, and minimizing unconscious biases. These are all reflected in one’s behavior such as active listening, making eye contact, praising employees and co-workers when appropriate, avoiding overt punishment, and showing respect. Soft skills are relevant in all aspects of the corporate world including management, collaborative communication, and customer service, to name a few. Soft skills have received significant attention in 2017 and likely will receive even more in 2018 as corporations work to reduce the incidence of sexual harassment. I have written extensively on this topic.

Like the cognitive skills learning system, the neural architecture of the behavioral skills learning system determines the set of training procedures that optimize learning. The scientifically-validated best practices for optimized learning of soft skills are many, and will be outlined in subsequent articles, but suffice it to say that learning in this system is active, it involves learning by doing, and physical repetition.

Without going into the detailed neuroanatomy, soft skill learning relies critically on interactivity in the form of real-time immediate corrective feedback. You generate a behavior and receive feedback (literally within a few hundred milliseconds, no more). If the behavior is rewarded with a smile or nod, then that behavior will be more likely to occur next time you are in the same situation. If the behavior is punished with a frown or head shake, then that behavior will be less likely to occur next time you are in the same situation. This interactive back-and-forth in real-time is what leads to behavior change.

Optimal Delivery System for Hard and Soft Skills Training

With a few exceptions, the most common delivery system for corporate training is computer-based. Whether hard or soft skills, the training content generally comes in the form of written text, slide shows, or perhaps video. Notice that all of these content media involve passive observation on the part of the learner. This suggests a strong bias in corporate training toward optimized hard skills training, but sub-optimal soft skills training. Given the growing recognition of the importance of soft skills training in the workplace (e.g., the #metoo phenomena), this is unacceptable. Corporate training must expand to include interactivity of the form outlined above, in order to optimize soft skills training.

Most Tasks Require a Mixture of Cognitive and Behavioral Skills Learning

Although I have described hard skill learning as being mediated by the cognitive skills learning system in the brain, and soft skill learning as being mediated by the behavioral skills learning system in the brain, the reality is that nearly all tasks involve a mixture of cognitive and behavioral skills. For example, learning a new CRM might begin with reading the manual, watching some slide shows, or watching videos of someone performing specific functions within the CRM, but ultimately you need to use your mouse, touchpad, tablet or arrow keys to navigate the CRM and the keyboard to enter data. These are all behaviors and with enough practice these behaviors will become habitized through behavioral skills learning in the brain. Analogously, when training to behave as an effective manager, it makes sense to begin by reading descriptions of appropriate and inappropriate leadership behaviors, and even passively observing video interactions that portray an array of appropriate and inappropriate behaviors. This will set the stage, and likely facilitate the subsequent training targeted directly at increasing effective leadership behaviors and decreasing ineffective leadership behaviors.

A number of vendors offer corporate training, including Salesforce, SAP, Cornerstone, Saba, Skillsoft, PageUp, Halogen, PeopleFluent, Talentsoft, Haufe, Oracle, SilkRoad, Deltek, IBM, Lumesse, Cegid, and many more. All of these vendors offer hard and soft skills training within a single platform. In most cases, the same training procedures are used for hard and soft skills training, with only the content changing. As outlined above, this approach is disadvantageous for soft skills training.

Hyoun Park Discusses Cloud Pricing on CIO.com

Money Bubbles in the Clouds

On CIO.com, analyst Hyoun Park discusses recent cloud pricing changes by Oracle, Amazon, and Google in context of understanding who is actually providing the cheapest cloud. In this blog, Park posits that Oracle’s new Universal Credits for IaaS and PaaS usage are fundamentally different from the traditional pricing models for cloud and shows that the enterprise cloud is coming of age.

One of Park’s assertions is that the most granular pricing may not be the cheapest because the complexity of detailed pricing prevents companies from optimizing their costs. Will this trend affect your cloud costs?

To learn more, click through to CIO.com and read this article: “Is the cheapest cloud pricing flexible or granular?”

Also, join Hyoun’s webinar to learn more about managing cloud costs on BrightTALK: Cloud Service Management: Managing Cost, Resources, and Security

W. Todd Maddox Speaks On Virtual Reality and Sexual Harassment Training in Forbes

Recently, Amalgam Analyst W. Todd Maddox was interviewed on Forbes for his innovative take on the potential use of virtual reality in improving training for sexual harassment

In speaking with CBS News analyst Larry Magid, Maddox points out the experiential and emotional gap in current soft skills and sexual harassment training modules and why unconscious bias is not being tested by traditional training methods.

To learn more and hear Todd’s interview, please click here to visit Forbes.com

Tom Petrocelli Introduces NoOps on InformationWeek

Amalgam Insights Logo
Amalgam Insights

In case you missed in, at re:invent Amazon launched a mind-numbing number of new services including managed Kubernetes service, more AWS Lambda extensions, Aurora Serverless, AWS Serverless Application Repository, and Amazon SageMaker.

Based on this, Amalgam Analyst Tom Petrocelli recently contributed a thought-provoking article on InformationWeek about how Amazon Web Services is working on killing off IT Ops and bringing in a new age of “NoOps.” For IT Ops, Winter is definitely coming.

Do you agree or disagree? Take a look at Tom’s POV and let us know what you think.

Click here to read Tom’s article: AWS Ignites Debate About the Death of IT Ops

What’s On Tap for 2018 from Tom Petrocelli

Tom Petrocelli, Amalgam Insights Contributing Analyst

As the year comes to a close, I have had the opportunity to reflect on what has transpired in 2017 and look ahead to 2018. Some of my recent thoughts on 2017 have been published in:

These articles provide a peek ahead at emerging 2018 trends.

In the two areas I cover, collaboration and DevOps/Developer Trends, I plan to continue to look at:
The continued transformation of the collaboration market. [Click to Tweet] I am expecting a “mass extinction event” of products in this space. That doesn’t mean the collaboration market will evaporate. Instead, I am looking for niche products that address specific collaboration segments to thrive while a handful of large collaboration players will consume the general market.
The emergence of NoOps, for No Operations, in the mid-market. [Click to Tweet] The Amazon push to serverless products is a bellwether of the upcoming move toward cloud vendor operations supplanting company IT sysops.
2018 will be the year of the container.[Click to Tweet] Containers have been growing in popularity over the past several years but 2018 will be the year when they become truly mass market. The growth in the ecosystem, especially the widespread availability of cloud Kubernetes services, will make containers more palatable to a wider market.
Integrated DevOps pipelines will make DevOps more efficient… if [Click to Tweet] we can get the politics out of IT.
Machine learning will continue to be integrated into developer tools [Click to Tweet] which, in turn, will make more complex coding and deployment jobs easier.

As you know, I joined Amalgam Insights in September. Amalgam Insights, or AI, is a full-service market analyst firm. I’d welcome the opportunity to learn more about what 2018 holds for you. Perhaps we can schedule a quick call in the next couple of weeks. Let me know what works best for you. As always, if I can provide any additional information about AI, I’d be happy to do so!

Thanks, and have a happy holiday season.

For more predictions on IT management at scale, check out Todd Maddox’s 5 Predictions That Will Transform Corporate Training.

Amazon SageMaker: A Key to Accelerating Enterprise Machine Learning Adoption

On November 29th, Amazon Web Services announced SageMaker, a managed machine language service that manages the authoring, model training, and hosting of algorithms and frameworks. These capabilities can be used by themselves, or as an end-to-end production pipeline.

SageMaker is currently available with a Free tier providing 250 hours of t2.medium notebook usage, 50 hours of m4.xlarge training usage, and 125 hours of m4.xlarge hosting usage for hosting for two months. After two months or for additional hours, the service is billed per instance, storage GB, and data transfer GB.

Amalgam Insights anticipates watching the adoption of SageMaker as it solves several basic problems in machine learning.

Continue reading “Amazon SageMaker: A Key to Accelerating Enterprise Machine Learning Adoption”

Amazon Aurora Serverless vs. Oracle Autonomous Database: A Microcosm for The Future of IT

On November 29th, Amazon Web Services announced a variety of interesting database announcements at Amazon re:invent. Amazon Neptune, DynamoDB enhancements, and Aurora Serverless. Amalgam found both Neptune and DynamoDB announcements to be valuable but believes Aurora Serverless was the most interesting of these events both in its direct competition with Oracle and its personification of a key transitional challenge that all enterprise IT organizations face.

Amazon Neptune is a managed graph database service that Amalgam believes will be important for analyzing relationships, networked environments, process and anomaly charting, pattern sequencing, and random walks (such as solving the classic “traveling salesman” problem). Amazon Neptune is currently in limited preview with no scheduled date for production. Over time, Amalgam expects that Neptune will be an important enhancer for Amazon Kinesis’ streaming data, IoT Platform, Data Pipeline, and EMR (Elastic MapReduce) as graph databases are well-suited to find the context and value hiding in large volumes of related data.

For the DynamoDB NoSQL database service, Amazon announced two new capabilities. The first is global tables that will be automatically replicated across multiple AWS regions, which will be helpful for global support of production applications. Secondly, Amazon now provides on-demand backups for DynamoDB tables without impacting their availability or speed. With these announcements, DynamoDB comes closer to being a dependable and consistently governed global solution for unstructured and semistructured data.

But the real attention-getter was in the announcement of Aurora Serverless, an upcoming relational database offering that will allow end users to pay for database usage and access on a per-second basis. This change is made possible by Amazon’s existing Aurora architecture in separating storage from compute from a functional basis. This capability will be extremely valuable in supporting highly variable workloads.

How much will Aurora Serverless affect the world of relational databases?

Taking a step back, the majority of business data value is still created by relational data. Relational data is the basis of the vast majority of enterprise applications, the source for business intelligence and business analytics efforts, and the standard format that enterprise employees understand best for creating data. For the next decade, relational data will still be the most valuable form of data in the enterprise and the fight for relational data support will be vital in driving the future of machine learning, artificial intelligence, and digital user experience. To understand where the future of relational data is going, we have to first look at Oracle, who still owns 40+% of the relational database market and is laser-focused on business execution.

In early October, Oracle announced the “Autonomous Database Cloud,” based on Database 18c. The Autonomous Database Cloud was presented as a solution for managing the tuning, updating, performance driving, scaling, and recovery tasks that database administrators are typically tasked with and was scheduled to be launched in late 2017. This announcement came with two strong guarantees: 1) A telco-like 99.995% availability guarantee, including scheduled downtime and 2) a promise to provide the database at half the price of Amazon Redshift based on the processing power of the Oracle database.

In doing so, Oracle is using a combination of capabilities based on existing Oracle tuning, backup, and encryption automation and adding monitoring, failure detection, and automated correction capabilities. All of these functions will be overseen by machine learning designed to maintain and improve performance over time. The end result should be that Oracle Autonomous Database Cloud customers would see an elimination of day-to-day administrative tasks and reduced downtime as the machine learning continues to improve the database environment over time.

IT Divergence In Motion: Oracle vs. Amazon

In providing two very different next-gen database services, Oracle and Amazon have taken divergent paths in providing their next-generation relational databases. These decisions lead to an interesting head-to-head decision for companies seeking enterprise-grade database solutions.

On the one hand, IT organizations that are philosophically seeking to manage IT as a true service have, in Oracle, an automated database option that will remove the need for direct database and maintenance administration. Oracle is removing a variety of traditional corporate controls and replacing them with guaranteed uptime, performance, maintenance, and error reduction. This is an outcome-based approach that is still relatively novel in the IT world.

For those of us who have spent the majority of our careers handling IT at a granular level, it can feel somewhat disconcerting to see many of the manual tuning, upgrading, and security responsibilities being both automated and improved through machine learning. In reality, highly repetitive IT tasks will continue to be automated over time as the transactional IT administration tasks of the 80s and 90s finally come to an end. The Oracle approach is a look towards the future where the goal of database planning is to immediately enact analytic-ready data architecture rather than to coordinate efforts between database structures, infrastructure provisioning, business continuity, security, and networking. Oracle has also answered the question of how it will answer questions regarding the “scale-out” management of its database by providing this automated management layer with price guarantees.

In this path of database management evolution, database administrators must be architects who focus on how the wide variety of data categories (structured, semi-structured, unstructured, streaming, archived, binary, etc…) will fit into the human need for structure, context, and worldview verification.

On the other hand, Amazon’s approach is fundamentally about customer control at extremely granular levels. Aurora is easy to spin up and allows administrators a great deal of choice between instance size and workload capacity. With the current preview of Amazon Aurora Serverless, admins will have even more control over both storage and processing consumption by starting at the endpoint level as a starting point for provisioning and production. Amazon will target the support of MySQL compatibility in the first half of 2018, then follow with PostgreSQL later in 2018. This billing will occur in Aurora Capacity Units as a combination of storage and memory metered in one-second increments. This granularity of consumption and flexibility of computing will be very helpful in supporting on-demand applications with highly variable or unpredictable usage patterns.

But my 20+ years in technology cost administration also lead me to believe that there is an illusory quality of control in the cost and management structure that Amazon is providing. There is nothing wrong with providing pricing at an extremely detailed level, but Amalgam already finds that the vast majority of enterprise cloud spend unmonitored from a month-to-month basis at all but the most cursory levels. (For those of you in IT, who is the accountant or expense manager who cross-checks and optimizes your cloud resources on a monthly basis? Oh, you don’t have one?)

Because of that, we at Amalgam believe that additional granularity is more likely to result in billing disputes or complaints. We will also be interested in understanding the details of compute: there can be significant differences in resource pricing based on reserved instances, geography, timing, security needs, and performance needs. Amazon will need to reconcile these compute costs to prevent this service from being an uncontrolled runaway cost. This is the reality of usage-based technology consumption: decades of telecom, network, mobility, and software asset consumption have all demonstrated the risks of pure usage-based pricing.

Amalgam believes that there is room for both as Ease-of-Use vs. Granular Management continues to be a key IT struggle in 2018. Oracle represents the DB option for enterprises seeking governance, automation, and strategic scale while Amazon provides the DB option for enterprises seeking to scale while tightly managing and tracking consumption. The more important issue here is that the Oracle DB vs. Amazon DB announcements represent a microcosm of the future of IT. In one corner is the need to support technology that “just works” with no downtime, no day-to-day adminstration, and cost reduction driven by performance. In the other corner is the ultimate commoditization of technology where customers have extremely granular consumption options, can get started at minimal cost, and can scale out with little-to-no management.

Recommendations

1) Choose your IT model: “Just Works” vs. ” Granular Control.” Oracle and Amazon announcements show how both models have valid aspects. But inherent in both are the need to both scale up and scale out to fit business needs.

2) For “Just Works” organizations, actively evaluate machine learning and automation-driven solutions that reduce or eliminate day-to-day administration. For these organizations, IT no longer represents the management of technology, but the ability to supply solutions that increase in value over time. 2018 is going to be a big year in terms of adding new levels of automation in your organizations.

3) For “Granular Control” organizations, define the technology components that are key drivers or pre-requisites to business success and analyze them extremely closely. In these organizations, IT must be both analytics-savvy and maintain constant vigilance in an ever-changing world. If IT is part of your company’s secret sauce and a fundamental key to differentiated execution, you now have more tools to focus on exactly how, when, and where inflection points take place for company growth, change, or decline.

Why Did TIBCO Acquire Alpine Data?

On November 15, 2017, TIBCO announced the acquisition of Alpine Data, a data science platform long known for its goals of democratizing data science and simplifying access to data, analytic workflows, parallel compute, and tools.

With this acquisition, TIBCO makes its second major foray into the machine learning space after June 5th acquisition of Statistica. In doing so, TIBCO has significantly upgraded its machine learning support capabilities, which will be especially useful to TIBCO in continuing to position itself as a full-range data and analytics solution.

When this acquisition occurred, Amalgam received questions on how Alpine Data and Statistica would be expected to work together and how Alpine Data would fit into TIBCO’s existing machine learning and analytics portfolio. Amalgam has provided favorable recommendations for both Alpine Data and Statistica in 2017 and plans to continue providing a positive recommendation for both solutions, but sought to explore the nuances of these recommendations.

In our Market Milestone, we explore why Alpine Data was a lower-ranked machine learning solution in analyst landscapes despite being early-to-market in providing strong collaborative capabilities and supporting a wide variety of data sources. We also wanted to explore the extent to which Alpine Data provided some sort of conflict to existing TIBCO customers. Finally, we also wanted to provide guidance on how TIBCO’s acquisition would potentially change Alpine Data’s positioning and capabilities.

To read Amalgam Insights’ view and recommendations regarding this report, use the following link to acquire this report.

How Does myTrailhead Excel and How Can Science Make myTrailhead Even Better

Astro, Einstein, and other Salesforce Trailhead characters
Salesforce’s Trailhead branding is both on-point and adorable.

Key Takeaways:

  • myTrailhead allows customized training content and incorporates useful motivational and performance testing tools.
  • myTrialhead could be enhanced by incorporating scientifically-validated best practices in training, which suggest that hard skills are best trained by a cognitive skill learning system in the brain and soft skills are best trained by a behavioral skill learning system in the brain
  • In its current implementation, myTrailhead is more nearly optimized for hard skill training, but is sub-optimal for soft skills training

Technology is progressing at an accelerating rate. Jobs are constantly being updated or redefined by, and with the help of technology. Employees are constantly being asked to learn new skills whether in the same job or in a new position. Constant training is the rule, not the exception, and training platforms must be built with this in mind.
Continue reading “How Does myTrailhead Excel and How Can Science Make myTrailhead Even Better”