Posted on 1 Comment

Cloud Vendors Race to Release Continuous Integration and Continuous Deployment Tools

Tom Petrocelli, Amalgam Insights Research Fellow

Development organization continue to feel increasing pressure to produce better code more quickly. To help accomplish that faster-better philosophy, a number of methodologies have emerged that that help organizations quickly merge individual code, test it, and deploy to production. While DevOps is actually a management methodology, it is predicated on an integrated pipeline that drives code from development to production deployment smoothly. In order to achieve these goals, companies have adopted continuous integration and continuous deployment (CI/CD) tool sets. These tools, from companies such as Atlassian and GitLab, help developers to merge individual code into the deployable code bases that make up an application and then push them out to test and production environments.

Cloud vendors have lately been releasing their own CI/CD tools to their customers. In some cases, these are extensions of existing tools, such as Microsoft Visual Team Studio on Azure. Google’s recently announced Cloud Build as well as AWS CodeDeploy and CodePipeline are CI/CD tools developed specifically for their cloud environments. Cloud CI/CD tools are rarely all-encompassing and often rely on other open source or commercial products, such as Jenkins or Git, to achieve a full CI/CD pipeline.

These products represent more than just new entries into an increasingly crowded CI/CD market. They are clearly part of a longer-term strategy by cloud service providers to become so integrated into the DevOps pipeline that moving to a new vendor or adopting a multi-cloud strategy would be much more difficult. Many developers start with a single cloud service provider in order to explore cloud computing and deploy their initial applications. Adopting the cloud vendor’s CI/CD tools embeds the cloud vendor deeply in the development process. The cloud service provider is no longer sitting at the end of the development pipeline; They are integrated and vital to the development process itself. Even in the case where the cloud service provider CI/CD tools support hybrid cloud deployments, they are always designed for the cloud vendors own offerings. Google Cloud Build and Microsoft Visual Studio certainly follow this model.

There is danger for commercial vendors of CI/CD products outside these cloud vendors. They are now competing with native products, integrated into the sales and technical environment of the cloud vendor. Purchasing products from a cloud vendor is as easy as buying anything else from the cloud portal and they are immediately aware of the services the cloud vendor offers. No fuss, no muss.

This isn’t a problem for companies committed to a particular cloud service provider. Using native tools designed for the primary environment offers better integration, less work, and ease of use that is hard to achieve with external tools. The cost of these tools is often utility-based and, hence, elastic based on the amount of work product flowing through the pipeline. The trend toward native cloud CI/CD tools also helps explain Microsoft’s purchase of GitHub. GitHub, while cloud agnostic, will be much for powerful when completely integrated into Azure – for Microsoft customers anyway.

Building tools that strongly embed a particular cloud vendor into the DevOps pipeline is clearly strategic even if it promotes monoculture. There will be advantages for customers as well as cloud vendors. It remains to be seen if the advantages to customers overcome the inevitable vendor lock-in that the CI/CD tools are meant to create.

Posted on Leave a comment

Data Science Platforms News Roundup, July 2018

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Anaconda, Cloudera, Databricks, Dataiku, DataRobotDatawatch, Domino, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta.

Continue reading Data Science Platforms News Roundup, July 2018

Posted on 2 Comments

Domino Deploys SAS Analytics Into a Model-Driven Cloud

The announcement: On July 10, Domino Data Lab announced a partnership with SAS Analytics that will let Domino users run SAS Analytics for Containers in the public cloud on AWS while using Domino’s data science platform as the orchestration layer for the infrastructure provisioning and management. This partnership will allow SAS customers to use Domino as an orchestration layer to access multiple SAS environments for model building, deploy multiple SAS applications on AWS, track each SAS experiment in detail, while having reproducibility of prior work.

What does this mean?

Domino customers with SAS Analytics workloads currently running on-prem will now be able to deploy those workloads to the public cloud on AWS by using SAS Analytics for Containers via the Domino platform. Domino plans to follow up with support for Microsoft Azure and Google Cloud Platform to further enable enterprises to offload containerized SAS workloads in the cloud. By running SAS Analytics for Containers via Domino, Domino users will be able to track, provide feedback on, and reproduce their containerized SAS experiments the same way they do so with other experiments they’ve constructed using Python, R, or other tools within Domino.

Continue reading Domino Deploys SAS Analytics Into a Model-Driven Cloud

Posted on 2 Comments

Google BigQuery ML Extends the Power of (Some) Modeling to Data Analysts

Last week at Google Next ‘18, Google announced a new beta capability in their BigQuery cloud data warehouse: BigQuery ML, which lets data analysts apply simple machine learning models to data residing in BigQuery data warehouses.

Data analysts know databases and SQL, but generally don’t have a lot of experience in building machine learning models using Python or R. An additional issue is the expense, time-consumption, and possible regulatory violations of moving data out of storage in order to send it through machine learning models. BigQuery ML aims to address these problems by letting data analysts push data through linear regression models (to predict a numeric value) or binary logistic regression models (to classify a value into one of two categories, such as “high” or “low”), using simple extensions of SQL on Google databases, run in place. Continue reading Google BigQuery ML Extends the Power of (Some) Modeling to Data Analysts

Posted on Leave a comment

Amalgam Insights Releases List of 2018’s Best Enablement Solution Vendors for “Building Better Sales Brains”

Optimized sales training requires three critical aspects; combining effective sales enablement, people skills, and situational awareness

July 31, 2018 08:30 ET | Source: Amalgam Insights

BOSTON, July 31, 2018 — It turns out that the traits common to the best salespeople aren’t necessarily innate; research indicates they can be scientifically coached to be top producers. A new report, issued today from Amalgam Insights, evaluates the technology vendors who are leading the industry in optimizing this kind of training.

Todd Maddox, the most-cited researcher in corporate learning and Learning Science Research Fellow at Amalgam Insights, authored this report. This research uniquely identifies learning science approaches of optimizing sales training, which Maddox says requires a combination of effective sales enablement, people skills, and situational awareness.

The companies recognized in this research are (in alphabetical order): Allego, Brainshark, CrossKnowledge, Gameffective, Highspot, JOYai, Lessonly, Mindtickle, Qstream, myTrailhead, Seismic, and Showpad.

“Companies are continually looking for a competitive edge; leveraging technology to develop a top-tier sales force is a key area for improvement,” Maddox says. “If you want sales enablement and training tools that are highly effective, they must be grounded in learning science – the marriage of psychology and brain science. Many sales teams and sales-focused vendors have access to these tools. What is needed is an effective way to leverage these tools to maximum advantage,” he added.

Maddox’s report notes that successful companies work to map sales processes to three distinct learning systems in the brain:

  • The cognitive skills learning system is about knowing facts (the “what”) and is directly linked to sales enablement;
  • The behavioral skills learning system is about behavior (the “how”) and is directly linked to people skills training; and
  • The emotional learning system, which is about “reading” people and situations (the “feel”) and is directly linked to situational awareness.

“When these three traits are combined, salespeople become far more effective in understanding both the tangible and intangible needs of their customers and can craft and deliver solutions that continually produce solid results for their companies,” Maddox said.

Maddox’s report, “2018’s Best Sales Enablement Solutions for Building Better Sales Brains,” is available for Sales VPs, Directors, and Managers at www.amalgaminsights.com.

About Amalgam Insights:
Amalgam Insights (www.amalgaminsights.com) is a consulting and strategy firm focused on the transformative value of Technology Consumption Management. AI believes that all businesses must fundamentally reimagine their approach to data, design, cognitive augmentation, pricing, and technology usage to remain competitive. AI provides marketing and strategic support to enterprises, vendors, and institutional investors for conducting due diligence in Technology Consumption Management.

For more information:
Hyoun Park
Amalgam Insights
hyoun@amalgaminsights.com
415.754.9686

Steve Friedberg
MMI Communications
steve@mmicomm.com
484.550.2900

Posted on Leave a comment

Azure Advancements Announced at Microsoft Inspire 2018

Last week, Microsoft Inspire took place, which meant that Microsoft made a lot of new product announcements regarding the Azure cloud. In general, Microsoft is both looking up and trying to catch up to Amazon from a market share perspective while trying to keep its current #2 place in the Infrastructure as a Service world ahead of rapidly growing Google Cloud Platform as well as IBM and Oracle.  Microsoft Azure is generally regarded as a market-leading cloud platform, along with Amazon, that provides storage, computing, and security and is moving towards analytics, networking, replication, hybrid synchronization, and blockchain support.

Key functionalities that Microsoft has announced include:
Continue reading Azure Advancements Announced at Microsoft Inspire 2018

Posted on Leave a comment

The Learning Science of Situational Awareness and Patient Safety

An Interactive Webinar with Qstream CEO, Rich Lanchantin

On Wednesday, July 17, 2018, Amalgam’s Learning Scientist and Research Fellow, Todd Maddox, Ph.D. and Qstream’s CEO, Rich Lanchantin conducted a live, interactive webinar focused on the critically important topic of situational awareness and patient safety. Achieving the highest quality in patient care requires a careful balance between efficiency and situational awareness in order to prevent medical errors. Healthcare professionals must be able to observe the current environment while also keeping the various potential outcomes top of mind in order to avoid unnecessary complications.

In the webinar, we discussed the learning science—the marriage of psychology and brain science—of situational awareness and patient safety and showed how to most effectively help clinicians learn and retain information, which results in long-term behavior change. We focused specifically on the challenges faced in optimally training the “what”, “feel” and “how” learning systems in the brain that mediate situational awareness, and how the Qstream platform effectively recruits each of these learning systems.

A replay of the webinar is available for all interested parties at the following link. Simply click the “Webcasts & Slideshare” button and the webcast is there. You will also see a second webcast that I recorded with Rich Lanchantin focused on “The Psychology of Hard vs. Soft Skills Training”. Enjoy!

If you would be interested in retaining Todd Maddox, the most-cited researcher in corporate learning, for a webinar, speaking engagement, or workshop, please contact us at sales@amalgaminsights.com

Posted on Leave a comment

The Adoption Gap in Learning and Development: Brandon Hall Hosts A Podcast with Amalgam Insights’ Todd Maddox

On Thursday July 19, 2018, Brandon Hall Group released a podcast discussion between Amalgam Insights’ Learning Scientist and Research Fellow, Todd Maddox, Ph.D. and Brandon Hall’s COO, Rachel Cooke. The podcast focused on the “adoption gap” in Learning & Development that results when users are presented with a large number of tools and technologies, but little if any guidance on what tool to use when.

Todd and Rachel discuss the importance of leveraging learning science—the marriage of psychology and brain science—to provide best practices for mapping tools onto learning problems in the most effective manner. Todd and Rachel discuss the challenges faced in optimally training hard and people (aka soft) skills.

A replay of the podcast is available for all interested parties at the following link.

If you would be interested in retaining Todd Maddox, the most-cited researcher in corporate learning, for a podcast, webinar, speaking engagement, or workshop, please contact us at sales@amalgaminsights.com

Posted on Leave a comment

Market Milestone: Informatica and Google Cloud Partner to Open Up Data, Metadata, Processes, and Applications as Managed APIs

[This Research Note was co-written by Hyoun Park and Research Fellow Tom Petrocelli]

Key Stakeholders: Chief Information Officers, Chief Technical Officers, Chief Digital Officers, Data Management Managers, Data Integration Managers, Application Development Managers

Why It Matters: his partnership demonstrates how Informatica’s integration Platform as a Service brings Google Cloud Platform’s Apigee products and Informatica’s machine-learning-driven connectivity into a single solution.

Key Takeaway: This joint Informatica-Google API management solution provides customers with a single solution that provides data, process, and application integration as well as API management. As data challenges evolve into workflow and service management challenges, this solution bridges key gaps for data and application managers and demonstrates how Informatica can partner with other vendors as a neutral third-party to open up enterprise data and support next-generation data challenges.
Continue reading Market Milestone: Informatica and Google Cloud Partner to Open Up Data, Metadata, Processes, and Applications as Managed APIs

Posted on 1 Comment

The Brain Science View on Why Microlearning Is Misused & Misapplied in Enterprise Learning Environments

Microlearning is taking the Learning and Development world by storm. Although many incorrectly identify microlearning as simply short duration training sessions, leaders in the field define microlearning as an approach to training that focuses on conveying information about a single, specific idea. The goal is to isolate the idea that is to be trained and then to focus all of the training effort on explaining that single idea with engaging and informative content. For example, with respect to sexual harassment, one might watch a brief piece of video content focused on the qualities of an inclusive leader, or ways to identify the symptoms of hostility in the workplace. The information would be presented in an engaging format that stimulates knowledge acquisition in the learner. The microlearning training goal is clear: train one idea succinctly with engaging content, and with as few “extras” as possible.

The Psychological and Brain Science of Microlearning: Training the Hard Skills of People Skills

Learning science—the marriage of psychology and brain science–suggests that microlearning is advantageous for at least two reasons. First, the emphasis on training a single idea as concisely and succinctly as possible, increases the likelihood that the learner will remain engaged and attentive during the whole microlearning session. Put another way, the likelihood that the learner’s attention span will be exceeded is low.

Second, the aim of microlearning to eliminate any ancillary information that is not directly relevant to the target idea, means that the cognitive machinery (i.e., working memory and executive attention) available to process the information can focus on the idea to be learned, with minimal effort being expended on filtering out irrelevant information that can lead the learner astray. The learner’s cognitive load will all be focused on the idea to be trained.

Because microlearning techniques are targeted at working memory, executive attention, and attention span in general, microlearning strongly affects processing in the cognitive skills learning system. The cognitive skills learning system in the brain recruits the prefrontal cortex, a region of cortex directly behind the forehead that mediates the learning of hard skills. These include learning rules and regulations, new software, and critical skills such as math and coding. Hard skill learning requires focused attention and the ability to process and rehearse the information. One learns by reading, watching, and listening, and information is ultimately retained through mental repetitions.

Thus, microlearning is optimal for hard skill training. Microlearning can, and appears to be, revolutionizing online eLearning of hard skills.

The Psychological and Brain Science of Microlearning and People Skills Training

I showed in a recent article that online eLearning approaches to corporate training use the same, one-size-fits-all, delivery platform and procedures when training hard skills and people (aka soft) skills. Although generally effective for hard skills training, especially when tools like microlearning are incorporated, this one-size-fits-all approach is only marginally effective at training people skills because people skills are ultimately behavioral skills. People skills are about behavior. They are what we do, how we do it, and our intent. These are the skills that one needs for effective interpersonal communication and interaction, for showing genuine empathy, embracing diversity, and avoiding situations in which unconscious biases drive behavior.

Behavioral skill learning is not mediated by the cognitive skills learning system in the brain, but rather is mediated by the behavioral skills learning system in the brain. Whereas the cognitive skills learning system in the brain recruits the prefrontal cortex, and relies critically on working memory and executive attention, the behavioral skills learning system in the brain recruits the basal ganglia, a subcortical brain structure, that does not rely on working memory and executive attention for learning. Rather the basal ganglia learn behaviors gradually and incrementally via dopamine-mediated error-correction learning. When the learner generates a behavior that is followed in real-time, literally with 100s of milliseconds, by feedback that rewards the behavior, dopamine is released, and that behavior will be incrementally more likely to occur next time the learner is in the same context. On the other hand, when the learner generates a behavior that is followed in real-time by feedback that punishes the behavior, dopamine is not released, and that behavior will be incrementally less likely to occur next time the learner is in the same context.

People skills are learned by doing and involve physical repetitions.

Microlearning: The Hard Skills of People Skills

So how effective is microlearning for people skills training? The answer is that microlearning is very effective for early epochs of people skills training when the focus is on learning the hard skills of people skills. It is also effective when learning to identify good and bad people skills. For example, if you are learning about the definition of empathy, are being shown a demonstration of unconscious bias, or are learning some of the advantages of a diverse workplace. In these cases, microlearning is very useful because you are gaining a cognitive understanding of various aspects of people skills.

When microlearning content is grounded in rich scenario-based training its effectiveness is enhanced. This follows because scenario-based training engages emotional learning centers in the brain that affect hard but also people skills learning. Rich scenario allow learners to “see themselves” in the training which primes the system for behavior change.

Microlearning: The Behavioral Skills of People Skills

Despite the effectiveness of microlearning approaches for training hard skills, and when supplemented with rich scenarios, for engaging emotion centers, people skills are ultimately behavioral skills. The ultimate goal is behavior change. All of the cognitive skills training is in the service of preparing the learner for effective behavior change.

How effective is microlearning for behavioral skills learning and for effective behavior change?

The behavioral science is clear. Behavior skills training is optimized when you train the learner on multiple different behaviors, across multiple different settings. Ideally, the learner has no idea what is coming next. They could be placed in a routine situation such as a weekly team meeting, or a non-routine situation in which an angry client is on the phone and the learner has only a few minutes to de-escalate the situation. In other words, if I have multiple leadership situations that I want to train, such as leading an effective meeting, giving an effective performance review, or evidencing active listening skills, then generalization, transfer and long-run behavior change is most effective if you randomly present the learner with these leadership settings. This teaches the leader to “think on their feet” and to be confident that they can handle any situation at any time. Put another way, it is optimal to train simultaneously, and in a random order, several people skill “ideas”. You don’t want to focus on one idea and just train it, then switch to another and just train it.

You also want to incorporate a broad set of environmental contexts. Although the context is not central to the skill to be trained, including a broad range of contexts leads to more robust behavior change. For example, during leadership training in which I am training effective performance reviews, it would be ideal for the office setting to change across scenarios from modern to retro, to minimalist. Similarly, it is best to practice with a range of employees who differ in age, gender, ethnicity, etc. The broader based the training the better.

Summary

Microlearning is one of the most important advances in corporate training in decades. Microlearning directly addresses the need for continuous on-the-job learning. Microlearning’s focus on a single idea with as little ancillary information as possible, is advantageous for hard and cognitive skill learning. It effectively recruits the cognitive machinery of working memory and attention and focuses these resources on the idea to be trained. It is time and performance effective.

On the other hand, microlearning is less effective for behavioral skill learning. Behavioral skills are learned by recruiting the basal ganglia and its dopamine-mediated incremental learning system. Behaviors are learned most effectively, and with greater generalization and transfer, when ancillary information is present and varies from training epoch to training epoch. This leads to a robust behavioral skill development that is less context sensitive, and more broad-based. It facilitates an ability to “think on one’s feet” and to obtain the confidence necessary to feel prepared in any situation.

As I have outlined in recent research reports, microlearning represents one of the many exciting new tools and technologies available to L&D clients. That said, one-size-does-not-fit-all and different tools and technologies are optimal for different learning problems. Learning scientists are needed to map the appropriate tool onto the appropriate learning problem.