Posted on 2 Comments

Domino Deploys SAS Analytics Into a Model-Driven Cloud

The announcement: On July 10, Domino Data Lab announced a partnership with SAS Analytics that will let Domino users run SAS Analytics for Containers in the public cloud on AWS while using Domino’s data science platform as the orchestration layer for the infrastructure provisioning and management. This partnership will allow SAS customers to use Domino as an orchestration layer to access multiple SAS environments for model building, deploy multiple SAS applications on AWS, track each SAS experiment in detail, while having reproducibility of prior work.

What does this mean?

Domino customers with SAS Analytics workloads currently running on-prem will now be able to deploy those workloads to the public cloud on AWS by using SAS Analytics for Containers via the Domino platform. Domino plans to follow up with support for Microsoft Azure and Google Cloud Platform to further enable enterprises to offload containerized SAS workloads in the cloud. By running SAS Analytics for Containers via Domino, Domino users will be able to track, provide feedback on, and reproduce their containerized SAS experiments the same way they do so with other experiments they’ve constructed using Python, R, or other tools within Domino.

Continue reading Domino Deploys SAS Analytics Into a Model-Driven Cloud

Posted on 2 Comments

Google BigQuery ML Extends the Power of (Some) Modeling to Data Analysts

Last week at Google Next ‘18, Google announced a new beta capability in their BigQuery cloud data warehouse: BigQuery ML, which lets data analysts apply simple machine learning models to data residing in BigQuery data warehouses.

Data analysts know databases and SQL, but generally don’t have a lot of experience in building machine learning models using Python or R. An additional issue is the expense, time-consumption, and possible regulatory violations of moving data out of storage in order to send it through machine learning models. BigQuery ML aims to address these problems by letting data analysts push data through linear regression models (to predict a numeric value) or binary logistic regression models (to classify a value into one of two categories, such as “high” or “low”), using simple extensions of SQL on Google databases, run in place. Continue reading Google BigQuery ML Extends the Power of (Some) Modeling to Data Analysts

Posted on Leave a comment

Amalgam Insights Releases List of 2018’s Best Enablement Solution Vendors for “Building Better Sales Brains”

Optimized sales training requires three critical aspects; combining effective sales enablement, people skills, and situational awareness

July 31, 2018 08:30 ET | Source: Amalgam Insights

BOSTON, July 31, 2018 — It turns out that the traits common to the best salespeople aren’t necessarily innate; research indicates they can be scientifically coached to be top producers. A new report, issued today from Amalgam Insights, evaluates the technology vendors who are leading the industry in optimizing this kind of training.

Todd Maddox, the most-cited researcher in corporate learning and Learning Science Research Fellow at Amalgam Insights, authored this report. This research uniquely identifies learning science approaches of optimizing sales training, which Maddox says requires a combination of effective sales enablement, people skills, and situational awareness.

The companies recognized in this research are (in alphabetical order): Allego, Brainshark, CrossKnowledge, Gameffective, Highspot, JOYai, Lessonly, Mindtickle, Qstream, myTrailhead, Seismic, and Showpad.

“Companies are continually looking for a competitive edge; leveraging technology to develop a top-tier sales force is a key area for improvement,” Maddox says. “If you want sales enablement and training tools that are highly effective, they must be grounded in learning science – the marriage of psychology and brain science. Many sales teams and sales-focused vendors have access to these tools. What is needed is an effective way to leverage these tools to maximum advantage,” he added.

Maddox’s report notes that successful companies work to map sales processes to three distinct learning systems in the brain:

  • The cognitive skills learning system is about knowing facts (the “what”) and is directly linked to sales enablement;
  • The behavioral skills learning system is about behavior (the “how”) and is directly linked to people skills training; and
  • The emotional learning system, which is about “reading” people and situations (the “feel”) and is directly linked to situational awareness.

“When these three traits are combined, salespeople become far more effective in understanding both the tangible and intangible needs of their customers and can craft and deliver solutions that continually produce solid results for their companies,” Maddox said.

Maddox’s report, “2018’s Best Sales Enablement Solutions for Building Better Sales Brains,” is available for Sales VPs, Directors, and Managers at www.amalgaminsights.com.

About Amalgam Insights:
Amalgam Insights (www.amalgaminsights.com) is a consulting and strategy firm focused on the transformative value of Technology Consumption Management. AI believes that all businesses must fundamentally reimagine their approach to data, design, cognitive augmentation, pricing, and technology usage to remain competitive. AI provides marketing and strategic support to enterprises, vendors, and institutional investors for conducting due diligence in Technology Consumption Management.

For more information:
Hyoun Park
Amalgam Insights
hyoun@amalgaminsights.com
415.754.9686

Steve Friedberg
MMI Communications
steve@mmicomm.com
484.550.2900

Posted on Leave a comment

Azure Advancements Announced at Microsoft Inspire 2018

Last week, Microsoft Inspire took place, which meant that Microsoft made a lot of new product announcements regarding the Azure cloud. In general, Microsoft is both looking up and trying to catch up to Amazon from a market share perspective while trying to keep its current #2 place in the Infrastructure as a Service world ahead of rapidly growing Google Cloud Platform as well as IBM and Oracle.  Microsoft Azure is generally regarded as a market-leading cloud platform, along with Amazon, that provides storage, computing, and security and is moving towards analytics, networking, replication, hybrid synchronization, and blockchain support.

Key functionalities that Microsoft has announced include:
Continue reading Azure Advancements Announced at Microsoft Inspire 2018

Posted on Leave a comment

The Learning Science of Situational Awareness and Patient Safety

An Interactive Webinar with Qstream CEO, Rich Lanchantin

On Wednesday, July 17, 2018, Amalgam’s Learning Scientist and Research Fellow, Todd Maddox, Ph.D. and Qstream’s CEO, Rich Lanchantin conducted a live, interactive webinar focused on the critically important topic of situational awareness and patient safety. Achieving the highest quality in patient care requires a careful balance between efficiency and situational awareness in order to prevent medical errors. Healthcare professionals must be able to observe the current environment while also keeping the various potential outcomes top of mind in order to avoid unnecessary complications.

In the webinar, we discussed the learning science—the marriage of psychology and brain science—of situational awareness and patient safety and showed how to most effectively help clinicians learn and retain information, which results in long-term behavior change. We focused specifically on the challenges faced in optimally training the “what”, “feel” and “how” learning systems in the brain that mediate situational awareness, and how the Qstream platform effectively recruits each of these learning systems.

A replay of the webinar is available for all interested parties at the following link. Simply click the “Webcasts & Slideshare” button and the webcast is there. You will also see a second webcast that I recorded with Rich Lanchantin focused on “The Psychology of Hard vs. Soft Skills Training”. Enjoy!

If you would be interested in retaining Todd Maddox, the most-cited researcher in corporate learning, for a webinar, speaking engagement, or workshop, please contact us at sales@amalgaminsights.com

Posted on Leave a comment

The Adoption Gap in Learning and Development: Brandon Hall Hosts A Podcast with Amalgam Insights’ Todd Maddox

On Thursday July 19, 2018, Brandon Hall Group released a podcast discussion between Amalgam Insights’ Learning Scientist and Research Fellow, Todd Maddox, Ph.D. and Brandon Hall’s COO, Rachel Cooke. The podcast focused on the “adoption gap” in Learning & Development that results when users are presented with a large number of tools and technologies, but little if any guidance on what tool to use when.

Todd and Rachel discuss the importance of leveraging learning science—the marriage of psychology and brain science—to provide best practices for mapping tools onto learning problems in the most effective manner. Todd and Rachel discuss the challenges faced in optimally training hard and people (aka soft) skills.

A replay of the podcast is available for all interested parties at the following link.

If you would be interested in retaining Todd Maddox, the most-cited researcher in corporate learning, for a podcast, webinar, speaking engagement, or workshop, please contact us at sales@amalgaminsights.com

Posted on Leave a comment

Market Milestone: Informatica and Google Cloud Partner to Open Up Data, Metadata, Processes, and Applications as Managed APIs

[This Research Note was co-written by Hyoun Park and Research Fellow Tom Petrocelli]

Key Stakeholders: Chief Information Officers, Chief Technical Officers, Chief Digital Officers, Data Management Managers, Data Integration Managers, Application Development Managers

Why It Matters: his partnership demonstrates how Informatica’s integration Platform as a Service brings Google Cloud Platform’s Apigee products and Informatica’s machine-learning-driven connectivity into a single solution.

Key Takeaway: This joint Informatica-Google API management solution provides customers with a single solution that provides data, process, and application integration as well as API management. As data challenges evolve into workflow and service management challenges, this solution bridges key gaps for data and application managers and demonstrates how Informatica can partner with other vendors as a neutral third-party to open up enterprise data and support next-generation data challenges.
Continue reading Market Milestone: Informatica and Google Cloud Partner to Open Up Data, Metadata, Processes, and Applications as Managed APIs

Posted on 1 Comment

The Brain Science View on Why Microlearning Is Misused & Misapplied in Enterprise Learning Environments

Microlearning is taking the Learning and Development world by storm. Although many incorrectly identify microlearning as simply short duration training sessions, leaders in the field define microlearning as an approach to training that focuses on conveying information about a single, specific idea. The goal is to isolate the idea that is to be trained and then to focus all of the training effort on explaining that single idea with engaging and informative content. For example, with respect to sexual harassment, one might watch a brief piece of video content focused on the qualities of an inclusive leader, or ways to identify the symptoms of hostility in the workplace. The information would be presented in an engaging format that stimulates knowledge acquisition in the learner. The microlearning training goal is clear: train one idea succinctly with engaging content, and with as few “extras” as possible.

The Psychological and Brain Science of Microlearning: Training the Hard Skills of People Skills

Learning science—the marriage of psychology and brain science–suggests that microlearning is advantageous for at least two reasons. First, the emphasis on training a single idea as concisely and succinctly as possible, increases the likelihood that the learner will remain engaged and attentive during the whole microlearning session. Put another way, the likelihood that the learner’s attention span will be exceeded is low.

Second, the aim of microlearning to eliminate any ancillary information that is not directly relevant to the target idea, means that the cognitive machinery (i.e., working memory and executive attention) available to process the information can focus on the idea to be learned, with minimal effort being expended on filtering out irrelevant information that can lead the learner astray. The learner’s cognitive load will all be focused on the idea to be trained.

Because microlearning techniques are targeted at working memory, executive attention, and attention span in general, microlearning strongly affects processing in the cognitive skills learning system. The cognitive skills learning system in the brain recruits the prefrontal cortex, a region of cortex directly behind the forehead that mediates the learning of hard skills. These include learning rules and regulations, new software, and critical skills such as math and coding. Hard skill learning requires focused attention and the ability to process and rehearse the information. One learns by reading, watching, and listening, and information is ultimately retained through mental repetitions.

Thus, microlearning is optimal for hard skill training. Microlearning can, and appears to be, revolutionizing online eLearning of hard skills.

The Psychological and Brain Science of Microlearning and People Skills Training

I showed in a recent article that online eLearning approaches to corporate training use the same, one-size-fits-all, delivery platform and procedures when training hard skills and people (aka soft) skills. Although generally effective for hard skills training, especially when tools like microlearning are incorporated, this one-size-fits-all approach is only marginally effective at training people skills because people skills are ultimately behavioral skills. People skills are about behavior. They are what we do, how we do it, and our intent. These are the skills that one needs for effective interpersonal communication and interaction, for showing genuine empathy, embracing diversity, and avoiding situations in which unconscious biases drive behavior.

Behavioral skill learning is not mediated by the cognitive skills learning system in the brain, but rather is mediated by the behavioral skills learning system in the brain. Whereas the cognitive skills learning system in the brain recruits the prefrontal cortex, and relies critically on working memory and executive attention, the behavioral skills learning system in the brain recruits the basal ganglia, a subcortical brain structure, that does not rely on working memory and executive attention for learning. Rather the basal ganglia learn behaviors gradually and incrementally via dopamine-mediated error-correction learning. When the learner generates a behavior that is followed in real-time, literally with 100s of milliseconds, by feedback that rewards the behavior, dopamine is released, and that behavior will be incrementally more likely to occur next time the learner is in the same context. On the other hand, when the learner generates a behavior that is followed in real-time by feedback that punishes the behavior, dopamine is not released, and that behavior will be incrementally less likely to occur next time the learner is in the same context.

People skills are learned by doing and involve physical repetitions.

Microlearning: The Hard Skills of People Skills

So how effective is microlearning for people skills training? The answer is that microlearning is very effective for early epochs of people skills training when the focus is on learning the hard skills of people skills. It is also effective when learning to identify good and bad people skills. For example, if you are learning about the definition of empathy, are being shown a demonstration of unconscious bias, or are learning some of the advantages of a diverse workplace. In these cases, microlearning is very useful because you are gaining a cognitive understanding of various aspects of people skills.

When microlearning content is grounded in rich scenario-based training its effectiveness is enhanced. This follows because scenario-based training engages emotional learning centers in the brain that affect hard but also people skills learning. Rich scenario allow learners to “see themselves” in the training which primes the system for behavior change.

Microlearning: The Behavioral Skills of People Skills

Despite the effectiveness of microlearning approaches for training hard skills, and when supplemented with rich scenarios, for engaging emotion centers, people skills are ultimately behavioral skills. The ultimate goal is behavior change. All of the cognitive skills training is in the service of preparing the learner for effective behavior change.

How effective is microlearning for behavioral skills learning and for effective behavior change?

The behavioral science is clear. Behavior skills training is optimized when you train the learner on multiple different behaviors, across multiple different settings. Ideally, the learner has no idea what is coming next. They could be placed in a routine situation such as a weekly team meeting, or a non-routine situation in which an angry client is on the phone and the learner has only a few minutes to de-escalate the situation. In other words, if I have multiple leadership situations that I want to train, such as leading an effective meeting, giving an effective performance review, or evidencing active listening skills, then generalization, transfer and long-run behavior change is most effective if you randomly present the learner with these leadership settings. This teaches the leader to “think on their feet” and to be confident that they can handle any situation at any time. Put another way, it is optimal to train simultaneously, and in a random order, several people skill “ideas”. You don’t want to focus on one idea and just train it, then switch to another and just train it.

You also want to incorporate a broad set of environmental contexts. Although the context is not central to the skill to be trained, including a broad range of contexts leads to more robust behavior change. For example, during leadership training in which I am training effective performance reviews, it would be ideal for the office setting to change across scenarios from modern to retro, to minimalist. Similarly, it is best to practice with a range of employees who differ in age, gender, ethnicity, etc. The broader based the training the better.

Summary

Microlearning is one of the most important advances in corporate training in decades. Microlearning directly addresses the need for continuous on-the-job learning. Microlearning’s focus on a single idea with as little ancillary information as possible, is advantageous for hard and cognitive skill learning. It effectively recruits the cognitive machinery of working memory and attention and focuses these resources on the idea to be trained. It is time and performance effective.

On the other hand, microlearning is less effective for behavioral skill learning. Behavioral skills are learned by recruiting the basal ganglia and its dopamine-mediated incremental learning system. Behaviors are learned most effectively, and with greater generalization and transfer, when ancillary information is present and varies from training epoch to training epoch. This leads to a robust behavioral skill development that is less context sensitive, and more broad-based. It facilitates an ability to “think on one’s feet” and to obtain the confidence necessary to feel prepared in any situation.

As I have outlined in recent research reports, microlearning represents one of the many exciting new tools and technologies available to L&D clients. That said, one-size-does-not-fit-all and different tools and technologies are optimal for different learning problems. Learning scientists are needed to map the appropriate tool onto the appropriate learning problem.

 

Posted on Leave a comment

What Wall Street is missing regarding Broadcom’s acquisition of CA Technologies: Cloud, Mainframes, & IoT

(Note: This blog contains significant contributions from long-time software executive and Research Fellow Tom Petrocelli)

On July 11, Broadcom ($AVGO) announced an agreement to purchase CA for $18.9 billion. If this acquisition goes through, this will be the third largest software acquisition of all time behind only Microsoft’s $26 billion acquisition of LinkedIn and Facebook’s $19 billion acquisition of WhatsApp. And, given CA’s focus, I would argue this is the largest enterprise software acquisition of all time, since a significant part of LinkedIn’s functionality is focused on the consumer level.

But why did Broadcom make this bet? The early reviews have shown confusion with headlines such as:
Broadcom deal to buy CA makes little sense on the surface
4 Reasons Broadcom’s $18.9B CA Technologies Buy Makes No Sense
Broadcom Buys CA – Huh?

All of these articles basically hone in on the fact that Broadcom is a hardware company and CA is a software company, which leads to the conclusion that these two companies have nothing to do with each other. But to truly understand why Broadcom and CA can fit together, let’s look at the context.

In November 2017, Broadcom purchased Brocade for $5.5 billion to build out data center and networking markets. This acquisition expanded on Broadcom’s strengths in supporting mobile and connectivity use cases by extending Broadcom’s solution set beyond the chip and into actual connectivity.

Earlier this year, Broadcom had tried to purchase Qualcomm for over $100 billion. Given Broadcom’s lack of cash on hand, this would have been a debt-based purchase with the obvious goal of rolling up the chip market. When the United States blocked this acquisition in March, Broadcom was likely left with a whole lot of money ready to deploy that needed to be used or lost and no obvious target.

So, add these two together and Broadcom had both the cash to spend and a precedent for showing that it wanted to expand its value proposition beyond the chip and into larger integrated solutions for two little trends called “the cloud,” especially private cloud, and “the Internet of Things.”

Now, in that context, take a look at CA. CA’s bread and butter comes from its mainframe solutions, which make up over $2 billion in revenue per year. Mainframes are large computers that handle high-traffic and dedicated workloads and increasingly need to be connected to more data sources, “things,” and clients. Although CA’s mainframe business is a legacy business, that legacy is focused on some of the biggest enterprise computational processing needs in the world. Thus, this is an area that a chipmaker would be interested in supporting over time. The ability to potentially upsell or replace those workloads over time with Broadcom computing assets, either through custom mainframe processors or through private cloud data centers, could add some predictability to the otherwise cyclical world of hardware manufacturing. Grab enterprise computing workloads at the source and then custom build to their needs.

This means that there’s a potential hyperscale private cloud play here as well for Broadcom by bringing Broadcom’s data center networking business together with CA’s server management capabilities, which end up looking at technical monitoring issues both from a top-down and bottoms-up perspective.

CA also is strong in supporting mobile development, developer operations (DevOps), API management, IT Operations, and service level management in its enterprise solutions business, which earned $1.75 billion in annual revenue over the past year. On the mobile side, this means that CA is a core toolset for building, testing, and monitoring the mobile apps and Internet of Things applications that will be running through Broadcom’s chips. To optimize computing environments, especially in mobile and IoT edge environments where computing and storage resources are limited, applications need to optimized on available hardware. If Broadcom is going to take over the IoT chip market over time, the chips need to support relevant app workloads.

I would expect Broadcom to increase investment in CA’s Internet of Things & mobile app dev departments expand once Broadcom completes this transaction as well. Getting CA’s dev tools closer to silicon can only help performance and help Broadcom to provide out-of-the-box IoT solutions. This acquisition may even push Broadcom into the solutions and services market, which would blow the minds of hardware analysts and market observers but would also be a natural extension of Broadcom’s current acquistions to move through the computing value stack.

From a traditional OSI perspective, this acquisition feels odd because Broadcom is skipping multiple layers between its core chip competency and CA’s core competency. But the Brocade acquisition helps close the gaps even after spinning off Ruckus Wireless, Lumina SDN, and data center networking businesses. Broadcom is focused on processing and guiding workloads, not on transport and other non-core activities.

So, between mainframe, private cloud, mobile, and IoT markets, there are a number of adjacencies between Broadcom & CA. It will be challenging to knit together all of these pieces accretively. But because so much of CA’s software is focused on the monitoring, testing, and security of hardware and infrastructure, this acquisition isn’t quite as crazy as a variety of pundits seem to think. In addition, the relative consistency of CA’s software revenue compared to the highs and lows of chip building may also provide some benefits to Broadcom by providing predictable cash flow to manage debt payments and to fund the next acquisition that Hock Tan seeks to hunt down.

All this being said, this is still very much an acquisition out of left field. I’ll be fascinated in seeing how this transaction ends up. It is somewhat reminiscent of Oracle’s 2009 acquisition of Sun to bring hardware and software together. This does not necessarily create confidence in the acquisition, since hardware/software mergers have traditionally been tricky, but doesn’t disprove the synergies that do exist. In addition, Oracle’s move points out that Broadcom seems to have skipped a step of purchasing a relevant casing, device, or server company. Could this be a future acquisition to bolster existing investments and push further into the world of private cloud?

A key challenge and important point that my colleague Tom Petrocelli brings up is that CA and Broadcom sell to very different customers. Broadcom has been an OEM-based provider while CA sells directly to IT. As a result, Broadcom will need to be careful in maintaining CA’s IT-based direct and indirect sales channels and would be best served to keep CA’s go-to-market teams relatively intact.

Overall, the Broadcom acquisition of CA is a very complex puzzle with several potential options.

1. The diversification efforts will work to smooth out Broadcom’s revenue over time and provide more predictable revenues to support Broadcom’s continuing growth through acquisition. This will help their stock in the long run and provide financial benefit.
2. Broadcom will fully integrate the parts of CA that make the most sense for them to have, especially the mobile security and IoT product lines, and sell or spin off the rest to help pay for the acquisition. Although Brocade spinoffs occured prior to the acquisition, there are no forces that prevent Broadcom from spinning off non-core CA assets and products, especially those that are significantly outside the IoT and data center markets.
3. In a worst case scenario, Broadcom will try to impose its business structure on CA, screw up the integration, and kill a storied IT company over time through mismanagement. Note that Amalgam Insights does not recommend this option.

But there is some alignment here and it will be fascinating to see how Broadcom takes advantage of CA’s considerable IT monitoring capabilities, takes advantage of CA’s business to increase chip sales, and uses CA’s cash flow to continue Broadcom’s massive M&A efforts.

Posted on Leave a comment

“Walking a Mile in My Shoes” With Skillsoft’s Leadership Development Program: A Market Milestone

In a recently published Market Milestone, Todd Maddox, Ph.D., Learning Scientist and Research Fellow for Amalgam Insights, evaluated Skillsoft’s Leadership Development Program (SLDP) from a learning science perspective. This involves evaluating the content, and the learning design and delivery. Amalgam’s overall evaluation is that SLDP content is highly effective. The content is engaging and well-constructed with a nice mix of high-level commentary from subject matter experts, dramatic and pragmatic storytelling from a consistent cast of characters faced with real-world problems, and a mentor to guide the leader-in-training through the process. Each course is approximately one hour in length and is comprised of short 5 – 10 minute video segments built with single concept micro-learning in mind.

From a learning design and delivery standpoint, the offering is also highly effective. Brief, targeted, 5 to 10 minute content is well-suited to the working memory and attentional resources available to the learner. Each course begins with a brief reflective question that primes the cognitive system in preparation for the subsequent learning and activates existing knowledge, thus providing a rich context for learning. The Program is grounded in a storytelling, scenario-based training approach with a common set of characters and a “mentor” who guides the training. This effectively recruits the cognitive skills learning system in the brain while simultaneously activating emotion and motivation centers in the brain. This draws the learner into the situation and they begin to see themselves as part of the story. This “walk a mile in my shoes” experience increases information retention and primes the learner for experiential behavior change.

For more information, read the full Market Milestone on the Skillsoft website.