Posted on Leave a comment

ICYMI: On Demand Webinar – Four Techniques to Run AI on Your Business Data

On October 17th, I presented a webinar with Incorta’s Chief Evangelist, Matthew Halliday, on the importance of BI architectures in preparing for AI. This webinar is based on a core Amalgam Insights belief that all enterprise analytics and data science activity should be based on a shared core of trusted and consistent data so that Business Intelligence, analytics, machine learning, data science, and deep learning efforts are all based on similar assumptions and can build off each other.

While AI is beginning to impact every aspect of our consumer lives, business data-driven AI seems to be lower on the priority list of most enterprises. The struggle to understand the practical value of AI starts with the lack of ability to make business data easily accessible to the data science teams. Today’s BI tools have not kept up with this need and often are the bottlenecks that stifle innovation.

In this webinar, you will learn from Hyoun Park and Matthew Halliday about:
  • key data and analytic trends leading to the need to accelerate analytic access to data.
  • guidance for challenges in implementing AI initiatives alongside BI.
  • practical and future-facing business use cases that can be supported by accelerating analytic access to large volumes of operational data.
  • techniques that accelerate AI initiatives on your business data.

Watch this webinar on-demand by clicking here.

Posted on Leave a comment

Now Available: Infrastructure as Code: Managing Hybrid, Multi-cloud Infrastructure at Scale

Research Fellow Tom Petrocelli recently recorded an on-demand BrightTALK webinar on Infrastructure as Code (IaC): a key trend for any company seeking to manage infrastructure at scale.

Petrocelli has previously covered this topic in additional research including

This webinar is a short introduction designed for IT professionals who are exploring Infrastructure as Code especially focusing on SysOps, managers (from mid-level to C-Suite), and system architects. As modern IT systems become more diverse and complex, managing them, especially at scale can become very difficult. Infrastructure as Code presents a new way of thinking about infrastructure management that alleviates many of these challenges.

This webinar discusses: Continue reading Now Available: Infrastructure as Code: Managing Hybrid, Multi-cloud Infrastructure at Scale

Posted on Leave a comment

Data Science Platforms News Roundup, September 2018

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Anaconda, Cambridge Semantics, Cloudera, Databricks, Dataiku, DataRobot, Datawatch, DominoElastic, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta, TROVE.

Continue reading Data Science Platforms News Roundup, September 2018

Posted on Leave a comment

Qstream Leverages Science to Train Situational Awareness in the Sales Brain

Consider a sales managers all-too-common nightmare…

“I wake up in the middle of the night in a pool of sweat worried about my sales teams’ ability to “think on their feet” and to “read” the ever-changing sales landscape. They know the product in and out. I know I quiz them frequently. During role play they also perform well. However, to a person, when faced with an objection, time pressure, or a resistant client, they choke. It is as if they have forgotten everything that they know. I am at a loss for what to do.”

This is a problem in “situational awareness” and it is common in sales, and many other domains (e.g., healthcare). A sales professional with strong situational awareness has an almost intuitive feel for the current situation and has a very good idea of what is coming next. This individual always remains calm and can retrieve critical product information whether in a routine or non-routine situation (e.g., under time pressure). This individual knows how to act in any situation. They always put their best foot forward, and maximize the chances of a sale, regardless of the situation. Continue reading Qstream Leverages Science to Train Situational Awareness in the Sales Brain

Posted on Leave a comment

Why Percipio Excels as a Learning Experience Platform: A Market Milestone

In a recently published Market Milestone, Todd Maddox, Ph.D., Learning Scientist and Research Fellow for Amalgam Insights, evaluated Percipio, Skillsoft’s Learning Experience Platform from a learning science perspective—the marriage of psychology and brain science. This involves evaluating the training content and delivery to determine whether it engages psychological processes and learning systems in the brain effectively.

Amalgam’s overall evaluation is that Percipio is highly effective.
Pecipio’s ELSA (Embedded Learning Synchronized Assistant) addresses the need for organized and easily accessible content as well or better than most competitor’s platforms. ELSA provides just-in-time, searchable and suggested content, all seamlessly integrated into the employee’s flow of work.

Percipio also engages the task appropriate system in the brain. The “watch”, “read”, “listen” framework allows learners to utilize the medium that is most effective in the current context and facilitates microlearning for a targeted and concise understanding of some concept or macrolearning for a deeper dive and understanding. The emphasis on storytelling and scenario-based content, when appropriate, engages emotional systems that upregulate processing in both the cognitive and behavioral skills systems in the brain. The addition of “practice” (to be released in 2019) will facilitate the development of behavioral skills directly in the workplace.

For more information, read the full Market Milestone at http://learn.skillsoft.com/Website-AR-Amalgam-Insights-The-Learning-Science-Perspective-View-Full-Report.html.

Posted on 1 Comment

Oracle Delivers a FOSS Surprise

Tom Petrocelli, Amalgam Insights Research Fellow

An unfortunate side effect of being an industry analyst is that it is easy to become jaded. There is a tendency to fall back into stereotypes about technology and companies. Add to this nearly 35 years in computer technology and it would surprise no one to hear an analyst say, “Been there, done that, got the t-shirt.” Some companies elicit this reaction more than others. Older tech companies with roots in the 80’s or earlier tend to get in a rut and focus on incremental change (so as not to annoy their established customer base) instead of the exciting new trends. This makes it hard to be impressed by them.

Oracle is one of those companies. It has a reputation for being behind the market (cloud anyone?) and as proprietary as can be. Oracle has also had a difficult time with developers. The controversy over Java APIs (which is really a big company spat with Google) hasn’t helped that relationship. There are still hard feelings left over from the acquisition of Sun Microsystems (a computer geek favorite) and MySQL that have left many innovative developers looking anywhere but Big Red. Oracle’s advocacy of Free and Open Source Software (FOSS) has been at best indifferent. When the FOSS community comes together, one expects to see Red Hat, Google, and even Microsoft and IBM but never Oracle.

Which is why my recent conversation with Bob Quillin of Oracle came as a complete surprise. It was like a bucket of cold water on a hot day, both shocking and at the same time, refreshing.

Now, it’s important to get some context right up front. Bob came to Oracle via an acquisition, StackEngine. So, he and his team’s DNA is more FOSS than Big Red. And, like an infusion of new DNA, the StackEngine crew has succeeded in changing Oracle on some level. They have launched the Oracle Kubernetes and Registry Services which brings a Container Engine, Kubernetes, and a Docker V2 compatible registry to the Oracle Cloud Service. That’s a lot of open source for Oracle.

In addition, Bob talked about how they were helping Oracle customers to move to a Cloud Native strategy. Cloud Native almost always means embracing FOSS since so many components are FOSS. Add to the mixture a move into serverless with Fn.  Fn is also an open source project (Apache 2.0 licensed) but one that originated in Oracle.  That’s not to say there aren’t other Oracle open source projects (Graal for example) but they aren’t at the very edge of computing like Fn. In this part of the FOSS world, Oracle is leading, not following. Oracle even plans to have a presence at Kubecon+CloudNativeCon 2018 in Seattle this December, an open source-oriented conference run by The Linux Foundation, where they will be a Platinum Sponsor. In the past this would be almost inconceivable.

The big question is how will this affect the rest of Oracle? Will this be a side project for Oracle or will they rewrite the Oracle DNA in the same way that Microsoft has done? Can they find that balance between the legacy business, which is based on high-priced, proprietary software – the software that is paying the bills right now – and community run, open source world that is shaping the future of IT? Only time will tell but there will be a big payoff to IT if it happens. Say what you will about Oracle, they know how to do enterprise software. Security, performance, and operating at scale are Oracle’s strengths. They are a big reason their customers keep buying from them instead of an open source startup or even AWS. An infusion of that type of knowledge into the FOSS community would help to overcome many of the downsides that IT experiences when trying to implement open source software in large enterprise production environments.

Was I surprised? To say the least. I’ve never had a conversation like this with Oracle. Am I hopeful? A bit. There are forces within companies like Oracle that can crush an initiative like this.  As the market continues to shift in the direction of microservices, containers, and open source in general, Oracle risks becoming too out of step with the current generation of developers. Even if FOSS doesn’t directly move the needle on Oracle revenue, it can have a profound effect on how Oracle is viewed by the developer community. If the attitude of people like Bob Quillin becomes persuasive, then younger developers may start to see Oracle as more than just their father’s software company. In my opinion, the future of Oracle may depend on that change in perception.

Posted on Leave a comment

Why It Matters that IBM Announced Trust and Transparency Capabilities for AI


Note: This blog is a followup to Amalgam Insights’ visit to the “Change the Game” event held by IBM in New York City.

On September 19th, IBM announced its launch of a portfolio of AI trust and transparency capabilities. This announcement got Amalgam Insight’s attention because of IBM’s relevance and focus in the enterprise AI market throughout this decade.  To understand why IBM’s specific launch matters, take a step back in considering IBM’s considerable role in building out the current state of the enterprise AI market.

IBM AI in Context

Since IBM’s public launch of IBM Watson on Jeopardy! in 2011, IBM has been a market leader in enterprise artificial intelligence and spent billions of dollars in establishing both IBM Watson and AI. This has been a challenging path to travel as IBM has had to balance this market-leading innovation with the financial demands of supporting a company that brought in $107 billion in revenue in 2011 and has since seen this number shrink by almost 30%.

In addition, IBM had to balance its role as an enterprise technology company focused on the world’s largest workloads and IT challenges with launching an emerging product better suited for highly innovative startups and experimental enterprises. And IBM also faced the “cloudification” of enterprise IT in general, where the traditional top-down purchase of multi-million dollar IT portfolios is being replaced by piecemeal and business-driven purchases and consumption of best-in-breed technologies.

Seven years later, the jury is still out on how AI will ultimately end up transforming enterprises. What we do know is that a variety of branches of AI are emerging, including Continue reading Why It Matters that IBM Announced Trust and Transparency Capabilities for AI

Posted on 1 Comment

IBM Presents “Change the Game: Winning with AI” in New York City


(Note: This blog is part of a multi-part series on this event and the related analyst event focused on IBM’s current status from an AI perspective.)

On September 13th, 2018, IBM held an event titled “Change the Game: winning with AI.” The event was hosted by ESPN’s Hannah Storm and held in Hell’s Kitchen’s Terminal 5, better known as a music venue where acts ranging from Lykke Li to Kali Uchis to Good Charlotte perform. In this rock star atmosphere, IBM showcased its current perspective on AI (artificial intelligence, not Amalgam Insights!).

IBM’s Rob Thomas and ESPN’s Hannah Storm discuss IBM Private Cloud for Data

This event comes at an interesting time for IBM. Since IBM’s public launch of IBM Watson on Jeopardy! in 2011, IBM has been a market leader in enterprise artificial intelligence and spent billions of dollars in establishing both IBM Watson and AI. However, part of IBM’s challenge over the past several years was that the enterprise understanding of AI was so nascent that there was no good starting point to develop machine learning, data science, and AI capabilities. In response, IBM built out many forms of AI including

  • IBM Watson as a standalone, Jeopardy!-like solution for healthcare and financial services,
  • Watson Developer Cloud to provide language, vision, and speech APIs, Watson Analytics to support predictive and reporting analytics,
  • chatbots and assistants to support talent management and other practical use cases,
  • Watson Studio and Data Science Experience to support enterprise data science efforts to embed statistical and algorithmic logic into applications, and
  • Evolutionary neural network design at the research level.

And, frankly, the velocity of innovation was difficult for enterprise buyers to keep up with, especially as diverse products were all labelled Watson and as buyers were still learning about technologies such as chatbots, data science platforms, and IBM’s hybrid cloud computing options at a fundamental level. The level of external and market-facing education needed to support relatively low-revenue and experimental investments in AI was a tough path for IBM to support (and, in retrospect, may have been better supported as a funded internal spin-off with access to IBM patents and technology). Consider that extremely successful startups can justify billion dollar valuations based on $100 million in annual revenue while IBM is being judged on multi-billion dollar revenue changes on a quarter-by-quarter basis. That juxtaposition makes it hard for public enterprises to support audacious and aggressive startup goals that may take ten years of investment and loss to build.

IBM’s perspective at this “Change the game” event was an important checkpoint in learning more about IBM’s current positioning on AI. Amalgam Insights was interested in learning more about IBM’s current positioning and upcoming announcements to support enterprise AI.

With strategies telestrated by IBM’s Janine Sneed and Daniel Hernandez, this event was an entertaining breakdown of IBM case studies demonstrating the IBM analytics and data portfolio and interspersed with additional product and corporate presentations by IBM’s Rob Thomas, Dinesh Nirmal, Reena Ganga, and Madhu Kochar. The event was professionally presented as Hannah Storm interviewed a wide variety of IBM customers including:

  • Mark Vanni, COO of Trūata
  • Joni Rolenaitis, Vice President of Data Development and Chief Data Officer for Experian
  • Dr. Donna M. Wolk, System Director of Clinical and Molecular Microbiology for Geisinger
  • Guy Taylor, Executive Head of Data-Driven Intelligence at Nedbank
  • Sreesha Rao, Senior Manager of IT Applications at Niagara Bottling LLC
  • James Wade, Director of Application Hosting for GuideWell Mutual Holding Company
  • Mark Lack, Digital Strategist and Data Scientist for Mueller, Inc
  • Rupinder Dhillon, Director of Machine Learning and AI at Bell Canada

In addition, IBM demonstrated aspects of IBM Private Cloud for Data, their cloud built for using data for AI, as well as Watson Studio, IBM’s data science platform, and design aspects for improving enterprise access to data science and analytic environments.

Overall, Amalgam Insights saw this event as a public-friendly opportunity to introduce IBM’s current capabilities in making enterprises ready to support AI by providing the data, analytics, and data science products needed to prepare enterprise data ecosystems for machine learning, data science, and AI projects. In upcoming blogs, Amalgam Insights will cover IBM’s current AI positioning in greater detail and the IBM announcements that will affect current and potential AI customers.

Posted on Leave a comment

Todd Maddox Ph.D.’s Top Four Scientific Observations on HR Tech 2018

HR Tech gets bigger and bigger every year. HR Tech 2018 was no exception. It broke all of the previous records. More importantly, the quality of the offerings, presentations and savvy of the clients continues to grow. I had a great time at HR Tech 2018, and I am already looking forward to 2019.

It took some time to dig out from three days away from my desk, and more time to let my thoughts coalesce, but I am now ready to provide my insights on the event.

As a psychological and brain scientist with 25 years of basic science research under my belt, and a particular interest in personality, motivation, learning, and all things data, I resonate with events like HR Tech because my passion is all things “people’. I find topics such as recruitment, interviewing, learning and development, leadership, succession planning, and talent management in general fascinating. Although I am sure others who attended HR Tech will highlight different topics, and acknowledging that I was only able to speak in detail with a dozen or so vendors, here are my Top Four Scientific Observations.

The Impact of Science Continues to Grow

Relevant Vendors That I Spoke With: Cornerstone, IBM, Infor, LTG, PeopleFluent, Phenom People, Saba, Skillsoft/Sumtotal, TalentQuest, Workday

I am happy to say that I see a growing interest in the importance of science, not only in vendors but also in their potential customers. Whether talent science, learning science, or data science, the methodology called “science” is growing in HR. Importantly, this is not just being used as a buzzword. In hiring and recruitment, the science of assessment is central. Combine assessment with “big data” and amazing recruitment offerings follow. In Learning and Development, tools and technologies are being built that battle against the brain’s natural tendency to forget and focus on presenting information within the attention span of the learner. These include microlearning, spaced training and testing. In talent management, the science of personality is growing in importance. Personality is directly related to motivation and incentivization and all three are being leveraged. The science of data and the use of analytics, machine learning and artificial intelligence to garner actionable insights continues to grow in its use and sophistication.

From a psychological and brain science perspective, we still have a long way to go, and we need to do a better job of mapping tools onto tasks but progress is being made and vendors are listening to their clients.

The Growing Importance of People Skills and Leadership Training

Relevant Vendors That I Spoke With: LeadX, LTG, PeopleFluent, Saba, Skillsoft/Sumtotal, TalentQuest

I have written extensively on the difference between hard skills and people (aka soft) skills, the psychological processes and brain systems relevant to each, and the fact that very different tools are needed. At times, I have felt like people hear, but are not listening. The tide is turning. More and more HR professionals (vendors and customers) are embracing the importance of people skills. Whether it is growing concerns with automation, the extensive data suggesting that diversity increased revenue and workplace harmony, the #metoo movement, or more likely the combination of all three, HR is embracing the need for effective people skills training in organizations large and small. This is not just about compliance and box checking, but a recognition that people skills are critical in the workplace.

The growing emphasis on leadership training is also exciting, and people skills are central to this mission. I visited numerous vendors from small, innovative startups to much larger companies who are working to tackle the leadership development problem head-on. Although more work is needed, some of these offerings are truly remarkable.

I should also note that some vendors are developing leadership training programs focused on developing Leadership in Women. For those of you who have read the likes of “Lean In”, you know the challenges facing women in the corporate world, especially leadership. Offerings targeted to these specific problems are welcome.

The Growing Recognition of the Value of Personality

Relevant Vendors That I Spoke With: IBM, Infor, PeopleFluent, Phenom People, LeadX, TalentQuest

As I alluded to earlier, the science of personality is growing in its relevance and sophistication in HR. I have conducted a number of studies of my own and show convincingly that personality is critical and affects many aspects of HR including team dynamics, learning and development, effective management, and succession planning to name a few. The number of vendors embracing the importance of personality is growing and the quality of some of the offerings are truly amazing. Personality assessment blends psychology and data science and can offer insights that are immediately actionable and can make the difference between workplaces that are high in engagement, satisfaction, and retention, and those that are weak on these same metrics. With the constant battle to find, develop, nurture and retain the right employees, factors such as personality will continue to grow in value.

There is still room for improvement and some intersections that the majority of HR vendors are not seeing, such as the intersection between personality, motivation, learning and incentivization, but I expect that those will come with time.

Virtual Reality Has Arrived

Relevant Vendors That I Spoke With: STRIVR, SumTotal, TalentQuest

For the past several years, I have been following the emergence of the immersive technology of virtual reality. My interest in this technology comes from my interest in Learning and Development. From a brain science perspective, it is clear that this technology is highly effective at engaging learning centers in the brain. Until recently, technological issues and equipment costs constrained virtual realities commercial use. For example, a year or so ago, a good virtual reality system cost several $1000 and required a laptop, and a tethered cable connected to the head mounted display. Recently, Oculus release the Go which is a stand-alone head-mounted display that costs under $300. A game changer!

Although the number of virtual reality offerings at HR Tech is still relatively small, the buzz around this technology is growing exponentially, and some key players are driving engagement. In addition, the quality of the offerings that are out there is impressive. These tools will be invaluable for collaboration and training. I am very excited to see this sector grow in 2019, and I fully expect the number of offerings at HR Tech 2019 to be significantly higher.

Conclusions

It is an exciting time for HR. That was clearly reflected in the enthusiasm, innovation and quality of the offerings at HR Tech 2018. Skills that have often been given lip service (e.g., people skills) are getting the attention that they deserve. Constructs that derive directly from people, such as personality, are being leveraged for good. New, and exciting technologies such as virtual reality are being introduced for more than gaming. And all of these are being built on a solid foundation of data and science. The future is bright for HR and HR Tech.

Posted on Leave a comment

Learning Elastic’s Machine Learning Story at Elastic{ON} in Boston

Why is a Data Science and Machine Learning Analyst at Elastic’s road show when they’re best known for search? In early September, Amalgam Insights attended Elastic{ON} in Boston, MA. Prior to the show, my understanding of Elastic was that they were primarily a search engine company. Still, the inclusion of a deep dive into machine learning interested me, and I was also curious to learn more about their security analytics, which were heavily emphasized in the agenda.

In exploring Elastic’s machine learning capabilities, I got a deep dive with Rich Collier, the Senior Principal Solutions Architect and Machine Learning specialist. Elastic acquired Prelert, an incident management company with unsupervised machine learning capabilities in September 2016 with the goal of incorporating real-time behavioral analytics into the Elastic Stack; in the interim two years, integrating Prelert has grown Elastic’s abilities to act on time-series data anomalies found in the Elasticsearch data store, offering an extension to the Elastic Stack called “Machine Learning” as part of their Platinum-level SaaS offerings.

Elastic Machine Learning users no longer have to define rules to identify abnormal time-series data, nor do they even need to code their own models – the Machine Learning extension analyzes the data to understand what “normal” looks like in that context, including what kind of shifts can be expected over different periods of time from point-to-point all the way up to seasonal patterns. From that, it learns when to throw an alert on encountering atypical data in real time, whether that data is log data, metrics, analytics, or a sudden upsurge in search requests for “$TSLA.” Learning from the data rather than configuring blunt rules makes for a more granular precision that reduces alerts on false positives on anomalous data.

The configuration for the Machine Learning extension is simple and requires no coding experience; front-line business analysts can customize the settings via pull-down menus and other graphical form fields to suit their needs. To simplify the setup process even further, Elastic offers a list of “machine learning recipes” on their website for several common use cases in IT operations and security analytics; given how graphically oriented the Elastic stack is, I wouldn’t be surprised to see these “recipes” implemented as default configuration options in the future. Doing so would simplify the configuration from several minutes of tweaking individual settings to selecting a common profile in a click or two.

Elastic also stated that one long-term goal is to “operationalize data science for everyone.” At the moment, that’s a fairly audacious claim for a data science platform, let alone a company best known for search and search analytics. One relevant initiative Kearns mentioned in the keynote was the debut of the Elastic Common Schema, a common set of fields for ingesting data into Elasticsearch. Standardizing data collection makes it easier to correlate and analyze data through these relational touch points, and opens up the potential for data science initiatives in the future, possibly through partnerships or acquisitions. But they’re not trying to be a general-purpose data science company right now; they’re building on their core of search, logging, security, and analytics; machine learning ventures are likely to fall within this context. Currently, that offering is anomaly detection on time series data.

Providing users who aren’t data scientists with the ability to do anomaly detection on time series data may be just one step in one category of data modeling, but having that sort of specialized tool accessible to data and business analysts would help organizations better understand the “periodicity” of typical data. Retailers could track peaks and valleys in sales data to understand purchasing patterns, for example, while security analysts could focus on responding to anomalies without having to define what anomalous data looks like ahead of time as a collection of rules.

Elastic’s focus on making this one specific machine learning tool accessible to non-data-scientists reminded me of Google’s BigQuery ML initiative  – take one very specific type of machine learning query, and operationalize it for easy use by data and business analysts to address common business queries. Then, once they’ve perfected that tool, they’ll move onto building the next one.

Improving the quality of data acquired and stored in Elasticsearch from search results will be key to improving on the user experience. I spoke with Steve Kearns, the Senior Director of Product Management at Elastic, who delivered the keynote speech with a sharp focus on “scale, speed, and relevance” for improving search results. Better search data can be used to optimize machine learning applied to that data. With how Elastic has created the Machine Learning extension focused on anomaly detection across time series data – data Elasticsearch specializes in collecting, such as log data – this can support more accurate data analysis and better business results for data-driven organizations.

Overall, It was intriguing to see how machine learning is being incorporated into IT solutions that aren’t directly supporting data science environments. Enabling growth in the use of machine learning tactics effectively spreads the use of data across an organization, bringing companies closer to the advantages of the data-driven ideal. Elastic’s Machine Learning capability potentially opens up a specific class of machine learning for a broader spectrum of Elastic users without requiring them to acquire coding and statistics backgrounds; this positions Elastic as a provider of a specific type of machine learning services in the present, and makes it more plausible to consider them as a company for providing machine learning services in the future.