Posted on

Why Infor’s Talent Science Solution Enhances Recruitment and Retention of Top-Performers: A Psychological and Neuroscience Evaluation Overview

Amalgam Insights recently attended Infor’s Innovation Summit in New York City. Senior executives outlined a number of innovations being incorporated into Infor’s suite of products, including ERP, Supply Chain, CRM, and others. In addition to the growing influence of the Cloud, Coleman AI and Birst analytics – an impressive array of broad-based solutions was described.

In my role as a Psychological and Neuroscience Analyst at Amalgam Insights, my focus is on Infor’s Human Capital Management (HCM) solution. I have followed the solution for quite some time, with a particular interest in Infor’s Talent Science product. Dr. Jill Strange, Vice President, HCM Science Applications, provided an update, and Tricia Engel, Director of Talent Acquisition at Wyndham Destinations, outlined a specific use case and customer success story.

Infor’s Talent Science Solution

Infor’s Talent Science solution uses talent assessment and analytics to address the recruiting needs of the modern workplace. A one-size-fits-all approaches to recruitment leads to weak employee performance and engagement, poor job satisfaction and high turnover rates. What is needed are approaches that blend psychology and data science to more accurately assess, place, and retain high-performing talent. Talent Science—that combination of psychology and data science–can achieve these aims, and when incorporated into HCM platforms, can offer insights that are actionable and can increase engagement, performance, satisfaction, and retention.

Infor’s Talent Science team works with the client to identify and quantify the Key Performance Indicators (KPIs) associated with success in a particular job. Incumbent employees are administered a single, standardized assessment tool that yields a custom performance profile that captures cognitive, behavioral, and cultural components of the employee in the workplace. Infor’s Talent Science team then builds models that predict the KPIs for incumbent employees from their custom performance profiles. New job applicants are administered the same talent assessment tool, a custom performance profile is constructed for each, and a measure of “fit” is derived that reflects the overlap between the candidate’s performance profile and the ideal profile for the relevant job. Interestingly, but not surprisingly, in many cases the applicant is found to be a better fit for a job that they did not apply for, but for which the company has a need.

The Psychology and Neuroscience of Workplace Performance

Workplace performance is directly affected by the psychological and brain processing of the employee. Thus, it is important to understand the nature of this processing. The brain is comprised of at least three learning and performance systems. These include the cognitive, behavioral and cultural/emotional systems in the brain (as outlined in the schematic below).

  • The cognitive system relies on the prefrontal cortex, is limited by working memory and attentional processes, and is the primary system in the brain that drives workplace performance centered around utilizing information and fact-based knowledge. These are often referred to as hard skills.
  • The behavioral system relies on the striatum and is the primary system in the brain that drives workplace behaviors. The detailed processing characteristics of this system are beyond the scope of this report, but suffice it to say that workplace behavior is distinct from cognition and information. It is one thing to know “what” to do or to have information and knowledge, it is another thing (and driven by a different system in the brain) to know “how” to do it and to act appropriately on information and knowledge. These behavioral skills are often referred to as “soft skills”, and they are difficult to assess.
  • The cultural/emotional system relies on the amygdala and other limbic structures and is the primary driver of what makes each of us unique. Social, emotional and personality characteristics are directly relevant here. The detailed processing characteristics of this system are less well understood than the cognitive and behavioral skills learning systems, but cultural and emotional processes strongly affect both cognitive and behavioral performance.

Figure: Cognitive, Behavioral and Cultural/Emotional Systems in the Brain

Infor’s Talent Science Performance Profiles Align with the Psychology and Neuroscience of Workplace Performance

Infor’s talent assessment tool yields a custom performance profile for each individual that characterizes and quantifies their performance along these three factors. Therefore, the solution quantifies hard and soft skills abilities, as well as cultural/emotional abilities, all of which directly affect workplace performance. This is advantageous from a Human Capital Management perspective because it means that Infor’s solution has the potential to provide actionable insights about all of the important aspects of an employee’s performance through the full employee lifecycle from hire to retire. These insights could provide employers with the information that they need to increase employee performance, engagement, satisfaction, and retention – the goal of HR departments everywhere.

Conclusions and Recommendations

Infor’s Talent Science solution uses talent assessment and analytics to drive change through behavioral analysis. It is grounded in the psychology and neuroscience of learning and performance. Thirty years of foundational research suggests that this is the optimal approach to net long-term gains.

By capturing and quantifying the talent profile associated with each workforce position, Infor provides its clients with the ability to make fast and accurate hiring decisions and increase the long term success of any new hire.

Posted on Leave a comment

IBM and Cloudera Join Forces to Expand Data Science Access

On June 21, IBM and Cloudera jointly announced that they were expanding their existing relationship to bring more advanced data science solutions to Hadoop users by developing a shared go-to-market program. IBM will now resell Cloudera Enterprise Data Hub and Cloudera DataFlow, while Cloudera will resell IBM Watson Studio and IBM BigSQL.

In bulking up their joint go-to-market programs, IBM and Cloudera are reaffirming their pre-existing partnership to amplify each others’ capabilities, particularly in heavy data workflows. Cloudera Hadoop is a common enterprise data source, but Cloudera’s existing base of data science users is small despite the growing demand for data science options, and their Data Science Workbench is coder-centric. Being able to offer the more user-friendly IBM Watson Studio to its customers gives Cloudera’s existing data customers a convenient option for doing data science without necessarily needing to know Python or R or Scala. IBM can now sell Watson Studio, BigSQL, and IBM consulting and services into Cloudera customers more deeply; it broadens their ability to upsell additional offerings.

Because IBM and Cloudera each hold significant amounts of on-prem data, It’s interesting to look at this partnership in terms of the 800-pound gorilla of cloud data: AWS. IBM, Cloudera, and Amazon are all leaders when it comes to the sheer amount of data each holds. But Amazon is the biggest cloud provider on the planet; it holds the plurality of the cloud hosting market, and most of IBM and Cloudera’s customers’ data is on-prem. Because that data is hosted on-prem, it’s data Amazon doesn’t have access to; IBM and Cloudera are teaming up to sell their own data science and machine learning capabilities on that on-prem data where there may be security or policy reasons to keep it out of the cloud.

A key differentiator in comparing AWS with the IBM-Cloudera partnership lies in AWS’ breadth of machine learning offerings. In addition to having a general-purpose data science and machine learning platform in SageMaker, AWS also offers task-specific tools like Amazon Personalize and Textract that address precise use cases for a number of Amazon customers who don’t need a full-blown data science platform. IBM has some APIs for visual recognition, natural language classification, and decision optimization, but AWS has developed their own APIs into higher-level services. Cloudera customers building custom machine learning models may find that IBM’s Watson Studio suits their needs. However, IBM lacks the variety of off-the-shelf machine learning applications that AWS provides. IBM supplies their machine learning capabilities as individual APIs that an application development team will need to fit together to create their own in-house apps.

Recommendations

  • For Cloudera customers looking to do broad data science, IBM Watson Studio is now an option. This offers Cloudera customers an alternative to Data Science Workbench; in particular, an option that has a more visual interface, with more drag-and-drop capabilities and some level of automation, rather than a more code-centric environment.
  • IBM customers can now choose Cloudera Enterprise Data Hub for Hadoop. IBM and Hortonworks had a long-term partnership; IBM supporting and cross-selling Enterprise Data Hub demonstrates that IBM will continue to sell enterprise Hadoop in some flavor.
Posted on Leave a comment

The Second Half of 2019 Has Already Begun! Amalgam Insights Highlights

We’ve reached July 1, 2019. It has been an amazing first half of the year both for Amalgam Insights and the tech world in general! From our perspective, it has been a good half as we’ve written 53 blogs, published 13 reports, and grown bookings 66% over the second half of 2018. Special Thanks to our corporate clients for your financial support that allows us to continue being a voice for changing the future of technology.:

And this gives you an idea of the companies that align with our perspective of technology being more global, usable, efficient, and financially sustainable in the here and now.

It is easy to be overwhelmed by the sheer hype of tech news cycles, but the past few months have been part of a fundamental shift in the world of IT that seems to happen once a decade. However, our audience has shown broad interest in topics across data management, cloud management, the future of finance, the neuroscience of learning, and enterprise-grade data science over the last six months. Heres a quick summary of the topics that Amalgam’s audience found most compelling in the first half of this year.

Amalgam Insights’ Top 10 for the First Half of 2019

  1. The Death of Big Data and the Emergence of the Multi-Cloud Era – Author: Hyoun Park

    Just as we saw the emergence of the Internet as a powerful business enabler around 2000 and saw the rise of Big Data and Analytics in 2010, we now face the emergence of Multi-cloud replacing CapEx as a fundamental basis for tech this year as we enter the 2020s.Based on that, it was no surprise that The Death of Big Data and the Emergence of the Multi-Cloud Era has been the most popular piece on Amalgam Insights in the first half of 2019.

  2. Docker Enterprise 3.0 is the Docker We’ve Been Waiting For – Author: Tom Petrocelli

    DevOps Research Fellow Tom Petrocelli describes in this research piece how Docker has been moving away from the commodity business of container infrastructure and reinventing itself as a developer tools company. It provides context to the DevOps community on why Docker 3.0 addresses one of the largest problems in DevOps, today.

  3. Microsoft “Early Adopts” New ASC 606 Revenue Recognition Standard – Author: Hyoun Park

    This piece continues to provide guidance to companies on how businesses prepared for ASC 606 accounting and has been a starting point for some of you to ask us about the likes of Zuora, Aria, Oracle BRM, SAP Hybris Billing, Sage Intacct, FinancialForce, Flexera Software Monetization Platform, Gemalto On-Demand Subscription Manager, and other subscription business platforms.

  4. Analyst Insight: 7 Key Technology Expense Management Predictions for 2019 – Author: Hyoun Park

    This report, published at the beginning of this year, provides7 predictions to help financially-minded technology managers gain 30% savings on operational cloud, network, and telecom expenses while gaining the visibility and governance needed to responsibly manage digital change and technology as a competitive advantage. This report, which comes with free analyst inquiry time, served as a strategic kickstart for enterprise IT and procurement teams in 2019.

  5. Technical Guide: A Service Mesh Primer – Author: Author: Tom Petrocelli 

    This groundbreaking Amalgam Insights Technical Guide on the Service Mesh provides enterprise architects, CTOs, and developer teams with the guidance they need to understand the microservices architecture, service mesh architecture, and OSI model context necessary to conceptualize Service Mesh.

  6. 2019 Top 6 Trends in Learning & Development and Talent Management – Author: W. Todd Maddox Ph.D. 

    Our resident Neuroscientist of Technology, Todd Maddox, provided Chief Learning Officers and enterprise training organizations with a headstart to 2019 with this overview of the six major trends that Amalgam Insights’ research suggests will dominate the Learning & Development and Talent Management landscape including: the Impact of Psychology and Brain Science, AI and machine learning as innovation drivers, the Emphasis on Empathy, the need for Scenario-enhanced Microlearning, best practices for using immersive and augmented reality, and the Power of Personality.

  7. SmartList Market Guide on Service Mesh and Building Out Microservices Networking – Author: Tom Petrocelli

    This piece, a companion to the Technical Guide for Service Mesh, is a comprehensive guide to the roles that top technology vendors play in the world of microservices and service mesh in 2019 including their roles in Istio vs. Linkerd, modern microservice architecture considerations, and the three segments of the service mesh market: Control Plane, Data Plane, and Value-Add.

  8. Amazon Aurora Serverless vs. Oracle Autonomous Database: A Microcosm for The Future of IT – Author: Hyoun Park

    This research document continues to provide guidance on the fundamental decision that IT departments need to choose in the world of cloud. Ease-of-Use vs. Granular Management continues to be a key IT struggle as the need for business agility creates conflict between the need for speed of implementation and management vs. the demand for individualized and customized business model construction.

  9. Amazon Expands Toolkit of Machine Learning Services at AWS re:Invent – Author: Lynne Baer

    The interest for data science and machine learning analyst Lynne Baer’s piece on Amazon re:Invent was driven by the interest in Amazon Textract, a service that extracts text and data from scanned documents, without requiring manual data entry or custom code. The promise of Textract in providing Role-based Expert Enhancement by automating manual work continues to be of interest for our enterprise IT audience.

  10. The CLAIRE-ion Call at Informatica World 2019: AI Needs Managed Data and Data Management Needs AI – Author: Lynne BaerThis research note reflects the synergy between modern data management strategies and evolution of artificial intelligence and Amalgam Insights’ recommendations for the data managers, executives, and enterprises in the Informatica ecosystem trying to figure out what to do next in preparing for the exponentially expanding challenge of billions of daily interactions, billions of daily searches, billions of devices and sensors – combined with a shortage of over 800,000 data science professionals.

As you prepare for the second half of 2019, please keep a look out for our upcoming research and review any of our top pieces that have been influencing technology decisions for our subscribers and advisory clients over the first part of 2019. If you are seeking guidance on the Finance of Tech, the Neuroscience of Tech, or the current state of ITOps and DevOps, please send us a note at info@amalgaminsights.com to set up a discussion. We look forward to supporting a better future for managing technology with you.

Posted on 1 Comment

3 Big 2019 Trends and 4 Strategic Tips for Managing the Transforming Cost of Technology

Amalgam Insights estimates that the total technology spend formally and centrally managed by enterprises with over $1 billion in revenue from telecom, network, mobility, Software-as-a-Service (SaaS), and Infrastructure-as-a-Service will double from June of 2019 by the end of 2021 driven by the massive growth of cloud computing and the need to manage a variety of “shadow IT” costs that grow to the size that formal management is required.. By “formally and centrally managed,” Amalgam Insights assumes full visibility of inventory, contracts, billing, and service orders across all vendors with active usage and supplier optimization efforts.

In 2019, Amalgam Insights notes several key trends in the world of technology expense management that IT organizations should be aware of.

First, every IT expense management solution is increasingly focused on cloud-based expenses, either in terms of Software as a Service or Infrastructure as a Service. Established Software Asset Management (SAM) companies are working on their SaaS expense capabilities including Aspera, Flexera, ServiceNow, and Snow Software and a variety of standalone vendors including Alpin, Binadox, Cleanshelf, Intello, Torii, and Zylo are emerging. Amalgam Insights is planning a SmartList for November 2019 to focus on key vendors to manage SaaS across the SaaS expense, SAM, and Technology Expense markets.

On the IaaS side, every major global Technology Expense Management solution has launched IaaS management capabilities, also known as FinOps or Cloud FinOps, including Asignet, Calero, Cass Information Systems, Cimpl, Dimension Data, MDSL, Sakon, and Tangoe. In addition, this market has been an area of rapid acquisition over the past couple of years including Microsoft’s acquisition of Cloudyn, Apptio’s acquisition of Cloudability (which calls this practice “FinOps”), and VMware’s acquisition of CloudHealth Technologies. And there are still standalone players such as Cloudcheckr in this space as well. This crowded market seeking to manage the next $100 billion of public cloud spend represents an interesting set of choices for IT departments in choosing how to aggregate and manage IT spend.

(As an aside, Amalgam Insights finds the use of the term “FinOps” by Apptio and Cloudability to be an interesting way to coordinate multiple departments and provide guidance on how to manage cloud expenses. At the same time, FinOps seems to be recreating the wheel to some extent in rebuilding a set of practices and cross-departmental teams that already have been managing telecom expenses for a number of years. Amalgam Insights is quite interested in seeing how this duplication of effort within IT departments will sort itself either by the establishment of separate Cloud FinOps departments, integration of Cloud FinOps and “Telecom FinOps” a.k.a. Telecom Expense Management, or the integration of both cloud and telecom into a larger IT expense or finance role. This is an interesting transitional period for IT expense as more spend moves to subscription, usage, feature, user, department, and project-based spend and chargeback models. )

A third key trend is the move to Europe. European IT is being targeted as an area that has traditionally been informally managed or managed in geographic silos that prevent strong global management and alignment with strategic enterprise efforts. To solve this problem, there has been a variety of acquisitions and office launches across Europe as the likes of Calero, MDSL, Tangoe, Flexera, Snow Software, Cloudcheckr, ServiceNow, VMware, and others. Amalgam Insights notes that European IT management challenges have been poorly supported in the past by global vendors that have not adequately accounted for the differences in managing data, connectivity, and compliance in each European country and the relative lack of geographic footprint to support services. In light of this, Amalgam Insights has been tracking European IT management successes and provides guidance on this topic for end user, investor, and vendor advisory clients.

To prepare for this evolution in technology expense management, Amalgam Insights provides the following guidance for Chief Information Officers, Chief Procurement Officers, Chief Accounting Officers, and related IT procurement, finance, and expense managers to prepare for the second half of 2019 and beyond based on prior guidance and research.

$100K and 30% are key benchmarks for specialized IT spend categories. Once an IT spend management category exceeds $100,000 per month, organizations start having the potential to save at least one employee’s worth of payroll through optimization by pursuing the 30% savings that typically exist in a previously unmanaged environment that has not been formally managed or audited over the last five years. At this point, your company should start looking for a dedicated solution for expense management, whether it be manual support from an in-house employee with telecom experience, a software platform to support management, or managed services to support contract, invoice, inventory, and usage management. These rules of thumb are especially true in SaaS and IaaS management, which are currently rife with poor governance, duplicate spending, and unmonitored usage patterns associated with decentralized cloud computing purchases.

Enterprise buyers at Global 2000 companies should review their current strategy for IT spend categories with the goal of supporting all cloud, software, hardware, network, and mobility spend from a usage and subscription-based perspective. IT is moving to a subscription and usage-based paradigm that started by enterprise telecom and then evolved through the enterprise adoption of cloud computing. Firms currently considering new or replacement IT expense vendors, due diligence in understanding prospective vendors’ roadmap and experience for new technology categories is vital for futureproofing this investment. This is especially true in an “As-a-Service” world where all aspects of IT are increasingly being sourced and billed in a telecom-like function which shifts these vendor relationships from traditional asset-based approaches to subscription and relationship-based approaches. Look for depth in the following functional areas: inventory, invoice line-items, service orders, disputes, optimization, governance, and security.

Customer satisfaction, geographic expertise, vertical expertise, and depth of managed and professional services are key differentiators. Amalgam recommends that companies looking at TEM solutions focus not only on the technical aspects of invoice and inventory management, but on the alignment between the vendor’s expertise and the potential buyer’s geographic footprint and business model. This alignment is often more important than the extremely granular Request for Proposals that Amalgam has seen in this industry. In 2019, there have been a number of specific trends in this regard, such as the telecom expense management market’s push towards aggregating European spend among market leaders, focus on providing additional managed services such as security and managed mobility, and vertical-specific cost management focuses that also include specific asset management and IT management strategies. Customer retention and satisfaction are also key metrics. For mature solutions, it is not uncommon to see annual customer retention metrics above 95% and to see an year-over-year increase in wallet share as new cloud and app spend is brought into a solution.

Rather than ask hundreds of questions where there is little to no differentiation, such as how invoices are processed and whether a specific type of inventory can be stored within the solution, focus on how the vendor supports the usage and management of value-added technology. The goal of IT is not to restrict the use of helpful technologies, but to increase the use of productivity-driving and outcome-improving technologies and then to providing that optimal level of utilization in a cost-effective manner. Spend management vendors that can help identify technology associated with revenue growth or customer satisfaction provide a competitive edge in understanding the cost-basis of strategic IT. By taking these steps, companies can start to better understand how to manage IT spend more practically.

(Note: This piece is an excerpt from Amalgam Insights’ upcoming SmartList for Technology Expense Management Market Leaders scheduled to publish in August 2019. If you would like more information about this topic, if you are considering a net-new or replacement purchase for IT expense management, or are interested in the upcoming report, please feel free to contact us at info@Amalgaminsights.com)

Posted on

Strategic Presentation for the Amalgam Insights Community – 5G Context for the Strategic Enterprise

I’ve recently had the opportunity to present on the present and future of 5G as a business enabler. Based on the past 20 years I’ve spent around the carrier, reseller, IT management, and industry analyst sides of the business, I’m looking forward to sharing part of my presentation to the Amalgam Insights audience and show why 5G introduces a new Age of Mobility to follow the Age of Voice, the Age of Text, the Age of Apps, and the Age of Streaming!

If you’re interested in further discussing any aspect of this deck or the repercussions of 5G for your business, please feel free to get in contact with us at info@amalgaminsights.com! Please click on the link below to download the slide deck and learn more about what 5G practically means for the world of business over the next year or two.

Amalgam Insights – 5G Context for the Strategic Enterprise

Posted on

In Case You Missed It: Why Augmented Reality is an Effective Tool in Manufacturing: A Brain Science Analysis

As you may know, PTC LiveWorx 2019 featured my live presentation “Why Augmented Reality is an Effective Tool in Manufacturing: A Brain Science Analysis“.

For those of you who couldn’t make it to LiveWorx or to the presentation, we are providing the slides so you can catch up! Simply click on the title slide or link below to download the slides.

Why Augmented Reality is Effective in Manufacturing

If you’d like to discuss this presentation in greater detail or have me speak to your organization on this topic, please follow up at info@amalgaminsights.com.

Context:

This presentation focuses on a brain science evaluation of augmented reality tools in manufacturing and their roles in:

    • reducing the cognitive load on the learner
    • providing the opportunity for limitless practice, and
    • accelerating learning and retention by broadly recruiting multiple learning systems in the brain in synchrony.

Specifically, AR tools simultaneously recruit cognitive, behavioral and experiential learning systems in the brain thus creating multiple distinct, but highly interconnected memory traces.

In comparison, traditional tools focus almost exclusively on the cognitive skills learning system. In this presentation, I examine a number of use cases for AR including supply chain, environmental health and safety (EHS), product development, equipment operation, and field service. Finally, I summarize the brain science underpinnings for the effectiveness of AR tools in each use case.

Please take a look and follow up with any questions you may have!

Why Augmented Reality is Effective in Manufacturing

Posted on 2 Comments

The Death of Big Data and the Emergence of the Multi-Cloud Era

RIP Era of Big Data
April 1, 2006 – June 5, 2019

The Era of Big Data passed away on June 5, 2019 with the announcement of Tom Reilly’s upcoming resignation from Cloudera and subsequent market capitalization drop. Coupled with MapR’s recent announcement intending to shut down in late June, which will be dependent on whether MapR can find a buyer to continue operations, June of 2019 accentuated that the initial Era of Hadoop-driven Big Data has come to an end. Big Data will be remembered for its role in enabling the beginning of social media dominance, its role in fundamentally changing the mindset of enterprises in working with multiple orders of magnitude increases in data volume, and in clarifying the value of analytic data, data quality, and data governance for the ongoing valuation of data as an enterprise asset.

As I give a eulogy of sorts to the Era of Big Data, I do want to emphasize that Big Data technologies are not actually “dead,” but that the initial generation of Hadoop-based Big Data has reached a point of maturity where its role in enterprise data is established. Big Data is no longer part of the breathless hype cycle of infinite growth, but is now an established technology.
Continue reading The Death of Big Data and the Emergence of the Multi-Cloud Era

Posted on

Inside our Slack Channel: A Conversation on Salesforce acquiring Tableau

As you may know, analysts typically only have the time to share a small fraction of the information that they have on any topic at any given time, with the majority of our time spent speaking with clients, technologists, and each other.

When Salesforce announced their acquisition of Tableau Monday morning, we at Amalgam Insights obviously started talking to each other about what this meant. Below is a edited excerpt of some of the topics we were going through as I was preparing for PTC LiveWorx in Boston, Data Science analyst Lynne Baer was in Nashville for Alteryx, and DevOps Research Fellow Tom Petrocelli was holding down the fort in Buffalo after several weeks of travel. Hope you enjoy a quick look behind the scenes of how we started informally thinking about this in the first hour or so after the announcement.

When the Salesforce-Tableau topic came up, Tom Petrocelli kicked it off.
Continue reading Inside our Slack Channel: A Conversation on Salesforce acquiring Tableau

Posted on Leave a comment

Market Milestone – Google to Buy Looker to Transform Business Analytics

Key Stakeholders:

Chief Information Officers, Chief Technical Officers, Chief Digital Officers, Chief Analytics Officers, Data Monetization Directors and Managers, Analytics Directors and Managers, Data Management Directors and Managers, Enterprise Architects

Why It Matters:

Google’s proposed $2.6 billion acquisition of Looker provides Google with a core data engagement, service, and application environment to support Google Cloud Platform. This represents an impressive exit for Looker, which was expected to IPO after its December 2018 Series E round. This report covers key considerations for Looker customers, GCP customers, and enterprises seeking to manage data and analytics in a multi-cloud or hybrid cloud environment.

Top Takeaway:

Google Cloud Platform intends to acquire a Best-in-Breed platform for cloud analytics, embedded BI, and native analytic applications in Looker. By filling this need for Google customers, GCP has strengthened its positioning for enterprise cloud customers at a time when Amalgam Insights expects rapid and substantial growth of 25%+ CAGR (Compound Annual Growth Rate) across cloud markets for the next few years. This acquisition will help Google to remain a significant and substantial player as an enterprise cloud provider and demonstrates the latitude that Google Cloud CEO Thomas Kurian has in acquiring key components to position GCP for future growth.

To read the rest of this piece, please visit Looker, which has acquired a commercial license for this research.
Continue reading Market Milestone – Google to Buy Looker to Transform Business Analytics

Posted on

Data Science and Machine Learning News Roundup, May 2019

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Amazon, Anaconda, Cambridge Semantics, Cloudera, Databricks, Dataiku, DataRobot, Datawatch, Domino, Elastic, Google, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta, TROVE.

Domino Data Lab Champions Expert Data Scientists While Outpacing Walled-Garden Data Science Platforms

Domino announced key updates to its data science platform at Rev 2, its annual data science leader summit. For data science managers, the new Control Center provides information on what an organization’s data science team members are doing, helping managers address any blocking issues and prioritize projects appropriately. The Experiment Manager’s new Activity Feed supplies data scientists with better organizational and tracking capabilities on their experiments. The Compute Grid and Compute Engine, built on Kubernetes, will make it easier for IT teams to install and administer Domino, even in complex hybrid cloud environments. Finally, the beta Domino Community Forum will allow Domino users to share best practices with each other, as well as submit feature requests and feedback to Domino directly. With governance becoming a top priority across data science practices, Domino’s platform improvements around monitoring and making experiments repeatable will make this important ability easier for its users.

Informatica Unveils AI-Powered Product Innovations and Strengthens Industry Partnerships at Informatica World 2019

At Informatica World, Informatica publicized a number of key partnerships, both new and enhanced. Most of these partnerships involve additional support for cloud services. This includes storage, both data warehouses (Amazon Redshift) and data lakes (Azure, Databricks). Informatica also announced a new Tableau Dashboard Extension that enables Informatica Enterprise Data Catalog from within the Tableau platform. Finally, Informatica and Google Cloud are broadening their existing partnership by making Intelligent Cloud Services available on Google Cloud Platform, and providing increased support for Google BigQuery and Google Cloud Dataproc within Informatica. Amalgam Insights attended Informatica World and provides a deeper assessment of Informatica’s partnerships, as well as CLAIRE-ity on Informatica’s AI initiatives.

Microsoft delivers new advancements in Azure from cloud to edge ahead of Microsoft Build conference

Microsoft announced a number of new Azure Machine Learning and Azure AI capabilities. Azure Machine Learning has been integrated with Azure DevOps to provide “MLOps” capabilities that enable reproducibility, auditability, and automation of the full machine learning lifecycle. This marks a notable increase in making the machine learning model process more governable and compliant with regulatory needs. Azure Machine Learning also has a new visual drag-and-drop interface to facilitate codeless machine learning model creation, making the process of building machine learning models more user-friendly. On the Azure AI side, Azure Cognitive Services launched Personalizer, which provides users with specific recommendations to inform their decision-making process. Personalizer is part of the new “Decisions” category within Azure Cognitive Services; other Decisions services include Content Moderator, an API to assist in moderation and reviewing of text, images, and videos; and Anomaly Detector, an API that ingests time-series data and chooses an appropriate anomaly detection model for that data. Finally, Microsoft added a “cognitive search” capability to Azure Search, which allows customers to apply Cognitive Services algorithms to search results of their structured and unstructured content.

Microsoft and General Assembly launch partnership to close the global AI skills gap

Microsoft also announced a partnership with General Assembly to address the dearth of qualified data workers, with the goal of training 15,000 workers by 2022 for various artificial intelligence and machine learning roles. The two companies will found an AI Standards Board to create standards and credentials for artificial intelligence skills. In addition, Microsoft and General Assembly will develop scalable training solutions for Microsoft customers, and establish an AI Talent network to connect qualified candidates to AI jobs. This continues the trend of major enterprises building internal training programs to bridge the data skills gap.