Posted on 2 Comments

The Death of Big Data and the Emergence of the Multi-Cloud Era

RIP Era of Big Data
April 1, 2006 – June 5, 2019

The Era of Big Data passed away on June 5, 2019 with the announcement of Tom Reilly’s upcoming resignation from Cloudera and subsequent market capitalization drop. Coupled with MapR’s recent announcement intending to shut down in late June, which will be dependent on whether MapR can find a buyer to continue operations, June of 2019 accentuated that the initial Era of Hadoop-driven Big Data has come to an end. Big Data will be remembered for its role in enabling the beginning of social media dominance, its role in fundamentally changing the mindset of enterprises in working with multiple orders of magnitude increases in data volume, and in clarifying the value of analytic data, data quality, and data governance for the ongoing valuation of data as an enterprise asset.

As I give a eulogy of sorts to the Era of Big Data, I do want to emphasize that Big Data technologies are not actually “dead,” but that the initial generation of Hadoop-based Big Data has reached a point of maturity where its role in enterprise data is established. Big Data is no longer part of the breathless hype cycle of infinite growth, but is now an established technology.
Continue reading The Death of Big Data and the Emergence of the Multi-Cloud Era

Posted on

Inside our Slack Channel: A Conversation on Salesforce acquiring Tableau

As you may know, analysts typically only have the time to share a small fraction of the information that they have on any topic at any given time, with the majority of our time spent speaking with clients, technologists, and each other.

When Salesforce announced their acquisition of Tableau Monday morning, we at Amalgam Insights obviously started talking to each other about what this meant. Below is a edited excerpt of some of the topics we were going through as I was preparing for PTC LiveWorx in Boston, Data Science analyst Lynne Baer was in Nashville for Alteryx, and DevOps Research Fellow Tom Petrocelli was holding down the fort in Buffalo after several weeks of travel. Hope you enjoy a quick look behind the scenes of how we started informally thinking about this in the first hour or so after the announcement.

When the Salesforce-Tableau topic came up, Tom Petrocelli kicked it off.
Continue reading Inside our Slack Channel: A Conversation on Salesforce acquiring Tableau

Posted on Leave a comment

Market Milestone – Google to Buy Looker to Transform Business Analytics

Key Stakeholders:

Chief Information Officers, Chief Technical Officers, Chief Digital Officers, Chief Analytics Officers, Data Monetization Directors and Managers, Analytics Directors and Managers, Data Management Directors and Managers, Enterprise Architects

Why It Matters:

Google’s proposed $2.6 billion acquisition of Looker provides Google with a core data engagement, service, and application environment to support Google Cloud Platform. This represents an impressive exit for Looker, which was expected to IPO after its December 2018 Series E round. This report covers key considerations for Looker customers, GCP customers, and enterprises seeking to manage data and analytics in a multi-cloud or hybrid cloud environment.

Top Takeaway:

Google Cloud Platform intends to acquire a Best-in-Breed platform for cloud analytics, embedded BI, and native analytic applications in Looker. By filling this need for Google customers, GCP has strengthened its positioning for enterprise cloud customers at a time when Amalgam Insights expects rapid and substantial growth of 25%+ CAGR (Compound Annual Growth Rate) across cloud markets for the next few years. This acquisition will help Google to remain a significant and substantial player as an enterprise cloud provider and demonstrates the latitude that Google Cloud CEO Thomas Kurian has in acquiring key components to position GCP for future growth.

To read the rest of this piece, please visit Looker, which has acquired a commercial license for this research.
Continue reading Market Milestone – Google to Buy Looker to Transform Business Analytics

Posted on

Data Science and Machine Learning News Roundup, May 2019

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Amazon, Anaconda, Cambridge Semantics, Cloudera, Databricks, Dataiku, DataRobot, Datawatch, Domino, Elastic, Google, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta, TROVE.

Domino Data Lab Champions Expert Data Scientists While Outpacing Walled-Garden Data Science Platforms

Domino announced key updates to its data science platform at Rev 2, its annual data science leader summit. For data science managers, the new Control Center provides information on what an organization’s data science team members are doing, helping managers address any blocking issues and prioritize projects appropriately. The Experiment Manager’s new Activity Feed supplies data scientists with better organizational and tracking capabilities on their experiments. The Compute Grid and Compute Engine, built on Kubernetes, will make it easier for IT teams to install and administer Domino, even in complex hybrid cloud environments. Finally, the beta Domino Community Forum will allow Domino users to share best practices with each other, as well as submit feature requests and feedback to Domino directly. With governance becoming a top priority across data science practices, Domino’s platform improvements around monitoring and making experiments repeatable will make this important ability easier for its users.

Informatica Unveils AI-Powered Product Innovations and Strengthens Industry Partnerships at Informatica World 2019

At Informatica World, Informatica publicized a number of key partnerships, both new and enhanced. Most of these partnerships involve additional support for cloud services. This includes storage, both data warehouses (Amazon Redshift) and data lakes (Azure, Databricks). Informatica also announced a new Tableau Dashboard Extension that enables Informatica Enterprise Data Catalog from within the Tableau platform. Finally, Informatica and Google Cloud are broadening their existing partnership by making Intelligent Cloud Services available on Google Cloud Platform, and providing increased support for Google BigQuery and Google Cloud Dataproc within Informatica. Amalgam Insights attended Informatica World and provides a deeper assessment of Informatica’s partnerships, as well as CLAIRE-ity on Informatica’s AI initiatives.

Microsoft delivers new advancements in Azure from cloud to edge ahead of Microsoft Build conference

Microsoft announced a number of new Azure Machine Learning and Azure AI capabilities. Azure Machine Learning has been integrated with Azure DevOps to provide “MLOps” capabilities that enable reproducibility, auditability, and automation of the full machine learning lifecycle. This marks a notable increase in making the machine learning model process more governable and compliant with regulatory needs. Azure Machine Learning also has a new visual drag-and-drop interface to facilitate codeless machine learning model creation, making the process of building machine learning models more user-friendly. On the Azure AI side, Azure Cognitive Services launched Personalizer, which provides users with specific recommendations to inform their decision-making process. Personalizer is part of the new “Decisions” category within Azure Cognitive Services; other Decisions services include Content Moderator, an API to assist in moderation and reviewing of text, images, and videos; and Anomaly Detector, an API that ingests time-series data and chooses an appropriate anomaly detection model for that data. Finally, Microsoft added a “cognitive search” capability to Azure Search, which allows customers to apply Cognitive Services algorithms to search results of their structured and unstructured content.

Microsoft and General Assembly launch partnership to close the global AI skills gap

Microsoft also announced a partnership with General Assembly to address the dearth of qualified data workers, with the goal of training 15,000 workers by 2022 for various artificial intelligence and machine learning roles. The two companies will found an AI Standards Board to create standards and credentials for artificial intelligence skills. In addition, Microsoft and General Assembly will develop scalable training solutions for Microsoft customers, and establish an AI Talent network to connect qualified candidates to AI jobs. This continues the trend of major enterprises building internal training programs to bridge the data skills gap.

Posted on

Kubernetes Grows Up – The View from KubeCon EU 2019

Our little Kubernetes is growing up.

By “growing up” I mean it is almost in a state that a mainstream company can consider it fit for production. While there are several factors that act as a drag against mainstream reception, a lack of completeness has been a major force against Kubernetes broader acceptance. Completeness, in this context, means that all the parts of an enterprise platform are available off the shelf and won’t require a major engineering effort on the part of conventional IT departments.

The good news from KubeCon+CloudNativeCon EU 2019 in Barcelona, Spain (May 20 – 23 2019) is that the Kubernetes and related communities are zeroing in on that ever so important target. There are a number of markers pointing toward mainstream acceptance. Projects are filling out the infrastructure – gaining completeness – and the community is growing.

Project Updates

While Kubernetes may be at the core, there are many supporting projects that are striving to add capabilities to the ecosystem that will result in a more complete platform for microservices. Some of the projects featured in the project updates show the drive for completeness. For example, OpenEBS and Rook are two projects striving to make container storage more enterprise friendly. Updates to both projects were announced at the conference. Storage, like networking, is an area that must be tackled before mainstream IT can seriously consider container microservices platforms based on Kubernetes.

Managing microservices performance and failure is a big part of the ability to deploy containers at scale. For this reason, the announcement that two projects that provide application tracing capabilities, OpenTracing and OpenCensus, were merging into OpenTelemetry is especially important. Ultimately, developers need a unified approach to gathering data for managing container-based applications at scale. Removing duplication of effort and competing agendas will speed up the realization of that vision.

Also announced at KubeCon+CloudNativeCon EU 2019 were updates to Helm and Harbor, two projects that tackle thorny issues of packaging and distributing containers to Kubernetes. These are necessary parts of the process of deploying Kubernetes applications. Securely managing container lifecycles through packaging and repositories is a key component of DevOps support for new container architectures. Forward momentum in these projects is forward movement toward the mainstream.

There were other project updates, including updates to Kubernetes itself and Crio-io. Clearly, the community is filling in the blank spots in container architectures, making Kubernetes a more viable application platform for everyone.

The Community is Growing

Another gauge pointing toward mainstream acceptance is the growth in the community. The bigger the community, the more hands to do the work and the better the chances of achieving feature critical mass. This year in Barcelona, KubeCon+CloudNativeCon EU saw 7700 attendees, nearly twice last year in Copenhagen. In the core Kubernetes project, there are 164K commits and 1.2M comments in Github. This speaks to broad involvement in making Kubernetes better. Completeness requires lots of work and that is more achievable when there are more people involved.

Unfortunately, as Cheryl Hung, Director of Ecosystems at CNCF says, only 3% of contributors are women. The alarming lack of diversity in the IT industry shows up even in Kubernetes despite the high-profile women involved in the conference such as Janet Kuo of Google. Diversity brings more and different ideas to a project and it would be great to see the participation of women grow.

Service Mesh Was the Talk of the Town

The number of conversations I had about service mesh was astounding. It’s true that I had released a pair of papers on it, one just before KubeCon+CloudNativeCon EU 2019. That may have explained why people want to talk to me about it but not the general buzz. There was service mesh talk in the halls, at lunch, in sessions, and from the mainstage. It’s pretty much what everyone wanted to know about. That’s not surprising since a service mesh is going to be a vital part of large scale-out microservices applications. What was surprising was that even attendees who were new to Kubernetes were keen to know more. This was a very good omen.

It certainly helped that there was a big service mesh related announcement from the mainstage on Tuesday. Microsoft, in conjunction with a host of companies, announced the Service Mesh Interface. It’s a common API for different vendor and project service mesh components. Think of it as a lingua franca of service mesh. There were shout-outs to Linkerd and Solo.io. The latter especially had much to do with creating SMI. The fast maturation of the service mesh segment of the Kubernetes market is another stepping stone toward the completeness necessary for mainstream adoption.

Already Way Too Many Distros

There were a lot of Kubernetes distributions a KubeCon+CloudNativeCon EU 2019. A lot. Really.  A lot. While this is a testimony the growth in Kubernetes as a platform, it’s confusing to IT professionals making choices. Some are managed cloud services; others are distributions for on-premises or when you want to install your own on a cloud instance. Here’s some of the Kubernetes distros I saw on the expo floor.  I’m sure I missed a few:

Microsoft Azure Google Digital Ocean Alibaba
Canonical (Ubuntu) Oracle IBM Red Hat
VMWare SUSE Rancher Pivotal
Mirantis Platform9

 

From what I hear this is a sample, not a comprehensive, list. The dark side of this enormous choice is confusion. Choosing is hard when you get beyond a handful of options. Still, only five years into the evolution of Kubernetes, it’s a good sign to see this much commercial support for it.

The Kubernetes and Cloud Native architecture is like a teenager. It’s growing rapidly but not quite done. As the industry fills in the blanks and as communities better networking, storage, and deployment capabilities, it will go mainstream and become applicable to companies of all sizes and types. Soon. Not yet but very soon.

Posted on Leave a comment

Scenario-Based Learning and Behavior Change: A Brain Science Analysis

Key Stakeholders: Chief Learning Officers, Chief Human Resource Officers, Learning and Development Directors and Managers, Corporate Trainers, Content and Learning Product Managers, Leadership Trainers, Cybersecurity Trainers, Compliance Officers, Environmental Health and Safety Trainers, Sales Managers.

Why It Matters: People skills, compliance skills, safety skills and other skills involve choosing the right behavior in real-time or near real-time. It is behavior change that is the gold standard for Learning and Development, and many L&D vendors utilize scenario-based approaches to elicit behavior change. In this report, we use brain science to evaluate the effectiveness of scenario-based learning approaches in eliciting behavior change, and determine whether this approach helps employees to choose appropriate behaviors and to be more effective managers and employees.

Top Takeaway: Real-time interactive scenario-based learning approaches optimally elicit behavior change by directly engaging the behavioral skills learning system in the brain. Non-interactive scenario-based approaches are effective for behavior change (although to a lesser degree) because they engage emotional learning centers in the brain that draw learners in, and make them feel like they are part of the training. Non-interactive scenario-based approaches are practical and cost-effective alternatives to real-time interactive scenario-based approaches.

Overview Continue reading Scenario-Based Learning and Behavior Change: A Brain Science Analysis

Posted on

Knowledge 2019 and ServiceNow’s Vision for Transforming the World of Work

In May 2019, Amalgam Insights attended Knowledge 2019, ServiceNow’s annual end-user conference. Since ServiceNow’s founding in 2004, the company has evolved from its roots as an IT asset and service management company to a company that supports digital workflow across IT, HR, service, and finance with the goal of making work better for every employee. In attending this show, Amalgam Insights was especially interested in seeing how ServiceNow was evolving its message to reflect what Amalgam Insights refers to as “Market Evolvers,” companies that have gained market dominance in their original market and taken advantage of modern mobile, cloud, and AI technology to expand into other markets. (Examples of Market Evolvers include, but are not excluded to, Salesforce, ServiceNow, Workday, Informatica, and Tangoe.) Continue reading Knowledge 2019 and ServiceNow’s Vision for Transforming the World of Work

Posted on 1 Comment

The CLAIRE-ion Call at Informatica World 2019: AI Needs Managed Data and Data Management Needs AI

From May 20-23, Amalgam Insights attended Informatica World 2019, Informatica’s end user summit dedicated to the world of data management. Over the years, Informatica has transformed from a data integration and data governance vendor to a broad-based enterprise data management vendor offering what it calls the “Intelligent Data Platform,” consisting of data integration, Big Data Management, Integration Platform as a Service, Data Quality and Governance, Master Data Management,  Enterprise Data Catalog, and Data Security & Privacy. Across all of these areas, Informatica has built market-leading products with the goal of providing high-quality point solutions that can work together to solve broad data challenges.

To support this holistic enterprise data management approach, Informatica has developed an artificial intelligence layer, called CLAIRE, to support metadata management, machine learning, and artificial intelligence services that provide context, automation, and anomaly recognition across Informatica’s varied data offerings. And at Informatica World 2019, CLAIRE was everywhere as AI served as a core theme of the event.

CLAIRE was mentioned in Informatica’s Intelligent Cloud Services, automating redundant, manual data processing steps. It was in Informatica Data Quality, cleansing and standardizing incoming data. It was in Data Integration, speeding up the integration process for non-standard data. It was in the Enterprise Data Catalog, helping users to understand where data was going across their organization. And it was in Data Security, identifying patterns and deviations in user activities that could indicate suspicious behavior, while contextually masking sensitive data for secure use.

What’s meant by CLAIRE? It’s the name tying together all of Informatica’s automated smart contextual improvements across its Intelligent Data Platform, surfacing relevant data, information, and recommendations just-in-time throughout various parts of the data pipeline. By bringing “AI” to data management, Informatica hopes to improve efficiency throughout the whole process, helping companies manage the growing pace of data ingestion. Continue reading The CLAIRE-ion Call at Informatica World 2019: AI Needs Managed Data and Data Management Needs AI

Posted on

Analyst Insight – Assessment in Talent and Human Capital Management: A Psychological Science Evaluation

Key Stakeholders:

Chief Human Resource Officers, Chief People Officers, Chief Talent Officer, Chief Technology Officers, Chief Digital Officers, Human Capital Directors and Managers, Human Resource Directors and Managers, Learning and Education Managers, Learning Project Directors and Managers, Organizational Change Directors and Managers, Talent Directors and Managers, Training and Development Directors and Managers, Training Officers.

Why It Matters:

Generational differences and continuous, rapid changes in the workplace, place a heavy burden on talent and human capital management platforms to effectively guide individuals through the full employee lifecycle from hire to retire. One-size-fits-all approaches to recruitment, onboarding, learning and development, succession planning and incentive compensation lead to weak employee engagement, poor job satisfaction and high turnover rates.

Top Takeaway:

Talent assessment blends psychology and data science, and when incorporated into talent and human capital management platforms, offers insights that are actionable and can increase engagement, satisfaction, and retention. Talent assessment is most commonly applied to recruitment and onboarding, and many vendors have developed impressive and effective offerings. However, talent assessment is less commonly applied in learning and development, succession planning and incentive compensation, which represents a missed opportunity.

Relevant Assessment Vendors: AllyO, Hirevue, IBM, Infor, Phenom People, PSI Services, TalentQuest

Relevant Talent and Human Capital Management Vendors: Bamboo HR, Cegid, Ceridian, Cornerstone on Demand, Kronos, Oracle NetSuite, PageUp People, PeopleFluent, Reflektive, Saba, SAP Successfactors, SumTotal, Talentsoft, Ultimate Software, Workday. Continue reading Analyst Insight – Assessment in Talent and Human Capital Management: A Psychological Science Evaluation

Posted on Leave a comment

Blackberry Successfully Transitions into a Software Company: Mission Accomplished

In April 2019, Amalgam Insights attended BlackBerry’s Analyst Summit, a collection of high-profile industry analysts and financial analysts who were provided with the key highlights of BlackBerry’s accomplishments over the past year. This day included BlackBerry’s top executives including CEO John Chen, President and Chief Operating Officer Bryan Palma, Chief Financial Officer Steve Cappelli, Chief Marketing Officer Mark Wilson, Chief Technology Officer Charles Eagan, and a collection of subject matter experts across security, the Enterprise of Things, mass communications, and Blackberry’s key verticals including automotive, government, and healthcare.

Above all, the key takeaway from BlackBerry’s Analyst Summit is that BlackBerry’s transformation  into a software and services company is complete.

A Successful Transformation

When John Chen first joined Blackberry roughly 2000 days ago, BlackBerry was a 7 billion dollar mobility company focused on its once-iconic handsets, but losing money hand-over-fist in the era of the iPhone and Android. Although Apple and Google had taken over the handset market, BlackBerry’s leadership at the time was reluctant to take the hard steps necessary to transform into a digital company and to take full advantage of its intellectual property. In this 2011-2012 time period, I was among the analysts who were criticizing BlackBerry for its inability to separate devices, software, and services and hoped that BlackBerry would move to Android, QNX (a 2010 acquisition) or another operating system that would be more flexible and app-friendly than BlackBerry.
Continue reading Blackberry Successfully Transitions into a Software Company: Mission Accomplished