In HPE’s General Session and subsequent presentations, several key themes emerged in HPE’s positioning. The most obvious is that, in consolidating HPE’s offerings to servers, storage, and networking, the company is now focused on being the arms dealer for hybrid IT support. This is based both on the core HPE portfolio of technology and services as well as removing the business services and complementary technologies that were previously seen as competitive to potential HPE competitors. This fundamental change should serve HPE well.
In mid-May, Amalgam Insights (AI) attended IBM Vision, an event focused on business performance, both as an attendee and a presenter. This has been my favorite IBM tradeshow for several years, as it focuses directly on key concerns that I have looked at throughout my career: financial management, enterprise governance, and compliance. Because everyone at this show is focused on some form of BI, performance management, or risk, it is easy to speak with a business user, consultant, or IBM professional at this show and to quickly find common professional ground.
This year, I took three key findings away from IBM Vision that should be of ongoing value for financial departments within the enterprise.
Amalgam Insights (AI) recently attended Informatica World 2017, where executives, partners, and customers provided backing for Informatica’s ability to support “The Disruptive Power of Data,” (an Informatica-trademarked phrase) as well as its positioning as the Enterprise Cloud Data Management leader.
This show provided a variety of positioning points that demonstrated a change in positioning, ranging from a rebranding and repositioning led by CMO Sally Jenkins as the “hottest pre-IPO company” to the hosting of the event by legendary reporter and Executive Editor of Recode, Kara Swisher. Jenkins set the stage with a core Amalgam Insights belief that IT must be transformational for digital disruption to succeed. A core challenge that businesses face is that IT is asked to be functional and worry about simply keeping the lights and services on at a time when technology and data management challenges are growing exponentially.
As an industry observer, my focus was on how Informatica planned to publicly support customers on an ongoing basis through intended packaging and forward-facing roadmap. From that perspective, this summary will focus on the guidance that CEO Anil Chakravarthy and Chief Product Amit Walia provided to the audience during the executive keynote.
In kicking off Informatica World, CEO Anil Chakravarthy walked the 2300+ attendees through the three generations of data-driven market disruption:
1) Data used in specific business applications: the introduction of custom enterprise applications that we all started to see in the 1980s and 1990s with the emergence of applications to replace both paper and spreadsheets.
2) Data used to support enterprise-wide business processes: the integration of applications associated with bringing ERP, CRM, supply chain management, knowledge management, and other enterprise applications together into an integrated suite.
3) Data powers digital transformation: the current generation of data where businesses realize that the proprietary and operational data that they have been collecting, archiving, and often deleting as digital exhaust can now be used to deeply understand customers, benchmark specific business tasks, and serve as new sources of revenue.
As a starting point, this set of generational definitions made sense and also immediately pointed out how the majority of data management and integration solutions were currently focused on this second generation. From AI’s perspective, this positioning was a good starting point for Informatica to describe its suite of capabilities.
With this “3.0” emergence of data, Informatica made the case that it was increasingly important to have a supplier capable of supporting Enterprise Cloud Data Management across six different areas:
Big Data Management
Cloud Data Management
Master Data Management
Data management is now a foundational enabler for digital integration. In light of this, Amalgam Insights believes that this positioning is important. Separating the tactical aspects of data quality and data integration from the corporate aspects of data governance and data security and then further distancing those tasks from the business data definitions of Big Data management, Cloud Data management, and Master Data Management is a long-term recipe for disaster. The result of this disaggregated approach will be that enterprises build a new set of silos for the IT departments of the future.
AI believes that Informatica’s established leadership stance in each of these markets also provides launching points to additional markets over time. For instance, Big Data management leads to potential expansion into key Big Data sources such as the Internet of Things, video, and content management. Cloud Data could leave to support of cloud brokerages or multi-cloud resource management. Master Data Management will increasingly require the machine-learning aided automation of ontologies and taxonomies that will accelerate business mergers, talent recruitment, and value-based business processes where companies seek to effectively articulate and support the highest-value use cases associated with their current capabilities.
AI’s perspective is that this consumption-based pricing is increasingly important both to support ad-hoc data management needs and to provide evergreen support and upgrades for the ongoing needs of Informatica clients. In particular, AI believes that providing PowerCenter as a usage-based purchase is an important step forward because PowerCenter is fundamentally a consumption-based service at this point.
To provide greater detail, Amit Walia later provided guidance on Informatica’s view of five key imperatives for data management, which AI found useful in contextualizing Informatica’s view of the future:
Leveraging existing investments
Bridge to hybrid clouds
Futureproof your business
Innovation at Enterprise scale
The first three bullet points are key in that enterprises have invested in foundational IT over the past 30+ years, including the collection of massive data sets and defining technology-enabled business processes. That on-premises-based and foundational data can be used to benchmark and optimize the present, but it first must be unlocked. AI believes that, to go forward, businesses must look back at their “legacy” data and learn from the time stamping, metadata-based ontologies, and semantic contextualization that a generation of workers has already defined for enterprise data and processes. There is no reason to rediscover the past when enterprises have already literally invested billions of dollars in technology and employee time in creating this data.
Pricing flexibility can also be thought of as “value flexibility,” a core practice that AI looks at from a financial management and pricing perspective. Value-based pricing, a topic AI has previously covered in a primer, must ultimately be defined based on the core use of technologies and services as either consumption-based services or as value-based packaged products. Based on the portfolio of products that Informatica provides, AI believes that Informatica must provide a nuanced range of pricing options from the per-hour approach that works best for data integration to the per-user approach needed for data quality and cleansing tasks to value-based pricing approaches for newer artificial intelligence and Master Data Management capabilities that can greatly accelerate business execution.
Finally, innovation at enterprise scale is highly dependent on a data strategy. One way to consider the scale of innovation is to consider how often startups and technology businesses stall out somewhere in the order of magnitude of $10 million in annual revenue. This is not necessarily due to a lack of service and product quality, but the inability to support ongoing data and personnel management at scale. By providing a consistent set of data definitions and management tools to employees, companies can maintain effective business definitions of products, services, workforce, contracts, components, revenue events, and performance obligations without having to depend on tribal knowledge across the organization to maintain basic business operations.
Walia also went in-depth into the importance of broad-based enterprise unified metadata across all data sources as a core value proposition for Informatica Enterprise Information Catalog as part of the intelligent Data Platform. In a previous incarnation, AI wrote about the importance of metadata as a powerful force for doing good and the tools to unleash this value are finally coming into place.
This metadata capability was announced as a precursor to Informatica’s launch of CLAIRE, a unified metadata intelligence engine embedded into Informatica Intelligent Data Platform. Because of the depth of this announcement, AI will be tackling CLAIRE in a separate post detailing how this engine differs from traditional metadata management approaches and evaluating its technology approach.
As a starting point, AI notes that CLAIRE’s combination of data clustering, semantic domain discovery, entity discovery, ontological mapping, data structure parsing, and anomaly detection represent an important core set of capabilities for unlocking the value of business data and amalgamates a variety of technologies that have been brought to market as standalone capabilities by a variety of startups and established enterprise application companies.
At Informatica World, AI saw that Informatica’s focus remained strong on supporting the role that data plays in supporting accretive and profitable business disruption. In general, AI notes that even enterprises that consider themselves to be data-savvy are still at an early adopter phase of effectively monetizing legacy operational data stores because they are still supporting legacy requests for data and analytics. Informatica’s continued positioning of the evolving Intelligent Data Platform is important both in expanding Informatica’s brand beyond PowerCenter and towards the product information, master data management, data security, and emerging data governance capabilities associated with the recent Diaku acquisition.
AI saw Informatica World 2017 as an important event both for Informatica to place stakes in the ground on the future of enterprise data and to define its role as a data management platform to support the current era of business disruption. In light of the themes of this event, AI makes the following recommendations to enterprises exploring data integration, management, security, and governance solutions:
1) Explore Informatica’s subscription pricing options across its entire portfolio. Informatica has typically had a reputation of having expensive and CapEx-heavy solutions based on traditional PowerCenter purchase models. However, with Informatica’s pursuit both of cloud-based solutions and of subscription pricing, enterprises should be able to develop a head-to-head and apples-to-apples comparison of Informatica’s products to existing cloud integration, management, and governance solutions.
2) Consider how both legacy and emerging cloud data environments will be supported in a holistic fashion across data cataloging, quality, governance, security, integration, management, and metadata definitions. All of these capabilities must come together or else businesses risk setting up a new set of silos that will impede the basic operations of a data-driven business. Digital Transformation is not just a one-time commitment to creating a data-supported business process, but a fundamental change in treating all business data as an ongoing asset that must be available and contextualized for employees to use, augment, productize, and analyze. If employees don’t trust and understand their data, the ongoing analytics and application support don’t matter. Garbage In, Garbage Out is still true in a Big Data Cloud world.
3) Translate IT into a department focused on transformational change. Fundamentally, this means that IT must support the services that drive value, not the commoditized services that are poorly differentiated. 20 years ago, IT had to focus on data center management and asset-based security methods. Today, IT must think about how to radically expand this approach to include new sources and accept that not every device can be fully secured, leading to a data-centric and service-specific approach to IT. As businesses identify new capabilities that must be supported and traditional IT is subsumed into a set of apps and established APIs, IT must transform. To do so, all data will need to be considered as both analytic and business relevant across all areas of IT. Any part of IT that does not understand data will become increasingly irrelevant. This means that every part of IT needs to be an active part of the corporate data management strategy, including networking, telecom, business process management, and application-specific personnel.
[Updated May 3rd with links to additional coverage from AOTMP, Blue Hill Research, Oracle Dispatch, StraTEM Consulting, and the Wall Street Journal]
On April 28, Marlin Equity Partners, an investment firm with over $3 billion in capital under management and the current owner of Telecom Expense and Enterprise Mobility Management vendor Asentinel, announced entering an agreement to purchase Tangoe for $6.50 per share for a transaction estimated at $242.6 million in cash.
This agreement is being structured as a merger. Tangoe is the market leader in Telecom Expense Management with oversight of over $34 billion in IT spend. Tangoe will be combined with Asentinel to create a company that managed over $38 billion in telecom and IT spend. This transaction is scheduled to close late in Q2 2017, pending all relevant conditions being met. With this merger, Tangoe CEO Jim Foy is expected to be CEO of the combined company and Asentinel CEO Tim Whitehorn is expected to become Chief Product Officer of the combined company. The combined company will be called Tangoe.
tl;dr: in the world of 2017 where these practical BI issues still reign supreme, a practical Michael Saylor has shown up to preach on MicroStrategy’s capabilities. Both the stock market and MicroStrategy competitors should take notice.
On April 19th, MicroStrategy World 2017 had its executive keynote session in DC. I’ve attended MicroStrategy (NASDAQ:MSTR) World in the past as an industry analyst and was interested in seeing how the keynote would come across from afar as an Amalgam Insights (AI) Principal Investigator.
One of my favorite topics in enterprise software is pricing. Despite the work done in value-based pricing over the past 50 years, the vast majority of pricing exercises still start with either a very basic cost-plus or percentage-based ROI model. This assumption has a key issue: it assumes that your product is a commodity. To explain why and to explain how to take a more value-based approach, consider what a price is.
When I attended Hub17 in San Francisco, representing Amalgam Insights (AI), I was looking forward to seeing how Anaplan’s go-to-market approach had changed, kept an eye out for key announcements, and looked for clues from the executive team on where Anaplan was heading next. In the process, AI also got some unexpected highlights and guidance on the future of the company.
Anaplan caught AI’s attention a number of years ago when it officially launched the Hyperblock, originally built by Michael Gould, to provide a combination of cube, cell-based, and columnar database architectures. This approach provided a foundational technology that was well-suited to massive and enterprise-scaled models. Once this technology was combined with a go-to-market productization that allowed business users to access the planning and modeling aspects of Anaplan in 2013, Anaplan became a strong solution in the enterprise planning market.
But between the hype, the party, the music, the free-flowing drinks, and the bright lights, Domo also has an excited customer base that was hungry for product announcements and gave strong feedback to new Domo features.
And there were some significant announcements, such as:
Domo’s planned “Mr. Roboto,” to use predictive analytics and machine language to support both an Alert Center for anomaly detection as well as a data science capability that currently looks like a predictive analytics and algorithm toolkit to support business performance challenges.
Domo Business-in-a-Box, a set of pre-built dashboards created to support major business departments, functions, and use cases across the entire organization. AI believes these dashboards will provide a shortcut for enterprises to quickly translate enterprise data into relevant and contextualized departmental insights.
Domo Everywhere, which serves as Domo’s foray into embedded BI with White Label, Embed, and Publish options. AI believes that this capability is important in providing ubiquitous analytics and to allow end users to take advantage of business insights without having to always go back to any specific platform or software solution.
As well as feature improvements such as increased chart options, time-series and period based views, data slicing, and the industry pundits’ favorite: Domo Data Lineage, which got a fair amount of attention in its ability to track data sources, actions, quality, and timeliness. Although Domo is portraying Data Lineage as a feature enhancement for Domo Analyzer, AI believes that Domo will be pleasantly surprised at the enterprise need and interest for Data Lineage, as data governance and data trust have been increasingly trendy concerns for enterprise analytics.
In speaking with Domo executives, salespeople, and customers, AI also started to see a consistent playbook emerge around Domo that demonstrated how, beyond the hype, the platform started to work as a business insight platform compared to other cloud BI or traditional BI products. Behind the hype, here is what actually seems to be happening for Domo at a high level to gain enterprise adoption.
1) Domo speaks to an executive or key business manager who is stuck with some manual process that requires excessive spreadsheet or Microsoft Access usage. These use cases tend to be focused on marketing, sales, operations, or finance use cases that align with current trends in enterprise performance management
2) Domo is initially implemented through self-service capabilities by line of business decision makers who are able to integrate data with little to no IT support. Once Domo conducts deeper due diligence on the enterprise-wide need for analytics, an analytics or IT management takes the lead within the organization to connect Domo with data from the rest of the company.
3) Domo product deployment and implementation is generally accepted by customers to be simpler than traditional performance management systems such as Hyperion or Cognos as well as simpler than other traditional BI systems.
4) Once Domo is in place, the executive stakeholder and IT manager work together in bringing all relevant departmental data into Domo by hunting down the spreadsheets and local dark data that have traditionally driven the manual process.
5) After this initial implementation and win, Domo gets additional attention internally based on the ease of creating report, the efficacy that these departments see in supporting analytic insights, and the usage rates associated with Domo
This roadmap may not sound like rocket science, but the devil has always been in the details. By connecting the dots between executives, IT, implementation roadblocks, data ingestion, and employee utilization rates, Domo has quickly grown to a $120 million+ annual run rate over the past several years.
AI Observations on the State of Domo
AI notes that Domo has some very specific strengths as a business-oriented insight solution. Its DNA makes it very focused on user interaction, collaboration, and graphic design which results in a front-end product that can be extremely engaging compared to other perceived competitors in the cloud BI space such as Birst, GoodData, and Looker as well as data discovery competitors such as
Qlik and Tableau. One of the most clever things Domo has done is to create “Cards” to display specific data, where each card shows how often the data is being accessed and provides guidance on whether end users are using the data that they should be aware of. Domo’s App Design Studio also can publish with Adobe Illustrator, which provides massive graphic advantages over a variety of other analytic app studios. (And was highlighted on the keynote stage in showing an application built by GE Digital’s Kim Schuhman.)
However, Domo has also invested mightily in its own back end technologies as well, including a high performance massively parallel processing columnar database, data warehousing, and 450+ native integrations. AI wonders if Domo needs to continue investing in all of these areas on an ongoing basis or whether it would be more fruitful for Domo to create high-value named partnerships, such as Tableau has created with Informatica or GoodData has created with HP Vertica, to solve some of the back-end and integration challenges. At the end of the day, AI is impressed with Domo’s focus on data collection, process improvement, and user engagement areas where they are truly excellent.
That aside, Domo has built a full-fledged business intelligence platform with a strong focus on supporting usability and adoption. With a loyal customer base, a user experience that seems popular both with end users and with report builders, and an aggressive product roadmap to accelerate time-to-value and integrate machine learning into the platform, AI believes that Domo is well positioned to continue competing in the business intelligence and analytics markets by combining analytic consumption, business process alignment, data aggregation and data integration.