Market Milestone – Looker Raises $103 million E Round to Expand the Looker Data Platform

On December 6, 2018, Looker announced that it closed a $103 million E round led by Premji Invest, a private equity firm owned by Wipro chairman Azim Premji. This round also includes new funds from Cross Creek Advisors, a venture capital firm focused on late-stage investments with current investments including tech darlings such as Anaplan, Datastax, Docker, DocuSign, Pluralsight, and Sumo Logic. This round also included participation from current investors from prior rounds (such as Looker’s Series D covered by Amalgam Insights).

This round is expected to be a final round before a potential Initial Public Offering. Of course, in recent times, plans for IPO from hot companies have been interrupted at the last moment, such as with Workday’s $1.55 billion acquisition of Adaptive Insights (see Amalgam Insights’ coverage for more information) and SAP’s $8 billion acquisition of Qualtrics. A Looker IPO may not be guaranteed, but Amalgam Insights would expect that the valuation of Looker is likely not going to be affected whether the company ends up going public or being acquired for strategic reasons.

Why is Looker worth $1.6 billion?

With this funding, Looker crosses into “unicorn” territory with a valuation of approximately $1.6 billion. This valuation is based on Looker’s run-rate of over $100 million in annual revenue supported by an employee base that is approaching 600, year-over-year growth exceeding 70%, and expansion into Tokyo to support the Asia-Pacific region.

Amalgam Insights believes that this valuation reflects several key trends and intelligent strategic decisions made by Looker in supporting enterprise-grade business intelligence and data supply chain needs.

First, Looker was developed to support a variety of data preparation, cataloging, governance, querying, and presentation capabilities at scale across a wide variety of data sources and formats. This approach reflects Looker’s purpose of being a data platform built for a new generation of data analysts based on the volume and variety of challenges that drive current analytic challenges.

Taking a step back, I remember when I first ran into Looker on the trade show circuit. I was looking at a variety of BI solutions in 2013, including the likes of Tableau, Qlik, and Microstrategy, when I ran into a small booth manned by a guy named Keenan Rice, who currently serves as Looker’s VP of Strategic Alliances. As a jaded analyst, I asked how his solution was different from the 30+ other solutions that were being shown on that show floor. At the time, everyone was bragging about their pixel-perfect visualizations, report building capabilities, and basic data presentation that were interesting to see, but rarely provided significant value, competitive differentiation, or Return on Investment.

However, Keenan’s pitch differed substantially. Although Looker also provided visualizations for large data tables, Keenan started by talking about data analyst challenges in preparing data for self-service analytics, using Looker’s SQL-based LookML as a modelling layer to access a variety of data, bringing new data into existing data and analytic workflows, and delivering it all as Software-as-a-Service. As a former data analyst, this was all music to my ears, but I wondered if Looker would be able to stand out as a solution against all of the dashboarding solutions, report builders, and the massive marketing budgets of incumbent BI vendors. Looker was just starting as a company and had just announced its Series A, so it faced significant odds in standing out based on the $10 million to $40 million range of Series A or Series B funding that a company like Looker would typically get at this point.

But it has been a pleasure both to seek Looker grow over the years and to lead a new generation of solutions focused on simplifying data supply chains and pipelines. It ends up that Looker was just early enough to avoid having comparable competition while solving a market need that businesses understood, especially those companies seeking a new generation of BI-based capabilities and wanting to develop a more data-driven organization. So, that’s a long way of saying that Looker took a big and laborious bet against the grain six years ago and has been rewarded for having a combination of good product and good timing.

But this isn’t the only reason that Looker has been successful.

As Looker has achieved market fit and success, the solution has evolved from a data workflow solution into an emerging application development solution that allows analysts and developers to work collaboratively in building secure and embeddable data models. By truly building Looker as a platform rather than simply calling it a platform as an aspirational goal based on a set of APIs, Amalgam Insights expects that Looker will become increasingly valuable for data analysts and the “citizen” developers who seek to increase appropriate access to enterprise data and analytic outputs.

And Looker has now taken an important step forward in providing department-specific apps in the most recent Looker 6 launch. Starting with digital marketing and web analytics, Looker is now taking advantage of its analytics capabilities to provide out-of-the-box support to help enterprises with key challenges rather than forcing clients to build foundational analytics that have been built over and over. Both these apps and Looker’s development focus build on top of Looker’s prior focus on Looker Blocks first launched in 2015, which were a collection of SQL, visualizations, and pre-built analytic models designed to accelerate analytic projects.

So, what is next for Looker? IPO? Global domination? 

All kidding aside, with expansion into AsiaPac to accompany Looker’s existing European offices in London and Dublin and Looker’s current 70% growth rate, it is possible that Looker could increase net-new revenues in 2019 faster than BI stalwarts such as Information Builders and Qlik. From a financial perspective, Amalgam Insights notes that Looker’s growth mirrors that of Alteryx, which has roughly quadrupled its stock price since its March 2017 IPO which was based on Alteryx’ 2016 revenue of roughly $86 million with 59% year-over-year growth.

Looker has stated that it believes that this Series E round should be its last funding round before IPO and, given recent market valuations for successful software companies, there should be no reason to expect otherwise from Looker. Looker’s progress both as a software development platform as well as a platform of pre-built applications and services bodes well for Looker as it continues to evolve as an enterprise platform focused on expanding access to data insights.

Finally, Amalgam Insights believes that Looker’s success will lead to continued success with other data and analytics companies focusing on rapid data modelling, data mapping, and analysis of multiple data sources. Looker’s agile BI approach that avoids the challenges of traditional BI and ETL solutions in supporting multiple data sources has become a new standard for accelerating the value of data. This both means that traditional BI companies will need to accelerate their own data pipeline efforts or to partner with other vendors and that Looker will start to be targeted in the same way that the likes of Tableau, Qlik, and Microstrategy have in the past. The price of success is increased competition and innovation, which is good news for the BI, data, and analytics markets and should provide Looker with enough challenges to avoid resting on its laurels.

Red Hat Hybrid Cloud Management Gets Financial with Cloud Cost Management

Key Stakeholders: CIO, CFO, Accounting Directors and Managers, Procurement Directors and Managers, Telecom Expense Personnel, IT Asset Management Personnel, Cloud Service Managers, Enterprise Architects Why It Matters: As enterprise cloud infrastructure continues to grown 30-40% per year and containerization becomes a top enterprise concern, IT must have tools and a strategy for managing the cost…

Please register or log into your Free Amalgam Insights Community account to read more.
Log In Register

Observations on the Future of Red Hat from Red Hat Analyst Day

On November 8th, 2018, Amalgam Insights analysts Tom Petrocelli and Hyoun Park attended the Red Hat Analyst Day in Boston, MA. We had the opportunity to visit Red Hat’s Boston office in the rapidly-growing Innovation District, which has become a key tech center for enterprise technology companies. In attending this event, my goal was to learn more about the Red Hat culture that is being acquired as well as to see how Red Hat was taking on the challenges of multi-cloud management.

Throughout Red Hat’s presentations throughout the day, there was a constant theme of effective cross-selling, growing deal sizes including a record 73 deals of over $1 million in the last quarter, over 600 accounts with over $1 million in business in the last year, and increased wallet share year-over-year for top clients with 24 out of 25 of the largest clients increasing spend by an average of 15%. The current health of Red Hat is undeniable, regardless of the foibles of the public market. And the consistency of Red Hat’s focus on Open Source was undeniable across infrastructure, integration, application development, IT automation, IT optimization, and partner solutions, which demonstrated how synchronized and focused the entire Red Hat executive team presenters were, including

Please register or log into your Free Amalgam Insights Community account to read more.
Log In Register

Torchbearer Case Study: OceanX Delivers the Consumer Subscription Experience with Oracle Cloud Infrastructure

(Note: Torchbearer Case Studies provide enterprises with a strong example of the “Art of the Possible” in using technology to enhance their business environment. Amalgam Insights provides emerging best practices based on the experience of the Torchbearer to inspire and educate Early Adopter and Early Majority buyers seeking to develop a strategic advantage.)

At Oracle Open World, I was interested in learning more about the Oracle Cloud and its role in data and analytics. Although Oracle has admittedly been relatively late to the enterprise cloud, the Oracle Cloud was front and center throughout Oracle Open World and it was interesting to see how Oracle was able to bring up enterprise technology deployments across its portfolio. One example of enterprise success in working with the Oracle Cloud that stood out was from OceanX.

At Oracle Open World, OceanX won the 2018 Oracle Innovation Award for Data Management. OceanX is a spinoff of Guthy-Renker started in 2016 to provide an integrated subscription commerce platform which brings together e-commerce, fulfilment, customer service, and business analytics associated with direct-to-consumer subscription programs. This allows consumer brands to provide personalized and bespoke products in their subscription offerings and to build consumer relationships based on a holistic view of the customer’s preferences, purchases, and interactions.

Please register or log into your Free Amalgam Insights Community account to read more.
Log In Register

Is IBM’s Acquisition of Red Hat the Biggest Acquihire of All Time?

Estimated Reading Time: 11 minutes

Internally, Amalgam Insights has been discussing why IBM chose to acquire Red Hat for $34 billion dollars fairly intensely. Our key questions included:

  • Why would IBM purchase Red Hat when they’re already partners?
  • Why purchase Red Hat when the code is Open Source?
  • Why did IBM offer a whopping $34 billion, $20 billion more than IBM currently has on hand?

As a starting point, we posit that IBM’s biggest challenge is not an inability to understand its business challenges, but a fundamental consulting mindset that starts with the top on down. By this, we mean that IBM is great at identifying and finding solutions on a project-specific basis. For instance, SoftLayer, Weather Company, Bluewolf, and Promontory Financial are all relatively recent acquisitions that made sense and were mostly applauded at the time. But even as IBM makes smart investments, IBM has either forgotten or not learned the modern rules for how to launch, develop, and maintain software businesses. At a time when software is eating everything, this is a fundamental problem that IBM needs to solve.

The real question for IBM is whether IBM can manage itself as a modern software company.

Please register or log into your Free Amalgam Insights Community account to read more.
Log In Register

From Calero World Online: From TEM to ITEM: Leveraging TEM for Non-Traditional Expenses

On October 18th, I presented a webinar at Calero World Online on the future of IT cost and subscription management. In this presentation, I challenge existing telecom and IT expense management managers to accept their destiny as pilots and architects of enterprise digital subscriptions.

Telecom expense has traditionally been the most challenging of IT costs to manage. With the emergence of software-as-a-service, cloud computing, the Internet of Things, and software-defined networks, the rest of the IT world is quickly catching up.

In this webinar, you will learn:

  • How the latest trends and technology are driving change to enterprise management strategies
  • How the challenges of traditional TEM and cloud expense management are similar in nature (and why TEM is a good place to start)
  • How organizations are benefiting from ITEM best practices using sample use cases

To learn more about the upcoming challenges of IT expense management, aligning technology supply to digital demand, and being the shepherd for your organization’s technology sourcing, utilization, and optimization, click here to watch this webinar on-demand.

ICYMI: On Demand Webinar – Four Techniques to Run AI on Your Business Data

On October 17th, I presented a webinar with Incorta’s Chief Evangelist, Matthew Halliday, on the importance of BI architectures in preparing for AI. This webinar is based on a core Amalgam Insights belief that all enterprise analytics and data science activity should be based on a shared core of trusted and consistent data so that Business Intelligence, analytics, machine learning, data science, and deep learning efforts are all based on similar assumptions and can build off each other.

While AI is beginning to impact every aspect of our consumer lives, business data-driven AI seems to be lower on the priority list of most enterprises. The struggle to understand the practical value of AI starts with the lack of ability to make business data easily accessible to the data science teams. Today’s BI tools have not kept up with this need and often are the bottlenecks that stifle innovation.

In this webinar, you will learn from Hyoun Park and Matthew Halliday about:
  • key data and analytic trends leading to the need to accelerate analytic access to data.
  • guidance for challenges in implementing AI initiatives alongside BI.
  • practical and future-facing business use cases that can be supported by accelerating analytic access to large volumes of operational data.
  • techniques that accelerate AI initiatives on your business data.

Watch this webinar on-demand by clicking here.

Why It Matters that IBM Announced Trust and Transparency Capabilities for AI


Note: This blog is a followup to Amalgam Insights’ visit to the “Change the Game” event held by IBM in New York City.

On September 19th, IBM announced its launch of a portfolio of AI trust and transparency capabilities. This announcement got Amalgam Insight’s attention because of IBM’s relevance and focus in the enterprise AI market throughout this decade.  To understand why IBM’s specific launch matters, take a step back in considering IBM’s considerable role in building out the current state of the enterprise AI market.

IBM AI in Context

Since IBM’s public launch of IBM Watson on Jeopardy! in 2011, IBM has been a market leader in enterprise artificial intelligence and spent billions of dollars in establishing both IBM Watson and AI. This has been a challenging path to travel as IBM has had to balance this market-leading innovation with the financial demands of supporting a company that brought in $107 billion in revenue in 2011 and has since seen this number shrink by almost 30%.

In addition, IBM had to balance its role as an enterprise technology company focused on the world’s largest workloads and IT challenges with launching an emerging product better suited for highly innovative startups and experimental enterprises. And IBM also faced the “cloudification” of enterprise IT in general, where the traditional top-down purchase of multi-million dollar IT portfolios is being replaced by piecemeal and business-driven purchases and consumption of best-in-breed technologies.

Seven years later, the jury is still out on how AI will ultimately end up transforming enterprises. What we do know is that a variety of branches of AI are emerging, including

Please register or log into your Free Amalgam Insights Community account to read more.
Log In Register

IBM Presents “Change the Game: Winning with AI” in New York City

(Note: This blog is part of a multi-part series on this event and the related analyst event focused on IBM’s current status from an AI perspective.) On September 13th, 2018, IBM held an event titled “Change the Game: winning with AI.” The event was hosted by ESPN’s Hannah Storm and held in Hell’s Kitchen’s Terminal…

Please register or log into your Free Amalgam Insights Community account to read more.
Log In Register

EPM at a Crossroads: Big Data Solutions

Key Stakeholders: Chief Information Officers, Chief Financial Officers, Chief Operating Officers, Chief Digital Officers, Chief Technology Officer, Accounting Directors and Managers, Sales Operations Directors and Managers, Controllers, Finance Directors and Managers, Corporate Planning Directors and Managers

Analyst-Recommended Solutions: Adaptive Insights, a Workday Company, Anaplan, Board, Domo, IBM Planning Analytics, OneStream, Oracle Planning and Budgeting, SAP Analytics Cloud

In 2018, the Enterprise Performance Management market is at a crossroads. This market has emerged from a foundation of financial planning, budgeting, and forecasting solutions designed to support basic planning and has evolved as the demands for business planning, risk and forecasting management, and consolidation have increased over time. In addition, the EPM market has expanded as companies from the financial consolidation and close markets, business performance management markets, and workflow and process automation markets now play important roles in effectively managing Enterprise Performance.

In light of these challenges, Amalgam Insights is tracking six key areas where Enterprise Performance Management is fundamentally changing: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience

Supporting Big Data for Enterprise Performance Management

Amalgam Insights has identified two key drivers repeatedly mentioned by finance departments seeking to support Big Data in Enterprise Performance Management. First, EPM solutions must support larger stores of data over time to fully analyze financial data and a plethora of additional business data needed to support strategic business analysis. The challenge of growing data has become increasingly important as enterprises now face the challenge of managing billion row tables and outgrow the traditional cubes and datamarts used to manage basic financial data. The sheer scale of financial and commerce-related transactional data requires a Big Data approach at the enterprise level to support timely analysis of planning, consolidation, close, risk, and compliance.

In addition, these large data sources need to integrate with other data sources and references to support integrated business planning to align finance planning with sales, supply chain, IT, and other departments. As the CFO is increasingly asked to be not only a financial leader, but a strategic leader, she must have access to all relevant business drivers and have a single view of how relevant sales, support, supply chain, marketing, operational, and third-party data are aligned to financial performance. Each of these departments has its own large store of data that the strategic CFO must also be able to access, allocate, and analyze to guide the business.

New EPM solutions must evolve beyond traditional OLAP cubes to support hybrid data structures that effectively scale to support the immense scale and variety of data being supported. Amalgam notes that EPM solutions focusing on large data solutions take a variety of relational, in-memory, columnar, cloud computing, and algorithmic approaches to define categories on the fly, store, structure, and analyze financial data.

To support these large stores of data and effectively support them from a financial, strategic, and analytic perspective, Amalgam Insights recommends the following companies that have been innovative in supporting immense and varied planning and budgeting data environments based on briefings and discussions held in 2018:

  • Adaptive Insights, a Workday Company
  • Anaplan
  • Board
  • Domo
  • IBM Planning Analytics
  • OneStream
  • Oracle Planning and Budgeting
  • SAP Analytics Cloud

Adaptive Insights

Adaptive Insights’ Elastic Hypercube, an in-memory, dynamic caching and scaling solution announced in July 2018. Amalgam Insights saw a preview of this technology at Adaptive Live and was intrigued by the efficiency that Adaptive Insights provided to models in selectively recalculating only the dependent changes as a model was edited, using a dynamic caching approach for only using memory and computational cycles when data was being accessed, and using both tabular and cube formats to support data structures. This data format will also be useful to Adaptive Insights as a Workday company in building out the various departmental planning solutions that will be accretive to Workday’s positioning as an HR and ERP solution after Workday’s June acquisition (covered in June in our Market Milestone).

Anaplan

Anaplan’s Hyperblock is an in-memory engine combining columnar, relational, and OLAP approaches. This technology is the basis of Anaplan’s platform and allows Anaplan to rapidly support large planning use cases. By developing composite dimensions, Anaplan users can pre-build a broad array of combinations that can be used to repeatably deploy analytic outputs. As noted in our March blog, Anaplan has been growing rapidly based on its ability to rapidly support new use cases. In addition, Anaplan has recently filed its S-1 to go public.

Board

Board goes to market both as an EPM and a general business intelligence solution. Its core technology is the Hybrid Bitwise Memory Pattern (HBMP), a proprietary in-memory data management solution, designed to algorithmically map each bit of data, then to store this map in-memory. In practice, this approach allows Board to allow many users to both access and edit information without dealing with lagging or processing delays. This approach also allows Board to support which aspects of data to support in an in-memory or dynamic manner to prioritize computing assets.

Domo

Domo describes its Adrenaline engine as an “n-dimensional, highly concurrent, exo-scale, massively parallel, and sub-second data warehouse engine” to store business data. This is accompanied by VAULT, Domo’s data lake to support data ingestion and serve as a single store of record for business analysis. Amalgam Insights covered the Adrenaline engine as one of Domo’s “Seven Samurai” in our March report Domo Hajimemashite: At Domopalooza 2018, Domo Solves Its Case of Mistaken Identity. Behind the buzzwords, these technologies allow Domo to provide executive reporting capabilities across a wide range of departmental use cases in near-real time. Although Domo is not a budgeting solution, it is focused on portraying enterprise performance for executive consumption and should be considered for organizations seeking to gain business-wide visibility to key performance metrics.

IBM Planning Analytics

IBM Planning Analytics runs on Cognos TM1 OLAP in-memory cubes. To increase performance, these cubes use sparse memory management where missing values are ignored and empty values are not stored. In conjunction with IBM’s approach of caching analytic outcomes in-memory, this approach allows IBM to improve performance compared to standard OLAP approaches and this approach has been validated at scale by a variety of IBM Planning Analytics clients. Amalgam Insights presented on the value of IBM’s approach at IBM Vision 2017 both from a data perspective and from a user interface perspective that will be covered in a future blog.

OneStream

OneStream provides in-memory processing & stateless servers to support scale, but their approach to analytic scale is based on virtual cubes and extensible dimensions, which allow organizations to continue building dimensions over time that are tied back to a corporate level and to create logical views of data based on a larger data store to support specific financial tasks such as budgeting, tax reporting, or financial reporting. OneStream’s approach is focused on financial use rather than general business planning.

Oracle Planning and Budgeting Cloud

Oracle Planning and Budgeting Cloud Service is based on Oracle Hyperion, the market leader in Enterprise Performance Management from a revenue perspective. The Oracle Cloud is built on Oracle Exalogic Elastic Cloud, Oracle Exadata Database Machine, and the Oracle Database, which provide a strong in-memory foundation for the Planning and Budgeting application by providing an algorithmic approach to manage storage, compute, and networking. This approach effectively allows Oracle to support planning models at massive scale.

SAP Analytics Cloud

SAP Analytics Cloud, SAP’s umbrella product for planning and business intelligence, uses SAP S/4HANA, an in-memory columnar relational database, to provide real-time access to data and to accelerate both modelling and analytic outputs based on all relevant transactional data. This approach is part of SAP’s broader HANA strategy to encapsulate both analytic and transactional processing in a single database, effectively making all data reportable, modellable, and actionable. SAP has also recently partnered with Intel Optane DC persistent memory to support larger data volumes for enterprises requiring larger persistent data stores for analytic use.

This blog is part of a multi-part series on the evolution of Enterprise Performance Management and key themes that the CFO office must consider in managing holistic enterprise performance: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience. If you would like to set up an inquiry to discuss EPM or provide a vendor briefing on this topic, please contact us at info@amalgaminsights.com to set up time to speak.

Last Blog: EPM at a Crossroads
Next Blog: Robotic Process Automation and Machine Learning in EPM