Posted on

EPM at a Crossroads: Big Data Solutions

Key Stakeholders: Chief Information Officers, Chief Financial Officers, Chief Operating Officers, Chief Digital Officers, Chief Technology Officer, Accounting Directors and Managers, Sales Operations Directors and Managers, Controllers, Finance Directors and Managers, Corporate Planning Directors and Managers

Analyst-Recommended Solutions: Adaptive Insights, a Workday Company, Anaplan, Board, Domo, IBM Planning Analytics, OneStream, Oracle Planning and Budgeting, SAP Analytics Cloud

In 2018, the Enterprise Performance Management market is at a crossroads. This market has emerged from a foundation of financial planning, budgeting, and forecasting solutions designed to support basic planning and has evolved as the demands for business planning, risk and forecasting management, and consolidation have increased over time. In addition, the EPM market has expanded as companies from the financial consolidation and close markets, business performance management markets, and workflow and process automation markets now play important roles in effectively managing Enterprise Performance.

In light of these challenges, Amalgam Insights is tracking six key areas where Enterprise Performance Management is fundamentally changing: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience

Supporting Big Data for Enterprise Performance Management

Amalgam Insights has identified two key drivers repeatedly mentioned by finance departments seeking to support Big Data in Enterprise Performance Management. First, EPM solutions must support larger stores of data over time to fully analyze financial data and a plethora of additional business data needed to support strategic business analysis. The challenge of growing data has become increasingly important as enterprises now face the challenge of managing billion row tables and outgrow the traditional cubes and datamarts used to manage basic financial data. The sheer scale of financial and commerce-related transactional data requires a Big Data approach at the enterprise level to support timely analysis of planning, consolidation, close, risk, and compliance.

In addition, these large data sources need to integrate with other data sources and references to support integrated business planning to align finance planning with sales, supply chain, IT, and other departments. As the CFO is increasingly asked to be not only a financial leader, but a strategic leader, she must have access to all relevant business drivers and have a single view of how relevant sales, support, supply chain, marketing, operational, and third-party data are aligned to financial performance. Each of these departments has its own large store of data that the strategic CFO must also be able to access, allocate, and analyze to guide the business.

New EPM solutions must evolve beyond traditional OLAP cubes to support hybrid data structures that effectively scale to support the immense scale and variety of data being supported. Amalgam notes that EPM solutions focusing on large data solutions take a variety of relational, in-memory, columnar, cloud computing, and algorithmic approaches to define categories on the fly, store, structure, and analyze financial data.

To support these large stores of data and effectively support them from a financial, strategic, and analytic perspective, Amalgam Insights recommends the following companies that have been innovative in supporting immense and varied planning and budgeting data environments based on briefings and discussions held in 2018:

  • Adaptive Insights, a Workday Company
  • Anaplan
  • Board
  • Domo
  • IBM Planning Analytics
  • OneStream
  • Oracle Planning and Budgeting
  • SAP Analytics Cloud

Adaptive Insights

Adaptive Insights’ Elastic Hypercube, an in-memory, dynamic caching and scaling solution announced in July 2018. Amalgam Insights saw a preview of this technology at Adaptive Live and was intrigued by the efficiency that Adaptive Insights provided to models in selectively recalculating only the dependent changes as a model was edited, using a dynamic caching approach for only using memory and computational cycles when data was being accessed, and using both tabular and cube formats to support data structures. This data format will also be useful to Adaptive Insights as a Workday company in building out the various departmental planning solutions that will be accretive to Workday’s positioning as an HR and ERP solution after Workday’s June acquisition (covered in June in our Market Milestone).

Anaplan

Anaplan’s Hyperblock is an in-memory engine combining columnar, relational, and OLAP approaches. This technology is the basis of Anaplan’s platform and allows Anaplan to rapidly support large planning use cases. By developing composite dimensions, Anaplan users can pre-build a broad array of combinations that can be used to repeatably deploy analytic outputs. As noted in our March blog, Anaplan has been growing rapidly based on its ability to rapidly support new use cases. In addition, Anaplan has recently filed its S-1 to go public.

Board

Board goes to market both as an EPM and a general business intelligence solution. Its core technology is the Hybrid Bitwise Memory Pattern (HBMP), a proprietary in-memory data management solution, designed to algorithmically map each bit of data, then to store this map in-memory. In practice, this approach allows Board to allow many users to both access and edit information without dealing with lagging or processing delays. This approach also allows Board to support which aspects of data to support in an in-memory or dynamic manner to prioritize computing assets.

Domo

Domo describes its Adrenaline engine as an “n-dimensional, highly concurrent, exo-scale, massively parallel, and sub-second data warehouse engine” to store business data. This is accompanied by VAULT, Domo’s data lake to support data ingestion and serve as a single store of record for business analysis. Amalgam Insights covered the Adrenaline engine as one of Domo’s “Seven Samurai” in our March report Domo Hajimemashite: At Domopalooza 2018, Domo Solves Its Case of Mistaken Identity. Behind the buzzwords, these technologies allow Domo to provide executive reporting capabilities across a wide range of departmental use cases in near-real time. Although Domo is not a budgeting solution, it is focused on portraying enterprise performance for executive consumption and should be considered for organizations seeking to gain business-wide visibility to key performance metrics.

IBM Planning Analytics

IBM Planning Analytics runs on Cognos TM1 OLAP in-memory cubes. To increase performance, these cubes use sparse memory management where missing values are ignored and empty values are not stored. In conjunction with IBM’s approach of caching analytic outcomes in-memory, this approach allows IBM to improve performance compared to standard OLAP approaches and this approach has been validated at scale by a variety of IBM Planning Analytics clients. Amalgam Insights presented on the value of IBM’s approach at IBM Vision 2017 both from a data perspective and from a user interface perspective that will be covered in a future blog.

OneStream

OneStream provides in-memory processing & stateless servers to support scale, but their approach to analytic scale is based on virtual cubes and extensible dimensions, which allow organizations to continue building dimensions over time that are tied back to a corporate level and to create logical views of data based on a larger data store to support specific financial tasks such as budgeting, tax reporting, or financial reporting. OneStream’s approach is focused on financial use rather than general business planning.

Oracle Planning and Budgeting Cloud

Oracle Planning and Budgeting Cloud Service is based on Oracle Hyperion, the market leader in Enterprise Performance Management from a revenue perspective. The Oracle Cloud is built on Oracle Exalogic Elastic Cloud, Oracle Exadata Database Machine, and the Oracle Database, which provide a strong in-memory foundation for the Planning and Budgeting application by providing an algorithmic approach to manage storage, compute, and networking. This approach effectively allows Oracle to support planning models at massive scale.

SAP Analytics Cloud

SAP Analytics Cloud, SAP’s umbrella product for planning and business intelligence, uses SAP S/4HANA, an in-memory columnar relational database, to provide real-time access to data and to accelerate both modelling and analytic outputs based on all relevant transactional data. This approach is part of SAP’s broader HANA strategy to encapsulate both analytic and transactional processing in a single database, effectively making all data reportable, modellable, and actionable. SAP has also recently partnered with Intel Optane DC persistent memory to support larger data volumes for enterprises requiring larger persistent data stores for analytic use.

This blog is part of a multi-part series on the evolution of Enterprise Performance Management and key themes that the CFO office must consider in managing holistic enterprise performance: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience. If you would like to set up an inquiry to discuss EPM or provide a vendor briefing on this topic, please contact us at info@amalgaminsights.com to set up time to speak.

Last Blog: EPM at a Crossroads
Next Blog: Robotic Process Automation and Machine Learning in EPM

Posted on Leave a comment

FloQast Supports ASC 606 Compliance by Providing a Multi Book Close for Accountants

On September 11, 2018, FloQast announced multi-book accounting capabilities designed to help organizations to support ASC 606 compliant financial closes by supporting dual reporting on revenue recognition and related expenses. As Amalgam Insights has covered in prior research, ASC 606/IFRS 15 standards for recognizing revenue on subscription services are currently required for all public companies and will be the standard for private companies as of the end of 2019.

Currently, FloQast supports multi book accounting for Oracle NetSuite and Sage Intacct, two strong mid-market finance solutions that have invested in their subscription billing model support capabilities. This capability is available for FloQast Business, Corporate, and Enterprise customers at no additional cost. The support of these solutions also reflects the investment that each of these ERP vendors has made in subscription billing. NetSuite’s 2015 acquisition of subscription billing solution Monexa eventually led to the launch of NetSuite SuiteBilling, while Sage Intacct developed its subscription billing capabilities organically in 2015.

Why This Matters For Accounting Teams

Currently, accounting teams compliant with ASC 606 are required to provide two sets of books associated with each financial close. Organizations seeking to accurately reflect their finances both from a legacy and current perspective either need to duplicate efforts to provide compliant accounting outputs or to use an accounting solution that will accurately create separate sets of close results. By simultaneously creating dual level close outputs, organizations can avoid the challenge of creating detailed journal entries to explain discrepancies within a single close instance.

Recommendations for Accounting Teams with ASC 606 Compliance Requirements

This announcement has a couple of ramifications for mid-market enterprises and organizations that are either currently supporting ASC 606 as public companies or preparing to support ASC 606 as private companies.

First, Amalgam Insights believes that accounting teams using either Oracle NetSuite or Sage Intacct should adopt FloQast as a relatively low-cost solution to solve the challenge of duplicate ASC 606 close. Currently, this functionality is most relevant to Oracle NetSuite and Sage Intacct customers with significant ASC 606 accounting challenges. To understand why, consider the basic finances of this decision.

Amalgam Insights estimates that, based on FloQast’s current pricing of $125 per month for business accounts or $150 per month for corporate accounts , FloQast will pay for itself with productivity gains in any accounting department where an organization spends four or more man-hours per month to create duplicate closes. This return is in addition to the existing ROI associated with financial close that Amalgam Insights has previously tracked for FloQast customers in our Business Value Analysis. In this document, we found that FloQast customers interviewed saw a 647% ROI in their first year of deployment by accelerating and simplifying their close workflows and improving team visibility to the current status of financial close.

Second, accounting teams should generally expect to support multi-book accounting for the foreseeable future. Although ASC 606 is now a current standard, financial analysts and investors seeking to conduct historical analysis of a company for investment, acquisition, or partnership will want to conduct a consistent “apples-to-apples” comparison of finances for multiple years. Until your organization has three full years of financial statements under ASC 606 that are audited, your organization will likely have to maintain multiple sets of books. Given that most public organizations started using ASC 606 in 2018, this means having a plan for multiple sets of books until 2020. For private organizations, this may mean an additional year or two given that mandatory compliance starts at the end of 2019. Companies that avoid preparing for the reality of dual level closes for the next couple of years will be spending significant accountant hours on easily avoidable work.

If you would like to learn more about FloQast, the Business Value Analysis, or the current vendor solution options for financial close management, please contact Amalgam Insights at info@amalgaminsights.com

Posted on 2 Comments

VMware Purchases CloudHealth Technologies to support Multicloud Enterprises and Continue Investing in Boston


Vendors and Solutions Mentioned: VMware, CloudHealth Technologies, Cloudyn, Microsoft Azure Cloud Cost Management, Cloud Cruiser, HPE OneSphere. Nutanix Beam, Minjar, Botmetric

Key Stakeholders: Chief Financial Officers, Chief Information Officers, Chief Accounting Officers, Chief Procurement Officers, Cloud Computing Directors and Managers, IT Procurement Directors and Managers, IT Expense Directors and Managers

Key Takeaway: As Best-of-Breed vendors continue to emerge, new technologies are invented, existing services continue to evolve, vendors pursue new and innovative pricing and delivery models, cloud computing remains easy to procure, and IaaS doubles every three years as a spend category, cloud computing management will only increase in complexity and the need for Cloud Service Management will only increase. VMware has made a wise choice in buying into a rapidly growing market and now has greater opportunity to support and augment complex peak, decentralized, and hybrid IT environments.

About the Announcement

On August 27, 2018, VMware announced a definitive agreement to acquire CloudHealth Technologies, a Boston-based startup company focused on providing a cloud operations and expense management platform that supports enterprise accounts across Amazon Web Services, Microsoft Azure, and Google Cloud Platform.
Continue reading VMware Purchases CloudHealth Technologies to support Multicloud Enterprises and Continue Investing in Boston

Posted on Leave a comment

Oracle Autonomous Transaction Processing Lowers Barriers to Entry for Data-Driven Business

I recently wrote a Market Milestone report on Oracle’s launch of Autonomous Transaction Processing, the latest in a string of Autonomous Database announcements made by Oracle following announcements in Autonomous Data Warehousing and the initial announcement of the Autonomous Database late last year.

This string of announcements by Oracle takes advantage of Oracle’s investments in infrastructure, distributed hardware, data protection and security and index optimization to create a new set of database services that seek to automate basic support and optimization capabilities. These announcements matter because, as transactional and data-centric business models continue to proliferate, both startups and enterprises should seek a data infrastructure that will remain optimized, secure, and scalable over time without become cost and resource intensive. With Oracle Automated Transaction Processing, Oracle provides its solution to provide an enterprise-grade data foundation for this next generation of businesses.

One of Amalgam Insights’ key takeaways in this research is the analyst estimate that Oracle ATP could reduce the cost of cloud-based transactional database management by 65% compared to similar services managed on Amazon Web Services. Frankly, companies that need to support net-new transactional databases that must be performant and scalable to support Internet of Things, messaging, and other new data-driven businesses should consider Oracle ATP and should do due diligence on Oracle Autonomous Database Cloud for reducing long-term Total Cost of Ownership. This chart is based on the costs of a 10 TB Oracle database on a reserved instance on Amazon Web Services vs. a similar database on the Oracle Autonomous Database Cloud

One of the most interesting aspects of the Autonomous Database in general that Oracle will need to further explain is how to guide companies with existing transactional databases and data warehouses to an Automated environment. It is no secret that every enterprise IT department is its own special environment driven by a combination of business rules, employee preferences, governance, regulation, security, and business continuity expectations. At the same time, IT is used to automation and rapid processing of some aspects of technology management, such as threat management and logs for patching and other basic transactions. But considering the needs of IT for extreme customization, how does IT gain enough visibility to the automated decisions made in indexing and ongoing optimization?

At this point, Amalgam Insights believes that Oracle is pushing a fundamental shift in database management that will likely lead to the automation of manual technical management tasks. This change will be especially helpful for net-new databases where organizations can use the Automated Database Cloud to help establish business rules for data access, categorization, and optimization. This is likely a no-brainer decision, especially for Oracle shops that are strained in their database management resources and seeking to handle more data for new transaction-based business needs or machine learning.

For established database workloads, enterprises will have to think about how or if to transfer existing enterprise databases to the Autonomous Database Cloud. Although enterprises will likely gain some initial performance improvements and potentially reduce the support costs associated with large databases, they will also likely spend time in double-checking the decisions and lineage associated with Automated Database decisions, both in test and in deployment settings. Amalgam Insights would expect that Autonomous Database management would lead to indexing, security, and resource management decisions that may be more optimal than human-led decisions, but with a logic that may not be fully transparent to IT departments that have strongly-defined and governed business rules and processes.

Although Amalgam Insights is convinced that Oracle Autonomous Database is the beginning of a new stage of Digitized and Automated IT, we also believe that a next step for Oracle Autonomous Database Cloud will be to create governance, lineage, and audit packages to support regulated industries, legislative demands, and documentation to describe the business rules for Autonomous logic. Amalgam Insights expects that Oracle would want to keep specific algorithms and automation logic as proprietary trade secrets. But without some level of documentation that is tracable and auditable, large enterprises will have to conduct significant work on their own to figure out if they are able to transfer large databases to Oracle Autonomous Database Cloud, which Amalgam Insights would expect to be an important part of Oracle’s business model and cloud revenue projections going forward.

To read the full report with additional insights and details on the Oracle Autonomous Transaction Processing announcement, please download the full report on Oracle’s launch of Autonomous Transaction Processing, available at no cost for a limited time.

Posted on Leave a comment

Azure Advancements Announced at Microsoft Inspire 2018

Last week, Microsoft Inspire took place, which meant that Microsoft made a lot of new product announcements regarding the Azure cloud. In general, Microsoft is both looking up and trying to catch up to Amazon from a market share perspective while trying to keep its current #2 place in the Infrastructure as a Service world ahead of rapidly growing Google Cloud Platform as well as IBM and Oracle.  Microsoft Azure is generally regarded as a market-leading cloud platform, along with Amazon, that provides storage, computing, and security and is moving towards analytics, networking, replication, hybrid synchronization, and blockchain support.

Key functionalities that Microsoft has announced include:
Continue reading Azure Advancements Announced at Microsoft Inspire 2018

Posted on Leave a comment

Market Milestone: Informatica and Google Cloud Partner to Open Up Data, Metadata, Processes, and Applications as Managed APIs

[This Research Note was co-written by Hyoun Park and Research Fellow Tom Petrocelli]

Key Stakeholders: Chief Information Officers, Chief Technical Officers, Chief Digital Officers, Data Management Managers, Data Integration Managers, Application Development Managers

Why It Matters: his partnership demonstrates how Informatica’s integration Platform as a Service brings Google Cloud Platform’s Apigee products and Informatica’s machine-learning-driven connectivity into a single solution.

Key Takeaway: This joint Informatica-Google API management solution provides customers with a single solution that provides data, process, and application integration as well as API management. As data challenges evolve into workflow and service management challenges, this solution bridges key gaps for data and application managers and demonstrates how Informatica can partner with other vendors as a neutral third-party to open up enterprise data and support next-generation data challenges.
Continue reading Market Milestone: Informatica and Google Cloud Partner to Open Up Data, Metadata, Processes, and Applications as Managed APIs

Posted on Leave a comment

What Wall Street is missing regarding Broadcom’s acquisition of CA Technologies: Cloud, Mainframes, & IoT

(Note: This blog contains significant contributions from long-time software executive and Research Fellow Tom Petrocelli)

On July 11, Broadcom ($AVGO) announced an agreement to purchase CA for $18.9 billion. If this acquisition goes through, this will be the third largest software acquisition of all time behind only Microsoft’s $26 billion acquisition of LinkedIn and Facebook’s $19 billion acquisition of WhatsApp. And, given CA’s focus, I would argue this is the largest enterprise software acquisition of all time, since a significant part of LinkedIn’s functionality is focused on the consumer level.

But why did Broadcom make this bet? The early reviews have shown confusion with headlines such as:
Broadcom deal to buy CA makes little sense on the surface
4 Reasons Broadcom’s $18.9B CA Technologies Buy Makes No Sense
Broadcom Buys CA – Huh?

All of these articles basically hone in on the fact that Broadcom is a hardware company and CA is a software company, which leads to the conclusion that these two companies have nothing to do with each other. But to truly understand why Broadcom and CA can fit together, let’s look at the context.

In November 2017, Broadcom purchased Brocade for $5.5 billion to build out data center and networking markets. This acquisition expanded on Broadcom’s strengths in supporting mobile and connectivity use cases by extending Broadcom’s solution set beyond the chip and into actual connectivity.

Earlier this year, Broadcom had tried to purchase Qualcomm for over $100 billion. Given Broadcom’s lack of cash on hand, this would have been a debt-based purchase with the obvious goal of rolling up the chip market. When the United States blocked this acquisition in March, Broadcom was likely left with a whole lot of money ready to deploy that needed to be used or lost and no obvious target.

So, add these two together and Broadcom had both the cash to spend and a precedent for showing that it wanted to expand its value proposition beyond the chip and into larger integrated solutions for two little trends called “the cloud,” especially private cloud, and “the Internet of Things.”

Now, in that context, take a look at CA. CA’s bread and butter comes from its mainframe solutions, which make up over $2 billion in revenue per year. Mainframes are large computers that handle high-traffic and dedicated workloads and increasingly need to be connected to more data sources, “things,” and clients. Although CA’s mainframe business is a legacy business, that legacy is focused on some of the biggest enterprise computational processing needs in the world. Thus, this is an area that a chipmaker would be interested in supporting over time. The ability to potentially upsell or replace those workloads over time with Broadcom computing assets, either through custom mainframe processors or through private cloud data centers, could add some predictability to the otherwise cyclical world of hardware manufacturing. Grab enterprise computing workloads at the source and then custom build to their needs.

This means that there’s a potential hyperscale private cloud play here as well for Broadcom by bringing Broadcom’s data center networking business together with CA’s server management capabilities, which end up looking at technical monitoring issues both from a top-down and bottoms-up perspective.

CA also is strong in supporting mobile development, developer operations (DevOps), API management, IT Operations, and service level management in its enterprise solutions business, which earned $1.75 billion in annual revenue over the past year. On the mobile side, this means that CA is a core toolset for building, testing, and monitoring the mobile apps and Internet of Things applications that will be running through Broadcom’s chips. To optimize computing environments, especially in mobile and IoT edge environments where computing and storage resources are limited, applications need to optimized on available hardware. If Broadcom is going to take over the IoT chip market over time, the chips need to support relevant app workloads.

I would expect Broadcom to increase investment in CA’s Internet of Things & mobile app dev departments expand once Broadcom completes this transaction as well. Getting CA’s dev tools closer to silicon can only help performance and help Broadcom to provide out-of-the-box IoT solutions. This acquisition may even push Broadcom into the solutions and services market, which would blow the minds of hardware analysts and market observers but would also be a natural extension of Broadcom’s current acquistions to move through the computing value stack.

From a traditional OSI perspective, this acquisition feels odd because Broadcom is skipping multiple layers between its core chip competency and CA’s core competency. But the Brocade acquisition helps close the gaps even after spinning off Ruckus Wireless, Lumina SDN, and data center networking businesses. Broadcom is focused on processing and guiding workloads, not on transport and other non-core activities.

So, between mainframe, private cloud, mobile, and IoT markets, there are a number of adjacencies between Broadcom & CA. It will be challenging to knit together all of these pieces accretively. But because so much of CA’s software is focused on the monitoring, testing, and security of hardware and infrastructure, this acquisition isn’t quite as crazy as a variety of pundits seem to think. In addition, the relative consistency of CA’s software revenue compared to the highs and lows of chip building may also provide some benefits to Broadcom by providing predictable cash flow to manage debt payments and to fund the next acquisition that Hock Tan seeks to hunt down.

All this being said, this is still very much an acquisition out of left field. I’ll be fascinated in seeing how this transaction ends up. It is somewhat reminiscent of Oracle’s 2009 acquisition of Sun to bring hardware and software together. This does not necessarily create confidence in the acquisition, since hardware/software mergers have traditionally been tricky, but doesn’t disprove the synergies that do exist. In addition, Oracle’s move points out that Broadcom seems to have skipped a step of purchasing a relevant casing, device, or server company. Could this be a future acquisition to bolster existing investments and push further into the world of private cloud?

A key challenge and important point that my colleague Tom Petrocelli brings up is that CA and Broadcom sell to very different customers. Broadcom has been an OEM-based provider while CA sells directly to IT. As a result, Broadcom will need to be careful in maintaining CA’s IT-based direct and indirect sales channels and would be best served to keep CA’s go-to-market teams relatively intact.

Overall, the Broadcom acquisition of CA is a very complex puzzle with several potential options.

1. The diversification efforts will work to smooth out Broadcom’s revenue over time and provide more predictable revenues to support Broadcom’s continuing growth through acquisition. This will help their stock in the long run and provide financial benefit.
2. Broadcom will fully integrate the parts of CA that make the most sense for them to have, especially the mobile security and IoT product lines, and sell or spin off the rest to help pay for the acquisition. Although Brocade spinoffs occured prior to the acquisition, there are no forces that prevent Broadcom from spinning off non-core CA assets and products, especially those that are significantly outside the IoT and data center markets.
3. In a worst case scenario, Broadcom will try to impose its business structure on CA, screw up the integration, and kill a storied IT company over time through mismanagement. Note that Amalgam Insights does not recommend this option.

But there is some alignment here and it will be fascinating to see how Broadcom takes advantage of CA’s considerable IT monitoring capabilities, takes advantage of CA’s business to increase chip sales, and uses CA’s cash flow to continue Broadcom’s massive M&A efforts.

Posted on Leave a comment

Amalgam Provides 4 Big Recommendations for Self-Service BI Success

 

Recently, my colleague Todd Maddox, Ph.D., the most-cited analyst in the corporate training world, and I were looking at the revolution of self-service BI, which has allowed business analysts and scientists to quickly explore and analyze their own data easily. At this point, any BI solution lacking a self-service option should not be considered a general business solution.

However, businesses still struggle to teach and onboard employees on self-service solutions, because self-service represents a new paradigm for administration and training, including the brain science challenges of training for IT. In light of these challenges, Dr. Maddox and I have the following  four recommendations for better BI adoption.

  1. Give every employee a hands-on walkthrough. If Self-Service is important enough to invest in, it is important enough to train as well. This doesn’t have to be long, but even 15-30 minutes spent on having each employee understand how to start accessing data is important.
  2. Drive a Culture of Curiosity. Self-Service BI is only as good as the questions that people ask. In a company where employees are either set in their ways and not focused on continuous improvement, Self-Service BI just becomes another layer of shelfware.

    Maddox adds: The “shelfware” comment is spot on. I was a master of putting new technology on the shelf! If what I have now works for my needs, then I need to be convinced, quickly and efficiently, that this new approach is better. I suggest asking users what they want to use the software for. If you can put users into one of 4 or 5 bins of business use cases, then you can customize the training and onboard more quickly and effectively.

  3. Build short training modules for key challenges in each department. This means that departmental managers need to commit to recording, say, 2-3 short videos that will cover the basics for self-service. Service managers might be looking for missed SLAs while sales managers look for close rates and marketing managers look for different categories of pipeline. But across these areas, the point is to provide a basic “How-to” so that users can start looking for the right answers.

    Maddox adds: Businesses are strongly urged to include 4 or 5 knowledge check questions for each video. Knowledge testing is one of the best ways to add additional training. It also provides quick insights on what aspects of your video is effective and what is not. Train by testing!

  4. Analytics knowledge must become readily available . As users start using BI, they need to figure out the depth and breadth of what is possible with BI, formulas, workflows, regression, and other basic tools. This might be as simple as an aggregation of useful Youtube videos to a formal program developed in a corporate learning platform.

By taking these training tips from one of the top BI influencers and the top-cited training analyst on the planet, we hope you are better equipped to support self-service BI at scale for your business.

Posted on 1 Comment

Mapping Multi-Million Dollar Business Value from Machine Learning Projects

Amalgam has just posted a new report: The Roadmap to Multi-Million Dollar Machine Learning Value with DataRobot. I’m especially excited about this report for a couple of reasons.

First, this report documents multiple clear value propositions for machine learning that led to the documented annual value of over a million dollars. This is an important metric to demonstrate at a time when many enterprises are still asking why they should be putting money into machine learning.

Second, Amalgam introduces a straightforward map for understanding how to construct machine learning products that are designed to create multi-million dollar value. Rather than simply hope and wish for a good financial outcome, companies can actually model if their project is likely to justify the cost of machine learning (especially the specialized mathematical and programming skills needed to make this work.)

Amalgam provides the following starting point for designing Multi-Million dollar machine learning value:

Stage One is discovering the initial need for machine learning, which may sound tautological. “To start machine learning, find the need for machine learning…” More specifically, look for opportunities to analyze hundreds of variables that may be related to a specific outcome, but where relationships cannot be quickly analyzed by gut feel or basic business intelligence. And look for opportunities where employees already have gut feelings that a new variable may be related to a good business outcome, such as better credit risk scoring or higher quality supply chain management. Start with your top revenue-creating or value-creating department and then deeply explore.

Stage Two is about financial analysis and moving to production. Ideally, your organization will find a use case involving over $100 million in value. This does not mean that your organization is making $100 million in revenue, as activities such as financial loans, talent recruiting, and preventative maintenance can potentially lead to billions of dollars in capital or value being created even if the vendor only collects a small percentage as a finder’s fee, interest, or maintenance fee. Once the opportunity exists, move on it. Start small and get value.

Then finally, take those lessons learned and start building an internal Machine Learning Best Practices or Center of Excellence organization. Again, start small and focus on documenting what works within your organization, including the team of employees needed to get up and running, the financial justification needed to move forward, and the technical resources needed to operationalize machine learning on a scalable and predictable basis. Drive the cost of Machine Learning down internally so that your organization can tackle smaller problems without being labor, cost, and time-prohibitive.

This blog is just a starting point for the discussion of machine learning value Amalgam covers in The Roadmap to Multi-Million Dollar Machine Learning Value with DataRobot. Please check out the rest of the report as we discuss the Six Stages of moving from BI to AI.

This report also defines a financial ROI model associated with a business-based approach to machine learning.

If you have any questions about this blog, the report, or how to engage Amalgam Insights in providing strategy and vendor recommendations for your data science and machine learning initiatives, please feel free to contact us at info@amalgaminsights.com.

Posted on Leave a comment

5 Stages of The Technology Expense Management Market (re: Calero Acquires Veropath)

In my recent Market Milestone, Calero Acquires Veropath to Bolster its Global Role in Technology Expense Management, I made a quick comment about Veropath as an “accretive acquisition target.” But then I realized that I hadn’t explained what that meant from an Amalgam perspective.

From Amalgam’s perspective, the Technology Expense Market (aka Telecom Expense Management, although these solutions now regularly manage a wide variety of IT assets, services, and subscriptions) roughly breaks out into companies of five sizes, each with capabilities that could be considered “accretive” to larger organizations. I should add that there are a number of additional TEM companies that are at these sizes, but do not fit these profiles. Outlying companies might be very profitable, stable, and good providers, but are not typically considered great acquisition targets.

The first size is those of 1 – 10 employees. These are companies that are usually good at a specific task or have a single product that is custom-suited to managing a specific capability, such as automated invoice processing or rate plan optimization or network data management. The companies in this space tend to have a combination of specialization and subject matter expertise from a technical perspective, but lack the support staff to manage a large number of clients. These are a combination of technology acquisitions and acquihires.

The second size is 10 – 30 employees. These Technology Expense Management companies have found a specific geographical, market, or service niche and tend to have some combination of technology and services. There is a long tail of TEM companies in this category that lack the scale to go national, but may have built strong geographic, technical, or process management capabilities. However, these companies typically lack the sales and marketing engine to expand beyond their current size, meaning that further growth will often require outside capital and additional investment in revenue-creating activities.

The third size is roughly between 30 and 75 employees. At this size, the TEM vendor has found a strong go-to-market message and is supporting both mid-market and enterprise vendors regularly. These vendors have built their own platform, have a significant internal support team, and typically have a strong sales leader who is either the CEO or a VP of Sales. At this point, Amalgam notes that the biggest challenge for these vendors is creating a management team empowered to make good decisions and in letting go of decisions as a CEO. This management challenge is quite difficult to surpass, both because it adds a lot of complexity to the business with very little immediate benefit to the CEO or the firm’s employees. However, at this scale, TEM businesses are also a good target for acquisition as they have built out every business function needed to be a successful and stable long-term business. Roughly speaking, these companies tend to have about $100 million to $500 million in spend under management and run as stable, profitable businesses. There are a number of strong TEM vendors in this space including, but not limited to, Avotus, Ezwim, ICOMM, Mobile Solutions, Network Control, SaaSwedo, SmartBill, Tellennium, Valicom, vCom, VoicePlus, and Wireless Analytics

The fourth size is between 75 and 1,000 employees. These TEM companies are rarely acquired and start becoming the acquirers of other TEM companies because they have successfully built an organization that can scale and run multiple business units. At this size, TEM companies start to manage over a billion dollars a year in spend and tend to either be publicly traded or backed by private equity. And at this point, TEM companies start running into adjacent competitors in markets such as Managed Mobility Services, SaaS Vendor Management, Cloud Service Management, IT Asset Management, and other related IT management areas. This is an interesting area for TEM because, after several years of watching Tangoe acquire businesses at this scale in the early 2010s, multiple new vendors appeared at this scale in the mid-to-late 2010s. Currently, Amalgam considers Calero, Cass, Cimpl, Dimension Data, MDSL, Mobichord, Mer Telemanagement Systems (MTS), One Source Communications, Sakon, and TNX to be representative of vendors of this size of large TEM providers.

Currently, the fifth size of 1,000+ employees is a market of one: Tangoe. This company has grown both organically and acquisitively to manage over $38 billion in technology spend, making it roughly six times larger than its nearest competitor. At this size, Tangoe focuses on large enterprise and global management challenges and is positioned to start pursuing adjacent markets more aggressively. Amalgam believes that there is sufficient opportunity in this market for additional firms of this scale, however, and expects one or more of Calero, Cass Information Systems, MDSL, or Sakon to leap into this scale in the next three-to-five years.

So, when Amalgam refers to “accretive opportunities” in the TEM space from an acquisition perspective, this is the rough context that we use as a starting point. Of course, with the 100+ firms that we track in this market, any particular category has both nuance and personalization in describing individual firms. If you have any questions regarding this blog, please feel free to follow up by emailing info@amalgaminsights.com and if you’d like to learn more about what Calero has done with this acquistion of Veropath, one of the largest UK-headquarted TEM vendors, please download our Market Milestone available this week (or as supplies last) for free.