Posted on

IBM Plans to Acquire IT Financial Management Leader Apptio: Consequences for the Enterprise IT Market

On June 26, 2023, IBM announced its intention to acquire IT Financial Management vendor Apptio for 4.6 billion dollars. This acquisition is intended to support IBM’s ability to support IT automation and business value documentation. With this acquisition comes the big question: is this acquisition good for IBM and Apptio customers? Who benefits most from this acquisition?

As an industry analyst who has covered the IT expense management space and first coined the Technology Expense Management and Technology Lifecycle Management terms as evolutions of the IT Asset Management and Telecom Expense Management markets, I’ve been looking at these markets and vendors for the past 15 years. In that time, IBM has gone through a variety of investments in the Technology Lifecycle Management space to manage the assets, projects, and costs associated with IT environments and Apptio has evolved from a nascent startup to a market leader.

When Amalgam Insights is asked “What do you think of IBM’s acquisition of Apptio,” this opinion requires exploring the back story and starting points for consideration as there is much more to this acquisition than simply stating that this is “good” or “bad.” Apptio is a market-leading vendor across IT financial management, SaaS Management, Cloud Cost Management (where Apptio is a current Amalgam Insights Distinguished Vendor), and Project Management. But there is a multi-decade history leading up to this acquisition, including both IBM’s pursuit of Technology Lifecycle Management solutions and Apptio’s long road to becoming a market leader in IT financial management.

Contextualizing the Acquisition

To understand this acquisition in its full context, let’s explore a partial timeline of the IBM, IBM partner, and Apptio journeys to get to this point:

1996 – IBM purchases Tivoli Systems for $743 million (approximately $1.4 billion in 2023 dollars) to substantially enter the IT asset management and monitoring business. Tivoli goes to become a market standard for IT asset management.

2002 – IBM acquires Rational Software for $2.1 billion to support software development and monitoring.

2007 – Apptio is founded as an IT financial management solution to support the planning, budgeting, and forecasting needs of CIOs and CFOs seeking to better understand their holistic IT ecosystem. At the time, it is seen as a niche capability compared to Tivoli’s broad set of functionalities but is still seen as promising enough to attract Andreessen Horowitz’ attention as their first investment back in 2009.

February 2012 – IBM acquires Emptoris, which includes a leading telecom expense management called Rivermine, to support sourcing, inventory management, and supply chain management as part of its Smarter Commerce initiative.

May 2015 – IBM Divests Rivermine operations, selling off the technology expense management business unit to Tangoe. Tangoe uses the customization of the Rivermine platform to support complex IT expense and payment management environments for large enterprises.

November 2015 – IBM acquires Gravitant, a hybrid cloud brokerage solution used to help companies to purchase cloud computing services across cloud environments. Later renamed IBM Cloud Brokerage, this capability was intended to support IBM’s Global Technology Services unit in supporting multi-cloud and complex enterprise hybrid cloud environments. This acquisition logic ended up being accurate in the long run, but was too early considering that the multi-cloud era is really only beginning now in the 2020s.

December 2018 – HCL purchases a variety of IBM software products for $1.8 billion, including Appscan and BigFix. Although these Rational and Tivoli products provided enterprise value for many years, they eventually became outdated and seen as legacy monitoring products.

January 2019 – Apptio is acquired by Vista Equity Partners for $1.94 billion. At the time, I thought this was a bargain even though it was a 53% premium to the trading price at the time. At the time, Apptio had gone through a rapid stock price fall due to some public market overreaction and Vista Equity came in with a strong offering that pleased institutional investors. With investments in IT and financial software companies including Bettercloud, JAMF, Trintech, and Vena, Vista Equity was seen as an experienced buyer capable of providing value to Apptio.

May 2019 – Apptio acquires Cloudability, entering the cloud cost management or Cloud FinOps (Financial Operations) space. With this acquisition, Apptio answered one of my long-time criticisms of the vendor, that it did not directly manage IT spend after holding out on directly managing a trillion dollars of enterprise telecom, network, and mobility spend. This transaction put visibility to $9 billion in multi-cloud spend across the Big 3 providers under Apptio’s supervision while maintaining Apptio’s vendor-neutral approach to IT finances.

December 2020 – IT Asset Management vendor Flexera is acquired by private equity firm Thoma Bravo. Over the next couple of years, Flexera develops a strong relationship with IBM to support IT Asset Management.

December 2020 – IBM acquires Instana to support observability and Application Performance Management. As real-time continuity, remediation, and observability have become increasingly important for monitoring the health of enterprise IT, this acquisition provides a crucial granular perspective for IBM clients.

February 2021 – Apptio acquires Targetprocess to support agile product and portfolio management. The ability to plan and budget projects and products allows Apptio to support IT at a more granular, contingent, and business-contextual level.

June 2021 – IBM acquires Turbonomic, an application resource, network performance, and cloud resource management solution. With this acquisition, IBM enters the FinOps space. In our 2022 Cloud Cost and Optimization SmartList, we listed IBM Turbonomic as a Distinguished Vendor noting that it focused “on application performance” and that the “software learns from organizations’ actions, so recommendations improve over time.”

October 2022 – Flexera One with IBM Observability aggregates cloud spend across multiple clouds. This offering combined with Flexera One’s status as an IBM partner gives IBM customers an option for multi-cloud spend management and the ability to purchase cost optimization based on cloud spend.

June 2023 – We come back to the present day, when IBM has agreed to purchase Apptio. So, now we are seeing a trend where IBM has invested in IT management solutions over the past couple of decades but has struggled to maintain market-leading status in those applications over time for a variety of reasons: market timing, market shifts, strategic positioning.

Concerns and Considerations

What is happening here? The problem isn’t that IBM is targeting bad companies, as IBM has consistently chosen top-tier companies and strong enterprise-grade solutions. This trend continues with Apptio, which has managed over 450 billion dollars in IT spend and provides a statistically significant lens for IT spend trends across a wide variety of vertical trends and geographies. From an acquisition perspective, Apptio makes perfect sense as a market leading solution executing on sales, marketing, and targeted inorganic growth to provide financial visibility and operational automation across global IT departments.

And the problem is not a lack of interest, as IBM has consistently targeted IT sourcing, expense, and performance management solutions with some success. IBM usually knows what it is trying to accomplish in purchasing solutions (with the exception of the missed Rivermine opportunity) and has done a good job of identifying where it needs to go next. As an example, IBM was early, perhaps too early, in pursuing multi-cloud brokerage services but in retrospect there is no doubt that multi-cloud management was the future of IT.

Based on my long market perspective of the Technology Lifecycle Management market, I think IBM has run into two main issues in this market: market size and partnership opportunities.

First, look at market size. This Technology Lifecycle Management market simply has not traditionally been an extremely large multi-billion dollar market on the scale of analytics, mainframes, or services. ITFM and related IT cost management services will always struggle to be much larger than a couple of billion dollars in revenue, as proven by market leaders across IT finance and cost such as Apptio, Tangoe, Calero, Zylo, Cass Information Systems, Flexera, Snow Software, CloudHealth (now VMware Aria), and Spot by NetApp. All of these solutions have grown to the point of managing billions of dollars, but none of these standalone businesses or business units have come close to reaching a billion dollars in annual recurring revenue. This is not an issue, other than that it is traditionally hard for behemoth global enterprises with $100 billion+ in annual revenue expectations to be fully committed to businesses of this size without trying to turn them into “larger” solutions that often lose focus.

A second issue is that IBM has a lot of internal pressure to play nicely with partners. The recent Flexera One partnership announcements are a good example where Flexera has quickly emerged as a strong partner to support IT asset management and multi-cloud cost management challenges and now will have to be rationalized in context of the capabilities that Apptio brings to market once this acquisition is completed. But when IBM has made commitments and plans to build significant services practices around a large partnership, it can be difficult to shift away from those plans no matter how significant the acquisition is. The challenge here is that even if the direct software revenue may pale in comparison to the services wrapped around it, the service revenue is still dependent on the quality of software used to provide services.

And despite any internal concerns about these issues, this is not a deal that Apptio and Vista Equity could refuse. The basic math here of adding $2.66 billion in market value in 4 and a half years, or roughly $600 million per year (minus the cost of acquisitions) is a no-brainer decision. Anyone who did not seriously consider this transaction would be considered negligent.

In addition, there are good reasons for Apptio to join a larger organization. There are limits to the organic development that Apptio can pursue across the Technology Lifecycle Management cycle across sourcing, observability, contingent resources and services, continuity planning, and MACH (Microservices, APIs, Cloud-Native, Headless) architecture support compared to what IBM (including Red Hat OpenShift and IBM Consulting) can provide. And IBM is obviously still a core provider when it comes to global IT support with a vested interest in helping global enterprises and highly regulated organizations with their IT planning capabilities.

Recommendations

So, what does this mean for IBM and Apptio customers? This is a nuanced decision where every current client will have specific exceptions associated with the customization of their IT portfolio. But here are some general starting points that we are providing as guidelines to consider this transaction.

For IBM: this is an acquisition where IBM is making a good decision, but success is not guaranteed just because of choosing the right vendor in the right space. There will be additional work needed to rationalize Apptio’s portfolio in light of how Turbonomic goes to market and how the Flexera One  partnership is currently structured, just as a starting point. Amalgam Insights hopes that Apptio will be the umbrella brand for IT oversight in the near future as IBM Rational, IBM Tivoli, and IBM Lotus served as strong brands and focal points. IBM already has a variety of cloud and AIOps capabilities across Turbonomic, Instana, and Red Hat Openshift management tools for Apptio to serve both a FinOps and CloudOps hub as well as a strong go-to-market brand.

There is room for mutual success in this vision, as Flexera One’s ITAM capabilities are outside the scope of Apptio’s core concerns. This does likely mean that Flexera’s cloud cost capabilities will be shelved in favor of Apptio Cloudability and this needs to be a commitment. IBM needs to be a bit more greedy when it comes to supporting its direct software products than it traditionally has been over the last decade in maintaining the best-in-breed capabilities that Apptio is bringing to market, as the talent and vision of the current Apptio team is a significant portion of the value being acquired. IBM can be a challenging environment for software solutions, as every decision is seen through a multitude of lenses with the goal of finding some level of consensus across a variety of conflicting stakeholders. As this balance is sought, Amalgam Insights hopes that IBM focuses on building its direct software business and keeping Apptio’s finance, cost, and project management capabilities at a market-leadership level that will be championed by customers and analysts, even if this comes at the cost of growing partnerships. It can be easy for IBM software solutions to get the short shrift as its direct revenue can sometimes pale in comparison to larger services contracts, but the newest generation of IT to support new data stacks, hybrid cloud, and AI-enabled decisions and generative assets is in its infancy and IBM has acquired both solutions and a product and service team prepared to take this challenge head-on.

For Apptio: The past five years have been a strong validation of the continued opportunities that exist in IT Financial Management across hybrid cloud, software, and project management. There are still massive opportunities in contingent labor and traditional telecom and data center cost management markets as well as the opportunity to get more granular with API, transactional logs, and technological behavior that can be used to align the cost, budget, and health of the IT ecosystem. Amalgam Insights hopes that Apptio is treated similarly to Red Hat as a growth engine for the company and that Apptio has the operational flexibility to continue operating on its current path, but with more ambition matching the scale of IBM’s technology relationships and goals of solving the world’s biggest challenges.

For Apptio customers: You are working with a market leader in some area of IT finance or multi-vendor public cloud management and should hold fast on demands to retain the tech and support structure currently in place. As you move to IBM contractual terms, make sure that Apptio-related service terms, commitments, and responsibilities stay in place. This is an area where Amalgam Insights expects that the Technology Business Council will prove useful as a collective voice of executive demands to drive future Apptio development and evolution. Be aware that there are additional stakeholders at the table when it comes to the future of Apptio and it will be increasingly important for direct Apptio customers to maintain and increase demands in light of the increased complexity that will inevitably become part of the management of Apptio.

For IBM customers: You are likely already an Apptio customer based on Apptio’s current client base: there was a lot of overlap and synergy between the customer bases. But if not, this is a good time to evaluate Apptio as part of the overall IBM relationship as a dedicated solution for finance and cost management. In doing so, get IBM executive commitment regarding core features and functionality that will be strategically important for aligning IT activity to business growth. To deal with the cliches that every company is now a “software company” or a “data-driven company,” companies must have strong financial controls over the technology components that drive corporate change. At the same time, it is important to maintain a best-in-breed approach rather than be locked into an aging ERP-like experience as many companies experienced over the past decade.

These considerations are all a starting point for how to take action as IBM moves towards acquiring Apptio. Amalgam Insights expects there should be little to no concern with the acquisition moving forward as it is both mutually beneficial to all parties and lacks any sort of monopoly or antitrust issue that has slowed down larger acquisitions.

If you are seeking additional guidance to more granular aspects of considering Apptio, Flexera, IBM Turbonomic or other vendors in the IT finance, cloud FinOps, SaaS Management, or other related Technology Lifecycle Management topics, please feel free to contact Amalgam Insights to schedule an inquiry or to schedule briefing time.

Posted on

Instant Mediocrity: a Business Guide to ChatGPT in the Enterprise

In June of 2023, we are firmly in the midst of the highest of hype levels for Generative AI as ChatGPT has taken over the combined hype of the Metaverse, cryptocurrency, and cloud computing. We now face a deluge of experts who all claim to be “prompt engineering” experts and can provide guidance on the AI tools that will make your life easier to live. At the same time, we are running into a cohort of technologists who warn that AI is only one step away from achieving full sentience and taking over the world as an apocalyptic force.

In light of this extreme set of hype drivers, the rest of us do face some genuine business concerns associated with generative AI. But our issues are not in worshipping our new robot overloads or in the next generation of “digital transformation” focused on the AI-driven evolution of our businesses that lay off half the staff. Rather, we face more prosaic concerns regarding how to actually use Generative AI in a business environment and take advantage of the productivity gains that are possible with ChatGPT and other AI tools.

Anybody who has used ChatGPT in their areas of expertise has quickly learned that ChatGPT has a lot of holes in its “knowledge” of a subject that prevent it from providing complete answers, timely answers, or productive outputs that can truly replace expert advice. Although Generative AI provides rapid answers with a response rate, cadence, and confidence that mimics human speech, it often is missing either the context or the facts to provide the level of feedback that a colleague would. Rather, what we get is “instant mediocrity,” an answer that matches what a mediocre colleague would provide if given a half-day, full-day, or week to reply. If you’re a writer, you will quickly notice that the essays and poems that ChatGPT writes are often structurally accurate, but lack the insight and skill needed to write a university-level assignment.

And the truth is that instant mediocrity is often a useful level of skill. If one is trying to answer a question that has one of three or four answers, a technology that is mediocre at that skill will probably give you the right answer. If you want to provide a standard answer for structuring a project or setting up a spreadsheet to support a process, a mediocre response is good enough. If you want to remember all of the standard marketing tools used in a business, a mediocre answer is just fine. As long as you don’t need inspired answers, mediocrity can provide a lot of value.

A few things for you to consider as your organization starts using ChatGPT. Just like when the iPhone launched 16 years ago, you don’t really have a choice on whether your company is using ChatGPT or not. All you can do is figure out how to manage and govern the use. Our recommendations typically take one of three major categories: Strategy, Productivity, and Cost. Given the relatively low price of ChatGPT both as a consumer-grade tool and as an API where current pricing is typically a fraction of the cost of doing similar tasks without AI, the focus here will be on strategy and productivity

Strategy – Every software company now has a ChatGPT roadmap. And even mid-sized companies typically have hundreds of apps under management. So, now there will 200, 500, or however many potential ways for employees to use ChatGPT over the next 12 months. Figure out how GPT is being integrated into the software and whether GPT is being directly used to process data or indirectly to help query, index, or augment data.

Strategy – Identify the value of mediocrity. The average large enterprise getting mediocrity from a query-writing or indexing perspective is often a much higher standard than the mediocrity of text autocompletion. Seek mediocrity in tasks where the average online standard is already higher than the average skill within your organization.

Strategy – How will you keep proprietary data out of Gen AI? – Most famously, Samsung recently had a scare when it saw how AI tools were echoing and using proprietary information. How are companies both ensuring that they have not put new proprietary data into generative AI tools for potential public use and that their existing proprietary data was not used to train generative AI models? This governance will require greater visibility from AI providers to provide detail on the data sources that were used to build and train the models we are using today.

Strategy – On a related note, how will you keep commercially used intellectual property from being used by Gen AI? Most intellectual property covered by copyright or patent does not allow for commercial reuse without some form of license. Do companies need to figure out some way of licensing data that is used to train commercial models? Or can models verify that they have not used any copyrighted data? Even if users have relinquished copyright for the specific social networks and websites that they initially wrote for, this does not automatically give OpenAI and other AI providers the same license to use the same data for training. And can AIs own copyright? Once large content providers such as music publishers, book publishers, and entertainment studios realize the extent to which their intellectual property is at risk with AI and somebody starts making millions with AI-enabled content that strongly resembles any existing IP, massive lawsuits will ensue. If an original provider, be ready to defend IP. If using AI, be wary of actively commercializing or claiming ownership of AI-enabled work for anything other than parody or stock work that can be easily replaced.

Productivity – Is code enterprise-grade: secure, compliant, and free of private corporate metadata? One of the most interesting new use cases for generative AI is the ability to create working code without having prior knowledge of the programming language. Although generative AI currently cannot create entire applications without significant developer engagement, it can quickly provide specifically defined snippets, functions, and calls that may have been a challenge to explicitly search for or to find on a Stack Overflow or in git libraries. As this use case continues to proliferate, coders need to understand their auto-generated code well enough to check for security issues, performance challenges, appropriate metadata and documentation, and reusability based on corporate service and workload management policies. But this will increasingly allow developers to shift from directly coding every line to editing and proofing the quality of code. In doing so, we may see a renaissance of cleaner, more optimized, and more reusable code for internal applications as the standard for code now becomes “instantly mediocre.”

Productivity – Quality, not Quantity. There are hundreds of AI-enabled tools out there to provide chat, search-based outputs, generative text and graphics, and other AI capabilities. Measure twice and cut once in choosing the tools that you use to help you. It’s better to find the five tools that matter than the 150 tools that don’t maximize the mediocre output that you receive.

Productivity – Are employees trained on fact-checking and proofing their Gen AI outputs? Whether employees are creating text, getting sample code, or prompting new graphics and video, the outputs need to be verified against a fact-based source to ensure that the generative AI has not “hallucinated” or autocompleted details that are incorrect. Generative AI seeks to provide the next best word or the next best pixel that is most associated with the prompt that it has been given, but there are no guarantees that this autocompletion will be factual just because it is related to the prompt at hand. Although there is a lot of work being done to make general models more factual, this is an area where enterprises will likely have to build their own, more personalized models over time that are industry, language, and culturally specific. Ideally, ChatGPT and other Generative AI tools are a learning and teaching experience, not just a quick cheat.

Productivity – How will Generative AI help accelerate your process and workflow automation? Currently, automation tends to be a rules-driven set of processes that lead to the execution of a specific action. But generative AI can do a mediocre job of translating intention into a set of directions or a set of tasks that need to be completed. While generative AI may get the order of actions wrong or make other text-based errors that need to be fact-checked by a human, the AI can accelerate the initial discovery and staging of steps needed to complete business processes. Over time, this natural language generation-based approach to process mapping is going to become the standard starting point for process automation. Process automation engineers, workflow engineers, and process miners will all need to learn how prompts can be optimized to quickly define processes.

Cost – What will you need to do to build your own AI models? Although the majority of ChatGPT announcements focus on some level of integration between an existing platform or application and some form of GPT or other generative AI tool, there are exceptions. BloombergGPT provides a model based on all of the financial data that it has collected to help support financial research efforts. Both Stanford University Alpaca and Databricks Dolly have provided tools for building custom large language models. At some point, businesses are going to want to use their own proprietary documents, data, jargon, and processes to build their own custom AI assistants and models. When it comes time for businesses to build their own billion-parameter, billion-word models, will they be ready with the combination of metadata definitions, comprehensive data lake, role definitions, compute and storage resources, and data science operationalization capabilities to support these custom models? And how will companies justify the model creation cost compared to using existing models? Amalgam Insights has some ideas that we’ll share in a future blog. But for now, let’s just say that the real challenge here is not in defining better results, but in making the right data investments now that will empower the organization to move forward and take the steps that thought leaders like Bloomberg have already pursued in digitizing and algorithmically opening up their data with generative AI.

Although we somewhat jokingly call ChatGPT “instant mediocrity,” this phrase should be taken seriously both in acknowledging the cadence and response quality that is created. Mediocrity can actually be a high level of performance by some standards. Getting to a response that an average 1x’er employee can provide immediately is valuable as long as it is seen for what it is rather than unnecessarily glorified or exaggerated. Treat it as an intern or student-level output that requires professional review rather than an independent assistant and it will greatly improve your professional output. Treat it as an expert and you may end up needing legal assistance. (But maybe not from this lawyer. )

Posted on

Hybrid Workforce Management: Navigating the Complexities of a Diverse Workforce in the Modern Era

Businesses face the challenge of managing a variety of workforce challenges across the wide variety of people at the company: freelancers, outsourcing firms, consultants, contingent labor, and full-time labor. The United States Government Accountability Office estimates that about 40% of workers are not full-time workers and fall within a variety of roles including contractors, part-time workers, and on-call workers that may track time through different systems and methods. With this challenge in mind, companies need a comprehensive management view that automates processes and helps companies focus on conducting work more quickly rather than be mired in a sea of paperwork and processes. Workforce management is no longer simply a matter of managing active full-time employees, but supporting a comprehensive practice that consolidates workforce management across contingent, part-time, full-time, and other categories of workers.

To effectively manage their hybrid workforce effectively across financial, operational, and management capacities, companies must consolidate workforce management tasks onto a single platform and a consistent set of data to avoid constant switching back and forth across inconsistent data. This platform should include contingent labor, internal labor, time and payroll, workforce scheduling, financial budgeting, employee engagement, and onboarding capabilities, including governance, risk, and compliance management across all areas. Data across all of these areas should ideally be within a single data store that provides a shared version of the truth for all stakeholders in workforce management across HR, finance, and line-of-business management roles.

Amalgam Insights believes the following capabilities should be considered in developing a comprehensive workforce management system.

Manage payroll, performance, and relevant benefits for employees, consultants, and freelancers.

Workforce management efforts must consider the combination of standard payroll systems, time and attendance systems, scheduling systems, contingent labor management, on-demand services, third-party temporary labor and consulting firms, and self-employed contractors. In doing so, companies must decide which benefits and services are consistent across various labor types and what resources are needed to maximize the productivity of each class of workers. Regardless of labor type, compensation must be timely, accurate, and provided based on contractual agreements based on relevant labor law. By managing all classes of workers across a shared and consistent set of characteristics, companies may be better positioned to see if there are part-time or contingent workers who should be made full-time employees or to see which tasks are better supported by specific workers, skillsets, geographies, shifts, and other identifying work characteristics.

Supporting differing compliance requirements based on geography, status, and corporate asset access.

Workers with privileged access to trade secrets or classified information must all be treated with relevant compliance and confidentiality standards regardless of their work categorization. At the same time, companies must manage differing standards across wages, benefits, and tax obligations that exist in each jurisdiction where a worker is located.

Standardizing Key Performance Indicators (KPIs) and Management by Objectives (MBOs) across different work categories by focusing on the quality and quantity of relevant outputs and deliverables.

Even within a single department, the combination of roles, geographies, seniority, and employee status can lead to widely disparate individual goals. As companies identify appropriate KPIs and MBOs on an individual level that maximizes the value that each person brings to the workplace, they must also ensure that teams are aligned to shared corporate success metrics rather than disparate and disconnected metrics that may inadvertently pit workers against each other to pursue personal success.

Using a feedback-based set of processes to create a consistent employee experience and corporate culture that provides all workers with a shared set of expectations, goals, worker preferences, and employee support.

Employee feedback is only as useful as the corpus of data created and the management response associated with the suggestions and criticisms provided. At the same time, feedback can also be part of a continuous learning and continuous improvement initiative if feedback is stored as analytic and decision-guiding data that is tracked and monitored over time. Feedback can also be analyzed to see if workers are engaged in processes that are designed to improve the worker or corporate experience.

Understand the top-line and bottom-line financial contribution of contract and contingent work.

Although revenue per full-time employee is an outward-facing metric used by public companies to show efficiency, the business reality is that contractors and part-time employees also represent investments that should be reflected in workforce costs in determining corporate productivity and profitability. If companies are effectively replacing skills with contingent labor, this should be noted and tracked. Conversely, if there are significant gaps between full-time and other employees, companies should figure out the cause of these gaps and whether they can be closed through training, onboarding, or technical augmentation.

Taking Steps to Create a Consolidated Workforce Management Environment

Ultimately, companies have a responsibility to support the relevant stakeholders and shareholders associated with the company. However, this responsibility cannot be met if the company lacks consistent visibility to every worker who is attached to corporate work output, regardless of employment status, geography, department, or role. As companies seek to improve productivity and to allow executives to be more strategic in their approach to support productive workers while maintaining all relevant compliance responsibilities and a shared version of all relevant data, Amalgam Insights provides the following recommendations for human resources, finance, and managerial roles tasked with creating a better work environment.

First, ensure that you have the data necessary to maintain consistency of work expectations. Workers should be able to expect some baseline of employee experience even as they differ in location, employment status, and compensation if for no other reason than to provide every worker with a standard set of expectations and professional responsibilities.

Second, measure the profitability and revenue across the entire workforce based on a holistic view of hours, skills, geographies, and business goals. This capability can potentially uncover if specific hiring or labor sourcing strategies may be more profitable, or at least aligned to higher revenue, rather than simply treating all hiring and contracting exercises as an exercise in managing costs.

Finally, manage contingent labor with metrics and standards similar to traditional employee labor. When 40% of labor consists of either part-time, contractors, or on-demand workers, a workforce management solution that only looks at full-time payroll, onboarding, time, attendance, and benefits is no longer sufficient to understand the finance and operational details of the holistic workforce. Frontline and hourly workers seeking to manage their scheduling and time need a consistent and mobile experience on par with full-time workers. Regardless of how these metrics are presented from a public perspective, companies must have an internal basis for tracking the skills and work of every person who conducts work for a company, regardless of formal employment status.

By taking these steps, companies can fully empower all workers to acknowledge their contributions, manage skill portfolios, and further invest in the success of the complete workforce.

Posted on 1 Comment

Why Are Spreadsheets Still A Common FP&A Tool?

“The status quo is not a neutral state, but a mindset to uphold the decisions of the past.”

In 2023, effective business planning, budgeting, and forecasting is a necessary capability to keep organizations running. Already this year, we have seen unexpected banking failures, unpredictable labor markets, and continued supply chain and logistics challenges based on geopolitical challenges. In light of these challenges, Amalgam Insights believes that businesses must have a shared version of the truth that they use as they look at their budget and finances.

And, in this case, we specifically talk about a “shared” version of the truth rather than the “single version of the truth” typically associated with data warehouses and enterprise applications. This is because data changes quickly and every stakeholder can potentially make different decisions to define and augment their data, even basic changes such as language and currency translation that can lead to different versions of the truth. In this analytically enhanced and globally complicated world, it makes more sense to have a shared version of the truth that is augmented with personalized or localized data and assumptions. However, this consistently shared version of the truth can be hard to accomplish in organizations where planning is handled in a distributed and personalized manner through spreadsheets. In the enterprise world, finance professionals are inured to the basic realities of auditable data, processes, and results. And they are often asked to provide reports and memos that are used at the executive level or by external investors and public markets to ascertain the health of the market. Given the assumed importance of this formality, why would experienced professionals use spreadsheets in the first place?

Let’s face it; spreadsheets are easy to use. They are the lingua franca of data; a format that every experienced data user has been trained on. And with plug-ins and Visual Basic, spreadsheets can now handle relatively complex analytic use cases. Even if they aren’t quite data science tools, spreadsheets can provide structured analytic outputs. Also, spreadsheets are accessible on every computer through Excel, Google Sheets, or other common spreadsheet software. And with the emergence of cloud-based spreadsheets, it is now possible for two or more people to collaborate within a single spreadsheet.

Spreadsheets also provide users with the ability to customize their own analytic views with their own personalized views of data and the ability to hypothesize by building their own models. Who hasn’t looked at data and wondered “what if the data looked a bit differently?” or “what if we have a drastic scenario that suddenly increases or decreases a fundamental aspect of the business?” In light of COVID, rapid interest rate hikes, global shortages in commodities production, trained labor shortages, and the increasingly unstable banking environment we are in, it is important to be able to test potential extreme assumptions and support a wide variety of scenarios. Between the ease of use, availability, and personalization aspects of spreadsheets, it is not hard to figure out why spreadsheets are still a leading tool for financial planning and analysis. Even so, Amalgam Insights has found that once organizations pass Dunbar’s number (approximately 150 employees), they start to struggle with collaborative tasks simply because it becomes difficult for any one employee to know all of the other employees who need to be involved in the business planning process and spreadsheets have been designed to maximize individual productivity, rather than collaborative work, for decades. From a practical perspective, people tend to work with the people they know best. This is fine for a small company with a dedicated office where everyone knows each other. According to US Census data, the typical 1,000-person company has 19 locations, making it highly unlikely that all of the key budget stakeholders will be in one office. In this regard, Amalgam Insights finds the following challenges in supporting spreadsheet-based planning at scale.

The distributed nature of work also makes spreadsheet governance a challenge, as it is easy for spreadsheets to suffer from version control issues, a structured workflow process, and for file owners to lose control of the inputs and outputs that they are responsible for supporting. The lack of version control, workflow, and activity tracking is especially challenging in industries and geographies that require tracking of any personal data either related to employees or customers.

Spreadsheets also struggle in large data environments, which are quickly becoming commonplace in the business planning world. Although a core enterprise database may only be a few gigabytes, accurate planning now often includes access to sales, operations, and potentially even IT transactional data sources that can quickly expand beyond the memory and data size constraints that spreadsheets are designed to use. From Amalgam Insights’ perspective, the size and variety of data are the biggest technical constraints that spreadsheets face as planning solutions.

Spreadsheets lack advanced analytic and machine learning capabilities. Although algorithmic, statistical, and machine learning tools are increasingly becoming part of the FP&A world, especially in forecasting, Amalgam Insights finds in practice that most organizations have not yet embraced complex analytics as a core part of their FP&A approach. Based on current job site metrics, Amalgam Insights estimates that less than 2% of FP&A professionals currently have a machine learning or data science certification or degree, making this an early innovator capability that has still not crossed the chasm to become a standard job requirement for FP&A.

But perhaps the most significant challenge with spreadsheet models is that they are often fragile: created based on the logic of a single person rather than on defined business logic and with little to no documentation associated with the plans, forecasting algorithms, and multi-tabular complexity that inevitably occurs when a spreadsheet is the primary planning tool for a business, which can also lead to costly data accuracy issues. The model is only as adaptable as the spreadsheet creator’s knowledge of the industry and is dependent on that employee staying employed. Considering that it is unrealistic to expect an FP&A senior analyst to remain in that role for more than five years before either getting promoted or getting a better offer, this human risk is a significant challenge for business planning solutions.

As organizations grow in size to support more than a handful of locations and a set of workers that exceeds Dunbar’s number of 150 colleagues, Amalgam Insights believes that it becomes necessary to adopt a formalized planning solution that supports collaboration, scale, advanced analytics, continuous planning across many scenarios, and advanced forecasting analytics. Otherwise, it is difficult for businesses to maintain a consistent and shared version of the truth across financial planning and analysis personnel that can drive both departmental and executive planning efforts.

Ultimately, the use of spreadsheets as a formal system of record for business planning is a risky one for any organization with a formal corporate structure, governed industry or geography, or any organization that has a significantly distributed business. The ubiquity of the spreadsheet makes it an easy place to start modeling a budget, and the value of the spreadsheet in helping users to structure small datasets will exist for the foreseeable future. But the fragility of the data structure, lack of user and version control governance, inability to scale, and the difficulty of verifying data with other sources while avoiding human error all lead to the need of supporting a more formalized planning solution over time. As organizations face a future of keeping distributed groups focused on a shared version of the truth and collectively consider a variety of scenarios at any given time, the risk of spreadsheet fragility needs to be matched up against the value of using a formalized FP&A solution designed to analyze, govern, and protect all relevant business data, formulas, and outcomes.

Posted on

Navigating The Road to Retail Analytic Success

Analytics in the Retail and Consumer Packaged Goods (CPG) markets is more complex than the average corporate data ecosystem because of the variety of analytic approaches needed to support these organizations. Every business has operational management capabilities for core human resources and financial management, but retail adds the complexities of hybrid workforce management, scheduling, and operational analytics as well as the front-end data associated with consumer marketing, e-commerce, and transactional behavior across every channel.

In contrast, when retail organizations look at middle-office and front-office analytics, they are trying to support a variety of timeframes ranging from intraday decisions associated with staffing and customer foot traffic to the year-long cycles that may be necessary to fulfill large wholesale orders for highly coveted goods in the consumer market. Over the past three years, operational consistency has become especially challenging to achieve as COVID, labor skill gaps, logistical bottlenecks, commodity shortages, and geopolitical battles have all made supply chain a massive dynamic risk factor that must be consistently monitored across both macro and microeconomic business aspects.

The lack of alignment and connection between the front office, middle-office, and administrative analytic outputs can potentially lead to three separate silos of activity in the retail world—     connected only by some basic metrics, such as receipts and inventory turnover, that are interpreted in three different ways. Like the parable of the blind men and an elephant where each person feels one part of the elephant and imagines a different creature, the disparate parts of retail organizations must figure out how to come together, as the average net margin for general retail companies is about 2% and that margin only gets lower for groceries and for online stores.

Analytic opportunities to increase business value exist across the balance sheet and income statement. Even though consumer sentiment, foot traffic, and online behavior are still key drivers for retail success, analytic and data-driven guidance can provide value across infrastructure, risk, and real-time operations. Amalgam Insights suggests that each of these areas requires a core analytic focus that is different and reflects the nature of the data, the decisions being made, and the stakeholders involved.

Facing Core Retail Business Challenges

First, retail and CPG organizations face core infrastructure, logistics, and data management challenges that typically require building out historic analysis and quantitative visibility capabilities often associated with what is called descriptive or historical analytics. When looking at infrastructure factors such as real estate, warehousing, and order fulfillment issues, organizations must have access to past trends, costs, transactions, and the breadth of relevant variables that go into real estate costs or complex order fulfillment associated with tracking perfect order index.

This pool of data ideally combines public data, industry data, and operational business data that includes, but is not limited to, sales trends, receipts, purchase orders, employee data, loyalty information, customer information, coupon redemption, and other relevant transactional data. This set of data needs to be available as analytic and queryable data that is accessible to all relevant stakeholders to provide business value. In practice, this accessibility typically requires some infrastructure investment either by a company or a technology vendor willing to bear the cost of maintaining a governed and industry-compliant analytic data store. By doing so, retail organizations have the opportunity to improve personalization and promotional optimization.

A second challenge that retail analytics can help with is associated with the risk and compliance issues that retail and CPG organizations face, including organized theft, supplier risk, and balancing risk and reward tradeoffs. A 2022  National Retail Federation (NRF) survey showed that organized retail crime had increased over 26% year over year, driving the need to identify and counter organized theft efforts and tactics more quickly. Retailer risk for specific goods and brands also needs to be quantified to identify potential delays and challenges or to determine whether direct store delivery and other direct-to-market tactics may end up being a profitable approach for key SKUs. Risk also matters from a profitability analysis perspective as retail organizations seek to make tradeoffs between the low-margin nature of retail business and the consumer demand for availability, personalization, automation, brand expansion, and alternative channel delivery that may provide exponential benefits to profits. From a practical perspective, this risk analysis requires investment in a combination of predictive analytics and the ability to translate the variance and marginal cost associated with new investments with projected returns.

A third challenge for retail analytics is to support real-time operational decisions. This use case requires access to streaming and log data associated with massive volumes of rapid transactions, frequently updated time-series data, and contextualized scenarios based on multi-data-sourced outcomes. From a retail outcome perspective, the practical challenge is to make near-real-time decisions, such as same-day or in-shift decisions to support stocking, scheduling, product orders, pricing and discounting decisions, placement decisions, and promotion. In addition, these decisions must be made in the context of broader strategic and operational concerns, such as brand promise, environmental concerns, social issues, and regulatory governance and compliance associated with environmental, social, and governance (ESG) concerns.

As an example, supply chain shortages often come from unexpected sources. An unexpected geopolitical example occurred in the United States when the government’s use of containers as a temporary barrier to block illegal immigration checkpoints on the US-Mexico border led to shortages at US ports for delivery. This delay in accessing containers was not predictable based solely on standard retail metrics and behavior and demonstrates one example of how unexpected political issues can affect a hyperconnected logistical supply chain.

Recommendations for Upgrading Retail Analytics in the 2020s

To solve these analytic problems, retail and CPG organizations need to allow line-of-business, logistics, and sourcing managers to act quickly with self-service and on-demand insights based on all relevant data. This ultimately means that to take an analytic approach to retail,     Amalgam Insights recommends the following three best practices in creating a more data-driven business environment.

  • Create and implement an integrated finance, operational, and workforce management environment. Finance, inventory, and labor must be managed together in an integrated business data store and business planning environment or the retail organization falls apart. Whether companies choose to do this by knitting together multiple applications with data management and integration tools or by choosing a single best-in-breed suite, retail businesses have too many moving parts to split up core operational data across a variety of functional silos and business roles that do not work together. In the 2020s, this is a massive operational disadvantage.
  • Adopt prescriptive analytics, decision intelligence, and machine learning capabilities above and beyond basic dashboards. When retail organizations look at analytics and data outputs, it is not enough to gain historical visibility. In today’s AI-enabled world, companies must have predictive analytics, statistical analysis, detect anomalies quickly, and have the ability to translate business data into machine learning and language models for the next generation of analytics and decision intelligence. Retail can be more proactive and prescriptive with AI and ML models trained to their enterprise data to support more personalized and contextualized purchasing experiences.
  • Implement real-time alerts with relevant and actionable retail information. Finally, timely and contextual alerts are also now part of the analytic process. As retail organizations have moved from seasonal purchases and monthly budgeting to daily or even hourly decisions, regional and branch managers need to be able to move quickly if there are signs of business danger coordinated revenue leakage, brand damage across any of the products held within the store, unexpected weather phenomena, labor issues, or other incipient macro or microeconomic threats.

Posted on

Workday AI and ML Innovation Summit: Chasing the Eye of the AI Storm

We are in a time of transformational change as the awareness of artificial intelligence (AI) grows during a time of global uncertainty. The labor supply chain is fluctuating quickly and the economy is on rocky ground as interest rates and geopolitical strife create currency challenges. Meanwhile, the commodity supply chain is in turmoil, leading to chaos and confusion. Rising interest rates and a higher cost of money are only adding to the challenges faced by those in the global business arena. In this world where technology is dominant in the business world, the global economic foundation is shifting, and the worlds of finance and talent are up for grabs, Workday stepped up to hold its AI and ML Innovation summit to show a way forward for the customers of its software platform, including a majority of the Fortune 500 that use Workday already as a system of record.

The timing of this summit will be remembered as a time of rapid AI change, with new major announcements happening daily. OpenAI’s near-daily announcements regarding working with Microsoft, launching ChatGPT, supporting plug-ins, and asking for guidance on AI governance are transforming the general public’s perception of AI. Google and Meta are racing to translate their many years of research in AI into products. Generative AI startups already focused on legal, contract, decision intelligence, and revenue intelligence use cases are happy to ride the coattails of this hype. Universities are showing how to build large language models such as Stanford’s Alpaca. And existing machine learning and AI companies such as Databricks are showing how to build custom models based on existing data for a fraction of the cost needed to build GPT.

In the midst of this AI maelstrom, Workday decided to chase the eye of the hurricane and put stakes in the ground on its current approach to innovation, AI, and ML. From our perspective, we were interested both in the executive perspective and in the product innovation associated with this Brave New World of AI.

Enter the Co-CEO – Carl Eschenbach

Workday’s AI and ML Innovation Summit commenced with an introduction of the partners and customers that would be present at the event. The Summit began with a conversation between Workday’s Co-CEOs, Aneel Bhusri and Carl Eschenbach, where Eschenbach talked about his focus on innovation and growth for the company. Eschenbach is not new to Workday, having been on its board during his time at Sequoia Capital, where he also led investments in Zoom, UIPath, and Snowflake. Having seen his work at VMware, Amalgam Insights was interested to see Eschenbach take this role and help Workday evolve its growth strategy from an executive level. From the start, both Bhusri and Eschenbach made it clear that this Co-CEO team is intended to be a temporary status with Eschenbach taking the reins in 2024, while Bhusri becomes the Executive Chair of Workday.

Eschenbach emphasized in this session that Workday has significant opportunities in providing a full platform solution, and its international reach requires additional investment both in technology and go-to-market efforts. Workday partners are essential to the company’s success and Eschenbach pointed out a recent partnership with Amazon to provide Workday as a private offering that can use Amazon Web Service contract dollars to purchase Workday products once the work is scoped by Workday. Workday executives also mentioned the need for consolidation, which is one of Amalgam Insights’ top themes and predictions for enterprise software for 2023. The trend in tech is shifting toward best-in-suite and strategic partnering opportunities rather than a scattered best-in-breed approach that may sprawl across tens or even hundreds of vendors.

These Co-CEOs also explored what Workday was going to become over the next three to five years to take the next stage of its development after Bhusri evolved Workday from an HR platform to a broader enterprise software platform. Bhusri sees Workday as a system of record that uses AI to serve customer pain points. He poses that ERP is an outdated term, but that Workday is currently categorized as a “services ERP” platform in practice when Workday is positioned as a traditional software vendor. Eschenbach adds that Workday is a management platform across people and finances on a common multi-tenant platform.

From Amalgam Insights’ perspective, this is an important positioning as Workday is establishing that its focus is on two of the highest value and highest cost issues in the company: skills and money. Both must exist in sufficient quantities and quality for companies to survive.

The Future of AI and Where Workday Fits

We then heard from Co-President Sayan Chakraborty, who took the stage to discuss the “Future of Work” across machine learning and generative AI. As a member of the National Artificial Intelligence Advisory Committee, the analysts in the audience expected Chakraborty to have a strong mastery of the issues and challenges Workday faced in AI and this expectation was clarified by the ensuing discussion.

Chakraborty started by saying that Workday is monomaniacally focused on machine learning to accelerate work and points out that we face a cyclical change in the nature of the working age across the entire developed world. As we deal with a decline in the percentage of “working-age” adults on a global scale, machine learning exists as a starting point to support structural challenges in labor structures and work efforts.

To enable these efforts, Chakraborty brought up the technology, data, and application platforms based on a shared object model, starting with the Workday Cloud Platform and including analytics, Workday experience, and machine learning as specific platform capabilities. Chakraborty referenced the need for daily liquidity FDIC requests as a capability that is now being asked for in light of banking failures and stresses such as the recent Silicon Valley Bank failure.

Workday has four areas of differentiation in machine learning: data management, autoML (automated machine learning, including feature abstraction), federated learning, as well as a platform approach. Workday’s advantage in data is stated across quantity, quality associated with a single data model, structure and tenancy, and the amplification of third-party data. As a starting point, this approach allows Workday to support models based on regional or customer-specific data supported by transfer learning. At this point, Chakraborty was asked why Workday has Prism in a world of Snowflake and other analytic solutions capable of scrutinizing data and supporting analytic queries and data enrichment. Prism is currently positioned as an in-platform capability that allows Workday to enrich its data, which is a vital capability as companies face the battle for context across data and analytic outputs. 

Amalgam Insights will dig into this in greater detail in our recommendations and suggestions, but at this point we’ll note that this set of characteristics is fairly uncommon at the global software platform level and presents opportunities to execute based on recent AI announcements that Workday’s competitors will struggle to execute on.

Workday currently supports federated machine learning at scale out to the edge of Workday’s network, which is part of Workday’s differentiation in providing its own cloud. This ability to push the model out to the edge is increasingly important for supporting geographically specific governance and compliance needs (dubbed by some as the “Splinternet“) as Workday has seen increased demand for supporting regional governance requests leading to separate US and European Union machine learning training teams each working on regionally created data sources.

Chakraborty compared Workday’s approach of a platform machine learning approach leading to a variety of features to traditional machine learning feature-building approaches where each feature is built through a separate data generation process. The canonical Workday example is Workday’s Skills Cloud platform where Workday currently has close to 50,000 canonical skills and 200,000 recognized skills and synonyms scored for skill strength and validity. This Skills Cloud is a foundational differentiator for Workday and one that Amalgam Insights references regularly as an example of a differentiated syntactic and semantic layer of metadata that can provide differentiated context to a business trying to understand why and how data is used.

Workday mentioned six core principles for AI and ML, including people and customers, built to ensure that the machine learning capabilities developed are done through ethical approaches. In this context, Chakraborty also mentioned generative AI and large language models, which are starting to provide human-like outputs across voice, art, and text. He points out how the biggest change in AI occurred in 2006 when NVIDIA created GPUs, which used matrix math to support the constant re-creation of images. Once GPUs were used from a computational perspective, they made massively large parameter models possible. Chakraborty also pointed out the 2017 DeepMind paper on transformers to solve problems in parallel rather than sequentially, which led to the massive models that could be supported by cloud models. The 1000x growth in two years is unprecedented even from a tech perspective. Models have reached a level of scale where they can solve emergent challenges that they have not been trained on. This does not imply consciousness but does demonstrate the ability to analyze complex patterns and systems behavior. Amalgam Insights notes that this reflects a common trend in technology where new technology approaches often take a number of years to come to market, only to be treated as instant successes once they reach mainstream adoption.

The exponential growth of AI usage was accentuated in March 2023 when OpenAI, Microsoft, Google, and others provided an unending stream of AI-based announcements including OpenAI’s GPT 4 and GPT Plugins, Microsoft 365 Copilot and Microsoft Security Copilot, Google providing access to its generative AI Bard, Stanford’s $600 Alpaca generative AI model, and Databricks’ Dolly, which allows companies to build custom GPT-like experiences. This set of announcements, some of which were made during the Workday Innovation Summit, shows the immense nature of Workday’s opportunity as one of the premier enterprise data sources in the world that will both be integrated into all of these AI approaches.

Chakraborty points out that the weaknesses of GPT include bad results and a lack of explainability in machine learning, bad actors (including IP and security concerns), and the potential Environmental, Social, and Governance costs associated with financial, social, and environmental concerns. As with all technology, GPT and other generative AI models take up a lot of energy and resources without any awareness of how to throttle down in a sustainable and still functional manner. From a practical perspective, this means that current AI systems will be challenged to manage uptime as all of these new services attempt to benchmark and define their workloads and resource utilization. These problems are especially problematic in enterprise technology as the perceived reliability of enterprise software is often based on near-perfect accuracy of calculating traditional data and analytic outputs.

Amalgam Insights noted in our review of ChatGPT that factual accuracy and intellectual property attribution have been largely missing in recent AI technologies that have struggled to understand or contextualize a question based on surroundings or past queries. The likes of Google and Meta have focused on zero-shot learning for casual identification of trends and images rather than contextually specific object identification and topic governance aligned to specific skills and use cases. This is an area where both plug-ins and the work of enterprise software companies will be vital over the course of this year to augment the grammatically correct responses of generative AI with the facts and defined taxonomies used to conduct business.

Amalgam also found it interesting that Chakraborty mentioned that the future of models would include high-quality data and smaller models custom-built to industry and vertical use cases. This is an important statement because the primary discussion in current AI circles is often about how bigger is better and how models compete on having hundreds of billions of parameters to consider. In reality, we have reached the level of complexity where a well-trained model will provide responses that reflect the data that it has been trained on. The real work at this point is on how to better contextualize answers and how to separate quantitative and factual requests from textual and grammatical requests that may be in the same question. The challenge of accurate tone and grammar is very different from the ability to understand how to transform an eigenvector and get accurate quantitative output. Generative AI tends to be good at grammar but is challenged by quantitative and fact-based queries that may have answers that differ from its grammatical autocompletion logic.

Chakraborty pointed out that reinforcement learning has proven to be more useful than either supervised or unsupervised training for machine learning, as it allows models to look at user behavior rather than forcing direct user interaction. This Workday focus both provides efficacy of scale and takes advantage of Workday’s existing platform activities. This combination of reinforcement training and Workday’s ownership of its Skills Cloud will provide a sizable advantage over most of the enterprise AI world in aligning general outputs to the business world.

Amalgam Insights notes here that another challenge of the AI discussion is how to create an ‘unbiased’ approach for training and testing models when the more accurate question is to document the existing biases and assumptions that are being made. The sooner we can move from the goal of being “unbiased” to the goal of accurately documenting bias, the better we will be able to trust the AI we use.

Recommendations for the Amalgam Community on Where Workday is Headed Next

Obviously, this summit provided Amalgam Insights both with a lot of food for thought provided by Workday’s top executives. The introductory remarks summarized above were followed up with insight and guidance on Workday’s product roadmap across both the HR and finance categories where Workday has focused its product efforts, as well as visibility to the go-to-market and positioning, approaches that Workday plans to provide in 2023. Although much of these discussions were held under a non-disclosure agreement, Amalgam Insights will try to use this guidance to help companies to understand what is next from Workday and what customers should request. From an AI perspective, Amalgam Insights believes that customers should push Workday in the following areas based on Workday’s ability to deliver and provide business value.

  1. Use the data model to both create and support large language models (LLMs). The data model is a fundamental advantage in setting up machine learning and chat interfaces. Done correctly, this is a way to have a form of Ask Me Anything for the company based on key corporate data and the culture of the organization. This is an opportunity to use trusted data to provide relevant advice and guidance to the enterprise. As one of the largest and most trusted data sources in the enterprise software world, Workday has an opportunity to quickly build, train, and deploy models on behalf of customers, either directly or through partners. With this capability, “Ask Workday” may quickly become the HR and finance equivalent of “Ask Siri.”
  2. Use Workday’s Skills Cloud as a categorization to analyze the business, similar to cost center, profit center, geographic region, and other standard categories. Workforce optimization is not just about reducing TCO, but aligning skills, predicting succession and future success potential, and market availability for skills. Looking at the long-term value of attracting valuable skills and avoiding obsolete skills is an immense change for the Future of Work. Amalgam Insights believes that Workday’s market-leading Skills Cloud provides an opportunity for smart companies to analyze their company below the employee level and actually ascertain the resources and infrastructure associated with specific skills.
  3. Workday still has room to improve regarding consolidation, close, and treasury management capabilities. In light of the recent Silicon Valley Bank failure and the relatively shaky ground that regional and niche banks currently are on, it’s obvious that daily bank risk is now an issue to take into account as companies check if they can access cash and pay their bills. Finance departments want to consolidate their work into one area and augment a shared version of the truth with individualized assumptions. Workday has an opportunity to innovate in finance as comprehensive vendors in this space are often outdated or rigidly customized on a per-customer level that does not allow versions to scale out in a financially responsible way as the Intelligent Data Core allows. And Workday’s direct non-ERP planning competitors mostly lack Workday’s scale both in its customer base and consultant partner relationships to provide comprehensive financial risk visibility across macroeconomic, microeconomic, planning, budgeting, and forecasting capabilities. Expect Workday to continue working on making this integrated finance, accounting, and sourcing experience even more integrated over time and to pursue more proactive alerts and recommendations to support strategic decisions.
  4. Look for Workday Extend to be accessed more by technology vendors to create custom solutions. The current gallery of solutions is only a glimpse of the potential of Extend in establishing Workday-based custom apps. It only makes sense for Workday to be a platform for apps and services as it increasingly wins more enterprise data. From an AI perspective, Amalgam Insights would expect to see Workday Extend increasingly working with more plugins (including ChatGPT plugins), data models, and machine learning models to guide the context, data quality, hyperparameterization, and prompts needed for Workday to be an enterprise AI leader. Amalgam Insights also expects this will be a way for developers in the Workday ecosystem to take more advantage of the machine learning and analytics capabilities within Workday that are sometimes overlooked as companies seek to build models and gain insights into enterprise data.
Posted on

Analyst Insight: The Decision to Replace Legacy Planning Solutions with Workday Adaptive Planning

Today, Amalgam Insights has published the following Analyst Insight: The Decision to Replace Legacy Planning Solutions with Workday Adaptive Planning. This report explores the decisions of eight Workday Adaptive Planning customers interviewed in 2022 to understand why companies chose to switch to Workday Adaptive Planning from another financial planning and budgeting solution.

The decision to choose a planning, budgeting, and forecasting solution is a complex one in 2023 as we have had to adjust to the challenges of more agile planning cycles using a wide range of data, the shift from purely financial planning to a broader array of business planning demands, as well as the need to create more scenarios based on the wide variety of potential business drivers and outcomes that are now potentially anticipated. Planning is treated less as a fixed, deterministic exercise and increasingly as a stochastic and broadly variable process that is ongoing and continuous.

In that light, when does it make sense to consider another planning solution? Our research shows that the following traits were most common in organizations that ended up switching to Workday Adaptive Planning.

Key drivers for switching to Workday Adaptive Planning

To learn more about why these traits showed up and the best practices that these companies discovered for making a solution change for a technology used to support executive demands and managing the cash flow lifeblood of the company, visit the Workday website for a free copy of the report.

This report is also available for purchase on the Amalgam Insights website.

Posted on

14 Key Trends for Surviving 2023 as an IT Executive

2023 is going to be a tough year for anybody managing technology. As we face the repercussions of inflation and high interest rates and the bubble of tech starts to be burst, we are seeing a combination of hiring freezes, increased focus on core business activities and the hoary request to “do more with less.”

Behind the cliche of doing more with less is the need to actually become more efficient with tech usage. This means adopting a FinOps (Financial Operations) strategy to cloud to go with your existing Telecom FinOps (aka Telecom expense) and SaaS FinOps (aka SaaS Management) strategies. And it means being prepared for new spend category challenges as companies will need to invest in technology to get work done at a time when it is harder to hire the right person at the right time. Here is a quick preview of our predictions.

 

14 Key Predictions for the IT Executive in 2023

To get the details on each of these trends and predictions and understand why they matter in 2023, download this report at no cost by filling out this quick form to join our low-volume bi-monthly mailing list. (Note: If you do not wish to join our mailing list, you can also purchase a personal license for this report.)

Posted on

Evaluating the Selection of Platform-Based and Best-in-Breed Apps for Financial Planning

“Innovation distinguishes between a leader and a follower.”

Steve Jobs

In 2023, we face a series of global planning challenges across accounting, finance, supply chain, workforce management, information technology, and data management. Each of these challenges involves a different set of stakeholders, data structures, key performance indicators, and broader economic and environmental drivers.

In light of this increasingly complex and nuanced set of categories that now make up the responsibilities associated with financial performance management (also known as enterprise performance management; corporate performance management; budgeting, planning, and forecasting, and other buzzwords, but all basically coming back to the financial planning and analysis FP&A role that we have known for decades), companies face a technology-related challenge for managing business plans. Is it better to work with a platform-based approach that allows every user to use the same application to support a variety of accounting and finance use cases including consolidation, close, and planning? Or is it better to use a Best-in-Breed application for business planning?

The basic starting point for evaluating this decision starts with a common sense question for enterprises: is it worth spending money on a standalone planning application or is it better to bundle planning with consolidation and transactional accounting such as an ERP or an accounting platform? In making this decision, companies should look at the following considerations:

Is the solution easy to use? In the 2020s, planning apps should be fairly easy to use, including ease of data entry, the ability to analyze data once it is entered, collaborative planning with other colleagues or budget-holding executives, mobile app support, and the ability to drill into planning data to explore specific deltas, outliers, and budget categories that are of specific interest. Ease of use should also extend to model and scenario management as financial professionals seek to bring a wide variety of potential considerations to enterprise forecasting environments. This ease of use is especially important as planning and forecasting exercises have accelerated in the 2020s based on COVID, supply chain challenges, currency value shifts, inflation, and the looming threat of a potential recession. The need to support flexible planning scenarios can be challenging to accomplish within the accounting framework of creating a fixed and defined set of data that is fully consolidated and auditable.

Is the current solution integrated with all of the data – including operational data – that is needed from a planning perspective? If spreadsheets are considered, this immediately leads to potential governance and consistency problems as each individual will probably have their own specific assumptions. Suppose companies are using a planning solution as part of their ERP. In that case, the planning solution will likely have access to the majority of accounting data associated with planning. Still, companies then have to see how much of their semi-structured data, third-party data (such as weather, government, or market-based data), and other external data are integrated into a solution. And do these integrations require significant IT support or can they be supported either by the vendor, line-of-business operations manager, or even by the end users, themselves?

Is the current planning solution flexible enough to both provide each department with the level of planning they are trying to perform while providing a consistent and shared version of the truth? Over the past few decades, the worlds of enterprise analytics and business accounting have both focused on the idea of a rigid “single version of the truth,” but the reality is that there is no single version of the truth as each individual and each department typically has specific goals, assumptions, terminology, and performance drivers specific to their specific job roles. And the moment that data is officially published or defined as “clean,” it immediately starts becoming outdated.

Accordingly, planning data needs to be organized so that every person involved in planning is able to access a consistent set of metrics while also having specialized views of the operational benchmarks and drivers associated with their specific goals as well as the ability to explore specific “what-if” hypothetical scenarios related to the variability of business situations that the organization may encounter. The operational data needed to support this level of flexibility is not always included as part of a core ERP suite and may need to come from a variety of transactional, payment, process automation systems, workflow management, and project management solutions to provide the level of clarity needed to support enterprise planning.

From Amalgam Insights’ perspective, this initial question of planning application vs platform is a bit of a red herring. Consolidation, close, and accounting audits are based on the need to lock down every transaction and document what has happened in the past. This historical view provides guidance and can be reviewed as necessary. But planning and forecasting are exercises in constructing the present and future of a business that requires the need to view the company through multiple lenses and scenarios and need to be altered based on possible business or global activities that may never happen. By nature, financial planning and analysis activities involve some level of uncertainty. Organizations seeking to accelerate the pace of planning and to extend planning beyond pure financial planning into sales, workforce, supply chain, information technology, & project portfolio management, will likely find that the need for near real-time analytics and data management increasingly requires an application that combines analytic speed, collaboration, and the ability to experiment within an application in ways that may conflict with or surpass the rate of accounting. Business planning needs to be a Best-in-Breed capability that allows for the flexibility of what-if analysis, the real-time feedback associated with new data and business considerations, the scale of modern data challenges, and the ability to collaboratively work with relevant business stakeholders. Without these supporting capabilities that can help organizations to independently adjust to the future, financial planning is ultimately a compliance exercise that lacks the impact and strategic guidance that executive teams need to make hard decisions.

Posted on

Key Planning Trends for 2023 In the Face of Economic Uncertainty

“The will to win means nothing without the will to prepare”

Boston Marathon champion Juma Ikangaa

2023 is undoubtedly a challenging year to forecast from an economic perspective as the tempest of inflation, stock market volatility, foreign exchange challenges, hiring freezes, supply chain delays, and geopolitical conflicts are creating pressure for companies of all sizes and industries. As companies seek to make sense of a complex world and forecast performance, it is important to take full advantage of planning and forecasting capabilities to provide guidance. Of course, it is important to provide visibility and report to business stakeholders. But beyond the basics, what should you be thinking about as we prepare for a bumpy ride? Here are five key recommendations Amalgam Insights is providing for the business community.

  1. Build a planning process that can be changed on a monthly basis. Even if your organization does not need to plan on a continuous basis, there will be at least one or two unexpected planning events that happen this year that will require widespread reconsiderations of the “annual plan.” The “annual planning cycle” concept is dead at companies after the past three years of working through COVID, supply chain issues, and workforce shortages. This means that planning often has to be updated with new and unexpected data to support a wide variety of scenarios. Locking the plan to a specific structure, schedule, or level of data consolidation is increasingly challenging for companies seeking better guidance throughout the year. If you are not building out a variety of scenarios and tweaking changes throughout the year based on business issues and changes, your business is working at a disadvantage to more nimble and agile organizations.

2. Identify planning anomalies quickly. As businesses review their plans, they will find that they are off-plan more quickly than they have historically been. One example of this is in cloud computing, a spend area that is expected to grow 18-22% in 2023, far above general IT spend or the expected rate of inflation in 2023. Other commodities such as complex manufactured goods and food stocks may fall into this category as well based on production delays, logistical shortages, & new novel diseases interrupt supply chains. The ability to quickly identify spend anomalies that exceed budgetary expectations allows companies to affect spend, procurement, and technologies strategies that may further optimize these environments. By identifying these anomalies quickly, finance can work with procurement both to figure out opportunities to reduce spend and to find alternative providers that can either reduce cost or ensure business continuity to meet consumer demand.

3. Interest rates and the cost of money may incentivize longer sourcing contracts to lock in costs. This lesson comes from the sports world, where baseball players are getting long contracts this year. Why? Because the cost of money is increasing and baseball teams can’t play games without players, leading teams to seek the opportunity to lock in costs. Of course, to do this, companies must budget for the potential upfront costs associated with taking on new contracts. This is a story of Haves and Have-nots where the haves now possess an opportunity to lock in costs for the next few years and take advantage of the value of money over the next couple of years while the Have Nots struggling to visualize their spend may be locked in short-term contracts that will cost more over time. However, this ability to make decisions based on the current cost of money is dependent on the ability to forecast the potential ramifications of locking in cost, especially when those costs represent the variable cost of goods to meet the demand for consumer purchases and services.

4. Cross-departmental business planning requires a data strategy that allows organizations to bring in multiple data sources. Finance must start learning about the value of a data pipeline and potentially a data lake for bringing data into a planning environment, processing and formatting the data properly, and maintaining a consistent store of data that includes all relevant information for modern business planning use cases. In the past, it may have been enough for finance to know that there was a database to support financial and payment information and then an OLAP cube to provide high-performance analytics for business planning. But in today’s planning world where finance is increasingly asked to be a strategic hub based on its view of the entire business, planning data now potentially includes everything from weather trends to government-provided data to online sentiment and even social media. These new data sources and formats require finance to both store and interact with data in ways that exceed the challenges of simply having massive row-based tables of business data.

5. Look for arbitrage opportunities across currencies, geographies, and even internal departments. The valuation of mission-critical skills and resources can be valued very differently across different areas. 2023 is an environment where corporate equity and stock values are lower, the US dollar is strong against the majority of global currencies, and skills and commodities can be hard to find. These are both challenges and opportunities, as they allow FP&A professionals to dig into forecasted costs and see if there are opportunities to go abroad or to look internally for skills, goods, and resources that may be less expensive than the typical markets businesses participate in. Finance can work with sourcing, human resources, information technology, and other departments to proactively identify specific areas where the business may have an opportunity to improve.

As we plan for 2023, it is time to prepare sagaciously so that we are ready to execute when challenges and opportunities emerge. By planning now for a wide variety of potential situations, businesses can make better decisions in critical moments that can define careers and the future of the entire organization.