Posted on Leave a comment

What Wall Street is missing regarding Broadcom’s acquisition of CA Technologies: Cloud, Mainframes, & IoT

(Note: This blog contains significant contributions from long-time software executive and Research Fellow Tom Petrocelli)

On July 11, Broadcom ($AVGO) announced an agreement to purchase CA for $18.9 billion. If this acquisition goes through, this will be the third largest software acquisition of all time behind only Microsoft’s $26 billion acquisition of LinkedIn and Facebook’s $19 billion acquisition of WhatsApp. And, given CA’s focus, I would argue this is the largest enterprise software acquisition of all time, since a significant part of LinkedIn’s functionality is focused on the consumer level.

But why did Broadcom make this bet? The early reviews have shown confusion with headlines such as:
Broadcom deal to buy CA makes little sense on the surface
4 Reasons Broadcom’s $18.9B CA Technologies Buy Makes No Sense
Broadcom Buys CA – Huh?

All of these articles basically hone in on the fact that Broadcom is a hardware company and CA is a software company, which leads to the conclusion that these two companies have nothing to do with each other. But to truly understand why Broadcom and CA can fit together, let’s look at the context.

In November 2017, Broadcom purchased Brocade for $5.5 billion to build out data center and networking markets. This acquisition expanded on Broadcom’s strengths in supporting mobile and connectivity use cases by extending Broadcom’s solution set beyond the chip and into actual connectivity.

Earlier this year, Broadcom had tried to purchase Qualcomm for over $100 billion. Given Broadcom’s lack of cash on hand, this would have been a debt-based purchase with the obvious goal of rolling up the chip market. When the United States blocked this acquisition in March, Broadcom was likely left with a whole lot of money ready to deploy that needed to be used or lost and no obvious target.

So, add these two together and Broadcom had both the cash to spend and a precedent for showing that it wanted to expand its value proposition beyond the chip and into larger integrated solutions for two little trends called “the cloud,” especially private cloud, and “the Internet of Things.”

Now, in that context, take a look at CA. CA’s bread and butter comes from its mainframe solutions, which make up over $2 billion in revenue per year. Mainframes are large computers that handle high-traffic and dedicated workloads and increasingly need to be connected to more data sources, “things,” and clients. Although CA’s mainframe business is a legacy business, that legacy is focused on some of the biggest enterprise computational processing needs in the world. Thus, this is an area that a chipmaker would be interested in supporting over time. The ability to potentially upsell or replace those workloads over time with Broadcom computing assets, either through custom mainframe processors or through private cloud data centers, could add some predictability to the otherwise cyclical world of hardware manufacturing. Grab enterprise computing workloads at the source and then custom build to their needs.

This means that there’s a potential hyperscale private cloud play here as well for Broadcom by bringing Broadcom’s data center networking business together with CA’s server management capabilities, which end up looking at technical monitoring issues both from a top-down and bottoms-up perspective.

CA also is strong in supporting mobile development, developer operations (DevOps), API management, IT Operations, and service level management in its enterprise solutions business, which earned $1.75 billion in annual revenue over the past year. On the mobile side, this means that CA is a core toolset for building, testing, and monitoring the mobile apps and Internet of Things applications that will be running through Broadcom’s chips. To optimize computing environments, especially in mobile and IoT edge environments where computing and storage resources are limited, applications need to optimized on available hardware. If Broadcom is going to take over the IoT chip market over time, the chips need to support relevant app workloads.

I would expect Broadcom to increase investment in CA’s Internet of Things & mobile app dev departments expand once Broadcom completes this transaction as well. Getting CA’s dev tools closer to silicon can only help performance and help Broadcom to provide out-of-the-box IoT solutions. This acquisition may even push Broadcom into the solutions and services market, which would blow the minds of hardware analysts and market observers but would also be a natural extension of Broadcom’s current acquistions to move through the computing value stack.

From a traditional OSI perspective, this acquisition feels odd because Broadcom is skipping multiple layers between its core chip competency and CA’s core competency. But the Brocade acquisition helps close the gaps even after spinning off Ruckus Wireless, Lumina SDN, and data center networking businesses. Broadcom is focused on processing and guiding workloads, not on transport and other non-core activities.

So, between mainframe, private cloud, mobile, and IoT markets, there are a number of adjacencies between Broadcom & CA. It will be challenging to knit together all of these pieces accretively. But because so much of CA’s software is focused on the monitoring, testing, and security of hardware and infrastructure, this acquisition isn’t quite as crazy as a variety of pundits seem to think. In addition, the relative consistency of CA’s software revenue compared to the highs and lows of chip building may also provide some benefits to Broadcom by providing predictable cash flow to manage debt payments and to fund the next acquisition that Hock Tan seeks to hunt down.

All this being said, this is still very much an acquisition out of left field. I’ll be fascinated in seeing how this transaction ends up. It is somewhat reminiscent of Oracle’s 2009 acquisition of Sun to bring hardware and software together. This does not necessarily create confidence in the acquisition, since hardware/software mergers have traditionally been tricky, but doesn’t disprove the synergies that do exist. In addition, Oracle’s move points out that Broadcom seems to have skipped a step of purchasing a relevant casing, device, or server company. Could this be a future acquisition to bolster existing investments and push further into the world of private cloud?

A key challenge and important point that my colleague Tom Petrocelli brings up is that CA and Broadcom sell to very different customers. Broadcom has been an OEM-based provider while CA sells directly to IT. As a result, Broadcom will need to be careful in maintaining CA’s IT-based direct and indirect sales channels and would be best served to keep CA’s go-to-market teams relatively intact.

Overall, the Broadcom acquisition of CA is a very complex puzzle with several potential options.

1. The diversification efforts will work to smooth out Broadcom’s revenue over time and provide more predictable revenues to support Broadcom’s continuing growth through acquisition. This will help their stock in the long run and provide financial benefit.
2. Broadcom will fully integrate the parts of CA that make the most sense for them to have, especially the mobile security and IoT product lines, and sell or spin off the rest to help pay for the acquisition. Although Brocade spinoffs occured prior to the acquisition, there are no forces that prevent Broadcom from spinning off non-core CA assets and products, especially those that are significantly outside the IoT and data center markets.
3. In a worst case scenario, Broadcom will try to impose its business structure on CA, screw up the integration, and kill a storied IT company over time through mismanagement. Note that Amalgam Insights does not recommend this option.

But there is some alignment here and it will be fascinating to see how Broadcom takes advantage of CA’s considerable IT monitoring capabilities, takes advantage of CA’s business to increase chip sales, and uses CA’s cash flow to continue Broadcom’s massive M&A efforts.

Posted on Leave a comment

Amalgam Provides 4 Big Recommendations for Self-Service BI Success

 

Recently, my colleague Todd Maddox, Ph.D., the most-cited analyst in the corporate training world, and I were looking at the revolution of self-service BI, which has allowed business analysts and scientists to quickly explore and analyze their own data easily. At this point, any BI solution lacking a self-service option should not be considered a general business solution.

However, businesses still struggle to teach and onboard employees on self-service solutions, because self-service represents a new paradigm for administration and training, including the brain science challenges of training for IT. In light of these challenges, Dr. Maddox and I have the following  four recommendations for better BI adoption.

  1. Give every employee a hands-on walkthrough. If Self-Service is important enough to invest in, it is important enough to train as well. This doesn’t have to be long, but even 15-30 minutes spent on having each employee understand how to start accessing data is important.
  2. Drive a Culture of Curiosity. Self-Service BI is only as good as the questions that people ask. In a company where employees are either set in their ways and not focused on continuous improvement, Self-Service BI just becomes another layer of shelfware.

    Maddox adds: The “shelfware” comment is spot on. I was a master of putting new technology on the shelf! If what I have now works for my needs, then I need to be convinced, quickly and efficiently, that this new approach is better. I suggest asking users what they want to use the software for. If you can put users into one of 4 or 5 bins of business use cases, then you can customize the training and onboard more quickly and effectively.

  3. Build short training modules for key challenges in each department. This means that departmental managers need to commit to recording, say, 2-3 short videos that will cover the basics for self-service. Service managers might be looking for missed SLAs while sales managers look for close rates and marketing managers look for different categories of pipeline. But across these areas, the point is to provide a basic “How-to” so that users can start looking for the right answers.

    Maddox adds: Businesses are strongly urged to include 4 or 5 knowledge check questions for each video. Knowledge testing is one of the best ways to add additional training. It also provides quick insights on what aspects of your video is effective and what is not. Train by testing!

  4. Analytics knowledge must become readily available . As users start using BI, they need to figure out the depth and breadth of what is possible with BI, formulas, workflows, regression, and other basic tools. This might be as simple as an aggregation of useful Youtube videos to a formal program developed in a corporate learning platform.

By taking these training tips from one of the top BI influencers and the top-cited training analyst on the planet, we hope you are better equipped to support self-service BI at scale for your business.

Posted on 1 Comment

Mapping Multi-Million Dollar Business Value from Machine Learning Projects

Amalgam has just posted a new report: The Roadmap to Multi-Million Dollar Machine Learning Value with DataRobot. I’m especially excited about this report for a couple of reasons.

First, this report documents multiple clear value propositions for machine learning that led to the documented annual value of over a million dollars. This is an important metric to demonstrate at a time when many enterprises are still asking why they should be putting money into machine learning.

Second, Amalgam introduces a straightforward map for understanding how to construct machine learning products that are designed to create multi-million dollar value. Rather than simply hope and wish for a good financial outcome, companies can actually model if their project is likely to justify the cost of machine learning (especially the specialized mathematical and programming skills needed to make this work.)

Amalgam provides the following starting point for designing Multi-Million dollar machine learning value:

Stage One is discovering the initial need for machine learning, which may sound tautological. “To start machine learning, find the need for machine learning…” More specifically, look for opportunities to analyze hundreds of variables that may be related to a specific outcome, but where relationships cannot be quickly analyzed by gut feel or basic business intelligence. And look for opportunities where employees already have gut feelings that a new variable may be related to a good business outcome, such as better credit risk scoring or higher quality supply chain management. Start with your top revenue-creating or value-creating department and then deeply explore.

Stage Two is about financial analysis and moving to production. Ideally, your organization will find a use case involving over $100 million in value. This does not mean that your organization is making $100 million in revenue, as activities such as financial loans, talent recruiting, and preventative maintenance can potentially lead to billions of dollars in capital or value being created even if the vendor only collects a small percentage as a finder’s fee, interest, or maintenance fee. Once the opportunity exists, move on it. Start small and get value.

Then finally, take those lessons learned and start building an internal Machine Learning Best Practices or Center of Excellence organization. Again, start small and focus on documenting what works within your organization, including the team of employees needed to get up and running, the financial justification needed to move forward, and the technical resources needed to operationalize machine learning on a scalable and predictable basis. Drive the cost of Machine Learning down internally so that your organization can tackle smaller problems without being labor, cost, and time-prohibitive.

This blog is just a starting point for the discussion of machine learning value Amalgam covers in The Roadmap to Multi-Million Dollar Machine Learning Value with DataRobot. Please check out the rest of the report as we discuss the Six Stages of moving from BI to AI.

This report also defines a financial ROI model associated with a business-based approach to machine learning.

If you have any questions about this blog, the report, or how to engage Amalgam Insights in providing strategy and vendor recommendations for your data science and machine learning initiatives, please feel free to contact us at info@amalgaminsights.com.

Posted on Leave a comment

5 Stages of The Technology Expense Management Market (re: Calero Acquires Veropath)

In my recent Market Milestone, Calero Acquires Veropath to Bolster its Global Role in Technology Expense Management, I made a quick comment about Veropath as an “accretive acquisition target.” But then I realized that I hadn’t explained what that meant from an Amalgam perspective.

From Amalgam’s perspective, the Technology Expense Market (aka Telecom Expense Management, although these solutions now regularly manage a wide variety of IT assets, services, and subscriptions) roughly breaks out into companies of five sizes, each with capabilities that could be considered “accretive” to larger organizations. I should add that there are a number of additional TEM companies that are at these sizes, but do not fit these profiles. Outlying companies might be very profitable, stable, and good providers, but are not typically considered great acquisition targets.

The first size is those of 1 – 10 employees. These are companies that are usually good at a specific task or have a single product that is custom-suited to managing a specific capability, such as automated invoice processing or rate plan optimization or network data management. The companies in this space tend to have a combination of specialization and subject matter expertise from a technical perspective, but lack the support staff to manage a large number of clients. These are a combination of technology acquisitions and acquihires.

The second size is 10 – 30 employees. These Technology Expense Management companies have found a specific geographical, market, or service niche and tend to have some combination of technology and services. There is a long tail of TEM companies in this category that lack the scale to go national, but may have built strong geographic, technical, or process management capabilities. However, these companies typically lack the sales and marketing engine to expand beyond their current size, meaning that further growth will often require outside capital and additional investment in revenue-creating activities.

The third size is roughly between 30 and 75 employees. At this size, the TEM vendor has found a strong go-to-market message and is supporting both mid-market and enterprise vendors regularly. These vendors have built their own platform, have a significant internal support team, and typically have a strong sales leader who is either the CEO or a VP of Sales. At this point, Amalgam notes that the biggest challenge for these vendors is creating a management team empowered to make good decisions and in letting go of decisions as a CEO. This management challenge is quite difficult to surpass, both because it adds a lot of complexity to the business with very little immediate benefit to the CEO or the firm’s employees. However, at this scale, TEM businesses are also a good target for acquisition as they have built out every business function needed to be a successful and stable long-term business. Roughly speaking, these companies tend to have about $100 million to $500 million in spend under management and run as stable, profitable businesses. There are a number of strong TEM vendors in this space including, but not limited to, Avotus, Ezwim, ICOMM, Mobile Solutions, Network Control, SaaSwedo, SmartBill, Tellennium, Valicom, vCom, VoicePlus, and Wireless Analytics

The fourth size is between 75 and 1,000 employees. These TEM companies are rarely acquired and start becoming the acquirers of other TEM companies because they have successfully built an organization that can scale and run multiple business units. At this size, TEM companies start to manage over a billion dollars a year in spend and tend to either be publicly traded or backed by private equity. And at this point, TEM companies start running into adjacent competitors in markets such as Managed Mobility Services, SaaS Vendor Management, Cloud Service Management, IT Asset Management, and other related IT management areas. This is an interesting area for TEM because, after several years of watching Tangoe acquire businesses at this scale in the early 2010s, multiple new vendors appeared at this scale in the mid-to-late 2010s. Currently, Amalgam considers Calero, Cass, Cimpl, Dimension Data, MDSL, Mobichord, Mer Telemanagement Systems (MTS), One Source Communications, Sakon, and TNX to be representative of vendors of this size of large TEM providers.

Currently, the fifth size of 1,000+ employees is a market of one: Tangoe. This company has grown both organically and acquisitively to manage over $38 billion in technology spend, making it roughly six times larger than its nearest competitor. At this size, Tangoe focuses on large enterprise and global management challenges and is positioned to start pursuing adjacent markets more aggressively. Amalgam believes that there is sufficient opportunity in this market for additional firms of this scale, however, and expects one or more of Calero, Cass Information Systems, MDSL, or Sakon to leap into this scale in the next three-to-five years.

So, when Amalgam refers to “accretive opportunities” in the TEM space from an acquisition perspective, this is the rough context that we use as a starting point. Of course, with the 100+ firms that we track in this market, any particular category has both nuance and personalization in describing individual firms. If you have any questions regarding this blog, please feel free to follow up by emailing info@amalgaminsights.com and if you’d like to learn more about what Calero has done with this acquistion of Veropath, one of the largest UK-headquarted TEM vendors, please download our Market Milestone available this week (or as supplies last) for free.

Posted on 1 Comment

Workday Surprises the IPO Market and Acquires Adaptive Insights

Key Stakeholders: Chief Information Officers, Chief Financial Officers, Chief Operating Officers, Chief Digital Officers, Chief Technology Officer, Accounting Directors and Managers, Sales Operations Directors and Managers, Controllers, Finance Directors and Managers, Corporate Planning Directors and Managers

Why It Matters: Workday snatched Adaptive Insights away from the public markets only days before IPO, acquiring a proven enterprise planning company, a trained enterprise sales team, and a team deeply skilled in ERP and enterprise application integration.

Top Takeaway:  Amalgam believes Workday’s acquisition of Adaptive Insights provides a net-win for current Adaptive Insights customers, Workday customers seeking more resources dedicated both to financial planning and workforce planning, and Adaptive Insights partners looking for enterprise product enhancements.

On June 11, 2018, Workday announced signing a definitive agreement to purchase Adaptive Insights, a cloud-based business planning platform. Workday will pay about $1.55 billion to fully acquire all shares of Adaptive Insights. This acquisition occurs only three days before Adaptive Insights was scheduled for a $115 million IPO on Thursday, June 14th, which was estimated to value the company at $705 million. This IPO was expected to be successful based on Adaptive Insights’ strong subscription revenue & revenue renewal metrics described in the S-1. With this acquisition, Tom Bogan will continue to lead the Adaptive Insights business unit and report to Workday CEO Aneel Bhusri and the Adaptive Insights team is expected to remain relatively intact.

Amalgam provides further details in the Market Milestone Workday Acquires Adaptive Insights and Gets a Leg Up On Oracle NetSuite where we explore:

  • Adaptive Insights as a pure-play Cloud EPM player
  • Adaptive Insights’ relationship with NetSuite
  • What Workday gains by purchasing Adaptive Insights
  • What Adaptive Insights’ customers and partners should expect
  • Who wins and loses from this acquisition
Posted on Leave a comment

SaaS Vendor and Expense Management on Display at Oktane 18

Key Stakeholders: CIO, CFO, Chief Digital Officer, Chief Technology Officer, Chief Mobility Officer, IT Asset Directors and Managers, Procurement Directors and Managers, Accounting Directors and Managers

Why It Matters: Okta is a key enabler for the discovery and management of SaaS, which is a necessary enabler for establishing the SaaS inventory and user identities needed to optimize costs. Amalgam noted that several leading SaaS vendor management solutions including Flexera, Torii, and Zylo attended Oktane 18 to speak to savvy IT executives seeking to control SaaS and Cloud expenses.

Top Takeaways: In the past year, the SaaS Vendor Management market has quickly expanded in quality, showing the increasing demand for solutions and maturity of this market. As Oktane has evolved into a hub for SaaS management, this show allows SaaS Vendor Management providers to speak to relevant digital innovators and architects responsible for being IT financial stewards. In the past year, the SaaS Vendor Management market has quickly expanded in quality, showing the increasing demand for solutions and maturity of this market.

At Oktane 18, Amalgam spoke with three leading SaaS Vendor Management solutions: Flexera/Meta SaaS, Torii, and Zylo. From Amalgam’s perspective, one of the most interesting aspects of these meetings was how each vendor had a different focus and perspective despite the fact that these vendors theoretically “do the same thing.” The side-by-side comparison of these vendors provided interesting differentiation in a market that is quickly expanding. To see how these vendors differed and to find out about an additional Baker’s Dozen of vendors that Amalgam tracks as SaaS Vendor Management solutions or products with a SaaS Vendor Management component, Click here to read the full report.

Posted on Leave a comment

Why Oktane Has Become The Most Important SaaS Event of the Year

Key Stakeholders: Chief Information Officers, Chief Digital Officers, Chief Information Security Officers, Security Directors and Managers , Security Operations Directors, IT Architects, IT Strategists, Identity and Access Directors and Managers, Software Engineers, Cloud Strategists

Why This Matters: Cloud and digital strategists who are not attending Oktane risk missing out on key SaaS and IT management strategies emerging in an end-user centric model of IT.
Continue reading Why Oktane Has Become The Most Important SaaS Event of the Year

Posted on Leave a comment

Informatica Prepares Enterprise Data for the Era of Machine Learning and the Internet of Things


From May 21st to May 24th, Amalgam Insights attended Informatica World. Both my colleague Tom Petrocelli and I were able to attend and gain insights on present and future of Informatica. Based on discussions with Informatica executives, customers, and partners, I gathered the following takeaways.

Informatica made a number of announcements that fit well into the new era of Machine Learning that is driving enterprise IT in 2018. Tactically, Informatica’s announcement of providing its Intelligent Cloud Services, its Integration Platform as a Service offering, natively on Azure represents a deeper partnership with Microsoft. Informatica’s data integration, synchronization, and migration services go hand-in-hand with Microsoft’s strategic goal of getting more data into the Azure cloud and shifting the data gravity of the cloud. Amalgam believes that this integration will also help increase the value of Azure Machine Learning Studio, which now will have more access to enterprise data.
Continue reading Informatica Prepares Enterprise Data for the Era of Machine Learning and the Internet of Things

Posted on Leave a comment

Tangoe Makes Its Case as a Change Agent at Tangoe LIVE 2018


Key Stakeholders: CIO, CFO, Controllers, Comptrollers, Accounting Directors and Managers, IT Finance Directors and Managers, IT Expense Management Directors and Managers, Telecom Expense Management Directors and Managers, Enterprise Mobility Management Directors and Managers, Digital Transformation Managers, Internet of Things Directors and Managers

On May 21st and 22nd, Amalgam Insights attended and presented at Tangoe LIVE 2018, the global user conference for Tangoe. Tangoe is the largest technology expense management vendor with over $38 billion in technology spend under management.

Tangoe is in an interesting liminal position as it has achieved market dominance in its core area of telecom expense management where Amalgam estimates that Tangoe manages as much technology spend as its five largest competitors combined (Cass, Calero, Cimpl, MDSL, and Sakon). However, at the same time, Tangoe is being asked by its 1200+ clients how to better manage a wide variety of spend categories, including Software-as-a-Service, Infrastructure-as-a-Service, IoT, managed print, contingent labor, and outsourced contact centers. Because Tangoe is in this middle space, Amalgam finds Tangoe to be an interesting vendor. In addition, Tangoe’s average client is a 10,000 employee company with roughly $2 billion in revenue, placing Tangoe’s focus squarely on the large, global enterprise.

In this context, Amalgam analyzed both Tangoe’s themes and presentations in this context. First, Tangoe LIVE started with a theme of “Work Smarter” with the goal of helping clients to better understand changes in telecom, mobility, cloud, IoT, and other key IT areas.

Tangoe Themes and Keynotes

CEO Bob Irwin kicked off the keynotes by describing Tangoe’s customer base as 1/3rd fixed telephony, 1/3rd mobile-focused, and 1/3rd combined fixed and mobile telecom. This starting point shows Tangoe’s potential in growing simply by focusing on existing clients. In theory, Tangoe could double its revenue and spend under management without adding another client based purely on this point alone. This fact alone makes Tangoe difficult to measure, as much of its potential growth is based on execution rather than the market penetration that most of Tangoe’s competitors are trying to achieve. Irwin also brought up three key disruptive trends of Globalization, Uberization (real-time and localized shared services), and Digital Transformation. On the last point, Irwin noted that 65% of Tangoe customers were currently going through some level of Digital Transformation.

Next, Chief Product Officer Ivan Latanision took the stage. Latanision is relatively new to Tangoe with prior experience with clinical development, quality management, e-commerce, and financial services which should prove useful in supporting Tangoe’s product portfolio and the challenges of enterprise governance. In his keynote, Latanision introduced the roadmap to move Tangoe clients to one of three platforms: Rivermine, Asentinel, or the new Atlas platform being developed by Tangoe. This initial migration will be based on whether the client is looking for deeply configurable workflows, process automation, or net-new deployments with a focus on broad IT management, respectively. Based on the rough timeline provided, Amalgam expects that this three-platform strategy will be used for the next 18-24 months. Then, starting sometime in 2020, Amalgam expects that Tangoe will converge customers to the new Atlas platform.

SVP of Product Mark Ledbetter followed up with a series of demos to show the new Tangoe Atlas platform. These demos were run by product managers Amy Ramsden, Cynthia Flynn, and Chris Molnar and demonstrated how Atlas would work from a user interface, asset and service template, and cloud perspective. Amalgam’s initial perspective was that this platform represented a significant upgrade over Tangoe’s CMP platform and was on par with similar efforts that Amalgam has seen from a front-end perspective.

Chris Molnar Presents Cloud Management

Product Management guru Michele Wheeler also came out to thank the initial customers who have been testing out Atlas in a production environment. Amalgam believes this is an important step forward for Tangoe in that it shows that Atlas is a workable environment and has a legitimate future as a platform going forward. The roadmap for Tangoe customers has been somewhat of a mystery in the past only because there was a wide variety of platforms to integrate and aggregate from the M&A activity that Tangoe had previously participated in. Chief of Operations, Tom Flynn completed the operational presentation by providing updates on Tangoe’s service delivery and upgrades. Flynn had previously acted as Tangoe’s Chief Administrative Officer and General Counsel, which makes this shift in responsibility interesting as Flynn now gets to focus on service delivery and process management rather than the governance, risk management, and compliance issues where Flynn previously was guiding Tangoe.

Thought Leadership Keynotes at Tangoe LIVE

Tangoe also featured two celebrity keynotes focused on the importance of innovation and preparing for the future. Amalgam believes these keynotes were well-suited for the fundamental challenge that much of the audience faced in figuring out how to expand their responsibilities beyond the traditional view of supporting network, telecom, and mobility spend to the larger world of holistic enterprise technology spend. Innovation expert Stephen Shapiro provided the audience with a focus on how to thinking outside of the boundaries that we lock ourselves into by looking at other departments, industries, and categories for new solutions. Business advisor and industry analyst Maribel Lopez followed up on this theme the next day by framing our current challenge of technology as both the Best of Times and the Worst of Time in determining how technology can effectively empower workers. By providing a starting point for achieving the goal of digital mastery, Lopez challenged the audience to take on the challenge of controlling technology and maximizing its value rather than becoming overwhelmed by the variety, scale, and scope of technology.

Amalgam’s Role at Tangoe LIVE

Amalgam Insights also provided two presentations at Tangoe LIVE: “Budgeting and Forecast for Technology Trends” and “The Convergence of Usage Management and IoT,” which was co-presented with Tangoe’s Craig Riegelhaupt.

In the Budgeting and Forecast presentation, Amalgam shared key trends that will affect telecom and mobility budgeting over the rest of 2018 and 2019 including:
USTelecom’s Petition for Forbearance to the FCC, which would allow large carriers to immediately raise the resale rate of circuits by 15% and potentially allow them to stop reselling circuits to smaller carriers altogether
• Universal Service Fee variances and the potential to avoid these costs by changing carriers or procure circuits more effectively
• The 2 GB limit for rate plans and how this will affect rate plan optimization
• The T-Mobile/Sprint Merger and its potential effects on enterprise telephony
• The wild cards of cloud and IoT as expense categories
• Why IoT Management is the ultimate Millennial Reality Check

Amalgam’s Hyoun Park speaks at Tangoe Live

Amalgam also teamed up with Tangoe to dig more deeply into the Internet of Things and why IoT will be a key driver for transforming mobility management over the next five years. By providing representative use cases, connectivity options, and key milestone metrics, this presentation provided Tangoe end users with an introductory guide to how their mobility management efforts needed to change to support IoT.

Tangoe’s Center of Excellence and Amalgam Insights’ Role

At Tangoe LIVE, Tangoe also announced the launch of the Tangoe Center of Excellence, which will provide training courses in topics including inventory management, expense management, usage management, soft skills, & billing disputes with a focus on providing guidance from industry experts.

As a part of this launch, Tangoe announced that Amalgam Insights is a part of Tangoe’s COE Advisory Board. In this role, we look forward to helping Tangoe educate its customer base on the key market trends associated with managing digital transformation and creating an effective “manager of managers” capability to effectively support the quickly-expanding world of enterprise technology. These classes will start in the Fall of 2018 and Amalgam is excited to support this effort to teach best practices to enterprise IT organizations.

Conclusion

Tangoe LIVE lived up to its billing as the largest Telecom and Technology Expense Management user show. But even more importantly, Tangoe was able to show its Atlas platform and provide a framework for users that are being pushed to support change, transformation, and new spend categories. Amalgam believes that the greatest challenge over rest of this decade for TEM managers is how they will support the emerging trends of cloud, IoT, outsourcing, and managed services as all of these spend categories require management and optimization. In this show, Tangoe LIVE made its case that Tangoe intends to be a platform solution to support this future of IT management and support its customers across invoice, expense, service order, and usage management.

Although Tangoe’s official theme for this show was “Work Smarter,” Amalgam believes that the true theme for this show based on keynotes, roadmap, sessions, and end user questions was “Prepare for Change” with the tacit assumption of doing this with Tangoe. In facing this question head on and providing clear guidance and advisory, Tangoe made a compelling case for managing the future of IT at Tangoe LIVE.

Posted on Leave a comment

Outsourcing Core IT When You Are Core IT

Today, I provided a quick presentation on the BrightTALK channel on Outsourcing Core IT when you ARE Core IT. It turns out that one of the takeways from this webinar is about your risk of being outsourced based on your IT model. First, your fear and approach to outsourcing really depend on whether you are part of core IT, enabling IT, or transformational IT:

Operational IT is a cost-driven, commoditized, and miserable IT environment that does not realize that technology can provide a strategic advantage. In this world, a smartphone, a server running an AI algorithm, and a pile of paper clips are potentialy all of equal value and defined by their cost. (Hint: This is not a great environment to work in. You are at risk of being outsourced!)

Enabling IT identifies that technology can affect revenues, rather than just over head costs. This difference is not trivial, as the average US employee in 2012 made about $47,000 per year, but companies brought in over $280,000 per employee each year. The difference between calculating the value of tech at roughly $20 an hour vs. roughly $140 an hour for each employee is immense!

And finally, there is Transformational IT, where the company is literally bet on new Digital Transformation and IT efforts are focused on moving fast.

These are three different modes of IT, each requiring its own approach to IT outsourcing. To learn more, check out the embedded webinar below to watch the entire 25-minute webinar. I look forward to your thoughts and opinions and hope this helps you on your career path. If you have any questions, please reach out to me at hyoun @ amalgaminsights . com!