Posted on Leave a comment

July 23: From BI to AI (Cube Dev, Dremio, Google Cloud, Julia Computing, Lucata, Palantir, Redpoint, Sisense, Vertica, Zoom)

If you would like your announcement to be included in these data platform-focused roundups, please email lynne@amalgaminsights.com.

Product Launches and Updates

Dremio Launches SQL Lakehouse Service to Accelerate BI and Analytics

On July 21, at Subservice Live, Dremio debuted Dremio Cloud, a cloud-native SQL-based data “lakehouse” service. The service marries various aspects of data lakes and data warehouses into a SQL lakehouse, enabling high-performance SQL workloads in the cloud and expediting the process of getting started. Dremio Cloud is now available in the AWS Marketplace.

Google Cloud Announces Healthcare Data Engine to Enable Interoperability in Healthcare

On July 22, Google Cloud announced Healthcare Data Engine, now in private preview. Healthcare Data Engine integrates healthcare and life sciences data from multiple sources such as medical records, claims, clinical trials, and research data, enabling a more longitudinal view of patient health along with advanced analytics and AI in a secure environment. With the introduction of Amazon HealthLake last week, it’s clear that expanding healthcare and life sciences analytics capabilities continue to be a top priority among data services providers.

Palantir Introduces Foundry for Builders

Dipping a toenail into the waters outside their usual large established organization customer base, Palantir announced the launch of Foundry for Builders, providing access to the Palantir Foundry platform for startups under a fully-managed subscription model. Foundry for Builders is starting off with limited availability; the initial group of startups provided access are all connected to Palantir alumni, with the hope of expanding to other early-stage “hypergrowth” companies down the road.

Redpoint Global Announces In Situ

On July 20, Redpoint announced In Situ, a service that provides data quality and identity resolution. In Situ uses Redpoint’s data management technology to supply identity resolution and data integration services in real time within an organization’s virtual private cloud, without needing to transfer said private data across the internet.

Sisense Announces Sisense Extense Framework

On July 21, Sisense debuted the Sisense Extense Framework, a way to deliver interactive analytics experiences within popular business applications. Initially supported apps include Slack, Salesforce, Google Sheets, Google Slides, and Google Chrome, now available on the Sisense Marketplace. The Sisense Extense Framework will be released more broadly later this year to partners looking to build similar “infusion” apps.

Vertica Announces Vertica 11

On June 20, at Vertica Unify 2021, Vertica announced the Vertica 11 Analytics Platform. Key improvements include broader deployment support, strengthened security, increased analytical performance, and enhanced machine learning capabilities.

Funding

Cube Dev Raises $15.5 Million to Help Companies Build Applications with Cloud Data Warehouses

On July 19, Cube Dev announced that they had raised $15.5M in Series A funding. Decibel led this round, with participation from Bain Capital Ventures, Betaworks and Eniac Ventures. The funding will be used to scale go-to-market activities and accelerate R+D on its first commercial product. Cube Dev also brought aboard Jonathan E. Cowperthwait of npm as Head of Marketing and Jordan Philips of Dashbase as Head of Revenue Operations to support their commercial expansion.

Julia Computing Raises $24M in Series A, Former Snowflake CEO Bob Muglia Joins Board

Julia Computing announced the completion of a $24M Series A funding round on July 19. Dorilton Ventures led the round, with participation from Menlo Ventures, General Catalyst, and HighSage Ventures. Julia Computing will use the funding to further develop JuliaHub, its secure, high-performance cloud platform for scientific and technical modeling, and to grow the Julia ecosystem overall. Bob Muglia, the former CEO of Snowflake, joined the Julia Computing board on the same day.

Lucata Raises $11.9 Million in Series B Funding to Introduce Next-Generation Computing Platform

Lucata, a platform to scale and accelerate graph analytics, AI, and machine learning capabilities, announced July 19 that it had raised $11.9M in Series B funding. Notre Dame, Middleburg Capital Development, Blu Ventures Inc., Hunt Holdings, Maulick Capital, and Varian Capital all participated in the round. The funding will fuel an “aggressive” go-to-market strategy.

Acquisitions

Zoom to Acquire Five9

On July 18, Zoom announced that it had entered into a definitive agreement to acquire Five9, a cloud contact center service provider, for $14.7B in stock. In welcoming Five9 to the Zoom platform, Zoom expects to build a better “customer engagement platform,” complementary with its Zoom Phone offering. Later in the week, Zoom also announced the launch of Zoom Apps and Zoom Events, further enhancing the collaboration capabilities of the primary Zoom video communications suite.

Posted on Leave a comment

July 9: From BI to AI (AnyVision, Google Cloud, IBM, Immuta, Obviously AI, Opaque)

If you would like your announcement to be included in Amalgam Insights’ weekly data and analytics roundups, please email lynne@amalgaminsights.com.

Funding

AnyVision Raises $235M from SoftBank Vision Fund 2 and Eldridge

AnyVision, a facial recognition AI company, has closed a $235M series C funding round led by SoftBank’s Vision Fund 2 and Eldridge. Amit Lubovsky, director of SoftBank Investment Advisors, will join the board as part of the transaction. Funding will be directed towards further development of AnyVision’s Access Point AI software, as well as further innovation of its SDKs for edge computing functionality. AnyVision’s funding announcement comes at an interesting time for facial recognition startups; concerns around data privacy are subjecting companies creating and using facial recognition to growing scrutiny.

Obviously AI Increases Seed Round Funding to $4.7M

Obviously AI, a no-code AutoML startup, has raised an additional $1.1M from the University of Tokyo Edge Capital Partners, as well as Trail Mix Ventures and B-Capital. The funding will go towards extending Obviously AI to serve more use cases, as well as expanding Obviously AI’s presence in Asian markets. The concept of no-code AI model building is the unicorn everyone dipping into data science is seeking, but Obviously AI is currently limited to supervised learning use cases, and broadening their scope to cover unsupervised learning is the next … obvious step.

Opaque Raises $9.5 Million Seed to Unlock Encrypted Data with Machine Learning

Opaque, a secure data analytics platform, announced July 7 that it had raised a seed round of $9.5M led by Intel Capital. Race Capital, The House Fund, and FactoryHQ also participated in this round. Opaque lets companies analyze encrypted cloud-based data without exposing the data to the cloud provider. Funding will go towards Opaque’s open source contributions to the data security community.

Product Launches and Updates

Immuta Becomes First Data Access Control Solution for Snowflake Partner Connect

On July 7, Immuta, a cloud data control access provider, announced its availability in the Snowflake Partner Connect portal. Snowflake users will now be able to use Immuta to configure automated data access control around their data. The Immuta Snowflake integration launches as an Immuta instance preconfigured with a Snowflake user’s connection credentials, minimizing setup complexity and time needed.

Hiring and Departing

Google announces Adaire Fox-Martin as its new EMEA Cloud president

Google Cloud has appointed Adaire Fox-Martin as its new EMEA Cloud president. Fox-Martin moves over from a 14-year tenure at SAP, most recently as an Executive Board Member leading Global Customer Success. Prior to that, Fox-Martin spent nearly two decades at Oracle.

IBM’s Jim Whitehurst Says He’s Leaving to Find a New Chance to Run Something

Over the holiday weekend, IBM announced that Jim Whitehurst would be stepping down as president, though he would remain in an advisory role for the time being. In an interview this week with Barrons, Whitehurst acknowledged that his reasoning is that he wants to be a CEO again, and with the appointment of Arvind Krishna to that spot at IBM, his own chances of holding that position were unlikely. Whitehurst had come over to IBM with the Red Hat acquisition, having held the CEO position there since 2007.

Posted on Leave a comment

June 4, 2021: From BI to AI featuring Alation, Cazena, Cloudera, Datacoral, Dataiku, Interative, and Stemma

This week’s roundup From BI to AI features Alation, Cazena, Cloudera, Datacoral, Dataiku, Interative, and Stemma. If you would like your announcement to be included in Amalgam Insights’ weekly data and analytics roundups, please email lynne@amalgaminsights.com.

Acquisitions

Cloudera Acquires Datacoral and Cazena, is Acquired by Clayton, Dubilier, and Rice and KKR for $5.3 Billion

On June 1, Cloudera announced that it had agreed to be acquired by investment companies Clayton, Dubilier, and Rice, and KKR for a $5.3B sum, transitioning to a private company. Financial results for Q12021 were released at the same time, with subscription revenue up 7% year over year.

Cloudera also acquired two SaaS companies in separate transactions. Datacoral enables data transformations and data integration, while Cazena implements quick cloud data lakes. Both companies provide fully managed services that facilitate data preparation for self-service analytics.

Funding

Alation Announces $110 Million Series D to Accelerate Growth

On Thursday, June 3, Alation, an enterprise data intelligence platform announced that it had raised a $110M Series D funding round. Riverwood Capital led this round of funding. Other participants also included existing investors Costanoa Ventures, Dell Technologies Capital, Icon Ventures, Salesforce Ventures, Sapphire Ventures, and Union Grove Partners, along with new investments from Sanabil Investments and Snowflake Ventures. Amalgam Insights’ Hyoun Park wrote about this example of “investipartnering,” and provides recommendations for the data management community.

Stemma Launches, Reports Seed Funding of $4.8 Million

On Thursday, June 3, Stemma announced that it had raised $4.8M in seed funding, led by Sequoia, and subsequently officially launched their data catalog product. Built atop the open-source data catalog Amundsen, Stemma provides enterprise-scale management capabilities and an intelligence layer based on relevant context.

MLOps Company Iterative Raises $20 Million Series A Funding Led by 468 Capital

Iterative.ai, an MLOps platform, announced Wednesday, June 2 that it had raised a $20M Series A round. 468 Capital and Florian Leibert led the round, which also included prior investors True Ventures and Afore Capital. Iterative.ai also debuted its first commercial product, DVC Studio, a visual front-end on its open source projects DVC (Data Version Control) and CML (Continuous Machine Learning) intended to enhance collaboration above and beyond data scientists’ usual Git methods.

Product Launches and Updates

Dataiku Now Available in the Microsoft Azure Marketplace

On June 1, Dataiku announced availability through the Azure Marketplace. Azure customers can now purchase Dataiku with their existing Azure cloud budget and relationship, taking advantage of integrated access to Azure’s cloud storage and compute resources for their data science workflows.

Posted on

ThoughtSpot Acquires Diyotta to Accelerate Access to Cloud Data

Key Stakeholders: Chief Information Officers, Chief Technical Officers, Digital Transformation Heads, Director of Engineering, Enterprise Architecture Directors and Managers, Application Architecture Directors and Managers, Financial Systems Directors and Managers

Why It Matters: This document serves as introductory guidance for Amalgam Insights’ subscribers considering ThoughtSpot and Diyotta for their data environments. Cloud data warehouses and data lakes provide challenges in data integration and transformation for enterprises seeking to analyze the massive volumes of data in these stores. Diyotta has a strong track record of supporting analytic data at massive cloud scale over the past decade and will provide ThoughtSpot with both the technology and talent necessary to continue innovating in making data more accessible for business analysis and distribution.

Top Takeaway: This acquisition makes ThoughtSpot more prepared and able to support enterprise data ecosystems and it should be accretive to current and future ThoughtSpot customers seeking to access data more quickly.

About the Announcement

On May 4th, 2021, ThoughtSpot purchased Diyotta, a data integration platform as a service (IPaaS) vendor known for its ability to support “Big Data” sources, including the market leaders in cloud data warehouses. Diyotta also has a data pipeline SaaS product supporting over 120 data sources to simplify integration. With this purchase, ThoughtSpot both acquires a strong data integration platform and closes gaps for customers seeking to rapidly deploy ThoughtSpot across cloud data sources and machine learning services. With this acquisition, over 60 Diyotta employees will be joining ThoughtSpot, which Amalgam Insights believes is over half of the company.

About Diyotta

Diyotta was founded in 2011 by Sanja Vyas, Ravindra Punuru, and Sripathi Tumati to build a cloud-based data integration at a time when cloud computing was just starting to get traction. At the time, infrastructure as a service (IaaS) was a $5 billion global market (compared to the $70 billion+ market that IaaS is in 2021) and data was only beginning to move into the cloud.

Over time, Diyotta built out a code-free data integration platform that allowed companies to connect a wide variety of scalable data sources and built out partnerships with leading data and analytics vendors. 

A notable partnership created was the October 2019 announcement of Diyotta and ThoughtSpot creating a strategic partnership to support search-driven analytic insights. This partnership accelerated enterprise access to data both by allowing companies to build data pipelines more quickly and to support data ETL (Extract, Transform, and Load) from a variety of sources to ThoughtSpot. 

What to Expect

ThoughtSpot has quickly evolved in 2021 both through inorganic acquisitions including Diyotta and SeekWell as well as through the organic creation of the ThoughtSpot Modern Analytics Cloud to provide a SaaS platform for search-driven analytics and the launch of ThoughtSpot Everywhere to provide a low-code application development platform. As ThoughtSpot seeks to continue enabling its growth as a company, one of its bottlenecks was in providing access to the increasingly diverse, distributed, and messy data ecosystems that every enterprise now has. ThoughtSpot had created ThoughtSpot Embrace to query a variety of data sources already, including Amazon Redshift, Google BigQuery, Microsoft Azure Synapse, SAP HANA, Snowflake, and Teradata. With the acquisition of Diyotta, ThoughtSpot Embrace development should accelerate and the two solutions should come closer together more quickly.

With Diyotta, ThoughtSpot now owns an in-house solution for companies seeking to bridge the data access gap for enterprises that lack mature ETL capabilities for cloud data. ThoughtSpot had already been licensing Diyotta technology within its solution, but the Diyotta acquisition allows ThoughtSpot to further access Diyotta’s combination of ease of use, scale, and support for a variety of cloud-based data sources This acquisition should allow the two technologies to support more synergistic development going forward. 

In particular, both Diyotta and ThoughtSpot are strong partners with Snowflake. ThoughtSpot has even taken a $20 million investment from Snowflake Ventures. ThoughtSpot is now better positioned to optimize its data integration and analytics solutions for Snowflake.

ThoughtSpot and Diyotta already partnered to support an easy way to both access and analyze data through their respective technologies. With this acquisition, Amalgam Insights expects to see Diyotta integrated into ThoughtSpot over time as analytics and business intelligence companies are driven to become business data companies capable of handling not only analytic needs, but the curation and orchestration of data sources and the programmatic delivery of analytics back to both applications and data sources.

As for the Diyotta standalone products, Amalgam Insights believes that the revenue from these products is relatively small considering that ThoughtSpot has raised approximately $564 million with the most recent round coming from Snowflake and the prior round of $248 million happening in August 2019. Given that, it is likely a distraction for Diyotta to continue both supporting the standalone iPaaS solution while also supporting ThoughtSpot’s broader product and sales goals. 

Amalgam Insights’ Recommendations

For ThoughtSpot customers, this acquisition should be a welcome addition as it will accelerate ThoughtSpot’s ability to support data sources. Diyotta’s technical team has deep experience in supporting rapid data connectors and the deeper ETL processes needed to support data analytics across a wide variety of data lakes and data warehouses. The biggest task for ThoughtSpot customers is to find out how quickly Diyotta will be available as a broader ThoughtSpot Embrace solution and whether Diyotta will be made available to current customers as a standalone product or as an embedded product going forward.

For Diyotta customers, start tracking support announcements to see what commitments ThoughtSpot is making to support the product. Diyotta was purchased to support ThoughtSpot’s massive research, development, and product roadmap and to effectively allow ThoughtSpot to be Diyotta’s biggest customer.

For the analytics community in general, this acquisition demonstrates that the data integration and analytics companies are coming together based on market demand for solutions to make data easier to access and analyze. Both Diyotta and ThoughtSpot were developed to handle massive analytic data. This is a trend that will continue. Expect to continue seeing Best-in-Class integration and BI solutions coming together to provide you with integrated options. 

This acquisition also speaks to the increased demand for usability. Diyotta consistently ranked high in every measure of usability and ease-of-use as an integration platform, which was an important aspect of this acquisition. ThoughtSpot continues to work on creating an Apple-like environment for data where end-user and analyst access to data remains simple by putting substantial work and investment into analytic performance, backend search, natural language processing, and data management.

Overall, Amalgam Insights believes that this acquisition makes ThoughtSpot more prepared and able to support enterprise data ecosystems and that it should be accretive to current and future ThoughtSpot customers seeking to access data more quickly.

Posted on

UPDATE – Quick Take: Is Oracle Buying Tiktok? (Hint: It’s all about the cloud)

Last Updated January 20th, 2021

(Update: As of January 20th, with the presidential inauguration of Joe Biden, it seems unlikely that the Biden administration will continue the pursuit of the US ban on Tiktok. This follows U.S. District Court Judge Carl Nichols Dec. 7 ruling that the Commerce Department had “likely overstepped” its authority in placing the ban. An earlier injunction on shutting down Tiktok services on October 30th in the United States Court of Appeals for the Third Circuit by Judge Wendy Beetlestone is currently scheduled to be appealed in February 2021.)

Key Takeaway: Master tactician Larry Ellison gains a feather in the Oracle Cloud by playing the long game and positioning Tiktok as a significant Oracle Cloud customer. Well played, Mr. Ellison.

As if 2020 hasn’t been weird enough, many of us are finding out that enterprise stalwart Oracle is apparently going to purchase Gen Z (born after 1995) and Gen Alpha (born 2010 or later) social media darling Tiktok.

What? Is this actually happening?

Well, not quite. But to explain, first we need to look at the context.

Last month, President Trump created an executive order to ban Tiktok in the United States based on security and censorship issues. This move was seen both as a move against the Chinese economy and to protect global social media platforms based in the United States such as Facebook and Twitter.

In response, a number of potential suitors showed up with either bids or proposals to support Tiktok in the United States. Microsoft showed interest in purchasing Tiktok to support its Azure cloud and gain a massive source of video content that would be useful across Microsoft’s marketing (Bing), gaming (XBox, Minecraft), augmented reality (Hololens), and artificial intelligence (Azure AI) businesses. And at one point, retail giant Walmart was associated with this bid, perhaps in an attempt to fend off Amazon in this digital path. But this bid was shut down was rejected on September 13.

Oracle came in after Microsoft, showing interest in Tiktok. At the time, there was massive confusion from the market at large on why Oracle would be interested. But, as someone who has written about the tight relationship between social technologies and the cloud for many years, my immediate thought was that it’s all about the cloud.

Oracle has been forcefully marketing Oracle Cloud Infrastructure as an enterprise solution after making significant investments to improve connectivity and usability. These recent changes have led to significant logo wins including Zoom and 8×8, both of which chose Oracle for its performance and 80% savings on outbound network traffic. The cost of connectivity has traditionally been a weak point for leading cloud providers, both due to a lack of focus on networking and because cloud vendors have wanted to gate data within their own platform and have little to no incentive to make inter-cloud transfers and migrations cheaper and easier. But Oracle’s current market position combined with its prior investments in high performance computing and network performance means that it makes good business sense for Oracle to be the most efficient cloud on a per-node and bandwidth perspective and to attack where other cloud vendors are weak.

Social media and communications vendors are massive cloud customers, in their own right. Pinterest has a 6 year, $750 million commitment with Amazon Web Services and is easily on pace to spend far more. Lyft has its own $300 million commitment wth AWS. And Citrix has a $1 billion commitment with its cloud vendor of choice, presumably Microsoft Azure. The cloud contract sizes of large and dynamic social and video-centric vendors is enormous. Every cloud provider would be glad to support the likes of Tiktok as a customer or potentially even as a massive operations writeoff that would be countered by the billions of dollars in revenue Tiktok provides.

And, of course, Tiktok creates a massive amount of data. Similar to Microsoft’s interest in Tiktok, Oracle obviously has both expertise and a large business focused on the storage and analysis of data. Managing Tiktok content, workloads, and infrastructure would provide Oracle with technical insights to video creation trends and management that no other company other than perhaps Alphabet’s Youtube could provide. Over the past couple of years, Oracle has put a lot of effort both into database automation and cloud administration with its Gen2 offering.

In addition to bolstering Oracle’s cloud, Tiktok also could make sense as a tie-in to Oracle’s Marketing Cloud. At a time when large marketing suites are struggling to support new platforms such as Tiktok, what better way to develop support than to own or to access the underlying technology? But wait, does Oracle have access to Tiktok’s code and algorithms?

Apparently not. Current stories suggest that Oracle will be the hosting partner or “Trusted Technology Provider” for Tiktok America while Tiktok parent company ByteDance still maintains a majority ownership of the company. It looks like Oracle has positioned itself to be the cloud provider for a massive social media platform, as the United States alone has over 100 million active users on Tiktok. And the speculation behind Microsoft’s rejected bid is that Microsoft sought to purchase the source code and algorithms of Tiktok, which ByteDance refused to provide.

So, the net-net appears to be that in response to Trump’s Executive Order, Oracle will gain an anchor client for Oracle Cloud Infrastructure while making some investment into the new Tiktok US organzation. Oracle’s reputation for security and tight US government relations are expected to paper over any current concerns about data sovreignty and governance, such as Chinese access to US user data. Current Tiktok investors, such as General Atlantic and Sequoia Capital, may also have stakes in the new US company. This activity effectively puts more money into a Chinese company. Most importantly, this action will allow Tiktok to remain operational in the United States after September 20th, the original due date of the executive order.

Congratulations to Oracle and Larry Ellison on a game well played.

Posted on 1 Comment

Quick Thoughts on Cloud Cost and Cloud FinOps Tagging

One of the tactical problems I get asked about most often is how to manage cloud Infrastructure as a Service within an IT cost management environment. As someone who recommends boh telecom expense management and cloud cost management solutions, I’ve seen that the paradigm typically used for telecom, servers, on-prem software, and other traditional IT assets and services doesn’t work as well for cloud both because of the transient nature of cloud services and that public cloud is often purchased solely by technical buyers, with professional sourcing and finance professionals being left out entirely.

This has led to a new practice that has been called Cloud Cost Management or, alternatively, FinOps (even though the abbreviation FinOps does not refer to the Operations of Finance or the CFO office, but that’s a debate for another time…)

In practice, this means that even the most basic general ledger or Active Directory taxonomy used for the vast majority of business costs is not used for the cloud because the people involved don’t know where to start. From a practical perspective, this means that cloud buyers often don’t get to take advantage of the business structure that most of the rest of IT purchasing has and end up having to recreate basic business categories from scratch.

As you start managing cloud services, you will most likely have to tag your resources within your management solution of choice because of the relative financial immaturity of cloud management solutions.

(There are exceptions: Apptio Cloudability, Calero-MDSL, CloudHealth by VMware, and Tangoe being the best examples)

The basic starting point for tagging is to look both at financial management and operational management.

For financial management, ask your controller or accounting team how they break down IT costs, then use the same categories for your tags. It’s usually some combo of employee ID, cost center, profit center, geography, project ID, General Ledger ID, but every company does its books a little differently.

Then the operational management is based on your IT org’s view of technology management, which could include applications, projects, technologies, staging environments and software supported, cloud service categories, functional IT tasks. This process is well-aligned to an IT Finance or Technology Business Management approach where technology is aligned to specific operational and functional tasks and responsibilities. But you may also need more granular tags that assign each resource to automated governance, security, data transfer or architecturally defined tasks. Each task or function should roll up to a functional manager, project manager, or stakeholder.

In thinking about the operational side of tagging, we recommend looking at Apptio Cloudability, CloudHealth by VMware, CloudCheckr, and Replex as starting points.

These tags end up being the taxonomy for your cloud environment and should ideally match up with existing IT taxonomies across IT asset management, project management, service management, and financial management. Otherwise, you risk reinventing the wheel and using up tags on categorization that only makes sense for yourself or your immediate team.

In addition, after creating these tags, you may also want to group these tags into larger dimensions that are associated with a specific use case, solution, or output with the goal of having shortcuts to manage what can be an intimidating number of services, resources, and tag combinations.

Over the next couple of months, Amalgam Insights will be providing more guidance in this space both with our SmartLists on Kubernetes Cost Management and Market Leaders in Technology Expense as well as releasing our videos on managing cloud costs from our recently completed Technology Expense Management Expo. If you have any suggestions for key issues we should include in these reports, please let us know at research@amalgaminsights.com.

And in case you missed it, here’s our recommendations for managing cloud cost from earlier this year.

Posted on

The Complexities of Managing Cloud Spend

COVID-19 shows no signs of letting up in the United States. For IT, finance and procurement professionals supporting remote staff, this continues to present expense management challenges. In recent blogs and webinars, Amalgam Insights showcases ways to maneuver these issues as they relate to telecom and networking, mobility, SaaS; we’ve also provided in-depth recommendations for understanding the six stages of COVID IT.

Now we go into detail about a particularly difficult, yet critical, area to assess: cloud IaaS. Even if you have decades of experience evaluating telecommunications and IT invoices, cloud is a whole different animal. And I say “critical” because of all IT categories, cloud IaaS stands out as the one that will experience spending growth in 2020, given that it best meets the needs of a distributed workforce. Both of these realities add pressure to the expense management team’s responsibility to uncover and control costs tied to the organization’s technology environment. 

Managing IaaS: 3 Core Challenges and Their Solutions

The influx of cloud services during the COVID-19 pandemic is highlighting issues that already existed but that expense managers may not have yet tackled. Takeaway? In a recessionary climate, you can’t put off addressing these challenges.

1. Huge Growth

Again, cloud spending will soar this year. Amalgam Insights expects public cloud spend to increase by an average of 30% across all enterprises. This may cause problems, if it hasn’t already, with budgets. But operating according to the IT Rule of 30 should help. That’s our calculation that any unmanaged category of IT contains about 30% waste.

So even though you’ll see about 30% growth in cloud, you may be able to reduce spending by the same amount with mature oversight.

2. Extremely Detailed Billing  

Compared to telecom, cloud features even more granular invoicing. This applies to every cloud service or component the organization uses. Expense managers have to scour and inspect cloud invoices line by line to avoid missing anything, ideally with programmatic tools or algorithms to help manage the Brobdingnagian challenges of cloud bills.

3. Lack of Standardization

Cloud is no Ma Bell. The various vendors have never worked together and do not plan to work together. This means there is no standardization for billing terminology or structure. Your enterprise may benefit by creating or obtaining a glossary and ontology that brings together, correlates and defines the providers’ different references.

Spotting Opportunities for Cloud Cost Management

Organizations must get a handle on their 2020 cloud expenses now. COVID-19 has upended budgets, forecasts and consumption. Following these near-term suggestions will help IT, finance, and procurement regain control.


Identify the Cloud Boss(es). When it comes to the business side of cloud computing, most environments don’t have someone in charge. Now is the time to designate a person or team – executive and managerial stakeholders in charge of planning and budgeting – to oversee the business of cloud. Amalgam Insights has noticed cloud expense and planning tends to be a hybrid role. The ideal candidates usually have expertise in IT, finance/accounting, and procurement. Knowing that, some titles to consider are: Chief Information Officer; Controller; Chief Digital Officer; Vice President of Cloud; Chief Architect. By identifying an executive responsible for cloud and gaining the attention of this champion, cloud accountability becomes a bigger deal.


Analyze Service Usage. Cloud features myriad buckets and use cases. Therefore, IT has to pinpoint what goes where, why, and whether to tweak any ancillary resources (networking, as the primary example). As an example of the latter statement, consider Zoom’s recent partnership with Oracle Cloud. Since the beginning of COVID-19, demand for Zoom has rocketed into the hundreds of millions of users. Service degradation was inevitable. Zoom needed help and turning to Oracle helped it save what we estimate to be over 80% on its cloud networking costs, while achieving necessary failover and business continuity requirements. But speaking to our assertion that IT has to figure out how cloud resources are allocated, the answer isn’t always “off premises.” If you’re archiving core applications on-site, and even with legacy tools, you can probably keep operating that way. Financially and otherwise, this may still be the wisest choice.

Optimize Cloud Services. Businesses adopted a lot of cloud services between March and June of this year, often without realizing it as staff scrambled to work from home (shadow IT, anyone?). That created a situation ripe for optimization. Here are our top recommendations for saving money on cloud spend:

  • Check bills for duplicate resources and eliminate any that are doing the same job (if doing so won’t impact workflows).
  • Rationalize, and potentially turn off, idle services.
  • Right-size resources. In other words, understand how the cloud environment will change as the organization grows or shrinks. Pro Tip: Have a contingency plan and a backup vendor in case usage doubles or triples. The goal isn’t to wholesale migrate all your services to a new vendor, but to be able to add overflow or additional computing and services that may be more cost-efficient or agile onto another vendor.
  • Review discounts to make sure they are actually showing up on the bill. Cloud pricing is almost always accurate but providers do seem to have issues getting the agreed-upon discounts right. 
  • Look at workload times. Turn off workloads when employees aren’t using them or at least turn them down during non-peak times.
  • Assess expiration dates. Which cloud resources have an expiration date and which don’t? Find out whether any cloud platforms are used for testing or development. Wherever it makes sense, ask cloud providers to remove expiration dates. 

Ensure Project Governance. Don’t just bring in more cloud resources on a whim. This will create more mess. Instead, take a step back. You want to do right by the organization, avoiding waste rather than adding to it. The goal is to “measure twice, cut once.” Start by tagging and categorizing all existing cloud services, tracking both technical details as well as relevant business categories based on the general ledger and project management solutions. Tagging will enable essential tracking capabilities, and we explore this idea in greater depth below. Then assign expiration dates and vendor commitments – this is also where having a cloud boss comes in handy. After that, conduct a thorough review before launching any new cloud service into production mode. Turn off all test platforms so the organization does not keep paying for them. 

Tag Categories. This practice is vital to cloud expense management best practices. How well the organization tags and categorizes cloud services plays directly into the efficacy and clarity of IT spending. IT needs to know why things happen in the cloud environment, and that won’t be apparent without tagging.

Here are the areas Amalgam Insights has identified as the most useful for tagging and categorizing:

  • Cross-charging: Link all cloud spending to the general ledger.
  • Project ownership: Every project and resource should have an owner and be assigned to that person. This works out most optimally if that person holds some level of IT responsibility with the business. Be sure to link this information to the human resources system, too.
  • Service priority: Make sure all cloud platforms used for testing and production are identified and have the appropriate service prioritization in place.
  • Region: Follow all cloud governance risk management best practices for every geography in which the organization operates. You don’t want to breach any compliance requirements.

Study Contractual Commitments. There are six main buckets to review for opportunities to save money on cloud expenses:

  1. Time commitments: Cloud vendors often extend more discounts or more flexible terms to organizations that agree to use their services for multiple years.
  2. Payment terms: Will paying upfront or over time serve the best interests of the business? It may be time to negotiate some flexibility depending upon the answer.
  3. Potential growth or reduction: Build a number of scenarios based on different expectations; for example, operational usage may stay the same but software development or research teams may need to add machine learning workloads. That will affect pricing. Make contractual agreements based on those changes.
  4. Potential investment in apps: What cloud usage is projected for the new apps being created? What data will they create and what services will they need to access? Although developers cannot fully predict usage patterns, the business needs to have a basic idea of potential cloud cost impacts and how app demand will change cloud costs.
  5. Regional concerns:  Figure out which regions need most access to the cloud, as regional pricing for services can vary significantly, leading to potential arbitrage opportunities.
  6. Discounts: As discussed already, cloud vendors often get the pricing right yet omit the discounts or tiering changes. Make sure the organization gets the agreed-upon concessions.

Conclusion: Recommendations for the Future

Think about managing cloud expenses, especially during COVID-19, as doing your part to act as a steward of the business. As we’ve said before, every $100,000-$200,000 in IT expenses saved equates to a job saved or reinstated. When it comes to the cloud side of the house, introducing automation and reducing total cost of ownership are two additional ways to achieve that goal. Remember, cloud itself doesn’t just represent a cost-cutting measure compared to on-premises data centers, it’s also a tool that saves labor and provides for ongoing business agility and access to services that are more resilient to technical debt. The current economic climate is tough. People have a lot less time (and patience) for activities such as setup, administration, business continuity/disaster recovery, upgrades and performance tuning. Automate as many of these tasks as possible. Sure, that might mean opting for a more expensive cloud that comes with better Key Performance Indicators and Service Level Agreements. Incrementally, though, this will provide more value than a cheaper counterpart.

Alongside automation and the Total Cost of Ownership, don’t overlook the benefits of data and application development. As cloud vendors show their reluctance to hedge on discounts or payment terms, companies with skills in writing more optimized code and supporting better data management will have advantages in optimizing and cleaning up the cloud environment. 

Above all, you don’t have to do all this alone. There are a number of vendors Amalgam Insights recommends that specialize in cloud expense management. Here they are, in alphabetical order:

  • Apptio Cloudability
  • BMC
  • Calero-MDSL
  • CloudCheckr
  • CloudHealth by VMware
  • Flexera
  • MobiChord
  • Snow Software
  • Tangoe
  • Upland Software
  • vCom

Keep in mind, each vendor takes different approaches and has different areas of strength. We recommend investigating each one to see how it fits your environment and needs. If you need unbiased help assessing the options, call on Amalgam Insights. 

***

If you are seeking outside guidance and a deeper dive on your IT environment, Amalgam Insights is here to help. Click here to schedule a consultation.

Join us at TEM Expo, currently available on-demand until August 13 at no cost, to learn more about how to prepare for COVID IT and take immediate action to cut costs. In particular, check out sessions by Robert Lee Harris and Corey Quinn on managing cloud costs and avoiding the biggest mistakes that cloud vendors won’t tell you about.

And if you’d like to learn more about this topic now, please watch our webinar.

Posted on Leave a comment

IBM and Cloudera Join Forces to Expand Data Science Access

On June 21, IBM and Cloudera jointly announced that they were expanding their existing relationship to bring more advanced data science solutions to Hadoop users by developing a shared go-to-market program. IBM will now resell Cloudera Enterprise Data Hub and Cloudera DataFlow, while Cloudera will resell IBM Watson Studio and IBM BigSQL.

In bulking up their joint go-to-market programs, IBM and Cloudera are reaffirming their pre-existing partnership to amplify each others’ capabilities, particularly in heavy data workflows. Cloudera Hadoop is a common enterprise data source, but Cloudera’s existing base of data science users is small despite the growing demand for data science options, and their Data Science Workbench is coder-centric. Being able to offer the more user-friendly IBM Watson Studio to its customers gives Cloudera’s existing data customers a convenient option for doing data science without necessarily needing to know Python or R or Scala. IBM can now sell Watson Studio, BigSQL, and IBM consulting and services into Cloudera customers more deeply; it broadens their ability to upsell additional offerings.

Because IBM and Cloudera each hold significant amounts of on-prem data, It’s interesting to look at this partnership in terms of the 800-pound gorilla of cloud data: AWS. IBM, Cloudera, and Amazon are all leaders when it comes to the sheer amount of data each holds. But Amazon is the biggest cloud provider on the planet; it holds the plurality of the cloud hosting market, and most of IBM and Cloudera’s customers’ data is on-prem. Because that data is hosted on-prem, it’s data Amazon doesn’t have access to; IBM and Cloudera are teaming up to sell their own data science and machine learning capabilities on that on-prem data where there may be security or policy reasons to keep it out of the cloud.

A key differentiator in comparing AWS with the IBM-Cloudera partnership lies in AWS’ breadth of machine learning offerings. In addition to having a general-purpose data science and machine learning platform in SageMaker, AWS also offers task-specific tools like Amazon Personalize and Textract that address precise use cases for a number of Amazon customers who don’t need a full-blown data science platform. IBM has some APIs for visual recognition, natural language classification, and decision optimization, but AWS has developed their own APIs into higher-level services. Cloudera customers building custom machine learning models may find that IBM’s Watson Studio suits their needs. However, IBM lacks the variety of off-the-shelf machine learning applications that AWS provides. IBM supplies their machine learning capabilities as individual APIs that an application development team will need to fit together to create their own in-house apps.

Recommendations

  • For Cloudera customers looking to do broad data science, IBM Watson Studio is now an option. This offers Cloudera customers an alternative to Data Science Workbench; in particular, an option that has a more visual interface, with more drag-and-drop capabilities and some level of automation, rather than a more code-centric environment.
  • IBM customers can now choose Cloudera Enterprise Data Hub for Hadoop. IBM and Hortonworks had a long-term partnership; IBM supporting and cross-selling Enterprise Data Hub demonstrates that IBM will continue to sell enterprise Hadoop in some flavor.
Posted on 2 Comments

The Death of Big Data and the Emergence of the Multi-Cloud Era

RIP Era of Big Data
April 1, 2006 – June 5, 2019

The Era of Big Data passed away on June 5, 2019 with the announcement of Tom Reilly’s upcoming resignation from Cloudera and subsequent market capitalization drop. Coupled with MapR’s recent announcement intending to shut down in late June, which will be dependent on whether MapR can find a buyer to continue operations, June of 2019 accentuated that the initial Era of Hadoop-driven Big Data has come to an end. Big Data will be remembered for its role in enabling the beginning of social media dominance, its role in fundamentally changing the mindset of enterprises in working with multiple orders of magnitude increases in data volume, and in clarifying the value of analytic data, data quality, and data governance for the ongoing valuation of data as an enterprise asset.

As I give a eulogy of sorts to the Era of Big Data, I do want to emphasize that Big Data technologies are not actually “dead,” but that the initial generation of Hadoop-based Big Data has reached a point of maturity where its role in enterprise data is established. Big Data is no longer part of the breathless hype cycle of infinite growth, but is now an established technology.
Continue reading The Death of Big Data and the Emergence of the Multi-Cloud Era

Posted on Leave a comment

Big Changes in the Cloud Data Migration Market: Attunity and Alooma Get Acquired

Mid-February (Feb. 17 – 23) was a hot week for data and cloud migration companies with two big acquisitions.

Google announced on Tuesday, Feb. 19 the acquisition of Alooma to assist with cloud data migration issues. This acquisition aligns well with the 2018 acquisition of Velostrata to support cloud workload migration. This acquisition reflects Google’s continued interest in the Israeli cloud tech ecosystem to acquire technology solutions. It also shows that Google is pushing for enterprise cloud data and workloads and will be better positioned to migrate storage and compute from other clouds, such as Amazon Web Services and Microsoft Azure. As Google Cloud Platform continues to grow, this allows Google to be better positioned to

  • become a primary enterprise solution for more enterprises as Google is better positioned to acquire structured data en masse from public and private competitors
  • be a credible backup or business continuity solution for enterprises that may want a multi-cloud or hybrid cloud solution
  • be a competitive provider to help enterprises with cost reduction either through being a foil in contract negotiations or to simply optimize cost in areas where Google resources and services end up being either more cost-effective or easier to manage than similar services from other cloud vendors

This acquisition will allow enterprises looking at cloud as a primary or significant compute and storage tool to consider Google Cloud Platform to replace, replicate, and/or backup existing cloud environments and provides a step forward for Google Cloud Platform to continue growing in the Infrastructure-as-a-Service market that Amalgam Insights estimates is currently growing at over 50% year-over-year driven by AWS growing 45%, Microsoft growing 76%, Google Cloud Platform’s assumed growth of 30-40%, and Alibaba Cloud growth of 84% based on the last quarter, the last of which shows Alibaba’s potential to leapfrog Google Cloud Platform in the next even as it has not significantly expanded past the Chinese market.

And there’s more!

On Thursday, February 21, Qlik announced its intention to acquire Attunity for $560 million. Attunity is based in the Boston area and has been a market leader in change data capture and data duplication with a considerable enterprise and cloud provider customer list. The scale of Attunity’s data transfer needs has made Attunity a proven solution for managing, transfer, backup, and recovery of data for on-prem, private cloud, and public cloud.

When I first covering Attunity in 2012, the stock was trading at around $5 per share after having survived both the dot-com and 2008 recessions. At the time, the company was repositioning itself as a cloud enabler and I had the opportunity to speak at their 2013 Analyst Day. At that time, I told the investing crowd that Attunity was aligned to a massive cloud opportunity because of its unique role in data replication and in supporting Cloud-based Big Data.

In 2018, Attunity’s stock finally reaped the benefits of this approach as the stock tripled based on rapidly growing customer counts and revenue driven by the need to manage data across multi-region cloud environments, multiple cloud vendors, and hybrid cloud environments. In light of this rapid growth, it is no surprise that Attunity was a strong acquisition target in early 2019’s cash-rich, data-rich, and cloud-dependent world. Looking at Attunity’s income statements, it is easy to see why Qlik made this acquisition from a pure financial perspective as Attunity has crossed the line into profitability and developed a scalable and projectable business that now needs additional sales and marketing resources to fully execute.

Amalgam Insights believes that Attunity provides Qlik with a strong data partner to go with last year’s acquisition of Podium Data (also a Boston-area startup) as a data catalog. With this acquisition, Qlik continues to build itself out as a broad enterprise data solution post-Thoma Bravo acquisition.

With this acquisition, Qlik users are in a comfortable position of being provided with a next-generation data ecosystem to support their move to the cloud and to support a broad range of data sources, formats, and use cases. Qlik is taking a step forward to support mature enterprise needs at a time when a number of its Business Intelligence competitors are focusing on incremental changes in usability, data preparation, or performance.

Amalgam Insights sees the acquisition of Attunity as a competitive advantage for Qlik in net-new deals and believes that this acquisition provides companies considering investments in cloud data or broad cloud migrations to immediately add Qlik to the list of vendors that need to be added to the enterprise toolkit to fully manage these projects.

The big picture for enterprises is that cloud data migration is a core capability to support BCDR (Business Continuity and Disaster Recovery), hybrid cloud, and multi-cloud environments. Google Cloud Platform is now more enterprise-ready and competitive with its larger competitors, Amazon Web Services and Microsoft Azure, at a time when Thomas Kurian is taking the reins. Qlik is establishing itself as a powerful Big Data and Cloud Data vendor at a time when Big Data continues to triple year after year. The enterprise data world is changing quickly and both Google and Qlik made moves to respond to burgeoning market demand.