Posted on 1 Comment

Amazon SageMaker: A Key to Accelerating Enterprise Machine Learning Adoption

On November 29th, Amazon Web Services announced SageMaker, a managed machine language service that manages the authoring, model training, and hosting of algorithms and frameworks. These capabilities can be used by themselves, or as an end-to-end production pipeline.

SageMaker is currently available with a Free tier providing 250 hours of t2.medium notebook usage, 50 hours of m4.xlarge training usage, and 125 hours of m4.xlarge hosting usage for hosting for two months. After two months or for additional hours, the service is billed per instance, storage GB, and data transfer GB.

Amalgam Insights anticipates watching the adoption of SageMaker as it solves several basic problems in machine learning.

Continue reading Amazon SageMaker: A Key to Accelerating Enterprise Machine Learning Adoption

Posted on 4 Comments

Amazon Aurora Serverless vs. Oracle Autonomous Database: A Microcosm for The Future of IT

On November 29th, Amazon Web Services announced a variety of interesting database announcements at Amazon re:invent. Amazon Neptune, DynamoDB enhancements, and Aurora Serverless. Amalgam found both Neptune and DynamoDB announcements to be valuable but believes Aurora Serverless was the most interesting of these events both in its direct competition with Oracle and its personification of a key transitional challenge that all enterprise IT organizations face.

Amazon Neptune is a managed graph database service that Amalgam believes will be important for analyzing relationships, networked environments, process and anomaly charting, pattern sequencing, and random walks (such as solving the classic “traveling salesman” problem). Amazon Neptune is currently in limited preview with no scheduled date for production. Over time, Amalgam expects that Neptune will be an important enhancer for Amazon Kinesis’ streaming data, IoT Platform, Data Pipeline, and EMR (Elastic MapReduce) as graph databases are well-suited to find the context and value hiding in large volumes of related data.

For the DynamoDB NoSQL database service, Amazon announced two new capabilities. The first is global tables that will be automatically replicated across multiple AWS regions, which will be helpful for global support of production applications. Secondly, Amazon now provides on-demand backups for DynamoDB tables without impacting their availability or speed. With these announcements, DynamoDB comes closer to being a dependable and consistently governed global solution for unstructured and semistructured data.

But the real attention-getter was in the announcement of Aurora Serverless, an upcoming relational database offering that will allow end users to pay for database usage and access on a per-second basis. This change is made possible by Amazon’s existing Aurora architecture in separating storage from compute from a functional basis. This capability will be extremely valuable in supporting highly variable workloads.

How much will Aurora Serverless affect the world of relational databases?

Taking a step back, the majority of business data value is still created by relational data. Relational data is the basis of the vast majority of enterprise applications, the source for business intelligence and business analytics efforts, and the standard format that enterprise employees understand best for creating data. For the next decade, relational data will still be the most valuable form of data in the enterprise and the fight for relational data support will be vital in driving the future of machine learning, artificial intelligence, and digital user experience. To understand where the future of relational data is going, we have to first look at Oracle, who still owns 40+% of the relational database market and is laser-focused on business execution.

In early October, Oracle announced the “Autonomous Database Cloud,” based on Database 18c. The Autonomous Database Cloud was presented as a solution for managing the tuning, updating, performance driving, scaling, and recovery tasks that database administrators are typically tasked with and was scheduled to be launched in late 2017. This announcement came with two strong guarantees: 1) A telco-like 99.995% availability guarantee, including scheduled downtime and 2) a promise to provide the database at half the price of Amazon Redshift based on the processing power of the Oracle database.

In doing so, Oracle is using a combination of capabilities based on existing Oracle tuning, backup, and encryption automation and adding monitoring, failure detection, and automated correction capabilities. All of these functions will be overseen by machine learning designed to maintain and improve performance over time. The end result should be that Oracle Autonomous Database Cloud customers would see an elimination of day-to-day administrative tasks and reduced downtime as the machine learning continues to improve the database environment over time.

IT Divergence In Motion: Oracle vs. Amazon

Oracle and Amazon have taken divergent paths in providing their next-generation relational databases, leading to an interesting head-to-head decision for companies seeking enterprise-grade database solutions.

On the one hand, IT organizations that are philosophically seeking to manage IT as a true service have, in Oracle, an automated database option that will remove the need for direct database and maintenance administration. Oracle is removing a variety of traditional corporate controls and replacing them with guaranteed uptime, performance, maintenance, and error reduction. This is an outcome-based approach that is still relatively novel in the IT world.

For those of us who have spent the majority of our careers handling IT at a granular level, it can feel somewhat disconcerting to see many of the manual tuning, upgrading, and security responsibilities being both automated and improved through machine learning. In reality, highly repetitive IT tasks will continue to be automated over time as the transactional IT administration tasks of the 80s and 90s finally come to an end. The Oracle approach is a look towards the future where the goal of database planning is to immediately enact analytic-ready data architecture rather than to coordinate efforts between database structures, infrastructure provisioning, business continuity, security, and networking. Oracle has also answered the question of how it will answer questions regarding the “scale-out” management of its database by providing this automated management layer with price guarantees.

In this path of database management evolution, database administrators must be architects who focus on how the wide variety of data categories (structured, semi-structured, unstructured, streaming, archived, binary, etc…) will fit into the human need for structure, context, and worldview verification.

On the other hand, Amazon’s approach is fundamentally about customer control at extremely granular levels. Aurora is easy to spin up and allows administrators a great deal of choice between instance size and workload capacity. With the current preview of Amazon Aurora Serverless, admins will have even more control over both storage and processing consumption by starting at the endpoint level as a starting point for provisioning and production. Amazon will target the support of MySQL compatibility in the first half of 2018, then follow with PostgreSQL later in 2018. This billing will occur in Aurora Capacity Units as a combination of storage and memory metered in one-second increments. This granularity of consumption and flexibility of computing will be very helpful in supporting on-demand applications with highly variable or unpredictable usage patterns.

But my 20+ years in technology cost administration also lead me to believe that there is an illusory quality of control in the cost and management structure that Amazon is providing. There is nothing wrong with providing pricing at an extremely detailed level, but Amalgam already finds that the vast majority of enterprise cloud spend unmonitored from a month-to-month basis at all but the most cursory levels. (For those of you in IT, who is the accountant or expense manager who cross-checks and optimizes your cloud resources on a monthly basis? Oh, you don’t have one?)

Because of that, we at Amalgam believe that additional granularity is more likely to result in billing disputes or complaints. We will also be interested in understanding the details of compute: there can be significant differences in resource pricing based on reserved instances, geography, timing, security needs, and performance needs. Amazon will need to reconcile these compute costs to prevent this service from being an uncontrolled runaway cost. This is the reality of usage-based technology consumption: decades of telecom, network, mobility, and software asset consumption have all demonstrated the risks of pure usage-based pricing.

Amalgam believes that there is room for both as Ease-of-Use vs. Granular Management continues to be a key IT struggle in 2018. Oracle represents the DB option for enterprises seeking governance, automation, and strategic scale while Amazon provides the DB option for enterprises seeking to scale while tightly managing and tracking consumption. The more important issue here is that the Oracle DB vs. Amazon DB announcements represent a microcosm of the future of IT. In one corner is the need to support technology that “just works” with no downtime, no day-to-day adminstration, and cost reduction driven by performance. In the other corner is the ultimate commoditization of technology where customers have extremely granular consumption options, can get started at minimal cost, and can scale out with little-to-no management.

Recommendations

1) Choose your IT model: “Just Works” vs. ” Granular Control.” Oracle and Amazon announcements show how both models have valid aspects. But inherent in both are the need to both scale up and scale out to fit business needs.

2) For “Just Works” organizations, actively evaluate machine learning and automation-driven solutions that reduce or eliminate day-to-day administration. For these organizations, IT no longer represents the management of technology, but the ability to supply solutions that increase in value over time. 2018 is going to be a big year in terms of adding new levels of automation in your organizations.

3) For “Granular Control” organizations, define the technology components that are key drivers or pre-requisites to business success and analyze them extremely closely. In these organizations, IT must be both analytics-savvy and maintain constant vigilance in an ever-changing world. If IT is part of your company’s secret sauce and a fundamental key to differentiated execution, you now have more tools to focus on exactly how, when, and where inflection points take place for company growth, change, or decline.

For additional insights on Amazon’s impact on the future of IT, read Amalgam analyst Tom Petrocelli’s perspective on Amazon Web Services and the Death of IT Ops on InformationWeek

Posted on 2 Comments

Why Did TIBCO Acquire Alpine Data?

On November 15, 2017, TIBCO announced the acquisition of Alpine Data, a data science platform long known for its goals of democratizing data science and simplifying access to data, analytic workflows, parallel compute, and tools.

With this acquisition, TIBCO makes its second major foray into the machine learning space after June 5th acquisition of Statistica. In doing so, TIBCO has significantly upgraded its machine learning support capabilities, which will be especially useful to TIBCO in continuing to position itself as a full-range data and analytics solution.

When this acquisition occurred, Amalgam received questions on how Alpine Data and Statistica would be expected to work together and how Alpine Data would fit into TIBCO’s existing machine learning and analytics portfolio. Amalgam has provided favorable recommendations for both Alpine Data and Statistica in 2017 and plans to continue providing a positive recommendation for both solutions, but sought to explore the nuances of these recommendations.

In our Market Milestone, we explore why Alpine Data was a lower-ranked machine learning solution in analyst landscapes despite being early-to-market in providing strong collaborative capabilities and supporting a wide variety of data sources. We also wanted to explore the extent to which Alpine Data provided some sort of conflict to existing TIBCO customers. Finally, we also wanted to provide guidance on how TIBCO’s acquisition would potentially change Alpine Data’s positioning and capabilities.

To read Amalgam Insights’ view and recommendations regarding this report, use the following link to acquire this report.

Posted on 2 Comments

With myEinstein, Salesforce Embraces that “AI is the New UI”

Salesforce Einstein Airplane - Courtesy of Salesforce
Salesforce Einstein Airplane – Courtesy of Salesforce

Key Takeaway: Amalgam believes that the go-live date of myEinstein will be the most important date for Enterprise AI in 2018 as it represents the day that AI will become practical and available to a broad business audience across industries, verticals, company sizes, and geographies.

On November 6, 2017, Salesforce [NYSE:CRM] announced the launch of myEinstein: services based on Salesforce’s Einstein machine learning platform to support point-and-click-based and codeless AI app development. This announcement was one of several new services that Salesforce built across platform (mySalesforce and myIoT), training (myTrailhead), and user interface development (myLightning).

myEinstein consists of two services: Continue reading With myEinstein, Salesforce Embraces that “AI is the New UI”

Posted on Leave a comment

Amalgam Insights Analyzes Sage Intacct and Pacioli AI

Amalgam Insights recently attended Sage Intacct Advantage. In the past, Intacct got AI’s attention for its strong technology foundation that positions it well for a future of predictive analytics, ease of integration, and machine learning while maintaining the core financial responsibilities associated with being a nominative mid-market ERP solution.

Sage has traditionally been known as a more traditional application company with an established customer base and geographic footprint, which left industry observers with questions of whether there would be significant conflict within cultures and whether observer favorite Intacct would be absorbed into a Sage organization that has traditionally been quiet in its interactions with industry analysts.

During a discussion with analysts, Sage Intacct EVP Rob Reid explained the key business reason for the acquisition. Intacct had been considering an IPO that would have likely resulted in roughly $100 million being raised to support Intacct’s subsequent international expansion. Although this would have been a strong monetization event for all stockholders, the work needed to build a footprint in a variety of new countries would have been both significant and challenging. In contrast, being acquired by Sage for $850 million allowed Intacct investors and stockholders to monetize shares and to access Sage’s significant global footprint.

In speaking with Sage Intacct’s executive team, both Reid and Sage President Blair Crump emphasized that Sage Intacct would be left as an independent organization from a product development perspective. AI believes this is especially important in allowing Sage Intacct to further expand its cloud-based capabilities for core verticals of SaaS, non-profit, and professional services and to accelerate time-to-value for new Sage Intacct capabilities.

With Sage’s acquisition of Intacct in July and the renaming of Intacct to Sage Intacct, AI was curious to see how Sage Intacct differed from the previous Intacct in four areas:

  • Innovation
  • Executive Commitment
  • Partner relationships
  • Customer for Life Initiatives

Innovation

Since Sage Intacct initially got AI analysts’ attention for its pace of innovation and solution flexibility relative to its competitors, AI was interested in seeing how the software provider sought to maintain its innovation post-acquisition.

At the event, Sage Intacct announced enhancements across key verticals to support board meetings in nonprofit organizations, project management for professional services, and contract management for SaaS organizations. The nonprofit Board Book, in particular, was differentiated for its integration with GuideStar to provide benchmark and best practice metrics for nonprofit finance and operations tracking.

Sage Intacct Pursues AI with Pacioli

In addition, Sage Intacct also announced its pursuit of Artificial Intelligence in Pacioli, named after the inventor of double-entry accounting. (Amalgam believes this is the second most inspired name for an AI behind only Infor’s Coleman named after Katherine Coleman Johnson.)

CTO Aaron Harris stated that Pacioli is intended to be a “the first digital assistant designed for CFOs” including a bot built to support the CFO with the ability to self-learn and to provide relevant context for revenue and financial anomalies. Although it’s too early to see how successful Sage Intacct will be in developing Pacioli, Amalgam believes that Sage Intacct’s strong focus on supporting dimensionality and related metadata across its entire product will accelerate development. In addition, Harris revealed that Sage had already been working on AI that will be brought into the Pacioli development process in projects such as the Pegg Chatbot. Between the announcements by Intacct and Infor and the work done by Salesforce, IBM, and Oracle, it has been a big year for bringing AI into the world of enterprise applications.

Amalgam notes that with the integration of AI, the user interface for applications is going to go through a sea-change that will make the rise of mobility look trivial in comparison. AI provides the opportunity to enhance expert judgment, develop individualized interfaces based on personality and work habit preferences, and to serve as a true assistant to the employee rather than a complex system that has to be continually re-configured and interfaced to provide repeatable business outputs. Based on the AI projects announced in 2017, Amalgam sees that 2018 and 2019 will be as important to the infancy of true enterprise AI as the late 2000s were for the early establishment of the app development, security, governance, and management of mobile apps.

Executive Commitment

It is not uncommon to see executives leave an organization either after IPO or acquisition. One of the most interesting moments at the industry analyst portion of the event was when Rob Reid called out his entire management team to show that they were all committed to the organization. Although nothing in life is guaranteed, it seemed that key Sage Intacct executives had incentive to stay on board for the next two-three years, meaning that there should be no immediate changes either to roadmap or executive commitments to the Sage Intacct community.

Partner Relationships

With the acquisition by Sage, it was valid for Sage Intacct customers to wonder whether Sage products would replace the ecosystem of partners that Sage Intacct had previously developed. To answer this question, at least in part, Sage Intacct invited ADP to the product roadmap portion of the Analyst event to show continued support of this relationship even as Sage provides Sage People based on the Fairsail acquisition earlier this year.

Amalgam looks forward to seeing this continued openness with partners such as ADP, Adaptive Insights, and FloQast as the integration capabilities of Sage Intacct in supporting best-in-breed mid-market solutions is an important aspect of Sage Intacct’s strength as a cloud solution. With this acquisition, it makes sense for Sage Intacct to support the use of Sage One if a potential client is not ready for Intacct and for Sage People when appropriate. If anything, Amalgam believes that Sage must be patient in allowing Sage Intacct customers to choose their own technology portfolios.

Customer for Life Initiatives

Amalgam tracks Customer for Life programs as part of its coverage of SaaS management and the changing role of the CIO office in managing technology as a set of strategic alliances. Amalgam has been analyzing Sage Intacct’s Customer for Life program and is currently working on a document regarding this specific approach as a result of the unusually successful renewal and broad-based approach that this software provider takes.

When questioned by Amalgam about the current stage of the Customer for Life program, Kathy Lord replied that the Sage Intacct program has not been affected by the acquisition and that the Sage Intacct approach is actually seen as a standard to be pursued across Sage. Amalgam looks forward to seeing whether this results in increased customer success investment or a perceived change in sales or service from Sage in general and looks forward to both briefings and public filings that make mention of any changes in this regard.

Conclusion

Overall, Amalgam believes that Intacct continued to show its leadership as a mid-market ERP solution and, more important to AI, as a cloud vendor that is providing leadership in building the next-generation of app platforms. Amalgam is bullish on Sage Intacct, but believes that it is integral for Sage to allow Intacct to continue on its own business path while providing the developmental and organization resources needed for Sage Intacct to maintain its growth.

Posted on Leave a comment

With Oracle Universal Credits, the Cloud Wars Are Truly On

In late September, prior to Oracle Open World, Oracle (NYSE: ORCL) held an event to announce its consumption pricing model of Universal Credits and the ability to reuse existing software licenses across Oracle’s Platform as a Service (PaaS) middleware, analytics, and database offerings. The Universal Credits represent a fundamental change in cloud pricing as they will allow Oracle Cloud customers to switch between Oracle’s IaaS and PaaS services. In addition, Larry Ellison also unveiled a “self-driving” database that would greatly reduce the cost of administration.
Continue reading With Oracle Universal Credits, the Cloud Wars Are Truly On

Posted on 1 Comment

28 Hours as an Industry Analyst at Strata Data 2017

grid-725269_640
grid-725269_640

Companies Mentioned: Aberdeen Group, Actian, Alation, Arcadia Data, Attunity, BMC, Cambridge Semantics, Cloudera, Databricks, Dataiku, DataKitchen, Datameer, Datarobot, Domino Data Lab, EMA, HPE, Hurwitz and Associates, IBM, Informatica, Kogentix, LogTrust, Looker, <MesoSphere, Micro Focus, Microstrategy, Ovum, Paxata, Podium Data, Qubole, SAP, Snowflake, Strata Data, Tableau, Tamr, Tellius, Trifacta.

Last week, I attended Strata Data Conference at the Javitz Center in New York City to catch up with a wide variety of data science and machine learning users, enablers, and thought leaders. In the process, I had the opportunity to listen to some fantastic keynotes and to chat with 30+ companies looking for solutions, 30+ vendors presenting at the show, and attend with a number of luminary industry analysts and thought leaders including Ovum’s Tony Baer, EMA’s John Myers, Aberdeen Group’s Mike Lock, and Hurwitz & Associates’ Judith Hurwitz.

From this whirwind tour of executives, I took a lot of takeaways from the keynotes and vendors that I can share and from end users that I unfortunately have to keep confidential. To give you an idea of what an industry analyst notes, following are a short summary of takeaways I took from the keynotes and from each vendor that I spoke to:

Keynotes: The key themes that really got my attention is the idea that AI requires ethics, brought up by Joanna Bryson, and that all data is biased, which danah boyd discussed. This idea that data and machine learning have their own weaknesses that require human intervention, training, and guidance is incredibly important. Over the past decade, technologists have put their trust in Big Data and the idea that data will provide answers, only to find that a naive and “unbiased” analysis of data has its own biases. Context and human perspective are inherent to translating data into value: this does not change just because our analytic and data training tools are increasingly nuanced and intelligent in nature.

Behind the hype of data science, Big Data, analytic modeling, robotic process automation, DevOps, DataOps, and artifical intelligence is this fundamental need to understand that data, algorithms, and technology all have inherent biases as the following tweet shows:
Continue reading 28 Hours as an Industry Analyst at Strata Data 2017

Posted on 1 Comment

Riverside Partners Acquires Calero: TEM in Transition

Calero Logo
Calero

On September 13th, 2017, Riverside Partners, a Boston-based private equity firm, announced the acquisition of Calero Software from Clearlake Capital. Calero manages more than $6 billion of annual telecom, mobility, and cloud spend for more than 3,000 customers in 40+ countries and provides managed mobility services for more than 400,000 devices, making it one of the largest technology expense management solutions overall behind Tangoe’s $38 billion+ in technology expense management and Flexera’s $13 billion+ in software expense management. (Cass does not break out its telecom spend, but Amalgam believes it to be similar in scale to Calero.)

This blog covers Amalgam’s perspective on:

  1. Why Clearlake sold Calero?
  2. Who is Riverside Partners, a relatively new player in the TEM space?
  3. What to expect from Calero going forward?

Continue reading Riverside Partners Acquires Calero: TEM in Transition

Posted on 2 Comments

Amalgam Insights’ Technology Consumption Management Webinar Schedule for 2017

BrightTalk Logo
BrightTalk

Over the rest of 2017, Amalgam Insights will be holding monthly webinars on our BrightTalk Technology Consumption Management Channel.

1) Eight Telecom Expense Solutions Gartner Missed
2) Machine Learning, Design Thinking, & the Role-Based Expert Enhancement Platform
3) Making the Leap from TEM to IT Management
4) Key Vendors for Managing $40+ Billion in SaaS Costs
5) Cloud Service Management: Managing Cost, Resources, and Security

If you are interested in any of these topics, please sign up to attend by clicking on the webinar title link. And if you are interested in a deeper dive of any of these topics, please email Amalgam’s Director of Client Services, Lisa Lincoln.

For more information on each webinar, please look below:

Upcoming Webinars (All Times in Eastern Time Zone):

1) Eight Telecom Expense Solutions Gartner Missed
Date/Time: August 31, 2017, 1 PM

Recommended Audience: Telecom Directors and Managers, Network Directors and Managers, Mobility Directors and Managers, IT Procurement, IT Finance, IT Architects

Amalgam Insights has been covering the Telecom Expense Management (TEM) market for over a decade. In May of 2017, Gartner released “Market Guide for Telecom Expense Management Services, 2017”. In this guide, Amalgam believes that eight key vendors were overlooked that can provide enterprise-grade services and represent billions of dollars in technology spend including:

* A pioneer in Robotic Process Automation
* The largest standalone TEM in Europe
* A SaaS solution partnering with IBM, Unisys, and Verizon
* A Fortune 1000-focused vendor with 100% referenceable clients
* A fast-growing technology management platform built on ServiceNow
* A commercial and government-focused vendor with over 4 million lines under management
* The largest Australian provider with over 500,000 items under management
* One of the Bay Area’s best rated places to work with IoT, data center, & SD-WAN management offerings

2) Machine Learning, Design Thinking, & the Role-Based Expert Enhancement Platform
Date/Time: September 28, 2017 at 1 PM

Recommended Audience: Chief Data Officers, Chief Analytics Officers, Analytics & BI Managers, Analytics & BI Program Leaders, Enterprise Information Management Managers, Enterprise Architects, Enterprise Applications Managers

Amalgam Insights believes that the key to success for artificial intelligence is embedded AI aligned to role-specific and industry-specific challenges. The goal is to provide focused outputs that enhance the best judgment of subject matter experts. This leads to a core mission of Amalgam Insights: improving the consumption of enterprise technology. Based on these assumptions, Role-Based Expert Enhancement Platforms (REEPs) are the future of embedded Artificial Intelligence.

Based on interviews with dozens of enterprise application platform users, application vendors, and machine learning providers, Amalgam Insights describes how lessons from embedded BI and application analytics can be used to create the next-generation of embedded AI and embedded machine learning applications.

3) Making the Leap from TEM to IT Management
Date/Time: October 26, 2017 at 1 PM

Recommended Audience: CIO, CFO, Procurement officers, IT Finance, IT Sourcing.

Telecom Expense has traditionally been the most challenging of IT costs to manage. With the emergence of Software-as-a-Service, Cloud Computing, the Internet of Things, and software defined networks, the rest of the IT world is quickly catching up.

This webinar will provide best practices for expanding your existing telecom expense management program into a bigger IT management program to take advantage of the robust capabilities and vendor management experience already in TEM. This presentation will include anonymized end user examples and a list of IT, cloud, and telecom management vendors with experience in managing non-telecom expense categories.

4) Key Vendors for Managing $40+ Billion in SaaS Costs
Date/Time: November 16, 2017 at 1 PM

Recommended audience: CIO, CFO, Chief Procurement Officers, Finance Directors and Managers, Controllers, IT Procurement, Procurement Directors and Managers, Sales and Marketing Operations Directors and Managers

In this webinar, Amalgam introduces a new set of solutions focused on enterprise SaaS expense management, a $40 billion+ global market that is largely unmanaged at this point.

Amalgam estimates that less than 5% of SaaS spend is currently centrally managed. The other 95% is hiding in expense reports and one-off accounts that escape corporate control. No rules, no buik discounts, no disputes, no contract enforcement or negotiations.

Stop the insanity! Learn which vendors to consider to manage an expense that may now represent $5,000 or more per employee in your organization.

5) Cloud Service Management: Managing Cost, Resources, and Security
Date/Time: December 14, 2017

Recommended Audience: CIOs, CFOs, Enterprise Architects, IT Project Managers, IT Procurement, IT Service, IT Finance

Cloud Infrastructure-as-a-Service is growing rapidly as companies replace obsolete data center servers and storage with “The Cloud.” As companies use more services from multiple regions and even multiple vendors, Cloud Computing becomes yet another management headache where discounts, service levels, and IT governance can go unenforced.

This webinar provides key tips on the Cloud Service Management market, including the drivers, best practices and top vendors including, but not limited to:

Cloudcheckr
CloudHealth Technologies
HP New Stack
IBM Cost and Asset Management
Microsoft Cloudyn

Posted on Leave a comment

Sumeru Equity Partners Acquires MDSL and Combines with Telesoft

Title Page: SEP Acquires MDSL
SEP Acquires MDSL

Note: Updated August 2nd with coverage links from Sumeru, MDSL, Telesoft, AltAssets, AOTMP, Crunchbase, PEHub, and Pitchbook.

In case you missed the announcement, as of July 31st, Sumeru Equity Partners has acquired MDSL and is joining Telesoft with MDSL to form a single Technology Expense Management company. I had a few opinions about the new MDSL that didn’t fit into my more formal analysis because they’re observations rather than directional guidance. If you haven’t read the formal Market Milestone yet, please check it out.

First, Sumeru Equity Partners has an interesting portfolio of companies that could be accretive to the new MDSL company. Take a look at their current portfolio:
Continue reading Sumeru Equity Partners Acquires MDSL and Combines with Telesoft