Posted on Leave a comment

GSG Rebrands as Sakon and Launches the Sakon Platform

Note: This blog contains excerpts from Amalgam’s Market Milestone document. For the full story, including additional context and recommendations for Global 2000 organizations, purchase the full report.

Key Stakeholders: CIO, CFO, IT Finance, Telecom Managers, Network Managers, Mobility Managers, Software Asset Managers.

Key Announcement

On January 17th, Sakon, formerly known as GSG, announced the launch of the Sakon platform, a Software as a Service suite of six applications to support the following areas: Mobility and Internet of Things Service Management, Network Services, Cloud Applications Management, Expense Management, Sourcing & Transformation Management, and Insights and Intelligence. This platform will be available as an annual subscription to support telecom, network, IoT, and SaaS management needs for enterprise IT organizations.
Continue reading GSG Rebrands as Sakon and Launches the Sakon Platform

Posted on Leave a comment

FloQast Announces Visualizations to Improve the Financial Close


Industry: Accounting and Audit Automation
Key Stakeholders: CIO, CFO, IT Controllers, Finance Managers, Accounting Managers

On January 23rd, 2018, FloQast announced new visualization capabilities for its Close Analytics solution initially launched in April of 2017. These visualizations included views of retrospective and close trends, progress in entity management, and a filtered isolation of high risk processes and root cause drivers.

Amalgam Context for FloQast Visualizations

FloQast is a close management solution created to improve the financial close process. Through a combination of workflows, cloud storage, and deep accounting subject matter expertise, FloQast provides a cloud-based environment to coordinate and accelerate the financial close.

In April of 2017, FloQast launched its Close Analytics solution, an additional module that allowed accounting teams to add visual tracking and metrics to improve close management. These analytics have been enhanced by this most recent announcement to improve process visibility for controllers, finance and accounting executives, and other key executives tracking the accounting close process. Amalgam Insights believes this announcement and its focus on risk contextualization and process specificity reflects the culmination of multiple strategic needs and trends in the accounting world.

First, these visualizations reflect a key market trend that accounting and financial automation in and of themselves can become a Black Box of risk when the process is not completely transparent. Although FloQast was already a market leader in providing visibility in the close process, FloQast’s efforts in providing clear visualizations on an entity and process-specific basis continued FloQast’s leadership in making financial close faster, easier, and more transparent.

Second, these visualizations offer accounting and executive teams with a greater shared basis for trust in their financial close. Amalgam Insights believes that Trust is the key theme for enterprise technology in 2018, superseding speed, automation, and productivity. Fortunately for FloQast and FloQast clients, close process visualization can simultaneously accelerate both productivity and trust. However, modern solutions must focus on trusted productivity, as productivity in and of itself is not sufficient.

Third, agility requires real-time visibility, or at least a reasonable facsimile of the current state of affairs. To minimize bottlenecks in the close process, accounting leads must be aware of potential problems and delays that are currently in place. Through these added visualizations, FloQast provides a level of visualization that is not currently available in the vast majority of mid-sized and large enterprises due to the inherent disaggregation that currently is pervasive in accounting departments.

Fourth, visualizations must reflect the past, present, and future of any given process to be truly useful. Historical visualizations of the past are important for identifying patterns and viewing evolutionary changes in process and activity. Current visualizations are important in taking action in the moment and allocating resources appropriately. And expected visualizations of future processes and results are vital in forecasting the need for additional resources and time that may be necessary due to external demands such as new revenue recognition and leasing requirements, mergers and acquisitions activity, or significant portfolio reallocations.

Recommendations for the Accounting Community

The visualization of financial close processes and progress is still relatively novel because of the traditional distribution of work in managing the enterprise close. Because of this, there are very few best practices for accounting teams to follow in tracking the close other than to simply visualize all processes. As accounting departments seek to integrate visualization into their close tracking process, Amalgam provides the following recommendations.

Use cloud-based storage for managing financial close documents, including relevant reconciliation and consolidation reports that must be viewed by two or more people. Without this shared view of key documents and standardization of processes and document names, businesses will be unable to support the data access and real-time updates necessary to visualize financial close process in the first place. The ongoing creation and governance of these documents can also be improved by creating repeatable monthly, quarterly, and annual templates that are also made available to all relevant financial close stakeholders.

Ensure that close visualizations include views of the past, present, and future, but be aware that each of these sets of visualization are used for different use cases. Visualizations focused on the past are important for compliance, pattern recognition, and a historical record of evolutionary change in financial close and data collection. Visualizations portraying data in the present need to be linked with alerts, high-risk processes, and the need to take action when necessary. Visualizations showing future forecasts and potential outcomes allow departments to prepare for key events, regulatory changes, or business transactions that may require reallocation, additional preparation, or additional investment to ensure consistent delivery of accounting services.

Be careful that the financial close is not a black box of activity. It is not sufficient to simply create a set of financial documents that reflect the business activity of the past fiscal period. Without sufficient explanatory documentation, progress tracking, and identification of key drivers that affect financial close efficacy, the financial close is not repeatable and companies will be stuck reinventing the wheel every month simply to conduct a basic business task. This lack of repeatability from Black Box thinking will prevent accountants from taking on higher business value tasks, such as resource optimization, sales operations, and other analysis that can elevate accountants from operational bean counters to strategic business consultants.

By aligning accounting process visibility to these key industry trends, Amalgam believes that FloQast’s focus on visibility has enhanced the Close Analytics module that this vendor offers and met a key need for improving accounting environments. FloQast’s improved visualizations reflect holistic business needs for increased trust, real-time agility, and risk contextualization in accounting departments. Mid-sized and large enterprises must adopt visualizations that portray retrospective trends, entity-specific close processes, and risk-prioritized isolation to truly gain control of their close environments.

Posted on Leave a comment

Data and Analytic Strategies for Developing Ethical IT: a BrightTALK webinar

BI to AI on Trusted Data - An Amalgam Insights Research Theme
BI to AI on Trusted Data – An Amalgam Insights Research Theme

Recommended Audience: CIOs, Enterprise Architects, Data Managers, Analytics Managers, Data Scientists, IT Managers

Vendors Mentioned: Trifacta, Paxata, Datameer, Datawatch, Lavastorm, Alation, Tamr, Unifi, 1010Data, Podium Data, IBM, Domo, Microsoft, Information Builders, Board, Microstrategy, Cloudera, H20.ai, RapidMiner, Domino Data Lab, Dataiku, TIBCO, SAS, Amazon Web Services, Google, DataRobot.

In case you missed it, I just finished up my webinar on Data and Analytic Strategies for Developing Ethical IT. We are headed into a new algorithmic, statistical, and heterogenous data-defined model of IT where IT ethics and relevance are being challenged. In this webinar, we discussed:

  • Why IT is broken from a support and business perspective
  • The aspects of IT that can be fixed
  • What we can do as IT managers to fix IT
  • Data Prep, Data Unification, Business Intelligence, Data Science, and Machine Learning vendors that can help unlock the Black Boxes and Opt-Out disasters in IT
  • Key Recommendations

This webinar provides context to my ongoing research tracks of “BI to AI on Shared Data” and “IT Management at Scale.” To attend the webinar, please check the embedded view below or click to watch on BrightTALK


Posted on 1 Comment

Calero Purchases European TEM Leader A&B Groep: What to Expect?


Note: This blog contains excerpts from Amalgam’s Due Diligence Dossier on Calero. To get the full report, click here.

In January 2018, Calero announced two acquisitions, Comview and A&B Groep. These acquisitions have increased Calero’s headcount by over 70 employees, added geographic footprint, demonstrated a specific profile for acquisition, and demonstrates the willingness for new owners, Riverside Partners, to quickly take action within 120 days of acquiring Calero. This combination of acquisition, execution, and stated focus result in the need to re-evaluate Calero in context of these significant changes. Continue reading Calero Purchases European TEM Leader A&B Groep: What to Expect?

Posted on Leave a comment

4 Predictions on the Future of IT Consumption

At Amalgam Insights, we have been focused on the key 2018 trends that will change our ability to manage technology at scale. In Part 1 of this series, Tom Petrocelli provided his key Developer Operations and enterprise collaboration predictions for 2018 in mid-December. In part two, , Todd Maddox provided 5 key predictions that will shape enterprise learning in 2018. In the third and final set of predictions, I’m taking on key themes of cloud, mobility, telecom, and data management that will challenge IT in terms of management at scale.

  1. Cloud IaaS and SaaS Spend under formal management will double in 2018, but the total spend under formalized management still be under 25% of total business spend.
  2. The number of cellular-connected IoT devices will double to over one billion between now and 2020.
  3. Technology Lifecycle Management will start to emerge as a complex spend management strategy for medium and large enterprises.
  4. Ethical AI will emerge as a key practice for AI Governance.

Continue reading 4 Predictions on the Future of IT Consumption

Posted on Leave a comment

Calero Flexes PE Muscle in Acquiring TEM Stalwart Comview

On December 29th, 2017, Calero Software acquired Comview, an experienced telecom expense management and call accounting software and services provider. Financial terms were not disclosed. From Amalgam’s perspective, this acquisition is important in demonstrating the plans of Calero under its new ownership, Riverside Partners, and in establishing Comview’s role in the Technology Expense Management market. With this acquisition, all Comview personnel other than Founder John Perri are expected to move to Calero. Calero has committed to supporting Comview’s solution for the immediate future. Comview is expected to run as “a Calero company” with independent development and operations in the short term.

Overall, Amalgam is bullish on this acquisition as it combines the strengths of a veteran TEM provider, Comview, with strong customer service and an integrated platform with the scale and breadth of Calero’s capabilities.

Continue reading Calero Flexes PE Muscle in Acquiring TEM Stalwart Comview

Posted on 6 Comments

The Enterprise Opportunity in Apple’s Slowdown


On December 28th, Apple released a statement regarding the fact that it was planning to lower the battery replacement costs for Apple 6s and 7s from $79 to $29 as a followup to a December 20th statement made to Techcrunch in which Apple described how it was slowing down iPhone 6s and 7s. This piece provides guidance to enterprise IT, mobility, and procurement managers on what this issue is and how to take full advantage of this situation to improve your enterprise mobility environment.

Recommended Audience: CIO, Chief Procurement Officer, IT Procurement Directors and Managers, Mobility Directors and Managers, Telecom Directors and Managers, IT Service Desk Directors and Managers

Vendors Mentioned: Apple, Samsung, Huawei, Calero, DMI, DXC, Ezwim, G Squared Wireless, Honeywell, IBM, Intratem, MDSL, MOBI, Mobichord, MobilSense, Mobile Solutions, NetPlus, Network Control, One Source Communications, Peak-Ryzex, RadiusPoint, Stratix, Tangoe, vCom Solutions, Visage, Vox Mobile, Wireless Analytics
Continue reading The Enterprise Opportunity in Apple’s Slowdown

Posted on 1 Comment

Amazon SageMaker: A Key to Accelerating Enterprise Machine Learning Adoption

On November 29th, Amazon Web Services announced SageMaker, a managed machine language service that manages the authoring, model training, and hosting of algorithms and frameworks. These capabilities can be used by themselves, or as an end-to-end production pipeline.

SageMaker is currently available with a Free tier providing 250 hours of t2.medium notebook usage, 50 hours of m4.xlarge training usage, and 125 hours of m4.xlarge hosting usage for hosting for two months. After two months or for additional hours, the service is billed per instance, storage GB, and data transfer GB.

Amalgam Insights anticipates watching the adoption of SageMaker as it solves several basic problems in machine learning.

Continue reading Amazon SageMaker: A Key to Accelerating Enterprise Machine Learning Adoption

Posted on 4 Comments

Amazon Aurora Serverless vs. Oracle Autonomous Database: A Microcosm for The Future of IT

On November 29th, Amazon Web Services announced a variety of interesting database announcements at Amazon re:invent. Amazon Neptune, DynamoDB enhancements, and Aurora Serverless. Amalgam found both Neptune and DynamoDB announcements to be valuable but believes Aurora Serverless was the most interesting of these events both in its direct competition with Oracle and its personification of a key transitional challenge that all enterprise IT organizations face.

Amazon Neptune is a managed graph database service that Amalgam believes will be important for analyzing relationships, networked environments, process and anomaly charting, pattern sequencing, and random walks (such as solving the classic “traveling salesman” problem). Amazon Neptune is currently in limited preview with no scheduled date for production. Over time, Amalgam expects that Neptune will be an important enhancer for Amazon Kinesis’ streaming data, IoT Platform, Data Pipeline, and EMR (Elastic MapReduce) as graph databases are well-suited to find the context and value hiding in large volumes of related data.

For the DynamoDB NoSQL database service, Amazon announced two new capabilities. The first is global tables that will be automatically replicated across multiple AWS regions, which will be helpful for global support of production applications. Secondly, Amazon now provides on-demand backups for DynamoDB tables without impacting their availability or speed. With these announcements, DynamoDB comes closer to being a dependable and consistently governed global solution for unstructured and semistructured data.

But the real attention-getter was in the announcement of Aurora Serverless, an upcoming relational database offering that will allow end users to pay for database usage and access on a per-second basis. This change is made possible by Amazon’s existing Aurora architecture in separating storage from compute from a functional basis. This capability will be extremely valuable in supporting highly variable workloads.

How much will Aurora Serverless affect the world of relational databases?

Taking a step back, the majority of business data value is still created by relational data. Relational data is the basis of the vast majority of enterprise applications, the source for business intelligence and business analytics efforts, and the standard format that enterprise employees understand best for creating data. For the next decade, relational data will still be the most valuable form of data in the enterprise and the fight for relational data support will be vital in driving the future of machine learning, artificial intelligence, and digital user experience. To understand where the future of relational data is going, we have to first look at Oracle, who still owns 40+% of the relational database market and is laser-focused on business execution.

In early October, Oracle announced the “Autonomous Database Cloud,” based on Database 18c. The Autonomous Database Cloud was presented as a solution for managing the tuning, updating, performance driving, scaling, and recovery tasks that database administrators are typically tasked with and was scheduled to be launched in late 2017. This announcement came with two strong guarantees: 1) A telco-like 99.995% availability guarantee, including scheduled downtime and 2) a promise to provide the database at half the price of Amazon Redshift based on the processing power of the Oracle database.

In doing so, Oracle is using a combination of capabilities based on existing Oracle tuning, backup, and encryption automation and adding monitoring, failure detection, and automated correction capabilities. All of these functions will be overseen by machine learning designed to maintain and improve performance over time. The end result should be that Oracle Autonomous Database Cloud customers would see an elimination of day-to-day administrative tasks and reduced downtime as the machine learning continues to improve the database environment over time.

IT Divergence In Motion: Oracle vs. Amazon

Oracle and Amazon have taken divergent paths in providing their next-generation relational databases, leading to an interesting head-to-head decision for companies seeking enterprise-grade database solutions.

On the one hand, IT organizations that are philosophically seeking to manage IT as a true service have, in Oracle, an automated database option that will remove the need for direct database and maintenance administration. Oracle is removing a variety of traditional corporate controls and replacing them with guaranteed uptime, performance, maintenance, and error reduction. This is an outcome-based approach that is still relatively novel in the IT world.

For those of us who have spent the majority of our careers handling IT at a granular level, it can feel somewhat disconcerting to see many of the manual tuning, upgrading, and security responsibilities being both automated and improved through machine learning. In reality, highly repetitive IT tasks will continue to be automated over time as the transactional IT administration tasks of the 80s and 90s finally come to an end. The Oracle approach is a look towards the future where the goal of database planning is to immediately enact analytic-ready data architecture rather than to coordinate efforts between database structures, infrastructure provisioning, business continuity, security, and networking. Oracle has also answered the question of how it will answer questions regarding the “scale-out” management of its database by providing this automated management layer with price guarantees.

In this path of database management evolution, database administrators must be architects who focus on how the wide variety of data categories (structured, semi-structured, unstructured, streaming, archived, binary, etc…) will fit into the human need for structure, context, and worldview verification.

On the other hand, Amazon’s approach is fundamentally about customer control at extremely granular levels. Aurora is easy to spin up and allows administrators a great deal of choice between instance size and workload capacity. With the current preview of Amazon Aurora Serverless, admins will have even more control over both storage and processing consumption by starting at the endpoint level as a starting point for provisioning and production. Amazon will target the support of MySQL compatibility in the first half of 2018, then follow with PostgreSQL later in 2018. This billing will occur in Aurora Capacity Units as a combination of storage and memory metered in one-second increments. This granularity of consumption and flexibility of computing will be very helpful in supporting on-demand applications with highly variable or unpredictable usage patterns.

But my 20+ years in technology cost administration also lead me to believe that there is an illusory quality of control in the cost and management structure that Amazon is providing. There is nothing wrong with providing pricing at an extremely detailed level, but Amalgam already finds that the vast majority of enterprise cloud spend unmonitored from a month-to-month basis at all but the most cursory levels. (For those of you in IT, who is the accountant or expense manager who cross-checks and optimizes your cloud resources on a monthly basis? Oh, you don’t have one?)

Because of that, we at Amalgam believe that additional granularity is more likely to result in billing disputes or complaints. We will also be interested in understanding the details of compute: there can be significant differences in resource pricing based on reserved instances, geography, timing, security needs, and performance needs. Amazon will need to reconcile these compute costs to prevent this service from being an uncontrolled runaway cost. This is the reality of usage-based technology consumption: decades of telecom, network, mobility, and software asset consumption have all demonstrated the risks of pure usage-based pricing.

Amalgam believes that there is room for both as Ease-of-Use vs. Granular Management continues to be a key IT struggle in 2018. Oracle represents the DB option for enterprises seeking governance, automation, and strategic scale while Amazon provides the DB option for enterprises seeking to scale while tightly managing and tracking consumption. The more important issue here is that the Oracle DB vs. Amazon DB announcements represent a microcosm of the future of IT. In one corner is the need to support technology that “just works” with no downtime, no day-to-day adminstration, and cost reduction driven by performance. In the other corner is the ultimate commoditization of technology where customers have extremely granular consumption options, can get started at minimal cost, and can scale out with little-to-no management.

Recommendations

1) Choose your IT model: “Just Works” vs. ” Granular Control.” Oracle and Amazon announcements show how both models have valid aspects. But inherent in both are the need to both scale up and scale out to fit business needs.

2) For “Just Works” organizations, actively evaluate machine learning and automation-driven solutions that reduce or eliminate day-to-day administration. For these organizations, IT no longer represents the management of technology, but the ability to supply solutions that increase in value over time. 2018 is going to be a big year in terms of adding new levels of automation in your organizations.

3) For “Granular Control” organizations, define the technology components that are key drivers or pre-requisites to business success and analyze them extremely closely. In these organizations, IT must be both analytics-savvy and maintain constant vigilance in an ever-changing world. If IT is part of your company’s secret sauce and a fundamental key to differentiated execution, you now have more tools to focus on exactly how, when, and where inflection points take place for company growth, change, or decline.

For additional insights on Amazon’s impact on the future of IT, read Amalgam analyst Tom Petrocelli’s perspective on Amazon Web Services and the Death of IT Ops on InformationWeek

Posted on 2 Comments

Why Did TIBCO Acquire Alpine Data?

On November 15, 2017, TIBCO announced the acquisition of Alpine Data, a data science platform long known for its goals of democratizing data science and simplifying access to data, analytic workflows, parallel compute, and tools.

With this acquisition, TIBCO makes its second major foray into the machine learning space after June 5th acquisition of Statistica. In doing so, TIBCO has significantly upgraded its machine learning support capabilities, which will be especially useful to TIBCO in continuing to position itself as a full-range data and analytics solution.

When this acquisition occurred, Amalgam received questions on how Alpine Data and Statistica would be expected to work together and how Alpine Data would fit into TIBCO’s existing machine learning and analytics portfolio. Amalgam has provided favorable recommendations for both Alpine Data and Statistica in 2017 and plans to continue providing a positive recommendation for both solutions, but sought to explore the nuances of these recommendations.

In our Market Milestone, we explore why Alpine Data was a lower-ranked machine learning solution in analyst landscapes despite being early-to-market in providing strong collaborative capabilities and supporting a wide variety of data sources. We also wanted to explore the extent to which Alpine Data provided some sort of conflict to existing TIBCO customers. Finally, we also wanted to provide guidance on how TIBCO’s acquisition would potentially change Alpine Data’s positioning and capabilities.

To read Amalgam Insights’ view and recommendations regarding this report, use the following link to acquire this report.