Posted on Leave a comment

Data and Analytic Strategies for Developing Ethical IT: a BrightTALK webinar

BI to AI on Trusted Data - An Amalgam Insights Research Theme
BI to AI on Trusted Data – An Amalgam Insights Research Theme

Recommended Audience: CIOs, Enterprise Architects, Data Managers, Analytics Managers, Data Scientists, IT Managers

Vendors Mentioned: Trifacta, Paxata, Datameer, Datawatch, Lavastorm, Alation, Tamr, Unifi, 1010Data, Podium Data, IBM, Domo, Microsoft, Information Builders, Board, Microstrategy, Cloudera, H20.ai, RapidMiner, Domino Data Lab, Dataiku, TIBCO, SAS, Amazon Web Services, Google, DataRobot.

In case you missed it, I just finished up my webinar on Data and Analytic Strategies for Developing Ethical IT. We are headed into a new algorithmic, statistical, and heterogenous data-defined model of IT where IT ethics and relevance are being challenged. In this webinar, we discussed:

  • Why IT is broken from a support and business perspective
  • The aspects of IT that can be fixed
  • What we can do as IT managers to fix IT
  • Data Prep, Data Unification, Business Intelligence, Data Science, and Machine Learning vendors that can help unlock the Black Boxes and Opt-Out disasters in IT
  • Key Recommendations

This webinar provides context to my ongoing research tracks of “BI to AI on Shared Data” and “IT Management at Scale.” To attend the webinar, please check the embedded view below or click to watch on BrightTALK


Posted on Leave a comment

4 Predictions on the Future of IT Consumption

At Amalgam Insights, we have been focused on the key 2018 trends that will change our ability to manage technology at scale. In Part 1 of this series, Tom Petrocelli provided his key Developer Operations and enterprise collaboration predictions for 2018 in mid-December. In part two, , Todd Maddox provided 5 key predictions that will shape enterprise learning in 2018. In the third and final set of predictions, I’m taking on key themes of cloud, mobility, telecom, and data management that will challenge IT in terms of management at scale.

  1. Cloud IaaS and SaaS Spend under formal management will double in 2018, but the total spend under formalized management still be under 25% of total business spend.
  2. The number of cellular-connected IoT devices will double to over one billion between now and 2020.
  3. Technology Lifecycle Management will start to emerge as a complex spend management strategy for medium and large enterprises.
  4. Ethical AI will emerge as a key practice for AI Governance.

Continue reading 4 Predictions on the Future of IT Consumption

Posted on Leave a comment

Calero Flexes PE Muscle in Acquiring TEM Stalwart Comview

On December 29th, 2017, Calero Software acquired Comview, an experienced telecom expense management and call accounting software and services provider. Financial terms were not disclosed. From Amalgam’s perspective, this acquisition is important in demonstrating the plans of Calero under its new ownership, Riverside Partners, and in establishing Comview’s role in the Technology Expense Management market. With this acquisition, all Comview personnel other than Founder John Perri are expected to move to Calero. Calero has committed to supporting Comview’s solution for the immediate future. Comview is expected to run as “a Calero company” with independent development and operations in the short term.

Overall, Amalgam is bullish on this acquisition as it combines the strengths of a veteran TEM provider, Comview, with strong customer service and an integrated platform with the scale and breadth of Calero’s capabilities.

Continue reading Calero Flexes PE Muscle in Acquiring TEM Stalwart Comview

Posted on 5 Comments

As API Management Problem Grows, Informatica Jumps into the Market

Tom Petrocelli, Amalgam Insights Contributing Analyst

API management is a necessary but boring practice. As developers make use of a mix of public cloud, purchased or open source libraries, and homegrown services, the number of APIs used by developers quickly renders pouring through documentation impractical.

Microservices, usually accessed via RESTFul APIs, cause API calls to rapidly proliferate. Even modest-sized microservices-based systems experience API overload quickly. Agile development can exacerbate the problem of understanding and using APIs. The rapid pace of Agile, especially Scrum, leaves little time for proper documentation of APIs. Documentation often takes a back seat to continuous deployment.
Continue reading As API Management Problem Grows, Informatica Jumps into the Market

Posted on

Tom Petrocelli Introduces NoOps on InformationWeek

Amalgam Insights Logo
Amalgam Insights

In case you missed in, at re:invent Amazon launched a mind-numbing number of new services including managed Kubernetes service, more AWS Lambda extensions, Aurora Serverless, AWS Serverless Application Repository, and Amazon SageMaker.

Based on this, Amalgam Analyst Tom Petrocelli recently contributed a thought-provoking article on InformationWeek about how Amazon Web Services is working on killing off IT Ops and bringing in a new age of “NoOps.” For IT Ops, Winter is definitely coming.

Do you agree or disagree? Take a look at Tom’s POV and let us know what you think.

Click here to read Tom’s article: AWS Ignites Debate About the Death of IT Ops

Posted on 1 Comment

What’s On Tap for 2018 from Tom Petrocelli

Tom Petrocelli, Amalgam Insights Research Fellow

As the year comes to a close, I have had the opportunity to reflect on what has transpired in 2017 and look ahead to 2018. Some of my recent thoughts on 2017 have been published in:

These articles provide a peek ahead at emerging 2018 trends.

In the two areas I cover, collaboration and DevOps/Developer Trends, I plan to continue to look at:
The continued transformation of the collaboration market. [Click to Tweet] I am expecting a “mass extinction event” of products in this space. That doesn’t mean the collaboration market will evaporate. Instead, I am looking for niche products that address specific collaboration segments to thrive while a handful of large collaboration players will consume the general market.
The emergence of NoOps, for No Operations, in the mid-market. [Click to Tweet] The Amazon push to serverless products is a bellwether of the upcoming move toward cloud vendor operations supplanting company IT sysops.
2018 will be the year of the container.[Click to Tweet] Containers have been growing in popularity over the past several years but 2018 will be the year when they become truly mass market. The growth in the ecosystem, especially the widespread availability of cloud Kubernetes services, will make containers more palatable to a wider market.
Integrated DevOps pipelines will make DevOps more efficient… if [Click to Tweet] we can get the politics out of IT.
Machine learning will continue to be integrated into developer tools [Click to Tweet] which, in turn, will make more complex coding and deployment jobs easier.

As you know, I joined Amalgam Insights in September. Amalgam Insights, or AI, is a full-service market analyst firm. I’d welcome the opportunity to learn more about what 2018 holds for you. Perhaps we can schedule a quick call in the next couple of weeks. Let me know what works best for you. As always, if I can provide any additional information about AI, I’d be happy to do so!

Thanks, and have a happy holiday season.

For more predictions on IT management at scale, check out Todd Maddox’s 5 Predictions That Will Transform Corporate Training.

Posted on 4 Comments

Amazon Aurora Serverless vs. Oracle Autonomous Database: A Microcosm for The Future of IT

On November 29th, Amazon Web Services announced a variety of interesting database announcements at Amazon re:invent. Amazon Neptune, DynamoDB enhancements, and Aurora Serverless. Amalgam found both Neptune and DynamoDB announcements to be valuable but believes Aurora Serverless was the most interesting of these events both in its direct competition with Oracle and its personification of a key transitional challenge that all enterprise IT organizations face.

Amazon Neptune is a managed graph database service that Amalgam believes will be important for analyzing relationships, networked environments, process and anomaly charting, pattern sequencing, and random walks (such as solving the classic “traveling salesman” problem). Amazon Neptune is currently in limited preview with no scheduled date for production. Over time, Amalgam expects that Neptune will be an important enhancer for Amazon Kinesis’ streaming data, IoT Platform, Data Pipeline, and EMR (Elastic MapReduce) as graph databases are well-suited to find the context and value hiding in large volumes of related data.

For the DynamoDB NoSQL database service, Amazon announced two new capabilities. The first is global tables that will be automatically replicated across multiple AWS regions, which will be helpful for global support of production applications. Secondly, Amazon now provides on-demand backups for DynamoDB tables without impacting their availability or speed. With these announcements, DynamoDB comes closer to being a dependable and consistently governed global solution for unstructured and semistructured data.

But the real attention-getter was in the announcement of Aurora Serverless, an upcoming relational database offering that will allow end users to pay for database usage and access on a per-second basis. This change is made possible by Amazon’s existing Aurora architecture in separating storage from compute from a functional basis. This capability will be extremely valuable in supporting highly variable workloads.

How much will Aurora Serverless affect the world of relational databases?

Taking a step back, the majority of business data value is still created by relational data. Relational data is the basis of the vast majority of enterprise applications, the source for business intelligence and business analytics efforts, and the standard format that enterprise employees understand best for creating data. For the next decade, relational data will still be the most valuable form of data in the enterprise and the fight for relational data support will be vital in driving the future of machine learning, artificial intelligence, and digital user experience. To understand where the future of relational data is going, we have to first look at Oracle, who still owns 40+% of the relational database market and is laser-focused on business execution.

In early October, Oracle announced the “Autonomous Database Cloud,” based on Database 18c. The Autonomous Database Cloud was presented as a solution for managing the tuning, updating, performance driving, scaling, and recovery tasks that database administrators are typically tasked with and was scheduled to be launched in late 2017. This announcement came with two strong guarantees: 1) A telco-like 99.995% availability guarantee, including scheduled downtime and 2) a promise to provide the database at half the price of Amazon Redshift based on the processing power of the Oracle database.

In doing so, Oracle is using a combination of capabilities based on existing Oracle tuning, backup, and encryption automation and adding monitoring, failure detection, and automated correction capabilities. All of these functions will be overseen by machine learning designed to maintain and improve performance over time. The end result should be that Oracle Autonomous Database Cloud customers would see an elimination of day-to-day administrative tasks and reduced downtime as the machine learning continues to improve the database environment over time.

IT Divergence In Motion: Oracle vs. Amazon

Oracle and Amazon have taken divergent paths in providing their next-generation relational databases, leading to an interesting head-to-head decision for companies seeking enterprise-grade database solutions.

On the one hand, IT organizations that are philosophically seeking to manage IT as a true service have, in Oracle, an automated database option that will remove the need for direct database and maintenance administration. Oracle is removing a variety of traditional corporate controls and replacing them with guaranteed uptime, performance, maintenance, and error reduction. This is an outcome-based approach that is still relatively novel in the IT world.

For those of us who have spent the majority of our careers handling IT at a granular level, it can feel somewhat disconcerting to see many of the manual tuning, upgrading, and security responsibilities being both automated and improved through machine learning. In reality, highly repetitive IT tasks will continue to be automated over time as the transactional IT administration tasks of the 80s and 90s finally come to an end. The Oracle approach is a look towards the future where the goal of database planning is to immediately enact analytic-ready data architecture rather than to coordinate efforts between database structures, infrastructure provisioning, business continuity, security, and networking. Oracle has also answered the question of how it will answer questions regarding the “scale-out” management of its database by providing this automated management layer with price guarantees.

In this path of database management evolution, database administrators must be architects who focus on how the wide variety of data categories (structured, semi-structured, unstructured, streaming, archived, binary, etc…) will fit into the human need for structure, context, and worldview verification.

On the other hand, Amazon’s approach is fundamentally about customer control at extremely granular levels. Aurora is easy to spin up and allows administrators a great deal of choice between instance size and workload capacity. With the current preview of Amazon Aurora Serverless, admins will have even more control over both storage and processing consumption by starting at the endpoint level as a starting point for provisioning and production. Amazon will target the support of MySQL compatibility in the first half of 2018, then follow with PostgreSQL later in 2018. This billing will occur in Aurora Capacity Units as a combination of storage and memory metered in one-second increments. This granularity of consumption and flexibility of computing will be very helpful in supporting on-demand applications with highly variable or unpredictable usage patterns.

But my 20+ years in technology cost administration also lead me to believe that there is an illusory quality of control in the cost and management structure that Amazon is providing. There is nothing wrong with providing pricing at an extremely detailed level, but Amalgam already finds that the vast majority of enterprise cloud spend unmonitored from a month-to-month basis at all but the most cursory levels. (For those of you in IT, who is the accountant or expense manager who cross-checks and optimizes your cloud resources on a monthly basis? Oh, you don’t have one?)

Because of that, we at Amalgam believe that additional granularity is more likely to result in billing disputes or complaints. We will also be interested in understanding the details of compute: there can be significant differences in resource pricing based on reserved instances, geography, timing, security needs, and performance needs. Amazon will need to reconcile these compute costs to prevent this service from being an uncontrolled runaway cost. This is the reality of usage-based technology consumption: decades of telecom, network, mobility, and software asset consumption have all demonstrated the risks of pure usage-based pricing.

Amalgam believes that there is room for both as Ease-of-Use vs. Granular Management continues to be a key IT struggle in 2018. Oracle represents the DB option for enterprises seeking governance, automation, and strategic scale while Amazon provides the DB option for enterprises seeking to scale while tightly managing and tracking consumption. The more important issue here is that the Oracle DB vs. Amazon DB announcements represent a microcosm of the future of IT. In one corner is the need to support technology that “just works” with no downtime, no day-to-day adminstration, and cost reduction driven by performance. In the other corner is the ultimate commoditization of technology where customers have extremely granular consumption options, can get started at minimal cost, and can scale out with little-to-no management.

Recommendations

1) Choose your IT model: “Just Works” vs. ” Granular Control.” Oracle and Amazon announcements show how both models have valid aspects. But inherent in both are the need to both scale up and scale out to fit business needs.

2) For “Just Works” organizations, actively evaluate machine learning and automation-driven solutions that reduce or eliminate day-to-day administration. For these organizations, IT no longer represents the management of technology, but the ability to supply solutions that increase in value over time. 2018 is going to be a big year in terms of adding new levels of automation in your organizations.

3) For “Granular Control” organizations, define the technology components that are key drivers or pre-requisites to business success and analyze them extremely closely. In these organizations, IT must be both analytics-savvy and maintain constant vigilance in an ever-changing world. If IT is part of your company’s secret sauce and a fundamental key to differentiated execution, you now have more tools to focus on exactly how, when, and where inflection points take place for company growth, change, or decline.

For additional insights on Amazon’s impact on the future of IT, read Amalgam analyst Tom Petrocelli’s perspective on Amazon Web Services and the Death of IT Ops on InformationWeek

Posted on Leave a comment

Amalgam Insights Analyzes Sage Intacct and Pacioli AI

Amalgam Insights recently attended Sage Intacct Advantage. In the past, Intacct got AI’s attention for its strong technology foundation that positions it well for a future of predictive analytics, ease of integration, and machine learning while maintaining the core financial responsibilities associated with being a nominative mid-market ERP solution.

Sage has traditionally been known as a more traditional application company with an established customer base and geographic footprint, which left industry observers with questions of whether there would be significant conflict within cultures and whether observer favorite Intacct would be absorbed into a Sage organization that has traditionally been quiet in its interactions with industry analysts.

During a discussion with analysts, Sage Intacct EVP Rob Reid explained the key business reason for the acquisition. Intacct had been considering an IPO that would have likely resulted in roughly $100 million being raised to support Intacct’s subsequent international expansion. Although this would have been a strong monetization event for all stockholders, the work needed to build a footprint in a variety of new countries would have been both significant and challenging. In contrast, being acquired by Sage for $850 million allowed Intacct investors and stockholders to monetize shares and to access Sage’s significant global footprint.

In speaking with Sage Intacct’s executive team, both Reid and Sage President Blair Crump emphasized that Sage Intacct would be left as an independent organization from a product development perspective. AI believes this is especially important in allowing Sage Intacct to further expand its cloud-based capabilities for core verticals of SaaS, non-profit, and professional services and to accelerate time-to-value for new Sage Intacct capabilities.

With Sage’s acquisition of Intacct in July and the renaming of Intacct to Sage Intacct, AI was curious to see how Sage Intacct differed from the previous Intacct in four areas:

  • Innovation
  • Executive Commitment
  • Partner relationships
  • Customer for Life Initiatives

Innovation

Since Sage Intacct initially got AI analysts’ attention for its pace of innovation and solution flexibility relative to its competitors, AI was interested in seeing how the software provider sought to maintain its innovation post-acquisition.

At the event, Sage Intacct announced enhancements across key verticals to support board meetings in nonprofit organizations, project management for professional services, and contract management for SaaS organizations. The nonprofit Board Book, in particular, was differentiated for its integration with GuideStar to provide benchmark and best practice metrics for nonprofit finance and operations tracking.

Sage Intacct Pursues AI with Pacioli

In addition, Sage Intacct also announced its pursuit of Artificial Intelligence in Pacioli, named after the inventor of double-entry accounting. (Amalgam believes this is the second most inspired name for an AI behind only Infor’s Coleman named after Katherine Coleman Johnson.)

CTO Aaron Harris stated that Pacioli is intended to be a “the first digital assistant designed for CFOs” including a bot built to support the CFO with the ability to self-learn and to provide relevant context for revenue and financial anomalies. Although it’s too early to see how successful Sage Intacct will be in developing Pacioli, Amalgam believes that Sage Intacct’s strong focus on supporting dimensionality and related metadata across its entire product will accelerate development. In addition, Harris revealed that Sage had already been working on AI that will be brought into the Pacioli development process in projects such as the Pegg Chatbot. Between the announcements by Intacct and Infor and the work done by Salesforce, IBM, and Oracle, it has been a big year for bringing AI into the world of enterprise applications.

Amalgam notes that with the integration of AI, the user interface for applications is going to go through a sea-change that will make the rise of mobility look trivial in comparison. AI provides the opportunity to enhance expert judgment, develop individualized interfaces based on personality and work habit preferences, and to serve as a true assistant to the employee rather than a complex system that has to be continually re-configured and interfaced to provide repeatable business outputs. Based on the AI projects announced in 2017, Amalgam sees that 2018 and 2019 will be as important to the infancy of true enterprise AI as the late 2000s were for the early establishment of the app development, security, governance, and management of mobile apps.

Executive Commitment

It is not uncommon to see executives leave an organization either after IPO or acquisition. One of the most interesting moments at the industry analyst portion of the event was when Rob Reid called out his entire management team to show that they were all committed to the organization. Although nothing in life is guaranteed, it seemed that key Sage Intacct executives had incentive to stay on board for the next two-three years, meaning that there should be no immediate changes either to roadmap or executive commitments to the Sage Intacct community.

Partner Relationships

With the acquisition by Sage, it was valid for Sage Intacct customers to wonder whether Sage products would replace the ecosystem of partners that Sage Intacct had previously developed. To answer this question, at least in part, Sage Intacct invited ADP to the product roadmap portion of the Analyst event to show continued support of this relationship even as Sage provides Sage People based on the Fairsail acquisition earlier this year.

Amalgam looks forward to seeing this continued openness with partners such as ADP, Adaptive Insights, and FloQast as the integration capabilities of Sage Intacct in supporting best-in-breed mid-market solutions is an important aspect of Sage Intacct’s strength as a cloud solution. With this acquisition, it makes sense for Sage Intacct to support the use of Sage One if a potential client is not ready for Intacct and for Sage People when appropriate. If anything, Amalgam believes that Sage must be patient in allowing Sage Intacct customers to choose their own technology portfolios.

Customer for Life Initiatives

Amalgam tracks Customer for Life programs as part of its coverage of SaaS management and the changing role of the CIO office in managing technology as a set of strategic alliances. Amalgam has been analyzing Sage Intacct’s Customer for Life program and is currently working on a document regarding this specific approach as a result of the unusually successful renewal and broad-based approach that this software provider takes.

When questioned by Amalgam about the current stage of the Customer for Life program, Kathy Lord replied that the Sage Intacct program has not been affected by the acquisition and that the Sage Intacct approach is actually seen as a standard to be pursued across Sage. Amalgam looks forward to seeing whether this results in increased customer success investment or a perceived change in sales or service from Sage in general and looks forward to both briefings and public filings that make mention of any changes in this regard.

Conclusion

Overall, Amalgam believes that Intacct continued to show its leadership as a mid-market ERP solution and, more important to AI, as a cloud vendor that is providing leadership in building the next-generation of app platforms. Amalgam is bullish on Sage Intacct, but believes that it is integral for Sage to allow Intacct to continue on its own business path while providing the developmental and organization resources needed for Sage Intacct to maintain its growth.

Posted on 1 Comment

28 Hours as an Industry Analyst at Strata Data 2017

grid-725269_640
grid-725269_640

Companies Mentioned: Aberdeen Group, Actian, Alation, Arcadia Data, Attunity, BMC, Cambridge Semantics, Cloudera, Databricks, Dataiku, DataKitchen, Datameer, Datarobot, Domino Data Lab, EMA, HPE, Hurwitz and Associates, IBM, Informatica, Kogentix, LogTrust, Looker, <MesoSphere, Micro Focus, Microstrategy, Ovum, Paxata, Podium Data, Qubole, SAP, Snowflake, Strata Data, Tableau, Tamr, Tellius, Trifacta.

Last week, I attended Strata Data Conference at the Javitz Center in New York City to catch up with a wide variety of data science and machine learning users, enablers, and thought leaders. In the process, I had the opportunity to listen to some fantastic keynotes and to chat with 30+ companies looking for solutions, 30+ vendors presenting at the show, and attend with a number of luminary industry analysts and thought leaders including Ovum’s Tony Baer, EMA’s John Myers, Aberdeen Group’s Mike Lock, and Hurwitz & Associates’ Judith Hurwitz.

From this whirwind tour of executives, I took a lot of takeaways from the keynotes and vendors that I can share and from end users that I unfortunately have to keep confidential. To give you an idea of what an industry analyst notes, following are a short summary of takeaways I took from the keynotes and from each vendor that I spoke to:

Keynotes: The key themes that really got my attention is the idea that AI requires ethics, brought up by Joanna Bryson, and that all data is biased, which danah boyd discussed. This idea that data and machine learning have their own weaknesses that require human intervention, training, and guidance is incredibly important. Over the past decade, technologists have put their trust in Big Data and the idea that data will provide answers, only to find that a naive and “unbiased” analysis of data has its own biases. Context and human perspective are inherent to translating data into value: this does not change just because our analytic and data training tools are increasingly nuanced and intelligent in nature.

Behind the hype of data science, Big Data, analytic modeling, robotic process automation, DevOps, DataOps, and artifical intelligence is this fundamental need to understand that data, algorithms, and technology all have inherent biases as the following tweet shows:
Continue reading 28 Hours as an Industry Analyst at Strata Data 2017

Posted on Leave a comment

Microsoft: The New Player in Quantum Computing

Doppelspalt
Doppelspalt

On the week of September 25th, 2017, Microsoft made a huge announcement at its annual Ignite and Envision conference. Microsoft has become one of a small number of companies that is demonstrating quantum computing. IBM is another company that is also pursuing this rather futuristic computing model.

For those who are not up-to-date on quantum computing, it uses quantum properties such as superposition and entanglement to develop a new way of computing. Current computers are built around tiny electron switches called transistors that allow for two states, which represent the binary system we have today. Quantum computers leverage quantum states that give us ones, zeros, and combinations of one and zero. This means a single qubit, the quantum equivalent of a bit, can represent many more states than the bit can. This is, of course, a gross oversimplification but quantum computing promises to deliver more dense and exponentially faster computing.

There are a number of problems with practical quantum computing. The hardware is still in a nascent stage and must be cooled to a temperature that is quite a bit colder than deep space. This makes it much more likely that quantum computing will be purchased via a cloud model than on-premises. The other inhibitor is that there is no standard programming model for quantum computing. IBM has demonstrated a visual programming model that shows how quantum computing works but is clearly not going to be a serious way to write real programs. Microsoft, on the other hand, showed a more standard looking curly bracket programming language. This application layer makes quantum computing more accessible to existing programmers who are more used to the current model of computing.

When quantum computing becomes practical – I would predict that is at least 5 years away, perhaps longer – it won’t be for everyday computing tasks. The current model is already more than adequate for those tasks. It’s also unlikely that the capabilities of quantum computers, especially the information dense qubit, and costs will have much a place in transactional computing. Instead, quantum computing will be used for analyzing very large and complex data sets for simulation and AI. That’s fine because the AI and analytics market is still new and the future needs are not yet completely known. That future computing needs is what quantum computing is meant to address. Even today’s big data applications can stretch computing capabilities and force batch analytics instead of real-time for some use cases.
Microsoft’s entry into what has been an otherwise esoteric corner of the computing world signals that quantum computing is on the path to being real. It has a long way to go and many obstacles to overcome but it’s no longer just science fiction or academic. It will be years but it is on the way to becoming mainstream.

Note: This post was originally posted on Tom’s Take