Posted on Leave a comment

From Calero World Online: From TEM to ITEM: Leveraging TEM for Non-Traditional Expenses

On October 18th, I presented a webinar at Calero World Online on the future of IT cost and subscription management. In this presentation, I challenge existing telecom and IT expense management managers to accept their destiny as pilots and architects of enterprise digital subscriptions.

Telecom expense has traditionally been the most challenging of IT costs to manage. With the emergence of software-as-a-service, cloud computing, the Internet of Things, and software-defined networks, the rest of the IT world is quickly catching up.

In this webinar, you will learn:

  • How the latest trends and technology are driving change to enterprise management strategies
  • How the challenges of traditional TEM and cloud expense management are similar in nature (and why TEM is a good place to start)
  • How organizations are benefiting from ITEM best practices using sample use cases

To learn more about the upcoming challenges of IT expense management, aligning technology supply to digital demand, and being the shepherd for your organization’s technology sourcing, utilization, and optimization, click here to watch this webinar on-demand.

Posted on Leave a comment

ICYMI: On Demand Webinar – Four Techniques to Run AI on Your Business Data

On October 17th, I presented a webinar with Incorta’s Chief Evangelist, Matthew Halliday, on the importance of BI architectures in preparing for AI. This webinar is based on a core Amalgam Insights belief that all enterprise analytics and data science activity should be based on a shared core of trusted and consistent data so that Business Intelligence, analytics, machine learning, data science, and deep learning efforts are all based on similar assumptions and can build off each other.

While AI is beginning to impact every aspect of our consumer lives, business data-driven AI seems to be lower on the priority list of most enterprises. The struggle to understand the practical value of AI starts with the lack of ability to make business data easily accessible to the data science teams. Today’s BI tools have not kept up with this need and often are the bottlenecks that stifle innovation.

In this webinar, you will learn from Hyoun Park and Matthew Halliday about:
  • key data and analytic trends leading to the need to accelerate analytic access to data.
  • guidance for challenges in implementing AI initiatives alongside BI.
  • practical and future-facing business use cases that can be supported by accelerating analytic access to large volumes of operational data.
  • techniques that accelerate AI initiatives on your business data.

Watch this webinar on-demand by clicking here.

Posted on Leave a comment

Why It Matters that IBM Announced Trust and Transparency Capabilities for AI


Note: This blog is a followup to Amalgam Insights’ visit to the “Change the Game” event held by IBM in New York City.

On September 19th, IBM announced its launch of a portfolio of AI trust and transparency capabilities. This announcement got Amalgam Insight’s attention because of IBM’s relevance and focus in the enterprise AI market throughout this decade.  To understand why IBM’s specific launch matters, take a step back in considering IBM’s considerable role in building out the current state of the enterprise AI market.

IBM AI in Context

Since IBM’s public launch of IBM Watson on Jeopardy! in 2011, IBM has been a market leader in enterprise artificial intelligence and spent billions of dollars in establishing both IBM Watson and AI. This has been a challenging path to travel as IBM has had to balance this market-leading innovation with the financial demands of supporting a company that brought in $107 billion in revenue in 2011 and has since seen this number shrink by almost 30%.

In addition, IBM had to balance its role as an enterprise technology company focused on the world’s largest workloads and IT challenges with launching an emerging product better suited for highly innovative startups and experimental enterprises. And IBM also faced the “cloudification” of enterprise IT in general, where the traditional top-down purchase of multi-million dollar IT portfolios is being replaced by piecemeal and business-driven purchases and consumption of best-in-breed technologies.

Seven years later, the jury is still out on how AI will ultimately end up transforming enterprises. What we do know is that a variety of branches of AI are emerging, including Continue reading Why It Matters that IBM Announced Trust and Transparency Capabilities for AI

Posted on 1 Comment

IBM Presents “Change the Game: Winning with AI” in New York City


(Note: This blog is part of a multi-part series on this event and the related analyst event focused on IBM’s current status from an AI perspective.)

On September 13th, 2018, IBM held an event titled “Change the Game: winning with AI.” The event was hosted by ESPN’s Hannah Storm and held in Hell’s Kitchen’s Terminal 5, better known as a music venue where acts ranging from Lykke Li to Kali Uchis to Good Charlotte perform. In this rock star atmosphere, IBM showcased its current perspective on AI (artificial intelligence, not Amalgam Insights!).

IBM’s Rob Thomas and ESPN’s Hannah Storm discuss IBM Private Cloud for Data

This event comes at an interesting time for IBM. Since IBM’s public launch of IBM Watson on Jeopardy! in 2011, IBM has been a market leader in enterprise artificial intelligence and spent billions of dollars in establishing both IBM Watson and AI. However, part of IBM’s challenge over the past several years was that the enterprise understanding of AI was so nascent that there was no good starting point to develop machine learning, data science, and AI capabilities. In response, IBM built out many forms of AI including

  • IBM Watson as a standalone, Jeopardy!-like solution for healthcare and financial services,
  • Watson Developer Cloud to provide language, vision, and speech APIs, Watson Analytics to support predictive and reporting analytics,
  • chatbots and assistants to support talent management and other practical use cases,
  • Watson Studio and Data Science Experience to support enterprise data science efforts to embed statistical and algorithmic logic into applications, and
  • Evolutionary neural network design at the research level.

And, frankly, the velocity of innovation was difficult for enterprise buyers to keep up with, especially as diverse products were all labelled Watson and as buyers were still learning about technologies such as chatbots, data science platforms, and IBM’s hybrid cloud computing options at a fundamental level. The level of external and market-facing education needed to support relatively low-revenue and experimental investments in AI was a tough path for IBM to support (and, in retrospect, may have been better supported as a funded internal spin-off with access to IBM patents and technology). Consider that extremely successful startups can justify billion dollar valuations based on $100 million in annual revenue while IBM is being judged on multi-billion dollar revenue changes on a quarter-by-quarter basis. That juxtaposition makes it hard for public enterprises to support audacious and aggressive startup goals that may take ten years of investment and loss to build.

IBM’s perspective at this “Change the game” event was an important checkpoint in learning more about IBM’s current positioning on AI. Amalgam Insights was interested in learning more about IBM’s current positioning and upcoming announcements to support enterprise AI.

With strategies telestrated by IBM’s Janine Sneed and Daniel Hernandez, this event was an entertaining breakdown of IBM case studies demonstrating the IBM analytics and data portfolio and interspersed with additional product and corporate presentations by IBM’s Rob Thomas, Dinesh Nirmal, Reena Ganga, and Madhu Kochar. The event was professionally presented as Hannah Storm interviewed a wide variety of IBM customers including:

  • Mark Vanni, COO of Trūata
  • Joni Rolenaitis, Vice President of Data Development and Chief Data Officer for Experian
  • Dr. Donna M. Wolk, System Director of Clinical and Molecular Microbiology for Geisinger
  • Guy Taylor, Executive Head of Data-Driven Intelligence at Nedbank
  • Sreesha Rao, Senior Manager of IT Applications at Niagara Bottling LLC
  • James Wade, Director of Application Hosting for GuideWell Mutual Holding Company
  • Mark Lack, Digital Strategist and Data Scientist for Mueller, Inc
  • Rupinder Dhillon, Director of Machine Learning and AI at Bell Canada

In addition, IBM demonstrated aspects of IBM Private Cloud for Data, their cloud built for using data for AI, as well as Watson Studio, IBM’s data science platform, and design aspects for improving enterprise access to data science and analytic environments.

Overall, Amalgam Insights saw this event as a public-friendly opportunity to introduce IBM’s current capabilities in making enterprises ready to support AI by providing the data, analytics, and data science products needed to prepare enterprise data ecosystems for machine learning, data science, and AI projects. In upcoming blogs, Amalgam Insights will cover IBM’s current AI positioning in greater detail and the IBM announcements that will affect current and potential AI customers.

Posted on

EPM at a Crossroads: Big Data Solutions

Key Stakeholders: Chief Information Officers, Chief Financial Officers, Chief Operating Officers, Chief Digital Officers, Chief Technology Officer, Accounting Directors and Managers, Sales Operations Directors and Managers, Controllers, Finance Directors and Managers, Corporate Planning Directors and Managers

Analyst-Recommended Solutions: Adaptive Insights, a Workday Company, Anaplan, Board, Domo, IBM Planning Analytics, OneStream, Oracle Planning and Budgeting, SAP Analytics Cloud

In 2018, the Enterprise Performance Management market is at a crossroads. This market has emerged from a foundation of financial planning, budgeting, and forecasting solutions designed to support basic planning and has evolved as the demands for business planning, risk and forecasting management, and consolidation have increased over time. In addition, the EPM market has expanded as companies from the financial consolidation and close markets, business performance management markets, and workflow and process automation markets now play important roles in effectively managing Enterprise Performance.

In light of these challenges, Amalgam Insights is tracking six key areas where Enterprise Performance Management is fundamentally changing: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience

Supporting Big Data for Enterprise Performance Management

Amalgam Insights has identified two key drivers repeatedly mentioned by finance departments seeking to support Big Data in Enterprise Performance Management. First, EPM solutions must support larger stores of data over time to fully analyze financial data and a plethora of additional business data needed to support strategic business analysis. The challenge of growing data has become increasingly important as enterprises now face the challenge of managing billion row tables and outgrow the traditional cubes and datamarts used to manage basic financial data. The sheer scale of financial and commerce-related transactional data requires a Big Data approach at the enterprise level to support timely analysis of planning, consolidation, close, risk, and compliance.

In addition, these large data sources need to integrate with other data sources and references to support integrated business planning to align finance planning with sales, supply chain, IT, and other departments. As the CFO is increasingly asked to be not only a financial leader, but a strategic leader, she must have access to all relevant business drivers and have a single view of how relevant sales, support, supply chain, marketing, operational, and third-party data are aligned to financial performance. Each of these departments has its own large store of data that the strategic CFO must also be able to access, allocate, and analyze to guide the business.

New EPM solutions must evolve beyond traditional OLAP cubes to support hybrid data structures that effectively scale to support the immense scale and variety of data being supported. Amalgam notes that EPM solutions focusing on large data solutions take a variety of relational, in-memory, columnar, cloud computing, and algorithmic approaches to define categories on the fly, store, structure, and analyze financial data.

To support these large stores of data and effectively support them from a financial, strategic, and analytic perspective, Amalgam Insights recommends the following companies that have been innovative in supporting immense and varied planning and budgeting data environments based on briefings and discussions held in 2018:

  • Adaptive Insights, a Workday Company
  • Anaplan
  • Board
  • Domo
  • IBM Planning Analytics
  • OneStream
  • Oracle Planning and Budgeting
  • SAP Analytics Cloud

Adaptive Insights

Adaptive Insights’ Elastic Hypercube, an in-memory, dynamic caching and scaling solution announced in July 2018. Amalgam Insights saw a preview of this technology at Adaptive Live and was intrigued by the efficiency that Adaptive Insights provided to models in selectively recalculating only the dependent changes as a model was edited, using a dynamic caching approach for only using memory and computational cycles when data was being accessed, and using both tabular and cube formats to support data structures. This data format will also be useful to Adaptive Insights as a Workday company in building out the various departmental planning solutions that will be accretive to Workday’s positioning as an HR and ERP solution after Workday’s June acquisition (covered in June in our Market Milestone).

Anaplan

Anaplan’s Hyperblock is an in-memory engine combining columnar, relational, and OLAP approaches. This technology is the basis of Anaplan’s platform and allows Anaplan to rapidly support large planning use cases. By developing composite dimensions, Anaplan users can pre-build a broad array of combinations that can be used to repeatably deploy analytic outputs. As noted in our March blog, Anaplan has been growing rapidly based on its ability to rapidly support new use cases. In addition, Anaplan has recently filed its S-1 to go public.

Board

Board goes to market both as an EPM and a general business intelligence solution. Its core technology is the Hybrid Bitwise Memory Pattern (HBMP), a proprietary in-memory data management solution, designed to algorithmically map each bit of data, then to store this map in-memory. In practice, this approach allows Board to allow many users to both access and edit information without dealing with lagging or processing delays. This approach also allows Board to support which aspects of data to support in an in-memory or dynamic manner to prioritize computing assets.

Domo

Domo describes its Adrenaline engine as an “n-dimensional, highly concurrent, exo-scale, massively parallel, and sub-second data warehouse engine” to store business data. This is accompanied by VAULT, Domo’s data lake to support data ingestion and serve as a single store of record for business analysis. Amalgam Insights covered the Adrenaline engine as one of Domo’s “Seven Samurai” in our March report Domo Hajimemashite: At Domopalooza 2018, Domo Solves Its Case of Mistaken Identity. Behind the buzzwords, these technologies allow Domo to provide executive reporting capabilities across a wide range of departmental use cases in near-real time. Although Domo is not a budgeting solution, it is focused on portraying enterprise performance for executive consumption and should be considered for organizations seeking to gain business-wide visibility to key performance metrics.

IBM Planning Analytics

IBM Planning Analytics runs on Cognos TM1 OLAP in-memory cubes. To increase performance, these cubes use sparse memory management where missing values are ignored and empty values are not stored. In conjunction with IBM’s approach of caching analytic outcomes in-memory, this approach allows IBM to improve performance compared to standard OLAP approaches and this approach has been validated at scale by a variety of IBM Planning Analytics clients. Amalgam Insights presented on the value of IBM’s approach at IBM Vision 2017 both from a data perspective and from a user interface perspective that will be covered in a future blog.

OneStream

OneStream provides in-memory processing & stateless servers to support scale, but their approach to analytic scale is based on virtual cubes and extensible dimensions, which allow organizations to continue building dimensions over time that are tied back to a corporate level and to create logical views of data based on a larger data store to support specific financial tasks such as budgeting, tax reporting, or financial reporting. OneStream’s approach is focused on financial use rather than general business planning.

Oracle Planning and Budgeting Cloud

Oracle Planning and Budgeting Cloud Service is based on Oracle Hyperion, the market leader in Enterprise Performance Management from a revenue perspective. The Oracle Cloud is built on Oracle Exalogic Elastic Cloud, Oracle Exadata Database Machine, and the Oracle Database, which provide a strong in-memory foundation for the Planning and Budgeting application by providing an algorithmic approach to manage storage, compute, and networking. This approach effectively allows Oracle to support planning models at massive scale.

SAP Analytics Cloud

SAP Analytics Cloud, SAP’s umbrella product for planning and business intelligence, uses SAP S/4HANA, an in-memory columnar relational database, to provide real-time access to data and to accelerate both modelling and analytic outputs based on all relevant transactional data. This approach is part of SAP’s broader HANA strategy to encapsulate both analytic and transactional processing in a single database, effectively making all data reportable, modellable, and actionable. SAP has also recently partnered with Intel Optane DC persistent memory to support larger data volumes for enterprises requiring larger persistent data stores for analytic use.

This blog is part of a multi-part series on the evolution of Enterprise Performance Management and key themes that the CFO office must consider in managing holistic enterprise performance: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience. If you would like to set up an inquiry to discuss EPM or provide a vendor briefing on this topic, please contact us at info@amalgaminsights.com to set up time to speak.

Last Blog: EPM at a Crossroads
Next Blog: Robotic Process Automation and Machine Learning in EPM

Posted on Leave a comment

FloQast Supports ASC 606 Compliance by Providing a Multi Book Close for Accountants

On September 11, 2018, FloQast announced multi-book accounting capabilities designed to help organizations to support ASC 606 compliant financial closes by supporting dual reporting on revenue recognition and related expenses. As Amalgam Insights has covered in prior research, ASC 606/IFRS 15 standards for recognizing revenue on subscription services are currently required for all public companies and will be the standard for private companies as of the end of 2019.

Currently, FloQast supports multi book accounting for Oracle NetSuite and Sage Intacct, two strong mid-market finance solutions that have invested in their subscription billing model support capabilities. This capability is available for FloQast Business, Corporate, and Enterprise customers at no additional cost. The support of these solutions also reflects the investment that each of these ERP vendors has made in subscription billing. NetSuite’s 2015 acquisition of subscription billing solution Monexa eventually led to the launch of NetSuite SuiteBilling, while Sage Intacct developed its subscription billing capabilities organically in 2015.

Why This Matters For Accounting Teams

Currently, accounting teams compliant with ASC 606 are required to provide two sets of books associated with each financial close. Organizations seeking to accurately reflect their finances both from a legacy and current perspective either need to duplicate efforts to provide compliant accounting outputs or to use an accounting solution that will accurately create separate sets of close results. By simultaneously creating dual level close outputs, organizations can avoid the challenge of creating detailed journal entries to explain discrepancies within a single close instance.

Recommendations for Accounting Teams with ASC 606 Compliance Requirements

This announcement has a couple of ramifications for mid-market enterprises and organizations that are either currently supporting ASC 606 as public companies or preparing to support ASC 606 as private companies.

First, Amalgam Insights believes that accounting teams using either Oracle NetSuite or Sage Intacct should adopt FloQast as a relatively low-cost solution to solve the challenge of duplicate ASC 606 close. Currently, this functionality is most relevant to Oracle NetSuite and Sage Intacct customers with significant ASC 606 accounting challenges. To understand why, consider the basic finances of this decision.

Amalgam Insights estimates that, based on FloQast’s current pricing of $125 per month for business accounts or $150 per month for corporate accounts , FloQast will pay for itself with productivity gains in any accounting department where an organization spends four or more man-hours per month to create duplicate closes. This return is in addition to the existing ROI associated with financial close that Amalgam Insights has previously tracked for FloQast customers in our Business Value Analysis. In this document, we found that FloQast customers interviewed saw a 647% ROI in their first year of deployment by accelerating and simplifying their close workflows and improving team visibility to the current status of financial close.

Second, accounting teams should generally expect to support multi-book accounting for the foreseeable future. Although ASC 606 is now a current standard, financial analysts and investors seeking to conduct historical analysis of a company for investment, acquisition, or partnership will want to conduct a consistent “apples-to-apples” comparison of finances for multiple years. Until your organization has three full years of financial statements under ASC 606 that are audited, your organization will likely have to maintain multiple sets of books. Given that most public organizations started using ASC 606 in 2018, this means having a plan for multiple sets of books until 2020. For private organizations, this may mean an additional year or two given that mandatory compliance starts at the end of 2019. Companies that avoid preparing for the reality of dual level closes for the next couple of years will be spending significant accountant hours on easily avoidable work.

If you would like to learn more about FloQast, the Business Value Analysis, or the current vendor solution options for financial close management, please contact Amalgam Insights at info@amalgaminsights.com

Posted on 2 Comments

VMware Purchases CloudHealth Technologies to support Multicloud Enterprises and Continue Investing in Boston


Vendors and Solutions Mentioned: VMware, CloudHealth Technologies, Cloudyn, Microsoft Azure Cloud Cost Management, Cloud Cruiser, HPE OneSphere. Nutanix Beam, Minjar, Botmetric

Key Stakeholders: Chief Financial Officers, Chief Information Officers, Chief Accounting Officers, Chief Procurement Officers, Cloud Computing Directors and Managers, IT Procurement Directors and Managers, IT Expense Directors and Managers

Key Takeaway: As Best-of-Breed vendors continue to emerge, new technologies are invented, existing services continue to evolve, vendors pursue new and innovative pricing and delivery models, cloud computing remains easy to procure, and IaaS doubles every three years as a spend category, cloud computing management will only increase in complexity and the need for Cloud Service Management will only increase. VMware has made a wise choice in buying into a rapidly growing market and now has greater opportunity to support and augment complex peak, decentralized, and hybrid IT environments.

About the Announcement

On August 27, 2018, VMware announced a definitive agreement to acquire CloudHealth Technologies, a Boston-based startup company focused on providing a cloud operations and expense management platform that supports enterprise accounts across Amazon Web Services, Microsoft Azure, and Google Cloud Platform.
Continue reading VMware Purchases CloudHealth Technologies to support Multicloud Enterprises and Continue Investing in Boston

Posted on Leave a comment

Oracle Autonomous Transaction Processing Lowers Barriers to Entry for Data-Driven Business

I recently wrote a Market Milestone report on Oracle’s launch of Autonomous Transaction Processing, the latest in a string of Autonomous Database announcements made by Oracle following announcements in Autonomous Data Warehousing and the initial announcement of the Autonomous Database late last year.

This string of announcements by Oracle takes advantage of Oracle’s investments in infrastructure, distributed hardware, data protection and security and index optimization to create a new set of database services that seek to automate basic support and optimization capabilities. These announcements matter because, as transactional and data-centric business models continue to proliferate, both startups and enterprises should seek a data infrastructure that will remain optimized, secure, and scalable over time without become cost and resource intensive. With Oracle Automated Transaction Processing, Oracle provides its solution to provide an enterprise-grade data foundation for this next generation of businesses.

One of Amalgam Insights’ key takeaways in this research is the analyst estimate that Oracle ATP could reduce the cost of cloud-based transactional database management by 65% compared to similar services managed on Amazon Web Services. Frankly, companies that need to support net-new transactional databases that must be performant and scalable to support Internet of Things, messaging, and other new data-driven businesses should consider Oracle ATP and should do due diligence on Oracle Autonomous Database Cloud for reducing long-term Total Cost of Ownership. This chart is based on the costs of a 10 TB Oracle database on a reserved instance on Amazon Web Services vs. a similar database on the Oracle Autonomous Database Cloud

One of the most interesting aspects of the Autonomous Database in general that Oracle will need to further explain is how to guide companies with existing transactional databases and data warehouses to an Automated environment. It is no secret that every enterprise IT department is its own special environment driven by a combination of business rules, employee preferences, governance, regulation, security, and business continuity expectations. At the same time, IT is used to automation and rapid processing of some aspects of technology management, such as threat management and logs for patching and other basic transactions. But considering the needs of IT for extreme customization, how does IT gain enough visibility to the automated decisions made in indexing and ongoing optimization?

At this point, Amalgam Insights believes that Oracle is pushing a fundamental shift in database management that will likely lead to the automation of manual technical management tasks. This change will be especially helpful for net-new databases where organizations can use the Automated Database Cloud to help establish business rules for data access, categorization, and optimization. This is likely a no-brainer decision, especially for Oracle shops that are strained in their database management resources and seeking to handle more data for new transaction-based business needs or machine learning.

For established database workloads, enterprises will have to think about how or if to transfer existing enterprise databases to the Autonomous Database Cloud. Although enterprises will likely gain some initial performance improvements and potentially reduce the support costs associated with large databases, they will also likely spend time in double-checking the decisions and lineage associated with Automated Database decisions, both in test and in deployment settings. Amalgam Insights would expect that Autonomous Database management would lead to indexing, security, and resource management decisions that may be more optimal than human-led decisions, but with a logic that may not be fully transparent to IT departments that have strongly-defined and governed business rules and processes.

Although Amalgam Insights is convinced that Oracle Autonomous Database is the beginning of a new stage of Digitized and Automated IT, we also believe that a next step for Oracle Autonomous Database Cloud will be to create governance, lineage, and audit packages to support regulated industries, legislative demands, and documentation to describe the business rules for Autonomous logic. Amalgam Insights expects that Oracle would want to keep specific algorithms and automation logic as proprietary trade secrets. But without some level of documentation that is tracable and auditable, large enterprises will have to conduct significant work on their own to figure out if they are able to transfer large databases to Oracle Autonomous Database Cloud, which Amalgam Insights would expect to be an important part of Oracle’s business model and cloud revenue projections going forward.

To read the full report with additional insights and details on the Oracle Autonomous Transaction Processing announcement, please download the full report on Oracle’s launch of Autonomous Transaction Processing, available at no cost for a limited time.

Posted on Leave a comment

Azure Advancements Announced at Microsoft Inspire 2018

Last week, Microsoft Inspire took place, which meant that Microsoft made a lot of new product announcements regarding the Azure cloud. In general, Microsoft is both looking up and trying to catch up to Amazon from a market share perspective while trying to keep its current #2 place in the Infrastructure as a Service world ahead of rapidly growing Google Cloud Platform as well as IBM and Oracle.  Microsoft Azure is generally regarded as a market-leading cloud platform, along with Amazon, that provides storage, computing, and security and is moving towards analytics, networking, replication, hybrid synchronization, and blockchain support.

Key functionalities that Microsoft has announced include:
Continue reading Azure Advancements Announced at Microsoft Inspire 2018

Posted on Leave a comment

Market Milestone: Informatica and Google Cloud Partner to Open Up Data, Metadata, Processes, and Applications as Managed APIs

[This Research Note was co-written by Hyoun Park and Research Fellow Tom Petrocelli]

Key Stakeholders: Chief Information Officers, Chief Technical Officers, Chief Digital Officers, Data Management Managers, Data Integration Managers, Application Development Managers

Why It Matters: his partnership demonstrates how Informatica’s integration Platform as a Service brings Google Cloud Platform’s Apigee products and Informatica’s machine-learning-driven connectivity into a single solution.

Key Takeaway: This joint Informatica-Google API management solution provides customers with a single solution that provides data, process, and application integration as well as API management. As data challenges evolve into workflow and service management challenges, this solution bridges key gaps for data and application managers and demonstrates how Informatica can partner with other vendors as a neutral third-party to open up enterprise data and support next-generation data challenges.
Continue reading Market Milestone: Informatica and Google Cloud Partner to Open Up Data, Metadata, Processes, and Applications as Managed APIs