Posted on Leave a comment

Outsourcing Core IT When You Are Core IT

Today, I provided a quick presentation on the BrightTALK channel on Outsourcing Core IT when you ARE Core IT. It turns out that one of the takeways from this webinar is about your risk of being outsourced based on your IT model. First, your fear and approach to outsourcing really depend on whether you are part of core IT, enabling IT, or transformational IT:

Operational IT is a cost-driven, commoditized, and miserable IT environment that does not realize that technology can provide a strategic advantage. In this world, a smartphone, a server running an AI algorithm, and a pile of paper clips are potentialy all of equal value and defined by their cost. (Hint: This is not a great environment to work in. You are at risk of being outsourced!)

Enabling IT identifies that technology can affect revenues, rather than just over head costs. This difference is not trivial, as the average US employee in 2012 made about $47,000 per year, but companies brought in over $280,000 per employee each year. The difference between calculating the value of tech at roughly $20 an hour vs. roughly $140 an hour for each employee is immense!

And finally, there is Transformational IT, where the company is literally bet on new Digital Transformation and IT efforts are focused on moving fast.

These are three different modes of IT, each requiring its own approach to IT outsourcing. To learn more, check out the embedded webinar below to watch the entire 25-minute webinar. I look forward to your thoughts and opinions and hope this helps you on your career path. If you have any questions, please reach out to me at hyoun @ amalgaminsights . com!

Posted on 2 Comments

EPM at a Crossroads Part 1: Why Is EPM So Confusing?

This blog is the first of a multi-blog series explaining the challenges of Enterprise Performance Management aka Financial Performance Management, Business Performance Management, Corporate Performance Management, Financial Planning and Analysis, and Planning, Budgeting, and Forecasting. Frankly, this list of names alone really helps explain a lot of the confusion. But one of the strangest aspects of this constant market renaming is that there are software and service vendors that literally provide Performance Management for Enterprises (so, literal Enterprise Performance Management) that refuse to call themselves this because of the confusion created by other industry analysts.

This blog series and the subsequent report seek to be a cure to this disease by exploring and defining Enterprise Performance Management more carefully and specifically so that companies can start using common-sense terms to describe their products once more.

Key Stakeholders: CFO, Chief Accounting Officer, Controllers, Finance Directors and Managers, Treasury Managers, Financial Risk Managers, Accounting Directors and Managers

Why It Matters: The need for true Enterprise Performance Management (EPM) has increased with the emergence of the strategic CFO and demands for cross-departmental business visibility. But market confusion has made EPM a very confusing term. Amalgam Insights is stepping up to fix this term and more accurately define the tasks and roles of EPM.

So, Why is Enterprise Performance Management So Confusing?

The market of Enterprise Performance Management has been a confusing one to follow over the past few years. During this decade, the functions of enterprise planning and budgeting have quickly evolved as the role of the CFO has transformed from chief beancounter to a chief strategist who has eclipsed the Chief Operating Officer and Chief Revenue Officer as the most important C-Level officer in the enterprise. The reason for this is simple: a company only goes as far as money allows it to.

The emergence of the strategic CFO has led to four interesting trends in the Finance Office.

  1. Financial officers now need to access non-financial data to support strategic needs and to track business progress in areas that are not purely financial in nature. This means that the CFO is looking for better data, improved integration between key data sources and applications, and more integrated planning between multiple departments.
  2. “Big Data” also becomes a big deal for the CFO, who must integrate relevant supply chain and manufacturing data, audio & video documents, external weather and government data, and other business descriptors that go outside of the traditional accounting entries used to build the balance sheet, income statement, and other traditional financial documents. To integrate the gigantic data volumes and non-traditional data formats associated with Big Data, planning tools must be scalable and flexible. This may mean going beyond the traditional OLAP cube to support semi-structured and unstructured data that provides business guidance.

  3. A quantitative approach to business management also requires a data-driven and analytic view of the business. After all, what is the point of data if it is not aligned and contextualized to the business. Behind the buzzwords, this means that the CFO must upgrade her tech skills and review college math including statistics and possibly combinatorics and linear algebra. In addition, the CFO needs to track all aspects of strategic finance, including consolidation & close, treasury management, risk management, and strategic business modeling capabilities.

  4. Finally, the CFO must think beyond data and consider the workflows associated with each key financial and operational task. This means the CFO is now also the Chief Process Officer, a role once assigned to the Chief Operating Officer, because of the CFO’s role in governance and strategy. Although accountants are not considered masters of operational management and supply chain, the modern CFO is slowly being cornered into this environment.

These drivers of strategy, data, and workflows force CFOs outside of their accounting and finance backgrounds to be true business managers. And in this context, it makes sense that Enterprise Performance Management is breaking up into multiple different paths to support the analytics, machine learning, workflows, document management, strategy, qualitative data, strategy, and portfolio management challenges that the modern strategic CFO faces.

Finance and accounting are evolving quickly in a world with new business models, massive data capacities, and complex analytical tools. But based on this fragmentation, it is difficult to imagine at first glance how any traditional planning and budgeting solution could effectively support all of these new tasks. So, in response, “EPM” solutions have started to go down four different paths. In upcoming reports, I’ll explain the paths that modern EPM solutions are taking to solve the strategic CFO’s dilemmas.

Posted on 1 Comment

Market Milestone: Oracle Builds Data Science Gravity By Purchasing DataScience.com

Industry: Data Science Platforms

Key Stakeholders: IT managers, data scientists, data analysts, database administrators, application developers, enterprise statisticians, machine learning directors and managers, current DataScience.com customers, current Oracle customers

Why It Matters: Oracle released a number of AI tools in Q4 2017, but until now, it lacked a data science platform to support complete data science workflows. With this acquisition, Oracle now has an end-to-end platform to manage these workflows and support collaboration among teams of data scientists and business users, and it joins other major enterprise software companies in being able to operationalize data science.

Top Takeaways: Oracle acquired DataScience.com to retain customers with data science needs in-house rather than risk losing their data science-based business to competitors. However, Oracle has not yet not defined a timeline for rolling out the unified data science platform, or its future availability on the Oracle Cloud.

Oracle Acquires DataScience.com

On May 16, 2018, Oracle announced that it had agreed to acquire DataScience.com, an enterprise data science platform that Oracle expects to add to the Oracle Cloud environment. With Oracle’s debut of a number of AI tools last fall, this latest acquisition telegraphs Oracle’s intent to expedite its entrance into the data science platform market by buying its way in.

Oracle is reviewing DataScience.com’s existing product roadmap and will supply guidance in the future, but they mean to provide a single unified data science platform in concert with Oracle Cloud Infrastructure and its existing SaaS and PaaS offerings, empowering customers with a broader suite of machine learning tools and a complete workflow. Continue reading Market Milestone: Oracle Builds Data Science Gravity By Purchasing DataScience.com

Posted on Leave a comment

4 Key Business & Enterprise Recommendations for Google Duplex

This week, everybody is talking about Google Duplex, announced earlier this week at Google I/O. Based on previous interactions with IVRs from calling vendors for customer support, Duplex is an impressive leap forward in natural language AI, and offers future hope at making some clerical tasks easier to complete. Duplex will be tested further by a limited number of users in Google Assistant this summer, refining its ability to complete specific tasks: getting holiday hours for a business, making restaurant reservations, and scheduling appointments specifically at a hair salon.

So what does this mean for most businesses?

Continue reading 4 Key Business & Enterprise Recommendations for Google Duplex

Posted on Leave a comment

Revealing the Learning Science for Improving IT Onboarding

Key Stakeholders: IT Managers, IT Directors, Chief Information Officers, Chief Technology Officers, Chief Digital Officers, IT Governance Managers, and IT Project and Portfolio Managers.

Top Takeaways: Information technology is innovating at an amazing pace. These technologies hold the promise of increased effectiveness, efficiency and profits. Unfortunately, the training tools developed to onboard users are often constructed as an afterthought and are ineffective. Technologies with great potential are underutilized because they are poorly trained. Learning scientists can remedy this problem and can help IT professional build effective training tools. Onboarding will become more efficient, and profits will follow in short order.

IT Onboarding Has an Adoption Problem

In my lifetime I have seen a number of amazing new technologies emerge. I remember the first handheld calculators and the first “flip” phones. Fast forward to 2018 and the majority of Americans now carry some of the most sophisticated technology that exists in their palm.

In the corporate world technology is everywhere. Old technologies are being updated regularly, and new innovative, disruptive technologies are being created every day. With every update or new technology comes a major challenge that is being poorly addressed.

How do we get users to adopt the technology, and to use it efficiently and effectively?

This is a problem in training. As a learning scientist I find it remarkable that so much time and money is allocated to developing these amazing technologies, but training is treated as an afterthought. This is a recipe for poor adoption, underutilization of these powerful tools, and reduced profits.

The IT Onboarding Issue

I experienced this dozens of times in my 25-year career as a University Professor. I routinely received emails outlining in excruciating detail a new set of rules, regulations or policies. The email would be laced with threats for non-compliance, but poorly designed instructions on how to obtain compliance. The usual approach was to ignore the instructions in the email and instead to use the grapevine to identify which buttons to click, and in what order to achieve compliance. I also received emails explaining (again usually in excruciating detail) how a new version of an existing software has changed, or how some new technology was going to be unveiled that would replace a tool that I currently used. I was provided with a schedule of “training workshops” to attend, or a link to some unintelligible “training manual”. I either found a way to use the old technology in secret, made a formal request to continue to use the “old” technology to avoid having to learn the new technology, or I asked the “tech” guy to come to my office and show me how to do the 5 things that I needed the new technology to do. I would take copious notes and save them for future reference.

If I got an email detailing a new technology that did not affect my current system, I deleted it with relief.

My story is common, it suggests a broken and ineffective system, and it all centered around the lack of quality training.

This is not restricted to the University setting. I have a colleague who builds mobile apps for large pharmacy chains. The mobile app reminds patients to refill prescriptions, allows pharmacists to adjust prescriptions, and several other features. It is a great offering and adds value for his clients. As with most IT, he rolls out a new release every few months. His main struggles are determining what features to improve, what new features to add, and how to effectively onboard users.

He gets some guidance on how to improve an existing feature, or what new features to add, but he often complains that these suggestions sound more like a client’s personal preference and are not driven by objectively-determined customer needs. With respect to training, he admits that the training manual is an afterthought and he is frustrated because his team is constantly fielding questions that are answered in the manual.

The result: “Improved” or new features no one wants, and difficulty training users of the app.

Experts in IT development should not be expected to have the skills to build an effective training manual, but they do need to understand that onboarding is critical, and effective onboarding requires a training tool that is effective.

Key Recommendations for Improving IT Onboarding

This is where learning science should be leveraged. An extensive body of psychological and brain science research (much of my own) has been conducted over the past several decades that provides specific guidelines on how to effectively train users. Here are some suggestions for improving IT training and development that derive from learning science.

Recommendation 1 – Survey Your Users: All too often technology is “improved” or a new feature is released and few clients see its value. Users know what they want. They know what problems that they need to solve and want tools for solving those problems. Users should be surveyed to determine what features they like, what features they feel could be improved, and what problems they would like solved with a new feature. The simplest, cheapest and most effective way to do this is to ask them via an objective survey. Don’t just ask the CEO, ask the users.

Recommendation 2 – Develop Step-by-Step Instructions for Each Feature and Problem it Solves: Although your new technology may have a large number of features and can solve a large number of problems, most technology users take advantage of a small handful of features to solve a small number of problems. Step-by-step instructions should be developed that train the user on each specific feature and how it helps them solve a specific problem. If I only use five features to solve two main problems, I should be able to train on those features within the context of these two problems. This approach will be fast and effective.

Recommendation 3 – Training Content Should be Grounded in Microlearning: Microlearning is an approach to training that leverages the brain science of learning. Attention spans are short and working memory capacity is limited. People learn best when training content comes in small chunks (5 – 10 minutes) that is focused on a single coherent concept. If you need to utilize two independent features to solve a specific problem, training will be most effective if you train on each feature separately, then train how to use those two features in concert to solve the specific problem.

Recommendation 4 – Develop Training Materials in Multiple Mediums: Some learners prefer to read, some prefer to watch videos, and some prefer to listen. Training materials should be available in as many of these mediums as possible.

Recommendation 5 – Incorporate Knowledge Checks: One of the best ways to train our brain is to test our knowledge of material that we have learned. Testing a learner’s knowledge of the steps needed to solve some problem, or their knowledge of the features of your app requires the learner to use cognitive effort to extract and retrieve that information from memory. This process strengthens information already learned and can identify areas of weakness in their knowledge. This would cue the learner to revisit the training material.

How to Implement These Recommendations & Representative Vendors

Now that we have identified the problem and offered some solutions, key stakeholders need to know how to implement these recommendations. Several avenues are available. Key stakeholders can work with corporate learning and internal learning and development solutions, as well as with corporate communications from an internal marketing perspective. We at Amalgam provide guidance on the best solutions and how to implement them.

Solutions are available from companies such as: Accord LMS, Agylia Group, Axonify, Cornerstone, Crossknowledge, Degreed, Docebo, EdCast, Expertus, Fivel, GamEffective, Grovo, Litmos, Lumesse, MindTickle, Mzinga, PageUp, Pathgather, PeopleFluent, Qstream, Reflekive, Saba, Salesforce, SAP SuccessFactors, Skillsoft, Talentquest, TalentSoft, Thought Industries, Totara Learn, and Zunos. Over the coming months, I will be researching each of these and other offerings in greater detail and will be writing about how they support IT education.

Posted on Leave a comment

Cloud Foundry Provides an Open Source Vision for Enterprise Software Development

From April 18-20, Amalgam Insights attended Cloud Foundry Summit 2018 in our hometown of Boston, MA. Both Research Fellow Tom Petrocelli and Founder Hyoun Park attended as we explored the current positioning of Cloud Foundry as an application development platform in light of the ever-changing world of technology. The timing of Cloud Foundry Summit this year coincided with Pivotal’s IPO, which made this Summit especially interesting. Through our attendance of keynote sessions, panels, and the analyst program, we took away several key lessons.

First, the conflict between Cloud Foundry and Kubernetes is disappearing as each solution has found its rightful place in the DevOps world. My colleague Tom Petrocelli goes into more detail in explaining how the perceived conflict between Cloud Foundry’s initial containerization efforts and Kubernetes is not justified. This summit made very clear that Cloud Foundry is a very practical solution focused on supporting enterprise-grade applications that abstracts both infrastructure and scale. Amalgam takes the stance that this conflict in software abstraction should never have been there in the first place. Practitioners have been creating an artificial conflict that the technology solutions are seeking to clarify and ameliorate.

At the event, Cloud Foundry also announced heavy-hitting partners. Alibaba Cloud is now a Gold member of the Cloud Foundry Foundation. With this announcement, Cloud Foundry goes to China and joins the fastest growing cloud in the world. This announcement mirrors Red Hat’s announcement of Alibaba becoming an Red Hat Certified Cloud and Service Provider last October and leads to an interesting showdown in China as developers choose between Cloud Foundry and OpenStack to build China’s future of software.

In addition, Cloud Foundry Foundation announced Cloud.gov as the 8th certified provider of Cloud Foundry. This step forward will allow federal agencies to use a certified and FedRAMP Authorized Cloud Foundry platform. The importance of this announcement was emphasized in an Air Force-led session on Project Kessel Run, which is focused on agile software for the Air Force. This session showed how how Cloud Foundry accelerated the ability to execute in an environment where the average project took over 8 years to complete due to challenges such as the need to ask for Congressional approval on a yearly basis. By using Cloud Foundry, the Air Force has identified opportunities to build applications in a couple of months and get these tools directly to soldiers to create culture that is #AgileAF (which obviously stands for “Agile Air Force”). The role of Cloud Foundry is accelerating one of the most challenging and governed application development environments in the world accentuated the value of Cloud Foundry in effectively enabling the vaunted goal of digital transformation.

From a technical perspective, the announcement that really grabbed our attention was the demonstration of cfdev in providing a full Cloud Foundry development experience on a local laptop or workstation on native hypervisors. This will make adoption far easier for developers seeking to quickly develop and debut applications as well as to help developers build test and sandbox environments for Cloud Foundry.

Overall, this event demonstrated the evolution of Cloud Foundry. The themes of Cloud Foundry as both a government enabler and the move to China were front and center throughout this event. Combined with the Pivotal IPO and Cloud Foundry’s ability to work with Kubernetes, it is hard to deny Cloud Foundry’s progress as an enterprise and global solution for application development acceleration and in working as a partner with other appropriate technologies.

Posted on Leave a comment

Lynne Baer: Clarifying Data Science Platforms for Business

My name is Lynne Baer, and I’ll be covering the world of data science software for Amalgam Insights. I’ll investigate data science platforms and apps to solve the puzzle of getting the right tools to the right people and organizations.

“Data science” is on the tip of every executive’s tongue right now. The idea that new business initiatives (and improvements to existing ones) can be found in the data a company is already collecting is compelling. Perhaps your organization has already dipped its toes in the data discovery and analysis waters – your employees may be managing your company’s data in Informatica, or performing statistical analysis in Statistica, or experimenting with Tableau to transform data into visualizations.

But what is a Data Science Platform? Right now, if you’re looking to buy software for your company to do data science-related tasks, it’s difficult to know which applications will actually suit your needs. Do you already have a data workflow you’d like to build on, or are you looking to the structure of an end-to-end platform to set your data science initiative up for success? How do you coordinate a team of data scientists to take better advantages of existing resources they’ve already created? Do you have coders in-house already who can work with a platform designed for people writing in Python, R, Scala, Julia? Are there more user-friendly tools out there your company can use if you don’t? What do you do if some of your data requires tighter security protocols around it? Or if some of your data models themselves are proprietary and/or confidential?

All of these questions are part and parcel of the big one: How can companies tell what makes a good data science platform for their needs before investing time and money? Are traditional enterprise software vendors like IBM, Microsoft, SAP, SAS dependable in this space? What about companies like Alteryx, H2O.ai, KNIME, RapidMiner? Other popular platforms under consideration should also include Anaconda, Angoss (recently acquired by Datawatch), Domino, Databricks, Dataiku, MapR, Mathworks, Teradata, TIBCO. And then there’s new startups like Sentenai, focused on streaming sensor data, and slightly more established companies like Cloudera looking to expand from their existing offerings.

Over the next several months, I’ll be digging deeply to answer these questions, speaking with vendors, users, and investors in the data science market. I would love to speak with you, and I look forward to continuing this discussion. And if you’ll be at Alteryx Inspire in June, I’ll see you there.

Posted on 1 Comment

The Software Abstraction Disconnect is Silly

Tom Petrocelli, Amalgam Insights Research Fellow

Over the past two weeks, I’ve been to two conferences that are run by an open source community. The first was the CloudFoundry Summit in Boston followed by KubeCon+CloudNativeCon Europe 2018 in Copenhagen. At both, I found passionate and vibrant communities of sysops, developers, and companies. For those unfamiliar with CloudFoundry and Kubernetes, they are open source technologies that abstract software infrastructure to make it easier for developers and sysops to deliver applications more quickly.

Both serve similar communities and have a generally similar goal. There is some overlap – CloudFoundry has its own container and container orchestration capability – but the two technologies are mostly complementary. It is possible, for example, to deploy CloudFoundry as a Kubernetes cluster and use CloudFoundry to deploy Kubernetes. I met with IT professionals that are doing one or both of these. The same is true for OpenStack and CloudFoundry (and Kubernetes for that matter). OpenStack is used to abstract the hardware infrastructure, in effect creating a cloud within a data center. It is a tool used by sysops to provision hardware as easily scalable resources, creating a private cloud. So, like CloudFoundry does for software, OpenStack helps to manage resources more easily so that a sysop doesn’t have to do everything by hand. CloudFoundry and OpenStack are clearly complementary. Sysops use OpenStack to create resources in the form of a private cloud; developers then use CloudFoundry to pull together private and public cloud resources into a platform they deploy applications to. Kubernetes can be found in any of those places.

Fake News, Fake Controversies

Why then, is there this constant tension between the communities and adopters of these technologies? It’s as if carpenters had hammer people and saw people who argued over which was better. According to my carpenter friends, they don’t. The foundations and vendors avoid this type of talk, but these kinds of discussions are happening at the practitioner and contributor level all the time. During KubeCon+CloudnativeCon Europe 2018, I saw a number of tweets that, in essence, said: “Why is Cloud Foundry Executive Director Abby Kearns speaking at KubeCon?” They questioned what one had to do with the other. Why not question what peanut butter and jelly have to do with each other?

Since each of these open source projects (and the products based on them) have a different place in a modern hybrid cloud infrastructure, how is it that very smart people are being so short-sighted? Clearly, there is a problem in these communities that limit their point of view. One theory lies in what it takes to proselytize these projects within an organization and wider community. To put it succinctly, to get corporate buy-in and widespread adoption, community members have to become strongly focused on their specific project. So focused, that some put on blinders and can no longer see the big picture. In fact, in order to sell the world on something that seems radical at first, you trade real vision for tunnel vision.

People become invested in what they do and that’s good for these type of community developed technologies. They require a commitment to a project that can’t be driven by any one company and may not pan out. It turns toxic when the separate communities become so ensconced in their own little corner of the tech world that they can’t see the big picture. The very nature of these projects defies an overriding authority that demands the everyone get along, so they don’t always.

It’s time to get some perspective, to see the big picture. We have an embarrassment of technology abstraction riches. It’s time to look up from individual projects and see the wider world. Your organizations will love you for it.

Posted on 1 Comment

KubeCon+CloudNativeCon Europe 2018 Demonstrates The Breadth and Width of Kubernetes


Standing in the main expo hall of KubeCon+CloudNativeCon Europe 2018 in Copenhagen, the richness of the Kubernetes ecosystem is readily apparent. There are booths everywhere, addressing all the infrastructure needs for an enterprise cluster. There are meetings everywhere for the open source projects that make up the Kubernetes and Cloud Native base of technology. The keynotes are full. What was a 500-person conference in 2012 is now, 6 years later, a 4300-person conference even though it’s not in one of the hotbeds of American technology such as San Francisco or New York City.

What is amazing is how much Kubernetes has grown in such a short amount of time. It was only a little more than a year ago that Docker released its Kubernetes competitor called Swarm. While Swarm still exists, Docker also supports, and arguably is betting the future, on Kubernetes.
Kubernetes came out of Google, but that doesn’t really explain why it expanded like the early universe after the Big Bang. Google is not the market leader in the cloud space – it’s one of the top vendors but not the top vendor – and wouldn’t have provided enough market pull to drive the Kubernetes engine this hot. Google is also not a major enterprise infrastructure software vendor the way IBM, Microsoft, or even Red Hat and Canonical are.

Kubernetes benefited from the first mover effect. They were early into the market with container orchestration, were fully open source, and had a large amount of testing in Google’s own environment. Docker Swarm, on the other hand, was too closely tied to Docker, the company, to appease the open source gods.

Now, Kubernetes finds itself like a new college graduate. It’s all grown up but needs to prepare for the real world. The basics are all in place and it’s mature but there is an enormous amount of refinement and holes that need to be filled in for it to be a common part of every enterprise software infrastructure. KubeCon+CloudNativeCon shows that this is well underway. The focus now is on security, monitoring, network improvement, and scalability. There doesn’t seem to be a lot of concern about stability or basic functionality.

Kubernetes has eaten the container world and didn’t get indigestion. That’s rare and wonderful.

Posted on Leave a comment

Market Milestone: ServiceNow Buys VendorHawk and SaaS Management Comes of Age

Industry: Software Asset Management

Key Stakeholders: CIO, CFO, Chief Digital Officer, Chief Technology Officer, Chief Mobility Officer, IT Asset Directors and Managers, Procurement Directors and Managers, Accounting Directors and Managers
Why It Matters: Software-as-a-Service (SaaS) is now a strategic IT component. As enterprise SaaS doubles in market size over the next three years, this complex spend category will continue to expand beyond the ability to manually manage it
Top Takeaways: With this acquisition, ServiceNow will have a cutting-edge & converged Software Asset Management solution for both SaaS and on-premises applications in 2019. Enterprise organizations managing over $25,000 a month should consider an enterprise SaaS vendor management solution to optimize licenses, de-duplicate vendor categories, and gain enterprise-grade governance.

With ServiceNow’s acquisition of VendorHawk, the era of SaaS Vendor Management is emergent.

ServiceNow Acquires VendorHawk

On April 25th, 2018, ServiceNow announced its acquisition of SaaS Vendor Management solution VendorHawk in an all-cash transaction scheduled to close in April. This acquisition highlights the increasingly strategic role of SaaS from an IT service management perspective and validates the need for Software Asset Management solutions to support SaaS. In addition, this acquisition continues a string of acquisitions that ServiceNow has made over the past year including acquisitions of:

• Qlue, an artificial intelligence framework for customer service
• Telepathy, a design firm focused on massive adoption of applications
• SkyGiraffe, a no-code mobile app development platform used to make all ServiceNow applications mobile-friendly

The VendorHawk acquisition falls in line with these acquisitions in that VendorHawk will help enterprises to support the widespread adoption of SaaS.
Continue reading Market Milestone: ServiceNow Buys VendorHawk and SaaS Management Comes of Age