Posted on

EPM at a Crossroads: Big Data Solutions

Key Stakeholders: Chief Information Officers, Chief Financial Officers, Chief Operating Officers, Chief Digital Officers, Chief Technology Officer, Accounting Directors and Managers, Sales Operations Directors and Managers, Controllers, Finance Directors and Managers, Corporate Planning Directors and Managers

Analyst-Recommended Solutions: Adaptive Insights, a Workday Company, Anaplan, Board, Domo, IBM Planning Analytics, OneStream, Oracle Planning and Budgeting, SAP Analytics Cloud

In 2018, the Enterprise Performance Management market is at a crossroads. This market has emerged from a foundation of financial planning, budgeting, and forecasting solutions designed to support basic planning and has evolved as the demands for business planning, risk and forecasting management, and consolidation have increased over time. In addition, the EPM market has expanded as companies from the financial consolidation and close markets, business performance management markets, and workflow and process automation markets now play important roles in effectively managing Enterprise Performance.

In light of these challenges, Amalgam Insights is tracking six key areas where Enterprise Performance Management is fundamentally changing: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience

Supporting Big Data for Enterprise Performance Management

Amalgam Insights has identified two key drivers repeatedly mentioned by finance departments seeking to support Big Data in Enterprise Performance Management. First, EPM solutions must support larger stores of data over time to fully analyze financial data and a plethora of additional business data needed to support strategic business analysis. The challenge of growing data has become increasingly important as enterprises now face the challenge of managing billion row tables and outgrow the traditional cubes and datamarts used to manage basic financial data. The sheer scale of financial and commerce-related transactional data requires a Big Data approach at the enterprise level to support timely analysis of planning, consolidation, close, risk, and compliance.

In addition, these large data sources need to integrate with other data sources and references to support integrated business planning to align finance planning with sales, supply chain, IT, and other departments. As the CFO is increasingly asked to be not only a financial leader, but a strategic leader, she must have access to all relevant business drivers and have a single view of how relevant sales, support, supply chain, marketing, operational, and third-party data are aligned to financial performance. Each of these departments has its own large store of data that the strategic CFO must also be able to access, allocate, and analyze to guide the business.

New EPM solutions must evolve beyond traditional OLAP cubes to support hybrid data structures that effectively scale to support the immense scale and variety of data being supported. Amalgam notes that EPM solutions focusing on large data solutions take a variety of relational, in-memory, columnar, cloud computing, and algorithmic approaches to define categories on the fly, store, structure, and analyze financial data.

To support these large stores of data and effectively support them from a financial, strategic, and analytic perspective, Amalgam Insights recommends the following companies that have been innovative in supporting immense and varied planning and budgeting data environments based on briefings and discussions held in 2018:

  • Adaptive Insights, a Workday Company
  • Anaplan
  • Board
  • Domo
  • IBM Planning Analytics
  • OneStream
  • Oracle Planning and Budgeting
  • SAP Analytics Cloud

Adaptive Insights

Adaptive Insights’ Elastic Hypercube, an in-memory, dynamic caching and scaling solution announced in July 2018. Amalgam Insights saw a preview of this technology at Adaptive Live and was intrigued by the efficiency that Adaptive Insights provided to models in selectively recalculating only the dependent changes as a model was edited, using a dynamic caching approach for only using memory and computational cycles when data was being accessed, and using both tabular and cube formats to support data structures. This data format will also be useful to Adaptive Insights as a Workday company in building out the various departmental planning solutions that will be accretive to Workday’s positioning as an HR and ERP solution after Workday’s June acquisition (covered in June in our Market Milestone).

Anaplan

Anaplan’s Hyperblock is an in-memory engine combining columnar, relational, and OLAP approaches. This technology is the basis of Anaplan’s platform and allows Anaplan to rapidly support large planning use cases. By developing composite dimensions, Anaplan users can pre-build a broad array of combinations that can be used to repeatably deploy analytic outputs. As noted in our March blog, Anaplan has been growing rapidly based on its ability to rapidly support new use cases. In addition, Anaplan has recently filed its S-1 to go public.

Board

Board goes to market both as an EPM and a general business intelligence solution. Its core technology is the Hybrid Bitwise Memory Pattern (HBMP), a proprietary in-memory data management solution, designed to algorithmically map each bit of data, then to store this map in-memory. In practice, this approach allows Board to allow many users to both access and edit information without dealing with lagging or processing delays. This approach also allows Board to support which aspects of data to support in an in-memory or dynamic manner to prioritize computing assets.

Domo

Domo describes its Adrenaline engine as an “n-dimensional, highly concurrent, exo-scale, massively parallel, and sub-second data warehouse engine” to store business data. This is accompanied by VAULT, Domo’s data lake to support data ingestion and serve as a single store of record for business analysis. Amalgam Insights covered the Adrenaline engine as one of Domo’s “Seven Samurai” in our March report Domo Hajimemashite: At Domopalooza 2018, Domo Solves Its Case of Mistaken Identity. Behind the buzzwords, these technologies allow Domo to provide executive reporting capabilities across a wide range of departmental use cases in near-real time. Although Domo is not a budgeting solution, it is focused on portraying enterprise performance for executive consumption and should be considered for organizations seeking to gain business-wide visibility to key performance metrics.

IBM Planning Analytics

IBM Planning Analytics runs on Cognos TM1 OLAP in-memory cubes. To increase performance, these cubes use sparse memory management where missing values are ignored and empty values are not stored. In conjunction with IBM’s approach of caching analytic outcomes in-memory, this approach allows IBM to improve performance compared to standard OLAP approaches and this approach has been validated at scale by a variety of IBM Planning Analytics clients. Amalgam Insights presented on the value of IBM’s approach at IBM Vision 2017 both from a data perspective and from a user interface perspective that will be covered in a future blog.

OneStream

OneStream provides in-memory processing & stateless servers to support scale, but their approach to analytic scale is based on virtual cubes and extensible dimensions, which allow organizations to continue building dimensions over time that are tied back to a corporate level and to create logical views of data based on a larger data store to support specific financial tasks such as budgeting, tax reporting, or financial reporting. OneStream’s approach is focused on financial use rather than general business planning.

Oracle Planning and Budgeting Cloud

Oracle Planning and Budgeting Cloud Service is based on Oracle Hyperion, the market leader in Enterprise Performance Management from a revenue perspective. The Oracle Cloud is built on Oracle Exalogic Elastic Cloud, Oracle Exadata Database Machine, and the Oracle Database, which provide a strong in-memory foundation for the Planning and Budgeting application by providing an algorithmic approach to manage storage, compute, and networking. This approach effectively allows Oracle to support planning models at massive scale.

SAP Analytics Cloud

SAP Analytics Cloud, SAP’s umbrella product for planning and business intelligence, uses SAP S/4HANA, an in-memory columnar relational database, to provide real-time access to data and to accelerate both modelling and analytic outputs based on all relevant transactional data. This approach is part of SAP’s broader HANA strategy to encapsulate both analytic and transactional processing in a single database, effectively making all data reportable, modellable, and actionable. SAP has also recently partnered with Intel Optane DC persistent memory to support larger data volumes for enterprises requiring larger persistent data stores for analytic use.

This blog is part of a multi-part series on the evolution of Enterprise Performance Management and key themes that the CFO office must consider in managing holistic enterprise performance: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience. If you would like to set up an inquiry to discuss EPM or provide a vendor briefing on this topic, please contact us at info@amalgaminsights.com to set up time to speak.

Last Blog: EPM at a Crossroads
Next Blog: Robotic Process Automation and Machine Learning in EPM

Posted on Leave a comment

The “Unlearning” Dilemma in Learning and Development

Key Stakeholders: IT Managers, IT Directors, Chief Information Officers, Chief Technology Officers, Chief Digital Officers, IT Governance Managers, and IT Project and Portfolio Managers.

Top Takeaways: One critical barrier to full adoption is the poorly addressed problem of unlearning. Anytime a new piece of software achieves some goal with a set of motor behaviors that is at odds with some well-established, habitized motor program, the learner struggles, evidences frustration, and is less likely to effectively onboard. Learning scientists can remedy this problem and can help IT professionals build effective training tools.

Introduction

In my lifetime I have seen amazing advances in technology. I remember the days of overhead projectors, typewriters and white out. Now our handheld “phone” can project a high-resolution image, can convert spoken word into text, and can autocorrect errors.

The corporate world is dominated by new technologies that are making our lives easier and our workplaces more effective. Old technologies are being updated regularly, and new innovative, disruptive technologies are replacing them. It is an exciting time. Even so, this fast-paced technological change requires continuous learning and unlearning and this is where we often fall short.

Despite the fact that we all have experience with new technologies that have made our lives easier, and we applaud those advances, a large proportion of us (myself included) fear the introduction of a new technology or next release of our favorite piece of software. We know that new and improved technologies generally make us more productive and make our lives easier, at least in the long-run, but at the same time, we dread that new software because the training usually “sucks”.

This is a serious problem. New and improved technology development is time-consuming and expensive. The expectation is that all users will onboard effectively and reap the benefits of the new technology together. When many users actively avoid the onboarding process (yours truly included) this leads to poor adoption, underutilization of these powerful tools, and reduced profits. I refer to this as the “adoption gap”.

Why does this “adoption gap” exist and what can we do about it?

There are two reasons for the “adoption gap” and both can be addressed if training procedures are developed that embrace learning science—the marriage of psychology and brain science.

First, all too often a software developer or someone on their or another team is tasked with developing a training manual or training tool. Although these individuals are amazing at their jobs, they are not experts in training content development and delivery and often build ineffective tools. The relevant stories that I can tell from my 25-year career as a University Professor are many. I can’t count the number of times I received an email explaining (in a paradoxically excruciating and incomprehensible detail) how a new version of an existing software has changed, or how some new technology was going to be unveiled that would replace a tool that I currently used. I was provided with a schedule of “training workshops” to attend, or a link to some unintelligible “training manual”.

Because the training materials were not easy to use and not immediately actionable, I and my colleagues did everything that we could to stick with the legacy (and currently workable solution) or to get small group training from someone who knew how to use the technology. Although this worked for us, it was highly sub-optimal from the perspective of the University and it adversely affected productivity.

When the training tool is ineffective, employees will fail to use the new technology effectively and will quickly revert back to the legacy software that works and feels comfortable. I discuss this problem in a recent blog post and I offered some specific suggestions for improving technology training and development that include surveying users, embracing microlearning, using multiple media, and incorporating knowledge checks. Even so, that blog ignores the second major cause of the “adoption gap” and obstacles to IT onboarding, unlearning.

The Importance of Unlearning

In this report I take a deeper dive and explore a problem that is poorly understood in Learning and Development. This is the central problem of unlearning. Simply put, all too often a novel technology or software release will introduce new ways of achieving a goal that are very different from, or at odds with, the old way of achieving that goal.

Consider a typical office setting in which an employee spends several hours each day using some technology (e.g., Photoshop, a word processor, some statistics package). The employee has been using this technology on a daily basis for months, possibly years and using the technology has become second nature. The employee has become so proficient with this tool that their behaviors and motor interactions with the technology have become habitized. The employee does not even have to “think” about how to cut and paste, or conduct a simple regression, or add a layer to their project. They have performed these actions so many times that they have developed “muscle memory”. The brain develops “muscle memory” and habits that reduce the working memory and attention load and leave those valuable resources for more complex problems like interpreting the outcome of a regression or visualizing the finalized Photoshop project that we have in mind.

Now suppose that the new release changes the motor program associated with cutting and pasting, the drop-down menu selections needed to complete a regression, or the button clicks to add, delete or move project layers. In this case, your habits and muscle memory are telling you one thing, but you have to do something else with the new software. This is very challenging, frustrating, and working memory and attention demanding. One has to use inhibitory control so as not to initiate the habit, and instead think really hard to initiate the new set of behaviors, and do this over and over again so that they become habitized. This takes time and effort and is working memory and attention demanding. Many (yours truly included) will abandon this process and fall back on what “works”. Unlearning habits is much more challenging than learning new behaviors.

Key Recommendations to Support Unlearning

This is an area where learning science can be leveraged. An extensive body of psychological and brain science research (much of my own) has been conducted over the past several decades that provides specific guidelines on how to solve the problem of unlearning. Here are a few suggestions for addressing this problem.

Recommendation 1: Identify Technology Changes That Impact High Frequency “Habits”. When onboarding a new software solution or when an existing software solution is upgraded, the IT team should audit IT behavior to identify high-frequency functionality and monitor users’ behavior. Users should also be encouraged to provide feedback on their interactions with the software and to identify functions that they believe have changed. Of course, IT personnel could be proactive and audit high-frequency behaviors before purchasing new software. This information could guide the purchasing process. IT professionals must understand that although the technology as a whole may be improved with each new release, there is often at least one high-frequency task that changes and requires extensive working memory and attentional resource to overcome. Every such instance is a chance for an onboarding failure.

Recommendation 2: Apply Spaced Training and Periodic Testing to Unlearn High-Frequency Habits. Once high-frequency habits that have changed are identified, unlearning procedures should be incorporated to speed the unlearning, and new learning process. Spaced training and periodic testing can be implemented to speed this process. Details can be found here, but briefly, learning (and unlearning) procedures should be developed that target these habits for change. These training modules should be introduced across training sessions spaced over time (usually hours or days apart). Each training session should be preceded by a testing protocol that identifies areas of weakness that require additional testing. This provides critical information for the learner and allows them to see objective evidence of progress. In short, habits cannot be overcome in a single training session. However, the speed of learning and unlearning can be increased when spaced training and testing procedures are introduced.

Recommendation 3: Automate High-Frequency Tasks to Avoid the Need for Unlearning. The obvious solution to the learning and unlearning problem is to minimize the number of motor procedural changes across software releases or with new technology. A straightforward method is to ask employees which tasks that they locked into “muscle memory”. Once identified, software developers and experts in UI/UX could work to automate or optimize these processes. Tools such as optimized macros, machine learning-based optimization or new functionality would be the goal. The time saved onboarding users should be significant and the number of users abandoning the process should be minimized. Although aspirational, with the amount of “big data” available to developers, and the rich psychological literature on motor behavior, this is a solvable problem. We simply need to recognize the problem that employees have been aware of for decades, and acknowledge that the problem must be solved.

By taking these recommendations into account, technology onboarding will become more efficient, technology users will become more efficient, and companies will be better positioned to extract maximum value from their investment in new and transformative technologies.

Posted on Leave a comment

FloQast Supports ASC 606 Compliance by Providing a Multi Book Close for Accountants

On September 11, 2018, FloQast announced multi-book accounting capabilities designed to help organizations to support ASC 606 compliant financial closes by supporting dual reporting on revenue recognition and related expenses. As Amalgam Insights has covered in prior research, ASC 606/IFRS 15 standards for recognizing revenue on subscription services are currently required for all public companies and will be the standard for private companies as of the end of 2019.

Currently, FloQast supports multi book accounting for Oracle NetSuite and Sage Intacct, two strong mid-market finance solutions that have invested in their subscription billing model support capabilities. This capability is available for FloQast Business, Corporate, and Enterprise customers at no additional cost. The support of these solutions also reflects the investment that each of these ERP vendors has made in subscription billing. NetSuite’s 2015 acquisition of subscription billing solution Monexa eventually led to the launch of NetSuite SuiteBilling, while Sage Intacct developed its subscription billing capabilities organically in 2015.

Why This Matters For Accounting Teams

Currently, accounting teams compliant with ASC 606 are required to provide two sets of books associated with each financial close. Organizations seeking to accurately reflect their finances both from a legacy and current perspective either need to duplicate efforts to provide compliant accounting outputs or to use an accounting solution that will accurately create separate sets of close results. By simultaneously creating dual level close outputs, organizations can avoid the challenge of creating detailed journal entries to explain discrepancies within a single close instance.

Recommendations for Accounting Teams with ASC 606 Compliance Requirements

This announcement has a couple of ramifications for mid-market enterprises and organizations that are either currently supporting ASC 606 as public companies or preparing to support ASC 606 as private companies.

First, Amalgam Insights believes that accounting teams using either Oracle NetSuite or Sage Intacct should adopt FloQast as a relatively low-cost solution to solve the challenge of duplicate ASC 606 close. Currently, this functionality is most relevant to Oracle NetSuite and Sage Intacct customers with significant ASC 606 accounting challenges. To understand why, consider the basic finances of this decision.

Amalgam Insights estimates that, based on FloQast’s current pricing of $125 per month for business accounts or $150 per month for corporate accounts , FloQast will pay for itself with productivity gains in any accounting department where an organization spends four or more man-hours per month to create duplicate closes. This return is in addition to the existing ROI associated with financial close that Amalgam Insights has previously tracked for FloQast customers in our Business Value Analysis. In this document, we found that FloQast customers interviewed saw a 647% ROI in their first year of deployment by accelerating and simplifying their close workflows and improving team visibility to the current status of financial close.

Second, accounting teams should generally expect to support multi-book accounting for the foreseeable future. Although ASC 606 is now a current standard, financial analysts and investors seeking to conduct historical analysis of a company for investment, acquisition, or partnership will want to conduct a consistent “apples-to-apples” comparison of finances for multiple years. Until your organization has three full years of financial statements under ASC 606 that are audited, your organization will likely have to maintain multiple sets of books. Given that most public organizations started using ASC 606 in 2018, this means having a plan for multiple sets of books until 2020. For private organizations, this may mean an additional year or two given that mandatory compliance starts at the end of 2019. Companies that avoid preparing for the reality of dual level closes for the next couple of years will be spending significant accountant hours on easily avoidable work.

If you would like to learn more about FloQast, the Business Value Analysis, or the current vendor solution options for financial close management, please contact Amalgam Insights at info@amalgaminsights.com

Posted on Leave a comment

Amalgam Insights Issues Vendor SmartList™ of Software Solutions That Help Build Better Leadership Brains

BOSTON, September 11, 2018 — A new report from industry analyst firm Amalgam Insights cites nine leaders in software solutions designed to effectively train the hard skills, people skills, and situational awareness of leadership.

The Vendor SmartList™, authored by Amalgam Insights Research Fellow Todd Maddox, Ph.D., takes on added significance given the increasing number of news reports about inappropriate conduct by corporate leaders or the lack of trained behavioral skills among a company’s employees.
“Companies must continuously educate their workers, top to bottom, about the ‘what,’ ‘how,’ and ‘feel’ of effective leadership by leveraging brain science to ensure that their behavior doesn’t land them in trouble,” Maddox says. “The companies in our report have succeeded in mapping these systems to the leadership processes needed in today’s business environment.”

Maddox lists nine companies as leaders, with comments about each company’s strength:
• CrossKnowledge: “Path to Performance uses digital solutions that can combine with facilitator training and a dedicated dashboard to provide effective leadership training paths.”
• Development Dimensions International: “DDI uses a broad range of online and offline solutions, tools for practice and simulation, and ‘what if’ scenarios to train leadership skills that meet tomorrow’s business challenges.”
• Fuse Universal: “(Its) use of science and analytics help clients build content that reflects their company DNA, vision and brand.”
• Grovo: “Its use of microlearning content and training programs that power modern learning and improve business outcomes.”
• Learning Technologies Group: “Its broad portfolio of providers, combined with its emphasis on rich scenario-based training and the importance of business outcomes.”
• Rehearsal: “The use of video role-play training that helps leaders improve their people skills through practice, coaching and collaboration.”
• Skillsoft: “Its combination of content and delivery tools that effectively train hard skills, people skills and situational awareness in leadership.”
• TalentQuest: “The use of personality and behavioral science to empower leaders to better manage, coach and develop their people.”
• Valamis: “Its use of technology and data science to build a system for each client that delivers learning aligned with the client’s business goals.”

Maddox cautions, however, that companies deploying any of the systems listed above, or others, must be prepared to do so on an ongoing basis. “One-off training does not work,” he notes.

The report is available for download at https://www.amalgaminsights.com/product/vendor-smartlist-2018s-best-leadership-training.

Posted on Leave a comment

Webinar On Demand: Optimizing Leadership Training and Development by Leveraging Learning Science

On September 6, 2018, Amalgam Insights Learning Scientist Research Fellow, Todd Maddox, Ph.D. presented a webinar focused on “The What, How, and Feel of Leadership Brain Training”.

By attending this talk, you can bring back to your organization a better understanding of how psychology and brain science can be leveraged to provide a roadmap for successfully training leaders and managers at all levels. In this era of digital transformation, where organizations rely increasingly on cross-functional and deeply collaborative teams, leadership is becoming more distributed and employees are taking on leadership roles much earlier in their careers.

Combine this with some of the recent corporate crises (#metoo, unconscious bias, discrimination) and effective leadership training becomes even more important. The overriding aim of this talk is to examine leadership training and development from a learning science perspective—the marriage of psychology and brain science—and to identify procedures that optimize leadership training.

To watch this webinar, view below on the embedded viewer or click through to watch “The What, How, and Feel of Leadership Brain Training

Posted on

Google Grants $9 Million in Google Cloud Platform Credits to Kubernetes Project

Tom Petrocelli, Amalgam Insights Research Fellow

Kubernetes has, in the span of a few short years, become the de facto orchestration software for containers. As few as two years ago there were more than a half-dozen orchestration tools vying for the top spot and now there is only Kubernetes. Even the Linux Foundation’s other orchestrator project, CloudFoundry Diego, is starting to give way to Kubernetes. Part of the success of Kubernetes can be attributed to the support of Google. Kubernetes emerged out of Google and they have continued to bolster the project even as it fell under the auspices of the Linux Foundation’s CNCF.

On August 29, 2018, Google announced that it is giving $9M in Google Cloud Platform (GCP) credit to the CNCF Kubernetes project. This is being hailed by both Google and the CNCF as an announcement of major support. $9M is a lot of money, even if it is credits. However, let’s unpack this announcement a bit more and see what it really means.
Continue reading Google Grants $9 Million in Google Cloud Platform Credits to Kubernetes Project

Posted on Leave a comment

Data Science Platforms News Roundup, August 2018

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Anaconda, Cloudera, Databricks, Dataiku, DataRobot, Datawatch, Domino, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta.

Continue reading Data Science Platforms News Roundup, August 2018

Posted on 2 Comments

VMware Purchases CloudHealth Technologies to support Multicloud Enterprises and Continue Investing in Boston


Vendors and Solutions Mentioned: VMware, CloudHealth Technologies, Cloudyn, Microsoft Azure Cloud Cost Management, Cloud Cruiser, HPE OneSphere. Nutanix Beam, Minjar, Botmetric

Key Stakeholders: Chief Financial Officers, Chief Information Officers, Chief Accounting Officers, Chief Procurement Officers, Cloud Computing Directors and Managers, IT Procurement Directors and Managers, IT Expense Directors and Managers

Key Takeaway: As Best-of-Breed vendors continue to emerge, new technologies are invented, existing services continue to evolve, vendors pursue new and innovative pricing and delivery models, cloud computing remains easy to procure, and IaaS doubles every three years as a spend category, cloud computing management will only increase in complexity and the need for Cloud Service Management will only increase. VMware has made a wise choice in buying into a rapidly growing market and now has greater opportunity to support and augment complex peak, decentralized, and hybrid IT environments.

About the Announcement

On August 27, 2018, VMware announced a definitive agreement to acquire CloudHealth Technologies, a Boston-based startup company focused on providing a cloud operations and expense management platform that supports enterprise accounts across Amazon Web Services, Microsoft Azure, and Google Cloud Platform.
Continue reading VMware Purchases CloudHealth Technologies to support Multicloud Enterprises and Continue Investing in Boston

Posted on Leave a comment

Code-Free to Code-Based: The Power Spectrum of Data Science Platforms

The spectrum of code-centricity on data science platforms ranges from “code-free” to “code-based.” Data science platforms frequently boast that they provide environments that require no coding, and that are code-friendly as well. Where a given platform falls along this spectrum affects who can successfully use a given data science platform, and what tasks they are able to perform at what level of complexity and difficulty. Codeless interfaces supply drag-and-drop simplicity and relatively quick responses to straightforward questions at the expense of customizability and power. Code-based interfaces require specialized coding, data, and statistics skills, but supply the flexibility and power to answer cutting-edge questions.

Codeless and hybrid code environments furnish end users who may lack a significant coding and statistics background with some level of data science capabilities. If a problem is relatively simple, such as a straightforward clustering question to identify customer personas for marketing, graphic interfaces provide the ability to string together data workflows from a pool of available algorithms without needing to know Python or other coding languages. Even for data scientists who do know how to code, the ability to pull together relatively simple models in a drag-and-drop GUI can be faster than manually coding them, and this also avoids the problem of typos and reduces the need for debugging code technicalities at the expense of focusing on the pure logic without distractions.

Answering a more advanced question may require some level of custom coding. Your data workflow may be constructed in a hybrid manner, composed of some pre-built models connected to nodes that can include bespoke code. This permits more adaptability of models, and makes them more powerful than those restricted solely to what a given data science platform supplies out of the box. However, even if a data science platform includes the option to include custom code in a hybrid model, taking advantage of this feature requires somebody with coding knowledge to create the code.

If the problem being addressed is complex enough, sharper coding, statistics, and data skills are necessary to create appropriately tailored models. At this level of complexity, a code-centric interactive development environment is necessary so that the data scientist can put their advanced skills into model construction and customization.

Data science platforms can equip data science users and teams with multiple interfaces for creating machine learning models. What interfaces are included say a fair bit about what kind of end users a given platform aims to best serve, and the level of skill expected of the various members of your data science team. A fully-inclusive data science platform includes both a GUI environment for data analysts to construct simple workflows (and for project managers and line of business to understand what the model is doing from a high-level perspective), as well as a proper coding environment for data scientists to code more complex custom models.

Posted on 1 Comment

Microsoft Loves Linux and FOSS Because of Developers

Tom Petrocelli, Amalgam Insights Research Fellow

For much of the past 30 years, Microsoft was famous for its hostility toward Free and Open Source Software (FOSS). They reserved special disdain for Linux, the Unix-like operating system that first emerged in the 1990s. Linux arrived on the scene just as Microsoft was beginning to batter Unix with Windows NT. The Microsoft leadership at the time, especially Steve Ballmer, viewed Linux as an existential threat. They approached Linux with an “us versus them” mentality that was, at times, rabid.

It’s not news that times have changed and Microsoft with it. Instead of looking to destroy Linux and FOSS, Microsoft CEO Satya Nadella has embraced it.

Microsoft has begun to meld with the FOSS community, creating Linux-Windows combinations that were unthinkable in the Ballmer era.

In just the past few years Microsoft has:
Continue reading Microsoft Loves Linux and FOSS Because of Developers