Posted on 1 Comment

IBM Presents “Change the Game: Winning with AI” in New York City


(Note: This blog is part of a multi-part series on this event and the related analyst event focused on IBM’s current status from an AI perspective.)

On September 13th, 2018, IBM held an event titled “Change the Game: winning with AI.” The event was hosted by ESPN’s Hannah Storm and held in Hell’s Kitchen’s Terminal 5, better known as a music venue where acts ranging from Lykke Li to Kali Uchis to Good Charlotte perform. In this rock star atmosphere, IBM showcased its current perspective on AI (artificial intelligence, not Amalgam Insights!).

IBM’s Rob Thomas and ESPN’s Hannah Storm discuss IBM Private Cloud for Data

This event comes at an interesting time for IBM. Since IBM’s public launch of IBM Watson on Jeopardy! in 2011, IBM has been a market leader in enterprise artificial intelligence and spent billions of dollars in establishing both IBM Watson and AI. However, part of IBM’s challenge over the past several years was that the enterprise understanding of AI was so nascent that there was no good starting point to develop machine learning, data science, and AI capabilities. In response, IBM built out many forms of AI including

  • IBM Watson as a standalone, Jeopardy!-like solution for healthcare and financial services,
  • Watson Developer Cloud to provide language, vision, and speech APIs, Watson Analytics to support predictive and reporting analytics,
  • chatbots and assistants to support talent management and other practical use cases,
  • Watson Studio and Data Science Experience to support enterprise data science efforts to embed statistical and algorithmic logic into applications, and
  • Evolutionary neural network design at the research level.

And, frankly, the velocity of innovation was difficult for enterprise buyers to keep up with, especially as diverse products were all labelled Watson and as buyers were still learning about technologies such as chatbots, data science platforms, and IBM’s hybrid cloud computing options at a fundamental level. The level of external and market-facing education needed to support relatively low-revenue and experimental investments in AI was a tough path for IBM to support (and, in retrospect, may have been better supported as a funded internal spin-off with access to IBM patents and technology). Consider that extremely successful startups can justify billion dollar valuations based on $100 million in annual revenue while IBM is being judged on multi-billion dollar revenue changes on a quarter-by-quarter basis. That juxtaposition makes it hard for public enterprises to support audacious and aggressive startup goals that may take ten years of investment and loss to build.

IBM’s perspective at this “Change the game” event was an important checkpoint in learning more about IBM’s current positioning on AI. Amalgam Insights was interested in learning more about IBM’s current positioning and upcoming announcements to support enterprise AI.

With strategies telestrated by IBM’s Janine Sneed and Daniel Hernandez, this event was an entertaining breakdown of IBM case studies demonstrating the IBM analytics and data portfolio and interspersed with additional product and corporate presentations by IBM’s Rob Thomas, Dinesh Nirmal, Reena Ganga, and Madhu Kochar. The event was professionally presented as Hannah Storm interviewed a wide variety of IBM customers including:

  • Mark Vanni, COO of Trūata
  • Joni Rolenaitis, Vice President of Data Development and Chief Data Officer for Experian
  • Dr. Donna M. Wolk, System Director of Clinical and Molecular Microbiology for Geisinger
  • Guy Taylor, Executive Head of Data-Driven Intelligence at Nedbank
  • Sreesha Rao, Senior Manager of IT Applications at Niagara Bottling LLC
  • James Wade, Director of Application Hosting for GuideWell Mutual Holding Company
  • Mark Lack, Digital Strategist and Data Scientist for Mueller, Inc
  • Rupinder Dhillon, Director of Machine Learning and AI at Bell Canada

In addition, IBM demonstrated aspects of IBM Private Cloud for Data, their cloud built for using data for AI, as well as Watson Studio, IBM’s data science platform, and design aspects for improving enterprise access to data science and analytic environments.

Overall, Amalgam Insights saw this event as a public-friendly opportunity to introduce IBM’s current capabilities in making enterprises ready to support AI by providing the data, analytics, and data science products needed to prepare enterprise data ecosystems for machine learning, data science, and AI projects. In upcoming blogs, Amalgam Insights will cover IBM’s current AI positioning in greater detail and the IBM announcements that will affect current and potential AI customers.

Posted on Leave a comment

Todd Maddox Ph.D.’s Top Four Scientific Observations on HR Tech 2018

HR Tech gets bigger and bigger every year. HR Tech 2018 was no exception. It broke all of the previous records. More importantly, the quality of the offerings, presentations and savvy of the clients continues to grow. I had a great time at HR Tech 2018, and I am already looking forward to 2019.

It took some time to dig out from three days away from my desk, and more time to let my thoughts coalesce, but I am now ready to provide my insights on the event.

As a psychological and brain scientist with 25 years of basic science research under my belt, and a particular interest in personality, motivation, learning, and all things data, I resonate with events like HR Tech because my passion is all things “people’. I find topics such as recruitment, interviewing, learning and development, leadership, succession planning, and talent management in general fascinating. Although I am sure others who attended HR Tech will highlight different topics, and acknowledging that I was only able to speak in detail with a dozen or so vendors, here are my Top Four Scientific Observations.

The Impact of Science Continues to Grow

Relevant Vendors That I Spoke With: Cornerstone, IBM, Infor, LTG, PeopleFluent, Phenom People, Saba, Skillsoft/Sumtotal, TalentQuest, Workday

I am happy to say that I see a growing interest in the importance of science, not only in vendors but also in their potential customers. Whether talent science, learning science, or data science, the methodology called “science” is growing in HR. Importantly, this is not just being used as a buzzword. In hiring and recruitment, the science of assessment is central. Combine assessment with “big data” and amazing recruitment offerings follow. In Learning and Development, tools and technologies are being built that battle against the brain’s natural tendency to forget and focus on presenting information within the attention span of the learner. These include microlearning, spaced training and testing. In talent management, the science of personality is growing in importance. Personality is directly related to motivation and incentivization and all three are being leveraged. The science of data and the use of analytics, machine learning and artificial intelligence to garner actionable insights continues to grow in its use and sophistication.

From a psychological and brain science perspective, we still have a long way to go, and we need to do a better job of mapping tools onto tasks but progress is being made and vendors are listening to their clients.

The Growing Importance of People Skills and Leadership Training

Relevant Vendors That I Spoke With: LeadX, LTG, PeopleFluent, Saba, Skillsoft/Sumtotal, TalentQuest

I have written extensively on the difference between hard skills and people (aka soft) skills, the psychological processes and brain systems relevant to each, and the fact that very different tools are needed. At times, I have felt like people hear, but are not listening. The tide is turning. More and more HR professionals (vendors and customers) are embracing the importance of people skills. Whether it is growing concerns with automation, the extensive data suggesting that diversity increased revenue and workplace harmony, the #metoo movement, or more likely the combination of all three, HR is embracing the need for effective people skills training in organizations large and small. This is not just about compliance and box checking, but a recognition that people skills are critical in the workplace.

The growing emphasis on leadership training is also exciting, and people skills are central to this mission. I visited numerous vendors from small, innovative startups to much larger companies who are working to tackle the leadership development problem head-on. Although more work is needed, some of these offerings are truly remarkable.

I should also note that some vendors are developing leadership training programs focused on developing Leadership in Women. For those of you who have read the likes of “Lean In”, you know the challenges facing women in the corporate world, especially leadership. Offerings targeted to these specific problems are welcome.

The Growing Recognition of the Value of Personality

Relevant Vendors That I Spoke With: IBM, Infor, PeopleFluent, Phenom People, LeadX, TalentQuest

As I alluded to earlier, the science of personality is growing in its relevance and sophistication in HR. I have conducted a number of studies of my own and show convincingly that personality is critical and affects many aspects of HR including team dynamics, learning and development, effective management, and succession planning to name a few. The number of vendors embracing the importance of personality is growing and the quality of some of the offerings are truly amazing. Personality assessment blends psychology and data science and can offer insights that are immediately actionable and can make the difference between workplaces that are high in engagement, satisfaction, and retention, and those that are weak on these same metrics. With the constant battle to find, develop, nurture and retain the right employees, factors such as personality will continue to grow in value.

There is still room for improvement and some intersections that the majority of HR vendors are not seeing, such as the intersection between personality, motivation, learning and incentivization, but I expect that those will come with time.

Virtual Reality Has Arrived

Relevant Vendors That I Spoke With: STRIVR, SumTotal, TalentQuest

For the past several years, I have been following the emergence of the immersive technology of virtual reality. My interest in this technology comes from my interest in Learning and Development. From a brain science perspective, it is clear that this technology is highly effective at engaging learning centers in the brain. Until recently, technological issues and equipment costs constrained virtual realities commercial use. For example, a year or so ago, a good virtual reality system cost several $1000 and required a laptop, and a tethered cable connected to the head mounted display. Recently, Oculus release the Go which is a stand-alone head-mounted display that costs under $300. A game changer!

Although the number of virtual reality offerings at HR Tech is still relatively small, the buzz around this technology is growing exponentially, and some key players are driving engagement. In addition, the quality of the offerings that are out there is impressive. These tools will be invaluable for collaboration and training. I am very excited to see this sector grow in 2019, and I fully expect the number of offerings at HR Tech 2019 to be significantly higher.

Conclusions

It is an exciting time for HR. That was clearly reflected in the enthusiasm, innovation and quality of the offerings at HR Tech 2018. Skills that have often been given lip service (e.g., people skills) are getting the attention that they deserve. Constructs that derive directly from people, such as personality, are being leveraged for good. New, and exciting technologies such as virtual reality are being introduced for more than gaming. And all of these are being built on a solid foundation of data and science. The future is bright for HR and HR Tech.

Posted on Leave a comment

Learning Elastic’s Machine Learning Story at Elastic{ON} in Boston

Why is a Data Science and Machine Learning Analyst at Elastic’s road show when they’re best known for search? In early September, Amalgam Insights attended Elastic{ON} in Boston, MA. Prior to the show, my understanding of Elastic was that they were primarily a search engine company. Still, the inclusion of a deep dive into machine learning interested me, and I was also curious to learn more about their security analytics, which were heavily emphasized in the agenda.

In exploring Elastic’s machine learning capabilities, I got a deep dive with Rich Collier, the Senior Principal Solutions Architect and Machine Learning specialist. Elastic acquired Prelert, an incident management company with unsupervised machine learning capabilities in September 2016 with the goal of incorporating real-time behavioral analytics into the Elastic Stack; in the interim two years, integrating Prelert has grown Elastic’s abilities to act on time-series data anomalies found in the Elasticsearch data store, offering an extension to the Elastic Stack called “Machine Learning” as part of their Platinum-level SaaS offerings.

Elastic Machine Learning users no longer have to define rules to identify abnormal time-series data, nor do they even need to code their own models – the Machine Learning extension analyzes the data to understand what “normal” looks like in that context, including what kind of shifts can be expected over different periods of time from point-to-point all the way up to seasonal patterns. From that, it learns when to throw an alert on encountering atypical data in real time, whether that data is log data, metrics, analytics, or a sudden upsurge in search requests for “$TSLA.” Learning from the data rather than configuring blunt rules makes for a more granular precision that reduces alerts on false positives on anomalous data.

The configuration for the Machine Learning extension is simple and requires no coding experience; front-line business analysts can customize the settings via pull-down menus and other graphical form fields to suit their needs. To simplify the setup process even further, Elastic offers a list of “machine learning recipes” on their website for several common use cases in IT operations and security analytics; given how graphically oriented the Elastic stack is, I wouldn’t be surprised to see these “recipes” implemented as default configuration options in the future. Doing so would simplify the configuration from several minutes of tweaking individual settings to selecting a common profile in a click or two.

Elastic also stated that one long-term goal is to “operationalize data science for everyone.” At the moment, that’s a fairly audacious claim for a data science platform, let alone a company best known for search and search analytics. One relevant initiative Kearns mentioned in the keynote was the debut of the Elastic Common Schema, a common set of fields for ingesting data into Elasticsearch. Standardizing data collection makes it easier to correlate and analyze data through these relational touch points, and opens up the potential for data science initiatives in the future, possibly through partnerships or acquisitions. But they’re not trying to be a general-purpose data science company right now; they’re building on their core of search, logging, security, and analytics; machine learning ventures are likely to fall within this context. Currently, that offering is anomaly detection on time series data.

Providing users who aren’t data scientists with the ability to do anomaly detection on time series data may be just one step in one category of data modeling, but having that sort of specialized tool accessible to data and business analysts would help organizations better understand the “periodicity” of typical data. Retailers could track peaks and valleys in sales data to understand purchasing patterns, for example, while security analysts could focus on responding to anomalies without having to define what anomalous data looks like ahead of time as a collection of rules.

Elastic’s focus on making this one specific machine learning tool accessible to non-data-scientists reminded me of Google’s BigQuery ML initiative  – take one very specific type of machine learning query, and operationalize it for easy use by data and business analysts to address common business queries. Then, once they’ve perfected that tool, they’ll move onto building the next one.

Improving the quality of data acquired and stored in Elasticsearch from search results will be key to improving on the user experience. I spoke with Steve Kearns, the Senior Director of Product Management at Elastic, who delivered the keynote speech with a sharp focus on “scale, speed, and relevance” for improving search results. Better search data can be used to optimize machine learning applied to that data. With how Elastic has created the Machine Learning extension focused on anomaly detection across time series data – data Elasticsearch specializes in collecting, such as log data – this can support more accurate data analysis and better business results for data-driven organizations.

Overall, It was intriguing to see how machine learning is being incorporated into IT solutions that aren’t directly supporting data science environments. Enabling growth in the use of machine learning tactics effectively spreads the use of data across an organization, bringing companies closer to the advantages of the data-driven ideal. Elastic’s Machine Learning capability potentially opens up a specific class of machine learning for a broader spectrum of Elastic users without requiring them to acquire coding and statistics backgrounds; this positions Elastic as a provider of a specific type of machine learning services in the present, and makes it more plausible to consider them as a company for providing machine learning services in the future.

Posted on

EPM at a Crossroads: Big Data Solutions

Key Stakeholders: Chief Information Officers, Chief Financial Officers, Chief Operating Officers, Chief Digital Officers, Chief Technology Officer, Accounting Directors and Managers, Sales Operations Directors and Managers, Controllers, Finance Directors and Managers, Corporate Planning Directors and Managers

Analyst-Recommended Solutions: Adaptive Insights, a Workday Company, Anaplan, Board, Domo, IBM Planning Analytics, OneStream, Oracle Planning and Budgeting, SAP Analytics Cloud

In 2018, the Enterprise Performance Management market is at a crossroads. This market has emerged from a foundation of financial planning, budgeting, and forecasting solutions designed to support basic planning and has evolved as the demands for business planning, risk and forecasting management, and consolidation have increased over time. In addition, the EPM market has expanded as companies from the financial consolidation and close markets, business performance management markets, and workflow and process automation markets now play important roles in effectively managing Enterprise Performance.

In light of these challenges, Amalgam Insights is tracking six key areas where Enterprise Performance Management is fundamentally changing: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience

Supporting Big Data for Enterprise Performance Management

Amalgam Insights has identified two key drivers repeatedly mentioned by finance departments seeking to support Big Data in Enterprise Performance Management. First, EPM solutions must support larger stores of data over time to fully analyze financial data and a plethora of additional business data needed to support strategic business analysis. The challenge of growing data has become increasingly important as enterprises now face the challenge of managing billion row tables and outgrow the traditional cubes and datamarts used to manage basic financial data. The sheer scale of financial and commerce-related transactional data requires a Big Data approach at the enterprise level to support timely analysis of planning, consolidation, close, risk, and compliance.

In addition, these large data sources need to integrate with other data sources and references to support integrated business planning to align finance planning with sales, supply chain, IT, and other departments. As the CFO is increasingly asked to be not only a financial leader, but a strategic leader, she must have access to all relevant business drivers and have a single view of how relevant sales, support, supply chain, marketing, operational, and third-party data are aligned to financial performance. Each of these departments has its own large store of data that the strategic CFO must also be able to access, allocate, and analyze to guide the business.

New EPM solutions must evolve beyond traditional OLAP cubes to support hybrid data structures that effectively scale to support the immense scale and variety of data being supported. Amalgam notes that EPM solutions focusing on large data solutions take a variety of relational, in-memory, columnar, cloud computing, and algorithmic approaches to define categories on the fly, store, structure, and analyze financial data.

To support these large stores of data and effectively support them from a financial, strategic, and analytic perspective, Amalgam Insights recommends the following companies that have been innovative in supporting immense and varied planning and budgeting data environments based on briefings and discussions held in 2018:

  • Adaptive Insights, a Workday Company
  • Anaplan
  • Board
  • Domo
  • IBM Planning Analytics
  • OneStream
  • Oracle Planning and Budgeting
  • SAP Analytics Cloud

Adaptive Insights

Adaptive Insights’ Elastic Hypercube, an in-memory, dynamic caching and scaling solution announced in July 2018. Amalgam Insights saw a preview of this technology at Adaptive Live and was intrigued by the efficiency that Adaptive Insights provided to models in selectively recalculating only the dependent changes as a model was edited, using a dynamic caching approach for only using memory and computational cycles when data was being accessed, and using both tabular and cube formats to support data structures. This data format will also be useful to Adaptive Insights as a Workday company in building out the various departmental planning solutions that will be accretive to Workday’s positioning as an HR and ERP solution after Workday’s June acquisition (covered in June in our Market Milestone).

Anaplan

Anaplan’s Hyperblock is an in-memory engine combining columnar, relational, and OLAP approaches. This technology is the basis of Anaplan’s platform and allows Anaplan to rapidly support large planning use cases. By developing composite dimensions, Anaplan users can pre-build a broad array of combinations that can be used to repeatably deploy analytic outputs. As noted in our March blog, Anaplan has been growing rapidly based on its ability to rapidly support new use cases. In addition, Anaplan has recently filed its S-1 to go public.

Board

Board goes to market both as an EPM and a general business intelligence solution. Its core technology is the Hybrid Bitwise Memory Pattern (HBMP), a proprietary in-memory data management solution, designed to algorithmically map each bit of data, then to store this map in-memory. In practice, this approach allows Board to allow many users to both access and edit information without dealing with lagging or processing delays. This approach also allows Board to support which aspects of data to support in an in-memory or dynamic manner to prioritize computing assets.

Domo

Domo describes its Adrenaline engine as an “n-dimensional, highly concurrent, exo-scale, massively parallel, and sub-second data warehouse engine” to store business data. This is accompanied by VAULT, Domo’s data lake to support data ingestion and serve as a single store of record for business analysis. Amalgam Insights covered the Adrenaline engine as one of Domo’s “Seven Samurai” in our March report Domo Hajimemashite: At Domopalooza 2018, Domo Solves Its Case of Mistaken Identity. Behind the buzzwords, these technologies allow Domo to provide executive reporting capabilities across a wide range of departmental use cases in near-real time. Although Domo is not a budgeting solution, it is focused on portraying enterprise performance for executive consumption and should be considered for organizations seeking to gain business-wide visibility to key performance metrics.

IBM Planning Analytics

IBM Planning Analytics runs on Cognos TM1 OLAP in-memory cubes. To increase performance, these cubes use sparse memory management where missing values are ignored and empty values are not stored. In conjunction with IBM’s approach of caching analytic outcomes in-memory, this approach allows IBM to improve performance compared to standard OLAP approaches and this approach has been validated at scale by a variety of IBM Planning Analytics clients. Amalgam Insights presented on the value of IBM’s approach at IBM Vision 2017 both from a data perspective and from a user interface perspective that will be covered in a future blog.

OneStream

OneStream provides in-memory processing & stateless servers to support scale, but their approach to analytic scale is based on virtual cubes and extensible dimensions, which allow organizations to continue building dimensions over time that are tied back to a corporate level and to create logical views of data based on a larger data store to support specific financial tasks such as budgeting, tax reporting, or financial reporting. OneStream’s approach is focused on financial use rather than general business planning.

Oracle Planning and Budgeting Cloud

Oracle Planning and Budgeting Cloud Service is based on Oracle Hyperion, the market leader in Enterprise Performance Management from a revenue perspective. The Oracle Cloud is built on Oracle Exalogic Elastic Cloud, Oracle Exadata Database Machine, and the Oracle Database, which provide a strong in-memory foundation for the Planning and Budgeting application by providing an algorithmic approach to manage storage, compute, and networking. This approach effectively allows Oracle to support planning models at massive scale.

SAP Analytics Cloud

SAP Analytics Cloud, SAP’s umbrella product for planning and business intelligence, uses SAP S/4HANA, an in-memory columnar relational database, to provide real-time access to data and to accelerate both modelling and analytic outputs based on all relevant transactional data. This approach is part of SAP’s broader HANA strategy to encapsulate both analytic and transactional processing in a single database, effectively making all data reportable, modellable, and actionable. SAP has also recently partnered with Intel Optane DC persistent memory to support larger data volumes for enterprises requiring larger persistent data stores for analytic use.

This blog is part of a multi-part series on the evolution of Enterprise Performance Management and key themes that the CFO office must consider in managing holistic enterprise performance: Big Data, Robotic Process Automation, API connectivity, Analytics and Data Science, Vertical Solutions, and Design Thinking for User Experience. If you would like to set up an inquiry to discuss EPM or provide a vendor briefing on this topic, please contact us at info@amalgaminsights.com to set up time to speak.

Last Blog: EPM at a Crossroads
Next Blog: Robotic Process Automation and Machine Learning in EPM

Posted on Leave a comment

The “Unlearning” Dilemma in Learning and Development

Key Stakeholders: IT Managers, IT Directors, Chief Information Officers, Chief Technology Officers, Chief Digital Officers, IT Governance Managers, and IT Project and Portfolio Managers.

Top Takeaways: One critical barrier to full adoption is the poorly addressed problem of unlearning. Anytime a new piece of software achieves some goal with a set of motor behaviors that is at odds with some well-established, habitized motor program, the learner struggles, evidences frustration, and is less likely to effectively onboard. Learning scientists can remedy this problem and can help IT professionals build effective training tools.

Introduction

In my lifetime I have seen amazing advances in technology. I remember the days of overhead projectors, typewriters and white out. Now our handheld “phone” can project a high-resolution image, can convert spoken word into text, and can autocorrect errors.

The corporate world is dominated by new technologies that are making our lives easier and our workplaces more effective. Old technologies are being updated regularly, and new innovative, disruptive technologies are replacing them. It is an exciting time. Even so, this fast-paced technological change requires continuous learning and unlearning and this is where we often fall short.

Despite the fact that we all have experience with new technologies that have made our lives easier, and we applaud those advances, a large proportion of us (myself included) fear the introduction of a new technology or next release of our favorite piece of software. We know that new and improved technologies generally make us more productive and make our lives easier, at least in the long-run, but at the same time, we dread that new software because the training usually “sucks”.

This is a serious problem. New and improved technology development is time-consuming and expensive. The expectation is that all users will onboard effectively and reap the benefits of the new technology together. When many users actively avoid the onboarding process (yours truly included) this leads to poor adoption, underutilization of these powerful tools, and reduced profits. I refer to this as the “adoption gap”.

Why does this “adoption gap” exist and what can we do about it?

There are two reasons for the “adoption gap” and both can be addressed if training procedures are developed that embrace learning science—the marriage of psychology and brain science.

First, all too often a software developer or someone on their or another team is tasked with developing a training manual or training tool. Although these individuals are amazing at their jobs, they are not experts in training content development and delivery and often build ineffective tools. The relevant stories that I can tell from my 25-year career as a University Professor are many. I can’t count the number of times I received an email explaining (in a paradoxically excruciating and incomprehensible detail) how a new version of an existing software has changed, or how some new technology was going to be unveiled that would replace a tool that I currently used. I was provided with a schedule of “training workshops” to attend, or a link to some unintelligible “training manual”.

Because the training materials were not easy to use and not immediately actionable, I and my colleagues did everything that we could to stick with the legacy (and currently workable solution) or to get small group training from someone who knew how to use the technology. Although this worked for us, it was highly sub-optimal from the perspective of the University and it adversely affected productivity.

When the training tool is ineffective, employees will fail to use the new technology effectively and will quickly revert back to the legacy software that works and feels comfortable. I discuss this problem in a recent blog post and I offered some specific suggestions for improving technology training and development that include surveying users, embracing microlearning, using multiple media, and incorporating knowledge checks. Even so, that blog ignores the second major cause of the “adoption gap” and obstacles to IT onboarding, unlearning.

The Importance of Unlearning

In this report I take a deeper dive and explore a problem that is poorly understood in Learning and Development. This is the central problem of unlearning. Simply put, all too often a novel technology or software release will introduce new ways of achieving a goal that are very different from, or at odds with, the old way of achieving that goal.

Consider a typical office setting in which an employee spends several hours each day using some technology (e.g., Photoshop, a word processor, some statistics package). The employee has been using this technology on a daily basis for months, possibly years and using the technology has become second nature. The employee has become so proficient with this tool that their behaviors and motor interactions with the technology have become habitized. The employee does not even have to “think” about how to cut and paste, or conduct a simple regression, or add a layer to their project. They have performed these actions so many times that they have developed “muscle memory”. The brain develops “muscle memory” and habits that reduce the working memory and attention load and leave those valuable resources for more complex problems like interpreting the outcome of a regression or visualizing the finalized Photoshop project that we have in mind.

Now suppose that the new release changes the motor program associated with cutting and pasting, the drop-down menu selections needed to complete a regression, or the button clicks to add, delete or move project layers. In this case, your habits and muscle memory are telling you one thing, but you have to do something else with the new software. This is very challenging, frustrating, and working memory and attention demanding. One has to use inhibitory control so as not to initiate the habit, and instead think really hard to initiate the new set of behaviors, and do this over and over again so that they become habitized. This takes time and effort and is working memory and attention demanding. Many (yours truly included) will abandon this process and fall back on what “works”. Unlearning habits is much more challenging than learning new behaviors.

Key Recommendations to Support Unlearning

This is an area where learning science can be leveraged. An extensive body of psychological and brain science research (much of my own) has been conducted over the past several decades that provides specific guidelines on how to solve the problem of unlearning. Here are a few suggestions for addressing this problem.

Recommendation 1: Identify Technology Changes That Impact High Frequency “Habits”. When onboarding a new software solution or when an existing software solution is upgraded, the IT team should audit IT behavior to identify high-frequency functionality and monitor users’ behavior. Users should also be encouraged to provide feedback on their interactions with the software and to identify functions that they believe have changed. Of course, IT personnel could be proactive and audit high-frequency behaviors before purchasing new software. This information could guide the purchasing process. IT professionals must understand that although the technology as a whole may be improved with each new release, there is often at least one high-frequency task that changes and requires extensive working memory and attentional resource to overcome. Every such instance is a chance for an onboarding failure.

Recommendation 2: Apply Spaced Training and Periodic Testing to Unlearn High-Frequency Habits. Once high-frequency habits that have changed are identified, unlearning procedures should be incorporated to speed the unlearning, and new learning process. Spaced training and periodic testing can be implemented to speed this process. Details can be found here, but briefly, learning (and unlearning) procedures should be developed that target these habits for change. These training modules should be introduced across training sessions spaced over time (usually hours or days apart). Each training session should be preceded by a testing protocol that identifies areas of weakness that require additional testing. This provides critical information for the learner and allows them to see objective evidence of progress. In short, habits cannot be overcome in a single training session. However, the speed of learning and unlearning can be increased when spaced training and testing procedures are introduced.

Recommendation 3: Automate High-Frequency Tasks to Avoid the Need for Unlearning. The obvious solution to the learning and unlearning problem is to minimize the number of motor procedural changes across software releases or with new technology. A straightforward method is to ask employees which tasks that they locked into “muscle memory”. Once identified, software developers and experts in UI/UX could work to automate or optimize these processes. Tools such as optimized macros, machine learning-based optimization or new functionality would be the goal. The time saved onboarding users should be significant and the number of users abandoning the process should be minimized. Although aspirational, with the amount of “big data” available to developers, and the rich psychological literature on motor behavior, this is a solvable problem. We simply need to recognize the problem that employees have been aware of for decades, and acknowledge that the problem must be solved.

By taking these recommendations into account, technology onboarding will become more efficient, technology users will become more efficient, and companies will be better positioned to extract maximum value from their investment in new and transformative technologies.

Posted on Leave a comment

FloQast Supports ASC 606 Compliance by Providing a Multi Book Close for Accountants

On September 11, 2018, FloQast announced multi-book accounting capabilities designed to help organizations to support ASC 606 compliant financial closes by supporting dual reporting on revenue recognition and related expenses. As Amalgam Insights has covered in prior research, ASC 606/IFRS 15 standards for recognizing revenue on subscription services are currently required for all public companies and will be the standard for private companies as of the end of 2019.

Currently, FloQast supports multi book accounting for Oracle NetSuite and Sage Intacct, two strong mid-market finance solutions that have invested in their subscription billing model support capabilities. This capability is available for FloQast Business, Corporate, and Enterprise customers at no additional cost. The support of these solutions also reflects the investment that each of these ERP vendors has made in subscription billing. NetSuite’s 2015 acquisition of subscription billing solution Monexa eventually led to the launch of NetSuite SuiteBilling, while Sage Intacct developed its subscription billing capabilities organically in 2015.

Why This Matters For Accounting Teams

Currently, accounting teams compliant with ASC 606 are required to provide two sets of books associated with each financial close. Organizations seeking to accurately reflect their finances both from a legacy and current perspective either need to duplicate efforts to provide compliant accounting outputs or to use an accounting solution that will accurately create separate sets of close results. By simultaneously creating dual level close outputs, organizations can avoid the challenge of creating detailed journal entries to explain discrepancies within a single close instance.

Recommendations for Accounting Teams with ASC 606 Compliance Requirements

This announcement has a couple of ramifications for mid-market enterprises and organizations that are either currently supporting ASC 606 as public companies or preparing to support ASC 606 as private companies.

First, Amalgam Insights believes that accounting teams using either Oracle NetSuite or Sage Intacct should adopt FloQast as a relatively low-cost solution to solve the challenge of duplicate ASC 606 close. Currently, this functionality is most relevant to Oracle NetSuite and Sage Intacct customers with significant ASC 606 accounting challenges. To understand why, consider the basic finances of this decision.

Amalgam Insights estimates that, based on FloQast’s current pricing of $125 per month for business accounts or $150 per month for corporate accounts , FloQast will pay for itself with productivity gains in any accounting department where an organization spends four or more man-hours per month to create duplicate closes. This return is in addition to the existing ROI associated with financial close that Amalgam Insights has previously tracked for FloQast customers in our Business Value Analysis. In this document, we found that FloQast customers interviewed saw a 647% ROI in their first year of deployment by accelerating and simplifying their close workflows and improving team visibility to the current status of financial close.

Second, accounting teams should generally expect to support multi-book accounting for the foreseeable future. Although ASC 606 is now a current standard, financial analysts and investors seeking to conduct historical analysis of a company for investment, acquisition, or partnership will want to conduct a consistent “apples-to-apples” comparison of finances for multiple years. Until your organization has three full years of financial statements under ASC 606 that are audited, your organization will likely have to maintain multiple sets of books. Given that most public organizations started using ASC 606 in 2018, this means having a plan for multiple sets of books until 2020. For private organizations, this may mean an additional year or two given that mandatory compliance starts at the end of 2019. Companies that avoid preparing for the reality of dual level closes for the next couple of years will be spending significant accountant hours on easily avoidable work.

If you would like to learn more about FloQast, the Business Value Analysis, or the current vendor solution options for financial close management, please contact Amalgam Insights at info@amalgaminsights.com

Posted on Leave a comment

Amalgam Insights Issues Vendor SmartList™ of Software Solutions That Help Build Better Leadership Brains

BOSTON, September 11, 2018 — A new report from industry analyst firm Amalgam Insights cites nine leaders in software solutions designed to effectively train the hard skills, people skills, and situational awareness of leadership.

The Vendor SmartList™, authored by Amalgam Insights Research Fellow Todd Maddox, Ph.D., takes on added significance given the increasing number of news reports about inappropriate conduct by corporate leaders or the lack of trained behavioral skills among a company’s employees.
“Companies must continuously educate their workers, top to bottom, about the ‘what,’ ‘how,’ and ‘feel’ of effective leadership by leveraging brain science to ensure that their behavior doesn’t land them in trouble,” Maddox says. “The companies in our report have succeeded in mapping these systems to the leadership processes needed in today’s business environment.”

Maddox lists nine companies as leaders, with comments about each company’s strength:
• CrossKnowledge: “Path to Performance uses digital solutions that can combine with facilitator training and a dedicated dashboard to provide effective leadership training paths.”
• Development Dimensions International: “DDI uses a broad range of online and offline solutions, tools for practice and simulation, and ‘what if’ scenarios to train leadership skills that meet tomorrow’s business challenges.”
• Fuse Universal: “(Its) use of science and analytics help clients build content that reflects their company DNA, vision and brand.”
• Grovo: “Its use of microlearning content and training programs that power modern learning and improve business outcomes.”
• Learning Technologies Group: “Its broad portfolio of providers, combined with its emphasis on rich scenario-based training and the importance of business outcomes.”
• Rehearsal: “The use of video role-play training that helps leaders improve their people skills through practice, coaching and collaboration.”
• Skillsoft: “Its combination of content and delivery tools that effectively train hard skills, people skills and situational awareness in leadership.”
• TalentQuest: “The use of personality and behavioral science to empower leaders to better manage, coach and develop their people.”
• Valamis: “Its use of technology and data science to build a system for each client that delivers learning aligned with the client’s business goals.”

Maddox cautions, however, that companies deploying any of the systems listed above, or others, must be prepared to do so on an ongoing basis. “One-off training does not work,” he notes.

The report is available for download at https://www.amalgaminsights.com/product/vendor-smartlist-2018s-best-leadership-training.

Posted on Leave a comment

Webinar On Demand: Optimizing Leadership Training and Development by Leveraging Learning Science

On September 6, 2018, Amalgam Insights Learning Scientist Research Fellow, Todd Maddox, Ph.D. presented a webinar focused on “The What, How, and Feel of Leadership Brain Training”.

By attending this talk, you can bring back to your organization a better understanding of how psychology and brain science can be leveraged to provide a roadmap for successfully training leaders and managers at all levels. In this era of digital transformation, where organizations rely increasingly on cross-functional and deeply collaborative teams, leadership is becoming more distributed and employees are taking on leadership roles much earlier in their careers.

Combine this with some of the recent corporate crises (#metoo, unconscious bias, discrimination) and effective leadership training becomes even more important. The overriding aim of this talk is to examine leadership training and development from a learning science perspective—the marriage of psychology and brain science—and to identify procedures that optimize leadership training.

To watch this webinar, view below on the embedded viewer or click through to watch “The What, How, and Feel of Leadership Brain Training

Posted on

Google Grants $9 Million in Google Cloud Platform Credits to Kubernetes Project

Tom Petrocelli, Amalgam Insights Research Fellow

Kubernetes has, in the span of a few short years, become the de facto orchestration software for containers. As few as two years ago there were more than a half-dozen orchestration tools vying for the top spot and now there is only Kubernetes. Even the Linux Foundation’s other orchestrator project, CloudFoundry Diego, is starting to give way to Kubernetes. Part of the success of Kubernetes can be attributed to the support of Google. Kubernetes emerged out of Google and they have continued to bolster the project even as it fell under the auspices of the Linux Foundation’s CNCF.

On August 29, 2018, Google announced that it is giving $9M in Google Cloud Platform (GCP) credit to the CNCF Kubernetes project. This is being hailed by both Google and the CNCF as an announcement of major support. $9M is a lot of money, even if it is credits. However, let’s unpack this announcement a bit more and see what it really means.
Continue reading Google Grants $9 Million in Google Cloud Platform Credits to Kubernetes Project

Posted on Leave a comment

Data Science Platforms News Roundup, August 2018

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Anaconda, Cloudera, Databricks, Dataiku, DataRobot, Datawatch, Domino, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta.

Continue reading Data Science Platforms News Roundup, August 2018