Posted on

14 Key Trends for Surviving 2023 as an IT Executive

2023 is going to be a tough year for anybody managing technology. As we face the repercussions of inflation and high interest rates and the bubble of tech starts to be burst, we are seeing a combination of hiring freezes, increased focus on core business activities and the hoary request to “do more with less.”

Behind the cliche of doing more with less is the need to actually become more efficient with tech usage. This means adopting a FinOps (Financial Operations) strategy to cloud to go with your existing Telecom FinOps (aka Telecom expense) and SaaS FinOps (aka SaaS Management) strategies. And it means being prepared for new spend category challenges as companies will need to invest in technology to get work done at a time when it is harder to hire the right person at the right time. Here is a quick preview of our predictions.

 

14 Key Predictions for the IT Executive in 2023

To get the details on each of these trends and predictions and understand why they matter in 2023, download this report at no cost by filling out this quick form to join our low-volume bi-monthly mailing list. (Note: If you do not wish to join our mailing list, you can also purchase a personal license for this report.)

Posted on

What To Watch Out For As GPT Leads a New Technological Revolution

2022 was a banner year for artificial intelligence technologies that reached the mainstream. After years of being frustrated with the likes of Alexa, Cortana, and Siri and the inability to understand the value of machine learning other than as a vague backend technology for the likes of Facebook and Google, 2022 brought us AI-based tools that was understandable at a consumer level. From our perspective, the most meaningful of these were two products created by OpenAI: DALL-E and ChatGPT, which expanded the concept of consumer AI from a simple search or networking capability to a more comprehensive and creative approach for translating sentiments and thoughts into outputs.

DALL-E (and its successor DALL-E 2) is a system that can create visual images based on text descriptions. The models behind DALL-E look at relationships between existing images and the text metadata that has been used to describe those images. Based on these titles and descriptions, DALL-E uses diffusion models to start with random pixels that lead to generated images based on these descriptions. This area of research is by no means unique to OpenAI, but it is novel to open up a creative tool such as DALL-E to the public. Although the outputs are often both interesting and surprisingly different from what one might have imagined, they are not without their issues. For instance, the discussion around the legal ownership of DALL-E created graphics is not clear, since Open AI claims to own the images used, but the images themselves are often based on other copyrighted images. One can imagine that, over time, an artistic sampling model may start to appear similar to the music industry where licensing contracts are used to manage the usage of copyrighted material. But this will require greater visibility regarding the lineage of AI-based content creation and the data used to support graphic diffusion. Until this significant legal question is solved, Amalgam Insights believes that the commercial usage of DALL-E will be challenging to manage. This is somewhat reminiscent of the challenges that Napster faced at the end of the 20th century as a technology that both transformed the music industry and had to deal with the challenges of a new digital frontier.

But the technology that has taken over the zeitgeist of technology users is ChatGPT and related use cases associated with the GPT (Generative Pre-Trained Transformer) autoregressive language model trained on 500 billion words across the web, Wikipedia, and books. And it has become the favorite plaything of many a technologist. What makes ChatGPT attractive is its ability to take requests from users asking questions with some level of subject matter specificity or formatting and to create responses in real-time. Here are a couple of examples from both a subject matter and creative perspective.

Example 1: Please provide a blueprint for bootstrapping a software startup.

This is a bit generic and lacks some important details on how to find funding or sell the product, but it is in line with what one might expect to see in a standard web article regarding how to build a software product. The ending of this answer shows how the autogenerative text is likely referring to prior web-based content built for search engine optimization and seeking to provide a polite conclusion based on junior high school lessons in writing the standard five-paragraph essay rather than a meaningful conclusion that provides insight. In short, it is basically a status quo average article with helpful information that should not be overlooked, but is not either comprehensive or particularly insightful for anyone who has ever actually started a business.

A second example of ChatGPT is in providing creative structural formats for relatively obscure topics. As you know, I’m an expert in technology expense management with over two decades of experience and one of the big issues I see is, of course, the lack of poetry associated with this amazing topic. So, again, let’s go to ChatGPT.

Example 2: Write a sonnet on the importance of reducing telecom expenses

As a poem, this is pretty good for something written in two seconds. But it’s not a sonnet, as sonnets are 14 lines, written in iambic pentameter (10 syllable lines split int 5 iambs, or a unstressed syllable followed by a stressed syllable) and split into three sections of four lines followed by a two-line section with a rhyme scheme of ABAB, CDCD, EFEF, GG. So, there’s a lot missing there.

So, based on these examples, how should ChatGPT be used? First, let’s look at what this content reflects. The content here represents the average web and text content that is associated with the topic. With 500 billion words in the GPT-3 corpus, there is a lot of context to show what should come next for a wide variety of topics. Initial concerns of GPT-3 have started with the challenges of answering questions for extremely specific topics that are outside of its training data. But let’s consider a topic I worked on in some detail back in my college days while using appropriate academic language in asking a version of Gayatri Spivak’s famous (in academic circles) question “Can the subaltern speak?”

Example 3: Is the subaltern allowed to fully articulate a semiotic voice?

Considering that the language and topic here is fairly specialized, the introductory assumptions are descriptive but not incisive. The answer struggles with the “semiotic voice” aspect of the question in discussing the ability and agency to use symbols from a cultural and societal perspective. Again, the text provides a feeling of context that is necessary, but not sufficient, to answer the question. The focus here is on providing a short summary that provides an introduction to the issue before taking the easy way out telling us what is “important to recognize” without really taking a stand. And, again, the conclusion sounds like something out of an antiseptic human resources manual in asking for the reader to consider “different experiences and abilities” rather than the actual question regarding the ability to use symbols, signs, and assumptions. This is probably enough of an analysis at a superficial level as the goal here isn’t to deeply explore postmodern semiotic theory but to test ChatGPT’s response in a specialized topic.

Based on these three examples, one should be careful in counting on ChatGPT to provide a comprehensive or definitive answer to a question. Realistically, we can expect ChatGPT will provide representative content for a topic based on what is on the web. The completeness and accuracy of a ChatGPT topic is going to be dependent on how often the topic has been covered online. The more complete an answer is, the more likely it is that this topic has already been covered in detail.

ChatGPT will provide a starting point for a topic and typically provide information that should be included to introduce the topic. Interestingly, this means that ChatGPT is significantly influenced by the preferences that have built online web text over the past decade of content explosion. The quality of ChatGPT outputs seems to be most impressive to those who treat writing as a factual exercise or content creation channel while those who look at writing as a channel to explore ideas may find it lacking for now based on its generalized model.

From a topical perspective, ChatGPT will probably have some basic context for whatever text is used in a query. It would be interesting to see the GPT-3 model augmented with specific subject matter texts that could prioritize up-to-date research, coding, policy, financial analysis, or other timely new content either as a product or training capability.

In addition, don’t expect ChatGPT to provide strong recommendations or guidance. The auto-completion that ChatGPT does is designed to show how everyone else has followed up on this topic. And, in general, people do not tend to take strong stances on web-based content or introductory articles.

Fundamentally, ChatGPT will do two things. First, it will make mediocre content ubiquitous. There is no need to hire people to write an “average” post for your website anymore as ChatGPT and other technologies either designed to compete with or augment it will be able to do this easily. If your skillset is to write grammatically sound articles with little to no subject matter experience or practical guidance, that skill is now obsolete as status quo and often-repeated content can now be created on command. This also means that there is a huge opportunity to combine ChatGPT with common queries and use cases to create new content on demand. However, in doing so, users will have to be very careful not to plagiarize content unknowingly. This is an area where, just like with DALL-E, OpenAI will have to work on figuring out data lineage, trademark and copyright infringement, and appropriation of credit to support commercial use cases.  ChatGPT struggles with what are called “hallucinations” where ChatGPT makes up facts or sources because those words are physically close to the topic discussed in the various websites and books that ChatGPT uses. ChatGPT is a text generation tool that picks words based on how frequently they show up with other words. Sometimes that result will be extremely detailed and current and other times, it will look very generic and mix up related topics that are often discussed together.

Second, this tool now provides a much stronger starting point for writers seeking to say something new or different. If your point of view is something that ChatGPT can provide in two seconds, it is neither interesting or new. To stand out, you need to provide greater insight, better perspective, or stronger directional guidance. This is an opportunity to improve your skills or to determine where your professional skills lie. ChatGPT still struggles with timely analysis, directional guidance, practical recommendations beyond surface-level perspectives, and combining mathematical and textual analysis (i.e. doing word problems or math-related case studies or code review) so there is still an immense amount of opportunity for people to write better.

Ultimately, ChatGPT is a reflection of the history of written text creation, both analog and digital. Like all AI, ChatGPT provides a view of how questions were answered in the past and provides an aggregate composite based on auto-completion. For topics with a basic consensus, such as how to build a product, this tool will be an incredible time saver. For topics that may have multiple conflicting opinions, ChatGPT will try to play either both sides or all sides in a neutral manner. And for niche topics, ChatGPT will try to fake an answer at what is approximately a high school student’s understanding of the topic. Amalgam Insights recommends that all knowledge workers experiment with ChatGPT in their realm of expertise as this tool and the market of products that will be built based on the autogenerated text will play an important role in supporting the next generation of tech.

Posted on

December 2: From BI to AI, Part 1 (AtScale, Dremio, Matillion, Informatica, Starburst)

Launches and Updates

AtScale’s Semantic Layer Launches on Google Cloud Marketplace

Semantic layer platform AtScale announced Thursday that it was now available on Google Cloud Marketplace. Mutual customers will be able to use AtScale on Google Cloud with services such as Google BigQuery, where they can run BI and OLAP workloads without needing to extract or move data. 

Dremio Announces Updates to Lakehouse Capabilities

Open data lakehouse Dremio announced a number of improvements this week. Among the updates: new SQL functionality, including support for the MAP data type so users can query map data from Parquet, Iceberg, and Delta Lake; security enhancements such as row and column-level policy-defined access control for users; support for INSERT, DELETE, and UPDATE on Iceberg tables, and for “time travel” to query historical data in place; as well as usability and performance improvements. Dremio also added a number of connectors, including dbt, Snowflake, MongoDB, DB2, OpenSearch, and Azure Data Explorer. 

Informatica Reveals AWS-Specific Cloud Data Management Services

At AWS re:Invent 2022, Informatica announced three new capabilities for Informatica within AWS. Informatica Data Loader is embedded within Amazon Redshift so that mutual customers will be able to ingest data from a wide variety of systems, including AWS. The Informatica Data Marketplace now supports AWS Data Exchange, allowing customers to access and use third-party data hosted on the Data Exchange. And Informatica INFACore, INFA’s new development and data science framework, simplifies the process of developing  and maintaining complex data pipelines, which can be shunted over to Amazon SageMaker Studio as a simple function, allowing users to pull prepared data from INFACore into SageMaker Studio for further use in building, training, and deploying machine learning models on SageMaker.

Matillion Accelerates Productivity for Data Teams with Key Ecosystem Integrations | Matillion 

Data productivity platform Matillion announced a number of integrations with technical partners. Most of these integrations are accelerators that speed up some aspect of data processing between Matillion and its partners. FHIR Data, built by Matillion and Hakkoda, is a Snowflake healthcare data integrator that simplifies the process of loading FHIR data in Snowflake, then transforming it into a structured format for analytics processing. AWS Redshift Serverless Scale will let mutual Matillion-AWS customers run analytics without needing to manually provision or manage data warehouse clusters. Matillion One Click within AllCloud automates the setup and maintenance of data pipelines. Finally, the Matillion-Collibra integration creates data lineage, mapping inbound and outbound data flows, and attaches data objects to assets in the Collibra data catalog.

Starburst Grows Galaxy with Data Products Capabilities

Analytics engine Starburst launched new capabilities for Starburst Galaxy, the managed service version of its primary Starburst Enterprise offering.  The new Data Products capabilities include a catalog explorer for users to search through their data more easily and understand what they have; schema discovery, which can help users find new datasets regardless of their storage format; and enhanced security and access controls. 

Starburst Enterprise Now Supports AWS Lake Formation and Further Data Federation

Starburst also announced support for AWS Lake Formation via Starburst Enterprise, which will allow joint customers to more easily implement a data mesh framework across all of an organization’s data sources.

Posted on

November 11: From BI to AI (Anaconda, Coefficient, Databricks, dbt, Domino Data Lab, Matillion, Neo4J, PwC, Sagence, Snowflake, Tellius)

Editorial note: While Twitter and Meta aren’t precisely on the BI to AI spectrum per se, given the sheer prevalence and use of these massive data sources and the fast-moving news around layoffs and security issues for these properties lately, Amalgam Insights would recommend confirming that you are performing backups of relevant data on a regular schedule; that you are taking available security precautions including the use of two-factor authentication where possible; and that your communications strategies are nimble enough to respond to issues of impersonation. 

Funding and Finances

Coefficient Secures $18M Series A Funding Round

Spreadsheet data automation company Coefficient has raised an $18M A round. Battery Ventures led the funding round, with participation from existing investors Foundation Capital and S28 Capital. Coefficient will use the money to scale up global operations and expand its offerings.

Databricks Ventures Invests in Matillion

Strategic investment firm Databricks Ventures has taken an equity stake for an undisclosed sum in Databricks partner Matillion, a data integration solution. In doing so, Databricks extends their existing partnership with Matillion, providing financial support for Matillion’s Data Productivity cloud and how it works with Databricks’ Lakehouse Platform.

Updates and Launches

Tellius Improves User Experience and Ease of Visual Analysis with Version 4.0

Decision intelligence platform Tellius announced version 4.0 on Wednesday. Key new features include Multi-Business View Vizpads, allowing users to view and analyze across multiple data sources without needing to constantly switch between individual dashboards for each; an enhanced onboarding and ongoing user experience with walkthroughs and in-app chat; and more robust search functionality.

Neo4j Announces General Availability of its Next-Generation Graph Database Neo4j 5

Graph data platform Neo4J made Neo4J 5, the next version of its cloud-ready graph database, generally available earlier this week. Among the notable improvements: new syntax making complex queries easier to write; query performance improvements by up to 1000x; automatic scaleout to handle sudden massive bursts of query activity; and the debut of Neo4J Ops Manager to monitor and manage continuous updates across global deployments. 

Partnerships

Anaconda Will Integrate with Domino’s Enterprise MLOps Platform

Anaconda in Snowpark for Python Enters Public Preview

Anaconda made two partnership announcements this week. First, Anaconda is collaborating with Domino Data Lab to incorporate the Anaconda repository into Domino’s Enterprise MLOps Platform. Domino users will be able to access Anaconda’s Python and R packages without requiring a separate Anaconda enterprise license.

Second, Snowpark for Python, the Anaconda repository and package manager within Snowflake Data Cloud, has entered public preview. Snowflake users will be able to use Python to build data science workflows and data pipelines within Snowpark.  

On a related note, dbt Labs also announced support for data transformation in Python to dbt, allowing dbt customers to take advantage of Python capabilities on major cloud data platforms such as Snowflake. Joint dbt and Snowflake customers will be able to use Python capabilities for both analytics and data science projects on Snowpark.

Acquisitions

PwC Acquires Data Strategy Consulting Firm Sagence 

Data management and analytics consulting firm Sagence has been acquired by PwC, adding to PwC’s existing data strategy and digital transformation capabilities. Sagence provides additional expertise and experience in creating action plans for proposed data strategies.


Posted on

November 4: From BI to AI (Alation, Cloudera, Collibra, IBM, Informatica, Qlik)

Funding

Alation Raises $123M Series E

Enterprise data intelligence platform Alation announced a $123M Series E round of funding this week. Thoma Bravo, Sanabil Investments, and Costanoa Ventures led the round, with additional participation from new investor Databricks Ventures and existing investors Dell Technologies Capital, Hewlett Packard Enterprise, Icon Ventures, Queensland Investment Corporation, Riverwood Capital, Salesforce Ventures, Sapphire Ventures, and Union Grove. Alation will use the capital to continue accelerating product innovation and global expansion.

Partnerships

Cloudera Expands Partner Opportunities, Accelerates Go to Market

On November 2, Cloudera debuted the Cloudera Partner Network, redesigning their existing partner program’s approach. CPN members will see improved tools for supporting go-to-market initiatives, a FastTrack Onboarding Program to shorten time-to-market capabilities, programs for rebates and market development funds to demonstrate financial commitment, a Partner Success Team to improve training, and additional benefits to support the new CDP One SaaS solution.

Launches and Updates

Collibra Announces Updates to Collibra Data Intelligence Cloud

At Data Citizens ’22, Collibra announced new capabilities for Collibra Data Intelligence Cloud. A new data marketplace will make it faster and easier to find curated and approved data, speeding up decision making and action. The Workflow Designer, now in beta, will help teams automate business processes in creating new workflows, and usage analytics will show which assets are most widely and frequently used. On the compliance side, Collibra also released Collibra Protect, available through their Snowflake partnership, to provide greater insight into how protected and sensitive data is being used, as well as protect said data and maintain compliance.   Collibra Data Quality and Observability, when deployed in an organization’s cloud, will help organizations scale and secure their data quality operations.

IBM Announces IBM Business Analytics Enterprise  

On November 3, IBM launched IBM Business Analytics Enterprise, a new suite that includes business intelligence planning, budgeting, reporting, forecasting, and dashboarding capabilities. Among the key new features is the IBM Analytics Content Hub, which will let users assemble planning and analytics dashboards from a number of vendor sources. In addition, the Hub tracks and analyzing usage patterns to recommend role-based content to users.

Up Next for Informatica: Intelligent Data Management Cloud for State and Local Government 

Informatica released the State and Local Government version of their Intelligent Data Management Cloud this week. This continued expansion of vertical-specific IDMCs demonstrates Informatica’s commitment to serving a variety of government clients beyond just the federal.

Qlik Launches Real-Time Enterprise Data Fabric Qlik Cloud Data Integration

Qlik released Qlik Cloud Data Integration, a set of SaaS services that form a data fabric. Among the major cloud platforms QCDI will integrate with are AWS, Databricks, Google Cloud, Microsoft Azure Synapse, and Snowflake. Data from these sources will be sent through QCDI, where it can be transformed from its raw state to analytics-ready, allowing for automated workflows for apps and APIs while accounting for metadata management and lineage.

Posted on

October 28: From BI to AI (Amazon Web Services, Axelera AI, Bloomberg, data.world, Informatica, IBM, LatticeFlow, Microsoft, OpenAI, Shutterstock, ThoughtSpot)

Funding and Finance

Edge AI Company Axelera AI Announces $27M Series A

Edge AI startup Axelera AI closed a $27M Series A funding round this week. Innovation Industries led the round, with participation from imec.xpand and SFPI-FPIM. Axelera AI will use the funding for launching and producing its AI acceleration platform, as well as hiring.

LatticeFlow Claims $12M in Series A Funding

AI platform LatticeFlow secured $12M in Series A funding. Atlantic Bridge and OpenOcean led the funding round, with participation from new investors FPV Ventures and existing investors btov Partners and Global Founders Capital. The funding will go towards expanding LatticeFlow’s capacity to diagnose and repair errors in AI data and models,

OpenAI Invests Tens of Millions of Dollars in Audio and Video Editing App Descript

AI company OpenAI, most recently in the news for its AI text and image generators, is investing tens of millions of dollars in audio and video editing app Descript at a valuation of $550M. Between this, the recent expansion of its partnership with content platform Shutterstock, and more money potentially flowing OpenAI’s way from Microsoft, OpenAI is making moves to secure its position in modern AI-based content creation.

Launches and Updates

AWS Releases Amazon Neptune Serverless 

AWS announced Amazon Neptune Serverless, a serverless version of their graph database service which automatically provisions and scales resources for unpredictable graph database workloads. Amazon Neptune Serverless is available today to AWS customers running Neptune in specific regions; availability in other regions is coming soon.

Bloomberg Carbon Emissions Dataset Now Covers 100,000 Companies

Bloomberg announced that it had enlarged its carbon emissions dataset to cover 100,000 companies. The dataset includes company-reported carbon data, as well as Bloomberg-generated estimates of carbon data for companies that do not have or provide carbon emissions data, and accompanying data reliability scores.

Informatica Launches Intelligent Data Management Cloud for Higher Education

Adding to their collection of vertical-specific releases of its Intelligent Data Management Cloud, Informatica launched their Intelligent Data Management Cloud for higher education this week. IDMC helps integrate educational data from a wide variety of decentralized sources while ensuring data remains secure and compliance and privacy standards are respected.

IBM Adds Natural Language Processing, Text to Speech, Speech to Text Libraries to Embeddable AI Portfolio

IBM released three new libraries this week, expanding their embeddable AI portfolio. These libraries include IBM Watson Natural Language Processing Library, IBM Watson Text to Speech Library, and IBM Watson Text to Speech Library. IBM Ecosystem partners will be able to use these libraries to develop and scale AI apps more quickly. 

ThoughtSpot for Sheets Brings Self-Service Analytics to Google Sheets

Analytics company ThoughtSpot debuted ThoughtSpot for Sheets, a web plugin for Google Sheets. Users will be able to install and run ThoughtSpot for Sheets directly in their web browser, capable of analyzing the data available in their Google Sheets spreadsheets while minimizing the technical knowledge necessary to do so. ThoughtSpot will be compatible with additional partners in the near future.

Hiring

data.world Names New CMO, SVP of Sales, VP of Finance

Enterprise data catalog company data.world made several hiring announcements this week. New Chief Marketing Officer Stephanie McReynolds joined data.world from Ambient.ai, where she served as head of marketing. Prior to that, McReynolds was the SVP of Marketing and first marketing executive at fellow data catalog company Alation. New SVP of Sales Richard Yonkers came to data.world from Knoema, an enterprise data hub provider, where he served as the senior vice president of sales. Mineo Sakan was the VP of Global Finance at data protection firm HYCU prior to joining data.world as their new VP of Finance.

Posted on

October 21: From BI to AI (Alteryx, Bloomberg, Dataiku, Google Cloud, Oracle, Stability AI, Tableau, Tellius, TigerGraph)

Oracle CloudWorld Announcements

Oracle made a number of announcements at Oracle CloudWorld this week.

Oracle Database 23c Now in Beta
Version 23c of Oracle Database, code named “App Simple,” is now available in beta. As highlighted by the code name, improvements focused on simplifying application development, particularly for apps written using JSON, Graph, or microservices. JSON Relational Duality is Oracle’s new approach to allow data to be simultaneously used and understood as both app-friendly JSON documents and as database-friendly relational tables, allowing JSON app data to be directly queried.

Oracle Launches MySQL HeatWave Lakehouse
Oracle expanded the MySQL HeatWave portfolio with the addition of MySQL HeatWave Lakehouse, which will allow customers to process and query data in object store at multi-terabyte scale. This will directly compete with Redshift and Snowflake in providing a cost-efficient, performant lakehouse offering.

Oracle Innovates Across Data and Analytics Portfolio
Oracle also announced product innovations across its data and analytics portfolio. Oracle Analytics Cloud added a semantic modeler to present the semantic model to business users in an appropriate manner; advanced composite visualizations to organize content and present data patterns and signals in a more easily understood manner; one-click automated insights that provide recommendations for visualizations; and AI and ML enhancements to connect Oracle Cloud Infrastructure cognitive services to Oracle Analytics Cloud. Oracle Fusion Analytics improved their existing ERP, SCM, and HCM analytics solutions with vertical-specific enhancements, and added Oracle Fusion CX Analytics to the OFA portfolio to give sales, marketing, service, and finance users KPIs and dashboards.

Oracle and NVIDIA Team Up to Accelerate Enterprise AI Adoption
Finally, Oracle and NVIDIA announced a partnership that will bring NVIDIA’s complete accelerated computing stack, including tens of thousands of GPUs, to Oracle Cloud Infrastructure to permit AI training and deep learning inference at scale.

Funding

Stability AI Announces $101M Seed Round
Open source AI company Stability AI announced a $101M seed round at a nearly $900M valuation. Coatue, Lightspeed Venture Partners, and O’Shaughnessy Ventures LLC led the round. Stability AI will use the funding to speed up the development of open AI models across a wide variety of use cases, consumer and enterprise alike.

Tellius Raises $16M B Round
AI decision intelligence platform Tellius announced a $16M B round of funding, announced October 20. Baird Capital led the round, and all existing investors also participated: Grotech Ventures, Sands Capital Ventures, and Veraz Investments. Tellius will use the funding to expand go-to-market, hire across sales, marketing, and product engineering, and R+D for their platform.

Launches and Updates

Alteryx Announces New Analytics Cloud Capabilities, Version 22.3
At Inspire EMEA 2022, Alteryx announced the 22.3 Alteryx product release, and additional enhancements to Alteryx Analytics Cloud. Alteryx Machine Learning is now running on Alteryx Analytics Cloud, and Designer Cloud has improved Snowflake data processing performance when moving from AWS. As for Alteryx 22.3, Alteryx’s Data Connection Manager supports Azure Active Directory group authentication for Databricks and Snowflake, and integrations with third-party vaults like CyberArk and HashiCorp. 22.3 also features enhanced Google BigQuery connectivity and performance improvements.

Bloomberg Data License Content Now Available on Google Cloud
Bloomberg made its Data License content available on Google Cloud. Clients will be able to integrate this cloud data into their Google Cloud-specific workloads directly.

Tableau Launches Version 2022.3
Tableau announced version 2022.3 this week. Key new features include a Data Guide guided experience to Tableau, Table Extensions that will let customers enrich their data with advanced analytics and predictions, and adding “dynamic zone visibility” functionality to dashboards, allowing users to see only the dashboard contents that are relevant without needing to manually design multiple individual dashboards.

TigerGraph Will Support openCypher in GSQL.
TigerGraph announced this week that they would support the openCypher query language in TigerGraph’s own graph query language. Providing this support will make it easier for developers to build or migrate graph applications to TigerGraph databases.

Partnerships

Dataiku and Slalom Team Up on AI Strategy
Dataiku announced a partnership with Slalom, a global consulting firm. Slalom will provide strategic guidance to Dataiku enterprise customers on implementing AI and MLOps projects.

Posted on

October 7: From BI to AI (Apollo GraphQL, AWS, CelerData, Domino Data Lab, Komprise, Kyndryl, SingleStore, Teradata)

Funding

SingleStore Closes Additional $30M in Series F Funding, Total Now at $146M

Database company SingleStore closed an additional $30M in F-round funding earlier this week, with Prosperity7 as a new investor. This additional investment brings SingleStore’s total for their F round up to $146M, after an initial round in September 2021 for $80M and an additional $36M in July 2022. In the interim, SingleStore has doubled its headcount, with geographic expansions to Ireland, Singapore, and Australia, and hiring continues.

Updates

Apollo GraphQL Launches GraphOS to Scale “Supergraphs”

On October 5, Apollo GraphQL debuted Apollo GraphOS. GraphOS is a platform to build, connect, and scale “supergraphs,” which are themselves architectures that bring together a company’s data, micro services, and other digital capabilities into one network to simplify data access and sourcing for app building. Key features of GraphOS include providing a centralized updated repository for schemas and pipelines, new GraphQL capabilities such as live queries and edge caching,  both cloud and self-hosting options, and security and governance capabilities to control who can access your supergraphs when and why

CelerData Launches Quick Start for StarRocks on Amazon Web Services 

On October 6, analytics platform CelerData released AWS QuickStart for StarRocks. This release deploys StarRocks on the AWS cloud, supporting quick deployment of real-time analytics and providing high concurrency while guiding users to follow AWS best practices

Domino 5.3 Previews Nexus Hybrid and Multi-Cloud Capabilities
Domino Data Lab released version 5.3 of their Enterprise MLOps platform yesterday. Key features include a private preview of Domino’s Nexus hybrid and multi-cloud capabilities, announced back in June; additional data connectors for Amazon S3 tabular data, Teradata warehouses, and Trino; and GPU-based model inference capabilities extended beyond model training to model deployment, helping bring complex models based on deep learning capabilities into production. Domino 5.3 is generally available now; companies who would like to preview Nexus can request access on the Domino Nexus website.

Komprise Rolls Out Fall 2022 Release 

Unstructured data management company Komprise released the Fall 2022 version of Komprise Intelligent Data Management. New capabilities include Komprise Smart Data Workflows, which let IT teams automate key parts of the data tagging and discovery process; and Deep Analytics, which permits authorized users outside of IT to view certain characteristics of their data and work with IT for better data management.

Partnerships

Kyndryl and Teradata Team Up for Cloud Migration
IT infrastructure services provider Kyndryl and data platform Teradata announced a strategic partnership earlier this week. The companies will combine Kyndryl’s data and AI services and Teradata’s cloud analytics and data platform to help customers migrate from on-prem data warehouses to the cloud.

Posted on

September 30: From BI to AI (AWS, Databricks, DataRobot, Domino, Microsoft Azure, Qlik, Salesforce, SAS) 

Dreamforce

Salesforce Premieres Genie, the New Realtime Data Platform Behind the Customer 360 

At Dreamforce, Salesforce launched Salesforce Genie, a realtime data platform undergirding Salesforce Customer 360. Genie ingests and stores streams of realtime data and combines it with transactional Salesforce data, then turns the combined data into a unified customer record. In addition, Einstein AI can provide additional relevant personalizations and predictions based on this realtime data, and Flow can trigger actions automatically based on the same realtime data. 

Salesforce and AWS Now Let You “Bring Your Own AI” to Customer 360 

Salesforce and AWS teamed up to reveal new integrations between Salesforce and Amazon SageMaker. Joint customers will be able to use SageMaker and Einstein AU in concert to build custom AI models – to “bring your own AI” and use it in realtime across the Customer 360. Cleansed and unified realtime Salesforce customer data alongside an organization’s data from an AWS data lake or data warehouse will be jointly accessible for building and training machine learning models in SageMaker.  The underlying technology enabling these integrations is Salesforce Genie, mentioned above.

Salesforce Appoints Robin Washington Lead Independent Director of the Board 

Salesforce also announced the appointment of Robin Washington as the Lead Independent Director of the Company’s Board of Directors, succeeding Sanford Robinson. Washington has served as Director of Salesforce since 2013, and currently also chairs the Board’s Audit Committee. Washington has also served on the boards of Gilead Sciences, and currently serves on the boards of Alphabet, Honeywell International, and Vertiv Holdings Co. 

Launches and Updates

DataRobot Debuts Dedicated Managed AI Cloud

DataRobot Dedicated Managed AI Cloud is now publicly available, DataRobot announced earlier this week. The Dedicated Managed AI Cloud is a dedicated hosted version of DataRobot’s AI Cloud, managed by DataRobot experts, allowing organizations to outsource certain aspects of AI platform deployment, configuration, management, and maintenance when staffing constraints limit on-prem capabilities.

Domino Data Lab Announces New Ecosystem Solutions with NVIDIA

Domino Data Lab announced two solutions with NVIDIA ecosystem partners last week. Domino and NVIDIA have created an on-prem GPU reference architecture and an integrated MLOps solution for high-performance processing needs on NVIDIA DGX systems.

Partnerships

Qlik Launches Two More Products in Partnership with Databricks  

Qlik debuted two more solutions in its partnership with Databricks. The first is the launch of the Databricks Lakehouse (Delta) Endpoint within Qlik Data Integration, which will make it easier for customers to get data into their Delta Lakehouse. The second is the integration of Qlik Cloud with Databricks Partner Connect, improving the “trial experience” of Qlik Data Analytics within Databricks. 

SAS Viya Now Available On Azure Marketplace

SAS Viya is now available on the Microsoft Azure Marketplace, expanding on the existing SAS-Microsoft partnership. Users will be able to access the full Viya package on Azure, including SAS Visual Analytics, SAS Visual Statistics, SAS Visual Data Mining and Machine Learning, and SAS Model Manager, with no-code and code-centric interfaces available.

Hiring

 Debanjan Saha Named Chief Executive Officer of DataRobot

Last week, DataRobot announced that Debanjan Saha had been appointed as Chief Executive Officer. Saha originally joined DataRobot in February 2022 as President and Chief Operating Officer, and served as DataRobot’s interim CEO when Dan Wright stepped down in July. Prior to DataRobot, Saha was the VP/GM of Data Analytics at Google, and the VP GM of Aurora and RDS at Amazon. 

Posted on

September 16: From BI to AI (Altair, AWS, Dataiku, Oracle, RapidMiner, Salesforce, Sigma Computing, Snowflake, Tableau)

Updates and Launches

Oracle Brings MySQL HeatWave to AWS

Oracle announced this week that MySQL Heatwave was now available on AWS. AWS users will be able to use MySQL Heatwave for transaction processing, analytics, and machine learning automation within a single MySQL database on AWS. As part of this release, AWS customers will also have access to MySQL Heatwave ML at no additional charge, which provides machine learning capabilities inside MySQL databases. 

Presenting the PyTorch Foundation

Meta announced this week that core AI research framework PyTorch will transition to being managed by the new PyTorch Foundation, part of the Linux Foundation. Initial PyTorch Foundation board members will include representatives from cloud platform providers AWS, Google Cloud, and Microsoft Azure, as well as chipmakers AMD and Nvidia, and Meta itself.  Meta will continue to invest in PyTorch and use it as Meta’s primary AI research framework.

Syniti Adds Data Quality and Catalog Capabilities to Syniti Knowledge Platform

Enterprise data management provider Syniti announced that it had added data quality and cataloging capabilities to its Syniti Knowledge Platform, expanding the rubric of its data management platform’s proficiencies. These new features will allow for more centralized and consistent data management practices. 

Partnerships

Real-Time Data Sharing Comes to the Salesforce-Snowflake Partnership 

Salesforce and Snowflake revealed upcoming data sharing innovations, available in pilot in Fall ’22. Specifically, a new integration between Salesforce and Snowflake will allow for uninterrupted access between Salesforce data and the Snowflake Data Cloud in real time, without needing to move data or perform any data copying or syncing operations. This will permit faster cross-platform unified analytics and personalization of customer profiles. In addition, data from both Salesforce and Snowflake can be visualized together in Tableau, building on Tableau’s long-standing relationship with Snowflake.

Sigma Computing Debuts Snowflake Healthcare & Life Sciences Data Cloud Integration

BI alternative Sigma Computing revealed this week that it had partnered with Snowflake, launching an integration for joint customers on the Snowflake Healthcare and Life Sciences Data Cloud. The Sigma integration will allow healthcare companies a unified data platform, eliminating data silos and remaining compliant with healthcare data regulations while permitting business users to safely analyze sensitive data.

Acquisitions

Altair Expands Data Analytics Portfolio with RapidMIner Acquisition

On September 13, Altair agreed to acquire low-code machine learning and analytics platform RapidMiner for an undisclosed amount. RapidMiner will be integrated with existing Altair tools, including machine learning solution Altair Knowledge Studio, enterprise data platform Altair SmartWorks, and SAS language compiler Altair SLC. 

Hiring

Dataiku Appoints Daniel Brennan as Chief Legal Officer

Dataiku announced Daniel Brennan as their new Chief Legal Officer. Prior to joining Dataiku, Brennan was the Vice President, Deputy General Counsel of Twitter, where he spent over a decade of his legal career, having seen Twitter through its IPO, as well as more recent legal challenges.  Before that, Brennan was the Executive Director, Legal for Dell’s Global Services business unit.