IBM and Cloudera Join Forces to Expand Data Science Access

On June 21, IBM and Cloudera jointly announced that they were expanding their existing relationship to bring more advanced data science solutions to Hadoop users by developing a shared go-to-market program. IBM will now resell Cloudera Enterprise Data Hub and Cloudera DataFlow, while Cloudera will resell IBM Watson Studio and IBM BigSQL.

In bulking up their joint go-to-market programs, IBM and Cloudera are reaffirming their pre-existing partnership to amplify each others’ capabilities, particularly in heavy data workflows. Cloudera Hadoop is a common enterprise data source, but Cloudera’s existing base of data science users is small despite the growing demand for data science options, and their Data Science Workbench is coder-centric. Being able to offer the more user-friendly IBM Watson Studio to its customers gives Cloudera’s existing data customers a convenient option for doing data science without necessarily needing to know Python or R or Scala. IBM can now sell Watson Studio, BigSQL, and IBM consulting and services into Cloudera customers more deeply; it broadens their ability to upsell additional offerings.

Because IBM and Cloudera each hold significant amounts of on-prem data, It’s interesting to look at this partnership in terms of the 800-pound gorilla of cloud data: AWS. IBM, Cloudera, and Amazon are all leaders when it comes to the sheer amount of data each holds. But Amazon is the biggest cloud provider on the planet; it holds the plurality of the cloud hosting market, and most of IBM and Cloudera’s customers’ data is on-prem. Because that data is hosted on-prem, it’s data Amazon doesn’t have access to; IBM and Cloudera are teaming up to sell their own data science and machine learning capabilities on that on-prem data where there may be security or policy reasons to keep it out of the cloud.

A key differentiator in comparing AWS with the IBM-Cloudera partnership lies in AWS’ breadth of machine learning offerings. In addition to having a general-purpose data science and machine learning platform in SageMaker, AWS also offers task-specific tools like Amazon Personalize and Textract that address precise use cases for a number of Amazon customers who don’t need a full-blown data science platform. IBM has some APIs for visual recognition, natural language classification, and decision optimization, but AWS has developed their own APIs into higher-level services. Cloudera customers building custom machine learning models may find that IBM’s Watson Studio suits their needs. However, IBM lacks the variety of off-the-shelf machine learning applications that AWS provides. IBM supplies their machine learning capabilities as individual APIs that an application development team will need to fit together to create their own in-house apps.

Recommendations

  • For Cloudera customers looking to do broad data science, IBM Watson Studio is now an option. This offers Cloudera customers an alternative to Data Science Workbench; in particular, an option that has a more visual interface, with more drag-and-drop capabilities and some level of automation, rather than a more code-centric environment.
  • IBM customers can now choose Cloudera Enterprise Data Hub for Hadoop. IBM and Hortonworks had a long-term partnership; IBM supporting and cross-selling Enterprise Data Hub demonstrates that IBM will continue to sell enterprise Hadoop in some flavor.

Kubernetes Grows Up – The View from KubeCon EU 2019

Our little Kubernetes is growing up.

By “growing up” I mean it is almost in a state that a mainstream company can consider it fit for production. While there are several factors that act as a drag against mainstream reception, a lack of completeness has been a major force against Kubernetes broader acceptance. Completeness, in this context, means that all the parts of an enterprise platform are available off the shelf and won’t require a major engineering effort on the part of conventional IT departments.

The good news from KubeCon+CloudNativeCon EU 2019 in Barcelona, Spain (May 20 – 23 2019) is that the Kubernetes and related communities are zeroing in on that ever so important target. There are a number of markers pointing toward mainstream acceptance. Projects are filling out the infrastructure – gaining completeness – and the community is growing.

Project Updates

While Kubernetes may be at the core, there are many supporting projects that are striving to add capabilities to the ecosystem that will result in a more complete platform for microservices. Some of the projects featured in the project updates show the drive for completeness. For example, OpenEBS and Rook are two projects striving to make container storage more enterprise friendly. Updates to both projects were announced at the conference. Storage, like networking, is an area that must be tackled before mainstream IT can seriously consider container microservices platforms based on Kubernetes.

Managing microservices performance and failure is a big part of the ability to deploy containers at scale. For this reason, the announcement that two projects that provide application tracing capabilities, OpenTracing and OpenCensus, were merging into OpenTelemetry is especially important. Ultimately, developers need a unified approach to gathering data for managing container-based applications at scale. Removing duplication of effort and competing agendas will speed up the realization of that vision.

Also announced at KubeCon+CloudNativeCon EU 2019 were updates to Helm and Harbor, two projects that tackle thorny issues of packaging and distributing containers to Kubernetes. These are necessary parts of the process of deploying Kubernetes applications. Securely managing container lifecycles through packaging and repositories is a key component of DevOps support for new container architectures. Forward momentum in these projects is forward movement toward the mainstream.

There were other project updates, including updates to Kubernetes itself and Crio-io. Clearly, the community is filling in the blank spots in container architectures, making Kubernetes a more viable application platform for everyone.

The Community is Growing

Another gauge pointing toward mainstream acceptance is the growth in the community. The bigger the community, the more hands to do the work and the better the chances of achieving feature critical mass. This year in Barcelona, KubeCon+CloudNativeCon EU saw 7700 attendees, nearly twice last year in Copenhagen. In the core Kubernetes project, there are 164K commits and 1.2M comments in Github. This speaks to broad involvement in making Kubernetes better. Completeness requires lots of work and that is more achievable when there are more people involved.

Unfortunately, as Cheryl Hung, Director of Ecosystems at CNCF says, only 3% of contributors are women. The alarming lack of diversity in the IT industry shows up even in Kubernetes despite the high-profile women involved in the conference such as Janet Kuo of Google. Diversity brings more and different ideas to a project and it would be great to see the participation of women grow.

Service Mesh Was the Talk of the Town

The number of conversations I had about service mesh was astounding. It’s true that I had released a pair of papers on it, one just before KubeCon+CloudNativeCon EU 2019. That may have explained why people want to talk to me about it but not the general buzz. There was service mesh talk in the halls, at lunch, in sessions, and from the mainstage. It’s pretty much what everyone wanted to know about. That’s not surprising since a service mesh is going to be a vital part of large scale-out microservices applications. What was surprising was that even attendees who were new to Kubernetes were keen to know more. This was a very good omen.

It certainly helped that there was a big service mesh related announcement from the mainstage on Tuesday. Microsoft, in conjunction with a host of companies, announced the Service Mesh Interface. It’s a common API for different vendor and project service mesh components. Think of it as a lingua franca of service mesh. There were shout-outs to Linkerd and Solo.io. The latter especially had much to do with creating SMI. The fast maturation of the service mesh segment of the Kubernetes market is another stepping stone toward the completeness necessary for mainstream adoption.

Already Way Too Many Distros

There were a lot of Kubernetes distributions a KubeCon+CloudNativeCon EU 2019. A lot. Really.  A lot. While this is a testimony the growth in Kubernetes as a platform, it’s confusing to IT professionals making choices. Some are managed cloud services; others are distributions for on-premises or when you want to install your own on a cloud instance. Here’s some of the Kubernetes distros I saw on the expo floor.  I’m sure I missed a few:

Microsoft Azure Google Digital Ocean Alibaba
Canonical (Ubuntu) Oracle IBM Red Hat
VMWare SUSE Rancher Pivotal
Mirantis Platform9

 

From what I hear this is a sample, not a comprehensive, list. The dark side of this enormous choice is confusion. Choosing is hard when you get beyond a handful of options. Still, only five years into the evolution of Kubernetes, it’s a good sign to see this much commercial support for it.

The Kubernetes and Cloud Native architecture is like a teenager. It’s growing rapidly but not quite done. As the industry fills in the blanks and as communities better networking, storage, and deployment capabilities, it will go mainstream and become applicable to companies of all sizes and types. Soon. Not yet but very soon.

How Red Hat Runs

This past week at Red Hat Summit 2019 (May 7 – 9 2019) has been exhausting. It’s not an overstatement to say that they run analysts ragged at their events, but that’s not why the conference made me tired. It was the sheer energy of the show, the kind of energy that keeps you running…

Please register or log into your Amalgam Insights Community account to read more.
Log In Register

Amalgam Insights Publishes Highly Anticipated SmartList on Service Mesh and Microservices Management

Amalgam Insights has just published my highly anticipated SmartList Market Guide on Service Mesh. It is currently available this week at no cost as we prepare for KubeCon and CloudNativeCon Europe 2019 where I’ll be attending.

Before you go to the event, get prepared by catching up on the key strategies, trends, and vendors associated with microservices and service mesh. For instance, consider how the Service Mesh market is currently constructed.

To get a deep dive on this figure regarding the three key sectors of the Service Mesh market, gain insights describing the current State of the Market for service mesh, and learn where key vendors and products including Istio, Linkerd, A10, Amazon, Aspen Mesh, Buoyant, Google, Hashicorp, IBM, NGINX, Red Hat, Solo.io, Vamp, and more fit into today’s microservices management environment, download my report today.

At IBM Think, Watson Expands “Anywhere”

At IBM Think in February, IBM made several announcements around the expansion of Watson’s availability and capabilities, framing these announcements as the launch of “Watson Anywhere.” This piece is intended to provide guidance to data analysts, data scientists, and analytic professionals seeking to implement machine learning and artificial intelligence capabilities and evaluating the capabilities of…

Please register or log into your Amalgam Insights Community account to read more.
Log In Register

Data Science and Machine Learning News Roundup, February 2019

On a monthly basis, I will be rounding up key news associated with the Data Science Platforms space for Amalgam Insights. Companies covered will include: Alteryx, Amazon, Anaconda, Cambridge Semantics, Cloudera, Databricks, Dataiku, DataRobot, Datawatch, Domino, Elastic, Google, H2O.ai, IBM, Immuta, Informatica, KNIME, MathWorks, Microsoft, Oracle, Paxata, RapidMiner, SAP, SAS, Tableau, Talend, Teradata, TIBCO, Trifacta,…

Please register or log into your Amalgam Insights Community account to read more.
Log In Register

Tom Petrocelli Clarifies How Cloud Foundry and Kubernetes Provide Different Paths to Microservices

DevOps Research Fellow Tom Petrocelli has just published a new report describing the roles that Cloud Foundry Application Runtime and Kubernetes play in supporting microservices. This report explores when each solution is appropriate and provides a set of vendors that provide resources and solutions to support the development of these open source projects.

Organizations and Vendors mentioned include: Cloud Foundry Foundation, Cloud Native Computing Foundation, Pivotal, IBM, Suse, Atos, Red Hat, Canonical, Rancher, Mesosphere, Heptio, Google, Amazon, Oracle, and Microsoft

To download this report, which has been made available at no cost until the end of February, go to https://amalgaminsights.com/product/analyst-insight-cloud-foundry-and-kubernetes-different-paths-to-microservices

CES 2019 Ramifications for Enterprise IT

Vendors and Organizations Mentioned: IBM, Ose, WindRiver, Velodyne, UV Partners, TDK Corporation, Chirp Microsystems, Qualcomm, Intel, Zigbee Alliance, Thread Group, Impossible Foods

The CES (Consumer Electronics Show) is traditionally known as the center of consumer technology. Run by the CTA (Consumer Technology Association) in Las Vegas, this show brings out enormous volumes of new technology ranging from smart cars to smart homes to smart sports equipment to smart… well, you get the picture. But within all of these announcements, there were also a number of important announcements that affect the enterprise IT world and the definition of IT that will be important for tech professionals to think about in 2019. Amalgam Insights went through hundreds of different technology press releases and announcements to find the most important announcements that will affect your professional career.

Come along with me as we look at Quantum Computing, Gender Equality, Autonomous Vehicles, Disinfected Smartphones, Low Power Virtual Reality, Neural Net Chips, Internet of Things Interoperability, and, yes, the Impossible Burger.

Quantum Computing

On January 8th, 2019, IBM announced IBM Q System One, the “first integrated universal approximate quantum computing system” designed for commercial use. From a practical perspective, this will allow R&D departments to actually have their own quantum computers. Today, the vast majority of quantum computing work is done based on remote access either to quantum computers or quantum computing emulators, which provide limits on the experimenters’ abilities to customize and configure their computing environments.

To create a quantum computing system, IBM had to bring together hardware that provided high-quality and low-error rate qubits, cryogenic equipment to cool the hardware and quantum activity, as well as the electronics, firmware, and traditional computing capabilities needed to support a quantum environment. Of course, IBM is not new to quantum computing and has been a market leader in this emerging category.

Quantum computing fundamentally matters because we are running up against the physical limits of material science that allow microprocessors to get smaller and faster, which we typically sum up as Moore’s Law. In addition, quantum computing potentially allows both for more secure encryption or the ability to quickly decrypt extremely secure technologies, depending on whether one takes a white-hat or black-hat approach. But the ramifications mean that it is important for security organizations to both start understanding quantum computing and to either stay ahead of black-hat quantum computing efforts or provide white-hat security answers to stay ahead.

Gender Equality at CES

At CES, a woman-designed sex toy originally given an innovation award (Warning: may not be Safe For Work) had its award revoked. The Ose vibrator designed by Lora DiCarlo was entered in the robotics and drone category based on its design by a robotics lab at Oregon State University and eight patents pending for a variety of robotic and biomimicry capabilities.

The product was undoubtedly risque. But CES has previously allowed virtual reality pornography to be shown within the show as well as other anatomical simulations designed for sex.

Given CES’ historical standards for other exhibitors to present similar products and objects, the revoking of this award looks biased. This is an important lesson that the answer to providing a gender-equal environment is not necessarily to simply remove all sexual content. The goal is to eliminate harassment and abuse while providing equal opportunity across gender. As long as sex is a part of consumer technology, CES needs to provide equal opportunity for all genders to present.

Autonomous Vehicles

There were a number of announcements associated with Lidar sensors and edge computing innovations. Two that got Amalgam Insights’ attention included:

WindRiver’s integration of its Chassis automotive software with its TitaniumCloud virtualization software. This announcement hints at the need for the car, as computing system, to be integrated with the cloud. This integration will be important as car manufacturers seek to upgrade car capabilities. As we continue to think about the car both as an autonomous data center of its own and set of computing and processing workloads that need to be upgraded on a regular basis, we will need to consider how the operational technologies associated with autonomous vehicles and other “Things” integrate with carrier-grade and public clouds.

Velodyne announced an end-to-end Lidar solution that includes both a hemisphere Lidar sensor called VelaDome as well as its Velia software. This launch reflects the need for hardware components and software to be integrated in the vehicle world, just as it is in the appliances and virtual machines we often use in the world of IT. This is another data point showing how autonomous vehicles are coming closer to our world of IT both in creating integrated solutions and in requiring IT-like support in the future.

Disinfected Smartphones

UV Partners announced a new product called the UV Angel Aura Clean & Charge, which combines both wireless charging with ultraviolet light disinfection. This product matters because, quite frankly, mobile phones tend to be filthy. That’s what happens when people are holding them for hours a day and rarely wash or disinfect the phones. So, this device will be useful for germophobes.

But there is also the practical aspect of being able to clean phone surfaces with this object more easily. This may lead to being able to use the phone to detect biological matter or changes more effectively without additional dirt and biocontaminants. This could make phones or other sensors more accurate in trying to detect trace elements or compounds and increase the functionality of both phones and “Things” as a result.

Low Power Virtual Reality

TDK Corporation announced its work with Qualcomm through the group company of Chirp Microsystems to improve controller tracking for mobile virtual reality and augmented reality headsets (). Most importantly, the tracking system used for these devices is only several miiliwatts, which is a small fraction of the total power within a standard smartphone battery. This compares to several hundred milliwatts for a standard optical tracking system. With this primary technology in development, both AR and VR experiences become more usable simply because they will take significantly less power to support.

This change may not sound exciting, but Amalgam Insights believes that one of the key challenges to the adoption of AR and VR is simply the battery life needed to use these applications for any extended amount of time. This breakthrough could significantly extend the life of AR and VR apps.

Artificial Intelligence

Intel made a number of chip announcements. Amalgam Insights is not a hardware analyst firm, so most of the mobile and laptop-based announcements are beyond our coverage. But the announcement that got our attention was the Intel Nervana Neural Network Processor. This chip, developed with Facebook, is developed to accelerate the detection of inference associated with the algorithmic processing of neural nets and will drive higher performance machine learning and artificial intelligence efforts.

At a time when every chip player is trying to get ahead with GPUs and TPUs, Intel is making its mark by focusing on the detection of iterative inference, which is a necessary part of the “intelligence” of AI. Amalgam Insights looks forward to seeing how the Nervana processor is made available for commercial use and as a cloud-based capability for the enterprise world.

Internet of Things Interoperability

The Zigbee Alliance and Thread Group announced completing the Dotdot 1.0 specification, which will improve interoperability across smart home devices and networks made by different vendors. By providing a standard application layer that works across a wide variety of vendors and works on an IP networking standard, Dotdot brings a level of standardization to application-level configuration, testing, and certification.

This standard is an important step forward for companies working on Smart Home devices or related Smart Office devices and seeking a common way to ensure that new devices will be able to communicate with existing device investments. Amalgam Insights looks forward to seeing how this standard revolutionizes Smart Buildings and the Future of Work.

And, the Impossible Burger

The belle of the ball, so to speak, at CES was the Impossible Burger 2.0, a soy-based protein held together by heme with iron and protein content similar to beef.

So, this is very cool, but why is this relevant to IT? First, this burger reminds us that food is now tech. Think about both how interesting and weird this is. A company has made custom proteins to build a new type of food designed to replace the taste and role of beef. Or at least that’s where they are today.

Meanwhile in the IT world, identity is increasingly based on biometrics: eyes, fingerprints, facial recognition. It is only a matter of time before either protein or DNA profiles are added to this mix. There will undoubtedly be some controversies and hiccups as this happens, but it is almost inevitable given the types of sensors we have and the evolution of DNA technologies like CRISPR that rapidly sequence and cut up DNA.

So, as we get better at replicating the nutrition and texture of meat with plant-based proteins at the same time that our physical bodies are increasingly used to provide access to our accounts… yes, this gets weird. But we’re probably five-to-ten years away from being hacked by some combination of these technbologies as the DNA, protein, and biometric worlds keep coming closer and closer together.

For now, this is just cool to watch. And the Impossible Burger 2.0 sounds like a great vegan alternative to a burger. But putting the pieces together, identity in 2030 is going to be extremely difficult to manage.

Observations on the Future of Red Hat from Red Hat Analyst Day

On November 8th, 2018, Amalgam Insights analysts Tom Petrocelli and Hyoun Park attended the Red Hat Analyst Day in Boston, MA. We had the opportunity to visit Red Hat’s Boston office in the rapidly-growing Innovation District, which has become a key tech center for enterprise technology companies. In attending this event, my goal was to learn more about the Red Hat culture that is being acquired as well as to see how Red Hat was taking on the challenges of multi-cloud management.

Throughout Red Hat’s presentations throughout the day, there was a constant theme of effective cross-selling, growing deal sizes including a record 73 deals of over $1 million in the last quarter, over 600 accounts with over $1 million in business in the last year, and increased wallet share year-over-year for top clients with 24 out of 25 of the largest clients increasing spend by an average of 15%. The current health of Red Hat is undeniable, regardless of the foibles of the public market. And the consistency of Red Hat’s focus on Open Source was undeniable across infrastructure, integration, application development, IT automation, IT optimization, and partner solutions, which demonstrated how synchronized and focused the entire Red Hat executive team presenters were, including

Please register or log into your Amalgam Insights Community account to read more.
Log In Register

Is IBM’s Acquisition of Red Hat the Biggest Acquihire of All Time?

Estimated Reading Time: 11 minutes

Internally, Amalgam Insights has been discussing why IBM chose to acquire Red Hat for $34 billion dollars fairly intensely. Our key questions included:

  • Why would IBM purchase Red Hat when they’re already partners?
  • Why purchase Red Hat when the code is Open Source?
  • Why did IBM offer a whopping $34 billion, $20 billion more than IBM currently has on hand?

As a starting point, we posit that IBM’s biggest challenge is not an inability to understand its business challenges, but a fundamental consulting mindset that starts with the top on down. By this, we mean that IBM is great at identifying and finding solutions on a project-specific basis. For instance, SoftLayer, Weather Company, Bluewolf, and Promontory Financial are all relatively recent acquisitions that made sense and were mostly applauded at the time. But even as IBM makes smart investments, IBM has either forgotten or not learned the modern rules for how to launch, develop, and maintain software businesses. At a time when software is eating everything, this is a fundamental problem that IBM needs to solve.

The real question for IBM is whether IBM can manage itself as a modern software company.

Continue reading “Is IBM’s Acquisition of Red Hat the Biggest Acquihire of All Time?”