Follow Datanami:
March 3, 2022

Knative Now an Incubating Project at CNCF

If you’re interested in ways to ease the deployment of serverless, event-driven applications atop Kubernetes in your organization, then you’ll be pleased to hear that Knative–an open source platform designed to simplify and automate serverless deployments atop Kubernetes–was accepted as an incubating project yesterday by the Cloud Native Computing Foundation (CNCF).

Knative (pronounced “kay-native”) was originally created by Google in 2018 as a way to simplify the deployment and management of serverless and event-driven applications on Kubernetes, the container orchestration software that Google released as open source back in 2014. While Kubernetes has proven to be a popular way to scale containers up and down–indeed more than 96% of organizations in a recent CNCF survey say they’re using it or evaluating it–K8S itself is a complex bit of technology that is difficult to master, and Knative helps to ease its adoption for two popular types of applications: serverless and event-driven.

The software has two main components: Knative Serving and Knative Eventing. Knative Serving enables users to run serverless containers on Kubernetes, and automates much of the work that revolves around networking, autoscaling, and revision tracking. Knative Eventing, meanwhile, provides a way to automate the deployment of an event-driven architecture, and makes it easy for developers to attach even-driven data streams (perhaps from Apache Kafka or a similar event broker) to Knative apps. The two systems can be used independently, but they’re typically deployed together.

Datanami recently caught up with Naina Singh, a principal product manager for serverless at Red Hat–which is one of the major contributors to the Knative project along with Google, IBM, VMware, and SAP–for an overview on this new and important piece of infrastructure technology.

Knative components (Image source: Knative.dev)

“Knative is one such technology that provides a layer of abstraction that can simplify a lot of concerns for a developer or even the operator, what the ops-person goes through while using Kubernetes,” Singh says. “I would say the abstractions layer is so fantastic that you can even call it ‘Kubernetes-Plus,’ because it brings tons of features, like request-based auto scaling. It gives you the concurrency control.”

Knative also produces reproducible deployments, called provisions, that can simplify the management of serverless applications, Singh says. For example, it provides traffic-splitting capabilities and rollbacks, and canary deployments. It also supports what are termed “green-blue” deployments where an administrator deploys new changes on the green side of the cluster, and then slowly turns off the blue side of the cluster and keeps the green, she says.

“All those things Knative makes it very easy,” Singh says. “It makes it very easy for the operators to not go into the weeds of Kubernetes either because they can try to control it at a serverless layer.”

Any organization that is looking to build event-driven, serverless applications that run in a distributed manner in the cloud would do well to learn about Knative and what it can provide them, Singh says.

“What it brings to you is distributed applications, loosely coupled, reactive,” she continues. Knative helps to solve “all your modern-day challenges that you get from the cloud-native. Because you have to be always on in this global world. It has to be highly available.”

The 1.0 version of Knative was released in October 2021, marking the first time the code has been generally available. It’s being adopted by vendors, such as Red Hat, which offers Knative as part of its OpenShift package. VMware and Google are also using Knative to automate the deployment of serverless, event-driven applications atop Kubernetes.

However, if customers were to deploy those types of applications atop AWS or Microsoft Azure, they would not be using Knative, because those clouds don’t support it yet. That limits the ability for customers to easily move their event-based serverless applications to other environments, Singh says.

“Let’s say right now, you are using Red Hat OpenShift Serverless, which has a Knative 1.0 compliance, and then tomorrow you want to go to another vendor who’s also saying we have KNative 1.0, you should be able to run your workload because that’s the promise of it,” she explains. “So what I see in open source and in the community, we are working very hard to keep this democratization of serverless. For example, if you use AWS Lambda solution or you use a Azure Function solutions or their container services–those are proprietary solutions.”

Knative is currently at version 1.2, with a new release expected about every six weeks. The Knative community consists of more than 1,800 individuals, with 94 contributors who are eligible to vote. According to the CNCF, the project’s governance structure includes 17 Working Group Leads working across 11 Working Groups; five Knative Steering Committee (KSC) members, five Technical Oversight Committee (TOC) members, and three Knative Trademark Committee (KTC) members.

Carlos Santana, the Knative Steering Committee and DOCS-UX Lead, says the delivery of Knative 1.0 and joining the CNCF will enable the project to grow.

“Becoming an incubating project will encourage additional companies to adopt, contribute to, and evangelize the project,” Santana says in a press release. “It will also bring the Knative community closer to other cloud native projects in the ecosystem–including all the projects it builds on–helping to establish a virtuous cycle for feedback and features.”

Related Items:

Kubernetes Adoption Widespread for Big Data, But Monitoring and Tuning Are Issues, Survey Finds

Most AWS Analytics Customers Will Go Serverless, VP Says

Becoming an Event-Driven Enterprise

 

 

 

Datanami