Today marks the one-year anniversary of Knative, an open-source project initiated by Google that helps developers build, deploy and manage modern serverless workloads. What started as a Google-led project now has a rich ecosystem with partners from around the world, and together, we’ve had an amazing year! Here are just a few notable stats and milestones that Knative has achieved this year:
- Seven releases since launch
- A thriving, growing ecosystem: over 3,700 pull requests from 400+ contributors associated with over 80 different companies, including industry leaders like IBM, Red Hat, SAP, TriggerMesh and Pivotal.
- Addition of non-Google contributors at the approver, lead, and steering committee level
- 20% monthly growth in contributions
With all this momentum for the project, we thought now would be a good time to reflect on why we initially created Knative, the project’s ecosystem, and how it relates to Google Cloud’s serverless vision.
Why we created Knative
Serverless computing provides developers with a number of benefits: the ability to run applications without having to worry about managing the underlying infrastructure, to execute code only when needed, to autoscale workloads from zero to N depending on traffic, and many more. But while traditional serverless offerings provide the velocity that developers love, they have a lack of flexibility. Serverless traditionally requires developers to use specific languages and proprietary tools. It also locks developers into a cloud provider and prevents them from being able to easily move their workloads to other platforms.
In other words, most serverless offerings force developers to choose between the velocity and simple developer experience of serverless, and the flexibility and portability of containers. We asked ourselves, what if we could offer the best of both worlds?
Kubernetes has become the de facto standard for running containers. Even with all that Kubernetes offers, many platform providers and operators were implementing their own platforms to solve common needs like building code, scaling workloads, and connecting services with events. Not only was this a duplicative effort for everyone, it lead to vendor lock in and proprietary systems for developers. And thus, Knative was born.
What is Knative?
Knative offers a set of components that standardize mundane but difficult tasks such as building applications from source code to container images, routing and managing traffic during deployment, auto-scaling of workloads, and binding running services to a growing ecosystem of event sources.
Idiomatic developer experience
Knative provides an idiomatic developer experience: Developers can use any language or framework, such as Django, Ruby on Rails, Spring and many more; common development patterns such as GitOps, DockerOps, or ManualOps; and easily plug into existing build and CI/CD toolchains.
A growing Knative ecosystem
When we first announced Knative, it included three main components: build, eventing, and serving, all of which have received significant investment and adoption from the community. Recently the build component has been spun out of Knative into a new project, Tekton. Tekton focuses on solving a much broader set of continuous integration use-cases than was Knative’s original intent.
But perhaps the biggest indicator of Knative’s momentum is the increase in commercial Knative-based products on the market. Our own Cloud Run is based on Knative, and several members of the community also have products based on Knative, including IBM, Red Hat, SAP, TriggerMesh and Pivotal.
“We are excited to be partnering with Google on the Knative project. Knative enables us to build new innovative managed services in the cloud, easily, without having to recreate the essential building blocks. Knative is a game-changer, finally making serverless workload portability a reality.” – Sebastien Goasguen, Co-Founder, TriggerMesh
“Red Hat has been working alongside the community and innovators like Google on Knative since its inception. By adding the Knative APIs to Red Hat OpenShift, our enterprise Kubernetes platform, developers have the ability to build portable serverless applications. We look forward to enabling more serverless workloads with Red Hat OpenShift Serverless based on Knative as the project nears general availability. This has the potential to improve the general ease of Kubernetes for developers, helping teams to run modern applications across hybrid architectures.” – William Markito Oliveira, senior principal product manager, Red Hat
To learn more about Knative and the community look out for an upcoming interview with Evan Anderson, Google Cloud engineer, and a Knative technical lead on the SAP Customer Experience Labs podcast.
Knative: the basis of Google Cloud Run
At Google Cloud Next 2019, we announced Cloud Run, our newest serverless compute platform that lets you run stateless request-driven containers without having to worry about the underlying infrastructure—no more configuration, provisioning, patching and managing servers. Cloud Run autoscales your application from zero to N depending on traffic and you only pay for the resources that you use. Cloud Run is available both as a fully managed offering and also as an add-on in Google Kubernetes Engine (GKE).
We believe Cloud Run is the best way to use Knative. With Cloud Run, you choose how to run your serverless workloads: fully managed on Google Cloud or on GKE. You can even choose to move your workloads on-premises running on your own Kubernetes cluster or to a third-party cloud. Knative makes it easy to start with Cloud Run and later move to Cloud Run on GKE, or start in your own Kubernetes cluster and migrate to Cloud Run in the future. Because it uses Knative as the underlying platform, you can move your workloads freely across platforms, while significantly reducing switching costs.
Customers such as Percy.io use both Cloud Run and Cloud Run on GKE and love the fact they can leverage the same experience and UI wherever they need.
“We first started running our workloads on Cloud Run as fully managed on Google Cloud, but then wanted to leverage some of the benefits of Google Kubernetes Engine (GKE), so we decided to move some services to Cloud Run on GKE. The fact we can seamlessly move from one platform to another by just changing the endpoint is amazing, and that they both have the same UI and interface makes it extremely easy to manage.” – David Jones, Director of Engineering, Percy.io
Get started with Knative today!
Knative brings portability to your serverless workloads and the simple and easy developer experience to your Kubernetes platform. It is truly the best of both worlds. If you operate your own Kubernetes environment, check out Knative today. If you’re a developer, check out Cloud Run as an easy way to experience the benefits of Knative. Get started with your free trial on Google Cloud—we can’t wait to see what you will build.
At one year old, Google’s open-source Knative project is helping to realize the dream of portable serverless workloads.