Follow Datanami:
April 4, 2024

Cloudflare Announces Major Updates for R2 Including Event Notifications and GCS Support

Cloudflare, a global connectivity cloud company, announced key updates for R2, the company’s S3-compatible object storage service. The expanded capabilities are part of Cloudflare’s broader goal of continually improving zero-egress object storage for providing reliable, cost-effective storage and powerful workflows.

One of the key updates is event notifications for R2 storage in open beta. Each time there is a change to the data, the event will now be received by Cloudfare Queue, which can then generate a notification in Cloudfare Worker for users to take further actions as needed. This new feature enables developers to get notifications for different types of events including changes to Worker namespaces and updates to the D1 database. 

Cloudflare introduced a private beta of the Infrequent Access storage class for use cases that involve data that isn’t frequently accessed, such as logs and long tail user-generated content. With Infrequent Access, developers will now pay less to store data that isn’t accessed frequently. The pricing model for data retrieval is based on volume. When users need to use the data, there are no egress fees. This pricing model allows Cloudflare to reduce costs for developers. 

In addition, users can define the object lifecycle to move data to the Infrequent Access class after a specified period of time. Cloudflare also shared its vision of offering automatic optimization of storage classes in the future. With this enhancement, users won’t have to manually set rules, offering a more streamlined method to adapt to change data access patterns. 

Super Slurper, a tool released by Cloudflare last year to enable easy and efficient migration of data to R2, also gets an update. It now offers data migration from Google Cloud Storage (GCS). Users can view the status of migration on the dashboard. 

A new storage ingestion service called Pipelines has been introduced. It offers support for streaming data to Apache Kafka, HTTP, and WebSocket. It is designed to ingest data at scale and has the ability to aggregate data and write direction to R2, eliminating the need to manage partitions, infrastructure, or worry about durability. 

As part of the Cloudflare Developer Week 2024, Cloudflare also announced new updates that empower developers to deploy AI applications on Cloudflare’s global network in one simple click directly from Hugging Face, an open-source hosting platform for natural language processing (NLP) and other machine learning (ML) domains. 

(lucadp/Shutterstock)

Cloudflare is now the first serverless GPU preferred partner for deploying Hugging Face models. There are 14 curated Hugging Face models now optimized for Cloudflare,  supporting different tasks such as embeddings, text generation, and sentence similarity. 

“The recent generative AI boom has companies across industries investing massive amounts of time and money into AI. Some of it will work, but the real challenge of AI is that the demo is easy, but putting it into production is incredibly hard,” said Matthew Prince, CEO and co-founder, Cloudflare. 

According to Prince, Cloudflare can solve this problem by minimizing the complexity and cost of building AI-powered apps. Worker AI is already one of the most affordable and accessible solutions to run inference, and now with Hugging Face integration, Cloudshare is in a better position to democratize AI. This will help reduce barriers to AI adoption and provide more freedom to developers looking to scale their AI apps. 

Related Items

Cloudflare Powers Hyper-Local AI Inference with NVIDIA Accelerated Computing

Inside AWS’s Plans to Make S3 Faster and Better

AWS Launches High-Speed Amazon S3 Express One Zone

Datanami