home / skills / sickn33 / antigravity-awesome-skills / azure-eventhub-rust

azure-eventhub-rust skill

/skills/azure-eventhub-rust

This skill helps you integrate Azure Event Hubs Rust SDK to send and receive events efficiently across producers and consumers.

npx playbooks add skill sickn33/antigravity-awesome-skills --skill azure-eventhub-rust

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
3.4 KB
---
name: azure-eventhub-rust
description: |
  Azure Event Hubs SDK for Rust. Use for sending and receiving events, streaming data ingestion.
  Triggers: "event hubs rust", "ProducerClient rust", "ConsumerClient rust", "send event rust", "streaming rust".
package: azure_messaging_eventhubs
---

# Azure Event Hubs SDK for Rust

Client library for Azure Event Hubs — big data streaming platform and event ingestion service.

## Installation

```sh
cargo add azure_messaging_eventhubs azure_identity
```

## Environment Variables

```bash
EVENTHUBS_HOST=<namespace>.servicebus.windows.net
EVENTHUB_NAME=<eventhub-name>
```

## Key Concepts

- **Namespace** — container for Event Hubs
- **Event Hub** — stream of events partitioned for parallel processing
- **Partition** — ordered sequence of events
- **Producer** — sends events to Event Hub
- **Consumer** — receives events from partitions

## Producer Client

### Create Producer

```rust
use azure_identity::DeveloperToolsCredential;
use azure_messaging_eventhubs::ProducerClient;

let credential = DeveloperToolsCredential::new(None)?;
let producer = ProducerClient::builder()
    .open("<namespace>.servicebus.windows.net", "eventhub-name", credential.clone())
    .await?;
```

### Send Single Event

```rust
producer.send_event(vec![1, 2, 3, 4], None).await?;
```

### Send Batch

```rust
let batch = producer.create_batch(None).await?;
batch.try_add_event_data(b"event 1".to_vec(), None)?;
batch.try_add_event_data(b"event 2".to_vec(), None)?;

producer.send_batch(batch, None).await?;
```

## Consumer Client

### Create Consumer

```rust
use azure_messaging_eventhubs::ConsumerClient;

let credential = DeveloperToolsCredential::new(None)?;
let consumer = ConsumerClient::builder()
    .open("<namespace>.servicebus.windows.net", "eventhub-name", credential.clone())
    .await?;
```

### Receive Events

```rust
// Open receiver for specific partition
let receiver = consumer.open_partition_receiver("0", None).await?;

// Receive events
let events = receiver.receive_events(100, None).await?;
for event in events {
    println!("Event data: {:?}", event.body());
}
```

### Get Event Hub Properties

```rust
let properties = consumer.get_eventhub_properties(None).await?;
println!("Partitions: {:?}", properties.partition_ids);
```

### Get Partition Properties

```rust
let partition_props = consumer.get_partition_properties("0", None).await?;
println!("Last sequence number: {}", partition_props.last_enqueued_sequence_number);
```

## Best Practices

1. **Reuse clients** — create once, send many events
2. **Use batches** — more efficient than individual sends
3. **Check batch capacity** — `try_add_event_data` returns false when full
4. **Process partitions in parallel** — each partition can be consumed independently
5. **Use consumer groups** — isolate different consuming applications
6. **Handle checkpointing** — use `azure_messaging_eventhubs_checkpointstore_blob` for distributed consumers

## Checkpoint Store (Optional)

For distributed consumers with checkpointing:

```sh
cargo add azure_messaging_eventhubs_checkpointstore_blob
```

## Reference Links

| Resource | Link |
|----------|------|
| API Reference | https://docs.rs/azure_messaging_eventhubs |
| Source Code | https://github.com/Azure/azure-sdk-for-rust/tree/main/sdk/eventhubs/azure_messaging_eventhubs |
| crates.io | https://crates.io/crates/azure_messaging_eventhubs |

Overview

This skill provides a concise guide to using the Azure Event Hubs SDK for Rust to send and receive streaming events. It covers creating Producer and Consumer clients, sending single or batched events, receiving from partitions, and optional checkpointing for distributed consumers. Practical tips and code snippets help integrate high-throughput event ingestion into Rust services.

How this skill works

The SDK connects to an Event Hubs namespace and opens a named Event Hub. ProducerClient sends raw event payloads or batches to partitions; ConsumerClient reads events from specific partitions and exposes event metadata. Authentication is handled via Azure credentials (e.g., DeveloperToolsCredential). For distributed processing, a checkpoint store (Blob) records progress so consumers can resume reliably.

When to use it

  • High-throughput telemetry or event ingestion from devices or services.
  • Real-time streaming pipelines where partitioned, ordered processing is required.
  • When you need reliable delivery with batching and parallel consumers.
  • Integrating Rust services with Azure event-driven architectures.
  • Distributed consumers that require checkpointing and load balancing.

Best practices

  • Create and reuse ProducerClient and ConsumerClient instances rather than recreating per operation.
  • Send events in batches to improve throughput and reduce network overhead.
  • Check try_add_event_data when building batches to avoid overfilling them.
  • Consume partitions in parallel and use consumer groups to isolate workloads.
  • Use the checkpoint store (Blob) to persist offsets for distributed, fault-tolerant consumers.

Example use cases

  • A telemetry pipeline ingesting IoT sensor readings into an Event Hub, batching writes for efficiency.
  • A log aggregation service that streams application logs to downstream processors.
  • A real-time analytics app that consumes partitioned events in parallel and updates dashboards.
  • A background Rust worker that checkpoints progress to Blob storage and resumes after restarts.
  • Microservices publishing domain events to feed event-driven workflows.

FAQ

How do I authenticate the Rust client against Azure?

Use an Azure credential implementation such as DeveloperToolsCredential for local dev or a production credential from azure_identity. Pass the credential when building ProducerClient or ConsumerClient.

When should I use batches versus single-event sends?

Use batches for high-volume workloads to reduce RPC overhead and increase throughput. Single-event sends are fine for low-volume or ad-hoc events.