Streaming Data to Cloud Providers
This guide explains how to configure Samsara's Data Connectors to stream real-time fleet data to your cloud infrastructure. Samsara supports streaming to Google Cloud Pub/Sub, Azure Event Hubs, and AWS MSK (Amazon Managed Streaming for Apache Kafka).
Overview
Samsara's Data Connectors push fleet telemetry, events, and entity changes directly to your message broker or streaming platform. This enables real-time data pipelines for analytics, alerting, and integration with your existing systems.
Supported Destinations
| Platform | Protocol | Authentication |
|---|---|---|
| Google Cloud Pub/Sub | Pub/Sub API | Service Account JSON |
| Azure Event Hubs | Kafka | SASL/PLAIN with Connection String |
| AWS MSK | Kafka | SASL/SCRAM-SHA-512 |
Google Cloud Pub/Sub
Google Cloud Pub/Sub is a fully managed messaging service. Samsara connects directly using the native Pub/Sub API.
Prerequisites
- Google Cloud project with Pub/Sub API enabled
- Service account with Pub/Sub Publisher role
- Pub/Sub topic created
Step 1: Create a Pub/Sub Topic
- Go to Google Cloud Console > Pub/Sub > Topics.
- Click Create Topic.
- Enter a Topic ID (e.g.,
samsara-fleet-data). - Click Create.
Step 2: Create a Service Account
- Go to IAM & Admin > Service Accounts.
- Click Create Service Account.
- Enter a name (e.g.,
samsara-pubsub-publisher). - Click Create and Continue.
- Grant the Pub/Sub Publisher role.
- Click Done.
Step 3: Generate a Service Account Key
- Click on the service account you created.
- Go to the Keys tab.
- Click Add Key > Create new key.
- Select JSON format.
- Click Create and save the downloaded JSON file.
Step 4: Configure in Samsara
- In Samsara, go to Settings > Developer > Data Streaming.
- Click the Clusters tab.
- Click Create Cluster Connection.
- Select Google Cloud Pub/Sub as the connector type.
- Enter:
- GCP Project ID: Your Google Cloud project ID
- Service Account JSON: Paste the entire contents of the JSON key file
- Cluster Name: A descriptive name for this connection
- Click Save.
Step 5: Create a Stream
- Go to the Streams tab.
- Click Create Stream.
- Select your Pub/Sub connector.
- Enter the Topic Name (must match the topic created in GCP).
- Select the Entity Types you want to stream.
- Click Save.
Message Format
Messages are published with:
- Data: JSON payload containing the entity data
- Attributes: When the message has a key, it is surfaced as the
message_keyattribute and used for Pub/Sub ordering
Configuration Summary
| Setting | Value |
|---|---|
| Protocol | Google Cloud Pub/Sub API |
| Authentication | Service Account JSON with Pub/Sub Publisher role |
| GCP Project ID | Your Google Cloud project ID |
| Topic Name | Pub/Sub topic created in Step 1 |
Azure Event Hubs
Azure Event Hubs provides a Kafka-compatible endpoint, allowing the Samsara Kafka Connector to connect using its standard Kafka producer.
Tier Requirements
Samsara requires Premium tier or higher. Standard tier supports the Kafka protocol but does not support Gzip compression, which Samsara uses for efficient data transfer.
| Tier | Kafka Support | Gzip Compression | Samsara Compatible |
|---|---|---|---|
| Basic | Not supported | Not supported | Not compatible |
| Standard | Supported | Not supported | Not compatible |
| Premium | Supported | Supported | Compatible |
| Dedicated | Supported | Supported | Compatible |
Prerequisites
- Azure Event Hubs namespace (Premium or Dedicated tier)
- Event Hub (topic) created
- Shared Access Policy with Send permission
Step 1: Create an Event Hubs Namespace
- Go to Azure Portal and search for Event Hubs.
- Click + Create.
- Configure:
- Subscription: Select your subscription
- Resource Group: Create new or select existing
- Namespace Name: Enter a unique name (e.g.,
mycompany-samsara) - Pricing Tier: Select Premium or Dedicated
- Region: Select your preferred region
- Click Review + Create > Create.
Step 2: Create an Event Hub (Topic)
- Navigate to your new namespace.
- Click + Event Hub.
- Enter a name (e.g.,
samsara-fleet-data). - Configure partitions (the default is sufficient for most use cases).
- Click Create.
Step 3: Get the Connection String
- In your namespace, go to Settings > Shared access policies.
- Click RootManageSharedAccessKey (or create a policy with Send permission).
- Copy the Connection string--primary key.
The connection string format is:
Endpoint=sb://<namespace>.servicebus.windows.net/;SharedAccessKeyName=<policy>;SharedAccessKey=<key>
Step 4: Configure in Samsara
- In Samsara, go to Settings > Developer > Data Streaming.
- Click the Clusters tab.
- Click Create Cluster Connection.
- Select Kafka as the connector type.
- Enter:
- URL:
<namespace>.servicebus.windows.net:9093 - SASL Mechanism:
SASL/PLAIN - Key:
$ConnectionString(this literal string) - Secret: Your full connection string from Step 3
- Cluster Name: A descriptive name for this connection
- URL:
- Click Save.
Step 5: Create a Stream
- Go to the Streams tab.
- Click Create Stream.
- Select your Azure Event Hubs connector.
- Enter the Topic Name (the Event Hub name, e.g.,
samsara-fleet-data). - Select the Entity Types you want to stream.
- Click Save.
Configuration Summary
| Setting | Value |
|---|---|
| Bootstrap Server | <namespace>.servicebus.windows.net:9093 |
| Security Protocol | SASL_SSL (TLS required) |
| SASL Mechanism | SASL/PLAIN |
| Username | $ConnectionString (literal string) |
| Password | Full Azure Event Hubs connection string |
AWS MSK (Amazon Managed Streaming for Apache Kafka)
Amazon MSK runs native Apache Kafka, providing full protocol compatibility with the Samsara Kafka Connector. Samsara connects to MSK via public endpoints using SASL/SCRAM authentication.
Prerequisites
Before connecting Samsara, you need an MSK cluster with the following configuration:
- SASL/SCRAM authentication enabled
- TLS encryption enabled (both client-broker and broker-broker)
- Public access enabled with
SERVICE_PROVIDED_EIPS - A SCRAM user stored in AWS Secrets Manager (secret name must start with
AmazonMSK_, encrypted with a customer-managed KMS key) - A Kafka topic created for Samsara data
- ACLs granting the SCRAM user
Write,Describe, andCreatepermissions on the topic, andDescribeon the cluster
For detailed instructions on setting up MSK with public access and SCRAM authentication, see:
- Creating an Amazon MSK cluster
- Sign-in credentials authentication with AWS Secrets Manager
- Configuring public access
- Apache Kafka ACLs
Note: Samsara uses Gzip compression. Ensure your MSK cluster configuration does not restrict compression types.
Step 1: Get Your Bootstrap Servers
After your MSK cluster is set up with public access, retrieve the public SCRAM bootstrap servers:
aws kafka get-bootstrap-brokers --cluster-arn <your-cluster-arn>Use the BootstrapBrokerStringPublicSaslScram value (port 9196).
Step 2: Configure in Samsara
- In Samsara, go to Settings > Developer > Data Streaming.
- Click the Clusters tab.
- Click Create Cluster Connection.
- Select Kafka as the connector type.
- Enter:
- URL: Your bootstrap servers (comma-separated if multiple)
- SASL Mechanism:
SASL/SCRAM-SHA-512 - Key: Your SCRAM username
- Secret: Your SCRAM password
- Cluster Name: A descriptive name for this connection
- Click Save.
Step 3: Create a Stream
- Go to the Streams tab.
- Click Create Stream.
- Select your MSK connector.
- Enter the Topic Name (e.g.,
samsara-fleet-data). - Select the Entity Types you want to stream.
- Click Save.
Configuration Summary
| Setting | Value |
|---|---|
| Bootstrap Server | b-1-public.<cluster>.<region>.amazonaws.com:9196 |
| Security Protocol | SASL_SSL (TLS required) |
| SASL Mechanism | SASL/SCRAM-SHA-512 |
| Username | SCRAM username from Secrets Manager |
| Password | SCRAM password from Secrets Manager |
Troubleshooting
| Error | Cause | Solution |
|---|---|---|
unexpected EOF | TLS not enabled | Ensure TLS is enabled on the cluster |
Authentication failed | Wrong credentials | Verify username/password in Secrets Manager |
SCRAM authentication failed | Secret not associated | Run batch-associate-scram-secret |
TopicAuthorizationException | No ACL | Grant ACLs using IAM authentication |
Unknown Topic Or Partition | Topic missing | Create the topic manually |
encrypted with default key | Wrong KMS key | Use a customer-managed KMS key |
Verifying the Connection
After configuring any provider, verify data is flowing:
- In Samsara, go to Settings > Developer > Data Streaming > Streams.
- Check that the stream status shows Active.
- On your cloud provider side, confirm messages are arriving in the topic or Event Hub.
Updated about 7 hours ago
