Streaming Data to Cloud Providers

This guide explains how to configure Samsara's Data Connectors to stream real-time fleet data to your cloud infrastructure. Samsara supports streaming to Google Cloud Pub/Sub, Azure Event Hubs, and AWS MSK (Amazon Managed Streaming for Apache Kafka).

Overview

Samsara's Data Connectors push fleet telemetry, events, and entity changes directly to your message broker or streaming platform. This enables real-time data pipelines for analytics, alerting, and integration with your existing systems.

Supported Destinations

PlatformProtocolAuthentication
Google Cloud Pub/SubPub/Sub APIService Account JSON
Azure Event HubsKafkaSASL/PLAIN with Connection String
AWS MSKKafkaSASL/SCRAM-SHA-512

Google Cloud Pub/Sub

Google Cloud Pub/Sub is a fully managed messaging service. Samsara connects directly using the native Pub/Sub API.

Prerequisites

  • Google Cloud project with Pub/Sub API enabled
  • Service account with Pub/Sub Publisher role
  • Pub/Sub topic created

Step 1: Create a Pub/Sub Topic

  1. Go to Google Cloud Console > Pub/Sub > Topics.
  2. Click Create Topic.
  3. Enter a Topic ID (e.g., samsara-fleet-data).
  4. Click Create.

Step 2: Create a Service Account

  1. Go to IAM & Admin > Service Accounts.
  2. Click Create Service Account.
  3. Enter a name (e.g., samsara-pubsub-publisher).
  4. Click Create and Continue.
  5. Grant the Pub/Sub Publisher role.
  6. Click Done.

Step 3: Generate a Service Account Key

  1. Click on the service account you created.
  2. Go to the Keys tab.
  3. Click Add Key > Create new key.
  4. Select JSON format.
  5. Click Create and save the downloaded JSON file.

Step 4: Configure in Samsara

  1. In Samsara, go to Settings > Developer > Data Streaming.
  2. Click the Clusters tab.
  3. Click Create Cluster Connection.
  4. Select Google Cloud Pub/Sub as the connector type.
  5. Enter:
    • GCP Project ID: Your Google Cloud project ID
    • Service Account JSON: Paste the entire contents of the JSON key file
    • Cluster Name: A descriptive name for this connection
  6. Click Save.

Step 5: Create a Stream

  1. Go to the Streams tab.
  2. Click Create Stream.
  3. Select your Pub/Sub connector.
  4. Enter the Topic Name (must match the topic created in GCP).
  5. Select the Entity Types you want to stream.
  6. Click Save.

Message Format

Messages are published with:

  • Data: JSON payload containing the entity data
  • Attributes: When the message has a key, it is surfaced as the message_key attribute and used for Pub/Sub ordering

Configuration Summary

SettingValue
ProtocolGoogle Cloud Pub/Sub API
AuthenticationService Account JSON with Pub/Sub Publisher role
GCP Project IDYour Google Cloud project ID
Topic NamePub/Sub topic created in Step 1

Azure Event Hubs

Azure Event Hubs provides a Kafka-compatible endpoint, allowing the Samsara Kafka Connector to connect using its standard Kafka producer.

Tier Requirements

Samsara requires Premium tier or higher. Standard tier supports the Kafka protocol but does not support Gzip compression, which Samsara uses for efficient data transfer.

TierKafka SupportGzip CompressionSamsara Compatible
BasicNot supportedNot supportedNot compatible
StandardSupportedNot supportedNot compatible
PremiumSupportedSupportedCompatible
DedicatedSupportedSupportedCompatible

Prerequisites

  • Azure Event Hubs namespace (Premium or Dedicated tier)
  • Event Hub (topic) created
  • Shared Access Policy with Send permission

Step 1: Create an Event Hubs Namespace

  1. Go to Azure Portal and search for Event Hubs.
  2. Click + Create.
  3. Configure:
    • Subscription: Select your subscription
    • Resource Group: Create new or select existing
    • Namespace Name: Enter a unique name (e.g., mycompany-samsara)
    • Pricing Tier: Select Premium or Dedicated
    • Region: Select your preferred region
  4. Click Review + Create > Create.

Step 2: Create an Event Hub (Topic)

  1. Navigate to your new namespace.
  2. Click + Event Hub.
  3. Enter a name (e.g., samsara-fleet-data).
  4. Configure partitions (the default is sufficient for most use cases).
  5. Click Create.

Step 3: Get the Connection String

  1. In your namespace, go to Settings > Shared access policies.
  2. Click RootManageSharedAccessKey (or create a policy with Send permission).
  3. Copy the Connection string--primary key.

The connection string format is:

Endpoint=sb://<namespace>.servicebus.windows.net/;SharedAccessKeyName=<policy>;SharedAccessKey=<key>

Step 4: Configure in Samsara

  1. In Samsara, go to Settings > Developer > Data Streaming.
  2. Click the Clusters tab.
  3. Click Create Cluster Connection.
  4. Select Kafka as the connector type.
  5. Enter:
    • URL: <namespace>.servicebus.windows.net:9093
    • SASL Mechanism: SASL/PLAIN
    • Key: $ConnectionString (this literal string)
    • Secret: Your full connection string from Step 3
    • Cluster Name: A descriptive name for this connection
  6. Click Save.

Step 5: Create a Stream

  1. Go to the Streams tab.
  2. Click Create Stream.
  3. Select your Azure Event Hubs connector.
  4. Enter the Topic Name (the Event Hub name, e.g., samsara-fleet-data).
  5. Select the Entity Types you want to stream.
  6. Click Save.

Configuration Summary

SettingValue
Bootstrap Server<namespace>.servicebus.windows.net:9093
Security ProtocolSASL_SSL (TLS required)
SASL MechanismSASL/PLAIN
Username$ConnectionString (literal string)
PasswordFull Azure Event Hubs connection string

AWS MSK (Amazon Managed Streaming for Apache Kafka)

Amazon MSK runs native Apache Kafka, providing full protocol compatibility with the Samsara Kafka Connector. Samsara connects to MSK via public endpoints using SASL/SCRAM authentication.

Prerequisites

Before connecting Samsara, you need an MSK cluster with the following configuration:

  • SASL/SCRAM authentication enabled
  • TLS encryption enabled (both client-broker and broker-broker)
  • Public access enabled with SERVICE_PROVIDED_EIPS
  • A SCRAM user stored in AWS Secrets Manager (secret name must start with AmazonMSK_, encrypted with a customer-managed KMS key)
  • A Kafka topic created for Samsara data
  • ACLs granting the SCRAM user Write, Describe, and Create permissions on the topic, and Describe on the cluster

For detailed instructions on setting up MSK with public access and SCRAM authentication, see:

Note: Samsara uses Gzip compression. Ensure your MSK cluster configuration does not restrict compression types.

Step 1: Get Your Bootstrap Servers

After your MSK cluster is set up with public access, retrieve the public SCRAM bootstrap servers:

aws kafka get-bootstrap-brokers --cluster-arn <your-cluster-arn>

Use the BootstrapBrokerStringPublicSaslScram value (port 9196).

Step 2: Configure in Samsara

  1. In Samsara, go to Settings > Developer > Data Streaming.
  2. Click the Clusters tab.
  3. Click Create Cluster Connection.
  4. Select Kafka as the connector type.
  5. Enter:
    • URL: Your bootstrap servers (comma-separated if multiple)
    • SASL Mechanism: SASL/SCRAM-SHA-512
    • Key: Your SCRAM username
    • Secret: Your SCRAM password
    • Cluster Name: A descriptive name for this connection
  6. Click Save.

Step 3: Create a Stream

  1. Go to the Streams tab.
  2. Click Create Stream.
  3. Select your MSK connector.
  4. Enter the Topic Name (e.g., samsara-fleet-data).
  5. Select the Entity Types you want to stream.
  6. Click Save.

Configuration Summary

SettingValue
Bootstrap Serverb-1-public.<cluster>.<region>.amazonaws.com:9196
Security ProtocolSASL_SSL (TLS required)
SASL MechanismSASL/SCRAM-SHA-512
UsernameSCRAM username from Secrets Manager
PasswordSCRAM password from Secrets Manager

Troubleshooting

ErrorCauseSolution
unexpected EOFTLS not enabledEnsure TLS is enabled on the cluster
Authentication failedWrong credentialsVerify username/password in Secrets Manager
SCRAM authentication failedSecret not associatedRun batch-associate-scram-secret
TopicAuthorizationExceptionNo ACLGrant ACLs using IAM authentication
Unknown Topic Or PartitionTopic missingCreate the topic manually
encrypted with default keyWrong KMS keyUse a customer-managed KMS key

Verifying the Connection

After configuring any provider, verify data is flowing:

  1. In Samsara, go to Settings > Developer > Data Streaming > Streams.
  2. Check that the stream status shows Active.
  3. On your cloud provider side, confirm messages are arriving in the topic or Event Hub.