View source: R/kinesis_operations.R
kinesis_put_record | R Documentation |
Writes a single data record into an Amazon Kinesis data stream. Call put_record
to send data into the stream for real-time ingestion and subsequent processing, one record at a time. Each shard can support writes up to 1,000 records per second, up to a maximum data write total of 1 MiB per second.
See https://www.paws-r-sdk.com/docs/kinesis_put_record/ for full documentation.
kinesis_put_record(
StreamName = NULL,
Data,
PartitionKey,
ExplicitHashKey = NULL,
SequenceNumberForOrdering = NULL,
StreamARN = NULL
)
StreamName |
The name of the stream to put the data record into. |
Data |
[required] The data blob to put into the record, which is base64-encoded when the blob is serialized. When the data blob (the payload before base64-encoding) is added to the partition key size, the total size must not exceed the maximum record size (1 MiB). |
PartitionKey |
[required] Determines which shard in the stream the data record is assigned to. Partition keys are Unicode strings with a maximum length limit of 256 characters for each key. Amazon Kinesis Data Streams uses the partition key as input to a hash function that maps the partition key and associated data to a specific shard. Specifically, an MD5 hash function is used to map partition keys to 128-bit integer values and to map associated data records to shards. As a result of this hashing mechanism, all data records with the same partition key map to the same shard within the stream. |
ExplicitHashKey |
The hash value used to explicitly determine the shard the data record is assigned to by overriding the partition key hash. |
SequenceNumberForOrdering |
Guarantees strictly increasing sequence numbers, for puts from the same
client and to the same partition key. Usage: set the
|
StreamARN |
The ARN of the stream. |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.