**Pipeline ID** -
Pipeline ID - 8428c7b24c4b44609b11c8bd9319f7b1
Event production method - Worker Binding - tesdash-signal-processor
Sample events - {"signal":"FAAAAAAAD.....=","received":1759446232076,"session_id":"c985bc9d-d018-4176-ab42-6e04c84e770b"}
Wrangler version - 4.41.0
I am trying to publish events to a pipeline for simple R2 archival. I ahve tried various iterations of the pipeline (different schemas, batch sizes, rollover sizes) but my pipeline is reporting no events and not creating any files.
I am using the default SQL insert - I do not intend for any transformation to occur.
INSERT INTO tesdash_signal_archive_sink SELECT * FROM tesdash_signal_archive_stream;
I also had required fields before but made them all optional:
Stream input schema:
┌────────────┬───────────┬─────────────┬──────────┐
│ Field Name │ Type │ Unit/Items │ Required │
├────────────┼───────────┼─────────────┼──────────┤
│ signal │ string │ │ No │
├────────────┼───────────┼─────────────┼──────────┤
│ received │ timestamp │ millisecond │ No │
├────────────┼───────────┼─────────────┼──────────┤
│ session_id │ string │ │ No │
I also had the signal set as binary but move to a string and also have no events showing.
The worker is not throwing an exception when invoking the binding.
3 Replies
Ideally I would like a long and large batch (6h and 1024mb) but I can;t event get this one working with a short lifed roll
✔ The base prefix in your bucket where data will be written (optional): …
✔ Time partition pattern (optional): … year=%Y/month=%m/day=%d
✔ Output format: › JSON
✔ Roll file when size reaches (MB, minimum 5): … 5
✔ Roll file when time reaches (seconds, minimum 10): … 10
✔ Automatically generate credentials needed to write to your R2 bucket? … yes
Stream: tesdash_signal_archive_stream
• HTTP: Disabled
• Schema: 3 fields
Sink: tesdash_signal_archive_sink
• Type: R2 Bucket
• Bucket: tesdash-signal-archive
• Format: json
✔ Create stream and sink? … yes
Metrics are flat despite publish a couple dozen events

I tried a couple other things also with no luck
1.) Create a new pipeline, stream, and sink with a web endpoint and post multiple requests to it.
Got {"success":true,"result":{"committed":1}} every time and posted at least 5MB of data, dashboard still shows no events and there are no file operations in R2
2.) Created a new stream with the config mentioned above with the received key set to int32 instead of timestamp.
Same behaviour as above,
I should also note that when creating these in the UI, I did receive an error message on create, however when I looked in the UI after I saw all three instances (pipeline, stream, and sink) were created with the right settings.
Created test instances
095dfcdd6ead4e618d297d6b04d6a2e2
2a4d34cca2d34864914797d4242c70fb