Hey š
Hey š
I'm unable to create streams with dashes in the name (like the example in https://developers.cloudflare.com/pipelines/streams/manage-streams/):
I'm unable to create streams with dashes in the name (like the example in https://developers.cloudflare.com/pipelines/streams/manage-streams/):
28 Replies
I believe it's the same issue reported here: https://discord.com/channels/595317990191398933/1359448593110663248/1421027714546143293
It may be something related to my account - I can't seem to be able to create streams at all:
Nothing happens after this :this:
Yeah currently pipelines has issues with dashes in the name - the team is working on better validation here
The second issue is odd let me check behind the scenes real quick
Thanks Marc. Here's my schema: https://gist.github.com/albt-revolucao/70f9b9ec9b4ee62c61867c888c76a2f0
Thanks for that. I was able to reproduce ... checking if there's something obvious on our side going on
I think I figured it out - it's the
brands
field inside context.userAgentData
.
If you remove it you'll be able to create the stream.yup - that's it though this appears to be a bug. Trying a quick workaround for you...
All good! I don't need that field; was just trying to match Segment's event format.
I appreciate your help!
roger - still a weird bug we'll need to fix. This list is being stubborn š
Marc, sorry for the double ping - this isn't urgent (and can wait until Monday): I think I set everything up correctly, but I'm not seeing events flowing to R2 (bucket is empty other than the metadata file, and trying to select everything using the SQL API returns "Query executed successfully with no results").
In the dashboard I see 340kb of data in but no data delivered, event delivered, or events dropped (except for a few that I intentionally sent with fake data).
What's interesting is that I used the cURL provided by the Wrangler CLI to send a few events:
And those aren't showing up as well.
I'll check later today and will let you know if anything changed.
No worries bert - thanks for putting it through its paces. First thing, re the original issue, it looks like we expect an arbitrary name field inside of items - so this worked:
for the second thing - can you share the command you used to create the sink?
Sure thing -
aha
There's two settings that control the time it takes for data to show up:
so by default, it's set to 5 minutes (300 seconds) - you can set this to be much lower if you'd like. The only issue is we don't support editing a sink yet so you'ld have to delete/recreate
Right! I don't think this is quite it, though. Check this out:

I sent those events ~20-30 minutes ago.
oh š
I'm pretty sure something's wrong with my schema, but it's interesting that I only see 8 events dropped in the dashboard.
(My events aren't 25kb each)
hmm
looking š
No worries - happy to chat Monday. This isn't urgent at all.
Ok! the last thing I'll ask you for is the schema - you said you essentially copied and pasted the output from the wrangler command to post the messages?
It looks like there was some deserialization errors ... which has me wondering if the wrangler example data it outputs is not correct
Yep! cURL + schema: https://gist.github.com/albt-revolucao/1f4580d94c70b8aefe6906e6a5c0a6e6
you're awesome thank you
yeah this one will need me to do a bit more debugging so it might be a bit - maybe Monday. What's in the gist is valid - I'm assuming whatever was outputted from Wrangler was not
Looks like the team identified the issue and has a fix on the way soon. Interestingly enough, the bug was found thanks to specific shape of your schema so thank you trying/finding this (also sorry you found this bug)
Sweet! No worries. I ended up changing
context
to json
since Segment doesn't specify any format in their documentation and everything's working on my end.If I may, one last bug report: for some reason when I cmd+click on "Pipelines", "Streams", or "Sinks" in the sidebar I see this

If I click to open in the same tab it works.
This is what I see in the console.

Ooo good find. Will get that one fixed
Hi Bert, wanted to let you know that we've deployed a fix for that issue. Nested structs should now work properly.