F
Firecrawlβ€’6mo ago
Alp

hey! I am using the crawl endpoint with

hey! I am using the crawl endpoint with a 1000 page limit. I can see I'm being charged 10k pages.. Massive hit on my quotas. How can I get some help ?
31 Replies
mogery
mogeryβ€’6mo ago
hey Alp! sorry, we're a bit behind on tickets and today is a company holiday. what formats are you using? also can you send me a crawl ID with this issue?
Alp
AlpOPβ€’6mo ago
153c46bd-d229-4f0f-8449-9460ffa3251f b04e64f0-56f1-4f03-a7d6-035ea70060df 2d7a6689-f0ab-4ac7-90e8-d4d9ee1fa4b1 f551920c-b116-47b7-93bd-fc256a51e6b3 Here are some. I am using html. If you look at my usage, You'll see a massive spike and then it goes away thanks @mogery !
mogery
mogeryβ€’6mo ago
you're setting a limit of 10k, not 1k
No description
mogery
mogeryβ€’6mo ago
at least that's what's arriving on our end can you send/review the code you're using to make the requests?
Alp
AlpOPβ€’6mo ago
I didn't move to v1 (didn't know about it). Could it be a v0/v1 discrepancy that is auto setting 10k? Cause I do not have that
mogery
mogeryβ€’6mo ago
should probably not be the issue, but moving to v1 should be done ASAP as the v0 API will be disabled relatively soon and we are no longer shipping fixes for it, nor are we running tests for it
Alp
AlpOPβ€’6mo ago
got it. Is there a way to double check the problem? I do not have a single instance where 10k could've been set. What is the default value? ++ should I have expected some email from deprecation on this? I totally missed it and was wondering how I would notice this change from v0=>v1
mogery
mogeryβ€’6mo ago
it actually looks like you're using v1 already since you're using version 2.0.2 of the python SDK
Alp
AlpOPβ€’6mo ago
I think it auto-bumped the version, but the input is still v0 style (surprised it doesn't fail btw)
No description
mogery
mogeryβ€’6mo ago
looks like you are not setting a limit at all in the request you're sending us
No description
mogery
mogeryβ€’6mo ago
that is v1 style input :)
Alp
AlpOPβ€’6mo ago
oh it is then I'm extra confused πŸ˜„ thanks for the clarification
mogery
mogeryβ€’6mo ago
but yeah, none of the params are making it to us, only the URL and the origin which is enforced by the SDK let me check the SDK code to make sure we didn't mess anything up ah, so -- the major version bump changed the API in a way that doesn't trigger the error but negates the crawl parameters in the way you pass them
mogery
mogeryβ€’6mo ago
this is the new function signature
No description
mogery
mogeryβ€’6mo ago
so you would need to do firecrawl.crawl_url(url='whatever', limit=1000) forwarding this internally, this should be fixed, but might be better to right now move over to using the new footprint
Alp
AlpOPβ€’6mo ago
moving over asap 😒 thanks so much for such a quick investigation
mogery
mogeryβ€’6mo ago
ofc! how many credits would you estimate were wasted? would be happy to add those back to your account
Alp
AlpOPβ€’6mo ago
I wanna follow-up about all the credits I lost (I ended up spending +55$ cause I'm out of them 😦 ). Can we sync about that?
mogery
mogeryβ€’6mo ago
of course! can you send me your firecrawl e-mail in DMs?
Alp
AlpOPβ€’6mo ago
I saw 5 instances of 10k -- probably 50k. Either way -- that'll be plenty for me until my next cycle πŸ˜„ just did
mogery
mogeryβ€’6mo ago
gotchu -- sent 50k. was that +55$ from auto recharges?
Alp
AlpOPβ€’6mo ago
yeah I'd say 55k-ish πŸ˜„ given my usage the rest of the time on average.
No description
Alp
AlpOPβ€’6mo ago
thanks so much!!!
Alp
AlpOPβ€’6mo ago
I double checked. It's actually slightly more than 55. Here they are
No description
mogery
mogeryβ€’6mo ago
alright -- do you want me to refund those packs and cancel them?
Alp
AlpOPβ€’6mo ago
yes please πŸ™‚
mogery
mogeryβ€’6mo ago
done -- refund should arrive in 5-10 business days according to Stripe
Alp
AlpOPβ€’6mo ago
im hotfixing this in production so should be live in 15 minutes
No description
Alp
AlpOPβ€’6mo ago
I'll fix everything else in the normal train. Making sure this would ensure limit isn't impacted?
mogery
mogeryβ€’6mo ago
looks good, but also needs ignore_query_parameters=True and scrape_options={...} to be in the method args by the way the 50k credits i issued will not reset at your billing cycle so you can continue to use them even if they do not run out this cycle
Alp
AlpOPβ€’6mo ago
oh woah. Thanks so much! appreciate it

Did you find this page helpful?