saving data in apify actor and cleaning
ive tried saving the data to a rawdata.json file from the data i scrape from my actors,
however i dont get a json output even thought the scraping works
how would i save the data to the apify console that i can then use mongodb to take that data and put it in my database -
i have my mongodb schema already setup so how would i save the data to the apify console and access it
would i have to save it to the apify dataset, if so how, and how would i also put it through a cleaning process through the same actor or if possible, a different actor and THEN save it to a mongodb database?
heres what i have for saving the json file so far:
6 Replies
conscious-sapphireOP•2y ago
Hmm... this should generally work... The question might be where is the file saved. You might find the examples for working with Dataset here https://crawlee.dev/api/core/class/Dataset (this will generated a new fiel in
storages
folder for each item in dataset). You should be able to even send it to the MongoDB directly, depends on your use-case.conscious-sapphireOP•2y ago
would i have to install the fs dependency if so how
no
fs
module is part of nodejs instalationconscious-sapphireOP•2y ago
does this work in an Actor because it only seems to work on my local compouter
@harish So I am not sure where do you run it. This #crawlee-js so you should be fully in control of wherever and how you run it. Are running it on Apify Platform? Then you may send me in DM a link with the run so I may check it.