`wrangler dev` command with hot reload
wrangler dev command with hot reloadreact-router dev should automagically run wrangler under the hood (you might need to clone the template to see whats going on)_redirects can do this if you set the code to 200
I mean there's nothing preventing users from hitting the API in many wayYes, there is? That's why I want to check
CF-Worker (and enforce 2a06:98c0:3600::103 as the source IP).so maybe offer an API key for tracking/limiting?That isn't nearly as frictionless. I already do that and it's just one more reason why people don't want to try it out.
Yes, there is? That's why I want to check CF-Worker (and enforce 2a06:98c0:3600::103 as the source IP).I can make 20 free Cloudflare accounts and abuse the API while looking legitimate. Even if I have to buy a domain for each, that's what, $4/yr/ea?
As for the API key, it's going to be needed for the paid planmhm, that's already done.
CF-Worker be an actual domain, but then that completely breaks anybody using Email or Cron Workers /ping works to show contents of /ping.txt, but /online keeps giving a 307 redirect to /.../online.html is served to /online (file exists but I don't want to serve it. may delete but didn't want to if that's not going to fix it)/online / 200next-auth isn't compatible with edge runtime.if ( request.headers.get( 'cookie' ).includes( ... ) ) return fetch( request ). (aware we can check cookies in the rule, but that only 'fixes' things for a small percent of our traffic)if ( request.headers.get( 'cookie' ).includes(This is going to throw an exception for any request not containing the cookie header
( request.headers.get( 'cookie' ) || '' ).includes*:"Error: bleh\n at fetch (52624/snippet.js:6:13)\n
connect. When closing a browser obtained that way, it will still keep the session available for additional connect calls.react-router dev/ping.txt/online/online/online /index.html 200
/ping /ping.txt 200/online.html/online / 200if ( request.headers.get( 'cookie' ).includes( ... ) ) return fetch( request )( request.headers.get( 'cookie' ) || '' ).includes try {
throw new Error("bleh")
return fetch(request);
}
catch (err) {
var response = await fetch(request);
const modifiedResponse = new Response(response.body, response);
modifiedResponse.headers.set("error", JSON.stringify({ message: err.message, name: err.name, stack: err.stack, ...err }))
return modifiedResponse;
}import { acquire, connect } from '@cloudflare/playwright';
export default {
async fetch(request: Request, env: Env) {
// ensures session is acquired and kept alive for 20 seconds
const { sessionId } = await acquire(env.MYBROWSER, { keep_alive: 20_000 });
const browser = await connect(env.MYBROWSER, sessionId);
// use the browser normally
// it closes this browser instance but keeps the session alive
await browser.close();
// another browser instance can connect to the same session
const otherBrowser = await connect(env.MYBROWSER, sessionId);
// use the other browser instance with the same session
await otherBrowser.close();
// session is automatically closed after 20 seconds of inactivity
},
};