Help with running Wasp MageGPT locally on ubuntu

I am looking to get some help running WaspGPT locally with my own OpenAI key so I can use GPT 4 for the whole app. My question is: "Where do I add my API key on the command line to generate the app?" wasp new-ai:disk MyAwesomeApp "Description of my awesome app." "{ "defaultGptModel": "gpt-4" }" I have nvm 18 installed and ran the curl command to install the AI version.
6 Replies
Filip
Filip•8mo ago
Hey, thanks for trying us out. Here's how you do it: 1. Open a terminal window 2. Define your API key with export OPENAI_API_KEY=... 3. Execute your command. One more fix, you must escape your quotes inside the json to avoid parsing errors. Here's the working command:
wasp new-ai:disk MyAwesomeApp "Description of my awesome app." "{ \"defaultGptModel\": \"gpt-4\" }"
wasp new-ai:disk MyAwesomeApp "Description of my awesome app." "{ \"defaultGptModel\": \"gpt-4\" }"
Thanks for jumping in, but @foxtrotunicorn is actually talking about Mage (Wasp's GPT web app generator), he just used the wrong name 🙂
foxtrotunicorn
foxtrotunicorn•8mo ago
Thank you for the help. I followed your instructions so far and they worked. Right now I am a little lost but i think the problem might be trying to open the port for the server.
I ran the migrate command, suggested an update to prisma, but I did not proceed with that. When running wasp "name of app" I got the following error msg: Client!] Error: Failed to scan for dependencies from entries: [Client!] /wasp/SynergyERP/.wasp/out/web-app/index.html [Client!] [Client!] ? [ERROR] No matching export in "src/ext-src/pages/Dashboard.jsx" for import "default" [Client!] [Client!] src/router.jsx:9:7: [Client!] 9 ? import DashboardPage from './ext-src/pages/Dashboard.jsx' [Client!] ? ~~~ [Client!] [Client!] [Client!] ? [ERROR] No matching export in "src/ext-src/pages/Tenant.jsx" for import "default" [Client!] [Client!] src/router.jsx:10:7: [Client!] 10 ? import TenantPage from './ext-src/pages/Tenant.jsx' [Client!] ? ~~~~ [Client!] [Client!] [Client!] at failureErrorWithLog (/wasp/SynergyERP/.wasp/out/web-app/node_modules/esbuild/lib/main.js:1649:15) [Client!] at /wasp/SynergyERP/.wasp/out/web-app/node_modules/esbuild/lib/main.js:1058:25 [Client!] at runOnEndCallbacks (/wasp/SynergyERP/.wasp/out/web-app/node_modules/esbuild/lib/main.js:1484:45) [Client!] at buildResponseToResult (/wasp/SynergyERP/.wasp/out/web-app/node_modules/esbuild/lib/main.js:1056:7) [Client!] at /wasp/SynergyERP/.wasp/out/web-app/node_modules/esbuild/lib/main.js:1068:9 [Client!] at new Promise (<anonymous>) [Client!] at requestCallbacks.on-end (/wasp/SynergyERP/.wasp/out/web-app/node_modules/esbuild/lib/main.js:1067:54) [Client!] at handleRequest (/wasp/SynergyERP/.wasp/out/web-app/node_modules/esbuild/lib/main.js:729:19) [Client!] at handleIncomingPacket (/wasp/SynergyERP/.wasp/out/web-app/node_modules/esbuild/lib/main.js:755:7) [Client!] at Socket.readFromStdout (/wasp/SynergyERP/.wasp/out/web-app/node_modules/esbuild/lib/main.js:679:7) It looks like there is code or files that are failing to be generated. I am running this in Docker from Portainer in a Ubuntu app stack.
Filip
Filip•8mo ago
Hey, it doesn't look like a port problem. The error says that the file pages/Dashboard.jsx either doesn't exist or doesn't contain the default export your app expects. I recommend double checking if mage created the file - sometimes it halucinates and forgets to create it
foxtrotunicorn
foxtrotunicorn•8mo ago
I keep getting the following error: Caught retryable HTTP exception while doing ChatGPT request: HttpExceptionRequest Request { host = "api.openai.com" port = 443 secure = True requestHeaders = [("Content-Type","application/json; charset=utf-8"),("Authorization","<REDACTED>")] path = "/v1/chat/completions" queryString = "" method = "POST" proxy = Nothing rawBody = False redirectCount = 10 responseTimeout = ResponseTimeoutMicro 90000000 requestVersion = HTTP/1.1 proxySecureMode = ProxySecureWithConnect
MEE6
MEE6•8mo ago
Wohooo @foxtrotunicorn, you just became a Waspeteer level 1!
martinsos
martinsos•8mo ago
Any more info before or after it? OpenAI API can sometimes return an error, usually due to being overloaded with requests, or maybe it does rate limiting because you asked it too many requests.