Beta test Llama3 serving and GUI on MAX Nightly
You can beta test the upcoming MAX serving and Llama3 chatbot GUI using the
Start the GUI by running these commands on
You can also ssh into a machine with vscode and run the above commands in the terminal, which will forward ports so you can run the GUI in your local browser.
Work is being done to vastly improve the experience of getting up and running and MAX, stay tuned for that.
nightly release, I'd love feedback if you run into any problems.Start the GUI by running these commands on
arm64 macOS or x86/arm64 Linux (Note the smallest llama3 model is 4.5GB):You can also ssh into a machine with vscode and run the above commands in the terminal, which will forward ports so you can run the GUI in your local browser.
Work is being done to vastly improve the experience of getting up and running and MAX, stay tuned for that.

