ModularM
Modular2y ago
37 replies
Jack

Beta test Llama3 serving and GUI on MAX Nightly

You can beta test the upcoming MAX serving and Llama3 chatbot GUI using the nightly release, I'd love feedback if you run into any problems.

Start the GUI by running these commands on arm64 macOS or x86/arm64 Linux (Note the smallest llama3 model is 4.5GB):
rm -rf ~/.modular
curl -s https://get.modular.com | sh -
modular auth
modular install nightly/max
MAX_NIGHTLY_PATH=$(modular config max-nightly.path)
SHELL_RC=~/.$(basename "$SHELL")rc
echo 'export MODULAR_HOME="'$HOME'/.modular"' >> $SHELL_RC
echo 'export PATH="'$MAX_NIGHTLY_PATH'/bin:$PATH"' >> $SHELL_RC
curl -fsSL https://pixi.sh/install.sh | $SHELL
source "$SHELL_RC"
git clone https://github.com/modularml/max.git ~/max
cd ~/max
git checkout nightly
cd examples/gui
pixi run gui

You can also ssh into a machine with vscode and run the above commands in the terminal, which will forward ports so you can run the GUI in your local browser.

Work is being done to vastly improve the experience of getting up and running and MAX, stay tuned for that.
Screenshot_2024-06-27_at_3.16.32_PM.png
Was this page helpful?