Skip to content

Scaleable, production grade web apps in pure Python (& powering Gooey.AI)

License

Notifications You must be signed in to change notification settings

GooeyAI/gooey-gui

Repository files navigation

GooeyGUI - Write production grade web apps in pure Python

GooeyGUI is an alternative to Streamlit, Dash, and other Python UI frameworks. See what it's capable of at /explore.

The main innovation in this framework is the complete removal of websockets.

You bring your own server, whether its fastapi, flask, or django - allowing you to be more flexible and scale horizontally like you would a classic website.

It also takes full advantage of SSR which means you get static HTML rendering and SEO goodness out of the box.

Prerequisites

  1. Install Node v18 (We recommend using nvm)
nvm install 18.12.0
  1. Install Python 3.10+ (We recommend using pyenv)

  2. Install & start redis.

E.g. on Mac - https://redis.io/docs/getting-started/installation/install-redis-on-mac-os/

brew install redis
brew services start redis

Setup

  1. Clone this repo
git clone https://github.com/GooeyAI/gooey-gui
  1. Install node dependencies
cd gooey-gui
npm install
  1. Install python dependencies
cd your-python-project
pip install gooey-gui

Usage

from fastapi import FastAPI
import gooey_gui as gui

app = FastAPI()

@gui.route(app, "/")
def root():
    gui.write("""
    # My first app
    Hello *world!*
    """)

Copy that to a file main.py.

Run the python server:

cd your-python-project
uvicorn main:app --reload

Run the Frontend:

cd gooey-gui
npm run dev

Open the browser at localhost:3000 and you should see the following πŸŽ‰

image

Adding interactivity

@gui.route(app, "/temp")
def root():
    temperature = gui.slider("Temperature", 0, 100, 50)
    gui.write(f"The temperature is {temperature}")

Sending realtime updates to frontend

Here's a simple counter that updates every second:

from time import sleep


@gui.route(app, "/counter")
def poems():
    count, set_count = gui.use_state(0)

    start_counter = gui.button("Start Counter")
    if start_counter:
        for i in range(10):
            set_count(i)
            sleep(1)

    gui.write(f"### Count: {count}")

image

Let's break this down:

First, we create a state variable called count and a setter function called set_count. gui.use_state(<default>) is similar in spirit to React's useState, but the implementation uses redis pubsub & server sent events to send updates to the frontend.

count, set_count = gui.use_state(0)

Next, we create a button called using gui.button() which returns True when the button is clicked.

start_counter = gui.button("Start Counter")

If the button is clicked, we start our blocking loop, that updates the count every second.

if start_counter:
    for i in range(10):  
        set_count(i)
        sleep(1)

Finally, we render the count using gui.write()

gui.write(f"### Count: {count}")

GooeyUI is always interactive

Unlike other UI frameworks that block the main loop of your app, GooeyUI always keeps your app interactive.

Let's add a text input and show the value of the text input below it. Try typing something while the counter is running.

from time import sleep

@gui.route(app, "/counter")
def poems():
    count, set_count = gui.use_state(0)

    start_counter = gui.button("Start Counter")
    if start_counter:
        for i in range(10):
            set_count(i)
            sleep(1)

    gui.write(f"### Count: {count}")

    text = gui.text_input("Type Something here...")
    gui.write("**You typed:** " + text)

image

This works because by default fastapi uses a thread pool. So while that counter is running, the other threads are free to handle requests from the frontend.

In production, you can scale horizontally by running multiple instances of your server behind a load balancer, and using a task queue like celery to handle long-running tasks, or using BackgroundTasks in FastAPI.

OpenAI Streaming

It's pretty easy to integrate OpenAI's streaming API with GooeyUI. Let's build a poem generator.

@gui.route(app, "/poems")
def poems():
    text, set_text = gui.use_state("")

    gui.write("### Poem Generator")

    prompt = gui.text_input("What kind of poem do you want to generate?", value="john lennon")

    if gui.button("Generate πŸͺ„"):
        set_text("Starting...")
        generate_poem(prompt, set_text)

    gui.write(text)


def generate_poem(prompt, set_text):
    openai.api_key = os.getenv("OPENAI_API_KEY")

    completion = openai.ChatCompletion.create(
      model="gpt-3.5-turbo",
      messages=[
        {"role": "system", "content": "You are a brilliant poem writer."},
        {"role": "user", "content": prompt}
      ],
      stream=True,
    )

    text = ""
    for i, chunk in enumerate(completion):
        text += chunk.choices[0].delta.get("content", "")
        if i % 50 == 1: # stream to user every 50 chunks
            set_text(text + "...")

    set_text(text) # final result

image

File uploads

from fastapi.staticfiles import StaticFiles
from starlette.datastructures import FormData
from starlette.requests import Request
from fastapi import Depends

if not os.path.exists("static"):
    os.mkdir("static")

app.mount("/static", StaticFiles(directory="static"), name="static")

async def request_form_files(request: Request) -> FormData:
    return await request.form()

@app.post("/__/file-upload/")
def file_upload(form_data: FormData = Depends(request_form_files)):
    file = form_data["file"]
    data = file.file.read()
    filename = file.filename
    with open("static/" + filename, "wb") as f:
        f.write(data)
    return {"url": "http://localhost:8000/static/" + filename}


@gui.route(app, "/img")
def upload():
    uploaded_file = gui.file_uploader("Upload an image", accept=["image/*"])
    if uploaded_file is not None:
        gui.image(uploaded_file)

image

About

Scaleable, production grade web apps in pure Python (& powering Gooey.AI)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published