Project 1: llmDraft: LLM-assist Chatter

DUE Wed, 09/24, 2 pm

This project may be completed individually or in teams of at most 2. You can partner differently for each project.

Treat your messages sent to chatterd and Ollama as public utterances with no reasonable expectation of privacy and know that these are recorded and shared with everyone using chatterd and for the purposes of carrying out a contextual interaction with Ollama.

Objectives

In addition to the objectives listed in the llmPrompt and Chatter tutorials, this project has the following objectives for the frontend:

LLM-assist chatting

We will build an app called Chatter and its corresponding backend server called chatterd. Users use the Chatter app to post text messages, called chatts, to the backend server, which stores each chatt in a database and shares all posted chatts with everyone else accessing the backend (be mindful of what you post!). The backend server, aside from a database to store chatts, also has access to an LLM, which can assist users in writing a chatt.

The LLM can assist in two ways:

  1. user can draft a chatt and ask the LLM for rewrite, and
  2. user can request a draft reply on a chatt posted by someone else.

We call the first the reply feature and the second the rewrite feature. When building Chatter, user can hard-code a “personality” for the LLM to assume in performing both the reply and rewrite features. For example, user can instruct the LLM to assume the personality of a poet when writing a draft.

Expected behavior

Displaying posted chatts, requesting rewrite of a draft chatt, requesting draft reply of another user’s chatt:

DISCLAIMER: the video demoes show you one aspect of the app’s behavior. It is not a substitute for the spec. If there are any discrepancies between the demo and this spec, please follow the spec. The spec is the single source of truth. If the spec is ambiguous, please consult the teaching staff for clarification.

Features and requirements

Your Chatter app must provide the following features and satisfy the following requirements, including those in any applicable “Implementation guidelines” documents, to receive full credit.

Front-end UI

As can be seen in the video above, the Chatter app consists of a single screen with the following UI elements:

  1. a title bar showing the title Chatter,
  2. a timeline of posted chatts with the current user’s chatt shown on the right and those of other user(s) shown on the left,
    • each chatt is shown in a message “bubble” with its timestamp at the bottom of the “bubble” in smaller fonts and different color from the message’s color,
    • the chatt on the left is shown with the posting user’s name shown above the “bubble”, again in smaller font and different color.
  3. these UI elements at the bottom of the screen:
    • a text box in the middle,
    • an “AI” button to the left of the text box showing an “AI Star” icon. This button is enabled only when the text box is not empty and there is no networking session in progress,
    • a “Send” button on the right of the textbox showing a “paper plane” icon. This button also is enabled only when the text box is not empty and no networking session is in progress.

    When a button is “disabled”, it is grayed out and tapping on it has no effect.

While there is a networking session in progress, that is, while waiting for Ollama’s response on a rewrite or reply request, or while waiting for the sending of a chatt to complete, the “Send” button’s icon changes from showing a “paper plane” to showing an animated “loading” circle.

UI Design

One can easily spend a whole weekend (or longer) getting the UI “just right.”

:point_right: Remember: we won’t be grading you on how beautiful your UI looks nor how precisely it matches the one shown on the video demo. You’re free to design your UI differently, so long as all indicated UI elements are fully visible on the screen, non overlapping, and functioning as specified.

Front-end UX

As demonstrated in the video above:

Back-end infrastructures

The chatterd backend accepts HTTP/2 connection from the Chatter frontend, with HTTPS security. The backend is set up with a self-signed certificate to provide HTTPS service. The backend is also set up with a PostgreSQL database to store posted chatts. The backend is further set up to forward user prompts to an Ollama server and returns Ollama’s reponse to the same user. Prompts to Ollama and the corresponding responses are not stored in the PostgreSQL database unless users subsequently posted Ollama’s responses as their chatts.

APIs

The chatterd backend has three API endpoint URLS:

Implementation guidelines

Backend

The chatterd backend is fully covered in the two tutorial specifications:

Completing these two tutorials completes the backend. The backend you built for these two tutorials cumulatively can be used as is in this project, with no further work required. Any of the alternative backend stacks provided will work. The host mada.eecs.umich.edu is not available for this programming assignment beyond the latter of the two deadlines of the tutorials. To receive full credit, your frontend MUST work with your own backend.

Frontend

The UI and UX for posting chatt on the Chatter frontend, along with the front-end interactions with the postchatt and getchatts APIs, are all fully provided in the Chatter tutorial.

The rewrite and reply functionalities and UI/UX are not covered in the two tutorials. Front-end interaction with the llmprompt API is shown in the llmPrompt tutorial; your remaining job in this project is to adapt it to provide the rewrite and reply functionalities, along with the accompanying UI/UX. Step-by-step guidelines are provided below (follow the link to your platform of choice):

:point_right:WARNING: You will not get full credit if your front end is not set up to work with your backend!

Everytime you rebuild your Go or Rust server or make changes to either of your JavaScript or Python files, you need to restart chatterd:

server$ sudo systemctl restart chatterd

:warning:Leave your chatterd running until you have received your tutorial grade.

:point_right:TIP:

server$ sudo systemctl status chatterd

is your BEST FRIEND in debugging your server. If you get an HTTP error code 500 Internal Server Error or if you just don’t know whether your HTTP request has made it to the server, first thing you do is run sudo systemctl status chatterd on your server and study its output, including any error messages and debug printouts from your server.


Prepared by Chenglin Li, Xin Jie ‘Joyce’ Liu, Sugih Jamin Last updated: August 24th, 2025