Project 1: llmDraft: LLM-assist Chatter
Course Schedule
We will build an app called Chatter and its corresponding back-end server, chatterd. Users post text
messages, called chatts, to the back-end server, which stores each chatt in a database and shares all
posted chatts with everyone accessing the back end (be mindful of what you post!). The back-end server,
aside from a database to store chatts, also has access to an LLM. A user can ask the LLM to rewrite
a chatt they’ve drafted prior to posting, or a user can ask the LLM to draft a reply to a chatt
posted by someone else.
Expected behavior
Displaying posted chatts, requesting rewrite of a draft chatt, requesting draft reply of another user’s chatt:
DISCLAIMER: the video demoes show you one aspect of the app’s behavior. It is not a substitute for the spec. If there are any discrepancies between the demo and this spec, please follow the spec. The spec is the single source of truth. If the spec is ambiguous, please consult the teaching staff for clarification.
Treat messages you send to chatterd and Ollama as public utterances with no reasonable expectation of
privacy and know that these are recorded and shared with everyone who uses chatterd and to provide
Ollama with context.
Partial credits
To help you with time management, break your approach down to smaller tasks, and to help structure your solution, you can earn partial credits by completing the following two tutorials by their deadlines, as listed in the Course Schedule:
Completing and submitting the tutorials by their respective deadlines are optional, though the features and functionalities embodied in the tutorials are REQUIRED of this project.
This project may be completed individually or in teams of at most 2. You can partner differently for each project. Only ONE member of a team needs to submit the project and its tutorials.
Objectives
In addition to the objectives listed in the llmPrompt and
Chatter tutorials, this project has the following objectives for
the front end:
- Practice and apply the objectives learned in the two tutorials
- Learn how to implement the long-press gesture
Features and requirements
To receive full credit, your solution app must provide the following features and satisfy the following requirements, including those in any applicable “Implementation guidelines” documents.
Front-end UI
As can be seen in the video above, the Chatter app consists of a single
screen with the following UI elements:
- a title bar showing the title
Chatter, - a timeline of posted
chatts with the current user’schattshown on the right and those of other user(s) shown on the left,- each
chattis shown in a message “bubble” with its timestamp at the bottom of the “bubble” in smaller fonts and different color from the message’s color, - the
chatton the left is shown with the posting user’s name shown above the “bubble”, again in smaller font and different color,
- each
- and at the bottom of the screen:
- a text box in the middle,
- an “AI” button to the left of the text box showing an “AI Star” icon. This button is enabled only when the text box is not empty and there is no networking session in progress,
- a “Send” button on the right of the textbox showing a “paper plane” icon. This button also is enabled only when the text box is not empty and no networking session is in progress.
When a button is “disabled”, it is grayed out and tapping on it has no effect.
While there is a networking session in progress, that is, while waiting
for Ollama’s response on a rewrite or reply request, or while waiting
for the sending of a chatt to complete, the “Send” button’s icon changes
from showing a “paper plane” to showing an animated “loading” circle.
UI Design
One can easily spend a whole weekend (or longer) getting the UI ““just right.”
Remember: we won’t be grading you on how beautiful your UI looks nor how
precisely it matches the one shown on the video demo. You’re free to design your UI differently,
so long as all indicated UI elements are fully visible on the screen, non overlapping,
and functioning as specified.
Front-end UX
As demonstrated in the video above:
-
Upon loading, the
Chatterfront end retrieves all postedchatts from thechatterdback end and displays each according to the UI description above. -
User can enter text in the textbox and click the “Send” button to post a
chattto the back end. -
Instead of clicking the “Send” button after entering text, user can click the “AI” button to ask for a rewrite suggestion from Ollama, through the
chatterdback end. The rewrite suggestion is displayed in the text box, replacing the original draft. Once the rewrite suggestion is fully displayed, the user can edit the text box, click the “Send” button to post thechattcurrently in the text box, or click the “AI” button again for another rewrite. -
User can long-press on another user’s posted
chattdisplayed on the left of the timeline to request a draft reply from Ollama, through thechatterdback end. The reply draft will be shown in the text box at the bottom of the screen. Once the reply draft is fully displayed, the user can edit the text box, click the “Send” button to post thechattcurrently in the text box, or click the “AI” button for a rewrite. -
Ollama’s response will be streamed from the back end. As each piece of the stream arrives, it must be immediately displayed on the text box, appended to what’s already there—instead of waiting for the full reply to complete.
Back-end infrastructures
The chatterd back end accepts HTTP/2 connection from the Chatter front end, with HTTPS security.
The back end is set up with a self-signed certificate to provide HTTPS service. The back end is also
set up with a PostgreSQL database to store posted chatts. The back end is further set up to
forward user prompts to an Ollama server and returns Ollama’s reponse to the same user. Prompts
to Ollama and the corresponding responses are not stored in the PostgreSQL database unless users
subsequently posted Ollama’s responses as their chatts.
APIs
The chatterd back end has three API endpoint URLS:
-
llmprompt: as documented in thellmPrompttutorial, and -
postchattandgetchattsas documented in theChattertutorial.
Implementation guidelines
Back end
The chatterd back end is fully covered in the two tutorial specifications:
-
llmPrompt: for setting up a AWS/GCP instance, installing
an Ollama server, creating a self-signed certificate, and setting up an HTTPS
server. It also provides implementation for the
llmpromptAPI. You will also need to install your self-signed certificate on your front-end platform following the instructions in the first tutorial for Android or iOS. -
Chatter: for setting up a PostreSQL database. It provides
the implemetation for both the
postchattandgetchattsAPIs.
Completing these two tutorials completes the back end. The back end you built for these
two tutorials cumulatively can be used as is in this project. Any of the alternative
back-end stacks provided will work. The host mada.eecs.umich.edu is not available
for this programming assignment beyond the latter of the two deadlines of the tutorials.
To receive full credit, your front end MUST work with your own back end.
Front end
The UI and UX for posting chatt on the Chatter front end, along with the front-end
interactions with the postchatt and getchatts APIs, are all fully provided in the
Chatter tutorial.
Front-end interaction with the llmprompt API is shown in the llmPrompt
tutorial. You are to provide the rewrite and reply functionalities and UI/UX after
you’ve integrated the two tutorials into the project. Step-by-step guidelines are provided
below (follow the link to your platform of choice):
WARNING: You will not get full credit if your front end is not set up
to work with your back end!
Everytime you rebuild your Go or Rust server or make changes to either of your JavaScript
or Python files, you need to restart chatterd:
server$ sudo systemctl restart chatterd
Leave your chatterd running until you have received your project grade.
TIP:
server$ sudo systemctl status chatterd
is your BEST FRIEND in debugging your server. If you get an HTTP error code
500 Internal Server Error or if you just don’t know whether your HTTP request
has made it to the server, first thing you do is run sudo systemctl status chatterd
on your server and study its output, including any error messages and debug printouts
from your server.
| Prepared by Chenglin Li, Xin Jie ‘Joyce’ Liu, Sugih Jamin | Last updated: December 28th, 2025 |