Project 2: llmPlay SwiftUI
Cover Page
Expected behavior
DISCLAIMER: the video demoes show you one aspect of the app’s behavior. It is not a substitute for the spec. If there are any discrepancies between the demo and this spec, please follow the spec. The spec is the single source of truth. If the spec is ambiguous, please consult the teaching staff for clarification.
Preparing your GitHub repo
If you have not completed the llmchat tutorial, please do so first.
In the following, replace /YOUR:TUTORIALS/ with the name of your tutorials folder.
- On your laptop, navigate to
/YOUR:TUTORIALS/ - Create a zip of your
llmchatfolder - Rename your
llmchatfolderllmplay - We’ll need to request permission to access location (again). Please follow the
instructions in the
Mapstutorial’s section on requesting permission -
If you have completed the
Mapstutorial, you can copy yourLocManager.swiftfile over from theMapstutorial: open bothmapsandllmplayprojects in Xcode, then alt-drag the file from one to the other.If you have not completed the
Mapstutorial, please follow the instructions in the Location manager section to set it up.Then update the
Locationstruct inLocManager.swiftto:struct Location: Decodable { var lat: CLLocationDegrees var lon: CLLocationDegrees var speed: CLLocationSpeed = 0.0 enum CodingKeys: String, CodingKey { // to ignore other keys case lat, lon } } - In your
swiftUIChatterApp.swift:- Set the
messageproperty in yourChattViewModelto an empty string instead of"howdy?". - Start collecting location updates in your
swiftUIChatterAppstruct. Add to its initializer:LocManager.shared.startUpdates()
- Set the
- Delete the file
ChattScrollView.swiftfrom your project - Push your local
/YOUR:TUTORIALS/repo to GitHub and make sure there’re no git issues:git push
- Open GitHub Desktop and click on
Current Repositoryon the top left of the interface - Click on your
reactiveGitHub repo - Add Summary to your changes and click
Commit to main - If you have pushed other changes to your Git repo,
click
Pull Originto synch up the clone on your laptop - Finally click on
Push Originto push changes to GitHub
- Open GitHub Desktop and click on
Go to the GitHub website to confirm that your folders follow this structure outline:
reactive
|-- chatterd
|-- chatterd.crt
|-- llmchat
|-- llmplay
|-- swiftUIChatter
|-- swiftUIChatter.xcodeproj
|-- swiftUIChatter
# and other files or folders
If the folders in your GitHub repo does not have the above structure, we will not be able to grade your assignment and you will get a ZERO.
Interacting with Ollama
I think of the interaction with Ollama as separated into three phases:
- When the app launches, it starts a game by prompting the model, through
chatterd’s/llmprepAPI, for a set of hints about a city. In the same prompt, it also specifies what the model must do to announce that the user has guessed correctly. - As the game progress, the app interacts with the model through
chatterd’s modified/llmchatAPI: receiving hints, sending guesses. We’ll be using a modifiedllmChat(appID:chatt:errMsg:), which I callllmPlay(appID:chatt:hints:winner:errMsg:), to handle this phase. - Finally, the app must be able to receive, recognize, and handle a
LatLonSSE event thatchatterdsends when Ollama announces a winner.
When the app launches, in the initializer of swiftUIChatterApp, after I’ve started the location
updates, I send the system prompt as described in the Front-end UX section
using llmPrep(appID:chatt:errMsg:showOk:) (see the llmPrep
TODO in the llmChat tutorial for an example usage).
After sending the system prompt, I follow with sending a single “Yes” string using
llmPlay(appID:chatt:hints:winner:errMsg:) to prime the model and get it to return the first set of
hints about the mystery city.
Be careful with the wording of your prompt. For example, opening your session with a “Test” or “This is a test” prompt could lead to confusing subsequent interaction with the model.
I collect the hints returned by Ollama in an observable property hints in my ChattViewModel,
also in swiftUIChatterApp.swift. Since this initial prompt will never result in any winning
notification, I give nil as the argument for the winner parameter.
llmPlay(appID:chatt:hints:winner:errMsg:)
Here’s the full signature I use for llmPlay(appID:chatt:hints:winner:errMsg:) in ChattStore:
func llmPlay(appID: String, chatt: Chatt,
hints: Binding<String>,
winner: ((Location) -> ())?,
errMsg: Binding<String>) async { }
The first, second, and last parameters are the same as those of llmChat(appID:chatt:errMsg:). The
hints parameter is an observable string we use to hold hints Ollama returns about the city for the
user to guess. We update hints similar to how we update errMsg. Appending newly arriving chunks
to hints accumulates the hints. SwiftUI automatically re-renders any View observing an observable
variable, such as hints and errMsg. The winner parameter is a closure that takes a Location
as argument and handles the winning notification. Location is defined in LocManager.swift.
I will build llmPlay(appID:chatt:hints:winner:errMsg:) by modifying
llmChat(appID:chatt:errMsg:), as follows. Unlike in the llmChat tutorial, we do not need to show
a timeline of user exchanges with Ollama. Thus, instead of creating a placeholder chatt message to
append to a chatts array, you can decode each Message data line of the returning stream into
an OllamaResponse struct and append its content to the hints parameter directly. Remember to
clear the hints parameter before you start processing Ollama’s reply and start appending to it.
Handling of SSE Error event and all other error handling code can be adopted from
llmChat(appID:chatt:errMsg:).
To recognize and handle a LatLon SSE event, you may want to read closely the
llmChat(appID:chatt:errMsg:) code for handling SSE Error event, and the accompanying explanation
in the llmChat tutorial. To implement SSE LatLon event
handling, first add a LatLon case variant to your SseEventType at the top of ChattStore file.
Then for each line of the incoming stream, when you detect an event tag, in addition transitioning to an
Error event, you want to recognize and transition to a LatLon event.
When parsing a data line, if it’s part of a LatLon event, decode the line into a Location
struct, not an OllamaResponse struct, and call the winner closure with the decoded
Location as its argument. Since winner can be nil, to invoke it, you must first check that it
is indeed not nil:
winner?(location)
We will discuss what goes into the winner closure at llmPlay(appID:chatt:hints:winner:errMsg:)
use site in SubmitButton later.
Game UI
The Game UI is simply the ContentView with a Map(position:) taking the place of
ScrollViewReader(). We want to move the map’s camera around when there’s
a winner notification.
First add to your ContentView.swift:
import MapKit
then add an observable cameraPosition property to your ContentView().
@State var cameraPosition: MapCameraPosition = .userLocation(fallback: .automatic)
As we did when calling Map(position:) in the Maps tutorial, call it with the cameraPosition
defined above. The map’s camera will automatically move to match where this property is pointing at.
You can remove the scollProxy property as we will no longer be showing a timeline of chatts.
Below all that, we want to show the hints returned by Ollama in a Text View above the TextField
View where user enters their guesses. The content of the Text View should be the observable
property hints in ChattViewModel where you’ve collected the hints from Ollama. Put these two
text boxes in a VStack().
Next to these two text boxes, we want to show the SubmitButton:
- You want to update the
SubmitButtonfrom thellmChattutorial to take an argument of typeMapCameraPosition. So thatSubmitButtoncan make changes to this parameter, it should be declared@Binding. - When calling
SubmitButton, since we don’t have a timeline ofchatts to show anymore, replace the parameterscrollProxywithcameraPositionwe defined above. Then inSubmitButtonremove the call towithAnimation { scrolltTo(:anchor:) }, along with its surroundingTask(priority:). - In
SubmitButton, callllmPlay(appID:chatt:hints:winner:errMsg:)in lieu ofllmChat(appID:chatt:errMsg:), passing it thehintsproperty inChattViewModel. - As for the
winnerparameter ofllmPlay(appID:chatt:hints:winner:errMsg:), pass it the following closure, which moves the camera on theMapView to the city’s lat/lon:{ loc in print("lat: \(loc.lat), lon: \(loc.lon)}") cameraPosition = .camera(MapCamera( centerCoordinate: CLLocationCoordinate2D(latitude: loc.lat, longitude: loc.lon), distance: 14000, heading: 0, pitch: 60) ) }
Finally, as usual, ContentView must pop up an alert dialog box if there
were any error messages.
Additional UX (optional)
The following UX feature is intended to increase the perceived responsiveness and interactivity of the app. You can choose to implement it to match the demo video, but you won’t be deducted point if you don’t (nor will there be extra credit if you do!).
While waiting for the first batch of hints from Ollama, the message in the
text box holding the hints should show Waiting for hints… and the SubmitButton
is disabled and shows a ProgressView() which changes back to the “paper plane”
icon when the hints finished arriving.
That’s all for Project 2!
Run and test to verify and debug
Be sure to run your front end against your back end. You will not get full credit if your front end is not set up to work with your back end!
Submission guidelines
Be sure you have submitted your modified back end in addition to submitting
your updated front end. As usual, we will only grade files committed to the
main branch. If you use multiple branches, please merge them all to the
main branch for submission.
Push your front-end code to the same GitHub repo you’ve submitted your back-end code:
- Open GitHub Desktop and click on
Current Repositoryon the top left of the interface - Click on the GitHub repo you created at the start of this tutorial
- Add Summary to your changes and click
Commit to mainat the bottom of the left pane - If you have pushed code to your repo, click
Pull Originto synch up the repo on your laptop - Finally click
Push Originto push all changes to GitHub
Go to the GitHub website to confirm that your front-end files have been uploaded to your GitHub
repo under the folder llmplay. Confirm that your repo has a folder structure outline similar to the following.
If your folder structure is not as outlined, our script will not pick up your submission and, further, you may
have problems getting started on latter tutorials. There could be other files or folders in your local folder
not listed below, don’t delete them. As long as you have installed the course .gitignore as per the instructions
in Preparing GitHub for Reactive Tutorials, only
files needed for grading will be pushed to GitHub.
reactive
|-- chatterd
|-- chatterd.crt
|-- llmchat
|-- llmplay
|-- swiftUIChatter
|-- swiftUIChatter.xcodeproj
|-- swiftUIChatter
# and other files or folders
Verify that your Git repo is set up correctly: on your laptop, grab a new clone of your repo and build and run your submission to make sure that it works. You will get ZERO point if your tutorial doesn’t build, run, or open.
IMPORTANT: If you work in a team, put your team mate’s name and uniqname in your repo’s README.md (click the pencil icon at the upper right corner of the README.md box on your git repo) so that we’d know. Otherwise, we could mistakenly think that you were cheating and accidentally report you to the Honor Council, which would be a hassle to undo. You don’t need a README.md if you work by yourself.
Review your information on the Tutorial and Project Links sheet. If you’ve changed your teaming arrangement from previous lab’s, please update your entry. If you’re using a different GitHub repo from previous lab’s, invite eecsreactive@umich.edu to your new GitHub repo and update your entry.
| Prepared by Chenglin Li, Xin Jie ‘Joyce’ Liu, Sugih Jamin | Last updated: February 6th, 2026 |