Tutorial: llmTools SwiftUI
Cover Page
This tutorial can be completed on the iOS simulator.
We assume that you’re building off your llmChat tutorial’s front end.
If you haven’t done that tutorial, you should complete it first. You
will also need the LocManager from the Maps tutorial.
To access your back end, you will need your self-signed certificate
installed on your front-end.
The front-end work involves mostly:
- preparing the toolbox and tool invocation infrastructure,
- loading of tool schema file,
- incorporating the Location Manager from the
Mapstutorial- and calling it from the
get_locationtool,
- and calling it from the
- adding
llmTools(appID:chatt:errMsg:)toChattStoreto process tool calls in the SSE stream:- adding tool management to Ollama message handling,
- handling
tool_callsSSE event, - calling the tool(s) if available and returning the results to Ollama,
- or reporting error to user if tool called is not available.
Preparing your GitHub repo
In the following, replace /YOUR:TUTORIALS/ with the name of your tutorials folder.
- On your laptop, navigate to
/YOUR:TUTORIALS/ - Unzip your
llmchat.zipfile. Double check that you still have a copy of the zipped file for future reference! - Rename your newly unzipped folder
llmtools - Push your local
/YOUR:TUTORIALS/repo to GitHub and make sure there’re no git issues:<summary>git push</summary>- Open GitHub Desktop and click on
Current Repositoryon the top left of the interface - Click on your assignment GitHub repo
- Add Summary to your changes and click
Commit to main - If you have pushed other changes to your Git repo, click
Pull Originto synch up the clone on your laptop - Finally click on
Push Originto push changes to GitHub
- Open GitHub Desktop and click on
Go to the GitHub website to confirm that your folders follow this structure outline:
reactive
|-- chatterd
|-- chatterd.crt
|-- llmtools
|-- swiftUIChatter
|-- swiftUIChatter.xcodeproj
|-- swiftUIChatter
|-- tools
# and other files or folders
We added the tools folder in the cover Tool definition JSON section earlier.
Go ahead and do it now if you haven’t. We will be needing it to complete the front end.
/YOUR:TUTORIALS/ folder on your laptop should contain zipped files from other tutorials in addition.
If the folders in your GitHub repo does not have the above structure, we will not be able to grade your assignment and you will get a ZERO.
Toolbox
Let us start by creating a toolbox to hold our tools. Create a new Empty file and name
it Toolbox.swift. Add import Foundation to the top of the file.
The contents of this file can be categorized into three purposes: tool/function definition, the toolbox itself, and tool use (or function calling).
Tool/function definition
Ollama tool schema: at the top of Ollama’s JSON tool definition is a JSON Object respresenting a tool schema. The tool schema is defined using nested JSON Objects and JSON Arrays. Add the full nested definitions of Ollama’s tool schema to your file:
struct OllamaToolSchema: Codable {
let type: String
let function: OllamaToolFunction
}
struct OllamaToolFunction: Codable {
let name: String
let description: String
let parameters: OllamaFunctionParams?
}
struct OllamaFunctionParams: Codable {
let type: String
let properties: [String:OllamaParamProp]?
let required: [String]?
}
struct OllamaParamProp: Codable {
let type: String
let description: String
let enum_: [String]?
enum CodingKeys: String, CodingKey {
// to map json field to property
// if specify one, must specify all
case type = "type"
case description = "description"
case enum_ = "enum"
}
}
Location tool schema: in this tutorial, we have only one tool on device, get_location. Instead
of manually instantiating an OllamaToolSchema for each tool, we use Swift’s Codable package to
create one for us from a JSON schema file. First we must add the file to Xcode’s Copy Bundle Resources.
So as not to have to add and remove files manually, we make this process “reactive,” by having Xcode
watch a folder for file addition/deletion:
- From the Finder, find and drag the
toolsfolder you created earlier to your project (first item in the Project Navigator, the left sidebar) in Xcode - In the dialog box that pops up select in the first two drop-down menus (if you don’t see the following choices, it could mean you dropped off the folder in the second item in the Project Navigator instead of the first item. Click
Canceland try the first step again):-
Action: “Reference files in place” -
Groups: “Create folders” -
Targets: should have your app name already checked; if not, check it.
-
- Click
Finish - Confirm that on your Project Navigator pane, the folder shows up in blue, not yellow, along with an “alias” arrow on its bottom left.
To further verify that you’ve successfully added get_location.json schema file to your Copy Bundle Resources:
- Click your project (top item on the Project Navigator)
- Under
TARGETSin pane next to the Project Navigator, your app should already be selected. If not, select it. - Click on the
Build Phasestab along the top of the pane next toTARGETS - Expand the
Copy Bundle Resourcesitem (usually the last one) - Confirm that
get_location.jsonis listed.
Next we define jsonToSchema(_:) function to load a schema file from Xcode’s Bundle and decode the
schema into an instance of OllamaToolSchema. You can put this function definition at the top-level of your
Toolbox.swift file:
func jsonToSchema(_ tool: String) -> OllamaToolSchema {
guard let url = Bundle.main.url(forResource: tool, withExtension: "json"), // prior to Xcode 16: subdirectory: "tools"),
let data = try? Data(contentsOf: url) else {
fatalError("Failed to find \(tool).json in bundle")
}
do {
return try JSONDecoder().decode(OllamaToolSchema.self, from: data)
} catch {
fatalError("Failed to decode \(tool).json: \(error)")
}
}
Location tool function: we implement the get_location tool as a getLocation(_:) function
that reads the device’s latitude and longitude data off the Location Manager from the Maps
tutorial. Here’s the definition of the getLocation(_:) function. Put it at the top-level of your
Toolbox.swift file also:
func getLocation(_ argv: [String]) async -> String? {
"latitude: \(LocManagerViewModel.shared.location.lat), longitude: \(LocManagerViewModel.shared.location.lon)"
}
Location Manager
We don’t need all of the functionalities of the LocManager, but it is the least amount of work and
lowest chance of introducing bugs if we just copy the whole LocManager.swift file from the Maps
tutorial: open both the Maps and llmTools projects in Xcode, then alt-drag the
LocManager.swift file from the Project Navigator pane of the Maps project to the llmTools
project’s. You will also need to request permission to read the location as you did in the
Maps tutorial.
If you have not completed the
Mapstutorial, please follow the instructions in the Location manager section to set up theLocManager. You don’t need to complete the rest of theMapstutorial.
Update the Location struct in LocManager.swift to conform to the Decodable protocol
and give the speed property default value of 0.0.
Then in your swiftUIChatterApp.swift file, add the following initializer to the start of the
init() block of your swiftUIChatterApp:
LocManager.shared.startUpdates()
The toolbox
Even though we have only one resident tool in this tutorial, we build a generalized architecture that can hold multiple tools and invoke the right tool dynamically. To that end, we’ll use a switch table (or jump table or, more fancily, service locator registry) as the data structure for our tool box. We implement the switch table as a dictionary. The “keys” in the dictionary are the names of the tools/functions. Each “value” is a record containing the tool’s definition/schema and a pointer to the function implementing the tool. To send a tool as part of a request to Ollama, we look up its schema in the switch table and copy it to the request. To invoke a tool called by Ollama in its response, we look up the tool’s function in the switch table and invoke the function.
Back in your Toolbox file, add the following async tool-function type. Also add a record type
comprising two properties: a tool definition schema (schema) and an async tool function (function):
typealias ToolFunction = ([String]) async -> String?
struct Tool {
let schema: OllamaToolSchema
let function: ToolFunction
}
Now create a switch-table toolbox and populate it with the location tool using the jsonToSchema(_:)
function we created earlier:
let TOOLBOX = [
"get_location": Tool(schema: jsonToSchema("get_location"), function: getLocation),
]
Tool use or function calling
Ollama tool call: Ollama’s JSON tool call comprises a JSON Object containing a nested JSON
Object carrying the name of the function and the arguments to pass to it. Add these nested struct
definitions representing Ollama’s tool call JSON to your Toolbox.swift file:
struct OllamaToolCall: Codable {
let function: OllamaFunctionCall
}
struct OllamaFunctionCall: Codable {
let name: String
let arguments: [String:String]
}
Tool invocation: finally, here’s the tool invocation function. We call this function to execute
any tool call we receive from Ollama response. It looks the toolbox up for the tool name. If the
tool is resident, it runs it and returns the result, otherwise it returns a nil.
func toolInvoke(function: OllamaFunctionCall) async -> String? {
if let tool = TOOLBOX[function.name] {
var argv = [String]()
for label in tool.schema.function.parameters?.required ?? [] {
// get arguments in order, Dict doesn't preserve insertion order;
// arguments may also arrive out of order from the back end
if let arg = function.arguments[label] {
argv.append(arg)
}
}
return await tool.function(argv)
}
return nil
}
That concludes our toolbox definition.
ChattViewModel
In your ChattViewModel in swiftUIChatterApp.swift file:
- change its
onTrailingEndproperty toqwen3:8b(you will be usingqwen3:0.6binstead when testing your back end). - We will not be sending any
systemprompt in this tutorial. Setsysmsgto"". - Set
messageto"What is the weather at my location?"
ChattStore
structs
In your ChattStore.swift, add
- a
ToolCallscase arm to yourSseEventType, -
thinkingandtoolCallsproperties, along with theCodingKeys, to yourOllamaMessageand - a
toolsproperty to yourOllamaRequest, also change themessageproperty ofOllamaRequestto be mutable (var):
enum SseEventType { case Error, Message, ToolCalls }
struct OllamaMessage: Codable {
// ...
let thinking: String?
let toolCalls: [OllamaToolCall]?
enum CodingKeys: String, CodingKey {
// to map json field to property
// if one is specified, must specify all
case role = "role"
case content = "content"
case thinking = "thinking"
case toolCalls = "tool_calls"
}
}
struct OllamaRequest: Encodable {
// ...
// change messages property to var
// ...
var tools: [OllamaToolSchema]?
}
With the addition of the thinking and toolCalls properties to OllamaMessage, you’d need to
add thinking: nil, toolCalls: nil arguments to existing instantiations of OllamaMessage in
ChattStore.swift.
llmTools(appID:chatt:errMsg:)
Rename your llmChat(appID:chatt:errMsg:) function llmTools(appID:chatt:errMsg:).
Update your apiUrl to point to /llmtools.
Find your instantiation of OllamaRequest, make the ollamaRequest variable mutable (var)
and add the following tools property to the instantiation:
tools: TOOLBOX.isEmpty ? nil : []
Once ollamaRequest is instantiated, populate its tools property with on-device tools:
// append all on-device tools to ollamaRequest
for (_, tool) in TOOLBOX {
ollamaRequest.tools?.append(tool.schema)
}
Next remove the following lines (we will put them inside a loop below):
guard let requestBody = try? JSONEncoder().encode(ollamaRequest) else {
errMsg.wrappedValue = "llmChat: JSONEncoder error"
return
}
and about six lines down (keep the intervening lines):
request.httpBody = requestBody
To accommodate on-device tool call, we use a flag, sendNewPrompt, to indicate whether we have any
prompt to send to Ollama. Initially, sendNewPrompt is set to true to always send user’s initial
prompt to Ollama. Subsequently, if Ollama makes a call for an on-device, we will send the result of
the tool call as a new prompt to Ollama. Add the following code right before the start of the
do-catch block, move the do-catch block inside the new while loop:
var sendNewPrompt = true
while sendNewPrompt {
sendNewPrompt = false
guard let requestBody = try? JSONEncoder().encode(ollamaRequest) else {
errMsg.wrappedValue = "llmTools: JSONEncoder error"
return
}
request.httpBody = requestBody
// leave existing do-catch block here
} // while sendNewPrompt
Back in the do-block, since we now have an additional ToolCall arm to the SseEventType, when we
get an empty incoming line, we need to set sseEvent back to the default .Message type regardless
of the current sseEvent type. Move sseEvent = .Message line out of the if sseEvent == .Error
{} block. Put it right before continue.
In parsing a SSE event line, in addition to the existing two cases, if we see event ==
"tool_calls, set sseEvent = .ToolCalls.
In parsing a SSE data line, replace the content of the else if parts[0].starts(with: "data") {} block
with:
if sseEvent == .Error {
errMsg.wrappedValue += String(describing: parts[1].trimmingCharacters(in: .whitespaces).utf8)
line = ""
continue
}
let data = Data(parts[1].trimmingCharacters(in: .whitespaces).utf8)
do {
let ollamaResponse = try decoder.decode(OllamaResponse.self, from: data)
if let token = ollamaResponse.message.content, !token.isEmpty {
resChatt.message?.append(token)
} else if let token = ollamaResponse.message.thinking, !token.isEmpty {
resChatt.message?.append(token)
}
// check tool call and make the tool call
} catch {
errMsg.wrappedValue += "\(error)\n\(apiUrl)\n\(String(data: data, encoding: .utf8) ?? "decoding error")"
}
If we’re in the .ToolCalls SSE event state, we handle the tool call. Replace the comment //check
tool call and make the tool call with :
if sseEvent == .ToolCalls, let toolCalls = ollamaResponse.message.toolCalls {
// message.content is usually empty
for toolCall in toolCalls {
let toolResult = await toolInvoke(function: toolCall.function)
if toolResult != nil {
// reuse OllamaMessage to carry tool result
// to be sent back to Ollama
ollamaRequest.messages = [OllamaMessage(role: "tool", content: toolResult, toolCalls: nil)]
// don't send tools multiple times
ollamaRequest.tools = nil
// send result back to Ollama
sendNewPrompt = true
} else {
// tool unknown, report to user as error
errMsg.wrappedValue += "llmTools ERROR: tool '\(toolCall.function.name)' called"
resChatt.message?.append("\n\n**llmTools Error**: tool '\(toolCall.function.name)' called\n\n")
}
}
}
And we’re done with llmTools(appID:chatt:errMsg:) and with ChattStore!
SubmitButton
Finally, in SubmitButton() in ContentView.swift, replace the call to
llmChat(appID:chatt:errMsg:) with a call to llmTools(appID:chatt:errMsg:).
That should do it for the front end!
Run and test to verify and debug
Please see the End-to-end testing section of the spec to test your front-end implementation.
Once you finished testing, change your serverUrl back to YOUR_SERVER_IP so that
we know what your server IP is. You will not get full credit if your front end is
not set up to work with your back end!
Front-end submission guidelines
We will only grade files committed to the main branch. If you’ve created multiple
branches, please merge them all to the main branch for submission.
Push your front-end code to the same GitHub repo you’ve submitted your back-end code:
- Open GitHub Desktop and click on
Current Repositoryon the top left of the interface - Click on the GitHub repo you created at the start of this tutorial
- Add Summary to your changes and click
Commit to mainat the bottom of the left pane - Since you have pushed your back end code, you’ll have to click
Pull Originto synch up the repo on your laptop - Finally click
Push Originto push all changes to GitHub
Go to the GitHub website to confirm that your front-end files have been uploaded to your GitHub repo
under the folder llmtools. Confirm that your repo has a folder structure outline similar to the following. If
your folder structure is not as outlined, our script will not pick up your submission and, further, you may have
problems getting started on latter tutorials. There could be other files or folders in your local folder not listed
below, don’t delete them. As long as you have installed the course .gitignore as per the instructions in Preparing
GitHub for Reactive, only files needed for grading will
be pushed to GitHub.
reactive
|-- chatterd
|-- chatterd.crt
|-- llmtools
|-- swiftUIChatter
|-- swiftUIChatter.xcodeproj
|-- swiftUIChatter
|-- tools
# and other files or folders
Verify that your Git repo is set up correctly: on your laptop, grab a new clone of your repo and build and run your submission to make sure that it works. You will get ZERO point if your tutorial doesn’t build, run, or open.
IMPORTANT: If you work in a team, put your team mate’s name and uniqname in your repo’s README.md (click the pencil icon at the upper right corner of the README.md box on your git repo) so that we’d know. Otherwise, we could mistakenly think that you were cheating and accidentally report you to the Honor Council, which would be a hassle to undo. You don’t need a README.md if you work by yourself.
Review your information on the Tutorial and Project Links sheet. If you’ve changed your teaming arrangement from previous tutorial’s, please update your entry. If you’re using a different GitHub repo from previous tutorial’s, invite eecsreactive@umich.edu to your new GitHub repo and update your entry.
| Prepared by Xin Jie ‘Joyce’ Liu, Chenglin Li, and Sugih Jamin | Last updated March 8th, 2026 |