Invoke - Local LLM Client icon

Invoke - Local LLM Client

kazuhiko sugimoto

3.2

store rating

Free

AppRecs review analysis

AppRecs rating 3.0. Trustworthiness 74 out of 100. Review manipulation risk 26 out of 100. Based on a review sample analyzed.

★★☆☆☆

3.0

AppRecs Rating

Ratings breakdown

5 star

17%

4 star

50%

3 star

0%

2 star

0%

1 star

33%

What to know

Low review manipulation risk

26% review manipulation risk

Credible reviews

74% trustworthiness score from analyzed reviews

About Invoke - Local LLM Client

Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required.

Chat with your local LLM! Seamlessly connect to Ollama or LM Studio for a fully offline, privacy-focused AI chat experience!

This iOS app connects to a locally hosted Large Language Model (LLM) server and enables seamless, natural conversations.
Compatible with Ollama and LM Studio via HTTP, it provides real-time message streaming and intuitive chat history management.
The app operates entirely within a local network—no internet connection required—making it ideal for those who prioritize privacy and security.

Key Features:
- Easy connection to local LLM servers (Ollama / LM Studio)
- Natural chat UI with bubble-style layout
- Auto-saving and browsing chat history
- Server and model selection via settings screen
- Supports Dark Mode
Similar to Invoke - Local LLM Client

©2023 Verdant Labs LLC. All rights reserved.

Privacy PolicyContact