Invoke - Local LLM Client
source

Invoke - Local LLM Client

(1)
Price
Free
Category
Developer Tools Utilities
Last update
Sep 09, 2025
Publisher
kazuhiko sugimoto Contact publisher
View in store
Loading...

Ratings & Reviews performance

Ratings & Reviews performance provides an overview of what users think of your app. Here are the key metrics to help you identify how your app is rated by users and how successful is your review management strategy.

Number of reviews,
total
1
Avg rating,
total
⭐4.0
Loading...

Description

914 chars

Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required. Chat with your local LLM! Seamlessly connect to Ollama or LM Studio for a fully offline, privacy-focused AI chat experience! This iOS app connects to a locally hosted Large Language Model (LLM) server and enables seamless, natural conversations. Compatible with Ollama and LM Studio via HTTP, it provides real-time message streaming and intuitive chat history management. The app operates entirely within a local network—no internet connection required—making it ideal for those who prioritize privacy and security. Key Features: - Easy connection to local LLM servers (Ollama / LM Studio) - Natural chat UI with bubble-style layout - Auto-saving and browsing chat history - Server and model selection via settings screen - Supports Dark Mode

Screenshots

https://is1-ssl.mzstatic.com/image/thumb/PurpleSource211/v4/ff/0f/3a/ff0f3a9f-c37b-0842-0eca-601ba5d5179c/Settings-iPad-Pro-13inch.png/2064x2752.pnghttps://is1-ssl.mzstatic.com/image/thumb/PurpleSource211/v4/b7/e4/01/b7e40189-65b3-d8b1-55aa-2201e39c4d1d/Simulator_Screenshot_-_iPad_Pro_13-inch__U0028M4_U0029_-_2025-08-07_at_18.00.45.png/2064x2752.pnghttps://is1-ssl.mzstatic.com/image/thumb/PurpleSource211/v4/90/54/df/9054df69-1e83-98a6-3e10-517dc48fa34d/Simulator_Screenshot_-_iPad_Pro_13-inch__U0028M4_U0029_-_2025-08-07_at_18.02.50.png/2064x2752.png
Loading...
Loading...

Find growth insights on our blog

React to user feedback and market trends faster