Logo
Bono AI
5 min de lecture Bono AI Team

Oh my AI! is coming to desktop

Oh my AI! is preparing its desktop version powered by Tauri. A local, fast and private AI, soon installable directly on your machine — no browser required.

Oh my AI! is coming to desktop

You know Oh my AI!, our AI chat application that runs entirely in the browser. Today, we’re excited to share what’s next: we’re building a desktop version powered by Tauri.

Why go desktop?

The browser version works great, but it has its limitations. Browsers impose restrictions on system resource access, memory management, and local storage. With a native application, we’ll be able to push the boundaries:

  • Better performance — Direct GPU access without the browser’s abstraction layers. Inference will be faster and more stable.
  • Memory management — No more limits imposed by browser tabs. Larger models will run comfortably.
  • Native experience — A real application in your taskbar, with keyboard shortcuts, notifications, and instant startup.
  • Still private — Just like the web version, everything will stay on your machine. Zero data sent externally.

Why Tauri?

We could go with Electron, but Tauri is the natural choice for our project:

  • Lightweight — Tauri uses your system’s native web engine (WebView2 on Windows, WebKit on macOS/Linux) instead of bundling an entire Chromium. The result: an app that’s a few MB instead of hundreds.
  • Rust under the hood — The backend is written in Rust, bringing memory safety, performance, and low-level system access.
  • Open source — Tauri is an open-source project, aligned with Bono AI’s values.
  • Cross-platform — A single codebase for Windows, macOS, and Linux.

What we’re planning

The architecture will be straightforward: Oh my AI!‘s SvelteKit frontend will run inside the native WebView, while Tauri provides the bridge to the system. WebLLM (MLC) will continue to handle model inference via WebGPU, with the advantage of more direct access to hardware resources.

On first launch, you’ll pick a model and the app will download it to a local directory. After that, everything happens locally — no internet needed to chat.

Target platforms

We’re planning to support:

  • macOS (Apple Silicon and Intel)
  • Windows (x64)
  • Linux (AppImage, deb)

Upcoming features

The Tauri version will open the door to features we can’t offer in the browser:

  • Advanced conversation management with persistent local storage
  • Support for larger models
  • System integration (global shortcuts, context menu)
  • Automatic updates

Follow the progress

Development is happening in the open on GitHub. Subscribe to our newsletter to be notified when the first release drops. Local, open-source AI is coming soon!