DataVyn Labs × Ollama Agents multi-model AI chat workspace

DataVyn Labs × Ollama Agents multi-model AI chat workspace

posted 1 min read

We built a clean, minimal AI chat interface powered by Ollama Cloud Models, designed as a fast workspace for trying multiple frontier LLMs in one place.

You only need a single Ollama Cloud API key to chat with 19+ top models—no separate OpenAI/Gemini keys, no billing setup, or no credit card needed.

This project is developed by DataVyn Labs (https://github.com/DataVyn-labs)

What it does

  • Talk to 19+ Ollama cloud models (OpenAI, DeepSeek, Qwen, Gemini, Mistral, Kimi, GLM, MiniMax, and more) from a single UI.
  • Upload .txt, .pdf, .json, .py, .csv and send the content to the model.
  • Voice input via mic with automatic transcription.
  • Secure API key handling (session-only, never saved to disk).
  • Dark, Claude-style interface built entirely with Streamlit.

deployed app

You just need an Ollama Cloud API key (Settings → API Keys on ollama.com) and you’re ready to go.

1 Comment

1 vote

More Posts

Dashboard Operasional Armada Rental Mobil dengan Python + FastAPI

Masbadar - Mar 12

I’m a Senior Dev and I’ve Forgotten How to Think Without a Prompt

Karol Modelskiverified - Mar 19

Defending Against AI Worms: Securing Multi-Agent Systems from Self-Replicating Prompts

alessandro_pignati - Apr 2

I spent years trying to get AI agents to collaborate. Then Opus 4.6 and Codex 5.3 wrote the rules

snapsynapseverified - Apr 20

I Wrote a Script to Fix Audible's Unreadable PDF Filenames

snapsynapseverified - Apr 20
chevron_left

Related Jobs

View all jobs →

Commenters (This Week)

4 comments
2 comments
1 comment

Contribute meaningful comments to climb the leaderboard and earn badges!