base on All-in-one LLM CLI tool featuring Shell Assistant, Chat-REPL, RAG, AI Tools & Agents, with access to OpenAI, Claude, Gemini, Ollama, Groq, and more. # AIChat: All-in-one LLM CLI Tool [![CI](https://github.com/sigoden/aichat/actions/workflows/ci.yaml/badge.svg)](https://github.com/sigoden/aichat/actions/workflows/ci.yaml) [![Crates](https://img.shields.io/crates/v/aichat.svg)](https://crates.io/crates/aichat) [![Discord](https://img.shields.io/discord/1226737085453701222?label=Discord)](https://discord.gg/mr3ZZUB9hG) AIChat is an all-in-one LLM CLI tool featuring Shell Assistant, CMD & REPL Mode, RAG, AI Tools & Agents, and More. ## Install ### Package Managers - **Rust Developers:** `cargo install aichat` - **Homebrew/Linuxbrew Users:** `brew install aichat` - **Pacman Users**: `pacman -S aichat` - **Windows Scoop Users:** `scoop install aichat` - **Android Termux Users:** `pkg install aichat` ### Pre-built Binaries Download pre-built binaries for macOS, Linux, and Windows from [GitHub Releases](https://github.com/sigoden/aichat/releases), extract them, and add the `aichat` binary to your `$PATH`. ## Features ### Multi-Providers Integrate seamlessly with over 20 leading LLM providers through a unified interface. Supported providers include OpenAI, Claude, Gemini (Google AI Studio), Ollama, Groq, Azure-OpenAI, VertexAI, Bedrock, Github Models, Mistral, Deepseek, AI21, XAI Grok, Cohere, Perplexity, Cloudflare, OpenRouter, Ernie, Qianwen, Moonshot, ZhipuAI, MiniMax, Deepinfra, VoyageAI, any OpenAI-Compatible API provider. ### CMD Mode Explore powerful command-line functionalities with AIChat's CMD mode. ![aichat-cmd](https://github.com/user-attachments/assets/6c58c549-1564-43cf-b772-e1c9fe91d19c) ### REPL Mode Experience an interactive Chat-REPL with features like tab autocompletion, multi-line input support, history search, configurable keybindings, and custom REPL prompts. ![aichat-repl](https://github.com/user-attachments/assets/218fab08-cdae-4c3b-bcf8-39b6651f1362) ### Shell Assistant Elevate your command-line efficiency. Describe your tasks in natural language, and let AIChat transform them into precise shell commands. AIChat intelligently adjusts to your OS and shell environment. ![aichat-execute](https://github.com/user-attachments/assets/0c77e901-0da2-4151-aefc-a2af96bbb004) ### Multi-Form Input Accept diverse input forms such as stdin, local files and directories, and remote URLs, allowing flexibility in data handling. | Input | CMD | REPL | | ----------------- | ------------------------------------ | -------------------------------- | | CMD | `aichat hello` | | | STDIN | `cat data.txt \| aichat` | | | Last Reply | | `.file %%` | | Local files | `aichat -f image.png -f data.txt` | `.file image.png data.txt` | | Local directories | `aichat -f dir/` | `.file dir/` | | Remote URLs | `aichat -f https://example.com` | `.file https://example.com` | | External commands | ```aichat -f '`git diff`'``` | ```.file `git diff` ``` | | Combine Inputs | `aichat -f dir/ -f data.txt explain` | `.file dir/ data.txt -- explain` | ### Role Customize roles to tailor LLM behavior, enhancing interaction efficiency and boosting productivity. ![aichat-role](https://github.com/user-attachments/assets/023df6d2-409c-40bd-ac93-4174fd72f030) > The role consists of a prompt and model configuration. ### Session Maintain context-aware conversations through sessions, ensuring continuity in interactions. ![aichat-session](https://github.com/user-attachments/assets/56583566-0f43-435f-95b3-730ae55df031) > The left side uses a session, while the right side does not use a session. ### Macro Streamline repetitive tasks by combining a series of REPL commands into a custom macro. ![aichat-macro](https://github.com/user-attachments/assets/23c2a08f-5bd7-4bf3-817c-c484aa74a651) ### RAG Integrate external documents into your LLM conversations for more accurate and contextually relevant responses. ![aichat-rag](https://github.com/user-attachments/assets/359f0cb8-ee37-432f-a89f-96a2ebab01f6) ### Function Calling Function calling supercharges LLMs by connecting them to external tools and data sources. This unlocks a world of possibilities, enabling LLMs to go beyond their core capabilities and tackle a wider range of tasks. We have created a new repository [https://github.com/sigoden/llm-functions](https://github.com/sigoden/llm-functions) to help you make the most of this feature. #### AI Tools & MCP Integrate external tools to automate tasks, retrieve information, and perform actions directly within your workflow. ![aichat-tool](https://github.com/user-attachments/assets/7459a111-7258-4ef0-a2dd-624d0f1b4f92) #### AI Agents (CLI version of OpenAI GPTs) AI Agent = Instructions (Prompt) + Tools (Function Callings) + Documents (RAG). ![aichat-agent](https://github.com/user-attachments/assets/0b7e687d-e642-4e8a-b1c1-d2d9b2da2b6b) ### Local Server Capabilities AIChat includes a lightweight built-in HTTP server for easy deployment. ``` $ aichat --serve Chat Completions API: http://127.0.0.1:8000/v1/chat/completions Embeddings API: http://127.0.0.1:8000/v1/embeddings Rerank API: http://127.0.0.1:8000/v1/rerank LLM Playground: http://127.0.0.1:8000/playground LLM Arena: http://127.0.0.1:8000/arena?num=2 ``` #### Proxy LLM APIs The LLM Arena is a web-based platform where you can compare different LLMs side-by-side. Test with curl: ```sh curl -X POST -H "Content-Type: application/json" -d '{ "model":"claude:claude-3-5-sonnet-20240620", "messages":[{"role":"user","content":"hello"}], "stream":true }' http://127.0.0.1:8000/v1/chat/completions ``` #### LLM Playground A web application to interact with supported LLMs directly from your browser. ![aichat-llm-playground](https://github.com/user-attachments/assets/aab1e124-1274-4452-b703-ef15cda55439) #### LLM Arena A web platform to compare different LLMs side-by-side. ![aichat-llm-arena](https://github.com/user-attachments/assets/edabba53-a1ef-4817-9153-38542ffbfec6) ## Custom Themes AIChat supports custom dark and light themes, which highlight response text and code blocks. ![aichat-themes](https://github.com/sigoden/aichat/assets/4012553/29fa8b79-031e-405d-9caa-70d24fa0acf8) ## Documentation - [Chat-REPL Guide](https://github.com/sigoden/aichat/wiki/Chat-REPL-Guide) - [Command-Line Guide](https://github.com/sigoden/aichat/wiki/Command-Line-Guide) - [Role Guide](https://github.com/sigoden/aichat/wiki/Role-Guide) - [Macro Guide](https://github.com/sigoden/aichat/wiki/Macro-Guide) - [RAG Guide](https://github.com/sigoden/aichat/wiki/RAG-Guide) - [Environment Variables](https://github.com/sigoden/aichat/wiki/Environment-Variables) - [Configuration Guide](https://github.com/sigoden/aichat/wiki/Configuration-Guide) - [Custom Theme](https://github.com/sigoden/aichat/wiki/Custom-Theme) - [Custom REPL Prompt](https://github.com/sigoden/aichat/wiki/Custom-REPL-Prompt) - [FAQ](https://github.com/sigoden/aichat/wiki/FAQ) ## License Copyright (c) 2023-2025 aichat-developers. AIChat is made available under the terms of either the MIT License or the Apache License 2.0, at your option. See the LICENSE-APACHE and LICENSE-MIT files for license details.", Assign "at most 3 tags" to the expected json: {"id":"6374","tags":[]} "only from the tags list I provide: [{"id":77,"name":"3d"},{"id":89,"name":"agent"},{"id":17,"name":"ai"},{"id":54,"name":"algorithm"},{"id":24,"name":"api"},{"id":44,"name":"authentication"},{"id":3,"name":"aws"},{"id":27,"name":"backend"},{"id":60,"name":"benchmark"},{"id":72,"name":"best-practices"},{"id":39,"name":"bitcoin"},{"id":37,"name":"blockchain"},{"id":1,"name":"blog"},{"id":45,"name":"bundler"},{"id":58,"name":"cache"},{"id":21,"name":"chat"},{"id":49,"name":"cicd"},{"id":4,"name":"cli"},{"id":64,"name":"cloud-native"},{"id":48,"name":"cms"},{"id":61,"name":"compiler"},{"id":68,"name":"containerization"},{"id":92,"name":"crm"},{"id":34,"name":"data"},{"id":47,"name":"database"},{"id":8,"name":"declarative-gui "},{"id":9,"name":"deploy-tool"},{"id":53,"name":"desktop-app"},{"id":6,"name":"dev-exp-lib"},{"id":59,"name":"dev-tool"},{"id":13,"name":"ecommerce"},{"id":26,"name":"editor"},{"id":66,"name":"emulator"},{"id":62,"name":"filesystem"},{"id":80,"name":"finance"},{"id":15,"name":"firmware"},{"id":73,"name":"for-fun"},{"id":2,"name":"framework"},{"id":11,"name":"frontend"},{"id":22,"name":"game"},{"id":81,"name":"game-engine "},{"id":23,"name":"graphql"},{"id":84,"name":"gui"},{"id":91,"name":"http"},{"id":5,"name":"http-client"},{"id":51,"name":"iac"},{"id":30,"name":"ide"},{"id":78,"name":"iot"},{"id":40,"name":"json"},{"id":83,"name":"julian"},{"id":38,"name":"k8s"},{"id":31,"name":"language"},{"id":10,"name":"learning-resource"},{"id":33,"name":"lib"},{"id":41,"name":"linter"},{"id":28,"name":"lms"},{"id":16,"name":"logging"},{"id":76,"name":"low-code"},{"id":90,"name":"message-queue"},{"id":42,"name":"mobile-app"},{"id":18,"name":"monitoring"},{"id":36,"name":"networking"},{"id":7,"name":"node-version"},{"id":55,"name":"nosql"},{"id":57,"name":"observability"},{"id":46,"name":"orm"},{"id":52,"name":"os"},{"id":14,"name":"parser"},{"id":74,"name":"react"},{"id":82,"name":"real-time"},{"id":56,"name":"robot"},{"id":65,"name":"runtime"},{"id":32,"name":"sdk"},{"id":71,"name":"search"},{"id":63,"name":"secrets"},{"id":25,"name":"security"},{"id":85,"name":"server"},{"id":86,"name":"serverless"},{"id":70,"name":"storage"},{"id":75,"name":"system-design"},{"id":79,"name":"terminal"},{"id":29,"name":"testing"},{"id":12,"name":"ui"},{"id":50,"name":"ux"},{"id":88,"name":"video"},{"id":20,"name":"web-app"},{"id":35,"name":"web-server"},{"id":43,"name":"webassembly"},{"id":69,"name":"workflow"},{"id":87,"name":"yaml"}]" returns me the "expected json"