AI prompts
base on AI Chatbots in terminal without needing API keys <p align="center"><img src="tgpt.svg"></p>
# Terminal GPT (tgpt) 🚀
[![GitHub go.mod Go version](https://img.shields.io/github/go-mod/go-version/aandrew-me/tgpt)](https://github.com/aandrew-me/tgpt)
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/aandrew-me/tgpt)](https://github.com/aandrew-me/tgpt/releases/latest)
![Arch Linux package](https://img.shields.io/archlinux/v/extra/x86_64/tgpt)
![Chocolatey Version](https://img.shields.io/chocolatey/v/tgpt)
tgpt is a cross-platform command-line interface (CLI) tool that allows you to use AI chatbot in your Terminal without requiring API keys.
### Currently available providers:
- [Blackbox AI](https://www.blackbox.ai/) (Blackbox model)
- [Duckduckgo](https://duckduckgo.com/aichat) (Supports several models)
- [Groq](https://groq.com/) (Requires a free API Key. LLaMA2-70b & Mixtral-8x7b)
- [KoboldAI](https://koboldai-koboldcpp-tiefighter.hf.space/) (koboldcpp/HF_SPACE_Tiefighter-13B)
- [Ollama](https://www.ollama.com/) (Supports many models)
- [OpenAI](https://platform.openai.com/docs/guides/text-generation/chat-completions-api) (All models, Requires API Key, supports custom endpoints)
- [Phind](https://www.phind.com/agent) (Phind Model)
**Image Generation Model**: BlackBoxAi
## Usage
```
Usage: tgpt [Flags] [Prompt]
Flags:
-s, --shell Generate and Execute shell commands. (Experimental)
-c, --code Generate Code. (Experimental)
-q, --quiet Gives response back without loading animation
-w, --whole Gives response back as a whole text
-img, --image Generate images from text
--provider Set Provider. Detailed information has been provided below. (Env: AI_PROVIDER)
Some additional options can be set. However not all options are supported by all providers. Not supported options will just be ignored.
--model Set Model
--key Set API Key
--url Set OpenAI API endpoint url
--temperature Set temperature
--top_p Set top_p
--max_length Set max response length
--log Set filepath to log conversation to (For interactive modes)
--preprompt Set preprompt
-y Execute shell command without confirmation
Options:
-v, --version Print version
-h, --help Print help message
-i, --interactive Start normal interactive mode
-m, --multiline Start multi-line interactive mode
-cl, --changelog See changelog of versions
-u, --update Update program
Providers:
The default provider is phind. The AI_PROVIDER environment variable can be used to specify a different provider.
Available providers to use: blackboxai, duckduckgo, groq, koboldai, ollama, openai and phind
Provider: blackboxai
Uses BlackBox model. Great for developers
Provider: duckduckgo
Available models: gpt-4o-mini (default), meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo, mistralai/Mixtral-8x7B-Instruct-v0.1, claude-3-haiku-20240307
Provider: groq
Requires a free API Key. Supports LLaMA2-70b & Mixtral-8x7b
Provider: koboldai
Uses koboldcpp/HF_SPACE_Tiefighter-13B only, answers from novels
Provider: ollama
Needs to be run locally. Supports many models
Provider: openai
Needs API key to work and supports various models. Recognizes the OPENAI_API_KEY and OPENAI_MODEL environment variables. Supports custom urls with --url
Provider: phind
Uses Phind Model. Great for developers
Examples:
tgpt "What is internet?"
tgpt -m
tgpt -s "How to update my system?"
tgpt --provider duckduckgo "What is 1+1"
tgpt --provider openai --key "sk-xxxx" --model "gpt-3.5-turbo" "What is 1+1"
cat install.sh | tgpt "Explain the code"
```
![demo](https://user-images.githubusercontent.com/66430340/233759296-c4cf8cf2-0cab-48aa-9e84-40765b823282.gif)
## Installation ⏬
### Download for GNU/Linux 🐧 or MacOS 🍎
The default download location is `/usr/local/bin`, but you can change it in the command to use a different location. However, make sure the location is added to your PATH environment variable for easy accessibility.
You can download it with the following command:
```bash
curl -sSL https://raw.githubusercontent.com/aandrew-me/tgpt/main/install | bash -s /usr/local/bin
```
If you are using Arch Linux, you can install with pacman:
```bash
pacman -S tgpt
```
### FreeBSD 😈
To install the [port](https://www.freshports.org/www/tgpt):
```
cd /usr/ports/www/tgpt/ && make install clean
```
To install the package, run one of these commands:
```
pkg install www/tgpt
pkg install tgpt
```
### Install with Go
You need to [add the Go install directory to your system's shell path](https://go.dev/doc/tutorial/compile-install).
```bash
go install github.com/aandrew-me/tgpt/v2@latest
```
### Windows 🪟
- **Scoop:** Package installation with [Scoop](https://scoop.sh/) can be done using the following command:
```bash
scoop install https://raw.githubusercontent.com/aandrew-me/tgpt/main/tgpt.json
```
- **Chocolatey**
```bash
choco install tgpt
```
## Updating ⬆️
If you installed the program with the installation script, you may update it with
```bash
tgpt -u
```
**It may require admin privileges.**
### Proxy
Support:
1. environment variable
http_proxy or HTTP_PROXY with following available formats:
- Http Proxy [ `http://ip:port` ]
- Http Auth [ `http://user:pass@ip:port` ]
- Socks5 Proxy [ `socks5://ip:port ]`
- Socks5 Auth [ `socks5://user:pass@ip:port` ]
2. configuration file
file location in the following order:
- ./proxy.txt (in the same directory from where you are executing)
- ~/.config/tgpt/proxy.txt
Example:
```bash
http://127.0.0.1:8080
```
### From Release
You can download the executable for your operating system, rename it to `tgpt` (or any other desired name), and then execute it by typing `./tgpt` while in that directory. Alternatively, you can add it to your PATH environmental variable and then execute it by simply typing `tgpt`.
## Uninstalling
If you installed with the install script, you can execute the following command to remove the tgpt executable
```
sudo rm $(which tgpt)
```
Configuration file is usually located in `~/.config/tgpt` on GNU/Linux Systems and in `"Library/Application Support/tgpt"` on MacOS
", Assign "at most 3 tags" to the expected json: {"id":"4292","tags":[]} "only from the tags list I provide: [{"id":77,"name":"3d"},{"id":89,"name":"agent"},{"id":17,"name":"ai"},{"id":54,"name":"algorithm"},{"id":24,"name":"api"},{"id":44,"name":"authentication"},{"id":3,"name":"aws"},{"id":27,"name":"backend"},{"id":60,"name":"benchmark"},{"id":72,"name":"best-practices"},{"id":39,"name":"bitcoin"},{"id":37,"name":"blockchain"},{"id":1,"name":"blog"},{"id":45,"name":"bundler"},{"id":58,"name":"cache"},{"id":21,"name":"chat"},{"id":49,"name":"cicd"},{"id":4,"name":"cli"},{"id":64,"name":"cloud-native"},{"id":48,"name":"cms"},{"id":61,"name":"compiler"},{"id":68,"name":"containerization"},{"id":92,"name":"crm"},{"id":34,"name":"data"},{"id":47,"name":"database"},{"id":8,"name":"declarative-gui "},{"id":9,"name":"deploy-tool"},{"id":53,"name":"desktop-app"},{"id":6,"name":"dev-exp-lib"},{"id":59,"name":"dev-tool"},{"id":13,"name":"ecommerce"},{"id":26,"name":"editor"},{"id":66,"name":"emulator"},{"id":62,"name":"filesystem"},{"id":80,"name":"finance"},{"id":15,"name":"firmware"},{"id":73,"name":"for-fun"},{"id":2,"name":"framework"},{"id":11,"name":"frontend"},{"id":22,"name":"game"},{"id":81,"name":"game-engine "},{"id":23,"name":"graphql"},{"id":84,"name":"gui"},{"id":91,"name":"http"},{"id":5,"name":"http-client"},{"id":51,"name":"iac"},{"id":30,"name":"ide"},{"id":78,"name":"iot"},{"id":40,"name":"json"},{"id":83,"name":"julian"},{"id":38,"name":"k8s"},{"id":31,"name":"language"},{"id":10,"name":"learning-resource"},{"id":33,"name":"lib"},{"id":41,"name":"linter"},{"id":28,"name":"lms"},{"id":16,"name":"logging"},{"id":76,"name":"low-code"},{"id":90,"name":"message-queue"},{"id":42,"name":"mobile-app"},{"id":18,"name":"monitoring"},{"id":36,"name":"networking"},{"id":7,"name":"node-version"},{"id":55,"name":"nosql"},{"id":57,"name":"observability"},{"id":46,"name":"orm"},{"id":52,"name":"os"},{"id":14,"name":"parser"},{"id":74,"name":"react"},{"id":82,"name":"real-time"},{"id":56,"name":"robot"},{"id":65,"name":"runtime"},{"id":32,"name":"sdk"},{"id":71,"name":"search"},{"id":63,"name":"secrets"},{"id":25,"name":"security"},{"id":85,"name":"server"},{"id":86,"name":"serverless"},{"id":70,"name":"storage"},{"id":75,"name":"system-design"},{"id":79,"name":"terminal"},{"id":29,"name":"testing"},{"id":12,"name":"ui"},{"id":50,"name":"ux"},{"id":88,"name":"video"},{"id":20,"name":"web-app"},{"id":35,"name":"web-server"},{"id":43,"name":"webassembly"},{"id":69,"name":"workflow"},{"id":87,"name":"yaml"}]" returns me the "expected json"