Prompting major LLMs via CLI.
Supported Models
- OpenAI
- MistralAI
- Anthropic
- Cohere
- Nvidia
- Fireworks
- Groq
- Together
- Local:
pip install --editable .
- PyPi:
pip install prompt_llm
prompt_llm --install-completionsource ~/.bashrc
Supported configs:
--api-key--model--system--temperature
prompt_llm config_add --api-key "<api_key>" --system "<system_messgae>" --temperature <temperature>
prompt_llm config_save_to --profile "openai"
prompt_llm config_load_from --profile "openai"
prompt_llm config-rm temperature
prompt_llm config-lsprompt_llm config-ls --profile="openai"
prompt_llm openai "Tell me something funny!"
python -m build
python3 -m twine upload --repository pypi dist/* --verbose