Word Loom
A convention for expressing language text and templates for AI language model-related uses, especially prompt templates. The format is based on TOML, and word looms are meant to be kept in resource directories for use with code invoking LLMs.
When working with LLMs, we've found ourselves needing better ways to manage prompts. Traditional code doesn't quite fit—prompts are natural language, not code. But they're also not just static text—they need templating, versioning, metadata and, crucially, internationalization.
Word Loom addresses some gaps that become clear once you start building real LLM applications:
- Separation of concerns: Keep your prompts out of your code, making them easier to iterate, version, and review
- Multilingual by design: LLM prompt engineering isn't just translation—a prompt that works well in English may need significant changes to achieve similar results in Japanese or Spanish. Word Loom lets you keep all language variants together, test them independently, and maintain metadata about their performance
- Template composition: Build complex prompts from reusable pieces, with clear markers for runtime values
- Diff-friendly: TOML's structure makes it easy to track changes in version control
- Compatible with traditional i18n: Works alongside gettext, Babel, and other localization tools, while respecting the unique needs of LLM prompting
# prompts.toml
lang = 'en'
[system_instruction]
_ = 'You are a helpful assistant that provides concise and accurate answers.'
[greeting_multilang]
_ = 'Hello, how can I help you today?'
_fr = "Bonjour, comment puis-je vous aider aujourd'hui?"
_es = '¡Hola! ¿Cómo puedo ayudarte hoy?'
_de = 'Hallo, wie kann ich Ihnen heute helfen?'
_ja = 'こんにちは、今日はどのようにお手伝いできますか?'
[code_review_prompt]
_ = '''
Review the following code and provide feedback on:
1. Code quality and readability
2. Potential bugs or issues
3. Suggestions for improvement
Code:
{code_snippet}
'''
_m = ['code_snippet'] # Declare template variablesAn example using Word Loom with an LLM API. OpenAI in this case, but Word Loom can work with any integration.
from openai import OpenAI
import wordloom
# Load your prompts
with open('prompts.toml', 'rb') as fp:
loom = wordloom.load(fp)
client = OpenAI()
# Select language based on user preference
user_lang = 'fr'
greeting = loom['greeting_multilang']
greeting_text = greeting.in_lang(user_lang) or str(greeting)
# Use with OpenAI
response = client.chat.completions.create(
model='gpt-4',
messages=[
{'role': 'system', 'content': greeting_text},
{'role': 'user', 'content': 'How does an LLM work?'}
]
)uv pip install wordloomOr without uv:
pip install wordloomSee wordloom_spec.md for the complete specification, including:
- Detailed format description
- Template marker syntax
- Internationalization features
- More usage examples
- Integration patterns
This is an under-considered area in AI prompting. When dealing with multiple languages, prompt engineering requires more than just translation. A prompt carefully tuned for English may perform very differently when naively translated to other languages. Word Loom helps by:
- Keeping all language variants in one place for easy comparison
- Allowing independent tuning of each language version
- Supporting metadata to track prompt performance across languages
- Enabling traditional i18n workflows while respecting LLM-specific needs
Contributions welcome! We're interested in feedback from the community about what works and what doesn't in real-world usage. To get help with the code implementation, read CONTRIBUTING.md.
- Code (Python library): Apache 2.0 - See LICENSE
- Specification (wordloom_spec.md): Creative Commons Attribution 4.0 International (CC BY 4.0) - See LICENSE-spec
The specification is under CC BY 4.0 to encourage broad adoption and derivative work while ensuring attribution. We want the format itself to be as open and reusable as possible, allowing anyone to create implementations in any language or adapt the format for their specific needs.
![]() |
Word Loom is primarily developed by the crew at Oori Data. We offer LLMOps, data pipelines and software engineering services around AI/LLM applications. Word Loom emerged from our work building LLM applications with sophisticated prompt management needs and multilingual imperatives. |
Since we started work on Word Loom there have bene some other projects emerging with some degree of intersection.
- IBM's Prompt Declaration Language - A more comprehensive language for prompt engineering
- PromptL
