Transform your Jira backlog into actionable insights with AI-generated refinement questions and test cases.
Features β’ Demo β’ Installation β’ Usage β’ Architecture
Product and QA teams spend countless hours in refinement sessions trying to:
- Identify edge cases and missing requirements
- Write comprehensive test cases
- Ensure tickets are implementation-ready
ProRef automates this process by analyzing your Jira tickets and generating intelligent questions and test cases using AI.
- Automatic sync with your Jira backlog via REST API
- Smart JQL builder with project, board, and sprint selectors
- Publish back to Jira β generated content appears as formatted comments
- OpenAI (GPT-4, GPT-3.5)
- Anthropic (Claude 3.5 Sonnet, Haiku)
- Google (Gemini 1.5 Pro, Flash)
AI analyzes each ticket to generate clarifying questions that uncover:
- Edge cases and boundary conditions
- Implicit assumptions
- Missing acceptance criteria
- Integration dependencies
Generates QA-ready test cases in a structured format:
TC-1: User login with valid credentials
PRE: User account exists and is active
STEPS:
1. Navigate to login page
2. Enter valid email and password
3. Click "Sign In"
EXPECTED:
- User is redirected to dashboard
- Welcome message displays user's name
- Embedding-based matching finds related tickets
- Cross-ticket awareness prevents duplicate work
- Smart suggestions based on similarity
AI-powered ticket quality assessment (1-10 scale):
- Ready (8-10) β Well-defined, implementation-ready
- Needs Work (5-7) β Minor improvements needed
- Not Ready (1-4) β Requires significant refinement
Evaluates: title clarity, description detail, acceptance criteria, edge cases
Context-aware prompts for different industries:
- Healthcare β HIPAA compliance, clinical workflows, EHR integration
- Fintech β Transaction integrity, PCI-DSS, fraud prevention
- E-commerce β Inventory management, payments, promotions
- SaaS β Multi-tenancy, RBAC, API versioning
- Generic β General software development
Visual progress tracking through the refinement pipeline:
ββββββββββββ ββββββββββββ ββββββββββββ ββββββββββββ
β FETCH β β β EMBED β β β GENERATE β β β PUBLISH β
β β 21 β β β 21 β β β³ 15 β β 8/21 β
ββββββββββββ ββββββββββββ ββββββββββββ ββββββββββββ
Modern dark-themed UI built with Streamlit:
- Dashboard β Workflow progress at a glance
- Tickets β Browse with filters, quality scores, and change indicators
- Generate β Create questions and test cases with domain presets
- Publish β Review and push to Jira
- Reports β Sprint summaries, quality breakdown, export to Excel/Markdown
- Settings β Configure AI providers and Jira connection
$ proref status
ProRef Status
========================================
Tickets:
Total: 21
With questions: 15
With test cases: 12
Publication:
Questions published: 8
Test cases published: 6
Pending: 13- Python 3.10+
- Jira Cloud account with API access
- OpenAI/Anthropic/Google AI API key
# Clone the repository
git clone https://github.com/yourusername/proref.git
cd proref
# Create virtual environment
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
# Install dependencies
pip install -e .
# Copy and configure environment
cp .env.example .env
cp data/config.example.json data/config.json
# Edit .env or use the web UI to configureYou can configure ProRef via environment variables or the web UI:
# .env
JIRA_BASE_URL=https://your-org.atlassian.net
JIRA_USER=your-email@example.com
JIRA_API_TOKEN=your-api-token
OPENAI_API_KEY=sk-your-keyOr launch the UI and go to Settings:
proref uiproref ui
# Opens http://localhost:8501| Command | Description |
|---|---|
proref fetch |
Import tickets from Jira |
proref embed |
Generate embeddings for semantic search |
proref questions |
Generate refinement questions |
proref testcases |
Generate test cases |
proref publish |
Interactively publish to Jira |
proref status |
Show processing statistics |
proref chat |
Interactive Q&A about tickets |
proref ui |
Launch web interface |
# 1. Fetch tickets from Jira
proref fetch
# 2. Generate embeddings for semantic search
proref embed
# 3. Generate questions (with auto-publish)
proref questions --publish
# 4. Generate test cases
proref testcases --publish
# 5. Check status
proref statusproref/
βββ app/
β βββ cli.py # Typer CLI application
β βββ ui.py # Streamlit web interface
β βββ config.py # Configuration management
β βββ paths.py # Path constants
β β
β βββ db/
β β βββ model.py # SQLAlchemy models
β β βββ save.py # Data persistence + quality scores
β β βββ embedding.py # Vector storage
β β
β βββ jira/
β β βββ fetcher.py # Jira API client
β β βββ publisher.py # ADF comment formatting
β β
β βββ logic/
β β βββ embedder.py # Text embeddings
β β βββ matching.py # Semantic search
β β βββ question_generator.py
β β βββ test_case_generator.py
β β βββ related_tickets.py
β β βββ quality_scorer.py # AI quality assessment
β β βββ prompts.py # Domain presets
β β βββ exporter.py # Excel/Markdown export
β β
β βββ utils/
β βββ retry.py # Retry decorator
β
βββ data/
β βββ proref.db # SQLite database
β βββ config.json # User configuration
β
βββ tests/ # 106 unit tests
βββ scripts/ # Legacy CLI scripts
| Layer | Technology |
|---|---|
| Frontend | Streamlit with custom CSS |
| CLI | Typer + Rich |
| Database | SQLite + SQLAlchemy |
| AI | OpenAI / Anthropic / Google APIs |
| Embeddings | text-embedding-3-small (1536 dims) |
| External API | Jira REST API v3 |
βββββββββββ βββββββββββββββ ββββββββββββ
β Jira ββββββΆβ ProRef ββββββΆβ SQLite β
β Cloud βββββββ Engine βββββββ DB β
βββββββββββ βββββββββββββββ ββββββββββββ
β
βΌ
βββββββββββββββ
β AI APIs β
β (GPT/Claude)β
βββββββββββββββ
# Run all tests
pytest
# Run with coverage
pytest --cov=app --cov-report=html
# Run specific test file
pytest tests/test_generators.py -v106 tests covering:
- Configuration management
- Database models
- Question/test generation
- Jira integration
- Embedding operations
- Quality scoring
- Domain prompts
- Export functionality
- Retry logic
- Multi-provider AI support
- Structured test case format
- Web UI with modern design
- Jira comment publishing (ADF format)
- Semantic ticket search
- Ticket quality scoring
- Domain-specific prompts
- Export to Excel/Markdown
- Sprint reports
- Change detection
- Epic-level documentation generation
- Slack/Teams integration
- PDF export with styling
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
Built with β€οΈ for QA and Product teams