We appreciate your interest in contributing to the PostgreSQL AI Query Extension! This document provides guidelines for contributing to the project.
- Getting Started
- Using the gh-issue Script
- Development Environment
- Code Style and Standards
- Testing
- Submitting Changes
- Releases (Maintainers)
- Reporting Issues
- Feature Requests
- Documentation
- Community
Before you start contributing, make sure you have:
- PostgreSQL 14+ with development headers
- CMake 3.16 or later
- C++20 compatible compiler (GCC 8+, Clang 10+, or MSVC 2019+)
- Git for version control
- API key from OpenAI or Anthropic for testing
-
Fork the repository on GitHub
-
Clone your fork:
git clone https://github.com/benodiwal/pg_ai_query.git cd pg_ai_query -
Initialize submodules:
git submodule update --init --recursive
-
Create a development branch:
git checkout -b feature/your-feature-name
-
Set up build environment:
mkdir build && cd build cmake -DCMAKE_BUILD_TYPE=Debug .. make
We provide a Nushell script (gh-issue) to help analyze and plan solutions for GitHub issues without writing code immediately. This tool is particularly useful for complex issues that require careful analysis.
- Nushell installed on your system
- GitHub CLI (
gh) installed and authenticated
The script accepts either an issue number or a full GitHub issue URL:
# Using issue number (for this repository)
./gh-issue 1
# Using issue URL
./gh-issue https://github.com/benodiwal/pg_ai_query/issues/1
# For a different repository (override default)
./gh-issue 123 --repo "owner/repo"The script will:
- Fetch the issue details from GitHub (title, description, comments)
- Generate a comprehensive prompt for analyzing the issue
- Guide you through:
- Reviewing the issue context
- Examining relevant parts of the codebase
- Explaining the problem and root cause
- Creating a detailed implementation plan
The script generates a plan that includes:
- Required code changes (SQL, C, or other relevant languages)
- Impact on PostgreSQL extension functionality
- Database compatibility considerations
- Necessary tests and documentation updates
- Performance and security implications
- PostgreSQL version backwards compatibility
- AI model integration considerations
Note: The script is designed to create a plan only, not to write code. This helps ensure thorough analysis before implementation.
We provide a comprehensive Nix flake that includes all required tools and dependencies. This ensures a consistent, reproducible development environment across all platforms.
Why use Nix?
- Same environment as CI/CD
- All tools pre-installed (CMake, PostgreSQL, Rust, mdbook, etc.)
- Reproducible builds
- No manual dependency installation
- Works on macOS and Linux
Quick Start with Nix:
See NIX.md for complete installation and setup guide.
# Install Nix (if not already installed)
curl -L https://nixos.org/nix/install | sh
# Enable flakes
mkdir -p ~/.config/nix
echo "experimental-features = nix-command flakes" >> ~/.config/nix/nix.conf
# Enter development environment
nix develop
# Or use direnv for automatic loading (recommended)
# See DIRENV_SETUP.md for detailsWhat's included in the Nix environment:
- CMake, Make, Clang/LLVM (C++20)
- PostgreSQL 16 with development headers
- Rust toolchain (cargo, rustc, rust-analyzer)
- Documentation tools (mdbook + plugins)
- Code quality tools (clang-format, markdownlint, yamllint)
- Utilities (ripgrep, fd, jq)
Build with Nix:
# Enter Nix environment
nix develop
# Build extension
mkdir -p build && cd build
cmake ..
make
# Build documentation
cd docs && mdbook serve
# Run formatting
make formatFor more details:
- NIX.md - Complete Nix setup and usage guide
- DIRENV_SETUP.md - Automatic environment loading with direnv
If you prefer not to use Nix, you can install dependencies manually:
- C++ Compiler: GCC 8+, Clang 10+, or MSVC 2019+
- CMake: Version 3.16 or later
- PostgreSQL: Development headers and libraries
- Git: For version control
The project uses these key dependencies:
- ai-sdk-cpp: For AI provider integration (included as submodule)
- nlohmann/json: For JSON processing (included with ai-sdk-cpp)
- OpenSSL: For secure HTTP communications
- PostgreSQL: Extension development headers
For development, use Debug build:
cmake -DCMAKE_BUILD_TYPE=Debug ..- Language Standard: C++20
- Header Style: Use
#pragma oncefor header guards - Naming Conventions:
- Classes:
PascalCase(e.g.,QueryGenerator) - Functions:
camelCase(e.g.,generateQuery) - Variables:
snake_case(e.g.,api_key) - Constants:
SCREAMING_SNAKE_CASE(e.g.,MAX_RETRIES)
- Classes:
The project uses clang-format with Chromium style:
With Nix (recommended):
# Nix environment includes all tools
nix develop
# Format all C++ source files
make format
# Check formatting (without modifying)
make format-check
# Lint other files
markdownlint docs/src/*.md
yamllint .github/workflows/*.ymlWithout Nix:
# Install clang-format (version may vary)
# macOS: brew install clang-format
# Ubuntu: apt install clang-format
# Format all source files
make format
# Check formatting
make format-check- Include Order: Always include
postgres.hfirst in PostgreSQL extensions - Error Handling: Use PostgreSQL's
ereport()for error reporting - Memory Management: Use PostgreSQL's memory contexts
- Function Signatures: Follow PostgreSQL's function signature conventions
- Header Comments: Include brief descriptions for all public functions
- Inline Comments: Explain complex logic and business rules
- API Documentation: Document all public interfaces
- Configuration: Document all configuration options
We have two types of tests:
- Unit Tests - C++ tests using GoogleTest (test core logic without PostgreSQL)
- PostgreSQL Tests - SQL tests that verify extension functions inside PostgreSQL
# Setup and run all unit tests (one command!)
make test-unit
# Run PostgreSQL extension tests
make test-pg
# Run both unit and PostgreSQL tests
make testUnit tests verify the core C++ logic without requiring a PostgreSQL instance.
# Setup test environment and run unit tests (does everything automatically)
make test-unit
# Or do it step by step:
make test-setup # Build test executable (only needed once)
make test-unit # Run tests
# Run a specific test suite
make test-suite SUITE=ConfigManagerTest
make test-suite SUITE=QueryParserTest
# Clean test artifacts
make test-cleanAvailable Test Suites:
ConfigManagerTest- Config file parsing, defaults, providersProviderSelectorTest- API key resolution, provider selectionResponseFormatterTest- JSON/plain text output formattingQueryParserTest- SQL extraction, response parsingUtilsTest- File reading, error formatting
Note: These unit tests verify individual components (config parsing, response formatting, etc.) without testing the full extension functionality or AI integration. For testing the actual extension with real queries, see the Basic Usage section in the README.
These tests verify the actual extension functions inside PostgreSQL.
Prerequisites: PostgreSQL running, extension installed.
# Build and install extension first
make
sudo make install
psql -d your_database -c "CREATE EXTENSION IF NOT EXISTS pg_ai_query;"
# Run PostgreSQL tests (uses PGDATABASE env var, defaults to 'postgres')
make test-pg
# Or specify a database
PGDATABASE=mydb make test-pgtests/
├── unit/ # C++ unit tests
│ ├── test_config.cpp
│ ├── test_provider_selector.cpp
│ ├── test_response_formatter.cpp
│ ├── test_query_parser.cpp
│ └── test_utils.cpp
├── fixtures/ # Test data
│ ├── configs/ # Sample config files
│ └── responses/ # Sample AI responses
├── sql/ # PostgreSQL tests
│ ├── setup.sql
│ ├── test_extension_functions.sql
│ └── teardown.sql
├── CMakeLists.txt
└── test_helpers.hpp
Adding Unit Tests:
- Create/edit a test file in
tests/unit/ - Include required headers:
#include <gtest/gtest.h> #include <gmock/gmock.h> #include "../test_helpers.hpp"
- Write tests:
TEST(YourTestSuite, TestName) { // Arrange, Act, Assert EXPECT_EQ(expected, actual); }
- Rebuild:
make pg_ai_query_tests
Adding PostgreSQL Tests:
Add to tests/sql/test_extension_functions.sql:
DO $$
BEGIN
IF some_condition THEN
RAISE NOTICE 'PASS: Test description';
ELSE
RAISE EXCEPTION 'FAIL: Test description';
END IF;
END $$;-
Update documentation if needed
-
Create descriptive commit messages:
feat: add response formatting configuration - Add support for JSON response format - Implement configurable explanation and warnings - Update documentation for new features -
Submit pull request:
- Use a clear, descriptive title
- Include a detailed description of changes
- Reference any related issues
- Include screenshots for UI changes
Use conventional commits format:
type(scope): description
[optional body]
[optional footer]
Types:
feat: New featurefix: Bug fixdocs: Documentation changesstyle: Code style changesrefactor: Code refactoringtest: Test additions or changeschore: Build process or auxiliary tool changes
All submissions require review:
- Automated Checks: CI/CD pipeline runs tests and code quality checks
- Peer Review: At least one maintainer reviews the code
- Testing: Changes are tested in development environment
- Documentation: Documentation is updated if necessary
This section is for project maintainers who have permissions to create releases.
Releases are automated via GitHub Actions. There are two ways to trigger a release:
# Ensure you're on main with latest changes
git checkout main
git pull origin main
# Create and push a version tag
git tag v1.0.0
git push origin v1.0.0- Go to Actions → Release workflow
- Click Run workflow
- Enter the version (e.g.,
v1.0.0) - Optionally mark as pre-release
- Click Run workflow
- Stable releases:
v1.0.0,v1.2.3 - Pre-releases:
v1.0.0-beta.1,v1.0.0-rc.1,v2.0.0-alpha.2
Pre-releases are automatically detected based on the tag format (contains -).
The release workflow builds artifacts for:
| Platform | PostgreSQL Versions |
|---|---|
| Linux (x86_64) | 14, 15, 16, 17, 18 |
| macOS (latest) | 14, 15, 16, 17, 18 |
Each artifact is a .tar.gz containing:
- Extension library (
.soor.dylib) - SQL and control files
install.shscript- README with quick start instructions
Before creating a release:
- Update CHANGELOG.md with the new version entry
- Verify CI passes on main branch
- Test locally with a clean build
- Review open issues for any blockers
The workflow automatically:
- Creates a GitHub Release with release notes
- Uploads all platform artifacts
- Generates a supported platforms table
- Links to documentation
Release workflow failed?
- Check the Actions tab for error logs
- Ensure the tag format is valid (
v1.0.0) - Verify all CI checks pass on the tagged commit
Need to re-release?
# Delete the tag locally and remotely
git tag -d v1.0.0
git push origin :refs/tags/v1.0.0
# Fix the issue, then re-tag
git tag v1.0.0
git push origin v1.0.0Artifacts missing?
- Check if the build job completed for all platforms
- Review build logs for compilation errors
When reporting bugs, please use the bug report template in .github/ISSUE_TEMPLATE/bug_report.yml. The template will guide you through providing:
- PostgreSQL version and OS
- Extension version
- Configuration file (remove API keys)
- Steps to reproduce
- Expected vs actual behavior
- Error messages and logs
When requesting features, please use the feature request template in .github/ISSUE_TEMPLATE/. The template will guide you through:
- Describing the use case
- Explaining the expected behavior
- Providing examples
- Considering implementation complexity
- Discussing potential alternatives
- Code Documentation: Inline comments and header documentation
- User Documentation: Installation, configuration, and usage guides
- Developer Documentation: Architecture and contribution guidelines
- API Documentation: Function and configuration reference
- User Docs:
docs/directory (mdBook format) - Code Docs: Inline comments in source files
- README: Project overview and quick start
- Contributing: This file
With Nix (recommended):
# Nix environment includes mdbook and all plugins
nix develop
# Build and serve
cd docs
mdbook serve # Visit http://localhost:3000
# Build static files
mdbook buildWithout Nix:
# Install mdBook
cargo install mdbook
# Optional plugins
cargo install mdbook-mermaid
cargo install mdbook-linkcheck
# Build documentation
cd docs && mdbook build
# Serve locally
mdbook serve- GitHub Issues: Bug reports and feature requests
- GitHub Discussions: Questions and community discussion
- Pull Requests: Code review and collaboration
We are committed to providing a welcoming and inclusive environment. Please:
- Be respectful and constructive in discussions
- Focus on what is best for the community
- Show empathy towards other community members
- Be open to feedback and different perspectives
If you need help:
- Documentation: Check the official documentation
- Issues: Search existing issues for similar problems
- Discussions: Start a discussion for questions
- Community: Engage with other contributors