A Rails application for exploring and testing different Large Language Model (LLM) integrations and frameworks. This playground is meant to showcase how to build similar features across different gems.
-
RubyLLM Integration - Direct integration with the
ruby_llmgem- Standard chat interface with model selection
- Asynchronous message processing
- Tool integration support
-
Raix Integration - Example implementation of the
raixgem- Standard chat interface
- Asynchronous message processing
- Tool integration support
- Structured workflow
-
Raif Engine - Mounted engine for convention-over-configuration approach to LLM integration
- Standard chat interface with model selection
- Asynchronous message processing
- Tool integration support
- Basic Auth - Simple Rails-generated authentication
- Real-time Updates - Turbo-powered real-time message updates
- OpenAI
- Additional providers can be added, see gem documentation for details
- Framework: Ruby on Rails 8.0+
- Database: SQLite3
- Frontend: Turbo, Stimulus, Tailwind CSS
- Authentication: Custom authentication with bcrypt
- Background Jobs: Solid Queue
- Caching: Solid Cache
- Real-time: Solid Cable (Action Cable)
- Ruby 3.0+
- SQLite3
-
Clone the repository
git clone <repository-url> cd rails_llm_playground
-
Install dependencies
bundle install npm install
-
Environment Configuration
cp .env.example .env
Configure your environment variables:
OPENAI_API_KEY=your_openai_api_key_here # Add other LLM provider API keys as needed
-
Database Setup
rails db:create rails db:migrate rails db:seed
-
Start the Application
bin/dev
The application will be available at
http://localhost:3000
- Register an Account - Create a new user account or sign in
- Explore Models - Visit
/modelsto see available LLM providers and models (these are specific to RubyLLM but are helpful to see available options in other gems) - Refresh Models - Use the refresh button to seed the database with new models from RubyLLM registry
- Select a model from the available providers
- Start conversations with your chosen LLM
- Messages are processed asynchronously
- Supports tool integration
- Create structured conversations using RAIX framework
- Choose from different conversation types
- Enhanced conversation management
- Access advanced conversation features through RAIF engine
- Sophisticated conversation flows
- Extended functionality through mounted engine
- View Models - Browse available models organized by provider
- Model Details - View specific model information and capabilities
- Refresh - Dynamically discover new models from providers
# Required
OPENAI_API_KEY=your_openai_api_key
# Optional - add as needed for other providers
ANTHROPIC_API_KEY=your_anthropic_key
GOOGLE_API_KEY=your_google_keyThe application is designed to be extensible. To add new providers:
- Add the provider gem to your Gemfile
- Configure the provider in your LLM integration
- Update the Model refresh logic if needed
rails test
rails test:systembundle exec rubocop
bundle exec brakeman- Foreman - Process management (
bin/dev) - Web Console - In-browser debugging
- Debug - Enhanced debugging capabilities
A Dockerfile is included for containerized deployment (but has not been tested):
docker build -t rails-llm-playground .
docker run -p 3000:3000 rails-llm-playgroundThe application includes Kamal configuration for easy deployment (but has not been tested):
kamal deployAt this time I am not looking for contributions. Feel free to fork or clone and expand this app however you like. If you have any ideas or suggestions, feel free to open an issue!
This playground demonstrates three different approaches to LLM integration in Rails:
- Framework-Agnostic - Fully custom implementation using Raix and defining persistence in callbacks
- Railsy-but-flexible - Dead-simple setup using RubyLLM generated models and controllers
- Hardcore Rails - Using mountable engine with Raif
- Chat responses are processed asynchronously using Solid Queue
- Real-time updates delivered via Turbo Streams
- Graceful handling of long-running LLM requests with chunked responses
- API keys stored in environment variables
- User authentication required for all LLM interactions
- Input validation and sanitization
- Rate limiting considerations for production use
This project is available as open source under the terms of the MIT License.
For questions and support:
- Open an issue on GitHub
- Check the documentation for each integrated gem
- Review the example implementations in the codebase
This project was created as a learning exercise and as a proof of concept for different LLM integration approaches. I found it challenging to answer these questions: "What are the best patterns for integrating an LLM into a Rails application? Are there any production-ready gems available?" I have no intention of maintaining this project or making any changes to it (but maybe I will eventually). Use this as a reference or a jumping off point for your own exploration.