A local CLI tool for tracking a Pakistani stock exchange (PSX) portfolio. Automatically syncs trading confirmations from broker emails, stores everything in a local SQLite database, and provides portfolio analytics with live prices.
- Fetches daily confirmation emails from Gmail (Next Capital broker)
- Extracts transactions from PDF attachments, calculates net price per share
- Stores trades, dividends, and deposits in a local SQLite database
- Live prices scraped from dps.psx.com.pk (cached after market close)
- Portfolio dashboard with P&L, CAGR, XIRR, cash balance, and sector allocation
- Shariah compliance view — debt ratio and dividend purification amounts
- Terminal charts for NLV history and cumulative deposits vs KSE100
- Manual entry for dividends and deposits
- Google Drive sync — push/pull data and config across machines
psx-auto-update/
├── src/
│ ├── cli.py # Thin CLI entry point
│ ├── helpers.py # Shared utilities (DB, config, Shariah helpers)
│ ├── commands/
│ │ ├── sync.py # sync command
│ │ ├── fetch.py # fetch command (prices + indices for cron)
│ │ ├── dashboard.py # dashboard command + sector allocation
│ │ ├── positions.py # positions command
│ │ ├── trades.py # trades + history commands
│ │ ├── dividends.py # dividends command
│ │ ├── chart.py # chart group (nlv, deposits)
│ │ ├── add.py # add group (dividend, deposit)
│ │ ├── import_.py # import + import-kse commands
│ │ └── drive.py # push/pull commands (Google Drive sync)
│ ├── database.py # SQLite wrapper
│ ├── drive_client.py # Google Drive API client
│ ├── price_fetcher.py # Live price scraping with caching
│ ├── portfolio.py # P&L and summary calculations
│ ├── gmail_client.py # Gmail API integration
│ ├── pdf_parser.py # Broker PDF parsing
│ ├── state_manager.py # Last-run timestamp tracking
│ └── models.py # Data models
├── scripts/
│ └── import_from_sheets.py # One-time migration from Google Sheets
├── config/
│ ├── config.yaml # API credentials paths, email filters
│ └── sectors.yaml # Symbol → sector mapping
├── credentials/ # OAuth tokens (gitignored)
├── data/ # SQLite DB, state file, failed PDFs (gitignored)
├── logs/ # Rotating logs (gitignored)
├── psx # Shell script — run from project root
└── pyproject.toml
- Python 3.10+
uvpackage manager- Google Cloud project with Gmail API enabled
uv sync- Go to Google Cloud Console
- Enable the Gmail API and Google Drive API
- Create an OAuth 2.0 Desktop credential
- Download and save to
credentials/gmail_credentials.json - Add your email as a test user on the OAuth consent screen
On first run, a browser window will open for authorization. The token is saved automatically.
The Drive sync (push/pull) reuses the same credentials file but stores its token separately at credentials/drive_token.json.
Edit config/config.yaml:
gmail:
sender_email: broker@example.com
subject_filter: "Confirmation Note"
credentials_path: credentials/gmail_credentials.json
token_path: credentials/gmail_token.json
drive:
credentials_path: credentials/sheets_credentials.json
token_path: credentials/drive_token.json
state:
state_file: data/state.json
lookback_days: 30Edit config/sectors.yaml to map your stock symbols to sectors (used in the dashboard allocation view).
All commands are run via the ./psx script from the project root.
./psx sync # fetch new broker emails and write trades to DB
./psx sync --dry-run # preview without writing./psx dashboardShows net liquidating value, invested amount, cash balance, total deposits, P&L (absolute, CAGR, XIRR), daily P&L, dividends, and sector allocation with per-sector CAGR and XIRR. Saves a daily portfolio snapshot after market close.
./psx positions # sorted by market value (default)
./psx positions --sort [value|symbol|day|abs|cagr|xirr]
./psx positions --shariah # add D/A ratio column (AAOIFI compliance)
./psx positions --shariah path/to.pdf # use a specific Shariah screening PDFShows each open position with average buy price, current live price, market value, day P&L, unrealized P&L, CAGR, and XIRR. The --shariah flag adds a debt-to-assets ratio column — green if below 33.33% (AAOIFI compliant), red if above.
./psx trades # all trades
./psx trades --symbol DGKC # one symbol
./psx trades --mode BUY # filter by mode./psx history DGKC./psx dividends # detail view — all dividends
./psx dividends --symbol EFERT # filter by symbol
./psx dividends --summary # one row per symbol with yield on cost
./psx dividends --summary --shariah # add purification % and amountThe --shariah flag reads non-compliant income percentages from the KMIALL screening PDF and calculates purification amounts per dividend received. Parsed data is cached in SQLite for 90 days.
./psx chart nlv # all-time NLV, invested amount, and total deposits
./psx chart nlv --period 7d # last 7 days
./psx chart nlv --period 30d # last 30 days
./psx chart nlv --period 6m # last 6 months
./psx chart nlv --period 1y # last 1 year
./psx chart deposits # cumulative deposits overlaid with KSE100 index (extended to today)# Dividend: symbol date(YYYY-MM-DD) after-tax-total shares
./psx add dividend DGKC 2025-06-30 12500 1000
# Deposit: amount date [broker]
./psx add deposit 500000 2025-01-15 --broker NextCapital./psx import --dry-run # preview what would be imported
./psx import # write to DBReads trades and deposits from the Entry tab (columns B–G, J–L) and dividends from the Dividends tab (columns B–G).
./psx fetch # fetch live index values + portfolio prices, save to DB
./psx fetch --quiet # suppress output (suitable for cron)Stores KSE100/KMI30 values in index_history and refreshes price_cache for all portfolio symbols. Intended to be run every 30 minutes during market hours via cron:
*/30 4-10 * * 1-5 cd /path/to/psx-auto-update && ./psx fetch --quiet >> logs/fetch.log 2>&1
Sync the database, config, and credentials across machines via Google Drive.
./psx push # upload local files to Drive (overwrites remote)
./psx pull # download files from Drive to local paths (overwrites local)
./psx pull --yes # skip confirmation promptFiles synced: data/portfolio.db, data/state.json, config/config.yaml, and all credential files. The machine-specific drive_token.json is intentionally excluded.
./psx import-kse # default: ~/Downloads/Karachi 100 Historical Data.csv
./psx import-kse path/to/kse100.csv # custom pathDownloads historical KSE100 data from Investing.com as CSV, then imports it for use in chart deposits.
Prices are fetched live from dps.psx.com.pk in parallel (one request per symbol). After market close (15:30 PKT, Mon–Fri), the last fetched prices are cached in the database and reused until the next market open.
For full functionality — including the chart nlv and chart deposits commands — three cron jobs should be configured. Add them all via crontab -e:
# 1. Fetch live prices + KSE100/KMI30 index values every 30 min during market hours
# Required for: chart deposits (KSE100 overlay), up-to-date price cache
# 04:00–10:30 UTC = 09:00–15:30 PKT (Mon–Fri)
*/30 4-10 * * 1-5 cd /path/to/psx-auto-update && ./psx fetch --quiet >> logs/fetch.log 2>&1
# 2. Save daily portfolio snapshot after market close
# Required for: chart nlv (NLV history over time)
# 11:30 UTC = 16:30 PKT — runs after Friday's extended close
30 11 * * 1-5 cd /path/to/psx-auto-update && ./psx dashboard >> logs/dashboard.log 2>&1
# 3. Sync broker emails after market close
# 11:00 UTC = 16:00 PKT
0 11 * * 1-5 cd /path/to/psx-auto-update && ./psx sync >> logs/cron.log 2>&1
Without these cron jobs:
chart nlvwill show no data (portfolio snapshots are only saved whendashboardruns after market close)chart depositsKSE100 overlay will be sparse or missing (index values are only recorded whenfetchordashboardruns)
Authentication error — delete credentials/*_token.json and re-run to re-authorize.
PDF parsing failures — failed PDFs are saved to data/failed_pdfs/ for manual review.
No emails found — check gmail.sender_email and gmail.subject_filter in config.yaml, and verify data/state.json has the expected last_run timestamp.
Stale prices — delete the price_cache rows in data/portfolio.db to force a fresh fetch:
sqlite3 data/portfolio.db "DELETE FROM price_cache;"Shariah PDF not found — place the KMIALL screening PDF in the project root or pass its path explicitly via --shariah path/to/file.pdf. Cached data can be cleared with:
sqlite3 data/portfolio.db "DELETE FROM shariah_cache;"