Skip to content

Commit 781172d

Browse files
committed
Fix isExhaustive pattern checking
Kudos to ChatGPT, which explains: The bug is in the “record” branch of `isExhaustive`. Right now, when the first column contains (only) records (i.e. `numSeen == 0` and `extractRecordPatterns` succeeds), the code does this: 1. Collect all field names seen anywhere in the matrix (`baseRecord`). 2. For **each field name independently**, check exhaustiveness of that one field (`specializeRowByRecordField fieldName`), then wrap the resulting missing patterns back into a single-field record. That is logically wrong for records: it’s checking **each field in isolation** (like an OR), but exhaustiveness requires the **cartesian product of fields** (like an AND across fields). So your first example slips through because: - for field `a`, both `True` and `False` appear somewhere - for field `b`, both `True` and `False` appear somewhere …but not all **combinations** appear. This is already handled correctly by `simplify`: `Can.PVar _` becomes `Anything`. So destructured names like `{ key = lKey }` do *not* introduce additional constraints. That’s why the `Just { first = { key = lKey, value = lValue }, rest }` pattern should not force enumeration of all `key/value/rest` combinations—those are `Anything` patterns. The fix below preserves that behavior. --- Instead of iterating fields one-by-one with `specializeRowByRecordField`, we must treat a record pattern as a product of its fields: build a consistent “base” field set and specialize the matrix into a vector of field-patterns (one per field) and recurse on that vector. You already have the machinery for this in `isUseful`: - it uses `collectRecordFieldsWithAnyPattern` to get a base map `{fieldName -> Anything}` - then uses `specializeRowByRecord baseMap` to expand a record into `Map.elems specializedMap` `isExhaustive` should do the same. No other code changes are required. --- Patterns: - `{ a = False, b = True }` - `{ a = True, b = False }` After specialization by the base record `{a = _, b = _}`, the algorithm checks exhaustiveness on the **2-column** matrix `[aPattern, bPattern]` and will find missing rows such as: - `{ a = False, b = False }` - `{ a = True, b = True }` So it correctly reports `Incomplete`. Inside `Just { first = { key = lKey, value = lValue }, rest }` all those `lKey/lValue/rest` are `PVar`, so `simplify` turns them into `Anything`. That makes the record subpatterns maximally general, so `Ctor Just [...]` covers all `Just _` cases. Together with `Nothing`, the `Maybe` match is exhaustive, and the record-product recursion won’t introduce spurious missing combinations.
1 parent 2e61355 commit 781172d

File tree

1 file changed

+16
-38
lines changed

1 file changed

+16
-38
lines changed

compiler/src/Nitpick/PatternMatches.hs

Lines changed: 16 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -253,21 +253,22 @@ isExhaustive matrix n =
253253
(:) Anything
254254
<$> isExhaustive (Maybe.mapMaybe specializeRowByAnything matrix) (n - 1)
255255
Just baseRecord ->
256-
let fieldNames = Map.keys baseRecord
257-
258-
isAltExhaustive fieldName =
259-
map (asRecordPattern fieldName) $
260-
isExhaustive
261-
(Maybe.mapMaybe (specializeRowByRecordField fieldName) matrix)
262-
n
263-
264-
asRecordPattern fieldName ptn =
265-
case ptn of
266-
firstValue : _ ->
267-
[Record $ Map.singleton fieldName firstValue]
268-
_ ->
269-
ptn
270-
in concatMap isAltExhaustive fieldNames
256+
-- Treat records as a product of fields (cartesian combination),
257+
-- not each field independently.
258+
let fieldCount = Map.size baseRecord
259+
baseFieldsInOrder = Map.keys baseRecord
260+
261+
-- Rebuild a record from the first `fieldCount` patterns in a counterexample row
262+
recoverRecord :: [Pattern] -> [Pattern]
263+
recoverRecord patterns =
264+
let (fieldPats, rest) = splitAt fieldCount patterns
265+
in Record (Map.fromList (zip baseFieldsInOrder fieldPats)) : rest
266+
in
267+
map recoverRecord $
268+
isExhaustive
269+
(Maybe.mapMaybe (specializeRowByRecord baseRecord) matrix)
270+
(fieldCount + n - 1)
271+
271272
else
272273
let alts@(Can.Union _ altList numAlts _) = snd (Map.findMin ctors)
273274
in if numSeen < numAlts
@@ -430,29 +431,6 @@ specializeRowByRecord baseMap row =
430431
[] ->
431432
error "Compiler error! Empty matrices should not get specialized."
432433

433-
-- INVARIANT: (length row == N) ==> (length result == arity + N - 1)
434-
specializeRowByRecordField :: Name.Name -> [Pattern] -> Maybe [Pattern]
435-
specializeRowByRecordField fieldName row =
436-
case row of
437-
Ctor _ _ _ : _ ->
438-
Nothing
439-
Anything : patterns ->
440-
Just (Anything : patterns)
441-
Array _ : _ ->
442-
Nothing
443-
Record namedPatterns : patterns ->
444-
case Map.lookup fieldName namedPatterns of
445-
Just pattern ->
446-
Just (pattern : patterns)
447-
Nothing ->
448-
Nothing
449-
Literal _ : _ ->
450-
error $
451-
"Compiler bug! After type checking, constructors and literals\
452-
\ should never align in pattern match exhaustiveness checks."
453-
[] ->
454-
error "Compiler error! Empty matrices should not get specialized."
455-
456434
-- INVARIANT: (length row == N) ==> (length result == N-1)
457435
specializeRowByLiteral :: Literal -> [Pattern] -> Maybe [Pattern]
458436
specializeRowByLiteral literal row =

0 commit comments

Comments
 (0)