Comments (28)
In my notes I require **rest
for a match to succeed, but I agree that actually mappings are different from sequences: they have "structural subtyping" behavior, i.e. passing a dictionary with extra keys somewhere will likely just work, while passing a tuple with extra items somewhere will likely just crash.
I also proposed to allow name patterns in key positions, but this is mostly to allow fixed length match. But this can easily be checked with a guard, so I also withdraw this idea.
from patma.
If we were to do it, some kind of distributive law should apply, like {X|Y} should be like {X}|{Y}. But fine to move to Rejected/Postponed for now.
from patma.
Also, you could have:
case {"a": 1, "b": 2}: # match ignoring extra values
case Exact({"a": 1, "b": 2}): # don't allow extra values
from patma.
Oh, neat! I guess the implementation could be
class Exact:
@staticmethod
def __match__(target):
if isinstance(target, Mapping):
return ExactProxy(target)
class ExactProxy:
__match_args__ = ["target"]
def __init__(self, target):
self.target = target
This sure beats the work of implementing support for
case {"a": 1, "b": 2, **rest} if not rest: ...
from patma.
(...and explaining why if you leave out **rest
it ignores extra keys, whereas for lists, if you leave out *rest
it insists there's nothing extra.)
from patma.
I think there's enough agreement here to mark it as 'accepted'.
(The minor issue of whether to support dict(key1=a, key2=b)
is separate. Let's postpone that.)
from patma.
Whoa, I take it back. Ivan's PEP currently doesn't support **rest
, and it allows |
in keys.
from patma.
Okay, in the end we opted for |
in keys and for **rest
. So marking as "accepted" again.
from patma.
I've begun implementing this, and have come across a couple of questions that the PEP does not explicitly address. The answers seem clear to me, but they should probably be included in the spec:
- Is using
**x
multiple times in the same pattern illegal? I think yes. - Is using
**x
anywhere other than the end of the pattern illegal? I think yes. - Does order matter? I think no.
Finally, it's not clear to me how |
in key patterns is supposed to work. Does it stop after a successful key match, or does it keep trying keys until it gets a value match as well?
match {"x": False, "y": True}:
case {"x" | "y": True}: # Does this match?
...
from patma.
from patma.
Maybe skip the OR in keys for now?
Yeah, I think that's a good idea. It's also worth considering omitting them from the proposal, since it seems they have few real-world use cases and could easily be added later.
from patma.
Flagging for revision since we're dropping |
for keys.
from patma.
I've come across a couple of questions that I'd like input on. Consider the following example:
from collections import defaultdict
match defaultdict(int):
case {"XXX": _}:
...
Does this case match? The current implementation copies the target into a dict
and pops keys from that. As a result, the above case doesn't match.
But I'm starting to wonder if it should. It would be pretty straightforward to turn the key pops into regular lookups on the target itself. There would just be more bookkeeping involved.
Which leads me to my second question. How should we handle duplicate keys?
xxx = "XXX"
match d:
case {"XXX": _, "XXX": _}:
pass
case {"XXX": _, .xxx: _}:
pass
Currently, neither of these cases match. That seems fine to me, but perhaps it makes more sense to raise... I'm not sure. Again, we would just require more bookkeeping to catch these.
Thoughts?
from patma.
-
I think the first example should only match if there's an existing key in the target, as found by
target.__contains__(key)
. -
I think duplicate keys should raise. (Do we allow
{.key: pat}
? Then{.key1: pat1, .key2: pat2}
should also raise ifkey1
andkey2
happen to be equal.)
from patma.
I think I agree.
We do currently allow .key
, so we will need to keep track of previously-seen keys at runtime.
from patma.
In particular the PEP still shows {"route"|"Route": route}
as valid syntax. We decided to defer that.
from patma.
In my code review you pointed out that some cases don't catch duplicate keys. It turns out that we do in fact check for duplicate keys, but many cases short-circuit and fail before we can get that far. Some cases are when:
- The target isn't a mapping:
>>> match None:
... case {"x": _, "x": _}:
... ...
...
>>>
- The target is too small (so it couldn't possibly match):
>>> match {}:
... case {"x": _, "x": _}:
... ...
...
>>>
- An earlier key lookup fails:
>>> match {"y": None, "z": None}:
... case {"x": _, "x": _}:
... ...
...
>>>
It's only when all of these steps succeed that we actually get to the duplicate and raise:
>>> match {"x": None, "y": None}:
... case {"x": _, "x": _}:
... ...
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: mapping pattern checks duplicate key ('x')
>>>
I think that the first three cases are best caught by linters, since they have no effect on the match at runtime. If a particular match gets far enough that we actually hit the second lookup (like the last example), then sure, we should absolutely raise. But a lot of the design choices in the implementation have favored doing as little extra work as possible when failing a match, so it seems wasteful to try to check the keys of a pattern that isn't going to match anyways. This could come with a significant runtime penalty for many heavily nested cases... even the ones with no duplicates!
Further, checking this in the compiler feels wrong. It would only work when literals (not name lookups) conflict. We also don't do this anywhere else: the statement x = {[]}
will obviously fail at runtime, but that's outside the scope of a compile error.
So I vote we leave the current behavior as-is. It's performant, and still catches the cases that matter.
from patma.
Hm, I still want to push back a little here.
Outside patterns, duplicate dict keys are not errors -- we use "last one wins".
In patterns, we reject duplicate keys when we get to matching. So I think we have decided that mappin patterns with duplicate keys are statically invalid, and for cases where we can detect this just by parsing the thing (i.e. when both keys are literals) making this a compilation error seems right. (Just like we do for duplicate keyword args in calls. Which match also should reject statically.)
Obviously we'd still need the runtime check to catch cases like
foo = 'foo'
match x:
case {'foo': 1, .foo: 2}:
print("Quantum mechanics is real!")
from patma.
Hm. Would the compiler raise a SyntaxError
? Because if so, this would have different exception types when caught at compile-time vs. run-time.
I'm not aware of any cases where the compiler raises anything besides SyntaxError
and SystemError
(except one pathological check for name mangling overflows). And I don't think SyntaxError
ever occurs at runtime.
from patma.
If the compiler detects it, it should raise SyntaxError
. When it's detected at runtime it should raise some other exception of your choice.
But frankly I could also live with only the runtime detection -- it's unlikely that people write the same literal key twice, and it will still be caught when actually matching. (That's why I wrote "push back a little.")
from patma.
I see in the pep that only constant value patterns or literals are allowed as keys, but I don't see a justification their or here on why (I can make several guesses, and it seems reasonable). Is there any reason that tuples of these are also disallowed? They should be unchangeable, unnamed, and, at least in principal, possible to check for uniqueness.
As an additional feedback on the wording in the pep, as noted above, it states that constant value patterns or literals can be used. This would send the reader to the section on Constant value patterns. That section mentions that this is used for comparing actual constants and Enum values. It may not be obvious to readers that it is possible to check for the existence of a key using this pattern that is not a constant or an Enum. For instance:
class Foo:
'''Fixed hash otherwise non constant'''
def __hash__(self):
return 12
f = Foo()
tup_key = (1,2)
mapping = {f: 'hello', tup_key: "world"}
match mapping:
case {.f: first, .tup_key: second}:
print(first, second)
case _:
print('default')
from patma.
Yeah, this went through a few revisions and the text is not entirely consistent between sections. We hadn't thought of using (constant) tuples for keys, maybe we could add that in the future (or if there's a groundswell of demand during the review phase :-).
PS. In order to make the example valid you'd have to add an __eq__
to class Foo that returns true for any two Foo instances.
from patma.
Oh good point on equals, I was playing too fast when typing this up. It happened to work in my interpreter, but I guess that's just because I didn't hit any collisions in the hash table.
I personally am a +1 on constant tuples, as I use them all the time. However, I am also a +1 in that it probably is not a reason to delay, as there is a way to get the behavior with no changes.
from patma.
I guess that's just because I didn't hit any collisions in the hash table.
It's because the dict
lookup will try an identity check first, and the same object f
is used both places here. It wouldn't work between different Foo
instances:
>>> class Foo:
... '''Fixed hash otherwise non constant'''
... def __hash__(self):
... return 12
...
>>> f1 = Foo()
>>> f2 = Foo()
>>> tup_key = (1,2)
>>> mapping = {f1: 'hello', tup_key: "world"}
>>>
>>> match mapping:
... case {.f2: first, .tup_key: second}:
... print("f2")
... case {.f1: first, .tup_key: second}:
... print("f1")
...
f1
from patma.
Yeah, but if you only use one object you don't need to define __hash__
to make the example work either. Anyway, the intention here is to show that "constant pattern" is a misnomer, and I must agree. Maybe "value pattern"?
from patma.
Yeah, I must admit “constant” never really sat right with me. “Value” captures the meaning well.
from patma.
Yeah the hash was not really necessary to convey the idea of "constness", I'm sorry for the extra, I was playing in interactive mode and just copied what I had. (it was not even working how I was thinking anyway as you two were kind to point out, it was falling back to id) I will try to be clearer and slow down a bit in any future posts. "value pattern" does seem an easier term to teach. +1
from patma.
See #40 (comment) about the terminology.
from patma.
Related Issues (20)
- Phrasing update in abstract and overview HOT 2
- Disallow guards when using an "open pattern"? HOT 11
- Can the compiler move guards around? HOT 7
- JSON example HOT 5
- Add link to DLS paper to PEP 635 HOT 1
- Reach out to Pablo for code review HOT 1
- Use "as" for named patterns HOT 2
- Raise SyntaxErrors for anything following irrefutable patterns
- Finish AST validation for patterns
- Catch remaining errors in the compiler HOT 1
- Unify the PEP 634 and implementation grammars HOT 2
- Review progress for the reference implementation HOT 5
- Thomas Wouters' objections HOT 13
- Nick Coghlan's (revised) PEP 642 HOT 1
- Dedicated AST nodes for patterns? HOT 5
- PyCon 2021 HOT 10
- work error with zip param "strict=False" on py386 HOT 1
- Please document the design decisions behind PEP 634. HOT 7
- Case goto's? HOT 1
- PyCon JP 2021 HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from patma.