digicert / pkilint Goto Github PK
View Code? Open in Web Editor NEWA framework for verifying PKI structures
License: MIT License
A framework for verifying PKI structures
License: MIT License
It would be nice to have a reference table which summarizes the information about the issues discovered by pkilint
, something similar to this page for pylint
.
I found some CSV files in the repository which provide these details, but I am not sure whether this information is comprehensive, and I think it should reside in a more prominent place.
Now that SMC-03 has passed, the required format for state/provinces in organizationIdentifier values has been relaxed. Currently, the linter requires exactly 2 letters; the ballot relaxes this to 1-3 alphanumeric characters. This should be relaxed to match the updated language.
QcCompliance and QcSSCD statements are mapped to None, which means that the required absence of the statementInfo
field is not checked. These should instead be mapped to the sentinel value ValueDecoder.VALUE_NODE_ABSENT
to indicate to the decoding validator that the statementInfo
field must be absent.
Currently, KeyUsagePresenceValidator raises a WARNING-level "pkix.ee_certificate_no_ku_extension" finding if the KU extension is not present in end-entity certificates. This warning is not founded in any text in RFC 5280 and is likely an overeager reading of one of the CABF documents.
This validation should be removed from KeyUsagePresenceValidator.
Not sure if intentional or not, but the SMBRs have a SHOULD NOT-level prohibition on making the CRLDP extension critical, whereas it's a MUST NOT for TLS and code signing.
We should raise this in the SMCWG and see whether this deviation from the other BR docs is intentional and make the appropriate changes here (if needed).
The CABF internal domain name validators report "base.unhandled_exception" when the URI or email address are malformed (missing scheme for URIs, no "@" in email addresses, etc.).
Switch to using urlparse and convert NoneType scheme values to the empty string.
Andrew Ayer pointed out on the CCADB mailing list that RFC 6962 requires that at least one SCT MUST be present in the list.
While SCT lists are currently being parsed to check for correctness in terms of the encoding, it was planned to add such semantic checks later when more comprehensive CT policy checks are added. However, it appears that such restrictions are not checked by any publicly available linter. Thus, it would be valuable to the community to add at least minimal checking.
To add:
The Cryptography library is currently being used to parse certificates when determining the certificate type for S/MIME and serverauth certificates. This is problematic, for three reasons:
The Pyasn1 and pkilint APIs should be used instead of the Cryptography library when determining certificate types.
If a validator reports multiple findings, then duplicate instances of these findings will be output in the JSON report. The number of duplicate findings is equal to the number of original findings by the validator. In other words, for every finding reported by a validator, all findings are written to the JSON report.
Currently, the Docker entrypoint script (entrypoint.py) calls subprocess.run
to execute commands and then returns the child process's exit code as the container main process's exit code. This is fine for short-lived programs that do not handle signals, but potentially problematic for long-running processes (such as gunicorn) that have their own signal processing to ensure orderly shutdown (closing log files, etc.).
To ensure that invoked processes have a chance to perform orderly shutdowns in response to SIGTERM
, etc., the call to subprocess.run
should be replaced with os.execvp
. The entrypoint script doesn't need to continue running, so there should be no ill effect with this change.
To reduce the amount of noise/verbose output that may not be helpful, only unhandled exceptions should be logged.
Hi, thank you for your cool tool.
SMIME BR legacy profiles permit other attribute types such as domainComponent in subject distinguished name.
https://github.com/cabforum/smime/blob/main/SBR.md#71425-subject-dn-attributes-for-sponsor-validated-profile
However pkitool raises parse error for such certificate:
% lint_cabf_smime_cert lint -t SPONSORED-LEGACY sponsored-validated_legacy_dconly.json
NameDecodingValidator @ certificate.tbsCertificate.subject.rdnSequence.2.0
itu.invalid_asn1_syntax (FATAL): ASN.1 decoding failure occurred at "certificate.tbsCertificate.subject.rdnSequence.2.0.value" with schema "DomainComponent", OID 0.9.2342.19200300.100.1.25: <TagSet object, tags 0:0:19> not in asn1Spec: <DomainComponent schema object, tagSet <TagSet object, tags 0:0:22>, encoding us-ascii>
NameDecodingValidator @ certificate.tbsCertificate.subject.rdnSequence.3.0
itu.invalid_asn1_syntax (FATAL): ASN.1 decoding failure occurred at "certificate.tbsCertificate.subject.rdnSequence.3.0.value" with schema "DomainComponent", OID 0.9.2342.19200300.100.1.25: <TagSet object, tags 0:0:19> not in asn1Spec: <DomainComponent schema object, tagSet <TagSet object, tags 0:0:22>, encoding us-ascii>
Tested certificate is here:
-----BEGIN CERTIFICATE-----
MIIHEzCCBPugAwIBAgIUBKdDVm4KxQ4u7Pu9Oe1mGCGySUowDQYJKoZIhvcNAQEL
BQAwSDELMAkGA1UEBhMCVVMxHzAdBgNVBAoMFkZvbyBJbmR1c3RyaWVzIExpbWl0
ZWQxGDAWBgNVBAMMD0ludGVybWVkaWF0ZSBDQTAeFw0yMzA0MDEwMDAwMDBaFw0y
NjA2MjgyMzU5NTlaMIHXMSMwIQYDVQRhExpMRUlYRy1BRVlFMDBFS1hFU1ZaVVVF
QlA2NzEeMBwGA1UEChMVQWNtZSBJbmR1c3RyaWVzLCBMdGQuMRMwEQYKCZImiZPy
LGQBGRMDY29tMRcwFQYKCZImiZPyLGQBGRMHZXhhbXBsZTEPMA0GA1UEBAwGWWFt
YWRhMQ8wDQYDVQQqDAZIYW5ha28xFjAUBgNVBAMMDVlBTUFEQSBIYW5ha28xKDAm
BgkqhkiG9w0BCQEWGWhhbmFrby55YW1hZGFAZXhhbXBsZS5jb20wggEiMA0GCSqG
SIb3DQEBAQUAA4IBDwAwggEKAoIBAQCw+egZQ6eumJKq3hfKfED4dE/tL4FI5sjq
ont9ABVI+1GSqyi1bFBgsRjM0THllIdMbKmJtWwnKW8J+5OgNN8y6Xxv8JmM/Y5v
Qt2lis0fqXmG8UTz0VTWdlAXXmhUs6lSADvAaIe4RVrCsZ97L3ZQTryY7JRVcbB4
khUN3Gp0yg+801SXzoFTTa+UGIRLE66jH51aa5VXu99hnv1OiH8tQrjdi8mH6uG/
icq4XuIeNWMF32wHqIOOPvQcWV3M5D2vxJEj702Ku6k9OQXkAo17qRSEonWW4HtL
btmS8He1JNPc/n3dVUm+fM6NoDXPoLP7j55G9zKyqGtGAWXAj1MTAgMBAAGjggJj
MIICXzAMBgNVHRMBAf8EAjAAMA4GA1UdDwEB/wQEAwIHgDAfBgNVHSMEGDAWgBTW
RAAyfKgN/6xPa2buta6bLMU4VDAdBgNVHQ4EFgQUiRlZXg7xafXLvUfhNPzimMxp
MJEwFAYDVR0gBA0wCzAJBgdngQwBBQMBMD0GA1UdHwQ2MDQwMqAwoC6GLGh0dHA6
Ly9jcmwuY2EuZXhhbXBsZS5jb20vaXNzdWluZ19jYV9jcmwuY3JsMEsGCCsGAQUF
BwEBBD8wPTA7BggrBgEFBQcwAoYvaHR0cDovL3JlcG9zaXRvcnkuY2EuZXhhbXBs
ZS5jb20vaXNzdWluZ19jYS5kZXIwHQYDVR0lBBYwFAYIKwYBBQUHAwQGCCsGAQUF
BwMCMIIBAwYDVR0RBIH7MIH4gRloYW5ha28ueWFtYWRhQGV4YW1wbGUuY29toCkG
CisGAQQBgjcUAgOgGwwZaGFuYWtvLnlhbWFkYUBleGFtcGxlLmNvbaAmBggrBgEF
BQcICaAaDBjlsbHnlLDoirHlrZBAZXhhbXBsZS5jb22kgYcwgYQxIzAhBgNVBGET
GkxFSVhHLUFFWUUwMEVLWEVTVlpVVUVCUDY3MSQwIgYDVQQKDBvjgqLjgq/jg5/l
t6Xmpa3moKrlvI/kvJrnpL4xDzANBgNVBAQMBuWxseeUsDEPMA0GA1UEKgwG6Iqx
5a2QMRUwEwYDVQQDDAzlsbHnlLDoirHlrZAwIwYJKwYBBAGDmCoBBBYTFEFFWUUw
MEVLWEVTVlpVVUVCUDY3MBIGCSsGAQQBg5gqAgQFEwNDRU8wDQYJKoZIhvcNAQEL
BQADggIBAEoUeOBUyS8TZnupriBTl6+fQtIbaGtgzZJ3E6pxQu5shdmw3pv751hW
IjAZIQhYkOV1Ymq5L51/jXyWk2S353AnMFL2rAxirRYbzZ4YygnxWVoy7Mi7DdfJ
mqtb/nIVz1JsjtC2y9YJCR0FP2ZC1bw5j/RFNlC30DcRrn6Pz9oTxBeD4Fa5uMjs
pQH5GXKSO4v0nqjuE95gte6gvre/mOIrM//uZ+DfNAPczgVe0PCGPLseycvsC8N/
oEFe6yeiXIVl1j2OwwUIchxt3yGVdugt/ocpBxO3SCaLhl6pmIACpj2oEtJPhncs
xy385ZFYRXkWBliW12goFFBuSZqj/XmL+PGRXZw6U8nFt3puZjzc4D2pKFZxk2+T
KDE75UgfupXZAbo1wDCpBu5Ghs5tAWsJnQEeZ6Dg71hzFT4BLEqXMOvpT89aCe52
QByUrCp1gAFsdTPt0MBk01WMMaCMygC71kue/v/g2muDQriU/+H8HjjB6ziRS777
KUEkbO0HK0Ko4Go97oJJov3dEhh+ToZmpOWwf3r5jJUpOoriYkEkMzAmG0vgmzDG
oKbmlfasezOfvTuInFWwBLYFDg9j6mWDhVbNmwWE405NNJ20UVjb/7k+a7rW3ck0
c0yTPWAgMNFpk/sHwalbtQG8/1hD/c/6UXnF4+aKKtqgD9P8P6HP
-----END CERTIFICATE-----
I'm afraid other attribute types not in the list will also raise error.
Currently, the REST API does not gracefully handle the case where the certificate type determination logic throws an exception due to bad ASN.1 encoding: a HTTP 500 response is returned.
ASN.1 decoding errors should be caught and returned as HTTP 422 errors instead.
I'm adding pkilint to our integration test environment, to show that the signed artifacts produced during testing lint cleanly. We already use docker-compose, so simply adding the pkilint docker container to our setup and making http requests to it seemed like the easiest solution.
However, the REST API does not appear to have endpoints for linting CRLs, which means we need to install pkilint directly in our primary container to accomplish that task. It would be great for the CLI and API to have the exact same set of capabilities.
Similar to zlint's list of users, users who agree to be listed should be added to a section of the README file.
The validation-level heuristic logic does not correctly compare CN and O attribute values, which leads to incorrectly detecting sponsored-level certificates in some cases.
Modify and extend serverauth lints to cover all SC-62 profiles.
OrganizationIdentifierAttributeValidator incorrectly requires registration references for GOV and INT identifiers. For example, an identifier of "GOVUS+TX" is incorrectly flagged as an error, whereas "GOVUS+TX-123" is not.
OrganizationIdentifierAttributeValidator needs to prohibit the inclusion of a registration reference for GOV and INT scheme identifiers.
At "Usage" section in README, the command name "lint_cabf_servercert_cert" is a typo of "lint_cabf_serverauth_cert" ?
Currently, there is no enforcement whether a validator includes all validations that can be reported in its validations
property. A unit test should be added to check this and add any missing validations to the appropriate lists.
In order to make finding messages more readable, all string values that are sourced from "documents under lint" should be surrounded by double quotes.
Currently, lint_cabf_smime_cert
's -d
option throws a ValueError
if the certificate type can't be determined. To clean up the output, the stack trace should be suppressed and only the error message "Could not determine validation level and generation" be output to stderr.
The tool should return with exit code 1 in this case.
When we used pklint to test our new profile for SMIME certificate for ORGANIZATION and STRICT profile type, we got this error:
cabf.smime.common_name_value_unknown_source (ERROR): Unknown CN value source: "Disig, a.s."
Our test file is attached.
Can you explain to us what this error means?
Regards
Peter Miskovic
Creating this issue for tracking. There's a discussion on https://bugzilla.mozilla.org/show_bug.cgi?id=1872374 whether it is compliant to use "EL" as the country code for the VAT registration scheme.
If the outcome of the discussion is that "EL" is allowed, then the TLS and SMIME orgId validators need to be modified to explicitly allow "EL" as an exception to ISO 3166-1.
Changes to the profile:
It was agreed on the August 30th call of the SMCWG that country codes included in the OrgID attribute must match those present in the countryName attribute.
Add a validator that enforces this requirement.
OrganizationIdentifierCountryNameConsistentValidator
currently does a case-sensitive country code comparison of the countryName and orgId country values. Other validators use a case-insensitive comparison of ISO 3166-1 country codes, as do other open-source linters.
SMBR Appendix A says:
The country code used in the Registration Scheme identifier SHALL match that of the subject:countryName in the Certificate as specified in Section 7.1.4.2.2.
While it does not specify how strict matching is to be done in this particular case, in all other ISO 3166 country code comparisons, it is case-insensitive. We should follow that convention here as well.
We should publish Docker images for each release alongside the Python packages. The Docker image should have the following functionality:
Target platforms:
To make it easier to embed pkilint in other projects, a REST API should be created.
Must haves:
The API design should be extensible such that no API contract changes are needed to add later support for CRL linting.
Currently, the CRLDP DistributionPoint validator only flags an error if the SEQUENCE is empty. This is not sufficient, as the incorrect case of a "reasons"-only DistributionPoint MUST also be flagged as incorrect.
Now that Python 3.12 has been released with several improvements, the Docker image should be updated to be based on Python 3.12.
The OpenAPI schema for the REST API specifies that the response value when HTTP 422 errors are returned is a list of ValidationErrors.
In the case of the "determine certificate type and lint" endpoints for TLS BR and SMIME BR certificates, there are a few instances where this contract is not implemented correctly and a scalar string response value in the body is returned instead.
These instances should be corrected to consistently return a list of ValidationErrors.
Now that it appears at least one publicly trusted CA will be using one of the methods described in RFC 7093 for SKI calculation [1], SubjectKeyIdentifierValidator should be updated to detect the usage of these methods.
[1] https://groups.google.com/a/mozilla.org/g/dev-security-policy/c/L7XoAXt_s1c/m/Hil4RxrYAAAJ
NOT A CONTRIBUTION
The CABF requirements for SMIME states:
"The Certificate MAY also contain additional policy identifier(s) defined by the Issuing CA. The Issuing CA SHALL document in its CP and/or CPS that the Certificates it issues containing the specified policy identifier(s) are managed in accordance with these Requirements."
We would like to see pkilint enhanced to also check additional policy identifier. Specifically, the abiity to configure the tool with one or more of the following inputs:
[--additional-required-policy-id POLICY_OID]
)[--additional-allowed-policy-id POLICY_OID]
)When provided, the tool would ensure that all additional required policy OIDs were present in the Certificate Policies extension, and that any remaining policy OIDs found are allowed.
The CABF requirements for TLS contains similar statements/requirements around policy identifiers, so it would be ideal if similar capability could be added there as well.
Note: Longer term, instead of (or perhaps in addition to) specifying these inputs via the command line, you might consider an input configuration file (perhaps in YAML) that could contain these values (and more).
Some ASN.1 definitions for "open type maps" (ANY, etc.) require that the value be absent. For example, an AlgorithmIdentifier
with an OID of "SHA256withECDSA" must have an absent parameters
field.
Add the ability to express this absence in decoder type maps.
SC-72 removed the requirements for the cPSUri policyQualifier
to be present in EV Subscriber and externally operated Subordinate CA certificates.
The absence of the cPSUri
should no longer be flagged for these certificate types.
The Docker entrypoint script currently has a list of available CLI linter scripts, which is duplicative of the contents of the "bin" package. In order to keep the list of available linter scripts DRY, this list should be removed from the entrypoint script.
Instead, the entrypoint script should attempt to import the named module and execute the "main" function in that module. If this fails, then the supplied arguments should then be used to call the execvp syscall (for execution of other applications, such as uvicorn).
Python 3.12 is slated for release on October 2nd. Steps to prepare:
The POST endpoint for each individual linter is used to lint a certificate. Currently, the GET endpoint is unused. It would be useful to make available the set of validations/possible findings that can be returned by the linter, so the GET endpoint should be added to return this information.
CRLs such as the following:
-----BEGIN X509 CRL-----
MIIBzzCBuAIBATANBgkqhkiG9w0BAQsFADAiMQswCQYDVQQGEwJYWDETMBEGA1UE
CgwKQ1JMcyAnciBVcxcNMjQwMzI1MTg0NzAwWhcNMjQwNDAxMTg0NzAwWjAAoGAw
XjAKBgNVHRQEAwIBATAfBgNVHSMEGDAWgBT80TS3y6SVsbZZ6gsFYh7omo+0OjAv
BgNVHRwBAf8EJTAjoB6gHIYaaHR0cDovL2Zvby5leGFtcGxlL2NybC5kbGyEAf8w
DQYJKoZIhvcNAQELBQADggEBAA3ygNK9ayDcm9QngyRe9zNoVvvZUfQn9Uvk9XZw
FKYx9pUEqdSw2vUhzFPTooXyA2sNOkqon19607b3SzBl9w+32wRVQFVBSl2VBze3
JrWNhaYWQDe2SmofmsEJBrcIGbUKDxxsUG3BaBavvQ2xW98Pp62MAQ/1l74xJecZ
XSnD0R45XTsedIP88ZDCoKJpKfkN0gNWgfx4VIaaA0vD5vhLBSObPw+qxnC8fEm+
V5XUPxyPvN1HjLCBEyMR8lnwGn5N27rga/R2gNLSIQkiyEsdiRxbWiSmr+0+byQo
BsOt6lqt481DCFJxVwOJIvvd4QEg+iSAbdCGiGK5w/kx4XU=
-----END X509 CRL-----
result in a DER decoding error when loaded. This is due to the explicitly encoded empty SEQUENCE for the "revokedCertificates" field. While the ASN.1 module in RFC 5912 adds a value size constraint to require one or more elements and RFC 5280 requires in section 5.1.2.6 to not include this field if empty, the document complies with the RFC 5280 ASN.1 module and thus should not result in a decoding error.
The decoding error is due to the pyasn1 encoder omitting the field when encoding an empty revokedCertificates field and thus the loaded document and its pyasn1-generated DER encoding are not binary equal. The re-encoding step is required to catch encoding errors that the pyasn1 DER decoder currently tolerates. This has two downsides:
This is an inherent limitation of the current decoding logic; moving to cbonnell/pyasn1-fasder to provide the DER decoding function will obviate the need for the re-encode step. Once migrated, a new validator to flag empty revokedCertificates SEQUENCES can be added to the PKIX layer to explicitly catch this case.
If a PrintableString is implicitly or explicitly tagged, then PrintableStringConstraintValidator does not flag any invalid characters within that PrintableString.
For example, the implicitly tagged "registrationStateOrProvince" field of the CABFOrgId extension is not properly checked for containing solely characters within the PrintableString repertoire.
Currently, the finding with code "cabf.smime.email_address_in_attribute_not_in_san" is a WARNING. However, SMBR 7.1.4.2.1 says:
All Mailbox Addresses in the subject field or entries of type dirName of this extension SHALL be
repeated as rfc822Name or otherName values of type id-on-SmtpUTF8Mailbox in this
extension.
Given this language, the severity of this finding should be changed to ERROR-level.
SaneValidityPeriodValidator currently reports an incorrect finding code of "pkix.invalid_time_syntax" for negative validity periods. The correct finding code is the document type-specific code specified in the constructor. For example, "'pkix.certificate_negative_validity_period" is the correct code for certificates.
Non-TLS CA certificates will not have serverauth reserved policy OIDs, so the SHOULD-level requirement for TLS certs to have a reserved OID in the first position isn't applicable.
NamedBitStringMinimalEncodingValidator always outputs a DER encoding that uses universal tag 0x03 and does not account for any implicit/explicit tagging of the named BIT STRING. This manifests in an incorrect error for DistributionPoints's reasons field (which is a named BIT STRING with an IMPLICIT tag).
The validator should be modified to use the ASN.1 schema of the ASN.1 node under test instead of assuming a "raw" BIT STRING.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.