Coder Social home page Coder Social logo

Comments (15)

darynwhite avatar darynwhite commented on July 19, 2024 1

I'm receiving the same error sequence:

#### Initializing recipe for plotly-resampler (pypi) ####

Recovering metadata from pypi...
Starting the download of the sdist package plotly-resampler
plotly-resampler 100% Time:  0:00:00 1022.4 KiB/s|#############################################################################################################################################|
Checking for pyproject.toml
pyproject.toml found in /var/folders/n1/82bp3xw12y50t8kgdv3bj17h0000ng/T/grayskull-plotly-resampler-ok03_0ki/plotly_resampler-0.9.1/pyproject.toml
Traceback (most recent call last):
  File "/Users/white/conda/envs/smithy/bin/grayskull", line 10, in <module>
    sys.exit(main())
             ^^^^^^
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/__main__.py", line 276, in main
    generate_recipes_from_list(args.pypi_packages, args)
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/__main__.py", line 299, in generate_recipes_from_list
    recipe, config = create_python_recipe(
                     ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/__main__.py", line 334, in create_python_recipe
    GrayskullFactory.create_recipe(
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/base/factory.py", line 46, in create_recipe
    GrayskullFactory.REGISTERED_STRATEGY[repo_type.lower()].fetch_data(
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/strategy/pypi.py", line 60, in fetch_data
    update_recipe(recipe, config, sections or ALL_SECTIONS)
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/strategy/pypi.py", line 485, in update_recipe
    metadata = get_metadata(recipe, config)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/strategy/pypi.py", line 347, in get_metadata
    sdist_metadata, pypi_metadata = get_origin_wise_metadata(config)
                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/strategy/pypi.py", line 232, in get_origin_wise_metadata
    sdist_metadata = get_sdist_metadata(
                     ^^^^^^^^^^^^^^^^^^^
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/strategy/py_base.py", line 771, in get_sdist_metadata
    pyproject_metadata = get_all_toml_info(pyproject_toml)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/strategy/py_toml.py", line 263, in get_all_toml_info
    add_poetry_metadata(metadata, toml_metadata)
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/strategy/py_toml.py", line 195, in add_poetry_metadata
    req_run, req_run_constrained = encode_poetry_deps(poetry_deps)
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/strategy/py_toml.py", line 180, in encode_poetry_deps
    constrained_dep = get_constrained_dep(dep_spec, dep_name)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/white/conda/envs/smithy/lib/python3.11/functools.py", line 909, in wrapper
    return dispatch(args[0].__class__)(*args, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/white/conda/envs/smithy/lib/python3.11/site-packages/grayskull/strategy/py_toml.py", line 158, in get_constrained_dep
    raise InvalidPoetryDependency(
grayskull.strategy.py_toml.InvalidPoetryDependency: Expected Poetry dependency specification to be of type str or dict, received list

from grayskull.

xylar avatar xylar commented on July 19, 2024 1

@kcpevey, yeah, I get that this can be a bit confusing. What grayskull produces is a meta.yaml that is used for a recipe for building a conda package. That's different from a yaml file that is used to produce an environment. In a meta.yaml, you would be very unlikely to have a spec like:

- python=3.10

Instead, you would have something like:

...
requirements:
  host:
    - python >=3.8
    - pip
  run:
    - python >=3.8
    - scikit-image >=0.18.1  # py<311
    - scikit-image >=0.20.0  # py>=311

The recipe will be build with various python versions (currently, 3.8 to 3.12) and the # py<311 and # py>=311 selectors will determine which to use. for the python version currently being used for the package build.

I think this shouldn't be too hard to fix in grayskull but I haven't had time to look into it myself.

from grayskull.

xylar avatar xylar commented on July 19, 2024

@dlqqq, is this one you could have a look at?

from grayskull.

darynwhite avatar darynwhite commented on July 19, 2024

Digging a little further, I think the error arises from how the numpy dependency is in the file:

[tool.poetry.dependencies]
python = "^3.7.1"
jupyter-dash = { version = ">=0.4.2", optional = true }
plotly = "^5.5.0"
dash = "^2.11.0"
pandas = ">=1"
trace-updater = ">=0.0.8"
numpy = [
    { version = ">=1.14", python = "<3.11" },
    { version = ">=1.24", python = ">=3.11" }
]
orjson = "^3.8.0"  # Faster json serialization
# Optional dependencies
Flask-Cors = { version = "^3.0.10", optional = true }
# Lock kaleido dependency until https://github.com/plotly/Kaleido/issues/156 is resolved
kaleido = {version = "0.2.1", optional = true}
tsdownsample = "0.1.2"

from grayskull.

darynwhite avatar darynwhite commented on July 19, 2024

In learning what Poetry's definitions are, I found this reference that is specific to this bug.

Poetry Multiple Constraints

from grayskull.

kcpevey avatar kcpevey commented on July 19, 2024

I dont think its possible can handle it the same way via conda, but it seems like grayskull should make some assumption and move on instead of failing.

It looks like we need some enhancement here . I started to think through a fix, something like this:

        if isinstance(dep_spec, list):
            for dep in dep_spec:
                constrained_dep = get_constrained_dep(dep, dep_name)

but that would only just use the last constraint specified. I think a proper solution would be to make an assumption like "always use the later python version" (and tell the user about it). And then go back and adjust the minimum python version for the recipe.

I dont know anything about this codebase, but it seems like there might be some functions in here that could help with version comparisons?

For example, I have python="^3.8" and then a dep_spec that looks like

scikit-image = [
    # Hinge because minimum support 0.20.0 for py3.11
    { version = ">=0.18.1", python = "<3.11" },
    { version = ">=0.20.0", python = ">=3.11" }
]

So I need to compare the two python versions in the dep spec and find the later one, then ensure that the overall python version is compatible or adjusted. Is there a tool in greyskull that can help with those comparisons?

from grayskull.

xylar avatar xylar commented on July 19, 2024

I don't agree that conda can't handle this. It should be possible to do:

scikit-image >=0.18.1  # py<311
scikit-image >=0.20.0  # py>=311

This obviously doesn't allow you to be noarch: python but that is something the feedstock maintainers should intervene on if they want to make that call. grayskull needs to default to not doing noarch: python in these circumstances.

from grayskull.

kcpevey avatar kcpevey commented on July 19, 2024

I'm not sure I understand what you mean @xylar . I meant that conda can't handle that same level of detail in a single environment file (or meta.yml spec?).

For example if I have an environment.yml file like this:

channels:
- conda-forge
dependencies:
- python=3.10
- scikit-image >=0.18.1  # py<311
- scikit-image >=0.20.0  # py>=311

This will solve to python=3.10.13 and scikit-image=0.22.0 which is in disagreement with what the poetry spec was attempting to specify.

from grayskull.

darynwhite avatar darynwhite commented on July 19, 2024

I've discovered that conda can handle a set of requirements, see this example I have working in conda-forge/willow-feedstock/recipe/meta.yaml:

pyproject.toml

heif = [
    "pillow-heif>=0.10.0,<1.0.0; python_version < '3.12'",
    "pillow-heif>=0.13.0,<1.0.0; python_version >= '3.12'",
]

recipe.yaml

  - pillow-heif >=0.10.0,<1.0.0<py312|>=0.13.0,<1.0.0>=py312

I adapted the list based on the documentation here: Conda Package Match Specs

from grayskull.

xylar avatar xylar commented on July 19, 2024

@darynwhite, I don't think we want something as complex as:

  - pillow-heif >=0.10.0,<1.0.0<py312|>=0.13.0,<1.0.0>=py312

That isn't how this kind of case is handled in meta.yaml files across conda-forge. I think this would be better:

  - pillow-heif >=0.10.0,<1.0.0  # py>312
  - pillow-heif >=0.13.0,<1.0.0  # py>=312

I don't think conda-smithy is going to be able to handle the syntax you suggested correctly.

from grayskull.

kcpevey avatar kcpevey commented on July 19, 2024

Thanks for the explanation! I didn't know about that!

I'm working on a solution. I have both deps added to the requirements list but then when I get here, that one liner grabs only the FIRST dep in the list.

If I fix that, then I hit this reduction of package names to a set, thereby preventing duplicates.

I'll keep pressing on, but I'm wondering exactly how many things I'll break down the line since there seems to be a strong assumption that there will only be ONE spec per package?

from grayskull.

darynwhite avatar darynwhite commented on July 19, 2024

@xylar I was successful in building the latest willow-feedstock update with that complex string, albeit a complicated package spec. Here's the latest PR

Quoting from the conda-build docs Requirements section

Specifies the build and runtime requirements. Dependencies of these requirements are included automatically.

Versions for requirements must follow the conda match specification. See Package match specifications.

And the Package match specifications say:

...
The build string constraint "numpy=1.11.2=nomkl" matches the NumPy 1.11.2 packages without MKL but not the normal MKL NumPy 1.11.2 packages.

The build string constraint "numpy=1.11.1|1.11.3=py36_0" matches NumPy 1.11.1 or 1.11.3 built for Python 3.6 but not any versions of NumPy built for Python 3.5 or Python 2.7.
...

That said...

I am unable to find a reference for breaking the package spec into two lines, so I don't know what the behavior could be if that is done. I only have the above mentioned Package match specifications documentation to go off of.

from grayskull.

xylar avatar xylar commented on July 19, 2024

@darynwhite, the multi-line approach uses processing selectors, see:
https://docs.conda.io/projects/conda-build/en/stable/resources/define-metadata.html#preprocessing-selectors

Your approach might work fine with conda-build. My concerns are:

  • it's pretty hard to read
  • it might not work with conda-smithy, which is the tool used by conda-forge to render its recipes.

The latter is my bigger concern since conda-forge uses grayskull extensively and has tens of thousands of package. I this would be a problem. I don't know for certain that the syntax you are proposing wouldn't work but I have never seen it before, where as the processing selector syntax is ubiquitous throughout conda-forge. So I would strongly prefer a solution that uses that syntax if one can be devised.

from grayskull.

darynwhite avatar darynwhite commented on July 19, 2024

@xylar Thanks for the terminology! I hadn't stumbled my way into that FAQ yet.

If I'm understanding this correctly, then these are equivalent:

  - pillow-heif >=0.10.0,<1.0.0<py312|>=0.13.0,<1.0.0>=py312
  - pillow-heif >=0.10.0,<1.0.0 # py<312
  - pillow-heif >=0.13.0,<1.0.0 # py>=312

And...

I say they are equivalent because I've seen the single-line approach pass the linter and build properly now. The multi-line approach should pass the linter and build properly, based on the documentation I have read and you just shared.

So...

Based on the original issue here, I agree that the multi-line approach is the best. Thus I believe this:

numpy = [
    { version = ">=1.14", python = "<3.11" },
    { version = ">=1.24", python = ">=3.11" }
]

would translate to:

  - numpy >=1.14 # py<311
  - numpy >=1.24 # py>=311

Thanks for the conversation and resources!

from grayskull.

DhanshreeA avatar DhanshreeA commented on July 19, 2024

Hey @xylar, why is it that grayskull has no complaints with a multi constraint dependency specification when they're not from a pyproject.toml using a poetry backend.

So something like this makes grayskull raise an error:

numpy = [
    { version = ">=1.14", python = "<3.11" },
    { version = ">=1.24", python = ">=3.11" }
]

but this does not:

heif = [
    "pillow-heif>=0.10.0,<1.0.0; python_version < '3.12'",
    "pillow-heif>=0.13.0,<1.0.0; python_version >= '3.12'",
]

from grayskull.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.