Coder Social home page Coder Social logo

pymatting / pymatting Goto Github PK

View Code? Open in Web Editor NEW
1.7K 41.0 217.0 7.29 MB

A Python library for alpha matting

Home Page: https://pymatting.github.io

License: MIT License

Python 93.44% Shell 0.43% C++ 2.05% C 4.08%
alpha-matting image-processing foreground python3

pymatting's Introduction

PyMatting: A Python Library for Alpha Matting

License: MIT CI PyPI JOSS Gitter

We introduce the PyMatting package for Python which implements various methods to solve the alpha matting problem.

Lemur

Given an input image and a hand-drawn trimap (top row), alpha matting estimates the alpha channel of a foreground object which can then be composed onto a different background (bottom row).

PyMatting provides:

  • Alpha matting implementations for:
    • Closed Form Alpha Matting [1]
    • Large Kernel Matting [2]
    • KNN Matting [3]
    • Learning Based Digital Matting [4]
    • Random Walk Matting [5]
    • Shared Sampling Matting [6]
  • Foreground estimation implementations for:
    • Closed Form Foreground Estimation [1]
    • Fast Multi-Level Foreground Estimation (CPU, CUDA and OpenCL) [7]
  • Fast multithreaded KNN search
  • Preconditioners to accelerate the convergence rate of conjugate gradient descent:
    • The incomplete thresholded Cholesky decomposition (Incomplete is part of the name. The implementation is quite complete.)
    • The V-Cycle Geometric Multigrid preconditioner
  • Readable code leveraging NumPy, SciPy and Numba

Getting Started

Requirements

Minimal requirements

  • numpy>=1.16.0
  • pillow>=5.2.0
  • numba>=0.47.0
  • scipy>=1.1.0

Additional requirements for GPU support

  • cupy-cuda90>=6.5.0 or similar
  • pyopencl>=2019.1.2

Requirements to run the tests

  • pytest>=5.3.4

Installation with PyPI

pip3 install pymatting

Installation from Source

git clone https://github.com/pymatting/pymatting
cd pymatting
pip3 install .

Example

# First import will take a minute due to compilation
from pymatting import cutout

cutout(
    # input image path
    "data/lemur/lemur.png",
    # input trimap path
    "data/lemur/lemur_trimap.png",
    # output cutout path
    "lemur_cutout.png")

More advanced examples

Trimap Construction

All implemented methods rely on trimaps which roughly classify the image into foreground, background and unknown regions. Trimaps are expected to be numpy.ndarrays of type np.float64 having the same shape as the input image with only one color-channel. Trimap values of 0.0 denote pixels which are 100% background. Similarly, trimap values of 1.0 denote pixels which are 100% foreground. All other values indicate unknown pixels which will be estimated by the algorithm.

Testing

Run the tests from the main directory:

 python3 tests/download_images.py
 pip3 install -r requirements_tests.txt
 pytest

Currently 89% of the code is covered by tests.

Upgrade

pip3 install --upgrade pymatting
python3 -c "import pymatting"

Bug Reports, Questions and Pull-Requests

Please, see our community guidelines.

Authors

  • Thomas Germer
  • Tobias Uelwer
  • Stefan Conrad
  • Stefan Harmeling

See also the list of contributors who participated in this project.

Projects using PyMatting

  • Rembg - an excellent tool for removing image backgrounds.
  • PaddleSeg - a library for a wide range of image segmentation tasks.
  • chaiNNer - a node-based image processing GUI.
  • LSA-Matting - improving deep image matting via local smoothness assumption.

License

This project is licensed under the MIT License - see the LICENSE.md file for details

Citing

If you found PyMatting to be useful for your work, please consider citing our paper:

@article{Germer2020,
  doi = {10.21105/joss.02481},
  url = {https://doi.org/10.21105/joss.02481},
  year = {2020},
  publisher = {The Open Journal},
  volume = {5},
  number = {54},
  pages = {2481},
  author = {Thomas Germer and Tobias Uelwer and Stefan Conrad and Stefan Harmeling},
  title = {PyMatting: A Python Library for Alpha Matting},
  journal = {Journal of Open Source Software}
}

References

[1] Anat Levin, Dani Lischinski, and Yair Weiss. A closed-form solution to natural image matting. IEEE transactions on pattern analysis and machine intelligence, 30(2):228–242, 2007.

[2] Kaiming He, Jian Sun, and Xiaoou Tang. Fast matting using large kernel matting laplacian matrices. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2165–2172. IEEE, 2010.

[3] Qifeng Chen, Dingzeyu Li, and Chi-Keung Tang. Knn matting. IEEE transactions on pattern analysis and machine intelligence, 35(9):2175–2188, 2013.

[4] Yuanjie Zheng and Chandra Kambhamettu. Learning based digital matting. In 2009 IEEE 12th international conference on computer vision, 889–896. IEEE, 2009.

[5] Leo Grady, Thomas Schiwietz, Shmuel Aharon, and Rüdiger Westermann. Random walks for interactive alpha-matting. In Proceedings of VIIP, volume 2005, 423–429. 2005.

[6] Eduardo S. L. Gastal and Manuel M. Oliveira. "Shared Sampling for Real-Time Alpha Matting". Computer Graphics Forum. Volume 29 (2010), Number 2, Proceedings of Eurographics 2010, pp. 575-584.

[7] Germer, T., Uelwer, T., Conrad, S., & Harmeling, S. (2020). Fast Multi-Level Foreground Estimation. arXiv preprint arXiv:2006.14970.

Lemur image by Mathias Appel from https://www.flickr.com/photos/mathiasappel/25419442300/ licensed under CC0 1.0 Universal (CC0 1.0) Public Domain License.

pymatting's People

Contributors

99991 avatar cclauss avatar jcla1 avatar karpp avatar macrocosme avatar rb-synth avatar thewchan avatar tuelwer avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pymatting's Issues

Tests require missing images

Several tests (for example, the one in test_estimate_alpha.py) fail because a required image is missing:

FAILED tests/test_estimate_alpha.py::test_alpha - FileNotFoundError: [Errno 2] No such file or directory: 'data/input_training_lowres/GT01.png'
FAILED tests/test_laplacians.py::test_laplacians - FileNotFoundError: [Errno 2] No such file or directory: 'data/input_training_lowres/GT01.png'
FAILED tests/test_lkm.py::test_lkm - FileNotFoundError: [Errno 2] No such file or directory: 'data/input_training_lowres/GT01.png'
FAILED tests/test_preconditioners.py::test_preconditioners - FileNotFoundError: [Errno 2] No such file or directory: 'data/input_training_lowres/GT01.png'

Could/should GT01.png be included?

How to get hand-drawn trimap?

the inputs of the Net are input image and a hand-drawn trimap,but ofen we don't have hand-drawn trimap. is there code to make it ?
cutout(
# input image path
"data/lemur/lemur.png",
# input trimap path
"data/lemur/lemur_trimap.png",
# output cutout path
"lemur_cutout.png")

[Question❓] Can successfully install, but can not run example code.

installed pymatting 1.1.1

But when I run this code:

from pymatting import cutout
cutout(

input image path

"../data/lemur/lemur.png",

input trimap path

"../data/lemur/lemur_trimap.png",

output cutout path

"lemur_cutout.png")

RuntimeError: Attempted to compile AOT function without the compiler used by numpy.distutils present. Cannot find suitable msvc.

[Question❓] Question regarding output of advanced example

Thank you for your efforts. I have successfully built the package and was able to execute few examples by following advanced example on https://pymatting.github.io/examples.html and have request for info.

I am attaching set of images to elaborate for what I am seeing and advice on how to improve the results.

1: original.png: it is the original file
2: target.png: this file is the manual alpha from photoshop (i was aiming for - might be ambitious)
3: color_bleeding.png, grid.png and cutout.png are the outputs of the pymatting package
4: initial_mask.jpg: that is the rough mask generated by the segmentation framework i.e. detectron2
5: trimap.png: that was generated by following function
def gen_trimap(alpha):
k_size = random.choice(range(1, 5))
iterations = np.random.randint(1, 20)
kernel = cv2.getStructuringElement(cv2.MORPH_ELLIPSE, (k_size, k_size))
dilated = cv2.dilate(alpha, kernel, iterations)
eroded = cv2.erode(alpha, kernel, iterations)
trimap = np.zeros(alpha.shape)
trimap.fill(128)
trimap[eroded >= 255] = 255
trimap[dilated <= 0] = 0
return trimap

Q1: is there a way the internal contours can be removed? or maybe not until those are present in trimap as well?

Q2: advice on what do i need to do in pymatting package to improve the alpha around external contours or hairs etc. given input image, rough mask from segmentation framwork and trimap (calculated in SW like above)?

Q3: could it also be the case that my testcase is more ambitious and pymatting is more suitable for object with sharp edges e.g. cars

Thank you.

`setup.py` should define actual dependencies

Currently, the packages specified in requirements.txt are copied into setup.py:

    install_requires=load_text("requirements.txt").strip().split("\n"),

This is bad practice and can cause problems downstream.

requirements.txt should be used to define a repeatable installation, such as a development environment or a production environment. As such, versions of dependencies contained therein should be as specific as possible.

install_requires should be used to indicate dependencies necessary to run the package. As such, versions of dependencies contained therein should be as broad as possible.

Please see “install_requires vs requirements files” on python.org or “requirements.txt vs setup.py” on stackoverflow for more information.

I'd be happy to contribute a PR with loose dependency specifications in setup.py and concrete specifications in requirements.txt.

Questions about parameters during testing

I'm using the sample code provided in the documentation for testing, but the final keying effect is not good, what parameters should I adjust to improve the result?
lemur_grid

Got Segmentation Fault when calling estimate_alpha_knn

Got this error both on macOS 10.14 and Ubuntu 16.04

When installing the package I used --ignore-installed llvmlite flag for pip because I got
Cannot uninstall 'llvmlite'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.. I am not sure if this is relevant.

pytest
============================================================================== test session starts ===============================================================================
platform darwin -- Python 3.7.4, pytest-5.2.1, py-1.8.0, pluggy-0.13.0
rootdir: /Users/user/pymatting-master
plugins: arraydiff-0.3, remotedata-0.3.2, doctestplus-0.4.0, openfiles-0.4.0
collected 11 items                                                                                                                                                               

tests/test_boxfilter.py .                                                                                                                                                  [  9%]
tests/test_cg.py .                                                                                                                                                         [ 18%]
tests/test_estimate_alpha.py F                                                                                                                                             [ 27%]
tests/test_foreground.py .                                                                                                                                                 [ 36%]
tests/test_ichol.py .                                                                                                                                                      [ 45%]
tests/test_kdtree.py Fatal Python error: Segmentation fault

Current thread 0x00000001086d7dc0 (most recent call first):
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pymatting/util/kdtree.py", line 280 in __init__
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pymatting/util/kdtree.py", line 347 in knn
  File "/Users/user/pymatting-master/tests/test_kdtree.py", line 20 in run_kdtree
  File "/Users/user/pymatting-master/tests/test_kdtree.py", line 46 in test_kdtree
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/python.py", line 170 in pytest_pyfunc_call
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 86 in <lambda>
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 92 in _hookexec
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/python.py", line 1423 in runtest
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/runner.py", line 125 in pytest_runtest_call
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 86 in <lambda>
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 92 in _hookexec
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/runner.py", line 201 in <lambda>
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/runner.py", line 229 in from_call
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/runner.py", line 201 in call_runtest_hook
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/runner.py", line 176 in call_and_report
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/runner.py", line 95 in runtestprotocol
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/runner.py", line 80 in pytest_runtest_protocol
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 86 in <lambda>
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 92 in _hookexec
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/main.py", line 256 in pytest_runtestloop
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 86 in <lambda>
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 92 in _hookexec
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/main.py", line 235 in _main
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/main.py", line 191 in wrap_session
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/main.py", line 228 in pytest_cmdline_main
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 86 in <lambda>
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/manager.py", line 92 in _hookexec
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/pluggy/hooks.py", line 286 in __call__
  File "/Users/user/opt/anaconda3/lib/python3.7/site-packages/_pytest/config/__init__.py", line 90 in main
  File "/Users/user/opt/anaconda3/bin/pytest", line 11 in <module>
[1]    99661 segmentation fault  pytest

Use scribble.png get bad result

i use scribble.png instead of trimap to compute alpha,the result is not good
lemur
lemur_caotu
lemur_alpha
what wrong with this?hope for your kindly reply@99991

TypeError in Numba

hi, I encountered some problems:

File "C:\ProgramData\Anaconda3\envs\work\lib\site-packages\pymatting\alpha\estimate_alpha_cf.py", line 35, in estimate_alpha_cf
A, b = make_linear_system(cf_laplacian(image, **laplacian_kwargs), trimap)
File "C:\ProgramData\Anaconda3\envs\work\lib\site-packages\pymatting\laplacian\cf_laplacian.py", line 156, in cf_laplacian
_cf(image, epsilon, radius, values, indices, indptr)
File "C:\ProgramData\Anaconda3\envs\work\lib\site-packages\numba\dispatcher.py", line 574, in _explain_matching_error
raise TypeError(msg)
TypeError: No matching definition for argument type(s) array(uint8, 3d, C), float64, int64, array(float64, 3d, C), array(int64, 1d, C), array(int64, 1d, C)


If you can help, thank you very much

ValueError on import

Hi,
I installed the lib but there are a problem with the importation of your package.

I am using python 3.8.1 with :
numpy=1.18.1 (>=1.16.0)
pillow=6.2.1 (>=5.2.0)
numba=0.47.0 (>=0.44.0)
scipy=1.3.3 (>=1.1.0)

*** ValueError: Failed in nopython mode pipeline (step: convert to parfors)
Cannot add edge as dest node 26 not in nodes {130, 132, 262, 264, 528, 30, 418, 302, 564, 565, 566, 568, 322, 450, 196, 452, 324, 340, 212, 86, 214, 348, 94, 228, 356, 494, 118, 246, 248, 378, 380}

(you can read all here: https://gyazo.com/b6b9756f0c8d75a30a63dada09c5f82e)

Thank you for your work 👍

Trimap Values?

Hi , what are the values in the trimap?

is 0 for background, 102 for uncertain, and 255 for foreground?

[BUG 🐛] No module named 'pymatting_aot.aot'

Bug description

$ python test2.py
Failed to import ahead-of-time-compiled modules. This is expected on first import.
Compiling modules and trying again (this might take a minute).
Traceback (most recent call last):
File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/pymatting_aot/cc.py", line 21, in
import pymatting_aot.aot
ModuleNotFoundError: No module named 'pymatting_aot.aot'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "test2.py", line 1, in
from pymatting import cutout
File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/pymatting/init.py", line 2, in
import pymatting_aot.cc
File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/pymatting_aot/cc.py", line 28, in
compile_modules()
File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/pymatting_aot/cc.py", line 6, in compile_modules
cc = CC("aot")
File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/numba/pycc/cc.py", line 65, in init
self._toolchain = Toolchain()
File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/numba/pycc/platform.py", line 78, in init
self._raise_external_compiler_error()
File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/numba/pycc/platform.py", line 121, in _raise_external_compiler_error
raise RuntimeError(msg)
RuntimeError: Attempted to compile AOT function without the compiler used by numpy.distutils present. If using conda try:

#> conda install gcc_linux-64 gxx_linux-64

To Reproduce

installed pymatting with pip install rembg on Fedora 33 within venv for Python3.8
create test2.py:
from pymatting import cutout

cutout(
# input image path
"data/lemur/lemur.png",
# input trimap path
"data/lemur/lemur_trimap.png",
# output cutout path
"lemur_cutout.png")
launch:
python test2.py within venv for Python3.8

Expected behavior

runed without errors

Images

(Add relevant images.)

Library versions:

(Run the following commands and paste the result here.)

python --version --version
Python 3.8.6 (default, Sep 25 2020, 00:00:00) 
[GCC 10.2.1 20200826 (Red Hat 10.2.1-3)]
(envrembg) [sav@localhost envrembg]$ python -c "import numpy; numpy.show_config()"

python -c "import numpy; numpy.show_config()"
blas_mkl_info:
  NOT AVAILABLE
blis_info:
  NOT AVAILABLE
openblas_info:
    libraries = ['openblas', 'openblas']
    library_dirs = ['/usr/local/lib']
    language = c
    define_macros = [('HAVE_CBLAS', None)]
blas_opt_info:
    libraries = ['openblas', 'openblas']
    library_dirs = ['/usr/local/lib']
    language = c
    define_macros = [('HAVE_CBLAS', None)]
lapack_mkl_info:
  NOT AVAILABLE
openblas_lapack_info:
    libraries = ['openblas', 'openblas']
    library_dirs = ['/usr/local/lib']
    language = c
    define_macros = [('HAVE_CBLAS', None)]
lapack_opt_info:
    libraries = ['openblas', 'openblas']
    library_dirs = ['/usr/local/lib']
    language = c
    define_macros = [('HAVE_CBLAS', None)]

python -c "import scipy;scipy.show_config()"
lapack_mkl_info:
  NOT AVAILABLE
openblas_lapack_info:
    libraries = ['openblas', 'openblas']
    library_dirs = ['/usr/local/lib']
    language = c
    define_macros = [('HAVE_CBLAS', None)]
lapack_opt_info:
    libraries = ['openblas', 'openblas']
    library_dirs = ['/usr/local/lib']
    language = c
    define_macros = [('HAVE_CBLAS', None)]
blas_mkl_info:
  NOT AVAILABLE
blis_info:
  NOT AVAILABLE
openblas_info:
    libraries = ['openblas', 'openblas']
    library_dirs = ['/usr/local/lib']
    language = c
    define_macros = [('HAVE_CBLAS', None)]
blas_opt_info:
    libraries = ['openblas', 'openblas']
    library_dirs = ['/usr/local/lib']
    language = c
    define_macros = [('HAVE_CBLAS', None)]

python -c "import numba;print('Numba version:', numba.__version__)"
Numba version: 0.51.2

python -c "import PIL;print('PIL version:', PIL.__version__)"
PIL version: 8.0.1

python -c "from pymatting.__about__ import __version__;print('PyMatting version:', __version__)"
Failed to import ahead-of-time-compiled modules. This is expected on first import.
Compiling modules and trying again (this might take a minute).
Traceback (most recent call last):
  File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/pymatting_aot/cc.py", line 21, in <module>
    import pymatting_aot.aot
ModuleNotFoundError: No module named 'pymatting_aot.aot'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/pymatting/__init__.py", line 2, in <module>
    import pymatting_aot.cc
  File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/pymatting_aot/cc.py", line 28, in <module>
    compile_modules()
  File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/pymatting_aot/cc.py", line 6, in compile_modules
    cc = CC("aot")
  File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/numba/pycc/cc.py", line 65, in __init__
    self._toolchain = Toolchain()
  File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/numba/pycc/platform.py", line 78, in __init__
    self._raise_external_compiler_error()
  File "/home/sav/pytest/envrembg/lib64/python3.8/site-packages/numba/pycc/platform.py", line 121, in _raise_external_compiler_error
    raise RuntimeError(msg)
RuntimeError: Attempted to compile AOT function without the compiler used by `numpy.distutils` present. If using conda try:

#> conda install gcc_linux-64 gxx_linux-64

when I run the examples on mac , report such error, any help?[Question❓]

/Users/mac/.conda/envs/yolov5/bin/python /Users/mac/PycharmProjects/aiphoto/pymatting/examples/advanced_example.py
Failed to import ahead-of-time-compiled modules.
This is expected on first import.
Compiling modules and trying again.
This might take a minute.
Traceback (most recent call last):
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/pymatting_aot/cc.py", line 36, in
import pymatting_aot.aot
ModuleNotFoundError: No module named 'pymatting_aot.aot'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/Users/mac/PycharmProjects/aiphoto/pymatting/examples/advanced_example.py", line 1, in
from pymatting import *
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/pymatting/init.py", line 2, in
import pymatting_aot.cc
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/pymatting_aot/cc.py", line 54, in
compile_modules()
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/pymatting_aot/cc.py", line 18, in compile_modules
cc.compile()
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numba/core/compiler_lock.py", line 32, in _acquire_compile_lock
return func(*args, **kwargs)
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numba/pycc/cc.py", line 217, in compile
objects += self._compile_mixins(build_dir)
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numba/pycc/cc.py", line 187, in _compile_mixins
objects = self._toolchain.compile_objects(sources, build_dir,
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numba/pycc/platform.py", line 137, in compile_objects
objects = self._compiler.compile(sources,
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numpy/distutils/ccompiler.py", line 90, in
m = lambda self, *args, **kw: func(self, *args, **kw)
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numpy/distutils/ccompiler.py", line 356, in CCompiler_compile
pool.map(single_compile, build_items)
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/multiprocessing/pool.py", line 364, in map
return self._map_async(func, iterable, mapstar, chunksize).get()
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/multiprocessing/pool.py", line 771, in get
raise self._value
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/multiprocessing/pool.py", line 125, in worker
result = (True, func(*args, **kwds))
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/multiprocessing/pool.py", line 48, in mapstar
return list(map(*args))
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numpy/distutils/ccompiler.py", line 321, in single_compile
self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts)
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numpy/distutils/ccompiler.py", line 90, in
m = lambda self, *args, **kw: func(self, *args, **kw)
File "/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numpy/distutils/unixccompiler.py", line 54, in UnixCCompiler__compile
raise CompileError(msg)
distutils.errors.CompileError: Command "gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/Users/mac/.conda/envs/yolov5/include -arch x86_64 -I/Users/mac/.conda/envs/yolov5/include -arch x86_64 -DPYCC_MODULE_NAME=aot -DPYCC_USE_NRT=1 -I/Users/mac/.conda/envs/yolov5/include/python3.8 -I/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numpy/core/include -c /Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numba/pycc/modulemixin.c -o /var/folders/qg/j35fnxd56tn3mzf6sr_tw9q80000gn/T/pycc-build-aot-n738hml5/Users/mac/.conda/envs/yolov5/lib/python3.8/site-packages/numba/pycc/modulemixin.o" failed with exit status 1

[BUG 🐛]Just a small typo in estimate_alpha_lkm.py

In file estimate_alpha_lkm.py, L9

Estimate alpha from an input image and an input trimap using Learning Based Digital Matting as proposed by :cite:he2010fast.

I guess the paper name was intended to be Fast matting using large kernel matting laplacian matrices?😉

pytest Error: tests/test_lkm.py:81: AssertionError

=== warnings summary ===

tests/test_foreground.py::test_foreground
/home/ferg/git/pymatting/tests/test_foreground.py:31: UserWarning: Tests for GPU implementation skipped, because of missing packages.

I'm on Fedora 31 and here are my pip 3 package versions which above the required dependencies.

Requirement already satisfied: numpy in /usr/lib64/python3.7/site-packages (1.17.4)
Requirement already satisfied: pillow in /usr/lib64/python3.7/site-packages (6.1.0)
Requirement already satisfied: numba in /home/ferg/.local/lib/python3.7/site-packages (0.48.0)
Requirement already satisfied: scipy in /home/ferg/.local/lib/python3.7/site-packages (1.4.1)

Cant find "tests/download_images.py"

When I try to do the python tests/download_images.py It gives me the following error.

can't open file 'tests/download_images.py': [Errno 2] No such file or directory

I am in my Python Projects folder. So Like src/github/BatDev0/pymatting This is where i do all my python tests and projects. Do i have to do the command in a other folder?

[BUG 🐛] division by zero error in estimate_foreground_ml

i am getting division by zero errors in estimate_foreground_ml()

what i tried:

  • pymatting 1.1.1 and 1.1.3
  • making sure both the image and the mask are not uniform (i've seen the error when both have min_val=0 and max_val=1)
  • default parameters and different variations

the environment is google colab. also sometime this (or something else in pymatting) causes the colab itself to crash and disconnect.

[BUG 🐛] PyMatting crashes when I use it in torch dataloader.

Bug description

I used pymatting in torch data preprocessing, but the new version of pymatting does not seem to support multi-threading.
In addition, 1.0.4 works.

To Reproduce

Pymatting 1.1.4, Torch 1.10, 5900x with 3090, CUDA 11.4。 torch.dataset/dataloader. number_workers>=1.

typo in knn_laplacian?

This repo really helps, Thank you!

But, I just find that the comment said the distance_weights "Weight of distance in feature vector, defaults to [2.0, 1.0]." but in the def line, the distance_weights distance_weights=[2.0, 0.1]. Is there a typo in definition or the comment. Which one is more generalized setting?

[BUG 🐛] RuntimeError: Attempted to compile AOT function without the compiler used by `numpy.distutils` present. Cannot find suitable msvc.

If you experience the error message

RuntimeError: Attempted to compile AOT function without the compiler used by `numpy.distutils` present. Cannot find suitable msvc.

on Windows when importing the PyMatting library, then you need to install a C compiler.

Currently, Build Tools for Visual Studio 2019 are available here: https://visualstudio.microsoft.com/downloads/#build-tools-for-visual-studio-2019

When installing, select Desktop Development with C++ or whatever it is called in English. The language selection is broken unfortunately.

VisualStudioBuildTools

I tried only installing specific components, but it seems like the Windows 10 SDK is required. Otherwise, cl.exe will fail because it can not find basetsd.h.

A several GB install is of course less than ideal. It would be nice if we could get rid of the dependency, which is currently required by Numba for ahead-of-time compilation.

Thresholded incomplete Cholesky decomposition failed

When a running with a 8k file and trimap it shows ..

Thresholded incomplete Cholesky decomposition failed due to insufficient positive-definiteness of matrix A with parameters:
discard_threshold = 1.000000e-04
shift = 1.000000e-04
Try decreasing discard_threshold or start with a larger shift

and

Thresholded incomplete Cholesky decomposition failed because more than max_nnz non-zero elements were created. Try increasing max_nnz or discard_threshold.

Also please tell me out of these which is the method with best accuracy..

numba Error: No matching definition for argument type(s)

this line
foreground = pymatting.estimate_foreground_ml(image, alpha)

gives an error

--> 214 foreground, background = _estimate_fb_ml(
215 image.astype(np.float32),
216 alpha.astype(np.float32),

~\AppData\Local\Continuum\anaconda3\envs\pytorch3d\lib\site-packages\numba\dispatcher.py in _explain_matching_error(self, *args, **kws)

TypeError: No matching definition for argument type(s) array(float32, 3d, C), array(float32, 3d, C), float64, int64, int64, int64

I checked image and alpha is np.float64 between 0.0 and 1.0.
How can this be resolved?

image

[Question❓]

How to convert choleskydecomposition object to numpy array? I want to convert return object from pymatting.preconditioner.ichol.ichol(A) this function.

[BUG 🐛] AttributeError: module 'numba.core.serialize' has no attribute '_ "numba_unpickle'"

Bug description

Importing PyMatting triggers the following error:

AttributeError: module 'numba.core.serialize' has no attribute '_ "numba_unpickle'"

To Reproduce

Update PyMatting from 1.0.6? to 1.1.2? and import the library.

Cause

This is likely an issue with Numba. It should probably recompile old modules or not crash when importing them.

Temporary fix

Locate the ahead-of-time compiled module and delete it. A new import will trigger a recompilation. In my case, the file was located at ~/miniconda3/lib/python3.7/site-packages/pymatting_aot/aot.cpython-37m-x86_64-linux-gnu.so

[Question❓] What exactly is a trimap?

I was reading about trimap,
I found 2 different definition on it

  • An image consisting of only 3 colors: black, white and a single shade of grey
  • An image consisting of black, white and shades of grey (where all shades of grey correspond to unknown region)

Which one is correct?

"Trimap has 102 colors" on a 3 colors image

Hello, I painted a trimap in Photoshop with colors picked from lamur_trimap.png. Then I used the [cutout] function. And there was this error appeared:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-28-ce857cc21222> in <module>()
      7     "data/nik/nik_trimap.png",
      8     # output cutout path
----> 9     "nik_cutout.png")

3 frames
/content/drive/My Drive/Colab/pymatting/pymatting/cutout/cutout.py in cutout(image_path, trimap_path, cutout_path)
     28         raise ValueError("Input image and trimap must have same size")
     29 
---> 30     alpha = estimate_alpha_cf(image, trimap)
     31 
     32     foreground = estimate_foreground_ml(image, alpha)

/content/drive/My Drive/Colab/pymatting/pymatting/alpha/estimate_alpha_cf.py in estimate_alpha_cf(image, trimap, preconditioner, laplacian_kwargs, cg_kwargs)
     44         preconditioner = ichol
     45 
---> 46     A, b = make_linear_system(cf_laplacian(image, **laplacian_kwargs), trimap)
     47 
     48     x = cg(A, b, M=preconditioner(A), **cg_kwargs)

/content/drive/My Drive/Colab/pymatting/pymatting/laplacian/laplacian.py in make_linear_system(L, trimap, lambda_value, return_c)
     36     h, w = trimap.shape[:2]
     37 
---> 38     is_fg, is_bg, is_known, is_unknown = trimap_split(trimap)
     39 
     40     c = lambda_value * is_known

/content/drive/My Drive/Colab/pymatting/pymatting/util/util.py in trimap_split(trimap, flatten)
    447             "Make sure that your trimaps are stored as PNG instead of JPG.\n"
    448             "If you scaled the trimap, make sure to use nearest filtering:\n"
--> 449             '    load_image("trimap.png", "GRAY", 0.5, "nearest")' % n_colors
    450         )
    451 

ValueError: Trimap has 102 colors, but should have no more than 3 (black, white, gray).
Make sure that your trimaps are stored as PNG instead of JPG.
If you scaled the trimap, make sure to use nearest filtering:
    load_image("trimap.png", "GRAY", 0.5, "nearest")

This is the trimap

nik_trimap

Error in local installation

I am trying to install the repo through:
git clone https://github.com/pymatting/pymatting.git
cd pymatting
pip install .
but I met the error below:

  File "setup.py", line 20, in <module>
    long_description=load_text("README.md"),
  File "setup.py", line 6, in load_text
    return f.read()
  File "/ldap_home/bohang.li/.conda/envs/pymatting/lib/python3.6/encodings/ascii.py", line 26, in decode
    return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 3305: ordinal not in range(128)```

pixelated edges?

Wondering why the results are so poor from images like this. Any tips/ideas?

Original:

00016

Trimap:

00016

The cutout:

00016

[Question❓] Tests for GPU implementation skipped, because of missing packages

Hi

I have setup the pymatting under container environment and executed the test. Pytest was able to complete it however I got following warnings:

tests/test_foreground.py::test_foreground
/pymatting/tests/test_foreground.py:32: UserWarning: Tests for GPU implementation skipped, because of missing packages.
"Tests for GPU implementation skipped, because of missing packages."

-- Docs: https://docs.pytest.org/en/stable/warnings.html

I noticed that similar issue was reported earlier as well but couldn't find conclusion

I have got Nvidia GPUs but somehow it is not being detected. I have individually installed cuPy, pyopencl, libcutensor and some other output on installed cuda packages:

root@10e16f343455:/pymatting# dpkg --list | grep cuda
ii  cuda-command-line-tools-10-2  10.2.89-1                           amd64        CUDA command-line tools
ii  cuda-compat-10-2              440.95.01-1                         amd64        CUDA Compatibility Platform
ii  cuda-compiler-10-2            10.2.89-1                           amd64        CUDA compiler
ii  cuda-cudart-10-2              10.2.89-1                           amd64        CUDA Runtime native Libraries
ii  cuda-cudart-dev-10-2          10.2.89-1                           amd64        CUDA Runtime native dev links, headers
ii  cuda-cufft-10-2               10.2.89-1                           amd64        CUFFT native runtime libraries
ii  cuda-cufft-dev-10-2           10.2.89-1                           amd64        CUFFT native dev links, headers
ii  cuda-cuobjdump-10-2           10.2.89-1                           amd64        CUDA cuobjdump
ii  cuda-cupti-10-2               10.2.89-1                           amd64        CUDA profiling tools runtime libs.
ii  cuda-cupti-dev-10-2           10.2.89-1                           amd64        CUDA profiling tools interface.
ii  cuda-curand-10-2              10.2.89-1                           amd64        CURAND native runtime libraries
ii  cuda-curand-dev-10-2          10.2.89-1                           amd64        CURAND native dev links, headers
ii  cuda-cusolver-10-2            10.2.89-1                           amd64        CUDA solver native runtime libraries
ii  cuda-cusolver-dev-10-2        10.2.89-1                           amd64        CUDA solver native dev links, headers
ii  cuda-cusparse-10-2            10.2.89-1                           amd64        CUSPARSE native runtime libraries
ii  cuda-cusparse-dev-10-2        10.2.89-1                           amd64        CUSPARSE native dev links, headers
ii  cuda-driver-dev-10-2          10.2.89-1                           amd64        CUDA Driver native dev stub library
ii  cuda-gdb-10-2                 10.2.89-1                           amd64        CUDA-GDB
ii  cuda-libraries-10-2           10.2.89-1                           amd64        CUDA Libraries 10.2 meta-package
ii  cuda-libraries-dev-10-2       10.2.89-1                           amd64        CUDA Libraries 10.2 development meta-package
ii  cuda-license-10-2             10.2.89-1                           amd64        CUDA licenses
ii  cuda-memcheck-10-2            10.2.89-1                           amd64        CUDA-MEMCHECK
ii  cuda-minimal-build-10-2       10.2.89-1                           amd64        Minimal CUDA 10.2 toolkit build packages.
ii  cuda-misc-headers-10-2        10.2.89-1                           amd64        CUDA miscellaneous headers
ii  cuda-npp-10-2                 10.2.89-1                           amd64        NPP native runtime libraries
ii  cuda-npp-dev-10-2             10.2.89-1                           amd64        NPP native dev links, headers
ii  cuda-nvcc-10-2                10.2.89-1                           amd64        CUDA nvcc
ii  cuda-nvdisasm-10-2            10.2.89-1                           amd64        CUDA disassembler
ii  cuda-nvgraph-10-2             10.2.89-1                           amd64        NVGRAPH native runtime libraries
ii  cuda-nvgraph-dev-10-2         10.2.89-1                           amd64        NVGRAPH native dev links, headers
ii  cuda-nvjpeg-10-2              10.2.89-1                           amd64        NVJPEG native runtime libraries
ii  cuda-nvjpeg-dev-10-2          10.2.89-1                           amd64        NVJPEG native dev links, headers
ii  cuda-nvml-dev-10-2            10.2.89-1                           amd64        NVML native dev links, headers
ii  cuda-nvprof-10-2              10.2.89-1                           amd64        CUDA Profiler tools
ii  cuda-nvprune-10-2             10.2.89-1                           amd64        CUDA nvprune
ii  cuda-nvrtc-10-2               10.2.89-1                           amd64        NVRTC native runtime libraries
ii  cuda-nvrtc-dev-10-2           10.2.89-1                           amd64        NVRTC native dev links, headers
ii  cuda-nvtx-10-2                10.2.89-1                           amd64        NVIDIA Tools Extension
ii  cuda-sanitizer-api-10-2       10.2.89-1                           amd64        CUDA Sanitizer API
hi  libcudnn7                     7.6.5.32-1+cuda10.2                 amd64        cuDNN runtime libraries
ii  libcudnn7-dev                 7.6.5.32-1+cuda10.2                 amd64        cuDNN development libraries and headers
hi  libnccl-dev                   2.7.8-1+cuda10.2                    amd64        NVIDIA Collectives Communication Library (NCCL) Development Files
hi  libnccl2                      2.7.8-1+cuda10.2                    amd64        NVIDIA Collectives Communication Library (NCCL) Runtime

Could you please advise on what package might be missing? Thank you.

Foreground background estimation for TensorFlow version[Question❓]

Hi,

Thank you for your amazing repo. I try to convert estimate_fg_bg_numpy.py to TensorFlow. However, the inference speed is not satisfactory. In GPU 1080Ti, the cupy version just cost 2ms, the TensorFlow version will cost 20ms for 144x256 resolution. Do you know how to correctly revise the numpy code to TensorFlow? Thank you very much.

import numpy as np
from PIL import Image
import time
import tensorflow as tf


def inv2(mat):
    a = mat[..., 0, 0]
    b = mat[..., 0, 1]
    c = mat[..., 1, 0]
    d = mat[..., 1, 1]

    inv_det = 1 / (a * d - b * c)

    inv00 = inv_det * d
    inv01 = inv_det * -b
    inv10 = inv_det * -c
    inv11 = inv_det * a
    inv00 = inv00[:, tf.newaxis, tf.newaxis]
    inv01 = inv01[:, tf.newaxis, tf.newaxis]
    inv10 = inv10[:, tf.newaxis, tf.newaxis]
    inv11 = inv11[:, tf.newaxis, tf.newaxis]
    inv_temp1 = tf.concat([inv00, inv10], axis=1)
    inv_temp2 = tf.concat([inv01, inv11], axis=1)
    inv = tf.concat([inv_temp1, inv_temp2], axis=2)

    return inv


def pixel_coordinates(w, h, flat=False):
    x, y = tf.meshgrid(np.arange(w), np.arange(h))

    if flat:
        x = tf.reshape(x, [-1])
        y = tf.reshape(y, [-1])

    return x, y


def vec_vec_outer(a, b):
    return tf.einsum("...i,...j", a, b)

def estimate_fb_ml(
        input_image,
        input_alpha,
        min_size=2,
        growth_factor=2,
        regularization=1e-5,
        n_iter_func=2,
        print_info=True,):

    h0, w0 = 144, 256

    # Find initial image size.
    w = int(np.ceil(min_size * w0 / h0))
    h = min_size

    # Generate initial foreground and background from input image
    F = tf.image.resize_nearest_neighbor(input_image[tf.newaxis], [h, w])[0]
    B = F * 1.0
    while True:
        if print_info:
            print("New level of size: %d-by-%d" % (w, h))
        # Resize image and alpha to size of current level
        image = tf.image.resize_nearest_neighbor(input_image[tf.newaxis], [h, w])[0]
        alpha = tf.image.resize_nearest_neighbor(input_alpha[tf.newaxis, :, :, tf.newaxis], [h, w])[0, :, :, 0]
        # Iterate a few times
        n_iter = n_iter_func
        for iteration in range(n_iter):
            x, y = pixel_coordinates(w, h, flat=True) # w: 4, h: 2
            # Make alpha into a vector
            a = tf.reshape(alpha, [-1])
            # Build system of linear equations
            U = tf.stack([a, 1 - a], axis=1)
            A = vec_vec_outer(U, U) # 8 x 2 x 2
            b = vec_vec_outer(U, tf.reshape(image, [w*h, 3])) # 8 x 2 x 3
            # For each neighbor
            for dx, dy in [(-1, 0), (1, 0), (0, -1), (0, 1)]:
                x2 = tf.clip_by_value(x + dx, 0, w - 1)
                y2 = tf.clip_by_value(y + dy, 0, h - 1)
                # Vectorized neighbor coordinates
                j = x2 + y2 * w
                # Gradient of alpha
                a_j = tf.nn.embedding_lookup(a, j)
                da = regularization + tf.abs(a - a_j)
                # Update matrix of linear equation system
                A00 = A[:, 0, 0] + da
                A01 = A[:, 0, 1]
                A10 = A[:, 1, 0]
                A11 = A[:, 1, 1] + da
                A00 = A00[:, tf.newaxis, tf.newaxis]
                A01 = A01[:, tf.newaxis, tf.newaxis]
                A10 = A10[:, tf.newaxis, tf.newaxis]
                A11 = A11[:, tf.newaxis, tf.newaxis]
                A_temp1 = tf.concat([A00, A10], axis=1)
                A_temp2 = tf.concat([A01, A11], axis=1)
                A = tf.concat([A_temp1, A_temp2], axis=2)
                # Update rhs of linear equation system
                F_resp = tf.reshape(F, [w * h, 3])
                F_resp_j = tf.nn.embedding_lookup(F_resp, j)
                B_resp = tf.reshape(B, [w * h, 3])
                B_resp_j = tf.nn.embedding_lookup(B_resp, j)
                da_resp = tf.reshape(da, [w * h, 1])
                b0 = b[:, 0, :] + da_resp * F_resp_j
                b1 = b[:, 1, :] + da_resp * B_resp_j
                b = tf.concat([b0[:, tf.newaxis, :], b1[:, tf.newaxis, :]], axis=1)
                # Solve linear equation system for foreground and background
            fb = tf.clip_by_value(tf.matmul(inv2(A), b), 0, 1)

            F = tf.reshape(fb[:, 0, :], [h, w, 3])
            B = tf.reshape(fb[:, 1, :], [h, w, 3])

        # If original image size is reached, return result
        if w >= w0 and h >= h0:
            return F, B

        # Grow image size to next level
        w = min(w0, int(np.ceil(w * growth_factor)))
        h = min(h0, int(np.ceil(h * growth_factor)))

        F = tf.image.resize_nearest_neighbor(F[tf.newaxis], [h, w])[0]
        B = tf.image.resize_nearest_neighbor(B[tf.newaxis], [h, w])[0]



######################################################################
def estimate_foreground_background_tf():
    image_np = np.array(Image.open("./image.png").resize([256, 144]))[:, :, :3] / 255
    alpha_np = np.array(Image.open("./alpha.png").resize([256, 144])) / 255
    image = tf.placeholder(tf.float32, [144, 256, 3])
    alpha = tf.placeholder(tf.float32, [144, 256])
    foreground, background = estimate_fb_ml(image, alpha, n_iter_func=2)
    sess = tf.Session()
    for i in range(10):
        s = time.time()
        sess.run(foreground, feed_dict={image: image_np, alpha: alpha_np})
        e = time.time()
        print("time: ", e - s)


######################################################################
def main():
    estimate_foreground_background_tf()


if __name__ == "__main__":
    main()

Matting without Trimap

Thanks for your amazing library. I would like to know if any method can support only a rgb input without a trimap. If not, may I ask why is there such a strong need for trimaps and why the algorithms have difficulty considering everything as part of the trimap region. I am working on a deep learning trimap less approach so am interested in gaining your thoughts on this issue.

[BUG 🐛] the results between the function "estimate_foreground_ml" and "estimate_foreground_ml_cupy" were different

Bug description
I estimated the foreground by "pymatting.foreground.estimate_foreground_ml" and "pymatting.foreground.estimate_foreground_ml_cupy". But I found the results between the function "estimate_foreground_ml" and "estimate_foreground_ml_cupy" were different . And there were something wrong in the output estimate_foreground_ml_cupy estimated!

Images

  • input image :
    test

  • output image by estimate_foreground_ml:
    human_ml

  • output image by estimate_foreground_ml_cupy:
    human_cupy

  • wrong estimate_foreground_ml_cupy estimated: "the ear with black contours"
    image

[Question❓] How to generate trimaps from u2net output?

Hi @99991,

I am using the pymatting in my lib. Basically, I use the output of u2net and chains in pymatting.

I generate the trimap based on foreground and background threshold followed by an erosion:
https://github.com/danielgatis/rembg/blob/master/src/rembg/bg.py#L16

I have an issue asking about the optimal values for thresholds and erosion size:
danielgatis/rembg#14 (comment)

Is there any technique to generate an optimal trimap based on u2net output?

thanks.

typo in util.py

in file util/util.py line 487-491:

    if is_fg.sum() == 0:
        raise ValueError("Trimap did not contain background values (values = 0)")

    if is_bg.sum() == 0:
        raise ValueError("Trimap did not contain foreground values (values = 1)")

Looks the place of two error messages should be swapped

Speedup[Question❓]

Thank you very much for your work.
I want to know whether can use GPU to accelerate matrix operation!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.