Coder Social home page Coder Social logo

kaanaksit / odak Goto Github PK

View Code? Open in Web Editor NEW
156.0 14.0 47.0 256.9 MB

Scientific computing library for optics, computer graphics and visual perception

Home Page: https://kaanaksit.com/odak

License: Mozilla Public License 2.0

Python 97.78% HTML 0.18% TeX 2.05%
raytracing optics wave-optics jones-calculus cluster-computing computational-optics computational-display computational-imaging computer-graphics holography

odak's Introduction

Odak

Odak (pronounced "O-dac") is the fundamental library for scientific computing in optical sciences, computer graphics, and visual perception. Odak is also the toolkit for the research conducted in Computational Light Laboratory. To learn more about what Odak can do to help your design, experimentation, and development, consult our documentation!

Getting Started

Installing

We encourage users to use virtual environments in their development pipeline when working with or developing odak. You can simply create and activate a virtual environment by using the following syntax:

python3 -m venv odak
source odak/bin/activate

Once activated, in the first usage, you can install odak using the previous instructions. To deactivate the virtual environemnt, you can always use deactivate command in your terminal. Once you have an activated virtual environment, please consider following any of the highlighted installation methods below.

For the most recent guidance on installing Odak, please consult installation documentation. Odak can be installed using pip:

pip3 install odak

or you can follow this, but remember that it will install the latest version in the repository this way:

pip3 install git+https://github.com/kaanaksit/odak --upgrade

or this:

git clone [email protected]:kaanaksit/odak.git
cd odak
pip3 install -r requirements.txt
pip3 install -e .

Usage and examples

You can import Odak and start designing your next in Optics, Computer Graphics, or Perception! We prepared a documentation on usage and much more. Absolute beginners can learn light, computation, and odak with our Computational Light course.

Here is a simple example of raytracing with odak:

import odak
import torch

starting_point = torch.tensor([0., 0., 0.])
end_point = torch.tensor([1., 1., 5.])
rays = odak.learn.raytracing.create_ray_from_two_points(
                                                        starting_point,
                                                        end_point
                                                       )

triangle = torch.tensor([[
                          [-5., -5., 5.],
                          [ 5., -5., 5.],
                          [ 0.,  5., 5.]
                         ]])

normals, distance, _, _, check = odak.learn.raytracing.intersect_w_triangle(
                                                                            rays,
                                                                            triangle
                                                                           )
print('intersection point is {}. Surface normal cosines are {}.'.format(normals[0, 0], normals[0, 1]))

Here is a simple example of computer-generated holography with odak:

import odak
import torch


wavelength = 532e-9
pixel_pitch = 8e-6 
distance = 5e-3
propagation_type = 'Angular Spectrum'
k = odak.learn.wave.wavenumber(wavelength)


amplitude = torch.zeros(500, 500)
amplitude[200:300, 200:300 ] = 1.
phase = torch.randn_like(amplitude) * 2 * odak.pi
hologram = odak.learn.wave.generate_complex_field(amplitude, phase)


image_plane = odak.learn.wave.propagate_beam(
                                             hologram,
                                             k,
                                             distance,
                                             pixel_pitch,
                                             wavelength,
                                             propagation_type,
                                             zero_padding = [True, False, True]
                                            )
image_intensity = odak.learn.wave.calculate_amplitude(image_plane) ** 2 
odak.learn.tools.save_image(
                            'image_intensity.png', 
                            image_intensity, 
                            cmin = 0., 
                            cmax = 1.
                           )

Here is a simple example of color conversion with odak:

import odak
import torch

input_rgb_image = torch.randn((1, 3, 256, 256))
ycrcb_image = odak.learn.perception.color_conversion.rgb_2_ycrcb(input_rgb_image)
rgb_image = odak.learn.perception.color_conversion.ycrcb_2_rgb(ycrcb_image)

Here is a simple example on deep learning methods with odak:

import odak
import torch


x1 = torch.arange(10).unsqueeze(-1) * 30.
pos_x1 = torch.arange(x1.shape[0]).unsqueeze(-1) * 1.
model_mlp = odak.learn.models.multi_layer_perceptron(
                                                     dimensions = [1, 5, 1],
                                                     bias = False,
                                                     model_type = 'conventional'
                                                    )


optimizer = torch.optim.AdamW(model_mlp.parameters(), lr = 1e-3)
loss_function = torch.nn.MSELoss()
for epoch in range(10000):
    optimizer.zero_grad()
    estimation = model_mlp(pos_x1)
    ground_truth = x1
    loss = loss_function(estimation, ground_truth)
    loss.backward(retain_graph = True)
    optimizer.step()
print('Training loss: {}'.format(loss.item()))


for item_id, item in enumerate(pos_x1):
    torch.no_grad()
    ground_truth = x1[item_id]
    estimation = model_mlp(item)
    print('Input: {}, Ground truth: {}, Estimation: {}'.format(item, ground_truth, estimation))

For more of these examples, you can either check our course documentation or visit our unit tests to get inspired.

Sample Projects that use Odak

Here are some sample projects that use odak:

How to cite

To add the link to this repository in your publication, please use Zenodo's citation. If you have used odak in your research project, please consider citing any of the following works:

@inproceedings{akcsit2023flexible,
  title={Flexible modeling of next-generation displays using a differentiable toolkit},
  author={Ak{\c{s}}it, Kaan and Kavakl{\i}, Koray},
  booktitle={Practical Holography XXXVII: Displays, Materials, and Applications},
  volume={12445},
  pages={131--132},
  year={2023},
  organization={SPIE}
}
@inproceedings{kavakli2022introduction,
  title={Introduction to Odak: a Differentiable Toolkit for Optical Sciences, Vision Sciences and Computer Graphics},
  author={Kavakl{\i}, Koray and Ak{\c{s}}it, Kaan},
  booktitle={Frontiers in Optics},
  pages={FTu1A--1},
  year={2022},
  organization={Optica Publishing Group}
}
@incollection{kavakli2022optimizing,
  title={Optimizing vision and visuals: lectures on cameras, displays and perception},
  author={Kavakli, Koray and Walton, David Robert and Antipa, Nick and Mantiuk, Rafa{\l} and Lanman, Douglas and Ak{\c{s}}it, Kaan},
  booktitle={ACM SIGGRAPH 2022 Courses},
  pages={1--66},
  year={2022}
}

odak's People

Contributors

actions-user avatar aguzel avatar albertgary avatar askaradeniz avatar david-morales-norato avatar drwalton avatar jeannebeyazian avatar kaanaksit avatar koraykavakli avatar kymer0615 avatar liangs111 avatar praneethc avatar rongduo avatar yaio4109 avatar yilmazdoga avatar yutaitoh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

odak's Issues

Test Output Directory and .gitignore Configuration

Hello,

First, I would like to express my gratitude for this amazing library.

I am preparing a pull request for Odak and encountered an issue when running tests. A significant number of files are generated in the main folder, which clutters the directory.

Proposed Solution:

  1. Test Output Directory: Implement a standard practice where each test saves its results in a designated subfolder within the test directory.
  2. Ignore Configuration: Ensure that this subfolder is included in the .gitignore file to prevent undesired files from being included in commits.

Additional Information:

For reference, I have attached the git status output after cloning the main branch and running pytest.

Thank you for considering this enhancement; it will improve the organization of the project. I believe I could refactor the tests, but it will take me some time, and first I want to make a pull request about something else.

Best regards,

David Morales-Norato

On branch light_asm_fresnel_light_propagators
Untracked files:
  (use "git add <file>..." to include in what will be committed)
	Impulse_Response_Fresnel.png
	Transfer_Function_Fresnel.png
	heights.ply
	hologram_intensity.png
	image.png
	image0_cropped.png
	image0_padded.png
	image1_padded.png
	image_0000.png
	image_16_bit_color.png
	image_16_bit_monochrome.png
	image_8_bit_color.png
	image_8_bit_monochrome.png
	image_intensity.png
	out.ply
	output.ply
	output_amplitude.png
	output_amplitude_0.png
	output_amplitude_1.png
	output_hologram.png
	phase.png
	reconstruction.png
	reconstruction_image_000.png
	reconstruction_image_001.png
	reconstruction_image_002.png
	status.txt
	target.png
	targets_0000.png
	test/heights.pt

nothing added to commit but untracked files present (use "git add" to track)

Pytorch and numpy/cupy comparison for beam propagation

Currently, beam propagation functions of odak.wave and odak.learn.wave give different outputs for the same input field.

As discussed in #7 and #10, these are some of the functions that we can compare:

  • odak.wave.set_amplitude and odak.learn.set_amplitude,
  • fftn and ifftn in torch with respect to fft2 and ifft in numpy, cupy.

An example output for comparison from test_learn_beam_propagation.py with Bandlimited Angular Spectrum method:

AssertionError: 
Arrays are not almost equal to 3 decimals

Mismatched elements: 250000 / 250000 (100%)
Max absolute difference: 9.45276863
Max relative difference: 129.77451864
 x: array([[-301.293   +2.111j,  317.095 -225.884j, -209.758 +613.112j, ...,
         232.588  +45.735j,  250.056  +82.582j,  181.349 +147.647j],
       [-239.635  -60.857j, -221.058 +125.061j,   11.215 -475.677j, ...,...
 y: array([[-294.892   -1.521j,  310.806 -225.954j, -204.181 +611.055j, ...,
         230.09   +46.909j,  252.016  +83.05j ,  176.521 +149.347j],
       [-245.039  -56.122j, -215.765 +124.03j ,    6.634 -472.519j, ...,...

Band-limited angular spectrum method

It would be good if we had an alternative beam propagation method in wave and learn submodules.

Band-Limited Angular Spectrum Method is described in this paper and also discussed in here. It seems to work well with practical applications.

Finding the nearest point on a geometric ray with respect to an another geometric ray

A definition is needed to find the nearest point on a geometric ray with respect to an another geometric ray. Praneeth Chakravatrhula provided this piece of code in the past for this reason:

def findNearestPoints(vec1, vec2, ray):
  # written by praneeth chakravarthula
    # Refer to the concept of skew lines and line-plane intersection for the following math.
    p1 = vec1[0].reshape(3,)
    d1 = vec1[1].reshape(3,)
    p2 = vec2[0].reshape(3,)
    d2 = vec2[1].reshape(3,)
    # normal to both vectors
    n = np.cross(d1, d2)
    # if the rays intersect
    if np.all(n)==0:
      point, distances = ray.CalculateIntersectionOfTwoVectors(vec1, vec2)
      c1 = c2 = point
    else:
      # normal to plane formed by vectors n and d1
      n1 = np.cross(d1, n)
      # normal to plane formed by vectors n and d2
      n2 = np.cross(d2, n)
      # nearest distance point to vec2 along vec1 is equal to
      # intersection of vec1 with plane formed by vec2 and normal n
      c1 = p1 + (np.dot((p2-p1), n2)/np.dot(d1, n2))*d1
      # nearest distance point to vec1 along vec2 is equal to
      # intersection of vec2 with plane formed by vec1 and normal n
      c2 = p2 + (np.dot((p1-p2), n1)/np.dot(d2, n1))*d2
    return c1, c2

The submodule raytracing can take advantage from this definition. So the ask is to place this piece of code to ray.py that can be found in the odak/raytracing/ray.py directory of Odak. The steps here:

  • Fork odak on github.
  • Add the definition to odak/raytracing/ray.py.
  • Commit on your fork.
  • Start a pull request attached to this issue.

Thanks!

Zero padding and cropping in beam propagation

While propagating coherent optical beams with Odak using odak.wave.propagate_beam or odak.learn.wave.propagate_beam, oftentimes, we have to first zero pad an input field in space and afterwards crop the propagated field in space again. Rather than following this, it makes perfect sense to have a flag variable as an input to beam propagation functions so that we avoid rewriting the same code in future projects.

A question about calculating the pooling area in make_pooling_size_map_pixels

If I understand it right, in order to calculate the major_axis of the ellipse here
https://github.com/kaanaksit/odak/blob/master/odak/learn/perception/foveation.py#L130
a multiplication is required according to simple trigonometric function as
major_axis = (torch.tan(angle_max) - torch.tan(angle_min)) * real_viewing_distance
But a division is used in the your code.
I'm wondering if it is a bug or I misunderstand your code.

Adding Stochastic Gradient Descent based hologram calculation

Odak supports torch based hologram calculation routines. Odak.learn.wave lacks a definition that can calculate holograms using Stochastic Gradient Descent. The latest and greatest in combining torch and odak is at a branch called torch_1_8_0. At the time of the writing of this issue, any development should go on top of that branch.

  • Add a stochastic_gradient_descent definition to odak.learn.wave.classical.
  • Add a test script to test folder to run a test for the new added stochastic_gradient_descent definition.

Thinning the dependencies

I want to drop dependencies as much as possible. The current list of dependencies can be found in here. This issue will track the progress towards dropping dependencies.

Wide-window angular spectrum method for diffractionpropagation in far and near field

According to this paper, although bandlimited angular spectrum method is valid for a wider propagation range compared to angular spectrum method, the accuracy of bandlimited angular spectrum method would decrease when the propagation distance is larger than 40 times the wavelength. The same paper explains the reasoning as follows: the band limits of a filter that avoids the aliasing is chosen according to the propagation distance of a beam, and in the large distances, the chosen limits are degrading the accuracy of the calculation. With respect to strict solution, this paper claims a superior accuracy, replicating it would be helpful in the long run.

Loading images with Odak in a normalized manner

Odak has two functions to load images as a Numpy and Torch variable. These are odak.tools.load_image and odak.learn.tools.load_image. Given many of the recipes within Odak uses normalized values between zero to one, it makes perfect sense to load images in a normalized way by default rather than having them represented in 8-bit pixel depth.

Fresnel and Fraunhofer beam propagator with a test routine

Recently, a beam propagator is added to Odak, which contains Fresnel Impulse Response (FIR), Fresnel Tranfer Function (FTF), and Fraunhofer propagation routines. They are all added to odak.wave.classical script that can be found in here.

A test routine is also added accordingly as in here.

The outcome of the test routine can be visualized as follows:
Initial field:
resim
Propagated field:
resim
Back propagated field at the source:
resim

and here is a visualization of the same code with a sample image:
Initial field:
resim
Propagated field:
resim
Back propagated field at the source:
resim

Migrating learn submodule to pytorch 1.8.0

Currently, some methods used in the learn submodule uses external functions for some operations. The current list of these functions is as the following:

  • fftshift in toolkit.py
  • ifftshift in toolkit.py

These functions are included in fft module in pytorch 1.8.0. We can update these functions when pytorch 1.8.0 becomes stable.

Fraunhofer beam propagation

Fraunhofer beam propagation simulations are essential when working with far field. Odak needs a verified Fraunhofer beam propagation code, the current one exists in here:

https://github.com/kunguz/odak/blob/59a1c71978d677edd2bf611c30bb497833348fba/odak/wave/classical.py#L47-L80

@praneethc stated that he can help with providing a verified Fraunhofer beam propagation code. His future code can replace the one provided above.

There is also a test routine that can be used for testing Fraunhofer beam propagation code. Once implemented the below line can be changed in this script to Fraunhofer from IR Fresnel and the code can be tested this way.
https://github.com/kunguz/odak/blob/59a1c71978d677edd2bf611c30bb497833348fba/test/test_beam_propagation.py#L12

To visualize the outcome these following lines can be uncommented within the same test script:
https://github.com/kunguz/odak/blob/59a1c71978d677edd2bf611c30bb497833348fba/test/test_beam_propagation.py#L36-L43

How to set up parallel computing

There is lots of time in holographic calculation and Sommerfeld diffraction integration.Here we can optimize the parallel calculation to speed up the time. But I don’t know how to do it.

Lens phase function generation

odak needs a lens phase function generation routine to generate phase patterns for various lenses (quadratic phase functions). Steps that are required to implement are as follows:

  • Fork odak,
  • In your own fork implement a phase function generation routine in odak.wave specifically in here.
  • Add a test routine with a name test_lens_phase.py to test folder found in here. Other test routines such as this one can help in writing a test routine for this case.
  • Commit all the changes to your fork.
  • Once you are done at your fork, add a pull request to main repository of odak using this link.

Do not hesitate to ask any questions during your implementation.

AttributeError: 'display_color_hvs' object has no attribute 'rgb_to_lms'

Hello,

I intended to use the __call__ function of the display_color_hvs class in odak.learn.perception.color_conversion.py. However, since the self.rgb_to_lms function is deprecated, I encountered the following error:

lms_image_second = self.rgb_to_lms(input_image.to(self.device))
AttributeError: 'display_color_hvs' object has no attribute 'rgb_to_lms'

To reproduce the issue, I have prepared an isolated example below:

import torch
from PIL import Image
from torchvision import transforms
from odak.learn.perception.color_conversion import display_color_hvs

image_path = 'data/parrot.png'
preprocess = transforms.Compose([
        transforms.ToTensor(),           
    ])

image = Image.open(image_path)
tensor_image = preprocess(image)

dc_hvs = display_color_hvs(read_spectrum = 'default')
loss = dc_hvs(tensor_image, tensor_image)

With the current design, is there a workaround to obtain equivalent functionality?

Gerchberg-Saxton phase retrieval method

This issue will track the work on migrating code from @askaradeniz that implements Gerchberg-Saxton to Odak. There are two veins to this migration. One of them is migrating the code in a way that is suitable to work with Numpy and Cupy. The second deals with the torch implementation, which I believe @askaradeniz can immediatly initiate as his code is already applicable to torch case.

  • Numpy/Cupy case will be hosted in odak/wave/classical.py,
  • Torch case will be hosed in odak/learn/classical.py,

We will also add test cases to test folder for both method.

Double phase hologram method

The paper titled as Reconstruct Holographic 3D Objects by Double Phase Hologram describes a phase only hologram calculation method using double phase information. It can either be realized using two phase only Spatial Light Modulators (SLMs) or with an interferometric method that modulates the input field with one phase only SLM twice. Adding support for this method to Odak is a good idea as it has been used in the literature in the recent years.

  • Numpy/Cupy case will be hosted in odak/wave/classical.py,
  • Torch case will be hosed in odak/learn/classical.py,
  • Add test cases to test folder for both implementations.

Incorrect Kernel Equations in `odak.learn.wave.classical`

Hello,

I have been working with the light propagators in your library and noticed a discrepancy in the equations for the kernels of the angular spectrum and Fresnel methods within odak.learn.wave.classical.

Issue Details:

According to "Fourier Optics" by Goodman (2007), the Fresnel transfer function should be as follows:

$H = e^{i \cdot z \cdot k - \pi \lambda z (f_x^2 + f_y^2)}$

Note that there is no square root function because of the small angle approximation.

However, the implemented equation in the library for fresnel kernel is:

$H = e^{i \cdot z \cdot \frac{2\pi}{\lambda} \cdot \sqrt{1 - (\lambda \cdot f_x)^2 - (\lambda \cdot f_y)^2}} $

This equation corresponds to the kernel for the angular spectrum method.

Proposed Solution:

To correct this issue, the equation for the Fresnel transfer function should be updated to match the standard form without the square root, reflecting the small angle approximation. I will make a pull request about it

References:

  • Goodman, J. W. (2007). Introduction to Fourier Optics.

Thank you for your attention to this matter.

Best regards,
David Morales-Norato

Wirtinger hologram generation routine

This issue will track the implementation of Wirtinger hologram generation routine described in Wirtinger holography for near-eye displays paper published by Chakravarthula et al. (@praneethc) at SIGGRAPH 2019. Here is the link for the copy of that paper. There are two veins to this migration. One of them is migrating the code in a way that is suitable to work with Numpy and Cupy. The second deals with the torch implementation.

  • Numpy/Cupy case will be hosted in odak/wave/classical.py,
  • Torch case will be hosed in odak/learn/classical.py,

Test cases will also be added to test folder for both method.

Implementing beam propagation in pytorch

It would be useful to have a beam propagation function written in pytorch for deep learning.

We can rewrite the Fresnel and Fraunhofer beam propagator defined in here with torch.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.