Coder Social home page Coder Social logo

mipi_camera's Introduction

Drive deprecation declaration

Raspberry Pi is transitioning from a legacy camera software stack based on proprietary Broadcom GPU code to an open-source stack based on libcamera. This version of the driver is deprecated on the new Bullseye system or later. This driver just can be used on legacy buster version system or before version.

Introduction

The MIPI cameras are widely used nowadays for smartphones and many opensource platforms like Raspberry pi and Nvidia Jetson series boards. And in order to add more mipi cameras (or other video streaming device with MIPI interface) support for these maker hardware platforms while keeping the mipi camera complex interface and protocol hidden from the user, Arducam team developed several camera drivers and demo codes based on different driver frameworks and their implementations.

Driver Framework

MIPI camera driver is a close source userland camera driver with no kernel version dependency. It can connect any MIPI camera modules from Arducam. Since this driver only supports the RAW sensor, it can receive the RAW images without hardware ISP processing, users have to do software ISP in their own implementation.

Jetvariety is a Nvidia Jetson platform V4L2 kernel camera driver framework which can support any MIPI cameras Arducam provides. A single-camera driver for all is the main goal of Jetvariety project, the user doesn't need to develop their own camera driver for Nvidia Jetson boards and even more, user can switch between different Arducam cameras without switching camera driver. Software compatibility for Jetvariety V4L2 driver is also another consideration for this project. Arducam_OBISP_MIPI_Camera_Module uses this driver on Jetson.

Similar to the Jetvariety, Pivariety uses the same idea to provide a single camera driver for all on the Raspberry pi platforms. It is also a V4L2 kernel driver for the software compatibility to most of the popular media player software or OpenCV. Arducam_OBISP_MIPI_Camera_Module also uses this driver on Raspberry Pi.

  • Libcamera-arducam for Raspberry Pi

Coming soon.

mipi_camera's People

Contributors

arducam avatar edward-ardu avatar gernot-hochmuth avatar glddiv avatar hermann-sw avatar iliesaya avatar ketai-dhr avatar laxmanchoudhary avatar ljp-while avatar nstrmb avatar uctronics avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mipi_camera's Issues

Impossible to open the camera

Hello,
I bought the IMX298 camera module few weeks ago and try to use it.
I installed the complete opencv git on my rapsberry pi zero w, then I downloaded the driver for the arducam mipi camera, following the instructions.
When I want to launch an example, I write ./preview it returns :
Open camera...
init camera status = -1

Could you help me please ?

Manual White Balance (Brightness Parameter)

Kindly, solve the issue (or gap) in your API about the camera's manual white balance parameter since the auto mode is quite bizarre & not appropriate for several options.

Get Full Circle Image with Lower resolution

Here are supported formats of my stereo camera hat:

pi@raspberrypi:~/MIPI_Camera/RPI $ ./list_format 
Found sensor ov5647 at address 36
mode: 0, width: 2592, height: 1944, pixelformat: pBAA, desc: (null)
mode: 1, width: 1920, height: 1080, pixelformat: pBAA, desc: (null)
mode: 2, width: 1280, height: 960, pixelformat: pBAA, desc: (null)
mode: 3, width: 1280, height: 720, pixelformat: pBAA, desc: (null)
mode: 4, width: 640, height: 480, pixelformat: pBAA, desc: (null)
mode: 5, width: 5184, height: 1944, pixelformat: pGAA, desc: Used for Arducam Synchronized stereo camera HAT, 2592x1944 * 2
mode: 6, width: 3840, height: 1080, pixelformat: pGAA, desc: Used for Arducam synchronized stereo camera HAT, 1920x1080 * 2
mode: 7, width: 2592, height: 972, pixelformat: pGAA, desc: Used for Arducam synchronized stereo camera HAT, 1296x972 * 2
mode: 8, width: 2560, height: 720, pixelformat: pBAA, desc: Used for Arducam synchronized stereo camera HAT, 1280x720 * 2
mode: 9, width: 2592, height: 730, pixelformat: pGAA, desc: Used for Arducam synchronized stereo camera HAT, 1296x730 * 2
mode: 10, width: 1280, height: 480, pixelformat: pGAA, desc: Used for Arducam synchronized stereo camera HAT, 640x480 * 2
index: 0, CID: 0x00980914, desc: V4L2_CID_HFLIP, min: 0, max: 1, default: 0, current: 0
index: 1, CID: 0x00980915, desc: V4L2_CID_VFLIP, min: 0, max: 1, default: 0, current: 0
close camera status = 0

If I execute ./arducamstill -t 2000 -m 5 -e jpg -o test_5.jpg, I get this image:
image

But if I execute ./arducamstill -t 2000 -m 9 -e jpg -o test_9.jpg, the image is cropped:
image

Is there any way to get the full circle images with lower resolution?
Especially, with the python wrapper?

Need to compile libarducam_mipicamera.so ?

Hello!
I'm trying to build these test programs on my RPi4 running Ubuntu18.04.
But when I make I get the error message,

/usr/bin/ld: skipping incompatible //usr/lib/libarducam_mipicamera.so when searching for -larducam_mipicamera

I did run the install and I can see the file there and can ls the above path+name.

Do I need to compile that library because I'm on Ubuntu?

Cheers

arducam B0174 make error

i 'm trying to install arcudam B0174 but having some issues and dont have live prieview mode arrducamstill can make a photo and save it to jpg file but i do not have any live preview

pi@raspberrypi:~/MIPI_Camera/RPI $ make clean && make

rm -f .o
rm -f video4cameras preview_setMode arducamstill capture video list_format capture_raw raw_callback yuv_callback read_write_sensor_reg ov9281_external_trigger 2MPGlobalShuterDemo preview-camera0 preview-dualcam capture-dualcam video2stdout capture2opencv qrcode_detection opencvGui
gcc -I. -g -O0 -std=gnu11 -o video4cameras video4cameras.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o preview_setMode preview_setMode.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o arducamstill arducamstill.c -larducam_mipicamera -lpthread
arducamstill.c: In function ‘main’:
arducamstill.c:408:50: warning: passing argument 3 of ‘pthread_create’ from incompatible pointer type [-Wincompatible-pointer-types]
int ret = pthread_create(&processCmd_pt, NULL, prcessCmd,&processData);
^~~~~~~~~

In file included from arducamstill.c:9:
/usr/include/pthread.h:236:15: note: expected ‘void * (
)(void )’ but argument is of type ‘void ()(PROCESS_STRUCT )’ {aka ‘void ()(struct )’}
void (__start_routine) (void ),
~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~

gcc -I. -g -O0 -std=gnu11 -o capture capture.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o video video.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o list_format list_format.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o capture_raw capture_raw.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o raw_callback raw_callback.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o yuv_callback yuv_callback.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o read_write_sensor_reg read_write_sensor_reg.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o ov9281_external_trigger ov9281_external_trigger.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o 2MPGlobalShuterDemo 2MPGlobalShuterDemo.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o preview-camera0 preview-camera0.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o preview-dualcam preview-dualcam.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o capture-dualcam capture-dualcam.c -larducam_mipicamera -lpthread
gcc -I. -g -O0 -std=gnu11 -o video2stdout video2stdout.c -larducam_mipicamera -lpthread
g++ -I. -g -std=gnu++11 pkg-config --cflags --libs opencv -o capture2opencv capture2opencv.cpp -larducam_mipicamera -lpthread
g++ -I. -g -std=gnu++11 pkg-config --cflags --libs opencv -o qrcode_detection qrcode_detection.cpp -larducam_mipicamera -lpthread -lzbar
g++ -I. -g -std=gnu++11 pkg-config --cflags --libs opencv -o opencvGui opencvGui.cpp -larducam_mipicamera -lpthread

please advice as for now this camera is just a electronic trash if software can not be compiled on fresh raspbian on RPI 4.

OV9281 and Raspberry pi 4

Hi,

HW : OV9281 and raspberry pi 4B 4Go

SW: fresh buster install + deps

Could you please some info about this issue :

pi@raspberrypi:/home/pi/MIPI_Camera/RPI/python $ python capture.py
Open camera...
init_camera: Unexpected result.
pi@raspberrypi:/home/pi/MIPI_Camera/RPI/python $ i2cdetect -y -r 0
0 1 2 3 4 5 6 7 8 9 a b c d e f
00: -- -- -- -- -- -- -- -- -- -- -- -- --
10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
20: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
60: 60 -- -- -- -- -- -- -- -- -- -- -- 6c -- -- --
70: 70 -- -- -- -- -- -- --
pi@raspberrypi:/home/pi/MIPI_Camera/RPI $ ./capture2opencv
Open camera...
init camera status = -1
pi@raspberrypi:/home/pi/MIPI_Camera/RPI $ ./ov9281_external_trigger
Open camera...
init camera status = -1
pi@raspberrypi:/home/pi/MIPI_Camera/RPI $ ./preview-camera0
Open camera...
Found sensor ov9281 at address 60

Could you please provide a solution ? Thx in advance.

Note: seems ok on raspberry pi 3B

Jetvariety kernel driver for Xavier NX and L4T revision 4.3

The current 4.2 driver does not work:

jetson@xavier:~$ sudo dpkg -i arducam-nvidia-l4t-kernel_4.9.140-32.4.2-20200520093606_arm64.deb
Selecting previously unselected package arducam-nvidia-l4t-kernel.
(Reading database ... 173761 files and directories currently installed.)
Preparing to unpack arducam-nvidia-l4t-kernel_4.9.140-32.4.2-20200520093606_arm64.deb ...
Unpacking arducam-nvidia-l4t-kernel (4.9.140-32.4.2-20200520093606) ...
dpkg: dependency problems prevent configuration of arducam-nvidia-l4t-kernel:
 arducam-nvidia-l4t-kernel depends on nvidia-l4t-kernel (= 4.9.140-tegra-32.4.2-20200428140713); however:
  Version of nvidia-l4t-kernel on system is 4.9.140-tegra-32.4.3-20200625213407.

dpkg: error processing package arducam-nvidia-l4t-kernel (--install):
 dependency problems - leaving unconfigured
Errors were encountered while processing:
 arducam-nvidia-l4t-kernel
jetson@xavier:~$ uname -a
Linux xavier 4.9.140-tegra #1 SMP PREEMPT Thu Jun 25 21:22:12 PDT 2020 aarch64 aarch64 aarch64 GNU/Linux
jetson@xavier:~$ cat /etc/nv_tegra_release
# R32 (release), REVISION: 4.3, GCID: 21589087, BOARD: t186ref, EABI: aarch64, DATE: Fri Jun 26 04:34:27 UTC 2020
jetson@xavier:~$

nvidia just released L4T revision 4.3, how can we get the drivers?

IMX298 unusuall behaviour

I bought the IMX298 and tested it and developed code which works. At home everything works flawless, but at the customer's workshop I cannot even shoot a simple picture. The effect of running my programm or any other sample from arducam returns in a freeze and I loose the ssh connection and have to restart.
This behaviour of the camera is very bizarre, I even ordered another Camera (same model), which works at home flawless but at the customers side it doesn't work.
Can anybody give me help?

tvservice-client: Failed to connect to TV service: -1

My program goal is take pictures, when sensor is triggered. I use raspbian latest version and I have done all updates. It saves pictures directly to network drive. My program works fine for days and suddenly it gets to “tvservice-client: Failed to connect to TV service: -1”. Only reboot seems to work, restart program wont help But how do I catch that error? I have tried running program in lxterminal and systeMD.

preview-camera0 SegmentationFault on CM3+ devkit

Running preview-camera0 executable gives

Open camera...
segfault

On CM3+ dev-kit and OV7251 camera?

I have followed the steps here Are there any other steps to take other than what has been mentioned in this link?

raw10 to jpg tool

Hello,

Could you please provide some assistance to use your code :)

I just want to load our data-set (raw10) from python3/opencv2 but your tool seems bogus....

Cheers,

pi@raspberrypi:~/MIPI_Camera/RPI/python $ python3 capture_raw.py 
Open camera...
Found sensor ov9281 at address 60
Setting the resolution...
Can't open the file
mmal: Failed to fix lens shading, use the default mode!
Current resolution is (640, 400)
Reset the focus...
reset_control: Unexpected result.
The camera may not support this control.
Enable Auto Exposure...
Enable Auto White Balance...
software_auto_white_balance: Unexpected result.
Close camera...
pi@raspberrypi:~/MIPI_Camera/RPI/python $ cd ../utils/
pi@raspberrypi:~/MIPI_Camera/RPI/utils $ python3 mipi_raw10_to_jpg.py ../python/640x400.raw 640x400.jpg 640 400 gray
Notice:This program only support csi-2 raw 10bit packet data convert to jpg.

Traceback (most recent call last):
  File "mipi_raw10_to_jpg.py", line 74, in <module>
    img = unpack_mipi_raw10(remove_padding(data, width, height, 10)).reshape(height, width, 1)
  File "mipi_raw10_to_jpg.py", line 48, in remove_padding
    buff = buff.reshape(align_height, align_width)
ValueError: cannot reshape array of size 256000 into shape (400,800)

error in "make clean && make"

While just following the instructions README file, I get the following error when after 'make clean && make':

g++ -I. -g -std=gnu++11 pkg-config --cflags --libs opencv-o capture2opencv capture2opencv.cpp -larducam_mipicamera -lpthread /usr/bin/ld: //usr/lib/arm-linux-gnueabihf/libavcodec.so.58: undefined reference tobcm_host_is_fkms_active'
collect2: error: ld returned 1 exit status
make: *** [Makefile:45: capture2opencv] Fout 1`

I use the new RPI4 model.

Exposure (Focus) & white balance (brightness) Adjustment for Arducam 16 MP cam module

How to set the manual values for exposure or focus & white balance or brightness. It seems that if we do not adjust the camera in appropriate position, it doesn't provide the image in any better than poor quality image. I have tried multiple times to set the index and focus_val attribute but it doesn't help. Kindly make an Application based on python for Raspbian platform or at least provide the enough information for description to edit the focus & brightness value on hand. Because there is no way to adjust the parameters while photography session.

Exposure failure with ov9281_external_trigger.c

I am trying to run a handful of ArduCam's OV9281 modules with a synchronized external trigger, but I am running into issues when I run the ov9281_external_trigger.c code to change the trigger mode.

At the moment the write_regs() function is called (tested by changing the argument of the preceding usleep() function), the camera exposure seems to drop to nearly zero, outputting all black except when pointed at a bright light source. The below images are taken of my ceiling, before and after the change.

Reducing the clock pulse frequency (supplied via pigpio on a Raspberry Pi separate from the one controlling the camera) from 60Hz to 30Hz returns the camera to normal exposure but results in alternating frames of normal video and all black--presumably because the software is expecting twice as many frames as it is receiving.

My best guess is that at 60Hz, the camera's exposure isn't complete when the next trigger pulse comes in, leading to the described failure, but manually changing V4L2_CID_EXPOSURE and V4L2_CID_GAIN has no effect, regardless of the status of arducam_software_auto_exposure() and arducam_set_resolution(). Is there an undocumented loss of user control over exposure time associated with the register changes in write_regs()?

Other diagnostics notes:

  • I have tried swapping out both the OV9281 and the Raspberry Pi it is connected to with no change
  • I have checked the clock signal with an oscilloscope and it is a clean 3.3V pulse at the expected frequency

Screenshot from 2019-08-02 11-37-58
Screenshot from 2019-08-02 11-38-34

parameter exposure stereo hat / ros

hi

I wrote a ros wrapper for the arducam stereo hat and I was wondering if you could "expose" more parameters from the library.
My particular interest is (if possible) the binning factor, in order to stream lower resolution images but higher fps.
The current implemented formats (1600x600, mode 7 if I am not wrong) still too big to have any real-time operation in ros / opencv

Also: why only specific formats are available? any documentation on this ?

Any comment / help is hugely appreciated !

P.

How is it that the hardware is out, but the software featured prominently on the advertising blog and video is not posted anywhere here on GitHub under the Arducam account?

How is it that the hardware is out, but the software featured prominently on the advertising blog and video is not posted anywhere here on GitHub under the Arducam account?
There are virtually no instructions for building and installing the driver, but I am experienced, so that much was a non-issue. But no matter what why is there is no reference to preprocess.sh or loaddriver.sh?

Crucially, I don't have any /dev/video/* handles.

So why aren't these new products even listed on the main ArduCam website?
Why do the buried product pages for them have blank documentation and software pages.

Please just post the proper code, here on GitHub?

Originally posted by @UberEclectic in ArduCAM/RaspberryPi#7 (comment)

Cannot install Mipi driver on Jetson Nano with JetPack 4.4

The instructions to get going with the Jetvariety camera boards (UC-599) doesn't work on a brand new Jetson Nano with the latest downloaded SD card image. See below.

$ sudo dpkg -i ./arducam-nvidia-l4t-kernel_4.9.140-32.3.1-20200325151719_arm64.deb
dpkg: regarding .../arducam-nvidia-l4t-kernel_4.9.140-32.3.1-20200325151719_arm64.deb containing arducam-nvidia-l4t-kernel, pre-dependency problem:
arducam-nvidia-l4t-kernel pre-depends on nvidia-l4t-ccp-t210ref
nvidia-l4t-ccp-t210ref is not installed.

dpkg: error processing archive ./arducam-nvidia-l4t-kernel_4.9.140-32.3.1-20200325151719_arm64.deb (--install):
pre-dependency problem - not installing arducam-nvidia-l4t-kernel
Errors were encountered while processing:
./arducam-nvidia-l4t-kernel_4.9.140-32.3.1-20200325151719_arm64.deb

$ uname -a
Linux nano 4.9.140-tegra #1 SMP PREEMPT Wed Apr 8 18:10:49 PDT 2020 aarch64 aarch64 aarch64 GNU/Linux
$ cat /etc/nv_tegra_release
R32 (release), REVISION: 4.2, GCID: 20074772, BOARD: t210ref, EABI: aarch64, DATE: Thu Apr 9 01:22:12 UTC 2020

Errors in arducam_mipicamera.py

Hello,
in the MIPI_Camera/RPI/python/arducam_mipicamera.py

line 182:184
arducam_stop_preview = camera_lib.arducam_stop_preview
arducam_start_preview.argtypes = [c_void_p]
arducam_start_preview.restype = c_int

should be:
arducam_stop_preview = camera_lib.arducam_stop_preview
arducam_stop_preview.argtypes = [c_void_p]
arducam_stop_preview.restype = c_int

It seems that are missing the functions to for the rgainvalue and bgainvalue
arducam_get_gain
arducam_manual_set_awb_compensation

How to set exposure on 18MP AR1820HS using the python script.

I am doing

sudo python3 rw_sensor.py -d 0 -r 12306 -v 500

and also setting it to 100
but my captures are getting the exact same exposure.

Here's my python code that's capturing the image.

import cv2
import argparse
from utils import ArducamUtils
import RPi.GPIO as GPIO
import _thread
import uuid
import time
import v4l2


def fourcc(a, b, c, d):
    return ord(a) | (ord(b) << 8) | (ord(c) << 16) | (ord(d) << 24)


def pixelformat(string):
    if len(string) != 3 and len(string) != 4:
        msg = "{} is not a pixel format".format(string)
        raise argparse.ArgumentTypeError(msg)
    if len(string) == 3:
        return fourcc(string[0], string[1], string[2], ' ')
    else:
        return fourcc(string[0], string[1], string[2], string[3])


def show_info(arducam_utils):
    _, firmware_version = arducam_utils.read_dev(ArducamUtils.FIRMWARE_VERSION_REG)
    _, sensor_id = arducam_utils.read_dev(ArducamUtils.FIRMWARE_SENSOR_ID_REG)
    _, serial_number = arducam_utils.read_dev(ArducamUtils.SERIAL_NUMBER)
    print("Firmware Version: {}".format(firmware_version))
    print("Sensor ID: 0x{:04X}".format(sensor_id))
    print("Serial Number: 0x{:08X}".format(serial_number))


# for executing a thread to save the image.
def save_image(data):
    cv2.imwrite("/home/dlinano/photos/imagetest" + str(uuid.uuid1()) + ".png", data)
    print("finished execution")

# Initialize the camera
print("Initializing camera")
images = []
cap = cv2.VideoCapture(0, cv2.CAP_V4L2)
# set pixel format
if not cap.set(cv2.CAP_PROP_FOURCC, cv2.VideoWriter_fourcc('Y', '1', '6', ' ')):
    print("Failed to set pixel format.")

arducam_utils = ArducamUtils(0)

show_info(arducam_utils)
# turn off RGB conversion
if arducam_utils.convert2rgb == 0:
    cap.set(cv2.CAP_PROP_CONVERT_RGB, arducam_utils.convert2rgb)

# set width
cap.set(cv2.CAP_PROP_FRAME_WIDTH, 4896)
# set height
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, 3684)

ret, frame = cap.read()


print("Starting demo now! Press CTRL+C to exit")

print("Capture")
ret, frame = cap.read()
frame = arducam_utils.convert(frame)
_thread.start_new_thread(save_image, tuple([frame]))

I am really new to using these kinds of cameras so sorry if there is something obvious I missed here. Also, I am using the jetson version of the camera with the jetson drivers. What is the best way to get proper control of the camera settings in python?

Autofocus on a specific point or region

With the Autofocus cameras (Specifically [B0163] 13MP Autofocus IMX135) is focusing on a specific point or region supported? I do not see that anywhere in example code or documentation.

Thanks!

Gstreamer support for Jetvariety cameras

I'm finding the lack of code examples for the jetvariety cameras I bought for the Jetson Nano hard to swallow compared to the Raspberry Pi's examples and support. The lack of examples wouldn't be as bad if the cameras simply worked with the Nvidia examples here: https://github.com/dusty-nv/jetson-inference

However, pretty much every Nvidia example and tool for inference (the whole point of using Arducams with the Jetson platform) uses gstreamer (1.14.5.0) pipelines and those all fail with jetvariety cameras (/dev/video0 and /dev/video1). See screenshot below.

Neither gst-launch-1.0 with v4lsrc or nvarguscamerasrc work...
Is this a bug? If not...how can I use gstreamer with Jetvariety cameras?

gstreamer-error

OV9281 Raw 10bit data

Im currently evaluating the OV9281 Sensor (ArduCam B0162) in combination with a raspberry pi for machine vision applications.
Therefore we need uncompressed image data (RAW) with 10bit resolution. Additionally frame capturing should be externally triggered to synchronize the global shutter sensor with a light source.
Your example applications are working fine. (change resolution, gain, exposure, flip, ...)

Further investigation shows that the API functionality is very limited (arducam_mipicamera.h).

For example i'm not able to switch to raw format:
arducam_mipicamera.h line 193:
@note Currently supported image formats are:
IMAGE_ENCODING_JPEG, IMAGE_ENCODING_I420.

Also i can't find any possibility to enable 10bit resolution.

Is it correct that the API restricts configuration functionality to set four different setups of resolution, exposure, gain and horizontal, vertical flip and two compressing formats?

Is there any possibility to get RAW 8 and/or 10bit data to the raspberry pi using the mipi interface and the arducam?

TypeError: VideoCapture() takes at most 1 argument (2 given)

I'm trying to run the examples, after fixing the TypeError: unsupported operand type(s) for +: 'range' and 'list' error, but still no luck.

Traceback (most recent call last):
  File "arducam_displayer.py", line 74, in <module>
    cap = cv2.VideoCapture(args.device, cv2.CAP_V4L2)
TypeError: VideoCapture() takes at most 1 argument (2 given)

I'm using Python 3.6.9 and python3-opencv arm64 3.2.0+dfsg-4ubuntu0.1

Any hints?

Side-question: can we use AR1820HS in gstreamer?

LE: works if one removes the second argument.

Driver for Xavier NX doesn't work

Hi, Thank you for your recent effort to launch camera driver for Jetson Xavier NX.
I'm trying to connect OV2311 camera by using arducam board to Xavier NX, but currently it's not working.

I cant find /dev/video* directory after I installed the driver and rebooted the system.
My kernel version is [ 4.9.140 ] and L4T release number is [ R32 revision 4.2 ] .
I'm not sure if this is related to driver or my mistake in the process of connecting camera.
If you had any suggestion and shared it , I would really appreciate for it.

Need additional Jetvariety drivers

Hello! I'm wondering if there are plans on releasing the jetvariety driver supporting other and older versions of the l4t kernel.

Specifically, perhaps including one for 4.9.140-32.2.0 would be helpful, as it would pair with the excellent deep learning tutorial by Adrian Rosebrock, which uses the Jetson 4.2 image:
tutorial

I had the misfortune of spending the time to install and configure opencv, tensoorflow, keras, etc over a day, only to find that the specific version he used is not supported by the arducam drivers. I can't help but feel that there will be other beginners out there who will make the same mistake, who will struggle with making the correct configurations to match to the supported kernel.

Thank you very much!

Connecting monochromatic cameras through MIPI/CSI-2

Recently I found some information about your project on the arducam-website.

I was curious how well monochromatic cameras could be integrated in the current MIPI/CSI-2 interface provided by the raspberry pi. I see a growing number of people demanding for such a feature due to low-photon imaging in astronomy and microscopy.
As far as I understood, one could use your USB-to-camera interface to connect a sensor to the raspberry pi. But would it also be possible to use your driver in order to achieve this with a dedicated camera-pcb/connector?

You can see a discussion on Twitter.

I saw that the Arducam 10MP MT9J001 has a monochrome chip - but I need to use the USB-converter shield in order to use it with the raspi, correct?

Thanks for the great project! I love your camera-multiplexer! :)

Best
Bene

Ethernet Link Down on OV9281 Cam Access on Raspi 4

I'm using ArduCam OV9281 with Raspberry Pi 4 B 4GB. The camera is working fine. However, on first access to the camera, the wired ethernet interface loses its link. It is not possible to re-active the link by re-plugging the cable or "ifconfig eth0 up". After reboot, the ethernet link is working again - until the next camera access.

Please see the attached text files for details.

If there is any information missing or there is something I should try in my setup, please let me know.

boot_config.txt
cam_access.txt
dmesg.txt
info.txt
lsmod.txt
uname_a.txt

Separate this repo to two for Raspi and Jetson

Jetson is good! But in a bad network environment, I failed to clone this repo several times even with depth=1. I don't need drivers for Jetson (165MB) since what I want is just RPI (12MB). Please consider separating this into two individual repo!

Arducam IMX 298 on Jetson Nano

Hello team of ArduCAM!
Today arrived my Arducam IMX 298 (https://www.amazon.com/Arducam-Camera-IMX298-Raspberry-Plugged/dp/B07W6LTFZC) but I'm trying to use it on a Nvidia Jetson Nano through CSI port but apparently it can't be recognized. I read that is compatible with RPi but I wanted to try it on the Jetson Nano because I'm doing some Deep Learning research and I've used other CSI cameras but none of them have the quality that this Arducam IMX 298 has and I really wanted to try it.
Is it possible to make it work on my Jetson Nano?
Thanks in advance!

OV9281 occasional capture timeout on CM3

I am using a CM3 which interfaces with the OV9281 using a custom carrier board.

I have a python script based heavily off of the capture2opencv.py demo - it sits in a while loop capturing images.

I have set my gpu allocation as following:

pi@raspberrypi:~ $ vcgencmd get_mem gpu
gpu=128M

My wiring is a little bit different to a standard CM3 carrier so I have reflected this change in my setup:

self._camera = arducam.mipi_camera()

# define hw wiring for OV9281
self.camera_interface = arducam.CAMERA_INTERFACE()
self.camera_interface.i2c_bus = 0
self.camera_interface.camera_num = 1
self.camera_interface.sda_pins = (28, 0)
self.camera_interface.led_pins = (44, 5)
self.camera_interface.shutdown_pins = (44,5)

self._camera.init_camera2(self.camera_interface)
self._fmt = self._camera.set_resolution(640, 400)
self._camera.software_auto_exposure(enable = True)

My main loop is pretty much identical to capture2opencv.py:

 while True:
            # update frame
            frame = self._camera.capture(encoding = 'i420')
            height = int(self.align_up(fmt['height'], 16))
            width = int(self.align_up(fmt['width'], 32))
            image = frame.as_array.reshape(int(height * 1.5), width)
            image = cv2.cvtColor(image, cv2.COLOR_YUV2BGR_I420)

My problem is that after 50 or so images, the arducam_capture() method will timeout and return a NullPtr. (Definition located here)

My current work around is the very hacky following to catch the buffer being a Null Pointer. It will just reinitialise the camera again and it will operate normatlly for another ~50 images before the error occurs again.

try:
    image = frame.as_array.reshape(int(height * 1.5), width)
except ValueError as e:
    print("ValueError: {}".format(e))
    self._camera.close_camera()
    self._camera.init_camera2(self.camera_interface)
    continue                                         

I have been using this camera with a 3B+ for several weeks and haven't had this problem.

Support for OV9281 on Jetson Nano

I bought an OV9281 for evaluation using Jetson Nano. I checked the jetvariety repo and discovered that still there is no support in the device tree.

Please could you give me the relevant section of the device tree for the Jetson Nano and OV9281?

I can build my own device tree but you would save me a lot of trouble if you could just give me the .DTS

Thank you.

No image from sensor - NULL pointer access

Dear friends,
I managed to fix the hardware (stereo hat with 2x8mb camera v2) and software necessary to proceed with arducam but I cannot manage to get any image preview. When I run the python demo, I get NULL pointer access although I can see
Found sensor imx219 at address 10
Can somebody help me please?
Thanks

IMX477 B0251 cannot handle 120fps when IMX219 B0207 can ?

Hello,

I'm trying the IMX477 B0251 camera on a Xavier NX. First, let me say that the image quality is very good, compared to IMX219 B0207, congrats!

However, my application requires 120FPS.

I was able to do it succesfully on a jetson Xavier Nx + IMX219 B0207 with the following command line:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1280, height=720,framerate=120/1' ! nvvidconv ! xvimagesink

However with the IMX477 camera, I get the following error:

ARGUS_ERROR: Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute: 679 Frame Rate specified is greater than supported

I am pretty sure the CSI bandwidth is sufficient, since I lowered the image size, and since it was working with the older camera.

Is there a solution for this?

Thanks

IMX477 FPS driver limit

First of all thank you for compiling the driver for the sensor, much appreciated!

Originally when we bought the Arducam IMX477 cameras the description stated:

 Full resolution@60fps(normal), 4K2K@60fps, 1080p@240fps

As far as I understand, it is a limitation of the driver now, since the arducam version uses all 4 lanes and could handle the bandwidth.

Since the driver is open-source, can you please do the necessary changes to unlock the full potential of the camera and enable those modes too?

Thank you in advance!

exposure > 65535

Is the maximum exposure time imposed by the hardware or just because its a 16 bit int? I am using the IMX298 mipi on an rpi and need long exposure times. Since mobile phones using this sensor usually go up to 32s I was hoping to achieve the same. Could the libarducam_mipicamera.so simply be rebuilt with a larger sized exposure variable?

MJPEG video

In this example an H264 video is captured and streamed.
I would like to do the same but with MJPEG video through gstreamer. Is it possible?

Moreover I would like to know if it is possible to set the framerate.

Frame rate read is 51 fps while Capture is 134fps

**The components I am using for my Arducam are UC 545 Rev.C(OV7251) camera, UC-489 Rev.C MIPI adaptor, andUC-425 Rev.C USB shield. Normally, read shows 49 fps, but when an object brought very close to the camera the read goes over 100 fps. I want the 'read' to be stable around 130fps. I have attached the pictures and cfg files below. **

49
image

;**************************************************************************************/
; ----- camera parameter -----
; [camera parameter]	Camera parameter set for USB2.0 & USB3.0 mode
;
; -- Keyname description:
; CFG_MODE  = set the config mode for camera: 0 -> User define(UI)
;											  1 -> This config File
; TYPE      = set the name of the camera module
; SIZE		= set the width and height of the image generated by camera 
; BIT_WIDTH = set the bit width of the image generated by camera 
; FORMAT    = set the format of the image generated by camera:
;				| 0 -> RAW	  | 1 -> RGB565  | 2 -> YUV422   | 3 -> JPG  | 4 -> MONO  | 5 -> ST_RAW	| 6 -> ST_MONO |  
;				| -- 0 -> RG  | -- 0 -> RGB  | -- 0 -> YUYV  |           | 			  | -- 0 -> RG  | 			   |  
;				| -- 1 -> GR  | -- 1 -> BGR  | -- 1 -> YVYU  |           | 			  | -- 1 -> GR  | 			   |  
;				| -- 2 -> GB  |              | -- 2 -> UYVY  |           |			  | -- 2 -> GB  |			   |  
;				| -- 3 -> BG  |              | -- 3 -> VYUY  |           | 			  | -- 3 -> BG  | 			   |  
; I2C_MODE  = set the bit width of the address and data of I2C communication: 
;				0 ->  8 bit address &  8 bit value				
;				1 ->  8 bit address & 16 bit value
;				2 -> 16 bit address &  8 bit value
;				3 -> 16 bit address & 16 bit value		
; I2C_ADDR  = set the I2C address for register config of camera 
; G_GAIN    = set the address for green1_gain register config of camera	( RAW & RGB565 & ST_RAW mode )
; B_GAIN    = set the address for blue_gain register config of camera	( RAW & RGB565 & ST_RAW mode )
; R_GAIN    = set the address for red_gain register config of camera	( RAW & RGB565 & ST_RAW mode )
; G2_GAIN   = set the address for green2_gain register config of camera	( RAW & ST_RAW mode )
; Y_GAIN    = set the address for Y_gain register config of camera		( YUV422 mode )
; U_GAIN    = set the address for U_gain register config of camera		( YUV422 mode )
; V_GAIN    = set the address for V_gain register config of camera		( YUV422 mode )
; GL_GAIN   = set the address for global_gain register config of camera
; 
; -- Keyname format:
; CFG_MODE  = <value1>							;<comment>
; TYPE      = <value1>
; SIZE		= <width>, <height>
; BIT_WIDTH = <bitWidth>
; FORMAT    = <value1>[, <value2>]
; I2C_MODE  = <value1>
; I2C_ADDR  = <i2cAddress> 
; G_GAIN    = [<page>,] <address>, <minValue>, <maxValue>
; B_GAIN    = [<page>,] <address>, <minValue>, <maxValue>
; R_GAIN    = [<page>,] <address>, <minValue>, <maxValue>
; G2_GAIN   = [<page>,] <address>, <minValue>, <maxValue>
; Y_GAIN    = [<page>,] <address>, <minValue>, <maxValue>
; U_GAIN    = [<page>,] <address>, <minValue>, <maxValue>
; V_GAIN    = [<page>,] <address>, <minValue>, <maxValue>
; GL_GAIN   = [<page>,] <address>, <minValue>, <maxValue>
; 
; <valueN>		Index value representing certain meanings 
; <width>		Width of the image generated by camera
; <height>		Height of the image generated by camera
; <bitWidth>	Bit width of the image generated by camera
; <i2cAddress>	I2C address for register config of camera
; <page>        Optional address space for this register. Some sensors (mostly SOC's)
;               have multiple register pages (see the sensor spec or developers guide)
; <address>     The register address 
; <minValue>	Minimale value of certain address
; <maxValue>	Maximale value of certain address
; <comment>    	Some form of C-style comments are supported in this .cfg file
; 
;**************************************************************************************/
[camera parameter]
CFG_MODE  = 1	
TYPE      = OV7251
SIZE      = 640, 480
BIT_WIDTH = 8 
FORMAT    = 4, 0
I2C_MODE  = 2					
I2C_ADDR  = 0xC0	

;**************************************************************************************/
; ----- board parameter -----
;[board parameter]					Board parameter set for USB2.0 & USB3.0 mode	
;[board parameter][dev2]			Board parameter set for USB2.0 mode
;[board parameter][dev3][inf2]		Board parameter set for USB3.0 mode and USB2.0 interface
;[board parameter][dev3][inf3]		Board parameter set for USB3.0 mode and USB3.0 interface
;
; -- Keyname description:
; VRCMD = set board parameter by vendor command 
; 
; -- Keyname format:
; VRCMD = <command>, <value>, <index>, <dataNumber>[, <data1>[, <data2>[, <data3>[, <data4>]]]] 		//<comment>
;
; <command>    Value representing certain command 
; <value>      value representing certain meanings
; <index>      Index representing certain meanings   
; <dataNumber> Number of <dataN>
; <dataN>      Data representing certain meanings
; <comment>    Some form of C-style comments are supported in this .cfg file
;
;**************************************************************************************/
[board parameter]
VRCMD = 0xD7, 0x4600, 0x0100, 1, 0x05
VRCMD = 0xD7, 0x4600, 0x0200, 1, 0x00
VRCMD = 0xD7, 0x4600, 0x0300, 1, 0x40
VRCMD = 0xD7, 0x4600, 0x0400, 1, 0x00
VRCMD = 0xD7, 0x4600, 0x0A00, 1, 0x02
VRCMD = 0xD7, 0x4600, 0x0C00, 1, 0x80
VRCMD = 0xD7, 0x4600, 0x0D00, 1, 0x02
VRCMD = 0xD7, 0x4600, 0x0E00, 1, 0x80
VRCMD = 0xD7, 0x4600, 0x0F00, 1, 0x01
VRCMD = 0xD7, 0x4600, 0x1000, 1, 0xE0
VRCMD = 0xD7, 0x4600, 0x1100, 1, 0x07
VRCMD = 0xD7, 0x4600, 0x2300, 1, 0x03

VRCMD = 0xF6, 0x0000, 0x0000, 3, 0x03, 0x04, 0x0C

[board parameter][dev2]


[board parameter][dev3][inf2]
VRCMD = 0xF3, 0x0000, 0x0000, 0           
VRCMD = 0xF9, 0x0000, 0x0000, 0   //8 bit
;VRCMD = 0xF9, 0x0001, 0x0000, 0    //12 bit

[board parameter][dev3][inf3]
VRCMD = 0xF3, 0x0000, 0x0000, 0           
VRCMD = 0xF9, 0x0000, 0x0000, 0   //8 bit
;VRCMD = 0xF9, 0x0001, 0x0000, 0    //12 bit

;**************************************************************************************/
; ----- register parameter -----
;[register parameter]					Register parameter set for USB2.0 & USB3.0 mode	
;[register parameter][dev2]				Register parameter set for USB2.0 mode
;[register parameter][dev3][inf2]		Register parameter set for USB3.0 mode and USB2.0 interface
;[register parameter][dev3][inf3]		Register parameter set for USB3.0 mode and USB3.0 interface
;
; -- Keyname description:
; REG    = assign a new register value
; DELAY  = delay a certain amount of milliseconds before continuing
;
; -- Keyname format:
; REG    = [<page>,] <address>, <value>             //<comment>
; DELAY  = <milliseconds>
;
; <page>         Optional address space for this register. Some sensors (mostly SOC's)
;                have multiple register pages (see the sensor spec or developers guide)
; <address>      the register address
; <value>        the new value to assign to the register
; <milliseconds> wait for this ammount of milliseconds before continuing 
; <comment>      Some form of C-style comments are supported in this .cfg file
;
;**************************************************************************************/
[register parameter]
;DELAY  = 0x100												 
;REG    = 0x10, 0x00FF
;BITSET = 0x10, 0xF000
;BITCLR = 0x10, 0x8000

REG = 0x0100, 0x00 
REG = 0x3005, 0x00 
REG = 0x3012, 0xC0 
REG = 0x3013, 0xD2 
REG = 0x3014, 0x04 
REG = 0x3016, 0xF0 
REG = 0x3017, 0xF0 
REG = 0x3018, 0xF0 
REG = 0x301A, 0xF0 
REG = 0x301B, 0xF0 
REG = 0x301C, 0xF0 
REG = 0x3023, 0x07 
REG = 0x3037, 0xF0 
REG = 0x3098, 0x04 
REG = 0x3099, 0x36	//PLL   
REG = 0x309A, 0x05 
REG = 0x309B, 0x04 
REG = 0x30B0, 0x0A 
REG = 0x30B1, 0x01 
REG = 0x30B3, 0x70
REG = 0x30B4, 0x03 
REG = 0x30B5, 0x05 
REG = 0x3106, 0x12 
REG = 0x3500, 0x00 
REG = 0x3501, 0x1F 
REG = 0x3502, 0x80 
REG = 0x3503, 0x07 
REG = 0x3509, 0x10 
REG = 0x350B, 0x28  //AEC AGC ADJ
REG = 0x3600, 0x1C 
REG = 0x3602, 0x62 
REG = 0x3620, 0xB7 
REG = 0x3622, 0x04 
REG = 0x3626, 0x21 
REG = 0x3627, 0x30 
REG = 0x3634, 0x41 
REG = 0x3636, 0x00 
REG = 0x3662, 0x01 
REG = 0x3664, 0xF0 
REG = 0x3669, 0x1A 
REG = 0x366A, 0x00 
REG = 0x366B, 0x50 
REG = 0x3705, 0xC1 
REG = 0x3709, 0x40 
REG = 0x373C, 0x08 
REG = 0x3742, 0x00 
REG = 0x3788, 0x00 
REG = 0x37A8, 0x01 
REG = 0x37A9, 0xC0 
REG = 0x3800, 0x00 
REG = 0x3801, 0x04 
REG = 0x3802, 0x00 
REG = 0x3803, 0x04 
REG = 0x3804, 0x02 
REG = 0x3805, 0x8B 
REG = 0x3806, 0x01 
REG = 0x3807, 0xEB 
REG = 0x3808, 0x02 
REG = 0x3809, 0x80 
REG = 0x380A, 0x01 
REG = 0x380B, 0xE0 
REG = 0x380C, 0x03 
REG = 0x380D, 0xA0 
REG = 0x380E, 0x02 
REG = 0x380F, 0x04 
REG = 0x3810, 0x00 
REG = 0x3811, 0x04 
REG = 0x3812, 0x00 
REG = 0x3813, 0x05 
REG = 0x3814, 0x11 
REG = 0x3815, 0x11 
REG = 0x3820, 0x44 
REG = 0x3821, 0x00 
REG = 0x382F, 0xC4 
REG = 0x3832, 0xFF 
REG = 0x3833, 0xFF 
REG = 0x3834, 0x00 
REG = 0x3835, 0x05 
REG = 0x3837, 0x00 
REG = 0x3B80, 0x00 
REG = 0x3B81, 0xA5 
REG = 0x3B82, 0x10 
REG = 0x3B83, 0x00 
REG = 0x3B84, 0x08 
REG = 0x3B85, 0x00 
REG = 0x3B86, 0x01 
REG = 0x3B87, 0x00 
REG = 0x3B88, 0x00 
REG = 0x3B89, 0x00 
REG = 0x3B8A, 0x00 
REG = 0x3B8B, 0x05 
REG = 0x3B8C, 0x00 
REG = 0x3B8D, 0x00 
REG = 0x3B8E, 0x00 
REG = 0x3B8F, 0x1A 
REG = 0x3B94, 0x05 
REG = 0x3B95, 0xF2 
REG = 0x3B96, 0x40 
REG = 0x3C00, 0x89 
REG = 0x3C01, 0xAB 
REG = 0x3C02, 0x01 
REG = 0x3C03, 0x00 
REG = 0x3C04, 0x00 
REG = 0x3C05, 0x03 
REG = 0x3C06, 0x00 
REG = 0x3C07, 0x05 
REG = 0x3C0C, 0x00 
REG = 0x3C0D, 0x00 
REG = 0x3C0E, 0x00 
REG = 0x3C0F, 0x00 
REG = 0x4001, 0xC2 
REG = 0x4004, 0x04 
REG = 0x4005, 0x20 
REG = 0x404E, 0x01 
REG = 0x4300, 0xFF 
REG = 0x4301, 0x00 
REG = 0x4600, 0x00 
REG = 0x4601, 0x4E 
REG = 0x4801, 0x0F 		//ECC, PH order
REG = 0x4806, 0x0F 
REG = 0x4819, 0xAA 
REG = 0x4823, 0x3E 
REG = 0x4837, 0x19 
REG = 0x4A0D, 0x00 
REG = 0x5000, 0x85 
REG = 0x5001, 0x80 
REG = 0x3503, 0x07 
REG = 0x3662, 0x03      //RAW8
 
REG = 0x30B0, 0x08
REG = 0x30B4, 0x06

;REG = 0x5E00, 0x8C      //test pattern bar


DELAY  =  0x10

[register parameter][dev3][inf2]


[register parameter][dev3][inf3]

[board parameter]
;;REM	*********************************************		
;;REM	Start up sequence		
;;REM	*********************************************		
;REM	**************************************************		
;REM	TC358746XBG Software Reset		
;REM	**************************************************		
//REG = 0x0002, 0x0001	//SYSctl, S/W Reset
VRCMD = 0xE1, 0x1C00, 0x0002, 2, 0x00, 0x01
DELAY = 	10		
//REG = 0x0002, 0x0000	//SYSctl, S/W Reset release
VRCMD = 0xE1, 0x1C00, 0x0002, 2, 0x00, 0x00
			
;REM	**************************************************		
;REM	TC358746XBG PLL,Clock Setting		
;REM	**************************************************		
//REG = 0x0016, 0x3077	//PLL Control Register 0 (PLL_PRD,PLL_FBD)
VRCMD = 0xE1, 0x1C00, 0x0016, 2, 0x10, 0x63
//REG = 0x0018, 0x0403	//PLL_FRS,PLL_LBWS, PLL oscillation enable
VRCMD = 0xE1, 0x1C00, 0x0018, 2, 0x04, 0x03
DELAY = 10		
//REG = 0x0018, 0x0413	//PLL_FRS,PLL_LBWS, PLL clock out enable
VRCMD = 0xE1, 0x1C00, 0x0018, 2, 0x04, 0x13
//REG = 0x0020, 0x0011	//CLK control register: Clock divider setting
VRCMD = 0xE1, 0x1C00, 0x0020, 2, 0x00, 0x11
			
;REM	**************************************************		
;REM	TC358746XBG MCLK Output		
;REM	**************************************************		
//REG = 0x000C, 0x0101	//MCLK duty setting
VRCMD = 0xE1, 0x1C00, 0x000C, 2, 0x01, 0x01
			
;REM	**************************************************		
;REM	TC358746XBG GPIO2,1 Control (Example)		
;REM	**************************************************		
;REM	0010	FFF9	GPIO Direction, GPIO2,1 output
;REM	0014	0000	GPIO output data. GPIO2="L", GPIO1="L"
;REM	000E	0006	GPIO enable. GPIO2,1 enable
;REM	0014	0006	GPIO output data. GPIO2="H", GPIO1="H"
			
;REM	**************************************************		
;REM	TC358746XBG Format configuration, timing Setting		
;REM	**************************************************		
//REG = 0x0060, 0x800A	//PHY timing DELAY =  setting
VRCMD = 0xE1, 0x1C00, 0x0060, 2, 0x80, 0x11
//REG = 0x0006, 0x0032	//FIFO control
VRCMD = 0xE1, 0x1C00, 0x0006, 2, 0x00, 0x64
//REG = 0x0008, 0x0011	//Data format control
VRCMD = 0xE1, 0x1C00, 0x0008, 2, 0x00, 0x01
//REG = 0x0004, 0x8145	//Configuration control
VRCMD = 0xE1, 0x1C00, 0x0004, 2, 0x81, 0x44


[register parameter]
REG = 0x0100, 0x01 

errors and fixes for python wrapper

Hello! Just sharing some stuff here in case other people encounter the same problem as I did.

Running the python wrapper on python3 throws a few errors. Mostly because the v4l2 python wrapper is quite old and has not been updated for python3. The bug fix is available here

Also running the video.py file throws this error. The fix is quite easy cause the function is trying to flush a closed file.

Traceback (most recent call last):
  File "_ctypes/callbacks.c", line 232, in 'calling callback function'
  File "video.py", line 27, in callback
    buff.as_array.tofile(file)
ValueError: flush of closed file

swapping the lines to change the callback to None before closing fixes this.

camera.set_video_callback(None, None)
file.close()

type error in arducam_mipicamera.py : function remove_padding

Hello,
Calling function remove_padding yields an error in python 3 (Version 3.7.3):
unsupported operand type(s) for &: 'float' and 'int'.
In Python 3 the result of the calculation (i.e. the division) in line 533
real_width = width / 8 * bit_width
is a float. The variable real_width is passed as an argument to function align_up which expects integers as arguments in order to perform boolean arithmetic. But boolean arithmetic does not work with operands of type float.
The issue may be different in Python 2 where the data type of the result of an operation depends on the data types of the operands (e.g. a division of an integer by an integer results in an integer). So real_widthmay be of type integer in Python 2. But in Python 3 the division of an integer by an integer may result in a float.
When I change the calculation in line 533 of function remove_padding using a floor division to
real_width = width // 8 * bit_width
the variable real_width is of data type integer so that it can be used safely by function align_up.
To keep the proposed solution compatible with Python 2 you can add the following import at the beginning of the file (line 5):
from __future__ import division

regards
Gernot

Will IMX477 driver work with Original RPi HQ camera also?

Hi,

I see that you have shared a compiled version of the IMX477 driver from Ridgerun for the Jetson. Since it works for the Arducam IM477 sensor, would it also work for the original RPi HQ cam (v3)? Assuming the R8 removal modification is done as needed.

Thanks.

AR1820HS on Jetson Nano, little FPS issue

Camera: AR1820HS
Board: Jetson Nano

Hi, I am testing AR1820HS camera on Jetson Nano following this article: camera-demonstration

I am runing example like this:
python arducam_displayer.py --width 4912 --height 3684

I am getting only 5-6 FPS but the spec says it can give:

4:3 – 18 Mp full resolution at 15 fps (MIPI-4L)

Where is the problem?

Constant framerate video streaming

Hello,
I need to stream video from Arducam IMX298 camera module with Raspberry Pi 4. I wrote a simple program based on video2stdout.c to be able to adjust camera parameters live during stream, but I found out that the output framerate depends on the exposure value. Is it possible to make encoder output to be a constant framerate? Currently I send the H264 encoded video to ffmpeg through a pipe and then send the video to a RTMP server, but due to the changing framerate it doesn't work properly.

Thank you

Preview video from Stereo Camera Hat is really DARK

I am using RPi4 & Stereo Camera Hat.

I've tried ./arducamstill -t 0 -m 10 -awb 1 -ae 1, but the preview video is pretty dark:
IMG_20200214_184533
If I increase the brightness, it turns yellow...
Anything I can do to get normal video feed?

No camera available in jetson nano

Hi,

I met a problem when I want to run IMX477 on a jetson nano.

First I downloaded the newest sd card image for nano and burn it into an empty sd card.

Then I download the driver of imx477 here for nano with the corresponding version.

When I want to run the command to pop up the camera, it showed 'no cameras available'.

Ps: my kernel version is a little bit strange, ending with '-tegra'.

It would be kind if someone can tell me how to fix those issues.

Thanks!

error using capture_raw10_opencv.py when not explicitly setting a camera mode with 10bit pixel format

capture_raw10_opencv.py yields an error
cannot reshape array of size 1024000 into shape (800, 1600)
when not explicitly setting a camera mode with 10bit pixel format.
The python example originally just sets the camera resolution

fmt = camera.set_resolution(1280,800)

but this does not assure that a camera mode is chosen which returns data in a raw10 pixel format.
Setting a raw10 camera mode e.g. by replacing camera.set_resolution(...) with the following code works (in my case mode no. 6 using an ov9821 camera, the mode has to be adjusted by the user matching the specific camera):

camera.set_mode(6)
fmt = camera.get_format()
height = fmt.get("height")
width = fmt.get("width")

and later on in the while-loop

    frame = arducam.remove_padding(frame.data, width, height, 10)
    ....
    frame = frame.reshape(height, width) << 6

missing opencv headers white_balance.hpp (library libopencv-dev is installed already)

In short, when issuing

make clean && make

I get the following error:

capture2opencv.cpp:14:44: fatal error: opencv2/xphoto/white_balance.hpp: No such file or directory
#include <opencv2/xphoto/white_balance.hpp>
^
compilation terminated.
Makefile:45: recipe for target 'capture2opencv' failed
make: *** [capture2opencv] Error 1

libopencv-dev and libopencv-contrib-dev are installed otherwise the error would be different, as opencv.hpp is of course available:
pi@trinium:~/spectrometer/camera_driver/MIPI_Camera/RPI $ locate opencv.hpp
/usr/include/opencv2/opencv.hpp

My /etc/apt/sources.list is:
deb http://raspbian.raspberrypi.org/raspbian/ stretch main contrib non-free rpi
and the packages have been updated and upgraded today.

Can you recommend a version of OpenCV, or an actuall packade, or just what you're using, where the missing headers are instead available? Thank you in advance for your help.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.