Jump to content


  • Posts

  • Joined

  • Last visited

 Content Type 


Member Map




Everything posted by @lex

  1. I came across to MickMake formula for WS2812 led strip: (spi_freq / 24) / FPS = MAX_leds The idea was to build a led panel and stream video, keeping the aspect ratio 4:3. I could save some money and change it to 16:9. That would be 240 x 135 , but still a lot of money.. 24 FPS is the ideal, but 23 FPS i think is enough. FPS = 23 then MAX leds would be ~ 1440 leds if a 60 leds per m. I could drive 6 rows of 4 m with a single board. I was thinking to use NetSync to sync the rows, so 30 boards would do the job for 240 x 180 (4:3). Mike also talks about issues if using multiple power supplies, but i have seen a DIY led panel almost as big as this with multiple power supplies and it works. The final thought... how to wire the things...
  2. I would like to drive a 240x180 led strip panel with SPI. I have zero experience with SPI and LEDs. I have chosen a WS2813 IC (four wires) for the LED. The seller has a spi controller, and it drives a max of 2048 LEDs. Does anyone have any experience driving as much as 2048 LEDs in SPI? Please, share your experience and/or projects with LED and SPI in this post. The question is how many LEDs i can drive with SPI and how to calculate it.
  3. Price: US$ 119.00 Promotion Price: US$ 65.00 . This lasted a few seconds then Price raised to 95.00. Did anyone get that board for US$ 65.00? Ouch, sorry. I realized it was for 2GB.
  4. The suggestion was to change to: powerdown-gpios = <&pio 4 15 1>; /* CSI-STBY-R: PE15 */ vfe_v4l2 is for legacy only. Can you set PG11 in u-boot and see if it works? I think is PD14, do it in u-boot cmd: gpio set PG11 gpio status PG11
  5. I don't have the board with me right now. But i still think i2c is not wired correctly. Perhaps someone else can measure and give the answer. Last thought: Try GPIO_ACTIVE_LOW if you have GPIO_ACTIVE_HIGH in your DTS
  6. Ouch, it is working then. The error is expected. In mainline you need to tell the sensor (csi) the format and size you want. Here is one example for streaming JPEG 1280x720: media-ctl --device /dev/media1 --set-v4l2 '"ov5640 1-003c":0[fmt:JPEG_1X8/1280x720]' For the mjpeg-streamer the fix is here: https://github.com/avafinger/bananapi-zero-ubuntu-base-minimal/issues/56 PS: I overlooked you output, i still think you wired the i2c1 and not i2c2
  7. Not sure i can give a good advice in this case, it is a bit confusing for me (software guy here). Looks like your sensor is detected with I2c-1 and it should be on i2c-2 , so i think it is the way you wired this module. PE12/CSI_SCK/TWI2_SCK --> CSI-SCK PE13/CSI_SDA/TWI2_SDA --> CSI-SDA and DOVDD, AVDD, DVDD should be in your DTS in the same way as Opi one. How you wired these pins to get the right voltage is a mystery to me.
  8. In order to use the imx219 sensor (or any other sensor) in mainline kernel 5.x you need to tell the sensor which format and size the sensor must deliver the frames using v4l2-ctl prior to grabbing the frames. Something like: v4l2-ctl --set-subdev-fmt pad=0,width=1280,height=960,code=0x3001 -d /dev/v4l-subdev1 v4l2-ctl --set-subdev-fmt pad=0,width=1280,height=960,code=0x3001 -d /dev/v4l-subdev0 v4l2-ctl --set-subdev-fmt pad=2,width=800,height=600,code=0x2008 -d /dev/v4l-subdev0 Collabora has a complete setup and a patched libcamera where you can pull some images from the sensor in mainline 5.x. For this, you need an updated rkisp driver and imx219 device node in dts (if you don't have it yet!) and fresh compiled v4l2-ctl. You can refer to: https://gitlab.collabora.com/koike/linux/-/commit/7842ca07b75828785ffcd217362a82eaa9cc1e21 You can also refer to the work done here for further information: https://github.com/initBasti/NanoPC-T4_armbian_configuration#testing-the-camera-
  9. You have the wrong pix_fmt. Try: -pix_fmt yuv420p Be aware you don't get HW encode with this version. Search the forum, someone already worked on the hw encoding side.
  10. I was an early backer of Pine64, it took so long to get my hands on one plus the learning curve in order to contribute with something useful, around 6 months, today things have evolved and 'this time' is an eternity. Not to mention the restriction to ship the battery from US. I just think they missed a great opportunity to create a kind of "convergence docker" that would export the gpios so you could drive other things, create an ecosystem around the device just like the Raspberry Pi has. The Makers should get this tip and launch a similar device and export all the pins. RK chip ofc. but don't know if they have a mobile chip or the new chip could be used. Rockchip seems to be committed to helping the Linux community, at least they contribute to the kernel. Anyway, Manjaro + Phosh is really a good alternative, i don't know if they improved their edition or not. You should give it a try.
  11. Hi, I haven't follow the progress, but i like the Mobian w/ Phosh design, if you can report your findings would be great. Thanks. I have a few (noob?) question about the device and interfaces. * Would you know if it is possible to have access and use the UART to interface with some peripherals? I know there is an extension with HDMI, ethernet and USB. I don't mean USB to Serial. * Can you stick a barcode reader, or something similar like a proximity reader (if you have that extension)? * Last time i checked (was really long time ago) there was no HW acceleration. today there is A64 HW in kernel 5.x , how is the status?. I know GNOME is a bit slow but maybe with HW accel it would be great. If you have time, can you try to: * rebuild Mobian w/ Phosh image and provide some inside info and what is needed? * Disclose the blobs that are still needed? I was expecting to see a similar device but with rockchip inside.. and then join the club. BR
  12. That's why i asked. How would you wire the touch? I don't have OPI4 but i am curious where it should go to the OPI4. I am thinking to try it out on my NanoPi M4. And it would be interesting to know how to connect all these things, some wires do not show where it goes ... Make sure you have: &vopb { status = "okay"; }; &vopb_mmu { status = "okay"; }; &vopl { status = "okay"; }; &vopl_mmu { status = "okay"; };
  14. You can follow this thread and add the ov5640 overlay, use the ov5640 params from FE dts node.
  15. @lex

    Security cameras

    If you don't want to code, you could try this: https://www.claudiokuenzler.com/blog/999/kodi-tv-add-video-stream-ip-surveillance-camera-add-on
  16. Here are the instructions to run Htop remotely using a web browser. I often use Htop to monitor the health of my boards and servers (amd64). It is a good tool for the sys admin to monitor the servers in real-time without much resources and it is not very intrusive. Recipe 1 - Clone shellinabox root@cubieboard2:~# git clone https://github.com/shellinabox/shellinabox Cloning into 'shellinabox'... remote: Enumerating objects: 3073, done. remote: Total 3073 (delta 0), reused 0 (delta 0), pack-reused 3073 Receiving objects: 100% (3073/3073), 4.31 MiB | 1.89 MiB/s, done. Resolving deltas: 100% (2418/2418), done. root@cubieboard2:~# cd shellinabox/ 2 - Build and install shellinabox shellinabox makefile is outdated for OpenSSL 1.1.y, so we need to bypass the linking process and do it manually. During the config process you get some errors, bypass the error by doing like so: root@cubieboard2:~/shellinabox# apt-get install libtool root@cubieboard2:~/shellinabox# autoreconf -i root@cubieboard2:~/shellinabox# autoconf root@cubieboard2:~/shellinabox# autoreconf -i root@cubieboard2:~/shellinabox# ./configure root@cubieboard2:~/shellinabox# make 3 - Bypass the linking error During the link process you get a missing openssl 1.1 lib, we then manually link shellinabox: root@cubieboard2:~/shellinabox# gcc -g -std=gnu99 -Wall -Os -o shellinaboxd shellinabox/shellinaboxd.o shellinabox/externalfile.o shellinabox/launcher.o shellinabox/privileges.o shellinabox/service.o shellinabox/session.o shellinabox/usercss.o ./.libs/liblogging.a ./.libs/libhttp.a -ldl -lutil -lssl -lcrypto 4 - Running Htop on the Browser I choose not to install shellinabox, just run a service to be able to run and make Htop available on the Browser. Tested on Google Chromium and FireFox (linux). root@cubieboard2:~/shellinabox# ./shellinaboxd -t -b -p 8888 --no-beep -s '/htop_app/:alex:alex:/:htop -d 10' where 8888 is the port. alex:alex is the user:group to run Htop with. Use yours [user] and [group]. Note: for security reason if you run with nobody:nogroup you wont be able to add or change any config on the Htop. Now fire the browser at http://ip_address:8888/htop_app/ and that's it, enjoy. 5 - Credits stackverflow, user ofstudio. Screenshot:
  17. And the nice thing, you can have Htop in a Browser...
  18. ____ _ _ _ _ ____ / ___| _| |__ (_) ___| |__ ___ __ _ _ __ __| | |___ \ | | | | | | '_ \| |/ _ \ '_ \ / _ \ / _` | '__/ _` | __) | | |__| |_| | |_) | | __/ |_) | (_) | (_| | | | (_| | / __/ \____\__,_|_.__/|_|\___|_.__/ \___/ \__,_|_| \__,_| |_____| Welcome to Armbian 20.08.1 Focal with Linux 5.8.5-sunxi No end-user support: community creations System load: 0.27 0.62 0.34 Up time: 5 min Memory usage: 8 % of 990MB IP: CPU temp: 44°C Usage of /: 4% of 29G Last login: Wed Nov 25 15:45:51 2020 from root@cubieboard2:~# htop root@cubieboard2:~# htop --version htop 2.2.0 - (C) 2004-2020 Hisham Muhammad Released under the GNU GPL. root@cubieboard2:~#
  19. Try removing the config file with Htop closed: sudo rm /root/.config/htop/htoprc Start Htop and make your changes.
  20. This is a classical memory corruption. Htop has possibly crashed. During a crash Htop emits a backtrace with some info. If you have the backtrace info, please post here with your Htop version. You can also try a few things: * Remove every meter, F2 and delete all the meters, exit. Start again and add one cpu bar. If it is Ok then proceed with the rest. * If you have some skills build Htop with debug info, the backtrace will show the function previous to the free() memory.
  21. Can you post the format info for your USB camera? v4l2-ctl -d 4 --list-formats or v4l2-ctl -d 5 --list-formats I have seen some USB camera has a YUY2 format. Cpu usage ~52% looks good for OpenCV.
  22. Ok. In the JPEG_1X8/640x480@30FPS case you should expect a bit more cpu usage than the USB camera, say ~5% more cpu usage and not the ~35% increase you see. The reason is you get 640x480x3 pixels from the sensor instead of only the compressed image. I would build opencv with debugging info and try to find the bottleneck. And why there is an image conversion in the DVP case. Push the limits to 720P/30fps or even 1080P and see what you get.
  23. No GPU/VPU involved. I think there is a conversion from YUV to rgb in opencv and possible a decompression from jpeg to rgb, i am not an opencv expert, maybe someone can give more details about what's going on inside opencv. There is still room to optimize the JPEG_1x8. That's pretty good. There must be no conversion in the image format to achieve this. It is interesting to find out more about it. Can you share more about your application and your setup and usb camera?
  24. What are the compiler options in use? You could try to put a break before you call **WrTSpec** and check the stack. You can also use valgrind to check for stack corruption.
  • Create New...