sgjava

  • Posts

    311
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

sgjava's Achievements

  1. How is i2c1 and i2c2 mapped on Nanopi Duo? I can configure in armbian-config, but all the Duo pin diagrams only show i2c0.
  2. How is i2c1 and i2c2 mapped on Nanopi Duo? I can configure in armbian-config, but all the Duo pin diagrams only show i2c0.
  3. Java U8g2 is a high performance library based on U8g2: Library for monochrome displays, version 2. Rather than try to code the JNI by hand I used HawtJNI to generate the JNI wrappers. I also used some custom code to generate the setup methods and font constants. final var u8g2 = U8g2.initU8g2(); // Change this to your actual display U8g2.setupSsd1306I2c128x64NonameF(u8g2, U8G2_R0, u8x8_byte_arm_linux_hw_i2c, u8x8_arm_linux_gpio_and_delay); U8g2.initDisplay(u8g2); logger.debug(String.format("Size %d x %d, draw color %d", U8g2.getDisplayWidth(u8g2), U8g2.getDisplayHeight(u8g2), U8g2. getDrawColor(u8g2))); U8g2.setPowerSave(u8g2, 0); U8g2.clearBuffer(u8g2); U8g2.setFont(u8g2, u8g2_font_t0_15b_mf); U8g2.drawStr(u8g2, 1, 18, "Java U8g2"); U8g2.sendBuffer(u8g2); try { TimeUnit.SECONDS.sleep(5); } catch (InterruptedException ie) { Thread.currentThread().interrupt(); } U8g2.setPowerSave(u8g2, 1); U8g2.done(u8g2);
  4. https://redirect.armbian.com/nanopiduo/Focal_current Tried to flash a couple of times, but it never makes it to the heartbeat led. Will try other releases. This one boots https://armbian.systemonachip.net/archive/nanopiduo/archive/Armbian_21.05.1_Nanopiduo_focal_current_5.10.34.img.xz
  5. Java Periphery is a high performance library for GPIO, LED, PWM, SPI, I2C, MMIO and Serial peripheral I/O interface access in userspace Linux. Rather than try to build this from scratch I used c-periphery and HawtJNI to generate the JNI wrappers. This saves a lot of hand coding and allows for easier synchronization with c-periphery changes moving forward. I believe this is the only userspace IO library that supports the new JDK 17 LTS (and still supports JDK 11). I had to fork HawtJNI to make it work with JDK 17. The only caveat is that ARM32 supports JDK 11. For some reason I cannot find a JDK 17 for ARM32 or information on why is was not supported. The install script handles this automatically. ARM64/x86/x86_64 uses JDK 17.
  6. I agree, but I went with Annke 4K cameras at around $80 US. So far, so good.
  7. I'm running the latest XU4 focal and ffmpeg doesn't support the latest H265+ codec even though I'm only doing a copy (no transcoding). Is there a way to build latest ffmpeg without losing hardware accelration? [rtsp @ 0x556f60] [warning] Multi-layer HEVC coding is not implemented. Update your FFmpeg version to the newest one from Git. If the problem still occurs, it means that your file has a feature which has not been implemented. BOARD=odroidxu4 BOARD_NAME="Odroid XU4" BOARDFAMILY=odroidxu4 BUILD_REPOSITORY_URL=https://github.com/armbian/build BUILD_REPOSITORY_COMMIT=428a20876-dirty DISTRIBUTION_CODENAME=focal DISTRIBUTION_STATUS=supported VERSION=21.05.6 LINUXFAMILY=odroidxu4 ARCH=arm IMAGE_TYPE=stable BOARD_TYPE=conf INITRD_ARCH=arm KERNEL_IMAGE_TYPE=Image BRANCH=current
  8. OK, so here's what looks like is happening. Based on the system load eventually the NICs will go offline. I used the 5V8A PSU and saw no difference. What I will most likely do is split 3 4K cams per XU4 or use one for rtsp/mjpeg proxy and one to do detection recording. As a proxy there very little CPU, system load, etc.
  9. I'll report back once I see the issue resolved. I think a couple days was the maximum time without losing the NICs.
  10. 6A PSU https://www.hardkernel.com/shop/5v-6a-power-supply-unit/ When you connect an external 2.5inch HDD/SSD to XU4 or using the CloudShell, the bundle 5V/4A power supply is not enough to supply stable power. We strongly recommend to use this 5V/6A PSU to improve the system stability. So maybe my PSU theory is correct. Would also explain where there's really nothing in the logs. Just ordered this guy. No use troubleshooting a software problem until I rule out hardware https://www.amazon.com/gp/product/B07H9X8FHM
  11. Yeah, looks like https://github.com/aler9/rtsp-simple-server might have some issues, but that shouldn't kill the NICs. Memory averages around 1.3G free which is excellent for 5 4K cameras. Nothing obvious sticks out. I have a SSD and 1G Ethernet dongle connected to USB 3. I believe I have a 4A power supply. I have another XU4 with 2 SSDs on USB 3 that works fine, but has less network activity (like 20 Mbps). The weird thing is the heartbeat is till cranking and I have to power cycle to get NICs back. If it makes any difference I have 5 to 7 ffmpeg processes running all the time. Processes average around 150. I didn't tweak any kernel parameters. Anyways, I tried swapping power supplies with another board just for the heck of it.
  12. What should I be looking for in output? dmesg.txt
  13. Howdy, I'm running Armbian_21.05.1_Odroidxu4_focal_current_5.4.116.img on my XU4 for security cameras. It averages about 60 Mbps 24/7. After a couple days usually one adapter disappears (from ip a and nmtui) and sometimes both (I have a USB 1 Gbps adapter too). I'm not seeing anything in the logs to indicate an error. I monitor this system with Zabbix and I see no unusual activity before this happens. Is there a way to get diagnostics for this event?
  14. OK, so I have things dialed in a bit better now. FFMPEG seems to like TCP over UDP for these streams. I'm seeing a lot less artifacts in the videos. Below is 12 hours of network and CPU activity for 3 4K cams at 12.5 FPS and 1 4K cam at 15 FPS (four 4K cams total) at highest quality. CPU seems to max out around 10% per camera and network maxes out around 45 mb total. I'm using a 1G PoE switch, so the bandwidth will never be an issue. The question then becomes how much more real-time processing can I do. CV routines are typically expensive even though I've learned tricks using ROI (region of interest). This isn't as much of a problem with a single camera, but I'm planning on six cameras. Obviously the most important thing for me now is to capture motion videos. Once the bread and butter stuff is done then I can look at adding various detection code (besides motion). The question becomes do I want real-time detection or perhaps offload that to another SBC. The fact that I can record this many 4K cameras with an Odroid XU4 is pretty amazing. Look at Blue Iris requirements https://ipcamtalk.com/wiki/choosing-hardware-for-blue-iris/ and this really doesn't cover electrical costs and heat dissipation. I'm working on a Pine 64 today with USB cam, so I'll test single camera encoding. Basically the code will scale to many cameras or a single camera.
  15. So I have two 4K @ 12 FPS cameras that stream to a buffer file. Currently I'm only doing motion detection, but that drives everything else obviously detection wise. On an Odroid XU4 each camera uses only 10%. Motion videos are just copied from the buffer file. I'm using H265+ without any transcoding. So far so good. Not quite ready to post up to github yet though. I doubt you get better CPU utilization than this. I'm using the substream at 4 FPS 640x480 for motion detection. One other thing to note is I proxy mjpeg and h265+ streams, so for cameras that only allow one stream you can have security software running and still stream live. I'm using PoE cameras and will have six 4K cameras running soon. So in essence you can build a cheap NVR without the hardware requirements of Blue Iris etal or just build a single smart camera. Armbian allows hardware encoding with ffmpeg on some models, so you can even encode USB cams with low overhead.