atomic77

  • Content Count

    10
  • Joined

  • Last visited

About atomic77

  • Rank
    Member

Profile Information

  • Location
    Toronto

Contact Methods

  • Github
    https://github.com/atomic77

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Did anyone make any progress on this? I'm also trying to connect a Pi One Plus to a newer 4K-enabled TV and none of the tweaks to boot.cmd or armbianEnv.txt I've found on the forum seem to be working. Everything is fine on my 1920x1080 monitor over HDMI.
  2. Confirmed that I can't reproduce this on Stretch 5.4.45. I'm leaning towards just disabling zram on 5.8 kernels - as far as I can tell, the main thing I need to watch out for is that I don't fill up 50MB of logs within the 15 minute interval of armbian-truncate-logs being run (now that it's not compressed)?
  3. Awesome, this does look related. I'm going to check a fresh image of Armbian 20.05 with stretch 5.4.45 kernel and confirm that this doesn't occur. Thanks!!
  4. Hi all, I've been doing some stress tests to trigger the watchdog on an Orange Pi Zero and I discovered that i can reliably reproduce a kernel oops with a simple fork bomb on a fresh install of the latest Armbian Buster image (Armbian_20.11_Orangepizero_buster_current_5.8.16.img.xz) The command run as root is: :(){ :|: & };: I noticed zs_malloc in the stack trace, and interestingly enough, it does not happen if I disable zram in /etc/default/armbian-zram-config. I get a flood of errors like below, and the device eventually recovers. -bash: fork:
  5. atomic77

    atomic77

  6. I got my hands on a "Set 9" Orange Pi Lite + GC2035 camera a while back and I've finally been able to put together a self-contained object detection device using Tensorflow, without sending any image data outside for processing. Basically, its a python Flask application that captures frames from the camera using a GStreamer pipeline. It runs them through a Tensorflow object detection model and spits out the same frame with extra metadata about objects it found, and renders a box around them. Using all four cores of the H2 it can do about 2-3 fps. The app keeps track of the count of all
  7. I was trying to do the same thing with my pihole running debian. Disabling the wpa_supplicant service didn't work for me because it seemed to get dragged up by polkit. Did you solve the problem? The only way I could figure out to prevent wpa_supplicant from coming up was to disable network-manager entirely. Since I set my ip statically with interfaces file it's ok for me but I imagine there is a better way.
  8. Hello all, I recently went through the exercise of getting the Armbian build system set up to create an image with some of my user customizations, similar to the example with OMV. The documentation and forums have been a huge help! As I am running Fedora and Windows as my main OSes, I did struggle a bit at first. I tried, in order: Ignoring the advice of the documentation, and running ./compile.sh on Fedora. Didn't get far Running the build with Docker on Fedora. I ran into a number of different problems and ultimately gave up. Running on Ubu
  9. Thanks for the reply. As soon as I get my hands on one of these SD cards i'll post my findings here on whether I get any useful information out of it with smartctl.
  10. Hello all, I've been recently burned by a failing SD card (armbianmonitor -v possibly saved me a much worse fate!) and I'm now trying to proactively avoid a similar situation. I found that WD has a line of cards that claim a health status feature that "Helps in preventive maintenance by signaling when the card needs to be replaced". One of them is the Purple QD101 that seems reasonably priced at $15 for 64GB. I'm not finding much in the way of details of how this is exposed though. Is there any way to get access to this information on linux?
  11. I know this thread is over a year old, but I was finally able to get decent camera output out of the GC2035 thanks to the cedar H264 gstreamer plugin linked here! I'm no gstreamer expert, but what i've been able to figure out is that after the cedar_h264enc stage of the pipeline, you need to add a h264parse stage, which you can then follow with something like matroskamux if you want to write a .mkv file, or rtph264pay if you want to send the data over the network via RTP. eg: gst-launch-1.0 -ve v4l2src device=/dev/video0 \ ! video/x-raw,format=NV12,width=800,height=600,f