Project

General

Profile

Actions

Camera

This is only in the idea stage. No development on this has started.

Making remote camera controlled by Ironman has been an interesting problem. Many solutions use Arduinos. But then I ran across an interesting idea using a RPi Zero W set inside a dummy camera housing. That led me to think about how I can use PiBox directly instead of using an Arduino based solution.

ironman camera
High Level Camera Design

The camera is based on a RPi Zero W running a new PiBox Lite release (PiBox stripped of X and other unnecessary components). A web API is built using Mongoose (web server written in C and easily extendable) (or original source) that basically just starts up mjpeg-streamer just as the PiCam app current does. mjpeg-streamer provides a URL that the monitor will access.

The mongoose server is used to reduce overhead - mongoose is self contained and doesn't need a bunch of libraries. On the flip side, nodejs with restify (for a REST API) is easy to implement but carries lots of weight (as all scripting languages do) and would require an external programs for running some components.

The camera doesn't need to run full time. We can attach a PIR sensor to a GPIO and wake the camera then. The definition of "wake" is critical: if it takes mjpeg_streamer too long to start up and start acquiring video then we miss the image we want to capture. If we run mjpeg_streamer all the time then we use too much power. So the PIR has to be able to start recording but mjpeg_streamer needs to start acquiring video quickly. An alternative to this may be to have ffmpeg handle video capture if mjpeg_streamer takes too long to start. Note that piboxd normally starts mjpeg_streamer and there may be a lag between getting the GPIO notification from the PIR and sending the request to piboxd. Hopefully the recent updates to thread synchronization will help speed this up.

The monitor is the standard Ironman Monitor with the addition of a new app: IMCameras. This is based on PiCam but supports selection of which camera to visually monitor. It also supports the following commands issued to selected monitors.

  • Start recording
  • Stop recording
  • Play a recording
  • Live view (no recording)

Recording of the mjpeg stream can be done with a command similar to the following.

ffmpeg -i http://x.x.x.x/axis-cgi/mjpg/video.cgi?resolution=320x240 -an -vcodec flv file.flv

Recording should be on demand via the monitor or initiated by the camera via motion sensor. The latter implies that recording is started on the camera to an NFS or Samba mount. Alternatively the sensor could signal the monitor to start recording. The latter is preferred but more susceptible to error.

A stretch goal is to allow display of up to four cameras at once. This would work only if omxplayer can be convinced to display only into a subsection of the framebuffer. Alternatively, a different player with hardware acceleration might be found, such as mpv.

The monitor would need extending to support storage, possibly using a new Ironman storage device that connects via Samba or NFS.

Updated by Hammel over 4 years ago · 5 revisions