Camera » History » Version 5
Hammel, 16 Nov 2019 12:15
1 | 1 | Hammel | h1. Camera |
---|---|---|---|
2 | |||
3 | 2 | Hammel | _This is only in the idea stage. No development on this has started._ |
4 | |||
5 | 1 | Hammel | Making remote camera controlled by Ironman has been an interesting problem. Many solutions use Arduinos. But then I ran across an interesting idea using a "RPi Zero W set inside a dummy camera housing":https://youtu.be/H7p5YEOrlSc. That led me to think about how I can use PiBox directly instead of using an Arduino based solution. |
6 | |||
7 | table{align=center}. |
||
8 | 2 | Hammel | |!{width:640px}ironman-camera.png!| |
9 | 1 | Hammel | |={font-size:120%;margin-bottom:15px;background-color:#dff}. *High Level Camera Design* | |
10 | 2 | Hammel | |
11 | 3 | Hammel | The camera is based on a RPi Zero W running a new _{color:blue}PiBox Lite_ release (PiBox stripped of X and other unnecessary components). A web API is built using "Mongoose":https://github.com/cesanta/mongoose (web server written in C and easily extendable) (or "original source":https://storage.googleapis.com/google-code-archive-source/v2/code.google.com/mongoose/source-archive.zip) that basically just starts up mjpeg-streamer just as the PiCam app current does. mjpeg-streamer provides a URL that the monitor will access. |
12 | 1 | Hammel | |
13 | 4 | Hammel | The mongoose server is used to reduce overhead - mongoose is self contained and doesn't need a bunch of libraries. On the flip side, "nodejs":https://nodejs.org/en/ with "restify":http://restify.com/ (for a REST API) is easy to implement but carries lots of weight (as all scripting languages do) and would require an external programs for running some components. |
14 | |||
15 | The camera doesn't need to run full time. We can attach a "PIR sensor":https://www.mysensors.org/build/motion to a GPIO and wake the camera then. The definition of "wake" is critical: if it takes mjpeg_streamer too long to start up and start acquiring video then we miss the image we want to capture. If we run mjpeg_streamer all the time then we use too much power. So the PIR has to be able to start recording but mjpeg_streamer needs to start acquiring video quickly. An alternative to this may be to have ffmpeg handle video capture if mjpeg_streamer takes too long to start. Note that piboxd normally starts mjpeg_streamer and there may be a lag between getting the GPIO notification from the PIR and sending the request to piboxd. Hopefully the recent updates to thread synchronization will help speed this up. |
||
16 | 3 | Hammel | |
17 | 2 | Hammel | The monitor is the standard Ironman Monitor with the addition of a new app: IMCameras. This is based on PiCam but supports selection of which camera to visually monitor. It also supports the following commands issued to selected monitors. |
18 | |||
19 | * Start recording |
||
20 | * Stop recording |
||
21 | * Play a recording |
||
22 | 5 | Hammel | * Live view (no recording) |
23 | |||
24 | Recording of the mjpeg stream can be done with a command similar to the following. |
||
25 | |||
26 | ffmpeg -i http://x.x.x.x/axis-cgi/mjpg/video.cgi?resolution=320x240 -an -vcodec flv file.flv |
||
27 | |||
28 | Recording should be on demand via the monitor or initiated by the camera via motion sensor. The latter implies that recording is started on the camera to an NFS or Samba mount. Alternatively the sensor could signal the monitor to start recording. The latter is preferred but more susceptible to error. |
||
29 | 2 | Hammel | |
30 | 1 | Hammel | A stretch goal is to allow display of up to four cameras at once. This would work only if omxplayer can be convinced to display only into a subsection of the framebuffer. Alternatively, a "different player with hardware acceleration":https://www.raspberrypi.org/forums/viewtopic.php?t=199775 might be found, such as "mpv":https://mpv.io/. |
31 | 3 | Hammel | |
32 | The monitor would need extending to support storage, possibly using a new Ironman storage device that connects via Samba or NFS. |