After asking for some help on Linux Audio Users and Stream-ring concerning skipping chunks of video, and after doing some of my own tests lowering the resolution and quality, it seems that there are two factors influencing the flow of stream: CPU spikes and bandwidth spikes.
As Robin Gareus writes via lau-list:
> Panning and zooming puts addidtional some strain on the encoder
> (motion compensation calc) and also produces bandwidth spikes….
> AT LAC, we were able to kick out some remote-participants with quick
> camera panning due to bandwith spikes so we tried to avoid those
Robin also writes about cpu-friendly scaling to multiples of 16(8):
> Down-scaling the video decreases CPU usage of the encoder, but
> requires a few CPU cycles to perform the scaling, in particular if
> scaling to non multiples of 16: try `-x 384 -y 288 -aspect 4:3`
> instead of ` -x 393 -y 288` (it /should/ not make a difference since
> Theora requires the geometry to be multiples of 8 (or 16?), but
> ffmpeg2theora may just be scaling/cropping/expanding twice?!).
Lluis Gomez i Bigorda via stream-ring also pointed at CPU usage andpoints to the “–speedlevel” option. So we are now using the following command line to steam from a DVcamera and it looks quite stable:
dvgrab --format raw - | ffmpeg2theora -f dv -x 384 -y 288 -v 3 --speedlevel 2 --no-skeleton -o /dev/stdout - | oggfwd icecastserver 8000 pass /mount.ogv
simple slideshow in PD/GEM
For the needs of displaying various images from a folder I wrote a simple slideshow “player” in GEM. The part I’m particularly proud of is the code that takes the dimensions of the image (WxH) in pixels, calculates ratio and is then able to calculate the resizing factor in order for the image to stay within bounds of a traditional computer screen or projections (4:3). For an experienced coder this might have been trivial, but for a self-thought artist-programmer like me it took me a bit of time.
Here’s a patch and a screenshot.
I was playing with various transformation of image, in fact, with a particular transformation and distortion of image using `metapixel` tool. In order to find a good combination of parameters (because metapixel is a command-line tool) I’ve written a bash script that cycles through some numbers and throws out about 30 permutations (metapixelations) of each image processed. The results are most of the time quite interesting (for me at least).
Here’s a the script and an image.
#!/bin/bash f=1 while [ $f -lt 256 ]; do let f=f*2 #echo f = $f for i in 1 2 3 4 do # choose w or h? w=$RANDOM let w%=2 if [ "$w" == "1" ] then h=$RANDOM let h%=400 let h=h+100 else w=$RANDOM let w%=400 let w=w+100 h=1 fi i=$RANDOM y=$RANDOM q=$RANDOM let i%=3 let y%=3 let q%=3 metapixel -x $1 -w $w -h $h -f $f -y $y -i $i -q $q --metapixel \ $1 $1\_`date +%y%m%d_%H%M%S`\__w$w\-h$h\-f$f\-y$y\-i$i\-q$q\.jpg done done