Tuesday 19 July 2022

FFmpeg, virtual webcams and processing

Following on from the previous post on revisiting convolution in processing and trying to tie that in with my previous explorations of the desktop as a performance space and feedback generator wouldnt it be good if there was a way to pass the desktop as video into processing scripts in a similar way to that in which ffmpegs' x11grab allows us to open a window which follows the mouse focus around the desktop like a virtual webcam ? 

Turns out there is. A friend had told me about V4l2loopback previously, a way of creating virtual webcams in Linux and then thinking about it the other day I came across this post which describes how to do it https://narok.io/creating-a-virtual-webcam-on-linux/ but that author streams a video file rather than what I want which is the same input as I get from x11grab. Why ? If we can create a virtual webcam we can use that as an input for processing so we can extend the possibilites of desktop feedback loops .

TLDR: do this on Debian based systems 

install this

sudo apt install v4l2loopback-dkms

run this command  

sudo modprobe v4l2loopback devices=1 video_nr=10 max_buffers=2 exclusive_caps=1 card_label="Default WebCam"

then this 

ffmpeg -f x11grab -follow_mouse centered -framerate 10 -video_size 640x480 -i :0.0 -f v4l2 -vcodec rawvideo -s 640x480 /dev/video10

Then open a script in processing which uses a webcam for input ( ie the convolution scripts ive been working on ) and experiment. 


 





 

 


ikillerpulse ( requires tomato.py in working directory)

I've been working on this script for the last week or so. What does it do? It takes an input video/s,  converts to a format we can use f...