I want to first carry on from what we were doing yesterday and introduce a few new ideas then we will be getting into using ffmpeg to prepare and re-encode videos , extract sound etc.
Windows users if you don’t already have it turned on open windows file explorer, click on view and tick the box that says file name extensions , it means its easier to see what files are rather than just a list of names and thumbnails.
Cutting files to size
If you don’t know how to do this already first we are going to cut a video you downloaded yesterday down to a minute or two. Using a file explorer go to the place you downloaded your file , open a terminal ( right click git-bash here in windows , open in terminal in Linux ) and use ffprobe to find out its length ie ffprobe yourfile look for the part of the output which says duration . Now open the file you want to work on with vlc or mpv , or any media player , then find a section you like, make a note of the time where it starts then we are going to enter that into ffmpeg and slice out a segment of that file to a new smaller file with this the basic command :
ffmpeg -ss 00:00:00 -t 00:00:00 -i yourfile.extension -c:v copy -c:a copy yourfile.extension
-ss gives us the start point (the point in the file you found a moment ago) -t gives us how many minutes after that we want to copy
so doing this
ffmpeg -ss 00:19:00 -t 00:01:00 -i yourfile.extension -c:v copy -c:a copy yourfileshort.extension
will give us one minute of video starting from 19 minutes in ( or whatever point you choose to start and end with)
Do that and now we have a short section of video to play with we can live hex edit it quickly so stay in the folder where your shortened video is.
Live Hex editing (again)
Using a slightly different method than yesterday , we are going to do it this way because although its easier to hex edit raw-video live we don’t get any of the artefacts that different file formats would give us , which is what gives us the lovely crunchy textures. It’s a slightly less forgiving method though and you have to choose values quite carefully , and as we are using ffplay we can also press the left arrow on the keyboard to flick forwards thru the film if it gets stuck ( but not if it stubbornly refuses to play)
where I have below somefile.extension , replace that with the short file you just created
xxd -p somefile.extension | sed 's/cade/6ebe/g' | xxd -r -p | ffplay -i -
the values to change are these sed 's/cade/6ebe/g' (numbers and letters in bold) change them to any combination of hexadecimal values ie 0 to f , experiment with different values on different formats
so with that all entered into the terminal hit enter and wait for the video to appear, check the output in the terminal for signs of life , sometimes it will just show no movement, and you get no video output , if thats the case click on the terminal and press ctrl+c to kill that process, wait for the terminal to give you back control then up arrow to recall that command
some formats are easier than others to find interesting values , some files just wont play we can try different combinations of values until we find something that will work , though the advantage of doing it this way is that we don’t have to write to file to view results and we can tune it as we go along . And we don’t have to worry about matching the output input and output dimensions of the file to the script .
Compare this to the method we used yesterday which would give us this
ffmpeg -i somefile.extension -f rawvideo -vcodec rawvideo -pix_fmt yuv420p -s 640x480 - | xxd -p | sed 's/76/888/g' | xxd -p -r | ffplay -f rawvideo -vcodec rawvideo -pix_fmt yuv420p -s 640x480 –
Unlike the new method this doesn’t allow us to flick through sections using the left arrow and its slower.
xxd -p somefile.extension | sed 's/cade/6ebe/g' | xxd -r -p > somefilehex.extension
Which will take the output and write to a new file, how you name it is up to you but name it differently to the input file otherwise you will overwrite that file and end up with nothing . Also note that the terminal will not display any output during this process just a blinking cursor after you press enter.
Now
we could change the codec of the video to something more interesting
like magicyuv using ffmpeg ( your ffmpeg may not have magicyuv , if not choose msvideo1 and substitute that in the script where I have magicyuv ) this script if you are not familiar with ffmpeg takes yourvideo as input ( the ffmpeg -i where -i stands for input , changes the codec used from its original codec via the -c:v part and keeps the original audio , via the c:a copy part )
so do this on your shortened file
ffmpeg -i yourfile.extension -c:v magicyuv -c:a copy yourfilenewname.avi
( magicyuv needs an avi extension )
Once that has encoded do the live hex editing script on that file
experiment with different codecs such as zmbv or dirac or snow ( though zmbv takes a long time to encode )
Baking
If you output to a file sometimes that file will be unplayable or might be partially playable so you might want to bake it so its usable later on.
There’s a couple of different ways of doing that my main choice these days is just to use handbrake as it seems to be able to handle even heavily damaged files . Though there is an advantage to using a live hex editing script via ffplay and capturing that with obs-studio , especially if you are using the previous days method of using rawvideo as a format as that is damn near unbreakable. Try using handbrake on files hex edited using other methods though . Sometimes handbrake will over fix and remove errors that we want , codecs like snow are more susceptible to that , but its easy enough to use ffmpeg on a file you have hex edited and saved to file in this way.
ffmpeg -i yourhexedfile.extension -c:v formatofchoice yourhexedfilefixed.extension
Displacement mapping
From the simple script we used yesterday to activate and glitch our webcams there is something else we can do with a similar script using a displacement map - if you’ve seen the work of Ras Alhague ( see here www.curatedbygirls.com/ras-alhagues-glitch-world and on Instagram here https://www.instagram.com/ras.alhagve/ or Yebu tonu on Instagram here https://www.instagram.com/yebutonu/ you will know that ( apart from being some of the most amazing work you will see on the inter-webs ) a displacement map is the process of taking two inputs still or video and using the texture of one to move the other against itself so that elements from one bleed into and effect the other , see Wikipedia article here https://en.wikipedia.org/wiki/Displacement_mapping doing a similar thing on windows and Linux with video is also possible using ffmpeg which has a filter that can be used to achieve this in real time.
On linux/ windows we can do this by using a video against itself but displacement mapping is very sensitive to input sizes so use ffprobe first to check the dimensions of your video , the first and second files must match in dimension or this won’t work if you are using different sources . Also note that if using longer videos the lengths of each must match so you will need to cut the longest files length to match the shortest . ( using the handy combination of ffprobe then the script illustrated earlier )
Displacing a film against itself
So first ffprobe yourfile.extension and then change the output size at the end of this script to your result ie the bit that says 640x360 , alter that to your file dimensions and then replace the appropriate parts in the script below where we are displaceing the same film against itself.
ffmpeg -i yourfile.extension -i yourfile.extension -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | ffplay -f rawvideo -pix_fmt yuv420p -s 640x360 -
now that's pretty simple but you can also use two different sources first two different films of the same dimensions . but how to change different dimensions to match each other ?
Changing dimensions with handbrake
If the films aren't the same dimensions we can use Handbrake to change that – also when using films with different lengths this will sometimes fail depending on the order of the film so first try changing round the order the films are processed in , ( I don’t think it really matters if we are using a live webcam feed as one of the inputs ) to avoid that we can alter the length of the films by using the command I showed earlier
So with handbrake open and having obtained the dimensions of your first film with ffprobe encode your second film with the same dimensions . Handbrake on linux and windows is pretty similar but the dimensions are in a slightly different arrangement on windows demo this . Especially the ‘allow upscaling tick box ‘ otherwise it won't work. Not the same on linux at all.
So to give an idea of what this would work like, practice with different films.
Displace one video against another
I prepared two films earlier using handbrake , first by matching the aspect ratios (3:2) and dimensions 480x320 using handbrake
The first ‘Horror hotel’ from 1960 you can find here https://archive.org/details/HorrorHotel1960 which is published under this creative commons licence which allows reuse , adaptation and sharing with attribution more info here https://creativecommons.org/licenses/by/3.0/
The second source is called ‘prisoners of the lost universe’ found here https://archive.org/details/PrisonersOfTheLostUniverse1983_201511 which according to its licence is completely within the public domain so I can do what I like pretty much with it ( taking into consideration the things we discussed yesterday ) https://creativecommons.org/publicdomain/mark/1.0/
do this on windows and linux ( its the same )
ffmpeg -i Horror_Express.mp4 -i prisoners.mp4 -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | ffplay -f rawvideo -pix_fmt yuv420p -s 480x320 -
or with the video shortened and the order swapped
ffmpeg -i prisonersshort.mp4 -i Horror_expressshort.mp4 -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | ffplay -f rawvideo -pix_fmt yuv420p -s 480x320 -
Depending on the source we can achieve different textures , this technique works really well if you have already glitched video to work with , say from glitched to non-glitched bear in mind colour as well , some strong colours like bright red/orange or blue can come through exceptionally well depending on the material you start with and if you have a particular video with a particular texture it can work really well.
If you dont want to watch this happen live or are happy that you are getting a file you can output to a file like this but be aware that these files can be quite large and may need re-encoding via handbrake to make them manageable for later editing, especially on resource limited machines.
ffmpeg -i prisonersshort.mp4 -i Horror_expressshort.mp4 -lavfi '[1]split[x][y],[0][x][y]displace' filename.mp4
And then maybe start adding back over that by using that file as input for other videos
Because this is rawvideo format we are working live we can hex edit it as well by modifying our earlier script.
ffmpeg -i prisonersshort.mp4 -i Horror_expressshort.mp4 -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | xxd -p | sed 's/cade/6eb/g' | xxd -r -p | ffplay -f rawvideo -pix_fmt yuv420p -s 480x320 -
and as we are using ffplay if we hit the spacebar we can pause or resume video so we can screenshot stills we also like along the way.
so we are starting to chain ideas and techniques together .
To capture with hex editing use obs-studio to capture the piped ffplay output or use your screen recorder software of choice.
Displacing live video over found video
We could also can also use live sources like webcams to mix with video if the output of the camera matches the dimensions of the video . How would we do that ? On windows like this
ffmpeg -i prisoners640x360.mp4 -f dshow -i video="Live! Cam Sync HD VF0770" -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | ffplay -f rawvideo -pix_fmt yuv420p -s 640x360 -
or reversing the order ie camera first and video input after notice I’ve slowed the frame rate of the camera as it seems to stutter , but the first method seems to work better
ffmpeg -f dshow -i video="Live! Cam Sync HD VF0770" -r 5 -i prisoners640x360.mp4 -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | ffplay -f rawvideo -pix_fmt yuv420p -s 640x360 -
the equivalent on linux would be
ffmpeg -i prisoners640x360.mp4 -i /dev/video0 -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | ffplay -f rawvideo -pix_fmt yuv420p -s 640x360 -
and reverse the input
ffmpeg -i /dev/video0 -i prisoners640x360.mp4 -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | ffplay -f rawvideo -pix_fmt yuv420p -s 640x360 -
( again alter the input and output dimensions to reflect your webcam resolution and the dimensions of the video )
Grabbing the desktop and using that to displace video
Grab a part of the desktop as a virtual camera and use that to displace the video playing
on Linux
ffmpeg -f x11grab -follow_mouse centered -framerate 25 -video_size 640x360 -i :0.0 -i prisoners640x360.mp4 -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | xxd -p | sed 's/808/181/g'| xxd -p -r | ffplay -f rawvideo -pix_fmt yuv420p -s 640x360 -
on Windows
ffmpeg -f gdigrab -framerate 6 -offset_x 10 -offset_y 20 -video_size 640x360 -i desktop -i prisoners640x360.mp4 -lavfi '[1]split[x][y],[0][x][y]displace' -f rawvideo -pix_fmt yuv420p - | xxd -p | sed 's/808/181/g'| xxd -p -r | ffplay -f rawvideo -pix_fmt yuv420p -s 640x360 -
On windows 10 to make this work well open another video in vlc or with ffplay and place that in the upper left hand corner of your screen the window opened previously uses that as a displacement map which in itself can be quite interesting , but it doesn’t have the flexibility of the linux option, but I’m going to be dealing with that in the fifth and final session ‘What is the desktop anyway’ .
Before we get to ffglitch extracting sound from video
This is kind of a by the by as well be dealing with sound in another session but its a good idea to cover this now so you have files to play with later
So in the same directory as the files you are working with ( on linux and windows)
just type in
ffmpeg -i yourfile.extension yourfile.wav
it really is that simple why wav and not mp3 or something else ? Wav is near raw , and so we can use it in more interesting ways and it tends to break less .
For instance if we wanted to visualize sound we could just do this with the wav we just created .
Find your wav file change the extension to .yuv
ffmpeg -f rawvideo -s 640x480 -r 25 -pix_fmt yuv420p -i yourfile.yuv -c:v libx264 -preset ultrafast -qp 0 output.mp4
which will save it to a file or we could do it live using ffplay like this
ffplay -f rawvideo -pix_fmt yuv420p -s 640x480 -qp 0 -i yourfile.yuv
If you want to remove sound from video which is handy for some kinds of datamoshing like tomato (old version which I still prefer ) do this
ffmpeg -i yourfile.extension -an -c:v copy newfile.extension
Datamoshing with ffgac ( aka ffglitch)
we’ve downloaded some content , hex edited it and maybe run a few displacement maps over it
If you havent used it before What is FFglitch? Ramiro Polla's FFglitch is basically the best tool for datamoshing there is, Kaspar Ravels tomato v2 is a beautiful and simple tool but it has limits , FFglitch is more complicated to use but gives much more fine grained control of the process. Where to get it - here https://ffglitch.org/ it comes in Linux , Mac and Windows flavors Download it and unpack it, there's no need to install it, as its a stand-alone binary, but to use it we will need to take a few steps first.
FFglitch is essentially a command line tool , and this is where windows sucks , powershell is okay but we need a bit more power which is why we have git-bash installed I’m not going to run through using python scripts with ffgac as Ramiro Polla states he is moving away from python to using javascript . Now I’m presuming everyone in windows land has either notepad ++ or geany installed – check . Okay
All of that out of the way I'm going to walk through the example I was actually asked about - the move sink and rise example that Ramiro Polla has on his website . First things first we need to change the format of any file we are using so it suits ffgac ( the new version is called ffgac not ffglitch ) which handily there is an example script to do just that - now one of the misconceptions people have on using ffglitch / ffgac is that they can use standard ffmpeg to prep their files and this leads to a further misconception which leads to trying and failing to use ffglitch / ffgac - and I think that explains the renaming - ffgac is a standalone version of ffmpeg designed to be used with this set of tools , all the scripts reference it and not any other version you may have installed - this is why we work within the ffgac / ffglitch directory .
We are also going to be using shellscripts today , if you don’t know what a shellscript is well think of it as a list of commands put into a text file that the computer will execute in order unless it finds an error , we run them from the terminal ( thus git-bash ) and they enable us to put together long and complex sequences of commands it would otherwise be tedious to type in or call up constantly from the terminal interface. Its job automation similar in some ways to dos .bat scripts but way more powerful
so prep your file with the script Ramiro Polla has on his website but modified for use with windows - open notepad++ and copy pasta what I’m about to paste into the shared notes from #!/bin/bash to done , save that script in the folder where you have ffgac as ffgacprep.sh and also copy the shortened file you made earlier into this folder for simplicity's sake as well
#!/bin/bash
mkdir
mpeg2
#change mp4
below to whatever the file extension your source file has
for I
in *.mp4;
do
#on linux change the line below to ./ffgac -i "$I" \
./ffgac.exe -i "$I" \
-an -vcodec mpeg2video -f rawvideo \
-mpv_flags +nopimb \
-qscale:v 6 \
-r 30 \
-g 90 \
-s 1280x720 \
-y mpeg2/"${I/.mp4/.mpg}";
done
open git bash terminal in that folder ie right click and click on Git bash here - the terminal window will appear - before we can use that script we have to make it executable so in the terminal type in exactly this ( without quote marks) ' chmod u+x ffgacprep.sh ' - if all goes well there will be no error messages and the command line returns . so to prep your file now type in exactly in the terminal :
./ffgacprep.sh
The script will create a folder within that folder called mpeg2 and output your file all ready for the next stage move it into the main ffglitch folder .
You could
also use the command line commands shown on the website. So in git
bash terminal and with your video in the ffglitch folder copy and
paste that command into git bash terminal ( to paste into git bash
terminal press shift and ins ) - change where it says input file to
the name of your file - leave temp_file.mpg as your output file name
unless your working on a lot of files and alter the first part of the
script from ffgac to ./ffgac.exe ( cos we are working on windows and
not linux ) so it will look like this ( but as one long line) - using
the left and right arrow keys to navigate through the command and
change it . ( read these commands as being on one line )
./ffgac.exe -i input_file -an -mpv_flags +nopimb+forcemv -qscale:v 0 -g 1000
-vcodec mpeg2video -f rawvideo -y temp_file.mpg
for linux just remove the .exe so it reads
./ffgac -i input_file -an -mpv_flags +nopimb+forcemv -qscale:v 0 -g 1000
-vcodec mpeg2video -f rawvideo -y temp_file.mpg
On the ffgac website there is a script called ‘mv_sink_and_rise.js’ here https://ffglitch.org/2020/07/mv.html look for the section about halfways down where he says ' With all that being said, here’s the script that we will be using' open notepad++ and copy the script and save it in the ffglitch folder as 'mv_sink_and_rise.js' (without the quotes ) .
function glitch_frame(frame)
{
// bail out if we have no motion vectors
let mvs = frame["mv"];
if ( !mvs )
return;
// bail out if we have no forward motion vectors
let fwd_mvs = mvs["forward"];
if ( !fwd_mvs )
return;
// clear horizontal element of all motion vectors
for ( let i = 0; i < fwd_mvs.length; i++ )
{
// loop through all rows
let row = fwd_mvs[i];
for ( let j = 0; j < row.length; j++ )
{
// loop through all macroblocks
let mv = row[j];
// THIS IS WHERE THE MAGIC HAPPENS
mv[0] = 0; // this sets the horizontal motion vector to zero
// mv[1] = 0; // you could also change the vertical motion vector
}
}
}
Now for the magic run the
command he has on his website but for windows we change it to ;
./ffedit.exe -i temp_file.mpg -f mv -s mv_sink_and_rise.js -o glitched_file.mpg
and linux again just remove the .exe so it reads
./ffedit -i temp_file.mpg -f mv -s mv_sink_and_rise.js -o glitched_file.mpg
copy
and paste ( or type !) the above into git bash terminal and press
enter :
Once
that has run succesfully look for the file in the ffglitch folder and
play it!
Once
we have this running and you are happy with this process go back to
Ramiro Polla's tutorials and start messing about with values in the
mv_sink_and_rise.js script.
We
can do more complex things if we mess with this bit
mv[0]
= 0; // this sets the horizontal motion vector to zero
// mv[1] = 0; // you could also change the vertical motion vector
to something like this
//mv[0]
= 0; // this sets the horizontal motion vector to zero
mv[1] = -
1
; // you could also change the vertical motion vector
Mess about with these values until you have something you like and then go and read
more on Ramiro’s website. Ive used this in part to replicate some of the effects
I can get using the avance logic card that I have that gives the slidng
horizontals effect.
Black and white dithering
Sometimes we want to convert colour to black and white , there are a number of
ways
to do that , some simple some complicated ffmpeg gives us a way to do that simply
( it looks complex but it isn’t)
ffmpeg -i yourfile.extension -f lavfi -i color=gray:s=1920x1080 -f lavfi -i color=black:s=1920x1080
-f lavfi -i color=white:s=1920x1080 -filter_complex threshold -sws_dither a_dither
yourfile.extension
change where I have 1920x1080 to the dimensions of your file , again find that
using ffprobe
If we take the file we just generated using ffgac
or we could do that more simply instead of it being just black and white it could be to desaturate the file
ffmpeg -i yourfile.extension -vf hue=s=0 output.extension
true greyscale would be
ffmpeg -i input -vf format=gray output