Trippy imagemagic

by: Artur Dziedziczak

June 14, 2020

Introduction

I sent my thesis for the final review so I had some time to work on something different. As couple days ago I checked with my girlfriend some videos from our trip to Paris and I found one particular video that made me thinking "Damn that was trippy"

But I decided to make it even more trippy. As I like bash I wanted to make a simple script that will convert movie clip to more crazy version of it

Tools needed

  • bash
  • cool programs for transformation of photos [1]
  • GNU parallel to speed things up [2]
  • ffmpeg [3]

Processing of image

Let’s first see what is a trippy image. To generate images bellow I used disperse filter

From documentation of the filter you can read that:

  • ( s ) Spread - spread distance of dispersion
  • ( d ) Density - density or frequency of detail
  • ( c ) Curviness - curviness/clumpiness of dispersion

test 1 61 46 test 31 1 1 test 31 16 16 test 31 16 1 test 76 1 16 test 16 1 16

As you can see first four images look distorted but what I was looking for is the effect made by last two of them

[400

I tried to make it look more like the Fear and loathing in Las Vegas movie poster. I’m hitting in more curvy transformations and generally smooth lines

Script which generate images with different configuration of spread, density, curviness
#/bin/bash
start=`date +%s`

runtime=$((end-start))
parallel ../disperse -s {1} -d {2} -c {3} movie290.png test_{1}_{2}_{3}.png ::: $(seq 1 15 100) ::: $(seq 1 15 100) ::: $(seq 1 15 100)

end=`date +%s`
runtime=$((end-start))

echo With parallel: $runtime >> output.txt

If you would like to run this script keep in mind that on my machine with parallel instances of disperse it took 17 minutes to calculate all 344 images

Processing videos

Before I started making script I checked different approaches. First I checked some python libraries which had similar disperse implementation. In the end I didn’t choose them as I have no idea how to write multithreaded code in python and I discovered that there are other solutions. I could write small scripts in python and still use parallel but if it’s possible to use bash I just decided to make everything in it

In the end I made a script that with every 50 frames increases spread parameter for disperse so it looks like with time movie gets more circular shape distortion

parser.sh
#/bin/bash
while getopts m:o:rs option
do
	case "${option}"
		in
		m) MOVIE_FILE=${OPTARG};;
		r) REMOVE_FILES=${OPTARG}
			echo "Removing old directories"
			rm -R images
			rm -R transformed_images
			;;
		o) OUTPUT_FILE=${OPTARG};;
		s) SERIAL=1;;
	esac
done

if [[ -z "$MOVIE_FILE" || ! -f "$MOVIE_FILE" ]]; then
	echo "Movie file was not provided or dont't exist"
	exit -1;
fi
if [[ -z "$OUTPUT_FILE" ]]; then
	echo "Output file was not selected"
	exit -1;
fi

mkdir -p images
mkdir -p transformed_images
IMAGES="images"
TRANSFORMED_IMAGES="transformed_images"

function extract_images_from_movie() {
	ffmpeg -i $MOVIE_FILE -vf fps=24 ./$IMAGES/%d.png
}

function disperse_images() {
	echo "Dispersing images"
	if [[ "$SERIAL" -eq 1 ]]; then
		bash serial_videos.sh $IMAGES $TRANSFORMED_IMAGES 50
	else
		bash parallel_videos.sh $IMAGES $TRANSFORMED_IMAGES 50
	fi
}

function make_video() {
	echo "Making video"
	ffmpeg -r 24 -i ${TRANSFORMED_IMAGES}/%d.png -c:v libx264 -pix_fmt yuv420p $OUTPUT_FILE
}

extract_images_from_movie
disperse_images
make_video
parallel_videos.sh
#!/bin/bash
start=`date +%s`

IMAGES=$1
TRANSFORMED_IMAGES=$2
files_in_grups=$3

files=""
num_of_files_for_batch_parallel=0;

s_parameter=($(seq 1 6 50))
s_parameter_index=1
for file_name in $(ls $IMAGES | sort -n); do
	if (( $num_of_files_for_batch_parallel >= $files_in_grups )); then
		echo $files
		parallel --null --bar ./disperse -s ${s_parameter[$s_parameter_index]} -d 1 -c 16 ./${IMAGES}/{1} ./${TRANSFORMED_IMAGES}/{1} ::: $files
		((s_parameter_index+=1))
		num_of_files_for_batch_parallel=0
		files=""
		if (( "${#s_parameter[@]}" < $s_parameter_index )); then
			s_parameter_index=0
		fi
	fi

	((num_of_files_for_batch_parallel+=1))
	files+=" $(basename $file_name)"
done

parallel --null --bar ./disperse -s ${s_parameter[$s_parameter_index]} -d 1 -c 16 ./${IMAGES}/{1} ./${TRANSFORMED_IMAGES}/{1} ::: $files

end=`date +%s`
echo parallel time: $((end-start)) >> time_check.txt
serial_videos.sh
#!/bin/bash
start=`date +%s`

IMAGES=$1
TRANSFORMED_IMAGES=$2
files_in_grups=$3

files=""
num_of_files_for_batch_parallel=0;

s_parameter=($(seq 1 6 50))
s_parameter_index=1
for file_name in $(ls $IMAGES | sort -n); do
	if (( $num_of_files_for_batch_parallel >= $files_in_grups )); then
		for fname in $files; do
			./disperse -s ${s_parameter[$s_parameter_index]} -d 1 -c 16 ./${IMAGES}/$fname ./${TRANSFORMED_IMAGES}/$fname
		done

		((s_parameter_index+=1))
		num_of_files_for_batch_parallel=0
		files=""
		if (( "${#s_parameter[@]}" < $s_parameter_index )); then
			s_parameter_index=0
		fi
	fi

	((num_of_files_for_batch_parallel+=1))
	files+=" $(basename $file_name)"
done

for fname in $files; do
	./disperse -s ${s_parameter[$s_parameter_index]} -d 1 -c 16 ./${IMAGES}/$fname ./${TRANSFORMED_IMAGES}/$fname
done

end=`date +%s`
echo serial time: $((end-start)) >> time_check.txt

To use it you have to type bash parser.sh -r -m '/path/to/video/to/convert' -o output_file.mp4 or bash parser.sh -r -s -m '/path/to/video/to/convert' -o output_file.mp4 with -s flag which will run disperse program in serial

Output results

On the bottom you can see two videos side by side. The one on the right was transformed by disperse filter

When it comes to time of execution the parallel version was faster:

time_check.txt
parallel time: 481
serial time: 652

So parallel one took 8 min and serial one 10 min

Final notes

I hope you had fun reading/watching this. In conclusion I have some points that I would like to share:

  • Working with ffmpeg tool in command line is really easy. If you have bunch of images and want to make a movie from them I would say this is one of the easiest options to pick.
  • parallel can speed up computation but it’s not always easy to estimate if parallel processing will have great speed improvements

All source codes can be found on my github

Sources

[1] “Fred’s ImageMagick Scripts: DISPERSE.” http://www.fmwconcepts.com/imagemagick/disperse/index.php

[2] “Gnu.Org.” https://www.gnu.org/software/parallel/

[3] “Using Ffmpeg to Convert a Set of Images into a Video.” https://hamelot.io/visualization/using-ffmpeg-to-convert-a-set-of-images-into-a-video/