Create image slideshow with ffmpeg

Following will take all *.jpg images from current folder and create mp4 image slideshow. Images can have different sizes, ffmpeg will rescale or pad them.

ffmpeg -framerate 1/3 \
       -pattern_type glob -i '*.jpg' \
       -vf "scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1" \
       -c:v libx264 -crf 14 -r 25 -pix_fmt yuv422p \
       output.mp4
  • -framerate 1/3 image will be change each 3 sec
  • -pattern_type glob -i '*.jpg' will use all *.jpg images from current folder
  • -vf "scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1" rescale or pad image to 1280x720
  • -c:v libx264 use H.264 codec
  • -crf 14 CFR (Constant Rate Factor) is betweemn 0–51, where 0 is lossless, 23 is the default, and 51 is worst quality possible.
  • -r 25 set frame rate.
  • -pix_fmt pixel output format

With background music:

ffmpeg -framerate 1/3 \
       -pattern_type glob -i '*.jpg' \
       -i audiofile.mp3 \
       -vf "scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1" \
       -c:v libx264 -crf 14 -r 25 -pix_fmt yuv422p \
       -shortest \
       output.mp4
  • -shortest will take shorten source images or mp3 as length limit for video

#bash#ffmpeg#2019

Convert MYSQL HEX to string with PHP

Hexadecimal literal values are written using X'string' or 0xstring notation, where string contains hexadecimal digits (0..9, A..F).

SELECT HEX('EXAMPLE'), X'4558414D504C45';
+----------------+-------------------+
| HEX('EXAMPLE') | X'4558414D504C45' |
+----------------+-------------------+
| 4558414D504C45 | EXAMPLE           |
+----------------+-------------------+

You can also transform this value HEX value also with PHP instead of SELECT *, HEX(column) as column_name ... SQL.

<?php
function hexToString($hex) {
    $string = '';
    for ($i = 0; $i < strlen($hex); $i++) {
        $string .= dechex(ord($hex[$i]));
    }
    return $string;
}

Inserting string to the table will be easy just "INSERT INTO table (column) VALUES HEX(?)" or with Hexadecimal Literal string starting with 0x or X'string'.

#PHP#mysql#2019

Install Arduino on Raspberry PI

  1. Download arduino Linux ARM 64 bit

  2. Untar

    sudo tar xvJf ~/Downloads/arduino-1.8.9-linux64.tar.xz -C /opt
    
  3. Install

    sudo -E /opt/arduino-1.8.9/install.sh
    

Now, you have to add the Debian 10 login user to the dialout, tty, uucp and plugdev group. Otherwise, you won’t be able to upload your Arduino code to the Arduino microcontroller.

sudo usermod -aG dialout $(whoami)
sudo usermod -aG tty $(whoami)
sudo usermod -aG uucp $(whoami)
sudo usermod -aG plugdev $(whoami)
sudo reboot

#Raspberry#iot#2019

Bonjour service for IOT

mDNS is installed by default on most operating systems or is available as separate package. On Mac OS it is installed by default and is called Bonjour. Apple releases an installer for Windows that can be found on Apple’s support page. On Linux, mDNS is provided by avahi and is usually installed by default.

#Bonjour#iot#2019

Subreddit Image Downloader

Subreddit downoader is a bash script which:

  • download all images in full size
  • download only new images
  • support paging
  • is MacOS/Linux/Windows compatible
#!/usr/bin/env bash

###############################################################################
# Config
###############################################################################

# default subreddit=catpictures :)
subreddit=${1-catpictures} && json=${subreddit}

# default dir=<subreddit name>
dir=`realpath ${2-${subreddit}}` && mkdir -p ${dir}

# default page=1
pages=${3-1} 

###############################################################################
# Downloading images
###############################################################################

printf "Download all subreddit \e[1;31m/r/${subreddit}\e[m images to \e[1;31m${dir}\e[m\n"

for i in $(seq ${pages});
do

    # download subreddit json file
    curl -sS "https://www.reddit.com/r/${subreddit}.json?limit=100&after=${after}" -A 'random' | json_pp -json_opt utf8,pretty > ${dir}/${json}.json

    printf "\e[1;35mProcessing data from ${dir}/${json}.json\e[m\n"

    images=$(cat ${dir}/${json}.json | jq -r ".data.children[].data.preview.images[0].source.url" | egrep '\.jpg|\.png|\.gif' )

    # download all images from file
    for img in ${images}
    do
        # getting filename from URL
        file=${img##*/} && file=${file%\?*}

        # Download only new images
        if [[ ! -f "${dir}/${file}" ]]; then
            printf "\e[1;90m- ${file}\e[m\n"
            curl -sS -A 'random' "${img//&amp;/&}" -o ${dir}/${file} &
        fi
    done;

    # go to next page
    after=$(cat ${dir}/${json}.json | jq -r ".data.after") && json=${after}

    if [[ ${after} == "" ]]; then
        break # not have any after pages
    fi

done;

###############################################################################
# Cleanup
###############################################################################

rm ${dir}/*.json

wait #wait for all background jobs to terminate

Usage

./subreddit-download.sh <subreddit name> <directory> <pages>

Download all images from catpictures subreddit:

./subreddit-download.sh catpictures ./catpictures 5

Requirements

Source code: https://github.com/OzzyCzech/subreddit-image-downloader

#bash#reddit#curl#2019

Fix casks with depends_on that reference pre-Mavericks

If you get an error of the type **Error: Cask 'xxx' definition is invalid: invalid 'depends_on macos' value: ":mountain_lion", where hex-fiend-beta can be any cask name, and :mountain_lion any macOS release name, run the following command:

/usr/bin/find "$(brew --prefix)/Caskroom/"*'/.metadata' -type f -name '*.rb' -print0 \
| /usr/bin/xargs -0 /usr/bin/perl -i -pe 's/depends_on macos: \[.*?\]//gsm;s/depends_on macos: .*//g'

#macOS#brew#Catalina#2019

Backup full git repository as single bundle file

Git is capable of "bundling" its data into a single file. The bundle command will package up everything that would normally be pushed over the wire with a git push command into a binary file that you can email to someone or put on a flash drive, then unbundle into another repository.

Following bash function will clone repository and create one signle bundle file with nice name:

#!/bin/bash

function git_backup() {    
    target=$(echo ${1#*:} | tr / _)        
    git clone --mirror $1 ${target} && cd ${target}
    git bundle create ${2-../}/${target%%.git}.bundle --all
    cd - && rm -rf ${target}
}

Usage:

git_backup git@github.com:OzzyCzech/dotfiles.git ~/Downloads/

PS: Note that git bundle only copies commits that lead to some reference (branch or tag) in the repository. So tangling commits are not stored to the bundle.

You can also create nice alias in .gitconfig file:

[alias]
  backup="!gb() { target=$(echo ${1#*:} | tr / _); git clone --mirror $1 ${target} && cd ${target}; git bundle create ${2-../}/${target%%.git}.bundle --all; cd - && rm -rf ${target}; }; gb"

For more informtion view https://github.com/OzzyCzech/dotfiles

Backup whole GitHub account

You can use GitHub API to get list of all user repos. Then you have to apply all your bash magic power to getting right names from that.

curl -s https://api.github.com/users/OzzyCzech/repos | json_pp | grep full_name | cut -d\" -f4

Or there are a number of tools specifically designed for the purpose of manipulating JSON from the command line. One of the best seems to me jq

for repo in $(curl -s https://api.github.com/users/OzzyCzech/repos | jq -r ".[].ssh_url")
do  
  git backup $repo /Volumes/Backup/git
done;

Restore

You can difectly clone repository from bundle file:

git clone my-super-file.bundle directory

#bash#git#2019

How to backup iCloud drive with rclone

All iCloud drive data are located in ~/Library/Mobile\ Documents/com~apple~CloudDocs/ folder. You can easily sync them with rclone to backup hard drive connected to Mac. In my case it will be mounted to /Volumes/Backup/.

rclone sync ~/Library/Mobile\ Documents/com~apple~CloudDocs/ /Volumes/Backup/iCloudDriveBackup --copy-links

If you will need also backup of all deleted files (sync usually remove files that was delete from source) is there --backup-dir parameter.

rclone sync ~/Library/Mobile\ Documents/com~apple~CloudDocs/ /Volumes/Backup/iCloudDriveBackup --copy-links 
       --backup-dir="/Volumes/Backup/iCloudDriveArchive/$(date +%Y)/$(date +%F_%T)"

#iCloud#backup#macOS#2019