6. 12. 2019

Create image slideshow with FFMPEG

Following will take all *.jpg images from current folder and create mp4 image slideshow. Images can have different sizes, ffmpeg will rescale or pad them.

ffmpeg -framerate 1/3 \
       -pattern_type glob -i '*.jpg' \
       -vf "scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1" \
       -c:v libx264 -crf 14 -r 25 -pix_fmt yuv422p \
       output.mp4

With background music:

ffmpeg -framerate 1/3 \
       -pattern_type glob -i '*.jpg' \
       -i audiofile.mp3 \
       -vf "scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1" \
       -c:v libx264 -crf 14 -r 25 -pix_fmt yuv422p \
       -shortest \
       output.mp4

#bash #ffmpeg

27. 11. 2019

Convert MYSQL HEX to string with PHP

Hexadecimal literal values are written using X'string' or 0xstring notation, where string contains hexadecimal digits (0..9, A..F).

SELECT HEX('EXAMPLE'), X'4558414D504C45';
+----------------+-------------------+
| HEX('EXAMPLE') | X'4558414D504C45' |
+----------------+-------------------+
| 4558414D504C45 | EXAMPLE           |
+----------------+-------------------+

You can also transform this value HEX value also with PHP instead of SELECT *, HEX(column) as column_name ... SQL.

<?php
function hexToString($hex) {
    $string = '';
    for ($i = 0; $i < strlen($hex); $i++) {
        $string .= dechex(ord($hex[$i]));
    }
    return $string;
}

Inserting string to the table will be easy just "INSERT INTO table (column) VALUES HEX(?)" or with Hexadecimal Literal string starting with 0x or X'string'.

#PHP #mysql

24. 10. 2019

Install Arduino on Raspberry PI

  1. Download https://www.arduino.cc/en/Main/software (Linux ARM 64 bit)
  2. Untar sudo tar xvJf ~/Downloads/arduino-1.8.9-linux64.tar.xz -C /opt
  3. Install sudo -E /opt/arduino-1.8.9/install.sh

Now, you have to add the Debian 10 login user to the dialout, tty, uucp and plugdev group. Otherwise, you won’t be able to upload your Arduino code to the Arduino microcontroller.

sudo usermod -aG dialout $(whoami)
sudo usermod -aG tty $(whoami)
sudo usermod -aG uucp $(whoami)
sudo usermod -aG plugdev $(whoami)
sudo reboot

#Raspberry #iot

15. 10. 2019

Subreddit Image Downloader

Subreddit downoader is a bash script which:

#!/usr/bin/env bash

###############################################################################
# Config
###############################################################################

# default subreddit=catpictures :)
subreddit=${1-catpictures} && json=${subreddit}

# default dir=<subreddit name>
dir=`realpath ${2-${subreddit}}` && mkdir -p ${dir}

# default page=1
pages=${3-1} 

###############################################################################
# Downloading images
###############################################################################

printf "Download all subreddit \e[1;31m/r/${subreddit}\e[m images to \e[1;31m${dir}\e[m\n"

for i in $(seq ${pages});
do

    # download subreddit json file
    curl -sS "https://www.reddit.com/r/${subreddit}.json?limit=100&after=${after}" -A 'random' | json_pp -json_opt utf8,pretty > ${dir}/${json}.json

    printf "\e[1;35mProcessing data from ${dir}/${json}.json\e[m\n"

    images=$(cat ${dir}/${json}.json | jq -r ".data.children[].data.preview.images[0].source.url" | egrep '\.jpg|\.png|\.gif' )

    # download all images from file
    for img in ${images}
    do
        # getting filename from URL
        file=${img##*/} && file=${file%\?*}

        # Download only new images
        if [[ ! -f "${dir}/${file}" ]]; then
            printf "\e[1;90m- ${file}\e[m\n"
            curl -sS -A 'random' "${img//&amp;/&}" -o ${dir}/${file} &
        fi
    done;

    # go to next page
    after=$(cat ${dir}/${json}.json | jq -r ".data.after") && json=${after}

    if [[ ${after} == "" ]]; then
        break # not have any after pages
    fi

done;

###############################################################################
# Cleanup
###############################################################################

rm ${dir}/*.json

wait #wait for all background jobs to terminate

Usage

./subreddit-download.sh <subreddit name> <directory> <pages>

Download all images from catpictures subreddit:

./subreddit-download.sh catpictures ./catpictures 5

Requirements

Source code: https://github.com/OzzyCzech/subreddit-image-downloader

#bash #redddit #curl

11. 10. 2019

Fix casks with depends_on that reference pre-Mavericks

If you get an error of the type **Error: Cask 'xxx' definition is invalid: invalid 'depends_on macos' value: ":mountain_lion", where hex-fiend-beta can be any cask name, and :mountain_lion any macOS release name, run the following command:

/usr/bin/find "$(brew --prefix)/Caskroom/"*'/.metadata' -type f -name '*.rb' -print0 \
| /usr/bin/xargs -0 /usr/bin/perl -i -pe 's/depends_on macos: \[.*?\]//gsm;s/depends_on macos: .*//g'

#macOS #brew #Catalina

27. 9. 2019

Backup full git repository as single bundle file

Git is capable of "bundling" its data into a single file. The bundle command will package up everything that would normally be pushed over the wire with a git push command into a binary file that you can email to someone or put on a flash drive, then unbundle into another repository.

Following bash function will clone repository and create one signle bundle file with nice name:

#!/bin/bash

function git_backup() {    
    target=$(echo ${1#*:} | tr / _)        
    git clone --mirror $1 ${target} && cd ${target}
    git bundle create ${2-../}/${target%%.git}.bundle --all
    cd - && rm -rf ${target}
}

Usage:

git_backup git@github.com:OzzyCzech/dotfiles.git ~/Downloads/

PS: Note that git bundle only copies commits that lead to some reference (branch or tag) in the repository. So tangling commits are not stored to the bundle.

You can also create nice alias in .gitconfig file:

[alias]
  backup="!gb() { target=$(echo ${1#*:} | tr / _); git clone --mirror $1 ${target} && cd ${target}; git bundle create ${2-../}/${target%%.git}.bundle --all; cd - && rm -rf ${target}; }; gb"

For more informtion view https://github.com/OzzyCzech/dotfiles

Backup whole GitHub account

You can use GitHub API to get list of all user repos. Then you have to apply all your bash magic power to getting right names from that.

curl -s https://api.github.com/users/OzzyCzech/repos | json_pp | grep full_name | cut -d\" -f4

Or there are a number of tools specifically designed for the purpose of manipulating JSON from the command line. One of the best seems to me jq

for repo in $(curl -s https://api.github.com/users/OzzyCzech/repos | jq -r ".[].ssh_url")
do  
  git backup $repo /Volumes/Backup/git
done;

Restore

You can difectly clone repository from bundle file:

git clone my-super-file.bundle directory

#bash #git

12. 9. 2019

How to backup iCloud drive with rclone

All iCloud drive data are located in ~/Library/Mobile\ Documents/com~apple~CloudDocs/ folder. You can easily sync them with rclone to backup hard drive connected to Mac. In my case it will be mounted to /Volumes/Backup/.

rclone sync ~/Library/Mobile\ Documents/com~apple~CloudDocs/ /Volumes/Backup/iCloudDriveBackup --copy-links

If you will need also backup of all deleted files (sync usually remove files that was delete from source) is there --backup-dir parameter.

rclone sync ~/Library/Mobile\ Documents/com~apple~CloudDocs/ /Volumes/Backup/iCloudDriveBackup --copy-links 
       --backup-dir="/Volumes/Backup/iCloudDriveArchive/$(date +%Y)/$(date +%F_%T)"

#iCLoud #backup #macOS

12. 9. 2019

Download all images from URL into a single folder

There is plenty options, but easiest one is use command line. The wget is command line utility allows you to download whole web pages, files and images from the specific URL.

Follow command works just fine:

wget -nd -nc -np \
     -e robots=off \
     --recursive -p \
     --level=1 \
     --accept jpg,jpeg,png,gif \

     [example.website.com]

What's mean all that?

Other useful download options:

Read more on wget manual page.

Real world example

Download all Homophones, Weakly images since 2011

wget -nd -nc -np \
     -e robots=off \
     --recursive -p \
     --level=1 \
     --accept jpg,jpeg \
     -H --random-wait \
     -U "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36" \
     http://homophonesweakly.blogspot.com/{2011..2019}

#bash #wget