Categories
timesavers troubleshooting

Recovering the Config of a Running Xen DomU

For those “oh poop” moments

I was in a situation where I had a running Xen guest, but the config file that defined the DomU was missing.

Fortunately, the listing command (xl list) has a long option, xl list -l, which prints out domain information in JSON format. This includes config information, from which the DomU configuration can be rebuilt.

Categories
automation python timesavers video

Rescheduling YouTube Videos using Python

More ‘exactly what it says on the tin’

A couple weeks ago, I had to renumber some Hunt: Showdown videos in a playlist:

Well, now I have another issue. When we started playing Hunt: Showdown, I was publishing the videos a couple a day on Mondays, Wednesdays and Fridays. Putting them all out at once is a bit of a crass move as it floods subscribers with notifications, so spreading then out is the Done Thing.1

However we’re now above 150 videos, and even after adding weekends to the schedule that still takes us up to, umm, May.

What I’d like to do is go back and redo the schedule so that all pending videos use Saturdays and Sundays, and maybe think about doing three or four per day, which would bring us down to about 8/6 weeks’ worth. That is still a lot, quite frankly, but pushing up the frequency further would be detrimental.

Changing the scheduled publish date would be even more painful than renumbering because it requires more clicks, I’d have to keep track and figure out when the next one was supposed to go out, and there are more to do (120-odd).

So back to python! I have already written a schedule-determiner for automating the generation of the pre-upload json template, so I can reuse — read: from genjson import next_scheduled_date — that for this task.

The filtering logic is straightforward: ignore anything not a Hunt video, skip anything before a defined start date (ie videos already published). From there change the current ‘scheduled’ date for the next one from the new schedule.

For the current set of scheduled videos that are not already published, the schedule of 3 videos each 5 days (15 per week) gives:

Current date: 2020-04-06 17:30
New date : 2020-03-09 20:00

So we’ve saved a month! Plus the pending videos (~40) will be done in two and a half weeks instead of four.

From here it’s straightforward to rewrite the scheduled field and use shoogle as before to change the dates, this time setting publishAt under status. Note that privacyStatus needs to be explicitly set to private, even if it is already set! This avoids a “400 The request metadata specifies an invalid scheduled publishing time” error.

Another thing done quickly with python!


1: On the note of ‘Done Things’, the thing to do would be to upload fewer videos in the first place.

I’ve considered that, and if a video is truly mundane and missable, I will omit it. But as well as being fun/interesting videos of individual rounds, the playlist should serve as a demonstration of our progress as players. The Dead by Daylight playlist does this: we start with no idea what’s going on or how to play properly, and by the final video — somewhere north of 300 — we are pretty competent.

Categories
automation python timesavers video

Renumbering Ordered Videos in a YouTube Playlist with Python

Doing exactly what it says on the tin

I’ve been playing Hunt: Showdown with friends recently. With these kids of things I like to stream and record the footage of us playing so that others can share our enjoyment — highs and lows! — and so we can watch them back later.

The videos are compiled in a playlist on YouTube, in the order recorded. The tools that I’ve written to help automate the process of getting the videos from a file on a hard drive to a proper YouTube video include numbering.

I realised that I had missed out three videos, which would throw off the numbering. The easy options would be to:

  • add them to the end of the playlist; downside: the video number wouldn’t reflect the order and progression
  • insert them in the right place manually; downside: it would take a long time to manually renumber subsequent videos (about ~60)
  • write a script to do this for me

Guess which one I picked?

Interacting with YouTube programmatically comes in two min forms: APIs or a wrapper like shoogle. The latter is what I am familiar with, and has the benefit o’ being a braw Scottish word to boot!

The list of video files I’ve uploaded is in json format, which makes interaction a cinch. The list is loaded, anything not a Hunt: Showdown video is skipped*, a regex matches the video number, if it’s over a number (59) in this case the number in the title is increased by 4 (I also had a duplicate number in the list!).

This title is then set using shoogle. The API has certain things it expects, so I had to ‘update’ both the title and the categoryId, though the latter remained the same. You also have to tell the API which parts you are updating, which in this case is the snippet.

As an example, the json passed to shoogle might look like:

{ "body": {
    "id": <ID>,
    "snippet": {
        "title": "Golden Battle (Hunt: Showdown #103)",
        "categoryId": "20"
        },
    },
 "part": "snippet"
}

From here it’s a simple matter to invoke shoogle (I use subprocess) to update the video title on YouTube.

The one caveat I would mention is that you only get 10 000 API credits per day by default. Updating the video costs 50 units per update, plus the cost of the resource (for snippet this is 2), which works out to 192 videos per day, max.

Once the list has been updated, I dump out the new list.

Much quicker than doing it manually, and the videos all have the right number!

Categories
automation python timesavers video

Improving Generated JSON Template for YouTube Uploads

Further automation automation

On a few of my Europa Universalis series, I’ve used a quick little python script to do take care of some of the predictable elements of the series — tags, title and video number — and work out a schedule.

Having gone through the process of uploading a lot of Dead by Daylight videos in the past, and with a large and growing set of Hunt: Showdown videos building up it seems like a good time to start adapting that script.

There is a significant hidden assumption here: my video file names are in ISO 8601 format, so we can sort based on filename.

As the previous uses had been EUIV videos the parameters were coded in as variables. This is obviously undesirable for a general-purpose script, so we need some way of passing in the things we want. And since we’re outputting JSON, why not use JSON formatting for the parameters file too?

We look for a supplied directory and file pattern, and pass those to glob.glob to be os.path.join-ed to build the file list. We then use a sorted() copy of the list which will have the videos in the correct — see assumption — order for the playlist.

Iterating through this sorted list, we can set the basics that uploadytfootage expects.

The only ‘fancy’ work here is in figuring out the schedule dates. Quoting my own docstring:

"""Based on:
    - the current scheduled date
    - valid days [M,Tu,W,Th,F,Sa,Su]
    - valid times (eg [1600, 1745, 2100])
    return the next scheduled date"""

I debated whether to make this a generator; and in the end I avoided it for reasons I can’t quite remember.

First we look at hours: if there’s a valid time later in the current day, use that. If not, we set the new hours part to the earliest of the valid times.

Next, days: if there’s a valid day of the week in the current week, set it to the next one. If not, take the difference of the current day and the earliest valid day away from 7 and add that to get the new day. That one might need a bit of explaining:

Valid: Monday (1) || Current: Friday (5):
7 – (5 – 1) = 3.

Using 3 for the days component of the timedelta gives us the Monday following the current Friday. We can also set the hours and minutes component of the time in that timedelta object.

Then it’s simply a matter of returning the value of the current scheduled date plus the timedelta!

In addition, I skip changing the scheduled date for any video that has “part” in the filename; on the basis that if it’s just been split for length — such as a three hour EUIV video split into hour segments — the different parts should all go out on the same day.

Having all the dates in the schedule figured out and set automatically is a huge timesaver.

The JSON provided by genjson is valid as uploadytfootage goes; but the only things that really need done are setting a title (if the videos in the series have different titles; EUIV playlists tend not to, Hunt ones do), a description, a thumbnail title and a thumbnail frame time.

Doing those few things are much quicker than redoing the metadata for each and every video.

Categories
coding timesavers video

Quick Hacks: A Script to Extract a Single Image/Frame From Video

Long ago, I posted the simple way to get a frame of a video using ffmpeg. I’ve been using that technique for a long time.

It can be a bit unwieldy for iteratively finding a specific frame, as when using a terminal you have to move the cursor to the time specification. So I wrote a very small wrapper script to put the time part at or towards the end:


#!/bin/bash
# f.sh - single frame

USAGE="f.sh infile timecode [outfile]"

if [ "$#" == "0" ]; then
        echo "$USAGE"
        exit 1
fi

if [ -e "$1" ]; then
        video="$1"
else
        echo "file not found: $1"
        exit 1
fi

if [ ! -z "$2" ]; then
        time="$2"
else
        echo "Need timecode!"
        exit 1
fi

# if we have a filename write to that, else imagemagick display

if [ ! -z "$3" ]; then
        echo "ffmpeg -i \"$video\" -ss $time  -vframes 1 -f image2 \"$3\""
        ffmpeg -loglevel quiet -hide_banner -ss $time -i "$video" -vframes 1 -f image2 "$3"
else
        echo "ffmpeg -i \"$video\" -ss $3  -vframes 1 -f image2 - | display"
        ffmpeg -hide_banner -loglevel quiet -ss $time  -i "$video" -vframes 1 -f image2 - | display
fi

Most of that is usage explanation, but broadly it has two modes:

  • display an image (f.sh video time)
  • write an image (f.sh video time image)

It’s more convenient to use it, hit ? and amend the time than to move the cursor into the depth of an ffmpeg command.

Categories
coding timesavers

Quick Hacks: A script to import photos to month-based directories (like Lightroom)

tl;dr: A bash script written in 15 minutes imports files as expected!

I was clearing photos off an SD card so that I have space to photograph a friend’s event this evening. Back on Windows, I would let Lightroom handle imports. Darktable is my photo management software of choice, but it leaves files where they are during import:

Importing a folder does not mean that darktable copies your images into another folder. It just means that the images are visible in lighttable and thus can be developed.

I had photos ranging from July last year until this month, so I needed to put them in directories from 2017/07 to 2018/02. But looking up metadata, copying and pasting seems like a tedious misuse of my time* so I wrote a little script to do so. It is not robust due to some assumptions (eg that the ‘year’ directory already exists) but it got the job done.

#!/bin/bash
# importcanon.sh - import from (mounted) sd card to directories based on date

CARD_BASEDIR="/tmp/canon"
PHOTO_PATH="DCIM/100CANON/"

TARGET_BASEDIR="/home/robert/mounts/storage/photos"

function copy_file_to_dir() {
    if [ ! -d "$2" ]; then
        echo "$2 does not exist!"
        mkdir "$2"
    fi
    cp "$1" "$2"
}

function determine_import_year_month() {
    #echo "exiftool -d "%Y-%m-%d" -S -s -DateTimeOriginal $1"
    yearmonth=$(exiftool -d "%Y/%m/" -S -s -DateTimeOriginal "$1")
    echo $yearmonth
}

printf "%s%sn" "$CARD_BASEDIR" "$PHOTO_PATH"

i=0
find "$CARD_BASEDIR/$PHOTO_PATH" -type f | while read file
do
    ym=$(determine_import_year_month "$file")
    copy_file_to_dir "$file" "$TARGET_BASEDIR/$ym"
    if let "$i %10 == 0"; then
        echo "Processed file $i ($file)"
    fi
    let i++

done

This uses exiftool to extract the year and month (in the form YYYY/MM), and that is used to give a target to cp.

The enclosing function has a check to see if the directory exists ([ ! -d "$2" ]) before copying. Using rsync would have achieved the effect of auto-creating a directory if needed, but that i) involves another tool ii) probably slows things down slightly due to invocation time iii) writing it this way let me remind myself of how to check for directory existence.

I still occasionally glance at how to iterate over files in bash, even though there are other ways of doing so!

There is also a little use of modulo in there to print some status output.

Not pretty, glamorous or robust but it got the job done!


*: Golden rule: leave computers to do things that they are good at

Categories
linux timesavers

Timesaver: import and combine GoPro Footage with FFmpeg

I’ve been taking my GoPro to Sunday Morning Football (as it is known) for a while now, so I figured I’d automate the process of importing the footage (moving it from microSD) and combining it into one file (GoPro splits recordings by default).

So I have the following script:


#!/bin/bash

GOPRO="/tmp/gopro"
DATE="$(date +%Y-%m-%d)"
VIDEO_BASE="/home/robert/mounts/storage/video/unsorted"
VIDEO_DEST="$VIDEO_BASE/$DATE"

if [ -e $GOPRO ]; then
        echo "Copying..."
        rsync -aP --info=progress2 --remove-source-files --include='*.MP4' $GOPRO/DCIM/100GOPRO/ $VIDEO_DEST/
        echo "Joining..."
        cd $VIDEO_DEST
        #cd $GOPRO/DCIM/100GOPRO/
        for file in `ls *.MP4`; do echo "file '$file'" >> stitch.txt; done
        #RECORD_DATE="$(ffprobe -v quiet `ls *.MP4 | head -n1` -show_entries stream=index,codec_type:stream_tags=creation_time:format_tags=creation_time | grep creation_time | head -n1| cut -d '=' -f 2| cut -d ' ' -f1)"
        # new format:
        RECORD_DATE="$(ffprobe -v quiet `ls *.MP4 | head -n1` -show_entries stream=index,codec_type:stream_tags=creation_time:format_tags=creation_time | grep creation_time | head -n1| cut -d '=' -f 2| cut -d ' ' -f1| cut -d 'T' -f1)"
        #echo "$RECORD_DATE"
        ffmpeg -y -f concat -i stitch.txt -c copy $RECORD_DATE.mp4
else
        echo "GoPro microSD not mounted?"
fi

Assumptions:

  • the microSD is already mounted before running (under /tmp/gopro) – I had considered automating this, but I figured running a script in response to insertion of removable media was a bad idea; I could add the mkdir and mount commands here, but since the latter requires root privileges I’d rather not and it is quickly recalled from bash history in any case
    • the $VIDEO_BASE directory is mounted and created (this is pretty stable)
    • the GoPro won’t number directories higher than 100GOPRO (eg 101GOPRO)- it possibly would if dealing with eg timelapse, but I am not covering that case
    • the GoPro will set creation time correctly; so far it has reset to the default date a few times, probably related to battery
    • I want to keep the source files around after creation (the script could remove them)

Given the above the script may seem a bit fragile – and it is definitely tightly coupled to my assumptions – but it’s done the job for a few weeks at least, and the commands it was based on have been pretty stable since I started recording football last year.