Closing in (95%)

Nearly there!

It’s getting annoyingly grindy now! I spend most of the time in the chopper, either:

  • deploying to an area to put down capture cages then immediately leaving
  • returning to the medical platform on mother base to hand over photographs

The latter is especially irritating as there’s no indication that you need to do it, and you can’t give all ten photographs at once! So chopper in, run to room, cutscene into room, hand over photo, run back to chopper, leave; repeat.

krusty_groan.wav

I have a sheet of paper that I’m crossing off the things as I do them. It’s slightly illegible due to my broken fingers, but usable.

On “Back up, Back Down”

The gods of irony got together with the gods of gaming after my recent gripe about having to do and redo things in MGSV:TPP:

Some missions have mutually-exclusive objectives – I’m looking at you, Backup, Back Down – so may require more

Well, I played through the “Extreme” version of Back Up, Back Down to do the additional objectives, and the team searching for the prisoner got there, stood around him and then… very kindly didn’t execute him.

So I ran up, stunned them all with a non-lethal assault rifle and fultoned him out! All optional objectives complete.

Seeking 100% in MGS V: The Phantom Pain

I’m getting close to 100% completion of Metal Gear Solid V: The Phantom Pain. I’ve completed the story (more on that in a follow up retrospective when I’m done) and achieved S-rank in all of the missions, which is easier than it seems at first glance. So I’m chasing the other things that need done:

  • achieve S-rank on every main mission

    wait, I said this already, weren’t you paying attention?

  • complete 157 side ops

    most of the time spent on this is taken on getting to the side-ops location; highly repetitive

  • capture / save a specimen of every animal

    Can you say ‘fetch quest’? I thought you could!

  • complete all important combat deployments

    click a button on the menu and wait? sure

  • collect all blueprints

    should be achieved if doing all the other stuff anyway

  • collect all key items

    similarly, most should be gained in the course of things, except the first aid kit (off the top of my head)

  • collect all 10 Paz photographs

    all but one gained via side ops

  • complete all mission tasks

    last but not least! practically speaking, this means missions need ‘around 3’ plays: first time its new, second time for s-rank, and third for the remaining objectives. Some missions have mutually-exclusive objectives – I’m looking at you, Backup, Back Down – so may require more

The final one is probably the biggest time sink; though the side ops come close. a rough guess, I reckon I’ve done all the mission objectives in at least 25% to 40% of missions, maybe more. Some of the objectives take a while, particularly the ones which involve following a target and listening to their conversations.

The Good

Playing without care for rank or speed generally means more fun! The optional objectives reward things that are a little more out of the way to do (like capturing patrolling armoured vehicles, or recovering a blueprint) but are in themselves rewarding.

But the most time-consuming ones, ‘listen to a series of conversations’ I’ve found the most interesting as they reveal more about the plot, so are cool in retrospect. Having completed the story, its a bit like rereading a book and going “oh, so that’s what they were foreshadowing!”.

The Bad

It’s a grind. Redoing things you’ve already done, watching the Boss (in the guise of whichever character) fly into the AO over and over again gets repetitive quickly. I’ve seen Hideo Kojima’s name in the credits more times than I count, though that’s in every mission a few times so is pretty repetitive in itself.

There’s also the nagging feeling that chasing a meaningless number in a game is a huge waste of life, but I try to push that to the back of my mind. I’m having fun!

The Ugly

There are occasional bugs; one mission (Lingua Franca) is prone it it. I’ve also had people/outposts/bases get spooked and never leave the heightened security state, necessitating a restart.

Am I still enjoying playing the game, even though I’ve completed the story? Yes, but only just.

[solved] js52: /usr/lib/libmozjs-52.so.0 exists in filesystem

Living on the edge in Arch Linux land is a fun activity everyone should try (at least once). However, a full system package upgrade caused the following today:

# pacman -Syyu
(...)
error: failed to commit transaction (conflicting files) 
js52: /usr/lib/libmozjs-52.so.0 exists in filesystem

I’m not the only one to have the issue. Seems the official way of getting past this is to rename the file, at least per this bug report.

Update: There’s an Arch news post that adds a modicum more information:

Due to the SONAME of /usr/lib/libmozjs-52.so not matching its file name, ldconfig created an untracked file /usr/lib/libmozjs-52.so.0. This is now fixed and both files are present in the package.

To pass the upgrade, remove /usr/lib/libmozjs-52.so.0 prior to upgrading.

I think this is the first time I’ve needed to do a manual intervention for a package upgrade for the time I’ve been running Arch; so all in all not bad.

Quick Hacks: A Script to Extract a Single Image/Frame From Video

Long ago, I posted the simple way to get a frame of a video using ffmpeg. I’ve been using that technique for a long time.

It can be a bit unwieldy for iteratively finding a specific frame, as when using a terminal you have to move the cursor to the time specification. So I wrote a very small wrapper script to put the time part at or towards the end:


#!/bin/bash
# f.sh - single frame

USAGE="f.sh infile timecode [outfile]"

if [ "$#" == "0" ]; then
        echo "$USAGE"
        exit 1
fi

if [ -e "$1" ]; then
        video="$1"
else
        echo "file not found: $1"
        exit 1
fi

if [ ! -z "$2" ]; then
        time="$2"
else
        echo "Need timecode!"
        exit 1
fi

# if we have a filename write to that, else imagemagick display

if [ ! -z "$3" ]; then
        echo "ffmpeg -i \"$video\" -ss $time  -vframes 1 -f image2 \"$3\""
        ffmpeg -loglevel quiet -hide_banner -ss $time -i "$video" -vframes 1 -f image2 "$3"
else
        echo "ffmpeg -i \"$video\" -ss $3  -vframes 1 -f image2 - | display"
        ffmpeg -hide_banner -loglevel quiet -ss $time  -i "$video" -vframes 1 -f image2 - | display
fi

Most of that is usage explanation, but broadly it has two modes:

  • display an image (f.sh video time)
  • write an image (f.sh video time image)

It’s more convenient to use it, hit ? and amend the time than to move the cursor into the depth of an ffmpeg command.

Better Backups: Decide What You’re Going To Back Up

tl;dr: Picking what you are going to back up helps (i) keep the backup space usage minimal (ii) helps to inform choice of backup program

Following on from picking a backup system in the backups series, now that you’ve picked a system, what exactly should you back up?

You could make the argument that really, what you’re going to back up is part of your requirements gathering. Frequently-changing data (eg documents) is different from a snapshot of a Windows installation is different from an archive of the family photos.

My my case, I want to back up my home directory, which is a mix of things:

  • documents of all sorts
  • code (some mine, some open source tools)
  • application configuration data
  • browser history etc
  • miscellaneous downloads

It totals less than 20Gb, most of which is split between downloads, browser and code (around 3:1:1, according to ncdu). Some things like documents, code and browser data will change semi-frequently and old versions are useful; others will stay relatively static and version history is not so important (like downloads).

Some downloads were for a one-off specific purpose and removed. It would be possible to pare down further by removing some downloads and some code — wine is the largest directory in ~/code/, and I don’t remember the last time I used it — but it’s not enough that I feel it’s a priority to do.

Is there anything in this set of data that doesn’t need kept? Frequently-changing-but-low-utility files like browser cache would be worth excluding as they will cause the (incremental) backups to grow in size. Incidentally, cache was the next largest item in the ratio above!

Some of the files will change relatively frequently, and I’d like to keep the history of them. I have decided that I want to keep my entire home directory, minus browser cache. This help to inform me what things I need my backup program to do, and what to do with it when I decide.

Quick Hacks: A script to import photos to month-based directories (like Lightroom)

tl;dr: A bash script written in 15 minutes imports files as expected!

I was clearing photos off an SD card so that I have space to photograph a friend’s event this evening. Back on Windows, I would let Lightroom handle imports. Darktable is my photo management software of choice, but it leaves files where they are during import:

Importing a folder does not mean that darktable copies your images into another folder. It just means that the images are visible in lighttable and thus can be developed.

I had photos ranging from July last year until this month, so I needed to put them in directories from 2017/07 to 2018/02. But looking up metadata, copying and pasting seems like a tedious misuse of my time* so I wrote a little script to do so. It is not robust due to some assumptions (eg that the ‘year’ directory already exists) but it got the job done.

#!/bin/bash
# importcanon.sh - import from (mounted) sd card to directories based on date

CARD_BASEDIR="/tmp/canon"
PHOTO_PATH="DCIM/100CANON/"

TARGET_BASEDIR="/home/robert/mounts/storage/photos"

function copy_file_to_dir() {
    if [ ! -d "$2" ]; then
        echo "$2 does not exist!"
        mkdir "$2"
    fi
    cp "$1" "$2"
}

function determine_import_year_month() {
    #echo "exiftool -d "%Y-%m-%d" -S -s -DateTimeOriginal $1"
    yearmonth=$(exiftool -d "%Y/%m/" -S -s -DateTimeOriginal "$1")
    echo $yearmonth
}

printf "%s%sn" "$CARD_BASEDIR" "$PHOTO_PATH"

i=0
find "$CARD_BASEDIR/$PHOTO_PATH" -type f | while read file
do
    ym=$(determine_import_year_month "$file")
    copy_file_to_dir "$file" "$TARGET_BASEDIR/$ym"
    if let "$i %10 == 0"; then
        echo "Processed file $i ($file)"
    fi
    let i++

done

This uses exiftool to extract the year and month (in the form YYYY/MM), and that is used to give a target to cp.

The enclosing function has a check to see if the directory exists ([ ! -d "$2" ]) before copying. Using rsync would have achieved the effect of auto-creating a directory if needed, but that i) involves another tool ii) probably slows things down slightly due to invocation time iii) writing it this way let me remind myself of how to check for directory existence.

I still occasionally glance at how to iterate over files in bash, even though there are other ways of doing so!

There is also a little use of modulo in there to print some status output.

Not pretty, glamorous or robust but it got the job done!


*: Golden rule: leave computers to do things that they are good at

QA: How can I stop on-screen volume indicator from showing every ~30s?

Note: This is a backup of a QA over at Super User which was auto-deleted, so is preserved here for posterity. Users with >10k rep can see the QA using the link.

This question was one of mine, which didn’t receive any attention, sadly!


I am using crunchbangplusplus (#!++, cbpp) Linux, a Debian-based lightweight distro which uses xfce4-notifyd to provide desktop notifications. One such notification is the humble volume indicator:

the notification for volume, isn't it lovely?

The indicator pops up in response to changes in volume and muting/unmuting. This is grand, but when I have vlc running, the volume indicator pops up every 30-45 seconds, which is rather distracting.

Some searching lead me to a crunchbang forum thread about disabling the volume indicator; but I don’t fancy losing all notifications just to rid myself of this turbulent volume display.

It did however bring me to xfce4-notifyd-config:

xfce4-notifyd-config dialog- lovely, but useless here

but unfortunately it doesn’t have an option to configure individual notifications. I also checked the volume mixer (PNmixer) preferences:

a tryptych of volume preferences

but nothing of help there.

Interestingly, I have observed that when the volume shows, it jumps from one volume (vlc‘s?) to another (system volume?). It also doesn’t happen if both vlc and system volume match at 100%. Since under Linux vlc can set the system volume, I am wondering if there’s a conflict here.

tl;dr:

Volume notification appears every ~30s when vlc is running- why, and how can I stop that?


1 A long gif for the patient:

a very long gif of the notification, captured using silentcast

QA: Identifying multimedia connectors

Note: This is a backup of a QA over at Super User which was auto-deleted, so is preserved here for posterity. Users with >10k rep can see the QA using the link.


pentix asked:

I need some help identifying a connector which at fist I thought is an HDMI Type A connector.

The unknown connector is black on both sides, but has a small arrow pointing to the plug on one side. It is a short cable converting SCART to Unknown.


Mystery Connector

Looks like DFP (VESA Digital FlatPanel):

enter image description here

DFP

QA: Is there a way to make a backup of the difference with DD or PV?

Note: This is a backup of a QA over at Super User which was auto-deleted, so is preserved here for posterity. Users with >10k rep can see the QA using the link.


So this is an odd question, I know. Here is what I need. I ran a DD backup in linux mint last weekend to backup over 1.8TB of data to an external 4TB HDD from my server. This week, people have been using the server again and I havent had the time to upload that data to the new drives(They are server specific and in a RAID-10 configuration, so I am unable to load the backup to them without using the server). What I need to do is figure out how to use DD, or possibly PV to backup the difference in the data. I want it to skip over existing data from the backup and then backup new data from this last week without taking the entire weekend to do so. Is there any easy method to do this?


Continuing a dd transfer

If and only if you are confident that what you have is a contiguous piece of data that has been appended to, you can use seek= to start dd from an offset.

just use seek= to resume.

dd if=/dev/urandom of=/dev/disk/by-uuid/etc bs=512 seek=464938971

GNU dd also supports seeking in bytes, so you can resume exactly, regardless of blocksize:

dd if=/dev/urandom of=/dev/disk/by-uuid/etc bs=1M seek=238048755782 oflag=seek_bytes

Credit to frostshulz for his answer over on U&L.

But to be clear, this only really works if the data has been appended to, as that is akin to a resume. If you think people have written data like the following:

before: ABCDEF

after: ABCDEFGHIJKLM

and you want to get the GHIJKLM bit, great, dd will do that for you. But dd operates in terms of bytes (well, records) in and out. What you likely have is analogous to:

before: ABCDEF

after: AGCDMLEKFHI

and dd will not help you here!

Frame challenge: dd isn’t the right tool to accomplish what you are trying to do

You want to copy only changed data*? rsync (or something based on the rsync protocol, like rsdiff-backup or rsnapshot) is what you want:

Rsync finds files that need to be transferred using a lqquick checkrq algorithm (by default) that looks for files that have changed in size or in last-modified time. Any changes in the other preserved attributes (as requested by options) are made on the destination file directly when the quick check indicates that the file’s data does not need to be updated.

from the rsync man page

Further Considerations on Backups

Without knowing what you are trying to accomplish I am reticent to suggest you do something else, but for anyone else looking at this question and thinking ‘man, it would be useful to have a backup program I could easily get new changes with in a space-efficient manner’, it’s definitely worth having a look through the backup program comparison page on the Arch wiki (the general principles are system-agnostic).

In particular, BorgBackup, bup and obman (among others) have favourable characteristics in being generally intelligent about backups and being quite efficient in terms of disk space due to block-level deduplication. In brief, a full 1.*TB backup need not necessarily read an write the full 1.8TB of data each time, unlike with dd.

*: Some further reading on synchronisation and comparisons: https://wiki.archlinux.org/index.php/Synchronization_and_backup_programs