Dr. Lawlor's Code, Robots, & Things

April 27, 2020

Video masking in Kdenlive

Filed under: Random Thoughts, Video Editing — Dr. Lawlor @ 12:24 am

A key trick for fancy video effects is masking, which lets you combine parts of one video with parts of another video (this is easier if they’re all shot on the same tripod setup). I figured out how to do basic image-driven masking in Kdenlive today.

Open the GIMP and draw a “mask”: black for the parts of the frame you want to be the background video, and transparent (gray checkerboard) for those you want kept as the foreground video. Blur out the boundary to make it much harder to see the edge between shots. Save your mask as PNG and import into Kdenlive as a clip.

Drag the mask image clip to your timeline on top of your foreground and background video clips. Stretch the time out to cover your scene.

Right click the mask clip, “Add Transition” -> “Composite and Transform”. Set the “Compositing” operator to “Destination out” to have the mask’s black parts chop out your foreground. Drag out the composite and transform transition to cover your whole shot.

Now right click the foreground clip, “Add Transition” -> “Composite” the result Over the backdrop (the default). Drag out the transition to cover the whole shot.


You should immediately be able to play your composited scene. Set the mask-to-foreground compositing operator to “Destination in” if you want black as foreground, transparent as background. Tweak your brightness levels (I prefer “Add Effects” -> “Curves”) and start working on the next Orphan Black.

(Or at least you can impress your mom!)

February 11, 2020

Starting generator at -26F

Filed under: Alaska, Random Thoughts — Dr. Lawlor @ 2:32 pm

We had a fairly long power outage starting early this morning just after 3am.  Because the outside temperature was -26F this morning in Fox, Alaska, by 9am the house had cooled to about 54F and we were starting to get concerned.  Our frozen foods were OK just being stored outside on the deck, and our refrigerator was likely to be OK as the house interior slowly cooled to near refrigerator temperature, but all our plumbing was cooling off rapidly without heat tapes, and it’s annoying to live without running water.

But this September I’d bought a Firman propane / gasoline generator–a solid machine, but at 220lb, they’re not kidding about two person lift!  I’d also wired up an interlocked main breaker generator transfer switch when I reworked our electrical service for solar this summer.  (I’m not paid by any manufacturer, just a happy customer!)

I have the generator fed from a portable 20lb propane cylinder so we can leave it fueled and ready to go (stale gasoline won’t start in the cold), and had installed full synthetic 5W-30 motor oil that the generator manual recommends for cold weather, but it still took a few tries to get the generator running in the cold.  The choke seems to behave strangely in cold weather, but after a few tense minutes of alternating between the onboard battery powered starter and the hand-pull rope, the generator lit.  As soon as I flipped the transfer switch to power the house, I could see the furnace kick on, and our radiators were soon warm, and our plumbing safe.

Our electricity is incredibly reliable here because our local utility GVEA is awesome, and the Fox electrical substation also serves a gold mine–one that loses about $1M/day if they don’t have electricity.  So GVEA got the power back on less than an hour after I got the generator running.  But it was nice to verify that our generator can keep the house running–in the extreme cold, the difference between shivering in the dark and basically having a normal day can be a modest up-front investment!

October 14, 2019

Making UNIX do what you want

Filed under: Linux, Random Thoughts — Dr. Lawlor @ 10:15 am
hal> open the pod bay doors
I'm sorry, Dave, I'm afraid I can't do that: permission denied.
hal> open the pod bay doors please
I'm still getting permission denied, Dave.
hal> sudo open the pod bay doors
They're opening now, Dave.

September 11, 2019

Architectural Chroot for Raspberry Pi

Filed under: Linux, Programming, RaspberryPi, Servers, Sysadmin — Dr. Lawlor @ 8:59 pm

One easy way to do sysadmin, apply updates, or other maintenance on a Raspberry Pi or BeagleBone Black is to pull out the microSD storage card, and put it into a full size Linux computer.  The tricky part about this is your big computer uses x86, and the Pi/BeagleBone uses an ARM chip, so even if you mount the ARM filesystem you can’t directly run its programs.  There’s a neat fix (hat tip: Sentry’s Tech Blog) called architectural chroot that emulates the ARM binaries using QEMU, registered directly with your regular x86 kernel using binfmt_misc:

sudo apt-get install qemu-user-static
mkdir arm_mnt
sudo mount /dev/mmcblk0p2 arm_mnt
cd arm_mnt
sudo mount -o bind /dev dev
sudo mount -o bind /proc proc
sudo mount -o bind /sys sys
echo ':arm:M::\x7fELF\x01\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x28\x00:\xff\xff\xff\xff\xff\xff\xff\x00\xff\xff\xff\xff\xff\xff\xff\xff\xfe\xff\xff\xff:/usr/bin/qemu-arm-static:' | sudo tee /proc/sys/fs/binfmt_misc/register
sudo cp /usr/bin/qemu-arm-static usr/bin/
sudo chroot .

From inside the chroot, you can now run programs and do sysadmin as if you were on the emulated ARM machine, but you still have the RAM and network access of the host (emulation slowdown makes up for the faster x86 CPU though, so compute is about the same speed).  To make network access work, I needed to add my host computer’s hostname as in the chroot etc/hosts file, and you might need to copy in /etc/resolv.conf from the host.

To clean up, exit the chroot with ctrl-D, and:

sudo umount dev proc sys
cd ..
sudo umount arm_mnt

(If it won’t let you unmount, use “lsof | grep arm_mnt” to find the offenders, like a dbus fired up by doing an update.)

This is a handy way to add user accounts, configure a headless machine, or grab data from your Raspberry Pi even if it won’t boot!

March 24, 2019

SteamVR 2.2 Input: Tiny Example

Filed under: Programming, Unity, VR — Dr. Lawlor @ 5:39 pm

They keep changing the SteamVR VIVE input system, most recently for the knuckles controller.

As of SteamVR 2.2, you the game programmer define the space of possible Actions in Unity “Window -> SteamVR Input”.  The user is theoretically supposed to be able to customize these in SteamVR “Devices -> Controller Input Binding”, although you provide a JSON file of hopefully working defaults.  You the programmer connect the user-visible Actions to your scripts in Unity Inspector, which seems like a lot of moving pieces to get right.  (More “Linux gaming” than “console gaming”.)

For example, here’s some code to read a boolean Action in Unity 2018.3 via a new Project -> Assets -> Create -> C# Script:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Valve.VR;

public class DumpVR : MonoBehaviour
  public SteamVR_Action_Boolean Teleport=SteamVR_Input.GetAction<SteamVR_Action_Boolean>("default", "Teleport");

  void Start()
    Debug.Log("Started DumpVR"); 

  void Update()
    if (Teleport.GetState(SteamVR_Input_Sources.Any))
      Debug.Log("Teleport hit");

Add the SteamVR Plugin for Unity and drag up a SteamVR->Prefabs -> CameraRig to replace the default camera. VR head tracking should work already.

Add a Sphere (Scene -> right click -> 3D Object -> Sphere) and drag your new DumpVR script on top of your new sphere.

The first time you run this, you might get the following weird error when you try to GetState on your Action:

NullReferenceException: Object reference not set to an instance of an object
Valve.VR.SteamVR_Action_Boolean.GetState (Valve.VR.SteamVR_Input_Sources inputSource) (at Assets/SteamVR/Input/SteamVR_Action_Boolean.cs:94)
DumpVR.Update () (at Assets/VRtests/DumpVR.cs:17)

To make it work, you just need to select an already-defined action for this instance of the script.  Look at where DumpVR is bound to your Sphere in the Unity Inspector:

For me that dropdown was initially set to “None”, causing that NullReferenceException error at runtime.  (But sometimes it autofills correctly.)

That’s a button. For location transforms, I was able to pull the position and rotation (“pose”) from the left hand and change the containing object’s transform with:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Valve.VR;

public class HandyPoser : MonoBehaviour
  public SteamVR_Action_Pose Poser=SteamVR_Input.GetAction<SteamVR_Action_Pose>("default", "Pose");
  public SteamVR_Input_Sources source=SteamVR_Input_Sources.LeftHand;
  void Start()

  void Update()
    Vector3 pos=Poser.GetLocalPosition(source);
    Quaternion rot=Poser.GetLocalRotation(source);
    Debug.Log("Left hand: pos "+pos+" rot "+rot);

There’s a bit more description at valve’s samples site, and hairier examples in Assets/SteamVR/InteractionSystem/Samples.  The source code for the user-visible classes is in Assets/SteamVR/Input.

January 29, 2019

LIDAR Selfie

Filed under: Graphics, Programming, Robotics — Dr. Lawlor @ 6:24 pm

Today I unpacked and hooked up an Ouster OS-1-64 LIDAR that they generously donated to UAF’s robotics team.  This is one of the few LIDAR units capable of actually taking a selfie, because it also captures reflectance (top) along with distance (bottom) data.  I’m looking forward to building autonomous robots using this accurate, robust, and capable sensor!


(This is about 1/4 of the full 360 degree horizontal field of view.)

Linux cheatcodes: to start off, you need to find the unit’s .local name.  Plug the lidar into ethernet and power it up, wait until it grabs a DHCP address and starts spinning.  Either look through the server’s DHCP leases, or portscan your network for the control port:

nmap –open -sT -p 7501 <your subnet>/24

nmap will report the lidar’s IP address.  Add an http:// to visit the config page, which gives you the “os1-XXXXXX.local” address that you can use thereafter.

To view the data (like above), do:

git clone https://github.com/ouster-lidar/ouster_example
cd oster_lidar/ouster_example/ouster_viz/
sudo apt-get libeigen3-dev libvtk6-dev libjsoncpp-dev
mkdir build
cd build
cmake ..
./viz os1-XXXXXXXX.local <your machine’s IP address>

This will show the lidar returns onscreen!

November 11, 2018

UAF Hackathon 2018 Results

Filed under: Random Thoughts — Dr. Lawlor @ 8:22 pm

I wrote up the results from the excellent 2018 UAF Hackathon, a one-weekend build session that combines computer code, embedded hardware, and awesomeness!

October 19, 2018

Free up RAM on a Raspberry Pi by stopping X

Filed under: Linux, Programming, RaspberryPi — Dr. Lawlor @ 1:56 pm

There’s a very simple trick to free up several hundred megs of RAM on a Raspberry Pi: switch to a text console (ctrl-alt-F1), log in, and shut down the graphical user interface:

sudo service lightdm stop

This liberates over 150 megabytes of RAM, allowing huge programs to compile without hitting swap (although all of OpenCV still takes about 2 hours to compile from scratch).  If you need more terminals, ctrl-alt-F2 (through F6) are available.

To get back to the graphical interface, just:

sudo service lightdm start

Press ctrl-alt-F7 to see the GUI again once it’s running.

September 13, 2018

Observational x86 Instruction Usage in 32 and 64 bit code

Filed under: Programming — Dr. Lawlor @ 3:23 pm

Several years ago I wrote a tiny script to measure the frequencies of x86 instructions used by actual programs.  The big surprise for me was that in 32-bit programs, the most common instructions are just doing data movement and control flow:

42.4% mov instructions (data movement)
5.0% lea instructions (quick arithmetic)
4.9% cmp instructions
4.7% call instructions
4.5% je instructions
4.4% add instructions
4.3% test instructions
4.3% nop instructions (for alignment)
3.7% jmp instructions
2.9% jne instructions

Today, in 64-bit mode,  the overall pattern is the same, with a slight drop in mov probably due to more registers being available:

33.6% mov instructions
6.9% cmp instructions
5.3% call instructions
5.1% je instructions
4.7% jmp instructions
4.4% nop instructions
4.4% lea instructions
4.1% test instructions
3.9% add instructions
3.9% push instructions

Script to measure instruction frequencies and register usage on a 64 bit executable:


objdump -drC -M intel “$file” | \
awk -F: ‘{print substr($2,24);}’ | \
grep -v “^$” > “$d”
tot=`wc -l $d | awk ‘{print $1}’`
echo “$tot instructions total”

echo “Instruction usage breakdown:”
awk ‘{print $1}’ $d | sort | awk ‘{
if ($1==last) {count++;}
else {print count, last; count=0; last=$1;}
}’ | \
sort -n -r | \
awk ‘{printf(” %.1f%% %s instructions\n”,$1*100.0/’$tot’,$2);}’ \
> dis_instructions.txt
head -15 dis_instructions.txt

echo “Register and feature usage:”
for reg in eax ebx ecx edx esp ebp esi edi \
rax rbx rcx rdx rsp rbp rsi rdi r8 r9 r10 r11 r12 r13 r14 r15 \
xmm ymm zmm \
“0x” “,” “+” “*” “\[” \
c=`grep “$reg” “$d” | wc -l | awk ‘{print $1}’`
echo | awk ‘{printf(” %.1f%% \”‘”$reg”‘\” lines\n”,’$c’*100.0/’$tot’);}’

August 22, 2018

Screen Sharing to Extron ShareLink from Linux

Filed under: Linux, Sysadmin — Dr. Lawlor @ 5:12 pm

Like many hardware vendors, the network-to-HDMI Extron ShareLink boxes clearly support Windows and Mac, but there’s no sign of how to make them work on Linux.

But they do have a Chrome app called MirrorOp Sender that seems to work in Chrome from my Ubuntu 18.04 machine, and can successfully connect and share your Linux desktop across the network, with a nice easy to use graphical user interface.  It even successfully downscales my 4K display to 1080p for streaming.

Internally, the Chrome app does all the network communication inside a 2.7Mb PNaCl binary.  The network protocol doesn’t look like anything I’ve ever seen: it starts with a connection to TCP port 389 but doesn’t send any data (a port knock?), then opens the control plane channel on TCP 3268 and exchanges a bunch of setup and metadata packets that start with “wppcmd” before settling down to a “wppaliveROCK” / “wppaliveROLL” 3 second keepalive ping cycle, and finally streams the video data in JFIF format over TCP port 8080.  (A vaguely similar protocol is discussed for the Creston Airmedia hardware here.)

Older Posts »

Create a free website or blog at WordPress.com.