Dr. Lawlor's Code, Robots, & Things

October 14, 2019

Making UNIX do what you want

Filed under: Linux, Random Thoughts — Dr. Lawlor @ 10:15 am
hal> open the pod bay doors
I'm sorry, Dave, I'm afraid I can't do that: permission denied.
hal> open the pod bay doors please
I'm still getting permission denied, Dave.
hal> sudo open the pod bay doors
They're opening now, Dave.

September 11, 2019

Architectural Chroot for Raspberry Pi

Filed under: Linux, Programming, RaspberryPi, Servers, Sysadmin — Dr. Lawlor @ 8:59 pm

One easy way to do sysadmin, apply updates, or other maintenance on a Raspberry Pi or BeagleBone Black is to pull out the microSD storage card, and put it into a full size Linux computer.  The tricky part about this is your big computer uses x86, and the Pi/BeagleBone uses an ARM chip, so even if you mount the ARM filesystem you can’t directly run its programs.  There’s a neat fix (hat tip: Sentry’s Tech Blog) called architectural chroot that emulates the ARM binaries using QEMU, registered directly with your regular x86 kernel using binfmt_misc:

sudo apt-get install qemu-user-static
mkdir arm_mnt
sudo mount /dev/mmcblk0p2 arm_mnt
cd arm_mnt
sudo mount -o bind /dev dev
sudo mount -o bind /proc proc
sudo mount -o bind /sys sys
echo ':arm:M::\x7fELF\x01\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x28\x00:\xff\xff\xff\xff\xff\xff\xff\x00\xff\xff\xff\xff\xff\xff\xff\xff\xfe\xff\xff\xff:/usr/bin/qemu-arm-static:' | sudo tee /proc/sys/fs/binfmt_misc/register
sudo cp /usr/bin/qemu-arm-static usr/bin/
sudo chroot .

From inside the chroot, you can now run programs and do sysadmin as if you were on the emulated ARM machine, but you still have the RAM and network access of the host (emulation slowdown makes up for the faster x86 CPU though, so compute is about the same speed).  To make network access work, I needed to add my host computer’s hostname as in the chroot etc/hosts file, and you might need to copy in /etc/resolv.conf from the host.

To clean up, exit the chroot with ctrl-D, and:

sudo umount dev proc sys
cd ..
sudo umount arm_mnt

(If it won’t let you unmount, use “lsof | grep arm_mnt” to find the offenders, like a dbus fired up by doing an update.)

This is a handy way to add user accounts, configure a headless machine, or grab data from your Raspberry Pi even if it won’t boot!

March 24, 2019

SteamVR 2.2 Input: Tiny Example

Filed under: Programming, Unity, VR — Dr. Lawlor @ 5:39 pm

They keep changing the SteamVR VIVE input system, most recently for the knuckles controller.

As of SteamVR 2.2, you the game programmer define the space of possible Actions in Unity “Window -> SteamVR Input”.  The user is theoretically supposed to be able to customize these in SteamVR “Devices -> Controller Input Binding”, although you provide a JSON file of hopefully working defaults.  You the programmer connect the user-visible Actions to your scripts in Unity Inspector, which seems like a lot of moving pieces to get right.  (More “Linux gaming” than “console gaming”.)

For example, here’s some code to read a boolean Action in Unity 2018.3 via a new Project -> Assets -> Create -> C# Script:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Valve.VR;

public class DumpVR : MonoBehaviour
  public SteamVR_Action_Boolean Teleport=SteamVR_Input.GetAction<SteamVR_Action_Boolean>("default", "Teleport");

  void Start()
    Debug.Log("Started DumpVR"); 

  void Update()
    if (Teleport.GetState(SteamVR_Input_Sources.Any))
      Debug.Log("Teleport hit");

Add the SteamVR Plugin for Unity and drag up a SteamVR->Prefabs -> CameraRig to replace the default camera. VR head tracking should work already.

Add a Sphere (Scene -> right click -> 3D Object -> Sphere) and drag your new DumpVR script on top of your new sphere.

The first time you run this, you might get the following weird error when you try to GetState on your Action:

NullReferenceException: Object reference not set to an instance of an object
Valve.VR.SteamVR_Action_Boolean.GetState (Valve.VR.SteamVR_Input_Sources inputSource) (at Assets/SteamVR/Input/SteamVR_Action_Boolean.cs:94)
DumpVR.Update () (at Assets/VRtests/DumpVR.cs:17)

To make it work, you just need to select an already-defined action for this instance of the script.  Look at where DumpVR is bound to your Sphere in the Unity Inspector:

For me that dropdown was initially set to “None”, causing that NullReferenceException error at runtime.  (But sometimes it autofills correctly.)

That’s a button. For location transforms, I was able to pull the position and rotation (“pose”) from the left hand and change the containing object’s transform with:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Valve.VR;

public class HandyPoser : MonoBehaviour
  public SteamVR_Action_Pose Poser=SteamVR_Input.GetAction<SteamVR_Action_Pose>("default", "Pose");
  public SteamVR_Input_Sources source=SteamVR_Input_Sources.LeftHand;
  void Start()

  void Update()
    Vector3 pos=Poser.GetLocalPosition(source);
    Quaternion rot=Poser.GetLocalRotation(source);
    Debug.Log("Left hand: pos "+pos+" rot "+rot);

There’s a bit more description at valve’s samples site, and hairier examples in Assets/SteamVR/InteractionSystem/Samples.  The source code for the user-visible classes is in Assets/SteamVR/Input.

January 29, 2019

LIDAR Selfie

Filed under: Graphics, Programming, Robotics — Dr. Lawlor @ 6:24 pm

Today I unpacked and hooked up an Ouster OS-1-64 LIDAR that they generously donated to UAF’s robotics team.  This is one of the few LIDAR units capable of actually taking a selfie, because it also captures reflectance (top) along with distance (bottom) data.  I’m looking forward to building autonomous robots using this accurate, robust, and capable sensor!


(This is about 1/4 of the full 360 degree horizontal field of view.)

Linux cheatcodes: to start off, you need to find the unit’s .local name.  Plug the lidar into ethernet and power it up, wait until it grabs a DHCP address and starts spinning.  Either look through the server’s DHCP leases, or portscan your network for the control port:

nmap –open -sT -p 7501 <your subnet>/24

nmap will report the lidar’s IP address.  Add an http:// to visit the config page, which gives you the “os1-XXXXXX.local” address that you can use thereafter.

To view the data (like above), do:

git clone https://github.com/ouster-lidar/ouster_example
cd oster_lidar/ouster_example/ouster_viz/
sudo apt-get libeigen3-dev libvtk6-dev libjsoncpp-dev
mkdir build
cd build
cmake ..
./viz os1-XXXXXXXX.local <your machine’s IP address>

This will show the lidar returns onscreen!

November 11, 2018

UAF Hackathon 2018 Results

Filed under: Random Thoughts — Dr. Lawlor @ 8:22 pm

I wrote up the results from the excellent 2018 UAF Hackathon, a one-weekend build session that combines computer code, embedded hardware, and awesomeness!

October 19, 2018

Free up RAM on a Raspberry Pi by stopping X

Filed under: Linux, Programming, RaspberryPi — Dr. Lawlor @ 1:56 pm

There’s a very simple trick to free up several hundred megs of RAM on a Raspberry Pi: switch to a text console (ctrl-alt-F1), log in, and shut down the graphical user interface:

sudo service lightdm stop

This liberates over 150 megabytes of RAM, allowing huge programs to compile without hitting swap (although all of OpenCV still takes about 2 hours to compile from scratch).  If you need more terminals, ctrl-alt-F2 (through F6) are available.

To get back to the graphical interface, just:

sudo service lightdm start

Press ctrl-alt-F7 to see the GUI again once it’s running.

September 13, 2018

Observational x86 Instruction Usage in 32 and 64 bit code

Filed under: Programming — Dr. Lawlor @ 3:23 pm

Several years ago I wrote a tiny script to measure the frequencies of x86 instructions used by actual programs.  The big surprise for me was that in 32-bit programs, the most common instructions are just doing data movement and control flow:

42.4% mov instructions (data movement)
5.0% lea instructions (quick arithmetic)
4.9% cmp instructions
4.7% call instructions
4.5% je instructions
4.4% add instructions
4.3% test instructions
4.3% nop instructions (for alignment)
3.7% jmp instructions
2.9% jne instructions

Today, in 64-bit mode,  the overall pattern is the same, with a slight drop in mov probably due to more registers being available:

33.6% mov instructions
6.9% cmp instructions
5.3% call instructions
5.1% je instructions
4.7% jmp instructions
4.4% nop instructions
4.4% lea instructions
4.1% test instructions
3.9% add instructions
3.9% push instructions

Script to measure instruction frequencies and register usage on a 64 bit executable:


objdump -drC -M intel “$file” | \
awk -F: ‘{print substr($2,24);}’ | \
grep -v “^$” > “$d”
tot=`wc -l $d | awk ‘{print $1}’`
echo “$tot instructions total”

echo “Instruction usage breakdown:”
awk ‘{print $1}’ $d | sort | awk ‘{
if ($1==last) {count++;}
else {print count, last; count=0; last=$1;}
}’ | \
sort -n -r | \
awk ‘{printf(” %.1f%% %s instructions\n”,$1*100.0/’$tot’,$2);}’ \
> dis_instructions.txt
head -15 dis_instructions.txt

echo “Register and feature usage:”
for reg in eax ebx ecx edx esp ebp esi edi \
rax rbx rcx rdx rsp rbp rsi rdi r8 r9 r10 r11 r12 r13 r14 r15 \
xmm ymm zmm \
“0x” “,” “+” “*” “\[” \
c=`grep “$reg” “$d” | wc -l | awk ‘{print $1}’`
echo | awk ‘{printf(” %.1f%% \”‘”$reg”‘\” lines\n”,’$c’*100.0/’$tot’);}’

August 22, 2018

Screen Sharing to Extron ShareLink from Linux

Filed under: Linux, Sysadmin — Dr. Lawlor @ 5:12 pm

Like many hardware vendors, the network-to-HDMI Extron ShareLink boxes clearly support Windows and Mac, but there’s no sign of how to make them work on Linux.

But they do have a Chrome app called MirrorOp Sender that seems to work in Chrome from my Ubuntu 18.04 machine, and can successfully connect and share your Linux desktop across the network, with a nice easy to use graphical user interface.  It even successfully downscales my 4K display to 1080p for streaming.

Internally, the Chrome app does all the network communication inside a 2.7Mb PNaCl binary.  The network protocol doesn’t look like anything I’ve ever seen: it starts with a connection to TCP port 389 but doesn’t send any data (a port knock?), then opens the control plane channel on TCP 3268 and exchanges a bunch of setup and metadata packets that start with “wppcmd” before settling down to a “wppaliveROCK” / “wppaliveROLL” 3 second keepalive ping cycle, and finally streams the video data in JFIF format over TCP port 8080.  (A vaguely similar protocol is discussed for the Creston Airmedia hardware here.)

August 10, 2018

Navier-Stokes GPU Fluids via Multigrid

Filed under: Graphics, Web — Dr. Lawlor @ 10:23 am

Check out my live demo of GPU fluids via multigrid.  It’s swirly!  I still need to write up the paper describing the multigrid pressure-free update, but here’s a 2015 lecture note describing the general approach.

(I’m reposting some of my old demos so I can show them off during SIGGRAPH 2018)

June 24, 2018

Legit Websites that Track Everything You Do

Filed under: Mass Surveillance, Random Thoughts — Dr. Lawlor @ 9:44 pm

TL;DR Version: many legitimate websites record extremely detailed data about everything you do on the site, including your mouse position and where you scroll, using a tool like ForeSee Replay.  Since there is no global opt-out, you need to install an ad blocker to prevent this kind of bandwidth-wasting privacy intrusion.

This afternoon I opened a few eBay tabs, and I have a vague recollection of seeing one of those standard ForeSee popups asking if I’d like to participate in a survey.  I almost always say “No Thanks”, and I’m 99% sure I did not agree to a survey today.

This evening, I noticed my laptop’s outbound network traffic was heavy, which seriously slows down our rural DSL (4mbit down, 1mbit up).  The more I looked at this, the less I liked it: chrome was sending piles of HTTPS data off to four different AWS-hosted servers that list themselves as “ForeSee Record Status (cxReplayRecorder) v2.4.11” (IP addresses:, but they’re load balanced).  I grabbed some of the traffic with Wireshark, but it’s encrypted, so I still have no way of telling what exactly was sent.

I knew the traffic was coming from chrome, but I have a bad habit of keeping about a hundred tabs open (!), so I carefully watched the bandwidth usage as I incrementally closed tabs.  As soon as I closed the ebay tabs, the traffic stopped.  Reopening the tabs didn’t bring the traffic back–it only happens when the tracker code decides it has enough data to be worth sending back out.

ForeSee seems particularly evasive in the way it phrases the ‘survey’: not only will you (maybe) answer questions, the ForeSee® Replay code also tracks every mouse click, mouse *hover*, and scroll that you do on the site.  Their Replay viewer lets them see heat maps showing where people click, or even hover the mouse.  (This is useful data for building the site, but it’s not disclosed that it is part of the ‘survey’.)

The tracking servers dump a pile of metadata, from which I can see:

  • There are about 10 active replay recorder servers right now.
  • Each server is receiving about 1,000 ‘transmits’ per minute.
  • Each ‘transmit’ occupies about 100 kilobytes (on average), which is a lot of bandwidth, and a lot of information captured.
  • A typical server seems to capture over a terabyte per week, from over 10 million users.

To stop this, install an ad blocker like the free Ad Block Plus, and add “/foresee/*” to the filter list.

Older Posts »

Create a free website or blog at WordPress.com.