Mixed-Precision Neural Network Training with APEX

TLDR: Just make these changes: from apex import amp # add this after net and optimizer are defined: net, optimizer = amp.initialize(net, optimizer, opt_level='O1') # replace 'loss.backward()' with this: with amp.scale_loss(loss, optimizer) as scaled_loss: scaled_loss.backward() Background I have a Turing GPU, which contains hardware optimized for efficient FP16 (half-precision floating point) processing. This is useful because gpu memory is often a bottleneck in deep learning - doubling the size of a network or doubling batch size can have a sizable impact.

Staff Removal in PyTorch (Revisiting ICDAR 2013)

2012 was a significant year for computer vision, as AlexNet smashed past records (and same-year competitors) on the ImageNet recognition challenge. In the following months and years, the field embraced CNN-based techniques, and a vast number of tasks and benchmarks saw major improvements in performance. Because of this, and thanks to the maturity of modern deep learning frameworks, it is quite often the case that pre-deep-learning challenges and benchmarks can be trivially surpassed, often with huge margins, simply by using basic out-of-the-box deep learning techniques.

Email Bomb

On August 12, for about 24 hours my email inbox was flooded with emails, peaking at over 1 email/second. This type of attack is known as an email bomb, and the intent is to overwhelm email providers and/or user attention as cover for other simultaneous attacks (which might send emails from password changes, online purchases, etc.). The attacker did not use their own computing resources to send emails - instead, the attacker had a list of mailing lists, and used a script to subscribe my email address to each one.

Graph Cuts on Markov Random Fields

Binary Multi-label Submodular Exact polynomial-time solution via min-cut/max-flow Exact polynomial-time solution via min-cut/max-flow Metric N/A NP-hard, polynomial-time alpha-expansion reaches local-min within a factor of 2 of global min Neither NP-hard, polynomial-time quadratic pseudo-boolean optimization can produce an exact partial solution NP-hard, polynomial-time alpha-beta swap reaches local-min Submodularity Binary submodular cost functions satisfy: Cost(a,b) + Cost(b,a) - Cost(a,a) - Cost(b,b) >= 0 Multi-label submodular cost functions satisfy:

Serial Access for R8000/AC3200 (and other) Routers

So you bricked your router. Or maybe you just want a more convenient way to manage and monitor firmware upgrades (wiping settings via command is a lot more pleasant than holding down power buttons). Either way, adding serial access is pretty easy for many routers. I first did this a couple years ago, but I had to do it again recently, so I documented the process here for my current router (Netgear R8000/AC3200).

Hungarian Matching Demo

Back in 2013, as a class project, we built a javascript demo of the hungarian algorithm. The basic idea is that it’s a polynomial-time method to obtain the optimal matching between 2 sets of objects (e.g. matching people to resources), where every pairing has some cost (or reward) associated with it. I had never used javascript before this project, and I never used it again afterwards, so no idea if the code itself is any good, but it was a fun project.

Building Meshlab from Source in Ubuntu

Every time I build Meshlab, it’s always a little more work than it really should be. So here’s my notes from my most recent build (June 2018, Ubuntu 18.04) Clone the repositories (This is for building master, switch to a release branch/tag if you prefer) git clone git@github.com:cnr-isti-vclab/meshlab.git git clone git@github.com:cnr-isti-vclab/vcglib.git -b devel Install dependencies (You may need other dependencies, these are just the ones that I needed at this point in time)

Dual-booting Ubuntu 18.04 with macOS (including full disk encryption)

Introduction I’ve been running Ubuntu on Macbook Pros for a couple years now, and while the ease of installation, driver support, and general stability has greatly improved in recent years, it can be difficult to find up-to-date guides. I’ve recently set up a mid-2015 macbook pro dual booting macOS with Ubuntu 18.04, so I figured I’d document my steps. First some overall notes and warnings, then simple instructions for a non-encrypted install, followed by slightly longer instructions for an encrypted install.

Publishing a Website from Emacs and Hugo

Introduction After 5 years, it’s time to give the site a bit of a refresh, now with fewer images and more words. Previously I used bootstrap plus a bit of manual editing. This time I’ll be using a pipeline of Emacs org-mode -> ox-hugo -> hugo -> nearlyfreespeech.net. This post will self-document my steps to get all that up and running. The last time I did any web-related things was over 5 years ago, and I wasn’t an expert then, so these steps should be taken with a grain of salt.