Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels

Channel Catalog

Channel Description:

Alexei's pile o'stuff, featuring writing on software development, travel, photography, and more.
    0 0

    My late-2013 15” MacBook Pro’s discrete GPU — an NVIDIA GeForce GT 750M — was pretty good for gaming during the first year of its life. But around the time that the new generation of consoles dropped, AAA games on the PC started becoming unplayable, even at postage-stamp resolutions with the lowest possible settings. I lived on a strict diet of indie games from 2015 to 2016 — thank goodness for well-tuned titles like Overwatch and The Witness! — but the itch to try games like the new Mirror’s Edge and Deus Ex became too great. Initially, I thought it might be time to switch out my MacBook for the upcoming 2016 model, but the winter reveal wasn’t particularly tempting: CPU performance was about the same as mine and the GPU was — at best — 3 times as powerful. (Still need to see the benchmarks on that — educated guess.) Worth it for a few hundred bucks, but $2000? No way!

    Building a gaming PC wasn’t an option due to my mobile lifestyle, and in any case the kind of CPU I could buy for cheap would be comically underpowered compared to the i7 4850HQ I already had in front of me. So I started looking into the scary world of external Thunderbolt GPUs, colloquially known as eGPU. Modern Thunderbolt 3 (allegedly) supports external GPUs in an official capacity, but older Thunderbolt 2 can get the job done as well, even though it’s unsanctioned by Intel. I’m usually reluctant to pursue these sorts of under-the-radar hobbyist projects, but there was enough prior art to make it worth a shot!

    Unlike many gaming enthusiasts, my goal was to optimize for simplicity over power: the fewer hacks and workarounds I had to use, the better. I already knew I’d have to use an external monitor and do my gaming in BootCamp, which was already the case. I knew there would be some performance loss from the limited bandwidth of TB2. I gathered that there may be timing issues and other problems that would require a bevy of software hacks to fix — mostly on the Windows side of things. But I was most concerned about the hardware hacking required to get the thing up and running in the first place.

    The majority of published eGPU builds involve enormous graphics cards connected to hotwired desktop PSUs, sitting in unseemly, torn-apart Thunderbolt-to-PCI chassises. It was clear that the anointed case for the job was the AKiTiO Thunder2. The Thunder2 wasn’t designed for eGPU use, but dozens of eGPU enthusiasts on forums like TechInferno demonstrated that it ran stable and performed admirably. (AKiTiO engineers even popped in on occasion to offer under-the-table eGPU advice — off-warranty, of course.) It was also one of the cheapest options on the market at around $200: very fair considering that a barebones development Thunderbolt 2 board cost nearly as much!

    Most eGPU builders buy this case to hack up, not to use as-is. Usually, the front panel is bent back or removed to fit larger cards, and then a desktop PSU is made to turn on with a paperclip and adapted to fit the DC plug. There are also arcane startup rituals to get everything powered and running with the right timing. I really didn’t want to have a PSU octopus and a ragged hunk of metal sitting bare on my table, though it sadly seemed inevitable. Then I discovered an alternate route.

    Most GPUs are power hogs that rely on one or two extra power ports on top of the card, but there are a few designed to pull power straight from the PCI slot. These aren’t super-extreme gaming cards, but these days they more than get the job done. For example, the just-released NVIDIA GeForce GTX 1050 Ti can pull 1080p at medium-high settings in many recent games and currently benchmarks as the ~40th best video card on the market! Better yet, many of these single-slot offerings are short and half as long as the monster enthusiast cards, easily fitting into AKiTiO’s compact case without any modifications. Using this type of card, I’d be able to keep my Thunder2 in one piece and avoid using a PSU entirely. No hacks required!

    At peak, these slot-powered cards can draw 75W from the PCI Express slot. Unfortunately, the AKiTiO Thunder2 only comes with a 60W adaptor, 30W of which is allocated to the circuitry. A dead-end? Not so fast: as stated in the official docs and verified by employees, the Thunder2 can actually pull as much as 120W from a more powerful adaptor. To be compatible, the new power brick needs to sport a 5.5×2.5mm barrel plug, provide 12V output, and have center positive polarity. (Practically every power adaptor has these last two items listed on the back.) My hope was to find a laptop power brick with these same specs, but it turned out that most laptops used chargers with an all-too-high output of 20V. Surprisingly, well-reviewed 12V/10A bricks weren’t common at all on Amazon (unless you lived in the UK or Europe), with most of the listings taken up by rebranded versions of a sketchy-looking adaptor with model number CT-1250. Eventually, I discovered one vendor who was selling bricks with model number CD120100A, which had a more confident label and looked identical to a power brick I saw in another successful closed-case AKiTiO build. (The Amazon listing was full of typos and the product photos didn’t match the user photos, but it just so happened that the adaptor in the user photos was exactly the one I was hoping to find — and Prime allowed for painless returns in any case.) If the US 12V/10A adaptor market was really dominated by CT-1250 and CD120100A, the latter just seemed like a better bet.

    For the graphics card, I decided to give the EVGA factory-overclocked version of the 1050 Ti a try, since one eGPU enthusiast mentioned that their EVGA card handled boot timing issues a bit better. (True or not, I’ve also had positive experiences with EVGA warranty and support in the past, so it was an easy decision.) Potentially, the overclock was a problem: the AKiTiO Thunder2 wouldn’t provide more than 75W of power to the slot, and any excess power pulled by the card could destabilize the system or even fry the circuitry (as reported by one user). But from everything I read, factory-overclocked EVGA cards were designed to never exceed the 75W threshold, and any instability could simply be fixed by underclocking the card slightly using EVGA’s (or possibly NVIDIA’s) own tools. Factor in the fact that the non-overclocked version cost exactly the same as overclocked while probably having lower resale value, and it became clear that the SC model was almost certainly the better buy — even if you dropped the clocks right from the get-go.

    (Note: many reviews will point out that the regular 1050 is a much better deal than the 1050 Ti from a price/performance perspective. Still, the Ti is about 20% faster than the non-Ti for just $20 more, and for the sake of future-proofing as well as TB2 performance loss it just makes sense to wring as much power from the purchase as possible.)

    Trawling eGPU forums for installation instructions was quite frustrating. Most users preferred to write about how they got their eGPUs working with their laptop displays (using Optimus drivers — possible with NVIDIA GTX cards) and/or in OSX. Both tasks involved copious scripts and hacks. I was only interested in the bare minimum — BootCamp on an external display — but most guides simply skipped that “easy” part. Would I need to make a custom build of Windows? Edit drivers? Install a custom bootloader? Nothing was clear, so I decided to just jump into it.

    Once I got all the parts assembled, I plugged the Thunder2 into my laptop and my monitor into the Thunder2, crossed my fingers, and turned on the computer while holding down the Alt key (for the boot menu — I already had BootCamp with the latest Windows 10 installed). At first… nothing. Just a black screen and no chime. I tried unplugging the cable, turning the machine on, waiting for the chime, and then plugging it in. The boot menu showed up, but froze when I selected Windows. I tried one more time to boot with the cable plugged in and it worked! Or — at least, it booted into Windows. Nothing showed up on the external display, but the Windows Device Manager had a tempting entry named “Microsoft Basic Display Adapter”. Hopeful, I searched for other eGPU users who had gotten to this step, and it became apparent that all I had to do was install the latest NVIDIA drivers. One reboot later (with no issues this time) and I was seeing “NVIDIA GTX 1050 Ti” in my Device Manager. I gave Overwatch a quick run on the highest settings, but performance didn’t seem particularly great; my suspicion was that the laptop defaulted to the discrete 750M instead of the eGPU. I returned to Device Manager and disabled the 750M, restarted Overwatch, and… 60fps! It actually worked! Holy cow!

    eGPU setup can be daunting depending on your hardware, but I seem to have gotten away with a problem-free configuration. The “hardest” part is getting the computer to chime on boot, presumably indicating that POST went correctly. This involves turning the computer off and on again one or two times in the worst case: if it chimes and the boot menu appears, everything is sure to work fine. (Recently, I’ve been getting the boot menu on first try 100% of the time. Maybe I was just impatient before!) Once booted into Windows, I’ve learned that simply changing the display settings to only use the external monitor, or to extend the desktop and use the external monitor as the main monitor, ensures that the eGPU is used over the discrete chip. (And I believe Windows remembers this preference when you launch with the eGPU connected.)

    Now for some benchmarks! The main bottleneck in this setup is the TB2 connection. TB2 doesn’t allow for the full PCIe x16 throughput, potentially crippling graphics card performance. In practice, this isn’t really that big of a deal: users have reported at most a 20% performance loss over native, and usually a bit less. Let’s see how well we do.

    GTX 1050 Ti SC GT 750M Improvement
    3DMark Fire Strike
    Graphics Score 6993 1911 3.66×
    Graphics FPS 1 32.28 8.74 3.69×
    Graphics FPS 2 28.74 7.96 3.61×
    3DMark Time Spy
    Graphics Score 2040 450 4.53×
    Graphics FPS 1 13.67 3.00 4.56×
    Graphics FPS 2 11.43 2.54 4.50×
    3DMark Sky Dive
    Graphics Score 22564 5602 4.03×
    Graphics FPS 1 102.25 26.41 3.87×
    Graphics FPS 2 103.83 24.80 4.19×
    3DMark11 Free
    Graphics Score 8802 2445 3.60×
    Graphics FPS 1 42.83 11.27 3.80×
    Graphics FPS 2 42.18 11.40 3.70×
    Graphics FPS 3 54.32 15.52 3.50×
    Graphics FPS 4 25.13 7.39 3.40×

    Quite an upgrade! According to Passmark and other benchmark listings, a 1050 Ti should, under normal circumstances, be about 4.5× as powerful as a 750M. Factor in 10%-20% performance loss from the TB link and that’s exactly what we see in our results: a 4x boost on average.

    Even without any underclocking, stability has not been an issue. I’ve been playing hours of Crysis 3, Far Cry 4, and Mirror’s Edge Catalyst over the past few days and everything’s still working great. I’m keeping the case closed, but I don’t think there’s any real risk of overheating: the GPU fan is designed to funnel heat right out through the back and there’s an extra front fan build into the case anyway. According to 3DMark, temperature during benchmarking has been stable.

    I’m not interested in running any weird scripts to get Optimus drivers for the internal display working, but I learned something interesting while fiddling with the Windows display settings. If you set the multiple display setting to “Duplicate these displays”, it seems that somehow the eGPU gets used for both the internal and external display! Assuming I’m interpreting this finding correctly, this means that theoretically you could buy something like this HDMI display emulator and use the eGPU on the internal display without an external monitor and without having to go through the hacky process of getting Optimus up and running. Unfortunately, there’s a performance penalty of about 20%-25% (according to my benchmarks) as well as approximately 0.25 seconds of latency, making this approach untenable for first-person shooters and other twitchy games. (I wonder if this is also the case with the Optimus driver route?)

    Another interesting finding: if you keep the discrete GPU enabled, there’s a setting in the NVIDIA control panel to dedicate one of the GPUs to PhysX. I’m not sure if this will make a real difference in performance or cause stability issues, but it might be worth investigating in the future.

    To summarize, using only…

    …you can assemble a painless, hack-less eGPU build and use it with your late-2013 15” dGPU MacBook as a relatively inexpensive graphics upgrade compared to building a PC from scratch or buying a console. (Cheaper still if you wait for rebates or use an older/weaker X50 card.) Caveat emptor: the same build might not work so well — or at all! — on other MacBook models or even with a different driver version. In other words, what worked for me might not work for you! Remember that eGPU on TB2 is not officially supported and mostly works by accident, though clearly it can work very well.

    (Also, there’s some great information in the HN thread for this post about new and upcoming TB3 enclosures. If you can get one working with a TB3-to-TB2 adaptor, it might be the best option of all for upgradability, reliability, and future-proofing. On the other hand, you’ll probably spend more money and the case will be a lot bigger. Do your research!)

    In time, I hope somebody releases a Thunderbolt 3 eGPU the size of one of those Square credit card readers — maybe sporting a GTX 980M caliber chip? — that plugs into a USB-C port and works seamlessly with the internal display. But for now, this lovely little eGPU will do just fine. I’m confident that my trusty MacBook can now serve me for another few years, especially if NVIDIA continues to release excellent and inexpensive PCI-powered cards on the regular.

    Let’s hope that the eGPU revolution is just beginning!

    0 0

    Last month, I released an unusual little app for iMessage. It’s called MusicMessages!, and it’s a collaborative step sequencer that lets you work on short pieces of music together with your friends. As far as I can tell, it’s the only app of its kind in the iMessage App Store. (Probably for good reason!)

    The app presents you with a grid of buttons, each corresponding to a musical note. Time is horizontal and pitch is vertical, and the entire grid can be panned like any other iOS scroll view. To place a note, simply tap one of the buttons; tap it again to erase the note. (If you have a 3D Touch capable device, you can depress the button using finger pressure. On an iPhone 7, there’s even a bit of haptic feedback at the end.) The tabs on top of the screen represent independent layers of notes, and if you tap their icons, you can pick a new instrument out of 40+ different ones (including percussion). Once you’re happy with your portion of the piece, you can send it off to one or more fellow iMessage users for their contributions. Each participant’s notes show up in their own unique color, making it easy to track the changes to a piece over time.

    Why iMessage? Since releasing Composer’s Sketchpad, I’ve wanted to create a companion app that would make it even easier to play around with simple musical ideas, though at the expense of expressiveness. Initially, I envisioned this as a tabbed, pannable, Minesweeper-like step sequencer for OSX. But when I started investigating the new iMessage frameworks in iOS 10, I realized that iMessage might be as good a place as any to work out this idea. No sync issues, no file I/O, a format that incentivized short experiments, and plus — the social aspect just seemed neat! Wouldn’t it be fun to riff on a melody or percussion line with your friends?

    Total development lasted exactly two months and involved approximately 8000 new lines of Swift code, plus 1000 lines and a bunch of assets borrowed from Composer’s Sketchpad.

    Favorite tech bit? The data format! I hate spinning up and maintaining servers, so my aim was to avoid any outside dependencies by sending data strictly through the iMessage APIs. Unfortunately, iMessage sends data via NSURL, which in this case had a hidden limit of 5120 characters. I hit this limit with plain old NSArchiver after about a dozen notes. To solve the problem, I had to compress all my data — 5+ layers, 5+ participants, and as many notes as possible — into approximately 3.75kb, assuming base64 encoding for the data string. Swift is pretty terrible at dealing with tightly-packed data structures (a 256-element static array can only be represented by a non-iterable 256-member tuple) and so I designed a struct and corresponding helper functions for my data in straight C. Lots of fun counting bits and optimizing for maximum data density… eventually, I settled on a maximum of 12 layers, 8 participants, and 1120 notes, along with a ton of extra data and even some room to spare. Nothing terribly complex, but it’s still fun to optimize within tight constraints.

    Another feature I enjoyed integrating was the perceptually-balanced HSLUV color space for all my user-selected colors. Normally, if you generate colors in the usual HSB color space by varying the hue and keeping saturation and brightness constant, you get colors that are perceived as unequally bright by the human eye. (An artifact of biology, alas.) Perceptually-accurate color spaces like CIELUV attempt to compensate for this, but most of them have large swaths of empty space where impossible colors lie, making it very difficult to create linear ranges of color parametrized by hue. HSLUV goes one step further and stretches the chroma to fill in these gaps. Not perceptually perfect, but just a ton more convenient and usable in practice!

    Since there’s an element of self-marketing in iMessage apps — recipients of app messages are automatically prompted to download the corresponding apps — it was important to make my app free. As I really didn’t want to plaster my interface with ugly ads, I decided to lock some non-critical features behind an in-app purchase. I’d never dealt with this payment model before, and as a complete novice in cryptography the code samples for receipt decryption and validation seemed quite daunting! Fortunately, I discovered an excellent OSX application called Receigen that generated auto-obfuscated receipt and IAP validation headers for my app. Ended up saving what probably would have been several days of frustrating, unrewarding work for just $30. Highly recommended!

    As before, designing the icon was a lot of fun. Just like last time, there was a long period in the middle where I was sure that the right design — one that would equally hint at the interface, functionality, and ambiance of the app — would elude me. And just as before, after a chain of prototype designs that I wasn’t crazy about, the right pieces suddenly snapped into into place all at once. On a lark, I even spent a few days parametrizing and animating the icon for my trailer, adding another 900 lines of code through Swift Playgrounds. (Next time, I should probably use something like After Effects or Flash. Keyframing in code is a huge pain, and performance in Playgrounds is hardly sufficient.) The thrill of creative experimentation and discovery is something I sorely miss in my day-to-day programming and makes me all the more eager to get started on my game project.

    Speaking of Adobe, I finally moved on from iMovie to Premiere Elements for my trailer. What a relief! Although deceptively simple at first, PE conceals enormous power in its effects and keyframing features. In trademark Adobe fashion, the program does its best to infuriate you into almost paying for the full CC; but with some clunky zoomed-in Bézier adjustments and begrudging cut-and-paste alignment of keyframe positions, it’s easy to create a video that moves, changes color, and feels very dynamic. The trailer I saw in my head came together in just a few days, and now iMovie feels like a joke in comparison. Well worth the $50 I paid for it on sale.

    MusicMessages! was an attempt at a speed project, so there’s many stones left unturned. The UI takes up too much room. The instrument tabs in horizontal mode are too hard to reach. Transitions are jittery and some of the UI glitches out on rotation. There should probably be a chord option for beginners. Percussion is in MIDI order, which is… a little bit crazy. But overall, I’m quite happy with the result! I hope people get a kick out of this weird project and enjoy sending their oddball musical ideas to each other.

    One more thing. There’s a good chance I’ll be releasing a standalone, file-based version of the app in the future (with MIDI, IAA, Audiobus and all that good stuff). If you’d be interested in using such an app, do let me know!

    0 0
  • 01/08/17--13:41: An Even Better Travel Gaiwan
  • Previously, I wrote about the Asobu Travel Mug as an excellent (if unintentional) travel gaiwan. Now, there’s a new leader in the not-a-gaiwan-but-almost-better-than-one category: the Klean Kanteen 8oz insulated tumbler.

    This mug is a bit thicker than the Asobu, but in trademark Klean Kanteen fashion the quality is simply superb. Heat is retained perfectly: there are no hot spots around the lip or anywhere on the body. Compared to the flaky finish of the Asobu, the matte black of the Klean Kanteen is slick and feels like it’ll last for ages. The shape is a little odd on first glance but feels great in the hand, and the rounded lip is perfect to drink from.

    Like the Asobu, the Klean Kanteen has a rubber-lined lid that can double as a strainer. For the most part, I use the sipping hole to strain: the lid snaps on very tightly and most loose-leaf teas expand enough to avoid going through the hole. (You might get a few stragglers, but the same thing happens with my regular gaiwan technique anyway.) If that doesn’t work, you can just pop the lid off and use the rubber seal as a makeshift strainer. As with the Asobu, the “lever” on the back of the lid can serve as a stopper while tilting it back. Admittedly, I did prefer the Asobu lid for its looser fit — the Klean Kanteen takes some strength to pop open! — but it’s a very minor ding on an otherwise excellent product. (Also, this might entirely be in the realm of personal preference. The Klean Kanteen lid looks and feels like it was precisely machined to fit the tumbler, which is a far cry from the ramshackle Asobu construction.)

    The mug fits about 7.7 ounces of water when filled right up to the lid, though you’ll get less when factoring in the tea leaves. It’s the ideal size for a single-serving cup of tea and about twice as big as your typical gaiwan. (Of course, there’s no issue using it for smaller steepings.)

    (As an aside: it took me way too long to realize this, but in addition to using a gram scale to measure out the exact amount of tea, you can also use it to measure the precise volume of water desired. This is because 1ml of water normally weighs 1g. Before, I used to eyeball the water; now, I just pour the water into the mug right after weighing the tea. This might seem super-finicky, but I’ve internalized Eco-Cha’s recommendation to use 9g of tea to 175ml of water for oolongs as a starting point, and it’s really nice to have reproducible results when comparing different steepings. The only question is whether to subtract the weight of the tea from the weight of the water, especially as the leaves expand. My hunch is yes.)

    As I mentioned in the previous article, one of the major reasons to use an insulated mug as a “gaiwan” is for its heat retention properties. Very little heat escapes the mug while making tea, maintaining the water at a stable temperature for the entire duration of the brew. My understanding is that certain kinds of teaware are especially prized for this property, but it’s almost impossible to beat vacuum-insulated steel in this race!

    Of course, it’s great that you can just throw this mug into your backpack or suitcase and not have to worry about it breaking or weighing you down. And since Klean Kanteen is such an entrenched brand, you can even find a number of accessories for it.

    The Klean Kanteen 8oz insulated tumbler: highly recommended as a surrogate travel gaiwan!

    0 0

    Update: I added a companion article with latency graphs for all three of my mice. I have also revised my conclusion to no longer recommend the MX Master for Bluetooth use or use cases sensitive to latency.

    We all know that Bluetooth has an abundance of flaws, ranging from frustrating latency to arcane pairing rituals. By many measures, it still feels like a technology stuck in the early 90’s. And yet, once you’ve experienced the freedom of going wireless, it’s very hard to go back to the old ways. Reaching to unplug your headphones when leaving your desk, only to realize that you can simply walk away? Bliss!

    For several years, I’ve been on the lookout for a Bluetooth mouse that could also be used for non-casual gaming. At minimum, the mouse needed to be on par with my trusty MX 518 at 1600 DPI and have little to no latency. Unfortunately, the vast majority of reputable Bluetooth mice maxed out at around 1000 DPI and had a reputation for being a bit laggy. The Razer Orochi was one of the few models that supported high DPI over Bluetooth, but it was a cramped little thing that felt rather unpleasant to use.

    There were a few wireless gaming mice that used proprietary USB adaptors to improve performance, including my latest mouse, the Logitech G602. This model did what it said on the tin, but despite the praise it garnered from gamers, I ended up somewhat disappointed with it. The USB receiver was pretty weak and would routinely cut out if you moved more than a few feet from the port. The fact that you had to use the receiver at all meant that you still plugged up one of your USB ports, causing significant setbacks with a two-port Macbook. (Hubs helped, but not while trying to use two USB-powered hard drives at the same time.) I was also unimpressed with the design and build in general: the body creaked in several prominent areas (including under the main buttons), the side buttons were unpleasant and hard to press, and the scroll wheel felt a bit mushy. After using it for about a year, I just ended up switching back to the MX 518.

    Recently, I’ve been working more in cafés, and the endless dance of the wire once again started to irk me. At first, I thought about getting an extra, cheapie Bluetooth mouse for use on the go, but then my “optimization sense” kicked in. It’s been 4 years since the G602, and technology moves quickly. Surely, I thought, there now had to be a mouse that could solve my wireless needs and also work for gaming! Besides, I deeply enjoyed finding tools for my life that could optimally solve multiple problems at once.

    Sure enough, Logitech had two new headlining models in the Bluetooth category: the MX Master and MX Anywhere 2. These were clearly top-shelf devices, sporting sleek designs, several color choices, and Logitech’s free-spinning MicroGear Precision scroll wheel. Interestingly, they also reached 1600 DPI and shared the ability to connect to Bluetooth or a Logitech Unifying USB receiver at the user’s discretion. (Update: the newly-released MX Master 2S goes up to 4000 DPI.) Based on my experience with the G602, I figured Bluetooth might be handy for everyday use while the USB receiver would work well for lag-free gaming. Were these the first Bluetooth mice that could actually fit the bill? I had to give them a spin!

    Eventually, I got my hands on both models and did some side-by-side testing. The MX Master was love at first touch, fixing almost everything I hated about the G602 and even adding a few extra features to win me over. Meanwhile, the MX Anywhere 2 was marred by one awful design decision and just felt too small for ergonomic comfort. (Update: unfortunately, I had to eventually give up the Master due to latency and connectivity issues. But the hardware remains spectacular!)

    Below is a discussion of several aspects of these mice that haven’t been covered in most reviews, including handfeel, clickiness, gaming use, and latency measurements.

    MX Anywhere 2

    The MX Anywhere 2 is a cute little mouse. Some reviewers have been comfortable switching to it as their primary work mouse, but in my testing, I found it just a bit too small. This is definitely a travel mouse in form and function. The weight, however, is great for usability, as it’s just hefty enough to stick a little to the mousepad without losing its high mobility.

    Click-wise, the two main buttons feel pretty good while the rest aren’t particularly notable. I was happy that the side navigation buttons were fairly normal sized compared to the scrunched side buttons on the Master. The coating feels grippy but maybe a tiny bit less premium than I’d hoped.

    Clicking every button on the MX Anywhere 2.

    In case you’re not aware, many Logitech mice now feature a scroll wheel that can also be clicked side-to-side. In reviews of Logitech mice, I often see praise for this sideways-clicking mouse wheel, and some go as far as to call it a “premium feature”. But I think I’ve come to realize that most people just don’t use their middle click all that much. Me? I’m an compulsive middle-clicker. I use that button for everything. New links. Closing tabs. Panning. Reloading. In fact, it’s possibly the second most important button on my mouse! Unfortunately, sideways-click cripples this button thoroughly, making it rattle from side to side with every minor push.

    If I otherwise loved the Anywhere, I figured I could get accustomed to this annoying hardware quirk. But Logitech really screwed up the wheel here. Incomprehensibly, there’s no middle click; instead, you get a tiny button right below the wheel that could be rebound to this function. (By default, it serves as the “gesture” button, which lets you show Exposé and whatnot.) The wheel itself, when depressed, mechanically toggles between traditional ratchet and free spin modes for scrolling, resulting in a heavy, chunky “clunk” that feels like you’re squishing something deep inside the mouse’s guts. Is there any other Logitech mouse that behaves this way? The middle-click has been a staple feature on mice since the 70’s, so why is changing scroll wheel modes suddenly more important? Considered together with the usual sideways-click complaints, this scroll wheel disappointed me in practically every respect.

    A demonstration of the janky scroll wheel.

    For a while, I tried rebinding the square button and sideways-click buttons to middle click. It felt OK… in the sense that I could probably get used to it over time. But I knew I’d never be happy with this compromise, and it’s what ultimately pushed me to give the Master a try.

    MX Master

    I’m delighted that tech companies have started to inject fashion into even their most pragmatic products. Both MX models come in black, navy, and white (“stone”). I liked the idea of white in honor of my old favorite Microsoft Intellimouse, and it’s the color I chose for my initial Anywhere purchase. But seeing it in person didn’t impress me as much as I had hoped. It was attractive but a little business casual, and in any case, it didn’t mesh with my recent black Logitech K380 keyboard purchase. (Peripheral matching, whaddaboutit?) So I decided to seek a different color with the MX Master.

    Between the other two options, navy looked svelte in pictures while black appeared to have some ugly beige accents that screamed “HP peripheral”. And yet… Amazon Prime Now had a promotion going where I could chip $10 off the purchase of just the black model, bringing the price down to a mere $50 and delivering it the very same day. Meanwhile, navy would cost me close to $70 and arrive several days later! Friends, I must admit I did not pass the marshmallow test on that day.

    Fortunately, this turned out to be a great decision: the black model looks fantastic in person. Despite what the photos might show, the accents are actually not beige at all but more along the lines of Apple’s space gray, perfectly complementing the darker matte gray of the body. In addition, the buttons have a slightly different coating from the rest of the mouse, giving them a pleasant sheen under certain lighting conditions.

    As most reviews have stated, the ergonomic comfort of this mouse is close to perfect. You lay your hand down and it feels like it was sculpted just for you. What’s more, the main buttons feel incredible to click — perhaps more so than any other mouse I’ve used! Seriously, I can’t stop clicking these buttons.

    Clicking every button on the MX Master.

    The Master’s sideclick-less wheel intrigued me when I first saw it. Most Logitech mice either feature sideclicking and free spinning together, or otherwise just throw in a plain old scroll wheel and call it a day. This was the first mouse I found which omitted sideways-clicking while still retaining the free spin mode, a feature I thought might come in handy as a substitute for the trackpad’s inertial scrolling. Prior to handling the Master, I hoped this setup might finally allow me to have an uncompromised middle click while still benefitting from Logitech’s fancy scroll wheel tech. And… that’s exactly what happened! The middle click on this mouse feels excellent, to the point where it’s very nearly as pleasing as the main buttons. (There’s a slight bit of wobble before the click is triggered, but I don’t think that can be helped on account of the complex mechanism.)

    There’s a subtle issue I noticed with the middle click that might be worth mentioning. When Smooth Scrolling is enabled in Logitech Options, if you click the mouse wheel and then immediately start scrolling, your scroll won’t actually register until several seconds later. This happens in both Windows and OS X. I assume this is some sort of hardware safeguard to prevent accidental scroll triggering, but it’s noticeable on occasion.

    My main issue with the build is the very poor layout of the back and forward buttons. I use these buttons quite frequently for navigation, and I miss the old Intellimouse days when the side buttons were enormous and clicked just as well as the main buttons. Here? The buttons are quiet and super annoying to differentiate. Why couldn’t they have spread them out just a little bit? The horizontal scroll wheel feels nice, but I don’t see myself getting much mileage out of it, especially now that I’ve learned you can simply Shift-scroll in OS X to get native horizontal scrolling.

    There’s one hidden button on this mouse: the “gesture” button, which can be activated by smashing down on the mesh pad next to your thumb. Unlike the other buttons, this button feels mushy and difficult to press, similar to those membrane buttons you find on cheap remotes. I guess they had to design it this way to avoid accidental clicks, but I wish they thought of something else or eliminated it altogether. I’ve been trying to use it as a surrogate back button instead of the tiny default one, but it’s not particularly pleasant or responsive to use. Oh well.

    Weight-wise, this mouse is pretty hefty, but not overbearing. I’ll have to get used to the inertia compared to my MX 518, which barely feels like it has any weight at all.

    The MX Master in regular use.

    I was worried when I was first looking at this mouse that it would just be a minor iteration on the G602, but these fears have been unfounded. The Master fixes every problem I had with the G602 (aside from perhaps the weight) and adds a bunch of great features to boot. I feel immediately at home with this device.

    Common Issues

    There are a few issues common to both mice that should be addressed.

    Both of these mice can be used while charging, but they don’t register as USB devices even when directly connected to your computer. You still have to use them via Bluetooth or the Unifying receiver, which means that there’s no zero-latency mode. In practice, as I demonstrate below, the mice are pretty darn close to lag-free. Most people didn’t consider wireless-only to be an issue with the G602, and I don’t see it as an issue here either. (The feature would have been appreciated, though.)Update: as it turns out, the latency is still an issue for gaming. Read on below.

    Second, there’s some scrolling weirdness, which seems to be a mix of OS issues as well as user habits. On the OS side, when smooth scrolling is enabled in Logitech Options, it doesn’t always seem to work right. Fairly frequently, you get some weird acceleration or momentum before things get going. (Both OS X and Windows have this issue, though manifested in different ways.) Most unfortunately, the wheel in free spin mode doesn’t seem to have a 1:1 mapping to page scrolling, which feels a lot less physically correct than using the trackpad. I think I could get used to this behavior, but even my ancient MX 518’s scrolling felt more natural. In terms of habits, if you’re used to trackpad momentum scrolling in OS X, you’ll be surprised when you’re free-scrolling a page and then find other pages continuing to scroll when switching windows! It might take a while to internalize the fact that the mouse has a mechanical component that needs to be stopped before switching tasks.

    These mice worry me a little with their reliance on mechanical trickery. On the MX Master, whenever the lever (or whatever it is) stops the wheel when switching to ratchet mode, I can feel the entire mouse shudder slightly. At least one user has already demonstrated that this part can get stuck. (This has apparently been quietly fixed by Logitech.) How long will it take for the mechanism to break or wear out? Fortunately, Logitech has an exceptional warranty department, so I don’t doubt that they’ll send me a replacement if anything bad happens. Still, I don’t like the idea of having to pamper my mouse.

    The Unifying receiver, unfortunately, tends to have a very short range if there’s any sort of interference nearby. (For example, I can hardly move the mouse a foot away if a Thunderbolt cable is attached to the port next to the receiver. Or maybe it’s the eGPU itself?) As a result, I’ve resorted to plugging the receiver into a USB extender. With Bluetooth, this is not an issue at all, so it comes up fairly infrequently.


    Now, for my personal dealbreaker with wireless mice: latency. I had a bit of a misconception when I first set my eyes on these two MX models. My assumption was that the Unifying receiver was identical to the one used by my G602, meaning that the adaptor would be highly optimized for reduced latency. But according to a Logitech representative, only Logitech’s gaming peripherals use the improved, custom-designed adaptor to get the “precision report rate”, whereas Unifying technology is less fancy and reserved for use with the business lineup. My question was: did “precision report rate” only refer to the polling rate, or were the gaming adaptors additionally less laggy? In other words, was I missing out with my Unifying receiver?

    I knew I wouldn’t have peace of mind until I had solid numbers, so I decided to measure the latency myself. There were two data points I needed to capture: the moment the mouse started moving, and the subsequent moment that the computer registered mouse activity. Both actions had to be on the same clock. My iPhone’s camera could record at 240 FPS, so precision wasn’t an issue; the problem was that my laptop display only refreshed at 60 Hz, meaning that I couldn’t rely on a recording of the screen alone to figure out how fast the mouse signal was going through. (There was only one display frame for every four video frames.)

    I ended up writing a small, single-window Mac application to help me along. On the left side, the window has a running millisecond timer, refreshing at the exact frequency of the display. This gave me the precise timestamp of each display cycle. (Well — with a possible delta of 1 frame or ~17ms, depending on how the labels spaced out their updates under the hood. But I was only interested in relative latencies between the mice, not the absolute latency, so the only important detail was that this offset was consistent.) The app also captured the timestamp for the precise moment mouse movement was first detected. This was displayed in a label on the right side. Both timestamps were generated using the same time function, CACurrentMediaTime.

    Next, I placed a mousepad next to my display along with a small box to evenly and consistently push the mouse along. I set up my phone to show both the laptop display (with the timer app running) and a side view of the mouse and box contact point. I filmed three trials each of the MX 518, MX Master with the USB adaptor, and MX Master in Bluetooth mode, resetting the righthand timer between each trial.

    Finally, I went through the videos frame-by-frame in VLC. (The ‘e’ key: highly convenient!) The left timestamp was used to determine the exact moment when the mouse started moving. If the movement occurred between two timestamps, I could simply interpolate the precise value based on which intermediary frame out of four I landed on. After that, I noted the righthand (“mouse was detected”) timestamp and did a bit of math to arrive at the latency value. Perhaps not a perfect system, but as accurate as I could manage with the tools I had at hand!

    Update: since publishing this article, I have run a more thorough and accurate suite of tests on all three of my mice. The conclusions are a bit different from the initial ones greyed out below, namely concerning Bluetooth accuracy (notably worse than originally tested) and MX Master latency characteristics (spiky and 10-20ms slower than wired).

    The results were: 55ms/58ms/50ms for the wired MX 518; 63ms/74ms/51ms for the MX Master in USB receiver mode; and 70ms/58ms/68ms for the MX Master in Bluetooth mode. (Keep in mind that these values were not a measure of absolute latency and were only meant to be compared to each other, since the test did not deduct OS latency, monitor latency, etc.)

    To my great surprise, not only was wireless latency very close to wired (~55ms vs. ~65ms), but Bluetooth was practically as performant as the USB receiver! I don’t know how Logitech managed it, but somehow the Bluetooth performance of these mice is nearly flawless, to the point where perhaps the dongle is basically unnecessary. (Except for edge cases like BIOS use.) You could make the argument that wireless performance is less consistent than wired, but I’d need to do more tests to figure this out. (And it’s probably more effort than it’s worth.)

    So is 10ms of lag a dealbreaker when it comes to precision gaming? I strongly suspect it won’t be noticeable — especially given how much latency already exists in the long chain from mouse to display — but I’d love to see some empirical evidence backing this up.


    There’s some mild consternation for these two MX models when it comes to gaming. Whenever people ask, some enthusiast always shows up and levies the following grievances against them:

    • They have no wired mode, and thus always feature some latency.
    • They have built-in acceleration and angle snapping.
    • They only poll at 125 Hz.
    • They only go up to 1600 DPI. (Update: the newly-released MX Master 2S goes up to 4000 DPI.)

    In contrast, they suggest, Razer and Logitech themselves make gaming-tailored wireless mice (the Logitech G900, or the new Razer Lancehead) that go up to 12000 DPI at 1000 Hz and use proprietary receivers for optimal performance. All technically true! However, the mouse I’ve loved the longest, and gamed the most with, has been my trusty MX 518, a classic model popular with gamers even today. And it turns out that this mouse also has built-in angle snapping, also only goes up to 1600 DPI, and also polls at a mere 125 Hz. The horror!

    In practice, none of these quirks are dealbreakers. 1600 DPI is more than enough for the vast, vast majority of people; it was a high standard a decade ago and accuracy-per-inch demands in humans have not suddenly spiked during that time. (DPI is more of an issue with enormous monitors and insane resolutions, but it doesn’t matter for my use case.) Same goes for 125Hz polling, which is effectively 2x the refresh rate of most monitors. On top of that, you’ll get about 30x less battery life (30 hours vs. 40 days!) with gaming wireless mice — not to mention losing all the benefits of Bluetooth. Unless you’re a pro, I don’t think it’s nearly worth the tradeoff.

    However… I have to admit that something about these mice definitely feels off when playing FPS. Side-by-side with the MX 518, the difference is immediately noticeable. With the 518, I feel like I’m directly inside the character’s head. With the Master, there’s a bit of a “cockpit effect”, or a very subtle sense that my movements aren’t perfectly mapped to the camera. Accordingly, things like rapid 180 degree turns and flick shots feel more hesitant and unnatural. For a while, I assumed this was due to wireless latency, but my experiments showed that this was unlikely to be the case. (Besides, my setup was a mess and there was plenty of latency coursing through the system already.) I also thought it might be the weight of the Master, but no dice: the Anywhere had the same issue at half the weight. So my working hypothesis is that this issue is caused by some subtle differences in mapping of mouse movement between the Master and the 518, meaning that I’ll have to reprogram my brain a little before I’m fully comfortable with it. (I think I could also customize this curve in software using various third-party tools, but this might be too finicky even for my tastes.) I actually remember having this exact response to the G602, so maybe that 10ms does make a critical difference in FPS gameplay after all? Or perhaps the G602 shares its motion curve with the Master? Who knows!Update: after using the Master for another week and doing some thorough testing, I have concluded that, unfortunately, latency is almost certainly the culprit here. That’s not something that you can really train yourself to ignore, at least not in fast-paced multiplayer games.


    The MX Master is so very close to a perfect all-arounder mouse. It supports Bluetooth. It works for gaming. It feels incredible in the hand and even features a free-scrolling mouse wheel with a solid middle click. Unfortunately, despite my initial excitement in the first version of this article, I’ve decided to return the mouse it in favor of the gamer-centric G403 Wireless. (The lack of Bluetooth support is a bummer, but this mouse hits it out of the park in every other respect.)

    There are two primary flaws that sealed the Master’s fate for me.

    First, the Bluetooth functionality is lackluster. I don’t know which company is responsible, but compared to my wireless Apple trackpad, the Master’s cursor movement under Bluetooth feels rough and jittery. On occasion, I’ve even seen it stop tracking altogether until the on/off switch is toggled. (Dozens of users have reported the same problem on Logitech forums.) Furthermore, Bluetooth performance seems to vary dramatically depending on software conditions. For example, if there are lots of windows on the screen and I free-spin the scroll wheel, the cursor might only be able to move once a half-second. One of my primary goals with this mouse was to have it immediately start working when setting up at a café, and this doesn’t quite pass my baseline of “working”.

    The other issue is gaming performance. This feels like such a very minor and nebulous nitpick, but at this point I’m reasonably certain that I’m not imagining it. Presumably on account of the additional 10-20ms of latency, I just don’t feel as in-control with this mouse as I do with my G602. When playing FPS with the G602, I have no problem running down a hallway, executing a precise about-face, and then turning right back in the span of a half-second. With the Master, this gesture simply feels sluggish and unnatural. I often have to make a concerted effort and then end up taking longer or miss the target altogether. The effect is immediately noticeable when using the two mice side-by-side over the course of a long gaming session. The G602 just feels much more precise.

    I really wish it was practical for me to keep this mouse, but it’s not quite the master-of-everything I was hoping to get. Still, I have no doubt most people would be very happy with the MX Master, especially now that the 2S revision has been released.

    If you decide to get this mouse, I recommend grabbing a Hermitshell case. It fits the mouse perfectly and keeps it safe for tossing into your backpack or bag.

    0 0

    In the course of doing latency testing for my previous article on the Logitech MX Master, I discovered a couple of flaws in my helper app, and I also realized that I should have probably recorded a few more sample points. So now, as a followup, I have devised a better testing methodology and run a full suite of tests. Unfortunately, with this new data in hand, I must now retract my original recommendation. The Master is still a good mouse for the average user, but its wireless performance is just too unreliable for precise gaming or Bluetooth use.

    If you’re looking for a great all-arounder, I would instead give my highest recommendation to the G403 Wireless, which I’ve been happily using for several months with zero issues. While this mouse does require a dongle and only has a tenth of the Master’s battery life, its best-in-class performance, non-existent latency, svelte form factor, and incredible clicky side buttons more than make up for these downsides. Better yet, you can routinely find it on sale for $50 or lower on Amazon and at Best Buy. I’ll try to post a fuller account sometime in the near future.

    In the meantime, here are the new test results for the MX Master, G602, and MX 518.


    Note: I also ran a few tests in microe’s MouseTester to compare the motion graphs of the three mice, but they looked pretty much the same to my eye. So I think the difference in feel of these mice is mostly due to latency and, to a lesser degree, weight and shape.


    Since last time, my helper app has been revised to explicitly update the left label every frame instead of implicitly relying on AppKit’s timing. I’ve also switched my scrubbing program from VLC to QuickTime, as the latter additionally allows you to step backwards frame-by-frame. (Extremely useful if you happen to overshoot the mouse movement point!) Combined with a Numbers spreadsheet for processing, the sampling process took maybe a minute or two per data point.

    In every test, the cursor begins moving two screen frames ahead of the right label, so there’s on the order of two frames of extra latency (~33ms) in these measurements. Subtracting this amount from the recorded values will get you closer to the absolute latency of the mouse. But again, if you’re only comparing these numbers to each other (which I am) then the extra latency doesn’t really matter. You may as well just subtract the latency of the wired mouse since that’s as close as you’re going to get to zero.

    I ended up testing both USB ports on my Macbook because I’ve had USB peripherals behave differently depending on which side they used. Not sure if the resulting variance is due to the ports themselves (power issues?) or simply reception.

    OK, on to the results!

    As the wired “control”, the MX 518 showed 33ms of average latency with the left USB port and 38ms with the right. Theoretically, none of the other results should have surpassed this value — though the G602 stood a slight chance with its higher 500Hz polling rate.

    During its worst run, with the adaptor plugged in to the left USB port, the MX Master had 62ms of average latency, or 30ms more than the wired MX 518. However, every subsequent run resulted in significantly quicker average values. Two more tests with the left USB port — one using a USB extender and one while simultaneously charging — gave me a better average of 54ms for both. And with the right port, things got better still, with two runs sporting an average of 45ms (including dips down to the thirties) and the other two responding at a respectable 47ms and 49ms on average.

    With Bluetooth, the Master responded at an average of 65ms. So my conclusion in the original article was overly optimistic: there can be up to 20ms difference between Bluetooth and the USB adaptor.

    During its first trial, the G602 reported with an astounding 34ms of latency — just 1ms more than wired! However, each subsequent run (including one with the very same setup as the first) only gave me 50ms on average.

    What can we conclude from these results?

    The main issue with the Unifying receiver seems to be that the latency is rather inconsistent and spiky. With the G602, regardless of whether it’s averaging 35ms or 50ms, the latency curve is always baby-butt smooth. In contrast, the Unifying receiver needs to be pampered to attain optimal performance.

    It seems that a variety of minute factors can drastically affect the latency of these mice, ranging from adaptor placement to USB port selection. The G602 might have a lower baseline than the MX Master, but the Master can still come within a respectable 10ms of that baseline. And in any case, it seems the G602 can’t be guaranteed to perform in this range. I wish I knew what caused the G602 to spike up to 50ms for all its subsequent trials!

    The first sample point in several of the MX Master runs was much higher than the rest. (A few are omitted since they’re not representative of the average running latency.) I assume this is the result of some energy-saving feature. Doesn’t really matter for games since you’re constantly moving the mouse anyway.

    The battery level in the MX Master doesn’t seem to have much of an effect on performance.


    I spent another week with the MX Master in daily use, and unfortunately, I had to concede the results: the Master was noticeably 1-2 frames behind my other mice. Frankly, I was really surprised by how much this affected gameplay. With the MX Master in CS:GO and Overwatch, I always felt like I was a little drunk. My cursor would constantly overshoot and I would miss many of my flick shots. Hot-swapping the G602 brought an instant wave of relief: my sense of immersion immediately returned and I felt like I could aim almost twice as well. (Maybe this is what happens when you hammer your synapses with FPS gameplay over the course of two decades!) I tried to account for the placebo effect as best as I could without doing a completely blind test, but I could easily see my performance suffer even when running around and shooting bots in the Overwatch training area.

    I followed up with a few more informal measurements, and all of them continued to show the MX Master trailing the G602 in performance — mostly on account of the lag spikes, but sometimes pretty drastically even on average. I also discovered that Bluetooth performance was quite unreliable on the Mac side, frequently dropping off or disconnecting altogether and requiring a hard mouse reset. Given that the Master was intended as an all-arounder for both gaming and Bluetooth use, this was a huge disappointment. It clearly wasn’t up to snuff in either respect, and I decided to send it back.

    As a last-ditch stop in my mousing hunt, I visited my local Best Buy to take a gander at Logitech’s gaming mice. The lineup had all the problems I was expecting: tacky designs, an overabundance of buttons, horrible tilt-click scroll wheels… except for the lone G403. As soon as I put this mouse in my hand, I knew it was the one. This was the only wireless gaming mouse that had just the five standard buttons in a classic body. Its scroll wheel was the normal kind, not the mushy tilt-wheel kind. Its internal hardware was the same as that of the possibly-best-in-class G900. And most surprising of all, its side buttons were actually clicky! (I know it’s such a small detail, but I hadn’t used a new mouse with clicky side buttons in years.) Before me was a phenomenal gaming mouse in the guise of a business accessory, evocative of the classic Microsoft Intellimouse — and USB dongle or not, this was exactly the mix I was searching for. I took it home and haven’t had a single complaint in the three months since. (Bonus: it fits snugly in my MX Master Hermitshell case.)

    The Master is 80% of the way to being an ideal all-arounder, but sadly, it’s killed for power users by inconsistent performance.

    0 0

    I am once again in the market for Bluetooth headphones. My last pair, the first edition Sony MDR-1RBT, served me very well for the last four-and-a-half years. I didn’t (and still don’t) have much experience with the high end of the audio market, but when I was picking them out, they were among the best sounding headphones I’d heard. The feeling of space was especially startling: for the first time in a headphone, I felt like the music was all around me instead of being localized to a point between my ears. Now, parts of it were held together with glue and the remaining pleather was flaking all over my jacket. The time was right for an upgrade.

    I started to research comparable, $200 to $400 over-ear wireless replacements. My wishlist included noise cancellation, multi-device pairing, volume controls that interfaced with your device, tactile controls with previous/next buttons, and foldability. The only real technical requirements were audio quality, a zero-latency wired connection, and at least some kind of physical control. But non-technical, quality-of-life attributes were also very important. It’s hard to deny that headphones play many important roles in our lives apart from simply reproducing audio. They are most certainly a fashion item. They serve as earmuffs in cold weather. They block out the outside world when we need to retreat into our work. For many of us, they’ve become one of the most important and frequently used accessories in our wardrobe! Soon, we’ll be seeing health and fitness sensors incorporated right into the earcups. Especially with the cord cut, there’s a lot more to headphones than just audio these days.

    Based on a wide and thorough reading of the field — Head-Fi, InnerFidelity, Les Numériques, /r/headphones, and more — I eventually whittled down my list to four pairs of headphones: the Audio-Technica ATH-DSR7BT, the Sony WH‑1000xM2, the Bowers & Wilkins PX, and the V-MODA Crossfade II Wireless. Since it was impossible to proceed past this point based on stats alone, I decided to get them all together in a room and give them a thorough, comparative workout.


    The Audio-Technica ATH-DSR7BT came widely recommended, but my first impressions were sorely disappointing. What I took out of the box was a creaky, plasticky set that clamped down hard on my head. The earpads don’t feel comfortable at all — certainly nothing like the pillows on my old MDR-1RDT. The cover over the USB port will not stay seated. The strange IR play/pause “button” positioned next to the regular buttons screams “budget design” and frequently misfires. The headphone volume controls appear to act independently of your paired device, forcing you to juggle two different sets of volume. There is also a quiet, tape-like hiss audible over silent passages. My 1RBT has occasional buzzing and popping over silence, but not to this degree. Perhaps the only technical advantage here is the fact that the volume rocker performs double duty as previous and next: very useful for jogging back and forward in audiobooks and podcasts.

    The Sony WH‑1000xM2 also didn’t amaze me in the build quality department. Though these are in fact the lightest set of the bunch, the materials here simply feel cheap. The plastic is on the same level I would expect to see in a bundled TV headphone, not a $350 product. Comfort-wise, they are OK but not great. The pads in particular are strictly pragmatic and lack the gentleness my MDR-1RBT. The earcups are roomy enough and the clamping force is light enough that the headphones slide around on your head when moving around. In terms of fashion, I’m just not fond of the design. It’s hard to describe these headphones as anything but office-grey boring, though the parallel h.ear on 2 models do come in different colors. (On that note, I’ve actually seen some scattered accounts of people who prefer the sound of the h.ear on 2 over the 1000xM2, even though h.ears look identical and cost $50 less. Reviews are still scarce, though, so it’s hard to know for sure.) On the upside, these headphones fold and come with a carrying case.

    Aside from the two buttons for switching power and ANC, the controls here are touch based. You can double-tap on the right earcup to play/pause and swipe in one of four directions for previous/next and volume adjustment. I didn’t expect to prefer these controls over physical buttons, but in practice they worked flawlessly. Buttons are often tricky to find on earcups anyway, so having a wide berth for gestures is a big plus. (No pun intended.) And in any case, it’s certainly worth having for the discrete previous/next gestures. You can also place your hand over the right earcup to pipe sound in during conversations. This feature is not useful to me since I’d prefer to just take my headphones off out of politeness, but maybe handy for noisy offices with ANC enabled.

    Unfortunately, it doesn’t seem that these headphones have the ability to pair with multiple devices simultaneously. (I did not test this myself, but it’s been mentioned in a number of forum posts.) This means you have to do the Bluetooth disconnection dance if you’re switching between laptop and phone, which is rather annoying.

    As reviews have stressed, ANC does not seem to affect sound quality in any major way. I didn’t visit any noisy environments with these headphones so I can’t vouch for the ANC’s effectiveness, though reviewers have reported it to be best-in-class. The Sony Connect app is a sight to behold, featuring several pages of settings mostly related to ANC behavior. There’s even a built-in EQ. (I tried out some of the surround effect settings, but the results were frankly underwhelming.)

    In terms of background noise, there is surprisingly very little here. If you strain very hard you can detect a slight hiss when the amp is on, but it’s basically imperceptible.

    The Bowers & Wilkins PX is certainly the most interesting headphone of the bunch. In terms of build quality, it really can’t be faulted. Every surface features lovely premium materials, including metal and real leather. The design is very attractive too, though I confess that I’m not too fond of the gaps the headband forms with the sides of my head. The pads aren’t cushy, but I don’t find them to be actively uncomfortable like some reviewers have. In fact, coupled with the high clamping force, I can really appreciate just how firmly these headphones sit on my ears. No amount of motion is likely to dislodge them. This also gives the PX the best passive isolation I’ve heard in a headphone. You don’t even need to enable ANC to block out most of the outside world. The earcups are fairly small, so this pair might not be a good choice if you have large ears. Personally, I find them cozy.

    Tech-wise, the PX attempts some very interesting things. First, the headphones pause your music when you take them off and then resume playing when you put them back on. In essence, this works almost the same as with the AirPods. Some people have complained about the sensitivity (which you can adjust in the B&W Control app) but I found it to be tuned just right. The feature works just as well on Mac and Windows, and on a technical level I was surprised by how many different kinds of media it was able to pause. (YouTube videos, for example.) Unfortunately, there are several flaws that might compel you to disable the feature altogether. (This is also possible through the app.) One, if you remove a single earcup to talk to someone, the headphones might pause and then immediately resume after detecting the back of your neck. Two, if you pause your media before taking off your headphones, it will still start playing after you put them back on. This can create some loud and unpleasant surprises.

    Next, these headphones have the ability to connect to multiple devices simultaneously. This doesn’t mean that your laptop and phone can interleave their audio, but it does allow you to switch back and forth between devices without any friction. The execution here is a bit questionable, though. If you’re connected to two devices and one of them produces a sound, the other device will automatically pause its media in the same way as the auto-pausing feature. In practice, this means that you have to put your phone on mute if you’re also connected to your laptop, since any notification sounds will immediately pause your music. If you prefer not to switch devices this way, you also have the option of quickly disconnecting from the currently paired device by way of a brief hold of the power button. This is significantly improved over the usual Bluetooth headphone switching hullabaloo, which involves going to your paired device and manually disconnecting the headphones in the Bluetooth device list. Having a disconnect option right on the earcup is a great convenience.

    ANC has its own switch, and you can adjust the sensitivity from inside the app. Reviewers have pointed out that sound quality suffers when ANC is set to the highest setting, and from cursory testing I have to agree. The loss is diminished with ANC is switched to the lower setting (Office), though it does seem quite weak. The highest setting (Flight) surprised me with just how serene it made the surrounding world. Riding on a train felt just like sitting in a quiet room. Plus, there was no sense of pressure at all.

    Battery saving on the PX is fairly aggressive. If there’s no audio stream detected for about a second, you can hear the amplifier turning off. (You can tell because there’s a very, very faint hiss when the amplifier is on. I’d say it’s on par with the 1000xM2 and only really detectable in contrast to absolute silence. These headphones have a very clean sound.) Then, when streaming resumes, the audio takes a second to fade back in. This is much improved over my MDR-1RBT, which simply switches off on audible silence and causes cleaned-up audiobooks and podcasts to glitch out1. Here, even if there’s no sound coming from the drivers, the amplifier will remain on as long your software has an active audio session going. (I believe this is the case with the other three headphones as well.)

    Very annoyingly, the PX seems to go into “standby” after 2 minutes of inactivity. That’s what the manual says; in practice, they seem to shut off, since I can’t get them to reconnect without hitting the power switch first. None of the other headphones do this: once connected, they stay connected until you turn your computer or headphones off. I ought to be able to leave these in a table, then pick them up after 15 minutes and resume listening immediately.

    The PX, really, has three modes: wireless, USB wired (which turns the headphones into a USB output device), and “analog” 3.5mm. That last mode is in quotes because even though the PX has a 3.5mm output, the headphones still have to be charged in order to use it. A bit annoyingly, when in USB mode (though thankfully not in 3.5mm mode), the headphones apply the same power-saving, amp-switching logic as in wireless mode. In practice it’s not really a problem, but you might be surprised if you hit the volume button expecting a beep and don’t hear anything because the amp hasn’t warmed up yet. I perceive the USB wired mode as having very slightly more latency than the 3.5mm mode — maybe 1 to 2 extra milliseconds — though I only tested this by switching back and forth in a game of Overwatch. I was hoping to gain access to the microphone over USB in order to use the PX as an quick-fix gaming headset, but this functionality is unfortunately locked away. The microphone does technically show up as an input device in both macOS and Windows, but it doesn’t produce any sound.

    Meanwhile, the 3.5mm connection is very audibly noisy — far more than either of the other modes, including wireless. Usually this manifests as a hum, but sometimes you can even hear crackling or static. This interference is amplified twofold when the USB cable is attached simultaneously, making me think that the analog lines inside the headphones haven’t been shielded for some reason. (Other users on Head-Fi have suggested that this might be a grounding issue, and indeed the hiss does change depending on the device and even the workload, but there doesn’t seem to be any way to make it disappear completely. And in any case, plenty of ungrounded devices sound great with my other headphones, so it’s a sorry excuse.) Since 3.5mm is the only analog, near-zero latency connection to this headphone, this strikes me as a pretty big oversight. The quality of the analog connection might not matter for games, but if you’re interested in musical performance, having any sort of noise on that line is simply unacceptable.

    Along with the ANC and power buttons, there are three controls on the right earcup: one multi-use button for play/pause/forward/back and two buttons for volume adjustment. I wish it was possible to make the volume buttons act like previous/next instead, since double- and triple-clicking the multi-use button for navigation can be finicky. (It’s been pretty consistent for me, but navigating in the wrong direction even one time out of twenty is really frustrating.) At least the multi-use button is contoured and very clicky so you can definitely tell when you’ve pressed it.

    The PX earcups turn flat for travel and come with a magnetic pouch, but the space savings are minimal. Actually, I wish the cups turned in the opposite direction, since wearing them over your neck with the earcups facing up is a bit awkward.

    Of the four headphones, the V-MODA Crossfade II Wireless is certainly the most comfortable and (to my taste) the best looking. There’s some plastic here, but it’s the good kind of plastic. You get the sense that you’d be able to throw these headphones around without much fear of damage. The design is fun and a bit 90’s, and I particularly love the subtle gold accents in my color scheme2. The earcups are soft and pillowy and fit snugly over your ears. Many people suggest getting XL earcups for these headphones, but I actually like the default size.

    Much like the PX, you can pair these headphones with multiple devices. Unlike the PX, audio does not pause when sound comes in on a second device. My empirical understanding is that you’re only able to start listening on a different device when the active audio session on your current device finishes playing. In practice, this seems to be triggerable by pausing your media or maybe closing your music or video player in the worst case. I prefer this behavior to the PX’s, since I don’t have to work around notifications interrupting my playback.

    Much like in the ATH-DSR7BT, there’s a very noticeable hiss in wireless mode over silent passages. In practice you can’t hear it over most music (except classical), but it is always there in the background, subtly shifting the quality of your music. I’ve heard it said that these headphones were designed first and foremost to be great wired cans, and at least two people have corroborated that they do in fact sound terrific in analog mode. (Many other Bluetooth headphones don’t sound nearly as good without their circuitry switched on.) The included 3.5mm braided cable features a microphone and a multi-use button, though the microphone quality is very poor.

    Oddly enough, iOS seems to track the battery level of these headphones in 20% increments. (The PX appears to use 10% increments, but at least you can check the precise battery level from the app.) For some boneheaded reason, V-MODA decided that it would be a great idea to have the headphones beep once a minute for the entirety of the last half-hour of battery life. The effect only serves to shorten the battery life even further since it’s just so difficult to tolerate.

    The physical controls work the same as on the PX: three buttons, two for volume and one multi-use. Here, the multi-use button sits on top of the earcup’s hexagonal inlay so it’s fairly easy to feel out. The buttons are a bit mushier than on the PX, though, making it easier to misclick if you’re going for those double or triple clicks.

    V-MODA has a couple out-of-band perks that are worth noting. First, you can send them your headphones even if they get completely trashed (if they’re still on sale) and receive a 50% coupon for a comparable model in return. (Especially important in the fast-moving world of wireless audio!) Second, the company is closely affiliated with the Head-Fi userbase and seems very much in touch with the audio enthusiast community. This gives me a bit more faith in their product and support than I would otherwise have.


    First, I should state for the record that I’m not really an audiophile. To be sure, I enjoy high quality audio and keep a hard drive worth of FLACs on standby. But the most expensive pair of headphones I’ve ever owned is the MDR-1RBT, and I’ve not had an opportunity to test and especially ABX any true audiophile cans. My only high-end frame of reference is a Sennheiser HD 800 S I briefly demoed at the SF Sennheiser store. (Incidentally, this experience was an excellent calibrator for all my future audio expectations. Never thought audio could sound so crisp and spacious through a pair headphones!)

    Second, I’m an audiophilia skeptic. Though I do love my FLACs as a collector, I believe that V2 MP3s (or equivalent) are universally transparent for the vast majority of music. I’m also very wary of magical-sounding terms used to describe audio equipment, since the placebo effect is so darn powerful when it comes to sound. For this test, I tried my best to describe exactly what I heard and to compare headphones as analytically as possible. This required switching sets many times over the course of a single piece of music and sometimes even a single section. I don’t have much hands-on experience with the vocabulary used in audio enthusiast circles (e.g. “dark”, “warm”, “detailed”, “analytical”, etc.) so I tried to describe the sound in my own words. Even as a skeptic, I was surprised by just how unique each headphone sounded on close listen!

    For testing, I got four iOS devices and paired a headphone to each one. I tried to equalize the volumes as best as possible. Then, I would pick the same piece of music on each device and hit play simultaneously, switching between headphones as the piece went along. As I listened, I took notes on each headphone. My playlist featured old and new favorites across a diverse set of genres as well as a few fancy masterings. I tried to select pieces that were well-mastered and featured a variety of instruments, frequencies, and sounds. Here’s what I ended up listening to over the course of several hours:

    • Dire Straits — Sultans of Swing
    • Deep Purple — Space Truckin’
    • The Clash — London Calling
    • Prince — Purple Rain
    • Brian Wilson — Good Vibrations
    • Sufjan Stevens — Decatur
    • Nickel Creek — Reasons Why
    • Punch Brothers — Passepied
    • Acoustic Alchemy — Mr. Chow
    • Acoustic Alchemy — No Messin’
    • Paul Gilbert — Three Times Rana
    • Rebecca Pidgeon — Spanish Harlem
    • Elliot Smith — Bottle Up and Explode
    • Poe — Hey Pretty
    • Tori Amos — A Sorta Fairytale
    • Radiohead — Let Down
    • Radiohead — How to Disappear Completely
    • Radiohead — Kid A
    • London Zoo — Poison Dart
    • Porcupine Tree — Mellotron Scratch
    • The Derek Truck Band — Sahib Teri Bandi/Maki Madni
    • Polyphia — Finale
    • Galneryus — Lament
    • Tipper — Gulch
    • Tipper — It’s Like
    • Ott — Adrift in Hilbert Space
    • Opiuo — Axolotl Throttle
    • Yori Horikawa — Letter
    • Песни Нашего Века — Контрабандисты
    • Песни Нашего Века — Гренада
    • Песни Нашего Века — Купола
    • Pat Metheny — Last Train Home
    • Beethoven, Josef Bulva — Piano Sonata No. 14 in C# Minor, Op. 27/2 “Moonlight”: III. Presto agitato
    • Rachmaninoff — Étude-tableau in B Minor, Op. 39/4
    • David Bowie — Lady Stardust (Ryko Au20)
    • David Bowie — Moonage Daydream (Ryko Au20)
    • Supertramp — Take the Long Way Home (MFSL)
    • Pink Floyd — Time (MFSL)
    • Pink Floyd — Wish You Were Here (Mastersound Gold)
    • Pink Floyd — Hey You (MFSL)
    • Metallica — Master of Puppets (DCC)
    • Miles Davis — All Blues (Japan DSD)
    • Stan Getz & João Gilberto — Doralice (MFSL)
    • Stan Getz & João Gilberto — O Grande Amor (MFSL)

    The Audio-Technica ATH-DSR7BT was the obvious straggler of the four. Although the soundstage was reasonably spacious (more so than the Crossfade) with good instrument separation, crispness, and detail, I frequently noted that the tone was shrill, sibilant, and harsh — even tinny. This was the only pair that was actually unpleasant to listen to in some vocal sections. It often felt like the low end was sometimes attenuated, leaving behind jagged and unpleasant remains in the higher registers. Which is not to say that this pair sounded bad, but compared to the competition, it felt like it lagged a few generations behind. Coupled with the hardware issues, bizarre design, and build quality, I would definitely avoid this one.

    Next in ranking was the Sony WH‑1000xM2. The soundstage here was very wide, but it felt, for lack of a better word, spherical. The sound lacked dimensionality and detail. The instruments sounded kind of murky and blended with each other instead of sticking out in space. (I wrote down: rounded, hollow, dull, muted, faded.) The bass was quite boomy and exaggerated in a way that just didn’t feel very realistic. Songs would often sound overly resonant or reverby. I realized while listening to this set that none of my favorite musical moments had any pop or excitement to them. These headphones sounded fine, but they kind of sucked the life out of everything. They simply felt cold. (Surprisingly, even my MDR-1RBT sounded comparitively warm and lively despite having a flatter and much tinnier sound.)

    The last two headphones were just so different in their tone that I couldn’t assign them an order.

    The B&W PX was definitely the odd one of the bunch. First thing that jumped out at me was its sheer spatiality. Much like my memory of the 800 S, recordings that used to feel relatively flat suddenly sounded almost binaural with this pair. Instruments would hang in the air with a wonderful amount of empty space between them. If you closed your eyes, you almost had the sense of being in the middle of a concert stage. (I wrote down: crisp, balanced, layered, subtly detailed, like a band playing around you.) For genres like jazz and classical (especially piano), this effect was simply transcendent. However, in comparison to the other headphones, there was something odd about the tone of certain instruments. Vocals in particular sounded a bit metallic, compressed, or rounded in a way I couldn’t quite pin down. I don’t know if I would have noticed if I was just listening to this pair in isolation, but in contrast especially to the Crossfade, the difference was quite stark. Maybe it’s simply a matter of preference; maybe it’s a sonic flaw; or maybe some longer burn-in is required. I know people on Head-Fi have complained about a boxy or tunnel-like character to vocals, but I don’t know they were referring to the same effect since the soundstage and detail are otherwise amazing. (I couldn’t get the pair to sound any different when pressing down on both earcups, as one user suggested. The earcup seals hold very well for me even with glasses on.) I also found that the headphones had a very neutral sound. Vocals fell in line with the other instruments. Plucky guitars lost their warmth and reverb. Bass was certainly there, but reserved. Many songs sounded very immersive but lacked “wow” in their most kinetic passages. And especially combined with a hint of harshness in the upper registers (cymbals, occasional vocals), bassy electronic music tended to fall flat and lose its fuzzy, enveloping effect.

    The V-MODA Crossfade II, on the other hand, simply sounded fun. Incredibly fun. Several times in my notes, I noted that it was like sinking into a pillow of sound. Its signature felt intimate and warm. The soundstage was relatively small, but you could hear each instrument plucking away around you. And oh, those plucks! You could almost physically feel every string on the guitar. (My notes mentioned a rich, cozy, lush, plump, and enveloping sound.) Color-wise, bass is clearly emphasized (to the point of occasionally being boomy) and vocals float forward. In terms of detail, I’d say they beat the Sonys but defer to the B&Ws. You can still pick out the smaller details, but it’s not quite as clear and the instruments don’t have their near-binaural separation. (In fact, I noted that the soundstage occasionally felt a bit flat and that the vocals sometimes sounded a bit muddy or fuzzy.) Nonetheless, whereas the PX astounds in certain genres (jazz, classical) but flubs in others (downtempo, trip-hop), the Crossfade excels in music featuring bass, vocals, and acoustic guitars and does a good job with just about everything else. Best of all, this pair is generous with its “wow” moments. Crescendos, bass drops, and other dramatic passages hit you just as hard as the artists intended. At its worst, the Crossfade produces a hyperactive mass of sound. At its best, it grabs you and all but forces you to move your feet along. And unlike in the PX, the sound doesn’t demand your attention. You can sit back and just let it wash over you. (Incidentally, I knew going in that these were considered basshead cans, but I tried my best to avoid the trap of conflating bass with sound quality. Indeed, I thought the Sony headphones actually had a bassier sound, but there it felt hollow and reverby while here it was resonant, impactful, and lively.)

    An Inconclusive Conclusion

    Choosing between the PX and Crossfade has been rather difficult. I love the Icarus-like reach of the PX towards a higher audio plane but feel burned by its glitches and oversights. The aggressive power saving causing the unit to frequently turn off is quite annoying, though the long battery life (30-40 hours) is certainly a huge plus. Meanwhile, the Crossfade keeps things very simple and elegant on a technical level but suffers from a pretty short battery life (10-15 hours), along with incessant beeping towards the tail end. The Crossfade also has that frustrating hiss over silence in wireless mode, though wired performance is quite excellent. The PX stays perfectly silent when used wirelessly or over USB, but has an even worse hiss in 3.5mm wired mode. In terms of materials, the PX feels more premium but I think the Crossfade just looks better when worn. It’s also significantly more comfortable in a direct comparison. But despite the cushier materials, I’ve found the clamping force to be more annoying on the Crossfade than the PX over several hours. The PX also clearly wins out in terms of sound isolation, both passively and with the help of ANC. Sound-wise, the two headphones couldn’t be more different. The PX is spacious, detailed, and sterile, while the Crossfade is enveloping, warm, and resonant. Jazz and classical music spring to life with the PX, while plucky guitars, vocals, and electronic music reverberate through the Crossfade and make you want to get out of your seat. In terms of support, the PX are set to get frequent firmware updates, but V-MODA has an excellent reputation and a lucrative trade-in program. The choice is made even harder by the fact that I got the Crossfade on sale for $240 while the PX is fixed at $400 with no sales in sight.

    I think I’ll have to spend a few more hours listening to each set before I make up my mind!

    1. I even wrote an open source iOS app to help me with this: ios-bluetooth-headphone-unsleeper. All it does is play a near-silent audio stream when you hit the switch, forcing the MDR-1RBT to stay on while the app is running.

    2. Incidentally, the rose gold model costs $20 more than the others and happens to be the only one that supports the high resolution aptX codec. Why? I have no idea!

    0 0

    (Sorry about the length! At some point in the distant past, this was supposed to be a short blog post. If you like, you can skip straight to the demo section which will get to the point faster than anything else.)

    Embarrassingly, most of my app development to date has been confined to local devices. Programmers like to gloat about the stupendous mental castles they build of their circuitous, multi-level architectures, but not me. In truth, networks leave me quite perplexed. I start thinking about data serializing to bits, servers performing secret handshakes and negotiating history, merge conflicts pushing into app-space and starting the whole process over again—and it all just turns to mush in my head. For peace of mind, my code needs to be locally provable, and this means things like idempotent functions, immediate mode rendering, contiguous data structures, immutable objects. Networks, unfortunately, throw a giant wrench in the works.

    Sometime last year, after realizing that most of my ideas for document-based apps would probably require CloudKit for sync and collaboration, I decided to finally take a stab at the problem. Granted, there were tons of frameworks that promised to do the hard work of data model replication for me, but I didn’t want to black-box the most important part of my code. My gut told me that there had to be some arcane bit of foundational knowledge that would allow me to network my documents in a more refined and functional way, without the stateful spaghetti of conventional network architectures. Instead of downloading a Github framework and smacking the build button, I wanted to develop a base set of skills that would allow me to easily network any document-based app in the future, even if I was starting from scratch.

    Continue Reading...

    0 0

    The challenge: fit a rotating art gallery somewhere into my life.

    I love visual art and find it hugely inspiring. Unfortunately, reading art books is too much of a context switch to be a regular distraction, while museums are only appropriate for the occasional excursion. Instagram helps, but it only lets you see content from artists you follow. There’s still the 99% of art history beyond that sliver!

    Sourcing art wasn’t the problem. For years, I had been keeping a fairly large folder of inspiring images from places such as Imgur albums, RuTracker museum collections, /r/ImaginaryNetwork, and /r/museum. But leafing through them wasn’t enough: I needed to put them into a regular and random rotation in a place that was just out of eyeshot, but without becoming an overt distraction.

    In 2015, I finally solved the problem by building an app called Backgroundifier, which converted arbitrary-size images into wallpapers by superimposing them onto attractive, blurred backgrounds. By pairing an Automator Folder Action with the native wallpaper cycling functionality of macOS, I could now drop arbitrary images into a directory on my desktop and have them automatically show up in my wallpaper rotation. Peeking at an image was as simple as invoking the Show Desktop shortcut, and if I wanted to see something new, all I had to do was switch to a new Space.

    For the past few years, this scheme had been working fine. But recently, my collection had grown to over 500 images, and I found myself bumping into some slight annoyances. For example, I had no way to retrieve the filename of the current wallpaper, to remove an image from rotation, or to mark it as a favorite. Every maintenance task had to be performed manually.

    Finally, I decided to build a menu bar app that would solve all my problems through a unified interface: BackgroundifierBuddy.

    Continue Reading...

    0 0

    It’s been a year and a half since I wrote my article on building a hassle-free Thunderbolt 2 eGPU for my late-2013 15” Macbook Pro. In all this time—with the caveat that I was exclusively using an external monitor through Bootcamp—I’ve had essentially zero issues with the setup. Booting would occasionally require one or two attempts to hear the Mac chime, but once you were in, performance was flawless. The GTX 1050 Ti was quite an excellent little card.

    However, my power requirements recently changed. When the Oculus Rift went on sale for $350, I just couldn’t resist picking one up. VR was new to me and I wasn’t sure if this was something that would stick, but my first experience with Beat Saber and The Climb left me in awe. This was, very clearly, a new category of human experience that was being offered for just a few hundred bucks! Unfortunately, latecy was now an issue with some of the higher-caliber games such as Batman: Arkham VR, and 60–90fps was essentially a hard requirement for VR immersion. Together with the fact that many newer titles were starting to see poor performance even at lower settings and resolutions, I decided to explore upgrade options.

    Until recently, I had my eye on the single-fan EVGA GTX 1060 SC to eventually replace my 1050 Ti, since the dimensions were very similar. But in doing this round of research, I was surprised to discover that Gigabyte offered an actual, full-fledged GTX 1080 in almost the same form factor! Though slightly too tall to fit, the card was just the right length for my AKiTiO case, meaning that I could simply take the top off and pop it in without having to do any metal-bending.

    Continue Reading...

    0 0

    I just released a drink-tracking app called Good Spirits. The code is available under the GPL.

    Following my CRDT explorations earlier this year, I wanted to build an app on top of a persistence layer that adopted the same kinds of resilient sync patterns, even if it didn’t use the actual low-level “ORDT” data structures described in the article. Having used a spreadsheet to track my drinking over the past few years, a dedicated drink tracking app seemed like the perfect little project.

    The main problem I was aiming to solve was architectural. Many apps establish rather unclear relationships between the main app, the persistent store, and the cloud. Does the local database or the cloud count as the primary store? When a change comes down from the UI layer, how does it eventually get synced to the local database and to other devices, and what code keeps track of all this state? Which system does the cloud talk to? How is offline mode dealt with? How do critical errors percolate through this pipework? And what does the in-memory cache have to say about all of this? A common result is that the data layer becomes a monstrous thing that deals with persistence, sync, caching, reachability changes, UI notifications, and many other systems, all at once.

    The reason for this mess, in my mind, is state. Sync often relies on systems catching every change and remembering their exact place in the whole process. If an update falls through the cracks, the data has a high chance of desyncing. Naturally, this leads to monoliths that try to claim ownership over every last bit of data.

    Ergo, my goal was to get rid of as much state as possible.

    Continue Reading...