Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog


Channel Description:

Alexei's pile o'stuff, featuring writing on software development, travel, photography, and more.

older | 1 | (Page 2) | 3 | newer

    0 0

    Composer’s Sketchpad 1.2 is out! This is a major update with several new features, including audio export (via AAC), a new tool for shifting notes along the time axis, and a one-finger drawing mode. I figured this might be a good opportunity to write about something a bit more on the creative side: icon design!

    Having no practical design experience, I am very proud of the icon I created for Composer’s Sketchpad. A good icon is absolutely essential for marketing, so most app developers would recommend contracting out this delicate task to a real designer. But I’m stubborn: one of my higher-level goals in creating Composer’s Sketchpad was to get better at art and design, and I wanted the icon in particular — the thesis of my app! — to be my own invention.

    Going along with the idea that creativity flourishes under harsh constraints, these were the requirements I laid out for the icon:

    • It had to feature a reference to music.
    • It had to hint at the functionality, aesthetics, and interface of the app.
    • It had to roughly align within the iOS 7 icon grid while somehow subverting it.
    • It had to exhibit some dimensionality and flow. I didn’t want it to look flat or overly vectory.
    • It had to be logo-like: symbolic, bold, and simple.
    • But most importantly, it had to immediately catch the eye. As a frequent App Store customer, I knew well enough that even a slightly uninteresting app icon would warrant a pass, while an interesting icon might make people peek at the app description without even knowing anything about it. The icon was absolutely critical to my passive marketing. It was my calling card — the entirety of my app wrapped up in 512×512 pixels. No pressure!

    Weeks before starting work on the icon, I began to keep tabs on other app icons that I found interesting. I was already following musicappblog.com religiously for music app news, so I scoured their archives for inspiration. I also carefully looked through all my home screens as well as the App Store top charts for non-music influences. In truth, even among the cream of the crop, there weren’t many icons that I outright loved. Most of the ones that caught my eye kept things relatively simple — outlines, primary colors, subtle gradients — while preserving the circular motif of the iOS 7 icon grid. (Many of these happened to be Apple icons.) There were also plenty of icons that failed at either extreme, either by cramming too much color and detail into the tiny square, or by not providing nearly enough detail to make a minimalist design stand out.

    A few app icons I would consider eye-catching.

    Inspiration in hand, I first made a number of rough pencil sketches, most of which depicted a prominent musical note with some embellishment. Quality was not a concern at this point: I wanted to jot down as many ideas as possible even if they didn’t seem to hold much promise. In the midst of this process, I found myself feeling fairly ambivalent towards most of the designs I came up with, though I knew they could probably be moulded into something that followed my rules. Something about them just didn’t feel right.

    I still didn’t have much of a sense how far my nascent design sensibilities could take me, and part of me started to give up hope of finding the perfect design. But when I came up with the sketch for the final swirly-tail icon (after running a few ideas by my folks — спасибо, мама!), everything suddenly clicked. I knew right then that this particular design would perfectly slot into the narrow niche defined by my requirements. For the first time, I thought that maybe I could pull this off!

    After making a few passes at the basic shape in pencil, I moved to the computer. My first attempts at a colored draft were very static. Doodling in Pixelmator with my Wacom tablet got me effectively nowhere, so I decided to just work in Illustrator directly — my first real stint with the software. As was typical with Adobe, the UI felt like a sprawling, bloated mess, but also allowed me to do some surprisingly powerful things. The most important discovery were the non-destructive transforms — particularly for the Pathfinder — in the inconspicuous “fx” menu at the bottom of the Appearance tab. With these tools, I gained the ability to perform boolean operations on sets of shapes, turn strokes into paths, and create complex gradients while still having full control over the constituent parts. Doing this across complex groups of layers wasn’t pretty, but it allowed me to freely experiment with new shapes without having to “bake” a final result and start all over again for minor adjustments.

    I’m sure experienced vector artists can use Illustrator to draft their ideas directly, but my process, as a beginner, was much more methodical. I started with the standard iOS 7 grid and drew a simple circle over the outer part. I typed an 8th note symbol in the center and looked through many fonts to find a pleasing shape for the flag. I rendered the note as a shape, added a scan of my freehand sketch in the background, and started dissecting the circle; it was split into several sections to make joining with the note flag a bit easier. After placing a connecting Bézier curve between the flag and the circle, fiddling with the control points to match my sketch, and adjusting the width to smoothly blend the circle and the flag, I had an outline that roughly matched my paper drawing. For this first pass, the rest of my time involved zooming out and adjusting the widths and tangents to make sure that everything looked smooth and contiguous.

    Some early experiments.

    Designing the colorful swish at the tail end of the circle came next, and it turned out to be the trickiest part of the process. I knew that this segment of the icon had to have flow, levity, and dimensionality without looking too realistic or skeuomorphic — and yet I couldn’t picture it in my head. I started with a simple 3-color gradient at the end of the circle that widened towards the bottom. This looked merely OK, but it felt too static. Adding more colors to the gradient and moving the left side of the tail into the circle helped, but it wasn’t enough.

    The first problem was nailing the outer curve of the tail. I tried many different shapes. Some looked like paintbrushes; some evoked waves; some resembled sand dunes. But none felt perfectly right. My “aha” moment was when I realized that I was subconsciously creating an Archimedean spiral with its origin at the note flag. I borrowed a spiral from Google Images and adjusted my curves to fit it. The shape finally came together.

    Next came the colors. I learned that I could add more control points to the bottom of the gradient envelope, allowing me to roughly specify the curve of each vertical slice of the gradient. The next few iterations involved creating an almost cloth-like shape out of the envelope and fiddling with the blur between the gradient colors. Still, the distribution of the gradient stripes was unsatisfactory. No matter how much I adjusted the gradient distribution or the control points of the envelope, the swirls felt too busy at the origin and too lopsided further towards the bottom.

    Rough drafts closer to the final icon.

    I realized that what I wanted was precise control over the “density” of the gradient envelope, top to bottom. Hoping that Illustrator contained within its multitudes the solution to my problem, I Googled around and was elated to discover that I was correct. The Gradient Mesh tool, though a bit tricky to set up, allowed you to apply a gradient to a flexible rectangle with an inner Bézier-based grid. I could now adjust the precise distribution of color throughout the entire length of my tail!

    There were still some shape-related questions to answer, the most important being: how do I maintain the legibility of the note and circle? The tail was supposed to be in the background; above all else, I didn’t want the shape or colors of the tail to interfere with the appearance of the note. Initially, I assumed that the left edge of the tail (touching the blue stripe) should avoid the note head entirely by going under or above it. However, both options made the tail look misshapen and unattractive, ruining the wave effect. On a whim, I tried intersecting the note head with the edge and it worked! Instead of disrupting the legibility of the note, the line drew the eye to it. I also had concerns that the gradient would make the lower part of the circle hard to see, but this was easy to fix by simply following the shape of the circle with the red stripe.

    Finally, I wanted to make sure that each curve in the tail — the left edge as well as each dividing color line in the gradient — “rhymed” with the overall shape of the icon. The final curves were mostly determined by trial and error. Just as with my initial sketch, I “knew” as soon as I saw the winning arrangement that I had found an inflection point for my design. There was a strong sense of motion originating from the note flag, carrying through the circle, and spiraling back around into a colorful background wave. Even though I couldn’t picture it at the time, it was exactly the effect I was hoping for when I originally came up with the design!

    (I wish there was more to say about color selection, but in truth, it was done quickly and somewhat haphazardly. The background blue was derived from the primary color of my app, while the gradient colors were basically chosen on a whim.)

    The final curves of the gradient mesh.

    For the Lite version, I once again wanted to stick to App Store conventions while defying them just a bit. Most Lite icons have an ugly banner over the default icon that feels out of place with the rest of the design. I still wanted to have the banner for consistency, but I wanted it to work with my diffuse pastel aesthetic.

    First, I had to determine banner placement. I tried several of the usual positions and then quickly rejected them; they blocked off too much of the underlying icon. I then decided to give the diagonals a shot and discovered that the upper-right corner had several benefits: not only did it preserve visibility for the key parts of the icon, but it also complemented the motion of the circle while allowing some interesting colors to peek through. (Assuming some translucency, which felt likely.)

    Next, I had to find a good look for the banner. (This iteration was done in Photoshop, since its raster effects were far better than Illustrator’s.) A simple color fill felt too out-of-place, so I decided to try for an iOS-7-ish Gaussian blur; ideally, I wanted a bit of the white outline and some of the tail colors to show through without compromising legibility. To make it easier to pick the final position, I masked the banner onto a blurred render of the underlying icon, which allowed me to treat the banner as if it were simply a blurry window and move it around freely. It didn’t take long until I found a satisfying result.

    Drafts for the Lite version of the icon. The final icon is on the right.

    That’s about it! Against all my expectations when I started on this journey, I’m still pleased by my icon whenever I catch it on the home screen even half a year later. There are certainly changes I could still make — there’s not enough contrast, the colors aren’t perceptually balanced, the gradient divisions are still a bit lopsided and the origin of the swirl needs some work — but I would consider these nitpicks. The gestalt of the design is right.

    (And as an unforeseen bonus, the icon easily converted into a black-and-white stencil for the promo poster a few months later!)

    If there’s a foremost design lesson I took a way from all this, it’s how many moments of inspiration occurred whenever I deviated from incremental adjustments and tried something more extreme. Adding a bit more curvature to a line didn’t yield any new insights; but turning it into a semi-circle gave me a completely new perspective on the shape. Changing the brightness slightly didn’t result in a satisfactory color palate; while ramping the slider completely made me rethink my initial assumptions about the chromatic balance. It seems that if you’re stuck in a design rut, it can be a good idea to vastly overshoot and then dial down instead of trying to inch towards an optimal design with minor, conservative changes.

    Ultimately, it felt wonderful over the course of this project to engage with my creative side — a part of myself that I still consider a mystery. Every time a design decision “clicked”, it felt like a little miracle. No doubt this will only reinforce my stubborn desire to do all my own art in future projects!


    0 0
  • 08/21/16--17:01: Indie App Reliance
  • Today, with a single tweet, the note-taking app Vesper has officially been shuttered. At its release, Vesper was widely promoted by the Apple indie developer communtiy as the hot new thing to try. More than anything else, it had an excellent pedigree, with influential blogger John Gruber of Daring Fireball at the helm. Many hopeful users switched to it for their primary note-taking needs, expecting that features like Mac support would arrive in short order. If any app from this circle was destined to be a breakaway hit, it was this one. And now, with barely a mention, it’s all but swept away, years after langishing with barely an update.

    This is not a post about why Vesper ultimately failed. There are plenty of others who will happily chat about the rusty economics of the App Store. Instead, I want to focus on the other end of this unfortunate feedback loop: the effect that these highly visible app shutdowns might have on App Store customers.

    Several bloggers have expressed curiosity as to why public interest in the App Store has waned so much. I can’t answer for everyone, but at least within myself, I’ve noticed an increasing and persistant reluctance to try new apps. It’s just that I’ve seen same pattern crop up over and over again. Somebody releases an interesting new app, touting fantastic design and improved productivity. The app gains some (but not overwhelming) traction. The app gets a few updates. The app lingers for a few years. And finally, the app untriumphantly rides off into the sunset, taking entire years of not just developer time, but thousands of users’ ingrained habits with it. The case is clear: most apps — and especially indie apps — cannot be reliably expected to continue operating.

    After being burned so many times by products that have been pulled out from under me, I’ve unconsciously adopted a worrying philosophy for trying new apps: unless the app I’m using is backed by a large corporation or is outright open-source, I’m not going to use it for anything particularly important in my life. (And even then, certain corporations — ahem, Google — are put under further scrutiny.) I hate having to do this because many amazing UX advancements can be found in apps produced by smaller developers. (Apple folks love to talk about how certain categories of apps are design playgrounds.) But at the same time, I know that with these apps, there is an inevitable sunset e-mail waiting for me in the not-too-distant future. It’s gotten so bad that I’m starting to seriously consider switching most of the (snappy, beautiful, well-designed) productivity apps on my phone over to their (ugly, clunky) open-source alternatives, just because I know that OpenWhatever will long outlive the current App Store darling for that category. (1Password is one hot spot that immediately comes to mind. Losing them would be a disaster.) I don’t want to worry every day about whether these proprietary silos will suddenly go up in flames with all my carefully-constructed workflows and data in tow.

    Despite the low prices on the App Store, I now get decision fatigue whenever I go to purchase an app. How long is this product going to be around? How reliable is this developer? How easy is it to export the data? How open are all the underlying formats and APIs? The price might be insignificant, but the commitment implied by my purchase is not trivial at all! Unfortunately, developers don’t seem to care much about the mental toll that pulling an app might cause, even when they were the ones touting life-changing productivity and workflow improvements in the first place. It’s one thing I miss about Windows utility software: so much of it is terribly designed, but at least I know it’ll run more or less forever. (Both on account of the open platform and Windows’ amazing legacy support.)

    It’s understandable why developers shut down their apps, but I wish there was another way out of this dead-end. Maybe apps could certify that all their back-end services are provided by external vendors and can be swapped out if necessary. (This is why I’m not too worried about apps like Reeder and Pocket Casts: I know that if they go away, I can take my precious data and switch right over to another app.) Maybe developers could pledge — even with legal backing! — to open-source their software if they ever decide to stop supporting it. Or going even further into this mythical socialist utopia, how about we finally figure out a way to fund open-source software from the get-go without having to beg for donations? With services like CloudKit, it’s no longer even necessary to spend a single cent of your money on servers. What’s the point of bringing something wonderful into the world if it only lasts for as long as people are willing to buy it? I can’t help but see that as hopelessly cynical.

    To be clear: I’m not saying that developers should be expected to support and add features to their apps indefinitely. That would be a very extreme stance. But on the other end, adopting a scorched earth policy for your app once you tire of it is also pretty extreme and poisons the market to boot.

    Apps — products that encapsulate years of people’s lives — should never outright disappear just because a developer can’t be bothered to support them anymore. If we don’t have that assurance, and if we can’t rely on our tools, all we’re doing is playing with toys.


    0 0

    I have to admit: I’m an analog kind of fellow. Much as I benefit from our growing roster of digital tools, I’m always on the lookout for software that reminds me of reality’s imperfect grit. Fake mechanical clock faces. Typewriter sounds. Simulated CRT monitors! Some might call them skeumorphic, clunky, or even fraudulent; but in a world increasingly bent on making things shiny and pristine, I enjoy having a reminder of which side of the screen is the more important one.

    The same even applies to my work notes. No doubt, there are immense benefits to limiting your note-taking to professional software like OneNote or Google Docs, including obvious features like copy & paste and text search that we all rely on while taking completely for granted. But whenever I undertake a major project, any spare pieces of paper lying around (including napkins, envelopes, and candy wrappers) will inevitably become conscripted as scratch paper, despite the vast universe of affordances on the digital side of the divide. As much as I’ve tried to adopt my thinking to software, the cold, hard truth of digital type just doesn’t represent my thoughts very well. On paper, my ideas becomes non-linear: sometimes visually grouped with related bits of info, sometimes crawling up the sides, sometimes accompanied by quick sketches and diagrams.

    One day, after failing yet again to trace my line of thought in Evernote, I decided to just give in and buy myself a nice paper notebook. Despite my initial concerns about all the features I was giving up, the switch turned out to be remarkably liberating. I loved to dig around in my backpack for my latest set of notes; feel the ever-growing creases in the covers; flip past all the dog-eared pages. And of course, no app in the world could replicate the joy of jotting down a rickety diagram with ink flowing in its wake! It felt significant that I could hold in my hands a tangible artifact representing the course of my project, and I looked forward to the days when I would exhaust my current notebook and have to go shopping for a new one. The tactile pleasures of this simple thing couldn’t be reproduced by any computer.

    The digital world still beckoned, slightly. Largely spurred by my light-packing travels, most of my other media had become digital at this point. My work notes usually served the role of scratch paper, so I didn’t miss the search feature in Evernote too much. Still, it was a terrible shame that once a project was over, all these notebooks had to be thrown in a closet to gather dust in obscurity. I tried scanning bits of them in, but it was too much of a hassle.

    In September 2015, Apple announced the new iPad Pro along with their brand new stylus, the Apple Pencil. This was a necessary purchase for my work anyway, and so a new hope crossed my mind: could I finally reconcile the analog and digital worlds with this tech? Here was a tablet that could finally act like a digital pad of paper, sporting a glass-welded, ambient-light-adapting screen and a digitizer running at 120Hz with barely any latency. Just a few years before, you needed an enormous desk-sized piece of hardware to do the same thing — and people still complained about the lag. With the Apple Pencil, seemingly everyone (artists and reviewers alike) agreed that it was the closest thing to paper they’d ever tried.

    After receiving my new iPad, I started to search for a very specific kind of software. Plenty of great drawing apps were out already, but I wanted more than that. I wanted an app that would let me collect a roster of virtual notebooks, all sporting different shapes and covers. I wanted each notebook to have a variety of pages, customizable with their own type and texture. Most importantly, I wanted every notebook to exist as an open-format file on my Dropbox. Instead of relying on a proprietary app to access my notebooks, it was critical that I be able to leaf through them — and maybe even edit them! — using other software. It seemed that PDF might be suited for the task; I’d routinely used it for book scans to great effect, and I also knew that the underlying rendering technology — PostScript — was more than capable of displaying any manner of graphic. Perhaps there was an app that fused PDF creation and annotation with just the right amount of magic make it work?

    There were a good handful of contenders, but three names kept coming up: Notability, Noteshelf, and GoodNotes. None were perfect. Notability had many bells and whistles and was clearly the audience favorite, coming up in any thread where people were talking notes. Noteshelf was beautifully designed and seemed to be directly targeting my digital notebook use case, sporting flippable pages and an iBooks-like shelf for your notebooks. Unfortunately, both options felt fairly proprietary. If PDF export was even a feature, it was clearly treated in a throw-it-over-the-fence kind of way: the pristine copies of your notebooks only lived inside their respective app silos. Then there was GoodNotes. This was a subtle app without too much pizzaz and not overflowing with features. It almost resembled an Office-style product more than any of its hipper competitors. But the features it did offer were incredible.

    First, GoodNotes stored your drawings as vectors instead of rasterizing them out. Every line, shape, and highlight you drew was retained as a pristine geometric shape, preserving your work for the ages and offering 100% clarity at any zoom level. Any text that you wrote would automatically get OCR-ed, allowing you to actually search through your notes if your handwriting was legible enough. Covers and pages could be swapped with ease to any image of your choosing, even in the middle of an existing notebook. (Templates such as graph paper and even musical staff paper were included.) The UI and gestures were bog-standard iOS, and accordingly intuitive: it took little effort to figure out how to manage your documents and create new content. You still got all the “parity features” offered by other note-taking apps such as typing, shapes, image support, and more, which weren’t particularly relevant to me but still felt like they could be useful on occasion. Then there was the kicker. Even though your notebooks weren’t stored directly as PDF, you could opt to encode them into Dropbox right as you were working. The implementation here was simply remarkable. Each PDF produced was, functionally, a lossless copy of your notebook, harnessing the full power of PDF to reproduce every GoodNotes feature in full. Digging around in the generated file internals, I saw that everything was layered just as it was in the app. The backgorund texture of each page was its own asset. Your writing was stored in its original vector form. Even the OCR text was there, hidden from view but layered on top of the original text and searchable using most PDF software. This was as close to a perfect copy of your notebook as you could get, and I felt confident that my data would be perfectly safe were GoodNotes to ever go out of business.

    (It should be noted that there’s one downside to GoodNotes’ vector drawing approach: performance is proportional to the amount of content on the page. In most cases, this isn’t a problem: loading takes no time at all and drawing is lag-free. But if one of your pages is especially dense with translucent lines and complex shapes, it might take a second for everything to tile in. In practice, the tradeoff of having a lossless copy of your data versus fixed performance is well worth it. I can wait a second if it means that my writing will look as crisp 100 years from now as it does today.)

    It’s hard to deny that you lose something in moving away from the physical world. No amount of code will give back the glide of an ink pen across cream-colored paper or the crinkle of a bound old set of pages. But you only trade one kind of magic for another. I can now switch colors and brush strokes in a snap. Mistakes can be erased and even undone with barely a thought. If I need to make a graph or draw some musical notes, I can switch out the “paper” I’m using to almost any other format — or even provide my own. And covers! Whereas in my paper notebook days, I could spend hours window-shopping for a cover with just the right look and feel, the joy of discovering the perfect vector image for the cover of my digital notebook comes very close indeed. Best of all, I now have a digital archive of all my notes without any extra work on my part. On finishing my work for the day, I can peek inside my Dropbox and leaf through a remarkable document featuring every paragraph, graph, and schematic I’d scribbled.

    Today, all my work-related notes are made directly on my iPad using GoodNotes. And it’s just wonderful!


    0 0

    The category of static visual art is in a bit of an awkward phase right now. Entertainment in the 21st century has evolved to actively engage our minds and senses, to the point where movies, music, games, and even audiobooks require little more than putting on a pair of headphones or fixing our vision to the nearest screen. Where does the immense body of work from genres such as fine art, photography, and illustration fit into this world? Museums — physical beasts that they are — can hardly be visited on a whim, and as of yet there’s (sadly) no Spotify for visual art. Meanwhile, hundreds of amazing works are posted daily on Instagram, DeviantArt, and Reddit. How do we find the time to fit them into our content-saturated lives? And how do we return to view the works we’ve already enjoyed?

    For several years, I wanted to create a sort of “digital museum” that would give me random, on-demand access to this very important side of the art world. The constraints weren’t complicated. All I needed was a large amount of art along with a mechanism that would randomly show me new works from this collection every fifteen minutes or so. But while acquiring the art was hardly a problem, there were relatively few areas in my life where I could idly display images. Screensavers? Showed up too infrequently and weren’t easily controllable. Wallpapers? Couldn’t deal with arbitrary aspect ratios. I thought I had my solution when I ran the Google Art Project in a full-screen browser tab on a second monitor, but the selection turned out to be too limited and I could no longer rely on the luxury of having more than one display when I set out on my travels.

    (As an aside, Clay Bavor solved this exact problem in hardware by creating a digital photo frame that automatically compensated for ambient light. Amazing solution! But I’m a software guy, so…)

    After discovering Chris Tomkins-Tinch’s Artful app which turned your desktop wallpaper into a rotating collection of fine art, I realized that I had given the humble desktop too little consideration. With a simple Gaussian blur, a soft drop shadow, and a sprinkle of magic, it was in fact quite simple to create dynamic backgrounds for images at practically any aspect ratio. But Artful was designed to automatically pull images from proprietary sources, whereas I already had a sizable “inspiration” folder of collected art that I wanted to add to the mix. I also wished to keep my system as clean and simple as possible: Artful interfaced directly with your system preferences, but I much preferred to just keep a wallpaper folder that I’d occasionally drop new images into. And so a new app was born: Backgroundifier, a native converter droplet that let you easily turn arbitrary images into lovely desktop wallpapers.

    Just having this app around increased my consumption of art tremendously. But it wasn’t enough. I wanted to bridge the gap between finding an image on the web and having it appear in my desktop rotation, and I also wanted to be able to show new works of art on a whim. Fortunately, macOS is no slouch! Using Backgroundifier’s command-line mode, Automator, and the native power of Mission Control and Spaces, I’ve finally been able to create the digital museum experience I’ve always wanted.

    Naturally, the process begins with finding the art.

    Where’s the Art?

    Some people want their art carefully curated, and there are a number of existing apps and services for that. (See the aforementioned Artful and the Google Art Project.) Not me, though! I want everything in my wallpaper shuffle: the “great artists” of the past; modern digital and concept art; Russian textbook illustrations; architectural photography. Much of my daily discoveries come from Reddit, and though the site is an awful cesspool in many respects, subs like r/imaginarylandscapes, r/cozyplaces, r/specart— and even plain old /r/art and /r/photographs— make it all worthwhile. Whenever I run into an interesting new sub specializing in visual art, I immediately sort by the top posts of all time and pull my favorite images from that list. (Fun tip: if you ever run into an Imgur gallery that you particularly like, you can find a link at the bottom to download the entire collection as a zip! I’ve done this with things like Miyazaki backgrounds.)

    If you’re interested in scouring some of the less savory parts of the web, there are Russian torrent sites featuring comprehensive collections of art from practically any famous artist or museum you could think of. There’s nothing particularly unethical about this approach — a lot of older art is at this point public domain, after all — and it’s quite an experience to drop “The Best of the Louvre” into your background rotation for a week.

    Running every single file through Backgroundifier and plonking it in your wallpaper folder is bound to be a chore. Fortunately, this can be entirely automated using Backgroundifier’s command-line mode and macOS’s native Automator.

    Harnessing the Command Line

    Although Backgroundifier presents a user-friendly GUI, it can also be accessed through the command line. (To see how one can make such a dual-mode app in Swift, you can examine my code here.) One way to do this is to navigate to your Backgroundifier.app bundle in Terminal and run the Backgroundifier executable found in the Contents/MacOS subdirectory. With the standard --usage flag, you can view all the options available to you. (Some of these aren’t even accessible through the GUI!)

    The simplest way to process a file is to run Backgroundifier -i /path/to/input_image.jpg -o /path/to/output_image.jpg -w 1920 -h 1080. Unfortunately, due to the fact that Backgroundifier is a sandboxed app, you can’t just do this for any random directory. Whereas a sandboxed GUI app can expand its sandbox to include any directories opened through the file picker or dropped directly onto it, command line apps (to my knowledge) have no such ability. You can therefore only process files located in your ~/Pictures directory.

    Fortunately, there’s another way. In the Resources directory of Backgroundifier.app bundle, there’s a zip file containing a non-sandboxed, Developer ID signed version of the command line tool. Extract it and you can use it in any directory you please.

    Magic Folders with Automator

    Automator, macOS’s powerful visual scripting tool, can be used to create so-called “Folder Actions”, or workflows that run whenever the contents of a predetermined directory are changed. As you might expect, this is ideal for file conversion. Below is my Folder Action workflow for automatically “backgroundifying” images into a separate output directory:

    Item 2 contains the path to the output directory and item 3 contains the path to the Backgroundifier command line utility. (They exist as separate items to make the paths easy to modify without having to resort to scripting.) Here’s the full text for the script in item 3:

    # assign paths
    bgify=$1
    output=$2
    
    # remove path arguments
    shift 2
    
    # process images
    for var in "$@"
    do
        filename=$(basename "$var")
        full_output="$output/$filename"
        echo "Processing $full_output ...""$bgify" -i "$var" -o "$full_output" -w 2560 -h 1600
    done
    

    Nothing too complicated! You can find the workflow file here, and I assume you can just drop it into your ~/Library/Workflows/Applications/Folder Actions directory. You can also pretty easily recreate it from scratch: just make a new Automator workflow with a Folder Action document type and copy the items.

    Whenever I find an interesting new image on Reddit, all I now have to do is drag-and-drop it straight from my browser into the designated Input directory on my desktop. macOS and Backgroundifier automatically take care of the rest.

    Dealing with the Desktop

    macOS’s desktop background settings allow us to pick a source directory and change the background to a random image at a set time interval (with 30 minutes being the default). All we really need to do here is drag the output directory from the previous step into our list, select it, check “Change picture” and “Random order”, and set our desired time interval.

    It’s no fun to manually move every window out of the way whenever you want to peek at your wallpaper. Fortunately, there are several macOS-native shortcuts for showing the desktop. One is to use a four-finger trackpad pinch, selectable under Trackpad → More Gestures → Show Desktop in System Preferences. Personally, I prefer binding the action to a keyboard shortcut: Command-Option-down, to go with my assigned Command-Option-left and right shortcuts for switching spaces. You can do this under Keyboard → Shortcuts.

    Some of us are… more messy than others. The desktop can acquire quite a bit of cruft over time, blocking view of the beautiful art below. But why bother cleaning it up when you can just sweep the mess under a rug? If you’re lazy like me, you can toggle visibility for the icons on your desktop by running this simple script:

    #!/bin/sh
    # Toggles desktop icons.
    
    if [[ $(defaults read com.apple.finder CreateDesktop) = false ]]; then
        echo "Showing desktop icons."
        defaults write com.apple.finder CreateDesktop true
    else
        echo "Hiding desktop icons."
        defaults write com.apple.finder CreateDesktop false
    fi
    
    killall Finder
    

    And voilà! Clutter-free art with hardly a fuss.

    Spaces & Showing New Art

    Here’s where it all comes together. One my favorite macOS features is Spaces, or virtual desktops. Spaces have an extra hidden benefit for our use case: whenever a new Space is created, its desktop background settings are taken from the previous space. This means that any new Space created in our configuration will automatically arrive with a fresh work of art in tow!

    Whenever you wish to see a new work of art, just pop open Mission Control (in my case, bound to Command-Option-up), create a few new Spaces, and keep switching Spaces to the right. It’s just like leafing through an art book!

    And that’s all it takes to create your own personal art gallery using Backgroundifier. No mysterious system overrides or hacks. No 3rd party tools of unknown provenance. Just a Unix-y converter, an Automator script, and a couple of native macOS features to tie it all together.

    It’s quite a thing knowing that a new, enriching artistic discovery — be it a Picasso, a Van Gogh, or even a Mike From Around The Web — is only a quick peek away!


    0 0

    My late-2013 15” MacBook Pro’s discrete GPU — an NVIDIA GeForce GT 750M — was pretty good for gaming during the first year of its life. But around the time that the new generation of consoles dropped, AAA games on the PC started becoming unplayable, even at postage-stamp resolutions with the lowest possible settings. I lived on a strict diet of indie games from 2015 to 2016 — thank goodness for well-tuned titles like Overwatch and The Witness! — but the itch to try games like the new Mirror’s Edge and Deus Ex became too great. Initially, I thought it might be time to switch out my MacBook for the upcoming 2016 model, but the winter reveal wasn’t particularly tempting: CPU performance was about the same as mine and the GPU was — at best — 3 times as powerful. (Still need to see the benchmarks on that — educated guess.) Worth it for a few hundred bucks, but $2000? No way!

    Building a gaming PC wasn’t an option due to my mobile lifestyle, and in any case the kind of CPU I could buy for cheap would be comically underpowered compared to the i7 4850HQ I already had in front of me. So I started looking into the scary world of external Thunderbolt GPUs, colloquially known as eGPU. Modern Thunderbolt 3 (allegedly) supports external GPUs in an official capacity, but older Thunderbolt 2 can get the job done as well, even though it’s unsanctioned by Intel. I’m usually reluctant to pursue these sorts of under-the-radar hobbyist projects, but there was enough prior art to make it worth a shot!

    Unlike many gaming enthusiasts, my goal was to optimize for simplicity over power: the fewer hacks and workarounds I had to use, the better. I already knew I’d have to use an external monitor and do my gaming in BootCamp, which was already the case. I knew there would be some performance loss from the limited bandwidth of TB2. I gathered that there may be timing issues and other problems that would require a bevy of software hacks to fix — mostly on the Windows side of things. But I was most concerned about the hardware hacking required to get the thing up and running in the first place.

    The majority of published eGPU builds involve enormous graphics cards connected to hotwired desktop PSUs, sitting in unseemly, torn-apart Thunderbolt-to-PCI chassises. It was clear that the anointed case for the job was the AKiTiO Thunder2. The Thunder2 wasn’t designed for eGPU use, but dozens of eGPU enthusiasts on forums like TechInferno demonstrated that it ran stable and performed admirably. (AKiTiO engineers even popped in on occasion to offer under-the-table eGPU advice — off-warranty, of course.) It was also one of the cheapest options on the market at around $200: very fair considering that a barebones development Thunderbolt 2 board cost nearly as much!

    Most eGPU builders buy this case to hack up, not to use as-is. Usually, the front panel is bent back or removed to fit larger cards, and then a desktop PSU is made to turn on with a paperclip and adapted to fit the DC plug. There are also arcane startup rituals to get everything powered and running with the right timing. I really didn’t want to have a PSU octopus and a ragged hunk of metal sitting bare on my table, though it sadly seemed inevitable. Then I discovered an alternate route.

    Most GPUs are power hogs that rely on one or two extra power ports on top of the card, but there are a few designed to pull power straight from the PCI slot. These aren’t super-extreme gaming cards, but these days they more than get the job done. For example, the just-released NVIDIA GeForce GTX 1050 Ti can pull 1080p at medium-high settings in many recent games and currently benchmarks as the ~40th best video card on the market! Better yet, many of these single-slot offerings are short and half as long as the monster enthusiast cards, easily fitting into AKiTiO’s compact case without any modifications. Using this type of card, I’d be able to keep my Thunder2 in one piece and avoid using a PSU entirely. No hacks required!

    At peak, these slot-powered cards can draw 75W from the PCI Express slot. Unfortunately, the AKiTiO Thunder2 only comes with a 60W adaptor, 30W of which is allocated to the circuitry. A dead-end? Not so fast: as stated in the official docs and verified by employees, the Thunder2 can actually pull as much as 120W from a more powerful adaptor. To be compatible, the new power brick needs to sport a 5.5×2.5mm barrel plug, provide 12V output, and have center positive polarity. (Practically every power adaptor has these last two items listed on the back.) My hope was to find a laptop power brick with these same specs, but it turned out that most laptops used chargers with an all-too-high output of 20V. Surprisingly, well-reviewed 12V/10A bricks weren’t common at all on Amazon (unless you lived in the UK or Europe), with most of the listings taken up by rebranded versions of a sketchy-looking adaptor with model number CT-1250. Eventually, I discovered one vendor who was selling bricks with model number CD120100A, which had a more confident label and looked identical to a power brick I saw in another successful closed-case AKiTiO build. (The Amazon listing was full of typos and the product photos didn’t match the user photos, but it just so happened that the adaptor in the user photos was exactly the one I was hoping to find — and Prime allowed for painless returns in any case.) If the US 12V/10A adaptor market was really dominated by CT-1250 and CD120100A, the latter just seemed like a better bet.

    For the graphics card, I decided to give the EVGA factory-overclocked version of the 1050 Ti a try, since one eGPU enthusiast mentioned that their EVGA card handled boot timing issues a bit better. (True or not, I’ve also had positive experiences with EVGA warranty and support in the past, so it was an easy decision.) Potentially, the overclock was a problem: the AKiTiO Thunder2 wouldn’t provide more than 75W of power to the slot, and any excess power pulled by the card could destabilize the system or even fry the circuitry (as reported by one user). But from everything I read, factory-overclocked EVGA cards were designed to never exceed the 75W threshold, and any instability could simply be fixed by underclocking the card slightly using EVGA’s (or possibly NVIDIA’s) own tools. Factor in the fact that the non-overclocked version cost exactly the same as overclocked while probably having lower resale value, and it became clear that the SC model was almost certainly the better buy — even if you dropped the clocks right from the get-go.

    (Note: many reviews will point out that the regular 1050 is a much better deal than the 1050 Ti from a price/performance perspective. Still, the Ti is about 20% faster than the non-Ti for just $20 more, and for the sake of future-proofing as well as TB2 performance loss it just makes sense to wring as much power from the purchase as possible.)

    Trawling eGPU forums for installation instructions was quite frustrating. Most users preferred to write about how they got their eGPUs working with their laptop displays (using Optimus drivers — possible with NVIDIA GTX cards) and/or in OSX. Both tasks involved copious scripts and hacks. I was only interested in the bare minimum — BootCamp on an external display — but most guides simply skipped that “easy” part. Would I need to make a custom build of Windows? Edit drivers? Install a custom bootloader? Nothing was clear, so I decided to just jump into it.

    Once I got all the parts assembled, I plugged the Thunder2 into my laptop and my monitor into the Thunder2, crossed my fingers, and turned on the computer while holding down the Alt key (for the boot menu — I already had BootCamp with the latest Windows 10 installed). At first… nothing. Just a black screen and no chime. I tried unplugging the cable, turning the machine on, waiting for the chime, and then plugging it in. The boot menu showed up, but froze when I selected Windows. I tried one more time to boot with the cable plugged in and it worked! Or — at least, it booted into Windows. Nothing showed up on the external display, but the Windows Device Manager had a tempting entry named “Microsoft Basic Display Adapter”. Hopeful, I searched for other eGPU users who had gotten to this step, and it became apparent that all I had to do was install the latest NVIDIA drivers. One reboot later (with no issues this time) and I was seeing “NVIDIA GTX 1050 Ti” in my Device Manager. I gave Overwatch a quick run on the highest settings, but performance didn’t seem particularly great; my suspicion was that the laptop defaulted to the discrete 750M instead of the eGPU. I returned to Device Manager and disabled the 750M, restarted Overwatch, and… 60fps! It actually worked! Holy cow!

    eGPU setup can be daunting depending on your hardware, but I seem to have gotten away with a problem-free configuration. The “hardest” part is getting the computer to chime on boot, presumably indicating that POST went correctly. This involves turning the computer off and on again one or two times in the worst case: if it chimes and the boot menu appears, everything is sure to work fine. (Recently, I’ve been getting the boot menu on first try 100% of the time. Maybe I was just impatient before!) Once booted into Windows, I’ve learned that simply changing the display settings to only use the external monitor, or to extend the desktop and use the external monitor as the main monitor, ensures that the eGPU is used over the discrete chip. (And I believe Windows remembers this preference when you launch with the eGPU connected.)

    Now for some benchmarks! The main bottleneck in this setup is the TB2 connection. TB2 doesn’t allow for the full PCIe x16 throughput, potentially crippling graphics card performance. In practice, this isn’t really that big of a deal: users have reported at most a 20% performance loss over native, and usually a bit less. Let’s see how well we do.

    GTX 1050 Ti SC GT 750M Improvement
    3DMark Fire Strike
    Graphics Score 6993 1911 3.66×
    Graphics FPS 1 32.28 8.74 3.69×
    Graphics FPS 2 28.74 7.96 3.61×
    3DMark Time Spy
    Graphics Score 2040 450 4.53×
    Graphics FPS 1 13.67 3.00 4.56×
    Graphics FPS 2 11.43 2.54 4.50×
    3DMark Sky Dive
    Graphics Score 22564 5602 4.03×
    Graphics FPS 1 102.25 26.41 3.87×
    Graphics FPS 2 103.83 24.80 4.19×
    3DMark11 Free
    Graphics Score 8802 2445 3.60×
    Graphics FPS 1 42.83 11.27 3.80×
    Graphics FPS 2 42.18 11.40 3.70×
    Graphics FPS 3 54.32 15.52 3.50×
    Graphics FPS 4 25.13 7.39 3.40×

    Quite an upgrade! According to Passmark and other benchmark listings, a 1050 Ti should, under normal circumstances, be about 4.5× as powerful as a 750M. Factor in 10%-20% performance loss from the TB link and that’s exactly what we see in our results: a 4x boost on average.

    Even without any underclocking, stability has not been an issue. I’ve been playing hours of Crysis 3, Far Cry 4, and Mirror’s Edge Catalyst over the past few days and everything’s still working great. I’m keeping the case closed, but I don’t think there’s any real risk of overheating: the GPU fan is designed to funnel heat right out through the back and there’s an extra front fan build into the case anyway. According to 3DMark, temperature during benchmarking has been stable.

    I’m not interested in running any weird scripts to get Optimus drivers for the internal display working, but I learned something interesting while fiddling with the Windows display settings. If you set the multiple display setting to “Duplicate these displays”, it seems that somehow the eGPU gets used for both the internal and external display! Assuming I’m interpreting this finding correctly, this means that theoretically you could buy something like this HDMI display emulator and use the eGPU on the internal display without an external monitor and without having to go through the hacky process of getting Optimus up and running. Unfortunately, there’s a performance penalty of about 20%-25% (according to my benchmarks) as well as approximately 0.25 seconds of latency, making this approach untenable for first-person shooters and other twitchy games. (I wonder if this is also the case with the Optimus driver route?)

    Another interesting finding: if you keep the discrete GPU enabled, there’s a setting in the NVIDIA control panel to dedicate one of the GPUs to PhysX. I’m not sure if this will make a real difference in performance or cause stability issues, but it might be worth investigating in the future.

    To summarize, using only…

    …you can assemble a painless, hack-less eGPU build and use it with your late-2013 15” dGPU MacBook as a relatively inexpensive graphics upgrade compared to building a PC from scratch or buying a console. (Cheaper still if you wait for rebates or use an older/weaker X50 card.) Caveat emptor: the same build might not work so well — or at all! — on other MacBook models or even with a different driver version. In other words, what worked for me might not work for you! Remember that eGPU on TB2 is not officially supported and mostly works by accident, though clearly it can work very well.

    (Also, there’s some great information in the HN thread for this post about new and upcoming TB3 enclosures. If you can get one working with a TB3-to-TB2 adaptor, it might be the best option of all for upgradability, reliability, and future-proofing. On the other hand, you’ll probably spend more money and the case will be a lot bigger. Do your research!)

    In time, I hope somebody releases a Thunderbolt 3 eGPU the size of one of those Square credit card readers — maybe sporting a GTX 980M caliber chip? — that plugs into a USB-C port and works seamlessly with the internal display. But for now, this lovely little eGPU will do just fine. I’m confident that my trusty MacBook can now serve me for another few years, especially if NVIDIA continues to release excellent and inexpensive PCI-powered cards on the regular.

    Let’s hope that the eGPU revolution is just beginning!


    0 0

    Last month, I released an unusual little app for iMessage. It’s called MusicMessages!, and it’s a collaborative step sequencer that lets you work on short pieces of music together with your friends. As far as I can tell, it’s the only app of its kind in the iMessage App Store. (Probably for good reason!)

    The app presents you with a grid of buttons, each corresponding to a musical note. Time is horizontal and pitch is vertical, and the entire grid can be panned like any other iOS scroll view. To place a note, simply tap one of the buttons; tap it again to erase the note. (If you have a 3D Touch capable device, you can depress the button using finger pressure. On an iPhone 7, there’s even a bit of haptic feedback at the end.) The tabs on top of the screen represent independent layers of notes, and if you tap their icons, you can pick a new instrument out of 40+ different ones (including percussion). Once you’re happy with your portion of the piece, you can send it off to one or more fellow iMessage users for their contributions. Each participant’s notes show up in their own unique color, making it easy to track the changes to a piece over time.

    Why iMessage? Since releasing Composer’s Sketchpad, I’ve wanted to create a companion app that would make it even easier to play around with simple musical ideas, though at the expense of expressiveness. Initially, I envisioned this as a tabbed, pannable, Minesweeper-like step sequencer for OSX. But when I started investigating the new iMessage frameworks in iOS 10, I realized that iMessage might be as good a place as any to work out this idea. No sync issues, no file I/O, a format that incentivized short experiments, and plus — the social aspect just seemed neat! Wouldn’t it be fun to riff on a melody or percussion line with your friends?

    Total development lasted exactly two months and involved approximately 8000 new lines of Swift code, plus 1000 lines and a bunch of assets borrowed from Composer’s Sketchpad.

    Favorite tech bit? The data format! I hate spinning up and maintaining servers, so my aim was to avoid any outside dependencies by sending data strictly through the iMessage APIs. Unfortunately, iMessage sends data via NSURL, which in this case had a hidden limit of 5120 characters. I hit this limit with plain old NSArchiver after about a dozen notes. To solve the problem, I had to compress all my data — 5+ layers, 5+ participants, and as many notes as possible — into approximately 3.75kb, assuming base64 encoding for the data string. Swift is pretty terrible at dealing with tightly-packed data structures (a 256-element static array can only be represented by a non-iterable 256-member tuple) and so I designed a struct and corresponding helper functions for my data in straight C. Lots of fun counting bits and optimizing for maximum data density… eventually, I settled on a maximum of 12 layers, 8 participants, and 1120 notes, along with a ton of extra data and even some room to spare. Nothing terribly complex, but it’s still fun to optimize within tight constraints.

    Another feature I enjoyed integrating was the perceptually-balanced HSLUV color space for all my user-selected colors. Normally, if you generate colors in the usual HSB color space by varying the hue and keeping saturation and brightness constant, you get colors that are perceived as unequally bright by the human eye. (An artifact of biology, alas.) Perceptually-accurate color spaces like CIELUV attempt to compensate for this, but most of them have large swaths of empty space where impossible colors lie, making it very difficult to create linear ranges of color parametrized by hue. HSLUV goes one step further and stretches the chroma to fill in these gaps. Not perceptually perfect, but just a ton more convenient and usable in practice!

    Since there’s an element of self-marketing in iMessage apps — recipients of app messages are automatically prompted to download the corresponding apps — it was important to make my app free. As I really didn’t want to plaster my interface with ugly ads, I decided to lock some non-critical features behind an in-app purchase. I’d never dealt with this payment model before, and as a complete novice in cryptography the code samples for receipt decryption and validation seemed quite daunting! Fortunately, I discovered an excellent OSX application called Receigen that generated auto-obfuscated receipt and IAP validation headers for my app. Ended up saving what probably would have been several days of frustrating, unrewarding work for just $30. Highly recommended!

    As before, designing the icon was a lot of fun. Just like last time, there was a long period in the middle where I was sure that the right design — one that would equally hint at the interface, functionality, and ambiance of the app — would elude me. And just as before, after a chain of prototype designs that I wasn’t crazy about, the right pieces suddenly snapped into into place all at once. On a lark, I even spent a few days parametrizing and animating the icon for my trailer, adding another 900 lines of code through Swift Playgrounds. (Next time, I should probably use something like After Effects or Flash. Keyframing in code is a huge pain, and performance in Playgrounds is hardly sufficient.) The thrill of creative experimentation and discovery is something I sorely miss in my day-to-day programming and makes me all the more eager to get started on my game project.

    Speaking of Adobe, I finally moved on from iMovie to Premiere Elements for my trailer. What a relief! Although deceptively simple at first, PE conceals enormous power in its effects and keyframing features. In trademark Adobe fashion, the program does its best to infuriate you into almost paying for the full CC; but with some clunky zoomed-in Bézier adjustments and begrudging cut-and-paste alignment of keyframe positions, it’s easy to create a video that moves, changes color, and feels very dynamic. The trailer I saw in my head came together in just a few days, and now iMovie feels like a joke in comparison. Well worth the $50 I paid for it on sale.

    MusicMessages! was an attempt at a speed project, so there’s many stones left unturned. The UI takes up too much room. The instrument tabs in horizontal mode are too hard to reach. Transitions are jittery and some of the UI glitches out on rotation. There should probably be a chord option for beginners. Percussion is in MIDI order, which is… a little bit crazy. But overall, I’m quite happy with the result! I hope people get a kick out of this weird project and enjoy sending their oddball musical ideas to each other.

    One more thing. There’s a good chance I’ll be releasing a standalone, file-based version of the app in the future (with MIDI, IAA, Audiobus and all that good stuff). If you’d be interested in using such an app, do let me know!


    0 0
  • 01/08/17--13:41: An Even Better Travel Gaiwan
  • Previously, I wrote about the Asobu Travel Mug as an excellent (if unintentional) travel gaiwan. Now, there’s a new leader in the not-a-gaiwan-but-almost-better-than-one category: the Klean Kanteen 8oz insulated tumbler.

    This mug is a bit thicker than the Asobu, but in trademark Klean Kanteen fashion the quality is simply superb. Heat is retained perfectly: there are no hot spots around the lip or anywhere on the body. Compared to the flaky finish of the Asobu, the matte black of the Klean Kanteen is slick and feels like it’ll last for ages. The shape is a little odd on first glance but feels great in the hand, and the rounded lip is perfect to drink from.

    Like the Asobu, the Klean Kanteen has a rubber-lined lid that can double as a strainer. For the most part, I use the sipping hole to strain: the lid snaps on very tightly and most loose-leaf teas expand enough to avoid going through the hole. (You might get a few stragglers, but the same thing happens with my regular gaiwan technique anyway.) If that doesn’t work, you can just pop the lid off and use the rubber seal as a makeshift strainer. As with the Asobu, the “lever” on the back of the lid can serve as a stopper while tilting it back. Admittedly, I did prefer the Asobu lid for its looser fit — the Klean Kanteen takes some strength to pop open! — but it’s a very minor ding on an otherwise excellent product. (Also, this might entirely be in the realm of personal preference. The Klean Kanteen lid looks and feels like it was precisely machined to fit the tumbler, which is a far cry from the ramshackle Asobu construction.)

    The mug fits about 7.7 ounces of water when filled right up to the lid, though you’ll get less when factoring in the tea leaves. It’s the ideal size for a single-serving cup of tea and about twice as big as your typical gaiwan. (Of course, there’s no issue using it for smaller steepings.)

    (As an aside: it took me way too long to realize this, but in addition to using a gram scale to measure out the exact amount of tea, you can also use it to measure the precise volume of water desired. This is because 1ml of water normally weighs 1g. Before, I used to eyeball the water; now, I just pour the water into the mug right after weighing the tea. This might seem super-finicky, but I’ve internalized Eco-Cha’s recommendation to use 9g of tea to 175ml of water for oolongs as a starting point, and it’s really nice to have reproducible results when comparing different steepings. The only question is whether to subtract the weight of the tea from the weight of the water, especially as the leaves expand. My hunch is yes.)

    As I mentioned in the previous article, one of the major reasons to use an insulated mug as a “gaiwan” is for its heat retention properties. Very little heat escapes the mug while making tea, maintaining the water at a stable temperature for the entire duration of the brew. My understanding is that certain kinds of teaware are especially prized for this property, but it’s almost impossible to beat vacuum-insulated steel in this race!

    Of course, it’s great that you can just throw this mug into your backpack or suitcase and not have to worry about it breaking or weighing you down. And since Klean Kanteen is such an entrenched brand, you can even find a number of accessories for it.

    The Klean Kanteen 8oz insulated tumbler: highly recommended as a surrogate travel gaiwan!


    0 0

    We all know that Bluetooth has an abundance of flaws, ranging from frustrating latency to arcane pairing rituals. By many measures, it still feels like a technology stuck in the early 90’s. And yet, once you’ve experienced the freedom of going wireless, it’s terribly hard to go back to the old ways. Reaching to unplug your headphones when leaving your desk, only to realize you can simply walk away? Bliss!

    For several years, I’ve been on the lookout for a Bluetooth mouse that could also be used for non-casual gaming. At minimum, the mouse needed to be on par with my trusty MX 518 at 1600 DPI and have little to no latency. Unfortunately, the vast majority of reputable Bluetooth mice capped at around 1000 DPI and had a reputation for being a bit laggy. The Razer Orochi was one of the few models that supported high DPI over Bluetooth, but it was a cramped little thing and felt rather unpleasant to use.

    There were a few wireless gaming peripherals that used proprietary USB adaptors to improve performance, including my latest mouse, the Logitech G602. This model did what it said on the tin, but despite the praise it garnered from gamers, I actually ended up somewhat disappointed with it. The USB receiver was pretty weak and would routinely cut out if you moved more than a few feet from the port. The fact that you had to use the receiver at all meant that you still used up one of your USB ports, which was a significant setback when your main computer was a two-port Macbook. (Hubs helped, but not while trying to use two USB-powered hard drives at the same time!) I was also unimpressed with the design and build in general: the side buttons were unpleasant and hard to press, the sensitivity toggles got in the way of your clicking finger, and the scroll wheel felt a bit mushy. After using it for about a year, I ended up happily switching back to the MX 518.

    Recently, I’ve been working more in cafés, and the endless dance of the wire once again started to irk me. I thought about getting an extra, cheapie Bluetooth mouse for working on the go, but I try to only buy things that can be stuffed into a suitcase even if I’m not actively traveling — and unfortunately, my eGPU was now reserving a large chunk of that real estate. Two mice was just too much, so I decided once again to take a peek at the high end of the Bluetooth peripheral market.

    Logitech had two new headlining models in this category, the MX Master and MX Anywhere 2. These were clearly top-shelf items, sporting sleek designs, several color choices, and Logitech’s free-spinning MicroGear Precision scroll wheel. Interestingly, they also reached 1600 DPI and shared the ability to connect to Bluetooth or a Logitech Unifying USB receiver at the user’s discretion. Based on my experience with the G602, I figured Bluetooth might be handy for everyday use while the USB receiver would work great for lag-free gaming. Were these the first Bluetooth mice that could actually satisfy my criteria? I had to give them a spin!

    Eventually, I got my hands on both models and did some side-by-side testing. The result? The MX Master was love at first touch, fixing almost everything I hated about the G602 and even adding a few extra features to win me over. Meanwhile, the MX Anywhere 2 was marred by one awful design decision and just felt too small for ergonomic comfort.

    Below is a discussion of several aspects of these mice that haven’t been covered in most reviews, including handfeel, clickiness, gaming use, and latency measurements.

    MX Anywhere 2

    The MX Anywhere 2 is a cute little mouse. Some reviewers have been comfortable switching to it as their primary work mouse, but in my testing, I found it just a bit too small. This is definitely a travel mouse in form and function. The weight, however, is great for usability. It’s hefty enough to stick a little to the mousepad without losing its high mobility.

    Click-wise, the two main buttons feel pretty good, while the rest aren’t particularly notable. I was happy that the side navigation buttons were fairly normal sized compared to the scrunched side buttons of the Master. The coating feels grippy but maybe a tiny bit less premium than I’d hoped.

    Clicking every button on the MX Anywhere 2.

    In case you’re not aware, many Logitech mice now feature a scroll wheel that can also be clicked side-to-side. In reviews of Logitech mice, I often see praise for this sideways-clicking mouse wheel, going as far as to call it a “premium feature”. But I think I’ve come to realize that most people just don’t use their middle click all that much. Me? I’m an compulsive middle-clicker. I use that button for everything. New links. Closing tabs. Panning. Reloading. In fact, it’s possibly the second most important button on my mouse! Unfortunately, sideways-click cripples this button thoroughly, making it rattle from side to side with every minor push.

    If I otherwise loved the Anywhere, I figured I could get accustomed to this annoying hardware quirk. But Logitech really screwed up the wheel here. Incomprehensibly, there’s no middle click; instead, you get a tiny button right below the wheel that could be rebound to this function. (By default, it serves as the “gesture” button, which lets you show Exposé and whatnot.) The wheel itself, when depressed, mechanically toggles between traditional ratchet and free spin modes for scrolling, resulting in a heavy, chunky “clunk” that feels like you’re squishing something deep inside the mouse’s guts. Is there any other Logitech mouse that behaves this way? The middle-click has been a staple feature on mice since the 70’s, so why is changing scroll wheel modes suddenly more important? Considered together with the usual sideways-click complaints, this scroll wheel disappointed me in practically every respect.

    A demonstration of the janky scroll wheel.

    For a while, I tried rebinding the square button and sideways-click buttons to middle click. It felt OK… in the sense that I could probably get used to it over time. But I knew I’d never be happy with this compromise, and it’s what ultimately pushed me to give the Master a try.

    MX Master

    I try surround myself with pretty things, and so I’m delighted that tech companies have started injecting fashion into even their most pragmatic product lines. Both MX models come in black, navy, and white (“stone”). I liked the idea of white in honor of my old favorite Microsoft Intellimouse, and it’s the color I chose for my initial Anywhere purchase. But seeing it in person didn’t impress me as much as I had hoped. It was attractive but a little business casual, and in any case, it didn’t mesh with my recent black Logitech K380 keyboard purchase. (Peripheral matching… I know, I know!) So I decided to seek a different color with the MX Master.

    Between the other two options, navy looked svelte in pictures while black appeared to have some ugly beige accents that screamed “HP peripheral”. And yet… Amazon Prime Now had a promotion going where I could chip $10 off the purchase of just the black model, bringing the price down to a mere $50 and delivering it the very same day. Meanwhile, navy would cost me close to $70 and arrive several days later! Friends, I must admit I did not pass the marshmallow test on that day.

    Fortunately, this turned out to be a great decision: the black model looks fantastic in person. Despite what the photos might show, the accents are actually not beige at all but more along the lines of Apple’s space gray, perfectly complementing the darker matte gray of the body. In addition, the mouse buttons have a slightly different sheen from the rest of the mouse, giving them a pleasing emphasis under certain lighting.

    As most reviews have stated, the ergonomic comfort of this mouse is close to perfect. You lay your hand down and it feels like it was sculpted just for you. What’s more, the main buttons feel incredible to click — perhaps more so than any other mouse I’ve used, including the Anywhere! Seriously, I can’t stop clicking these buttons.

    Clicking every button on the MX Master.

    The Master’s sideclick-less scroll intrigued me when I first saw it. Most Logitech mice either feature sideclicking and free spinning together, or otherwise just throw in a plain old scroll wheel and call it a day. This was the first mouse I saw that omitted the sideways-clicking while still retaining the free spin mode, which was a very desirable feature in place of the inertial scrolling you’d get with the Apple trackpad. Prior to handling the Master, I figured this setup might finally allow me to have an uncompromised middle click while still benefitting from Logitech’s fancy scroll wheel tech. And… that’s exactly what happened! The middle click on this mouse feels excellent, to the point where it’s very nearly as pleasing as the main buttons. (There’s a slight bit of wobble before the click itself, but I don’t think that can be helped on account of the complex scroll wheel mechanism.)

    My main issue with this mouse is the very poor layout of the back and forward buttons. I use these buttons quite frequently for navigation, and I miss the old Intellimouse days when the side buttons were enormous and clicked just as well as the main buttons. Here? The buttons are quiet and super annoying to reach and differentiate. Why couldn’t they have spread them out just a little bit? The horizontal scroll wheel feels nice, but I frankly don’t see myself getting much mileage out of it, especially now that I’ve learned you can simply Shift-scroll in OS X to get native horizontal scrolling.

    There’s one hidden button on this mouse: the “gesture” button, which can be activated by smashing down on the mesh pad next to your thumb. Unlike the other buttons, this button feels mushy and difficult to press, similar to those membrane buttons you find on cheap remotes. I guess they had to design it this way to avoid accidental clicks, but I wish they thought of something else or eliminated it altogether. I’ve been trying to use it as a surrogate back button instead of the tiny default one, but it’s not particularly pleasant or responsive to use. Oh well.

    Weight-wise, this mouse is pretty hefty, but not overbearing. Still, I’ll have to get used to the inertia compared to my MX 518, which barely feels like it has any weight at all.

    The MX Master in regular use.

    I was worried when I was first looking at this mouse that it would just be a minor iteration on the G602, but these fears have been unfounded. The Master fixes every problem I had with the G602 (aside from perhaps the weight) and adds a bunch of great features to boot. I feel immediately at home with this device.

    Common Issues

    There are a few issues common to both mice that should be addressed.

    Both of these mice can be used while charging, but they don’t register as USB devices even when directly connected to your computer. You still have to use them via Bluetooth or the Unifying receiver, which means that there’s no zero-latency mode. In practice, as I demonstrate below, the mice are pretty darn close to lag-free. Most people didn’t consider wireless-only to be an issue with the G602, and I don’t see it as an issue here either. (The feature would have been appreciated, though.)

    Second, there’s some scrolling weirdness, which seems to be a mix of OS issues as well as user habits. On the OS side, when smooth scrolling is enabled in Logitech Options, it doesn’t always seem to work right. Fairly frequently, you get some weird acceleration or momentum before things get going. (Both OS X and Windows have this issue, though manifested in different ways.) Most unfortunately, the wheel in free spin mode doesn’t seem to have a 1:1 mapping to page scrolling, which (ironically) feels a lot less physically correct than using the trackpad. I think I could get used to this behavior, but my ancient MX 518’s scrolling felt more natural. In terms of habits, if you’re used to trackpad momentum scrolling in OS X, you’ll be surprised when you’re free-scrolling a page and then find other pages continuing to scroll upon switching windows! It might take a while to internalize that the mouse has a mechanical component that needs to be stopped before switching tasks.

    These mice worry me a little with their reliance on mechanical trickery. On the MX Master, whenever the lever (or whatever it is) stops the wheel when switching to ratchet mode, I can feel the entire mouse shudder slightly. At least one user has already demonstrated that this part can get stuck. (This has apparently been quietly fixed by Logitech.) How long will it take for the mechanism to break or wear out? Fortunately, Logitech has an exceptional warranty department, so I don’t doubt that they’ll send me a replacement if anything bad happens. Still, I don’t like the idea of having to pamper my mouse.

    The Unifying receiver, unfortunately, tends to have a very short range if there’s any sort of interference nearby. (For example, I can hardly move the mouse a foot away if a Thunderbolt cable is attached to the port next to the receiver. Or maybe it’s the eGPU itself?) As a result, I’ve resorted to plugging the receiver into a USB extender. With Bluetooth, this is not an issue at all, so it comes up fairly infrequently.

    Latency

    Now, for my personal dealbreaker with wireless mice: latency. I had a bit of a misconception when I first set my eyes on these two MX models. My assumption was that the Unifying receiver was identical to the one used by my G602, meaning that the adaptor would be highly optimized for reduced latency. But according to a Logitech representative, only Logitech’s gaming peripherals used the improved, custom-designed adaptor to get the “precision report rate”, whereas Unifying technology was less fancy and reserved for use with the business lineup. My question was: did “precision report rate” only refer to the actual polling rate, or were the gaming adaptors also less laggy? In other words, was I missing out with my Unifying receiver?

    I knew I wouldn’t have peace of mind until I had solid numbers, so I decided to measure the latency myself. There were two data points I needed to capture: the moment the mouse started moving, and the subsequent moment that the computer registered mouse activity. Both actions had to be on the same clock. My iPhone’s camera could record at 240 FPS, so precision wasn’t an issue; the problem was that my laptop display only refreshed at 60 Hz, meaning that I couldn’t rely on a recording of the screen alone to figure out how fast the mouse signal was going through. (There was only one display frame for every four video frames.)

    I ended up writing a small, single-window Mac application to help me along. On the left side, the window has a running millisecond timer, refreshing at the exact frequency of the display. This gave me the precise timestamp of each display cycle. (Well — with a possible delta of 1 frame or ~17ms, depending on how the labels spaced out their updates under the hood. But I was only interested in relative latencies between the mice, not the absolute latency, so the only important detail was that this offset was consistent.) The app also captured the timestamp for the precise moment mouse movement was first detected. This was displayed in a label on the right side. Both timestamps were generated using the same time function, CACurrentMediaTime.

    Next, I placed a mousepad next to my display along with a small box to evenly and consistently push the mouse along. I set up my phone to show both the laptop display (with the timer app running) and a side view of the mouse and box contact point. I filmed three trials each of the MX 518, MX Master with the USB adaptor, and MX Master in Bluetooth mode, resetting the righthand timer between each trial.

    Finally, I went through the videos frame-by-frame in VLC. (The ‘e’ key: highly convenient!) The left timestamp was used to determine the exact moment when the mouse started moving. If the movement occurred between two timestamps, I could simply interpolate the precise value based on which intermediary frame out of four I landed on. After that, I noted the righthand (“mouse was detected”) timestamp and did a bit of math to arrive at the latency value. Perhaps not a perfect system, but as accurate as I could manage with the tools I had at hand!

    The results were: 55ms/58ms/50ms for the wired MX 518; 63ms/74ms/51ms for the MX Master in USB receiver mode; and 70ms/58ms/68ms for the MX Master in Bluetooth mode. (Keep in mind that these values were not a measure of absolute latency and were only meant to be compared to each other, since the test did not deduct OS latency, monitor latency, etc.)

    To my great surprise, not only was wireless latency very close to wired (~55ms vs. ~65ms), but Bluetooth was practically as performant as the USB receiver! I don’t know how Logitech managed it, but somehow the Bluetooth performance of these mice is nearly flawless, to the point where perhaps the dongle is basically unnecessary. (Except for edge cases like BIOS use.) You could make the argument that wireless performance is less consistent than wired, but I’d need to do more tests to figure this out. (And it’s probably more effort than it’s worth.)

    So is 10ms of lag a dealbreaker when it comes to precision gaming? I strongly suspect it won’t be noticeable — especially given how much latency already exists in the long chain from mouse to display — but I’d love to see some empirical evidence backing this up.

    Gaming

    There’s some mild consternation for these two MX models when it comes to gaming. Whenever people ask, some enthusiast always shows up and levies the following grievances against them:

    • They have no wired mode, and thus always feature some latency.
    • They have built-in acceleration and angle snapping.
    • They only poll at 125 Hz.
    • They only go up to 1600 DPI.

    In contrast, they suggest, Razer and Logitech themselves make gaming-tailored wireless mice (the Logitech G900, or the new Razer Lancehead) that go up to 12000 DPI at 1000 Hz and use proprietary receivers for optimal performance. All technically true! However, the mouse I’ve loved the longest, and gamed the most with, has been my trusty MX 518, a classic model popular with gamers even today. And it turns out that this mouse also has built-in angle snapping, also only goes up to 1600 DPI, and also polls at a mere 125 Hz. The horror!

    In practice, none of these quirks are dealbreakers. 1600 DPI is more than enough for the vast, vast majority of people; it was a high standard a decade ago and accuracy-per-inch demands in humans have not suddenly spiked during that time. (DPI is more of an issue with enormous monitors and insane resolutions, but it doesn’t matter for my use case.) Same goes for 125Hz polling, which is effectively 2x the refresh rate of most monitors. On top of that, you’ll get about 30x less battery life (30 hours vs. 40 days!) with gaming wireless mice — not to mention losing all the benefits of Bluetooth. Unless you’re a pro, I don’t think it’s nearly worth the tradeoff.

    However… I have to admit that something about these mice definitely feels off when playing FPS. Side-by-side with the MX 518, the difference is immediately noticeable. With the 518, I feel like I’m directly inside the character’s head. With the Master, there’s a bit of a “cockpit effect”, or a very subtle sense that my movements aren’t perfectly mapped to the camera. Accordingly, things like rapid 180 degree turns and snap-shots feel more hesitant and unnatural. For a while, I assumed this was due to wireless latency, but my experiments showed that this was unlikely to be the case. (Besides, my setup was a mess and there was plenty of latency coursing through the system already.) I also thought it might be the weight of the Master, but no dice: the Anywhere had the same issue at half the weight. So my working hypothesis is that this issue is caused by some subtle differences in mapping of mouse movement between the Master and the 518, meaning that I’ll have to reprogram my brain a little before I’m fully comfortable with it. (I think I could also customize this curve in software using various third-party tools, but this might be too finicky even for my tastes.) I actually remember having this exact response to the G602, so maybe that 10ms does make a critical difference in FPS gameplay after all? Or perhaps the G602 shares its motion curve with the Master? Who knows! Will have to do a bit more digging.

    Still, take the above comments as the nitpicks of a reasonably skilled FPS player who’s been at it since the 90’s. The Master absolutely works for gaming in general, and I spent several fun hours playing Overwatch and Lawbreakers using both the receiver and Bluetooth.

    (I would like to additionally measure the latency of the G602 and see how it compares to the Master. Will eventually post results in this section.)

    Conclusion

    The MX Master is as close to a perfect all-arounder mouse as I’ve used over the past few years. Sure, there are still a few details I’d love to see changed, and I actually think the M720 would be even better if only it went up to 1600 DPI. Nonetheless: if I can get accustomed to competitive-ish gaming on this thing, I think I can finally retire my MX 518 and join the world of wireless for good!

    Finally, if you’ll be traveling with this mouse, I recommend grabbing a Hermitshell case.


    0 0

    Blog comments are out, blog responses are in — and so I thought I’d respond to John Gruber’s recent article titled “Headphone Jacks Are the New Floppy Drives”. Here’s why I think removing the headphone jack would be a bad idea at this moment in time:

    • Poor wireless options and standards. I use Bluetooth headphones and I love them, but they’re a world of compromises. Audio quality is far from lossless, and not just because of the codecs: with the sound off, you are likely to hear noise and static from the radio right next to your ear. (This does not bother me, but would drive many people crazy.) Switching between devices is a pain. Pairing is a pain. You have to remember to charge them. There is unbearable latency for games and occasionally even movies. Few audiophile-level headphone makers bother with Bluetooth headphones, leaving us with just the consumer brands. They can only be as powerful as the battery-powered driver. Might Apple introduce a new wireless codec that tackles all of these pain points? Sure. But then we get:
    • Vendor lock-in. Apple Wireless or Lighning headphones wouldn’t be compatible with much else. Not a problem for cheap earbuds, but definitely a big deal for high-quality, $400+ headphones. After years of freedom, audio would be siloed. As Gruber mentions, this is in Apple’s best interests; but among all our gadgets, headphones have always been among the most universal and independent. They are a true analog path between our disparate electronics — an intuitive and surprisingly error-free technology in a world where devices routinely refuse to talk to each other. You wouldn’t find yourself spending an hour helping your mom troubleshoot the headphone jack. This change would be a major pain point, especially when it comes to:
    • Loss of plug-and-play. I constantly plug my headphones from my phone to my laptop and back. Bluetooth can sort of do this, but it always takes me about a minute with my wireless headphones. With Lightning headphones, it wouldn’t even be a possibility. (Barring Lightning-endowed Macbooks, which would be utterly bizzarre. What else would that port be used for? How would it be differentiated from USB-C?) A once-flexible workflow would be completely subverted.
    • Needless complication. Headphones are a very simple thing: just a wire leading to drivers. Very few things can go wrong in this arrangement, as evidenced by the proven durability and versatility of headphones over the past few decades. Headphone makers have gotten really good at working with these few parameters to create truly world-class audio devices. Indeed, some of the most esteemed headphones in the low-end audiophile space (I’m thinking of Grados) are basically glued together by hand in a workshop. If we start shoving more electronics — Lightning circuitry or a DAC, most obviously — into headphones, we make this proven system far more brittle than it needs to be. Headphones will malfunction in frustrating ways. Noise will be introduced. Designs will become more bloated to accommodate the extra circuitry. Every headphone having its own DAC is like every monitor having its own video card: clearly putting technology on the wrong side of the divide.

    What is all this for? What do we gain in return?

    In the past, every time a prominent piece of technology was removed from my Apple hardware — most recently the CD drive and the Ethernet port — my response was ambivalent because I had already been happily using the alternative for a while. Wi-Fi, despite its flaws, offered countless advantages over Ethernet, leading to rapid adoption. Steam, iTunes, and Netflix had made me almost forget that CDs were still a thing by the time I got my Retina Macbook Pro. It almost goes without saying that these technologies were standard and universal — nobody would have accepted them otherwise. But there’s no Next Best Thing in headphones. This is an entirely artificial change.

    Were there an existing high-quality wireless standard for headphones, I’d be somewhat on board, especially if the phone could be waterproofed in exchange. But we’re not there yet, and I fear that in this instance, Apple is looking out for their corporate interests instead of their users. When Apple removes features, I can usually envision the “better tomorrow” they’re striving for. Here, what future can we look forward to if we’re all using bloated, proprietary, and fragile headphones that sound like garbage?

    I can already hear the cry that “the average consumer won’t care”. Sure, maybe not. But their listening experience wouldn’t really be improved by the change, their options for audio hardware would become a lot more limited, and their lives would become riddled with new minor frustrations. The “average consumer” doesn’t care about typography, True Tone displays, or Retina graphics, either. But it all adds up. I respect Apple because they’re internally motivated to strive for quality, and a move towards pointless proprietary standards — towards profit-driven mediocrity with the “average consumer” as a scapegoat — would be a sad blow to that image.

    There’s a good chance I’ll keep buying iPhones without a headphone jack, but also a 100% chance I’ll end up carrying a 3.5mm adaptor wherever I go. One more thing to lose. A permanent ugly tail sticking out of Ive’s immaculately-designed round rect.

    Good work, team?


    0 0

    Composer’s Sketchpad 1.2 is out! This is a major update with several new features, including audio export (via AAC), a new tool for shifting notes along the time axis, and a one-finger drawing mode. I figured this might be a good opportunity to write about something a bit more on the creative side: icon design!

    Having no practical design experience, I am very proud of the icon I created for Composer’s Sketchpad. A good icon is absolutely essential for marketing, so most app developers would recommend contracting out this delicate task to a real designer. But I’m stubborn: one of my higher-level goals in creating Composer’s Sketchpad was to get better at art and design, and I wanted the icon in particular — the thesis of my app! — to be my own invention.

    Going along with the idea that creativity flourishes under harsh constraints, these were the requirements I laid out for the icon:

    • It had to feature a reference to music.
    • It had to hint at the functionality, aesthetics, and interface of the app.
    • It had to roughly align within the iOS 7 icon grid while somehow subverting it.
    • It had to exhibit some dimensionality and flow. I didn’t want it to look flat or overly vectory.
    • It had to be logo-like: symbolic, bold, and simple.
    • But most importantly, it had to immediately catch the eye. As a frequent App Store customer, I knew well enough that even a slightly uninteresting app icon would warrant a pass, while an interesting icon might make people peek at the app description without even knowing anything about it. The icon was absolutely critical to my passive marketing. It was my calling card — the entirety of my app wrapped up in 512×512 pixels. No pressure!

    Weeks before starting work on the icon, I began to keep tabs on other app icons that I found interesting. I was already following musicappblog.com religiously for music app news, so I scoured their archives for inspiration. I also carefully looked through all my home screens as well as the App Store top charts for non-music influences. In truth, even among the cream of the crop, there weren’t many icons that I outright loved. Most of the ones that caught my eye kept things relatively simple — outlines, primary colors, subtle gradients — while preserving the circular motif of the iOS 7 icon grid. (Many of these happened to be Apple icons.) There were also plenty of icons that failed at either extreme, either by cramming too much color and detail into the tiny square, or by not providing nearly enough detail to make a minimalist design stand out.

    A few app icons I would consider eye-catching.

    Inspiration in hand, I first made a number of rough pencil sketches, most of which depicted a prominent musical note with some embellishment. Quality was not a concern at this point: I wanted to jot down as many ideas as possible even if they didn’t seem to hold much promise. In the midst of this process, I found myself feeling fairly ambivalent towards most of the designs I came up with, though I knew they could probably be moulded into something that followed my rules. Something about them just didn’t feel right.

    I still didn’t have much of a sense how far my nascent design sensibilities could take me, and part of me started to give up hope of finding the perfect design. But when I came up with the sketch for the final swirly-tail icon (after running a few ideas by my folks — спасибо, мама!), everything suddenly clicked. I knew right then that this particular design would perfectly slot into the narrow niche defined by my requirements. For the first time, I thought that maybe I could pull this off!

    After making a few passes at the basic shape in pencil, I moved to the computer. My first attempts at a colored draft were very static. Doodling in Pixelmator with my Wacom tablet got me effectively nowhere, so I decided to just work in Illustrator directly — my first real stint with the software. As was typical with Adobe, the UI felt like a sprawling, bloated mess, but also allowed me to do some surprisingly powerful things. The most important discovery were the non-destructive transforms — particularly for the Pathfinder — in the inconspicuous “fx” menu at the bottom of the Appearance tab. With these tools, I gained the ability to perform boolean operations on sets of shapes, turn strokes into paths, and create complex gradients while still having full control over the constituent parts. Doing this across complex groups of layers wasn’t pretty, but it allowed me to freely experiment with new shapes without having to “bake” a final result and start all over again for minor adjustments.

    I’m sure experienced vector artists can use Illustrator to draft their ideas directly, but my process, as a beginner, was much more methodical. I started with the standard iOS 7 grid and drew a simple circle over the outer part. I typed an 8th note symbol in the center and looked through many fonts to find a pleasing shape for the flag. I rendered the note as a shape, added a scan of my freehand sketch in the background, and started dissecting the circle; it was split into several sections to make joining with the note flag a bit easier. After placing a connecting Bézier curve between the flag and the circle, fiddling with the control points to match my sketch, and adjusting the width to smoothly blend the circle and the flag, I had an outline that roughly matched my paper drawing. For this first pass, the rest of my time involved zooming out and adjusting the widths and tangents to make sure that everything looked smooth and contiguous.

    Some early experiments.

    Designing the colorful swish at the tail end of the circle came next, and it turned out to be the trickiest part of the process. I knew that this segment of the icon had to have flow, levity, and dimensionality without looking too realistic or skeuomorphic — and yet I couldn’t picture it in my head. I started with a simple 3-color gradient at the end of the circle that widened towards the bottom. This looked merely OK, but it felt too static. Adding more colors to the gradient and moving the left side of the tail into the circle helped, but it wasn’t enough.

    The first problem was nailing the outer curve of the tail. I tried many different shapes. Some looked like paintbrushes; some evoked waves; some resembled sand dunes. But none felt perfectly right. My “aha” moment was when I realized that I was subconsciously creating an Archimedean spiral with its origin at the note flag. I borrowed a spiral from Google Images and adjusted my curves to fit it. The shape finally came together.

    Next came the colors. I learned that I could add more control points to the bottom of the gradient envelope, allowing me to roughly specify the curve of each vertical slice of the gradient. The next few iterations involved creating an almost cloth-like shape out of the envelope and fiddling with the blur between the gradient colors. Still, the distribution of the gradient stripes was unsatisfactory. No matter how much I adjusted the gradient distribution or the control points of the envelope, the swirls felt too busy at the origin and too lopsided further towards the bottom.

    Rough drafts closer to the final icon.

    I realized that what I wanted was precise control over the “density” of the gradient envelope, top to bottom. Hoping that Illustrator contained within its multitudes the solution to my problem, I Googled around and was elated to discover that I was correct. The Gradient Mesh tool, though a bit tricky to set up, allowed you to apply a gradient to a flexible rectangle with an inner Bézier-based grid. I could now adjust the precise distribution of color throughout the entire length of my tail!

    There were still some shape-related questions to answer, the most important being: how do I maintain the legibility of the note and circle? The tail was supposed to be in the background; above all else, I didn’t want the shape or colors of the tail to interfere with the appearance of the note. Initially, I assumed that the left edge of the tail (touching the blue stripe) should avoid the note head entirely by going under or above it. However, both options made the tail look misshapen and unattractive, ruining the wave effect. On a whim, I tried intersecting the note head with the edge and it worked! Instead of disrupting the legibility of the note, the line drew the eye to it. I also had concerns that the gradient would make the lower part of the circle hard to see, but this was easy to fix by simply following the shape of the circle with the red stripe.

    Finally, I wanted to make sure that each curve in the tail — the left edge as well as each dividing color line in the gradient — “rhymed” with the overall shape of the icon. The final curves were mostly determined by trial and error. Just as with my initial sketch, I “knew” as soon as I saw the winning arrangement that I had found an inflection point for my design. There was a strong sense of motion originating from the note flag, carrying through the circle, and spiraling back around into a colorful background wave. Even though I couldn’t picture it at the time, it was exactly the effect I was hoping for when I originally came up with the design!

    (I wish there was more to say about color selection, but in truth, it was done quickly and somewhat haphazardly. The background blue was derived from the primary color of my app, while the gradient colors were basically chosen on a whim.)

    The final curves of the gradient mesh.

    For the Lite version, I once again wanted to stick to App Store conventions while defying them just a bit. Most Lite icons have an ugly banner over the default icon that feels out of place with the rest of the design. I still wanted to have the banner for consistency, but I wanted it to work with my diffuse pastel aesthetic.

    First, I had to determine banner placement. I tried several of the usual positions and then quickly rejected them; they blocked off too much of the underlying icon. I then decided to give the diagonals a shot and discovered that the upper-right corner had several benefits: not only did it preserve visibility for the key parts of the icon, but it also complemented the motion of the circle while allowing some interesting colors to peek through. (Assuming some translucency, which felt likely.)

    Next, I had to find a good look for the banner. (This iteration was done in Photoshop, since its raster effects were far better than Illustrator’s.) A simple color fill felt too out-of-place, so I decided to try for an iOS-7-ish Gaussian blur; ideally, I wanted a bit of the white outline and some of the tail colors to show through without compromising legibility. To make it easier to pick the final position, I masked the banner onto a blurred render of the underlying icon, which allowed me to treat the banner as if it were simply a blurry window and move it around freely. It didn’t take long until I found a satisfying result.

    Drafts for the Lite version of the icon. The final icon is on the right.

    That’s about it! Against all my expectations when I started on this journey, I’m still pleased by my icon whenever I catch it on the home screen even half a year later. There are certainly changes I could still make — there’s not enough contrast, the colors aren’t perceptually balanced, the gradient divisions are still a bit lopsided and the origin of the swirl needs some work — but I would consider these nitpicks. The gestalt of the design is right.

    (And as an unforeseen bonus, the icon easily converted into a black-and-white stencil for the promo poster a few months later!)

    If there’s a foremost design lesson I took a way from all this, it’s how many moments of inspiration occurred whenever I deviated from incremental adjustments and tried something more extreme. Adding a bit more curvature to a line didn’t yield any new insights; but turning it into a semi-circle gave me a completely new perspective on the shape. Changing the brightness slightly didn’t result in a satisfactory color palate; while ramping the slider completely made me rethink my initial assumptions about the chromatic balance. It seems that if you’re stuck in a design rut, it can be a good idea to vastly overshoot and then dial down instead of trying to inch towards an optimal design with minor, conservative changes.

    Ultimately, it felt wonderful over the course of this project to engage with my creative side — a part of myself that I still consider a mystery. Every time a design decision “clicked”, it felt like a little miracle. No doubt this will only reinforce my stubborn desire to do all my own art in future projects!


    0 0
  • 08/21/16--17:01: Indie App Reliance
  • Today, with a single tweet, the note-taking app Vesper has officially been shuttered. At its release, Vesper was widely promoted by the Apple indie developer communtiy as the hot new thing to try. More than anything else, it had an excellent pedigree, with influential blogger John Gruber of Daring Fireball at the helm. Many hopeful users switched to it for their primary note-taking needs, expecting that features like Mac support would arrive in short order. If any app from this circle was destined to be a breakaway hit, it was this one. And now, with barely a mention, it’s all but swept away, years after langishing with barely an update.

    This is not a post about why Vesper ultimately failed. There are plenty of others who will happily chat about the rusty economics of the App Store. Instead, I want to focus on the other end of this unfortunate feedback loop: the effect that these highly visible app shutdowns might have on App Store customers.

    Several bloggers have expressed curiosity as to why public interest in the App Store has waned so much. I can’t answer for everyone, but at least within myself, I’ve noticed an increasing and persistant reluctance to try new apps. It’s just that I’ve seen same pattern crop up over and over again. Somebody releases an interesting new app, touting fantastic design and improved productivity. The app gains some (but not overwhelming) traction. The app gets a few updates. The app lingers for a few years. And finally, the app untriumphantly rides off into the sunset, taking entire years of not just developer time, but thousands of users’ ingrained habits with it. The case is clear: most apps — and especially indie apps — cannot be reliably expected to continue operating.

    After being burned so many times by products that have been pulled out from under me, I’ve unconsciously adopted a worrying philosophy for trying new apps: unless the app I’m using is backed by a large corporation or is outright open-source, I’m not going to use it for anything particularly important in my life. (And even then, certain corporations — ahem, Google — are put under further scrutiny.) I hate having to do this because many amazing UX advancements can be found in apps produced by smaller developers. (Apple folks love to talk about how certain categories of apps are design playgrounds.) But at the same time, I know that with these apps, there is an inevitable sunset e-mail waiting for me in the not-too-distant future. It’s gotten so bad that I’m starting to seriously consider switching most of the (snappy, beautiful, well-designed) productivity apps on my phone over to their (ugly, clunky) open-source alternatives, just because I know that OpenWhatever will long outlive the current App Store darling for that category. (1Password is one hot spot that immediately comes to mind. Losing them would be a disaster.) I don’t want to worry every day about whether these proprietary silos will suddenly go up in flames with all my carefully-constructed workflows and data in tow.

    Despite the low prices on the App Store, I now get decision fatigue whenever I go to purchase an app. How long is this product going to be around? How reliable is this developer? How easy is it to export the data? How open are all the underlying formats and APIs? The price might be insignificant, but the commitment implied by my purchase is not trivial at all! Unfortunately, developers don’t seem to care much about the mental toll that pulling an app might cause, even when they were the ones touting life-changing productivity and workflow improvements in the first place. It’s one thing I miss about Windows utility software: so much of it is terribly designed, but at least I know it’ll run more or less forever. (Both on account of the open platform and Windows’ amazing legacy support.)

    It’s understandable why developers shut down their apps, but I wish there was another way out of this dead-end. Maybe apps could certify that all their back-end services are provided by external vendors and can be swapped out if necessary. (This is why I’m not too worried about apps like Reeder and Pocket Casts: I know that if they go away, I can take my precious data and switch right over to another app.) Maybe developers could pledge — even with legal backing! — to open-source their software if they ever decide to stop supporting it. Or going even further into this mythical socialist utopia, how about we finally figure out a way to fund open-source software from the get-go without having to beg for donations? With services like CloudKit, it’s no longer even necessary to spend a single cent of your money on servers. What’s the point of bringing something wonderful into the world if it only lasts for as long as people are willing to buy it? I can’t help but see that as hopelessly cynical.

    To be clear: I’m not saying that developers should be expected to support and add features to their apps indefinitely. That would be a very extreme stance. But on the other end, adopting a scorched earth policy for your app once you tire of it is also pretty extreme and poisons the market to boot.

    Apps — products that encapsulate years of people’s lives — should never outright disappear just because a developer can’t be bothered to support them anymore. If we don’t have that assurance, and if we can’t rely on our tools, all we’re doing is playing with toys.


    0 0

    I have to admit: I’m an analog kind of fellow. Much as I benefit from our growing roster of digital tools, I’m always on the lookout for software that reminds me of reality’s imperfect grit. Fake mechanical clock faces. Typewriter sounds. Simulated CRT monitors! Some might call them skeumorphic, clunky, or even fraudulent; but in a world increasingly bent on making things shiny and pristine, I enjoy having a reminder of which side of the screen is the more important one.

    The same even applies to my work notes. No doubt, there are immense benefits to limiting your note-taking to professional software like OneNote or Google Docs, starting with obvious features like copy & paste and text search that we all rely on while taking completely for granted. But whenever I undertake a major project, any spare pieces of paper lying around (including napkins, envelopes, and candy wrappers) will inevitably become conscripted as scratch paper, despite the vast universe of affordances on the digital side of the divide. As much as I’ve tried to adopt my thinking to software, the cold, hard truth of digital type just doesn’t represent my thoughts very well. On paper, my ideas becomes non-linear: sometimes visually grouped with related bits of info, sometimes crawling up the sides, sometimes accompanied by quick sketches and diagrams.

    One day, after failing yet again to trace my line of thought in Evernote, I decided to just give in and buy myself a nice paper notebook. Despite my initial concerns about all the features I was giving up, the switch turned out to be remarkably liberating. I loved to dig around in my backpack for my latest set of notes; feel the ever-growing creases in the covers; flip past all the dog-eared pages. And of course, no app in the world could replicate the joy of jotting down a rickety diagram with ink flowing in its wake! It felt significant that I could hold in my hands a tangible artifact representing the course of my project, and I looked forward to the days when I would exhaust my current notebook and have to go shopping for a new one. The tactile pleasures of this simple thing couldn’t be reproduced by any computer.

    The digital world still beckoned, slightly. Largely spurred by my light-packing travels, most of my other media had become digital at this point. My work notes usually served the role of scratch paper, so I didn’t miss the search feature in Evernote too much. Still, it was a terrible shame that once a project was over, all these notebooks had to be thrown in a closet to gather dust in obscurity. I tried scanning bits of them in, but it was too much of a hassle.

    In September 2015, Apple announced the new iPad Pro along with their brand new stylus, the Apple Pencil. This was a necessary purchase for my work anyway, and so a new hope crossed my mind: could I finally reconcile the analog and digital worlds with this tech? Here was a tablet that could finally act like a digital pad of paper, sporting a glass-welded, ambient-light-adapting screen and a digitizer running at 120Hz with barely any latency. Just a few years before, you needed an enormous desk-sized piece of hardware to do the same thing — and people still complained about the lag. With the Apple Pencil, seemingly everyone (artists and reviewers alike) agreed that it was the closest thing to paper they’d ever tried.

    After receiving my new iPad, I started to search for a very specific kind of software. Plenty of great drawing apps were out already, but I wanted more than that. I wanted an app that would let me collect a roster of virtual notebooks, all sporting different shapes and covers. I wanted each notebook to have a variety of pages, customizable with their own type and texture. Most importantly, I wanted every notebook to exist as an open-format file on my Dropbox. Instead of relying on a proprietary app to access my notebooks, it was critical that I be able to leaf through them — and maybe even edit them! — using other software. It seemed that PDF might be suited for the task; I’d routinely used it for book scans to great effect, and I also knew that the underlying rendering technology — PostScript — was more than capable of displaying any manner of graphic. Perhaps there was an app that fused PDF creation and annotation with just the right amount of magic make it work?

    There were a good handful of contenders, but three names kept coming up: Notability, Noteshelf, and GoodNotes. None were perfect. Notability had many bells and whistles and was clearly the audience favorite, coming up in any thread where people were talking notes. Noteshelf was beautifully designed and seemed to be directly targeting my digital notebook use case, sporting flippable pages and an iBooks-like shelf for your notebooks. Unfortunately, both options felt fairly proprietary. If PDF export was even a feature, it was clearly treated in a throw-it-over-the-fence kind of way: the pristine copies of your notebooks only lived inside their respective app silos. Then there was GoodNotes. This was a subtle app without too much pizzaz and not overflowing with features. It almost resembled an Office-style product more than any of its hipper competitors. But the features it did offer were incredible.

    First, GoodNotes stored your drawings as vectors instead of rasterizing them out. Every line, shape, and highlight you drew was retained as a pristine geometric shape, preserving your work for the ages and offering 100% clarity at any zoom level. Any text that you wrote would automatically get OCR-ed, allowing you to actually search through your notes if your handwriting was legible enough. Covers and pages could be swapped with ease to any image of your choosing, even in the middle of an existing notebook. (Templates such as graph paper and even musical staff paper were included.) The UI and gestures were bog-standard iOS, and accordingly intuitive: it took little effort to figure out how to manage your documents and create new content. You still got all the “parity features” offered by other note-taking apps such as typing, shapes, image support, and more, which weren’t particularly relevant to me but still felt like they could be useful on occasion. Then there was the kicker. Even though your notebooks weren’t stored directly as PDF, you could opt to encode them into Dropbox right as you were working. The implementation here was simply remarkable. Each PDF produced was, functionally, a lossless copy of your notebook, harnessing the full power of PDF to reproduce every GoodNotes feature in full. Digging around in the generated file internals, I saw that everything was layered just as it was in the app. The background texture of each page was its own asset. Your writing was stored in its original vector form. Even the OCR text was there, hidden from view but layered on top of the original text and searchable using most PDF software. This was as close to a perfect copy of your notebook as you could get, and I felt confident that my data would be perfectly safe were GoodNotes to ever go out of business.

    (It should be noted that there’s one downside to GoodNotes’ vector drawing approach: performance is proportional to the amount of content on the page. In most cases, this isn’t a problem: loading takes no time at all and drawing is lag-free. But if one of your pages is especially dense with translucent lines and complex shapes, it might take a second for everything to tile in. In practice, the tradeoff of having a lossless copy of your data versus fixed performance is well worth it. I can wait a second if it means that my writing will look as crisp 100 years from now as it does today.)

    It’s hard to deny that you lose something in moving away from the physical world. No amount of code will give back the glide of an ink pen across cream-colored paper or the crinkle of a bound old set of pages. But you only trade one kind of magic for another. I can now switch colors and brush strokes in a snap. Mistakes can be erased and even undone with barely a thought. If I need to make a graph or draw some musical notes, I can switch out the “paper” I’m using to almost any other format — or even provide my own. And covers! Whereas in my paper notebook days, I could spend hours window-shopping for a cover with just the right look and feel, the joy of discovering the perfect vector image for the cover of my digital notebook comes very close indeed. Best of all, I now have a digital archive of all my notes without any extra work on my part. On finishing my work for the day, I can peek inside my Dropbox and leaf through a remarkable document featuring every paragraph, graph, and schematic I’d scribbled.

    Today, all my work-related notes are made directly on my iPad using GoodNotes. And it’s just wonderful!


    0 0

    The category of static visual art is in a bit of an awkward phase right now. Entertainment in the 21st century has evolved to actively engage our minds and senses, to the point where movies, music, games, and even audiobooks require little more than putting on a pair of headphones or fixing our vision to the nearest screen. Where does the immense body of work from genres such as fine art, photography, and illustration fit into this world? Museums — physical beasts that they are — can hardly be visited on a whim, and as of yet there’s (sadly) no Spotify for visual art. Meanwhile, hundreds of amazing works are posted daily on Instagram, DeviantArt, and Reddit. How do we find the time to fit them into our content-saturated lives? And how do we return to view the works we’ve already enjoyed?

    For several years, I wanted to create a sort of “digital museum” that would give me random, on-demand access to this very important side of the art world. The constraints weren’t complicated. All I needed was a large amount of art along with a mechanism that would randomly show me new works from this collection every fifteen minutes or so. But while acquiring the art was hardly a problem, there were relatively few areas in my life where I could idly display images. Screensavers? Showed up too infrequently and weren’t easily controllable. Wallpapers? Couldn’t deal with arbitrary aspect ratios. I thought I had my solution when I ran the Google Art Project in a full-screen browser tab on a second monitor, but the selection turned out to be too limited and I could no longer rely on the luxury of having more than one display when I set out on my travels.

    (As an aside, Clay Bavor solved this exact problem in hardware by creating a digital photo frame that automatically compensated for ambient light. Amazing solution! But I’m a software guy, so…)

    After discovering Chris Tomkins-Tinch’s Artful app which turned your desktop wallpaper into a rotating collection of fine art, I realized that I had given the humble desktop too little consideration. With a simple Gaussian blur, a soft drop shadow, and a sprinkle of magic, it was in fact quite simple to create dynamic backgrounds for images at practically any aspect ratio. But Artful was designed to automatically pull images from proprietary sources, whereas I already had a sizable “inspiration” folder of collected art that I wanted to add to the mix. I also wished to keep my system as clean and simple as possible: Artful interfaced directly with your system preferences, but I much preferred to just keep a wallpaper folder that I’d occasionally drop new images into. And so a new app was born: Backgroundifier, a native converter droplet that let you easily turn arbitrary images into lovely desktop wallpapers.

    Just having this app around increased my consumption of art tremendously. But it wasn’t enough. I wanted to bridge the gap between finding an image on the web and having it appear in my desktop rotation, and I also wanted to be able to show new works of art on a whim. Fortunately, macOS is no slouch! Using Backgroundifier’s command-line mode, Automator, and the native power of Mission Control and Spaces, I’ve finally been able to create the digital museum experience I’ve always wanted.

    Naturally, the process begins with finding the art.

    Where’s the Art?

    Some people want their art carefully curated, and there are a number of existing apps and services for that. (See the aforementioned Artful and the Google Art Project.) Not me, though! I want everything in my wallpaper shuffle: the “great artists” of the past; modern digital and concept art; Russian textbook illustrations; architectural photography. Much of my daily discoveries come from Reddit, and though the site is an awful cesspool in many respects, subs like r/imaginarylandscapes, r/cozyplaces, r/specart— and even plain old /r/art and /r/photographs— make it all worthwhile. Whenever I run into an interesting new sub specializing in visual art, I immediately sort by the top posts of all time and pull my favorite images from that list. (Fun tip: if you ever run into an Imgur gallery that you particularly like, you can find a link at the bottom to download the entire collection as a zip! I’ve done this with things like Miyazaki backgrounds.)

    If you’re interested in scouring some of the less savory parts of the web, there are Russian torrent sites featuring comprehensive collections of art from practically any famous artist or museum you could think of. There’s nothing particularly unethical about this approach — a lot of older art is at this point public domain, after all — and it’s quite an experience to drop “The Best of the Louvre” into your background rotation for a week.

    Running every single file through Backgroundifier and plonking it in your wallpaper folder is bound to be a chore. Fortunately, this can be entirely automated using Backgroundifier’s command-line mode and macOS’s native Automator.

    Harnessing the Command Line

    Although Backgroundifier presents a user-friendly GUI, it can also be accessed through the command line. (To see how one can make such a dual-mode app in Swift, you can examine my code here.) One way to do this is to navigate to your Backgroundifier.app bundle in Terminal and run the Backgroundifier executable found in the Contents/MacOS subdirectory. With the standard --usage flag, you can view all the options available to you. (Some of these aren’t even accessible through the GUI!)

    The simplest way to process a file is to run Backgroundifier -i /path/to/input_image.jpg -o /path/to/output_image.jpg -w 1920 -h 1080. Unfortunately, due to the fact that Backgroundifier is a sandboxed app, you can’t just do this for any random directory. Whereas a sandboxed GUI app can expand its sandbox to include any directories opened through the file picker or dropped directly onto it, command line apps (to my knowledge) have no such ability. You can therefore only process files located in your ~/Pictures directory.

    Fortunately, there’s another way. In the Resources directory of Backgroundifier.app bundle, there’s a zip file containing a non-sandboxed, Developer ID signed version of the command line tool. Extract it and you can use it in any directory you please.

    Magic Folders with Automator

    Automator, macOS’s powerful visual scripting tool, can be used to create so-called “Folder Actions”, or workflows that run whenever the contents of a predetermined directory are changed. As you might expect, this is ideal for file conversion. Below is my Folder Action workflow for automatically “backgroundifying” images into a separate output directory:

    Item 2 contains the path to the output directory and item 3 contains the path to the Backgroundifier command line utility. (They exist as separate items to make the paths easy to modify without having to resort to scripting.) Here’s the full text for the script in item 3:

    # assign paths
    bgify=$1
    output=$2
    
    # remove path arguments
    shift 2
    
    # process images
    for var in "$@"
    do
        filename=$(basename "$var")
        full_output="$output/$filename"
        echo "Processing $full_output ...""$bgify" -i "$var" -o "$full_output" -w 2560 -h 1600
    done
    

    Nothing too complicated! You can find the workflow file here, and I assume you can just drop it into your ~/Library/Workflows/Applications/Folder Actions directory. You can also pretty easily recreate it from scratch: just make a new Automator workflow with a Folder Action document type and copy the items.

    Whenever I find an interesting new image on Reddit, all I now have to do is drag-and-drop it straight from my browser into the designated Input directory on my desktop. macOS and Backgroundifier automatically take care of the rest.

    Dealing with the Desktop

    macOS’s desktop background settings allow us to pick a source directory and change the background to a random image at a set time interval (with 30 minutes being the default). All we really need to do here is drag the output directory from the previous step into our list, select it, check “Change picture” and “Random order”, and set our desired time interval.

    It’s no fun to manually move every window out of the way whenever you want to peek at your wallpaper. Fortunately, there are several macOS-native shortcuts for showing the desktop. One is to use a four-finger trackpad pinch, selectable under Trackpad → More Gestures → Show Desktop in System Preferences. Personally, I prefer binding the action to a keyboard shortcut: Command-Option-down, to go with my assigned Command-Option-left and right shortcuts for switching spaces. You can do this under Keyboard → Shortcuts.

    Some of us are… more messy than others. The desktop can acquire quite a bit of cruft over time, blocking view of the beautiful art below. But why bother cleaning it up when you can just sweep the mess under a rug? If you’re lazy like me, you can toggle visibility for the icons on your desktop by running this simple script:

    #!/bin/sh# Toggles desktop icons.if[[$(defaults read com.apple.finder CreateDesktop)=false]]; then
        echo"Showing desktop icons."
        defaults write com.apple.finder CreateDesktop true
    else
        echo"Hiding desktop icons."
        defaults write com.apple.finder CreateDesktop false
    fi
    
    killall Finder
    

    And voilà! Clutter-free art with hardly a fuss.

    Spaces & Showing New Art

    Here’s where it all comes together. One my favorite macOS features is Spaces, or virtual desktops. Spaces have an extra hidden benefit for our use case: whenever a new Space is created, its desktop background settings are taken from the previous space. This means that any new Space created in our configuration will automatically arrive with a fresh work of art in tow!

    Whenever you wish to see a new work of art, just pop open Mission Control (in my case, bound to Command-Option-up), create a few new Spaces, and keep switching Spaces to the right. It’s just like leafing through an art book!

    And that’s all it takes to create your own personal art gallery using Backgroundifier. No mysterious system overrides or hacks. No 3rd party tools of unknown provenance. Just a Unix-y converter, an Automator script, and a couple of native macOS features to tie it all together.

    It’s quite a thing knowing that a new, enriching artistic discovery — be it a Picasso, a Van Gogh, or even a Mike From Around The Web — is only a quick peek away!


    0 0

    My late-2013 15” MacBook Pro’s discrete GPU — an NVIDIA GeForce GT 750M — was pretty good for gaming during the first year of its life. But around the time that the new generation of consoles dropped, AAA games on the PC started becoming unplayable, even at postage-stamp resolutions with the lowest possible settings. I lived on a strict diet of indie games from 2015 to 2016 — thank goodness for well-tuned titles like Overwatch and The Witness! — but the itch to try games like the new Mirror’s Edge and Deus Ex became too great. Initially, I thought it might be time to switch out my MacBook for the upcoming 2016 model, but the winter reveal wasn’t particularly tempting: CPU performance was about the same as mine and the GPU was — at best — 3 times as powerful. (Still need to see the benchmarks on that — educated guess.) Worth it for a few hundred bucks, but $2000? No way!

    Building a gaming PC wasn’t an option due to my mobile lifestyle, and in any case the kind of CPU I could buy for cheap would be comically underpowered compared to the i7 4850HQ I already had in front of me. So I started looking into the scary world of external Thunderbolt GPUs, colloquially known as eGPU. Modern Thunderbolt 3 (allegedly) supports external GPUs in an official capacity, but older Thunderbolt 2 can get the job done as well, even though it’s unsanctioned by Intel. I’m usually reluctant to pursue these sorts of under-the-radar hobbyist projects, but there was enough prior art to make it worth a shot!

    Unlike many gaming enthusiasts, my goal was to optimize for simplicity over power: the fewer hacks and workarounds I had to use, the better. I already knew I’d have to use an external monitor and do my gaming in BootCamp, which was already the case. I knew there would be some performance loss from the limited bandwidth of TB2. I gathered that there may be timing issues and other problems that would require a bevy of software hacks to fix — mostly on the Windows side of things. But I was most concerned about the hardware hacking required to get the thing up and running in the first place.

    The majority of published eGPU builds involve enormous graphics cards connected to hotwired desktop PSUs, sitting in unseemly, torn-apart Thunderbolt-to-PCI chassises. It was clear that the anointed case for the job was the AKiTiO Thunder2. The Thunder2 wasn’t designed for eGPU use, but dozens of eGPU enthusiasts on forums like TechInferno demonstrated that it ran stable and performed admirably. (AKiTiO engineers even popped in on occasion to offer under-the-table eGPU advice — off-warranty, of course.) It was also one of the cheapest options on the market at around $200: very fair considering that a barebones development Thunderbolt 2 board cost nearly as much!

    Most eGPU builders buy this case to hack up, not to use as-is. Usually, the front panel is bent back or removed to fit larger cards, and then a desktop PSU is made to turn on with a paperclip and adapted to fit the DC plug. There are also arcane startup rituals to get everything powered and running with the right timing. I really didn’t want to have a PSU octopus and a ragged hunk of metal sitting bare on my table, though it sadly seemed inevitable. Then I discovered an alternate route.

    Most GPUs are power hogs that rely on one or two extra power ports on top of the card, but there are a few designed to pull power straight from the PCI slot. These aren’t super-extreme gaming cards, but these days they more than get the job done. For example, the just-released NVIDIA GeForce GTX 1050 Ti can pull 1080p at medium-high settings in many recent games and currently benchmarks as the ~40th best video card on the market! Better yet, many of these single-slot offerings are short and half as long as the monster enthusiast cards, easily fitting into AKiTiO’s compact case without any modifications. Using this type of card, I’d be able to keep my Thunder2 in one piece and avoid using a PSU entirely. No hacks required!

    At peak, these slot-powered cards can draw 75W from the PCI Express slot. Unfortunately, the AKiTiO Thunder2 only comes with a 60W adaptor, 30W of which is allocated to the circuitry. A dead-end? Not so fast: as stated in the official docs and verified by employees, the Thunder2 can actually pull as much as 120W from a more powerful adaptor. To be compatible, the new power brick needs to sport a 5.5×2.5mm barrel plug, provide 12V output, and have center positive polarity. (Practically every power adaptor has these last two items listed on the back.) My hope was to find a laptop power brick with these same specs, but it turned out that most laptops used chargers with an all-too-high output of 20V. Surprisingly, well-reviewed 12V/10A bricks weren’t common at all on Amazon (unless you lived in the UK or Europe), with most of the listings taken up by rebranded versions of a sketchy-looking adaptor with model number CT-1250. Eventually, I discovered one vendor who was selling bricks with model number CD120100A, which had a more confident label and looked identical to a power brick I saw in another successful closed-case AKiTiO build. (The Amazon listing was full of typos and the product photos didn’t match the user photos, but it just so happened that the adaptor in the user photos was exactly the one I was hoping to find — and Prime allowed for painless returns in any case.) If the US 12V/10A adaptor market was really dominated by CT-1250 and CD120100A, the latter just seemed like a better bet.

    For the graphics card, I decided to give the EVGA factory-overclocked version of the 1050 Ti a try, since one eGPU enthusiast mentioned that their EVGA card handled boot timing issues a bit better. (True or not, I’ve also had positive experiences with EVGA warranty and support in the past, so it was an easy decision.) Potentially, the overclock was a problem: the AKiTiO Thunder2 wouldn’t provide more than 75W of power to the slot, and any excess power pulled by the card could destabilize the system or even fry the circuitry (as reported by one user). But from everything I read, factory-overclocked EVGA cards were designed to never exceed the 75W threshold, and any instability could simply be fixed by underclocking the card slightly using EVGA’s (or possibly NVIDIA’s) own tools. Factor in the fact that the non-overclocked version cost exactly the same as overclocked while probably having lower resale value, and it became clear that the SC model was almost certainly the better buy — even if you dropped the clocks right from the get-go.

    (Note: many reviews will point out that the regular 1050 is a much better deal than the 1050 Ti from a price/performance perspective. Still, the Ti is about 20% faster than the non-Ti for just $20 more, and for the sake of future-proofing as well as TB2 performance loss it just makes sense to wring as much power from the purchase as possible.)

    Trawling eGPU forums for installation instructions was quite frustrating. Most users preferred to write about how they got their eGPUs working with their laptop displays (using Optimus drivers — possible with NVIDIA GTX cards) and/or in OSX. Both tasks involved copious scripts and hacks. I was only interested in the bare minimum — BootCamp on an external display — but most guides simply skipped that “easy” part. Would I need to make a custom build of Windows? Edit drivers? Install a custom bootloader? Nothing was clear, so I decided to just jump into it.

    Once I got all the parts assembled, I plugged the Thunder2 into my laptop and my monitor into the Thunder2, crossed my fingers, and turned on the computer while holding down the Alt key (for the boot menu — I already had BootCamp with the latest Windows 10 installed). At first… nothing. Just a black screen and no chime. I tried unplugging the cable, turning the machine on, waiting for the chime, and then plugging it in. The boot menu showed up, but froze when I selected Windows. I tried one more time to boot with the cable plugged in and it worked! Or — at least, it booted into Windows. Nothing showed up on the external display, but the Windows Device Manager had a tempting entry named “Microsoft Basic Display Adapter”. Hopeful, I searched for other eGPU users who had gotten to this step, and it became apparent that all I had to do was install the latest NVIDIA drivers. One reboot later (with no issues this time) and I was seeing “NVIDIA GTX 1050 Ti” in my Device Manager. I gave Overwatch a quick run on the highest settings, but performance didn’t seem particularly great; my suspicion was that the laptop defaulted to the discrete 750M instead of the eGPU. I returned to Device Manager and disabled the 750M, restarted Overwatch, and… 60fps! It actually worked! Holy cow!

    eGPU setup can be daunting depending on your hardware, but I seem to have gotten away with a problem-free configuration. The “hardest” part is getting the computer to chime on boot, presumably indicating that POST went correctly. This involves turning the computer off and on again one or two times in the worst case: if it chimes and the boot menu appears, everything is sure to work fine. (Recently, I’ve been getting the boot menu on first try 100% of the time. Maybe I was just impatient before!) Once booted into Windows, I’ve learned that simply changing the display settings to only use the external monitor, or to extend the desktop and use the external monitor as the main monitor, ensures that the eGPU is used over the discrete chip. (And I believe Windows remembers this preference when you launch with the eGPU connected.)

    Now for some benchmarks! The main bottleneck in this setup is the TB2 connection. TB2 doesn’t allow for the full PCIe x16 throughput, potentially crippling graphics card performance. In practice, this isn’t really that big of a deal: users have reported at most a 20% performance loss over native, and usually a bit less. Let’s see how well we do.

    GTX 1050 Ti SC GT 750M Improvement
    3DMark Fire Strike
    Graphics Score 6993 1911 3.66×
    Graphics FPS 1 32.28 8.74 3.69×
    Graphics FPS 2 28.74 7.96 3.61×
    3DMark Time Spy
    Graphics Score 2040 450 4.53×
    Graphics FPS 1 13.67 3.00 4.56×
    Graphics FPS 2 11.43 2.54 4.50×
    3DMark Sky Dive
    Graphics Score 22564 5602 4.03×
    Graphics FPS 1 102.25 26.41 3.87×
    Graphics FPS 2 103.83 24.80 4.19×
    3DMark11 Free
    Graphics Score 8802 2445 3.60×
    Graphics FPS 1 42.83 11.27 3.80×
    Graphics FPS 2 42.18 11.40 3.70×
    Graphics FPS 3 54.32 15.52 3.50×
    Graphics FPS 4 25.13 7.39 3.40×

    Quite an upgrade! According to Passmark and other benchmark listings, a 1050 Ti should, under normal circumstances, be about 4.5× as powerful as a 750M. Factor in 10%-20% performance loss from the TB link and that’s exactly what we see in our results: a 4x boost on average.

    Even without any underclocking, stability has not been an issue. I’ve been playing hours of Crysis 3, Far Cry 4, and Mirror’s Edge Catalyst over the past few days and everything’s still working great. I’m keeping the case closed, but I don’t think there’s any real risk of overheating: the GPU fan is designed to funnel heat right out through the back and there’s an extra front fan build into the case anyway. According to 3DMark, temperature during benchmarking has been stable.

    I’m not interested in running any weird scripts to get Optimus drivers for the internal display working, but I learned something interesting while fiddling with the Windows display settings. If you set the multiple display setting to “Duplicate these displays”, it seems that somehow the eGPU gets used for both the internal and external display! Assuming I’m interpreting this finding correctly, this means that theoretically you could buy something like this HDMI display emulator and use the eGPU on the internal display without an external monitor and without having to go through the hacky process of getting Optimus up and running. Unfortunately, there’s a performance penalty of about 20%-25% (according to my benchmarks) as well as approximately 0.25 seconds of latency, making this approach untenable for first-person shooters and other twitchy games. (I wonder if this is also the case with the Optimus driver route?)

    Another interesting finding: if you keep the discrete GPU enabled, there’s a setting in the NVIDIA control panel to dedicate one of the GPUs to PhysX. I’m not sure if this will make a real difference in performance or cause stability issues, but it might be worth investigating in the future.

    To summarize, using only…

    …you can assemble a painless, hack-less eGPU build and use it with your late-2013 15” dGPU MacBook as a relatively inexpensive graphics upgrade compared to building a PC from scratch or buying a console. (Cheaper still if you wait for rebates or use an older/weaker X50 card.) Caveat emptor: the same build might not work so well — or at all! — on other MacBook models or even with a different driver version. In other words, what worked for me might not work for you! Remember that eGPU on TB2 is not officially supported and mostly works by accident, though clearly it can work very well.

    (Also, there’s some great information in the HN thread for this post about new and upcoming TB3 enclosures. If you can get one working with a TB3-to-TB2 adaptor, it might be the best option of all for upgradability, reliability, and future-proofing. On the other hand, you’ll probably spend more money and the case will be a lot bigger. Do your research!)

    In time, I hope somebody releases a Thunderbolt 3 eGPU the size of one of those Square credit card readers — maybe sporting a GTX 980M caliber chip? — that plugs into a USB-C port and works seamlessly with the internal display. But for now, this lovely little eGPU will do just fine. I’m confident that my trusty MacBook can now serve me for another few years, especially if NVIDIA continues to release excellent and inexpensive PCI-powered cards on the regular.

    Let’s hope that the eGPU revolution is just beginning!


    0 0

    Last month, I released an unusual little app for iMessage. It’s called MusicMessages!, and it’s a collaborative step sequencer that lets you work on short pieces of music together with your friends. As far as I can tell, it’s the only app of its kind in the iMessage App Store. (Probably for good reason!)

    The app presents you with a grid of buttons, each corresponding to a musical note. Time is horizontal and pitch is vertical, and the entire grid can be panned like any other iOS scroll view. To place a note, simply tap one of the buttons; tap it again to erase the note. (If you have a 3D Touch capable device, you can depress the button using finger pressure. On an iPhone 7, there’s even a bit of haptic feedback at the end.) The tabs on top of the screen represent independent layers of notes, and if you tap their icons, you can pick a new instrument out of 40+ different ones (including percussion). Once you’re happy with your portion of the piece, you can send it off to one or more fellow iMessage users for their contributions. Each participant’s notes show up in their own unique color, making it easy to track the changes to a piece over time.

    Why iMessage? Since releasing Composer’s Sketchpad, I’ve wanted to create a companion app that would make it even easier to play around with simple musical ideas, though at the expense of expressiveness. Initially, I envisioned this as a tabbed, pannable, Minesweeper-like step sequencer for OSX. But when I started investigating the new iMessage frameworks in iOS 10, I realized that iMessage might be as good a place as any to work out this idea. No sync issues, no file I/O, a format that incentivized short experiments, and plus — the social aspect just seemed neat! Wouldn’t it be fun to riff on a melody or percussion line with your friends?

    Total development lasted exactly two months and involved approximately 8000 new lines of Swift code, plus 1000 lines and a bunch of assets borrowed from Composer’s Sketchpad.

    Favorite tech bit? The data format! I hate spinning up and maintaining servers, so my aim was to avoid any outside dependencies by sending data strictly through the iMessage APIs. Unfortunately, iMessage sends data via NSURL, which in this case had a hidden limit of 5120 characters. I hit this limit with plain old NSArchiver after about a dozen notes. To solve the problem, I had to compress all my data — 5+ layers, 5+ participants, and as many notes as possible — into approximately 3.75kb, assuming base64 encoding for the data string. Swift is pretty terrible at dealing with tightly-packed data structures (a 256-element static array can only be represented by a non-iterable 256-member tuple) and so I designed a struct and corresponding helper functions for my data in straight C. Lots of fun counting bits and optimizing for maximum data density… eventually, I settled on a maximum of 12 layers, 8 participants, and 1120 notes, along with a ton of extra data and even some room to spare. Nothing terribly complex, but it’s still fun to optimize within tight constraints.

    Another feature I enjoyed integrating was the perceptually-balanced HSLUV color space for all my user-selected colors. Normally, if you generate colors in the usual HSB color space by varying the hue and keeping saturation and brightness constant, you get colors that are perceived as unequally bright by the human eye. (An artifact of biology, alas.) Perceptually-accurate color spaces like CIELUV attempt to compensate for this, but most of them have large swaths of empty space where impossible colors lie, making it very difficult to create linear ranges of color parametrized by hue. HSLUV goes one step further and stretches the chroma to fill in these gaps. Not perceptually perfect, but just a ton more convenient and usable in practice!

    Since there’s an element of self-marketing in iMessage apps — recipients of app messages are automatically prompted to download the corresponding apps — it was important to make my app free. As I really didn’t want to plaster my interface with ugly ads, I decided to lock some non-critical features behind an in-app purchase. I’d never dealt with this payment model before, and as a complete novice in cryptography the code samples for receipt decryption and validation seemed quite daunting! Fortunately, I discovered an excellent OSX application called Receigen that generated auto-obfuscated receipt and IAP validation headers for my app. Ended up saving what probably would have been several days of frustrating, unrewarding work for just $30. Highly recommended!

    As before, designing the icon was a lot of fun. Just like last time, there was a long period in the middle where I was sure that the right design — one that would equally hint at the interface, functionality, and ambiance of the app — would elude me. And just as before, after a chain of prototype designs that I wasn’t crazy about, the right pieces suddenly snapped into into place all at once. On a lark, I even spent a few days parametrizing and animating the icon for my trailer, adding another 900 lines of code through Swift Playgrounds. (Next time, I should probably use something like After Effects or Flash. Keyframing in code is a huge pain, and performance in Playgrounds is hardly sufficient.) The thrill of creative experimentation and discovery is something I sorely miss in my day-to-day programming and makes me all the more eager to get started on my game project.

    Speaking of Adobe, I finally moved on from iMovie to Premiere Elements for my trailer. What a relief! Although deceptively simple at first, PE conceals enormous power in its effects and keyframing features. In trademark Adobe fashion, the program does its best to infuriate you into almost paying for the full CC; but with some clunky zoomed-in Bézier adjustments and begrudging cut-and-paste alignment of keyframe positions, it’s easy to create a video that moves, changes color, and feels very dynamic. The trailer I saw in my head came together in just a few days, and now iMovie feels like a joke in comparison. Well worth the $50 I paid for it on sale.

    MusicMessages! was an attempt at a speed project, so there’s many stones left unturned. The UI takes up too much room. The instrument tabs in horizontal mode are too hard to reach. Transitions are jittery and some of the UI glitches out on rotation. There should probably be a chord option for beginners. Percussion is in MIDI order, which is… a little bit crazy. But overall, I’m quite happy with the result! I hope people get a kick out of this weird project and enjoy sending their oddball musical ideas to each other.

    One more thing. There’s a good chance I’ll be releasing a standalone, file-based version of the app in the future (with MIDI, IAA, Audiobus and all that good stuff). If you’d be interested in using such an app, do let me know!


    0 0
  • 01/08/17--13:41: An Even Better Travel Gaiwan
  • Previously, I wrote about the Asobu Travel Mug as an excellent (if unintentional) travel gaiwan. Now, there’s a new leader in the not-a-gaiwan-but-almost-better-than-one category: the Klean Kanteen 8oz insulated tumbler.

    This mug is a bit thicker than the Asobu, but in trademark Klean Kanteen fashion the quality is simply superb. Heat is retained perfectly: there are no hot spots around the lip or anywhere on the body. Compared to the flaky finish of the Asobu, the matte black of the Klean Kanteen is slick and feels like it’ll last for ages. The shape is a little odd on first glance but feels great in the hand, and the rounded lip is perfect to drink from.

    Like the Asobu, the Klean Kanteen has a rubber-lined lid that can double as a strainer. For the most part, I use the sipping hole to strain: the lid snaps on very tightly and most loose-leaf teas expand enough to avoid going through the hole. (You might get a few stragglers, but the same thing happens with my regular gaiwan technique anyway.) If that doesn’t work, you can just pop the lid off and use the rubber seal as a makeshift strainer. As with the Asobu, the “lever” on the back of the lid can serve as a stopper while tilting it back. Admittedly, I did prefer the Asobu lid for its looser fit — the Klean Kanteen takes some strength to pop open! — but it’s a very minor ding on an otherwise excellent product. (Also, this might entirely be in the realm of personal preference. The Klean Kanteen lid looks and feels like it was precisely machined to fit the tumbler, which is a far cry from the ramshackle Asobu construction.)

    The mug fits about 7.7 ounces of water when filled right up to the lid, though you’ll get less when factoring in the tea leaves. It’s the ideal size for a single-serving cup of tea and about twice as big as your typical gaiwan. (Of course, there’s no issue using it for smaller steepings.)

    (As an aside: it took me way too long to realize this, but in addition to using a gram scale to measure out the exact amount of tea, you can also use it to measure the precise volume of water desired. This is because 1ml of water normally weighs 1g. Before, I used to eyeball the water; now, I just pour the water into the mug right after weighing the tea. This might seem super-finicky, but I’ve internalized Eco-Cha’s recommendation to use 9g of tea to 175ml of water for oolongs as a starting point, and it’s really nice to have reproducible results when comparing different steepings. The only question is whether to subtract the weight of the tea from the weight of the water, especially as the leaves expand. My hunch is yes.)

    As I mentioned in the previous article, one of the major reasons to use an insulated mug as a “gaiwan” is for its heat retention properties. Very little heat escapes the mug while making tea, maintaining the water at a stable temperature for the entire duration of the brew. My understanding is that certain kinds of teaware are especially prized for this property, but it’s almost impossible to beat vacuum-insulated steel in this race!

    Of course, it’s great that you can just throw this mug into your backpack or suitcase and not have to worry about it breaking or weighing you down. And since Klean Kanteen is such an entrenched brand, you can even find a number of accessories for it.

    The Klean Kanteen 8oz insulated tumbler: highly recommended as a surrogate travel gaiwan!


    0 0

    Update: I added a companion article with latency graphs for all three of my mice. I have also revised my conclusion to no longer recommend the MX Master for Bluetooth use or use cases sensitive to latency.

    We all know that Bluetooth has an abundance of flaws, ranging from frustrating latency to arcane pairing rituals. By many measures, it still feels like a technology stuck in the early 90’s. And yet, once you’ve experienced the freedom of going wireless, it’s very hard to go back to the old ways. Reaching to unplug your headphones when leaving your desk, only to realize that you can simply walk away? Bliss!

    For several years, I’ve been on the lookout for a Bluetooth mouse that could also be used for non-casual gaming. At minimum, the mouse needed to be on par with my trusty MX 518 at 1600 DPI and have little to no latency. Unfortunately, the vast majority of reputable Bluetooth mice maxed out at around 1000 DPI and had a reputation for being a bit laggy. The Razer Orochi was one of the few models that supported high DPI over Bluetooth, but it was a cramped little thing that felt rather unpleasant to use.

    There were a few wireless gaming mice that used proprietary USB adaptors to improve performance, including my latest mouse, the Logitech G602. This model did what it said on the tin, but despite the praise it garnered from gamers, I ended up somewhat disappointed with it. The USB receiver was pretty weak and would routinely cut out if you moved more than a few feet from the port. The fact that you had to use the receiver at all meant that you still plugged up one of your USB ports, causing significant setbacks with a two-port Macbook. (Hubs helped, but not while trying to use two USB-powered hard drives at the same time.) I was also unimpressed with the design and build in general: the body creaked in several prominent areas (including under the main buttons), the side buttons were unpleasant and hard to press, and the scroll wheel felt a bit mushy. After using it for about a year, I just ended up switching back to the MX 518.

    Recently, I’ve been working more in cafés, and the endless dance of the wire once again started to irk me. At first, I thought about getting an extra, cheapie Bluetooth mouse for use on the go, but then my “optimization sense” kicked in. It’s been 4 years since the G602, and technology moves quickly. Surely, I thought, there now had to be a mouse that could solve my wireless needs and also work for gaming! Besides, I deeply enjoyed finding tools for my life that could optimally solve multiple problems at once.

    Sure enough, Logitech had two new headlining models in the Bluetooth category: the MX Master and MX Anywhere 2. These were clearly top-shelf devices, sporting sleek designs, several color choices, and Logitech’s free-spinning MicroGear Precision scroll wheel. Interestingly, they also reached 1600 DPI and shared the ability to connect to Bluetooth or a Logitech Unifying USB receiver at the user’s discretion. (Update: the newly-released MX Master 2S goes up to 4000 DPI.) Based on my experience with the G602, I figured Bluetooth might be handy for everyday use while the USB receiver would work well for lag-free gaming. Were these the first Bluetooth mice that could actually fit the bill? I had to give them a spin!

    Eventually, I got my hands on both models and did some side-by-side testing. The MX Master was love at first touch, fixing almost everything I hated about the G602 and even adding a few extra features to win me over. Meanwhile, the MX Anywhere 2 was marred by one awful design decision and just felt too small for ergonomic comfort. (Update: unfortunately, I had to eventually give up the Master due to latency and connectivity issues. But the hardware remains spectacular!)

    Below is a discussion of several aspects of these mice that haven’t been covered in most reviews, including handfeel, clickiness, gaming use, and latency measurements.

    MX Anywhere 2

    The MX Anywhere 2 is a cute little mouse. Some reviewers have been comfortable switching to it as their primary work mouse, but in my testing, I found it just a bit too small. This is definitely a travel mouse in form and function. The weight, however, is great for usability, as it’s just hefty enough to stick a little to the mousepad without losing its high mobility.

    Click-wise, the two main buttons feel pretty good while the rest aren’t particularly notable. I was happy that the side navigation buttons were fairly normal sized compared to the scrunched side buttons on the Master. The coating feels grippy but maybe a tiny bit less premium than I’d hoped.

    Clicking every button on the MX Anywhere 2.

    In case you’re not aware, many Logitech mice now feature a scroll wheel that can also be clicked side-to-side. In reviews of Logitech mice, I often see praise for this sideways-clicking mouse wheel, and some go as far as to call it a “premium feature”. But I think I’ve come to realize that most people just don’t use their middle click all that much. Me? I’m an compulsive middle-clicker. I use that button for everything. New links. Closing tabs. Panning. Reloading. In fact, it’s possibly the second most important button on my mouse! Unfortunately, sideways-click cripples this button thoroughly, making it rattle from side to side with every minor push.

    If I otherwise loved the Anywhere, I figured I could get accustomed to this annoying hardware quirk. But Logitech really screwed up the wheel here. Incomprehensibly, there’s no middle click; instead, you get a tiny button right below the wheel that could be rebound to this function. (By default, it serves as the “gesture” button, which lets you show Exposé and whatnot.) The wheel itself, when depressed, mechanically toggles between traditional ratchet and free spin modes for scrolling, resulting in a heavy, chunky “clunk” that feels like you’re squishing something deep inside the mouse’s guts. Is there any other Logitech mouse that behaves this way? The middle-click has been a staple feature on mice since the 70’s, so why is changing scroll wheel modes suddenly more important? Considered together with the usual sideways-click complaints, this scroll wheel disappointed me in practically every respect.

    A demonstration of the janky scroll wheel.

    For a while, I tried rebinding the square button and sideways-click buttons to middle click. It felt OK… in the sense that I could probably get used to it over time. But I knew I’d never be happy with this compromise, and it’s what ultimately pushed me to give the Master a try.

    MX Master

    I’m delighted that tech companies have started to inject fashion into even their most pragmatic products. Both MX models come in black, navy, and white (“stone”). I liked the idea of white in honor of my old favorite Microsoft Intellimouse, and it’s the color I chose for my initial Anywhere purchase. But seeing it in person didn’t impress me as much as I had hoped. It was attractive but a little business casual, and in any case, it didn’t mesh with my recent black Logitech K380 keyboard purchase. (Peripheral matching, whaddaboutit?) So I decided to seek a different color with the MX Master.

    Between the other two options, navy looked svelte in pictures while black appeared to have some ugly beige accents that screamed “HP peripheral”. And yet… Amazon Prime Now had a promotion going where I could chip $10 off the purchase of just the black model, bringing the price down to a mere $50 and delivering it the very same day. Meanwhile, navy would cost me close to $70 and arrive several days later! Friends, I must admit I did not pass the marshmallow test on that day.

    Fortunately, this turned out to be a great decision: the black model looks fantastic in person. Despite what the photos might show, the accents are actually not beige at all but more along the lines of Apple’s space gray, perfectly complementing the darker matte gray of the body. In addition, the buttons have a slightly different coating from the rest of the mouse, giving them a pleasant sheen under certain lighting conditions.

    As most reviews have stated, the ergonomic comfort of this mouse is close to perfect. You lay your hand down and it feels like it was sculpted just for you. What’s more, the main buttons feel incredible to click — perhaps more so than any other mouse I’ve used! Seriously, I can’t stop clicking these buttons.

    Clicking every button on the MX Master.

    The Master’s sideclick-less wheel intrigued me when I first saw it. Most Logitech mice either feature sideclicking and free spinning together, or otherwise just throw in a plain old scroll wheel and call it a day. This was the first mouse I found which omitted sideways-clicking while still retaining the free spin mode, a feature I thought might come in handy as a substitute for the trackpad’s inertial scrolling. Prior to handling the Master, I hoped this setup might finally allow me to have an uncompromised middle click while still benefitting from Logitech’s fancy scroll wheel tech. And… that’s exactly what happened! The middle click on this mouse feels excellent, to the point where it’s very nearly as pleasing as the main buttons. (There’s a slight bit of wobble before the click is triggered, but I don’t think that can be helped on account of the complex mechanism.)

    There’s a subtle issue I noticed with the middle click that might be worth mentioning. When Smooth Scrolling is enabled in Logitech Options, if you click the mouse wheel and then immediately start scrolling, your scroll won’t actually register until several seconds later. This happens in both Windows and OS X. I assume this is some sort of hardware safeguard to prevent accidental scroll triggering, but it’s noticeable on occasion.

    My main issue with the build is the very poor layout of the back and forward buttons. I use these buttons quite frequently for navigation, and I miss the old Intellimouse days when the side buttons were enormous and clicked just as well as the main buttons. Here? The buttons are quiet and super annoying to differentiate. Why couldn’t they have spread them out just a little bit? The horizontal scroll wheel feels nice, but I don’t see myself getting much mileage out of it, especially now that I’ve learned you can simply Shift-scroll in OS X to get native horizontal scrolling.

    There’s one hidden button on this mouse: the “gesture” button, which can be activated by smashing down on the mesh pad next to your thumb. Unlike the other buttons, this button feels mushy and difficult to press, similar to those membrane buttons you find on cheap remotes. I guess they had to design it this way to avoid accidental clicks, but I wish they thought of something else or eliminated it altogether. I’ve been trying to use it as a surrogate back button instead of the tiny default one, but it’s not particularly pleasant or responsive to use. Oh well.

    Weight-wise, this mouse is pretty hefty, but not overbearing. I’ll have to get used to the inertia compared to my MX 518, which barely feels like it has any weight at all.

    The MX Master in regular use.

    I was worried when I was first looking at this mouse that it would just be a minor iteration on the G602, but these fears have been unfounded. The Master fixes every problem I had with the G602 (aside from perhaps the weight) and adds a bunch of great features to boot. I feel immediately at home with this device.

    Common Issues

    There are a few issues common to both mice that should be addressed.

    Both of these mice can be used while charging, but they don’t register as USB devices even when directly connected to your computer. You still have to use them via Bluetooth or the Unifying receiver, which means that there’s no zero-latency mode. In practice, as I demonstrate below, the mice are pretty darn close to lag-free. Most people didn’t consider wireless-only to be an issue with the G602, and I don’t see it as an issue here either. (The feature would have been appreciated, though.)Update: as it turns out, the latency is still an issue for gaming. Read on below.

    Second, there’s some scrolling weirdness, which seems to be a mix of OS issues as well as user habits. On the OS side, when smooth scrolling is enabled in Logitech Options, it doesn’t always seem to work right. Fairly frequently, you get some weird acceleration or momentum before things get going. (Both OS X and Windows have this issue, though manifested in different ways.) Most unfortunately, the wheel in free spin mode doesn’t seem to have a 1:1 mapping to page scrolling, which feels a lot less physically correct than using the trackpad. I think I could get used to this behavior, but even my ancient MX 518’s scrolling felt more natural. In terms of habits, if you’re used to trackpad momentum scrolling in OS X, you’ll be surprised when you’re free-scrolling a page and then find other pages continuing to scroll when switching windows! It might take a while to internalize the fact that the mouse has a mechanical component that needs to be stopped before switching tasks.

    These mice worry me a little with their reliance on mechanical trickery. On the MX Master, whenever the lever (or whatever it is) stops the wheel when switching to ratchet mode, I can feel the entire mouse shudder slightly. At least one user has already demonstrated that this part can get stuck. (This has apparently been quietly fixed by Logitech.) How long will it take for the mechanism to break or wear out? Fortunately, Logitech has an exceptional warranty department, so I don’t doubt that they’ll send me a replacement if anything bad happens. Still, I don’t like the idea of having to pamper my mouse.

    The Unifying receiver, unfortunately, tends to have a very short range if there’s any sort of interference nearby. (For example, I can hardly move the mouse a foot away if a Thunderbolt cable is attached to the port next to the receiver. Or maybe it’s the eGPU itself?) As a result, I’ve resorted to plugging the receiver into a USB extender. With Bluetooth, this is not an issue at all, so it comes up fairly infrequently.

    Latency

    Now, for my personal dealbreaker with wireless mice: latency. I had a bit of a misconception when I first set my eyes on these two MX models. My assumption was that the Unifying receiver was identical to the one used by my G602, meaning that the adaptor would be highly optimized for reduced latency. But according to a Logitech representative, only Logitech’s gaming peripherals use the improved, custom-designed adaptor to get the “precision report rate”, whereas Unifying technology is less fancy and reserved for use with the business lineup. My question was: did “precision report rate” only refer to the polling rate, or were the gaming adaptors additionally less laggy? In other words, was I missing out with my Unifying receiver?

    I knew I wouldn’t have peace of mind until I had solid numbers, so I decided to measure the latency myself. There were two data points I needed to capture: the moment the mouse started moving, and the subsequent moment that the computer registered mouse activity. Both actions had to be on the same clock. My iPhone’s camera could record at 240 FPS, so precision wasn’t an issue; the problem was that my laptop display only refreshed at 60 Hz, meaning that I couldn’t rely on a recording of the screen alone to figure out how fast the mouse signal was going through. (There was only one display frame for every four video frames.)

    I ended up writing a small, single-window Mac application to help me along. On the left side, the window has a running millisecond timer, refreshing at the exact frequency of the display. This gave me the precise timestamp of each display cycle. (Well — with a possible delta of 1 frame or ~17ms, depending on how the labels spaced out their updates under the hood. But I was only interested in relative latencies between the mice, not the absolute latency, so the only important detail was that this offset was consistent.) The app also captured the timestamp for the precise moment mouse movement was first detected. This was displayed in a label on the right side. Both timestamps were generated using the same time function, CACurrentMediaTime.

    Next, I placed a mousepad next to my display along with a small box to evenly and consistently push the mouse along. I set up my phone to show both the laptop display (with the timer app running) and a side view of the mouse and box contact point. I filmed three trials each of the MX 518, MX Master with the USB adaptor, and MX Master in Bluetooth mode, resetting the righthand timer between each trial.

    Finally, I went through the videos frame-by-frame in VLC. (The ‘e’ key: highly convenient!) The left timestamp was used to determine the exact moment when the mouse started moving. If the movement occurred between two timestamps, I could simply interpolate the precise value based on which intermediary frame out of four I landed on. After that, I noted the righthand (“mouse was detected”) timestamp and did a bit of math to arrive at the latency value. Perhaps not a perfect system, but as accurate as I could manage with the tools I had at hand!

    Update: since publishing this article, I have run a more thorough and accurate suite of tests on all three of my mice. The conclusions are a bit different from the initial ones greyed out below, namely concerning Bluetooth accuracy (notably worse than originally tested) and MX Master latency characteristics (spiky and 10-20ms slower than wired).

    The results were: 55ms/58ms/50ms for the wired MX 518; 63ms/74ms/51ms for the MX Master in USB receiver mode; and 70ms/58ms/68ms for the MX Master in Bluetooth mode. (Keep in mind that these values were not a measure of absolute latency and were only meant to be compared to each other, since the test did not deduct OS latency, monitor latency, etc.)

    To my great surprise, not only was wireless latency very close to wired (~55ms vs. ~65ms), but Bluetooth was practically as performant as the USB receiver! I don’t know how Logitech managed it, but somehow the Bluetooth performance of these mice is nearly flawless, to the point where perhaps the dongle is basically unnecessary. (Except for edge cases like BIOS use.) You could make the argument that wireless performance is less consistent than wired, but I’d need to do more tests to figure this out. (And it’s probably more effort than it’s worth.)

    So is 10ms of lag a dealbreaker when it comes to precision gaming? I strongly suspect it won’t be noticeable — especially given how much latency already exists in the long chain from mouse to display — but I’d love to see some empirical evidence backing this up.

    Gaming

    There’s some mild consternation for these two MX models when it comes to gaming. Whenever people ask, some enthusiast always shows up and levies the following grievances against them:

    • They have no wired mode, and thus always feature some latency.
    • They have built-in acceleration and angle snapping.
    • They only poll at 125 Hz.
    • They only go up to 1600 DPI. (Update: the newly-released MX Master 2S goes up to 4000 DPI.)

    In contrast, they suggest, Razer and Logitech themselves make gaming-tailored wireless mice (the Logitech G900, or the new Razer Lancehead) that go up to 12000 DPI at 1000 Hz and use proprietary receivers for optimal performance. All technically true! However, the mouse I’ve loved the longest, and gamed the most with, has been my trusty MX 518, a classic model popular with gamers even today. And it turns out that this mouse also has built-in angle snapping, also only goes up to 1600 DPI, and also polls at a mere 125 Hz. The horror!

    In practice, none of these quirks are dealbreakers. 1600 DPI is more than enough for the vast, vast majority of people; it was a high standard a decade ago and accuracy-per-inch demands in humans have not suddenly spiked during that time. (DPI is more of an issue with enormous monitors and insane resolutions, but it doesn’t matter for my use case.) Same goes for 125Hz polling, which is effectively 2x the refresh rate of most monitors. On top of that, you’ll get about 30x less battery life (30 hours vs. 40 days!) with gaming wireless mice — not to mention losing all the benefits of Bluetooth. Unless you’re a pro, I don’t think it’s nearly worth the tradeoff.

    However… I have to admit that something about these mice definitely feels off when playing FPS. Side-by-side with the MX 518, the difference is immediately noticeable. With the 518, I feel like I’m directly inside the character’s head. With the Master, there’s a bit of a “cockpit effect”, or a very subtle sense that my movements aren’t perfectly mapped to the camera. Accordingly, things like rapid 180 degree turns and flick shots feel more hesitant and unnatural. For a while, I assumed this was due to wireless latency, but my experiments showed that this was unlikely to be the case. (Besides, my setup was a mess and there was plenty of latency coursing through the system already.) I also thought it might be the weight of the Master, but no dice: the Anywhere had the same issue at half the weight. So my working hypothesis is that this issue is caused by some subtle differences in mapping of mouse movement between the Master and the 518, meaning that I’ll have to reprogram my brain a little before I’m fully comfortable with it. (I think I could also customize this curve in software using various third-party tools, but this might be too finicky even for my tastes.) I actually remember having this exact response to the G602, so maybe that 10ms does make a critical difference in FPS gameplay after all? Or perhaps the G602 shares its motion curve with the Master? Who knows!Update: after using the Master for another week and doing some thorough testing, I have concluded that, unfortunately, latency is almost certainly the culprit here. That’s not something that you can really train yourself to ignore, at least not in fast-paced multiplayer games.

    Conclusion

    The MX Master is so very close to a perfect all-arounder mouse. It supports Bluetooth. It works for gaming. It feels incredible in the hand and even features a free-scrolling mouse wheel with a solid middle click. Unfortunately, despite my initial excitement in the first version of this article, I’ve decided to return the mouse it in favor of the gamer-centric G403 Wireless. (The lack of Bluetooth support is a bummer, but this mouse hits it out of the park in every other respect.)

    There are two primary flaws that sealed the Master’s fate for me.

    First, the Bluetooth functionality is lackluster. I don’t know which company is responsible, but compared to my wireless Apple trackpad, the Master’s cursor movement under Bluetooth feels rough and jittery. On occasion, I’ve even seen it stop tracking altogether until the on/off switch is toggled. (Dozens of users have reported the same problem on Logitech forums.) Furthermore, Bluetooth performance seems to vary dramatically depending on software conditions. For example, if there are lots of windows on the screen and I free-spin the scroll wheel, the cursor might only be able to move once a half-second. One of my primary goals with this mouse was to have it immediately start working when setting up at a café, and this doesn’t quite pass my baseline of “working”.

    The other issue is gaming performance. This feels like such a very minor and nebulous nitpick, but at this point I’m reasonably certain that I’m not imagining it. Presumably on account of the additional 10-20ms of latency, I just don’t feel as in-control with this mouse as I do with my G602. When playing FPS with the G602, I have no problem running down a hallway, executing a precise about-face, and then turning right back in the span of a half-second. With the Master, this gesture simply feels sluggish and unnatural. I often have to make a concerted effort and then end up taking longer or miss the target altogether. The effect is immediately noticeable when using the two mice side-by-side over the course of a long gaming session. The G602 just feels much more precise.

    I really wish it was practical for me to keep this mouse, but it’s not quite the master-of-everything I was hoping to get. Still, I have no doubt most people would be very happy with the MX Master, especially now that the 2S revision has been released.

    If you decide to get this mouse, I recommend grabbing a Hermitshell case. It fits the mouse perfectly and keeps it safe for tossing into your backpack or bag.


    0 0

    In the course of doing latency testing for my previous article on the Logitech MX Master, I discovered a couple of flaws in my helper app, and I also realized that I should have probably recorded a few more sample points. So now, as a followup, I have devised a better testing methodology and run a full suite of tests. Unfortunately, with this new data in hand, I must now retract my original recommendation. The Master is still a good mouse for the average user, but its wireless performance is just too unreliable for precise gaming or Bluetooth use.

    If you’re looking for a great all-arounder, I would instead give my highest recommendation to the G403 Wireless, which I’ve been happily using for several months with zero issues. While this mouse does require a dongle and only has a tenth of the Master’s battery life, its best-in-class performance, non-existent latency, svelte form factor, and incredible clicky side buttons more than make up for these downsides. Better yet, you can routinely find it on sale for $50 or lower on Amazon and at Best Buy. I’ll try to post a fuller account sometime in the near future.

    In the meantime, here are the new test results for the MX Master, G602, and MX 518.

    Data

    Note: I also ran a few tests in microe’s MouseTester to compare the motion graphs of the three mice, but they looked pretty much the same to my eye. So I think the difference in feel of these mice is mostly due to latency and, to a lesser degree, weight and shape.

    Results

    Since last time, my helper app has been revised to explicitly update the left label every frame instead of implicitly relying on AppKit’s timing. I’ve also switched my scrubbing program from VLC to QuickTime, as the latter additionally allows you to step backwards frame-by-frame. (Extremely useful if you happen to overshoot the mouse movement point!) Combined with a Numbers spreadsheet for processing, the sampling process took maybe a minute or two per data point.

    In every test, the cursor begins moving two screen frames ahead of the right label, so there’s on the order of two frames of extra latency (~33ms) in these measurements. Subtracting this amount from the recorded values will get you closer to the absolute latency of the mouse. But again, if you’re only comparing these numbers to each other (which I am) then the extra latency doesn’t really matter. You may as well just subtract the latency of the wired mouse since that’s as close as you’re going to get to zero.

    I ended up testing both USB ports on my Macbook because I’ve had USB peripherals behave differently depending on which side they used. Not sure if the resulting variance is due to the ports themselves (power issues?) or simply reception.

    OK, on to the results!

    As the wired “control”, the MX 518 showed 33ms of average latency with the left USB port and 38ms with the right. Theoretically, none of the other results should have surpassed this value — though the G602 stood a slight chance with its higher 500Hz polling rate.

    During its worst run, with the adaptor plugged in to the left USB port, the MX Master had 62ms of average latency, or 30ms more than the wired MX 518. However, every subsequent run resulted in significantly quicker average values. Two more tests with the left USB port — one using a USB extender and one while simultaneously charging — gave me a better average of 54ms for both. And with the right port, things got better still, with two runs sporting an average of 45ms (including dips down to the thirties) and the other two responding at a respectable 47ms and 49ms on average.

    With Bluetooth, the Master responded at an average of 65ms. So my conclusion in the original article was overly optimistic: there can be up to 20ms difference between Bluetooth and the USB adaptor.

    During its first trial, the G602 reported with an astounding 34ms of latency — just 1ms more than wired! However, each subsequent run (including one with the very same setup as the first) only gave me 50ms on average.

    What can we conclude from these results?

    The main issue with the Unifying receiver seems to be that the latency is rather inconsistent and spiky. With the G602, regardless of whether it’s averaging 35ms or 50ms, the latency curve is always baby-butt smooth. In contrast, the Unifying receiver needs to be pampered to attain optimal performance.

    It seems that a variety of minute factors can drastically affect the latency of these mice, ranging from adaptor placement to USB port selection. The G602 might have a lower baseline than the MX Master, but the Master can still come within a respectable 10ms of that baseline. And in any case, it seems the G602 can’t be guaranteed to perform in this range. I wish I knew what caused the G602 to spike up to 50ms for all its subsequent trials!

    The first sample point in several of the MX Master runs was much higher than the rest. (A few are omitted since they’re not representative of the average running latency.) I assume this is the result of some energy-saving feature. Doesn’t really matter for games since you’re constantly moving the mouse anyway.

    The battery level in the MX Master doesn’t seem to have much of an effect on performance.

    Conclusion

    I spent another week with the MX Master in daily use, and unfortunately, I had to concede the results: the Master was noticeably 1-2 frames behind my other mice. Frankly, I was really surprised by how much this affected gameplay. With the MX Master in CS:GO and Overwatch, I always felt like I was a little drunk. My cursor would constantly overshoot and I would miss many of my flick shots. Hot-swapping the G602 brought an instant wave of relief: my sense of immersion immediately returned and I felt like I could aim almost twice as well. (Maybe this is what happens when you hammer your synapses with FPS gameplay over the course of two decades!) I tried to account for the placebo effect as best as I could without doing a completely blind test, but I could easily see my performance suffer even when running around and shooting bots in the Overwatch training area.

    I followed up with a few more informal measurements, and all of them continued to show the MX Master trailing the G602 in performance — mostly on account of the lag spikes, but sometimes pretty drastically even on average. I also discovered that Bluetooth performance was quite unreliable on the Mac side, frequently dropping off or disconnecting altogether and requiring a hard mouse reset. Given that the Master was intended as an all-arounder for both gaming and Bluetooth use, this was a huge disappointment. It clearly wasn’t up to snuff in either respect, and I decided to send it back.

    As a last-ditch stop in my mousing hunt, I visited my local Best Buy to take a gander at Logitech’s gaming mice. The lineup had all the problems I was expecting: tacky designs, an overabundance of buttons, horrible tilt-click scroll wheels… except for the lone G403. As soon as I put this mouse in my hand, I knew it was the one. This was the only wireless gaming mouse that had just the five standard buttons in a classic body. Its scroll wheel was the normal kind, not the mushy tilt-wheel kind. Its internal hardware was the same as that of the possibly-best-in-class G900. And most surprising of all, its side buttons were actually clicky! (I know it’s such a small detail, but I hadn’t used a new mouse with clicky side buttons in years.) Before me was a phenomenal gaming mouse in the guise of a business accessory, evocative of the classic Microsoft Intellimouse — and USB dongle or not, this was exactly the mix I was searching for. I took it home and haven’t had a single complaint in the three months since. (Bonus: it fits snugly in my MX Master Hermitshell case.)

    The Master is 80% of the way to being an ideal all-arounder, but sadly, it’s killed for power users by inconsistent performance.


    0 0

    I am once again in the market for Bluetooth headphones. My last pair, the first edition Sony MDR-1RBT, served me very well for the last four-and-a-half years. I didn’t (and still don’t) have much experience with the high end of the audio market, but when I was picking them out, they were among the best sounding headphones I’d heard. The feeling of space was especially startling: for the first time in a headphone, I felt like the music was all around me instead of being localized to a point between my ears. Now, parts of it were held together with glue and the remaining pleather was flaking all over my jacket. The time was right for an upgrade.

    I started to research comparable, $200 to $400 over-ear wireless replacements. My wishlist included noise cancellation, multi-device pairing, volume controls that interfaced with your device, tactile controls with previous/next buttons, and foldability. The only real technical requirements were audio quality, a zero-latency wired connection, and at least some kind of physical control. But non-technical, quality-of-life attributes were also very important. It’s hard to deny that headphones play many important roles in our lives apart from simply reproducing audio. They are most certainly a fashion item. They serve as earmuffs in cold weather. They block out the outside world when we need to retreat into our work. For many of us, they’ve become one of the most important and frequently used accessories in our wardrobe! Soon, we’ll be seeing health and fitness sensors incorporated right into the earcups. Especially with the cord cut, there’s a lot more to headphones than just audio these days.

    Based on a wide and thorough reading of the field — Head-Fi, InnerFidelity, Les Numériques, /r/headphones, and more — I eventually whittled down my list to four pairs of headphones: the Audio-Technica ATH-DSR7BT, the Sony WH‑1000xM2, the Bowers & Wilkins PX, and the V-MODA Crossfade II Wireless. Since it was impossible to proceed past this point based on stats alone, I decided to get them all together in a room and give them a thorough, comparative workout.

    Technology

    The Audio-Technica ATH-DSR7BT came widely recommended, but my first impressions were sorely disappointing. What I took out of the box was a creaky, plasticky set that clamped down hard on my head. The earpads don’t feel comfortable at all — certainly nothing like the pillows on my old MDR-1RDT. The cover over the USB port will not stay seated. The strange IR play/pause “button” positioned next to the regular buttons screams “budget design” and frequently misfires. The headphone volume controls appear to act independently of your paired device, forcing you to juggle two different sets of volume. There is also a quiet, tape-like hiss audible over silent passages. My 1RBT has occasional buzzing and popping over silence, but not to this degree. Perhaps the only technical advantage here is the fact that the volume rocker performs double duty as previous and next: very useful for jogging back and forward in audiobooks and podcasts.

    The Sony WH‑1000xM2 also didn’t amaze me in the build quality department. Though these are in fact the lightest set of the bunch, the materials here simply feel cheap. The plastic is on the same level I would expect to see in a bundled TV headphone, not a $350 product. Comfort-wise, they are OK but not great. The pads in particular are strictly pragmatic and lack the gentleness my MDR-1RBT. The earcups are roomy enough and the clamping force is light enough that the headphones slide around on your head when moving around. In terms of fashion, I’m just not fond of the design. It’s hard to describe these headphones as anything but office-grey boring, though the parallel h.ear on 2 models do come in different colors. (On that note, I’ve actually seen some scattered accounts of people who prefer the sound of the h.ear on 2 over the 1000xM2, even though h.ears look identical and cost $50 less. Reviews are still scarce, though, so it’s hard to know for sure.) On the upside, these headphones fold and come with a carrying case.

    Aside from the two buttons for switching power and ANC, the controls here are touch based. You can double-tap on the right earcup to play/pause and swipe in one of four directions for previous/next and volume adjustment. I didn’t expect to prefer these controls over physical buttons, but in practice they worked flawlessly. Buttons are often tricky to find on earcups anyway, so having a wide berth for gestures is a big plus. (No pun intended.) And in any case, it’s certainly worth having for the discrete previous/next gestures. You can also place your hand over the right earcup to pipe sound in during conversations. This feature is not useful to me since I’d prefer to just take my headphones off out of politeness, but maybe handy for noisy offices with ANC enabled.

    Unfortunately, it doesn’t seem that these headphones have the ability to pair with multiple devices simultaneously. (I did not test this myself, but it’s been mentioned in a number of forum posts.) This means you have to do the Bluetooth disconnection dance if you’re switching between laptop and phone, which is rather annoying.

    As reviews have stressed, ANC does not seem to affect sound quality in any major way. I didn’t visit any noisy environments with these headphones so I can’t vouch for the ANC’s effectiveness, though reviewers have reported it to be best-in-class. The Sony Connect app is a sight to behold, featuring several pages of settings mostly related to ANC behavior. There’s even a built-in EQ. (I tried out some of the surround effect settings, but the results were frankly underwhelming.)

    In terms of background noise, there is surprisingly very little here. If you strain very hard you can detect a slight hiss when the amp is on, but it’s basically imperceptible.

    The Bowers & Wilkins PX is certainly the most interesting headphone of the bunch. In terms of build quality, it really can’t be faulted. Every surface features lovely premium materials, including metal and real leather. The design is very attractive too, though I confess that I’m not too fond of the gaps the headband forms with the sides of my head. The pads aren’t cushy, but I don’t find them to be actively uncomfortable like some reviewers have. In fact, coupled with the high clamping force, I can really appreciate just how firmly these headphones sit on my ears. No amount of motion is likely to dislodge them. This also gives the PX the best passive isolation I’ve heard in a headphone. You don’t even need to enable ANC to block out most of the outside world. The earcups are fairly small, so this pair might not be a good choice if you have large ears. Personally, I find them cozy.

    Tech-wise, the PX attempts some very interesting things. First, the headphones pause your music when you take them off and then resume playing when you put them back on. In essence, this works almost the same as with the AirPods. Some people have complained about the sensitivity (which you can adjust in the B&W Control app) but I found it to be tuned just right. The feature works just as well on Mac and Windows, and on a technical level I was surprised by how many different kinds of media it was able to pause. (YouTube videos, for example.) Unfortunately, there are several flaws that might compel you to disable the feature altogether. (This is also possible through the app.) One, if you remove a single earcup to talk to someone, the headphones might pause and then immediately resume after detecting the back of your neck. Two, if you pause your media before taking off your headphones, it will still start playing after you put them back on. This can create some loud and unpleasant surprises.

    Next, these headphones have the ability to connect to multiple devices simultaneously. This doesn’t mean that your laptop and phone can interleave their audio, but it does allow you to switch back and forth between devices without any friction. The execution here is a bit questionable, though. If you’re connected to two devices and one of them produces a sound, the other device will automatically pause its media in the same way as the auto-pausing feature. In practice, this means that you have to put your phone on mute if you’re also connected to your laptop, since any notification sounds will immediately pause your music. If you prefer not to switch devices this way, you also have the option of quickly disconnecting from the currently paired device by way of a brief hold of the power button. This is significantly improved over the usual Bluetooth headphone switching hullabaloo, which involves going to your paired device and manually disconnecting the headphones in the Bluetooth device list. Having a disconnect option right on the earcup is a great convenience.

    ANC has its own switch, and you can adjust the sensitivity from inside the app. Reviewers have pointed out that sound quality suffers when ANC is set to the highest setting, and from cursory testing I have to agree. The loss is diminished with ANC is switched to the lower setting (Office), though it does seem quite weak. The highest setting (Flight) surprised me with just how serene it made the surrounding world. Riding on a train felt just like sitting in a quiet room. Plus, there was no sense of pressure at all.

    Battery saving on the PX is fairly aggressive. If there’s no audio stream detected for about a second, you can hear the amplifier turning off. (You can tell because there’s a very, very faint hiss when the amplifier is on. I’d say it’s on par with the 1000xM2 and only really detectable in contrast to absolute silence. These headphones have a very clean sound.) Then, when streaming resumes, the audio takes a second to fade back in. This is much improved over my MDR-1RBT, which simply switches off on audible silence and causes cleaned-up audiobooks and podcasts to glitch out1. Here, even if there’s no sound coming from the drivers, the amplifier will remain on as long your software has an active audio session going. (I believe this is the case with the other three headphones as well.)

    Very annoyingly, the PX seems to go into “standby” after 2 minutes of inactivity. That’s what the manual says; in practice, they seem to shut off, since I can’t get them to reconnect without hitting the power switch first. None of the other headphones do this: once connected, they stay connected until you turn your computer or headphones off. I ought to be able to leave these in a table, then pick them up after 15 minutes and resume listening immediately.

    The PX, really, has three modes: wireless, USB wired (which turns the headphones into a USB output device), and “analog” 3.5mm. That last mode is in quotes because even though the PX has a 3.5mm output, the headphones still have to be charged in order to use it. A bit annoyingly, when in USB mode (though thankfully not in 3.5mm mode), the headphones apply the same power-saving, amp-switching logic as in wireless mode. In practice it’s not really a problem, but you might be surprised if you hit the volume button expecting a beep and don’t hear anything because the amp hasn’t warmed up yet. I perceive the USB wired mode as having very slightly more latency than the 3.5mm mode — maybe 1 to 2 extra milliseconds — though I only tested this by switching back and forth in a game of Overwatch. I was hoping to gain access to the microphone over USB in order to use the PX as an quick-fix gaming headset, but this functionality is unfortunately locked away. The microphone does technically show up as an input device in both macOS and Windows, but it doesn’t produce any sound.

    Meanwhile, the 3.5mm connection is very audibly noisy — far more than either of the other modes, including wireless. Usually this manifests as a hum, but sometimes you can even hear crackling or static. This interference is amplified twofold when the USB cable is attached simultaneously, making me think that the analog lines inside the headphones haven’t been shielded for some reason. (Other users on Head-Fi have suggested that this might be a grounding issue, and indeed the hiss does change depending on the device and even the workload, but there doesn’t seem to be any way to make it disappear completely. And in any case, plenty of ungrounded devices sound great with my other headphones, so it’s a sorry excuse.) Since 3.5mm is the only analog, near-zero latency connection to this headphone, this strikes me as a pretty big oversight. The quality of the analog connection might not matter for games, but if you’re interested in musical performance, having any sort of noise on that line is simply unacceptable.

    Along with the ANC and power buttons, there are three controls on the right earcup: one multi-use button for play/pause/forward/back and two buttons for volume adjustment. I wish it was possible to make the volume buttons act like previous/next instead, since double- and triple-clicking the multi-use button for navigation can be finicky. (It’s been pretty consistent for me, but navigating in the wrong direction even one time out of twenty is really frustrating.) At least the multi-use button is contoured and very clicky so you can definitely tell when you’ve pressed it.

    The PX earcups turn flat for travel and come with a magnetic pouch, but the space savings are minimal. Actually, I wish the cups turned in the opposite direction, since wearing them over your neck with the earcups facing up is a bit awkward.

    Of the four headphones, the V-MODA Crossfade II Wireless is certainly the most comfortable and (to my taste) the best looking. There’s some plastic here, but it’s the good kind of plastic. You get the sense that you’d be able to throw these headphones around without much fear of damage. The design is fun and a bit 90’s, and I particularly love the subtle gold accents in my color scheme2. The earcups are soft and pillowy and fit snugly over your ears. Many people suggest getting XL earcups for these headphones, but I actually like the default size.

    Much like the PX, you can pair these headphones with multiple devices. Unlike the PX, audio does not pause when sound comes in on a second device. My empirical understanding is that you’re only able to start listening on a different device when the active audio session on your current device finishes playing. In practice, this seems to be triggerable by pausing your media or maybe closing your music or video player in the worst case. I prefer this behavior to the PX’s, since I don’t have to work around notifications interrupting my playback.

    Much like in the ATH-DSR7BT, there’s a very noticeable hiss in wireless mode over silent passages. In practice you can’t hear it over most music (except classical), but it is always there in the background, subtly shifting the quality of your music. I’ve heard it said that these headphones were designed first and foremost to be great wired cans, and at least two people have corroborated that they do in fact sound terrific in analog mode. (Many other Bluetooth headphones don’t sound nearly as good without their circuitry switched on.) The included 3.5mm braided cable features a microphone and a multi-use button, though the microphone quality is very poor.

    Oddly enough, iOS seems to track the battery level of these headphones in 20% increments. (The PX appears to use 10% increments, but at least you can check the precise battery level from the app.) For some boneheaded reason, V-MODA decided that it would be a great idea to have the headphones beep once a minute for the entirety of the last half-hour of battery life. The effect only serves to shorten the battery life even further since it’s just so difficult to tolerate.

    The physical controls work the same as on the PX: three buttons, two for volume and one multi-use. Here, the multi-use button sits on top of the earcup’s hexagonal inlay so it’s fairly easy to feel out. The buttons are a bit mushier than on the PX, though, making it easier to misclick if you’re going for those double or triple clicks.

    V-MODA has a couple out-of-band perks that are worth noting. First, you can send them your headphones even if they get completely trashed (if they’re still on sale) and receive a 50% coupon for a comparable model in return. (Especially important in the fast-moving world of wireless audio!) Second, the company is closely affiliated with the Head-Fi userbase and seems very much in touch with the audio enthusiast community. This gives me a bit more faith in their product and support than I would otherwise have.

    Sound

    First, I should state for the record that I’m not really an audiophile. To be sure, I enjoy high quality audio and keep a hard drive worth of FLACs on standby. But the most expensive pair of headphones I’ve ever owned is the MDR-1RBT, and I’ve not had an opportunity to test and especially ABX any true audiophile cans. My only high-end frame of reference is a Sennheiser HD 800 S I briefly demoed at the SF Sennheiser store. (Incidentally, this experience was an excellent calibrator for all my future audio expectations. Never thought audio could sound so crisp and spacious through a pair headphones!)

    Second, I’m an audiophilia skeptic. Though I do love my FLACs as a collector, I believe that V2 MP3s (or equivalent) are universally transparent for the vast majority of music. I’m also very wary of magical-sounding terms used to describe audio equipment, since the placebo effect is so darn powerful when it comes to sound. For this test, I tried my best to describe exactly what I heard and to compare headphones as analytically as possible. This required switching sets many times over the course of a single piece of music and sometimes even a single section. I don’t have much hands-on experience with the vocabulary used in audio enthusiast circles (e.g. “dark”, “warm”, “detailed”, “analytical”, etc.) so I tried to describe the sound in my own words. Even as a skeptic, I was surprised by just how unique each headphone sounded on close listen!

    For testing, I got four iOS devices and paired a headphone to each one. I tried to equalize the volumes as best as possible. Then, I would pick the same piece of music on each device and hit play simultaneously, switching between headphones as the piece went along. As I listened, I took notes on each headphone. My playlist featured old and new favorites across a diverse set of genres as well as a few fancy masterings. I tried to select pieces that were well-mastered and featured a variety of instruments, frequencies, and sounds. Here’s what I ended up listening to over the course of several hours:

    • Dire Straits — Sultans of Swing
    • Deep Purple — Space Truckin’
    • The Clash — London Calling
    • Prince — Purple Rain
    • Brian Wilson — Good Vibrations
    • Sufjan Stevens — Decatur
    • Nickel Creek — Reasons Why
    • Punch Brothers — Passepied
    • Acoustic Alchemy — Mr. Chow
    • Acoustic Alchemy — No Messin’
    • Paul Gilbert — Three Times Rana
    • Rebecca Pidgeon — Spanish Harlem
    • Elliot Smith — Bottle Up and Explode
    • Poe — Hey Pretty
    • Tori Amos — A Sorta Fairytale
    • Radiohead — Let Down
    • Radiohead — How to Disappear Completely
    • Radiohead — Kid A
    • London Zoo — Poison Dart
    • Porcupine Tree — Mellotron Scratch
    • The Derek Truck Band — Sahib Teri Bandi/Maki Madni
    • Polyphia — Finale
    • Galneryus — Lament
    • Tipper — Gulch
    • Tipper — It’s Like
    • Ott — Adrift in Hilbert Space
    • Opiuo — Axolotl Throttle
    • Yori Horikawa — Letter
    • Песни Нашего Века — Контрабандисты
    • Песни Нашего Века — Гренада
    • Песни Нашего Века — Купола
    • Pat Metheny — Last Train Home
    • Beethoven, Josef Bulva — Piano Sonata No. 14 in C# Minor, Op. 27/2 “Moonlight”: III. Presto agitato
    • Rachmaninoff — Étude-tableau in B Minor, Op. 39/4
    • David Bowie — Lady Stardust (Ryko Au20)
    • David Bowie — Moonage Daydream (Ryko Au20)
    • Supertramp — Take the Long Way Home (MFSL)
    • Pink Floyd — Time (MFSL)
    • Pink Floyd — Wish You Were Here (Mastersound Gold)
    • Pink Floyd — Hey You (MFSL)
    • Metallica — Master of Puppets (DCC)
    • Miles Davis — All Blues (Japan DSD)
    • Stan Getz & João Gilberto — Doralice (MFSL)
    • Stan Getz & João Gilberto — O Grande Amor (MFSL)

    The Audio-Technica ATH-DSR7BT was the obvious straggler of the four. Although the soundstage was reasonably spacious (more so than the Crossfade) with good instrument separation, crispness, and detail, I frequently noted that the tone was shrill, sibilant, and harsh — even tinny. This was the only pair that was actually unpleasant to listen to in some vocal sections. It often felt like the low end was sometimes attenuated, leaving behind jagged and unpleasant remains in the higher registers. Which is not to say that this pair sounded bad, but compared to the competition, it felt like it lagged a few generations behind. Coupled with the hardware issues, bizarre design, and build quality, I would definitely avoid this one.

    Next in ranking was the Sony WH‑1000xM2. The soundstage here was very wide, but it felt, for lack of a better word, spherical. The sound lacked dimensionality and detail. The instruments sounded kind of murky and blended with each other instead of sticking out in space. (I wrote down: rounded, hollow, dull, muted, faded.) The bass was quite boomy and exaggerated in a way that just didn’t feel very realistic. Songs would often sound overly resonant or reverby. I realized while listening to this set that none of my favorite musical moments had any pop or excitement to them. These headphones sounded fine, but they kind of sucked the life out of everything. They simply felt cold. (Surprisingly, even my MDR-1RBT sounded comparitively warm and lively despite having a flatter and much tinnier sound.)

    The last two headphones were just so different in their tone that I couldn’t assign them an order.

    The B&W PX was definitely the odd one of the bunch. First thing that jumped out at me was its sheer spatiality. Much like my memory of the 800 S, recordings that used to feel relatively flat suddenly sounded almost binaural with this pair. Instruments would hang in the air with a wonderful amount of empty space between them. If you closed your eyes, you almost had the sense of being in the middle of a concert stage. (I wrote down: crisp, balanced, layered, subtly detailed, like a band playing around you.) For genres like jazz and classical (especially piano), this effect was simply transcendent. However, in comparison to the other headphones, there was something odd about the tone of certain instruments. Vocals in particular sounded a bit metallic, compressed, or rounded in a way I couldn’t quite pin down. I don’t know if I would have noticed if I was just listening to this pair in isolation, but in contrast especially to the Crossfade, the difference was quite stark. Maybe it’s simply a matter of preference; maybe it’s a sonic flaw; or maybe some longer burn-in is required. I know people on Head-Fi have complained about a boxy or tunnel-like character to vocals, but I don’t know they were referring to the same effect since the soundstage and detail are otherwise amazing. (I couldn’t get the pair to sound any different when pressing down on both earcups, as one user suggested. The earcup seals hold very well for me even with glasses on.) I also found that the headphones had a very neutral sound. Vocals fell in line with the other instruments. Plucky guitars lost their warmth and reverb. Bass was certainly there, but reserved. Many songs sounded very immersive but lacked “wow” in their most kinetic passages. And especially combined with a hint of harshness in the upper registers (cymbals, occasional vocals), bassy electronic music tended to fall flat and lose its fuzzy, enveloping effect.

    The V-MODA Crossfade II, on the other hand, simply sounded fun. Incredibly fun. Several times in my notes, I noted that it was like sinking into a pillow of sound. Its signature felt intimate and warm. The soundstage was relatively small, but you could hear each instrument plucking away around you. And oh, those plucks! You could almost physically feel every string on the guitar. (My notes mentioned a rich, cozy, lush, plump, and enveloping sound.) Color-wise, bass is clearly emphasized (to the point of occasionally being boomy) and vocals float forward. In terms of detail, I’d say they beat the Sonys but defer to the B&Ws. You can still pick out the smaller details, but it’s not quite as clear and the instruments don’t have their near-binaural separation. (In fact, I noted that the soundstage occasionally felt a bit flat and that the vocals sometimes sounded a bit muddy or fuzzy.) Nonetheless, whereas the PX astounds in certain genres (jazz, classical) but flubs in others (downtempo, trip-hop), the Crossfade excels in music featuring bass, vocals, and acoustic guitars and does a good job with just about everything else. Best of all, this pair is generous with its “wow” moments. Crescendos, bass drops, and other dramatic passages hit you just as hard as the artists intended. At its worst, the Crossfade produces a hyperactive mass of sound. At its best, it grabs you and all but forces you to move your feet along. And unlike in the PX, the sound doesn’t demand your attention. You can sit back and just let it wash over you. (Incidentally, I knew going in that these were considered basshead cans, but I tried my best to avoid the trap of conflating bass with sound quality. Indeed, I thought the Sony headphones actually had a bassier sound, but there it felt hollow and reverby while here it was resonant, impactful, and lively.)

    An Inconclusive Conclusion

    Choosing between the PX and Crossfade has been rather difficult. I love the Icarus-like reach of the PX towards a higher audio plane but feel burned by its glitches and oversights. The aggressive power saving causing the unit to frequently turn off is quite annoying, though the long battery life (30-40 hours) is certainly a huge plus. Meanwhile, the Crossfade keeps things very simple and elegant on a technical level but suffers from a pretty short battery life (10-15 hours), along with incessant beeping towards the tail end. The Crossfade also has that frustrating hiss over silence in wireless mode, though wired performance is quite excellent. The PX stays perfectly silent when used wirelessly or over USB, but has an even worse hiss in 3.5mm wired mode. In terms of materials, the PX feels more premium but I think the Crossfade just looks better when worn. It’s also significantly more comfortable in a direct comparison. But despite the cushier materials, I’ve found the clamping force to be more annoying on the Crossfade than the PX over several hours. The PX also clearly wins out in terms of sound isolation, both passively and with the help of ANC. Sound-wise, the two headphones couldn’t be more different. The PX is spacious, detailed, and sterile, while the Crossfade is enveloping, warm, and resonant. Jazz and classical music spring to life with the PX, while plucky guitars, vocals, and electronic music reverberate through the Crossfade and make you want to get out of your seat. In terms of support, the PX are set to get frequent firmware updates, but V-MODA has an excellent reputation and a lucrative trade-in program. The choice is made even harder by the fact that I got the Crossfade on sale for $240 while the PX is fixed at $400 with no sales in sight.

    I think I’ll have to spend a few more hours listening to each set before I make up my mind!

    1. I even wrote an open source iOS app to help me with this: ios-bluetooth-headphone-unsleeper. All it does is play a near-silent audio stream when you hit the switch, forcing the MDR-1RBT to stay on while the app is running.

    2. Incidentally, the rose gold model costs $20 more than the others and happens to be the only one that supports the high resolution aptX codec. Why? I have no idea!


    0 0

    (Sorry about the length! At some point in the distant past, this was supposed to be a short blog post. If you like, you can skip straight to the demo section which will get to the point faster than anything else.)

    Embarrassingly, most of my app development to date has been confined to local devices. Programmers like to gloat about the stupendous mental castles they build of their circuitous, multi-level architectures, but not me. In truth, networks leave me quite perplexed. I start thinking about data serializing to bits, servers performing secret handshakes and negotiating history, merge conflicts pushing into app-space and starting the whole process over again—and it all just turns to mush in my head. For peace of mind, my code needs to be locally provable, and this means things like idempotent functions, immediate mode rendering, contiguous data structures, immutable objects. Networks, unfortunately, throw a giant wrench in the works.

    Sometime last year, after realizing that most of my ideas for document-based apps would probably require CloudKit for sync and collaboration, I decided to finally take a stab at the problem. Granted, there were tons of frameworks that promised to do the hard work of data model replication for me, but I didn’t want to black-box the most important part of my code. My gut told me that there had to be some arcane bit of foundational knowledge that would allow me to network my documents in a more refined and functional way, without the stateful spaghetti of conventional network architectures. Instead of downloading a Github framework and smacking the build button, I wanted to develop a base set of skills that would allow me to easily network any document-based app in the future, even if I was starting from scratch.

    Continue Reading...

older | 1 | (Page 2) | 3 | newer