The resilience of simple solutions

2019-06-26 @Lifestyle

The reason I prefer the simplest of tools to address problems concerns not only aesthetics. These solutions tend to also be faster to deliver. They tend to be more flexible to resolution and adaptation across an entire arena of tinkerers. They tend to be pluggable into other solutions, yet not constrained by them. And they tend to be very resilient to unforeseen environmental factors.

Let’s consider some cases.

A tasty beverage

Suppose I want coffee, a practical and insatiable craving I satisfy 2-3 times daily.

Sometimes I resort to instant, which demands nothing but the pouring of boiling water over the teaspoon or two of the specially processed and condensed coffee grains. It consumes about a minute of prep time, something I appreciate at 5:30AM to facilitate a speedy transition to the journal writing.

Now suppose I prefer the more glorifying experience of raw coffee grains, this usually being the case. Many rely on coffee machines. Or the espresso machines.

These take space, employ mechanics of varying complexity, can require occasional maintenance, cost anywhere from USD 10-500, and come with an instruction manual.

Culture has invented compound coffee-based cocktails. Or those little cylindrical, single-serving plastic coffee tubes.

Coffee has been consumed for centuries, far predating electricity or complex sweeteners. All it demands is boiling water and a container.

Unless you rely on an electric stove or electric tea pot, you need not depend on electricity.

You simply incorporate the ground coffee beans into cold water, set over sufficiently low heat to cook over 3-4 minutes, and once the water boils, pour into a cup, slowly, to limit the raw grains from entering. Or perhaps wait a couple of minutes to let them further settle before serving. Or add a tiny bit of cold water to expedite the settling. Produce as many portions as you desire at once.

If, on the other hand, you prefer the pour-over method, you need no machine. Simply boil the water separately, then find any small steel filter (or paper if you have a flask to place under).

If you grind your beans as I tend to, you need not even use an electric grinder. Manual designs exist (ie the ones with a spinning top lever) which demand but slightly more time.

Any of these methods you can employ anywhere you have the ability to boil water. There’s hardly maintenance aside from the disposing of the used coffee grounds (or leveraging them for horticultural/exfoliating/cleaning/cooking/scent purposes).

No specialized coffee machinery is ever necessary, and if you get accustomed to this basic method of preparation, you will never depend on spare parts, malfunctioning, the supply of wasteful plastic coffee tubes, plastics, or electricity. You can prepare coffee in a forest over fire.

The misadventures of the Brave Little Toaster

How about a toaster. A convenient contraption, it takes one or more slices of fresh bread as input, and produces toasted bread as output in under a minute.

Now, the mechanics wear out and break down. The toaster takes space, uses electricity, and costs money, which you may argue amortizes to much time savings over its expected lifetime.

Have we forgotten how to toast a slice of bread on a skillet?

This process too should consume about a minute of time once the skillet heats. No mechanical parts, possibly no electricity. Just the quick washing of the skillet.

I don’t think this process adds any substantial prep time after becoming a second-nature ritual. And it eliminates unnecessary space-consuming gadgets for something so pre-ancestrally simple. (It wouldn’t surprise me to discover IoT-ready programmable, remotely activated toasters, but I dare not explore.)

Let’s proceed to something a bit more polemic.

We love to email. Well, I don’t know about love. Email is but a protocol. Only the group society labels as geeks or nerds can feel platonic love for a protocol.

I respect a protocol. I admire a protocol for its simplicity, for its effectiveness, for its resilience to poor network conditions. I admire it for the fashionable syntax, solid semantics, and acceptable timing, all cornerstone ingredients to any attractive protocol. And I most admire it for the longevity - the ability to survive the test of time.

The email protocol, dating back to the 70s, fulfills that criteria.

It facilitates what I indeed love, the ability to compose and direct unconstrained, asynchronous communication to the party of my choosing.

Unconstrained because one may produce long, romantic prose without the need for parchment, pen, ink, or parcel.

Alternatively, it facilitates terse, corporate interchange lacking a subject heading, capitalization, words over four letters, and the slightest uncertainty in passing the Turing test.

It’s asynchronous because upon the sending, we don’t place any strict technical expectations on the time of receipt or reply, at least insofar as the protocol, similar to a paper letter.

And while a paper letter allows the careful eye to gauge the authenticity of the sender by keen observation of the handwriting (and other Holmes-esque analog clues), email enables security measures of key-pair encryption to insure not only the identity but the confidentiality. Never mind the infrequent usage of these measures.

Email is simple to drive and maintain, with a 40-year history to solidify the reputation.

Anyone may host an email server or leverage the thousands available. Anyone with an email address corresponding to an email server (the @domain portion of the address) may communicate with any other email recipient across the world, all filtering and blocking aside.

One may transact directly with the email server via a web interface, or by means of an alternate client via the IMAP or other proprietary email retrieval protocol.

One may also download the email for local storage by similar methods. I download and transact with all my email offline via a series of Linux CLI tools, leveraging basic IMAP and SMTP functionality.

One need not feel attached to an email host or address. Further, the two need not have any dependency on each other.

A domain purchased and maintained at one registry may point to a completely different email host, and redirected at will. If you grow wary of storing your email at one location, switch email hosts and redirect your address (by a change in the nameserver parameters in the domain registry).

Likewise, if seeking to change your address (due to overspamming or the yearning for adventure), you may choose a different one, yet leverage the same host, and the same exact mailbox even.

Beautiful abstraction.

It follows that one may combine the address, the host, the retrieval, and the composition methods in any way suitable. Each, at least by nature, is entirely abstracted from the rest.

At some point, the abstraction layers became diluted and symbiotically joined in the presence of mega-corporate cloud solutions.

I exaggerate of course. But over time, the public has forgotten about the simplicity of the original architecture and elected to bundle their solutions into a compound not easily diffused.

The Gmail hosted solution arguably spans across the majority of email users today. The majority of email addresses I encounter contain this domain.

This doesn’t even cover those other email domains nameserver-directed at the gmail servers, but I’ll exclude them from this introspection, for those users already find themselves on firmer soil.

I don’t see reason to store all my mail on the google servers, and especially with the respective gmail domain. Be it google, or any other major dominant email provider, and over time I grow skeptical of storing too much information in one too prevalent of an entity. This especially applies to providers mining data for ulterior purposes.

Thousands of small to larger, free, or low-cost email providers exist. As a paradox as this may appear, I feel more comfortable leveraging an email host out of a low-redundancy data center housed in a basement of a 2-man sysop team than in the conglomerate of an ultra dominant cloud solution provider.

Scrutiny this may raise, but I showcase the extreme scenario to make a point. Having that extra local vulnerability causes me to employ backup measures and become more conscientious of the big picture as a result.

By employing an external domain, I can quickly redirect the email to any other physical host. This I’ve already pointed out.

Second, I hold more trust in a small scale solution of any sufficient than a large-scale solution of questionable motive and infinite ambition that transgresses my direct needs.

Third, the large-scale solution with all the comforts creates emotional dependency. You begin to associate the otherwise simple decades-old service with the brand, with the interface, with the domain.

You begin to expect the otherwise unnecessary features, the tags, the categories, labels, plugins, integrations, interface, themes, et cetera. Ask yourself, is all of this necessary?

A more basic email solution provides the simple means to accomplish largely the same. Email already appropriates folders, filters for automatic processing and mail redirection, basic SPAM filtering, security, address book functionality, etc.

Each feature may not represent the absolute state of the art, but does that really matter?

Rather than rely on the number one spam filtering solution in the market, what concern is it to assume a bit less, and simply be mindful of how you parade your email address(es) to begin with?

The large-scale solution may achieve everything slightly better than the alternatives. But I don’t mind the second, the third, or the 17th greatest.

I can handle (and actually prefer) a ruggeder interface that I would hardly use anyway since I process my email offline via CLI tools and IMAP retrieval (incidentally generating a yet additional backup layer).

I prefer not to nurture too much comfort and hence fragility in placing all dependence on one leading provider. I prefer to have more transparency, more control, and cultivate a sense of mindfulness.

The alarming cloud formations

For a very long time there has been a trend in transitioning services into the cloud. Many, without real necessity, follow suit.

“Let’s roll down our sleeves and outsource the task to someone who does it better.”

Yet, for many of these ancient and trivial services, do you really need the better? Do you need to place dependence on the provider and a stable network connection for something relatively basic? Do you want to place all hope and trust in the cloud provider?

False needs spread like weeds. Usually, you don’t really need the cloud services. Many were devised, carefully packaged, and sold, irrespective of need.

I completely acknowledge and respect this, as capitalism couldn’t otherwise propel.

But as an individual, you also have a responsibility, primarily to yourself, to think for yourself, to fend for yourself, to intelligently assess your needs, and to discern the real from the fiction.

You can store your calendar in a cloud solution. Which nicely synchronizes across multiple devices. You can share the calendar with others.

How about incorporate external calendars such as the important milestones in the history of Hogwarts or the German Bauhaus?

Speaking from 10 years of calendar history I finally offloaded last year, I found it substantially (and eerily) revealing in detail to the likes of a private journal. I now store and access it exclusively offline via a Linux CLI application.

Is synchronization that important? Probably not. You can operate fine via one access point.

And if synchronization were indeed that critical, leverage any of a myriad of calendar solutions with CalDev support (the widespread protocol for calendar synchronization).

It doesn’t have to originate from a major provider with a top of the art interface and malicious information mining habits. Again, I would sooner entrust an autistic basement hacker to host a server-side copy of my calendar if I felt the urge.

Now, having a back up is important if you manage your calendar entirely offline. Hence back it up from time to time.

By being in control of these components, you won’t find yourself debilitated in case of a rare event. Note, being in control usually implies low levels of intervention and maintenance, not impulsive micromanagement.

To think of it, I could easily manage by storing all calendar-like entries in a simple text file, which many minimalist calendar Linux tools already resort to underneath.

Readable plain text is the most resilient data storage format you’ll encounter. No need to worry about extinct or legacy formats, incompatibility, lack of support.

But I wouldn’t dare to insinuate that you store (encode) what essentially constitutes plain text, in some office-suite product. Never.

I find the cloud document storage and synchronization solutions similarly over-engineered and unnecessary for most cases. For the end user, I sense this started from the need to easily collaborate and share your documents with others.

I acknowledge this as an actual need to circumvent the grotesque tactics of emailing unfashionably large clunks of data back and forth. I struggle to remain calm upon receiving a 30MB-consuming email with 3 selfies photographed at 20 megapixels each.

And major email providers deliver emails of this amplitude. So thank you document sharing solutions, as I can’t expect casual users to toss flash drives around, and less so to scp/ssh data to who knows what servers.

It started with document sharing, yes. But with time it degenerated into storing a respectable portion of your documents in the cloud, not encrypted (by you that is), and without as much as the slightest intent to share or distribute. This, in the era of a 32Gb micro-SD card costing an equivalent of a glorified coffee with milk.

You might say cloud storage offers cosmic levels of redundancy, security, and backup measures. That may be true, per a certain definition of security.

Now, I’ve never pretended to be amateurishly versed in security practices, but the little I know suggests that in the chain of cascading components, the weakest link breaks the chain.

And the weakest link is rarely the 2048-bit AES encryption scheme. Where is that private key stored anyway?

With regard to cosmic redundancy, how much do you really need?

Let’s suppose you store your data at N nodes, each carrying some low probability p of failure/data loss during some sufficiently discreet time interval. Assume independence of failure events. The assumption is not necessarily safe, but plausible, provided the N nodes are housed remotely under sufficiently varying factors.

The odds of all failing simultaneously is pN. Without the independence assumption, the odds are greater, but should still be astronomically small for a “sufficiently large” N ie N > 1.

For the record, I abandoned all SaaS (software as a service) storage/synchronization solutions for any but the most ephemeral of data, ie a quick transfer.

I still leverage a IaaS (infrastructure as a service) cloud solutions for housing sizable data of mostly multimedia content, as I travel a lot and carrying a backup with me would severely defy the point. Most of such data I wouldn’t much care to be revealed to the entire internet, but for anything slightly more sensitive, I encrypt it myself.

I could go on indefinitely on the superfluousness of applications of many cloud services, with respect to images, music, etc. But similar reasoning usually applies - fragility, dependence, habituation, opaqueness, complexity.

Regards to the disk jockey

À propos, I’ve been listening to FM radio a lot. It only takes the discovery of one ad-light, eclectic and ambitious radio station to sustain the majority of my music listening urges, not heavily superseded by internet streamed radio.

But I find the analog radio receiver preferable. It’s small-scale, old technology, having survived the test of time.

It runs on both AC and DC power. It immediately turns on. No delay, no buffering, no network connection constraints. No accounts or distractions. No caching hundreds of MB on disk or RAM.

Just the occasional static and electromagnetic interference, which, kept within bounds, I find as pleasurable as the LP scratching noise.

A plain vanilla (:

Here’s one I’ve nearly neglected. Chat. At its core, chat carries out largely a similar function to email - the exchange of text; usually terse and cryptic text intended for the regional dialectics. But we all understand the text smiley.

I don’t know at which point this came to be, but the chat application with the greatest user base, at least across the destinations I’ve travelled, is presently the Whatsapp messenger.

In some countries with restricted SMS plans (and for other reasons), WA has become the de facto communication standard in lieu of cellular networks, email, and other internet-based methods I find more plausible, a point I’ll shortly elaborate on.

Now, WA, presently, and for all practical purposes, requires a smart mobile/tablet device to operate, and only a device from one of the few leading manufacturers of touch-screen smart devices.

I have low tolerance for such devices, and I’m not the only. In fact, I heavily prefer to chat on the (stationary) computer.

It follows, if I properly exercise the deduction faculties, that for people like myself, never mind the low percentage of the users we represent, in order to actively communicate with a respectable portion of mobile users (people) across the world, we must use, or at least keep actively operational and maintain one such device.

More tersely stated, the application imposes a closed protocol/API, a fixed application/interface, and a very limited set of hardware. By that, I mean the 2% of plausible mobile hardware design (if we explore the last 20-year history of heavily varied character of phone manufacture) adored by 98% of the users. All for the sake of chat.

No thanks.

As I stated, chat, on the surface, is not too distant from email, but with certain glorifying features, mainly with regard to the timing.

There need not be a chat de facto application. With proper abstraction, the system decomposes into the protocol/API, the interface, and the hardware it operates on, each layer interoperable with the others.

Let’s consider the old, but hardly obsolete XMPP chat protocol. It’s still used much across the corporate landscape and some specialized domains.

Major chat solutions, including the very WA, Facebook messenger, and GTalk, leverage a closed variant of XMPP, which, alas, eliminates a major purpose - the openness and flexibility to combine with any interface and hardware. (FB messenger, as of the writing, still offers the flexibility.)

There is a major but. The traditional XMPP protocol is distributed, like the email protocol.

Anyone may host an XMPP server. Anyone may leverage an XMPP account on any open XMPP domain, to communicate with any other XMPP user from any other open XMPP domain. In fact, an XMPP username resembles an email address: user@domain. You see the similarity in the underlying principle.

Naturally, there is not a notion of a de facto XMPP application/interface/hardware. Hundreds of desktop/mobile/web variants exist to chat over XMPP via any practically conceivable hardware. And with the protocol open, anyone may write a new client.

XMPP supports the same multimedia, group, and whatever superficial functionality the Whatsapp/FB messengers provide, limited only to the application interfacing with the protocol.

XMPP facilitates an open chat ecosystem. WA imposes a closed chat ecosystem.

A myriad of other, not necessarily leading but prominent chat ecosystems exist, with a smaller user base, exceptions made in certain geographic regions.

Many of these, while not strictly open, relax some of the restrictions WA imposes. Many lift the hardware restriction, allowing one to chat on far greater varied phones or strictly on the computer without as much as a mobile phone in possession. Some provide an open or semi-open API, enabling anyone to write a client for the protocol.


I’ve rejoined the IRC (Internet Relay Chat) network for the first time since probably the year 2000. The user base and activity on IRC has significantly declined. But it continues to live and be maintained.

IRC has too survived the test of time, you see. It exists since 1988! IRC is just a protocol, and like XMPP, distributed. No central entity issues directives, no single point of failure.

It’s the wild west, with varied IRC networks, divided into redundant world-wide servers to house user accounts and IRC channels, each maintained by one or more channel operators.

Myriads of IRC clients exist for any sort of desirable interface and hardware. It’s about pure textual private/group chat and file exchange in IRC, and it operates wickedly fast despite the antiquity, largely due to the openness and the light-weight text terminal footprint.

Regretfully, much of what I see on IRC caters more to specialized groups one may consider nerds, geeks, extremists, gamers, hackers, and intellectuals. Not so much for the casual user. No longer.

Questions, comments? Connect.