Skip to main content

Smartphone Cryptogeddon

3 min read

After yesterday's Senate committee hearing on encryption, wherein both [FBI Director James Comey]( and [New York County District Attorney Cyrus Vance Jr.]( made some pretty nasty comments about strong encryption on smartphones and the end of the world potential problems it could bring, I thought it might be a good idea to remind everyone of what [Representative Ted Lieu of California said back in April]( about why some users wanted smartphone encryption in the first place:

> Why do you think Apple and Google are doing this? It's because the public is demanding it. People like me: privacy advocates. A public does not want an out-of-control surveillance state. It is the public that is asking for this. Apple and Google didn't do this because they thought they would make less money. This is a private sector response to government overreach.
> ...
> [T]o me it's very simple to draw a privacy balance when it comes to law enforcement and privacy: just follow the damn Constitution.
> And because the NSA didn't do that and other law enforcement agencies didn't do that, you're seeing a vast public reaction to this. Because the NSA, your colleagues, have essentially violated the Fourth Amendment rights of every American citizen for years by seizing all of our phone records, by collecting our Internet traffic, that is now spilling over to other aspects of law enforcement. And if you want to get this fixed, I suggest you write to NSA: the FBI should tell the NSA, stop violating our rights. And then maybe you might have much more of the public on the side of supporting what law enforcement is asking for.
> Then let me just conclude by saying I do agree with law enforcement that we live in a dangerous world. And that's why our founders put in the Constitution of the United States—that's why they put in the Fourth Amendment. Because they understand that an Orwellian overreaching federal government is one of the most dangerous things that this world can have.

It might be worth point out that Rep. Lieu is one of four House members with computer science degrees, is a Lieutenant Colonel in the United States Air Force Reserves, *and* served for four years as a member of the Judge Advocate General’s Corps, making him (IMHO) someone knowledgeable in this area.

And it just so happens that [fourteen of the world's top computer security experts]( agree with him, but who's counting.


Two Hard Things

1 min read

Came across this little ditty today, via [Martin Fowler](

> There are only two hard things in Computer Science: cache invalidation and naming things.
> -- Phil Karlton

Personally, though, I prefer the corollary:

> There are only two hard things in Computer Science: cache invalidation, naming things, and off-by-one errors.

Too true.


The Art of Authorship and Appropriation

1 min read

Christopher Sprigman [takes another look]( at Richard Prince's Instagram Exhibit, and makes some bold conclusions:

> Prince’s body of appropriation art is provoking a reassessment of the meaning of authorship at a time when ownership of creative works in our digital world is tenuous. Anyone with access to the Internet can take something made by others, copy it, change it, and distribute it at the click of a mouse. In this context, we can see that authorship is not a stable concept, but rather that it shifts as technology weakens the link between an “originator” and his work. You may like that or hate that; Prince is pointing it out, in the direct way that only art can.

As a would-be artist whose done some "appropriation art" myself (as well as a longtime fan of perpetual copyright-trolls, [Negativland](, I find this whole discussion fascinating. However, I have to admit that I'm more than a bit surprised at the sums he's been able to get for his "re"-work, and the implication that one man's copyright infringement is another man's high-brow art.


An act of freedom, but for whom?

3 min read

On this most auspicious day, when the [USA FREEDOM Act]( passed through the Senate on it's way to president's desk, I spent the afternoon listening to some of law professors [Eben Moglen's]( excellent talks about [Snowden and the Future](

One of the things that I noticed he mentioned, which I don't recall hearing anywhere else, is our (the US citizenry) continued complacency about spying, as long as they aren't spying on Americans.

> Military control ensured absolute command deference with respect to the fundamental principle which made it all "all right," which was: "No Listening Here." The boundary between home and away was the boundary between absolutely permissible and absolutely impermissible—between the world in which those whose job it is to kill people and break things instead stole signals and broke codes, and the constitutional system of ordered liberty.

Of course, we all know how that turned out:

> Not only had circumstances destroyed the simplicity of "no listening inside," not only had fudging with the Foreign Intelligence Surveillance Act carried them into the land where law no longer provided them with useful landmarks, but they wanted to do it—let's be frank, they wanted to do it. Their view of the nature of human power was Augustan if not august. They wanted what it is forbidden to wise people to take unto themselves. And so they fell, and we fell with them.

Nearly every time that the USA PATRIOT Act is demonized in the press (even the leftist press), it seems to only be because the NSA dared to spy on *us*. But, shouldn't we be questioning why they have to have such a large net at all, irrespective of the national boundaries?

Or, as professor Moglen so succinctly put it (emphasis mine):

> The empire of the United States, the one that secured itself by listening to everything, was the empire of exported liberty. What we had to offer all around the world was freedom—after colonization, after European theft, after the forms of twentieth-century horror we haven't even talked about yet—we offered liberty; we offered freedom.
> ...
> It is, of course, utterly inconsistent with the American ideal to attempt to fasten the procedures of totalitarianism on American constitutional self-governance... Partly, as I shall suggest next time, because freedom is merely privilege extended unless enjoyed by one and all. But primarily because *there is an even deeper inconsistency between American ideals and the subjection of every other society on earth to the procedures of totalitarianism*.

Something to think about the next time someone talks about "freedom".


The Web is Dead! Long Live the Web!

4 min read

In browsing through some of the fallout from the arrival of [Facebook's Instant Articles](, I stumbled across a couple of great pieces by Baldur Bjarnason ([@fakebaldur]( that go a long way to explain how we got into [the situation we're in](, and why it's us [web developers]( who are responsible.

In the first, he takes on [the ongoing debate about apps vs. the web](, and makes the assertion that it isn't "the web" that's broken, it's how (we) web developers are using it that's broken (emphasis his):

> Here’s an absolute fact that all of these reporters, columnists, and media pundits need to get into their heads:
> The web doesn’t suck. Your websites suck.
> _All of your websites suck._
> You destroy basic usability by hijacking the scrollbar. You take native functionality (scrolling, selection, links, loading) that is fast and efficient and you rewrite it with ‘cutting edge’ javascript toolkits and frameworks so that it is slow and buggy and broken. You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability.
> The lousy performance of your websites becomes a defensive moat around Facebook.

In other words, if the [mobile web is dead](, it's because we developers killed it.

On a side note, I wonder if this isn't alot of the reason that millennials have increasingly [preferred using apps to browsers]( - because mobile browsing is, for many, a needlessly painful experience.

In the [second piece](, he even goes so far as to explain why people can't seem to get on the same page about how "the web" should be: Because they're all talking about different versions of it:

> Instead of viewing the web as a single platform, it’s more productive to consider it to be a group of competing platforms with competing needs. The mix is becoming messy.
> 1. Services (e.g. forms and ecommerce, requires accessibility, reach, and security)
> 2. Web Publishing (requires typography, responsive design, and reach)
> 3. Media (requires rich design, involved interactivity, and DRM)
> 4. Apps (requires modularity in design, code, and data as well as heavy OS integration)

Just to drive this point home, he makes reference to the Apple Pointer issue from [earlier this year](

> This is just one facet of the core problem with the web as an application platform: we will never have a unified web app platform.
> What Apple, Google, Microsoft, and Mozilla want from web applications is simply too divergent for them to settle on one unified platform. That’s the reason why we’re always going to get Google apps that only work in Chrome, Apple Touch APIs that are modelled on iOS’s native touch model, and Microsoft Pointer APIs that reflect their need to support both touch and mouse events on a single device at the same time. There really isn’t an easy way to solve this because standardisation hinges on a common set of needs and use cases which these organisations just don’t share.

A more conspiracy-minded individual might even believe most of the major vendors would be better off if the standards never really do work out, since it would prevent "native-esque" web apps from cutting into their bottom-lines in their respective app stores. But I digress.

Speaking for myself, I know that I had never really considered this point when talking / ranting about "the web". What's more, I wonder if half of our inability to come to agreement on some of these issues is simply a matter of terminology getting in the way of having meaningful conversations. I mean, apps aren't "better" than "the web", because they are essentially part of (one form of) it: they use the same web protocols (HTTP / HTML) as the rest of the "browsable" web, they just use them on the back-end before glossing it over with a pretty "native" front end.

In fact, one might argue that this is the reason that the one area of web standards that has actually seen some progress in the past few months is the [HTTP2 spec]( - an update to how data is transmitted on-the-wire, which should bring notable speed and security improvements to anyone that uses HTTP (including all of those native apps I mentioned earlier). After all, improving this part of "the web" is the one thing that all of the players involved can agree on.


Ethics in Shilling Videogames

2 min read

[David Wolinsky]( has a [great article]( on [Unwinnable]( capturing his thoughts on the whole "ethics in game journalism" / thing.

> It’s time we retire the term “videogame journalist.”
> Most writers in the field need to accept that they, too, are marketers unless their approach or something else in the landscape shifts and changes.

Part of the problem, as he sees it, is that videogame companies aren't driven to do PR with journalists that might give them serious criticism (a.k.a. bad reviews). As a result, traditional "videogame journalists" have to choose between being a PR puppet for the game companies, or not being at all.

Part of the reason for this all-or-nothing attitude are the YouTube streamers, whose undeniable popularity means that they are getting courted more and more often by the game companies in lieu of print / online journalists. For example, look at [Pewdiepie](, and his 36-million followers:

> Thirty-six million subscribers means roughly anything he puts online is more popular than Nirvana’s Nevermind (somewhere around 30 million sales) or Michael Jackson’s Bad (also around 30 million).
> Think about it. An audience that size, bigger than the population of Canada (a country), and they are all paying attention to one person’s opinions about videogames. That is staggering on a basic human level.

He hits on a lot of different notes, and it does tend to run long, but it's an overall great read for anyone that wants to move beyond the black-and-white in-group / out-group fighting and into a serious discussion about marketing vs. journalism, and what ethics in gaming can (and should) be.


The People vs. John Deere

1 min read

Over at Wired, [iFixit's]( Kyle Wiens ([@kwiens]( points out that abuse extends well beyond preventing you from [jailbreaking your PS3]( and into the world of... [farm machinery](

> In a particularly spectacular display of corporate delusion, John Deere—the world’s largest agricultural machinery maker —told the Copyright Office that farmers don’t own their tractors. Because computer code snakes through the DNA of modern tractors, farmers receive “an implied license for the life of the vehicle to operate the vehicle.”
> It’s John Deere’s tractor, folks. You’re just driving it.

I find this particularly worrisome with regards to the , and the possibility of forced vendor lock-in on even the most trivial of items ("I'm sorry, sir, you'll have to call a certified Moen plumber to fix your leak.")

Welcome to the future. [Fight to make it better.](


Developing the Web

3 min read

_Sorry for posting this again, but I accidentally deleted the original when I changed web servers recently, and I thought it was worth reprinting. Let that be a lesson to us all in the [Tao of Backup](

The great Remy Sharp ([@rem]( wrote a [piece]( about what it means to be a web developer, as opposed to an engineer, and the difference a title does (or doesn't) make. In the end, he settles on the title of "web developer":

> I don't know why I thought it was uncool to be a "web developer". Perhaps because it's utterly vague.
> What "web developer" does mean to me though, is this:
> Someone who writes code for browsers. Likely from the school of view source, is comfortable with drop-in libraries, understands standards and best practice techniques. But mostly, a tinkerer.

I like his definition (especially the part about tinkering), but I think that it's incomplete, being merely functional.

I suggest that the term "web developer", by its definition, carriers a philosophical drive: to develop the web. That is to say, a web developer should visualize how they would like the web to be (as a whole), and build their own projects in a way that reflects that vision.

This is something I've tried to do myself, both in my professional and personal projects (albeit with varying degrees of success). To me, being a web developer means that I should use [responsive design principles](, [ensure accessibility](, and [follow the standards]( wherever possible. It also means using only open source software, be it [in the server stack](, the [service layer](, or even as a [client browser](

As a web developer, I want to participate in a decentralized web, and would rather use a self-hosted, fully-open [social media platform]( than a corporate data silo. Likewise, I support the use of standards-based communication protocols (IRC, e-mail, etc.) over proprietary solutions. Finally, as a web developer, I believe in a more secure web, and support initiatives like [HTTPS everywhere](

All in all, I think this definition adds an air of legitimacy to the "web developer" title. As I noted in a [comment]( based on this criteria, one could say that Sir Tim Berners-Lee is the definitive Web Developer (a title he himself uses, as @rem pointed out), and that's not bad company to be in. In fact, I think I'm going to go get some business cards with "Web Developer" on them.

_TL;DR - A should "develop the web" by building their projects in accordance with their own vision of how the web should be. For me, that means using open source software to build standards-compliant, accessible, and secure sites and apps.