Install this theme

Posts tagged: tech

Mutation Vectors: Tech Hell Edition

Klint Finley

office-space

Status Update

Up is down and down is up. That’s the default “natural” setting on my new MacBook Pro’s trackpad. As a long-time Windows and Linux user, I find that this perfectly sums up the entirety of the Apple experience for me thus far.

See below for my Apple and Linux rants for more on my current experience of tech-hell. But first, a run down of why Twitter has started to suck for many people.

Browsing

I’ve got a ton of stuff in Pocket for reading, perhaps over the weekend, but I don’t have much for you today. But I did really enjoy’s Alan Jacob’s sequence of posts on the state of Twitter, which hits many of my own issues with the Twitter right now, and a few others besides:

I’m not so famous or female that I get inundated with harassment on my timeline, but I do find myself yearning for more granularity in terms of what I see and share.

Many of my friends are nostalgic for Live Journal, which did indeed do a good job of providing that granularity. But I’d hazard a guess that most of us have far more connections on Twitter and Facebook today than we did on LiveJournal in, say, 2005. That makes trying to deal with grouping friends a much more daunting task, especially if you’re starting with a big list of basically everyone you know and need to figure out which groups to put each person in.

Today Google Plus and Facebook offer similar features for publishing posts visible only to only pre-defined groups of people, but I don’t know how widely used they are. And the hassle of trying to categorize a couple-few hundred people into neat groups is a big part of what keeps me from bothering with those features.

Still, if we were able to share stuff on Twitter based on Lists (remember those?), maybe that would be something. Though I’m not sure I’d be willing to spend the time to make a bunch of new lists — I pretty much gave up on that idea back in 2010 or 2011 when Twitter hid that functionality and us worry that it would go away entirely.

Which is another part of the problem: we have no idea which new Facebook or Twitter features will stick around more than a couple months. Why spend time getting used to something when some A/B tester might say “hey, this feature isn’t getting enough traction, let’s hide it to stream line the interface and move those engineering resources elsewhere”?

The indie web can potentially help solve the disappearing feature problem (though most of us will still be at the mercy of what the developers of the software we depend on decide to do). But it could also make granularity more difficult, at least without some widely adopted decentralized authentication system.

(Or we could all just start multiple different e-mail newsletters…)

Watching

On brighter note: season 8 of The Trailer Park Boys just hit Netflix!

Listening

On a darker note, in a good way: Earth’s new album Primitive and Deadly is out!

Apple Rant

This blog post by Alex Payne explains some of the reasons I ended up buying a Mac instead of another PC laptop, though he writes from the perspective of an ardent Windows hater, which I’m not, and from the perspective of someone switching away from the Apple ecosystem, and not to it.

The other thing is that, if you look at the actual specs on offer, Macs aren’t that much more expensive than PCs any more. The last laptop I bought, an ASUS UL, was about half the price of an equivalently spec’d Macbook. But finding an ultraportable (under four pounds and less than 14″s) with a Haswell processor, eight gigs of RAM and a solid state hard drive cheaper than a refurbished Mac proved difficult. In many cases, the equivalently spec’d machines from Acer, ASUS and Lenovo were more expensive. Most companies are now building what amount to Mac knock-offs that are just as un-upgradable as the MBP, but have shittier hardware that actually costs more. And just buying something with lower specs and upgrading isn’t much of an option either, since so many of this Apple knockoffs solder RAM and make hard drives inaccessible.

Lenovo’s ThinkPad line remains one of the few exceptions, but most of its machines way cost more than their Mac equivalents, and the amount of maintenance you can do yourself is steadily shrinking. Plus you have to deal with Lenovo’s customer support.

So I bit the bullet and bought a Mac, but I’m already thinking of sending it back while I still can.

Part of it’s the UI quirks, though I was able to smooth most of those out with third party apps listed here and here (or in the trackpad’s case, just turning off so-called “natural” scrolling).

But there are other things. For one, I hate the built in keyboard. The keys feel sludgy, sticky. The battery gets scalding hot.

And although the MBP itself was cheaper than a ThinkPad, there are tons of hidden costs. My external monitor won’t work with it, even after I went through this Linux-esque process to try to make it work. The only monitor officially supported is, apparently, its own $1,000 Thunderbold monitor, which, if I’m not mistaken, will only work with Macs. Plus I need a new external keyboard since Macs have their own layout.

Then there’s all the proprietary software I have to buy to either make things work properly (like Witch and Hyperdock), or to replace things that don’t work on Macs (Zim, which only kind of sort of works, and MyLifeOrganized, which works in Wine on Linux but which I’ve not been able to get working on OS X). That means shelling out ~ $40 each for Ulysses and Omnifocus.

So I’m looking at about $1,200 worth of extra stuff to get this $1,200 laptop up to snuff, which I can’t afford right now. That makes a $1,500 or $1,600 ThinkPad look not so bad.*

But the switching cost is one I feel like I’m going to have to make eventually anyway, because Windows is starting to feel a lot like BlackBerry back in 2010. Microsoft is still a big, rich company with millions of users, but the writing is on the wall in terms of developer support. And Linux is no better off.

*I’m also a bit confused about how PCs got so expensive. The ASUS was about $600. It was just under four pounds, had nine hour battery life and was just a little behind the bleeding edge of its time. Today’s “budget” laptops are either heavy or majorly lacking in specs or both.

Addendum: Linux Rant

I originally wrote this a comment on this article about why Linux desktop development has been slow going. But when Wired did a big upgrade and reorganization a few months ago, all the old comments got deleted:

I wrote the article in question, but I’m writing here as a computer user, not as a Wired writer.

I’m not a Mac or iOS user. I spend most of my time in Windows 7 these days, even though I still have a Linux partition (Peppermint right now, but I plan to swap this out for Ubuntu Studio).

The whole Linux/Ubuntu usability thing always comes down to personal opinion and anecdote. In my opinion, the Linux distros, particularly the various flavors of Ubuntu, are quite usable. Getting it working on laptops can still be a pain, but this can be mitigated by buying a laptop that comes with Linux pre-installed.

I’ve found that when I get someone to try Linux they can indeed do all the basics with no trouble. But that’s the problem — the basics are easy enough anywhere. There’s no incentive for someone who just uses their computer to for webmail and Facebook to go through the trouble of switching operating systems or buy a new computer from a Linux specialist. Better security and the open source ideology just don’t seem to be enough to convince people to make the switch. I don’t think I’ve gotten a single person to switch from Windows to Linux.

On the other hand since the mid 00s I’ve seen lots of people switch from Windows or Linux to OS X. Apple products have a je ne sais quoi that gets people to pay the premium. Linux just doesn’t seem to have that.

Meanwhile, when you get beyond the basics, Linux poses a lot of challenges. I went from using Ubuntu exclusively to using Windows sometimes because I needed a better video editor than was available for Linux a few years ago. Eventually I started booting into Linux less and less.

There are some great multimedia applications for Linux, including GIMP, Inkscape and VLC. There are also some really promising apps like Ardour and LMMS. But for professionals in design, video and audio Linux just doesn’t cut it yet, and most of the really good stuff is also available for Macs and/or Windows (Din, a soft synthesizer, is one interesting exception).

Gaming has often been a non-starter on Linux as well, though that’s steadily been getting better.

Then there are what I guess you could call power user apps. Stuff like Evernote, Skitch, Notational Velocity, TweetDeck, OmniFocus and Quicksilver that are available for Macs and/or Windows but not Linux. There are clones and substitutes, but nothing seems to measure up to the originals. I know there was an update to Skype recently, but the Linux version has long been a red headed step child.

Oh, and last I heard Netflix Streaming doesn’t work on Linux yet.

So while I’ve found the underlying guts of the OS/distro to be good, the lack of application support has long been lacking. The perfect clone or substitute has perpetually been just around the corner. This is the sort of stuff that keeps me in Windows all day. On Linux I’m constantly having to choose between buggy abandonware that happens to be open source, or dealing with running stuff in WINE. If it were just a matter running one or two apps in WINE, or of just booting into Windows when I had some audio mangling to do, that’d be one thing. But it just seems easier to swallow my geek pride and use Windows.

So why aren’t more apps available for Linux? I didn’t mention it in the article, but the financial incentives are far better for developers on Windows, OSX, the web, Android and iOS. For most Linux desktop developers it’s a labor of love.

There’s the chicken and the egg thing — there’s not enough users to justify developing for, but if there were more apps then there’d be more users. But if developers have been defecting to OS X and the web, then there’s not much hope of that.

I wish it weren’t true, but I’ve been slowly accepting this over the past year. I hope I’m wrong.

[I actually do think the situation has gotten better since I wrote that. I discovered Zim, which helped, and even though I don’t use it, Wunderlist supports Linux, which is nice.]

The Next Stage for Google’s Quantum Computing Efforts

Klint Finley

New from me at Wired:

Google launched its Quantum A.I. Lab last year to test a machine called the D-Wave Two, an intriguing but controversial system that its makers bill as a quantum computer, and it believes quantum computing could play a key role in so many of its future ambitions, from self-driving cars and other robots to better predictive analytics systems for products like Google Now to things we haven’t even dreamed up yet. Thanks to what’s called the superposition principle of quantum mechanics, it could process data for such projects at speeds that are exponentially faster than what you get from today’s machines.

But the scientific community has greeted the D-Wave machine with skepticism, questioning whether the machine is actually a quantum computer at all, and whether it can actually provide something you can’t get from conventional machines. In joining Google, Martinis lends new weight to the company’s quantum ambitions.

Full Story: Wired: The Man Who Will Build Google’s Elusive Quantum Computer

4chan Spawns an Open Source, Encrypted Skype Alternative, But Can You Trust It?

Klint Finley

Tox

My latest for Wired:

The web forum 4chan is known mostly as a place to share juvenile and, to put it mildly, politically incorrect images. But it’s also the birthplace of one of the latest attempts to subvert the NSA’s mass surveillance program.

When whistleblower Edward Snowden revealed that full extent of the NSA’s activities last year, members of the site’s tech forum started talking about the need for a more secure alternative to Skype. Soon, they’d opened a chat room to discuss the project and created an account on the code hosting and collaboration site GitHub and began uploading code.

Full Story: Wired: Hackers Build a Skype That’s Not Controlled by Microsoft

Turning the Internet of Things Against Slumlords

Klint Finley

My latest for Wired:

To guard the safety and health of tenants, New York and many other cities require landlords to keep inside temperatures above a certain level from October until May. But not all building owners and managers follow the rules. Each year, heating complaints are either the number one or number two most frequent complaint to New York’s government services and information line, 3-1-1, says Tom Hunter, the spokesperson for a volunteer effort called Heat Seek NYC, citing data from the site NYC OpenData.

“Last year alone, 3-1-1 received 200,000 plus heating complaint calls,” he says. “Many more tenants go without heat and don’t call 3-1-1, so we don’t know exactly how many people are directly affected each year.”

Tenants can sue landlords over this, but historically, they’ve had to rely on their own hand written records of how cold their apartments get. And these records haven’t always held up in court. Heat Seek NYC hopes solve that problem by building internet-connected heat sensors to monitor the conditions of apartment buildings in order to provide a reliable, objective record that tenants and advocacy groups can use in court.

Full Story: Wired: How to Use the Internet of Things to Fight Slumlords

Previously:

The Dark Side of the Internet of Things

The Internet of Things Could Be Bad for the Environment

Bullshit Jobs and Silicon Valley

Klint Finley

pieter-levels

My latest for Wired:

Levels is on a quest to launch 12 “startups” in just 12 months, and he’s a third of the way home now. One, called Play My Inbox, gathers all the music it finds in your e-mail inbox into a single playlist. Another, called Go Fucking Do It, gives you a new way to set personal goals. Basically, if you don’t reach your goal, you have to cough up some cash to Levels. Gifbook, due to launch by the end of the month, is his fifth creation.

Launching one product a month would be a major endeavor for anyone, but Levels has ramped up the degree of difficulty. For one, he’s building all this stuff while traveling the world. He has no fixed address. Instead, he lives out of a single backpack and works from coffee shops and co-working spaces. And two, each of these “startups” is a one-man operation. “I do everything,” he tells WIRED from his current home, The Philippines. “I’m sort of a control freak.”

Depending on who you ask, Levels represents either everything that’s right about the state of the technology industry or everything that’s wrong. He’s self-motivated, ambitious, and resourceful, building each of these projects without any outside investment. But on the flip side, he’s yet another young white male making products that solve what many people see as trivial problems for an already privileged subset of the population, while ignoring larger issues like global warming and wealth disparity.

Worse, as a “digital nomad” who has left to West to create new tech gizmos in places like Thailand and Indonesia, some argue that he’s exploiting wealth disparity to his own benefit. But Levels no fool. He’s deeply aware of the contradictions in his work, and he’s trying hard to sort through them. He may or may not succeed.

Full Story: Wired: This Guy Is Launching 12 Startups in 12 Months

What I intended — and I’m not sure I succeeded — was to do a meditation/case study on the state of the tech startup ecosystem. We had to cut a lot of material from this article, and there was more that didn’t make it in, but one of the things on my minds was David Graeber’s “bullshit jobs” idea. From an interview in Salon:

Suddenly it became possible to see that if there’s a rule, it’s that the more obviously your work benefits others, the less you’re paid for it. CEOs and financial consultants that are actually making other people’s lives worse were paid millions, useless paper-pushers got handsomely compensated, people fulfilling obviously useful functions like taking care of the sick or teaching children or repairing broken heating systems or picking vegetables were the least rewarded.

But another curious thing that happened after the crash is that people came to see these arrangements as basically justified. You started hearing people say, “well, of course I deserve to be paid more, because I do miserable and alienating work” – by which they meant not that they were forced to go into the sewers or package fish, but exactly the opposite—that they didn’t get to do work that had some obvious social benefit. I’m not sure exactly how it happened. But it’s becoming something of a trend. I saw a very interesting blog by someone named Geoff Shullenberger recently that pointed out that in many companies, there’s now an assumption that if there’s work that anyone might want to do for any reason other than the money, any work that is seen as having intrinsic merit in itself, they assume they shouldn’t have to pay for it. He gave the example of translation work. But it extends to the logic of internships and the like so thoroughly exposed by authors like Sarah Kendzior and Astra Taylor. At the same time, these companies are willing to shell out huge amounts of money to paper-pushers coming up with strategic vision statements who they know perfectly well are doing absolutely nothing.

So as much as we bash on techbros* wasting time building silly apps, there’s a bit more going on here. It’s hard to find a job today, especially if you’re young, and especially one that is “meaningful.” Tech just happens to be one of the few booming industries at the moment, and one of the only ones paying living wage**. So while many people might rather be curing maleria or fighting poverty or fixing global warming, building apps for Silicon Valley startups. And what’s their real alternative? Work for a big company like IBM, or go work for the NSA? They’re probably better off working for Yo or Rap Genius or whatever.

“Get rich writing apps” may be the new “make money from home selling Tupperware,” but it’s the best many people can hope for today, and blaming young programmers, as opposed to the politicians and capitalists who got us into this mess.

*Note that I’m not calling Pieter Levels a techbro here.

**Which is part of why it’s important to change tech culture to make it more inclusive, which is another topic entirely. (One covered very well at Model View Culture).

Why Animated GIFs are the New “Hello World”

Klint Finley

404 animated gif

New from me at Wired, meet revisit.link, the “Hello World” of web services:

Basically, all the site’s image effects are stored by a community of developers, much like any other open source software. Anyone can not only use these effects, but build their own and share them with the community by way of the code hosting and collaboration site GitHub. “Since everyone likes glitch art and animated GIFs, it’s a creative outlet for developers to create something new that’s outside their usual field,” say Jen Fong-Adwent, the creator of revisit.link. “But it’s also a way for new people to learn basics.”

If you’re building a modern web service, you aren’t just creating a program that will run on one machine. You have to learn how to deploy code to online servers, and teach your programs to talk with other applications. revisit.link is a good way to learn these skills, since the effects servers are simple and lightweight and can be written in any language. And once a server is built, the developer can learn how to use GitHub and how to make small changes to someone else’s code and submit those changes for review—all in a low-pressure environment with a very low barrier to entry.

Full Story: Out in the Open: How Animated GIFs Can Turn You Into a Web Coder

You can play with it here, or view a strea, of examples here.

See also: glitchgifs Tumblr

Imagining a Bitcoin Alternative Built on Reputation Instead of Numbers

Klint Finley

My latest for Wired:

Instead of using pure mathematics to prevent things like the same person spending the same money twice, Document Coin will rely on personal reputation to keep all transactions in order. And each unit of currency created using Document Coin could have different values in different situations. If you use a coin in one place, it might be worth more then if you use it in another. The goal, Anderson says, is to get people to completely rethink the entire idea of money. […]

Unlike with bitcoin—which keeps its currency scarce by rewarding it only to those who participate in what amounts to a race to solve complex cryptographic puzzles—anyone will be able to create a new Document Coin anytime they want. The value of each coin will be completely subjective, depending on who creates the coin and why. “For example, the coin my disco singer friend created and gave me at my barbeque might be what gets me past the rope at the club,” Anderson says. A coin minted by tech pundit Tim O’Reilly might be highly prized in Silicon Valley circles, but of little interest to musicians. “It’s a bit like a combination of a social network with baseball trading.”

Ultimately, he hopes to get developers thinking about the social implications of crypto-currencies, and to get people to question the idea that everything needs to have a set, numeric value. “If bitcoin is the toy version of what we’ll all be using the future, then I want to build the crazy art project version of the future,” he says. Document Coin’s usefulness as a real currency is limited, but Anderson does hope people will eventually want to use it. “If you build something, you don’t want to be disappointed if it succeeds,” he says. “You need to build things that you would be happy to see take off.”

Full Story: Wired: A New Digital Currency Whose Value Is Based on Your Reputation

Previously: My interview with Anderson about CouchDB

Did Buckminster Fuller Predict Graphene Computers?

Klint Finley

synergeticsfigure2201c1997

Above: Fuller’s diagrams. Below, a diagram of graphen molecules.

Graphen

My friend Trevor, who maintains the largest known archive of R. Buckminster Fuller’s work has uncovered an unpublished manuscript in which Fuller seems to predict graphene computers:

Fuller’s computer is a layered cuboctahedrons, each layer made of spheres. Fuller describes the spheres as glass coated with gold, silver, copper or
aluminum depending on their location in the array. But Fuller also speaks of individual atoms of these materials, predicting by decades the nanocomputers under development today.

These layers of cuboctahedrons would have arrays of hexagons on their equators, and the nanocomputers of today are made of layers of hexagons. The fifth layer of a layered cuboctahedron of spheres would have the potential for a new nucleus sphere.

Full Story: Syncrhonofile: R. Buckminster Fuller’s Ultra-Micro Computer

I can’t say that I fully understand the material at hand. Fuller’s work seems to be focused on memory and storage, while today’s graphene computation research is focusing on the creation of more efficient transistors for processing. But there does seem to be at least some overlap. The fact that he was even thinking about computing at all in 1968 puts him well ahead of the curve.

See also:

Buckminster Fuller

Graphene

Are Online Security and Convenience Fundamentally Incomptible?

Klint Finley

Latest from me at Wired:

Staying secure online is a pain. If you really want to protect yourself, you have to create unique passwords for every web service you use, turn on two-factor authentication at every site that supports it, and then encrypt all your files, e-mails, and instant messages.

At the very least, these are tedious tasks. But sometimes they’re worse than tedious. In 1999, researchers at Carnegie Mellon University found that most users couldn’t figure out how to sign and encrypt messages with PGP, the gold standard in e-mail encryption. In fact, many accidentally sent unencrypted messages that they thought were secured. And follow-up research in 2006 found that the situation hadn’t improved all that much.

As many internet users seek to improve their security in the wake of ex-government contractor Edward Snowden exposing the NSA’s online surveillance programs, these difficulties remain a huge issue. And it’s hard to understand why. Do we really have to sacrifice convenience for security? Is it that security software designers don’t think hard enough about making things easy to use—or is security just inherently a pain? It’s a bit of both, says Lorrie Cranor, an expert in both security and usability and the director of Carnegie Mellon’s CyLab Usable Privacy and Security Laboratory, or CUPS for short. “There isn’t a magic bullet for how to make security usable,” she says. “It’s very much an open research project.”

Full Story: Wired: Online Security Is a Total Pain, But That May Soon Change

(I don’t care for that headline — there’s not really much evidence that this is necessarily going to change anytime soon)

The Internet of Things Could Be Bad for the Environment

Klint Finley

pollution-getty

A bit more contrarianism from me at Wired today:

The pitch is that the Internet of Things will make our world a greener place. Environmental sensors can detect pollution, the voices say. Smart thermostats can help us save money on our electric bills. A new breed of agriculture tech can save water by giving crops exactly the amount they need and no more.

But this vast network of new online devices could also end up harming the environment. Manufacturing all those gadgets means expending both energy and raw materials. In many cases, they will replace an older breed of devices, which will need to be disposed of (so long, non-smart thermostat). And eventually, every IoT device you buy–and people are predicting there will be hundreds of thousands–will need to be retired too. Since all these devices will connect to the net, we should even consider the energy used by the data centers that drive them.

Full Story: Wired: The Internet of Things Could Drown Our Environment in Gadgets

Previously: The Dark Side of the Internet of Things