Taking a week off work.
After posting about Warren Ellis’ extinction aesthetic thing on Monday, I figured I should look into the Dark Mountain Project a bit more. I figured the New York Times Magazine profile of Paul Kingsnorth would be as good a place to start as any.
Reading this led me to wonder what the current worst case scenarios for climate change, ocean acidification and peak soil are, which led me to a long piece from The Nation that, if I understand it correctly, reports that we could see a 3.5 Celsius increase in global temperatures as early as 2035. An increase of 3.5C would kill off the earth’s remaining plankton, which are already dying quickly thanks to ocean acidification, which would kick off a series of events leading to the death of most of our food sources.
In other words, we could be facing human extinction in just 21 years.
Actually, I imagine it would take at least a few more years after 2035 for the human species to actually go extinct. Maybe we’ll discover that some people can live on smaller amounts of food, or but it doesn’t sound like things will be pretty for the survivors.
And if we don’t hit those numbers by 2035, there’s a ticking time bomb of methane stored in arctic permafrosted soil, and that’s going to be thawing out sooner or later. And when that happens, temperatures are likely to go out of control fast.
Even if we make it to 2050, current projections estimate that our soil will only be able to produce about 30 percent of the amount of food we do today. That’s particularly bad news because new population projections predict that instead of peaking peaking at nine billion around 2050, we’re going to hit 11 billion by 2100 and keep growing (unless of course we all starve to death decades before we ever reach that point).
The good news is that these are just the worst case scenarios. Many scientists still think we can turn this around, at least somewhat. The bad news is that the worst case scenarios keep getting worse.
Other cheery subjects:
The police remain a visible presence in the borough’s Brownsville neighborhood, where the vast and violent expanse of public housing had made the neighborhood a proving ground for the department’s use of the tactics as a way to curb gun violence. As part of a new strategy called Omnipresence, the officers now stand on street corners like sentries, only rarely confronting young men and patting them down for weapons. But the residents of Brownsville, conditioned by the years of the stop-and-frisk tactics, still view these officers warily.
This week I watched Hardware, not realizing that human sterilization and population control were subplots. And finished watching the second season of Utopia (the British drama, not the U.S. reality show). I seem unable to escape the themes of human extinction and involuntary sterilization.
Stray Bullets: Uber Alles Edition, which is the sort of thing that makes you think that humans deserve to go extinct.
Malcom Harris on Dana Goldstein’s book The Teacher Wars
The tag line to Dana Goldstein’s new book The Teacher Wars is “A history of America’s most embattled profession.” That Goldstein, an education journalist now at the fledgling Marshall Project, can make that claim without ruining her credibility before the first page speaks to the unique role educators play in American society. They’re (mostly) unionized government employees, but they spend their time working alone. We ask that they produce standardized results and demonstrate individualized care at the same time. We say their work is invaluable and pay them as if they were semiskilled. They come under frequent attack from all corners of the political map. Whether that necessarily makes teachers more embattled than psychologists or babysitters or coal miners or housewives I’m not sure, but they are certainly curious.
Full Story: The New Inquiry: Not for Teacher
The Atlantic interviews Julie Norem, a psychology professor at Wellesley College and author of The Positive Power Of Negative Thinking:
Olga Khazan: What is defensive pessimism?
Julie Norem: It’s a strategy for dealing with anxiety and helping to manage anxiety so that it doesn’t negatively influence performance. If you feel anxious in a situation, it doesn’t really matter if it’s realistic or not, you feel how you feel. It’s hard not to feel that particular way. If you feel anxious, you need to do something about it. Usually people try to run away from whatever situation makes you anxious. But there are other ways of dealing with it. Defensive pessimism is one way.
When people are being defensively pessimistic, they set low expectations, but then they take the next step which is to think through in concrete and vivid ways what exactly might go wrong. What we’ve seen in the research is if they do this in a specific, vivid way, it helps them plan to avoid the disaster. They end up performing better than if they didn’t use the strategy. It helps them direct their anxiety toward productive activity.
See also: The Powerlessness of Positive Thinking
Hank’s thing was treading the well-worn path of telling you to fuck your dreams because, hey, your dreams are unrealistic. Well, they’re not unrealistic. But they’re just suggestions. And that you don’t owe any obligation to your former self: they literally don’t exist anymore. But this is the hard part: if you’re trying to work out what it is that you *want* to do, then you kind of have to try a whole bunch of things out. Our education system and culture and economy isn’t set up to do that. We aren’t set up to let people a/b test a whole bunch of vocations or careers. We haven’t built up a society that enables and empowers people to work out what’s best, because hey, we’ve got bills to pay all the time. And if you haven’t noticed, all of this technology that empowers people and enables new forms of success *costs money*.
Extinction Symbol. Dark Extropianism. Apocalyptic Witchcraft. Dark Mountain. Uncivilisation. In The Dust Of This Planet. Health Goth. Accelerationism. After Nature/Dark Ecology/Ecognosis. Early signals: The New Nihilism, Speculative Realism, Neoreaction, Occulture. Cusp: Toxic Internet. Post-Westphalian.
Full Story: morning.computer: Extinction Aesthetic
I spent yesterday afternoon at Maker Faire volunteering at the Tesseract Design booth, where I was lucky enough to watch Crawford 3D scanning people and then printing out little plastic busts of them. Talk about a New Aesthetic experience. I also got to see a a real-life Flintstones car and a bunch of Tesla coils.
Spending today recovering from too much heat and not enough water, and catching up on some reading.
“The current struggle for Scottish independence has about as much to do with the events depicted in Braveheart as America’s ongoing racial struggles have to do with the events depicted in Abraham Lincoln: Vampire Hunter,” writes Amanda Taub for Vox. In fact, the movie is outrageously historically inaccurate even by Hollywood standards. Fortunately, Taub also wrote a nice ‘splainer on the whole situation. Meanwhile, Quinn Norton puts it in context with other contemporary independence movements.
Elsewhere in hypothetical geopolitics: if Reddit were a country it would be a failed state.
And for a taste of something completely different, how about the Islamic roots of science fiction?
After binging through the entire new season of Trailer Park Boys, we just started the latest season of Channel 4’s Utopia which as I’ve mentioned was one of my favorite shows of last year.
Continuing the fequent Mutation Vectors motif of me finding out that one of my favorite bands has a new album out months after the fact, this week I found out that Bruxa who I raved about before put out a new album in July on a pay watcha want basis.
Food for thought going into the weekend, from Alex Soojung-Kim Pang:
The problem with the “do what you love” mantra is in how we follow it, which is with a single-mindedness that carries unnecessary risk. We interpret “do what you love” to mean “Do only what you love and nothing else,” and the implication of that is that if you don’t practice this kind of creative monogamy, you’re being untrue to yourself. A corollary encourages, “Don’t worry about the details and practicalities.” The universe will reward your passion and belief in yourself. It also means assuming all the financial risk of a risky career move. The reality is that creative work is terribly funded, and the odds of making a steady living from it are very very small. Being fully exposed to that kind of instability can make you less creative, not more so.
See also: Quit Your Passion and Take a Boring Job
Some surprising research from Pew:
Millennials are quite similar to their elders when it comes to the amount of book reading they do, but young adults are more likely to have read a book in the past 12 months. Some 43% report reading a book—in any format—on a daily basis, a rate similar to older adults. Overall, 88% of Americans under 30 read a book in the past year, compared with 79% of those age 30 and older. Young adults have caught up to those in their thirties and forties in e-reading, with 37% of adults ages 18-29 reporting that they have read an e-book in the past year.
Plus: “The number of independent bookstores in the US rose by more than 20% between 2009 and 2014, according to the American Booksellers Association,” Quartz reports.
(both links via NextDraft)
Here’s the second half of our conversation with R.U. Sirius, editor of the late great Mondo 2000 magazine and the co-author of the forthcoming Transcendence: The Disinformation Encyclopedia of Transhumanism and the Singularity.. This time around R.U. tells us about the state of the Transhumanist movement and what he misses about the 90s, and Chris and I go off on a tangent about algorithms and an app store for identity.
Download and Show Notes: Mindful Cyborgs: Counterculture, The Singularity, and the Amish with R. U. Sirius pt 2
The first half is here.
Up is down and down is up. That’s the default “natural” setting on my new MacBook Pro’s trackpad. As a long-time Windows and Linux user, I find that this perfectly sums up the entirety of the Apple experience for me thus far.
See below for my Apple and Linux rants for more on my current experience of tech-hell. But first, a run down of why Twitter has started to suck for many people.
I’ve got a ton of stuff in Pocket for reading, perhaps over the weekend, but I don’t have much for you today. But I did really enjoy’s Alan Jacob’s sequence of posts on the state of Twitter, which hits many of my own issues with the Twitter right now, and a few others besides:
I’m not so famous or female that I get inundated with harassment on my timeline, but I do find myself yearning for more granularity in terms of what I see and share.
Many of my friends are nostalgic for Live Journal, which did indeed do a good job of providing that granularity. But I’d hazard a guess that most of us have far more connections on Twitter and Facebook today than we did on LiveJournal in, say, 2005. That makes trying to deal with grouping friends a much more daunting task, especially if you’re starting with a big list of basically everyone you know and need to figure out which groups to put each person in.
Today Google Plus and Facebook offer similar features for publishing posts visible only to only pre-defined groups of people, but I don’t know how widely used they are. And the hassle of trying to categorize a couple-few hundred people into neat groups is a big part of what keeps me from bothering with those features.
Still, if we were able to share stuff on Twitter based on Lists (remember those?), maybe that would be something. Though I’m not sure I’d be willing to spend the time to make a bunch of new lists — I pretty much gave up on that idea back in 2010 or 2011 when Twitter hid that functionality and us worry that it would go away entirely.
Which is another part of the problem: we have no idea which new Facebook or Twitter features will stick around more than a couple months. Why spend time getting used to something when some A/B tester might say “hey, this feature isn’t getting enough traction, let’s hide it to stream line the interface and move those engineering resources elsewhere”?
The indie web can potentially help solve the disappearing feature problem (though most of us will still be at the mercy of what the developers of the software we depend on decide to do). But it could also make granularity more difficult, at least without some widely adopted decentralized authentication system.
(Or we could all just start multiple different e-mail newsletters…)
On brighter note: season 8 of The Trailer Park Boys just hit Netflix!
On a darker note, in a good way: Earth’s new album Primitive and Deadly is out!
This blog post by Alex Payne explains some of the reasons I ended up buying a Mac instead of another PC laptop, though he writes from the perspective of an ardent Windows hater, which I’m not, and from the perspective of someone switching away from the Apple ecosystem, and not to it.
The other thing is that, if you look at the actual specs on offer, Macs aren’t that much more expensive than PCs any more. The last laptop I bought, an ASUS UL, was about half the price of an equivalently spec’d Macbook. But finding an ultraportable (under four pounds and less than 14″s) with a Haswell processor, eight gigs of RAM and a solid state hard drive cheaper than a refurbished Mac proved difficult. In many cases, the equivalently spec’d machines from Acer, ASUS and Lenovo were more expensive. Most companies are now building what amount to Mac knock-offs that are just as un-upgradable as the MBP, but have shittier hardware that actually costs more. And just buying something with lower specs and upgrading isn’t much of an option either, since so many of this Apple knockoffs solder RAM and make hard drives inaccessible.
Lenovo’s ThinkPad line remains one of the few exceptions, but most of its machines way cost more than their Mac equivalents, and the amount of maintenance you can do yourself is steadily shrinking. Plus you have to deal with Lenovo’s customer support.
So I bit the bullet and bought a Mac, but I’m already thinking of sending it back while I still can.
But there are other things. For one, I hate the built in keyboard. The keys feel sludgy, sticky. The battery gets scalding hot.
And although the MBP itself was cheaper than a ThinkPad, there are tons of hidden costs. My external monitor won’t work with it, even after I went through this Linux-esque process to try to make it work. The only monitor officially supported is, apparently, its own $1,000 Thunderbold monitor, which, if I’m not mistaken, will only work with Macs. Plus I need a new external keyboard since Macs have their own layout.
Then there’s all the proprietary software I have to buy to either make things work properly (like Witch and Hyperdock), or to replace things that don’t work on Macs (Zim, which only kind of sort of works, and MyLifeOrganized, which works in Wine on Linux but which I’ve not been able to get working on OS X). That means shelling out ~ $40 each for Ulysses and Omnifocus.
So I’m looking at about $1,200 worth of extra stuff to get this $1,200 laptop up to snuff, which I can’t afford right now. That makes a $1,500 or $1,600 ThinkPad look not so bad.*
But the switching cost is one I feel like I’m going to have to make eventually anyway, because Windows is starting to feel a lot like BlackBerry back in 2010. Microsoft is still a big, rich company with millions of users, but the writing is on the wall in terms of developer support. And Linux is no better off.
*I’m also a bit confused about how PCs got so expensive. The ASUS was about $600. It was just under four pounds, had nine hour battery life and was just a little behind the bleeding edge of its time. Today’s “budget” laptops are either heavy or majorly lacking in specs or both.
I originally wrote this a comment on this article about why Linux desktop development has been slow going. But when Wired did a big upgrade and reorganization a few months ago, all the old comments got deleted:
I wrote the article in question, but I’m writing here as a computer user, not as a Wired writer.
I’m not a Mac or iOS user. I spend most of my time in Windows 7 these days, even though I still have a Linux partition (Peppermint right now, but I plan to swap this out for Ubuntu Studio).
The whole Linux/Ubuntu usability thing always comes down to personal opinion and anecdote. In my opinion, the Linux distros, particularly the various flavors of Ubuntu, are quite usable. Getting it working on laptops can still be a pain, but this can be mitigated by buying a laptop that comes with Linux pre-installed.
I’ve found that when I get someone to try Linux they can indeed do all the basics with no trouble. But that’s the problem — the basics are easy enough anywhere. There’s no incentive for someone who just uses their computer to for webmail and Facebook to go through the trouble of switching operating systems or buy a new computer from a Linux specialist. Better security and the open source ideology just don’t seem to be enough to convince people to make the switch. I don’t think I’ve gotten a single person to switch from Windows to Linux.
On the other hand since the mid 00s I’ve seen lots of people switch from Windows or Linux to OS X. Apple products have a je ne sais quoi that gets people to pay the premium. Linux just doesn’t seem to have that.
Meanwhile, when you get beyond the basics, Linux poses a lot of challenges. I went from using Ubuntu exclusively to using Windows sometimes because I needed a better video editor than was available for Linux a few years ago. Eventually I started booting into Linux less and less.
There are some great multimedia applications for Linux, including GIMP, Inkscape and VLC. There are also some really promising apps like Ardour and LMMS. But for professionals in design, video and audio Linux just doesn’t cut it yet, and most of the really good stuff is also available for Macs and/or Windows (Din, a soft synthesizer, is one interesting exception).
Gaming has often been a non-starter on Linux as well, though that’s steadily been getting better.
Then there are what I guess you could call power user apps. Stuff like Evernote, Skitch, Notational Velocity, TweetDeck, OmniFocus and Quicksilver that are available for Macs and/or Windows but not Linux. There are clones and substitutes, but nothing seems to measure up to the originals. I know there was an update to Skype recently, but the Linux version has long been a red headed step child.
Oh, and last I heard Netflix Streaming doesn’t work on Linux yet.
So while I’ve found the underlying guts of the OS/distro to be good, the lack of application support has long been lacking. The perfect clone or substitute has perpetually been just around the corner. This is the sort of stuff that keeps me in Windows all day. On Linux I’m constantly having to choose between buggy abandonware that happens to be open source, or dealing with running stuff in WINE. If it were just a matter running one or two apps in WINE, or of just booting into Windows when I had some audio mangling to do, that’d be one thing. But it just seems easier to swallow my geek pride and use Windows.
So why aren’t more apps available for Linux? I didn’t mention it in the article, but the financial incentives are far better for developers on Windows, OSX, the web, Android and iOS. For most Linux desktop developers it’s a labor of love.
There’s the chicken and the egg thing — there’s not enough users to justify developing for, but if there were more apps then there’d be more users. But if developers have been defecting to OS X and the web, then there’s not much hope of that.
I wish it weren’t true, but I’ve been slowly accepting this over the past year. I hope I’m wrong.
[I actually do think the situation has gotten better since I wrote that. I discovered Zim, which helped, and even though I don’t use it, Wunderlist supports Linux, which is nice.]