Teens aren’t abandoning “social.” They’re just using the word correctly.
Teens aren’t abandoning “social.” They’re just using the word correctly.
My pal Cliff wrote this excellent piece last week, it’s such an incisive view of the social bubble.
If you want a job in media, technology or a related field, make learning basic computer language your goal this summer. There are plenty of services—some free and others affordable—that will set you on your way. Teach yourself just enough of the grammar and the logic of computer languages to be able to see the big picture. Get acquainted with APIs. Dabble in a bit of Python. For most employers, that would be more than enough. Once you can claim familiarity with at least two programming languages, start sending out those resumes.
If you want a job in media, technology or a related field, make learning basic computer language your goal this summer. There are plenty of services—some free and others affordable—that will set you on your way. Teach yourself just enough of the grammar and the logic of computer languages to be able to see the big picture. Get acquainted with APIs. Dabble in a bit of Python. For most employers, that would be more than enough. Once you can claim familiarity with at least two programming languages, start sending out those resumes.
—
Kirk McDonald: Sorry, College Grads, I Probably Won’t Hire You - WSJ.com
This is great advice. I know it’s great advice because I’ve heard it half a bazillion times in the past year. I’d love to teach my journalism students some of these skills, but first I have to learn them.
If you know what McDonald means by “the grammar and logic of computer languages” (I do not), I could use your help. Got any specific recommendations? Where should I start?
(via kimlisagor)
Hi, Kim, awesome question. I disagree a little with what McDonald is saying here, specifically that knowing a little bit about programing makes one better prepared to allocate resources (I’d say the exact opposite is just as likely true) or that dabbling dilettantism for its own sake is necessarily a good thing.
However! I’m certainly a proponent of code literacy and that journalists should learn more about all aspects of the business, whether it’s how the CMS works or how ads are sold.
The highest bang/buck ratio for your students would be to learn HTML and CSS. They are going to be publishing on the web, they need to know what that means and why the CMS is throwing in stray tags or why copying and pasting from Word is probably going to get screwed up. Here’s what I’d consider a basic level of understanding:
A more intermediate to advanced level of understand would include:
There are certainly plenty of things I’ve left out or forgotten but this should keep just about anyone busy for the summer.
As for programming (btw, the difference between markup, styling, and programming is another good thing to figure out), it’s tempting to think Javascript is a good language to start with. It’s ubiquitous, native to the web, and looks good on a resume, especially alongside its more comely cousin jQuery.
There are a few problems with Javascript as a starter programming language. First, you really need to understand how web pages and browsers work to really get what Javascript is doing, otherwise, it just feels like magic; a pretty intimate understanding of the DOM really helps. Javascript, as a language, has a few truly bad parts that are easy to avoid if you understand programming concepts more broadly but can be hard to get over if you’re learning the fundamentals of programming at the same time.
I’d recommend starting with a high-level, interpreted language like Python or Ruby first. They’ll run on any computer, are easy to start with, you can see the results of your programs immediately, and they don’t require anything more than a text editor. My personal preference is Python.
These days, there’s no shortage of places, many of them free and online, to learn all of this stuff. And, true to Sturgeon’s Law, most of it is crap. There are a few bright spots, some of them requiring a little bit of money.
I hope this helps. Happy to add more or answer any questions I can.
Your pal in nerdy journalism,
Jim
Teens aren’t abandoning “social.” They’re just using the word correctly.
My pal Cliff wrote this excellent piece last week, it’s such an incisive view of the social bubble.
Newsbound explains the Chargemaster
One key part of the Affordable Care Act went into effect today, which makes the internal price lists public.
There’s something so boringly obvious about Kara Swisher’s behind-the-scenes look at how Facebook came to own Instagram. To fans of Silicon Valley drama (amongst whom Swisher seems to count herself), it’s a breathless tale of luck, determination, and picking the decisive moment to pivot. From a more critical vantage, it’s the story of a rich white son of privilege selling his company to another rich white son of privilege.
Maddeningly1, Swisher insists on writing it straight as a rags to riches story, glossing over the life of complete safety and entitlement of Instagram’s most prominent co-founder, Kevin Systrom. Prep school, four years at Stanford, startup internships, requisite time at Google – this is a well worn path in the valley (or a parallel one to Wall St.) and there’s nary a hint of what made Systrom different or interesting. If you read the story hoping to glean some lesson for selling your own zero-revenue company for a cool billion, keep looking, unless that lesson is to pick your parents well.
Perhaps there’s some cause to celebrate the waspy young turks who forsake well-groomed, upper crust New England lives in finance or medicine or law to strike out to the already tamed frontier of the valley. After all, Systrom and Zuckerberg and Bill Gates all reached further than their fellow prep-school grads to amass unimaginable wealth from silicon and social. Look no further than a Winkelvoss or (Randi) Zuckerberg to see how it could have turned out. Ultimately, though, these amount to little more than brave tales of how the 1% become the 0.1%.
The background stories of today’s robber barons amassing users and mining likes are no different than any other generation’s: wealthy scions risking little and being rewarded for their cynicism and ability to network.
Maddening, if not exactly surprising, considering the whole thing is in Vanity Fair. The day Swisher’s story was published, the second most popular story on the site was one about a hedge fund manager alongside a slideshow of beautiful people on horses. ↩︎
Jan Chipchase has written a wonderfully thoughtful essay on Google Glass
App.net is a bit of an odd duck. First, there’s the name, which is terrible. Conceptually, it’s kind of hard to wrap your head around and sell to your friends. It’s like Twitter, but you have to pay for it? But there’s also storage, like Dropbox, and people are building apps on the network (“oh, appdotnet, I get it. Wait.”) that are really nothing like Twitter.
I’ve been giving it a go mostly because the “figuring it out” part reminds me of mid-2007 era Twitter, but different. My pal Guy English put it smartly: this is for distilled insight, not quips, and I’ve found that mostly to be about right. Partly because you get more characters (bytes, not wisenheimers) but also because it feels different – it’s embarrassing to just crack jokes is about the only way I can put it.
Different, of course, does not mean better. For the most part, the talk is still very nerdy, very meta, the denizens too homogeneous. Kinda like the early days of Twitter.
The thing that’s missing is you. Statistically, if you’re reading this, you’re probably not on app.net, and that’s a shame. I’m a believer in Metcalfe’s Law and I’m interested in seeing if app.net can really branch into something different or if it will always just lie in the shadow of Twitter.
The folks behind app.net are really doing some incredible work. They’ve added new features and changed directions at an incredible clip, it’s been impressive to watch the growth and change. The ethos just sits right with me – I’m valued not as a faceless “user” to be monetized to brands, but as someone who contributes, in his own small way, to making the sum better.
And, they’ve been kind enough to give me a hundred free invites if you sign up via this link: https://join.app.net/from/jimray. The free tier has some limitations – you can only follow 40 other people, there’s less storage, that kinda thing – but it’s a great way to see if there really is anything to this un-Twitter.
A while back, there was some boastful kvetching about how app.net was better because of its exclusivity, in direct contradiction to what we know about the power of network effects, that its country clubbiness kept it nice and pristine. That kind of thinking is in every way antithetical to what I want from the world, and certainly the internet, so come on and muss it up a bit, won’t you?
How would you save journalism?
The excellent Priceonomics blog has a lengthy summary of some of the attempts to find new business models for journalism. Bonus: they experiment with a model of their own.
Watching Quartz’s launch, growth, and evolution has been fascinating. They’re doing phenomenal work, from the perspective of design, development, and editorial.
Telling the news on Twitter is different than telling the news in a magazine or newspaper. I realize journalists have a difficult job these days. The way mistakes are made and disseminated and the way they are corrected, is utterly different on Twitter than at a magazine like Wired or a newspaper like the New York Times. This places unfamiliar demands on journalists and novel demands on consumers of news. And the bigger burden is on the consumers, which I imagine makes the journalists especially cross. Because if we consumers want to have a real-time account of events–and we do, it really makes us a better informed citizenry–we have to understand how to deal better with ambiguity.
Consumers don’t just have to be “skeptical” or “critical thinkers” of breaking information: but they themselves have to operate as do journalists, by e.g., waiting for at least two independent sources as confirmation, and even then realize a piece of news only has some higher probability of being true. Tweets about older events have a lower threshold for warrant than breaking news, for obvious reasons. The price of timeliness is eternal vigilance.
I can understand the temptation to want to edit some (perceived) egregious fallacy you accidentally helped perpetuate, but that’s not how things work on Twitter. Delete the tweet, tweet a correction, or write an elaborate apology on your blog. It will harm your reputation to make a careless error, but on the other hand the audience should know to expect corrections when who-they-follow switch to the breaking-news game. And the audience wants breaking news, warts and all.
Telling the news on Twitter is different than telling the news in a magazine or newspaper. I realize journalists have a difficult job these days. The way mistakes are made and disseminated and the way they are corrected, is utterly different on Twitter than at a magazine like Wired or a newspaper like the New York Times. This places unfamiliar demands on journalists and novel demands on consumers of news. And the bigger burden is on the consumers, which I imagine makes the journalists especially cross. Because if we consumers want to have a real-time account of events–and we do, it really makes us a better informed citizenry–we have to understand how to deal better with ambiguity.
Consumers don’t just have to be “skeptical” or “critical thinkers” of breaking information: but they themselves have to operate as do journalists, by e.g., waiting for at least two independent sources as confirmation, and even then realize a piece of news only has some higher probability of being true. Tweets about older events have a lower threshold for warrant than breaking news, for obvious reasons. The price of timeliness is eternal vigilance.
I can understand the temptation to want to edit some (perceived) egregious fallacy you accidentally helped perpetuate, but that’s not how things work on Twitter. Delete the tweet, tweet a correction, or write an elaborate apology on your blog. It will harm your reputation to make a careless error, but on the other hand the audience should know to expect corrections when who-they-follow switch to the breaking-news game. And the audience wants breaking news, warts and all.
—
Nick Kallen, who was a platform engineer at Twitter, wrote a good technical and philosophical response to Mat Honan’s request for a Twitter edit button. I still think Mat’s idea has merit but Kallen deftly explains why it’s beyond non-trivial.
This quote, about “telling news on Twitter”, though, is where Kallen reaches too far, irresponsibly so. It reads like Frankenstein promising his creation really is the key to eternal life, plus he’s great with kids, probably.
Kallen casually tosses out that a fire hose of real-time news makes for “a better informed citizenry” with absolutely nothing resembling a fact to back this claim up. I’m certainly unaware of anything that suggests the rush of breaking news equates to better democracy. In fact, everyone I know who seriously studies how breaking news affects news comprehension hypothesizes the end result is a net loss.
The fact is, we don’t yet know whether, as Kallen claims, “a real time account of events” actually does make for better citizens (and democracies) and probably won’t for some time. I suspect, though, that the proliferation of “slow news”1 we’ve seen as a response to the Chinese water torture of news-like updates is an indication that our fellow citizens yearn for, and deserve, better. And, let’s not forget, those stodgy old newspapers often still manage to tell the story best.
Kallen also suggests part of the solution is to shift the cognitive burden of figuring out fact from fiction back to readers2, that ambiguity and eternal vigilance are the prices we pay for an 86400000 millisecond news cycle. Call me old school, but I preferred when a journalist was someone we could trust to get it first, but first, get it right , instead of simply blasting out what any mope could hear coming across the police scanner.
I’ll note that I’m not laying the blame for the problems of breaking news at Twitter’s feet. These problems really aren’t new, they were with us long before Noah Glass wrote Twitter’s first Rails controller. In fact, I’d suggest that Twitter is perhaps uniquely suited to help solve these problems, beyond just an edit button, by putting all that Big Data to use sorting fact from rumor. Being the heart through which so much of the world beats has got to be useful for something more than telling me the kids still like Justin Bieber.
I’m no luddite and, perhaps surprisingly to my friends who work there, still have love in my heart for Twitter. I want to believe they can crack the secret to helping me know – really know, not just thumb through – the world I live in. Even if it’s not an edit button, I want to believe they’re trying.
By “slow news” I’ll (begrudgingly) include both the algorithmic summarizers that seek to distill the news of the day by Hadooping a never-ending supply of reverse pyramid wire copy and (more optimistically) the human touches of sites like The Brief or Evening Edition. And, yes, I had a hand in the genesis of Evening Edition so there lies my bias. ↩︎
A, perhaps fussy, stylistic point: I truly loathe the word “consumer”, particularly as it’s applied to what we once referred to as “readers”. The word conjures a gaping maw, shoveling in the byproduct of some faceless corporation, barely stopping to chew, let alone think, and its overuse by the wunderkinds of new new media betrays a certain intention, does it not? ↩︎
Twitter’s music app is beautiful, in that now-tiresome way apps from VC-backed companies are required to be, sacrificing utility for aesthetics and looking dated as soon as it hits your neighborhood app store. That’s fine, they’ve got plenty of designers to restyle it every 18 months.
Mechanically, it works quite well, exactly what you’d expect a music app on an iPhone today to be, down to properly responding to the standard library of earbud remote shortcuts. As of 1.0.3, it’s plenty stable, less buggy than you might expect given the general unreliableness of networks and streaming media. Linking an Rdio or Spotify account is seamless, and a clever runaround of what would surely be a thorny negotiation with the music biz, to boot.
The Popular pane is useless to anyone over the age of 17. Emerging seems to simply be the inverse of Popular and is therefore equally hopeless. Swipe over to Suggested and we’re finally getting somewhere, save for the fact that the secret sauce of what makes an artist “suggested” is completely opaque. I have no idea1 what I should do to improve the algorithmic guidance or what the fuck @beth_orton is doing in there.
Tellingly, you can’t get to a musician’s tweets from within the app to decide whether you want to follow them based on the content of their stream, you’re just supposed to follow all of your favorite musicians and be in awe of their celebrity, I guess.
The # NowPlaying pane gets to the heart of what’s really wrong with the app and, may I suggest, Twitter circa 2013. In order for this 25% of the app to be useful, the people I trust and follow must also auto-tweet what they’re listening to, complete with hashtag detritus (or trolls). Perhaps I’m just too far past what Twitter considers cool, but a stream littered with # NowPlaying refuse (or Vines or Foursqure check-ins, for that matter) is a sign that I need to spend some quality time with the unfollow button. Twitter has built an app that requires users to abuse their timelines and followers with machine tags without any meaningful way of tuning out that noise.
Worse still, a recommendations engine built on top of who I follow on Twitter is not solving the problem of introducing me to new music, it’s reminding me how many of my friends have terrible taste (relative to my obviously awesome library, natch). Context matters, it’s why the intersection of my sets of Rdio and Twitter friends is actually pretty small and that’s ok. Again, maybe it matters to tweens that your best friends are also totally into @OneDirection or whatever but that’s no way for anyone past puberty to live.
Sadly, the music app also says plenty about where Twitter is going. They long ago gave up any pretense of subverting the mainstream, cozying up to the likes of MTV and NBC, and are now fully focused on being yet another megaphone for the world’s already over-exposed. Let us welcome our new new media overlords, same as the old overlords, it seems.
You can see how this plays out: more hashtagged “verticals” for # tv # movies # celebrities # gossip #news2 # food #etc, more courting verified b-list celebs, further metastasizing our streams. If you were wondering how Twitter was planning on paying back the more than $1 billion in venture capital they’ve stacked up, while also minting another generation of Silicon Valley [b|m]illionaires, here’s a clue.
Of course, it’s their prerogative3 to be an adjunct to and tool of the mainstream media. Let’s just not confuse the story of what Twitter is today with something that continues to be interesting.
It seems like the suggestions algorithm is keyed to the musicians you follow on Twitter, since that’s pretty much the only meaningful metric Twitter has bothered to tap into. I only follow two musicians: Aimee Mann, because I think she’s hilarious and she likes my polititweets; and John Roderick, a pal from Seattle who for some dumb reason doesn’t merit the proper “musician” badge. ↩︎
The events of last week and how poorly they were covered on Twitter (and the tired old dogs like CNN being wagged by Twitter’s tail) should disabuse anyone of the notion that a Twitter news app is anything resembling a good idea. ↩︎
As of press time, @KingBobbyBrown remains unverified, which is a god damn shame. ↩︎