Friday, April 6, 2018

Repost: Facebook's culture of unaccountability owes a lot to years of credulous coverage from places like the New York Times. (Why we need Gawker part 4,732)

Wednesday, June 15, 2011

"How To Party Your Way Into a Multi-Million Dollar Facebook Job" -- the sad state of business journalism

Andrew Gelman (before his virtual sabbatical) linked to this fascinating Gawker article by Ryan Tate:

If you want Facebook to spend millions of dollars hiring you, it helps to be a talented engineer, as the New York Times today [18 May 2011] suggests. But it also helps to carouse with Facebook honchos, invite them to your dad's Mediterranean party palace, and get them introduced to your father's venture capital pals, like Sam Lessin did. Lessin is the poster boy for today's Times story on Facebook "talent acquisitions." Facebook spent several million dollars to buy Lessin's drop.io, only to shut it down and put Lessin to work on internal projects. To the Times, Lessin is an example of how "the best talent" fetches tons of money these days. "Engineers are worth half a million to one million," a Facebook executive told the paper.
We'll let you in on a few things the Times left out: Lessin is not an engineer, but a Harvard social studies major and a former Bain consultant. His file-sharing startup drop.io was an also-ran competitor to the much more popular Dropbox, and was funded by a chum from Lessin's very rich childhood. Lessin's wealthy investment banker dad provided Facebook founder Mark Zuckerberg crucial access to venture capitalists in Facebook's early days. And Lessin had made a habit of wining and dining with Facebook executives for years before he finally scored a deal, including at a famous party he threw at his father's vacation home in Cyprus with girlfriend and Wall Street Journal tech reporter Jessica Vascellaro. (Lessin is well connected in media, too.) . . .
To get the full impact, you have to read the original New York Times piece by Miguel Helft. It's an almost perfect example modern business reporting, gushing and wide-eyed, eager to repeat conventional narratives about the next big thing, and showing no interest in digging for the truth.
It is not just that Helft failed to do even the most rudimentary of fact-checking (twenty minutes on Google would have uncovered a number of major holes); it is that he failed to check an unconvincing story that blatantly served the interests of the people telling it.

Let's start with the credibility of the story. While computer science may well be the top deck of the Titanic in this economy, has the industry really been driven to cannibalization by the dearth of talented people? There are certainly plenty of people in related fields with overlapping skill sets who are looking for work and there's no sign that the companies like Facebook are making a big push to mine these rich pools of labor. Nor have I seen any extraordinary efforts to go beyond the standard recruiting practices in comp sci departments.

How about self-interest? From a PR standpoint, this is the kind of story these companies want told. It depicts the people behind these companies as strong and decisive, the kind of leaders you'd want when you expect to encounter a large number of Gordian Knots. When the NYT quotes Zuckerberg saying “Someone who is exceptional in their role is not just a little better than someone who is pretty good. They are 100 times better,” they are helping him build a do-what-it-takes-to-be-the-best image.

The dude-throws-awesome-parties criteria for hiring tends to undermine that image, as does the quid pro quo aspect of Facebook's deals with Lessin's father.

Of course, there's more at stake here than corporate vanity. Tech companies have spent a great deal of time and money trying to persuade Congress that the country must increase the number of H-1Bs we issue in order to have a viable Tech industry. Without getting into the merits of the case (for that you can check out my reply to Noah Smith on the subject), this article proves once again that one easily impressed NYT reporter is worth any number of highly paid K Street lobbyists.

The New York Times is still, for many people, the paper. I've argued before that I didn't feel the paper deserved its reputation, that you can find better journalism and better newspapers out there, but there's no denying that the paper does have a tremendous brand. People believe things they read in the New York Times. It would be nice if the paper looked at this as an obligation to live up to rather than laurels to rest on.

Segundo de Chomón and the pushbutton age

Regular readers have noticed we've been spending a lot time on the history of technology, particularly the explosive changes around the late 19th and early 20th centuries. One of the things I find most fascinating about the period is the number of concepts that are now so familiar as to be a part of our intuitive view of the world which didn't exist until the time in question.
The idea of remote control, virtually instantaneous nonmechanical action at any terrestrial distance. You touch a button, you throw switch, and lights go on, doors open, motors start. This went from being impossible to completely mundane with remarkable speed.

The pushbutton age was still fairly new when Segundo de Chomón made the groundbreaking film electric hotel. The though overshadowed by Georges Méliès, de Chomón was, for my money, probably the better filmmaker and his work with stop motion animation would prove more fertile than any of the trick effects his contemporary is remembered for.








Another piece of new technology.




No stop action, just a personal favorite.







Thursday, April 5, 2018

Monthly repost of the media consolidation piece -- Sinclair edition

From Talking Points Memo:
Local newscasts nationwide last week decried “fake” and “one-sided” reporting by reading from a shared script written by one of the most powerful broadcasters in America.
The so-called “must run” script, which local stations owned by Sinclair Broadcast Group were required to read, according to several reports, blasts “the troubling trend of irresponsible, one sided news stories plaguing our country.” 

“The sharing of biased and false news has become all too common on social media,” the script continues, according to a copy published Friday by the Seattle Post-Intelligencer.

“More alarming, some media outlets publish these same fake stories… stories that just aren’t true, without checking facts first,” an anchor adds, according to the script. Another anchor continues: “Unfortunately, some members of the media use their platforms to push their own personal bias and agenda to control ‘exactly what people think’…This is extremely dangerous to a democracy.”
Sinclair Broadcast Group owns more television stations than any other broadcaster in the country, and stands to spread its influence even more if the Justice Department and Federal Communications Commission approve a massive merger with Tribune Media. 

“At my station, everyone was uncomfortable doing it,” one unnamed television anchor at a Sinclair Broadcast Group-owned station told CNN earlier this month, referring to the “one-sided news” script. 

“They’re certainly not happy about it,” an unnamed KOMO employee told the Post-Intelligencer Friday, referring to their colleagues.

Which leads us back to this...

Tuesday, December 19, 2017


As if you didn't have enough to worry about



[This is another one of those too-topical-to-ignore topics that I don't have nearly enough time to do justice to, but I suppose that's why God invented blogging.]

There's a huge problem that people aren't talking about nearly enough. More troublingly, when it does get discussed, it is usually treated as a series of unrelated problems, much like a cocaine addict who complains about his drug problem, bankruptcy, divorce, and encounters with loan sharks, but who never makes a causal connection between the items on the list.

Think about all of the recent news stories that are about or are a result of concentration/deregulation of media power and the inevitable consequences. Obviously, net neutrality falls under this category. So does the role that Facebook, and, to a lesser extent, Twitter played in the misinformation that influenced the 2016 election. The role of the platform monopolies in the ongoing implosion of digital journalism has been widely discussed by commentators like Josh Marshall. The Time Warner/AT&T merger has gotten coverage primarily due to the ethically questionable involvement of Donald Trump, with very little being said about the numerous other concerns. Outside of a few fan boys excited over the possibility of seeing the X-Men fight the Avengers, almost no one's talking about Disney's Fox acquisition.

It didn't used to be like this. For most of the 20th century, the government kept a vigilant watch for even potential accumulation of media power. Ownership was restricted. Movie studios were forced to sell their theaters (see United States v. Paramount Pictures, Inc). The largest radio network was effectively forced to split in two (that's why we have ABC broadcasting today). Media companies were tightly regulated, their workforce was heavily unionized, and they were forced to jump through all manner of hoops before expanding into new markets to insure that the public good was being served.

In short, the companies were subjected to conditions which we have been told prevent growth, stifle innovation, and kill jobs. We can never know what would've happened had the government given these companies a freer hand but we can say with certainty that for media, the Post-war era was a period of explosive growth, fantastic advances, and incredible successes both economically and culturally. It's worth noting that the biggest entertainment franchises of the market-worshiping, anything-goes 21st century were mostly created under the yoke of 20th century regulation.

Wednesday, April 4, 2018

Evaluations and zero sum games

This is Joseph

In a discussion of education reform pros and cons, Curmudgucation notes
I'm giving Kraft a bonus point for this one, because too many reformsters refuse to acknowledge that their evaluation systems set up a kind of teacher thunderdome, a system in which I can't collaborate with a colleague because I might just collaborate myself out of a raise or a job. Because a school doesn't make a profit, all teacher merit pay systems must be zero sum, which means in order for you to win, I must lose. This does not build collegiality in a building.
This is a good point and a general problem with "stack ranking" style systems.  They often work well when first deployed, because nobody has actually adapted to them.  But they quickly insert perverse incentives.

Imagine a system that said "the bottom 10% of employees need to be let go each year".  Used for the first time it would remove a lot of dead wood (and maybe some good people as well).  But people would quickly note some obvious downsides -- like who is the comparison group.  If you get promoted could you be in the bottom 10% of the next rank up and thus be promoted into being fired?
Does it not make it rational to hire the weakest person you can sneak through the hiring system?  After all, if cash bonuses and job retention are based on relative ranking (and not overall performance) then is it not best that the competition be as weak as possible?

I wonder how much this counter-balances quality effects due to ranking systems?  Especially given that churn, itself, is costly.

Tuesday, April 3, 2018

In other words, stop whining about the goddamn participation trophies


Those complaining that focusing on kids' self-esteem is creating a weak and coddled generation need to remember that this focus has been around for around four generations now

From the back pages of the greatest comic book ever, The Spirit 9/24/1944






Monday, April 2, 2018

Muzak has been around a disturbingly long time.

As previously mentioned, there is a popular narrative among those trying to explain away the apparent failure of a new technology. The story goes that the under-performance is not due to the technology being badly designed or serving no particular useful purpose, but instead is due to the lack of a "killer app" that will someday appear to save the day. In these accounts, technologies frequently spend years languishing until someone suddenly realizes something like "hey, you could use this to play music."

Having spent a great deal of the past year or so looking at the history of this sort of thing, I've come to the conclusion that people normally hit upon these killer apps very quickly, Often before the technology itself is viable. Subscription services for broadcasting music to pubic places and alarm clocks that woke sleepers with music were being tried long before the tech existed to make either practical.

A couple of side notes on the first story. The evolution of synthesizers is a bit outside of the scope of our ongoing threads but if the subject interests you, definitely check out the history of the telharmonium. Also note the quote from Mark Twain. Twain was fascinated by the new technology of the era and we should probably devote some future posts to his take on the subject.

From Scientific American March 9, 1907












































From Scientific American  April 6, 1907









Friday, March 30, 2018

Ironically, the turn-of-the-century printing technology looks worse because I used images compressed with early 21st century technology.


We've been talking a lot about the late 19th/early 20th centuries and what sets them apart from other periods. We've focused on the magnitude of the changes wrought by technology, but that may be less impressive than the ubiquitousness. When you start digging into the era, it is remarkably difficult to find an area that was not experiencing explosive change.

Even the "mature" technologies were evolving so quickly as to often be unrecognizable. Take printing.  We've already talked about how, by the end of the 19th century, machines requiring a fraction of the space and manpower could do many times the work compared to 50 years earlier. What we left out was the spectacular quality of the work. Photographic reproduction, fine details, beautiful color printing.

The comic strip came out of this technology (helped along by the intense competition of press lords like Hearst and Pulitzer). Initially, the strips were printed one per page and the results could be glorious, most of all when the artist was Winsor McCay (also arguably the father of the animated cartoon, but that's a topic for another post).

















Thursday, March 29, 2018

As soon as the phrase "cartoonishly evil" pops up, you'll probably know where we're headed.


There are cases where the sober, balance, and accurate depiction of the facts will leave the subject coming off as cartoonishly evil. In these cases, if the press is doing its job, the personal brand of these subjects should eventually taint any cause or initiative associated with them.

The first sentence certainly applies to the Koch Brothers. Perhaps the second is starting to as well.
The public benefits of jumping on the KentuckyWired offer would be substantial: Not only would West Louisville get a chance at better access for its homes and businesses, but the city could install fiber-controlled traffic signals, create better and cheaper connectivity for public-safety agencies, and ship data around inexpensively to improve its operations. In a nutshell, the city would build the infrastructure and lease capacity to private internet-service providers. "We were looking at this as our smart city foundation," Grace Simrall, Louisville's chief of civic innovation, says. At least half of the new fiber capacity would be reserved for open access leases, to encourage last-mile retail providers to wire homes and businesses. All for just the cost of the fiber lines.

It seemed to be a no-brainer. “I can't think of a more sensible plan," Simrall says. "I just didn't think that we were going to face opposition on this. We thought surely people would understand that this was a way for us to leapfrog where we were for a fraction of the cost."



That's when Simrall learned who had joined the forces determined to block Louisville from spending a dime on fiber for the city's use: Charles and David Koch, the brothers backing environment-hostile fossil fuels and funding politicians who dole out goodies to the super-rich. "It's widely known that they [the Taxpayers Protection Alliance] receive a lot of funding from the Koch brothers," Simrall says.

The connection between the TPA and the Koch brothers emerged from investigative reporting by ProPublica and others. This work has revealed that the Taxpayers Protection Alliance is a front advocacy group, part of a network of dark-money organizations supported in part by the Koch brothers. (The funding seems not to come from the Koch family directly but instead is funneled through other Koch-funded groups.) TPA’s most recent IRS filing shows it received about half a million dollars in contributions in 2016, but the sources of these contributions are blacked out. Tax-exempt organizations are not required to disclose the names of their donors publicly. David Williams, TPA’s president, told the Louisville Courier-Journal earlier this year that the group receives funding from “a lot of different sources," including groups affiliated with the Koch brothers.



Later that month, there were two dramatic public meetings on the city's budget for the fiber project. The first vote went along party lines, with Republicans voting against any city involvement in fiber. Simrall and her team kept fighting, and managed to convince some Republicans that the city plan made a lot of sense—especially the Republicans from districts that have suffered from digital redlining by incumbents. In the end, at the final budget hearing, the council voted unanimously to approve the request. "It was really quite a thrilling thing," Simrall says.

At the end of the day, the Koch-funded campaign backfired. It helped fire up some council members who might not have understood the importance of city fiber; once they knew the Koch brothers were against it, the city's plan got their attention. "That felt pretty good," Simrall says.

Wednesday, March 28, 2018

Yes, if you promise something that you know you will probably never deliver, you are lying. Glad I could clear that up for you.

This Wired piece by Erin Griffith shows how the tech community is starting to come to terms with the damage that hype and magical heuristics have wrought. You should read the whole thing but the following seemed worth singling out.
Historically, the startup world’s “fake it till you make it” culture wasn’t a much of a problem; venture investors encouraged startup founders to think big and a high percentage of them fail anyway. So what if someone stretches the truth a little in pursuit of world domination? The nature of technology requires a degree of magical thinking to function. As I wrote in 2016, even the most well-intentioned startup founders have to persuade investors, engineers, and customers to believe in a future where their totally made-up idea will be real:

“That’s not ‘My cola tastes better than yours.’ That’s ‘Let me explain to you how the world’s going to be,’” says Chris Bulger, managing director at Bulger Partners, an investment bank that advises technology companies on acquisitions. “Is that person lying when they turn out to be wrong?”


Tuesday, March 27, 2018

After looking at this 1889 torpedo, you'll find the "Savage" in the name entirely appropriate

Another one of the threads we need to pick up on is how the idea of remote control (both electrical and wireless) changed the way people looked at the world in the late 19th and early 20th centuries.

This very cool looking prototype torpedo described in an 1889 issue of Scientific American is a good example.










Monday, March 26, 2018

One more post on the NIMBY/YIMBY debate

[I realize we've covered a lot of this territory before and I apologize for the redundancy, but I thought it might be nice to some everything up in one final post.]

Just to have a framework, let's start with some fundamental assumptions of the conventional urbanist wisdom. These are badly oversimplified, but they should be good enough for our purposes here.

The best and easiest way of alleviating the serious externalities associated with commuting (particularly environmental damage) is by having people move near enough to centers of employment that personal transportation (other than bikes) is not necessary.

The best and easiest way of lowering the often exorbitant rents near the center's is by building up.

The best and easiest way of getting high-capacity housing where we most need it is through market forces.

Putting aside arguments for telecommuting (pretty much by definition the fastest and most efficient way to get to work), here are some of my concerns with this model. Ironically, some of them are fairly closely the concerns that urbanists have about suburban sprawl.

Moving is difficult. Buildings are permanent (and they do have an environmental footprint). One of the hidden social costs of home ownership is that it ties the owner to a specific job market. If you are wedded to the idea of making commuting nondependent on automobiles, this high density approach faces many of the same challenges, particularly for households with more than one working member. These housing units need to be so close to a wide enough range of jobs that two people can find housing within easy commute of two different positions and will have a reasonably good chance of staying in the same location in the event of a job change. What's more, that employment center needs to remain relatively stable more or less indefinitely. Booms and busts could play hell with this model.

Actual researchers tend to take a more nuanced and sophisticated view, but in the press, the urban density debate generally treats the choice of where to live as a fairly simple function of two variables, proximity to employment and housing cost. We have reason to believe that the real relationship has more variables and more complexity with interactions between proximity to employment and the weighting of other factors. For example, we know that a nontrivial number of people in Los Angeles and the Bay Area will opt for rental options that are both more expensive and further from work.

Silicon Valley workers living in San Francisco have gotten a lot of coverage but trendy neighborhoods in LA may be a more useful case for study. "Trendy" is the key word here. We're generally talking about well-paid professionals who are willing to put up with an extra half hour or more of traffic for scenic views, dining and other amenities, and, perhaps most of all, the ability to impress other people with where you live and who your neighbors are. The resulting dynamic can be very much like suburban sprawl, but with the suburb tucked in the middle of a high density urban area.

Partially because of the reasons given above, market forces have a very mixed record when it comes to picking the most efficient spot for development. I'll limit my comments to Los Angeles because I know the town, but I believe they could be generalized to a large number of other areas.

A great deal has been written about the NIMBY push against development in Santa Monica. Utopian urbanists like Dave Roberts have gone so far as to suggest that anyone who claims to be an environmentalist and opposes it must be a hypocrite.

The problem with this line of reasoning is that Santa Monica, particularly the extremely expensive section north of the 10 and west of Lincoln, is one of the worst possible places in the county of Los Angeles (and this is a big God damn County) for using high density development to alleviate the impact of commuting and to reduce cost of living.

Geographically, it's bounded on two sides by ocean and mountains thus greatly limiting the number of commuting destinations. The constant flow of tourists means that prices will tend to be high and traffic will never, ever be good. The trendiness of the town makes it likely to become an urban suburb and an appealing spot for second homes among the rich. Finally, and perhaps most importantly, the public transportation actuation is extremely bad. Other than the buses, which have to deal with the aforementioned traffic, the only other option is a single, slow train with a not-that-convenient route. (Don't get me wrong, simply having a train to the ocean is a big step forward for LA, but not nearly big enough to alleviate the traffic woes of a much denser Santa Monica/Venice.)

If the goal really were to create a greener, less car dependent Los Angeles, Santa Monica developments wouldn't even be on our radar. Instead, we would be focusing on development around transportation hubs, particularly Union Station. There's plenty of room for growth with in a two-mile radius, but the best places for development are not in the trendy upscale neighborhoods, and developers know that trendy is where the money is.

Friday, March 23, 2018

Self driving cars

This is Joseph.

A few thoughts on the recent automated car crash.
  1. The cars need to be able to operate without safety drivers to actually do what pundits want (driverless taxis, shared cars).  If they require a safety driver that is a bad thing.
  2. It sure seems like the failure here was pretty central.  This should have been a case where the car sensors give it an advantage over a human driver.
  3. It is a non sequitur to say that the car was following the rules of the road.  Complex urban areas often have many actions that are technically illegal. Ramming rulebreakers at full speed will make traffic much worse and less safe, not better.  
  4. There is a hint of catastrophic failure here and in the Tesla crash. This means that we need the rate to be lower than for human piloted cars, as severity of incidents may be higher.
  5. Automatic software updates are going to be exciting, as a bad patch is not going to be pretty.  
Mike the Mad Biologist did an estimate of the accident rate. Using his figures the fatal crash rates per billion passenger miles (bpm)

Cars 7.28 per  bpm
Buses 0.11 per bpm
Motorcycles 213 bpm

Duncan Black estimates the Uber rate at:

333 per bpm

Now it is true that there is one crash so far. But if we assume that crashes are uniformly distributed across the whole driving time, it is worrisome to see the fatal crash happen in the first 5% of the 140 million passenger miles driven.  It surely could have happened here by chance.  But it isn't a reassuring piece of data.

This is doubly true as we'd like self-driving cars to be as safe as buses, if we are going to eliminate public transit with a network of cars.  .

None of this is to say that making cars smarter is a bad thing.  But it points out the challenges for some of the more extreme applications, like self-driving taxis.  It isn't clear to me that focusing on improved public transit isn't a viable alternative.  

"Adam ruins Facebook"

A bit of a quibble. There is reason to be a bit skeptical about some of these claims of the amazing predictive and persuasive power of this kind of targeted marketing (more on that later), but before you start feeling too relieved, there is also a reason to believe that this data could be used to do far worse things than encourage a bacon lover to overindulge.






Thursday, March 22, 2018

Tech revisionism and the myth of the killer app

I'm wondering if anyone else there occasionally has a "blogger moment." It is similar to a "senior moment," but it involves either thinking you posted something that you didn't or failing to remember you posted something that you did. I had one of these this morning when I went looking for what I'd written at the time about this egregious piece of tech revisionism by NPR's Laura Sydell.
Years later, an Edison assistant wrote: "We were sitting around. We'd been working on the telephone — yelling into diaphragms. And Edison turned to me, and he said, 'If we put a needle or a pin on this diaphragm, it'll vibrate, and if we pull a strip of wax paper underneath it, it should leave marks. And then if we pull that piece of paper back, we should hear the talking.' "

Yet, no one knew what to do with this invention. It took 20 years to figure out that music was the killer app.
Even a cursory check of the historic record would show that the ability to record and reproduce (since that's what we mean when we talk about "recording" technology) spoken words, music, etc. was instantly hailed as a major discovery, that people immediately saw the potential, particularly for music, and that there was from day one an enormous push by a wide range of inventors and engineers to get the technology commercially viable.

These illustrations from the October 12, 1889 issue of Scientific American illustrate the point.





Wednesday, March 21, 2018

Repost: Given Facebook's current scandals, this seems like a good time to revisit this argument

I don't know if I've actually come out and said this in so many words but Facebook should be forced to divest itself of Instagram (along similar lines, Google should be forced to divest itself of YouTube, but that's a topic for another day). As we've previously mentioned, mid-20th-century regulators would never have allowed Facebook to become this large or to achieve this level of monopoly power. They certainly would not have allowed it to hang on to Instagram as well.

Having Instagram in competition with Facebook would not solve the problem but it would address it in at least a couple of ways. First, to belabor the obvious, competition is good. Second, Facebook has a widely noted aging demographic problem (in my very limited personal experience, the older the friend the more hours he or she spends on the platform). At this rate, if the company is not allowed to grow through acquisition, the Facebook problem might just take care of itself in time.

Tuesday, December 19, 2017

As if you didn't have enough to worry about






[This is another one of those too-topical-to-ignore topics that I don't have nearly enough time to do justice to, but I suppose that's why God invented blogging.]

There's a huge problem that people aren't talking about nearly enough. More troublingly, when it does get discussed, it is usually treated as a series of unrelated problems, much like a cocaine addict who complains about his drug problem, bankruptcy, divorce, and encounters with loan sharks, but who never makes a causal connection between the items on the list.

Think about all of the recent news stories that are about or are a result of concentration/deregulation of media power and the inevitable consequences. Obviously, net neutrality falls under this category. So does the role that Facebook, and, to a lesser extent, Twitter played in the misinformation that influenced the 2016 election. The role of the platform monopolies in the ongoing implosion of digital journalism has been widely discussed by commentators like Josh Marshall. The Time Warner/AT&T merger has gotten coverage primarily due to the ethically questionable involvement of Donald Trump, with very little being said about the numerous other concerns. Outside of a few fan boys excited over the possibility of seeing the X-Men fight the Avengers, almost no one's talking about Disney's Fox acquisition.

It didn't used to be like this. For most of the 20th century, the government kept a vigilant watch for even potential accumulation of media power. Ownership was restricted. Movie studios were forced to sell their theaters (see United States v. Paramount Pictures, Inc). The largest radio network was effectively forced to split in two (that's why we have ABC broadcasting today). Media companies were tightly regulated, their workforce was heavily unionized, and they were forced to jump through all manner of hoops before expanding into new markets to insure that the public good was being served.

In short, the companies were subjected to conditions which we have been told prevent growth, stifle innovation, and kill jobs. We can never know what would've happened had the government given these companies a freer hand but we can say with certainty that for media, the Post-war era was a period of explosive growth, fantastic advances, and incredible successes both economically and culturally. It's worth noting that the biggest entertainment franchises of the market-worshiping, anything-goes 21st century were mostly created under the yoke of 20th century regulation.