Ami Angelwings noted that Q chose Picard to test, as a representative specimen of humanity, because as a personality he illustrates the biggest change from what he’d have been in the past (reformatted into paragraph form from @Ami_Angelwings Twitter, May 7, 2015):
I realized that the reason Q tests Picard, is that he represents the part of “humanity” that they need to see has evolved from the past. While we know him as an intelligent mature guy, he was a brash womanizer as a kid, who got into fights & loved em & left em. He left home, only freshman to win the marathon, he became the youngest captain, he has all these gary stu heroic stories. He’s a white straight cis male who shot to the top, he’s not the smartest guy, even as an explorer he’s still a military commander. Geordi could’ve solved the paradox, Crusher could’ve solved Farpoint. Point is that in other times, Picard would be the privileged oppressor. The test isn’t can ANY human expand their mind to figure out a paradox, or can ANY human have the compassion to solve Farpoint? The test is “can your white cis straight dude captain with the gary stu past who commands the strongest ship, figure that stuff out?”
[A dialogue with Josh Marsfelder of Vaka Rangi blog, based on a Twitter exchange. My comments are in bold.]
The Federation’s attitude towards treaty obligations in Ensigns of Command, The Vengeance Factor and The Hunted seems to value an interstellar equivalent of the Westphalian nation-state for its own sake. Put that together with the Prime Directive, and the central value of the Federation seems to be stasis. Reminds me a lot of George Bush standing by & letting Saddam suppress the Shia/”Marsh Arab” uprising in Iraq after the war.
Yeah, that sounds about right to me. Do bear in mind where the Enterprise crew’s values stood in relation to that though. And also how Captain Picard’s description of life in the 24th Century in “The Neutral Zone” seems to contradict that. Or, for that matter, the sheer number of times the crew have been put up against Starfleet Command/Federation administration. Once you get to TNG Mark II and DS9, you’ll see how the war with Cardassia might have given the Feds a reason to act that way. A four year war about petty border disputes where entire planetary populations were used as bargaining chips by both sides. Continue reading
[Note — this is crossposted from C4SS].
These three short stories all come from the same Cory Doctorow collection, Overclocked: Stories of the Future Present St (New York: Thunder’s Mouth Press, 2007). Free download here. The three are all set against a background of what I call the “DRM Curtain,” a transnational corporate Empire based on artificial scarcities enforced through a maximalist version “intellectual property” rights, promoted through trade deals written and lobbied by the proprietary content industries, and ultimately backed by the military force of the American state. The DRM Curtain’s corporate ruling class is as dependent on police state surveillance and the restriction of information flow as was the bureaucratic oligarchy that ruled the old Soviet empire behind the Iron Curtain.
[Cross-posted from P2P Foundation Blog]
Daniel Suarez. Daemon (Signet, 2009); Freedom(TM) (Dutton 2010).
I should have known how good these books would be when I saw John Robb of Global Guerrillas listed among Suarez’s advisers on the Acknowledgements page of Daemon. If you’ve been following Robb the last year or so, you know he writes a lot about resilient communities and darknets. Recently, against the backdrop of disruption of the Icelandic volcano, he stated the two principles of resilience:
*Virtualize everything else
Those are, perhaps not coincidentally, the central organizing principles of the new society that emerges in Suarez’s two novels. As Robb describes it in his review of Freedom,
it is a fictional account of the next American revolution (AR 2.0) using resilient communities, open source warfare, systems disruption, individual super-empowerment, parasitic predation, hollow nation-states, etc, (all staples of global guerrilla thinking) as central themes. Very cool.
Any regular reader of this blog, anyone on the P2P Research or Open Manufacturing lists, anyone who follows John Robb, Jeff Vail or David Ronfeldt, should run—not walk—to buy both of these books. Continue reading
[Reblogged from P2P Foundation Blog]
In the previous installment of this series of review essays, I considered the technological unemployment scenario presented by Martin Ford in The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future.
In this last installment, I will discuss his proposed agenda for dealing with abundance, and then present my own counter-agenda.
[Reblogged from P2P Foundation Blog]
Martin Ford. The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future (Acculant Publishing, 2009).
Of the three works considered in this series of review essays, Ford’s pays by far the most attention to the issue of technological unemployment. It’s the central theme of his book.
Members of the P2P Research and Open Manufacturing lists are probably familiar with the worst-case scenarios for technological unemployment frequently outlined in the posts of member Paul Fernhout. Coupled with draconian social controls and strong IP enforcement, it’s the scenario of Marshall Brain’s Manna. Still others are surely familiar with similar projections in Jeremy Rifkin’s The End of Work.
Ford writes very much in the same tradition.
But there are significant mitigating features to technological unemployment which Ford fails to address—features which I’ve also raised on-list in debates with Fernhout. Most important is the imploding price of means of production.
[Reblogged from P2P Foundation Blog]
What’s variously called the “cognitive capitalism” model, or Paul Romer’s New Growth Theory, assumes that technological progress and increased efficiency will lead to “economic growth” in the sense of the total volume of monetized economic activity. But this presumes the use of “intellectual property” and other forms of artificial scarcity to capitalize efficiency improvements as a source of rents, rather than allowing market competition to pass reduced costs on to the consumer in the form of lower prices.
But similar assumptions are found, in a weaker form, even among people who aren’t exactly friends of the proprietary content industries. This includes Chris Anderson’s “Freemium” model, and similar arguments by Mike Masnick at Techdirt. Their basic idea, which is great as far as it goes, is to use free content to piggyback monetized auxiliary services: Linux distros offering tech support and customization, music companies selling certified authentic copies available at a convenient location, Phish selling concert tickets, etc.
One thing they fail to adequately address, though, is that the total amount of cash available from such auxiliary services is less than what proprietary content brought in. Or to take Anderson’s example, Encarta sales didn’t bring in money equivalent to the exchange value it destroyed for Britannica et al. And Wikipedia destroyed billions in net monetized value for both hard copy encyclopedias and Encarta.
In Masnick’s and Anderson’s model, though, the total size of the monetized economy overall still continues to increase. A reduction in the total money expenditures (and hence labor) required to obtain a consumer good will simply free up purchasing power and increase demand for new goods in some other sector.
The problem is, this assumes that total demand is infinitely, upwardly elastic.