By the mid- to late-1960s, however, schizophrenia was a diagnosis disproportionately applied to the hospital’s growing population of African American men from urban Detroit. Perhaps the most shocking evidence I uncovered was that hospital charts “diagnosed” these men in part because of their symptoms, but also because of their connections to the civil rights movement. Many of the men were sent to Ionia after convictions for crimes that ranged from armed robbery to participation in civil-rights protests, to property destruction during periods of civil unrest, such as the Detroit riots of 1968. Charts stressed how hallucinations and delusions rendered these men as threats not only to other patients, but also to clinicians, ward attendants, and to society itself. You’d see comments like Paranoid against his doctors and the police. Or, Would be a danger to society were he not in an institution.
In what sounds like the setup for a bad “psychological thriller” movie, neuroscientist James Fallon discovered that his brain fits the profile of a psychopath’s: low activity in the orbital cortex.
“You see that? I’m 100 percent. I have the pattern, the risky pattern,” he says, then pauses. “In a sense, I’m a born killer.”
Fallon’s being tongue-in-cheek — sort of. He doesn’t believe his fate or anyone else’s is entirely determined by genes. They merely tip you in one direction or another.
And yet: “When I put the two together, it was frankly a little disturbing,” Fallon says with a laugh. “You start to look at yourself and you say, ‘I may be a sociopath.’ I don’t think I am, but this looks exactly like [the brains of] the psychopaths, the sociopaths, that I’ve seen before.”
I asked his wife, Diane, what she thought of the result.
“I wasn’t too concerned,” she says, laughing. “I mean, I’ve known him since I was 12.”
Diane probably does not need to worry, according to scientists who study this area. They believe that brain patterns and genetic makeup are not enough to make anyone a psychopath. You need a third ingredient: abuse or violence in one’s childhood.
Over at ReadWriteWeb I take a look at the controversy surrounding the Lieberman-Collins bill:
It doesn’t sound like a “kill switch.” The bill would require the President to submit a report describing, among other things, “The actions necessary to preserve the reliable operation and mitigate the consequences of the potential disruption of covered critical infrastructure” (pg. 84 lines 1-4). That sounds like the opposite of a kill switch: this legislation describes a process by which the president is expected to take action to ensure access to “critical infrastructure” -including the Internet.
There’s plenty of room to debate the merits of the federal government dictating the security policies of private companies, the ability of the president to continually extend any provisions beyond 30 days, the value of establishing new cyber security departments within the government, and the vagueness of the language in the bill. But this is nothing nearly so radical as some are making it out to be.
In fact, as Senate Committee on Homeland Security and Governmental Affairs’ web site for the bill points out, the President already has a legislative (but of course, not technological) “kill switch.” The Communications Act of 1934 gave the president power to shut down “wire communications.”
University of Illinois Professor Dolores Albarracin and her team’s research on motivation:
Researchers tested these two different motivational approaches first by telling study participants to either spend a minute wondering whether they would complete a task or telling themselves they would. The participants showed more success on an anagram task (rearranging words to create different words) when they asked themselves whether they would complete it than when they told themselves they would.
In another experiment, students were asked to write two seemingly unrelated sentences, starting with either “I Will” or “Will I,” and then work on the same anagram task. Participants did better when they wrote, “Will I” even though they had no idea that the word writing related to the anagram task. A final experiment added the dimension of having participants complete a test designed to gauge motivation levels. Again, the participants who asked themselves whether they would complete the task did better on the task, and scored significantly higher on the motivation test.
Influential cyberpunk author Rudy Rucker is giving away digital copies of his signature tetralogy the “ware” series – Software, Wetware, Freeware, and Realware. I’ve only read Software, and it’s great.
It it seems like so many sites are just getting so bad with road-blocks and screen hogging ads. It’s getting like it was in the late 90s and early 2000s, with pop-ups. You’d go to a page and you’d get 3 or 4 pop-ups. And now pop-up blockers are built into all browsers, basically. So that’s not even a viable form of advertising any more. So I expect ad-blockers will become a standard part of browsers – I just don’t know how companies are going to expect to profit from advertising in the future.
Shortly after that interview was conducted, Apple announced it was integrating the browser plugin Readability into Safari.
Screenshot from the big sloppy blowjob Ad-Age gave Apple for Reader
Apple is simultaneously shipping ad-blocking technology in its browser and entering the advertising market. Some, such as Ars Technica’s Ken Fisher cried foul:
So the company that has made an advertising platform a major part of its iOS strategy is also hawking an ad-blocking technology for its Web browser, where it has no stake in ads. App Store: use our unblockable ads, developers! They help you get paid for your hard work! Web: hey, block some ads, readers! They’re annoying!
Chris Arkenberg noted “Apple’s iAd/ad-block duality underscores their anti-Flash agenda. If they can’t own & control the platform, they will try to crush it.”
Certainly this is relevant, especially if Apple doesn’t offer the ability to block its own iAds in apps. And readers of this site know I’m no fan of Apple’s strong-arm control tactics. But it’s worth noting that Reader/Readability is a different sort of ad-blocker.
It loads the full page, ads and all, and then gives the reader the option of blocking ads while they read the content. This means, for impression driven advertising, that sites still get their pageviews – and advertisers still get some exposure, if for only a few seconds.
Fisher notes this, but worries about articles split up into multiple pages. Sites might not be able to get pageviews for each and every single one of those pages. Cry me a fucking river! The content industry (of which I’m now very much a part, now being a writer for ReadWriteWeb) is on the verge of having the rug pulled out from under it by ad-blockers, and you’re worried that you might not be able to tack on a few extra pageviews for longer content? (Some sites already let you hit “single page” or view a printer friendly version if you don’t want to click through multiple pages.) We’re going to be goddamn lucky if readers stick to Readability and don’t go all out with Ad-Block Plus – or both.
The good news for both advertisers and content providers is that only 40% of Facebook readers say they dislike ads. And I’d guess Facebook users are fairly representative sample of the Internet. That means that unless browsers start shipping with an Ad-Block Plus-like ad blocker, we can expect that at least 60% of Internet users won’t bother to install an ad-blocker more aggressive than Readability.
But Facebook’s ads are pretty unobtrusive, unlike Salon’s monstrous screen hogging ads that try to drive you off the Internet and back onto paper. Considering 40% of Facebook users are annoyed by Facebook’s relatively tiny ads, the number of readers annoyed by “road block” ads and those “Congratulations, you’ve just been selected to receive a free iFuck-in-Ass” ads are probably more in the 98-99% range (the other 1-2% are too stupid to realize those road-blocks and fake contests are ads). Those really obtrusive ads could ruin it for everyone – they’re what finally drove me to install Ad-Block Plus, something I resisted for years.
Advertisers and publishers need to forget about being able to monetize every single pageview and focus on offering unobtrusive and highly targeted advertising – and ostracize anyone who tries to push obnoxious advertising on the web before browser makers start including real ad-blocks by default.
I’ve read before that although the Romans kept meticulous records of crucifixions, there is no surviving record of a Jewish radical from Nazareth being crucified in the claimed time period. I don’t have references handy, but I can dig some up if anyone’s interested. Christian scholars, when presented with this lack of evidence, have sometimes argued the lack of a record is due to the fact that Jesus was crucified by Jews, not by Romans. However, this Christian scholar actually argues that Jesus wasn’t crucified at all:
The legend of his execution is based on the traditions of the Christian church and artistic illustrations rather than antique texts, according to theologian Gunnar Samuelsson.
He claims the Bible has been misinterpreted as there are no explicit references the use of nails or to crucifixion – only that Jesus bore a “staurus” towards Calvary which is not necessarily a cross but can also mean a “pole”. […]
The ancient Greek, Latin and Hebrew literature from Homer to the first century AD describe an arsenal of suspension punishments but none mention “crosses” or “crucifixion.”
Mr Samuelsson, of Gothenburg University, said: “Consequently, the contemporary understanding of crucifixion as a punishment is severely challenged.
“And what’s even more challenging is the same can be concluded about the accounts of the crucifixion of Jesus. The New Testament doesn’t say as much as we’d like to believe.”
However, I would expect the Romans would have kept records of all executions, crucifixions or not, though I suppose the “he was executed by Jews” caveat would still apply.
Samuelsson also claims “That a man named Jesus existed in that part of the world and in that time is well-documented. He left a rather good foot-print in the literature of the time.” My understanding is that there are no surviving contemporary accounts of Jesus, but I could be wrong.
(I still subscribe the “composite character” theory of Jesus – he was based on several historical Jewish radicals, not a historical single person, and later sexed up with Pagan mythology to make Christianity more palatable)
In recent decades, it began to seem as though America’s proud tradition of venerating great outlaws might have been lost. We’d grown to accept — and secretly admire — corporate-thievery. Our renegades and outlaws, once admired for their boldness in the face of tyrants, were recast in popular culture as degenerates. The logic went that by breaking the law — written to protect the powerful corrupt — they acted against society itself. Now we know nothing could be further from the truth. Thank God Colton Harris-Moore, the “Teenage Jesse James,” is winning back the hearts and minds of all those people who forgot the allure of a goodhearted outlaw.
Sometimes we hate being wrong because of the consequences. Mistakes can cost us time and money, expose us to danger or inflict harm on others, and erode the trust extended to us by our community. Yet even when we are wrong about completely trivial matters — when we mispronounce a word, mistake our neighbor Emily for our co-worker Anne, make the dinner reservation for Tuesday instead of Thursday — we often respond with embarrassment, irritation, defensiveness, denial, and blame. Deep down, it is wrongness itself that we hate. […]
As ashamed as we may feel of our mistakes, they are not a byproduct of all that’s worst about being human. On the contrary: They’re a byproduct of all that’s best about us. We don’t get things wrong because we are uninformed and lazy and stupid and evil. We get things wrong because we get things right. The more scientists understand about cognitive functioning, the more it becomes clear that our capacity to err is utterly inextricable from what makes the human brain so swift, adaptable, and intelligent.