Category: Technology and Regulation

Alan Anderson of the Sydney Morning Herald notes that Ronald Reagan’s joke about the Government’s view of the economy has become United Nations policy toward the internet. The Belmont Club blog notes that placing control of the Web into the hands of UN regulators will have far reaching negative consequences:

The United Nations: Working hard to create a less free and less useful internet!

One of the reasons the Internet has been so successful is that it has so far escaped the restraints of Filipino judges, Tunisian government officials and United Nations bureaucrats. Addresses which are published onto the root servers can be resolved and their content displayed, subject to the restrictions of their publishers. The United States, by refusing to regulate the Internet, has occupied the position of an information central banker maintaining the coin of the realm. If lower court Filipino judges and assorted bureaucrats get their way, the pathways of the Internet will be subject to bureaucratic gatekeeping, conducted in the name of “governance”. But the proper word would be debasement.

The moment the free flow of packets over the Internet is no longer substantially guaranteed, it will cease to be trusted. Companies which are building businesses worth billions over the Internet protocols would stop if they knew a relative of the Tunisian President had to be placated for commerce to continue. Applications such email, instant messaging, searches, e-commerce, online banking, virtual medicine — to name a few — would be at the mercy of bureaucratic caprice, not just in the United States, but in every swamp and backwater imaginable. In the end, governing the Internet, especially in the United Nations sense, might be indistinguishable from destroying it. But one can see how that would appeal to those who yearn for bad, bad old days.

Blog author: abradley
Wednesday, November 23, 2005
By

The newest phase in the fight for digital/intellectual property rights involves the recent Digital Rights Management software from Sony. Apparently, Sony’s “protected” audio CDs have been installing a “rootkit” onto your computer, and opening up your computer to yet more malicious software on the Internet (as if it isn’t bad enough already without a Sony rootkit). There are a couple of things I want to say about this – first, a short description of exactly what the problem is; and secondly, a look at the ethical/moral implications of this situation. (All you Computer Science professors out there: this is a very good case study if you are teaching a class on Software Ethics.)

So, what exactly happened? Sony, along with many other music companies, has been brainstorming up ways to prevent people from copying audio CDs. This is mostly a reaction to the Napster phenomenon from the turn of the millennium, but also to continued audio piracy. Sony’s solution to the problem has been the sale of protected CDs that put software on a device that identifies the CD as legitimate and allows playback. The software that Sony CDs have been installing onto computers around the world is flawed and has opened up countless computers to new trojans and other malicious software. Sony has since released patches that “remove” the flawed code, although the updated software seems to be equally flawed.

What are the ethical implications? First, and foremost, Sony has been installing software on computers without the informed consent or knowledge of its general user base. While this is bad enough, Sony has been installing a “rootkit” onto your computer – a program that has administrative access to everything on your computer, and hides certain files. Even granting Sony the benefit of the doubt, this is simply poor decision-making and poor programming. To make matters worse, they’ve used allegedly plagiarized code. Sony, as a leader among their competition, should be excelling in all of these areas using honest, open, and transparent means. A company such as Sony should be at the forefront of developing software and/or hardware that is easy to use, SAFE, and effective, not software that is deceptive and dangerous.

One more thing to say before I’m done venting… In response to RIAA president Cary Sherman’s following statement at a recent press conference:

“The problem with the SonyBMG situation is that the technology they used contained a security vulnerability of which they were unaware. They have apologized for their mistake, ceased manufacture of CDs with that technology, and pulled CDs with that technology from store shelves. Seems very responsible to me. How many times that software applications created the same problem? Lots. I wonder whether they’ve taken as aggressive steps as SonyBMG has when those vulnerabilities were discovered, or did they just post a patch on the Internet?”

People generally know that software that they install may contain bugs, and there is a user end license agreement that specifies the terms of those situations. An audio CD that you want to listen to is not equatable to general software installation.

Blog author: jballor
Tuesday, November 22, 2005
By

Check out this Marketplace story about real money being spent in the virtual world. The commodities of online gaming have real-world value to people, to the extent that a virtual island can cost upwards of $26,000 in the world of Project Entropia.

This leads me to ask with the Matrix’s Morpheus: ‘What is “real”? How do you define “real”? If you’re talking about what you can feel, what you can smell, what you can taste and see, then “real” is simply electrical signals intepreted by your brain…’

Thus the power of imagination makes the virtual world seem real. And perhaps for some lonely souls, even more real than the “real” world.

Theologian John Baillie writes, “I have long been of the opinion that the part played by the imagination in the soul’s dealings with God, though it has always been understood by those skilled in the practice of the Christian cure of souls, has never been given proper place in Christian theology which has too much been ruled by intellectualistic preconceptions.” But perhaps there’s some good reason why the imagination has been so treated.

It is precisely the imaginative element of human thinking that is so often used to create idols in our own image. John Calvin writes, “But as to my statement that some erroneously slip into superstition, I do not mean by this that their ingenuousness should free them from blame. For the blindness under which they labor is almost always mixed with proud vanity and obstinacy. Indeed, vanity joined with pride can be detected in the fact that, in seeking God, miserable men do not rise above themselves as they should, but measure him by the yardstick of their own carnal stupidity, and neglect sound investigation; thus out of curiosity they fly off into empty speculations.” There is no doubt that human creativity and ingenuity is a gift of God. But at the same time, these are fallen gifts, which are the source of much error, corrupt and fallible conceptions.

What might Morpheus say about the man who died after playing video games for 50 hours straight?

Morpheus: ‘Your mind makes it real, Neo. If you’re killed in the Matrix, you die here…. The body cannot live without the mind.’

Blog author: jballor
Friday, November 18, 2005
By

To expand the “scientist” as “priest” metaphor a bit, you may find it interesting to read what Herman Bavinck has to say on the fundamental place of “faith” with respect to all kinds of knowledge, including not only religious but also scientific:

Believing in general is a very common way in which people gain knowledge and certainty. In all areas of life we start by believing. Our natural inclination is to believe. It is only acquired knowledge and experience that teach us skepticism. Faith is the foundation of society and the basis of science. Ultimately all certainty is rooted in faith.

A little later he writes:

Clement of Alexandria in many places uses πιστις to denote all immediate knowledge and certainty and then says that there is no science without belief, that the first principles, including, for example, the existence of God, are believed, not proven. Especially Augustine highlighted the significance of belief for society and science. Those who do not believe, he says, never arrive at knowledge: “Unless you have believed you will not understand.” Belief is the foundation and bond uniting the whole of human society.

The point essentially is that all of us, scientist, pastor, gardener, or surfer, have presuppositions, first principles or principia that are by definition that “on which all proofs ultimately rest, [and] are not themselves susceptible of being proven: they are certain only by and to faith. Proofs, therefore, are compelling only to those who agree with us in accepting those principles. ‘There is no point in arguing against a person who rejects the first principles’ (Contra principia negantem non est disputandum).”

This final Latin phrase that Bavinck quotes, incidentally, is often traced back to Aristotle’s Rhetoric, but also appears in a form in the Summa Theologica of Thomas Aquinas: quod inferiores scientiae nec probant sua principia, nec contra negantem principia disputant, or “the inferior sciences neither prove their principles nor dispute with those who deny them” (ST 1.1.8).

As a brief aside, there is no relationship between the Greek word for faith (πιστις, or pistis) and epistemology as a “theory of knowledge,” which instead comes from Greek words meaning “to stand over.”

Blog author: abradley
Thursday, November 17, 2005
By

At the the UN net summit in Tunis, MIT’s Nicholas Negroponte has showcased his hundred dollar computer. The small, durable, lime colored, rubber-encased laptop is powered by a handcrank, and is designed to make technology more accessible to poor children in countries around the world.

If I may speak of ‘trickle-down’ technology, this is the perfect example. This announcement–an announcement of a tool to help poor countries–may not be the best time to note the virtues of richer ones; and I am not trying to steal the UN’s thunder. But there will be those who, like the BBC, will hail this as a great opportunity to narrow “the technology gap between rich and poor.” Indeed it will. But I would like to note that without this gap–one created by the entrepreneurial minds that invented laptop technology to begin with–there would be no laptops for impoverished children. A necessary precurser to this act of charity (in the traditional sense of self-giving love) is the development of the product. And this development takes place best in the free society.

Here at Acton, it is commonly noted that “you have to create wealth before you can distribute it.” The same goes with the creation of our technology, a particular type of wealth. In order to develop those tools which help us all better combat poverty, disease, and other physical ills, we must have the freedom to enact our creative initiative to create those tools. This means entrepreneurship. Which often means capital. Which commonly means people in suits with briefcases that sometimes vote Republican. But by the time we get to this point, many people are crying “oppression!” as if businessman and tyrant meant the same thing.

The point it this: narrowing the technology gap does not mean bringing society back to some default position. We don’t all go back to the equality of zero. Some have the good fortune or the grace to find themselves with particular tools or means. In freedom, some of these people cultivate these gifts, creating something to make other people’s lives better. The space of time where some have this product and other do not–this is not ipso facto a time of injustice (although injustice can come about in these circumstances). It is as often a time where the good work of entrepreneurs is trickling down to touch everyone. And do not be put off by the phrase “trickle down” as if it implies the inherent superiority of the entrepreneurs; it doesn’t. What trickles down is often that which raises men up. Perhaps we can call it grace.

Blog author: jballor
Thursday, November 17, 2005
By

Thomas Lessl, Associate Professor in the Department of Speech Communication at the University of Georgia, talks about the “priestly voice” of science. He argues that “scientific culture has responded to the pressures of patronage by trying to construct a priestly ethos — by suggesting that it is the singular mediator of knowledge, or at least of whatever knowledge has real value, and should therefore enjoy a commensurate authority. If it could get the public to believe this, its power would vastly increase.”

Lessl makes an important point about the effect of this on popular perceptions of science: “The priestly character of scientific rhetoric has to do with the need to identify science with the most essential human values by making it a world view — by creating a public culture based in scientism. The best known example of this approach to scientific communication in recent memory would be that taken by Carl Sagan. Perhaps more successfully than any other popular writer of the last century, except perhaps H. G. Wells, Sagan was able create the sense that history has a scientific destiny.”

Read the rest of the interview with Dr. Lessl here.

Blog author: jballor
Wednesday, November 16, 2005
By

The AP reports that a deal has been struck to continue primary management of the Internet by the United States, following weeks and months of controversy. The EU had been pushing for control of the web to be turned over to a supra-national body, such as the UN.

The accord was accomplished at The World Summit on the Information Society, an international gathering to examine the “digital divide” between developed and developing nations. While “the summit was originally conceived to address the digital divide–the gap between information haves and have-nots–by raising both consciousness and funds for projects,” the meeting provided a forum to discuss and come to a resolution: “Instead, it has centered largely around Internet governance: oversight of the main computers that control traffic on the Internet by acting as its master directories so Web browsers and e-mail programs can find other computers.”

U.S. Assistant Secretary of Commerce Michael D. Gallagher said that the new agreement means that the onus now lies with the developing world to bring in not just opinions, but investment to expand the Internet to their benefit.

The fundamental basis for the agreement is the establishment of the Internet Governance Forum, a non-binding advisory body that would bring “its stakeholders to the table to discuss the issues affecting the Internet, and its use.” The formation of the forum essentially follows the recommendations of the UN’s Working Group on Internet Governance made in this past June.

For more on the issue of Internet governance, check out the Internet Governance Project, “an interdisciplinary consortium of academics with scholarly and practical expertise in international governance, Internet policy, and information and communication technology.”

A paper issued earlier this year by the project focuses on “the six factors that need to be taken into account in working out the details of a forum mechanism” (Download PDF here).

We’ve discussed textual interpretation a bit on this blog here before (here, here, and here). Paul Ricœur, who is famous for his “attempt to combine phenomenological description with hermeneutic interpretation,” passed away earlier this year.

One of Ricœur’s important contributions involved an observation about the nature of textual interpretation in distinction to personal dialogue. He writes, for example in his book Hermeneutics and the Human Sciences,

Dialogue is an exchange of questions and answers; there is no exchange of this sort between the writer and the reader. The writer does not respond to the reader. Rather, the book divides the act of writing and the act of reading into two sides, between which there is no communication. The reader is absent from the act of writing; the writer is absent from the act of reading. The text thus produces a double eclipse of the reader and writer. It thereby replaces the relation of dialogue, which directly connects the voice of one to the hearing of the other.

Ricœur notes some effects of this “double eclipse” and formulates a theory of the “sense of the text” to norm textual interpretation. In Plato’s Phaedrus, Socrates makes a somewhat similar observation about the nature of writing:

I cannot help feeling, Phaedrus, that writing is unfortunately like painting; for the creations of the painter have the attitude of life, and yet if you ask them a question they preserve a solemn silence. And the same may be said of speeches. You would imagine that they had intelligence, but if you want to know anything and put a question to one of them, the speaker always gives one unvarying answer. And when they have been once written down they are tumbled about anywhere among those who may or may not understand them, and know not to whom they should reply, to whom not: and, if they are maltreated or abused, they have no parent to protect them; and they cannot protect or defend themselves.

Of course, general agreement with Socrates and Ricœur does not entail a necessary acceptance of a kind of “sense of the text” radically disconnected from any authorial intent.

Even so, the inherent limits to written communication form an essential point of reference for articulating any coherent interpretive scheme. Yale philosopher Nicholas Wolterstorff, in his 1993 Wilde lectures, published under the title Divine Discourse, makes a key point in his critique of Ricœur on the pervasiveness of the “double eclipse” problem:

It is not only the temporal endurance of texts but also the spatial transportability of texts which grounds the difficulties of interpretation to which Ricoeur calls attention. But our technological ability to broadcast utterance, as well as record it, has the consequence that we are forced to interpret even “live,” non-recorded, utterance in situations spatially distanced from the originating situation. Thus what Ricoeur attributes to writing is in fact equally true of recorded and broadcast utterance. Ricoeur conducts his discussion as if we were living in a pre-Edisonian age!

Blog author: abradley
Thursday, September 22, 2005
By

George Orwell wrote 1984 in 1949, long before the PC came along. Tiny cameras were not available and Big Brother typically had to be physically watching you (either in person or from a stationary camera) to catch you at a crime (the book was political of course, and not technological). Either way, Big Brother always was watching you. Now we have PCs, the Internet, tiny cameras everywhere and available to all. And of course, Big Brother wants to see everything.

Although I hate writing about how the modern world reflects more and more what we see described in Orwell’s novel (Wikipedia suggests that “Orwell is reported to have said that the book described what he saw as the actual situation in the United Kingdom in 1948″), it seems fitting to remind people of the dangers of allowing too much access to information. PC PRO published a news item today talking about some ideas the European Commission has:

The European Commission has accepted proposals to log details of all telephone, email and Internet traffic in an attempt to combat terrorism and serious crime.

The proposals, which are designed to harmonise data retention practices across the EU, will need the backing of all 25 member states. However some states believe they have been watered down in response to pressure from telecommunications firms and civil rights groups.

If these proposals are the watered down version, I shudder to think what the original proposals might have been! Just wait for the proposals to flow when we all have RFID tags “to make purchasing goods at the grocercy store easier” surgically inserted at birth.

Blog author: jballor
Thursday, August 11, 2005
By

Reading this story about a man who played video games to death, I find it likely that an already existing addiction will be newly documented: Vidiocy.

My mom used to call me a “little vidiot” when I was a kid because I liked watching TV so much, but I submit this as a possible term for video game “addictions.” According to other reports, the man named Lee really was dedicated to the god of technology, as he “recently quit his job to spend more time playing games.”

Of course, maybe he didn’t really die, he just left “The Matrix.”