WearComp and WearCam as nomadic personal infowar machines: Personal empowerment through the use of Reflectionist media as a protective element.

(Adapted from USENIX98 closing keynote speech, at the suggestion of Geert Lovink; text adapted with some suggestions from and interaction with Tom Sherman of Syracuse University)

Prof. Steve Mann, University of Toronto

John S. Quarterman asks the questions: ``Will the U.S. continue to contain more than half of the Internet? Will one company make all the software?''. Similarly we may ask: Will one educational institute manufacture all the world's ideas? Will one security company place us all under surveillance?

That the U.S. government, together with a single company, might control the world's information flow and software, and thus indirectly, the world's thoughts, ideas, etc., is a troubling thought to many. One possible solution to this problem is in the tradition of science, and thus the notion of disclosure and open peer review, as a basis for allowing anyone the option of acquiring, and thus advancing the world's knowledge base. A further construct called ``Humanistic Intelligence (HI)'', motivated by the philosophy of science, is proposed. HI provides a new synergy between humans and machines that seeks to involve the human rather than having computers emulate human thought or replace humans. Particular goals of HI are human involvement at the individual level, and providing individuals with tools to challenge society's organizational hegemonies and pre--conceived notions of human--computer relationships. An emphasis in this article is on computational frameworks surrounding ``visual intelligence'' devices, such as video cameras interfaced to computer systems.

Software hegemony, seamlessness of thought, and the building of computer science upon a foundation of secrecy, present a series of problems to the individual. Advanced computer systems is one area where a single individual can make a tremendous contribution to the advancement of human knowledge. A system that excludes any individual from exploring it fully, may prevent that individual from ``thinking outside the box'' and therefore advancing the state--of--the--art.

Some of the new directions in Human--Computer Interaction (HCI) suggest bringing advanced computing into all aspects of life. Computers everywhere, constantly monitoring our activities, and responding ``intelligently'' have the potential to make matters worse, from the above perspective, because of the possibility of excluding the individual user from knowledge not only of certain aspects of the computer upon his or her desk, but also of the principle of operation and the function of everyday things.


The U.S. Department of Agriculture, working together with Delta and Pine Land Company, has engaged in genetic engineering research to develop seeds that will produce sterile plants. This effort, they claim, will boost sales of new seeds by preventing farmers from using seeds obtained from their previous year's plants. Preventing farmers from re-planting from a previous year's crops will force them to buy new seeds each year, thereby increasing Delta and Pine Land Company's revenue.

This proposed invention has created outrage in the community, and critics have pointed out that the invention could threaten the world's food supply through the spreading of sterility. Moreover, there is concern that this single corporation would control the world's food supply. Furthermore, this would end the centuries-old practice of farmers doing cross--breeding themselves to adapt plants to local soil and climate conditions.


What do seeds have to do with advanced computing systems? Delta and Pine's ability to genetically turn the reproductive system of seeds on and off is analogous to a compiler's ability to produce executable programs in an intellectually--encrypted form, so that the software vendor can decide whether or not to include source--code and other conceptual disclosure.

Programs distributed with source code are analogous to the traditional seeds that contribute to a healthy and diverse world food supply. Programs that are operationally disclosed (with source code, etc.) will be adapted to local requirements, through modification and sometimes cross--breeding with other program source code. Each disclosed program contributes to the knowledge base of individual users. Not necessarily all users will modify the software, but even a small fraction of the users doing this can and often will improve or contribute to the knowledge base.


A varied supply of ideas, like biodiversity, leads to advancement in thought, and new paradigms in computing. Thus a single entity controlling all the world's supply of software, and in some sense, thought (e.g. a single entity providing all of the world's WWW browsers, and therefore, in some sense influencing the world's Internet content, or a single entity providing all of the world's dictionaries or automatic spell checking programs, in some sense influencing the use of language), could reduce the world's diversity of thought. Seamlessness of thought may reduce intellectual diversity, in the same way that monocropping reduces biodiversity.


Science provides us with ever--changing schools of thought, opinions, ideas, and the like, while all building upon a foundation of verifiable (and sometimes evolving) truth. The foundations, laws, and theories of science, although true by assumption may at any time be called into question as new experimental results unfold.

Thus a situation in which one or more of the foundation elements are held in secret is contrary to the principles of science. Although many results in science are treated as a ``black box'', for operational simplicity, there is always the possibility that the evidence may want to lead us inside that box.

Imagine, for example, conducting an experiment on a chemical reaction between a proprietary solution ``A'', mixed with a secret powder ``B'', brought to a temperature of 325 degrees T. (Top secret temperature scale which you are not allowed to convert to other units).

Now it is quite likely that one could make some new discoveries about the chemical reaction between A and B, without knowing what A and B are, and one might even be able to complete a doctoral dissertation and obtain a PhD for the study of the reaction between A and B (assuming a large enough quantity of A and B were available). However, one might ask where one would publish these findings, except maybe in the Journal of Irreproducible Results.


Imagine a clock that was designed so that when the cover was lifted off, all the gears would fly out in different directions, so that it would be more difficult for a young child to open up his or her parents' clock and determine how it works. Alternatively, imagine the clock were loaded with explosives, so that it would completely self--destruct upon opening.

Assuming a child survived such a bad experience, it is still doubtful that devices made in this manner would be good for society, in particular, for the growth and development of young engineers and scientists with a natural curiosity about the world around them.

As the boundary between software and hardware blurs, devices are becoming more and more difficult to understand. This difficulty arises in part, as a result of deliberate obfuscation on the part of product manufacturers. More and more devices contain general--purpose microprocessors, so that their function depends on software. Specificity of function is achieved through specificity of software rather than specificity of physical form. By manufacturing everyday devices in which there is provided only executable code, without source code, manufacturers have provided a first level of obfuscation. Furthermore, additional obfuscation tools are often used in order to make the executable task image more difficult to understand. These tools include strippers that remove object link names, etc., and even tools for building encrypted executables which contain dynamic decryption function that generates a narrow sliding window of unencrypted executable, so that only a small fragment of the executable is decrypted at any given time. In this way, not only is the end--user deprived of source--code, but the executable code itself is encrypted, making it difficult or impossible to look at the code even at the machine code level.

Moreover, devices such as Complex Programmable Logic Devices (CPLDs), such as the Alterra 7000 series, often have provisions to permanently destroy the data and address lines leading into a device, so that a single chip device can operate as a finite--state machine yet conceal even its machine--level contents from examination. Devices such as Clipper chips go a step further by incorporating fluorine atoms, so that if the user attempts to put the device into a milling machine, to mill off layer--by--layer for examination under an electron microscope, the device will self--destruct in a drastic manner that destroys structure. Thus the Clipper phones could contain a ``trojan horse'', or some other kind of ``back door'', and we might never be able to determine whether or not this is the case. This is yet another example of deliberate obfuscation of the operational principles of everyday things.

Thus we have a growing number of general--purpose devices whose function or purpose depends on software, downloaded code or microcode. Because this code is intellectually encrypted, so is the purpose and function of the device. In this way, manufacturers may provide us with a stated function or purpose, but the actual function or purpose may differ, or may include extra features, of which we are not aware.


There are a number of researchers who have been proposing new computer user--interfaces based on environmental sensors.

Increasingly we are witnessing the emergence of ``intelligent highways'', ``smart rooms'', ``smart floors'', ``smart ceilings'', ``smart toilets'', ``smart elevators'', ``smart lightswitches'', etc.. However, a typical attribute of these ``smart spaces'' is that they were architected by someone other than the occupant. Thus the end--user of the space often does not have a full disclosure of the operational characteristics of the sensory apparatus and the flow of intelligence data from the sensory apparatus.


What is proposed is a computational framework for individual personal empowerment. This framework involves the architecting of a new kind of personal space, through an apparatus that is owned, operated, and controlled by the occupant of that space. In some sense, it is like a building, built for one occupant, and collapsed down around that one occupant. This computational framework for HI is called ``WearComp'', and will now be described.


Typical embodiments of WearComp comprise a body--worn computer system, a visual display over one or both eyes with text and graphics display capability, and an input device consisting of typically five or more pushbutton switches that may be operated by one hand. Other input devices typically include a microphone and video camera positioned such that it provides a view of the same subject matter the wearer sees. While this apparatus may sound, at first, unwieldy, it has evolved, over the last 20 years or so, into a normal--looking (e.g. unobtrusive) form of clothing (http://www.wearcam.org/wearpubs.html).


Personal Imaging is a camera--based computational framework in which the camera behaves as a true extension of the mind and body, after a period of long--term adaptation.


WearCam may function as a personal safety device: Imagine, perhaps as you walk down some quiet street late at night, an assailant wielding a sawn--off shotgun, demanding cash from you. You would not like have time or opportunity to pull out a camcorder to record the experience, but since the eyeglasses are worn constantly, you would have captured the experience. Consider also the implications on human rights violations, police brutality, etc..


While there will no doubt be more environmental intelligence than personal intelligence, there is at least the hope that there might be an end to the drastic imbalance between personal intelligence and environmental intelligence. The individual making a purchase in a department store may have several cameras pointing at him to make sure that if he removed merchandise without payment that there would be evidence that he did not pay for the item. However, in the future, he will have a means of collecting evidence that he did pay for the item, or a recorded statement of a clerk about the refund policy. More extreme examples, such as the case of Latasha Harlins (A customer falsely accused of shoplifting, and fatally shot in the back by a shopkeeper as she attempted to walk out of the shop) also come to mind.


In this sense, WearCam becomes an equalizer much like the Colt45 in the Wild West. When there's a standoff, it doesn't matter whether one person has a big gun and the other has a small gun, so long as there is enough ammunition for mutually assured destruction.

In the WearCam case, it is simply a matter of mutually assured accountability, through pictures taken of the other person's activity and conduct. The pictures do not even need to be really taken, so long as there is a possibility that they could have been. In this way, the adversary must be on his/her best behaviour at all times. Hence a look-alike apparatus, http://wearcam.org/maybecamera.html may work as well as a real WearCam.


http://www.wearcam.org/lvac/index.html (retrospective exhibit of various embodiments of WearComp invention as it has evolved over the last 20 years); see also http://wearcam.org/art.html

Leonardo 31(2) Reflectionism and Diffusionism: New tactics for deconstructing the video surveillance superhighway, rough at http://wearcam.org/leonardo.ps.gz

The Data Dandy and Sovereign Media An Introduction to the Media Theory of ADILKNO Geert Lovink International Symposium on Electronic Art, Helsinki, Finland, 24 August 1994. (alternative anti-hegemonic media strategies)

http://www.linux.org (fully disclosed operating system as alternative to MS-Windows hegemony)

http://www.usenix.org (in particular, the freenix technical session tracks)


An excellent tutorial on FPGAs and CPLDs may be found in "FPGA and CPLD Architectures: A Tutorial", Stephen Brown and Jonathan Rose, IEEE Design and Test of Computers; other links at http://wearcam.org/jayarpubs.html