Prof. Steve Mann, University of Toronto
John S. Quarterman asks the questions: ``Will the U.S. continue to contain more than half of the Internet? Will one company make all the software?''. Similarly we may ask: Will one educational institute manufacture all the world's ideas? Will one security company place us all under surveillance?
That the U.S. government, together with a single company, might control the world's information flow and software, and thus indirectly, the world's thoughts, ideas, etc., is a troubling thought to many. One possible solution to this problem is in the tradition of science, and thus the notion of disclosure and open peer review, as a basis for allowing anyone the option of acquiring, and thus advancing the world's knowledge base. A further construct called ``Humanistic Intelligence (HI)'', motivated by the philosophy of science, is proposed. HI provides a new synergy between humans and machines that seeks to involve the human rather than having computers emulate human thought or replace humans. Particular goals of HI are human involvement at the individual level, and providing individuals with tools to challenge society's organizational hegemonies and pre--conceived notions of human--computer relationships. An emphasis in this article is on computational frameworks surrounding ``visual intelligence'' devices, such as video cameras interfaced to computer systems.
Software hegemony, seamlessness of thought, and the building of computer science upon a foundation of secrecy, present a series of problems to the individual. Advanced computer systems is one area where a single individual can make a tremendous contribution to the advancement of human knowledge. A system that excludes any individual from exploring it fully, may prevent that individual from ``thinking outside the box'' and therefore advancing the state--of--the--art.
Some of the new directions in Human--Computer Interaction (HCI) suggest bringing advanced computing into all aspects of life. Computers everywhere, constantly monitoring our activities, and responding ``intelligently'' have the potential to make matters worse, from the above perspective, because of the possibility of excluding the individual user from knowledge not only of certain aspects of the computer upon his or her desk, but also of the principle of operation and the function of everyday things.
This proposed invention has created outrage in the community, and critics have pointed out that the invention could threaten the world's food supply through the spreading of sterility. Moreover, there is concern that this single corporation would control the world's food supply. Furthermore, this would end the centuries-old practice of farmers doing cross--breeding themselves to adapt plants to local soil and climate conditions.
Programs distributed with source code are analogous to the traditional seeds that contribute to a healthy and diverse world food supply. Programs that are operationally disclosed (with source code, etc.) will be adapted to local requirements, through modification and sometimes cross--breeding with other program source code. Each disclosed program contributes to the knowledge base of individual users. Not necessarily all users will modify the software, but even a small fraction of the users doing this can and often will improve or contribute to the knowledge base.
Thus a situation in which one or more of the foundation elements are held in secret is contrary to the principles of science. Although many results in science are treated as a ``black box'', for operational simplicity, there is always the possibility that the evidence may want to lead us inside that box.
Imagine, for example, conducting an experiment on a chemical reaction between a proprietary solution ``A'', mixed with a secret powder ``B'', brought to a temperature of 325 degrees T. (Top secret temperature scale which you are not allowed to convert to other units).
Now it is quite likely that one could make some new discoveries about the chemical reaction between A and B, without knowing what A and B are, and one might even be able to complete a doctoral dissertation and obtain a PhD for the study of the reaction between A and B (assuming a large enough quantity of A and B were available). However, one might ask where one would publish these findings, except maybe in the Journal of Irreproducible Results.
Assuming a child survived such a bad experience, it is still doubtful that devices made in this manner would be good for society, in particular, for the growth and development of young engineers and scientists with a natural curiosity about the world around them.
As the boundary between software and hardware blurs, devices are becoming more and more difficult to understand. This difficulty arises in part, as a result of deliberate obfuscation on the part of product manufacturers. More and more devices contain general--purpose microprocessors, so that their function depends on software. Specificity of function is achieved through specificity of software rather than specificity of physical form. By manufacturing everyday devices in which there is provided only executable code, without source code, manufacturers have provided a first level of obfuscation. Furthermore, additional obfuscation tools are often used in order to make the executable task image more difficult to understand. These tools include strippers that remove object link names, etc., and even tools for building encrypted executables which contain dynamic decryption function that generates a narrow sliding window of unencrypted executable, so that only a small fragment of the executable is decrypted at any given time. In this way, not only is the end--user deprived of source--code, but the executable code itself is encrypted, making it difficult or impossible to look at the code even at the machine code level.
Moreover, devices such as Complex Programmable Logic Devices (CPLDs), such as the Alterra 7000 series, often have provisions to permanently destroy the data and address lines leading into a device, so that a single chip device can operate as a finite--state machine yet conceal even its machine--level contents from examination. Devices such as Clipper chips go a step further by incorporating fluorine atoms, so that if the user attempts to put the device into a milling machine, to mill off layer--by--layer for examination under an electron microscope, the device will self--destruct in a drastic manner that destroys structure. Thus the Clipper phones could contain a ``trojan horse'', or some other kind of ``back door'', and we might never be able to determine whether or not this is the case. This is yet another example of deliberate obfuscation of the operational principles of everyday things.
Thus we have a growing number of general--purpose devices whose function or purpose depends on software, downloaded code or microcode. Because this code is intellectually encrypted, so is the purpose and function of the device. In this way, manufacturers may provide us with a stated function or purpose, but the actual function or purpose may differ, or may include extra features, of which we are not aware.
Increasingly we are witnessing the emergence of ``intelligent highways'', ``smart rooms'', ``smart floors'', ``smart ceilings'', ``smart toilets'', ``smart elevators'', ``smart lightswitches'', etc.. However, a typical attribute of these ``smart spaces'' is that they were architected by someone other than the occupant. Thus the end--user of the space often does not have a full disclosure of the operational characteristics of the sensory apparatus and the flow of intelligence data from the sensory apparatus.
In the WearCam case, it is simply a matter of mutually assured accountability, through pictures taken of the other person's activity and conduct. The pictures do not even need to be really taken, so long as there is a possibility that they could have been. In this way, the adversary must be on his/her best behaviour at all times. Hence a look-alike apparatus, http://wearcam.org/maybecamera.html may work as well as a real WearCam.
http://www.wearcam.org/lvac/index.html (retrospective exhibit of various embodiments of WearComp invention as it has evolved over the last 20 years); see also http://wearcam.org/art.html
Leonardo 31(2) Reflectionism and Diffusionism: New tactics for deconstructing the video surveillance superhighway, rough at http://wearcam.org/leonardo.ps.gz
The Data Dandy and Sovereign Media An Introduction to the Media Theory of ADILKNO Geert Lovink International Symposium on Electronic Art, Helsinki, Finland, 24 August 1994. (alternative anti-hegemonic media strategies)
http://www.linux.org (fully disclosed operating system as alternative to MS-Windows hegemony)
http://www.usenix.org (in particular, the freenix technical session tracks)
http://hi.eecg.toronto.edu/wearpubs.html
An excellent tutorial on FPGAs and CPLDs may be found in "FPGA and CPLD Architectures: A Tutorial", Stephen Brown and Jonathan Rose, IEEE Design and Test of Computers; other links at http://wearcam.org/jayarpubs.html