Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Technology and creativity in the 21st century : a philosophical narrative of an arts educator LaMonde, Anne-Marie R. 2002

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
831-ubc_2002-0142.pdf [ 14.16MB ]
Metadata
JSON: 831-1.0090237.json
JSON-LD: 831-1.0090237-ld.json
RDF/XML (Pretty): 831-1.0090237-rdf.xml
RDF/JSON: 831-1.0090237-rdf.json
Turtle: 831-1.0090237-turtle.txt
N-Triples: 831-1.0090237-rdf-ntriples.txt
Original Record: 831-1.0090237-source.json
Full Text
831-1.0090237-fulltext.txt
Citation
831-1.0090237.ris

Full Text

T E C H N O L O G Y A N D C R E A T I V I T Y IN T H E 2 1 s t C E N T U R Y : A P H I L O S O P H I C A L N A R R A T I V E O F A N A R T S E D U C A T O R by A N N E - M A R I E R. L A M O N D E B.P.E. , The University of Calgary, 1990 B.Ed. , The University of Calgary, 1991 A THESIS S U B M I T T E D IN P A R T I A L F U L F I L M E N T O F T H E R E Q U I R E M E N T S F O R T H E D E G R E E OF M A S T E R OF A R T S in T H E F A C U L T Y O F G R A D U A T E STUDIES (Department of Curriculum Studies; Music Education) We accept this thesis as conforming to the required standard T H E U N I V E R S I T Y OF BRITISH C O L U M B I A Apr i l 2002 © Anne-Marie R. LaMonde, 2002 In presenting this thesis in partial fulfilment of the requirements for an advanced degree at the University of British Columbia, I agree that the Library shall make it freely available for reference and study. I further agree that permission for extensive copying of this thesis for scholarly purposes may be granted by the head of my department or by his or her representatives. It is understood that copying or publication of this thesis for financial gain shall not be allowed without my written permission. Department of Curriculum Sidles, The University of British Columbia Vancouver, Canada Date A p r i l 2fc 20Q-2-DE-6 (2/88) 11 ABSTRACT This thesis is an exploration of the relationships that potentially exist between technology and creativity with the purpose of addressing one of the greatest conundrums in the classroom, namely, the nurture, assessment, and evaluation of creativity in a technologically rich environment. Addressing those relationships is believed to be the first step toward solving problems inherent of pedagogy, but must be preceded by a shared (i.e., general) understanding of both phenomena. It is believed, however, that the development of an understanding is constrained by the theoretical gap that exists between viewing the general natures of those phenomena. Existing studies on technology and creativity, both quantitative and qualitative, have resulted in an increase in knowledge that is principally 'particular' and brimming with variables. The abundance of particular knowledge, however, conceals the possibility of a 'general' theory that may reveal the essential characters of either phenomenon. Touching on historical research and problems inherent of pedagogical means that focus on technology or creativity, the writer attempts to demonstrate the need for a theoretical understanding. Moreover, a personal narrative is interwoven to reveal several troublesome dialogues (i.e., constructivism and postmodernism) that impede further research in creativity and technology. McLuhan's inherent notions of medium and message offer a new lens from which to view media generally and, as such, may serve to address the classic paradoxes of dualities: mind/body, concrete/abstract, percept/concept, theory/praxis, and figure/ground. Additionally the significance of the work undertaken by L.S. Vygotsky and his student, A.R. Luria on language and cognition is reviewed. From that perspective, a metaphorical comparison is made to technology and creativity, respectively. The core relationship between technology and creativity is philosophically interpreted, if not the same as, similar to the relationships arising from such paradoxes as medium/message and art/science. i i i T A B L E O F C O N T E N T S Abstract i i Table of Contents i i i List of Figures v Preface vi Acknowledgements t . . . viii CHAPTER ONE Introduction & Overview 1 1.1 An Introduction: The Call to Inquiry 1 1.2 Origins of Western Paradox: Implications for Discourse 21 1.3 The General Problem: A Conundrum 42 1.4 Research on Creativity: Focusing The Research Questions 53 CHAPTER TWO Literature Review & Personal Narrative 76 2.1 The Road to Inquiry: A Journey into History and Philosophy 76 2.2 Reflections on Teaching: An Uneasy Discovery 88 2.3 Flashback: The Beginning of the Journey 91 2.4 The Voice of Reason: A Critical Reflection 117 2.5 A History of Mind: The Missing Dialectic 121 CHAPTER THREE Methodology & Analysis 128 3.1 Methodology: The Road to Knowledge 128 3.2 Mysterious Methods: The Paradox Within Means 130 3.3 Constructivism: Its Claim to Being a Creative Approach 131 3.3.1 First Incident 133 3.3.2 Second Incident 139 3.4 Awakened from Slumber 145 3.5 Propositional Logic: Tearing the Medium Down 156 3.6 Teacher as Facilitator: A Question of Whether the Computer Can Do As Well 162 3.7 Activity for Activity's Sake: Another Misconception 168 3.8 Authenticity: Another Paradox 174 3.9 Pragmatism: The Extended Arm of Functionalism 178 3.10 When the End is the Means 180 3.11 Pluralism: The Doctrine of Multiple Sign Systems 182 3.12 Intelligence and Creativity: In Search of a Definition 194 iv 3.13 Oliver Sacks: Stories that Shatter Descartes' Doubt 198 3.14 Language: The Bias of Hearing 202 3.15 Vygotsky's Study: Closing in on Creativity 218 3.16 Luria's Study: Further insights into Abstract Reasoning and Creativity 228 3.16.1 Concept Definitions 230 3.16.2 Deduction and Inference 230 3.16.3 Problem-solving and Reasoning 231 3.16.4 Imagination 231 3.16.5 Subjects with Formal Instruction 232 CHAPTER FOUR Conclusion and Recommendations 236 4.1 Planes of Consciousness: The Conclusion 236 4.1.1 Creativity Research 237 4.1.2 Arts and Sciences 240 4.1.3 Means and Methods 243 4.1.4 Medium and Message 245 4.1.5 Intelligence and Creativity 246 4.1.6 Language, Technology, and Art 248 4.1.7 Language and Creativity 252 4.1.8 Imagery and Abstraction 254 4.2 The Road Ahead 258 Endnotes 264 References 265 V LIST O F F I G U R E S Figure 1. Tetrad: Postmodernism 139 Figure 2. Tetrad: Constructivism 145 Figure3. Museum Exhibit 150 Figure 4. Tetrad: Spoken Word 234 vi P R E F A C E The passionate investigation of an enigmatic 'thing' will often lead one into boundless metaphysical adventures—all of which create inexhaustible possibilities. Certainly this particular investigation has turned out to be a collection of enthusiastic expeditions rooted deep within the historical and the autobiographical. The autobiographical may prove to be overly presumptuous and not nearly stringent enough—and the criticisms of historical and current theories may prove to be entirely incorrect. In fairness to all who have contributed to the discourse within, myself included, I gratefully acknowledge bold expressions. It is not an easy thing to allow one's thoughts to be judged. A fact that was confirmed in Arthur Koestler's (1959) writings, which described at least one individual of great fame who warily guarded his thoughts until he could no longer contain them. He was Copernicus, and with much nervousness, he held his thinking at bay. My anxiety has been no less, though unlike Copernicus, I have benefited from 20 th century discourse. In these modern and postmodern days, some four hundred years after the advent of mass literacy, allowances have been made for the limitations of the printed word. Moreover, the recognition of the extraordinary manner in which the printed word regenerates into new ideas almost the instance the word is scribed. The printed word bears an astonishing potential to lay the foundation for abstract thought, i.e., symbolic. The irony lays in the fact that the printed word is arguably distant and removed from the sensory yet within its figural boundary it has the ability to touch us on an emotional and very physical level. It could be said that the inorganic printed word co-exists with the organic creator in a playful interchange: Its 'techne' form altered by the human being who is conversely altered by its form. Symbolic, metaphoric, figural, sequential, abstract and logical, its medium affects the systems of the brain's left hemisphere in opposition to the systems of the V l l brain's right hemisphere, which are primarily kinetic, spatial, and visual. Whatever single sensory privation may befall us, the printed word necessarily requires sight and touch—and I daresay 'imagined' sound and silence, rhythm and tempo. And therein lays the inexorable connection: The acoustic space that traverses the two hemispheres that join the two polar opposites, concept and percept. After all, there are times when we seem to 'hear' what has not been spoken, 'feel' what has not been intimated, and 'see' what has not been pictured. The actualization of that last thought emerged at the very instance the thesis herein, with its various messages and meanings, came to life during its oral defense. For the acquiescence of the paradoxes of medium (i.e., text) and message (i.e., the meanings within) was enabled through additional media, namely, a short film that tried to capture the sensibilities of the many threads of inquiry, and, as well, dialogue, gesture and expression. When we are faced with paradox, the struggle to make sense of its contrary and difficult nature may be overwhelming. Any inquiry that supposes to study technology and creativity invariably reveals an overarching paradox that is formed by a vast matrix of polarities felt in both discourse and practice. Often noted to be the struggle between theory and praxis, inquiry attempts to unravel the axioms embedded in discourse that is both metaphysical and epistemological, i.e., mind/body, abstract/concrete, percept/concept, and art/science. Like so many maxims that bear the character of being true only if the opposite is false, almost all phenomena bear the weight of contradictory forces. Nevertheless, when we seek to understand a phenomenon, we may choose to carefully attend to oppositions in an effort to reveal the balance to which nature subscribes—not of a 'middle' ground but of the playful interchange between figure and ground. A C K N O W L E D G M E N T S I would like to thank my children, Natalie, Marci, and Gregory for their kindness, patience, and love without which I could never have dedicated my time to such a daunting endeavor. As well, many thanks to my parents, Jean and Paulette, whose love for the arts and natural enthusiasm toward inquiry imbued me with the fortitude to pursue difficult quests throughout my life. And my gratitude also goes out to my partner, Jayson, who provided a joyous and caring context from which to grow. I am forever indebted to my advisor, mentor, and friend, Dr. Peter Gouzouasis. His intelligence, energy, and quick wit enabled my thoughts, while his support, generosity, and trust allowed me to spread my wings and journey into the unknown. I am also deeply grateful for the support of the faculty in the Department of Curriculum Studies at UBC, especially Dr. Stephen Petrina whose kind words of encouragement, thoughtful questions, and carefully worded rebuttals gave me pause to rethink my position in many instances. Lastly, there are numerous friends, colleagues, and students who have offered a patient and sympathetic ear to my near-constant ramblings on matters pertaining to my thesis. I could not name them all for there have been too many over the past few years, however, I would be remiss if I did not mention my comrade-in-arts, B i l l Gemmill, whose unfailing exchange through e-mail fueled many thought-provoking conversations. His insights on the many directions my thoughts could take me provided a rich ground for decisive action. 1 C H A P T E R O N E An Introduction: The Call to Inquiry It is an extraordinary feeling to be able to look back on my life and realize that every intentional pursuit has been both joyful and rewarding. How unfortunate it would be to find displeasure in one's labor. It is certain that throughout my life, I am indebted to several remarkable persons, most of whom, being teachers, inspired me to take my passionate interests into the classroom. And since the start of my teaching venture, it has been a fantastic voyage. What better way to ensure a lifetime of learning, in effect, than to be employed in the pursuit of knowledge? As I now endeavor to instruct beginning teachers in the art and science of Music Education, the challenge has only pushed me toward a new plane of consciousness. My passion for the arts began in very early childhood and, in one form or another, a rainbow of performing arts became the window to my world. My first schooling experience, upon emigrating from France to Canada, unlocked the beauty of English through speech arts—dramatic performances that kept me engaged for over a decade. During those early years, I also developed my stage talents through local theatre, became occupied with piano and dance lessons, and wrote dozens of short stories and poems in my treasured journal. In high school, where the arts were plentiful, I threw myself into drama, music and English literature, trusting that my destiny lay with the pen, piano and the stage. I devoted my freshman and sophomore years of University to the study of theatre and cinematic arts, with a secondary focus in English literature, but graduated some time later with two degrees in dance and education. Several decades and three children later, I had been a modern dance performer and instructor, a film critic and radio D.J., the producer and host of a local T V show, a lead role in an 2 independent Canadian feature film, an amateur filmmaker of several short subject films (shot on 8mm) and an elementary and middle school music teacher. What a journey! Admittedly, when I began my teacher education—in my case, a practical decision—there were a symphony of emotional fugues that were difficult to reconcile for in becoming an educator it seemed that my own artistic dreams were running out. Nonetheless, while I entreated others to discover a passion for the arts, I took refuge in continued endeavors whenever time allowed, as have many educators who consider that their special knowledge and skills must be exercised daily. My passion for science was also awakened during my junior and senior years, although it hardly appeared as an awakening in the beginning. My degree in dance, which required three years of various biological and anatomical sciences, became a battle of wit and physique as the intensity of the daily physical workout interfered with the intellectual challenges of study. Of all the sciences I took, including biology, zoology, biomechanics, and physiology, it was anatomy that intrigued me the most. Unlike my pre-medical colleagues who were instructed through Grey's Anatomy tome, to my great displeasure we studied on rhesus monkeys and cadavers. Working through my initial horror, I became fascinated with organic structures and found great pleasure in studying the skeletal, vascular and muscular systems of the body. It was particularly absorbing to reflect on the internal and external physical dimensions of the human being. Nonetheless, philosophical study became even more entrancing. I had enrolled in a few philosophy courses along the way (i.e., formal logic and metaphysics) after failing miserably to convince my philosophy of dance professor that 'nature is the true dancer.' My interests had broadened to include Russian literature, European history, aesthetics, and anthropology. It would seem as though my artistic pastimes grew steadily in competition with my academic studies. As such, it had been quite tempting to join the ranks of generalist teachers for I wanted to draw from my diverse preoccupations. Instead, I accepted the role of 3 'specialist' in Arts and technology with a strong focus in performance. Ultimately, it was a wise decision for the arts are filled with the content of life. It was not long, in fact, before I discovered that my knowledge and skills in the arts found equal footing with intellectual undertakings. Through artistic enterprises and curricular development, it became equally fulfilling to be both artist and teacher. In particular, I found it gratifying to encourage children to perform their own compositions—their expressions of delight could not be bought. In the end, upon receiving my permanent teaching status, there came a point of no return. Any wavering, or feeling of regret, was soon expatriated by the thrill of teaching. Teaching Music Education at the university level, as I currently do, requires a very different focus than that of my years of instruction in music performance and composition. This is because, while performance, composition, improvisation and all other endeavors are still part of the daily learning, Music Education's main focus, being pedagogy, is a different medium. Music explorations must be met on several planes of thought: one is devoted to music theory and praxis, while the other, devoted to the art and science of pedagogy. What is pedagogy? It is, simply stated, the art and science of teaching, though this can be amplified to mean: the genius of making comprehensive what is usually transparent to the performer in order to transform them into willful partakers. It is the capacity to awaken the mind and, simultaneously, teach the steps that awaken. It is engaging creative endeavors and, simultaneously, showing the 'tools' of creativity—no small feat for an educator. But in that moment of reckoning, whereupon the delicate processes of art and science is wrought by self-realization, the coupling of theory and praxis becomes an exhilarating accomplishment. We might question whether pedagogy is not more art than science, for anyone who has ever tried to teach a concept from a practical application can attest to the internal wrestling between concrete and abstract thought that ensues. A matter not easily resolved by mere 4 scientific means. A point Malaguzzi (1993a) strongly emphasized: "When teaching is monodirectional and rigidly structured according to some science, it becomes intolerable, prejudicial, and damaging to the dignity of both teacher and learner" (p. 77). Though this statement infers that science is without art, neurologist and physician Oliver Sacks would argue that science is more than mere procedure. In fact, while those who fail to understand extraordinary feats in science or medicine frequently credit achievement as arising from mysterious means, Sacks (1973) dismisses this notion. "What is the mystery which passes any method or procedure, and is essentially different from algorithm or strategy? It is art" (p. 247). The same could be said of extraordinary achievement in education. Every educator, unfortunately, has witnessed teaching to be dominated by what Malaguzzi (1993a) referred to as "directives, ritualized procedures, systems of evaluation (which Benjamin B loom believed should be properly guiding models of education), and rigid cognitivistic curriculum packages, complete with ready made scripts and reinforcement contingencies" (p. 77). Educators w i l l undoubtedly recognize those words to contain some truth and, most likely, w i l l agree that when society comes to the full realization that the human spirit always outwits a science filled with schema, strategy and probabilities, pedagogy w i l l be reconceptualized to mean both an art and a science. Artistry requires logic for clarity of thought, as much as logic demands artistry so as to provoke thought and passion. If we merely consider art from a Platonic perspective, however, art contains an element of deception, an imitation of reality. Art, in this sense, can be seen as misleading the audience through sophistry or trickery. Science, on the other hand, may be considered to be the practical application of critical inquiry—the pragmatics of philosophy—and, hence, rest solely on logical proofs. Nonetheless, applied arts, as the name suggests, has a practical nature and, conversely, scientific inquiry has frequently arisen out of the imagination of artists. Clearly there can be 5 strong assumptions made of the two means of expression and inquiry, but elements of each necessarily exist while in pursuit of epistemological truth. Whatever assumptions are made, however, when a single view excludes the significance of the other, we are inevitably led to discordant actions—very often unjust. The injustice that arises when our thinking is governed by duality, however, has historical precedence. If our consciousness would be released from the iron clutches of the mechanistic views of Newton, and the incessant rationalism of Descartes, moreover, freed from the rigid notions of scientific positivism, science may become rightfully reunited with art. Despite that there have been great socio-political and cultural changes since those periods of time, judging from the fact that educators in Canada generally do not recognize the role of the arts, it would appear that little improvement has been made with respect to the coupling of art and science. Although the humanist movement influenced educators of the 20th century to turn their backs on methods of 'objective' detachment that threatened the human spirit, the rise of American pragmatism seemed to influence the educator to merely embrace applied arts and sciences (Weiner, 2000). Thus burdened by the exigency of praxis, education throughout the 20th century has suffered from an unbalanced curriculum where activity and performance supersedes abstract endeavor—whether in arts or science, education has been reduced in current practice to the instruction of techniques. That, unfortunately, is especially true of music education. Yet, should pedagogy embrace the balance of means {scientific means when reality is confounded by mystery and artistic means when wonder is derailed by procedure) education could take on new significance for our youth who are increasingly disenchanted with formal learning. The world of medicine surely would be transformed if science were coupled with art, exclaimed Sacks (1973). "If this were clearly understood, no trouble would arise. Folly enters when we try to 'reduce' metaphysical terms and matters to mechanical ones: worlds to systems, 6 particulars to categories, impressions to analyzes, and realities to abstractions. This is the madness of the last three centuries, the madness which so many of us—as individuals—go through, and by which all of us are tempted. It is this Newtonian-Lockean-Cartesian view—variously paraphrased in medicine, biology, politics, industry, etc.—which reduces men to machines, automata, puppets, dolls, blank tablets, formulae, ciphers, systems and reflexes. It is this, in particular, which has rendered so much of our recent and current medical literature unfruitful, unreadable, inhuman, and unreal" (p. 204). The awakening Sacks eagerly hopes for in the world of medicine may be what we await in the field of education. We may wonder what ingenuity is required of the pedagogue that practices the art and science of coupling theory and praxis. Or, for that matter, whether the process is truly ingenious, after all, every teacher inevitably grapples with this problem. The 'genius' of teaching, that is, pedagogy, may be nothing more than stylish personality mixed with procedural technique. But these are really red herrings because, like any performance art, pedagogy requires so much more. Consider the two dimensions of thought required of theory and praxis: abstract and concrete. Those cognitive means, controlled by disparate apertures, invariably render two different focal points. The brain must toggle mysteriously between the concrete and the abstract and, because the two are paradoxical, the mind cannot focus simultaneously without, to use a photographic metaphor, increasing the depth of field by adjusting the aperture and shutter speeds. Before expanding on the previous thought, a few words must be said. In the strictest of terms, that description may resemble some peculiar division between the 'camera' of the mind, or sensory, and the 'visual' processing found in the 'mind.' On the one hand this analogy appears to 'mechanize' the mind and, on the other, it appears to split the mind and body. Nothing in this description, however, is further from the truth. Despite that 'sensors' can be 7 placed in robots (i.e., crash test dummies) and despite that the mind has been compared to a Central Processing Unit (CPU), we have not yet mechanically constructed something that fully captures the essence of the human being with all of its nuances. To begin a conversation about the definition of what it means to be human is yet another metaphysical road that technology and creativity takes us on. The discourse on humans and cyborgs has a rich history in literature and film. And for me, it is epitomized in the novel by Phillip K. Dick (1980), Do androids dream of electric sheep? As to splitting the mind and body, it must be possible to come to grips with what Kant (1998) expressed. Paraphrasing thus, he exclaimed senses are utterly 'blind' without the mind and the mind 'empty' without the senses. Returning, therefore, to the issue of concrete and abstract thought, by revisiting the lifework of Piaget (1969) we can clarify some important points. More specifically, revisiting the work through updated experiments, we can take note of new developments. While re-enacting Piaget's famous water-pouring experiment, Jerome Bruner (1966) discovered the mind's 'depth of field' may be deepened by changing the parameters. Two odd shaped containers of water were shown to children as in Piaget's original experiment. In a slight variation, cards covered the containers from view. As the children watched the water be poured into the hidden containers, they each reasoned correctly that both containers held the same amount of water. But when the cards were removed, each child reverted to what they thought they saw in the original experiment. That is to say, the tall, thin container held more liquid than the short, fat container. Here was some indication that the conceptual or abstract was in conflict with the practical view, although upon maturity, the 'depth of view' would eventually increase to allow for both the abstract and concrete to 'toggle' back and forth. Maturity, however, does not guarantee that 'abstraction' will occur. Most literate adults will agree that conceptual constructs are not readily apprehended even after prolonged 8 observation. If the study is concrete in nature, abstraction is unlikely to occur without some assistance from a technical aid, i.e., a text or an individual speaking. Even in adulthood, the theoretical will remain at a distance from the "lower plane of consciousness," a term borrowed from Edelman (2000) referring to concrete thought. From this lower plane of consciousness, practical applications will appear to solely require dexterity and coordination, fortitude, a raw will or instinct, memory, and quick 'wit' that translates into a physicality of action as an interaction between material entities; whereas, theoretical knowledge, will appear to require a plodding wit, as it nestles furtively in the mind, where it may form a spiritual, mythical, and, arguably, an implausible reality of metaphysical magnitude. That distinction had certainly been formed through my impressions during youthful years of training as a dancer. It was evident that when engaged in dance performance, I was obliged to push away any internal monologue or abstractions, and trust that my memory and my body served me well. Yet mature years in dance were only sustained and nurtured with the discovery of a multitude of conceptual views, in particular, when engaged in choreographic endeavors, for practical applications wore themselves thin without deeper understanding and philosophical perspective. A theoretically constructed view of reality, to the layperson, may, in fact, appear impractical—perhaps even delusional. Yet abstractions are the medium through which concrete phenomena are made comprehensible—except to anyone who thinks solely in practical ways, or thinks that they think in solely practical ways. As Vygotsky (1962) pointed out, even in the strictest method that Piaget undertook to collect his data, "facts are always examined in the light of some theory and therefore cannot be disentangled from philosophy" (p. 11). Despite Piaget's demonstration, in effect, that from a practical, childlike reality, the mind moves toward the comprehension of theory (i.e., an ontogenetic phenomenon) in no way does it 9 imply that a mature mind continues along the same path. As a matter of fact, i f more attention to cognition is given, there is ample evidence to show that both perspectives, equally necessary and accessible in adulthood, tend to leapfrog one over the other: the concrete affecting the abstract, the abstract transforming the concrete. Moreover, an important fact is often overlooked, i f understood at all to exist, a stage of complex thinking, being neither wholly concrete nor abstract, also plays a significant part in our mental development as children and adults. Taking those thoughts one step further, the concrete outlook may be considered particular, in nature, whereas the abstract may be seen as a general principle or schema. Adults come to understand the world from a particular and a general outlook, the former being relative, while the latter being universal. A universal construct is most often accepted as part of a collective consciousness. Nonetheless, it can only have been conceived, ontogenetically, f rom the particular, even though concepts, once formed, may further denote, connote, and refer, a priori. Since particulars that lead to universals may be falsely concluded, as induction w i l l sometimes prove, some other factor has to enable the process of universal truths to emerge. This paragraph, of course, contains the start of the layers and layers of discourse on epistemology that has evolved over the past 400 years of Western thought. Without reaching too far back, our thinking may at once be pushed in Sartre's (1953) direction, for existentialism still has its tentacles wrapped around our thoughts. Insofar as he proposed that / can never know other, and, in light of our preponderance for relative truth, we are forced to consider knowledge with some doubt. In psychological terms, a particular view is derived from an internal psyche whether expressed or not. When expressed, however, its meaning becomes interpreted by an external psyche. It is this 'external' psyche that adjudicates whether the particulars fit the larger population. 10 Many will argue that the method of moving from the particular to the general, inductively, cannot be trusted—there is always 'an exception to the rule' to be found. It is in all likelihood that a flawed conclusion may ensue from inductive logic. Notwithstanding, inquiry drawn from a generalization, as deduction implies, is unlikely to have come to pass without encountering induction in the process. Most cognitive scientists assert that abstraction arose from a 'lower plane of consciousness.' In this sense, therefore, it is possible that generalizations (i.e., abstractions) may correctly arise from a particular instance (at least insofar as triggering a solution to a problem). For the most part, however, the method of deduction avoids the pitfall of arriving at a conclusion based on particulars alone and, thereby, excluding the exceptions to the rule. In order to fit a social psyche, in other words, a general view that sets us on a common plane of understanding, particulars must be considerably simplified for universal truths to arise. Indeed, Einstein (1961) once stated that everything must be made "simple but not simpler." Nevertheless, Sartre (1953) contended that absolute truth would not be found, for the paradox of subject and object is irreconcilable, and the object will forever remain unknowable to the subject. The burden of solipsism may be logically avoided by concluding that (a) given the historical development of humanity, (b) we cannot rule out the capacity for generalities to be abstracted from the concrete. Moreover, despite a general penchant for existentialism, with its reverence for personal revelation and individualism, it is clear that particular views cannot sustain the fabric of society. The subject, in principle, has a need to 'commune' with the object whether it is enveloped in one domain or another, i.e., psychic or physical. A l l those who followed Sartre were as caught up in subjectivity, as those who had followed Descartes had been caught up in objectivity. When Werner Heisenberg (1958), firmly contested 'objectivity' and contended that 11 reality is dependent on the subject, he acknowledged, nevertheless, the interdependence between the internal and external realms. From the following excerpt, it is clear that he considers both the 'measuring device' and the human mind to be responsible for 'finding' the truth. To what extent, then, have we finally come to an objective description of the world, especially of the atomic world? In classical physics, science started from the belief—or should one say from the illusion—that we could describe the world or at least parts of the worlds without any reference to ourselves. This again emphasizes a subjective element in the description of atomic events, since the measuring device has been constructed by the observer, and we have to remember that what we observe is not nature in itself but nature exposed to our method of questioning (pp. 55, 58). Notwithstanding, there are those whose belief in a divine law predicates their outlook on life, making such matters moot for in the grand scheme of things, whatever means is afforded to us, we can only be acting by proxy according to the mind of God. In other words, all answers to life's ultimate questions are rooted in divine inspiration. Whereas the existentialist eliminates God, the scientist attempts to understand God's mind—if more metaphorically than literally. Whatever the source, the matter rests on the fact that neither the subject nor the object can be wholly ignored from inquiry. When gazing upon the struggles inherent of disparate views, it is like seeing out calmly from the eye of the tornado into a gigantic whirlwind of energy, which was formed by the conflict of not one paradox, but of a number so great as to cause a tremendous upheaval. Even if we could pull such paradoxes apart, for the sole purpose of examining their unique composition, we ought to know that the phenomenon we witness can only be repeated from the pairing of contradictory states—in the case of the tornado, when hot and cold pressures meet. Above all, we have to keep in mind an important fact about any given phenomenon: it is always made up of layers of phenomena for as McLuhan (1963) correctly identified "the content of any medium is always another medium" (p. 23). When content is reduced to political play, it is merely the result of a polemic struggle whereby each side, often holding a personal grudge 12 against the opponent, is moved to act by greed. Machiavelli would serve us well if we chose to pursue this line of political discourse. At any rate, the formation of a general principle (i.e., an abstraction) coupled with praxis (i.e., the concrete) is the very essence of truth. Arguably, generalities may be strongly contested but when accepted, it is because others (a) share a depth of knowledge sufficient to comprehend its simplicity, and (b) share a breadth of experience. History, as a matter of fact, is filled with examples of individuals who have been able to do the one thing Edward Sapir (1949) brilliantly asserted. That is to say, "The world of our experiences must be enormously simplified and generalized before it is possible to make a symbolic inventory of all our experiences of things and relations" (p. 12). Nonetheless, history also shows that brilliant 'ideas' have been reviled until such time as the 'universe' was ready to receive them. Where many fail to apprehend the general principle that may govern a phenomenon, a few extraordinary individuals have captured and expressed it with elegance and simplicity—even if time is needed for full acknowledgement. So often in literate societies, the majority of dedicated crusaders seeking Truth fixate on the perplexing swirl of unintelligible fragments of information—like the bits and pieces that are swept up by the tornado—and with earnest endeavor, try to analyze the composition of those fragments but overlook a simple fact. In the years preceding Einstein, with Newton's laws firmly entrenched, light was not harnessed until Einstein asserted its constancy. Physicists had circled around and around the 'acceleration' of light but Einstein merely presented a simple fact, ipso facto. That is to say, light particles have a constancy of speed that, until present, was true in every experiment. It literally stared everyone in the face. But truth is unkind to those who observe the tornado of facts from a distance, and even meaner to those who are swept up into the debris. Truth seems to 13 only emerge with those few who manage to wend their way into the calm center. With the help of an ingenious contraption, a few will discover the simple essence of a thing or event. Another recounting may serve to describe the difference between 'imagery' that remains closely aligned with experience versus abstractions that require a stepping away from the concrete, and how this 'calm center' allows for greater understanding. While Ptolemy's universe was increasingly complicated, and neither Copernicus or Kepler could do much to free it from its particular construct, Galileo eventually asserted a simple matter that allowed the universe to tumble into an intelligible whole (Koestler, 1959). With the assistance of a concrete viewer, the extended telescopic eye—by no means created from concrete thinking—Galileo tried to push a general idea forward that did away with geometric shapes as a means for understanding the way the planets are positioned and travel. Though geometry is a theoretical construct, as anyone who has studied geometry will avow, geometric shapes that are arranged one inside the other is an image too concrete for such an abstract cosmos (as Ptolemy tried to prove)—even if it suited the unschooled minds of the religious clerics. It could be likened to one naive individual who stated, "The world is really a flat plate supported on the back of a giant tortoise" and, when queried upon what the tortoise stands, replied, "it's turtles all the way down" (Hawking, 1988, p. 1). Those are but a very few examples, in retrospect, that demonstrate our existence has been shaped by fantastic voyages into the abstract. With an exceptional leap during the past century into the realm of light, abstractions have created one of the most remarkably advanced epochs since the origin of Homo sapiens. Though every domain of knowledge has contributed, it is within cosmology that abstract breakthroughs have notably marked our current world. Our world was shaped by the intellect of players such as Planck, Einstein, Poincare, Heisenberg, Bohr, and so on. Their abstractions have resulted in the making and releasing of a very concrete, 14 never before imagined mushroom cloud of energy and, at the same time, became the foundation for the miniature world of silicon with an invisible but equally insidious force. One thing is certain, as far as abstract thinking is concerned, once any individual reaches a degree of theoretical understanding, it is inevitable that the individual—barring any neurological anomalies—cannot remain in this 'closed' state of intellect, for there is always a practical action to be taken when passion calls. It is clear, after all, that logic and emotion infrequently work concurrently—though must develop a kind of harmony to allow extraordinary insight. To wit, there exist numerous maxims that remind us of the interplay between logic and passion, i.e., reason before passion. For Einstein, reason kept his emotions in check when battling political and religious issues, for Alan Turing—the mastermind behind digital technology—it was the physical challenge of competitive foot racing. Despite that theories dominated their thoughts, they each carefully carried out their ideas concretely. In view of their 'hot-headed' and practical natures (as many have claimed), in fact, the realization of their dreams could not have arisen without first engaged in 'thought' experiments, followed by 'concrete' applications. Einstein (1961) himself swore that language did not play a role in his imaginings—a claim most often made by, musicians, graphic artists, and mathematicians. I venture, however, that language becomes the very center of arts, mathematics and sciences. Moreover, entertaining a 'thought' experiment seems to suggest that language is at play at least part of the time—in the sense that algorithms or syllogisms are word problems—although for those who deal strictly in images, sounds, and theorems, the language aspect may be diminished. It is difficult to imagine a language-deprived individual solving the riddle of the cosmos, however. It goes without saying that those who cannot abide the idyll character of words would do well to consider the following point: words form a rather important expressive and intellectual function. 15 A dislike for 'abstract' conversations (i.e., dialectic) led Hume and others to demand action over words (a fair request of the soldier in the field). The attitude one takes regarding 'action' over words will prevent many to consider the very thing that makes us distinctly human. After all, what distinguishes us from the action-oriented world of animals is language—therein a distinction worthy of some careful attention. To McLuhan (1963),' language is yet another creative technology—the extension of human minds, ears, eyes, hands and limbs—and its impact on cognition is dramatic. It can be said truthfully that humans cannot exist as human beings without the aid of technology. Of particular interest to linguists (though clearly important to educators), language holds great significance with respect to intellect. Thirty years after Humboldt suggested that language undergoes infinite change from within a finite premise, Noam Chomsky (1957, 1968) defined a deep structural language potential innately present in humans. That revolutionary idea formed what is known as the theory of Universal Grammar. His position simply galvanized the field of linguistics—as all generalities have tended to do in their respective domains after having sagged under the weight of particulars. Chomsky (2000) described the tension that existed between linguistic researchers as having arisen from "the search for descriptive adequacy [which] seems to lead to ever greater complexity and variety of rule systems" and the search for "explanatory adequacy [which] requires that [the] structure must be invariant" (p. 7). This description could readily apply to any domain of knowledge, including physics. At any rate, his published response to B. F. Skinner's Verbal Behavior was quite effectively the ultimate triumph of Modern Rationalism (i.e., organicism) and the demise of behaviorism—a testament to Chomsky's extraordinary contribution. Likewise, Chomsky's genius proves the manner in which one generalized view is never the final word on the matter. Notwithstanding, in a throwback to their Skinnerian roots, 16 social linguists continue to do battle over varied aspects of language, including, taking Chomsky to task for his internalist view. Though the majority of linguists will concede to a language instinct—noticeably present in young infants—there are those who take exception with some of his extended views on such matters as semantics. Linguistics, however, is a field that has imploded in its search for answers that extend much further than the structural or descriptive nature of language. A study of sign systems, and all things connected to the sign, so named semiotics, has evolved into a hotbed of topics from a debate over the metaphysics of artificial intelligence, to the search for socio-biological determinants, to a plethora of new models of mind and beyond. Strongly influenced by the works of C. S. Peirce (Brent, 1991), semiotics ventures toward paradigmatic constructs where cognition, language, sign, and society all meet to form a "galaxy" of ideas—as Eco (1997) put it. In short, Frege's (1980) notion of context, being the point of terminus a quo, the entire field, since the advent of new media, is both prolific and volatile. Semiotics, in effect, is comparative to physics, for it too has also imploded from the eye of the storm into a universe of grand ideas. Like physicists who continue to utilize Newton's laws and Einstein's theory but ceaselessly labor toward new concepts, semiotists will refer to their 'forefathers' (i.e., Kant, Peirce, and Frege) while they continue to develop new ideas through an incessant exchange on matters with increasingly complex problems. Ironically, much of the semiotic movement has been carried out by pragmatist not metaphysicians—leaving 'abstract' metaphysical musings on the back burner. The irony is contained in the fact that in recent years, knowledge (i.e., physics) has been created by that scope of metaphysical discourse once so quickly dismissed by logical positivists and followers of the pragmatism espoused by William James. 17 The remarkable thing about the unprecedented appearance of a general principle is that it usually goes unnoticed until a practical application ensues. Generalizations are both soothing to society (complex issues may be explained simply) and suspect (simple explanations appear far too 'simple' to be correct). In the former instance, our understanding may be immediately clarified, albeit suspiciously so. And, simple explanations allow for an explosion of extraordinary new undertakings—clearly a theory will extend beyond the boundaries for which it was originally intended to apply. This fact was made clear when I recently read the biography of John Nash Jr.; the Nobel Prize winning economist whose game theory has been applied to an extremely wide scope of ventures (Nasar, 1998). The suspicion that surrounds general principles comes from the fact that many generalizations are formed a priori. To make this statement means to open an enormous epistemological and metaphysical discourse. Without presently becoming entangled in an argument, it will suffice to say that an intellectual battlefield may inevitably take shape precisely because the nature of a generalization is to offer an unproven theory. Sadly, despite the deductive nature of theoretical constructs, many unschooled individuals will accuse a theory of being nothing more than opinion. If only our students could be obliged to read Plato's Republic, there may be some understanding of the distinction between theory and opinion. Nevertheless, once a theory is put in practice (i.e., tested), the conflict then arises from the fallout out that ensues when new inventions (i.e., media) encroach on established ways of living—so proving the multi-layers of media that shape conflict. A 'watershed' idea, like extraordinary art, will be filled with contradictions, and, consequently, be vexed by conflict. Though conflict will need further exploration (as far as it relates to the 'tension' expressed in creative works), two important points need to be stated. Insofar as creative works are concerned there are two agents always at work: (1) creativity 18 always involves more than the inventor, and (2) both abstract and concrete cognitive processes must be present. In other words, looking at creativity from the pure 'absolute,' creativity (i.e., historically creative) requires some form of 'external' judgment. And, from the same point of view, creativity arises from an advanced cognitive state of 'mind.' Creative works necessarily arise out of an 'idea' (i.e., image or thought)—not excluding the 'senses' that lead to an idea—which must ultimately be 'tested' in the physical world (even paradigmatic thinking is 'tested' against the 'real' world). 'Testing' ideas may or may not be carried out by the inventor. Because ideas frequently have broad applications, 'testing' will take the shape of many individuals in many different fields through many years of study. On the other hand, if the inventor carries out an idea, the 'test' necessarily comes in the form of a 'public reaction.' In either case, there is both an abstract and concrete state of consciousness at play, i.e., an idea and a practical consequence. Perhaps the reader will judge the preceding as being closely aligned with pragmatism. Nonetheless, insofar as the pragmatist generally turns away from 'abstract' notions, it is not merely practical in nature. It contains notions of 'identity' and social perception that are not 'practical' notions to explore. To suggest that a creative work is judged against social perception is also to invite discourse on many social phenomena and the interplay with the individual. In effect, the fact that many creative individuals long for social acceptance yet, in failing to obtain it, continue to be creatively productive has many interesting ontological and psychological implications. That those creative individuals frequently receive recognition posthumously is also interesting and requires some thoughtful study. In any case, the preceding does not define creativity, but it goes a long way to present two plausible components for creative works to ensue: (1) more than one person's view is involved, and (2) more than one state of consciousness arises. The notion that creativity involves a 'public' was posited by at least one creativity researcher. Csikszentmihalyi (1990, 1996) made 19 this suggestion along with his famous treatise on the nature of 'flow' being of special consideration. Nevertheless, the preceding position regarding two plausible components has never been completely formulated, much less applied, as far as I am aware. The idea that there exist two levels of consciousness for creativity to occur may be illustrated by naming individuals whose 'ideas' were ultimately put to practical application. The following brief list of historical figures, therefore, exemplifies the genius of those who unleashed a 'generalization' upon our consciousness and whose abstractions were 'practically' and successfully applied: Arendt, Aristotle, Barr, Chomsky, Dickens, Derrida, Descartes, Duncan, Einstein, Godel, Gould, Heidegger, Hegel, Heisenberg, Hugo, Kant, Luria, McLuhan, Neumann, Piaget, Picasso, Plato, Sacks, Sapir, Sartre, Turing, and Vygotsky. Those few names have been deliberately selected because of the diversity of thought from philosophy to science, and from Arts to technology, but by no means do they represent a cohesive center. Each of those persons, in fact, share only two things in common: they brilliantly premiered an abstract idea that became the impetus for a flurry of activity not without controversy, and they each have focused our attention, at one time or another, on technology—whether it be language or otherwise. I have another underlying reason for naming those persons, which is not readily discernible at present. A l l of them will serve to illustrate ideas that are proposed in this thesis, or they have all given birth to ideas that contribute to the shaping of the thoughts contained herein. Nevertheless, before moving on, the choice of the preceding list of individuals needs to be elaborated for social and political reasons. First, those are not the only individuals that the reader will encounter throughout, for many significant players make up the list of characters. And, second, I am well aware that my list is largely drawn from a male population, a fact that might alert the gender-sensitive person. 20 Since this is bound to be a point of concern for certain individuals reading this thesis, I have to give some plausible reason that hopefully will ring true and not detract from the content found herein. Fortunately, while there are noteworthy female contributors who have often refocused our attention and sharply outlined particular views, general principles tied to this thesis have, for better or worse, originated in male dominated domains of interest, i.e., music, art, philosophy, physics, psychology, technology, etc. We can only surmise what kind of world would have arisen if these domains had been dominated by females—a statement that does not downplay the political, moral, social and economic barriers that have prevented females from historical significance. As it stands, however, since generalities, as the word implies, are not gender specific, that is to say, particular, I see no disadvantage in extending the 'ideas' that markedly shed light on the vexing issue of creativity and technology. Additionally, it has become evident that gender, according to new sciences, has fallen into a rather nebulous field of study due to the fact that we are becoming increasingly aware that at least five distinct biological genders, made up of phenotypical and genotypical factors, are in existence. Perhaps, in light of new scientific discoveries, gender concerns will eventually recede, possibly, when redress and retribution is made to all parties, male and female. Clearly, it is not an understatement to say that all human beings have been affected, to one degree or another, by dominant forces. Whether or not we can specifically determine those to be solely 'masculine' is suspect given the biological evidence that demonstrates many more conditions are at play in the human psyche. Whatever forces have been at play, it has been brilliantly asserted by Simon de Beauvoir that everyone suffers from injustice. "It is a criminal paradox to refuse women all kinds of 21 opportunities for self-realization and then entrust to them the most delicate and serious task there is: the forming of a human being" (Vintges, 1996, p. 31). Origins of Western Paradox: Implications for Discourse Whenever I face a new group of students, I project on a screen a detail of Raphael's painting of the School of Athens. It depicts Plato pointing to the heavens while firmly grasping Timaeus, an abstraction in metaphysics. Standing next to him, Aristotle clutches Ethics while he gestures to the earth, as if to suggest we must remain grounded. It is my deliberate way of illustrating that two paradoxical views have masterminded the whole of Western thought, even though on occasion, a student will contend, that it is strictly a masculine view. At any rate, I suggest to them that we may consider Plato to have shaped our sensibility toward Nature: all material things are merely an extension of our mind, including our body. For our mind is bestowed with innate qualities wisely selected from omnipotent origins, and exists in a purely spiritual dimension. Plato loaned to Descartes a deep conviction that the mind is a divine inheritance. It is separate from the material world and, hence, only the human mind, destined for rational thought, rises above mere instinct. It is not, in short, dependent upon the senses. Though many would deny any connections to Descartes, even in the 20 t h century our view of the psyche was borrowed from Plato, i.e., social and moral judgment, will, identity, motivation, dreams, and so on. He laid the psychological foundation for a determined, preordained, and whole universe where change was meted by the hand of God. Like his 'spiritual' progeny, Plato was wary of human artifacts that feigned to raise consciousness to a material reality for the spirit, after all, should aspire toward a divine truth. Music, poetry, and all manner of persuasion, he surmised, would only lead to false assumptions as to the nature of reality, for an individual could rise from the cave of shadows to peer inside the omniscient mind only by the grace of the Gods. Artifacts, he proclaimed, would ultimately 22 mislead human understanding. It was a belief so loudly exhorted that in an odd backlash it alerted a few to the possibility that a profound relationship between mind and body actually exists—though many would venture little more than a mechanistic union. Many more individuals, however, developed a disdain toward materialism. For those inheritors, we find the religious zealots, with hardly a need to understand beyond the will of God. Aristotle, by contrast, was Plato's archetypal nemesis, though I interpret that he was really just a good pupil who challenged his mentor's ideas. In the vernacular of story structure, if Plato was the protagonist, Aristotle was the antagonist. It is always interesting to note, of course, that to each the protagonist and antagonist, roles are reversed whenever the story is rendered from the opposite point of view. In other words, everyone is a hero to his/her own story. At any rate, Aristotle was interested in understanding natural phenomena that he could observe, a posteriori, and determined to understand reality as it transformed before his eyes. Though he did not depend on media, or instruments of inquiry, per se, his view of the extensions of the mind (i.e., the body) played an important role in creating an outlook regarding perception. He paved the way for scientific inquiry, especially as it undertook naming, classifying, and categorizing. When in a materialistic frame of mind, Aristotle concluded that we begin to know the world first through experience. Noting that material elements had qualities that could transform into other substances, ostensibly, he laid the foundations for the scientific theory of motion (i.e., physics); and he labored to outline the laws and methods by which material things were technically created, such as the law of poetics. As an idealist, Plato saw the world through rational, immutable means destined by divine atavism—beings subject to the will of the Gods. The Nature perspective, in particular when viewed as divine gifts, could be considered as a direct descendent of Platonism and, by contrast, the nurture perspective, from Aristotle. Those two perspectives, ostensibly the start of Western 23 discourses on reality and knowledge, would eventually splinter into two opposing views fuelled by Descartes' argument—multifarious strands that continue unabated in modern society. At this juncture, one might ask: What possible difference could it make to a group of students who aspire to become teachers where Western thought originated? The answer has always been obvious to me, but since students and educators alike have posed this question, I will venture a rationale. There is not one field of inquiry that has refrained from asking at one point or another: What is ultimate truth? What is ultimate reality? And there is not one discourse that has not been affected, to one degree or another, by paradoxical views, such as those held by Aristotle and Plato. In the field of education, where knowledge and reality meets upon all planes of inquiry—nothing could be more important. Before we climb the mountain of obstacles that impede our understanding, we need to find a sturdy foot and handhold to carry our actions forward. What came before is certain to affect what will come after. More importantly, to carry the rock climbing metaphor further, the lead climber is only assured security based on the leverage of the climber (i.e., belayer) beneath, and the climber beneath is only freed from injury if the lead has secured the bolts (i.e., pitons) above. In the end, perhaps this lack of prior knowledge has been the biggest impediment to an extraordinary amount of educational research in the 20 t h century; and for that matter, thoughtful practice. The contentious storm that rages among educators in contemporary times is principally attached to a discourse on mind, which is laid precariously on the pinnacle of the Nature versus nurture controversy. What other interest have educators but that of the human mind? Educators will quip that there is concern for the physical, emotional, and intellectual. Do we need to separate those characters? Have we not reached the point when we can clearly admit that the mind and body are inexorably linked but that the 'mind' is indeed that to which we refer when 24 we speak of intelligence? How do we define an 'intelligent' body or emotion? As a dancer and musician, I cannot deny the delicate balance between mind and body, but I am not ready to suppose that my 'body' would be able to act in the same way if my mind were not intact, no more than my 'mind' would be able to carry on its capacities without the intactness of my body. At any rate, once ripped open most deliberately by Descartes and patched together with some difficulty by Kant, where once the mind was rooted in metaphysical inquiry, it is now rooted in psychological concerns. Hence, educational psychology is a necessary course of study for pre-service teachers. But psychology is a young field, no more than an approximate 125 years of existence, with origins firmly embedded in the philosophical tree of knowledge. To wit, the Nature and nurture paradox, in effect, is decidedly a metaphysical concern, which evolved into a contentious matter over knowledge, a matter that seems to inseparably divide the mind and body. Essentially, on an epistemological note, I speak of the controversy of knowledge acquired either by percept, concept or a combination of both. Those are not simple matters to resolve and, moreover, they cannot be easily brushed away with platitudes or with an appeal to one or another authority. They must be earnestly grappled with philosophically, and abstractly, for no concrete 'image' or impression can sufficiently address the issue deeply enough to anchor pedagogy. Though Maxine Greene (1998) once declared that we must know ourselves before we can embark on a pedagogical journey, she would be the first to suggest that we must come to know our collective 'thoughts' as they have been remarkably recorded over the past three millennia. Peirce (1998) claimed that impressions give way to predicates, thereby forming hypothetical inferences, or concepts, and those abstractions, as species (categories) are not real but rather a 'token' of our impressions. Inevitably, when a person's belief is based on perceptual knowledge, the natural conclusion is to accuse the 'dialectic' (i.e., the entertaining of absolutes) 25 as being impractical. Being prone to misinterpretation, practical understanding is called for in situations that require immediate action, not abstraction. If practical understanding is all that is needed, however, why bother to consider absolutes (i.e., concepts)? Peirce admitted that, while there is no satisfactory explanation for the shift from impression to conception, human reason is markedly influenced by the capacity to abstract, or in other words, to think things that have no concrete bearing. Peirce undoubtedly overlooked Socrates who made it clear that entertaining absolutes forces one to consider matters that reach either beyond common sense (i.e., opinion) or practicality, for "until the person is able to abstract and define rationally the idea of good, and unless he can run the gauntlet of all objections, and is ready to disprove them, not by appeals to opinion, but to absolute truth, never faltering at any step of the arguments—unless he can do all this, you would say that he knows neither the idea of good nor any other good" (Plato, 1999, p. 196). Is it not possible that therein lays a matter that needs our fullest attention? Can we fully understand our 'mind' from the foundations of a practical 'science' alone? It would seem that at the very least, educators would give some thought to such matters. Yet the study of phUosophy (i.e., of metaphysics and epistemology) is not demanded of the pre-service teacher by education faculties. The requirement, being merely the study of the philosophy of education, a rather euphemistic endeavor, is all that is expected. Even in my own teacher education, philosophy was sorely amiss, and were it not for the demand of philosophical inquiry in dance I might very well have ignored its importance. The 'philosophy of education' actually involves" a spectrum of topics from curriculum development to principles of teaching. The discussion, impressively practical and rooted in the "things" of education, would have undoubtedly pleased Peirce. As Hume once suggested, it is the nature of 26 philosophy to be abandoned in favor of practical considerations and, I suppose, there can be nothing more practical than the practicum year suggests. Pushing my point a bit further, however, how deeply are students usually entreated to explore Western discourse (or any discourse) during their meagre training? From within which well do they draw the waters of knowledge that will quench their critical thirst, a thirst they are encouraged to develop? It turns out that philosophical inquiry seems to be merely a token homage to the past even at the graduate level and, moreover, it seems to have something to do with gender issues. One graduate textbook I was required to purchase for a course on curriculum issues, ventured one or two paragraphs on Bacon, Locke, Heidegger, and a brief mention of Pythagoras. Not much more is offered to undergraduates. Other than a glancing mention of propositional logic, and uses of language as communication, philosophy is synonymous with the whimsical and personal beliefs. I will admit that since the inception of philosophical inquiry, worldviews have shaped along various branches of concern, with physical, social, biological, and cognitive sciences most recently forming a new foundation. A l l of those domains contribute somewhat to modern educational practice, yet the very same issues that plague human endeavor in contemporary times have been battled among philosophers throughout history. Anyone who chooses to read Plato, for instance, will discover that Socrates had relatively the same concerns over human behavior as we do today and, ironically, offers much the same advice. There has to be some rational connection to the fabric of the past that lends insight into the present. In the very least, we must wonder whether the repetitive themes and patterns of themes do not suggest something of the nature of cognition—a worthy undertaking for the educator whose primary concern is just that. 27 The distance from philosophical inquiry that the average individual may have can be reasonably accounted for, as there are far too many pressing concerns with an equal number of contemporary authorities ready to offer a solution. But to the educator, whose responsibility is to develop pedagogical insight, like any artist or scientist who seeks to rise above the din of the insipid enterprise, it would seem an imperative to lay critical foundations before structures are erected. Is this not an educator's credo? Is the educator not devoted to developing critical thinking and does this not necessarily require abstract and concrete thought? As it so happens, however, most educators would balk at the suggestion that foundations are not concertedly addressed during every stage of educational development from primary to post-secondary, including teacher education. Universities might even take offence to comments made by Malaguzzi (1993a) in the following statement: "The preparation of teachers to work with children is, I believe, a sort of legally sanctioned farce, really unspeakable..." (p. 65). Judging from a recently posed question by a student who demanded to know whether there was any practical use for exploring new media in the practicum year, particularly when it seemed that there was precious little time to learn and too few schools equipped to handle the domain of study, makes me believe that if philosophical foundations were addressed, perhaps they would lean too far in the direction of utilitarian and pragmatic sentiments. Admittedly, this takes on a rather presumptuous tone, but it is not intended to offend. I merely intend to elaborate that the fall out from the conflict between those two ancient paradigms of thought has wreaked havoc among practitioners of religion, sciences, psychology, sociology, politics and education. Since the repercussions have had tremendous consequences upon the individual and society at large, it is possible that all of us, educators and researchers alike, keep missing the point. 28 If 'getting the point' has anything to do with philosophical inquiry, it is probably wildly askew to focus on pragmatics, but, for the sake of enlarging my comments, and at the risk of some irony, I will ask a practical question. Do educators know upon which view they base a pedagogical decision? This question relates directly to the desire to commit to good teaching a planned approach and observable outcome that can be measured to some degree. It may be the case that good teachers understand the media within which they teach, i.e., discourse, lecture, self, environment, student, new media and so on. Without such knowledge, teachers are unlikely to determine learning outcomes. In other words, unless we understand the medium—-self (i.e., mind and body) and extensions of self (i.e., technology—and its affect on content, there is little chance of understanding teaching and learning. That is what McLuhan argued. It is through understanding media that we can begin to understand the content of individuals, societies, art, science, and events. Understanding would do well to begin simply from an absolute. Without a general theory of medium, for instance, we may become lost in a morass of particulars and irrelevant data—in short, relative perspectives. More importantly, we could begin with historical worldviews and determine whether our actions have been dictated by such discourse. To use the computer as analogy (different from the information processing model of mind), let us first imagine an educator who holds the opinion that the mind or the spirit is divinely hard-wired (i.e., the brain as hardware). Thus, knowledge is already present to one degree or another. A rather bleak Rousseau-like view, this educator would be resigned to pedagogical acts and artifacts that provide nothing more than a garden (i.e., stimulus) for novice minds to naturally grow. Locke (1998), in fact, was extremely troubled by such a view for he demanded to know how one can "say a notion is imprinted on the mind, and yet at the same time to say, that the 29 mind is ignorant of it, and never yet took notice of it" (p. 618). Though most of us balk at Locke's view of the mind as tabula rasa, we might still consider his question. How indeed can we say that a child both possesses and does not possess knowledge? Common sense seems to want both statements to be true despite the paradox that they present. Rather than confront the contradiction, however, the paradox effectively forces many to side with one or another view. An educator who may labor under Locke's view (i.e., the mind as a blank slate) must feel the heavy burden of responsibility to ensure that very strict learning experiences are met. Conversely, an educator who may see the mind as pre-determined and, upon judging the quality of Nature's handiwork (i.e., cognitive state), may feel the restrictions imposed on his/her teaching by knowing the learner's limits. Sadly, the former would feel overwhelmed by a sense of duty to fill the child's world with information, and the latter would sense the futility of enticing a child beyond what was determined as their threshold of understanding. Neither teacher can understand their actions or sentiments if they lack the ground upon which their beliefs have been shaped. A third, more radical view that extends Locke's premise needs to be addressed. In this instance, the educator considers the mind to be 'soft-wired' (i.e., the brain as software) and thereby 'programmable' through appropriate cues. Presently there is an entirely different view of the brain as software within evolutionary psychology, but the early stimulus-response model placed humans at par with trained rats and dogs, and was at the heart of the absurd assertions made by Watson et al. The behaviorist view would make pedagogy seem like a maniacal game. Unfortunately, despite that the school of behaviorism has long since thought to be abandoned; behavior modification programs and extrinsic reward systems are still actively enforced in schools. Even during their course work, pre-service teachers are required to study classroom management skills that are often tantamount to methods taught from a Skinnerian 30 viewpoint. The preceding shows to what extent educators continue to learn unexamined material. After all, behaviorism was obliterated in the 1950's through proven theories. In all fairness, the majority of students do not find the views presented above particularly appealing. For pre-service teachers, in fact, there are many views to consider. If the mind is viewed as software, for instance, suggesting that it has innate potentials ready to adapt to changes in the environment, including physical changes from within—a more contemporary Darwinian perspective—pre-service teachers can find research to support their practice. If, on the other hand, limitations of the mind are imposed by the brain's hardware, there is research available that outlines the limits of learning. The matter that remains in question is whether assumptions or convictions will carry pre-service teachers to further inquiry. Often led by models of instruction, pre-service teachers will adopt a similar stance taken by veteran teachers—usually that means ignoring the matter entirely. When pressed, veteran teachers acknowledge that there are disadvantages from thinking in either extremes and, therefore, are reserved about current research findings—more so because researchers often fail to clarify the discourse that influenced theoretical underpinnings or methodology. Often teachers end up sitting uneasily in the middle between controversies. One can understand the reticence of the pre-service teacher whose limited teaching experience, access to discourse and research findings may prohibit them from taking a personal stance. A fence-sitting position does little to advance critical thinking but, admittedly, as the Beatles once sang, "Living is easy with eyes closed." Since generally I have found that many educational matters are left alone without scrutiny, the reader is entreated to turn to the topic of philosophical inquiry and its lack of importance in teacher education. Reflecting on the words and experiences of others and with additional support from personal experience, I am persuaded to believe that if we keep applying 31 ourselves practically we will fail to address issues critically. Without engaging critical thought, that is to say, the entertaining of abstract ideas, there can be no real understanding. Instead, we would be like the guardians of Socrates' Utopian society who can do no more than defend the society from enemies but cannot answer with any degree of precision those questions that are deeply epistemological and metaphysical. Without philosophical inquiry, we will continue to ignore, for instance, the very nature of creativity and what is required of teachers to nurture it. Moreover, we will ignore the role that either technology (i.e., language) or medium plays in this matter. The fact that language is ubiquitous and seemingly instantaneous places it in a precariously invisible plane of consciousness. The fact that language, as a medium of expression, carries with it tremendous distrust places it at odds with the senses. The fact that language is the medium from which philosophy arises has placed philosophical inquiry precariously at odds with science. A l l those things are pedagogical paradoxes, for language (i.e., the didactic) has been placed at odds with 'good' teaching. Malaguzzi (1983) invited this line of thought when he said, "Piaget warned us that a decision must be made about whether to teach schemes and structures directly or to present the child with rich problem-solving situations in which the active child learns from them in the course of exploration. The objective of education is to increase possibilities for the child to invent and discover. Words should not be used as a shortcut to knowledge" (p. 77). In scientific terms, Bertrand Russell and Gottlob Frege, in effect, were in search of the perfect logic, linguistic and mathematic, that would render science with greater clarity, for neither science nor philosophy (dare I say human beings) can exist without language. Sadly, however, metaphysics has taken the harshest blow in education for it risks areas of knowledge that appear impractical and farfetched, i.e., mysticism and fancy. Scorned and derided for its 32 cryptic means, to the logical positivist and the pragmatist, metaphysics has often been on thin ice. In the words of William James (1998), "To attain perfect clearness in our thoughts of an object, then, we need only consider what conceivable effects of a practical kind the object may involve—what sensations we are to expect from it, and what reactions we must prepare" (p. 1071). To be a rational being, it would seem, is a muddied concept. Whether or not the practical thinker is a rational being, whether the metaphysician is less than practical, in the end, I believe that James took a wrong turn in logic. A l l such ponderings neglect even the words of Hume (1998) who maintained that a philosopher "must be first a (hu)man." Ultimately, human beings are all rational to varying degrees, insofar as we are capable of thought. The issue, therefore, rests on whether we are afforded greater clarity from thinking practically (i.e., concretely) versus thinking in absolutes (i.e., abstractly). It would seem that if we are only led to tackle a particular problem, it will have limited, functional application, whereas, thinking that operates in both practical and general terms will enable us to apply solutions to infinite problems. Unwittingly (and quite stubbornly), the pragmatist denies the power of abstractions because of their great impatience with words—and words make up the full measure of metaphysical content, i.e., fanciful, impractical illusions. If words were in their 'perfect' Adamic form (as many might hope), then perhaps the mind of God would be revealed. As it stands, we are all 'victims' of the folly of humans who dared build a Tower to Heaven—we are 'lost' in our babble (Eco, 1998). And in a pragmatist's desire to find a tangible solution, words will be circumvented. The whole of semiotics, if pragmatics rules its tenet, could be in the same predicament that linguistics was floundering in before the arrival of Chomsky: everyone picking away at language in more and more particular ways, looking for the 'secret' that lays behind, beneath, 33 between those horribly irritating thoughts of philosophers of language. It is no wonder that James is given such respect for he suggested that a "pragmatist turns his back resolutely and once for all upon a lot of inveterate habits dear to professional philosophers. He turns away from abstraction and insufficiency, from verbal solutions, from bad a priori reasons, from fixed principles, closed systems, and pretended absolutes and origins. He turns towards concreteness and adequacy, towards facts, towards action and towards power" (p. 1072). I recognize that this statement was in passionate response to Descartes' error but the one thing I would want to pose to William James, were he still alive, is whether or not "bad a priori reasons" are more detestable than no a priori reasons. As it turns out, James is wrong on more than one count. Words, in effect, are significantly tied to our cognitive development, to our thinking critically and to finding solutions. But so far, investigators of semiotics—from within cognitive sciences to linguistics—have not fully assimilated this widely accepted, foregone conclusion. What James, and others, cannot reconcile is a troublesome paradox: one that was correctly identified and understood to mean, "the medium is the message" (McLuhan, 1963). However, having said as much, the paradox is not rooted in the medium of language, per se, for those who grapple with language and cognition are on side with Wittgenstein (1998) when he astutely remarked, "What confuses us is the uniform appearance of words when we hear them spoken or meet them in script and print. For their application is not presented to us so clearly" (p. 1158, 11). The paradox is the medium, and of this topic, James did not understand media at all—like so few philosophers after him. James did not consider that the content of any medium is always another medium. Perhaps, had he been alive to read McLuhan, he might have awakened from his slumber, as Hume had awakened Kant from his. 34 In the long run, pragmatism exists because experience is given precedence over rational discourse and, as such, has placed the concrete above the abstract—at least from a popular perspective. A decidedly Aristotelian view, pragmatics is the descendent of a belief that knowledge can only be derived empirically, that is to say, where experience is key to understanding. Thus, knowledge that is 'conceived' in the 'mind' is tantamount to conjecture. That view, of course, is in direct defiance over the debacle Descartes started and the measures that Kant initiated to put things in order. No one, for the longest time, seemed prepared to acknowledge that concept and percept are necessary cognitive allies. Ironically, William James (1998, p. 1072), like Charles Peirce—considered to be the father of pragmatism—came to the conclusion a priori that "useless questions and metaphysical abstractions" was a waste of time. To be fair, there is reasonable justification for this irony to occur: a paradox is not so easily identified when we are being swept up or ripped apart by a tornado of beliefs. And when the dust settles, and it becomes apparent that a paradox exists, two things may happen: either a person is led to understand a phenomenon with greater perspicuity or it is rejected. In the case of the paradox between percept and concept, the issue comes down to the chicken and the egg dilemma. As it turns out, we have enough evidence to suggest that whichever came first, the egg could only be hatched from nurture whether by another chicken or a chicken-like medium. Paradox is not normally tolerated, however, because of its irrational nature. The exception is with comedy, of course, where absurdity has gained some leverage. Aside from the comic genius of Monty Python members and Steve Martin, philosophy is not usually a comic outlet. That was especially true of James. He had hoped to turn away from abstraction, not poke fun at its precarious position in human beings. Ridding himself of that paradox, however, would not solve the knotty problem he faced. In fact, on a general plane but at the risk of talking 35 myself into a Godelean loop, facing paradox leads to abstracting a solution and, in turn, implementing a particular action. Leibniz (1998) once declared, "Necessary truths...are founded upon the principle of contradiction" (p. 588). Although that statement is difficult to comprehend, any good storyteller and artist would agree with Leibniz. The contour of any story, whether it is a narrative, a melody or merely an image, must contain conflict or opposition. The conflict exists between the archetypes—protagonist and antagonist—in other words, between contrasting elements. Without contradictory forces, there would be no fullness of story, melody, image and so forth (at the very least none that would provoke the viewer and listener). The fullness of truth, therefore, is founded upon the very essence of paradox, that is to say, forces of contradiction for a story or even a character fashioned without contrast would remain flat and mono-dimensional. The result might be a caricature of the most cartoon-like features. Although 'art' is actually present in the intentional caricature or cartoon for the paradox is embedded in the medium of 'childish' lines, which in turn grossly exaggerate (as children's art tends to do) the adult features they so desperately wish to hide. Anyone who remembers Dickens' poetic opening and closing from his novel, A tale of two cities—a work very distinct from his others—will remember that the story begins and ends as a tale of irreconcilable differences of the most conflicted of times. The story of humanity, its boundless character and endless intrigue, as it is unveiled and examined by intellectuals, could be compared to an artful endeavor, such as a novel, for what is demanded of art is that it reveal the tension between opposites—or else it will bear an unmistakable flaw. When art, narrative or otherwise, bears no tension, it is often judged to be insipid—or fanatic in its extreme position of depicting only one 'truthful' side. If art is truth, as Heidegger (1971) once declared, it is true because of the existence of paradox and, likewise, if knowledge is 36 truth, it too bears this nature of conflict. The artist and scientist, if they seek truth, must each possess a degree of artistry and clarity, for it is only by artistry that paradox may be revealed and only by clarity that it be understood. If we but take theory and praxis as example of a pedagogical paradox, we can readily see that to express the fullest of knowledge, both artistry and clarity must be engaged, but the coupling of these contradictory natures require ingenious means. Of course, the medium that is pedagogy is itself the content of the medium that is the individual and, in turn, is the content of theory and praxis which is the medium of knowledge and so on. As it stands, knowledge must be subject to rigorous criticism, moreover, if possible, it should be subject to experimentation to enable one to judge its claim. However, science cannot fulfill the human faculty without artistry for in both art and science we find the very thing that is human. Herein lies another paradox: without clarity (as sought by logical positivists) and artistry, the fullness of truth would neither be contained in science, philosophy, art, nor pedagogy. Aside from the fate it may suffer from being judged innocuous, trite, or banal—clarity without artistry is dangerously utilitarian minus the human spirit, while artistry without clarity is catharsis minus comprehension. Neither are representative of the fullness of humanity—both will suffer the fate of becoming nothing more than a trifle in the books of history that sit collecting dust on the shelves of the library. Knowledge, with its perplexing character of being medium and message, however, is not an easy topic to tackle. After Descartes, there have been many who have hoped to renounce the duality of the two historic paradigms: Rationalism and Realism. Kant, for one, tried to quell the dispute over the division of mind and body, for he reasoned that such dualities as, subject/object, 37 reason/sense or percept/concept were doomed to remain irreconcilable without greater clarity in conceptual definition—though this, in itself, became an incongruity in his argument. Ultimately, Kant questioned how a person as subject describes the world as object without taking note that this arises necessarily from both a sense and an idea of its nature. One is led to question whether object can exist without subject, reason without sense or concept without percept. Alone and one-sided, none satisfy the full measure of epistemological truth. Indeed, Kant's criticism of Cartesians led him to the proposition that neither conception nor perception stood independent of each other, asserting an emptiness in the purely conceptual and a blindness in the purely perceptual, though I would add not necessarily 'blindness' but definitely speechlessness. Thus, it was Kant who began to address the idea that the way to knowledge is both perceptual and conceptual with the content of either being nominal when separated; or, in other words, the fullness of human knowledge, qua truth, is possible only if concept and percept are coupled. As one proposition is made, however, another immediately rises to bear yet another paradox. Those layers of paradox reside within the layers of media. According to McLuhan, (1988), characteristic of all media is the interdependence between figure, conceivably the message, and ground or space, conceivably the medium. When we focus on the space or ground (medium) the figure secures a different view. Yet, when we focus on the figure (message) the ground or space may simple vanish. This law of media explains, in part, why philosophy as medium is ignored—it is as transparent as the words it utilizes for expression—media within media. Effectively, when any medium is carefully attended, messages are interpreted with greater perspicuity, though, sadly, not the other way around. Being of French origins, I will use a familiar example that illumines this condition—artfully portrayed by Peter Mayle (1989). In 38 his highly successful book, A year in Provence, Mayle explained how the French propensity for "speaking with their hands" furnished him with greater insight than the mere words would have allowed. A slight gesture of the hand as it rocked, palm down, level to the ground, spoke volumes. Understanding the primacy with which the French place in "speaking with their hands," as well, its integral relationship with facial expression went a long way to bring clarity (and humour) to a troublesome situation. The gesture, in this instance the medium, was a concrete ground from which Mayle could begin to understand the figure (i.e., the words that were spoken) for had he allowed the gesture to remain transparent, the words would have been greatly misleading. Hence, like the image of the faces and vase, or the young and old lady, when we become aware of the medium or ground (we are told the image has two figures), the figures soon enough appear; but when we examine one or another figure, the ground easily vanishes from our conscience. McLuhan presented the view that no matter how carefully content is scrutinized, without understanding the medium it will remain incomprehensible and, to some degree, this corresponds with Frege's contextual sense. Yet, even if the path to knowledge is generally acknowledged as having shaped the content, almost everyone will fall prey to the transparency of the medium. Inviting us to entertain the notion that, conceptually, content is merely another medium, McLuhan suggested that meaning arises from the exploration of the layers of grounds and figures, or networks of media and messages. To accomplish such a feat, we must fly into the eye of the storm and avoid being swept into the melee. If we do not find this vantage point from which we can begin to ponder for ourselves what is real, or what is the nature of knowledge, or to criticize the works of those who questioned likewise, we will unwittingly end up staring squarely at reality and knowledge 39 through the archetypal lens of Plato or Aristotle—defending one extreme that may or may not satisfy our quest. As a matter of fact, the two ancient archetypes stand as a testament of the law of paradox which is vibrantly illustrated in the paradox of medium and message: each view, if considered primary, has given birth to layers and layers of paradoxes, all with figures and grounds more or less transparent. Mind and body, percept and concept, abstract and concrete, theory and praxis, particular and general, subject and object, chaos and order, being and nothingness, the individual and society, the list is virtually endless as it spans historical discourse. Although many historical figures have shaped their works or argued their point from a dualistic position, splitting the contrary poles, as it were; an exceptional few—often termed genius or creative genius—have taken paradoxes to task. This is not stated innocently, for it will lead the reader, eventually to the main problem. However, the question I ask my students, as we gaze on the painting, is whether or not both views can be reasonably entertained at any given time. I believe the answer is yes and no. Metaphorically speaking, like the faces and vase, when we understand the media, there is a moment where the two, medium and message, intersect before one must eventually override the other. So, yes we can hold a paradox in our mind but it will succumb to the transparency of the medium—this being the paradox of media. To carry the metaphor one step further, however, I believe that it is at that flicker in time, the moment when paradoxes intersect, wherein the revelation of truth, great and small, sweeps through an individual or all of humanity. Provided that we begin to understand media, we may alternately begin to understand any message with greater clarity. For instance in the case of knowledge as medium, understanding the source from which all views have stemmed, we can gingerly avoid being caught in extreme positions. 40 The character of overriding one view in favor of another, defined as prescending in nature, may be described in linguistic terms as the manner in which to predicate the blackness of ink will cancel out the predicate of its liquidity (Eco, 1997). In the end, however, it is possible to speak of both the blackness and liquidity of ink as complex realities. In this instance, abstracting qualities of ink is easily reconciled with the senses because ink is tangible. Yet there are physically indiscernible phenomena that are not so easily handled by toggling between the senses and the abstract. The phenomenon of light, for instance, its unpredictability reconciled to some degree by Heisenberg's Uncertainty Principle, is frustrated by its 'insubstantial' nature. Abstractly speaking, the principle meant that an 'observation' will necessarily collapse as another view is taken, that is to say, either the velocity or the position of light quanta can be observed but not both—any 'fixed' point will be solely and subjectively determined. Though light quanta can opt to be in more than one place at once, physically, we cannot determine where one 'particle' goes once we fix our senses. It is only when we engage the imagination, either through metaphysical inquiry or the arts, that the possibility of alternate realities appear plausible. To avoid falling into the trap of improbable extremes, reality or truth requires a 'back and forth' approach to observation—a toggling between art and science, theory and praxis, subject and object, medium and message and so on. Of course, proof of parallel universes in the future may replace Heisenberg's theory or the need to 'toggle' at all. At present, however, insofar as 'mysterious' phenomena that require abstractions are concerned, a Janus-like outlook is all that we can muster to understand all the paradoxical dimensions. Science and art are also paradoxically paired and not easily reconciled. Moreover, as separate media they each carry layers of paradox as equally difficult to reconcile. The tension 41 found in every opposing layer, whether it is media-media or medium-message, nonetheless, are vitally important to educators. As members, soon to share an institutional mentality that trembles with fear whenever conflict arises, I encourage my students to consider the alternative. First, creating a world free from contradiction ought to be an objectionable matter to the pluralist who, like the artist or scientist, must embrace paradox. Opposition is the emergence of truth, as the story of democracy's struggle shows and, for this reason, we find the principle of opposition standing starkly as the vanguard of a democratic society. Second, science without art could no more exist than art without science. The educator, with knowledge of both, is in a remarkable position of strength. Though the science of any single medium is not sufficient to master all media, if scientific principles and processes were apprehended, therein would clarity abound as media were discerned. And though the artistry of any one art is not sufficient to master all of the arts, media and messages, it is enough to have journeyed deeply in one art to call upon artistry. As members of an institution that purports to be democratic, which cherishes both the individual and society, we feel the instability that inevitably arises from contradiction of art and science or the distinctiveness between clarity and artistry, or between science and philosophy. The acceptance of paradox, however, may be what Nelson Goodman (1998) called democracy's "unfinished business." No matter what the conflict is about, however, absurdity is not easily embraced except in jest and, being akin to the irrational, paradox is most erroneously brushed aside in favor of doing battle with something far more palpable and far less taxing to the mind. Reality, after all, depends upon 'seriousness' rather than the comic absurd—or does it? 42 The General Problem: A Conundrum The central concern of this thesis, being the relationship of creativity and technology, is riddled with absurdities. The outcome is a multitude of paradoxes. Apart, each phenomenon suffers equally from internal contradictions. For instance, when I query my students as to the worth of computer technologies in the classroom, the response is invariably mixed. The same is true when students try to address creativity. Put the two together and the task of determining their relationship appears to be an exercise in cracking a code inside a code, inside yet another code. Insofar as creativity is concerned, there seems to be little about it that is 'rational.' Creativity entails thinking that is both convergent—the only possible solution—and divergent—any number of possible solutions (Guilford, 1950). It is both the deconstruction to a problem (analysis) and the building of a solution (synthesis). It is considered derivative of nature (i.e., imitative of the hand of God) but originally arrived at ex nihilo (i.e., out of nothing). It manifests itself in children and adults, in laypersons and experts, in average wit and genius. It exists as an ideology of individualism wherein autonomy, authorship and self-actualization are held in greatest esteem and highly rewarded, yet, notably resides among traditional societies who downplay individual contributions in favor of elevating group heritage. It is at once a process, but bears no expression or meaning until it is a product. It is recognized in its many products, from 'ideas' to inventions, but, having been observed to be uniquely human, it is unpredictable. Undefined, perplexing and mysterious, educators are sworn to nurture it. It shifts with public sentiment and inner perceptions, and it seems to embrace everything from the sublime to the kitsch. Creativity is, without doubt, as farcical as any writings produced by Voltaire. 43 Although technology is judged to be the product of a creative cast of individuals, defining it proves to be as contradictory in nature as defining creativity—not surprising since technology is the creative outcome of human thought. In effect, technology embodies a concert of paradoxes that seem particularly obnoxious and incomprehensible to most, albeit ones that were clearly outlined by McLuhan (McLuhan & McLuhan, 1988). First, technology renders obsolete what it had once hoped to ameliorate. A technology such as the Western alphabet, for instance, created for the purpose of enhancing the memory, allowing for private authorship, disseminating the dialectic, and abstracting from the concrete, simultaneously obsolesces perception in the form of concrete, spatial intuition. Second, new technologies retrieve something from the abandoned past. In the case of written language, it could be an ancient tongue no longer in common use or a shift of power and wisdom from the common mass to the elite. That technology extends human actions and products, as well as forcing things to obsolesce is not much of a surprise, yet, few suspect that technology awakens the long forgotten past—a statement boldly made by McLuhan that will need further clarification. Before proceeding, however, germane to this thesis is the notion of intuition. Intuition plays a significant role to those who have ventured an explanation toward creative processes. Einstein and Poincare, for instance, described intuition as the means for arriving at their theories when all other logical explanations failed. As illusive in nature as any concept, intuition may be described as the mind, abstract and concrete, communicating with itself. In short, the portion of 'mind' that interprets and stores the sensory, argues with the portion of 'mind' that collapses images into schema (Miller, 1996). Like the individual whose brain has been severed along the corpus callosum, the logic wrought by linguistic means will be 'intuitively' slapped away by the visual-acoustic, visual-44 tactile or acoustic-tactile 'hand' of reason. Intuition gives way when abstract logic leans upon concrete knowledge for clarity and vice versa. In other words, intuition is felt as an epiphany, a moment when both the abstract and the concrete converge to illuminate truth. The most surprising feature of technological invention, to return to the problem at hand, is that the past is unwittingly and dramatically retrieved. As far as creativity is concerned, the very notion that the past is retrieved flies in the face of reason that often demands creative achievement to be something other than derivative—though many would argue is never conceivably the case of artistic creation. More likely, that the past factors into innovations in technology appears incongruent with notions regarding technological advancement. After all, technological artifacts of the past are long held to have been obliterated by modernity and, thus, no longer relevant. Notwithstanding, when suddenly placed into an extraordinary world of technological magnitude, as heroes can be in the wake of an inciting incident, there is a striking lack of awareness that the past has been made present. This is the unusual and fantastic assumption proposed in the Laws of media: The new science by McLuhan and McLuhan (1988) who claimed accordingly that the past has been retrieved in all present-day artifacts. To clarify the idea of artifact, insofar as every artifact is a human creation, whether an idea or a tangible product, an artifact is conceivably a technology. Every technology, hence, is a medium qua artifact, and every medium necessarily arises out of yet another artifact qua medium. To McLuhan, the medium that is created (referred interchangeably as either artifact or technology) follows a general tetradic law of media. For instance, technology enhances or extends the mind, eye, ear and limb, it retrieves something from the past; reverses into a paradox—the tension between figure and ground; and obsolesces the figure that was once 45 enhanced, returning it to a transparent ground. Figure and ground essentially dance with each other: as figure emerges from ground, the figure will eventually be 'ground' as another is born. To clarify further McLuhan's line of thinking, a figure is an abstraction that is retrieved from the concrete ground. Hence, technological innovations begin as abstractions (figures) retrieved from the concrete (ground), such as speech from gesture. Notwithstanding, the figure (speech) reverses into ground as another figure (written word) emerges. McLuhan claimed that the tetradic laws—enhance, retrieve, reverse into and obsolesce—represent a simultaneous occurrence, contrary to the linearity of Hegel's dialectical triad, Viz., thesis, antithesis, and synthesis. Whether Hegel's triad is actually linear or not, and whether events fit a tetrad or not, every new incident is the start of a new 'book' in the series—always in concert with the introduction of a new medium—which results in a new adventure and, invariably, an evolution of events. Changes in consciousness that is brought about by new media, is felt as social content, which becomes the focus of psychological, sociological, and anthropological study. Content alone, however, insufficiently addresses the cataclysmic change that is certain to befall any society. It is one thing to focus on the figure, or the analysis of a figure (i.e., the injustice of a governing policy) but it is another to step away from merely addressing the content to addressing the interlocking events that make up the whole. It is different because, once again, the content of any medium is always the outcome of another medium (McLuhan, 1963). Moreover, the whole picture necessarily includes the change in consciousness that is the outcome of the introduction of any new medium. Ignoring such a perspective, individuals often lose sight of the big picture—from the inception of the medium to the irrevocable change wrought by its ubiquitous presence. To the seasoned storyteller, however, establishing the big picture is requisite of any good story, for all 46 good stories deal with the changes that come about when an inciting incident hurtles the hero from an ordinary world into the unknown. Unlike the medium of literary text whereby metaphors can be veiled in poetic form, filmmakers can graphically and concretely explore the clash between the boundaries of the traditional and the novel, with the imposing new medium (i.e., technology or idea) being made obvious to the audience. Pleasantville, a film written and directed by Gary Ross (1998), is the story of the change that overcomes a fictitious community living inside the 'black and white' world of television in the 1950's. When two teenagers from the ninety's inadvertently enter their television set and land in the middle of the small town, bringing with them the phenomenon of color, the town's people experiences a cascade of changes. In this instance, Ross imagined that 'color,' both exhilaratingly new and ultimately 'dangerous' to a way of thinking, symbolized new ideas that would eventually shake loose the entrenched beliefs that gripped the town with prejudices and complacency. The introduction of color in a colorless society, cleverly revealed through special effects, was the medium by which consciousness shifted and its role was not at all transparent to the audience. From a more historical viewpoint, Once upon a time in China I-VII, a series of films written by Hark Tsui (1991-1997), has brilliantly captured the shocking results of the sudden, brutal, and 'mechanized' invasion of the East through Western innovations. With the introduction of clocks, cameras, pharmaceuticals, guns, automobiles, Western clothing and cutlery, the Chinese were suddenly and jarringly thrust into a modern world. As an attempt to civilize a predominantly oral society that had lived practically unchanged for thousands of years, Western civilization and accompanying artifacts, was met with incredulity and distrust. There is 47 a comical co-existence that continues today with the juxtaposition of ancient Chinese culture with the modernity of the West, i.e., rickshaw and automobile. With the advent of the Gutenberg press on Western civilization, McLuhan (1963) posited that the eye could no longer perceive in concert with the ear or the hand as it had in an oral society. A change in universal perception whereby the eyes observed the world through abstract and linear figures (i.e., the alphabet), its relationship to acoustic, visual and tactile space (concrete imagery) would simply disappear to accommodate the new medium of order and logic. In so doing, the social fabric was simultaneously decentralized and multiplied by the hundreds, for it created implosions of individual seams, producing patterns of individuals bound together by threads of culture like a patchwork quilt. But all of this changed when the ear and the eye were given new media, such as, radio, T V and computer technology, for when acoustic, visual and tactile space was reintroduced into society (as if it had ever truly left), the eye and ear became partners—interdependent as they were once in pre-literate times. McLuhan suspected that the change in consciousness would be felt as a systematic re-shifting toward a central order. Historically, the clash of two contrasting media, i.e., print and radio, inevitably caused a moment of truth for literate Western society. Witnessed as the 'revolution' of popular arts and 'dethroning' of authoritative figures and organizations ruled by logic or science, it was the beginning of the end of one consciousness giving way to another. Since the recent terror unleashed on America on September 11th, 2001, we have been given an opportunity to tangibly witness the great divide between the clash of technology. One society, driven by tribal ways and illiteracy, gather in numbers around the radio, their perception forever distanced from their literate confreres, whose access to TV and print enable alternate realities. Even within the American society, the reactions from these fantastic events are 48 dependent on the medium of truth—TV, radio and print—each creating a different conceptual reality (save for those who experienced the event first hand). The fear that prevailed in the days of radio technology, as it began its journey into widespread broadcast (not just point-to-point communication), was that it would one day replace the medium of print. It is this fear of 'displacement' or obsolescence that has gripped every generation whenever a new medium is introduced. McLuhan showed that the fear was fatuous, because whether old media are doomed to be supplanted or not, the obsolescence of the medium itself is not the issue that plagues us most. It is the shifting of consciousness that is at risk for, in truth, this is more insidious in the way it is utterly ignored. Though Huxley and Orwell infused a healthy reproach into the American psyche toward technology, both authors supposed that technology's role in a totalitarian society would be in the hands of external agents, i.e., evil persons or empires. Neither conjectured, as McLuhan (1963) had, that technology would serve as its own entity to create change from the inside out. Though many modern filmmakers have tried to address the transformation occasioned by technology, only a few (i.e., Ridley Scott's (1982) Blade Runner and David Cronenberg's (1999) eXistenZ) have succeeded in capturing the medium without becoming overly focussed on the content, i.e., agendas that depict the forces of evil as socio-political and not metaphysical. It was modern technology's affect on consciousness, which transformed the patchwork quilt that marked decentralized societies originally sewn from print technology. Creating cultural diversity, referred to as multiculturalism, cultural 'identities' were contained, classified and divided. Nonetheless, the introduction of new visual-acoustic-tactile media has unravelled the patchwork seams, so that the boundaries are lost, the classes are blurred and order is randomly dispersed. 49 Western civilization is now a world not unlike pre-literate societies where life is governed, not by abstract logic, rather, by common sense. As one consciousness has been awakened, another has gone to sleep, for the hotly engaging medium of print, the "neutral visual world of lineal organization" (p.93) has been displaced by cool media which render a virtual reality as concretely dissociated from logic as perceptual impressions had ruled the outlook of the non-literate mind (McLuhan, 1963). If a central control re-emerges, it will be because society has been primed to receive it through its own inventions. Those are serious allegations and, if true, could be calamitous for educators, for their struggle to maintain the stature of print in order to fulfill their pedagogical creed, may be in vain if they unwittingly succumb to the figural affects of modern technology. Even if McLuhan had wildly exaggerated the characteristics of technology, however, it is certain that changes in cognition must occur. But relatively few have critically considered the impact of technology from the point of view of cognitive development. This is not very unusual for fewer still consider the relationship that exists between cognition and language. Despite the importance of this latter relationship, educators wholly ignore all such matters. Not even to those whose study is semiotics, has it become an essential matter, for, despite that their linguistic inquiry has led them to make deeper links with cognitive scientists, they are most often looking at the issue from the perspective of the content or message. To understand the relationship between language and cognition, one has to eventually come to terms with the two media—not, as many suppose, the messages within messages. From McLuhan's theoretical view of media, here is an unpalatable paradox to consider by individuals whose beliefs were constructed from the literacy of print technology, and whose sole conscious engagement with any medium has been with content and meaning—the absurdity, of course, is that experts do not know that they do not know. 50 It is only right, therefore, despite very strong evidence to back McLuhan's proposition, that the laws of media should be met with great incredulity. To prove this point, one student remarked, only having 'heard' of the theory, "I'm a product of the 70's; I grew up glued to TV, and I can't say that I've reverted to any 'primitive,' illiterate way of thinking." That kind of non-critical statement usually succeeds in elevating the conversation to the level of opinion where mud slinging is de rigueur, risking an emotional fallout and absolute division. A l l that because the cool medium of verbal language predominantly engages concrete impressions, in contrast to the hot medium of print that engages the conceptual, abstract mind. In the end, with great irony, one medium usually collapses the possible realities offered by another. It is not enough to 'hear' of the laws of media, it is only from deepened scrutiny that they can be judged. McLuhan's ideas are more than just academic ravings of a lunatic obsessed with understanding media. There is considerable evidence drawn from the combined fields of cognitive sciences and semiotics to support his claims. It is not surprising, however, that the evidence has not been properly matched to fit his theory, for in a flurry of cognitive investigations (i.e., neurological and semiotic), researchers have been preoccupied with practical considerations—the need to find a solution in the face of dramatic changes to geo-political forces, i.e., enemies and terrorist forces. Philosophy of mind, arising out of the roots of ancient Greece and reinstated with passion by Descartes is, of course, greatly interesting to both the cognitive scientist and semiotist. But metaphysics, being of the character of the study of mind, is not sufficiently scientific—at least not to those whose trust is placed in praxis. Astutely observed by one student, the remarkable irony is that we have supplanted philosophy with science only to find ourselves having no means but philosophy to understand 51 indiscernible phenomena familiar to cosmologists and mathematicians. From philosophical musings, we have succeeded in creating a fantastic new world, although literature and film arts have certainly played a role in turning virtual ideas into scientific realities or, conversely, realities into 'art.' Even Hume (1998) recognized that the merit of the artist lies in the fact that he "must be better qualified to succeed in this undertaking, who, besides a delicate taste and quick apprehension, possesses an accurate knowledge of the internal fabric, the operations of the understanding, the workings of the passions, and the various species of sentiment which discriminate vice and virtue" (p. 674). However, it is neither philosophy nor the arts to which we have turned for increased understanding of the mind. It is psychology that has dominated our sphere of knowledge on those matters. And born during a time of scientific positivism, much of psychology's research is couched in mechanistic terms. Today, even those who balk against mechanism frequently explain in mechanistic terms any construed relationship between mind and body, or between language and cognition. There have been and continue to be exceptions, of course. For instance, a remarkably forward thinker, Lev Vygotsky began near the start of 20 th century to look for relationships between language and cognition that moved beyond a mechanistic explanation. With an inclination toward an organismic view, later refined by his student A.R. Luria, who became a respected neurologist, contemporary scientists have notably recognized his contribution toward understanding neurological implications of linguistic development and function. Although educators regard his psychological theories with some importance, many of his ideas on this scope remain controversial due to Marxist underpinnings. Nonetheless, his general principles were of significant importance to neurological findings. 52 With similar interest, Piaget had preceded Vygotsky to show that maturity affects cognition and that there exists identifiable stages of cognitive development. As with Vygotsky, Piaget's ideas have greatly contributed to our understanding of cognition, in particular the field of education. Similarly, his ideas have been both refined and overlooked among educators. Notwithstanding, Piaget also contributed to the notion that a medium affects the way a person thinks. We find a corollary of ideas among those individuals who have sought to understand cognition as an organic medium, and the manner in which it is affected by the creation of human artifacts, i.e., technology as an extension of mind and body. There are quite a number of players who have unwittingly collaborated ideas to bring clarity to the matter of creativity and, at the same time, have permitted us to see its relationship with technology. Although individuals such as Vygotsky, Luria, Piaget, and McLuhan, to name of few, were neither in search for a definition of creativity, nor of the relationship between creativity and technology, each of them have contributed in ways that have been entirely overlooked by communities of researchers. Whatever factors have been and will continue to be scrutinized (the list being infinite in scope), creativity is above all a mindful act that is manifested physically. Technology is an extension of the mind and corresponding bodily actions enacted as a creative response to human ideas (i.e., abstract imaginations). The perspective I have taken, therefore, is to address those matters through the cognitive lens and, as such, in this thesis I will attempt to demonstrate that creativity is necessarily cognitive. Without recognizing the implications of cognition, we are likely to commit the same error that many educators who are fixated on perceptual abilities, i.e., visual, auditory, tactile, and kinetic. Though anomalies of any one of the perceptual modes will cause conceptual difficulties, it is not anomalous modes that frustrate our understanding—after all we can readily understand 53 that blindness and deafness will affect visual and oral imagery. Rather, the misconception that many music educators like Murray Schafer (1986), who wrote the book, The thinking ear, do not account for the fact that no matter what we may conclude of the sensory modes, the ear does not think. That, in my opinion, is a critical point that must be thoroughly digested for any other erroneous belief obfuscates the issue of mind and body. Research on Creativity: Focusing The Research Questions Having lived the life of a performance artist, heavily reliant on multi-media, the nature of creativity ultimately engaged my curiosity, but the instruction of music pedagogy, as it were, pushed me to greater depths of personal inquiry. As McLuhan (1963) once expressed so well "the best way to get to the core of a form is to study its effect in some unfamiliar setting" (p. 257). There was nothing more unsettling than shifting from performer to pedagogue (despite the similarities). Though the key that unlocks creativity had always captivated me, it was in becoming an arts educator that carved the necessity to understand the relationship between the medium, the individual or process, and the product. Understanding the medium, either a classroom filled with activity centers, the teacher, instrument or technology, became a heightened focus when pedagogy became the subject. Bearing in mind that the content of any medium is always another medium, there were potentially many layers of intriguing phenomena to peel away. From simple curiosity, I was led with increasing momentum toward the investigation of creativity as it became apparent that it deeply affects current educational concerns on multiple levels. My inquiry naturally developed along two objectives: one was related to theory and the other to praxis. Insofar as a theoretical understanding of creativity was concerned, I searched for a light into its dark nature, knowing full well that as our ship of learning sailed, and as I 54 modelled expectations for creative processes and outcomes, our understanding rested without an anchor on the surface of a stormy sea of knowledge. I wanted to know whether I was peering into a cultural ideal that embraces individualism and self-actualization, ostensibly, one that stems from the American Humanist movement, or whether I was gazing into the mind's abyss. Creativity loomed upon us as we gingerly side-stepped 'cookie-cutter' activities, yet, none of us could wrestle with its 'substance', a factor that became the impetus to understand its essential nature—knowledge that would be instrumental in explaining the mystery that lay beneath compositional capabilities of children without formal music training. As an educator, I wanted to know what part any medium played, if at all, toward nurturing creative outcomes, for when we question whether or not creativity can be nurtured, we enter into an epistemological conundrum that demands a solution. As to praxis, I hoped to resolve the perplexing matter regarding the assessment and evaluation of creativity in the classroom. It was a matter which troubled me more since I began to instruct pre-service teachers, for I lacked an intelligible definition that could legitimately aid them in pinpointing useful tests and measures. I had years of experience and a storehouse of ideas in my favor, but in this new environment, I felt an obligation to move beyond euphemisms or, worse, becoming yet another 'curriculum source.' Being an inventor of sorts, I had a need to help them find that staff which would guide their own journey. Moreover, I wanted to demonstrate that there was a profound relationship between technology as medium and creativity as cognition, as I had witnessed for many years, despite the students' incredulity. The problem was in defining that relationship, and that could not be answered until creativity had some definite form. Since all such issues raised serious debate among education students, in class and in reflections, my objective became a desire to get 55 to the bottom of it all. To do that, I had to follow the thousands of trails that had been left behind in order to comb for one significant clue. I began, hence, to examine an enormous body of knowledge, all of which had emerged from divergent paths of inquiry. One path was historical-cultural and its content was shaped by hermeneutics with a "focus on internal relations of significance and meaning." Another was philosophical and its content was shaped by conceptual, rational discourse. Still another path was empirical and its content was shaped by "conjectures, hypotheses, predictions, evaluations and interpretations." Though none of the pathways led directly to new insights, each produced a constant change of ideas that resembled "frogs on a wheelbarrow: during the course of being wheeled along the path they leap in all directions" (Peeters, 1996, p. 201). In general, I was fascinated by the divergent paths of investigation and was curiously drawn to each perspective. The domain of creativity alone, ostensibly, dating back to Plato, was more or less rooted in the rainbow of paradigms that swept the 19th and 20th Century, from Romanticism to Positivism to Postmodernism and beyond. Though there have been a few individuals, such as, Charles Darwin (1974); social historian, Arthur Koestler (1964); and physicist, David Bohm (1998) who have tackled the subject of creativity from unusual perspectives, principally, it is psychology that has dominated the inquiry. From within the diverse branches of psychology, therefore, the creative viewpoint could range from within psychoanalytical, behaviorist, and Gestalt theories to neurological developments. The result of an intense psychological scrutiny produced hundreds of psychometric tools and measures. Historical studies, known as historiometrics, wherein psychologists scour biographies and personal accounts of famous persons, from the past or present, from one domain to another, measuring and contrasting backgrounds or intellects, and scrutinizing famous works or processes, have also contributed to the field. 56 There are even more than a few psychologists who have been working on historical correlations that hope to determine a numerical link to the phenomenon—a sort of 'numerology' of creativity (Simonton, 2001). Also, from the social sciences and its fusion to biology or evolution, studies have traversed the range of socio-evolution to cultural anthropology. The possible links between disciplines is as numerous as the interests of researchers. Finally, there are studies that border on the science fiction, such as, Artificial Intelligence, where A.I. researchers, such as Margaret Boden (1990), and Avron Barr (Rheingold, 1985) continue to search for ways to build a machine with problem-solving and creative capabilities. There are a few, like physician and historian Phillip Sandblom (1982, 1995), who have even revisited the Freudian notion of creativity and madness. Initially, the search for journal articles and books was unbearably daunting. A search of the U B C library catalogue on the term 'creativity' produced 1,000 titles, while a search of commercial book dealers on the Internet produced 680,000 titles. And the entry of merely the keyword, creativity, on ERIC (Educational Resource Information Center database) produced 25,348 journal articles. Eventually, after an exhaustive search in what I perceived as the most salient books and journal publications on creativity, my query was left unsatisfied. Most studies sought to uncover a variable that bore some kind of relationship with creativity. With the hope that it would shed light on its enigmatic character, a large number of factors from intelligence, personality, and behavior to birth order, family, and social background, were correlated. The infinite list of possible factors, however, seemed to lead in circles, and worse, they offered data that was too quickly accepted by educators. In general, books written on creativity—most often claiming to arise from 'current' research—tended to oversimplify creativity altogether. One such book that I had found being used in our Faculty gave me reason to cringe. It was entitled, Up and out: Using creative and critical thinking skills to enhance 57 learning (Johnson, 2000), and it was appallingly naive in its attempt to strategize creativity in the classroom. There were nearly as many theoretical models or 'schema' as research studies—although there was a large overlap of ideas—matched only by the list of qualifiers, or characteristics, used principally to rate creative or cognitive processes and products. Nearly every empirical study, no matter what new term had been employed, however, returned to four stages in the creative process coined by Wallas in 1926: preparation, incubation, illumination and verification.2 And, additionally, J.P. Guilford's product ratings: fluency, flexibility, elaboration, and originality. Those terms, for instance, were used in experiments designed by music educators (DeLorenzo, 1989; Kratus, 1994; Gordor, 1980; Vaughn, 1973; Webster, 1992;). At any rate, there was one thing that was made very clear to me, all such matters pointed indubitably to the mind as a primary force. And, despite the field's fecundity, I could well understand psychologist Theresa Amabile (1996) sentiments when she said, "researchers are often accused of not knowing what they are talking about. The definition or assessment of creativity has long been a subject of disagreement and dissatisfaction among psychologists, creating a criterion problem that researchers have tried to solve in a variety of ways" (p. 19). No one was plainer about perceiving the flaw in investigation than poet and novelist Czeslaw Milosz. By refusing an invitation from Csikszentmihalyi (1996) to take part in an extensive study, he sent a communique via his secretary that made it clear he would not take part due to "some methodological errors at the basis of all discussions about creativity" (p. 13). When the subject (i.e., the creative individual) questions the means and interpretation of the observer (i.e., objective measure), there is a good chance that someone is missing a valuable point. Perhaps this invisible fact—such as the one that upset Margaret Mead's findings—is to blame for the constant blooming of a research tree that remains barren of fruit. 58 At any rate, following as many leads as possible—a search that traveled to far and away places in philosophy, physics, psychology, biology, mathematics, and neurology—I hunted for clues. Many ideas began to spring from the particulars of wildly disparate subjects. I seemed to embark on several new insights that appeared to merge on pathways leading to creativity and technology. Some of those were farfetched, such as those that emerged from 'thermodynamics' and 'entropy.' Eventually I narrowed the possibilities to a few generalities—a refocus of my energy to complete the task at hand (the pragmatic mind urging the metaphysical to end the wild leaps).3 Insofar as theories were concerned, the laws of media prevailed. It became obvious that it was not just a matter of what is known of creativity, an expansive and unwieldy content in such a brief historical period of time, it was a matter of the means by which we have come to know, for this would undoubtedly influence any theory of creativity. As expected, there was little mention of acknowledging that fact in any of the research studies. However, recent work, such as that by Weiner (2000), made a strong effort to trace a socio-historical view of creativity. Since technology had been part of my inquiry from the start, I had found relatively few studies in the domain of music education that dealt with both technology and creativity. An ERIC search produced merely 3 articles with the key phrases, music technology and creativity, and slightly more (i.e., 76 articles) with a Boolean search that included all three terms. Just as I had turned to creativity studies that covered a broad range of domains, I now turned to the larger body of literature on technology and creativity. However, merely entering the phrase technology and creativity on the 'Google' web search engine produced 787,000 possible sites. An ERIC search using the same key phrase produced 1,614 possible journal articles. As in the case of my search for books on creativity, I would often peruse the shelves of libraries and bookstores, looking for recent publications rather than attempting to trace the titles 59 found on library or Internet searches. One particular book entitled, Tools for thought: The history and future of mind-expanding technology, (Rheingold, 1985/2000) offered many insights. As a journalist, Rheingold pieced together the history of nearly every important player in the development of computer technologies. He also touched on the most salient ideas, with direct quotes from many of the key players in the personal computer 'revolution.' In the 2000 edition, Rheingold added an 'afterword' that brought several events, interviews, and ideas up-to-date. Ultimately, I was in search of a general theory of either creativity or of technology that would set the stage for a study that looked at the relationship between both phenomena. Overall, however, I noted that most researchers in education, psychology, and the social sciences with interest in new media or digital technologies focused predominantly on the what (i.e., the message). With various influences by technology pundits Postman (1995) and Toffler (1990), I was thus driven to learn more of what McLuhan (1963) had to say about the how (i.e., media). While searching for questions that would guide my investigations and methodology, many established methods had to be abandoned insofar as quantitative studies were concerned. Despite writing a reasonable proposal for investigation early in my research, I found it necessary to discard all of my initial attempts. A description of that attempt and the reason for its abandonment is outlined in the following paragraphs. As previously stated, I believed my investigation would be an exploration of the relationship between creativity and technology. With the use of software that was MIDI-based (musical instrument digital interface), in a fully equipped computer laboratory, I had planned to have intermediate school-aged children participate in a small study. The study intended to examine any observable relationship between creative music ability (i.e., composition) and technology. If the participant had little formal training, I was on the lookout for factors that may influence creativity such as aptitude, personality, and the use of new versus traditional music 60 media. Trouble in formulating a good research design began when I tried to establish an operational definition for creativity based on existing studies. Quoting J.P. Guilford (Amabile, 1996) I tried to establish the historical thread: "In its narrow sense, creativity refers to the abilities that are most characteristic of creative people...and is particularly interested in those traits that are manifested in performance, in other words, in behavior traits. Behavior traits come under the broad categories of aptitudes, interests, attitudes, and temperamental qualities...Creative personality is then a matter of those patterns of traits that are characteristic of creative persons" (p. 21). I also established that effort had been made to lay the groundwork in defining creativity. I stated that research was undertaken to (1) identify creative behavior and characteristics, (2) operationalize creativity, (3) determine creativity as a distinctive part of cognition, (4) devise appropriate testing conditions, and (5) investigate the relationship of creativity traits to other selected variables (DeLorenzo, 1989). Despite drawing content from earlier studies, however, I was force to agree with researchers that an operational definition for the purpose of assessment was not yet established (Amabile, 1996; Boden, 1990; Piirto, 1998). Undaunted, I addressed creativity studies designed by music educators, and I outlined empirical studies that had been published as early as the 1970's. Those studies, like so many in other domains, had been influenced by the psychometric research of Guilford (1967), and Torrance (1966). In my rationale, I had to explain that despite the growing contention regarding validity issues and evidence to refute early findings, psychometric designs continued to be the norm for music educators with interest in creativity. Moreover, aside from Kratus (1989), few studies addressed music aptitude and aural conception known as audiation. Notwithstanding, most studies (including Kratus) used Wallas and Guilford's process and product definitions. 61 For peculiar reasons, although ignoring the aural aspect of music development, the Torrance Test of Creativity (1966) became the model for a number of creative music tasks. Its use was popularized in education due to ease of administration and scoring, and for that reason, music education researchers may have found it useful. On the supposition that creative music abilities needed to be identified—absent in most creativity tests—music researchers merely added to Torrance's tests rather than redefine its parameters (Vaughan, 1971; Webster, 1992). I was also obliged to outline that in recent years creativity tests had become as hotly contested as IQ tests due to (1) content validity issues, (2) construct validity issues (concurrent and predictive), and (3) issues of generalizations made from the results of assessing narrow ranges of ability (Amabile, 1996; Boden, 1990; Kratus, 1989). Having admitted all those things, it became increasingly apparent that creativity studies were stubbornly following a line of inference that was inductive in nature. There was little discourse to guide a theoretical framework, and studies were compiling as if they were increasing bits of data with the hope that someone would eventually make sense of all the findings. My attempt to ascertain the relationship between creativity and new media contexts for music composition seemed hopeless. It was clear that despite the thousands of journal articles, researchers frequently neglected the obvious—creativity as a cognitive process. If we separated the mind and body respectively as concept and percept, the two portions would still add up to mindful acts whatever the outcome, i.e., art, drama, music, dance, science, mathematics, literature, etc. What was most apparent, however, was that researchers who studied creativity within separate branches of psychology (e.g., personality, attitude, behavior, social and moral judgment, motivation, will, temperament, sensory and motor responses, emotion, logic, artistry, intuition and so on) seemingly neglected the fact all matters rest in the mind. 62 Though having utilized the term mind, I do not mean to imply a lack of physicality. Once again, it is not a matter of separating mind and body, or the physical aspect of the brain, or of addressing being as if it were merely housed in a disposable carcass, independent of its physical scope—it is a matter of acknowledging that a symphony of characteristics are conducted by the chef d'orchestre, from hypocampus to dendrite to artifact and back as a medley of media (human and technologic). Neither mind nor body would exist without the other but neither quite so dramatically human in spirit without technology. Moreover, having utilized the term mind to connote both mind and body, it was clear that the socio-cultural context that interplays with the biological aspects of human to 'create' mind was ignored. There exists a body of work in the field of social cognition that addresses the intimate relationship humans have with their surroundings (Augoustinos & Walker, 1995; Dodwell, 2000; Lave & Wenger, 1991; Vygotsky, 1962). Ontologically speaking, it is reasonable to state that the human 'mind' (i.e., human identity) may be heavily shaped by its relationship with the social and cultural. Moreover, according to McLuhan (1963), it is shaped by its relationship with artifacts (i.e., media). There was no simple solution to the dilemma of design a research study. There was not in existence a single psychometric assessment tool or social context (i.e., environmental setting) that would sufficiently provide the foundation to reveal a relationship between technology and creativity. In fact, all manner of 'valid' psychometrics and 'contexts' would need to be administered or observed. Needless to say, an experimental design of such magnitude was a conceivably daunting task. I determined that it would yield no more insight than what had already been established over the past fifty plus years through numerous multi-factorial designs. Setting aside the controversy related to aptitude testing, psychometrics is a means to approaching creative processes by comparing and contrasting it to other mental processes we can 63 reasonably measure—whether addressing visual, aural or kinetic perception and reason—such as the identification of relationships and patterns. For instance, the Intermediate Measures of Music Audiation ( IMMA; Gordon, 1989) is used most readily as a diagnostic tool for determining specifically music related ability, i.e., the ability to discriminate and identify differences between rhythmic and tonal patterns. Presumably, the higher the audiation scores, the more likelihood that compositional abilities are demonstrated. Understandably, a person who is unable to discriminate rhythmic and tonal patterns is unlikely to successfully compose a piece of music that could be rated as exceptionally creative—bizarre perhaps, but not creative under musical constructs. Notwithstanding, psychometric testing does not resolve the creative distinction between two equally skilled musicians, and, thus, proves to be of little help in completing a definition for creativity. Second, it is creative products, not merely processes that have to be considered. After all, no matter how creative the process, it is the product that is judged as creative and, as well, the creative product that is associated with the individual. Hence, the creative genius is only bestowed this title upon demonstrating a particular accomplishment. As with operational definitions, descriptive rating scales (i.e., rubric), while valid prima facie, are not entirely reliable. Even with inter-judge reliability scores, the ratings could arguably contain many extraneous influences that color the results. For that matter, we often learn more from studying famous works against a public reception (i.e., box office and billboard ratings) than from contrived exercises in out-of-context endeavors aimed at testing creativity. Third, given that authentic creative products are judged in context, one could reasonably assume that the medium plays a significant role. Due to the fact that every context paradoxically displays enabling and disabling characteristics, my attention turned to environmental studies that promised to reveal some relevant facts. Based on the three-component model of creativity 64 designed by Sternberg (1977; 1988), viz., domain-relevant skills, creativity relevant processes, and task motivation—Amabile and Simonton (1996) attempted to show contextual factors that influence creative products. Unfortunately, this latter attempt to uncover factors presented no major breakthroughs. In the end, researchers who aim to determine social or environmental factors are beset with the same issues psychometric researchers face—too many relevant bits of data to sift through and analyze. It is effectively like trying to find a pinprick in a gigantic tapestry of variables. Any additional 'medium' (i.e., technology) placed as a variable into the experimental design seemed hardly worth endeavoring until creativity was more clearly defined. Lastly, some researchers have tried to work around the issue of definition. Amabile (1996), for instance, has used a 'subjective' or implicit rating scale. Judges are told to use their own personal definition of creativity (not assigned by the experimenter). Judges often succumb to defining creative products by using the word 'creative' or some other synonym, i.e., novel, unique, unusual, original, etc. Since the results have indicated that irrespective of the rating scale there is strong inter-judge reliability, definitive descriptions of creativity appear to be of little value. Despite that there may be strong inter-judge reliability when predicate and subject are one in the same, it does not bring any clarity to understanding what is being evaluated. Furthermore, there is a distressing issue in the use of expert judges. There is an historical precedence that groundbreaking works are often not understood or acknowledged by equally skilled peers within a domain. Frequently, those products are embraced by persons in a tangential domain and, with some reluctance, may be grudgingly accepted over time by peers from within the same sphere. It became increasingly evident that a philosophical examination of the essential nature of creativity was desperately needed to allow us to overcome dead-ends and impossible knots of 65 confusion. While there were thousands of empirical studies designed to outline a model of creativity, none of the designs had been built on a rational foundation. Other than refuting the inspiration from the Muses, there was no philosophical point of departure—no theory to speak of. On the other hand, there was not much to be found within philosophical endeavors that could be seen as directly foundational to creativity. However, the richness of dialogue on matters of mind, intellect, imagination, imagery, spirit, intuition, art, poetry, and so on, were undeniably present, especially in the area of metaphysics. In all fairness to philosophy, of course, the creativity phenomenon crept into intense study as terminus ad quern of the seminal address given by J.P. Guilford. The term creativity had barely moved into the American vernacular shortly afterward (Weiner, 2000); and philosophers, not in the habit of treading on psychology's ground, preferred to direct their attention to the content of sign, that is to say, philosophy had turned to matters of language—with developments in semiotics having only recently returned to the matter of cognition. If there has been any connection made between the sign and creativity, there has been no definitive work undertaken by philosophers, though there have been more than a few who have tackled imagery and creativity, such as Arthur I. Mi l le r (1996). Understanding the genius or creative mind and the extensions of mind (creative products and processes) has been principally left to cognitive scientists. McLuhan 's take on media, which includes mind, technology, language and so on, is one perspective that cannot be dismissed. That is because his focus was to develop a conceptual understanding of media—all media as signs and as signs play themselves out cognitively. Not merely as an interpretation of content, nor of a medium's syntax and grammar interacting on a single plane, but as a collection of media and messages pitted against the individual and society 66 on several conscious levels. Since the thrust of new media implies innovative processes and products—both from the aspect of the medium itself being innovative and from its means for innovation to take place—there is a conceptual parallel to be drawn. McLuhan was adamant that his perspectives are not merely theoretical. His concepts have been, in fact, backed by strong evidence taken from the fields of cognitive sciences and linguistics—evidence that substantially forms a basis of understanding of the relationship between language and cognition; evidence, when logically applied, that supports the relationship between creativity and technology. As I continued to mull over the relationship between technology and creativity, while simultaneously focused on cognition and language, at some point, several things began to tumble into place. McLuhan provided two media constructs that are easily proven to be true: (1) "the content of any medium is always another medium" (p. 23); and, (2) the medium as ground is transparent—having a quality of being there but not seen. He proposed that we are unlikely to understand the figure with any degree of perspicuity without fully acknowledging the medium's existence, because the medium is the message. Insofar as he did not imply that medium and message are one-in-the-same, for if they were, he would have had no cause to examine their relationship, the study of this paradox compels one to take a closer look at the mind/body issue, particularly as it relates to, speech and thought. While many might believe dualism has long been settled, or that speech and thought issues died with Saussure (1949), unfortunately, there are many indicators that suggest it is not the case. That unsettled matter, as I discovered, impacts heavily on creativity research, but it has even greater repercussions across all levels of education. As a result of sending newly educated teachers into the field without having been given the philosophical tools to become critical 67 thinkers, not even the knowledge of what has been conceived before, many spurious solutions are drawn up as means to dealing with issues of creativity and technology. Pedagogy, it appears, is increasingly governed by neither art nor science but by popular culture and trade. Teachers, like so much of the public, have developed a growing respect for common sense in lieu of critical thinking. For instance, in light of the amazing developments in digital technologies, educators easily succumb to commercial advertisements selling media products with claims that new media enables innovation. As one Microsoft ad pointedly admitted, "Bad ideas do not get better on line." To the deep chagrin of anyone who has studied Plato's lament, despite our sophisticated age, there has been little progress vis-a-vis the development of critical thinking among the general populace. It is especially disconcerting that a learned community (i.e., education) falls within the same parameters. Upon reading the studies conducted by L.S. Vygotsky and later extended by his student, A.R. Luria, which demonstrate an undeniable link between language and cognition, I was struck by the similarities of the relationship between language and thought, and the seeming parallel relationship between technology and creativity. As far as creativity research is concerned, since a great number of studies had been conducted within the domain of cognitive science, at least that much could not have been a revelation to the community of creativity researchers. I was surprised to discover, however, that to conceive language qua technology is not so readily attainable. Given the strong relationship that exists between thought and language, I wondered whether or not any light could be shed on the matter of the relationship between creativity and technology respectively. It occurred to me that after ten years of teaching in the public school system, with continuous professional development in areas of creative and critical thinking, technology and language development, I had never encountered either Vygotsky's neurological findings or 68 McLuhan's theory applied in an educational context. Though I had briefly encountered Luria's work, during my two years as a practicum student, there was no mention of Vygotsky. And, apart from enrolling in a Canadian communications course where I had the opportunity to examine McLuhan's ideas to some degree, he had made no impact in my education as a pre-service teacher. Thus, I was largely unaware of the impact that either had made to my current inquiry. Moreover, I had been unaware that Luria had greatly refined Vygotsky's Marxist position and had used his findings as a means of departure for continued neurological study. From an educator's point of view, those findings would have greatly enabled the area that Piaget had neglected to explain: the leap that must be made from concrete to abstract thought. Vygotsky called the bridge that connects the concrete mind with the abstract as the "dialectic leap." Although Vygotsky took a purely material view of Hegelian ideals and could be said to have ignored the dialectic from a metaphysical standpoint, Luria was able to demonstrate in his studies the difference between dialectical inquiry as abstraction and material exchange. From a creativity standpoint, the dialectic (in its purest sense) has hardly made a ripple among educational researchers. More importantly, insofar as the development of the creative mind is concerned, the role of the dialectic has not been fully considered. Clearly the reason for this oversight among creativity researchers has been because (1) psychology has dominated the field of inquiry resulting in a lack of metaphysical grounding; (2) blinded by layers of media, researchers cannot see beyond what has been established in their field; (3) research has followed an inductive method wherein particular data is collected in the hope of arriving at a general conclusion; (4) it is time consuming to examine multiple avenues of seemingly unrelated sources, and (5) the laws of media in the form of a tetrad may be better equipped to deal with 69 conceptually understanding technology and creativity than the dialectic. I venture that final reason to be the most important. The more I began to juxtapose theories within my classroom teaching experiences, the more I became convinced that the coalescence of that information could lead me directly to understanding the very essence of creativity. Putting things into perspective, nonetheless, is an important first step. As with seismic readings, the collection and interpretation of all the data makes a significant difference. Although there are hundreds of events that may erupt all around a single event in question, each producing similar shock waves, it is from the coordinated efforts of seismologists whereby a single event of great magnitude, like a thermonuclear blast, can be distinguished from all other 'blasts.' Metaphorically speaking, there are ample Shockwaves produced by several paradigms of thought resulting in similar data that masks the facts regarding cognitive development, the field wherein creativity is most logically defined. Just as the Shockwaves produced by technology has masked the similar waves of linguistic data. In truth, I only took notice of a 'blast' when, in concert with 'unfamiliar' classroom settings, it demanded my special attention. To be fair, many researchers have been blindsided by rapid change, and, with some urgency have sought to find new meanings under stressed conditions. There may be another reason, however, for the Kantian 'blindness,' and seemingly points more directly to human nature. Christine de Pizan wrote in 1550 that the measure of the human desire is to create and discover, "for it is not such a great feat of mastery to study and learn some field of knowledge already discovered by someone else, as it is to discover by oneself some new and unknown thing" (Weiner, 2000, p. 49). Even when historians remind us that the past echoes in the present, the birth of something novel inevitably charms the imagination and blinds us to obtaining all the facts which may lay open to be recovered. When writing about the human condition, in effect, 70 Hannah Arendt (1958) said that as each human being is born, a desire to impart a measure of one's uniqueness is the impetus for departing from "whatever may have happened before" (p. 176). With word and deed we insert ourselves into the human world, and this insertion is like a second birth, in which we confirm and take upon ourselves the naked fact of original physical appearance. This insertion is not forced upon us by necessity, like labor, and it is not prompted by utility, like work. It may be stimulated by the presence of others whose company we may wish to join, but it is never conditioned by them; its impulse springs from the beginning which came into the world when we were born and to which we respond by beginning something new on our own initiative..It is in the nature of beginning that something new is started which cannot be expected from whatever may have happened before. Whatever the cause for this general lack of understanding, it reflects dramatically in a system that faces meteoric changes in technology. And its impact on every decision, from implementation to assessment, reverberates throughout society. Change, in effect, is the unrest brought about as a result of the introduction of novelty into stable conditions. For experienced teachers, the battleground of rapid change demands new understanding. If science and art does not shape pedagogical processes, however, then applying technology and assessing creativity is going to be a colossal absurdity, or, in other words, farce. Paradoxes conspire to make it impossible for educators to carry out their role without developing clarity in theory and artistry in praxis. The educator who instructs novice teachers has an extraordinary responsibility to pedagogy and the layers of media embedded within. In the end, I suspect that to understand that the medium is the message is to understand relationships as they exist between technology and creativity, language and cognition, mind and body, praxis and theory, and so on. It is herein that those paradoxes may be coherently understood. Recognizing the futility of conventional methods in creativity research, I decided to employ an approach that was by nature, a conceptual analysis. Since I sought to achieve synthesis, it could not be arbitrary. It had to have a measure of the philosophical, historical and 71 empirical. Its validity had to be open to verification. I began to entertain the idea of writing autobiographically. I knew that I had in possession a professional journal that I had maintained for numerous years. Since my journal contains queries on practice juxtaposed with thoughts on readings, I had a record of both my experiences and musings. Providing supportive data, moreover, I had a portfolio filled with classroom transcripts of text, music, photographs, art, videos and all kinds of compositions and reflections of former students. Unwittingly, I had collected a rich data source from which to guide my conceptual analysis and arrive at synthetic thought. Later, I found literature on narrative and autobiographical research quite useful and somewhat aligned with my purpose. Concerned with the idea, however, that in writing autobiographically readers may view the content as a 'venting' of grievances rather than an analysis of theory, I took some comfort in the fact that "individual experience need not be confessional or focused on the author's feelings (though it may be)" (Griffiths, 1994, p. 76). Moreover, the validity of a narrative or autobiographical approach fell within the parameters of "theory reinterpreted in the light of experience, and experience in the light of theory" (p. 76). In point of fact, Griffiths cites as example "Jackson's (1990) discussion of masculinity in the light of his remembered life, including his career as an English teacher" (p. 76). This thesis rests, therefore, on experience juxtaposed (i.e., analyzed) against facts, that as best as I have determined, have neither been fully applied to understanding creativity nor, for that matter, its relationship with technology. Though there is perhaps nothing particularly stunning about any one of these facts to the larger community of researchers in semiotics, cognition, and technology, the collective application of experience, reflection, and theory carries great significance. According to Griffiths (1994) "a critical autobiography will add to reliable knowledge if it makes use of individual experience, theory, and a process of reflection and re-72 thinking, which includes attention to politically situated perspectives" (p. 76). I would add to that statement, 'attention to all socio-cultural and philosophically situated perspectives—or at the very least as many as one can muster.' Many contexts and contents tend to be glossed over in the educational study of creativity and technology. This glossing over is a major cause of fallacies in logic embedded in contemporary arguments that, despite being compelling to educators and researchers alike, upon closer inspection, reveal not to be in the least bit useful for pedagogical directions in technology and creativity. Probing into popular arguments will provide the reader a means from which to understand the nature of the oversight, by becoming familiar with propositions that do not hold their ground in practice—information that is absent from the field of research and pedagogical endeavors. Propositional logic, quite rightly, must withstand careful deconstruction and, if logic prevails, premises and conclusions must be true. If error exists, therefore, it will only be found through the use of analytical precision. Thus, the method from whence this thesis unfolds functions with three objectives in mind. The first objective is to illuminate the problem through a carefully constructed narrative (i.e., autobiography). Being a record of a personal account, the narrative describes settings, characters, problems, and obstacles that have impeded truth. At the same time, the narrative establishes the parameters of the inquiry at hand—since many threads of conversation could lead into many more discussions (i.e., sub-plots), the narrative serves as the 'architectural tone.' Thus, the narrative wends its way through observations in practice and reflections in practice, historical accounts, empirical evidence, philosophy and scientific theories in order to reach its conclusions. From that perspective, the narrative is the critical ground from which to understand the problem, and it renders visible what heretofore was transparent to me. 73 Next, insofar as a conceptual synthesis is dependent upon words, a treatise cannot ignore diverse meanings found therein. To arrive at a common ground of understanding, therefore, definitions are frequently in order. Yet definitions, even those found in the most reliable of sources, carry multiple connotations, references and associations, which when not anchored to context, obfuscate their sense. The second objective, therefore, is to bring clarity and meaning to concepts by moving beyond mere definitions. For that reason, a narrative's usefulness when dealing with abstractions through analogy, metaphor or parable, can allow facts to become more intelligible. In that instance, the descriptive nature of the narrative, being a record of events, also lends support to the central idea by drawing upon the strength of metaphor as a critical method of inquiry (Griffiths, 1994). A third objective is to achieve a balance of knowledge derived from philosophical and empirical means. Since the particulars and universals expressed in this thesis have been derived through disparate means in distinct but related areas, it will be necessary to collate their truthfulness by reason and logic. However, the means of the narrative may pose a problem for some because narratives are viewed frequently to be merely 'descriptive' and relative (i.e., particular) in nature. Usually narratives are considered insufficient for general conclusions to be formed. Nonetheless, the particulars will be examined under the light of generalizations accepted as true in comparative findings, and additional generalizations made will be admittedly 'speculative.' A deductive argument cannot gain from additional information, for the conclusion cannot be any truer. As such, the reader may find that additional information provided to support the arguments found therein points to an inductive method. Despite that it is from special cases wherein generalizations are often made possible, by definition, science is wary of induction because of its propensity to conclude erroneously based on particular data. No amount of 74 additional experiences (i.e., white swan sightings) can logically make anything truer (i.e., all swans are white). A narrative may appear to some to be a collection of particular data added up to arrive at a conclusion. A critical autobiography, however, uses logic that is deductive in nature and, therefore, aligns itself with the scientific approach. From a metaphysical standpoint, the theoretical construct for understanding media, undertaken notably by McLuhan (1963), serves to connect and illuminate many points. To some extent, the tetrad (as previously discussed) will serve to help clarify relevant media such as creativity and technology. The tetrad will also help to triangulate relationships, and examine the link between language and cognition; and between creativity and technology. Moreover, it will serve to draw out the paradoxes, mind/body, percept/concept, and concrete/abstract thought, all of which form cracks in the foundation of logic. Generally speaking, educators are caught in a hotbed of rhetoric and research that confounds pedagogical choices. Relying on experts, and neglecting critical inquiry, they willingly accept spurious solutions to resolve urgent practical matters. In recent years, notions stemming from the dialectic of postmodernism, constructivism and pragmatism, such as, indeterminacy and chaos; transfer of knowledge and generative learning contexts; authenticity and self-determinacy; multiple forms of representation and intelligences, offer immeasurable propositions that deal with complex cognitive learning issues. Although the application of all such concepts toward pedagogical approaches must be carefully weighed, neither the Marxist nor relativist-oriented discourse has facilitated the comprehension of the phenomenon of creativity. Notwithstanding, the essential natures of creativity and technology have not been given sufficient philosophical consideration, and, visibly missing from research, requires some attention. 75 In summary, faced with a demand to implement technology, and the pursuant design and assessment of creative processes and products, educators must come to terms with what appears to be an insoluble issue—defining, assessing and evaluating creativity with a new media focus. The purpose of this thesis rests upon several matters that are inexorably linked. The first will be to reveal the classroom condition; the second will be to critically examine popular educational arguments relative to creativity issues; and the third will be to reveal supporting evidence that allows for a conceptual definition of creativity. Thus, in the next chapter, the reader will discover the school setting, its players and the arguments that make up policies in school and the classroom—policies that ultimately depend on the depth of understanding of cognition and sign systems. The following questions have guided this thesis throughout the investigation. (1) What impedes our understanding of creativity and technology? (2) What parallels can be drawn from empirical findings in language and cognition? (3) How can this new knowledge, gained from the exploration of the previous questions, heighten our understanding of creativity and technology so that change is positively affected in classroom pedagogy? 76 C H A P T E R T W O The Road to Inquiry: A Journey into History and Philosophy Certainly we talk to ourselves; there is no thinking being who has not experienced that. One could even say that the word is never a more magnificent mystery than when, within a man, it travels from his thought to his conscience and returns from his conscience to his thought (Hugo, 1987, p. 226). It is possible to read the preceding passage from Les Miserables, nod in agreement and, yet, while still admiring its poetic sense, overlook its significance. It is possible to never stop to wonder, as a thinking being, by what manner we talk to ourselves, or question in what sense our conscience is separate from thought, or even wonder why Hugo bothered to write of such matters. As Les Miserables unfolds, in fact, Hugo's trifling with the significance of word and thought does not compare to the moral and social paradoxes he so artfully exposed and directed our minds to attend. Indeed, without the skills of an auteur, were we to author the content of our lives, we would not stop to ponder the relationships between word and thought. It would appear that the words merely recorded the thoughts that had swept over events—those victorious moments when we defeated our obstacles, or, grimly, when we ourselves were defeated. For in reflection, it is the magnitude of the events we focus upon and the lessons we believe to have learned from them; rarely do we focus on the language of our sentiments, or by what means the events unfolded, and, more infrequently, how our thoughts came to be. Even if we considered, for poetic or philosophic reasons, the words we chose, nothing could compare to what was said or done. Yet, this is not surprising. We understand in story structure that the inciting incident, that which propels the hero from the ordinary world to an extraordinary one, is transparent in the overall scheme of things. In fact, every incident, as large as it may strike our sensibilities, is a medium with a message that is seemingly transparent to everyone except the author. As soon as the archetypal hero is pushed into extraordinary circumstances, however, he/she assumes bravely (often reluctantly) the fight against every obstacle that blocks the path to the goal—the resolution of a very big problem in order to return to the sanctity of the ordinary world. When the inciting incident hurtles the hero into a completely baffling world, as it had Alice in Wonderland, imagine the confusion. Against what or whom is the hero battling? How will the hero return to the ordinary world? What if the ordinary world no longer exists? Could we not say that the startling achievements of the 20 t h century in quantum mechanics is the inciting incident that transparently sent us reeling into the extraordinary world of the 21st? Was it not cosmology, the study of all that is large and small, that eventually marshaled energy into the palm of our hand? Certainly, all of it can be confusing, particularly when the hero is slow to awaken to the strangeness of a new world and its new messages—and even slower to act. McLuhan (1966, 1967), however, proposed that a hero's confusion could be eased through understanding that the medium is the message. Moreover, that the medium is like a Rosetta stone; it becomes the key to its deciphering. It is only from understanding the medium that the hero of a story may achieve the aim and so too can a reader interpret the author's work. By understanding the medium, we are free to decide on a course of thought and action. To continue along McLuhan's (1963) line of thought, to be aware of the medium is to begin to understand the message, but when we neglect to consider that moral, social or political content is locked into a relationship with media, we lose perspective and end up battling what we cannot see, or acting upon what we do not understand. The more intently we focus on the content, in fact, the more we lose touch with the medium and the manner wherein medium has 78 affected the content. When we lose touch with all but the content, we fail to note the intimacy shared by the medium and message—an intimacy that is as absolutely interdependent as conjoined twins. To wit, digital technology is the inciting incident that has thrust us into the 21 s t century, an increasingly sensational, fantastic, and strange world. Showing signs of constant acceleration, digital media elicits both admiration and disdain; its contents (i.e., products) have become revered and feared. Why is there a division in sentiment? It cannot be merely due to the paradox of change. Surely few who live in a modern society though admiring traditional means can begrudge technological advancements. Rather, it is because few are willing to admit that content exists by virtue of its medium and that content, in fact, reflects its medium. Modern technology has sped up knowledge, just as Gutenberg had hastened it with print, though the cognitive affects of the two media, print and digital, are quite different. According to McLuhan (1963), the printed word creates a fragmented, sequential and ordered world by virtue of its abstract figure that must be encoded and decoded one meaningless byte of information at a time. By contrast, aural and graphic media are whole and concrete—the eye and the ear are interchangeable. In creating a simultaneous now and informing us beyond print, new media presents a world not much different from the concrete and actual world we live in; effectively considered a virtual reality. Those differences in media impact significantly on the way we think and feel; their subtleties barely discernible to the average individual. But to McLuhan (1963), such differences critically demonstrate the power of the medium; and were we to never discern them, keeping the power of the medium at bay, the fate of modern society would resemble Plato's cave dwellers who were sentenced to a world of shadows. There is a comparison to be made between print and digital media that reaches beyond figures and numerals as means for expressing qualitative and quantitative thought. The visible 79 alphabet, of course, requires that one develop a phonetic association, which then may be inscribed with the crudest of instruments. Digital media, by contrast, are dependent upon a numeric logic that is physically imperceptible, except to programmers, and requires sophisticated engineered hardware. Designed to capture traditional media (i.e., symbols, images and sounds) digital media escapes the user's conscious knowledge, and, to varying degrees, is also true of the use of the alphabet over time. That is to say, the alphabet serves to capture traditional signs of language, namely, speech and gesture. The conventional alphabetic codes must be acquired through education but once learned, they become transparent to the learner. Both print and digital media serve a similar purpose: extension, storage and communication. And each originates from meaningless fragments to form sequenced logic. Furthermore, although their strength is in mass and global communication versus individual and regional usage, neither digital nor print media can become accessible to the masses until hardware is engineered to support their design. The fact is that neither print nor digital technology became an issue of concern until the advent of the printing press and personal computer respectively. Since those watershed periods, both print and digital media engender grave concern among those whose sensibilities toward moral, social, political and economical matters hang in the balance. For many players in a global arena, the concern is over who will control knowledge, or gain access to knowledge, since hardware (e.g., as archaic as the printing press) is always costly. Many socially conscious individuals fear the economic and political disempowerment of those who are without means. A concern principally over content and the message or knowledge contained therein, the fear is based on the belief that what is said or done, as an infringement on the rights of society or the individual, is left to continue with impunity by those who own the instruments (e.g., 80 computers). The same is true of knowledge that may be of benefit society to if it is kept hidden from the individual. Conversely, for those at the helm of society, the fear is based on the possibility of losing authority as the masses become empowered. The notion that knowledge is power stems from the belief that omniscience enables one to act as God. On the other hand, those who strongly advocate innovative hardware and software (i.e., the printing press in the 13 th century and conventional spelling as early as 1876 in England and the US), do so with a positive disposition toward the acceleration of culture by firmly believing they are advancing human rights and ideas. In short, most advocates of modern technology are mainly concerned with the promotion of content and the degree of digital acceleration. When we examine the issues that are certain to arise between two or more polar points of view, we note that the conflict over communication advancement has not changed much in 450 years. When we begin to examine the nature of that conflict with a statement made by Werner Heisenberg (1958) we are shown yet another angle. "Pluralism never appeals to those who are wont to think in fundamental principles" (p. 65). The term Luddite, applied to those who oppose technological advancement, often fear the unknown—an inevitable cosmic condition. Yet, even among physicists, who understand for the most part the nature of the unknown, uncertainty is hardly palpable. For most physicists at the turn of the century, in effect, the Uncertainty Principle was an unsavory notion because of its unpredictable nature—physics after all, sought to predict events, not leave them to chance. Moreover, most reasoned that truth is founded upon a determined design that lays in wait to be discovered. From that perspective, chance cannot conceivably play any part in science. As Einstein once declared, "God does not play dice." Yet, Heisenberg's proposition, as absurd as it may have seemed, met the rigor of verification. Chance was thus introduced as a matter of 81 course. And, thus, despite that chance is paradoxical in nature—it became a new trumpet to sound within and without scientific domains. But even if it were conceptually incorrect to introduce chance in the role of quantum events, the mere positing of such a paradoxical claim to solve a complex problem would not have gone unnoticed. The sensational proposition, thus, would have shaken the ground and every effort to uncover the truth would have ensued. Umberto Eco (1998) suggested, in fact, that false claims that are taken seriously, are often serendipitous in nature for they frequently lead to truthful discovery. In so doing, Eco argues that false claims contain as much merit as truthful ones. Since, as catalysts, either truthfulness or falseness may stir investigation, we may surmise that reality is dependent upon contradiction, that is, the forces of opposition being the impetus for discovery. In the end, we do not take away the label of creative genius to those whose claims are partially incorrect (i.e., Ptolemy) we merely reassign them to the status of 'naivete.' Their position among geniuses becomes less revered in light of those whose discoveries or theories accurately shifted an entire consciousness. Once a paradox is understood to necessarily exist, the contradiction may be thought of as complimentary (if one so desires)—a type of relative condition that allows truth, pitted against opposites to become an either/or proposition. Light is either a particle or a wave, and either its velocity or position may be measured, but not both. The point of view of the observer plays a role in determining the outcome. Similarly in story structure, the protagonist and antagonist necessarily claim truth on their side; if one is true, however, then the other is false depending upon which side is taken. Truth of a relative nature is correctly deemed particular, applying to one or the other paradoxically. 82 Yet, from either relative state, a general principle may be abstracted. Provided that all particulars pertain to the general principle, the latter remains in force until such time as it is shown otherwise. Whether paradox is ignored (i.e., regarded as too absurd) or considered complimentary as in either/or; whether a claim is true or false, no one can deny that the very nature of opposition has fortuned us with extraordinary works, whether in science, philosophy or arts. That is to say, the fertile ground becomes impregnated with ideas and productivity nearly explodes (or in the case of quantum media, implodes) when contradictions abound. When we neglect the medium and focus is placed solely on the message, two conditions are at once in force. The first is conflict and the second is the struggle that ensues to resolve the conflict. In the case of all media, conflict is wrought by the medium's nature to slip by unnoticed. For instance, the story as fiction (medium) becomes second to the story (message). J.K. Rowling (2000), author of the overwhelmingly popular series of books on Harry Potter, has encountered the problem that besets those who ignore the medium. In an outpouring of religious fervour, many devout Christians throughout North America denounced the stories as instruments of witchcraft. Rowling countered with a logical position. She questioned the adult assumption that children are incapable of discerning fact from fiction. The outcome of any 'duality' is usually marked by polemics—insofar as points of reference are contradictory in nature (i.e., mind and body). Suffice it to say that when the relationship between medium and message is overlooked, it creates a permanent divide. Invariably, contradiction and opposition creates a struggle between the keepers of Truth (antagonist and protagonist). Since an institution (or members therein) fears its own overthrow (i.e., religious), as a result of the struggle for control, conflict is often placated (sometimes squelched) at all costs. Even if it means a willful acceptance of spurious solutions, the need to restore order or to resolve urgent practical matters that threaten equilibrium is often embodied in the institution. 83 Unfortunately, the content of such matters, being one-sided, is not much more than propaganda. Though many may recognize it as such, to face paradox squarely may be an impossible task to those who believe to hold the truth that is free from contradiction. Unless paradox is revealed, however, institutions will continue to be filled with characters such as Hugo's Javert, who hold up the standard of the message until one day they are confronted with a lifetime of flawed behavior. There are many who fight with righteous indignation, with a banner of Truth to support their convictions. But the bigger picture that includes both sides of the opposition can only be revealed by thoughtful inquiry. Truth is not so much relative as it is the revelation of the tension that binds opposition. At any rate, truth can neither emerge from a single point of view nor from refusing to face the tension that comes from contradiction. The institution of contemporary schooling may be restricted by bureaucratic choices that are situated within economic, political and social issues. The messages that are embedded within the social discourse frequently dictates the outcome of action to the extent that as a means of inquiry, schools generally lack the combination of artistry and clarity. To borrow a term frequently employed in the field of mathematics, schooling lacks an elegant proof. Though artists, scientists and philosophers have contributed to educational constructs neither art, science, nor philosophy governs classroom practice. In general, practitioners are buffeted by every blowing doctrine—the bigger the gale the more likely the doctrine is spread and, thus, its popularity will ensure its position. It is well known among advertising firms that the more frequently a product is seen, the more likelihood of selling more product and successfully beating out the competition. Despite the numerous competing theories that ought to provoke critical inquiry, debate is met with opinion rather than research. The transparency of the medium, its play upon the message, reaffirms the juxtaposition of medium and message. 84 Ultimately, practioners come to rely on experts who trouble themselves with such matters, even though it is incumbent of all educators to think critically. Educators thoughtfully engaged in critical inquiry try to meet paradoxes with certain reserve. It is not easy to reconcile theory and praxis, or common sense and science. There may be a few who try in earnest to awaken the masses, but with such a sprawling and established institution as education it is not a simple matter. Critical inquiry is not for everyone, and unless the findings state the obvious as Hume (1998) once criticized with a measure of practical application, entertaining absolutes (i.e., abstractions) can be daunting for the average mind. It is for that very reason that I presume Marshall McLuhan has been largely overlooked. However, a grassroots type of change can occur at the level of teacher education provided that the instruction is guided by the knowledge that paradox renders truth and that conceptual understanding is possible when clarity and artistry are applied. By that, I imply the following. (1) Abstractions, or generalizations common to theoretical constructs, are in most instances difficult to apprehend and, thus, teachers will try many practical strategies to purify the waters of theory. Nevertheless, it is only when a teacher has herself clearly understood and applied the general principle that can ultimately bring clarity. (2) The art of any discipline demands that paradox be revealed. Teacher education, effectively, offers pre-service teachers the only venue to initiate the art of inquiry among beginning practitioners and, having said as much, that fact places a great deal more pressure on instructors at higher institutions to move beyond pragmatics. Discourse must become as much a part of education as activity. (3) The education of novice teachers is as caught up in the paradox of theory and practice as that of any discipline found in school. But, admitting that this paradox exists is at least a step in the right direction. A l l that remains is to 'see' the paradox 85 in order to confront it and then to illustrate its presence with a degree of eloquence so that others may join the struggle. The fact that there exists a continued struggle within traditional disciplines to marry theory and praxis is troubling enough—the fact that there is a novel context imposed on today's educators compounds the situation. Teachers in the 21 s t century must contend with technology as the new medium and new content of educational practice and theory. Thus, individuals who enter the profession of teaching are required to comprehend and apply a constant changing body of knowledge to keep up with the demands of a restless society. There are some individuals who would say that art, the Latin translation for the Greek word techne, and technology are bound by the same principles. A paradoxical tension exists between the medium and message, and between the human being and the artifact. For that reason, art and technology are conceptually similar (Gouzouasis, 2001). Conceptually speaking, however, that fact stands at a distance from practice, theory and praxis being what they are. The inexperienced thinker will find the examination of such matters difficult to comprehend, and, practically speaking, may find it disconcerting insofar as immediate concerns need to be addressed. After all, anyone faced with the daunting task of integrating technology in a classroom context has frequently more urgent and practical matters to consider than philosophical. Nevertheless, most of the discussion to this point in this thesis concerns what is at stake for educators who lead new professionals into the field, particularly when technology is involved. In the end, educational researchers who endeavor to broaden personal knowledge and to apply in practice what is most often theoretical must not run from paradoxes. Like all teachers in every classroom, the first paradox is the coupling of theory and praxis (a process that is universally misunderstood on a colossal scale). The second is the capacity to disseminate one's 86 findings without squelching competing ideas, knowing that it is only when competing ideas are critically examined that a pre-service or practicing teacher may discover the art of inquiry. The bonus of approaching new knowledge in such a manner (i.e., inquiry) is the fact that discourse enables one to uncover what one seeks. For those who make the investigation of competing theories and practice an imperative, never shying from squarely facing paradox, always willing to examine the essential nature of troubling phenomena (such as technology), there is a chance of 'understanding' with more clarity. The earnest teacher realizes that to follow the militant path that comes of embracing too much certainty is to risk concealing truth. If truth is to be sought, therefore, one thing must remain in the forefront of learning: a flow of questions that probe with incredulity theoretical and practical foundations. Moreover, it is the critical contemplation of competing ideas wherein truth may emerge. It is only through inquiry that the new professional can begin to honor their role as educator. By that statement I imply that a teacher honors the profession when a critical view is taken rather than the mere reliance on the knowledge of so called experts. Without personal inquiry and reflection, teaching would resemble not much more than the transmission of fragmented bits of information that has become the providence of television (though not nearly as entertaining). Within the heart of earnest inquiry lies both artistry and clarity. Art necessarily carries a measure of paradox, and it is the revelation of the paradox that is the skill of the artist. Short of personal inquiry, it is better that good research back practice than none at all. Understandably, after the paradox is revealed with an artist's skill, a sense of 'powerlessness' may ensue. Paradox always reveals both sides of an issue and, if we are careful observers, we begin to find it difficult to take sides (as a bystander we can gratefully abstain from doing so). Even more difficulty arises when we try to balance a curriculum in a limited time frame or with limited resources. Notwithstanding, to ignore the paradox is to limit truth. 87 After the September 11 th attacks on the United States people had a need to 'understand' all sides of the issue. As such came the inevitable irony of the truth of any complex situation. Creating an environment that honors investigation necessitates thoughtful action and nothing less should be demanded of educators—in particular those who instruct pre-service teachers. The instructor's respectability is founded upon an ability to use a measure of precision in thought and artistry in action, which holds true whether instructing at the university level, or at the kindergarten level. Generally speaking, however, it is an arduous task. Anyone engaged in thoughtful inquiry and critical investigation faces the conflict of theory and practice—on the one hand the mind is engaged in entertaining absolutes, and on the other, attending to practical matters. The demand placed on instruction (and instructors) effectively requires the teacher to employ an elegant means to mediate the conundrum of learning (i.e., praxis and theory). Nonetheless as the instructor is the medium, if awakening the mind is the goal, it requires great clarity and artistry in every possible sense. The desire to "insert ourselves into the human world," as suggested by Arendt (1958, p. 176), like an assertive teenager, is something that may prevent us from comprehending with more clarity and, thus, from engaging the art of inquiry or, for that matter, art generally. We may wish to create something new, to discover new frontiers, but without understanding, we are unwittingly bound to repeat what others have done before, being neither original nor novel in our endeavors. Moreover, we are certain to either crash or stumble our way through a network of delicate principles that were constructed, for better or worse, with thoughtful intention. In truth, my own desire to "insert myself into the human world" caused me to pitch forward blindly. I often claimed to advance by intuition and fortitude. I was unaware that intuition is merely the capacity to wrestle concrete and abstract meanings, and fortitude is the willingness to find a solution despite the pains from struggle. Unable to articulate with a degree 88 of certitude, I held firmly to the opinion of experts to support my beliefs. It was only when I encountered opposition from within—a personal wrestling with theory and practice—that I was compelled to seek greater understanding. I realized at the time that because of a lack of understanding, I would be prone to do the same thing in the same way while expecting different results. That is a condition that surely renders feelings of frustration. Realizing that much was a step in the right direction. The next step toward understanding, it would seem, was to examine all the clues left behind. That meant the examination of both medium and message. In so doing, there was a chance that a discovery could be made. Reflections on Teaching: An Uneasy Discovery Thinking in this manner, I pondered over my work of the past ten years that I have spent teaching in arts and technology, the latter three years in educating pre-service teachers within the bounds of the discipline of music. As I faced the diversity of knowledge, belief, value, skill, style and temperament of those who would become teachers, desiring a degree of clarity and artistry, I searched for a means to convey all that was necessary. From the start, the diversity within the university classroom, a reflection of a democratic and pluralistic society, created a challenge to discover a method and style that would enable me to negotiate between conflicting views though, at the same time, provide thoughtful content that not only reflected the discipline of music with its requisite method and content, but also reflected the enormous body of knowledge that has shaped both the discipline and the profession of teaching. I discovered that it was a great deal to accomplish in the short period of time allotted for instruction. Invariably, conflict arises and emotions flare when confusion reigns in a course designed to teach music to generalists with little formal music background. During those 89 moments of confusion, I reasoned that it was imperative to identify the cause of malcontent if I wished to effectively reach my goal. Neither was it a simple matter to locate the origins nor was it easy to remain cool-headed. Whenever tension arose, I thought of plausible reasons that caused the flare up: the struggle embedded within the profession of teaching, the body of knowledge that questions epistemological concerns generally and that of the discipline of music, and any number of factors that make up the human condition. One day, a frustrated student met my tentative first teaching attempts head on in her journal and shed light on the whole matter: A l l I would like, though I do not speak for the others but many will agree, is to have some direction. In order to use this course within the context of our teaching careers we need to know what to do. It was natural that students would ask me to show them the way—I was their guide, after all. But, what did this entail? How was I failing to guide, to anticipate their need, to offer models of instruction? Though I appreciated the student's candor, I rested uneasily at night wondering whether I could handle the complexity of teaching a methods course entitled Curriculum and Instruction in Music Education to non-specialists. In the midst of my anguish, for it was very troubling to me, I began to note the questions that were posed throughout their personal reflections. I was struck by their great desire to know, understand, and develop teaching skills. As illustrated in the following examples, many education students are prepared to question established convention. Why must music be taught in school, is it not better taught privately? Why must I be held responsible for music education when specialists are not available—what can I, as a non-specialist, do to address music adequately? What role does technology play in the classroom, particularly, in music education? Why must I be held responsible for teaching with technology; are we really prepared to engage all children equitably? What activities should I structure and why? How do I integrate activities with other subjects? Which theory, strategy, technique, 90 approach achieves which outcome and why? (A natural response to this line of questioning is that pedagogy is matter-of-factly governed by cause and effect). Why is it so difficult to assess and evaluate music performance and creative outcomes? With each query, despite my years of experience, I became increasingly aware of my own lack of precise understanding of my aim as a music educator and, for that matter, the relationship between music education, creativity, and technology. As it is so frequently the case, the imprecision of my thoughts bore the signature of a limited conceptual knowledge. Though I could have shown by example innumerable cases and engaged students practically in dozens of strategies, none would suffice to explain the essential quality of their questions that demanded conceptual definition: What is (x) phenomenon? What is my responsibility to (x)? How should I address, teach, assess (x) and why? There existed no easy responses to those queries for as I grappled with the immediate issues that troubled them, I struggled to interpret all that lay beneath their objections, just as I struggled with my own understanding. Fundamentally, their troubles arose from being put in a situation of learning to teach a subject that eluded the majority of them both practically and theoretically as children, and now as adults. Though I ventured to assess and fill their comprehension gap in music and music pedagogy—bearing an intimate relationship with developmental psychology and general learning—I could not fill the gap between classroom experiences and conceptual knowledge marked by matters of a metaphysical and epistemological nature. I simply lacked the answers, and, thus, I reasoned that their understanding would be served best through personal inquiry. Hence, I assigned readings for analysis, presentation and discussion, and I endeavored to relate such abstract matters to concrete classroom work. At the very least, this task forced students to engage in thoughtful discourse as they grappled over the 91 tension that exists between tradition and change as it is marked in the discipline of music, between theories that contradict classroom practice, and between academic study and the lived experience. In general, personal inquiry ameliorated classroom learning. Though it did little to enhance the development of music skill and knowledge it did, however, raise important issues. And from a desire to gain further understanding, practical experiences were surprisingly sought after, such as playing the guitar or making a movie with an original composition. Flashback: The Beginning of the Journey In September of 1997,1 had started a third year of teaching in a most extraordinary school named University Elementary (UES), which was situated in Calgary, Alberta. Though I was not especially involved in early childhood education, I spent the first weekend in October at a conference entitled, "Contemporary Curriculum Dialogues in conjunction with The Hundred Languages of ChUdren Exhibit: Work from the municipal preschools ofReggio Emilia, Italy." The conference had brought together delegates and presenters immersed in transformative and generative curriculum, and featured two keynote speakers: William Doll, a curriculum theorist rooted in postmodernism; and George Forman (1993), the co-author of The hundred languages of children: The Reggio Emilia approach to early childhood education. From the outset, the conference promised to deliver a message of support and affirmation for the direction we had taken in curriculum and pedagogy at UES while, at the same time, offering us new insights from contemporary theories and practice. I was particularly keen because of the changes I had made to my specialist domain, namely music education, which despite many personal misgivings, I had implemented with enthusiasm and the blessings from two of my administrators. I had departed sufficiently from the core music curriculum to feel nervous about my personal choice to do so, even though it was UES that had set the landscape for this change in my practice encouraging me to use my broad 92 background in performing and media arts. In short, by challenging my thinking and allowing me the freedom to imagine, innovate and develop a new curriculum, I was encouraged to apply the knowledge and skills I had acquired in music, drama, dance and film. Finally, the curriculum I was encouraged to develop in the arts had to link to core curriculum—a challenge that greatly enabled my understanding of integration. As a member of the UES Creative Application Team whose focus was arts, science and technology, I became part of a new movement that introduced collaborative team teaching between specialists and generalists. It was.a movement that reflected changing times observed in society, business, and institutions, where general and special skills are required for successful systems operation. It would seem that a new worldview was hatching since the inception of science and industry, and this change was rapidly transforming society through innovations in technology. Our new movement began with a school mission statement that extolled the virtues of democratic learning deemed to be collaborative, constructive, generative and transformative. Those notions had emerged from our readings: the philosophies of Maxine Greene, Jerome Bruner and John Dewey, the theories of Jean Piaget and Howard Gardner. It was also part of an emergent movement to break with the Modern view with its hold on a central Truth and authority. In brief, those latter notions emerged from the postmodern views of curriculum theorists William Doll, Elliot Eisner and others. As such, the postmodern dialectic influenced our commitment towards tolerance for ambiguity, chaos and indeterminacy over linearity, sequenced learning, and determined outcomes. Those latter views, believed to be traditional in scope, compelled us to reject what Doll (1993) called the clinging to Euclidian logic and positivism. We embraced, instead, the randomness that emerges from negotiated and generative learning. In other words, the subject of 93 study was determined, that is to say the content, was driven by the context. It was another example of the interrelatedness of medium and message. We did not doubt the language used by the new voices of society, for postmodernists, technologists, scientists, humanists and capitalists all appeared to sound the sacred trumpet of democracy. We all felt strongly that democracy, a system that requires openness, freedom, choice, negotiation and tolerance of many beliefs and values, was the cornerstone of learning in our times, for our classrooms were filled with enormous diversity. This diversity included: ages, abilities, ethnicity, culture, socio-economic status, language, disabilities and so on. We rejected the educational oligarchy and embraced the notion of 'every teacher as leader,' and we equally dismissed the autocratic rule of the teacher in the class by including children in decision making through negotiated learning. Negotiated learning meant that teachers were flexible in content and approaches, providing student choice and engaging in inquiry that stemmed from both student and teacher interest. To accomplish that, teachers created projects whereby children could choose their focus of learning and select by what means they would express what they discovered. Though there was a kind of randomness to this approach, the content was not entirely random. The overall study (i.e., Native Americans) was mandated by the Ministry of Education and handed down to schools as Curriculum Resource Guides. What could be negotiated, often brainstormed between teacher and pupils, were relative tasks, topics, and a means to arrive at the end. That was the 'creative' end of things. On the one hand, the classroom practice hoped to emulate the decision-making processes that were at work in our decentralized system, one that penetrated as far back in administration as the governing school board. That is to say, the school board itself had taken steps to ensure a measure of individual school autonomy. On the other hand, we wanted to be at the 'cutting edge' of 94 educational policies in teaching and learning. While it may seem that there was a certain 'flow' to these events (as one might expect of highly creative activity), just as in most 'democratic' societies (where policy is created as needs are revealed), there existed a perpetual tension that forced an incessant dialogue between members of the staff, between teachers and students, and between teachers and parents. There were many teachers who adjusted to the openness of negotiated means as did many students in the classroom, though for some students the expectations that required a good deal of self-direction and self-control proved to be a challenge. For that reason, while there were many teachers out of a staff of twenty-three who had adopted the new language and found ways to match it to practice, there were a few who, in principle, accepted the new ideas but whose practice appeared not to change. More of us, however, who embraced democracy with some sensibility toward its limitations, (i.e., the chaos that reigned with the collapse of the U.S.S.R., wars in newly reformed countries, and the confusion wrought by conflicting values in our changing society) proceeded with cautious uncertainty. There existed, nonetheless, a sense of lightness from a feeling of autonomy and from the restrictive boundaries having been lifted. The changes in administrative management were a result of the works of W. Edward Deming (1986) and Peter Senge (1990), both of which influenced our immediate and district administrators to reconceptualize quality management and system's thinking. Yet, the ambiguity that was present, the feeling that few boundaries existed, dampened the spirits and caused unease within staff meetings and during Parent Advisory Council (PAC) meetings. Democracy requires a great deal more dialogue than authoritarian rule, and dialogue requires a great deal more thought, as Socrates once exclaimed, than is at first conceived. 95 Most of the unease was dissipated, albeit momentarily, with continued discourse, open forums and mutual encouragement, though, after some time, there were those who simply moved on to a more traditional school—having never quite adjusted to the protean environment. Indeed, one year after the implementation of the school's new mission statement, I had responded in my journal to questions my principal had posed to the staff that reflected many of those sentiments. The following excerpt, therefore, is a record of the events that I later shared with the staff during a day of professional discourse, a document that bears insight as to the context of the time. In response to the question, "How has your teaching changed since coming to University Elementary?" I replied as follows. UES, or should I say S , has challenged my intuitions, my assumptions, my beliefs and my practice. Mostly though, because I came here with a reasonable intelligence and a multitude of experiences, I have come to the conclusion that this was orchestrated under the ingenious (or perhaps maddening) guise of forcing me to find my voice, and articulate with clarity and conviction an approach to teaching and learning based in postmodern curriculum theories. It all began with an environment that I perceived as chaotic, ambiguous, filled with incessant dialogue, and persistent inquiry. In an organization that depends on its capacity to prepare, set goals, implement and evaluate its outcome, it was hard to imagine that anything but a carefully laid plan was behind the 'chaos.' On the other hand, I knew that chaos could appear unwittingly even during the best laid plans. I could not decide whether chaos ought to be left to chance or built into a design, and this issue plagued me for quite some time because of the voices of postmodern curriculum theorists, such as William Doll (1993), who embraced chaos as a means to an end. At any rate, as a measure of my desire to move toward a new curriculum, I had spent the previous year in a lengthy project involving upper elementary students, aged 9 to 12, in film art and music critique that eventually led to video art compositions. I began the project with a critical examination of short subject narratives and followed up by implementing the use of varied technologies, i.e., video cameras, still cameras, computers, scanners and related digital 96 music and video editing applications (i.e., MIDI—musical instrument digital interface—and Adobe Premiere). A l l of the above resulted in a project lasting six months, a rather lengthy period of time for any elementary field of study. The art of film, which is an ongoing and lifelong personal hobby, is a medium that necessarily requires collaborative action and thought, multiple skills and knowledge, generative processes for planning and designing, and dialogue. From start to finish, a film requires a scientific approach to finding and solving problems: observing, inferring, predicting, testing, and evaluating. It also requires a good story, hence, a script. Scriptwriting involves all the elements and processes of storytelling, i.e., identifying the story arc from inciting incident to denouement with accompanying dramatic moments of conflict, rising action and climax. The script has to be typed, usually on a word processor, and the sides have to be memorized by the actors (volunteers in our case). The actor's scenes must then be blocked by the director and sequenced in accordance with the shooting schedule in a non-linear fashion as opposed to linear, which differs greatly from a theatrical version unrestricted by equipment availability, location and wardrobe. Filmmaking requires developing a vision, a plan, and a procedure, as well as maintaining focus, teasing out details, and making creative decisions that heighten the story beyond the simple plot sequences. Furthermore, it requires technical equipment and supporting media. Camera equipment, electric outlets, lighting, microphones, sets, props, location all have to be considered and carefully handled. The computer, useful in many ways, is used to view the rushes (i.e., rough cuts) and to edit, add visual effects, dialogue, and sound oxfoley (i.e., sound effects). Most important of all, music, considered as an integral element of film for enhancing the dramatic text and action, must be composed and requires the use of digital equipment, such as a MIDI synthesizer and computer 97 sound studio. Finally, promotional material (in our case posters) must be made with the use of computers, for text and graphics are assisted through digital art design. Film, from start to finish, needs the special abilities of composers, writers, directors, camera operators, directors of photography, producers and publicists, actors, technicians (gaffers and grips) and continuity, wardrobe and makeup persons, ostensibly, all of whom are creative in their endeavors. The students who took charge of the filmmaking processes, once discussions were underway, assumed each of those tasks. Filmmaking seemed to be an ideal metaphor and an authentic vehicle for the values we espoused, in particular the notions embedded in constructivism. Moreover, I had felt strongly that students would gain invaluable knowledge of media and acquire alternate forms of communication through criticism and composition. I knew it counted for something, aside from authenticity in collaborative processes and products, for I had learned of the possibilities for developing 'literacy' in other sign systems, though I did not know precisely what this meant or how to articulate why this would matter. The most difficult task, in effect, proved to be the evaluation of our work and grading the students. Before one can evaluate, one has to know what is being assessed, whether it is either literacy, creativity, skill, critical thinking, or .compositional outcome. When technology is at issue, in particular multimedia structures, the application of the aims embedded within traditional methods and content are seemingly insufficient for such purposes. For instance, though an educator may have a reasonable understanding of traditional forms of literacy (i.e., reading and writing) and the expectant outcomes, media 'literacy' does not presently contain the depth of knowledge found in linguistics and it does it hold the same status as linguistics. Literacy, in fact, not having been previously applied to oral societies, originally pertained to the written word or other such systems with 'literal symbols' such as mathematics. New 98 media, by contrast, use expressive means that far exceed written text and, in effect, deploy the senses of hearing, sight, motion and touch in ways that literary forms could only do by conceptual imagery. Nonetheless, the term 'literacy' is used in today's multimedia society to signify the ability to 'interpret messages,' especially those of a visual nature. Considine and Haley (1992), for instance, explain that TV and film audiences can either perceive and process at a shallow level or have a deep understanding depending on the level of media 'literacy.' What Considine and Haley probably intended was media fluency, a term frequently employed by Gouzouasis (2000, 2001) to distinguish between traditional literacy and the application and comprehension of new media. Many people today broadly interpret all forms of expression as sign systems, i.e., new media, traditional symbol and art forms—including language and math. Ultimately, literacy requires one to encode and decode messages and to understand contexts and cultural meanings, one reason why the term 'literacy' has been broadened to include all sign systems. Nevertheless, it is less of a literate nature and more reflective of fluency since not every expression is bound by the skills required of 'literate' individuals who must first learn to 'read' abstract symbols in a linear sequence before interpreting the meaning. Since the brain processes images and sound differently from symbols, therein lies an indication that the term literacy does not serve all media processes and products. I had first encountered the notion of sign systems and media 'literacy' through the writings of Short and Burke (1996) and Elliott Eisner (1997), whose ideas had stemmed, in part, from the philosophy of Susanne Langer (1953). Sign systems, in reality, contain meta-systems such as language, arts, technology, and so on. When broken down, those meta-systems form smaller systems known as icons, codes, symbols, and rules of usage. Generally, those smaller systems embedded within a meta-system are referred to, in the Chomskian sense, as 'deep 99 structures' and bear the terms grammar and syntax, otherwise known as the logical and intrinsic sequence of words and the relationships found between phrasings. Language, for instance, is considered to bear three structures: grammar (i.e., agreements between words), syntax (i.e., hierarchical order of phrases) and semantics (i.e., meanings of words). Music could be said to contain both syntax and grammar (Gordon, 1986), whereas, according to McLuhan (1963), visual images do not have a syntactical structure but arguably contain a visual grammar. The preceding is greatly supported by neurological findings (Sacks, 1995). As a whole, language adheres to a linear, hierarchical syntax and grammar—as does music (Gouzouasis, 1991, 1994c). Whereas graphic images, unlike language and music, consist of a non-linear, non-hierarchical 'visual' grammar. Moving images, notwithstanding, are more likely to contain syntactical structures, since the medium of film is primarily based on text (McLuhan, 1963). The study of sign systems, as a whole, is most likely to be found in the study of semiotics. Semiotic discourse has emerged from many fields of endeavor that have attempted to understand systems of communication and the related sign, symbol, icon, image, metaphor. Findings in cognitive sciences also influence semiotics insofar as the connectedness of 'sign' and cognition are recognized. A l l of those systems within systems become part of a larger discussion on meaning and culture, with each one affecting the other. Those matters make sense in light of the fact that McLuhan (1963) said that the 'content of any medium is always another medium'. Like many other educators, I had not investigated any one of those fields to the fullest extent. Nonetheless, I proceeded with limited knowledge and the satisfaction of a notion, which appeared, at best, intuitive. In other words, the undertaking of the project was well intentioned and accompanied by fairly expert domain knowledge and skills. Yet, though bearing the weight of strongly held beliefs, it was not buoyed by a fullness of meaning, particularly with respect to 100 signs and systems of signs. Moreover, teaching and learning wavered on the undefined and while there was a great deal of activity, thinking and dialoguing, I was uncertain, nonetheless, at what all of it signified. At best, the project appeared to be grounded in contemporary educational practice and 'philosophy,' and having risen out of a feeling that "something isn't right," it was a step toward fulfilling our school vision (although I later discovered not much beyond). I utilized those thoughts expressed by Short & Burke (1996) to justify the change I believed to be making in my teaching. I thus continued my response to my principal's preceding query by largely quoting from them in the following manner. As Short and Burke (1996) expressed in the article, Examining our beliefs and practices through inquiry, change began with a "vague feeling of tension (for me it was more like a full-fledged war) that we may not be able to articulate. Something isn't right, and we aren't quite sure what it is (I had many opinions, however), and that leads us to take some kind of action" (p. 97). At any rate, I stayed the course precisely in the way Short and Burke so aptly describe, "What often happens, however, is that our first steps stay within the same paradigm of beliefs and lead to surface changes in our actions in the classroom. Although these first steps toward change are significant ones, they often do not go far enough. If we become self-satisfied with surface changes in our practices and stop searching and asking questions, we are in danger of actually continuing the status quo which we think we are transforming" (p. 97). Those "first steps" toward action, though gratifying, indeed, left me with more questions than answers. I was willing to accept that our process had a measure of heuristics (a postmodern endorsement) that depended on serendipitous approaches and outcomes. Teaching, however, requires more than being satisfied that something was learned, it requires something more specific which counts toward cognitive development in concert with other areas of human development. Hence, the events and outcomes that were frequently unforeseen, hardly seemed plausible to include, ex post facto, as part of the planning process, much less evaluations. For instance, one fifth grade student summed up her feelings of the film project, following a much larger discussion on the varied aspects of the process she had come to understand (a reflection I 101 had assigned). Though this final summation does not reflect the full range of student thought, it contains a measure of the unforeseen outcome that is left unaccounted for and cannot be included effectively as having been part of the original intent of the project. I was director of my film. I participated with most other jobs too. It was very hard work. I now appreciate why the director gets credit for the film. I think students need to get a clearer picture that the director is in charge. I spent more of my time dealing with discipline problems than with the filming. I also had trouble [because] the assistant-director, many of the actors, and technical and artistic crew, lost interest and quit mid-way through the filming. There ended up being a small group of students that completed the film by doing many jobs each. Some of these are experiences that most professional directors would not have to deal with, though they may have had to deal with some bad attitude actors. I have done some acting myself and found that attitudes are pretty hard to deal with. The 'feeling that something was not quite right' forced me toward a great deal more pondering. My unrest, in effect, had moved me from a curriculum that was merely activity-based, one that was fun and enriching (as music education tends to be rationalized) to one that seemed more purposeful, yet, I was still far from uncovering the full sense of what I was doing. In the first place, I recognized that 'activity for activity sake' was not the direction one must take in the classroom, but I did not know why this was so (other than the feeling that it was not enough). In an effort to make learning purposeful, I devised something else to take its place, but this too lacked comprehension. What curricular and pedagogical knowledge I did possess, in fact, showed later to lack precise understanding. How do we account for pedagogical decisions? What can be counted as educative? A l l the beliefs and theories that abound in education, amount to very little until they are put into practice, this much is common knowledge among practitioners. Yet, all the well-intentioned practice in the world cannot be founded solely upon vague feelings, common sense or intuition if truth is what we seek. If teaching is to contain artistry and clarity, it requires both artistic expression and a scientific approach. Anything less would dull the role of the teacher and cause undesirable 102 damage to the human spirit. A rationale for content and method in teaching requires more than stating a desired outcome, and requires a great deal more than euphemistic expressions. Though I have heard it frequently said throughout my teaching career, it is not enough to say that we wish children to be happy, to have fun, develop intelligence and self-esteem, to be socially well adjusted, to gain leadership traits, and good citizenship qualities. Ideas such as those are indisputably held in the highest regard, but we have not yet found ideal conditions that either identify the cause or predict their end. Put another way, it is difficult to determine either the approach, environment, or content (though many will try) as having caused the outcome—the complexity for which cannot be actualized by means of mere stimulus-response. In truth, learning outcomes often arise from the foundation of the human condition and the ensuing socio-cultural context in which human beings find one another—none of which can be easily manipulated, controlled and predicted. Hence, it does not seem sufficient to think of curriculum planning as either creating a landscape, planting . the seeds, nourishing the mind, building the foundation, erecting the pillars, or fine tuning. Those metaphors only give us a vague impression that something worthwhile is afoot, but none precisely define what it signifies. Curriculum development throughout the latter part of the 19th century until the middle of the 20 t h century was based principally on a social and economic agenda related to technological influences of that period (Ayers, 1990a, 1990b; Glatthorn, 1987; Liedtke, 1990). Employable skills were largely the impetus for a curricular focus on specific learning outcomes relative to the industrial and agricultural times (Toffler, 1980). From the beginning of the 1970's until present, largely influenced by the rapid changes in technology, curriculum development necessarily had to shift its rationale to encompass general skill outcomes (Strickland, 1985; Toffler, 1990). The 103 instability present in today's social structure and economy has forced learning institutions to focus primarily on general cognitive skills (i.e., critical thinking and problem solving). Curriculum resources seemingly provide ways to plan, design, assess and evaluate general skills. Nevertheless, since creative thinking and problem-solving processes remain largely undefined, in my experience it would seem that teachers most often depend on vague impressions, and the benefit of years of experience, such as the established approach for defining the criteria (i.e., a model) prior to evaluation. Curricular content, in effect, is largely developed within the framework of a convincing rationale arising from the socio-cultural climate and, in particular, advances in technology (Toffler, 1990). Despite the foregoing, a theoretical understanding of such matters is largely missing from classroom practice. When creativity becomes a criterion, for instance, the learner is no clearer on the concept than the teacher is when it comes time to evaluate the product. Contrary to the wisdom of educators in the 21 s t century, creative processes are extremely complex to evaluate. The best one can muster is to judge the strategies used for designing a product, i.e., brainstorming and exploratory flow. Though logical processes can follow a simple strategy, with creative processes we are simply at a loss to know which strategy actually leads to a creative product. Therein lies a huge contradiction for those who engage in logical pursuits such as mathematics, for a 'creative' mathematician is expected to arrive at a solution using alternate strategies despite that there are logical steps to be followed. In that instance, we begin to understand why creativity must require both convergent and divergent thinking. A l l of the preceding fails to answer whether an individual can follow logical steps and achieve an altogether unique outcome. By the same token, it fails to explain why 'creative processes' can be engaged that produce little more than a tasteless or 'kitschy' product. 104 Moreover, there is a problem that arises out of the design of the task and the expected outcome. The assessment and evaluation of a product that correctly conforms to specifications is relatively uncomplicated. A list of requirements necessary for completion is all that is needed for the student to understand the task at hand. Throw in a factor of creativity, and things begin to be problematic. The whole task becomes an exercise of ambiguity—for both the teacher and learner. In the end, I have yet to encounter a teacher, much less a beginning teacher, who can comfortably and with ease, assess and evaluate creative thinking without feeling that there is a great deal wrong with the task because of the degree of subjectivity involved. As with any art-based learning experience, the majority of people, with the exception of those raised in artistic pursuits, lament that when it comes to judging the product, there is not enough 'objectivity' involved. What is problematic with assessing and evaluating creativity does not solely rest in ambiguity and subjectivity per se. Though any given synonym may either clarify or add to the degree of ambiguity surrounding creativity, there is a paradox that settles into the equation when the sum of the terms to capture the meaning of creative processes and products are introduced. The following descriptors of creativity are found equally dispersed among educators, researchers and expert critics alike: ground-breaking, pioneering, modern, revolutionary, novel, unusual, unique, innovative, out-of-the ordinary, out-of-the box, cutting edge, appropriate, imaginative, inspired, artistic, inventive, original, catchy, surprising, shocking, refreshing, provocative, intuitive, ingenious, elegant and so on. If clarification were the objective for building the list of descriptors, the opposite is certain to ensue because each word implies a contradiction to the next. In my opinion, precision is further compromised in an educational context with popular expressions. In effect, one student went so far as to declare, "Creativity is judged by a factor of 105 Wow!" In other words, when learners meet all the criteria that have been carefully expressed by the teacher, creativity is arbitrarily awarded to the performance or composition that has left us sentimentally speechless. Nothing so illogical as the human emotion is as chafing to the beginning teacher or student in general who seeks precision. Nevertheless, what is most likely to happen when expressing a teaching rationale is to aim for a learning outcome as if it contained a measure of meaning unto itself. For instance, a teacher might reason the following: 'The children will be able to show creative thinking as a result of composing with the use of the synthesized keyboard.' The rationale, in this instance, assumes 'creative thinking' is plainly understood. Moreover, creative thinking becomes part of the measurement scale, i.e. '...will be marked on composition and creativity.' Along with the circularity of the rationale, there also exists a certain measure of redundancy. When I had set my elementary students to the task of filmmaking, for instance, I had simply reasoned that playing and composing with media would lead to knowing and expressing with alternate sign systems. That the condition of playing with something leads to knowing of that thing seems to contain a measure of common sense, yet, it does not declare precisely what is known. Despite that play was poorly defined, since activity ranged from informal to formal structure, I felt confident that 'play' was enabling me to determine and evaluate comprehension and expression with the use of alternate sign systems. In other words, I observed children at play and determined that something was afoot. Thinking along those lines forced educational theory and praxis to remain vague. We may think of children's play as a means to an end. Notwithstanding, the end of informal play is unpredictable and not entirely the measure of pedagogy and curriculum. On the other hand, unlike informal play, structured play such as found in improvised music, allows for 106 the anticipation of certain logical outcomes. Theoretically, therefore, play seemed to constitute a fair context for discovery and invention. Once a theory is firmly rooted, with or without critical study, a teacher may wryly note its failure to take hold in a teaching context. "This method of fixing belief, which may be called the method of tenacity," proclaimed Charles S. Peirce (1998), "will be unable to hold its ground in practice" (p. 1057). That was my precise discovery while teaching at University Elementary. With tenacity my colleagues and I held firmly to the principles of democracy and to the ideal that knowledge could be obtained through a means of negotiated processes in teaching and learning, through personal and shared inquiry, and, as well, play, collaboration, dialogue and expression. We held tightly to the humanist principles of self-actualization (though we did not use that highly charged term), and also to the postmodern tenet that no one owns the truth. We desired that students 'construct' their personal knowledge and, as such, considered ourselves to be constructivists. But nowhere did we question the cognitive development of children. Surely, between all of the discourse relating to social and political consciousness, the children's development of mental activity would have been of concern. Apparently, it was not. We heeded Loris Malaguzzi (1993a), founder of the Reggio Emilia approach, who exclaimed, "It is important for pedagogy not to be the prisoner of too much certainty." Yet, we neglected the part which added, "but instead to be aware of both the relativity of its powers and the difficulties of translating its ideals into practice" (p. 51-52). We were absolutely convinced that children were engaged in critical and creative pursuits, and that they were developing, additionally, multiple forms of literacy and expression. Sadly, without precision in understanding, my practice remained cloudy, and, for all of its merit lauded by postmodernists, pedagogy was filled with error. As is typical of placing too much weight on either theory or 107 praxis, "the errors and ills of pedagogy come from a lack of balance between scientific data and social application" (Malaguzzi, 1993, p. 52). It was only during my own reflections on practice that I began to understand pedagogy was at far greater risk than content. In pursuit of innovative means, ways of knowing and doing, amidst the excitement of postmodern ideas, we remained unaware that our practice held many shortcomings largely because the medium (indeterminacy) gave way to content (postmodern values). "Indeed, it is only too typical that the content blinds us to the character of the medium" (McLuhan, 1967, p. 24). Nevertheless, since we believed our knowledge flourished with discourse (not recognizing that the knowledge was so shaped by discourse), so too did we assume our students' knowledge would develop through 'shared inquiry.' We ignored the warning by Malaguzzi (1993a) who exclaimed, "The conditions and goals of the one who teaches are not identical to the conditions and goals of one who learns" (p. 77). And, in our effort to establish 'quality' learning, our focus turned to the tension that inevitably arose between the players, i.e., parents, teachers and students. Those who valued indeterminacy and negotiation ended up speaking as though there were no other means to the road of self-discovery. Those who opposed that view, valuing instead determinacy and good government, saw ambiguity as a major flaw. Either way, in retrospect, I note that when "teaching is monodirectional and rigidly structured according to some 'science', it becomes intolerable" (p. 77). Whether players hold tenaciously to one belief or another, the result is the same: an approach that is mechanistic, and utterly empty of human quality. Some final thoughts must be made about postmodernism before moving onto other matters, for its contribution to our method of instruction at UES was pervasive, yet, poorly understood. Since postmodern theories, as medium, formed the context from within which our work had sprung, a great many problems resisted solution precisely because of its tenets. For 108 example, the manner in which we applied methods and constructed content was often blind to human conditions (i.e., developmental stages of learning), but more importantly a focus on those means and end precluded the capacity to understand what we aimed to know and do. As John Dewey once stated, it brought "about a separation of that mode of activity commonly called practice from insight, of imagination from executive doing" (McLuhan, 1967, p. 54-55). In short, I agree with Katherine Hayles (Barth, 1995) insofar as there exists a distinction between the cultural world that is 'post modern' and the philosophy of 'postmodernism.'4 To know how it feels to live in a pluralistic and multi-cultural society is measurably different from those who theorize about it. I do not agree, however, with her on the point that it is the youth of our society who understand it best, for undoubtedly we are all affected by a changing era marked by rapid-fire images and sound timed to rapid-fire meanings. In other words, we are caught in a stream of an acceleration of medium and message. (It is peculiar, in fact, that despite an awareness each of us as teachers possessed of a fast-paced and seemingly fragmented world, we rushed to recreate this world in our classrooms.) For most of us, embracing the philosophy of postmodernism was an extension of cultural expressions that flourished since the early 1970's. Postmodernism mainly represented a break from the past. In fact, to justify our sentiment, we had constructed a large chart that divided teachers into two methodological paradigms. The traditional 'boss' teacher is the authority in the classroom—a caricature of traditional teaching. There are basal readers and textbooks in the classroom as primary sources. In the traditional schooling context, the boss teacher designs units of study from those primary sources (i.e., provincially mandated curricula) that contains sequenced lessons with start and end dates firmly affixed. The boss teacher demands that students memorize decontextualized elements of a literary (i.e., spelling, grammar, punctuation) and mathematical content (e.g., multiplication tables). Teaching is transmitted in didactic style. And 'closed' rather than 'open' problems are 109 given (e.g., "fi l l in the blank" and "identify the item that does not belong"). Finally, the boss teacher uses traditional assessments such as multiple choice exams and essays, and imposes themes and subjects that are academic in nature rather than as 'lived' experiences. The modern 'lead' teacher, by contrast, is portrayed as practically a saviour. An expert but not an authority, the lead teacher is believed to direct students to multiple sources found in varied contexts, i.e., literature and other media like video, magazines, the web and the like. The lead teacher designs units of study that contain authentic activity, investigation, and inquiry rather than academic exercises. The object of the lead teacher is to actively engage with children in experiential ways and, hence, design open problems and creative tasks (i.e., "demonstrate the quantity of 1,000" and "imagine two alternate endings for this story"). Through the use of multiple assessments such as journal writing, performance, portfolio, and oral presentation, the lead teacher avoids 'unfairly' evaluating a student. Finally, the lead teacher introduces a big theme (e.g., patterns) and then examines the theme within a variety of contexts introduced into the classroom as a method of integrating disciplines. Teachers who live by postmodern tenets, we believed with some confusion, are constructivists who make allowances for open-ended projects that begin with student inquiry and end when the topic has been exhausted. They allow students to design their projects, uncover resources, and use diverse means for expression such as technology. Then, in the end, they celebrate both individualization and group unity. 'Saintly' postmodern teachers encourage autonomy and authorship, yet, also revel in the undefined, chaotic, ambiguous, indeterminate means and end. Postmodernism is considered to be the ideal 'conceptual' medium from which to view democracy, and freedom. Creativity, we believed, is thought to only exist in a democratic environment of this kind—as though there were no creative learners in the classroom prior to the advent of postmodernism. Postmodernism, in 110 effect, seems to exemplify those things that everyone holds up as 'ideal' conditions for creativity to emerge: chance, collaboration, individuation, rigor and productivity. What must be said is that, in many ways, UES was exemplary but it was not free from encountering many pitfalls and inconsistencies; and it was not anywhere close to the ideal we aspired to reach. Reasons for that are varied but principally point to the error of embracing one view while categorically refuting another. Worse, we were absolutely in the dark about other views that might possibly enlighten our understanding. Despite that every lucid philosopher, theorist and scientist has noted that there is a need to balance methods in light of the fact that all inquiry faces the same problem, namely, the error of relying on either purely rational or purely empirical means, it would seem that as soon as the acceptance of polarities is suggested and embraced, another event arises to threaten its equilibrium. The world, within which we practiced at UES, had been imbued by postmodernism, in particular the philosophy of Jacques Derrida (1978). If truth were to exist, thought Derrida and not merely annihilated by relative truths, such as the outcome of existentialism, we could possible endeavor to uncouple the subject from the object so that all structures arrived at by any means and deemed absolute were open for deconstruction. Science was just another narrative to be deconstructed. Yet that view, obtained from play on words, rests a pernicious event that resists reconstruction. It would seem that at UES, in effect, we found ourselves caught in contradictory paradigms of constructivism qua functionalism and deconstruction qua poststructuralism. One view opened the door to practical solutions and the other threatened to undo all that was done by abstraction. It was no wonder, therefore, that the ambiguity and indeterminacy so extolled by postmodernists, which shook the very ground upon which we stood, caused a great deal of turmoil. In the end, a tension arose between traditional deterministic views and new I l l views opting for indeterminacy, chance, randomness and chaos. To complete this description one more illustration is needed. It is said that the 'post' modern has appeared at times as "a world of disconnected present moments that jostle one another but never form a continuous (much less logical) progression" (Hayles, in Barth, 1995, p. 306). Indeed, classrooms at UES were constantly buzzing with multiple projects, that were themselves multi-strands of various activities, some beginning, some in the midst of unfolding, and others ending—all signs of high degrees of 'creativity.' William Doll (1993, 1998) would have approved at the veritable smorgasbord of knowledge nestled in boundless resources. Students were free to find any space suitable for learning—the Internet (a newly acquired resource in 1995) and teacher librarians were readily available for the asking. With the exception of a few community groupings that were under the supervision of some of the teachers who had not completely embraced the new ideas, nearly the entire school mirrored the frenetic diversity of the outside world. As in the world that existed beyond the scholastic walls, there were individuals (i.e., students and teachers) who exuded leadership, enthusiasm, and industry. There were also individuals who were readily personable but displayed an average demeanor (i.e., neither overtly enthusiastic or industrious) and there were those whose meager presence slid through unnoticed. The latter did not call attention to either their work or themselves. There were also a few students who possessed learning and physical challenges whose behavior had to be monitored. There were those identified as possessing A D D / A D H D and there were those who were under the care of special needs personnel (e.g., autism and muscular dystrophy). Whatever postmodernists hoped would emerge from educational settings that reject the Modern past and placed in question all truth, could not have envisioned the tension that indeterminacy, ambiguity and chaos would produce. A tension that bore its most compelling 112 expression through the words of a wheelchair bound twelve-year old who said through her tears one day, " A l l I want is for people to realize that I'm sitting in a chair and to stop ignoring this fact and, at the same time, I want them to notice who I am." From an artist's point of view, with a zest for the new, the thrilling, and the edgy, the life at UES was often romantic, but looking back, I'd have to agree with Barth (1995), it was frequently incomprehensible, especially in relation to creativity. Herein can I begin to feel the impatience of the pragmatist, for language is not only a mine for our thoughts, it can also be exploited for gain. Unquestionably, erudite words are often used to gain leverage in a debate, but when the words employed throw one's own team members into confusion, there may be something wrong with the their use. Despite our fondness for Doll's work, his rambling address given at the Contemporary Curriculum Dialogues conference in the fall of 1997 had left my administrator and I greatly perplexed. Upon exchanging cryptic notes, we wondered whether postmodernism held as much promise as we had imagined. Lacking the precise words of Doll's address, I will illustrate what I mean by using similar thoughts expressed by Lyotard that Barth (1995) quoted with a lead in commentary on "those passages of theoretical abstraction that makes my eyes glaze over with incomprehension" (p. 302). The postmodern would be that which, in the modern, puts forward the unpresentable in presentation itself; that which denies itself the solace of good forms, the consensus of a taste which would make it possible to share collectively the nostalgia for the unattainable; that which searches for new presentations, not in order to enjoy them but in order to impart a,stronger sense of the unpresentable. A postmodern artist or writer is in the position of a philosopher: the text he writes, the work he produces are not in principle governed by preestablished rules, and they cannot be judged according to a determining judgment, by applying familiar categories to the text or to the work. Those rules and categories are what the work of art itself is looking for. The artist and the writer, then, are working without rules in order to formulate the rules of what will have been done. Hence the fact that work and text have the characters of an event; hence also, they always come too late for their author, or, what amounts to the same thing, their being put into work, their realization (mise en oeuvre) always begin too soon. Postmodern 113 would have to be understood according to the paradox of the future (post) anterior (modo). Essentially Lyotard and others discuss the very central issues that make up the creativity problem, though to the average person it is just 'talk.' Lyotard suggests that postmodernism is creation ex nihilo; a creation that always begins 'too soon' because it will straddle the 'mod' and the 'outdated' simultaneously. It cannot depend on 'rules' but must 'present' the 'unpresentable,' ignoring 'taste' and a collective 'nostalgia.' In short, postmodernism has placed creativity in the realm of the bizarre and the fickle. How did we at UES interpret postmodernism? The world, according to some, neither figural and linear nor held by one center fraught by absolute scientific and social rule, had drifted toward fragmented knowledge and beliefs arrived at by chance, chaos and whimsy (Jencks, 1995). This is largely what we thought also, though McLuhan (1967, 1988) had said that the world, thanks to science and technology, was shifting toward a new ground—a return to an 'oral' society that was anything but fragmented. It is a return to a notion that fluency has a role in 'literacy,' and that perhaps fluency is as important as literacy in new media contexts and content (Gouzouasis, 2000, 2001). By virtue of the linearity of text that is simply a continuous fragment and prior to the advent of radio, beliefs had been governed by the detached, impersonal figure. The tyranny of text, as some have defined it, divided the world between part and whole. McLuhan (1967) contended, "The principles of continuity, uniformity, and repeatability derived from print technology have, in England and America, long permeated every phase of communal life. A child learns literacy from traffic and street, from every car and toy and garment" (p. 262). In other words, print technology had regoverned our "inner life into segmented visual terms" to follow an order and sequence of events. At least this much is shared with postmodernists, but, according to McLuhan, electronic new media had changed the Western world (if not the whole 114 world) by shifting toward a simultaneous whole, an ever present now without external sequence or order but with internal mysteries, much as it had done previously in oral society. How were these two disparate views to be interpreted? Postmodernists, in general, spoke of fragmentation to signify a plurality of ideas, a compartmentalization of multi-views resting in units of social diversity and disseminated via electronic media: TV, film, Internet, music, video, etc. (Pinar, 1995). McLuhan, by contrast, spoke of fragmentation as belonging principally to 'text' by virtue of its figural 'bits' strung in sequence, so that presently, thanks to new media, the world was reverting to a unified, decentralized, and simultaneous ground. Television and radio, for instance, express a detachment of simultaneous happenings with spontaneous interruptions of whole events—distant from the sequential movement of film that is still so interconnected to text—apprehended by the eye and ear interchangeably. Along with T V and radio, we find that there are a great number of new media that offer discontinuous (or asynchronous), yet, whole "bytes" of information: Internet, music video, CD, and so on. Are digital media and print media so different? From a reductive point of view, both media are similar in organization insofar as they were formed from a sequence of logical fragments. The content of each medium is apprehended with either the eye or the fingertip (although the content of digital media may also be apprehended through the ear) and each are made up from a schema of 'symbols' that may be 'transparent' to the perceiver. However, the similarities may end there. The printed word, requires the individual to interact with symbols and phonemes (abstractions) to make meanings, whereas, the digital medium delivers content that allows for a 'virtual' reality (i.e., visual, aural, tactile), and that does not require the individual to interact with its silicon atoms. One might view digital media as placing an individual on a virtual ground of consciousness. 115 Although conceptual understanding (i.e., conceptual end) of printed passages, images, and sound generally require discursive means, according to McLuhan (1963), the content of digital media will 'neurologically' affect the individual with greater nuance—the tactile-visual-spatial processes needed in quick reactionary frames of reference. The printed word, in effect, involves more laborious steps for apprehension to take place, beginning with learning to decode symbols and phonemes. To comprehend the written word, an individual must learn to read its symbolic 'characters' unique to print as well as its 'symbolisms' (i.e., symbol and sign). Digital media's content also requires one to 'read' the symbolisms and hence is not 'immediately' apprehended, if an individual has never before viewed visual-acoustic content. McLuhan (1963) offers the example of the tribal African who, in seeing a film for the first time, wonders why Americans live such 'discontinuous' lives. His comments were in direct relation to the manner in which a filmmaker chooses to cut out long passages of continuous time sequences, such as a person walking from one place to another. I question whether that would have been much of an issue for individuals growing up in the West with access to literary forms of writing such as the novel. The film medium, in effect, is very much like the novel in terms of stretching, shrinking, or cutting up time lines. The virtual world of television, film, photography, and even the Internet since the advent of animation, streamed sound, and video, in fact, resembles the 'real' world (and possibly the print world in certain instances). To comprehend the messages contained in photographs, film, TV and the Internet, Western individuals have a rich background from which to decode symbols and signs. Photographic images and sound are closely aligned with the real world, whereas graphic images and animations (i.e., buttons and icons) are designed to align with familiar 'stereotypes' that speed up the interaction between the person and the medium. Critical 116 understandings of either medium are most likely mediated through discourse. One may interpret that postmodernists, like many others, focus merely on content. McLuhan, on the other hand, may be interpreted to have focused on both the medium and the message. As dissenters of new media, postmodernists most often deliberate on content for there is a belief that the cause of social fragmentation is an outcome of global information that remains in the hands of a few powerful individuals and organizations. Those same dissenters, however, also accuse text of a similar offence, claiming that through a 'unified' historical narrative the written word has served to shape a collective conscience. McLuhan (1967), in contrast, agreed with Plato who felt that poetry, with its "rhythm, meter, and music," was more to blame for persuading people than the content of the words that displayed "abstract, speculative reasoning" (p. 116). Again, it is not the content which causes world views to shift, though it would seem that way, but, rather, the medium. It would appear as though the postmodern content, also a medium, promoted the belief that reality is made up of abstract extremes, where creativity is the rebellion of form and knowledge is the momentary pauses in an otherwise unknowable world. But who would believe this message, even if the words had been comprehensible to the average individual? Clearly, there is something else that pushes the shifts in paradigms. Furthermore, the 'post' of "modern" offers something else, and since it is postmodernists who refer to it as such, postmodernism is more than the content that is examined. A l l evidence points to media. Even though it is already a span of over forty years since McLuhan first made his theories public, the impact of his message still has not found its mark. In truth, those who have paid careful attention are part of the multi-billion dollar industries that make up the sectors of new media use, i.e., film, TV, music, Internet and so on. 117 The Voice of Reason: A Critical Reflection Which voice held more sway at UES? To be certain, the parts that we understood with some degree of clarity (albeit an interpretation) favored a world of chance, chaos and indeterminacy. Serendipity had become a much loved word for chance encounters seemed to place all things such as volition into question—a necessary consideration for those who challenge determinacy. We thought our role, in part, was to create a reunification of a fragmented universe (our interpretation of contemporary society), and we strove to oppose the effects of both text and new media by examining their content and socio-political causes. By siding with Hume, we believed the outcome to point to the cause and we believed the cause to be chance. In retrospect we paid lip service to notions like chance for we constantly battled with the skeptic's dilemma—the denial of anything certain but the acknowledgement that while some things cannot be proven, they can carry a degree of predictability nonetheless. We assumed two things were being accomplished through our beliefs, (1) a positive step forward by turning our back on the past, and (2) a unity in vision. Sadly, we had not determined the role of the past. Moreover, McLuhan's axiom had made little impact on our understanding. Since it was postmodernist thought that influenced us the most at UES, the difficulty in executing its doctrine lay in its contradiction with the very premise of education, as well as that of media, never mind 'good sense'. We were obliged by a sense of accountability toward planning, implementing, and grading—to conform to tradition on the one hand and, on the other, spurred toward a new vision. We had a sense of the 'simultaneous' and the non-linear due to the new media that surrounded us and we wanted to recreate that world. However, we neither comprehended the media nor understood the significance of the media. Instead we embraced chance and chaos as the necessary pre-requisite to order. That position was taken mainly because of the idea Ilya Prigogene (1984) had presented, 'order out of chaos.' Updating his 118 position, we aimed toward a 'bound chaos'. We failed to note that the re-creation of chaos in the school setting is antithetical to desired teaching and learning outcomes as prescribed by policy. We did not realize that we could not in good conscience assume that 'the students will be able to' accomplish anything we have set out for them if we relied solely on chance as the determinant. Even our school mission statement, which bore a metaphoric image representing our vision of teaching and learning, tried to capture what was incomprehensible. The image did not represent the typical school 'schema' for learning: a Vygotskian scaffold or a Brunerian spiral. Instead, it showed a spinning, amorphous and chaotic vortex, replete with words spewing outward and tumbling inward. Like a black hole, it was an implosion of ideas and events—one that was predicted by McLuhan to be the very essence of new media. Even as we still spoke of scaffolding and spiral learning, we imagined the possibilities inherent in multiple entry points to knowledge, multiple means and ends, all colliding on the wings of chance to form simultaneously independent and dependent 'societies' or blocks of knowledge. In retrospect, we were not inventing anything new. We had fallen into the pattern of a modern literate society with its patchwork quilt sewn together by invisible boundaries, totally unaware that the quilt was systematically being unraveled from the inside out. Why did we give so much energy to chaos, the antithesis of order, so much meaning to the content of postmodernism? We seemed to fly in the face of the Canadian institutional motto: peace, order, and good government. It is most likely a gross oversimplification, however, it was partly because we were ignorant of some important facts, partly because educational theorists who side with postmodernism were mainly American, and partly because the traditions of education seemed so repugnant—at least from our perspective. Had European communities taken postmodernism to heart? We were ignorant of the fact that scientific positivism existed in the first place as a refutation of the absolute confidence placed on knowledge solely obtained by 119 means of rational thought. A reliance on truth that is determined a priori, either by common sense or deductive reasoning, oddly placed what was considerably subjective (i.e., resting in the mind of the individual) as objective by virtue of being detached from the senses—an absolute separation of mind and body. We failed to investigate the fact that to a 17th century rationalist philosopher, knowledge by means of the senses and only the senses had reached a state of incredulity. That was partly the cause of religious beliefs wherein definitive proof for the existence of God was not forthcoming, and partly the cause of an over reliance by the common populace upon what Nietzsche (1974) referred to as "magicians, alchemists, astrologers, and witches whose promises and pretensions create a thirst, a hunger, a taste for hidden and forbidden powers" (p. 240). Scientific experimentation was suspect because it often contradicted common sense, which in appearance was considered a magician's 'slight of hand.' Empirical inquiry, therefore, came under attack as readily as common sense approaches because each ultimately depended upon the senses for truth. Moreover, as McLuhan (1967) pointed out, the assuredness of the accuracy of mechanical objects, such as clocks and the printed word, drew attention to the fallacy of the senses. Descartes' error, as Antonio Damasio (1994) called it, was not from having distinguished between mind and body; rather, it was the absolute separation of each and the belief that the senses were inferior to reason. The split between mind and body left one with a purely mechanistic means to explain their interaction. In trusting the content of purely rational means over the content of purely empirical means, 'rationalism' dominated the means to truth. The reader will recall, however, that in the first chapter, I question the notion that human beings are not rational beings at all cost. 120 Later, Kant comprehended that the two, reason and sense, could not be uncoupled—just as medium and message are interdependent, so too is the mind and body. Cartesian rationalism rested on mechanistic views because Descartes saw nothing but a mechanical relationship between the mind and the body. We carry a vestige of a mechanistic way of thinking whenever we refer to 'mechanisms' when we deal with organic phenomena. Present day organicists (Overton, 1984) are cautious whenever the term 'organic' appears to replace what was once termed 'mechanical.' That is because organismic models are developmental in nature. Those models make reference to the inevitability of evolutionary change found in organic substances, whereas mechanical devices can only demonstrate entropy. Moreover, any relationship between the mind and body that is viewed as mechanical remains static rather than always in a state of flux—the latter view is more consistent with evolutionary and developmental theories. At any rate, when rationalism became intolerable in light of its absolute denial of a material existence, there was an inevitable counterbalance with scientific Positivism. That resulted in another revolution (i.e., post-positivism) and so on and so on. It is the long history of what C S . Peirce (1998) described as "the pendulum [that] has swung backward and forward between a more material and a more spiritual philosophy, from the earliest times to the latest" (p. 1058). And that historical precedence continues unabated in the 21 s t century. Despite the influence of philosophy and science on worldviews, education reflects those views to a lesser degree. That is because there appears to be a lack of coherent critical inquiry among many if not most educators. The repercussion of that lack of coherency withholds placing educational researchers on the same footing as philosophers and scientists. Researchers in education appear to overlook the metaphysical in favor of epistemological concerns. Yet, anyone who seeks to understand the human phenomenon cannot escape the metaphysical. 121 Perhaps to that extent, the UES environment was somewhat exceptional since postmodernism attempts to address both metaphysical and epistemological issues. Unfortunately, without a stronger foundation in critical inquiry, discourse was open to anything we wanted it to mean. In the end, we applied postmodernist views, more or less, to epistemological considerations and ignored the metaphysical implications. Even when postmodern theorists spoke of the ultimate reality, it would seem that most of us glossed over that message in an effort to apply the content to teaching and learning. A History of Mind: The Missing Dialectic To begin a discussion on the mind, that part of the being where cognition rests, is as tricky as trying to find the starting point of a ball of twine. Our fascination with the mind most likely began at the dawn of self-awareness and, remarkably, of the millions of species on earth, the human being is the only one whose mind is fascinated with mind in and of itself. That human feat did not escape Descartes (1976) for he concluded / ' etre, being of mind (spirit) and body (extensions of the mind), exists by thought alone. If thought exists, so must T regardless whether T co-exist with a body. Any objection to Descartes' logic that we may sensibly make today has been made since the publication of his works. Very little can be adjoined other than evidence which rests in modern science. The objection made by Hobbes (1976), for instance, centered around the state of the unconscious while dreaming: do T still exist while dreaming? Like many theologians and philosophers, he questioned the tenet that perception, sensing or feeling, can be separate from the body. Also, the manner by which thought first comes into being was curiously unanswered by Descartes. Hobbes asked, "whence do you know, that you know, that you know, that you know" (p. 247)? From a materialist's point of view, Descartes' claim was confusing, particularly when he asserted that "activities, which we call thinking, e.g. understanding, willing, imagining, 122 feeling etc.," agree "under the description of thought perception or consciousness" (p. 249). Hobbes pointed out that, insofar as imagining and feeling are not at all the same as understanding or willing, they are imprecisely defined under 'thinking activities' and, worse, as a derivation of the mind and never the body, a complete contradiction. In declaring that he is "a thing which thinks" (p. 173), Descartes was commenting on the fact that he is a conscious being: meaning that to think is to be conscious of self. Of course, making this assertion places humans hierarchically above other beings, also a fond argument for the religious-minded. For the humanist, whose considerate treatment of all beings is paramount, came the objections of Antoine Arnauld (1976) who wondered how thinking is separate from "corporeal organs, since we can believe it to be asleep in infants, extinguished in the case of lunatics" (p. 269). Arnauld was particularly concerned with the distinction that is made of a being lacking in spirit (consciousness) from another equipped with greater faculty because of "those impious men whose aim is the soul's slaughter" (p. 269). The most Descartes offered was that "thinking is asleep in infants and maniacs—though not extinct, yet troubled" (p. 276). In effect, animals were not considered to have souls and, as such, were subject to unthinkable acts. For the atheist, it is not the 'soul' that is of issue, but rather the ability to reason and to be conscious of self. That sentiment has done little to change the mistreatment of animals in today's society. In any case, there were those who rejected that narrow point of view, like Pierre Gassendi (1976) who pointed out that animals are endowed with "nerves, (animal) spirits, a brain and a conscious principle residing therein, which in similar manner receives the messages sent by the animal spirits and accomplishes the act of sensation" (p. 280). He vehemently opposed Descartes' assertion that "brutes lack reason" by further stating, "while doubtless they are without human reason, they do have a reason of their own" (p. 281). Even if Descartes would have allowed that animals have a reason of their own, the pure and abstract form of thought to 123 which Descartes subscribed, did not make this a relevant factor, for only the human mind "when communing with itself can experience the fact that it thinks" (p. 281). According to Descartes, any evidence that would prove otherwise could only be ascertained through the senses, a posteriori—since the body could not be trusted, its veracity would be dubious. Despite our objections to Descartes, prima facie, there is something that lies beneath the surface of Cartesian logic that elevates common sense. By common sense, I mean that notions are given credence without logical precision, somewhat akin to popular psychology. Continued discourse of Cartesian views in the 21st century after 360 years of deliberation, shows to what extent matters regarding the mind remain unresolved, or at the very least continue to resonate with society. In the first place, Descartes was accused by his critics of restating what Plato had already asserted—a recorded historical precedence being notably established, Descartes simply laid the terms out to the satisfaction of those in power, i.e., the Church. It is possible that if religion held as much sway in today's democratic nations, Descartes' views would undoubtedly dominate over science. The division of mind and body, in effect, continues to be the central canon in religious contexts. Setting religion aside, however, we find evidence of the figurative splitting of mind and body in non-religious institutions and contexts, placing the mind over the body simply as part of a common consciousness not unlike ancient times. What is the cause of that split and reverence for the mind? McLuhan (1967) suggested that the post-literate individual would have reason to distrust the body. With the printed word, individuals no longer relied on their sense-memory to remember events. Originally recorded in Plato's Phaedrus (Jowett, 1871, p. 844), we find support in Socrates' tale of the myth of the alphabet's inventor as recounted by Thamus who was King of Egyptian Thebes. Theuth, a famous old god and inventor of many arts, including, arithmetic, calculation, geometry, astronomy, draughts and dice, is said to have also invented the 124 alphabet. When Theuth came to Thamus to show his inventions and inquire as to their value, he gave the highest praise to the alphabet saying, "This will make the Egyptians wiser and give them better memories; for this is the cure of forgetfulness and of folly." To which Thamus replied with the following notions. O most ingenious Theuth, he who has the gift of invention is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance a paternal love of your own child has led you to say what is not the fact: for this invention of yours will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. You have found a specific, not for memory but for reminiscence, and you give your disciples only pretense of wisdom; they will be hearers of many things and will have learned nothing; they wil l appear to be omniscient and will generally know nothing; they will be tiresome, having the reputation of knowledge without the reality. It is not the body that is the extension of the mind, as Descartes supposed, but rather it is the artifact-all technological inventions, including the printed word. Moreover, all media are extensions of humans. According to McLuhan, the literate individual is the product of technological extensions of the mind and body. It would seem that reliance on technology atrophies portions of the mind (e.g., memory) and the body (e.g., eyes). With extended conceptual powers, the individual can perfectly record images, sounds, movement and thoughts without once relying on the imperfections and inaccuracies of the human body. Glenn Gould (1984), in fact, made that point succinctly when he removed himself from the concert stage to concentrate solely on musical recordings. Preferring the electronic sound recording rather than the live performance, he countered that the recording artist would be in control of the execution, while the listener would be in control of the sound source—each manipulating the medium to their degree of satisfaction. Of course, the one with the most knowledge of the medium was bound to have the most control. Interestingly, Gould conducted several 'experiments' that demonstrated to some extent that we are bound by the medium we use. 125 Setting up two groups of participants, Gould tried to determine whether individuals could detect 'cuts' or 'edits' in his recordings. One group of participants was made up solely of traditional music performers, composers, and critics. The other group was made up of 'media' related professionals from radio, TV to sound recording, as well as laypersons without background in either media or music. The participants had to listen to several pieces of recorded music with 'cuts' or 'edits.' They were instructed to identify the 'cuts' that were thought to be unnoticed by the 'average' listener. Typically, the traditionally trained musicians searched for breaks in musical phrasing or dynamics or rhythm or any such thing that related to music grammar and syntax. In contrast, the non-musical group appeared to have relied on their sensitivity to the whole of the recording, as well as their knowledge of music engineering. The non-musicians were able to identify edits with far more accuracy than the musicians could. Moreover, upon learning that one piece was of a particular composer who was known to record without any cuts, one musician asked if he could change his first responses. Clearly, the two media, music and technology, despite their interdependence, operated independently of a traditional medium. Its effect on cognition was of a curious nature. As to the medium of the printed word, McLuhan (p. 117) posited that it is print that has heightened the need to "see for ourselves" and that "all kinds of shorthand systems of notation have been developed to help us see what we hear." Reliance on visual media means, "many people find it difficult to understand purely verbal concepts. They suspect the ear; they do not trust it," for, ultimately, children are admonished to "believe only half of what they see, and nothing of what they hear." Individuals, in fact, who study music formally, come to strongly rely on manuscript, for written symbols render the illusion of correctness in play, as opposed to those who 'play by ear,' 126 that is to say, those who depend on audiation, memory, and 'sense' of music. According to McLuhan, it is probable that Plato's views were embraced in Descartes' time as a consequence to the heightened literacy that had penetrated the masses. Insofar as the printed word seems to alienate the sensory, it was felt as a separation of mind (i.e., logic) and body (i.e., sensory). In general, dualisms are best handled through intellectual discourse because speaking of the mind and body, as separate entities, cannot be eradicated from our actions or from our language. If there were no differences, however, there would be no relationship to examine. Nonetheless, the argument that is centered around the absolute division of the two, as though one can exist without the other, forces one to link them by means of a mere mechanism. Once more, humans are organic beings in desperate need of an organismic view. In the end, the mind/body paradox is often a battle of beliefs that have been transformed by one or another medium. To wit, many who take issue with one side or the other, will blame words as the fetter from which confusion arises. Hobbes asserted much the same when he accused Descartes of misusing names of things to support his argument. Yet, Descartes contended that names or words had little to do with understanding since it is what a name signifies rather than its form. I believe that Descartes would have found much support in the opinions of Heidegger (1971) who contended that faulty understanding does not rest on the word but the person. "To be sure, people speak of immortal works of art and of art as an eternal value. Speaking this way means using that language which does not trouble with precision in all essential matter for fear in the end to be precise would call for—thinking" (p. 79). It is interesting that in whatever direction the accusing finger is pointed, it is always pointed toward a medium: a person, the word, the symbol, the recording device, etc. Nonetheless, none of these media are ever directly dealt with though their importance with respect to cognition is enormous. As a matter of fact, we 127 must take note that language is never distant from thinking, as it is so clearly evident when Descartes (1976) exclaimed, "I am, I exist, is necessarily true each time that I pronounce it, or that I mentally conceive it" (p. 171, italics mine). Language, as Hugo intimated, holds the key to our thoughts. The relationship between language and cognition, however, is a poorly understood event by those who should have a clear conceptual basis of knowledge, i.e., educators. Though language has been a hot topic for 20 t h century philosophers, cognitive scientists and linguists, language is nearly absent from empirical studies on creativity; and, it goes without saying, creativity is nearly absent from language studies, except for perhaps from a Chomskian sense of 'infinite possibilities.' Language, however, particularly the use of metaphors, icons, and symbols, is a hot topic among technologists who deal simultaneously with media, semiotics and creativity (i.e., software developers). If one possesses the desire to 'make sense' of the ' i l logical , ' from the 'senseless' human act to the 'incomprehensible' natural phenomenon, we seem to have no other recourse but the dialectic with which to form our understanding. And , language, in the form of a 'voice over,' clarifies a story plot in film. Language, in the form of a 'read me' text, provides 'additional' support to computer program use. Language, through the use of 'absolutes,' provides a means to understand intractable political, economic, and social issues. The dialectic implies an exchange of thought, and educators who know of such things are very effective. Notwithstanding the dialectic method, language, in the form of a tetrad, may also facilitate greater understanding. The tetrad, therefore, wi l l be explored in greater detail in the following chapter. 128 C H A P T E R T H R E E For as one may feel sure that a chain will hold when he is assured that each separate link is of good material and that it clasps the two neighboring links, viz., the one preceding and the one following it, so we may be sure of the accuracy of the reasoning when the matter is good, that is to say, when nothing doubtful enters into it, and when the form consists in a perpetual concatenation of truths which allows of no gap (Leibniz, 1982, p. 322) Methodology: The Road to Knowledge The following chapter necessarily deals with research methodology. However, the preceding chapters could be viewed as part of the methodology, with the following merely a continuation of the autobiographical method outlined in the first chapter. In effect, the intent to this point in the thesis has been to use "autobiographical writing to aid in the reinterpretation of theory in light of experience," whereas the following chapter may be viewed as the reinterpretation of "experience in light of theory" (Griffiths, 1994, p. 76). The critical method employed, hereafter, begins with the reinterpretation of two of my teaching experiences in relation to existing theory. Through the use of a tetrad, I attempt to examine the fundamental aspects of both constructivism and postmodernism. And by drawing out key contradictions contained within those two paradigms of thought, I was able to bring more understanding to the two teaching experiences that were examples of approaches to creativity in the classroom. The retelling of those two experiences set the stage for revisiting the context of the Contemporary Curriculum Dialogues conference (1997), which also held an exhibit for 'creativity and technology.' The impact that the exhibit had on me stayed a part of my thoughts for some time, but became more prominent as I began to re-examine all such matters in preparation for this thesis. At that time, the exhibit had prompted a flurry of 'constructivist' 129 approaches, which had lasted throughout the school year. Based on the constructivist methods of Loris Malaguzzi (1993b), the founder of the Reggio Emilia schools in Italy, I had tried to emulate the notion of generative and autonomous learning through the 'atelier' or workshop in my Arts program. Notions such as 'child-centeredness' and 'personal construction of knowledge' imbued the classroom practice. As I retain from that experience a large record of video footage and photographs, I have been able to revisit both the children's work and their reflections from the interviews that I conducted. I was pleased with the varying degrees of success, however, I chose not to detail that account. Instead, I felt it was important to describe and analyze the theoretical influences on our practice. The reader is thus further entreated to follow a critical analysis of theories commonly held of teachers, teaching, and learning: teacher as facilitator, activity-centered learning, authenticity in the classroom, pragmatism, and multiple intelligences. The purpose of that analysis is to draw out the popular arguments that have attached themselves to those terms. I believe that the impact of those particular arguments has tempered both the level of awareness regarding creativity and the degree of critical inquiry. Clearly ignorance cannot lead to truth, however, many educators and educational researchers would balk at the suggestion that they are ignorant of facts. On the contrary, there is an abundance of information that has filtered into the schools based on theories and empirical studies conducted on matters of creativity and technology. The trouble, however, lies in the fact that educators do not know that they do not know. Those of us who taught at UES, for instance, were steeped in discourse and investigations. For the most part, professional development days were designed to explore current trends in curricular research. In addition, all of the teachers at UES were committed to carrying out an action research—a form of investigation that has grown popular in educational inquiry.5 Yet, the obstacles to knowledge that we faced were more complicated than they 130 appeared and they hearkened back to the question posed in the first chapter: Do educators know upon which view they base a pedagogical decision? At UES, we felt that we were correctly basing our beliefs and actions on a form of constructivist thinking—a paradigm that had swept through the educational system that replaced 'traditional' teaching methods. In general, it can be said that music education has endured a lack of precision from employing approaches empty of empirical evidence. Notwithstanding the number of empirical studies on creativity and technology as they relate to music education and other disciplines, there is sufficient information that can be drawn from cognitive and neurological studies refuting many popular assumptions made of either creativity or technology. This chapter entitled 'Methods,' therefore, is as much a reflection of 'methods' employed in the classroom as it is about the method of relating theory to experience. Mysterious Methods: The Paradox Within Means My colleagues at UES and I committed the gravest of mistakes. We became so overly preoccupied with philosophical content (i.e., postmodernism) and instructional content (i.e., themes) that we simply forgot the role that medium plays. We thought little of methods other than to suppose that either a method was mysterious, serendipitous and stylistic (the creative pedagogy) or traditional and moribund (the pedagogy that followed the recipe). The former was intuitive and had a spiritual, organic quality (whatever we understood those things to mean); the latter was rigid, mechanical, robotic, distant and fragmented. All such talk reminded me of the beginnings of modern dance pioneered by Isadora Duncan, the first American modern dancer. Duncan had rejected traditional ballet for its entirely un-organic style. Viewing it as having reduced beings to robots, ballet separated the mind from body by denying the body's natural gesture, and the body from the ground by imposing hard, wooden toe shoes. Believing herself to be a mystic, she tried to project a spiritual oneness with Nature. In recreating the 131 ancient Grecian mythical forms, barefoot and loosely clad, Duncan succeeded in liberating future dancers from the ballet rigors to shape what has become known as modern dance. Moreover, coinciding with suffragettes of the era, female dancers freed themselves from the clutches of the male ballet master. Yet, for all of Duncan's contributions to modern dance, her methods could never be duplicated for they were elusive and utterly lacking in form. Dancers were simply told to feel the dance. Anyone without formal training could feasibly become a modern dancer, just as anyone between the 1950's and 1980's who knew of experimentalist John Cage could feasibly become a musician. But trained dancers eventually tired of this formlessness and opted, instead, to borrow aspects of the ballet while playing with interpretations of world and ethnic dance forms. Modern dance emerged from under Duncan's shadow of 'pastiche' to produce such luminaries such as Ruth St. Denis, Ted Shawn, Doris Humphrey, Jose Limon, Martha Graham, Merce Cunningham, Twila Tharp, Katherine Dunham, and Meredith Monk, all of whom have applied rigorous techniques and developed methods of practice that are as arduous and demanding as the ballet. Like other experimentalists, however, Duncan could be considered to be a groundbreaker. In that case, was Isadora Duncan a creative genius'? If groundbreaking work is synonymous with creativity, there is a possibility that the honor must be awarded irrespective of how bizarre the product. To that end, we seem to be left with some cynicism and a need for more clarification. Constructivism: Its Claim to Being a Creative Approach Constructivism is a term, not surprisingly, with many connotations, and it has evolved into many different branches of thought. When I speak of constructivism in the following pages, I speak of the assumption that was shaped principally by postmodernists. However, the use of 132 the word 'constructivism' often implied numerous things that originated in a numerous perspectives. Missing from our knowledge at UES was the organismic model for constructivism. At any rate, the postmodern perspective, originally rooted in Dewey's theories, evolved as an extension of the post-structuralist, deconstructivist paradigms. Hence, constructivism from a postmodern perspective carries with it a 'construction-deconstruction' model. As a result, it is rooted in a Marxist model (originally stemming from Hegel) whereby a revolution is expected to occur that will tear down established constructs, be they social or otherwise. The organismic model of constructivism, by contrast, involves the careful attention to cognitive development and the need for early exposure to natural and fluent activity. For instance, in the case of music development, the organismic model of constructivism would strive to introduce both formal and informal music activities as soon as the child is born. That approach corresponds with the natural context of linguistic development (Gouzouasis, 1991, 1992). Free from the Marxist perspective, the 'social' aspect of the construction of knowledge would be no more malevolent than the natural socialization that is required of early language development. And it would not require a 'revolution' to undo its constructs. Moreover, the organismic view of constructivism is not weighted down by American pragmatic ideals. The construction of knowledge, per se, is not in response to practical or teleological outcomes. In the following two incidents, I describe the models of constructivism most frequently applied in the classroom. The outcome of their application proved to be disastrous both in terms of the learner and the teacher. What I gained from revisiting those two incidents was an understanding as to how profoundly we are affected when we fail to critically examine a belief or a theory. Moreover, I was given a clearer view of notions attached to 'exploratory learning' and the 'personal construction' of knowledge. 133 First Incident. I began to platoon into one of the learning communities as a member of the Creative Applications Team. A learning community at UES signified one family grouping of 150 pupils from the first to the sixth grade. The family groups were divided into two smaller groups: Division I consisted of grades 1-3 and Division II consisted of grades 4 - 6 . There were five teachers working together, and with the addition of the specialist, the children could be placed into varying groups according to need. The community metaphor was an important aspect of our process as it tried to emulate a democratic society filled with independent learners working interdependently. There were three communities at UES, called East, West, and Center (named after their geographical location) and though all of the 450 students engaged in music education during certain intervals between the first four days of the week, Fridays were set aside for three specialists to each join a community engaged in core subject areas. Our task as specialists was to find ways to integrate Arts, science and technology into core topics (e.g., math, language arts) and, at first, we did little more than observe the processes in which the children were engaged with their regular classroom teachers before venturing to undertake a role. There was, in most instances, a standard flow of interaction through 'dialogue' and activity. The first time I stepped into Center Community, I wandered over to one small group of nine year-old boys. They were obviously having difficulty understanding the meaning of a song written by Canadian songwriter Gordon Lightfoot entitled, The Canadian Railroad Trilogy. Though the purpose of the activity, as their teachers had designed, was to inquire into its meaning (a very postmodern way of approaching text), the boys were doing no more than circling familiar words, or, tapping and drumming their pencils impatiently. As I queried them, 134 it became evident that the language rose well above their level of comprehension and precluded any meaningful discussion. With that in mind, I began to dramatize some of the events in the song in an attempt to clarify certain passages or to 'concretize' the meaning. It seemed to be working well enough; at the very least, the boys and I were having great fun as we laid the railway tracks, felt the hot sun, swatted the mosquitoes and cringed from the railway bosses. In the midst of our play, I was suddenly met with hostility from one of the regular classroom teachers. Pulling me aside, the teacher chided me for interfering with their critical thinking processes. Though we argued for a few minutes regarding the complexity of the task, it was clear that the tension between us stemmed from two differing banks of knowledge. I argued that the language was far too difficult without some mediation and the teacher argued that I did not trust the process of critical inquiry—believing that the task was well within the scope of children's ability. Reflecting on the incident in my journal that evening, I decided to share my writing with the teachers in Centre Community the following day. Though it eased some of the tension, not much more was understood of the differences we held. The following excerpt was an expression of my sentiments. Two days have I 'parachuted' into Centre's Humanities studies. The first day was a little daunting. With the brief (or maybe not so brief) dispute I had with B... . , I realized, when we talked later, that trust is fragile and difficult to build in a short period of time. Nonetheless, reflecting, and perhaps exchanging reflections may help the process, hence, I am writing tonight as part of the process for becoming integrated into the work. B has spoken of American schools "packaging" new theories, and I witnessed this to be the case in the satellite broadcast of the Multiple Intelligence Symposium. It is more than just packaging, however, it is the very notion of institutionalization (which springs from isms). Challenged by the philosophies of constructivism it appears that at times we are racing to push learners into the state of critical, creative thinking. Funny, is it not? We hurry-up and push the learners, which is paradoxically unconstructivist in nature. As we deconstruct the old ways, as we construct the new, are we 135 becoming dogmatic? Institutionalizing means pushing an ideal so as to turn onto itself (witness communalism vs. communism). Yes, yes...we want learners to gather information, to listen to stories, to make connections, to reflect, to respond and then to brilliantly find new meaning. We do all these things because we are in a hurry to respond to the new age. We are revolutionaries. And we do not tolerate when others put brakes on our convictions. We do not even trust each other when we come together (We ask, is my way of constructing your way? Wi l l you undo what I have tried so hard to do? Somehow I envision calling myself a deconstructivist!) Does the mean (imposing constructivist methods by the teacher absolutely) justify the end (enlightened problem solvers and creators)? Is there a place for slow transformation? Is there room for tolerating traditional pedagogy? What of mistakes by learners or teachers? Furthermore, a constructivist [teacher] who does not treat a learner with respect is no more valued than a traditionalist [teacher] who does the same. A constructivist, who is quick to anger when a learner has failed to inquire, is no more enlightened than a traditionalist who is quick to anger when a learner has failed to recite basic facts. A constructivist who believes too highly in their understanding has no more humility than a traditionalist who guards knowledge. My reflections were filled with doubt, possibly disdain, for constructivism. It had become that 'method' Malaguzzi (1993a) described as "intolerable, prejudicial and damaging." It was conceived as the means to explore issues of 'class' struggle and ended up exploiting individuals in yet another struggle. I was reminded instantly of its central tenet while coincidently helping my daughter with a school report on the Russian revolution. Hegel influenced Marx to regard competition (i.e., the marketplace) as the cause of social injustice. Entirely ignoring evolutionary tenets first expressed by Darwin (i.e., competition, cooperation and natural selection), the Marxist view failed to take into account the 'internal' workings of the individual (i.e. from an abstract, mind-based, metaphysical viewpoint). My reflections were thus laid out in the open and it provided an opportunity to discuss our differences. But, though our feelings were assuaged with talk, suspicion had been raised. No matter what my misgivings were, however, I could not in good conscience turn away from constructivism altogether for I feared the alternate view. It was an alternative that was described by Alfred Whitehead (1982), wherein he said, "all the time pupils are hard at work solving 136 examples, drawing graphs, and making experiments, until they have a thorough hold on the whole subject" are never made "to feel that they are studying something, [but] merely executing intellectual minuets" (p. 255). Sadly, the alternative had won an unprecedented revival during the back to basics movement of the Reagan years in US administration (1980-1988). It appeared to loom upon us in ominous and threatening ways, as it continues even in the 21 s t century. A l l such sentiment, in retrospect, has led me to note that postmodern constructivism was based upon socio-political struggle, instead of upon a cognitive developmental model. Should the reader feel predisposed to viewing teaching and learning from a Marxist position (i.e., phenomenology) further clarification will be necessary to strengthen the argument that socio-political struggles have little if any bearing on pedagogical choices. First, I will state the position that it is not culture, race, gender, socio-economic status, religious, or political affiliation that ought to impact on the decisions a pedagogue must make with respect to improving the quality of thought—though any one of those areas impact educational policy as a whole and may, in fact, highly influence teachers' thinking. Rather it is the degree to which the learner is cognitively and emotionally developed that must be the first and foremost deciding factor. That is not to say we must ignore social factors, either in the context of teacher or student development. The socialization of the child, in fact, is a key factor for cognitive development to arise in the first place. Children raised in isolation will most likely display abnormal cognitive capacity. Barring that extreme, however, the question is always, "Is the child ready and able to understand a particular 'concept?'" It is clear that there are many biological and social factors that make up the whole of the individual but it is not social factors upon which pedagogy has to be based entirely. It is understandable that social factors affect curricular content insofar as they relate to justice. But 137 those are content issues not methodological issues. Content merely becomes another medium (i.e., the lens of social justice) and the message. Addressing issues of social justice are important dimension of education, for as would be expected, those factors contribute to a sense of sorrow and empathy toward the human condition. Nonetheless, to focus solely on those issues is to ignore the essential needs of an individual's development. It cannot be overly stated that to provoke and enable cognitive pathways (i.e., intellect, passion, volit ion, sensory awareness and so on) teachers need to focus on cognition. Though saying as much may appear redundant, it is not. Nowhere has it been shown that knowledge of a critical nature is acquired without direct focus and application. In effect, it is the attitude of most arts educators (including myself) who work with inner city children that the ghettoization of the arts has been in direct response to viewing the arts as either a cushion to heal wounds caused by social conditions or a motivation for underachievers rather than applying the arts as the foundation of critical thinking. That developmental foundation includes a focus on aesthetic and emotional maturity. It is also clear to many educators that technology has replaced the arts as the new means of motivation and social cushion—al l of which has been a grave misconception of art and technology. Those matters, however, cannot fully be explored in the scope of this thesis. Moreover, the preceding incident clearly shows a fallacy in teaching. The poetry of Gordon Lightfoot, with its reference to the social injustice committed on the Chinese railway workers, was never going to penetrate the boys' consciousness without first attending to their stage of development (i.e., cognitive and emotional maturity). Irrespective of how that stage of development had been shaped, though not discounting the many factors, the boys depended on a pedagogy that addressed developmental issues before either social discourse or methodologies based on social discourse. And surely educators w i l l agree that what occurred in the first nine 138 years of a child's life is far beyond our role as educators to resolve. While educators can work in tandem with key professionals who can address psychological, medical, and social concerns, the classroom teacher is not equipped to assume all of those functions. The organismic constructivist would venture that the best teachers can do to improve the quality of thought is to assess where a child is situated and to approach learning from that point forward. Next, I can strengthen my argument by enlisting the 'laws of media' that McLuhan theorized would bring clarity to what is frequently transparent to the individual. The reader will remember that I introduced in the first chapter a theoretical construct for the laws of media posited by McLuhan and McLuhan (1988). Accordingly, "The laws of media, in tetrad form, bring logos and formal cause up to date to reveal analytically the structure of all human artifacts" (p. 127). In other words, McLuhan claimed that any artifact—hence medium—such as an idea or product could be run through a tetrad to uncover its effects. A l l artifacts (i.e., media) are inventions that extend human thoughts and senses. The tetrad is made up of four laws (i.e., causes): an artifact enhances, retrieves, reverses into, and obsolesces. Thus, the laws of media will most certainly appeal to the philosopher (i.e., postmodernist) who has "engaged in an attempt to get at the hidden properties or hidden effects of language and technology alike" (p. 128). In other words, its dialectical content having been shaped by the artists and philosophers who wished to break free from the tentacles of an established, albeit Modern, discourse (i.e., moral, social, political, and aesthetic) is a medium intended for analytical thought (i.e., the revelation of hidden meanings such as the identification of paradox, irony, symbol, metaphor, literary device and so on). By running postmodernism through the tetrad (see Figure 1, p. 139), I have analyzed its properties as a medium. 139 According to McLuhan (1988) there is no correct way to 'read' a tetrad, since each law occurs simultaneously. "But when 'read' either left-right or top-bottom (Enhance is to Retrieve as Reverse is to Obsolesce, etc.), or the reverse, the proportions and metaphor- or word-structure should appear" (p. 130). In keeping with the tetrads that McLuhan and McLuhan (1988) "In a world of media, the contrast between reality and fantasy breaks down and is replaced by a hyperreality, a world of self-referential signs What remains is signs referring to other signs, texts referring to other texts"(Kvale,1995, p. 19). Enhances meanings, significance, relationships, rhetoric, grammar, perspective "Today the very idea of self is under siege" (Zweig, 1995, p. 146) Reverses into indeterminate anarchy, deconstruction, solipsism, nihilism, entropy, cynicism I 'OSTMODKKMSM traditional forms, thesis, construct socio-political discourse, revolution, irony, scepticism historical figures Retrieves " So the ironist thinks of logic as ancillary to dialectic..." (Rorty, 1995, p. 106) structures, Euclidean logic, order, sequence, linearity product, synthesis Obsolesces "A figure is that which is contained by any boundary or boundaries." QED (Quod erat demonstrandum) (Book I, Def 14, Elements by Euclid) Figure 1. Tetrad: Postmodernism explored in their book, additional quotes (called 'glosses') have been added to one or another law to poetically heighten some of the terms captured. Second Incident When I was given the opportunity as a graduate student to teach education students (pre-service teachers), my enthusiasm got the better of me on the first attempt. I wanted to share the constructivist culture that had been a part of UES, believing that many steps we had taken moved us toward a new frontier of a teaching and learning. Notwithstanding, to my surprise, I managed to perpetuate what I had criticized (as described in the first incident above). 140 With the collaborative efforts of a co-teaching partner, we structured an environment steeped in what we understood to be postmodern constructivist ideals. We envisioned a deliciously creative space filled with stimulating activity and sufficient ambiguity, chaos and exploration to foster the imagination. At first, it was pleasant to discover that there were like-minded individuals teaching in other Provinces (removed from my point of origin). On the other hand, it was unnerving to discover that our ideal did not appeal to our adult students, most of whom had little to no music background. Gratefully, we had encouraged reflective writing. If it were not for that feedback strategy, we could never have adjusted our approach and the course would have failed more miserably than it had. With our enthusiastic ideals in tow, we introduced music through singing, moving and playing instruments to a combined number of 78 students from the two course sections (mine and my partner's). On the first day, we outlined the course objectives, modeled the first compositional project, encouraged journal reflections, and finally, expressed the desire to experiment with team-teaching at the undergrad level. My partner and I concluded after an exciting, albeit exhausting morning that we had managed relatively well under the circumstances. By the second class, however, I was overwhelmed to find out how far off the mark we had really been. Upon reading the student reflections, my enthusiasm for our course of action was suddenly derailed. In the following samples, extracted from three of my former students, there is a noticeable discomfort in the tone of the responses. The discomfort is a combination of the subject matter (i.e., music) and the events that transpired. Student (1): As I walked into my first music class today, I felt overwhelmed. First, by the subject itself and second by the sheer number of students in the class...I have some concerns about this course and these have to do with the instructors' efforts to team teach. Student (2): At the beginning of the lesson, we started singing songs that I had never heard of, reading from a fuzzy overhead screen in a room full of strangers, including the two unfamiliar ladies standing at the front of the room. 141 Student (3): I believe there are many issues within this class that concern the whole group... It is disappointing for me to see a class such as music leave a negative imprint on me. The next class, despite our adjustments, was not much better. Once the negativity had settled in, there appeared to be little that we could do to turn the mood around. No amount of teasing out 'creative' activities or employing of 'constructivist' means could change the fact that the postmodern views simply did not sit well with students. In the following, I gauged the reaction of one of the students who had responded especially negatively to the first day by tracking her sentiments. Student (3): The station travel was a major confusion as far as I was concerned. What I found is that it was movement from one place to another with little time given to understanding the concepts. I especially find the music lab frustrating. It is a worthwhile resource for teaching music, but time is a factor. I would find it more useful if one whole class was dedicated to this. Without the means to turn back time and 'start over' I opted to try the one thing I knew as an artist I could do to try to 'fix' (at least ameliorate) the situation. On the fifth day, I arranged my group of students to enter a dimly lit room where I had placed chairs in a circle. There I invited everyone to 'relax' to the music I had put in the background. Once everyone was settled, I began to read from the pages of my journal from my early years of public school teaching. I recounted the story of a student whose gifted drumming skills had elevated our classroom experience—and whose own personal struggles with school had improved when he found himself in a leadership role. The student left an indelible impression on my mind during those first years of my teaching career and my view of the power of the arts had been greatly affected because of him. I wanted to share that transformation with them. It was a quiet hour of contemplation and as the music gently soothed ruffled nerves, I too felt a great calmness. The responses reflected positively on that small effort to reach out to the class and rebuild broken bridges from the depths of the human spirit. 142 Student (3): The introduction to the class today was an important form in which to begin the class. A form of meditation gets us in tune with the surroundings and the goal of a particular course. By using this it transformed myself from this stressed out student to one in which I was able to concentrate and enjoy the purpose of music. Student (4): I found the class today very uplifting and rewarding. The weather was dull and stormy as I left for home this morning (at 7a.m.). I was hoping that music would lift my spirits, which it did. I appreciated the class outline, the calmness and thoughtfulness in the class today. The readings and quotes were poignant and full of thought. In retrospect, any breakthrough I may have had was accomplished through the medium of artistic expression as a passionate response to the situation. The desire to redeem myself, and the arts, motivated me to do something radical. However, it neither advanced the teaching of the arts nor defined a pedagogical approach for creative endeavor. At best it was a model of what the arts have to offer when they join together in concert to fill the human spirit. Re-visiting my initiation into teaching adults, it had become apparent that postmodernism and, more specifically, a postmodern constructivism as method (i.e., medium) cannot (a) stand up to the power of artistic expression—a far more palatable medium rooted in the human condition; and (b) facilitate the intellectual and kinesthetic operations required of music understanding. Compared to the sensory dynamic of the arts—an explosive awakening of all the senses when sound, images, and movement are interconnected— the abstract voice of postmodernism fails miserably in a teaching and learning context. In addition, careful observation reveals how deeply the arts may become a pedagogical process. I do not mean the arts as curriculum per se, but as pedagogical choice. In other words, the ability a teacher has to apply artistically a content that can uplift, inspire, motivate or otherwise awaken the spirit. Artistry is proof that art plays a very profound role. One cannot be scientific without science any more than one can be artistic without art. Moreover, artistry amply proves that we cannot be surprised when children prefer entertaining forms of education, 143 such as T V and computers. We do not yet understand all of the implications that new media play on cognition (i.e., the degree of sophistication) but we are well aware of the boredom that settles in a regular classroom environment. If nothing else, boredom should alert us that the mind is seeking to be stimulated and is finding nothing new. Postmodernism, however, is not the only issue of contention. The notions embedded within what we perceive to be constructivist methods are also of importance insofar as constructivism relates to a creative pedagogy. Post-war educational reforms in North America coincided with the advent of American pragmatism, led by John Dewey and the Humanist movement, led by Abraham Maslow. As a result of a 're-building' sentiment, North American education began to embrace constructivism. During that period of re-birth, educators abandoned rote methods of learning and approached subjects with the idea that learning was best accomplished when the outcome had a practical application. Maslow's hierarchy influenced educators to address the needs of the individual and to motivate each one to attain self-actualization. The humanist movement pushed for racial tolerance, individualism, multi-culturalism and many other social reforms in the classroom. It is probably self-evident but worth pointing out nonetheless that the middle of the 20 th century marked the most significant social, political, scientific and methodological reforms of the century. The accelerated pace of science, in particular physics and astronomy, and mathematics that reverberated throughout America came as a result of the ensuing post-war temperament. With the insecurity of the Cold War and a race for political power, Americans did their utmost to recruit some of the best scientists and mathematicians in the world to their universities and other non-profit organizations such as RAND (research and development). Coinciding with this recruitment was a desire to 'develop' indigenous minds with high abilities. To do so, 144 psychologists needed to study intelligence and creativity to try to determine the factors necessary for fostering genius. Thus, it was and has been largely believed that constructivism is the means by which children developed creative ability. After all to construct is to create. Insofar as children constructed their own knowledge, there was a sense that educators were finally taking a step back from indoctrination to allow the imagination to flourish—a tenet also espoused by postmodern curriculum theorists such as William Doll (1993). Nevertheless, there are a great many paradoxes embedded within, the first of which involves the interpretation and application of Piaget's developmental theory. The area of cognition that Piaget spent his entire focus was in the area of intellectual development (i.e., reason). There is a tremendous tension that is found when constructivism is allied with postmodernism. First, postmodernists as a rule do not concern themselves with issues of intellectual development, as had Piaget, from a biologically developmental point of view. Their concern is rooted in the social psychology of hidden messages. Second, the constructivist applies techniques (i.e., tools) to build a structure whereas the postmodernist seeks out the techniques to undo it. The constructivist has in mind of building a better 'thing' whereas the postmodernist has in mind of showing why it is faulty. The irony in coupling constructivism with postmodernism is that the former is a synthetic medium and the latter is an analytical medium. They are, for all intent, at cross-purposes. By running constructivism through a tetrad (see Figure 2, p. 145) we can begin to note the tension. The tetrad serves quite well to demonstrate that within every medium there contains paradoxes. Paradox is the tension of contradictory elements and the reader will recall that truth emerges from that tension. That is to say, truth is the revelation of the tension that is paradox. The tetradic laws, in and of themselves brilliantly counterbalanced, reveal the contradictions inherent of any medium. Though the tetrad is not a scientific device, that is, McLuhan did not conceive 145 "Constructivism as a concept helps to find solutions to a problem" (Brooks, 1990, p. 3) "students, teachers, administrators, and parents all evidenced a malaise that is still apparent" (Doll, 1993, p. 278). Enhances freedom, creativity, function, self, intellect, awareness, sense of worth, cooperation, independence invention, product, concept Reverses into ambiguity, post-positivism post-structuralism, postmodernism technocracy, authority, ideology new doctrine, percept CONSTRUCTIVISM idealism inclusiveness, cultures realism, process, art, life, community, relationships democracy, imaginings Retrieves abstraction, paradox uncertainty, randomness metaphysics, complexity critical analysis Obsolesces "By creating a big tent in which everything that is not 'chalk and talk' masquerades as constructivism it becomes impossible to talk about the differences between enabling, inclusive, student-centerd instruction, and forms of teaching that may look student-centered and process-oriented, but are fundamentally authoritarian and traditional underneath" (Q'Loughlin,1995, p. 9). Figure 2. Tetrad: Constructivism the tetrad to be philosophical but rather more akin to rhetoric and grammar, the juxtaposition of the laws 'enhances' the lens through which to view any medium. When media intersect, as they have with the discourses of constructivism and postmodernism, the layers of paradox are dense and convoluted. Add to that the teacher-medium, the classroom-medium, the student-medium, the art-medium, and the whole matter becomes too filled with swirls of dust clouds to see accurately. The mixing of all the media takes on the effect of a tornado, and as I mentioned earlier, it takes a great deal of effort to not be swept up in its powerful force. Awakened From Slumber The media that had brought me to where I was situated, as an artist and teacher, must have been filled with paradox. Whether they had been concrete life experiences (i.e., performing 146 in the arts) or study (i.e., philosophical, literary, scientific, artistic) all media seemed to incur dramatic tension. The tension we felt at UES, for instance, came from any number of media and the unresolved conflicts (i.e., pedagogical, philosophical, technological) created pressure that was none too kind or gentle. UES, in fact, was a tornado of events, a virtual implosion, much like our mission statement proudly stated. To make matters worse (if conflict is seen as 'worse'), it did not help that in the midst of revering a postmodern perspective, creativity was demanded from learners and teachers alike. Creativity, however, was encouraged within the context of a postmodern ambiguity. In effect, my administrator made certain that ambiguity was the reality at UES. Any comments to the effect that in the past 'so and so' was done like 'such and such' succeeded in a scornful reply that 'such and such' would not work around 'here.' No matter how much my administrator was pressed, however, she never clearly identified how things were done around 'here.' It was more or less a hit and miss operation that thankfully I managed to frequently achieve a bulls-eye success in my second and third year of employ. Others were not so lucky however. Without the benefit of some guidance, people felt nervous about implementing new ideas. I can confidently, in retrospect, see the folly of that kind of leadership thanks to a questionnaire I gave to my university students some years later. When asked to identify the degree of ambiguity they would need to be 'creative,' an overwhelming majority responded that they would need established boundaries to be defined, but would want to be given the permission to break beyond. For me, the effect of removing the traditional teaching boundaries at UES was as exasperating as it was thrilling. I practically thrived under the tension (provided it was of an intellectual variety), but others simply crumbled. My administrator eliminated long term planning, prescribed learning outcomes, scope-and-sequence approaches, special need programs, 147 and any method that appeared 'traditional' in nature. In music education it meant doing away with either Orff-Schulwerk or Kodaly.6 My administrator's encouragement of new approaches to teaching paved the way for me to implement Laban movement from my dance background, drama from my theatre background, and music and film composition from my technology background. I was lucky to have had so many experiences! It would seem that technology was our ticket to the future. In effect, technology was revered and elevated to a prime importance (the last visit to UES in 2000 showed a computer lab with 36 iMacs)—though with no more understanding than our 'constructivist' methods. With increasing curiosity, we pounced on the opportunity technology presented itself to becoming leaders in our field. And it was in the midst of all the turmoil I faced at UES that I gratefully accepted the opportunity to attend the Contemporary Curriculum Dialogues conference (1997). Upon entering an exhibit displayed at the Glenbow Museum in Calgary, I was greeted by vivid expressions of children who had been nurtured, so it seemed, under the paradigm of postmodern thinking. The program read, "The exhibit documents The Reggio Approach through sequences of photographs, children's original work, transcriptions of their dialogues and reflections by educators. This documentation shows how creating a favorable environment, giving careful attention to the children's true interests, and trusting their potential lead the children to represent their ideas and their learning through many languages beyond words." Indeed, placed throughout the surroundings, the children's handiwork was delightfully expressive in multiple forms. There were drawings and paintings, computerized LEGO models of moving structures and gadgets, animations, and other such applications of art and technology. Much of what I saw fascinated me and, while looking upon each of the exhibits, I meditated upon my own practice, pleased that I had kept a record of my students' work (albeit not as detailed but equally 148 revealing). I appreciated the value of documenting and keeping a record of events from the point of view of inquiry, and readily agreed with Malaguzzi (1993a) when he said, "Responding to all of these demands requires from teachers a constant questioning of their teaching. Teachers must leave behind an isolated, silent mode of working that leaves no traces...Instead they must discover ways to communicate and document the children's evolving experiences at school" (p. 63). While reflecting upon the computerized models and gadgetries powered by LEGO LOGO, I recalled the pleasure I had of animating with the use of LOGO. Pioneered by Seymour Papert (1980), LOGO is a computer programming language designed for school children. Although I had used other applications for animation purposes, here were the new developments of LOGO, and I felt regret that I had not extended it in my practice for it seemed such activity must affect children's thinking in some way. As an adult, I recalled grasping the essence of Boolean logic, with its conjunctives, A N D , OR and NOT, as data gateways. I recalled understanding the difference between processes that were (1) top-down and (2) bottom-up, strong metaphors of (1) deductive and (2) inductive reasoning respectively. I doubted those ideas could be formed by children, of course, as I had some understanding that children engage in activity without the fullness of meaning—a vestige of evolutionary development perhaps similar to linguistic development. If children express via "a hundred languages," as the conference statement suggested (i.e., a reference not to language, per se, but to varied media), I wondered about the innate structures (i.e., assuming Chomsky is correct) that allowed the temporary use of complex systems without complete awareness of the function of those systems. In effect, developmentally speaking, children's linguistic use has been shown to do just that. 149 Surely if evolution was a factor in establishing linguistic structures barely discernible to the young mind, it could not be a factor in the establishment of the structures of complex new media. Its fluid use for expression seemed too quickly acquired by comparison to what must have taken millennia to achieve linguistically. But the facts seemed to stare us in the face, children (without much help from adults) manipulate technologies with ease, finding numerous ways to express and create. The questions that seemed to prick at my consciousness at that time were: "What role does computer technology play in the development of a young child's mind?" And "What are we failing to understand with respect to the interplay between the human being and computer?" The whole matter was unsettling and became even more so upon exiting the room, when I came upon an intriguing quote (author unknown) that had been framed at the entrance of the exhibit (See Figure 3, p. 150). I was drawn to the numerous contradictions that abounded between the intelligences of mind and machine; between algorithmic and heuristic means; between problem solving and creativity. The notion of a "plurality of symbols" and how one must respond; the necessity of collaborative work and developing "control over spatial movements," and the usefulness of logic and time with respect to complex actions. If there were "joy" in verification, "pleasure" in sensation, and "gratification" in active participation and achievement, surely this was as a result of what Nelson Goodman (Malaguzzi, 1998) suggested, "to understand is to experience desire, drama and conquest" (p. 60) for technology would appear to be the new frontier. Finally, in its summary statement (italics mine), I was drawn to the contradictions embedded within the words "abstract, conceptual mastery over praxis and symbolism." I was puzzled by the dispute over differences between concept and percept; between theory and praxis. Did this dispute over dualities still hang in the balance? From a scientific perspective, can children master abstraction in ways that Piaget and others have never observed? A l l of these 150 The meeting of the children and computer is, in effect, the meeting of two "intelligences" that need to know each other. The intelligence of the child is fluid, intuitive, curious, and above all able to "decentralize" itself and assimilate new interactive rules, to regulate its performance to find and change constructive communicative proposals and solutions. The "intelligence" of a machine is more linear, rigid, programmed, in many ways an imposition in others, receptive and willing to execute commands, able to listen to children and to invite them (without humiliating them) to rethink their actions, to indicate this way out of a problem, to suggest the means to arrive at a meeting of "forces." Notably what is recognized is: • the existence of a plurality of symbolic communicative signs to which one can respond with appropriate codes. • the compatibility of procedures of trial and error. • the force of reflection, the flexibility of reasoning, the value of conjecture and working together with one's companions. • the usefulness of the logical and temporal breakdown of a complex action. • the necessity of developing control over spatial movements • the joy of verifying one's success • the pleasurable sensation of touching the keyboard in order to enter an operative relationship with machine. • and the growth of awareness gratified by active participation and achievement of results. A careful evaluation of the experience allows us to discover talents and resources never before accredited to children...chiefly those regarding their capacity for abstraction and conceptual mastery over praxis and symbolism. Figure 3. Museum Exhibit contradictions, all such tensions, could do no less than raise my curiosity but also my wonder over technology and its role, particularly, as it related to processes and products deemed creative. Intuitively, I knew that I had stumbled upon an important set of propositions that carried the central dilemma of both technology and creativity, but were the answers bound up in the question? I did not fully seize what the words signified, particularly juxtaposed one with the other. The message seemed to resonate with my spirit, however, for I heard in those words a complex harmony of contradictory beliefs and values that resisted simple analysis and synthesis. 151 In short, I encountered paradox. The words, artfully expressed, lacked only in precise meaning, though, I had experienced something of what I believed it described, as I had witnessed, seemingly extraordinary things emerge from the filmmaking project. Like a child who hears a word for the first time, uttering it in varied ways until the meaning one day fully manifests itself, and, having eagerly copied down the quote, I concocted ways to include it in my teaching rationale. For it seemed to point somehow toward a means for arriving at a 'common understanding.' Some time later, an opportunity arose that required me to summarize the events of the filmmaking project, and, thus the quote found its way into my writings. I had spotted an invitation by the Canadian Federation of Teachers to submit classroom work that would be considered innovative, and on a whim, decided to send a summary of our project. To my delight, and that of our school community, I received the Roy C. Hill provincial award, at nearly the same time as our school received the provincial Quality Award in Education. Clearly, both the nature of the project and the ways in which we were engaged in teaching and learning at UES had appealed to adjudicators (no doubt the uses of technology had played a part). We were certain of ourselves, certain that our work was a 'sign of the times,' and we proudly congratulated each other for carrying through with our convictions. The lyrics in the song From the Air, by Laurie Anderson, perhaps captured our jubilant sentiment: This is the time. And this is the record of the time. Nevertheless, it was Linda Darling-Hammond (1997, Sept.), who most aptly pinpointed the 'record of the time,' for her views presented us with an even more perplexing problem, one that threatened our very sensibility toward contemporary education. The new basics demanded by today's knowledge society require that all students be able to meet requirements previously reserved for only the 'talented tenth.' They must learn to: understand and use complex materials; communicate clearly and persuasively; plan and organize their own work; access and use resources; solve sophisticated mathematical and scientific problems; create new ideas and products, and use new technologies in all of these pursuits (p. 5). 152 Hammond's thoughts were both intimidating and eerily haunting. The notion of a former "talented tenth" made me wonder who was being discussed: the intelligent, innovative, affluent, and gifted? And whether by "new basics" we no longer cared about the 'old basics,' or for that matter what in fact 'basics' signified and to whom? I wondered about "new technologies" and their "use in all of these pursuits" and whether we knew how to teach by means of any of those pursuits. Those were but a few of the questions that seemed to arise from the language, most of which pointed toward the main concern for educators: content and method. That is to say, content qua message and method qua medium (context). In truth, the system of education as medium is akin to whatever the artist must manipulate in order to achieve an end. Though as an artist I could intuit such matters, I lacked the capacity to reason with clarity in an academic context. And though I shared common meanings with other artists who also embraced the maxim—the medium is the message—it was when addressing the non-artist that clarity became an issue. An issue that pointed directly to the lack of clarity found within a our artistic understanding as artists. As a musician, dramatist, dancer, and new media artist, I knew well enough that neither medium nor message could be uncoupled any more than mind and body. I became particularly mindful of the comments made by Malaguzzi (1993a) upon the "new consciousness regarding the education of young children" which had emerged in the 1960's. He said, "We avoided the paralysis that had stalled left political theorists for more than a decade in a debate regarding the relationship between content and method in education. For us that debate was meaningless because it did not take into account differences that were part of our society and ignored the fact that active education involves an inherent alliance between content and method" (p. 52-53). I 153 wondered if we too were stalled in our debates over technology means and end, whether we were still battling the issue of content and method. McLuhan (1963) had long since exposed sign systems in his seminal book, Understanding media. In that book, entire chapters were devoted to revealing the effects and affects of sign systems on the individual and society. Whether or not one agrees with his analysis, at the very least the discourse points to the inevitable Truth: there exists a profound and intimate relationship between medium and message. That Truth translates to understanding medium and message (not merely the analysis of the latter). He impressed upon us that if the medium remains 'transparent,' this condition precludes any reflective and intelligent escape of its given message. It is similar in scope as the old adage, "Consider the source." During professional discussions at UES, when we debated the implementation of technology, I recalled that the striking parallel between technology and art was entirely overlooked. Technology and art are similar in that both aim toward clarifying the mind and heightening the senses, and they are artifacts (i.e., media) that extend a human being's expressions (i.e., vocally, bodily, etc). To extend that idea one step further, the educational system (another medium bearing another content) is ideally both an art and a technology. Conceivably that notion is a complicated matter, as if it were not enough for educators to focus on practical matters without having to worry about metaphorical comparisons to artistry and technique. To be truly an artist, one who understands their medium, it demands a level of expertise that rises above the norm. Yet, educators in today's modern society are viewed at a disadvantage, for the pedagogical artistry of the past (if it existed), no longer seems to fit the content of the present. Gayle Long (1997) elaborated on a sentiment that is prevalent in schools and society at large. "Teachers today are seeing a new kind of student enter their classrooms. Many children 154 sat at a computer for the first time shortly after they received their first pair of shoes. They're the Nintendo generation or the screenagers—the first to grow up with personal computers, video games and the Internet. They expect material to be presented to them in a creative and challenging way and are eager to experiment with innovations in technology" (p. 17). Long clearly told educators that the proverbial cart (learner) is well ahead of the horse (teacher), and, thus, implying that the role of the teacher was rather tenuous, particularly in matters concerning technology. The audience (students) appeared more expert than the artist (educator), which could only result in a rather grim devaluation of the artist. Such emotional pleas demonstrate the power of the medium to affect, for better or worse, our perception. Formal language, in fact, is often blamed for its ability to obfuscate context and meaning, though, every institution, be it educational, political, or religious, rightfully uses language to define, convey, defend, implore, advocate and persuade. Notwithstanding, the informal use of language and its conventional banalities does much the same thing as the erudite and poetic. George Berkeley (1974), for one, contended that language exceeds the mere communication of thoughts. "There are other ends, as the raising of some passion, the exciting to or deterring from an action, the putting the mind in some particular disposition; to which the former is in many cases barely subservient, and sometimes entirely omitted, when these can be obtained without it, as I think does not infrequently happen in the familiar use of language" (pp. 147-148). Judging from the sentiments expressed by my colleagues, it was hard not to sense the feelings of powerlessness epitomized in Long's comments that echoed so many thoughts (an echo that reverberates even now, a few years past). At the same time, a feeling of deep resentment at the mere suggestion of such an idea also prevailed. 155 The power of the words, it would seem, ultimately dictated less than rational action, for their seductiveness set emotions in gear. Without belaboring the history of language study, Umberto Eco (1998) pointed out that many people, from ancient times to present, have been preoccupied with deliberating on the alluring power of language, which fell from grace at the Tower of Babel. Language, in particular poetics, when viewed as a medium may be used to deceive, to alienate, to form secret societies or to conspire and rule others. It was for those reasons that the ancients looked upon philosophy, and, later, logical positivists—as defined by the Vienna Circle—looked upon Logic, as a means to free us from erroneous beliefs wrought from passionate appeals which lack sensibility and most of all, veracity. In other words, the implication is to attend to the medium in order to assess the message—the reality is that the message (the significant) has become more vital than the sign. That is not because there has not been anyone aside from McLuhan who has studied the medium. Peirce (Eco, 2000) himself identified semiotics to be the study of sign as referent or object, the ground as the nature of the relationship to the referent, and the interpretant (the relationship between the interpreter and the meaning). In McLuhenesque fashion, every aspect of sign necessarily consists of both medium and message. It is this obvious clue that is often missed in semiotics, a clue that remains buried in works of Derrida that are filled with issues of the sign, the signifier, and the signified. At any rate, an earnest look at educational 'messages' (since it is to these that the educator turns) can be of great value, the least of which being to determine the sense of what professor Anne Phelan (1995) has described as "buzzwords, bandwagons and banalities." Faced with postulates, from constructivism to postmodernism, from philosophy to science, no study can be complete without some critique of the tenets held dear in education. 156 Practical considerations still govern the majority of educational decisions and, for that fact alone pragmatism remains as discourse du jour. As many former university students of mine have expressed, music education necessarily requires concrete examples and activity. No amount of time spent on either the theoretical or aesthetic constructs of music can bring the patterns of sound and silence (i.e., rhythmic or tonal) to the mind's understanding and the body's awareness. Therein lies the need for music educators to focus principally on the ground—a practical, concrete perspective. Nonetheless, theoretical constructs necessarily shape the, figure. It is in abstract thought that patterns, shapes, and relationships are revealed. For that matter, because abstraction is germane to this thesis, I later discuss how it is through abstraction that patterns, shapes, and relationships are thought of at all. The reader may recall that I mentioned 'intuition' in the first chapter. I defined intuition as the moment when the concrete and abstract meet to allow a moment of clarity. That moment may not necessarily be possible to articulate without further thought and deliberation, but it often leads one to consider an action. Conceivably aesthetics or aesthetic sensibilities may be the 'application' of intuition in a creative work. But this is most likely an oversimplification of a complex issue that has been discussed in Western thought for 3,000 years. Propositional Logic: Breaking down the Medium There is a growing need among educators in the 21 s t century to find a way to get to the Truth with respect to pedagogical and curricular matters. On the positive side, there are many studies to support and enlighten our understanding of the mind from neurological and psychological points of view. The down side, however, is the fact that many good research findings remain hidden from knowledge. That is most likely due to the overwhelming number of popular arguments that drown out other voices. Many popular arguments (i.e., theories) that seemingly support educational practice, in fact, contain fallacies in logic. There are serious 157 epistemic threats to numerous theories that can be linked to issues of validity (i.e., construct) and issues of causality. Of course, 'construct' issues wind up being another case of medium and message. A particularly onerous argument is one that is a fallacy of relevance. In instances such as those, premises are logically irrelevant to their conclusions and, hence, preclude the conclusions from being true or valid. Although that statement appears perfectly logical, even to someone without an education in formal logic, because the premises are psychologically appealing, we are generally persuaded to accept their correctness irrespective of logic. In those instances popular appeal will uphold arguments until such time they are discounted through critical examination. When an argument contains fallacies of relevance but is supported by public appeal it will usually lead to reversals of opinion when public taste changes—unfortunately that is not always the case. A fallacy of relevance can also occur when the argument is made from ignorance. We find that argument to be frequently favored by religious groups and phenomenologists. When it is posited that since we have no means by which to prove the existence or inexistence of certain phenomena (i.e., God), we may accept to some degree the veracity of the argument. In other words, since a proposition has not been proven to be true it must be false or, conversely, if not proven false it must be true. Such an argument, made from ignorance, can neither hope to explain nor define a phenomenon but merely describe it. In much the same way as we are able to give a reasonable description of a creative process or product, the description itself does not provide us with either a definition or a definition for creativity. Although the preceding argument regarding ignorance may not hold sway with individuals who are either pragmatically or scientifically oriented, social scientists who support the preceding argument may do so quite convincingly. Noam Chomsky (2000), for instance, 158 stated that the intellectual world is divided into "problems" and "mysteries" and that by virtue of the fact that we are "partially opaque to ourselves" we can only achieve theories within the boundaries of problems, but never in the case of mysteries. In Chomsky's case, the mysteries to which he refers are those found in linguistics, viz., its origins and evolutionary development. In general, Chomsky views language as an innate reflex, and, much like any instinct found within the animal kingdom, explanations of its origins become a matter of conjecture. In other words, since the origins of language is shrouded in mystery and cannot be proved, it is only linguistic description that is the most fruitful endeavor. Researchers in the area of creativity have certainly expressed those sentiments (Boden, 1990; Csikszentmihalyi, 1996; Gardner, 1993). It goes without saying that arguments lacking in logic do not preclude their use from becoming emotionally riveting. In a similar vein, metaphorically speaking, an illogical and poorly scripted film may win an audience merely by playing a great music soundtrack, or by possessing amazing special effects. Emotionally appealing arguments that petition a public is called, argumentum ad populum. That argument also creates a fallacy in logic. Whether a popular argument is true or false, its fallacy lies in appealing to an unschooled public. The trouble with popular arguments is the manner in which they take hold of social consciousness. It is not until a serious investigation proves them to be false that reversals come into effect. Unfortunately, as with so called 'old wives tales' and 'fishing stories,' even scientific fact may prove to be no match for emotionally appealing arguments for their effects may linger long past many generations. It was Socrates greatest fear that popular arguments would rule over logic when unleashed on an unschooled public—a fear that bore out to be true in the end. Nevertheless, Socrates was not aware that when a medium of great significance (i.e., the Gutenberg Press) makes its presence in society, change is bound to happen. And, according to 159 McLuhan, that change is precipitated most often from the inside out (i.e., from the individual to society). Popular beliefs are sometimes concluded from an argument which has begged the question altogether. This third fallacy occurs when the premises and conclusions contain the same wording. For example, 'it is highly advantageous to society when education provides students the freedom to create using multiple forms of representation; for the freedom to express in multiple ways is highly conducive to the community at large.' In that instance, if the premise is true then so must be the conclusion by virtue of the fact that they are one and the same. Nevertheless, the premise fails to prove or establish its conclusion. Once more, as with the first two fallacies in logic, begging the question simply frustrates further thought on the matter. Ultimately, rational arguments with logical fallacies may be found to have shaped a good proportion of educational writings and, hence, create the impetus for continued investigation by diligent philosophers and researchers alike. Unfortunately, for the average educator, that is not the case. It cannot be overly stated that once having been sufficiently swayed to one or another view point, the average educator thinks to investigate no further. That is the fallout of new media, for when false claims are published and produced for the masses, there is no turning back the claim except through a counter claim that is also published and mass produced. While fallacies in logic, as described, can be found whenever an intellectual argument is made, two fallacies affect scientific view more specifically, viz., hasty generalizations and false cause. In the former instance, the particular facts established in one or two cases are then hastily generalized to a rule for which there is a fit. For example, linguistics was dominated for many years by structural grammar that, while descriptive of some grammatical cases, failed to explain all cases. Despite the many attempts made by linguists, the field remained bereft of an adequate 160 explanation. Beleaguered by particulars, in fact, linguistics was a mass of opinions most predominantly rooted in Skinnerian thinking until Chomsky developed a theory that from an innate perspective (i.e., deep structure), explained syntactical and grammatical universals. Again, there is a parallel to be made in creativity research. As hinted at in the second chapter, the field of creativity suffers from a myriad of particular facts that try to fit a general principle. For the most part this battlefield fuels continued research and may not be necessarily a bad thing. As linguists know too well, even when a remarkably accurate generalization has been asserted, the research continues because there is always another puzzle to be solved. In Chomsky's case, his critics remark that he has ignored semantics from the study of language and continue thus to look at language from a broader base of study known as semiotics. Likewise, if creativity research were to arise out of the quagmire of particulars to reach a general principle satisfactorily, as one might expect, it would not completely close the door on all the issues that surround it. A general principle seems to allow a broad range of possibilities (i.e., content), while at the same time narrows the frame of reference (i.e., context). Nonetheless, that definition may not sufficiently carry the full measure of the significance of creativity and, hence, would continue to require thoughtful inquiry. For instance, the notion that creativity, in an absolute sense, could be defined as 'the expression (i.e., artifact) that arises from a creator's capacity to conceive and 'visualize' a paradox that has gone previously unnoticed, and which expression causes both a perceptual and conceptual transformation in other individuals or societies. Presently, that definition lends itself quite well to endeavors in the classroom that are considered 'creative' - whether the creator is 'young' or 'old.' Finally, we must deal with research validity. For the most part, validity issues pertain to quasi-experimental constructs, analyzes, and conclusions. Prior to addressing issues of those kinds, however, we can begin to critically examine the theory that later inspires a flurry of 161 research designs. Those research efforts are necessarily part of the search for knowledge. More to the point, because many theories assert relationships and a cause and effect, rightfully, those claims need to be corroborated. In general, causality has been of primary concern for scientific, psychological and educational research. Notwithstanding, the logic of some theoretical propositions could be rooted out through formal logic and thereby save time, energy, and money invested in the research experiment. In the very least, we may have better designed experiments. We find two logical fallacies that most often threaten empirical studies: (a) hasty generalizations, and (b) false cause. A hasty generalization, as it implies, is the error of generalizing too quickly or applying too broadly facts based on particular events. That error may stem from either inductive methods or the simple fallibility of human reasoning. Often it is our own sentiment that blinds our judgment. False cause is a logical fallacy whereby neither a mere fact of coincidence or temporal succession can establish causal connection. A l l fallacies of relevance (i.e., popular appeal, ignorance, begging the question) emerge from word play. Words, as many will note, contain more than mere definitions and connotations. Words are laden with tone and rhythm that effectively raise sentiments similar to the strains of music. Anyone who has listened to the passionate speech of Martin Luther King, even decades after it was first spoken, will immediately recognize the effect that his tone and rhythm has on the senses. Word play, however does not merely affect some arguments, words affect all arguments, including theories that underscore empirical studies. Plainly, fallacies of logic affect empirical study due to epistemic threats to the originating theory. The following principal arguments found among educators have been so repeatedly employed with few objections it is possible that they have been entirely overlooked as sources of error. Those arguments, with little research to support the claims, have been made largely to support policies on either the implementation of technology or creativity, or both. The following 162 analyzes, therefore, have been undertaken to (a) root out fallacies in logic, and (b) attempt to convince the reader that the lack of formal logic stands in the way of much needed research and understanding. Teacher as Facilitator: A Question of Whether the Computer Can Do As Well. A popular position to take for placating the sentiment expressed by Gayle Long (cited in the previous section) would contend that a teacher is more like a facilitator. To deflect the kind of statement that makes educators fearful of children who are seemingly leaps ahead in technological understanding, the following recommendations were made by Cole and Bruner (1972): "...the teacher should stop laboring under the impression that he must create new intellectual structures...He should start concentrating on how to get the child to transfer skills he already possesses to the task at hand" (p. 176). In other words, teachers could assume a role that merely denotes facilitating the transfer of skills that apparently a child has already in possession. Assuming that skill transfer is possible, it is useful to know what it is that a child "already possesses" so that transfer may be mediated. One interpretation of what teachers are facilitating, though it seems illogical, might mean the transfer of general abilities that are genetically present. The ability to 'see' shapes, for example, is generally believed to transfer to the skill of painting or solving geometric problems. However, making an assumption that the perception of 'things' in one instance is innately transferable to other activities confuses the very premises of perception and conception. The ability to 'see' a shape (i.e., square, triangle) requires more than the visual apparatus and more than the potential that resides within the brain's biological makeup to understand the shape. It is illogical to assert that merely perception (i.e., sensory) is involved when, in fact, to name a thing requires conceptual logic. To notice a thing and to identify it as a shape, a person would have to conceive of that shape. Once a shape has been identified (i.e., named), it has been 163 registered in the brain conceptually. And though there is a good chance that conceptually it will be recognized in other instances, it is not necessarily the case. The potential to conceptualize, therefore, is not the notion in question; rather, it is whether a child has conceptualized or merely reacted on a reflexive, imitative level. And once a 'concept' has formed, it is questionable whether the concept will transfer to other unrelated 'things.' That relationship, in effect, is also the result of conceptual thought. The bone of contention with Cole and Bruner is an assumption that links in knowledge and skill are natural occurrences that are irrespective of a child's actual development. Despite the controversial aspects of the Montessori approach (i.e., an adherence to scientific objectivism), Maria Montessori did not merely allow pre-school children to develop cognitively through their own devices precisely because the 'natural' child (as Rousseau envisioned) does not exist. Through her own studies and experiences, she considered that children require specific focuses that could heighten critical thought (i.e., to enable them to develop the 'logic' to make moral decisions) (Montessori, 1967). Many educators have argued that socializations also require directed focuses. At any rate, within those contexts that are both formally or informally structured, we are entirely unsure of what knowledge and skill a child actually acquires through play. At least, we are uncertain to the degree that we often resist the guidance of developmental theories and practice. We have acquired a fair notion of what children think about at play and can use that information to help them to connect disparate concepts. But we must make certain that we do not assume that conceptual knowledge springs from mere exposure to a context. When technology is the thing of play we ignore this fact entirely. Do children learn skills from a traditional medium that then apply to new media? Are there skills that children have acquired 164 through exposure to new media that can transfer to a traditional context? The answers to those questions are still largely unknown. The following will serve somewhat to illustrate the preceding proposition. The use of a baton to conduct music in triple meter (i.e., waltz) will create a triangular pathway that, in turn, relates to measuring the triple meter that is heard and felt in the music. Nonetheless, that triangular pathway will be 'observed' though not understood to be triangular by someone whose notion of 'shapes' in space has neither been noticed nor developed. Moreover, something like the notion of rhythm being organized into three steady beats might never occur to a person who has not taken notice of such a thing. In general, whenever a thing is perceived, if the thing is named or thought about, we also imply that the thing has been formed conceptually in the mind. If not, then the person is unlikely to describe or reproduce what they have 'perceived.' I will argue, in fact, that without a degree of conceptual understanding, (i.e., pattern recognition) the ability to reproduce or describe a thing will not take place. Perception, after all, is of a sensory base distributed throughout our body, whereas conception occurs within the mind. Obviously, in consideration of both mind and body in a (re)connected human being, both come into play whenever we think about things. It goes without saying that my eyes must be free from obstruction (i.e., nearsightedness) to visually perceive a thing. I will 'see' light for instance, but, in addition, my brain will attempt to interpret that thing in some way. If I take notice of the light, my brain will name it, categorize it, and compare it to other experiences (e.g., daylight vs. artificial light). Without that kind of mental activity, the light in and of itself will be of little conscious consequence. The moment the light is no longer part of my consciousness, when it recedes into the transparent background, the light will remain solely at a sensory level (i.e., my nervous system still registers its presence). 165 That fact alone explains why I can play a piece of music on a compact disk player, ask my students to describe what they have heard, and receive multiple answers. Descriptions will depend entirely on what they have heard (i.e., perceived) and what they have interpreted conceptually. Perception is not the issue when something goes amiss. It is obvious that the sounds emanate equally into the air, and, barring any physical obstructions, can be heard by any persons present. Yet, it is not until a thing pertaining to sound is pointed out that a person may take notice of it. Taking notice of it, therefore, is both a perceptual and conceptual occurrence (Gouzouasis, 1992, 1994a). The difference between them is the difference between conscious awareness (i.e., I know the thing exists) and a subliminal sensory awareness resting in the autonomic nervous system. It cannot be overstated that therein is the proof that epistemology insufficiently addresses educational matters, for metaphysical issues (i.e., existence) always comes into play. To sum up the preceding rather dense explanation, for a person to transfer a thought or skill about a particular thing, it would have to have reached a state of consciousness that went beyond the sensory, that is to say, beyond merely the registering of its presence. The thing would have to have been conceived. But children and adults conceive differently. That is what Piaget spent a lifetime observing, viz., the mental activity that happens once a child takes notice of something (i.e., conceives it). Nevertheless, Piaget was able to show that 'conception' in the adult mind is both different and similar from 'conception' in the child's mind. To that extent, it may be said that there are degrees of conceptual thought. In other words, there is a stage of 'complex' thinking, first identified by Vygotsky, that may appear 'crude' to the highly abstract thinker but is, nonetheless, part of conceptual activity. Complex thinking is most often observed in children who use words that they cannot adequately define but may use correctly in a sentence. Chomsky (1957) showed that this was entirely the case of the youngest of speakers. 166 The nature and nurture debate is always at play when making inferences regarding transference of knowledge or abilities. We might argue whether children possess knowledge of a 'thing' without prior experience. We might infer that a baby who 'dog paddles' when instantly thrown into the water 'knew' how to swim innately. But does that instinct to swim mean that a child 'knows' to swim out of necessity? Those problems extend beyond the nature and nurture argument. They extend to language and the manner in which we conceptually think of 'things' in general. We speak of a dog 'knowing' of something, yet we are fully cognizant that a dog's knowledge, per se, does not even compare to that of a toddler learning to speak. Thankfully the debate over tabula rasa, along with the behaviorist argument, has long since been put to bed through scientific knowledge. Nonetheless, the degrees to which nature vs. nurture make up the human development are still hotly debated, in large part because we find it difficult to agree at a conceptual level. The wonderful and terrible thing about conceptual thought is the infinite permutations. We may all agree, notwithstanding, that humans are born with a set of innate potentials that only develop fully under specific conditions. When potentials are determined, we are more likely to set up appropriate conditions necessary for maximum development. Nonetheless, it is in determining those potentials that we run up against another obstacle. Sadly, the idea of determining potentials has come under fire by purveyors of social justice who deem that information a breeding ground for prejudice. As a pedagogue, however, I find that diagnostic tests significantly contribute to the choices I make in teaching. I compare this to the choices one must make in life generally. The more we know of a 'thing' the better we can determine learning and teaching options. Of course we can flounder around and try the 'hit and miss' approach. We might get lucky and hit upon the right 'combination' of teaching, albeit serendipitously. On the other hand, given that 167 probability settles in whenever chance is involved in the equation, we are also likely to miss the best choices altogether. At any rate, let us assume that Cole and Bruner referred to the transfer of conceptual knowledge, since the transfer of mere perception is hardly possible. Should a teacher be conceived of merely transferring knowledge to a new task? As far as that goes, transference could easily have come about through the child's natural play (i.e., discovery). What is the role of the teacher if discovery is the sole aim? After all, a museum, science center, and other learning facility could stimulate discovery just as well. For that matter, perhaps the computer software industry is just as capable of providing a context from which children can 'discover.' Undoubtedly, the children's television program Sesame Street has helped millions of children discover the alphabet and numerals without the assistance of a teacher. So it would appear that by using the term facilitator, we place teachers in the dubious role of mere medium. Of course, a teacher is a medium but preferably a teacher is a medium that thinks actively and organically. Pedagogy, were it merely facilitative, would not be so much a form of art and science as it would be a kind of passive medium construed from an algorithmic formula not unlike a computer program or a TV show. To put things into perspective, the following is a description of an algorithm by David Berlinski (2000). "An algorithm is a finite procedure written in a fixed symbolic vocabulary, governed by precise instructions, moving in discrete steps, 1, 2, 3..., whose execution requires no insight, cleverness, intuition, intelligence or perspicuity, and that sooner or later comes to an end" (p. ix). That description helps us to understand that it is a rather rigid and finite means of achieving an end. It is the foundation of computer design and, while algorithms were born out of human conception, it doesn't in any way describe the side of the mind that thinks heuristically, 168 beyond schema and formula. At this juncture, we are faced once more with the issue of creativity in consideration of both divergent and convergent thinking. Despite that we may find some teaching that lacks creative sensibility (i.e., formulaic), the art and science of pedagogy, as it so implies, is far more organic than might a mere facilitator be capable of handling, i.e., computer. Notwithstanding, when an educator finds himself or herself like a 'horse behind the cart,' recommendations such as those of Cole and Bruner are greatly appealing. Left unexamined, however, it lacks certain sensibility and its conclusion could be argued from ignorance. In effect, any transfer made from 'lived experiences' to school tasks, is precisely because of pedagogical mediation that requires a great deal more than facilitating—pedagogy requires thought, before, during, and after teaching and learning. Activity for Activity's Sake: Another Misconception If the institution of education were not distinct from the world-at-large, it would present a very large problem for educators, placing everything that we do in doubt. There is a distinction, however, between school-based learning and world-at-large learning by virtue of the media of each being distinctly unique; from a Berkeleian sense, a difference between formal and informal uses of language. Tulvist (1991), for instance, expresses this dissimilarity in problem solving. If the school method of solving syllogistic problems represents a generally higher stage in the development of thinking that makes it possible to solve any problem better, then it would certainly be retained by individuals who had attended school and it might even be subject to further development. Absence of this method under traditional environmental conditions, on the other hand, indicates that we are dealing with a specific method of thinking that is functionally appropriate to solving specifically school or scientific problems and does not have a functional significance in the types of activity that do not require application of scientific information and the solution of corresponding problems (p. 139). Not only does Tulvist place in question the notion of transference, but he also points out that school develops a different kind of thinking and attributes this with activities that are qualitatively different from that of society at large.7 169 Luria (1976) was a neurologist who furthered the work begun by Vygotsky. He completed studies that reveal the remarkable differences between oral and literate individuals. Though his work will be detailed later in the thesis, briefly, Luria set out to test individuals in Russia at the start of the Marxist regime in order to compare the results of those who had no formal education with those who had received some formal instruction. He found that non-literates were incapable of thinking within a closed system. Our transcripts thus provide unambiguous evidence that the simple computational operations used in everyday practical affairs presented no special difficulties, although these calculations were carried out by wholly concrete procedures. The difficulties that arose always involved a failure to find the solution within the framework of the formal condition of the problem, that is, the failure to perform a discursive operation (p. 126). When Luria administered similar tests to those who had received some formal education, he concluded that "subjects with at least short-term school instruction" solved problems in entirely different ways than those whose "thought processes operate on the level of graphic and functional practical experience" (p. 132). The importance of these findings will later bear significantly on this thesis but presently comments sufficiently on the fact that school, or rather school activities, produces new forms of thought. Peeters (1996) concluded the following: "Differences in forms of thinking do not correspond with education as such, or with different cultures, but with different forms of activity" (p. 180). The proposition that different kinds of activity produce different forms of thought supports Piaget's developmental theory considerably. Lest we fall under the spell of Descartes' logic, we must be careful when making the distinction between perception and conception for it is too easy to value one side over another and disregard their interrelatedness. For instance, it is common for some to misinterpret Piaget's stages of cognitive development to mean that abstraction, occurring later in maturation, is intellectually superior to concrete thought. While it is true that children under the age of twelve are principally concrete thinkers with restricted adult intellect, yet, to undervalue the significance 170 of concrete thought or the developmental stages required for abstraction to occur, places undue stress on the older learner. When working with the adult learner, in fact, it becomes readily apparent that thinking, either abstractly or concretely, can never fully disengage one from the other. By esteeming the concrete and experiential, especially play,8 as the sole means of development, sensory is favored over logical activity (i.e., discursive). An isolated folk song game in and of itself, for instance, is very much a sensory activity that employs memory incidentally to the physical action. Usually games of that nature require some rote application (i.e., patterns). In the first stages of memorization, the mind is challenged to retain bits of information. When information is logically associated, the mind can retain it with more ease. When there is no link, however, the challenge is to employ a mnemonic device. Once acquired, nonetheless, barring any memory losses, the activity may ensue from a series of applied 'reflexes.' The quality of pedagogy will be undermined, if there is no discourse following active, physical engagement. Even if that activity necessarily engages the mind (i.e., retrieves information), discursive activity, unlike sensory activity, challenges logic. In effect, taking the above into account, teaching could be described in part as the art of conceptual challenge. For obvious reasons, an activity-based or experiential curriculum is most often what is expected of a music education. That is to say, if we wish to think and perform musically, and not just think about music, it becomes necessary to sing, play, move and so forth (Gouzouasis, 1994a). In order to solve music-based problems, however, such as, those encountered during composition, arrangement, and improvisation, both conceptual and practical understanding is required. Activity alone, such as play, does not guarantee the type or quality of thinking educative mediating processes can ultimately shape. And "thinking about" music, by describing an activity's non-musical attributes, does not come close to the type of experience necessary to develop music concepts (i.e., thoughts on the music in and of itself). Syllogistic reasoning, as 171 Vygotsky and Luria demonstrated, arises out of the interchange between the child and adult immersed in language-based activities. A socio-cultural theory of cognitive development is not new among educators, for Vygotsky holds a significant position in educational discourse. But often, as it applies to the classroom, the proposition is underplayed from lack of apprehending its full linguistic implication. Many educators simply interpret the socio-cultural aspect of Vygotsky's theory to mean an interchange from a social significance not as a 'medium' for linguistic development (a point he made quite strongly). But, the deeper issue is that those same educators have little knowledge of cognitive processes, and even less understanding as to the gap that exists between concrete and abstract thinking, or the leap that must occur to fill the gap. The interchange between child and adult takes on greater importance if it is clear that the adult has a particular duty to bridge the gap linguistically, for language and thought are inexorably linked as medium and message and, moreover, despite linguistic innateness, abstraction does not arise on its own as experts will avow. That point will be expanded later in the text. I have discovered in practice that children and adults who possess little formal training, in particular those who manifest a high degree of music aptitude (Gordon, 1993; Gouzouasis, 1992), are readily capable of engaging in music composition and improvisation; and, given sufficient time, whatever may be lacking in technical skill (i.e., dexterity, coordination, speed) may be overcome through the use of instrumental computer technology. What has eluded my understanding is how composition is possible with or without the use of computer technology by (1) the beginner adult who lacks sufficient concrete music experiences to ground whatever theory they may have acquired in a short period of time, and (2) the child who lacks sufficient experiential knowledge and is not yet mature in conceptual understanding. In point of fact, 172 music education researchers have not yet convinced me that this is at all possible (DeLorenzo, 1989; Gordor, 1980; Webster, 1992; Wiggins, 1994). In general, though we may know a great deal with respect to some developmental stages related to some aspects of learning, we appear to have little knowledge regarding the leap that must occur between concrete and abstract thought. Most important, if we had greater knowledge, the matter regarding the incompatibility of theory and praxis would be illuminated. As it stands, the proposition that children can master abstraction over praxis, presented in the quote by the unknown author, requires a great deal more understanding if it is to be taken seriously, particularly, if we have failed to acknowledge this possibility. The most difficult aspect of music instruction, whether it is with children or adults, in fact, is in moving learners from thinking on concrete levels to conceptual levels or, conversely, from conceptual to concrete levels of learning. The reconciliation between theory and practice is a tension that must be reckoned with, yet, it lacks direction and understanding. Why have my colleagues and I floundered on such matters? Even though there is overwhelming evidence to show that activity for activity's sake cannot lead to abstract thinking, we continue to operate activity-based programs; and even though there exists evidence that effectively explains the leap between concrete and abstract thought, we do not apply such notions. In short, if theory as abstraction is not rooted in experience, it becomes meaningless since language cannot compensate. Conversely, activity alone offers no more compensatory effect for cognitive development. In music education, the result of ignoring that evidence creates an incompatibility between music theory and practice in the early years of training. It is not difficult to understand that in order for complex music development to occur, such as composition, arrangement and improvisation, a music learning theory is indispensable (Gordon, 1990). How do music educators mediate this knowledge? Through another abstract 173 medium, of course, namely, printed symbol. Despite that instruction is principally experiential, aural and kinesthetic, much of theoretical instruction favors pencil and paper tasks. Invariably, this keeps conceptual understanding at a distance from practice for no leap of logic from the concrete to the abstract is likely to occur through the introduction of yet another abstract medium. Even when the learner is steadfast in practice, many conceptual connections are lost without direct oral intervention. When practice is filled with tasks that are concrete and practical, such as the development of skill or memory, it is unlikely that the learner will abstract generalizations without linguistic mediation. To my mind, Gouzouasis (1991, 1992) is one of those rare individuals in music education who recognizes that a form of 'back and forth' interaction between concrete and abstract thinking must necessarily take place in the early years of educative development. In the Gouzouasis model, young infants are exposed to several planes of consciousness at once and are encouraged to make whatever observations, inferences, categorizations and predictions—forms of complex thinking— they can make. By testing those predictions and inferences through active musical play, children are also encouraged to synthesize during improvisational games and compositional tasks—the synthesis resembles the linguistic usage wherein syntax and grammar is manifested long before notions of understanding are fully achieved. Those same principles are applied quite successfully, notwithstanding, in the adult music education context of the university teacher education program. A l l the while engaged, children are given words as a means to refer to certain tasks or objects. Conceptual as each word may be, they act in the early years as simply a means for common communication. Words provided are merely labels used to enable children to recognize and identify music events in the classroom, in their home, and play environments. Later, those words will form the basis of their understanding of music theory, from both sound to symbol, 174 and from syntax to grammar, in all manner of related knowledge and skill at a conceptual level of understanding. That model of learning, originating in Edwin Gordon's (1986, 1993) Music Learning Sequences theory but reinterpreted with greater clarity and artistry by Gouzouasis (1991, 1992), is the key to high ability in music development, including composition and improvisation—two creative processes and products of music. Essentially, Gouzouasis has made it clear that the common practice of 'naming' or 'labeling' aural objects (e.g., tones) and the theoretical discussion of those matters are categorically different. Music theory traditionally deals with merely 'labeling.' The question c