Type on Wheels. Two Voices on Teaching the Language of Motion.

This paper by Jan Kubasiewicz and Brian Lucid was presented at MOTYF 2013 (Motion Typography Festival) in Warsaw, Poland.

Type on Wheels
The phrase Type on Wheels has a somehow sarcastic, and perhaps even derogatory, connotation. The term is often applied to the work of programs that teach kinetic typography as a means to an end — typographic animation that lacks a broader context of the language of motion and communication design.

The phrase was coined more or less a decade ago during a period of explosive demand for motion graphics programs at colleges and universities across the United States. This trend continues to this day. Unfortunately, this curriculum does not reflect the current situation in the professional motion graphics industry. The rapid rise of on-demand television services in the United States has left our broadcast world scrambling not simply for ratings, but for an entirely new model to sustain itself. This has led to a chilling impact on motion graphics studios. While the industry is fracturing, why does there remain so much demand for higher education programs to teach motion graphics and kinetic typography?

In the past few decades the communication design profession and its supporting educational programs have been required to shift their focus — and vocabulary — to remain relevant and appropriate in the context of new technologies. Fixed becomes fluid, passive becomes responsive, and what was once composed must now be choreographed. In the same spirit of these Platonic dichotomies we should recognize the necessary shift from Kinetic Typography to Dynamic Media Communication.

Many American programs saw kinetic typography and motion graphics as a quick way to extend existing traditional graphic design curricula. This quick-fix can only result in a limited-in-scope solution that does not address the core reasons why motion is so important within a modern communication design curriculum.

Most design educators, including us, agree that the subject of motion should be taught. The real question is how to relate the language of motion to the wider eco-system of design education. For those of us who have labored to make graphic design programs broader and more multidisciplinary, this rise of the importance of kinetic typography can be seen as a rejection of the very real demand for curriculum covering interaction and experience design.

Pedagogy: What Do We Teach?
Considering the above, how is kinetic typography taught at the Massachusetts College of Art and Design? It isn’t! The program does not consider kinetic typography as an end product. We aim, instead, to deeply embed the concepts of motion across the entire design curriculum. Motion is inherent in the media through which we communicate every day. Motion defines modern communications. This makes motion a property integral to all design and an element to which all other elements must relate. Therefore, we have rejected the idea that kinetic typography be taught separately, and worked to place it within a wider curricular context. Instead, following the model of our MFA program, the Dynamic Media Institute, in which the language of motion is omnipresent in the graduate curriculum, our BFA Graphic Design Department introduces kinetic typography within several required courses as well as multiple electives on all levels of undergraduate program.

Defining Dynamism
How do we define dynamic media? Or an even broader question: How do we define dynamism?

1. The term dynamism is related to or is caused by motion
This is perhaps too broad of a definition since we all experience motion almost every moment of every day. In fact, it would be difficult to point to a human experience which does not involve physical motion directly or indirectly. The verb “to experience” itself implies the context of motion. In terms of communicating in the language of motion, the term involves issues of what is moving and how that something is moving. The how question refers to the kinetic form and its grammar defined by space and time dimensions. Kinetic “behavior” contributes an additional layer of meaning to the objects that already convey messages expressed in their own native languages of pictures or words.

2. Dynamism is characterized by continuous change or progress
In a broader social sense, motion can also be understood as a process beyond physics: a process which changes one situation into another with reference to the system of human values. The motion of transferring information in a learning experience takes a person from one point to another on a difficult path to knowledge. In a social context, motion can be seen as transformation.

3. Dynamism is related to interactive systems or processes
Lastly, the term dynamism in the context of communication and dynamic media is related to interactive systems and processes. The term “interaction design” describes an interdisciplinary field encompassing those aspects of design and engineering (and many other disciplines) that are involved in bringing meaningful experiences to people. Interactive systems mediate the process of communication and therefore augment the participant’s experience as well as the environment where communication occurs.

The Language(s) of Communication
Various languages of communication are related to the development of tools and technologies of communication. Let us observe the very brief story of imaging language:

In the beginning was the spoken word. Speech: a complex system of communication that helped us humans build the Tower of Babel and, by extension, our civilizations. A sequence of phonemes carrying the meaning of abstract concepts in the language of sound articulated by timbre of voice, pace, and intonation — properties that belong to the language of music as well.

Then came the written language. The script: symbolic, visual representation of spoken language, expressed by a hand gesture holding a stylus or a pen making a personal, unique message visible and transportable across space and time. Writing combined the linguistic as well as visual idiosyncrasies of the author of the manuscript.

Then typography. Technical text: creating even further distance from spoken language by removing a personality of a person involved (typesetter or a designer) in favor of clear system of communication design suitable for multiplication.

Then kinetic typography…
What does it bring to the evolution of text communication? Does kinetic typography, with its ability to manipulate time through motion, bring back some property of spoken language? Does kinetic typography, structured on a timeline, merge the properties of the language of text with the properties of the language of music?

Then… The language of sound with its properties:
— Pitch — resulting in high/low frequency of sound
— Duration — resulting in rhythm
— Amplitude — resulting in loudness/softness
— Timbre — resulting in tone quality or color
— Location — resulting in perceived position in space (after Karlheinz Stockhausen)

Then… Cinematic language with its properties:
— Visual
— Sonic
— Kinetic.

Pedagogy: “Onomatopoeia” Project
The Onomatopoeia project is a MassArt design student’s first experience working with motion. It is assigned in the first year of the major in a required typography course. But Onomatopoeia is not a simply a kinetic typography project, it asks students to consider how the formal characteristics of typography can be systematically applied to interpret the structure and experience of a musical composition.

The project begins with an overview of the history and tradition of the visual representation of sound and music. We focus particularly on the historical moment when avant-garde music began to stretch the limits of traditional musical notation. One example of this genre of work is Aria, the John Cage score from 1958 [Fig. 1]. It consists of a sequence of hand-drawn and colored lines roughly describing the relative pitch path requested for the emission of sound. Colors placed on the curves define timbre. The different timbres used in the work were not predetermined by Cage, but chosen by the performer during the rehearsals.

Once familiar with some historical background, students begin the project by choosing a short sample of instrumental music to work with. No vocals are permitted. They begin by visualizing their listening experience using traditional media. [Fig. 2] Then they are pushed towards more rigorous forms of diagramming, mapping the instruments they hear to pitch on the x-axis and time on the y-axis. [Fig. 3]

Once familiarized, the students reveal the structure of their selection by composing a typographic score using onomatopoeic syllables as substitutes for the sounds they hear. Onomatopoeia are words that imitate the sound they describe. Students are required to work with onomatopoeia because it forces them to focus upon the meaning generated from the formal and compositional treatment of their letterforms — not the meaning embedded in words and phrases. The only textual content is time and sound, keeping the work closer to the structure of music. Stockhausen’s properties are systematically mapped to the formal characteristics of typographic elements. Letterform size, for example, may be mapped to amplitude while weight is mapped to timbre. Such typographic equivalencies are not prescribed in this project, but are defined individually by each student. Students determine the rules for their system, and are required to explain those rules in detail. [Fig. 4]

Once their static score is composed, the students are challenged to take their typographic systems and re-combine them with the original musical selection to create a short motion sequence. [Video Samples: 1–4]

While kinetic typography sits at the core of this project, its focus is really on understanding, defining and applying typographic systems in the context of time-based structures. The language of motion serves as an organizing principle of all the components within a sequential continuum.

Pedagogy: “Code Performs” Project
The complex relationship between type and sound is the topic of choice of many Dynamic Media Institute MFA students. The project by Kate Nazemi entitled Code Performs has grown from her personal interests in phonetic poetry, particularly in Kurt Schwitters Ursonate. Kate built a system that applied phonetic expression to the arcane keywords of assembly commands, a low-level computer language that was never intended to be spoken out loud. The project was presented as a gallery installation allowing users to interact (via mouse and touch screen) with slowly moving typography representing the assembly strings. By a gesture of pointing, stopping or moving the code in different directions with different velocity, the user triggered a sequence of phonetic repetitions — de facto creating semi-random visual and phonetic compositions. [Video Sample 5]

Pedagogy: “Type & Sound” Project
As we consider our Masters of Fine Arts program “anti-disciplinary,” the Dynamic Media Institute has the opportunity to accept students who have a wide variety of backgrounds. Stephanie Dudzic entered our program with an undergraduate degree in computer science and a developing love for typography. The “Type and Sound” project grew from her interest in how the forms of the Roman letter might be mapped to unrelated data properties. She eventually settled on sound, and built a computer-vision platform that allows her to scan letterforms and transform them into sonic experiences.

Her documentation of the project starts with a simple example: performing a sans-serif capital A. [Video Samples 6] She uses this platform as a laboratory for defining how these shapes get mapped to sound properties — pitch, tempo, instrumentation, etc. Still a work in progress, the project has helped make the formal variations of the letterforms more tangible to her. For example, in an early study she was able to sonically compare the same word rendered in a variety of different typefaces and weights, such as Clarendon to Didot, etc.

Stephanie’s system not only allows for the reading of set type, but also captures type within the environment. Any letterform that can be captured with the computer’s camera can be converted to sound. Recently she has been installing the system in galleries and allowing users to cut and paste letterforms freely on paper to create customized musical performances. Letterforms are dismembered and rearranged to create new sonic responses.

The end of her documentation demonstrates sonic interpretation of the font designed by Władysław Strzemiński. This experimental type design was originally published in the second publication of the avant-garde artists group a.r., Łódź, 1932.

Three Aspects of Motion Language
1. The Articulation of Motion in Static Media
Static renderings compress our living three-dimensional physical experience onto a fixed two-dimensional surface. One of the most notable effects of this compression is the removal of “verbs.” With the loss of the passage of time we lose the ability to show action. Instead, we must visually allude to the concepts of time and motion so the work connects to our experience and understanding of the natural world. Over time a variety of techniques have been developed to convey motion and time through graphic form: multiplication, sequencing and the manipulation of images to list a few. A system of symbols — arrows, lines, and icons — is also often relied upon to convey motion, force, and action with clarity and brevity. All symbols have to be learned and their meaning varies based upon culture and context. If not used carefully, they can cause more confusion than clarity.

Pedagogy: “Instructional Diagram” Project
One effective way to challenge students to think deeply about motion is to make them aware of the absence of it. In the first project of their second year required curriculum, students are asked to contemplate the limitations of a static medium by using it to represent an action or procedure over time.

This assignment begins with students observing and documenting a complex physical action. Using photography, they break down the movement into important moments in time and capture those moments from many different points of view. This parallax matrix of photographs provides a visual reference that will assist them in starting the process of “flattening” the three-dimensional experience onto the printed page.

All of the content in the final diagram must be conveyed through visual form. Use of words or numbers is forbidden. Composition must guide our eye through the diagram in the correct sequence. Students may choose to use symbols to clarify action or represent force, but each use of these graphic shortcuts is challenged and must be justified during critique. [Fig. 5]

2. Motion as It Relates to Information Design
Integration of motion with information graphics, in many cases, seems to be the only practical solution for managing and understanding complexity of large-scale information structures. Understanding is a result of a communication process that can only be completed within an individual’s mind through the process of comparison, explanation of structure, and/or causality.

There are many examples that demonstrate a great potential of our brains in processing complex sequential information. While following dynamic diagrams of “Powers of Ten” (Charles and Ray Eames, 1977), or “Remind Me” (Ludovic Houplan & Hervé de Crécy, 2002) [Video Sample 7], analyze your own process of perception. What can you actually perceive? What are you missing due to the pace of the animation or the lack of appropriate time to reflect upon information you are processing and learning?

Increasingly sophisticated computational imaging tools require new conventions and strategies for dynamic visualization, since very often the solutions adopted from traditional information design do not work successfully in dynamic and interactive environments.

Pedagogy: “One Hundred Items” Project
One Hundred Items is another second-year required project undertaken by all students in the program.
The first phase of the assignment entails gathering a simple data set and defining methods to bring structure and order to that data. The second phase involves extracting findings from the data and conveying that information clearly (through numeric representation) and engagingly (through design and metaphor) via time-based media.

Students begin the project by identifying and cataloging a collection of their choice. The collection must include at least one hundred items that share similar properties or attributes. A detailed record of the collection is made by creating a custom database that contains unique values for those attributes that are shared by every item in the collection.

Once cataloged, this database is then visualized in the form of a poster. The poster must represent each object in the database while revealing that item’s unique attributes to the viewer using some type of visual system.

Like in the “Instructional Diagram” project, this assignment was written specifically to challenge the student’s perception of the medium. The poster format is not the optimal medium for this content. It would be far more appropriate to represent this data dynamically, allowing users to sort and filter. A poster, lacking both a temporal axis and the ability to interact with users, was selected for several reasons. Because content on a poster cannot be sorted and reorganized, students are forced to make difficult editorial decisions to reach a clearly defined and understandable hierarchy for the collection. Additionally, the poster format challenges students to consider ways to keep each attribute visual, separate and glance-able. With each element needing to display five or six properties, this quickly becomes a challenge.

[Fig. 6] Above is one example of a visual database. Student Phil Pham surveyed two hundred participants across the United States and India about how they personally define happiness. Every response is included in his poster. The interviewee’s primary response is grouped by color. Lighter shades are female respondents, darker are males, and the position of that mark defines the respondents’ age.

Many patterns or findings are revealed through the process of organizing, structuring and visualizing the database. Once identified, students are asked to reveal those findings via an animated sequence. The animation that grew from this data is titled Happiness and the Meaning of Life. [Video Samples 8]

In another example, student Matt Kaiser worked with a collection of clothing he owned. He was interested in how much he paid for it, where it was made, and the hourly rate of the employee who made it. His animation documents his findings. [Video Samples 9]

Motion as It Relates to Interface
Interacting with complex data is a special kind of motion. The visual logic of interfaces creates for the user a unique way of interacting with information. By viewing, reading, and scanning visual patterns—static and dynamic—and by selecting subjective paths through the content, users learn in their own unique way. This unique path to knowledge is a result of action and reaction, a stimulus-response loop repeated ad infinitum.

The history of dynamic visualization seems to accelerate. It is difficult to realize that widely accepted concepts and metaphors of dynamic visualization and interaction with data—such as manipulability, transparent intersecting planes, infinite zoom, zero-gravity 3-D space—were only developed in the last two decades. [Fig. 7]

Today, the cinematic vocabulary inspires metaphors of user interface as well. Indeed, interface can be considered a tool for narrative. Or, to paraphrase Marshall McLuhan, ”the interface is the message.”

Pedagogy: “Inception Interactive” Project
This project challenged students with translating the story of Inception, a 2010 science fiction action film written and directed by Christopher Nolan, into a browsable database and to create the user interface to interact with the content. This particular solution of the Inception browser includes dynamic, vertical scrolling that allows the user to navigate through different levels of dream — from reality to limbo. The action of scrolling, which results in moving image and type, triggers also short sequences of animated icons in reference to particular moments of the movie narrative. The project was designed and prototyped by Mia Fabbri and Lee House within a second-year required class. [Video Sample 10]

Pedagogy: “Body Machine” Project
Body Machine is an interface prototype designed by Carolin Horn that allows a user to re-sort the human body based on certain parameters. In this short example the elements that make up the human body are re-sorted based upon weight. Heavier items sink downwards while lighter items rise up. [Video Sample 11]

Pedagogy: “Jellyfish” Project
Jellyfish is a prototype for a holistic and expressive encyclopedia of the arts and design. Each subject area is represented as a “jellyfish”-shaped structure comprised of nodes of content. Interacting with one subject draws other related subjects slowly towards the user’s mouse. As artists and notable works are selected, relationships are made across disciplines.

Motion and proximity is used to show connections and relationships. The jellyfish appear intelligent and autonomous, and their movement helps place into context the relationships in which the user is currently examining. [Video Sample 12]

Pedagogy: “Anymails” Project
Anymails, Carolin Horn’s thesis case study, grew from her Jellyfish project. Her research focused upon how natural metaphors could be used in information design and, specifically, how the motion of objects can subtly inform.

A playful visualization of her email inbox, each message is represented as a small, living organism. The species of the organism is determined by the content of the email—work, social, school, junk, etc. The properties of each animal are defined by the properties of its relevant email and its lifespan within the inbox. As an example, organisms lose some of their cilia (or hair) once they have been opened and read, and lose even more once responded too. This not only creates a noticeable formal change in the creature, it also translates to how fast that creature can move within the interface—a property that is far more glance-able when the interface is put to use.

Sketching and pre-visualizing code-based work is a challenge. While designing Anymails Carolyn built a series of computational “playgrounds” — she used the term “motion-labs” — where she could manipulate and observe the results of different motion properties. Once satisfied with the rules for the motion of that category, she could cut and paste those motion properties into her larger case study. Her motion tools eventually became a module of her final interface, allowing for user personalization of kinetic behaviors. [Video Sample 13]

The results of her study have been documented via screen-capture. [Video Sample 14] The properties that define each creature are revealed upon rolling the mouse over them. Menus at the bottom of the interface allow users to filter and group common properties. Grouping is made visual thought a type of graphic conga-line. In the linked example the creatures are first grouped by a certain user, then grouped by status.

While not aimed as a replacement for the traditional email window, this experimental solution creates an expressive interface that allows users to identify patterns of information while they interact with a playful, living in-box.

Cinematic Language
Cinematic language refers to content on screens. Originally in movie theaters, today screens are of all sizes and locations, both private and public, from your smartphone to Time Square’s large-scale displays. The flatness of a screen can be defined by its dimensions: X, Y, and T for Time. Perception of time on screen results in kinetic, live, multi-sensory experiences of narrative.

Humankind’s ability to create narratives has always been a powerful communication model. As the mind perceives visual, sonic, and kinetic information over a period of time, it continuously organizes discrete units or messages into a story, however abstract that story might be. The sequence of events in time — images and sounds — is perceived by human brain as somehow organized.
It is our cerebral cortex — with its neural networks of complex pattern recognition — trying to make sense out of sequencing of images in motion, changing colors, shapes, sizes, and sound. The result of that process is narrative. As defined by Aristotle, narrative must have a beginning, middle, and end. Though, as Godard proclaimed, narrative need not necessarily be told in that order. Godard refers here to storytelling — which may bend time and distort chronology in order to deliver the story in a memorable way. A designer’s awareness of these two distinct timelines — one for the story, another for the storytelling — is essential.

In its hundred-year history, the language of cinema has evolved into a complex, near universally understood system of communication, capable of translating a multi-sensory human experience into a kinetic sequence of audio-visual events where motion serves to integrate all other channels of communication.

One of the most spectacular historical examples of the process of designing and analyzing narrative structure is a post-production storyboard for the 1938 film Alexander Nevsky by Sergei Eisenstein, a Russian director and one of the first theorists of the medium. [Fig. 8] That storyboard is a timeline in which visual representations of the film’s various components are precisely synchronized into a sequence of “audio-visual correspondences.” That includes film shots, the musical score, a diagram of pictorial composition, and a diagram of camera movement resulting in motion on screen. Choreographed very precisely, in fact to a fraction of a musical measure, this “diagram of movement” attests to how essential the language of motion was for the cinematographer as an integrating principle of all other elements of his vocabulary.

Pedagogy: “Design for Motion and Sound” Course
Certainly, MassArt does have a place within the curriculum where students are able to focus deeply on issues of motion and storytelling within more traditional applications. The traditional skills of cinematic narrative — writing, editing and storyboarding — remain essential.

Karolina Novitska’s three-and-a-half minute movie entitled “Crossword” received a 2005 Adobe International Design Achievement Award. [Video Sample 15] The film was based on the personal story of Karolina’s grandmother suffering from Alzheimer’s disease. Through long-form narrative sequence Karolina re-created an experience of the disease — a sense of cognitive fragmentation, repetition, confusion, and fear. From a pedagogical point of view it is interesting to compare the final movie “Crossword” with the “Sketch for Crossword.” The comparison reveals a struggle of choosing the most appropriate vocabulary and grammar of storytelling in order to deliver a story, which is indeed an essential challenge of cinematic language. [Video Sample 16]

Another example deals with motion’s role in branding. This project, developed by Matt Kaiser, explores traditional television branding and features a few separate sub-brands nested within one channel. Modifications of the main logo create a family of sub-brands defined by kinetic behavior, reinforced by the standard visual and typographical components. [Video Sample 17]

Pedagogy: “Data as Narrative” Project
Data as Narrative is a required collaborative project undertaken by students at the end of their second year. It asks teams to identify a large source of abstract data and work with it to reveal stories. From the number and types of casualties in the war in Afghanistan to local housing costs by neighborhood, students may work with any data they can get access to. Students begin the assignment using data analysis tools to interrogate the data. They clean, transform and visualize it to find patterns. Those patterns will eventually be expressed as stories. Once revealed, students work to tell them in a visual and expressive way. The goal is to reveal the story with clarity and accuracy while making the underlying information visual, relatable and comparable.

One team chose to work with the data within the Congressional Record, a document that shows how active the United States Congress has been in the past years at passing laws. In looking at the data, the team discovered a clear pattern of declining productivity from 1947 to today. This is the story they wanted to tell us.

The video begins by introducing viewers to the US congress and explaining a few important terms and concepts so that the data in the congressional record can be understood. Building upon that foundation, they then show the Congress’ inaction by creating visual comparisons between situations in American history — the housings crisis, wars — to recent events. [Video Sample 18]

As is common in data journalism, the narrative storytelling is coupled with an interactive experience allowing viewers to inspect the original data and come to their own personal understanding of it. Students are challenged to map their data to the most appropriate interface tools to best facilitate exploration and ease of use. Some interfaces become interactive prototypes, others are prototyped via a motion sequence.

When users drag the slider across the time axis within the interactive diagram of the US Congressional Record browser, they are presented information about the number of measures proposed vs. passed at that point in time, events that were happening at that point in U.S. history and the party makeup of the congress in that given session. [Video Sample 19]

Pedagogy: “Service-craft” Project
The service-craft assignment is a collaborative project that grows out of a required third-year design research course. Teams identify a community of users they feel are underserved in the digital landscape. They use a variety of quantitative and qualitative research methods to become better acquainted with the needs of that selected community, and envision a prototypical service that would be beneficial to their selected group.

Once a service concept has been developed, teams use their motion skills to create a short narrative scenario to explain and defend the services they propose. Narrative scenarios are tools for concepting, prototyping and justifying systems or experiences. A scenario simply visualizes one way that a system is — or is envisaged to be — used by describing the predicted interactions of users. Time-based narratives are frequently used as part of the system development process. They are successful in this context because when we, as a viewer, observe a prototypical character moving from point A to B within a system, we move further down the path to understanding that system.

Possibly the most influential narrative user scenario ever created was Apple Computer’s 1987 video for an imaginary product named the Knowledge Navigator. [Video Sample 20] The compelling and “realistic” storyline focused our attention not on technology that was outlandish for the time, but on a single user’s experience, and in so doing defined a vision for future computing that would guide Apple’s design decisions for the next 20 years.

In this example of a rough service map, [Fig. 9] the students are defining the relationship between different types of users for a community service center that focuses on child welfare and development. By mapping out the service, the team is better able to identify a single path or story they can tell to help explain the service. Once diagrammed, they are ready to begin writing a script and storyboard their narrative user scenario.

The first example scenario describes a service that helps college students identify and prepare for possible careers in Mathematics. [Video Sample 21]

The second example envisions a service that brings together creative people from different disciplines to collaborate on projects. [Video Sample 22]

In Closing
There are some concluding thoughts we would like to leave for your consideration and perhaps for future discussion among design educators:

— Motion is a language of communication connecting multiple domains: From text to sound, to sequence, to narrative, to experience.

— The language of motion is inherent in today’s media, and is integral to design. It should be taught at all levels of design curriculum.

— As design products move from passive to increasingly responsive, we wish to see a movement from teaching Kinetic Typography as a “product,” towards seeing it placed in the broader context of Dynamic Media Communication.

Figures
Fig. 1 Cage, John. Aria 1958. New York, Henmar Press Inc. Henmar Press, 1960. Print
Fig. 2 Onomatopoeia Sound Analysis by Marianne Schoucair, 2008.
Fig. 3 Onomatopoeia Sound Analysis by Marianne Schoucair, 2008.
Fig. 4 Song for Dot by Laura Harrington, 2010.
Fig. 5 How to Sharpen a Pencil by Kimber Couzo, 2009.
Fig. 6 Happiness and the Meaning of Life by Phil Pham, 2012
Fig. 7 Muriel Cooper, Information Landscape, MIT Center for Advanced Educational Services VHS tape. 1994
Fig. 8 Alexander Nevsky, 1938 in: Sergei Eisenstein, The Film Sense, Jay Leyda, ed. (San Diego: HBJ, 1975), 175.
Fig. 9 Service Map by Mari Emori, Co Dam, Kat Dorson, 2011

Video Samples
1 Hong Kong Mambo by Isadora Williams. 2007 http://vimeo.com/17624837
2 Gutter by Matthew Brimicombe. 2012 https://vimeo.com/68463378
3 Genesis by Alec Sibilia. 2012 https://vimeo.com/68463533
4 George Va Lentin by Nina Lilliebjerg-Heder. 2012 https://vimeo.com/68585200
5 Code Performs by Kate Nazemi, 2006 https://vimeo.com/68440737
6 Sound of Type by Stephanie Dudzic, 2013 https://vimeo.com/68445516
7 Remind Me. Dir. Ludovic Houplan and Hervé De Crécy. Perf. Röyksopp. Vimeo. Web. 14 June 2013. https://vimeo.com/31766357
8 Happiness and the Meaning of Life by Phil Pham, 2012 https://vimeo.com/34100496
9 100 Articles of Clothing by Matt Kaiser, 2012 https://vimeo.com/34100570
10 Inception Interactive (Fragment) by Mia Fabbri and Lee House, 2013 https://vimeo.com/68441864
11 Body Machine (Fragment) by Carolin Horn, 2005 https://vimeo.com/68441255
12 Jellyfish by Carolin Horn, 2005 https://vimeo.com/34112787
13 AnyMails: Motion Lab. by Carolin Horn. 2007 https://vimeo.com/68744499
14 AnyMails (Fragment) by Carolin Horn. 2007 http://vimeo.com/7322582
15 Crossword by Karolina Novitska. 2005 https://vimeo.com/12310880
16 Sketch for “Crossword” by Karolina Novitska. 2005 https://vimeo.com/12277871
17 Invi by Matt Kaiser, 2013 https://vimeo.com/68443394
18 Congressional Productivity by Carl Bowman and Chris Skinner. 2013 http://vimeo.com/68438630
19 Congressional Productivity Interface by Carl Bowman and Chris Skinner. 2013 http://vimeo.com/68438531
20 Apple Computer, Inc., “The Knowledge Navigator”. 1987 . Online video clip. YouTube. Access on 14 June 2013. http://youtu.be/3WdS4TscWH8
21 MathPlus.org by Will Millar, Pernilla Kaiser, Jeremiah Louf and Will Bruno. 2011 http://vimeo.com/34100717
22 CoPilot by Ryan Boye, Nathan Hass, Matt Kaiser and Phil Pham. 2012 http://vimeo.com/65614658

© 2024 Jan Kubasiewicz

Back to Top