Lisa Jevbratt – The Infome Imager

Mapping Transitions

Infome – noun, from: information + ome (- suf., all, the totality of, (as in genome))

The Infome Imager allows the user to create “crawlers” (software robots, which could be thought of as automated Web browsers) that gather data from the Web, and it provides methods for visualizing the collected data. Some of the functionality of the Infome Imager software is similar to a search engine such as Google, but with some significant differences. The search engine crawler collects data about the intended content of a page, the actual words written by one person, the Web author, in an (questionable) effort to index the Web according to the “meaning”, the semantics, of Web pages. The Infome Imager crawler collects “behind the scenes” data such as the length of a page, when a page was created, what network the page resides on, the colors used in a page and other design elements of a page etc. It glances down into the subconscious of the Web in hopes to reveal its inherent structure, in order to create new understandings of its technical and social functionalities. Another difference lies in the way the data is presented to the user. The search engine uses algorithms to sort the data according to one theory or another, in order to present the user with pages containing a few selected links each. The user is not allowed to see the actual data, but a subset of it, selected and sorted by a computer. The result of an Infome Imager “search” is an image with all collected data, potentially a vast amount of information, presented in a way in which the human brain, not the computer, is put to work on what it does so well – creating intuitive understandings of large quantities of information.

The Infome Imager interface allows the user to manipulate the crawler’s behavior in several ways. The user decides where it should begin crawling; it could for example start on a Web page specified by the user, a page resulting from a search on a search engine, or on a random Web page. The crawler can be set to either visit a page once or every time it encounters a link to it. The data resulting from many revisits will create repetitive patterns in the visualization, revealing the linkage structure of the Web sites, while data resulting from single visits will generate distinct data. The crawler can take many hours depending on the amount of pages it should visit. The activity and the result of the crawler can be accessed from the “manifestations” page. The visualizations created by the crawling process functions as an interface linking to all the sites the crawler visited.

The crawler and data mapping software that together form the foundation for the Infome Imager software was originally developed for the “Mapping the Web Infome” show exhibited in conjunction with “Lifelike” at New Langton Arts in SF in July 2001.

Link


02.09.08

Uncategorized
Lisa Jevbratt – Mapping the Web Infome

Lisa Jevbratt - Mapping the Web Infome

Mapping the Web Infome is a net art endeavor developed in conjunction with the exhibition LifeLike at New Langton Arts gallery in San Francisco. A group of artists were invited to use software developed for the exhibition. The Infome software enables the creation of web crawlers – automatic processes that access web sites and collect data from them – and the creation of visualizations/mappings of the collected data.

Link


02.09.08

Uncategorized
Lisa Jevbratt – 1:1 (2)

Lisa Jevbratt - 1:1 (2)

1:1 was a project created in 1999 which consisted of a database that would eventually contain the addresses of every Web site in the world and interfaces through which to view and use the database. Crawlers were sent out on the Web to determine whether there was a Web site at a specific numerical address. If a site existed, whether it was accessible to the public or not, the address was stored in the database. However, the Web was changing faster than the database was updated and in 2001 it was clear that the database was outdated.

1:1(2) is a continuation of the project including a second database of addresses generated in 2001 and 2002 and interfaces that show and compare the data from both databases.

Link 


02.09.08

Uncategorized
Mapping Transitions

 Mapping Transitions

Mapping Transitions was an exhibition curated by Christian Paul at the University of Colorado.  While distinctly different in their approach, the art projects commissioned for Mapping Transitions are all concerned with the visualization of various forms of data flow and data sets. Both Mary Flanagan’s and Lisa Jevbratt’s project explore the ‘search’ as an aesthetic form of mapping the Internet. Flanagan’s [search] examines the search engine as a creator of context and meaning by reconfiguring its content in a way that illustrates semantic levels, which usually aren’t obvious to the viewer. Displaying the constant stream of questions that users ask the Internet—a stream that ranges from the ridiculous to the sublime—the project creates a topography of Internet users’ interests and a map of the function that the Internet fulfills in people’s daily lives. On one level, [search] is the unfiltered stream of consciousness that saturates the network at any given moment. Flanagan imposes a filter on this stream by allowing users to select words from the incoming flow of questions, which in turn triggers a search on the chosen term. Yet another layer is provided by a kind of Visual Thesaurus that reveals synonyms for the words selected by the user, enhancing the contextual network of meaning. While there are Web statistics that illustrate what categories of information are most frequently requested by users (entertainment, business, pornography, art), we seldom get a chance to look at the micro-level of the request, the actual question that induces a search. Flanagan’s project uses a layering of filters for the creation of meaning that fluctuates between the randomness and control inherent to the network itself. [search] is geared towards transparency, visualizing what users want from the network as well as the process of the project itself. Meaning is unveiled as a transitory and continuously fluctuating process.

Link


02.09.08

Exhibitions
Database Imaginary

Database Imaginary

With Database Imaginary, 33 artists take us on an imaginative and subversive ride. The artists presented in Database Imaginary use databases to comment on their uses and to imagine unknown uses. The term database was only coined in the 1970s with the rise of automated office procedures, but the 23 projects in this exhibition – which includes wooden sculptures, movies and telephone user-generated guides to the local area – deploy databases in imaginative ways to comment on everyday life in the 21st century. Using newly inflected forms of visual display arising from computerized databases, the works seem to raise questions about authorship, agency, audience participation, control and identity.


02.09.08

Exhibitions
John Kilma – Ecosystm

John Kilma - Ecosystm

ecosystm is a real-time representation of global currency volatility fluctuations, leading global market indexes, and up-to-the-minute weather reports from JFK airport.

Commissioned in 2000 by Zurich Capital Markets, an investment company based in New York, ecosystm takes data ZCM uses every day, re-purposing it to drive a 3d environmental simulation viewers explore using a joystick.

ecosystm consists of flocks of “birds” (each flock representing a country’s currency) and branching “tree” structures (each tree representing a country’s leading market index). As a market index advances, the tree grows new branches. If the index declines, branches begin to fall off the tree. Similarly, a currency’s current value against the dollar is indicated by an increase or decrease in the population of the flock.

The flocks also exhibit certain behavioral patterns determined by the volatility of their currency. Volatility is a common financial analysis equation that examines values over time periods. A currency is considered volatile when its value fluctuates considerably over a given period. In ecosystm, daily volatility determines the territory the flock occupies. If a currency is stable, the flock has an expansive territory and can fly throughout it in a graceful manner. If, however, the currency is volatile, the flock becomes very “excited”, and their available territory is considerably reduced in size. In addition to this, a currency’s daily volatility is compared to its yearly volatility, which in certain cases produces exceptional behaviors. If the daily volatility exceeds twice the yearly volatility, the flock is “hungry” and it “feeds” on its country’s leading market index (as represented by the trees). If the daily volatility exceeds the three times the yearly, the flock becomes “aggressive” and attacks a neighboring flock.

As an added visual element, the current weather conditions at JFK airport determine the “weather” inside ecosystm. Runway visibility and cloud cover directly effect visibility and cloud layering in ecosystm.

Link


02.09.08

Uncategorized
Lynn Hershman – Synthia

Lynn Hershman - Synthia

Synthia is a virtual character who represents fluctuations in the stock market online. Her behavior is triggered by the most recent information on stock prices from NASDAQ, Dow Jones and Russel 2000, and her mood depends on the atmosphere at the stock exchange. If prices go up, Synthia dances about; if they drop, she sits anxiously at her desk.Synthia will be displayed on a plasma screen under a bell glass of an electronic ticker tape inspired by Thomas Edison’s design. Edison made a significant improvement to the electric exchange-rate telegraph by printing Wall Street’s quotations on a paper tape in a legible format. Synthia takes it all a step further by personifying the market as an online set of economic patterns. She is a symbol of the symbiotic relationship between the market and people.

 Link


02.09.08

Uncategorized
Nancy Patterson – Stock Market Skirt

Nancy Patterson - Stock Market Skirt

A blue taffeta and black velvet party dress is displayed on a dressmaker’s mannequin or ‘Judy,’ located next to a computer and several monitors of varying sizes. In large type, the stock ticker symbol and price which is being tracked, marches from right to left across the monitor screens as the stock price is continuously updated. Large white numbers and letters on a blue background (matching the blue of the taffeta skirt) scroll in simulation of the pixel board displays used to track stock values on traditional exchange room floor.

PERL scripts (running under Linux) extract and analyze stock prices from online stock market quote pages on the internet. These values are sent to a program which determines whether to raise or lower the hemline via a stepper motor and a system of cables, weights and pulleys attached to the underside of the skirt. When the stock price rises, the hemline is raised; when the stock price falls, the hemline is lowered.

This mediawork also utilizes a webcam to capture and display real-time images of the hemline as it fluctuates. A website simultaneously displays these images as well as the stock market quotes which are controlling the length of the hemline. This site is made available in conjunction with the exhibition of this installation.

Link


02.09.08

Financial
Martin Wattenberg – Smart Money’s Map of the Market

Martin Wattenberg - Smart Money’s Map of the Market

A visualization that allows users to see performance of hundreds of stocks at once, with a rich context of industry and value information. A new algorithm lets an existing visualization technique, the treemap, scale more effectively. The resulting transparent view of the market has been widely adopted by financial institutions and investors.

Link


02.09.08

Financial
Bradford Paley – TextArc

Bradford Paley - TextArc

A TextArc is a visual represention of a text—the entire text (twice!) on a single page. A funny combination of an index, concordance, and summary; it uses the viewer’s eye to help uncover meaning.

Link


02.09.08

Textmapping