Skip to topic | Skip to bottom
Home
Hiaz.DifferentResolutionsr1.1 - 05 Jul 2005 - 13:16 - TWikiGuesttopic end

Start of topic | Skip to actions

Different resolutions within the arts, sciences and nature


Daniel Buren

(born 1938 in Paris, France)

is living and working in situ. Since 1965, he uses the alternated white and colored stripes always in 8.7 cm in width, and the place where the work is created is measured by the stripes as a unit of common measurement.Then, the stripes which are place in the site assure visually the same function than the graduations of a ruler of measure.

source: http://www.cca-kitakyushu.org/project/burendet.html


Tony Conrad

In 1965, American artist Tony Conrad made The Flicker, a film consisting entirely of black and white photograms, which alternated according to different arrangements or frequencies.

When projected, The Flicker produces a stroboscopic, or flashing, effect that often leads audiences to ‘see’ images or colored motifs. According to Jonas Mekas, the film actually provokes an epileptic attack in one out of every 15,000 people. Conrad, who studied the physiology of the nervous system at Harvard University, invents through this film a new film image that is different from the usual narrative or pictorial ones generally put forward in cinema. By implicating the retina rather than sight—that is, by stimulating physiological rather than psychological impressions—the film displaces the centers of reception from the sensorial to the neural. Upon its release, The Flicker stayed confined to New York’s underground scene, only to be recalled in 1999 when its Japanese counterpart, Pokemon, had similarly vibrating images that triggered epilectic fits among hundreds of children.

We know cinema can deeply impact organisms. The Ludovico therapy that Stanley Kubrick imagines in Clockwork Orange (1971) is a physiological cinema that provokes nausea. Video films may even provoke heart attacks, as demonstrated in 1998 by the Japanese filmmaker Hideo Nakata in his film Ring. Television, too, may have an effect, such as causing the viewer to fall asleep. This was recorded in an investigation led by journalist Peter Entell ( http://www.filmtube.com ), in which he revealed that when looking at television, the brain only receives alpha waves (flat waves devoid of stimuli), whereas cinema and written text generate beta waves (waves that stimulate the mind).

Indeed, The Flicker could be interpreted as a pornographic film, if one considers the theory that orgasm is a form of reflexive epilepsy initiated at the point of excitement. Consider as well the power of imagination and the placebo effect, as confirmed by the American designers RAW when they exhibited water in association with images of sex, chaos, and speed. Observers were to choose the water they preferred, which was then bottled, labeled, and became a fantastic souvenir to take home. The traumatheque of French artists Berdaguer and Péjus functions on a similar scale: infra-visible and infra-audible videos charged with emotion and narrative. They are homeopathic horror films, non-visual snuff movies.

-- Philippe Rahm (translator Nathalie Angles)

source: http://www.ubu.com/sound/conrad.html // http://www.artbrain.org/journal2/rahm.html


Insect´s hearing and sound generation

I heard some crickets produce sounds so slight they operate only on a single molecule of air. will write back later on..

cricket ears: http://www.erin.utoronto.ca/~w3bio422/install9.htm


Microsound

Below the level of the musical note lies the realm of microsound, of sound particles lasting less than one-tenth of a second. Recent technological advances allow us to probe and manipulate these pinpoints of sound, dissolving the traditional building blocks of music--notes and their intervals--into a more fluid and supple medium. The sensations of point, pulse (series of points), line (tone), and surface (texture) emerge as particle density increases. Sounds coalesce, evaporate, and mutate into other sounds.

Composers have used theories of microsound in computer music since the 1950s. Distinguished practitioners include Karlheinz Stockhausen and Iannis Xenakis. Today, with the increased interest in computer and electronic music, many young composers and software synthesis developers are exploring its advantages. Covering all aspects of composition with sound particles, Microsound offers composition theory, historical accounts, technical overviews, acoustical experiments, descriptions of musical works, and aesthetic reflections. The book is accompanied by an audio CD of examples.

source: http://www.ihc.ucsb.edu/events/past/oldersite/roads/

material relating to Granular Synthesis:


Time measures

  • Coordinated universal time: The uniform timescale that forms the basis for most civil timekeeping in the world. UTC is based on atomic clocks, such as the one held by the National Physical Laboratory in Teddington, south-west London. Some 32 extra leap seconds have been added to UTC since it was officially adopted in 1972, to account for the fact that the Earth's rotation is gradually slowing down.
  • International atomic time: A statistical timescale mostly used for scientific reference. The Bureau International des Poids et Mesures in Paris sets TAI time by monitoring the regular vibrations of caesium atoms in atomic clocks around the world. Coordinated universal time is generated from TAI by adding leap seconds. TAI is currently 32 seconds ahead of UTC.
  • GPS time: An atomic timescale used by the US global positioning system. When it was set in 1980, GPS time was based on coordinated universal time, but GPS time is now some 13 seconds ahead because it does not count leap seconds. A Russian GPS system called glonass does account for leap seconds, but adding them has caused technical problems.
  • Greenwich mean time: A time standard established for British navigation in the mid-19th century. GMT has now been officially replaced by coordinated universal time, so Big Ben, the BT speaking clock and the BBC radio pips all mark UTC, not GMT as some people think -although the two are usually very close. British law still refers to GMT because a 1997 bill that tried to update it to UTC was never passed. It ran out of time.

source: http://www.guardian.co.uk/uk_news/story/0,3604,985020,00.html

more GPS related projects:

more TAI related projects:


Television standards PAL/NTSC 25/29.97 fps

The most difficult thing to understand for someone who is new to this industry is why we still have fields:

So consider this: Fields is a concept inherited from television/video technology history.  The initial reason for fields was a technological difficulty.  A clever kludge was devised, fields, to address two engineering issues. In a nutshell, the scan rate was decided to be made in phase to the power system frequency (60 MHZ for America and 50 for Europe), as it was evaluated that like this AC line interference effects were minimized in the reproduced picture.  The interlace scheme was selected as a fallback compromise because it was the only way to overcome the technical limitations of the forties (hello! this is year 2001). The writers of the initial television standard proposal actually initially proposed 44 full frames a second as minimum to avoid flickering problems and their proposal was rejected as too expensive for deployment of television in the 40s, so then came the Fabulous Fields...

Since then, many equipment manufacturers have invented all kinds of reasons to make us believe that such a thing is actually a good thing. Bottom line, we are still stocked with it 60 years or so later.

Now, have you ever complained about strobing when watching a film transferred to video? Of course, not. So do we need so many frames per second?  It depends. One functionality forgotten here is that the film camera process has also an additional attribute, a shutter. That is a shutter opening causes a motion blur in objects that move in effect reducing the required amount of frames required by our brain to achieve fusion ("critical flicker fusion").  There is no such concept in video since by definition video requires to be timed to 1/60th (or 1/50) of a second per field. The only way to do shuttering in video is by dropping resolution (for example dropping a field).

Fields are really a problem when doing visual effects. This is one reason why so much commercials are still shot with film. Fields are also bad to display on a computer monitor and as well are impossibly hard on compressors as they remove local structure in an image.   So the truth is that until we can devise digi-opto-electronic imaging devices which integrate a full frame capture array (buffer)  we might be stocked with 3 solutions: shoot with film,  do high rate video (60 or over FPS progressive) or do a lot of smart processing in post.

And as an additional informational note, within the standard video forms (ATSC), the current solution that creates the less post-production problems is 60 progressive FPS video.  The reason is that at that frame rate "motion blur" starts to become a less important factor,  because there are now enough frames per second to have your brain completely do the work.  For example if you look at a frame of a Showscan film (60 FPS), you will see very sharp images with almost no blur.  This is exactly why our computer display have 70-75 HZ (and not 24 for example). Also note that in a film theater, the same frame is actually projected twice (or sometimes 3), that is there is actually a shuttering process in the film projection system itself, and in effect the projector's shutter is shut half the time (for one thing allowing film plane motion to the next frame).

SO, bottom line, whether we like it or not, we have to deal with it. SO this is about opening another track to let you deal with fields in a manner where hopefully they come into place totally at the end of your creative thinking process, and help you squeeze out the most quality out of your digital video camcorders.

source: http://www.revisionfx.com/

Steina and Woody Vasulka´s Arts and Science Laboratory is a great resource to see artistic work that exploring the technical foundations described above:


Physiology

  • cardiac muscle (frequency in range of 0.8 - 2.5 Hz)
  • breathing (frequency in a range 0.2 - 0.5 Hz)


-- HiazHhzz - 28 Apr 2003
to top


You are here: Hiaz > DifferentResolutions

to top

Copyright © 1996 - 2006 by hiaz. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback.