Biomimeticx2

 

This project has been performed in the premises of Dinacon (KohLon, Thailand), Harakka island and the Maj & Thor Nestling Foundation pop-up work space (Helsinki, Finland)

A biomimetic case discovered by Biomimeticx2 (Päivi Maunu and Marko Nykänen) at DiNaCon, KohLon June 2018

 

We have revealed flying tropical carpenter’s bee (Xylocopa latipes) expressing the prominent phenotype with shiny metallic bluish green wings. The ultrastructure of the wings provide vital opportunities for the insect to persist successfully within its niche. Its ultrastructure encompasses meshwork of intercalating, longitudinally and horizontally traversing, cytoskeletal fibers. Those provide structurally and efficiently ingenious resolutions for survival in the most challenging climate and environment.

The erudite ultrastructure of the wing counterparts analogously with the ultrastructural cytoarchitecture of the brush border (i.e. terminal web) in the apical part of human airway epithelial cell, where comparable meshwork of intercalating cytoskeletal fibers occur as in the wings of Xylocopa latipes. On the basis of these ultrastructural cytoarchitectural specifics, we were able to comprise a biomimetic resolution where the crucial ultrastructural cytoarchitecture enables efficacious survival both for the tropical carpenter’s bee and human epithelial cell.

However, architectural establishments like the pillar of the Greek temple built by humans, have not been as efficacious persisting in natural extreme circumstances like the pillar cell in the Corti’s organ of human inner ear. The cell is able to endure intense vibrational forces within the cochlea. Also the pillar cell has the homologous and analogous ultrastructural cytoarchitecture to the human epithelial brush border and cytoskeletal meshwork in the wings of Xylocopa. See the similar pattern of ultrastructural cytoskeletal elements in the micrographs captivated by transmission electron microscope (the counterpart images).

 

Human cultural evolution could not be able to cater the refined flexible and durable architecture as the natural evolution has done. Thus, the comprised biomimetic resolution provides an intriguing and encouraging option to tackle more fruitfully with the dramatic challenges associated with the global climate change, e.g. superstorms.

 

 

The source image displays tropical carpenter’s bee (Xylocopa latipes) 

 

The dorsal part of insect’s left wing is exhibited in the magnification. 

 

The counterpart figure portrays vertically traversing cytoskeletal microfibers associated with the cytoskeletal meshwork (the terminal web i.e. brush border) in the apical part of human airway epithelial cell (pseudocolored transmission electron micrograph of freeze-fractured replica coated with platinum/carbon, magnification x 120,000).  

 

The source image reveals the pillar at the temple of Olympian Zeus (Athens, Greece) and its counterpart, the pillar cell, discovered in the human inner ear cochlea (in the Corti’s organ). The pillar cell has a giant desmosome analogous to the helix below the architrave of the temple pillar, and the cannelures in the pillar portray visual resemblance to vertically traversing cytoskeletal fibers in the pillar cell. The scale bars show remarkable discrepancy expressed here as kilometers. The pillar cell in Corti’s organ is over 330,000 times shorter and a million times thinner than the temple pillar (transmission electron micrograph, magnification x 90,000). 

 

 

 

The Harakka island is displayed in the Winter. The macroworld image pairs (i.e. KohLon in the Summer and Harakka in the opposed season) imitate analogously the extremes in the measures that has also been depicted between the image pairs (the source and counterpart images). 

 

 Still during late March, it was possible to walk on the intact ice from mainland to the island….

 

… However, already in April, the ice was melting away and isolating the island.

 

Flyer for DiNaCon Extension, Visual Art Exhibition in Harakka island.

 

Visual Art Exhibition, Dinacon Extension, was arranged June 12-17 immediately after landing of our flight very early morning at Helsinki back from Thailand.

 

 

 

 

 

 

 

 

 

Snapshots from DiNaCon

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

ImmerSea: Digital Naturalism Conference Field Report

Subversive Submersibles, water-adapted augmented reality 

Oya Damla, Kira deCoudres, Adam Zaretsky, Ryan Cotham

Abstract – ImmerSea: Subversive Submersibles is water-adapted augmented reality (AR). This included aquatic AR goggles, immersive AR environments and AR submergibles.

ImmerSea: Subversive Submersibles are installation experimentations involving creative real time superimposition of re-mashed audio visual overlays onto everyday audio-visual and other sense data for experience alteration and tabulation of reactions. The experiments focused on (1) Eye Candy Disruptors, (2) EcoSensual Synesthesia and (3) Gamification of Risk. Centered around our pop-up sustainable miniature golf course, we designed transmissions and communicated odd augmentations to our immersed and submerged co-artists in the Andaman Sea.

Field Report: IMMERSEA: SUBVERSIVE SUBMERSIBLES DETAILS

We had wonderful experiences doubling our sound and vision under the sea. Kira, Oya, Adam and Ryan formed a temporary collective of multimedia artists, bioartists, psycho-geographers, biologists and free thinkers who collaborated in the ImmerSea: Subversive Submersibles node at the Digital Naturalism Conference. We manipulated live feed footage from environmental settings. We designed and built submerged interactive/immersive sound and video wearables for architecturally encasing art installations using site specific sonic and tactile sources. We ran experiments in psychological experience and altered states of semi-consciousness. We dedicated our time on the island to rugged prototyping allowing for flexibility that utilized the ready-made technics we brought, borrowed and found. We incorporated naturally occurring sounds found in the acoustic environment of Koh Lon and build them into our novel device laden suits.

In the mists of a lush tropical island environment, we composed six channel, ‘real’ surround sound for zorb ball immersive experiences and Video Jockey creative misuse of AR software explorations for a hand crafted underwater iPad floatie suit. We detuned, crashed out and mashed up with our sound art, performance art and mad interdisciplinary sciart skillz in the DinaCon Cone of Tropical Geekdom (DCCTG). Underscoring our critical quandary into behavioral, cognitive and queer studies, while relying on our own brand of fringe philosophy, we unpack for you here our experimental designs and the resultant psychoacoustic/psychogeographical analysis of augmented underwater experience.

Yes, we designed submerged aquatic AR environments and underwater AR submergible wearables. ImmerSea: Subversive Submersibles made installations of DIY bio-body art experimentation by building mediated exoskeletons and getting in them, in the water. We were receiving creative, real-time and re-mashed audiovisual overlays while in and on the sea. We had that experience of alteration that mocks, and yet joins in, screen hypnosis through gadget love/hate relations. In this field report we include our own qualitative tabulation of reactions from the audio visual occlusion front of mixed reality superimposition. Working from the vantage points of artistic self-experimentation, tech-no-masochism and begrudgingly admitted tech-titillation, the following is our lab book.

What is Hydro Immersive Augmented Reality (HIAR) to Us?

Augmented Reality is a super-imposition of digital media onto the natural world for artificial intimacy. We study Computer aided User Deprogramming Experience (UDX) as well as push-advertisement styled media coersion. Is immersion in the sea a way towards more easily abscessed critique of a common mass obsession? Due to the changing state of the aqueous mind, an undersea, layered-on, disruptor of immersive relaxation may be the ticket towards a reveal of the prosthesis between the blinds of both inner and outer worlds. In other words, we theorize that Underwater AR may provide evidence of cognitive glitch implicit in consciousness (i.e. equating the needless stressor blow out anomie to the thrill of mediated stim seduction excess). Does the fall of alienation through a leisurely addition of social entertainment screens open us to gamified virtual vacation communication as utter surrender?

“We lived once in a world where the realm of the imaginary was governed by the mirror, by dividing one into two, by theatre, by otherness and alienation. Today that realm is the realm of the screen, of interfaces and duplication, of contiguity and networks. All our machines are screens, and the interactivity of humans has been replaced by the interactivity of screens.  … We draw ever closer to the surface of the screen; our gaze is, as it were, strewn across the image. We no longer have the spectator’s distance from the stage — all theatrical conventions are gone. That we fall so easily into the screen’s coma of the imagination is due to the fact that the screen presents a perpetual void that we are invited to fill. Proxemics of images: promiscuity of images: tactile pornography of images. Yet the image is always light years away. It is invariably a tele-image — an image located at a very special kind of distance which can only be described as unbridgeable by the body.  … There is no ambiguity in the traditional relationship between man and machine: the worker is always, in a way, a stranger to the machine he operates, and alienated by it. But at least he retains the precious status of alienated man. The new technologies, with their new machines, new images and interactive screens, do not alienate me. Rather, they form an integrated circuit with me.”

– From Xeorox and Infinity, Jean Baudrillard / Translated by James Benedict

This essay was originally published as part of Jean Baudrillard’s “La transparence du mal: Essai sur les phénomènes extrèmes” (1990), translated into English in 1993 as “The Transparency of Evil: Essays on Extreme Phenomena”, http://insomnia.ac/essays/xerox_and_infinity/

That the near entirety of living human animal populations could be bought off so succinctly with a two thumbed, LCD touch sensitive, tool-being hand to eye sized bauble that bleeps intermittent rewards is a tribute to the era of Homo Rechargerus and the addictive nature of conditioning. Regardless of however trite the reward (the bleep for bleep’s sake or the glow brightness itself as iHearth), regardless how onerous the odds, regardless of how guaranteed the eventual amnesiac economy of loss, this is tech that offers satiation on the installment plan and hence satisfaction for many.

Simply by cataloging the percentage of time spent recharging our handhelds we can tabulate the incremental toll, the hemorrhagic loss of service that our screens take. The moth-like to the light of the screen identity is a relic of the TV years. Identity has now gone beyond the CRT beam hook into the infinite line of LCD glow scrolling (touch sensitive screen) and the audio intermittent reward of Pavlovian ring tones and alerts. The ‘you’ve got mail’ instaSnap Limited Interactivity Extended Reality (ISLIXR) of AR superimposition is the clicker training of the commons.

How and Why are We Using AR?

This project engages with topics of behavioral immersion by connecting environmental and experiential bio-data-tics across human, animal, and technological populations. Here, media manipulation serves to make post-environmental and post-biological post-truth information experientially available, stimulating curiosity and interest previously inaccessible to post-programmed populations (the human, post-human and a-human organismic masses). Initializing User Deprogramming Experience (UDX) for Mashup Remix Mixed Reality (MRMR), we present third or fourth level irony in a time where Kitsch has fallen flat. The layers of bullshit detectable have upped the delayed ante up on Critique. We are still trying to favir pop-regurg-a-purge hypnogogic options to standard immersion. Into which product orientation seminars shall we build satire? What exactly has been left un-gamified? What target has the naturalism to uncover, to reveal, as opposed to make appear less world wide cobwebby?

Our three experimental designs are titled: Eye Candy Disruptor, EcoSensual Synesthesia and Gamification of Risk. This report includes the materials, methods and results of these three experiments. It is our hope, in the spirit of the Digital Naturalism Open Source Public Lab Aesthetics (DINAOSPLA), that the art over data ratio (A/d) will be equal or greater than one. This is of course plus or minus a 3% negsanguineously margin of error for the duration of our unstill life studies.

Immersea Experiment One: Eye Candy Disruptors

Experiment One: Eye Candy Disruptor 

Goals: Through the mediatization of audiovisual fields with Live VJ Blipvert AR in a wearable AR snorkeling suit, we monitor Exposure Therapy to virtual/physical superimposition. The attempt to approach some semblance of mediated saturation, beyond both utility and entertainment was our goal. We were able to modulate variables of visual time alterity and applied repetition media regimes. This confirms with our intention to study the effects of the artistic taking over of the augmented visual and audio field with more Crap than is already available while swimming in the open sea.

Materials: DIY Water Proof iPad, wearable AR snorkeling suit

Methods: Much of our ‘data’ is based on pre-present & post surveys of human performance volunteers focusing on Annoyance, Habituation and Recall as quantitative, qualitative and catatonicative data sets. Qualitatively, we used an all-orificial data set including questionnaires for tabulation. This entails a synopsis of amalgamated qualitative data from all 11-13 of the major collection points or body portals of our voluntary human arts subjects. This survey data was compared to motion analysis of body language tracking video documentation. Use of binaural new age meditation audio with 31% translucent clouds as a control was meant to approximate degree zero samsara experiential authenticity in order to compare the subject reactions to experiential, repetitive, strobic, vaporwavicle, remash noise ratios.

 

 

Results: People had fun! People clamored to be inside the contraption. People craved to swim with a screen in their face, to snorkel in the most lovely sea while looking through a busily augmented screen full of touch sensitive 3d emojis[1] and first person zombies[2] to shoot and three dimensional glitter drawings tracked by space and head movement[3]. The idea was that snorkeling through schools of fish or peeking at anenomes in coral reefs would be reason enough to dispose of the screen. We were wrong, most people want to augment and peer through the screen as often as they can. Instead of non-virtual fidelity, celibate edutainment is the norm.

Future Research: We hope future experiments will include real-time monitored reactions to live blipvert wireless VJ remashed madvertisments. Superimpose motion tracked data collection with specific experimental brain wave stimulating binaural-sextanaural new age automated samsara control variables, and we may score that mind control research grant we always dreamed of. The plan for now is to expand our repertoire into a wider manipulative media range through: percentage of audiovisual augmentation (loudness and field of superimposed vision), speed of editing-jumpcutting, strobic/distorted sonic and signal to noise ratios. should be automated to increase whole brain de-synchronicity and imprint vulnerability. We are still looking for something like real-time giphy world 3D sculpt brush AR jockey software for beaming real time to others in the underwater AR screen spectator faceplant worlding. Perhaps that is something we could seek collaboration for building.

Immersea Experiment Two: EcoSensual Synesthesia

Goals: We are focused on smell, taste, CNS nerve stimulating (touch) and proprioception for this experiment. We created an inhabitable instrument–the semi-submerged wireless human augmented reality zorb ball interface (See image + caption) — to measure sonic arousal in a spherical 360 degree sextanaural sound space.

Materials: We created an inhabitable instrument — the semi-submerged wireless human augmented reality sextanaural zorb ball interface. This included a zorb ball and six waterproof portable MP3 players with six MicroSD cards synched to play our six channel compositions. We composed several six track digital audio compositions. These included six tracks to 6 speakers in the Zorb Ball (Front, Back, Up, Down, Left, Right), as well as experimenting with binaural standing waves in full six channel dimensionality (hence sextanaural sonic space) in a closed and yet N degree of cognitive freedom user movable space.

Methods: In order to transmit the sensual sounds of the island into a semi-submerged human augmented reality zorb ball interface we positioned the waterproof speakers: fore, aft, port, starboard, stern and bow. The six channel sonic compositions are available online mixed and as original six directionally distinct tracks. Made from generated, remashed and digital audio field recordings from the Koh Lon local island ecosphere soundscapes.

You can try this yourself! “Music For Zorb Balls” is the result of a Digital Naturalism Conference Project, ImmerSea: Subversive Submersibles. An experiment in sext-o-phonic sound (6-channel 3D omni-directional audio), “Music for Zorb Balls” documents audio tracks created for an immersive media-tastic soundscape while in an inflatable bubble on water. For our purposes, we assigned each directional track to six separate wireless Bluetooth speakers attached to points inside the Zorb ball. The audio can be streamed through any audio-playing device: iPhones, iPods, iPads, iAmASpeaker, Car Stereo Speakers, etc. Try duct taping pillows to friends and slipping their phones on loop into the pillow cases and then modern dance moshing with them while connected to their feet and hands with industrial rubber bands. This may give you the interactive UDX experience that it takes to feel the interface.

The album is free for public access and includes each artist’s contribution to the interactive sound installation. For public replication of this experiment, download each directionally determined track and assign each to individual speaker outputs for the best, most disorienting experience.

To download the full album from Bandcamp, select “Buy the Full Digital Album” but rest assured, the album and all tracks are free! Name your price at $0 (or more!). To download only specific tracks from the album, select the track(s) of choice and select “Buy Digital Track”. FREE. Enjoy!

https://immerseadinacon.bandcamp.com/album/music-for-zorb-balls

The system of zorb-bodies conformed to the human hamster wheel XYZ audio architechture simplot. Our research subjects were floating spherical on the waves of an immersive interface. This six channel Zorb space is semi submerged with user controlled motion and multi sense intensities. This is implicit in most full immersion media heavy, seafaring globes of transparent inhabitation. Our interest in binaural sounds stemmed directly from researching the CIA funded Monroe Institute[4] declassified documents.

Results:

“Being a primary experiment subject contained within the Zorb Ball fulfilled many fetal fantasies of being inside of a techNO-FEAR PROTECTO-SPHERE. Surprisingly, this enclosed experience externalized dissociation in a most dreamy, oxygen-deprived way. The sun beaming down on the Andaman Sea quickly steamed the dream bubble, creating a sauna effect. This, coupled with sonic bombardment from all directions (up, down, left, right, front, back) was immediately disorienting and yet, cacophonously comforting due to cultural familiarity with over-commercialization of experience. It was a clumsy kinetic learning-curve to walk on water, with big Jesus-like footsteps to fill. This tumbling was quite entertaining and meditatively exhausting due to the amount of energy exertion required to move an immeasurable amount in any direction. This cyclical going-nowhere and treading on water must be what hamsters and laborers trapped in capitalist loops feel like. In conclusion, this inverse rebirth was enlightening and recommended for anyone in search of an affixiatingly intense psychophysical re-alignment through mediated reprogramming.”

– Kira deCoudres.

 Future Research: We would like to experiment with waterproof, six channel, acoustic contact microphones that use streaming wifi. Live streaming from six sonic on-ground (or hydrophonic ‘in-water’) contact sites of interest would be fun and analog enough to have a warming mediatationed effect. With the right psychogeographically positioned field station, we could include instrumentation used to collect data that measures arousal, (for instance, a penile or clitoral plethysmograph, a electromyograph for rythym contraction data, Doppler velocimetry penis cuff/sleeve, etc). But there is a limit to the measurement devices available for five kingdoms wide, cross species real time arousal. These interspecies types of inclusive all-phylum arousal economies of fetish measurement devices can be prescient techno-predictors of both the chemistry and early reporter weather predictions[5]. Orgonomic coefficients coincident with ecological forecasting may yet prove that incorporating the sensual psychophysiology arts into climate change studies is a novel use value for our currently sparse interspecies, Kinseyesque, sexual response data and… correlated to climate change… we have even more reasons to cum[6].

Immersea Experiment Three: Gamification of Risk

Goals: This experiment is about what level of virtual overtaking of the ‘actual’ is possible by sudden and incremental alarming risk suggestion. We are looking at demonstrating extreme what ifs, neurotic paranoia and underwater fear factors to induce panic. This is to see if reliance on augmentation is such an addictive autopilot that users are capable of downgrading risk assessment or if danger as a concept can be subsumed by the multisensorial eye candy and responsive ISLIXR HIAR environments.

Materials: Zombie Apocalypse AR software, fake shark fin hat,fear itself and general tropical sea/mangrove/rain-forest fear mongering information sessions for players.

Methods: The intention is to compare notes on the reaction to panic attack inducing threatcasts, with or without augmented reality. Working with the generation of psychotic states or schizogenesis, we combine overlaying the audiovisual experiences of emergencies with actual dangerous or convivially able to be perceived of as dangerous environments such as the Andaman Sea. So, this is predominantly a genre of overlaid media to be explored while at risk of shark attack, drowning, unusual currents, poisonous fish, stinging jellyfish and sea gypsy pirates of the Malay Peninsula. Immersea experimental volunteers were subjected to a media barrage of: flashing lights, loud voices giving orders, mass media styled endless emergency warnings, faux shark attack and crashed, impeded glitch, audiovisual environments.

Results: Actual dangers encountered were: strong currents to nearly drown us, coconuts falling from the sky, jelly fish stinging and oxygen depletion in the Zorb ball as well as AR zombies everywhere. There was a distinct inability for the volunteers to conceptualize a difference between the Gamified Eye Candy and the emulation of mind control fascist depersonalization indoctrination cult tactics. It all seemed like an enjoyable and playful adventure, until the air started running out or the drowning feelings started. At that point, the virtual became secondary to survival, so there is still much to learn before we overcome that urge to stay alive outside of the MRMR XR.

Future Research: The superimposition of a second, generated or algorithmically filtered version of your sense data accruement through your inborn and falsifiably inept orifice input-output economy may be safer than separating the two on competing screens (LCD/Flesh). Commercially available screens are brighter than the in-born filtered (i.e. eyes or nostrils) sense data of the everyday world. Interestingly, we found that digital LCD camera flowthrough screens that are not semi-transparent (like phones or tablets) may be of a higher bodily harm risk than mixed reality helmets or headgear. Semi-immersive gadgets are smaller, brighter, louder, more vibratory and more penile than immersive headgear or wearable environments. Comparing risk assessment artifacts and actual risk of the handheld gadget versus the wearable gadget may be worth looking into further. It does seem that any situation other than life endangering impact crashes or other forms of sudden, potentially fatal, mortal ruin are considered to be most often ignorable in or out of augmented reality. On the other hand, even long, drawn out, miserable pain itself can plausibly always be heightened by the digital.

A final Gamified Risk analysis involves the cognitive effect of the amplification of technological pain along with analog pain. This is a question of the rehabituation of illusory meta-pain in dialogical interplay with brute force. The competition between immersive digital torment and screen based digital torment in relation to bodily (technologically unattached) torment is a Department of Defense (DoD) issue worth pursuing[7]. Perception of risk is a fun and interesting area to analyze when comparing distinct media to the cognitive conception of the unmediated Central Nervous System (CNS). The real money is probably in how much poor risk assessment behavior can be coordinated through UDX AR MRMR DoD & ISLIXR CNS (see Terms Key below) as opposed to other kinds of mind control in terms of fully trained assets and glitching enemy assets.

Sustainable Mini Golf

Simply as a Central Node for ImmerSea: Subversive Submersibles at Dinacon, we spearheaded the first ever Sustainable Miniature Golf Course. Nearly on the equator, this might start an equatorial fad, especially with a Biodiversity Banking Theme!

Miniature golf is great low-impact fun especially if it is sourced from local and sustainable materials. Our node hosted a social outpost made from crafted natural golf balls (baby casaba), DIY golf clubs (bamboo and brain coral and conch shells, tied together with coconut twine) and temporary redesign of loosely themed nature. Equatorial Sustainable Miniature Golf Courses are a perfect interface to teach-in about biodiversity, human gene editing and bioethics.

The ImmerSea: Subversive Submersibles Equatorial Biodiversity Bank Themed Sustainable Miniature Golf Course was a home base for our node as well as the source-mixing and wireless beaming for our remixed Immersea databank. Beginning with local Koh Lon mangrove, rainforest and coral reef biomes, this minimal impact portable eco-minigolf course houses the qualitative and quantitative ImmerSea: Subversive Submersibles library of biodiversity. The ImmerSea: Subversive Submersibles Equatorial Biodiversity Bank Themed Sustainable Miniature Golf Course is the AR-VJ immersive transmission station to our underwater ImmerSea: Subversive Submersibles Human Subjects Our video, audio, photo and written collection of ImmerSea: Subversive Submersibles biodiversity notes have been kept in a water proof lab book; notes and data available by request. We beam our tracks for output into the ImmerSea: Subversive Submersibles Biodiversity Bank Themed Sustainable Miniature Golf Library which functions as a VJ home base for UDX AR MRMR ISLIXR underwater and semi-submerged transmissions.

Let’s talk about how the genes for the natural excesses of exuberant traits in the tropics might be pasted into the human genome. The Public Lab Book includes a tabulation and interpretation of Natural Excesses, Exuberant Traits and BioBanks for Future Human/Nonhuman Germline Hybrid Experiments in Human Inherited Genetic Modification. What do you think the range of possible future bodies is? Which GMO humans should be included in the Transgenic Human Genome Project (THGP)? What can the biodiversity of Koh Lon and the Andaman Sea tell us about the range of potentials, enhancements and options for future human anatomies?

Future of Artistic Research on the Interspecies Physiology of Immersive ArtSci Environments

This is an emotionally deep research process compared to the usual data recording pilot studies. Instead of using the general facial recognition software, eye tracking, blink assessment and live cortisol level monitoring (without a funding dollar spent on gestural, full embodied or any lower body orificial economy explorations), We would like to propose artistic research to amass fuller than standard physiological data spacescapings. Contiguous with our biomediated-popsyche-ecosensual experiential developmental theme, we are striving for an all-body-portal accumulative database. Future research will include anal-tensegrity monitoring as a part of any full physiological observation of our (w)hole organism [assuming this is an organism with an anus]. Along with gene expression patterning over time (including multi generational epigenetic environmental effects) and further collection of tangible and intangible objects from our subcognitive focus groups, we intend to use AI to data mine our UX bioinformatics data swarms to provide further MRMR iterative, permutative and pseudo random artistic nodes for future studies in Artistic Research on the Interspecies Physiology of Immersive DINAOSPLA ISLIXR HIAR ArtSci Environments.

TERMS KEY:

  •  AV = Audio Visual
  •  AR = Augmented Reality
  •  HIAR = Hydro Immersive Augmented Reality (HIAR)
  •  VR = Virtual Reality
  •  VJ= Video Jockey
  •  XR = Extended Reality
  •  ISLIXR = instaSnap Limited Interactivity Extended Reality
  •  UX = User Experience
  •  UDX = User Deprogramming Experience
  •  DCCTG = DinaCon Cone of Tropical Geekdom
  •  UI = User Interface
  •  MR = Mixed Reality
  •  MRMR = Mashup Remix Mixed Reality
  •  DoD CNS = Department of Defense Central Nervous System
  •  DINAOSPLA = Digital Naturalism Open Source Public Lab Aesthetics
  •  A/d = art over data ratio

Thanks to Andy and Tasneem for all the organizational skills and joy of DINACON. We did independent DIY research and fun, wild, hacking together with resultant novel instrumentation for alternate realities. Thanks also to Pat Pataranutaporn and Werasak Surareungchai of the Freak Lab https://freaklab.org KMUTT for generous support and for hosting us to review, presentation and exhibition of our Artistic Research on the Interspecies Physiology of Immersive ArtSci Environments through Eye Candy Disruptor, EcoSensual Synesthesia and Gamification of Risk. Additional thanks to Tentacles Gallery, Bangkok, Thailand.  Thanks to Praba Pilar for designing the initial Underwater AR prototype at our Woodstock gridfree residency in preparation for the NO!!!BOT performances at Grace Space in NYC, May 2018. https://www.prabapilar.com/events/nobotnyc

[1] the 3D AR software Giphy World https://giphy.com/apps/giphyworld

[2] ARZombi iOS11/ARKit game https://www.arzombigame.com/

[3] World Brush https://worldbrush.net/

[4] for instance Binaural Beats and the Regulation of Arousal Levels by F. Holmes Atwater, BA,  (Hemi-Sync Journal, Vol. I, Nos. 1 & 2, Winter-Spring 2009) https://www.monroeinstitute.org/article/3002 More information: The Monroe Institute, Binaural PTSD recovery and Astral Projection for military remote viewing and CIA mind Control. P.O. Box 505, Lovingston, VA 22949. Founded and directed by Robert Monroe from 1974 until his death in1995, the Institute held classified contracts with the U.S. Army Intelligence & Security Command (INSCOM) on orders by Gen. Albert Stubblebine. The Institute studied their hemi-synch techniques to see if they could enhance soldiers’ performance and concentration. (Emerson, Steven, Secret Warriors, G.P. Putnam’s Sons, 1988, pg 103-4) The primary area of research at the Monroe Institute involves using a binaural beat to cause different psychological effects. A binaural beat is created by using stereo headphones with each speaker emitting a slightly different frequency. The result is a tone at the frequency between the two, which allegedly causes the brain to “entrain” on the frequency, i.e. the brain waves regulate themselves to the same frequency.The National Research council evaluated the Institute’s claims that the method could be used to improve learning. (National Research Council, Enhancing Human Performance, National Academy of Sciences, 1988, pg 111-4)  “..located near Charlottesville, Virginia. Bob Monroe, author of many books on Out of Body experiences, has long and close ties with the C.I.A. James Monroe, Bob’s father, if I’m not mistaken, was involved with the Human Ecology Society, a C.I.A. front organization of the late 50’s and 60’s. The Monroe Institute has done research on accelerated learning and foreign language learning through the use of altered states of consciousness for the C.I.A. and other government organizations. Government interest in the more radical research going on at the institute remains only tantalizing speculation. Official classified document storage boxes have been seen at their mail-order outlet located in Lovingston, VA.” – Porter, Tom, Government Research into ESP & Mind Control, March, 1996 THE MONROE INSTITUTE S HEMISYNC PROCESS – Document Type: CREST Collection: STARGATE Declassified: December 4, 1998 CIA-RDP96-00788R001700210004-8.pdf Approved For Release 2003/09/10 : CIA-RDP96-00788R00170021000THE MONROE INSTITUTE’S HEMISYNC PROCESS “Hemisync is a patented auditory guidance system which is said to employ the use of sound pulses to induce a frequency following response (FFR) in the human brain. It is reported that the Hemisync process can heighten selected awareness and performance while creating a relaxed state. Hemisync is more than this however, and an extensive evaluation is warranted.

[5] This is why we are seeking an antianthropocentric, anthropo-scenic ecologist/ meteorologist device developer to join our team. Please contact [email protected] if you are interested in applying.

[6] See: http://sexecology.org/ ,  https://www.fuckforforest.org

[7] Neurological Manifestations Among US Government Personnel Reporting Directional Audible and Sensory Phenomena in Havana, Cuba, Randel L. Swanson II, DO, PhD1,2Stephen Hampton, MD1,2Judith Green-McKenzie, MD, MPH2,3; et al

Wild Behavior – Jonathan Gill

My project was to develop a low-cost, open-source platform for testing the perceptual and cognitive abilities of animals in the wild. As a behavioral and computational neuroscientist, I design experiments and novel technologies to uncover and decode how perceptions guide actions in humans and animals. At Dinacon, I began to create a platform capable of precise stimulus delivery (e.g. sounds and lights in a multimodal game for treats), identification of animal participants (a sound/photo fingerprint), and wireless networking for the collection and sharing of data. The goal of this project is to unite DIY engineering with laboratory neuroscience/psychology to enable an open platform for “field neuroscience”.

With these goals in mind, Wild Behavior was born!

The general idea can be thought of as a “rodent arcade game”, where animals can approach a machine and get treats, or time running on a wheel, in exchange for participating in a simple game. The key is that their choices in the game can tell us about how well each animal can distinguish different sound frequencies (like a hearing test), or how well they can remember the order of different lights and sounds (like a memory game).

To do this I assembled a device combining some inexpensive off-the-shelf components, in the table below, that could be battery powered to be used outside on the island.

Island rodents respond to the stimuli by either sticking their nose across a beam-break, or by licking a tube which would dispense tasty liquid if they made the right choice. To start with, the device plays the game “only respond when I play a certain sound or flash a certain light” (programmed using an Arduino), then progresses into more complicated games if the animal is doing well.

I was also curious as to whether wild animals might be interested in running on a pet-store running wheel as a reward, i.e. would they even find domesticated toys fun? I spent some time trying to follow up on this fascinating paper which demonstrated that wild mice, frogs and other animals would spend time on a running wheel placed outside “for fun”, even though they could run anywhere they liked. I tried to capture some island creatures in the act at night using IR camera traps and a wheel baited with peanut butter.

 

By the end of my all-too-short stay, I had a prototype!

Now, after returning home to a land of millions of rodents, I’m planning to position new prototypes around the city. Who do you think are smarter, subway rats or park rats?

Jennifer Jacobs

Dates: July 1st – July 6th

Jennifer Jacobs is a Brown Institute Postdoctoral Fellow at Stanford University, working with Professor Maneesh Agrawala in the Computer Science Department. Her research examines ways to diversify participation and practice in computationally-supported creation by building new creative tools, software, and programming languages for creative expression. Jennifer received her PhD from the MIT Media Lab in the Lifelong Kindergarten Research Group. She completed a masters of science in the High-Low Tech group and an MFA in Integrated Media Art from Hunter College. Her work has been presented at international venues including CHI, SIGGRAPH, and Ars Electronica.

More of her work is located at jenniferjacobs.co

Dinasaur Illustrations

By Michelle Tan

I was at Dinacon for almost four days in the first week of the conference, and created illustrations and comics about my time on the island.

A Kayaking (Mis)adventure

Featuring Danielle and Shreyasi

 

Singapore Foodscapes Illustration

Created for Foodscape Collective, commissioned by Huiying; as part of a public letter about urban farming and agri-diversity in Singapore

 

I wanted to portray what alternative foodscapes, imaginary or otherwise, there are to standard food practices in Singapore. A little boy picking up a fallen fruit in a supermarket encounters a dream-like glimpse into another world where product and nature are entwined. It is a farmer’s stall in a luscious, colourful setting, glowing in stark contrast to his own sterile surroundings. I was inspired by the lush vegetation on the island.

Procedural Naturalist Drawings- Jennifer Jacobs

As a programmer and a visual artist, I’m fascinated by ways to integrate different forms of drawing. I arrived on the island with paper, pencils, a computer, and a 2 axis drawing plotter. I used these tools to create a series of procedural naturalist drawings: drawings that were produced through a combination of computer-generated effects and manual illustration. I stayed on the island for 5 days. After getting settled the first day, my goal was to create one new drawing per day. My process consisted of four basic steps:

Step 1: Observation

I spent time walking around the island looking for forms to draw.  I particularly focused on trying to find organisms or objects that would be compatible with some form of procedural creation. Complexity, fractal patterns, or symmetry are all possible to represent relatively easy with code, so I looked for organisms with similar properties.

Step 2: Code

After settling on an organism to draw for the day, I then used Processing to write a simple program that produced forms that represented some aspect of that organism. Because coding is a somewhat anti-social activity, I tried to keep these coding sessions short- no more than a couple hours at most.

Step 3: Digital Drawing

The programs I wrote in Processing were designed to function essentially as drawing tools in that they were designed to generate some form of procedural pattern based on mouse input. By hooking up a tablet and stylus to my computer, I could use the stylus to create different procedural drawings that transformed my manual line.  I exported these drawings as vector PDFs.

Step 4: Plotting

I experimented with different pencils and a 2-axis plotter to quickly generate physical drawings of the procedural forms I created in Processing. I love the plotter for its speed and ease of use, and also for the fact that you can use a wide range of different drawing media with it. It works well with pens, pencils, and even can be modified to support paint brushes- all tools that one can also use by hand. Using the plotter is also often a social activity. It’s highly visible without being disruptive or noisy, and people tend to come by and ask about it, or watch it while it works. I really like this quality.

Step 5: Manual Drawing

Once the drawings had been plotted, I used a variety of different pencils to manually finish them- adding in shading and different texture effects. While it would have been possible to add some of these effects with the plotter, the process of manually drawing on top of the plotted drawings gave them a different quality.  Unlike coding and plotting, which requires a linear-planning intensive process, drawing by hand enables me to work intuitively and serendipitously. I can quickly try something out, and if I like it, continue in that direction. Each stroke informs the next.  In addition, coding or the plotting required access to electricity,  whereas manual drawing enabled me to work anywhere on the island.By the end of my stay, I completed 4 drawings.

Drawing 1: Sea foam and waves

My first drawing was an abstract composition that  was largely inspired by the waves and foam on the shore of the island.

Since this was the first drawing of the series, it was the most experimental. I started with a program that repeated and scaled a single stroke drawn with the tablet. I drew with this program with wavy- undulating strokes,  then plotted a series of these drawings with a rough charcoal stick. 

I spend some time with a 6B pencil darkening corners of the plotted drawing before placing the drawing under the plotter a second time and plotting a second series of lines (blue) upon the original. I then shaded and reinforced some lines manually, producing the finished result.

Drawing 2: Fern

Plants are often extremely algorithmic. Inspired by the numerous ferns on the island, I wrote a program in Processing that repeated a simple leaf shape along a hand drawn line, and mirrored it on the reverse. I modified the program so that it scaled the leaf shape so that they grew larger towards the center of the line, and smaller towards the end. This way, I could quickly draw a variety of different fern fronds.

Rather than plot multiple ferns, I decided I liked the simplicity of a single frond. I plotted it with a light pencil and shaded the back of it by hand to create a contrast against the white paper.

  Drawing 3: Palm and Lichen

By the third drawing, I had started to get more ambitious. I loved the patterns the lichen made on the bark of the palm trees around the island. I wrote a program that repeated palm segments along a hand drawn path. I then wrote a second program that automatically generated lichen-like shapes by creating a irregular outline, and then repeating and scaling out that outline around the perimeter to create a set of rings. I used Perlin noise to create the variation in the ringed sections.  The math to generate the lichen took a little while to figure out, and as a result, I spent longer coding this pattern than I would have liked.

To take advantage of the plotter’s capabilities I added a complex hatched-background to the vector illustration. Below is the finished plotted result.

I wanted to use manual drawing to create sense of depth in the drawing by shading the smaller stalks a darker tone and keeping the larger stalks light. I liked this effect, but realized it didn’t work well with the complex hatched background, so I ended up removing it and shading it dark. Yet another great thing about working by hand after plotting- you can make changes at will, and in an improvisational fashion.

Drawing 4: Butterfly

By the final day, I had gotten slightly behind. The shading on the palm and lichen piece took longer than expected and I had to prepare a presentation for the evening. Therefore I decided to rely on a simple but effective technique for the last drawing: bilateral symmetry. I wrote a program that mirrored and repeated whatever I drew by hand on the horizontal axis. I then used this program to draw two insects- a butterfly and a beetle.

I ended up only having time to plot and shade the butterfly. I left one side of the butterfly plotted but unshaded. I liked this contrast. Sometimes simple is best.

Next morning, I packed, had breakfast, and took the boat off the island. Dinacon was a wonderful, unique, experience, filled with lovely generous people. I can’t thank Andy and Tasneem enough for organizing it and letting me be a part of it.

Smart environments: from natural to digital

In the future, we will live in “smart environments”. A smart environment is filled with smart objects, objects that can presumably react or show some sign of thinking. I have always been frustrated by this classification or trend, as it is redefining the word smart.  Does it also imply that some objects are stupid? Have we previously lived in a stupid environment? Etc… I am currently developing a learning platform for Internet of Things at my company Teknikio, so I should add that these concepts are extremely top of mind.

My original proposal for Dinacon was to create an environment in which the plants and trees could communicate with each other.  Upon arriving to the island,  I decided to d see what else might inspire me in this strange paradise.

I didn’t rule out the idea of a networked project though, and had brought several different bluetooth development boards just in case.

It started with the Mimosa plant. I had never seen one before and I was instantly fascinated by the way these plants that automatically closes its leaves. How does it decide when to close its leaves? Furthermore, is this a smart plant, or an emotional plant, or a robot plant? I decided the former- this plant is smarter than other plants that can’t control their leaves and just sit there like nothing happened when touched. What intrigued me most was that it felt like communication. And so, I started to look for other elements of non-human natural behavior in this environment that felt like communication. I started by exploring how to express this system of harnessing naturally smart things for our own digitally built “smart” environment. I decided to build a prototype of a natural to digital communication system in which a sensor would collect data from the mimosa or other “naturally smart thing”. This data would be transmitted via bluetooth to a human-made device that activates in response to incoming communication signals. Somewhat like this diagram:

I found some neat rip-stop in the scrap fabrics pile and folded it up into a herringbone origami shape, that could contract and expand and glued it to a base of woven palm leaves. I then attached a servo motor to the base to pull on the rip-stop, the idea being that the servo would move the ripstop to open and close the shape in response to the intensity of the incoming signal. The first signal I used was that of waves rising and falling. Although this is quite literal, it also communicates to us the coming of a storm and other environmental information. I used a float sensor that Yannick uses on the Diva and a micro:bit microcontroller to capture the data. Below is the first version of this system that was built at Dinacon:

A bit rough, but worked for a proof of concept! I would love to build a family of these objects and place them around the island. Looking forward to next time!

Robot Language Music Video-by Albert and Mary (Dezmediah)

with help from Maggie Kane, Tasneem Khan, Mark Lifana, and Andy Quitmeyer

Make a kickass music video, with only the tools at hand; this was the challenge Mary and Albert set for themselves while on Koh Lon at Dinacon.

Spouses Mary and Albert came to Dinacon as their last stop on a seven-month long traveling stint, mostly in Southeast Asia. They knew they wanted to make something great at Dinacon, but also knew they wouldn’t be able to bring many supplies, as they’d have to carry it on their backs for months.

Mary is a musician, and while the thought of being apart from a guitar or piano for seven months made her jittery, she also relished the idea of being forced to learn the ins and outs of the iPad GarageBand app. So, she got her fingers used to the tiny keys of the on-screen keyboards, spent hours combing the built-in samples, and recorded vocals in bathrooms, on quiet beaches, and in backyards and forests in Croatia, Cambodia, Thailand, Malaysia, Indonesia, and Singapore. In May, she finished her EP, entitled Beep Boop. Pretty soon she and Albert realized that Robot Language, the first track on the album, could make a pretty fun music video.

Albert has a filmmaking background, having graduated from NYU Film School in 2005. In addition to filmmaking, Albert is a visual artist and had recently begun exploring making video art using one of the portable, affordable, pico projectors that have come on the market in the last few years. It was in Croatia that Albert first showed Mary his technique. He created a feedback loop between the projector and the camera, which resulted in interesting color distortions and multiplication and tilting of image elements.What was captured on the camera was fed into the projector, which projected onto a surface, which the camera filmed, which was fed into the projector, etc. The result was a fun, trippy “reality distortion” beam, which Mary thought would be perfect for a music video.

They knew what the story of the video would be, and had an idea of how they would film it. Once they got to Dinacon, the challenge was how to make a robot costume using only the tools at hand. Thankfully, they met Maggie Kane (Streetcat), a genius inventor, hacker, and cosplay costume designer. Maggie worked with Mary for several days to make the costume from primarily trash and duct tape.

Albert and Mary also enlisted the help of Mark, a great videographer who could film the scenes where Albert would need to be in costume as the robot. Andy and Tas, being the amazing people they are, offered to stick around for the filming, and brought down to the beach an assortment of bright lights, which they kindly held during filmmaking. They also gave valuable input on shots and angles.

The result is “Robot Language,” the music video. The video can be watched at https://tinyurl.com/dezmediahrobotlanguage

Tree Area Network (TAN) – a private Network for trees and humans

A TAN is a Network using plant-infrastructure to transmit data over trees.
(Ingo Randolf)

Introduction – Personal Area Network (PAN)

In his book “When things start to think” (Owl Books – 1999) Neil Gershenfeld writes about how they found a Private Area Network using the body as data-channel: When trying to find a “bug” measuring the hand-position of violinist Ani Kavafian he and Thomas Zimmerman found that human bodies can be used as a data channel using capacitive coupling.

“… the source of our problem was immediately clear: part of Ani’s body was in the [electric-] field and part was out; … Tom [Zimmerman] then realized that we should be able to detect the part of the field that was passing through her body. This creates a tiny current. … In other words, we could transmit data through a body. The bug could become quite a feature.”

There are Wide Area Networks (WANs) to link up cities, and Local Area Networks (LANs) to link up buildings. They have created a Personal Area Network (PAN) to connect parts of a body.

Thomas Guthrie Zimmerman wrote his master thesis (1995) with the title “Personal Area Networks (PAN): Near-Field Intra-Body Communication”: http://www.cba.mit.edu/docs/theses/95.09.zimmerman.pdf

Research in the last years was conducted to use electrostatic communication and waveguides (galvanic coupling) to transmit data through or from within a human body. This research mostly was done in the medical field for applications to monitor the body and send data from internal sensor-data to a base-station outside the body, where data then can be analyzed. This also is called the wet-net or internal-net.
The focus in this research varies from the physical layer to the communication layer, an international standard (IEEE 802.15.6) was developed to standardize the way of communication.

Around 2005 Japanese company NTT developed a product named “RedTacton”, but it is removed from their webpage and it is unclear what happened to it. When looking for consumer or pro-sumer devices in 2017 i could find none. The only way to experiment with humans as data-channel was to build a sender and receiver from scratch.

PAN @ Dinacon: TAN (Tree Area Network)

At dinacon i was interested in experimenting with this devices in the wild to send data over a tree, or in the best case to send data from one tree to another. Is it possible to send data from one side of the jungle to the other? The jungle as a network. I wanted to tackle this questions by starting out using the device on a single tree to see if it is possible at all.

In theory it should be possible to use capacitive coupling on plants. Like a human biological conductor also a plant consists of an internal wet system and an isolating layer on its outside, the bark. The internal system (phloem) is used to transport nutrients and food to and from the roots and consists mainly of water. The phloem also acts as a communication system within the plant; see: “Electrical signals and their physiological significance in plants”, Jörg Fromm & Silke Lautner – Plant, Cell and Environment (2007)

Experiments

The used devices are the same as in the human coupling experiments documented here:
https://ingorandolf.info/building-a-near-field-intra-body-communication-device/

The first experiments were to try if it is possible to detect a simple pattern. The pattern used was a carrier-wave of ~333kHz turned on and off in short pulses of ~200us. This is also the preamble used to establish communication before sending data

Materials used:
– electrodes (sender and receiver): copper plated ripstop – woven textile – Statex, Shieldex® Nora
– sender / receiver: see link above
– amplifier op-amp: from MCP629x family

First experiment:

Setup:
– unmodified PAN sender and receiver on different trees and plants
– 3.4V input to resonant tank, resulting in ~30V peak-to-peak on the transmitter electrode.
– electrodes: sender: 45 x 52 mm, receiver:

Result:
It was possible to pick up the signal a couple of centimeters (~10 cm) away from the transmitter. Different plants worked different well.
On one plant it was only possible to receive the signal on a branch of the same sub-branch, but not on the upper-branch.
Using a steel-needle penetrating the bark as the transmitter electrode, improved transmission. The needles was in the bark for ~1.5 cm. While this showed better results we distanced from such practice as we did not wanted to violate the bark of the tree.

Second experiment:

Setup:
– modified transmitter to send with higher voltages
(12 .. 24 V input, ~80 .. 180 V pp on output)
– modified receiver with a third amplification step (MCP6292)
– same electrode configuration as in the first experiment

Results:
As expected the signal could be picked up better with the stronger signal and the more sensitive receiver. It was possible to receive the signal from ~1m away from the sender on the same branch. The signal did not travel across branches.

Third experiment:

Setup:
– sender with higher voltage: approx. 17V input and ~100 V pp output
– using circular electrodes around the branches for transmitter and receiver electrode.

Result:
Using electrodes around the branched we could picked up the signal unexpectedly well. It seems that encircling the branches with the electrodes perturbs the phloem well enough to send the signal over branching from the top of a tree to it’s stem close to the ground.
Different input voltage ranging from 12 V .. 24 V with a resulting output voltage of ~80 to bigger than ~120V peak-to-peak on the sender resulted in different signals-strength picked up by the receiver. All input-voltage configurations could be picked up.

With this setup it was possible to send sensor-data measured at the top of the tree to the stem close to ground. When the receiving electrode was too close to ground it was not possible to receive the signal anymore. (as deflected to ground?)
The signal passed 5 branchings and covered a distance of around 5,40 meters.

Costuming TAN

Mika Satomi built an interface for TAN enabled tree. This tree-hugging garment is used to receive data from the tree:
https://www.dinacon.org/2018/09/30/costuming-tan/

That Strange Sensation by Dezmediah

My main project at Dinacon was to write a short story inspired by one or more things I saw there. What came out was a story about a marine biologist who finds herself on a tropical island (Dinasaurs will guess it’s Koh Lon) in an unspecified future (Dinasaurs will guess it’s 2561) with a bunch of other scientists and artists. Nobody knows why they’re there and so, in addition to surviving, etc., they’re going to try to figure that out. At Dinacon I wrote about 4,000 words and I realized that I wasn’t nearly finished yet. So, since the thing had a sort of pulpy, classic science-fiction feel to it, I thought I’d serialize it.

Following is Part One. New parts get released on the first of every month and can be read at thatstrangesensation.com. The project will (hopefully) continue until late spring.

Part One

Lately, every time L ascended, she felt on the verge of passing out. About two meters from the surface, she’d find herself needing to grasp onto the inflater nozzle of her BCD in order to remind her body of the task at hand. The water would squeeze her, the churning, womb-like sounds surrounding her and disorienting her. The sun, filtered by the water into individual rays, would hit her like a spotlight, causing her to shield her eyes even as she felt herself hungrily drawing toward it.

And now, once again, she finds herself on the surface, back in her right mind, back on solid ground, which is in fact the choppy surface of the water. The sun steady, the physics standard. Escaped. Just a weird sensation was all.

Ever since she was a beginner diver, she’d felt a whiff of this sensation, but in the past few weeks it’s become stronger every dive. Glancing around to check that the interns she’s been diving with are well, she actually wonders—if she were to let herself go on autopilot during ascension, allow her mind wander even just a bit, would she make it? Or would she pass out, sink to the bottom, die immediately?

What an unscientific thought. Likely she was becoming dizzy as a result of a slight physiological malfunction. An inner ear issue. Or maybe it was simply that this feeling mimicked that of not wanting to wake up from a good dream—it was so peaceful under there after all, so cozy, meditative. Your mind couldn’t be scattered. The water directed your focus, plied your attention toward what it wanted to show you.

“My god, I know how you feel,” her colleague, E, tells her as they unsuit back on the boat. E grunts as her tank clinks into its holder. “Sometimes I just don’t want to leave that world.”

“Maybe that’s all it is,” L replies, but still she can’t explain why the sensation is getting stronger, or—could she say—worse?

**

Two hours later she is entering the day’s data into the Thai governmental database. On that morning’s dive, she and her team of interns completed a fish survey and noted this bounty: forty-five butterfly fish, nine bream, five parrot fish, three angel fish, twenty-five wrasse, forty-five cardinal fish, and one soap fish. Still much fewer snapper than she’d like to be seeing, but the other fishes were doing well.

E types away beside her, probably messaging with a prospective intern: an eager undergraduate or beleaguered graduate student, looking for a suitable research site to host them as well as an exciting Southeast Asian experience. A storm has rolled in. L’s nostrils are alerted to a metallic smell as large raindrops begin to fire away on the roof like they mean to put a hole in it. She feels as if the space has become smaller, as if the world would be happy to do them in.

L leans her forehead on her hand, rubs her temples. “I’ve got a bit of a headache now,” she says. E turns toward her and frowns.

“Take a paracetemol,” E says and, sighing, turns back to her computer. Then she groans. “This student wants to bring his girlfriend. But she’s not going to do any research. She just wants to hang out. ‘She won’t take up another bed,’ he says. ‘I don’t see why she has to pay.’” She rolls her eyes.

L gets up and heads to the kitchen to get a drink of water. On her fourth step, a curtain comes over her vision and all she can see is black. “I’m going blind,” she says as she collapses to the floor.

When she wakes up, E is standing over her. Her face looks old, and the geometry of it evokes an ancient math. L is sure, then, that there have been hundreds of people throughout human history that looked exactly like E.

And then she feels her heart beating faster than it should be beating. Her breath is deep and rapid at the same time, as if she can’t get enough air. But her breath moves in and out, her heart beats, and she can see.

“I’m okay,” she says.

“My god, what is wrong with you?” E yells, her Russian accent really coming out now. “Do you want me to call an ambulance?”

“No, no,” L says. “I just stood up too fast I think. Something a little off with my circulation lately, maybe my blood pressure.”

Maybe I’m fucking pregnant. Fucking pregnant, that’s a funny phrase.

“My god, go home,” E says. “Take the day off.”

“But new students are coming, I have to orient them.”

“Honey, you need to take some time off.”

**

A couple hours later L is in her house, in her bed, inside the mosquito net. Her headache has faded and she feels fine. The storm has passed away, leaving behind thin, shifting, planes of air. She’s reading a dense, poetic book about water and how to interpret it. She’s enjoying the language, but can’t process much meaning from it. She puts the book down and looks at her nightstand. Two pregnancy tests rest there, staring up at her with two blank eyes. No results.

How is this possible?

Pregnancy was unlikely, as she and her various partners on the island always used condoms, but you never knew. So she could understand a positive result and she could understand a negative result but a non-result was perplexing to say the least.

Just a little low on iron from my last period. Something, something like that.

It is barely five o clock. A breeze blows in and a rodent scampers across her roof. The cicadas are quieting down to a low, tired, scratching, only needing to cool themselves down a little in this breezy landscape.

“We will look at water as the subject. Mammals and insects are interesting, but they will only earn their place in this book to the extent that they can explain the behavior, the signs and symbols of water.”

She puts the book down and falls asleep. She sleeps 12 hours. At 5 am a gecko lands on the wall of her bungalow just outside her head and calls out, loud and clear, “unh unh, unh unh, unh unh,” and she jolts awake, thinking the gecko is in her bed, that someone put it in her bed to wake her up, but there’s no one in her house, not even a gecko.

She can’t believe she slept 12 hours.

Maybe I am fucking pregnant.

Suddenly she feels tough and lichenous, tucked away inside herself from whatever might be happening outside.

**

On her motorbike drive to work, a rabid dog lunges at her, causing her to swerve sharply. After driving off a safe distance, she stops and looks back at it. It lies in the middle of the road, sunning.

She gets to the lab before E and spends a quiet morning drinking coffee and looking over the data. The coral bleaching is getting worse and what to do, what to do about that. 50% bleached already and it’s only the beginning of the hot season. At some point in her meager little life, she’d decided that the best thing she could do was have this field station and report the data. Tell the authorities. Alert people in power. Bolster the science, strengthen the argument. Not shut up. Perhaps she should do more.

E enters the room with a clanging of bags and various attachments. Her motorbike helmet falls off her arm and rolls toward L. E’s eyes go wide and she feigns anger. “My god, what are you doing here?”

“What do you mean?” L says.

“I thought you’d take the day off.”

“Oh I’m fine. Got a good night’s sleep.”

E tuts and shakes her head reprovingly.

**

Two hours later they’re diving again. It’s been determined L will be divemaster for two of the more experienced students and E will take the newbies. That way, the experienced students can cover some of the more routine data gathering and L can be free to focus on her pet research project, which tests whether smaller solitary corals are less resistant to bleaching than larger solitary corals.

E’s group lays out the transects while L and her interns hang back and look at coral. She breathes out and sinks closer in to some branching coral, the home of twenty or so baby, white and yellow butterfly fish, who dart in and out like bees. She wishes she were doing a fish survey so that these lovely, tiny fish could be counted. If only their presence could be felt, could matter in the world. But probably they don’t care either way, probably that doesn’t matter to them.

Now it’s time to go and she motions the students to go ahead of her. With the lab’s underwater camera they take a picture of the transect measuring tape every 50 cm. Back at the lab they will need to go through every one of these 300 pictures and identify the coral just to the left of the transect. She removes her underwater slate from her BCD pocket and begins counting. Everything is slow, deliberate, meditative. She breathes slowly. It’s arduous counting all the solitary corals—there are so many. The students’ frog kicks are too frequent, they are going too fast—almost out of her sight now. No matter, they are safe and experienced. She finishes her survey and meets them at the end of the third transect at 50 minutes into their dive. Together they reel up the transects, spiders assuming the thread of their web back into their abdomens. She directs one of the students to take the transect bag and hook it to her kit. The three of them look at each other in the eyes and L makes the hand signal for “let’s ascend”—a thumbs up.

She doesn’t think about that strange sensation. She’s thinking about the data she gathered and about what conclusions she might begin to draw. Slowly, slowly, she swims up, not even needing to think about moving her feet, just willing herself up. And then, at three meters from the surface, once again, it hits.

**

The pressure is more intense this time, the movements of the water like a thousand little flies distracting her attention. The light hits and she feels the heat of the sunrays on her body. The rays form a cone, which twists around her, and she is an unwilling dancer, moving her limbs oddly, floating six inches above an empty stage.

And then she is elsewhere. Her face is naked—no regulator. She feels sand in her nose and on her lips. She sputters, rubs her nose with her index and thumb, sticks out her tongue. Opens her eyes. She’s on the beach. Or a beach, rather. She doesn’t recognize the topography of this beach, with its thick forest, its meters of white sand. All the beaches on her island are short, with sparse, low vegetation and pieces of trash strewn about. This beach is pristine. A breeze tumbles down the white sand, unobstructed by a single other person. She is alone.