July 31, 2014 jmoro

Privacy and Surveillance & The Politics of the Skies, Pt. 2: 29 July, 2014

Part two of a series of meetings on Privacy, Surveillance, and the Sociopolitics of Drones and the Skies.

This Month in Drones

From Jon Caris and the working group:

  • The largest item was a prominent letter to the FAA written by Paul Voss, member of the working group and engineering professor at Smith College, cosigned by a number of faculty members from colleges and universities across the nation. The letter argued against the FAA’s current rulings on drones as unnecessarily restrictive towards academic institutions. More information is available about Voss’s letter here, and the working group took time to consider the implications for their own work and the AIRLab. The thrust of Voss’s letter, which the working group emphatically supports, was arguing for “attaching the person to the drone,” and making it clear that drones have utility to on-campus research. In particular, we want to extend this to include humanistic and creative practices as well as traditional STEM applications.
  • Drone news is coming fast and frequently these days, and many of the pieces are collected over on our Corpus page. In particular, we wanted to highlight an instance of a drone being flown around the Space Needle in Seattle by an Amazon employee. It was initially reported that the drone crashed into the Space Needle, but was later found not to be true.

Meeting notes

  • Fraser Stables, assistant professor of art at Smith College, talked about the role of regulation and institutional review in artistic practice, and how we need to be mindful of these practices when thinking through making art with drones. How do institutional review boards sometimes misunderstand how art practice functions, and how can we imagine using art practice as a framework for teaching with and about drones?
  • Would the AIRLab be interested in sponsoring a small microgrant for Five College artists interested in doing work with or about drones?
  • The question of permits and licensing emerged again, this time with regards to the projects themselves, rather than operators. The group agreed it could be more useful to license a project to use drones in a particular way, rather than individual operators—just as there are regulations about how to take photographs at archaeological sites, for instance.
  • Jeffrey Moro and Stables spoke briefly about the aesthetics of drone-captured footage. Stables argued that at this stage in the game, any art made with drones is more interesting or provocative because of the means of production, and that we don’t yet have an aesthetic framework through which to discuss the footage as reading anything other than questions of privacy, surveillance, borders, or transgression.
  • Talked about what kinds of sensors we’d like to experiment with—and how those sensors extend our abilities to sense and perceive in different ways, and how those sense extensions potentially infringe on others’ privacy. In particular, we mentioned LIDAR, audio capture, olfactory sensors, and also using drones to project information outwards as well as capturing information (projector drones indoors, for instance).
  • Andy Anderson and Jon Caris brought up built-in geofencing as a way for institutions to police and restrict drone use to proper locations. We also imagined a system by which property owners could opt in to allowing drones on their properties, and then submitting those GIS coordinates to a database.
  • Eric Poehler taught us the fun phrase “exoschematic adaptation”—thinking of drones as a real-time avatar than can access space we’re physically incapable of accessing, and in that way, imagining drones from an almost evolutionary or anthropological standpoint.
  • We also touched on the necessity of a data management plan: where might all of the raw data captured, specifically from AIRLab drones, go? What kinds of protocols are we going to need to have in place about the long-term storage, management, and possible deletion of the data? Poehler suggested that we need an automated system in which any data that could not be public is flagged for automatic deletion—we won’t keep anything on our servers that we couldn’t make public.
  • We then started to think about these questions in relationship to the AIRLab as a resource. If we imagine the AIRLab as the drone equivalent of, say, a writing center, then we can more effectively imagine the Lab as an in-house consultancy rather than running the projects themselves. In that way, we can put the onus of having a concrete data management plan on individual projects that use the AIRLab—which providing ample regulation and coaching.

Readings

Readings for this meeting were the same as the last.

Tagged: