Cyborg Classics Call for Papers

Cyborg Classics Call for Papers April 28, 2017

We are pleased to announce a one-day symposium, sponsored by BIRTHA (The Bristol Institute for Research in the Humanities and Arts) to be held at the University of Bristol, on Friday July 7th 2017.

Keynote speakers:

  • Dr Kate Devlin (Goldsmiths)
  • Dr Genevieve Liveley (Bristol)
  • Dr Rae Muhlstock (NYU)

The aim of the day is to bring together researchers from different disciplines – scholars in Archaeology & Anthropology, Classics, English, History, and Theology as well as in AI, Robotics, Ethics, and Medicine – to share their work on automata, robots, and cyborgs. Ultimately, the aim is an edited volume and the development of further collaborative research projects.

Indicative key provocations include:

  • To what extent do myths and narratives about automata, robots, and cyborgs raise questions that are relevant to contemporary debates concerning robot, cyborg, and AI product innovation?
  • To what extent, and how, can contemporary debate concerning robot, cyborg, and AI product innovation rescript ancient myths and narratives about automata, robots, and cyborgs.
  • Can interdisciplinary dialogues between the ‘soft’ humanities and the ‘hard’ sciences of robotics and AI be developed? And to what benefit?
  • How might figures such as Pandora, Pygmalion’s statue, and Talos help inform current polarized debates concerning robot, cyborg, and AI ethics?
  • What are the predominant narrative scripts and frames that shape the public understanding of robotics and AI? How could these be re-coded?

We invite scholars working across the range of Classics and Ancient History (including Classical Reception) and across the Humanities more widely to submit expressions of interest and/or a title and abstract (of no more than 250 words) to the symposium coordinator, Silvie Kilgallon (silvie.kilgallon@bristol.ac.uk). PhD students are warmly encouraged to contribute. The deadline for receipt of abstracts is May 31st, 2017.

Via The Stoa Consortium

"I think immersive role playing is an awesome way to learn a language. I had ..."

Direct and Indirect Learning Through Games
"I never thought about it before, but Paul stressing Jesus was of David's line is ..."

Genealogies and the Age of the ..."
"James said: I've thought that Q might have had some reference to Jesus being born ..."

Genealogies and the Age of the ..."
"That's a great question. That two authors independently decide to add infancy stories and genealogies ..."

Genealogies and the Age of the ..."

Browse Our Archives

Follow Us!


TRENDING AT PATHEOS Progressive Christian
What Are Your Thoughts?leave a comment
  • John MacDonald

    I think the Copernican point in AI will be if we can design a system that is self-aware. At that point, if it ever happens, we will have a new and unique form of life.

    • Ian

      How will you know? The problem with all ‘when AI does X’ is that we have no objective test for our Xs. Proxies (‘when AI plays chess’, ‘when AI writes poems’, ‘when AI programs itself’), turn out to be rather simple, but relatively unconvincing. We can’t define ‘intelligence’ or ‘consciousness’ or ‘creativity’ in any robust non-biological way, let alone test for them. We rely on the ‘superhuman’ fallacy (has to be human-style performance beyond any human capability). That’s philosophically dubious at the best of times and won’t help for self-awareness.

      Plenty of programs are already self-modifying, self-regulating, and adaptive.

      • John MacDonald

        How do you know your fellow human being is self-aware if all you have access to is their behaviors?

        • It is an inference from our own experience of our inner life, by analogy. And whether that analogy is possible in the case of non-humans is precisely the question!

          • John MacDonald

            I can’t take credit for that. The philosopher Edmund Husserl noted long ago that we attribute an inner life to other humans by analogy. I think he first noticed this while observing the way people attribute human-like characteristics to their pets (e.g., the dog is drooling when I prepare his dinner, so he must feel the way I would feel in a similar situation).