Share

cover art for Mirrored, with Kyriaki Goni

The Data Fix with Dr. Mél Hogan

Mirrored, with Kyriaki Goni

Ep. 32

Kyriaki Goni - an artist with a background in social and cultural anthropology - and I start our conversation reflecting back on the lockdowns of April 2020 in Athens; what this signified, and how it shaped her art, which ultimately manifested as "The Portal or Let’s Stand Still for the Whales", which was a reflection on the tensions between the darkness of pandemic realities and the quiet restoration of natural things in her surroundings, and beyond. We also talk about "Perfect Love #couplegoals #AIgenerated, 2020,2022", as an exploration of intimacy, doomscrolling and isolation. We finish our conversation on "Not Allowed for Algorithmic Audiences, 2021" which focuses more specifically on 'audio assistant' tech, and the way algorithms pull audio from social media and various corners of the internet to then classify and reorganize the way we're listened to and heard. One of the (MANY) things I love about Kyriaki's work is that it is decidedly not preachy -- instead it holds a mirror to the audience to reflect gently, and in their own time, on the significance of technology in various contexts. Recorded Jan 12, 2024. Released Feb 26, 2024.


KYRIAKI GONI

https://kyriakigoni.com/


ANTHROPOCENE ON HOLD

https://www.pcai.gr/anthroposcene-on-hold


NOT ALLOWED FOR ALGORITHMIC AUDIENCES

March 23, 2023–April 29, 2023

The Breeder Feeder

https://thebreedersystem.com/uncategorized/kyriaki-goni_-not-allowed-for-algorithmic-audiences/


studio international: Kyriaki Goni – interview: ‘For me, technology is an existential discussion’

https://www.studiointernational.com/index.php/kyriaki-goni-interview-for-me-technology-is-existential-discussion-data-garden-blenheim-walk-gallery-leeds-arts-university

More episodes

View all episodes

  • 40. Territorial, with Alina Utrata

    51:42
    Alina Utrata and I have a conversation about billionaires conquering space for personal pleasure, in the pursuit of energy sources or minerals, or, to push forward a longtermist interplanetary movement. Alina explains how when we think about outer space as "empty", we unwittingly thinking territorially -- an incredibly valuable contribution to critical space scholarship. Recorded May 20. Released June 24, 2024.Engineering Territory: Space and Colonies in Silicon Valleyhttps://www.cambridge.org/core/journals/american-political-science-review/article/engineering-territory-space-and-colonies-in-silicon-valley/5D6EA4D306E8F3E0465F4A05C89454D6
  • 39. Futures, with Lee Vinsel

    51:32
    I invited Lee Vinsel to discuss with me a post he wrote from a workshop on "Politics of Controlling Powerful Technologies". In this episode we discuss how futures are (imagined to be) predicted through data modelling and crunching numbers, and how various alternatives to these statistical imaginaries also come short of knowing what awaits us. Can we stand to not know? And if we don't know what the future holds, how do we plan politically? Recorded April 19. Released June 10, 2024. How to Be a Better Reactionary: Time and Knowledge in Technology Regulationhttps://sts-news.medium.com/how-to-be-a-better-reactionary-1630b5098fbc
  • 38. Objective, with Lisa Messeri and M. J. Crockett

    53:42
    In this episode, Lisa Messeri and M. J. Crockett discuss how scientists are in danger of overlooking AI tools’ limitations, and how science is made stronger by questioning its obsession with objectivity. Recorded April 18, 2024. Released May 27, 2024.Artificial intelligence and illusions of understanding in scientific researchLisa Messeri & M. J. Crockett  Nature volume 627, pages49–58 (2024)Cite this articlehttps://www.nature.com/articles/s41586-024-07146-0
  • 37. Thirsty, with Shaolei Ren

    48:22
    In this episode, Shaolei Ren and I discuss the relationship between water and generative AI. We delve into what happens to water in the (very thirsty) data center, what it's used for, and how much fresh water the AI revolution will ask of the planet in the future, and at what costs. Big Tech doesn't yet disclose its water withdrawal or consumption, so researcher like Shaolei Ren take up the work and propose solutions for a more sustainable future for AI. Recorded April 19, 2024. Released May 13, 2024.
  • 36. Diversity, with Catherine Stinson and Sophie Vlaad

    53:32
    With Catherine Stinson and Sophie Vlaad, we discuss what diversity means in the context of AI -- its applications, conceptualizations, teams, institutions, networks, members, and ideals. As they ask in a recent article, "diversity" is often proposed as a solution to ethical problems in artificial intelligence (AI), but what exactly is meant by "diversity" and how it can it solve those problems? Recorded March 22, 2024. Released April 22, 2024.A feeling for the algorithm: Diversity, expertise, and artificial intelligenceStinson, C., & Vlaad, S. (2024). A feeling for the algorithm: Diversity, expertise, and artificial intelligence. Big Data & Society, 11(1). https://doi.org/10.1177/20539517231224247
  • 35. Unsustainable, with Matthew Archer

    47:52
    Listen to my conversation with Matthew Archer, author of Unsustainable: Measurement, Reporting, and the Limits of Corporate Sustainability. In his beautifully written book, Matthew makes a case for being highly skeptical of corporate sustainability initiatives, especially as they've become increasingly grounded in metrics of all kinds that measure just and exactly what the companies themselves determine to be worthy of measuring. Framing sustainability as a technical issue has been and continues to be a failure, and so we ask: what it might mean to take this criticism seriously? Recorded Feb 2, 2024. Released Apr 8, 2024.Unsustainable: Measurement, Reporting, and the Limits of Corporate Sustainability (Feb 2024, Published by NYU Press)https://nyupress.org/9781479822027/unsustainable/
  • 34. Change, with Sireesh Gururaja, Amanda Bertsch and Clara Na

    01:00:28
    Together, Sireesh Gururaja, Amanda Bertsch and Clara Na explain the paradigm shifts in Natural Language Processing that they've noticed themselves, observed in the community, and documented through a series of interviews with NLP researchers. They share their hopes for the NLP field -- as less focused on benchmarks, and as more self-reflexive and ethically-driven -- moving forward. Recorded Jan 19, 2024. Released March 25, 2024. To Build Our Future, We Must Know Our Past: Contextualizing Paradigm Shifts in Natural Language Processingby Sireesh Gururaja, Amanda Bertsch, Clara Na, David Gray Widder, Emma Strubellhttps://arxiv.org/abs/2310.07715
  • 33. Adversarial, with Steph Maj Swanson

    50:21
    Steph Maj Swanson, aka Supercomposite and I discuss the spooky Loab phenomenon, generative adversarial network, negative prompts and the demons (maybe?) lurking in large datasets. Recorded Jan 19, 2024. Released March 11, 2024. What I Learned from Loab: AI as a creative adversaryThe artist behind the viral cryptid "Loab" reflects on her critical relationship to AI art toolshttps://media.ccc.de/v/37c3-12052-what_i_learned_from_loab_ai_as_a_creative_adversary Original Twitter thread:https://twitter.com/supercomposite/status/1567162288087470081?lang=enInstahttps://www.instagram.com/supercomposite/