Share

cover art for Responsible Tech & Human Rights in AI

The Foil

Responsible Tech & Human Rights in AI

Ep. 17

In our first Episode of our Responsible Tech Series in the lead up to the Australian Federal Election, we speak with Edward Santow who is the Industry Professor for Responsible Tech at the University of Technology in Sydney. Prior to his current role, Ed was the Australian Human Rights Commissioner. During his tenure, he led the world’s largest public consultation on human rights and technology and published a public report with recommendations for the development of responsible tech.


In this Episode, we talk with Ed about his early experiences working as a lawyer in community legal services where he saw first-hand the impact of tech applications gone wrong in policing. We discuss the pivotal moment when public attitudes shifted away from complacency to real public concern for responsible use of data and tech; when Cambridge Analytica used personal data belonging to millions of Facebook users collected without their consent to provide analytical assistance to the 2016 presidential campaign of Donald Trump. 


Ed outlines the three key vectors for responsible tech: the law, training and design. We explore regulation and legislation as it currently exists and that through enforcement of the current law, “80% of problems would go away.” Ed presents the recommendation of an impact assessment before use of AI for automated decision making. We discuss the future expectations from the public and the challenge for policy makers.


Ed highlights the application of AI in today’s business and public sector context noting that 85% of AI projects fail and why this is the case. We discuss facial recognition technology and the risks, and the need to build data capabilities across society in the data and digital age.


https://profiles.uts.edu.au/Edward.Santow

seerdata.ai

More episodes

View all episodes

  • 20. Diversity for AI

    50:12||Ep. 20
    Steve Nouri is the Founder of AI4Diversity, a non-profit global initiative that engages and educated diverse communities about AI to benefit society, and Founder and Chair of Hackmakers a global hackathon bringing hackers together to collaborate on impactful innovation challenges. Steve is a well-known AI influencer with a social media following of 750k+ people.Steve shares his journey from software developer to becoming the Head of Data Science at CSIRO’s Data61 and the Australian Computer Society.Steve shares what his followers are interested in hearing about right now. We discuss a variety of industries that are being disrupted and the AI technologies driving that disruption, from autonomous drones to large language models. Steve describes the introduction of bias to these systems through the data they are trained on, and highlights this as a risk.Steve discusses the introduction of a data-centric approach to machine learning, as opposed to the more traditional model-centric approach. We ask Steve to weigh in on the trade-off between accuracy and fairness in application of AI to society.Steve introduces and discusses the efforts of AI4Diversity, and the importance of having diverse teams involved in the development of AI.www.ai4diversity.orgwww.hackmakers.comseerdata.ai
  • 19. Regulation to defend democracy from Big Tech

    58:53||Ep. 19
    Chris Cooper is a cultural anthropologist and Executive Director at Reset Australia, the Australian affiliate to the international Reset network and think tank working to drive public policy to tackle digital threats to democracy. Chris is also Senior Campaign Director at Purpose, an international social impact agency supporting leading activists and companies to develop strategy that can shift policies and change public narratives. Chris comments on the current state of Australian tech regulation. We discuss how to identify bad actors and bad content online. Chris shares his definition of “mis” and “dis” information, a key focus for Reset. Both “mis” and “dis” information is false information that is shared. Mis-information is shared without the sharer knowing it is false, whereas dis-information is shared by sharer who knows it is false. Chris describes efforts at Reset to build on the work of the “age-appropriate design code” from the UK, and the “best interests principle” which requires that digital platforms that children are likely to use must prove that they are designed and operating with the best interests of those children in mind. Chris relates the key objectives of Reset for policy change.Regulation on digital platform accountability and responsibility.Regulation on eliminating risks from systems and processes, giving regulators more oversight over the design of systems in use by companies.Regulation to address community and societal risks; one person misinformed is not so problematic but a fragmented society consuming two different versions of the truth is a problem for democracy.Establishment of regulatory responsibility in these areas with a new regulator or existing agency.Equipping regulators with powers to enforce regulation with penalties proportionate to the scale of harms caused. We ask Chris for his thoughts on the issue of foreign interference in Australia’s democratic system. Chris makes the case for increased transparency from digital platforms that are a significant source of information for the Australian citizenry. Chris asserts that polarisation of public opinions on critical issues, as well as proliferation of hate speech and racism, is exacerbated by social media, and that regulation is required to address this. https://au.reset.tech/https://www.purpose.com/https://seerdata.ai  
  • 18. Disinformation, Democracy & Elections

    47:01||Ep. 18
    Katie Harbath is a global leader at the intersection of elections, democracy, civic and tech. Katie was the public policy director at Facebook for 10 years and is credited with building out and leading a global team responsible for managing elections. She played a significant role in getting governments and elected officials around the world - at the local, regional and national levels - to use Facebook and Instagram as a way to connect and engage with constituents.In this Episode, we delve into what you need to know on the eve of the Australian Federal Election. Katie helps us understand the dilemmas, hard trade offs and decisions for social media platform products and policies that set the rules to manage the spread of misinformation, disinformation and mal-information. She talks us through the impact of data and digital on elections and democracy.We explore Elon Musk’s announcement of the purchase of Twitter. Katie calls on the need for action and plans to build the guardrails for social media platforms to protect integrity and reduce harm to democracy. We discuss the need for leaders, product owners and campaigners to admit what has worked and hasn’t to reduce bad outcomes that denigrate democracy. Katie discusses product and legislations for protections. She gives advice on what behaviours we can all take to reduce the spread of misinformation and talks about what we can expect in the future.www.anchorchange.comwww.seerdata.ai
  • 16. Quorum Breaker

    01:02:26||Ep. 16
    State Rep. Claudia Ordaz Perez represents Texas’s House District 76 in El Paso County. She is the former Mayor Pro Tempore and City Councilwoman for the City of El Paso, where she was an advocate for working parents and family caregivers. At City Council, she was successful in creating local policies impacting living wages for workers, local park enhancements for children, funding for new infrastructure for municipal police and fire departments, local animal shelter improvements, and promoting investment opportunities to expand job growth in the Borderplex region.In 2021, Rep. Ordaz Perez was among a group of Texan Democrats who broke quorum to halt a legislative session in Texas and fight a controversial voting rights bill.  The law added new identification requirements for voting by mail, banned 24-hour voting and drive-through voting and established uniform voting hours in the state. Republicans argued it was needed to ensure election integrity. Democrats said the new proposed rules disproportionately affected minority voters and they fled Texas to break quorum as a result.Busting the quorum isn’t unheard of — in fact, it has happened at least two other times in Texas political history. But it is considered a nuclear option, a last resort when the debate has shut down and one side believes it’s being railroaded. As their quorum-breaking departure captured attention around the world, the Texas' Democrats' drastic move to break quorum was hard to ignore. And while they may not have spurred immediate federal change in their favour, this dramatic walkout halfway across the country marked a new inflection point in the national voting rights debate and shaped Texas politics forever. In this Episode, Rep. Ordaz Perez shares the needs of the borderplex community in El Paso, the changes in legislation that drove her to work with fellow Representatives to break quorum, the development of the “black and brown” movement led by women, the reception in Washington D.C. and the importance of data informed discussion on critical legislation to protect democratic process in the United States.  https://house.texas.gov/members/member-page/?district=76www.seerdata.ai
  • 15. Dreaming a Bigger Story

    33:31||Ep. 15
    Joe Couch is a narrative designer, storyteller and Founder of fast growing start-up, Omelia. Omelia is a narrative engine to develop our future stories. Joe and his Co-founder, Kate Armstrong-Smith, have created a suite of narrative design tools powered by the Omelia Engine to enhance collaboration, creativity and development of story based content. Whether a movie script, game design or interactive experience—Omelia is the narrative technology, story makers have been waiting for.Joes talks about how he and Kate saw the need for Omelia. We talk about the challenge writers and producers face creating new, original stories that represent the complexity of today's world. Joe shares his personal story as a writer and creative, his motivations and the opportunity Omelia offers creative industries. We explore the opportunities for AI and ML to unleash creativity and offer writers, producers and studios new ways to develop and manage narrative design that aims to enhance the creative process, not detract from it.www.omelia.comwww.seerdata.ai
  • 14. Data is Power

    37:42||Ep. 14
    Stefaan Verhulst is Co-Founder and Chief of Research and Development at The GovLab, New York University.Stefan founded GovLab with the goal of strengthening the ability of institutions and people to work more openly, collaboratively, effectively, and legitimately to make better decisions and solve public problems.Stefaan says the COVID-19 pandemic has been a watershed moment in which we’ve realised that we don’t have access to a lot of the data we need and that we need to unlock data assets that could be used to save lives. Stefaan advocates for more institutions to “publish [data] with purpose” by identifying a public interest benefit for which the data is required. Stefaan describes advances in the disciplines of formulating purpose specifications, problem specification, and question definition which requires a skillset that many policy professionals assume they have but often don’t.Stefaan emphasises the importance of inclusivity in question formulation. Stefaan admonishes us to pursue not just data equity, but also question equity, in order that the questions for which answers are sought and metrics are developed are those that really matter to society.Stefaan observes that power dynamics are determined by asymmetries, such as the data “haves” and the data “have nots”. Stefaan quotes Sir Francis Bacon who said, “knowledge is power” asserting that in the 21st century “data is power”. Stefaan describes a variety of data asymmetries such as between consumers and corporations, between citizens and government, and between business and government. Stefaan argues that addressing these asymmetries is essential for achieving “digital self-determination” for individuals and groups.Stefaan acknowledges some tensions between the ideal of data sharing and reuse for public benefit, and of digital self-determination where these principles interface at the concept of privacy. Stefaan says this balance will not be easy to find but argues that with data we need to go beyond consent and aim to avoid not just misuses, but also missed uses. Stefaan believes legislation will be inadequate for arbitrating all specific circumstances, and that Data Stewards as a profession will need to be skilled in evaluating the appropriateness of the purpose and fitness of the data for sharing and empowered to do so.www.seerdata.ai www.thefoil.ai 
  • 13. Virtual Reality Sexual Assault, AI Risks to Women & Bias

    41:24||Ep. 13
    On International Women’s Day we celebrate by speaking with Dr Catriona Wallace who is a mother and a global leader in AI ethics. She sits on numerous boards and educates leaders around the world on mitigating unintended harms from AI. Catriona discusses the recent emergence of metaverses; immersive virtual worlds where users can interact in new and creative ways. Catriona relates a recent sexual assault incident in which a woman Nina Jane Patel was virtually assaulted within Horizon, a metaverse created by Meta, and another incident in which the owner of a virtual residence found that their virtual dwelling was being squatted in and there was no clear recourse to justice. We discuss the risks of bias in AI algorithms and how women have historically been under-valued by AI systems tasked with recommending job candidates for Amazon or estimating customer creditworthiness for Goldman Sachs and Apple. Catriona argues that this bias stems from inadequate representation of women in the data used to train the AI systems, and under-representation of women in the field of Data Science. Catriona observes that 85 million jobs will be replaced by AI systems, and that 90% of these jobs are held by women and minorities. Catriona argues that the responsibility for AI-enhanced real-world decisions should remain with business owners, not the technical teams who develop the AI systems. Catriona relates her experience as the Executive Director at the Gradient Institute of training boards and executives who have very little understanding of AI. Catriona describes how it is predominantly young men who are creating datasets, for example by manually labelling images, and that this is one way in which bias is introduced into AI systems. Catriona talks about the work of the Gradient Institute training Data Scientists to code ethically and teaching Data Scientists about tools that are available for assessing whether their work is having unintended consequences. Catriona advocates for regular AI systems assessments by external assessors to provide Data Scientists with feedback about how they can be more responsible. Catriona shares the recent release of Australia’s first Responsible AI Index by Fifth Quadrant, Ethical AI Advisory, and the Gradient Institute. The research found that only 8% of organisations had any type of Responsible AI maturity. Organisations can measure their own Responsible AI maturity using the Responsible AI Self-Assessment Tool (fifthquadrant.com.au). Catriona observes that many of the entry-level, administrative, and customer service jobs that will be automated by AI systems in the coming years are typically held by women and minorities, and that Australia needs another 160,000 Data Scientists to keep pace with global industry. www.seerdata.ai  www.thefoil.ai  
  • 12. Australia's Data Front Door

    45:28||Ep. 12
    Andrew Lalor is Assistant Secretary, Data & Digital, at Department of the Prime Minister and Cabinet (PM&C).  In this Episode, Andrew describes his journey through the Australian Public Service (APS) and development of his passion for data analytics and data governance. At PM&C Andrew works with his team and others to advance the public data system as a whole. We talk with Andrew about the development of the Australian Data Strategy which describes why data is important and the opportunity to create enormous value through appropriate sharing and use of data, and paints an expansive vision for Australia as a leading data driven economy. Andrew describes the way the Australian Data Strategy emerged as part of the Australian Government’s broader work on its whole-of-economy Digital Economy Strategy. Andrew outlines two of the key actions described by the Strategy; one of which the creation of a “Front Door” that will make it easier for people to access data that is managed by the Australian Government, including improvement and expansion of the data.gov.au website to include both Open data and other data assets that can be made available in controlled, secure environments. The other key action Andrew describes is the creation of the National Disability Data Asset which aims to provide insight into the experiences of people with a disability and help to develop better and more personalised disability services. Andrew describes the difficulties of building and maintaining public trust in Government, and how the Australian Government is committed to being transparent about the ways in which it collects, stores, and uses data. Andrew echoes the concerns of Australians about privacy and security of their data and acknowledges that for many these concerns are higher priorities than quality, convenience, or price when considering products and services. Andrew observes that the positive experience of many Australians when engaging with Australian Government services is enabled by an efficient and effective use of data. We discuss the Data Availability and Transparency Bill which seeks to provide a preferred pathway to share public sector data, ensuring it is accessible and sharing is safe, consistent and streamlined, and the Intergovernmental Agreement on Data Sharing which recognises the immense value that can be created when data flows between the Australian Federal Government and the State and Territory Governments as evidenced throughout the COVID-19 pandemic. Andrew also references the Consumer Data Right. The Australian Data Strategy is open for public consultation until 30 June 2022 and submissions are welcome via smart form. www.seerdata.ai www.thefoil.ai