Macquarie University NSW 2109
Our research strategy
We span a wide range of research topics, reflecting our multidisciplinary membership.
The Macquarie University Ethics and Agency Research Centre (EAC) focuses on developing research networks, fostering new collaborations and mentoring the next generation of researchers.
Centre members can apply for funds across a range of research activities such as:
- conducting research projects
- running workshops
- holding research meetings
- inviting internationally-renowned scholars to visit the centre
- supporting higher-degree research students.
Current centre research falls broadly into ethical and legal challenges related to technology, social justice and equity, and health and healthcare.
We have active research on a range of health topics. Some recent activities include:
- Ebony Birchall (Law School) co-hosted a workshop with the Public Interest Advocacy Centre that brought together 25 academic, industry, and non-profit sector experts to share knowledge on health care provision within the immigration detention network. The team has plans to develop a research program focused on access to justice for immigration detainees.
- Wendy Lipworth (Department of Philosophy) has submitted a Future Fellowship application on philosophical approaches to research integrity. The EAC is central to the application, with the centre forming the core of the new Macquarie University Research Integrity Laboratory laboratory, to be established if the project is funded.
- Several members of the EAC (from the Law School, the Australian Institute for Health Innovation and the Department of Philosophy) are joining another application led by Wendy Lipworth, for an Australian Research Council Centre of Excellence on the governance of health technology innovation.
Social justice and equity are central to research in the EAC. For example:
- Jane Johnson (Department of Philosophy) leads work on the care of non-human animals and its impact on both animals and their human carers. In 2023 she ran a workshop with animal technicians and managers from various research organizations across Sydney examining the challenges they experience as part of their work. The workshop helped establish a new research team in the EAC who will work towards a collaborative grant application.
- Gender equity is central to research led by Katrina Hutchison (Department of Philosophy). One of her current projects focuses on gender bias in various careers (including surgery, law, philosophy, information technology, biology, astronomy). Her team is investigating the impact of common interactions (such microaffirmations, microaggressions, microinequities) on career choices. The associated workshop attracted 35 attendees including prominent professional leaders, leading to plans for a collaborative publication.
- A new project, ‘Towards an Indigenous-led conceptualisation of race’, led by Paul Podosky (Department of Philosophy) brings EAC members with expertise in the philosophy of race into dialogue with scholars from Indigenous studies at Macquarie University and beyond.
Generative AI and creative industries: Ethical, legal and work implications
Discovery project Chief Investigator: Professor Paul Formosa
Generative AI is creating significant new challenges in the creative industries as it consumes the copyrighted outputs of creative workers to generate content that can compete with the outputs of those same workers.
Using an innovative interdisciplinary approach and industry collaborations, this project will generate solutions to the ethical, philosophical, legal, and workplace problems created by Generative AI in the creative industries, a sector contributing $90 billion and more than 700,000 jobs to the economy.
The national benefit of this project will be the design of an innovative framework for responding to this economy-altering technology in a fair and ethical manner, while drawing on the perspectives of impacted creative workers.
This project directly will develop solutions to the problems raised by Generative AI for the nation's vital creative industries, which contribute over $115 billion annually to the economy and employ more than 600,000 people. As Generative AI rapidly transforms creative work, it is essential that Australia develops responsible strategies to harness these technologies for innovation and productivity while mitigating risks to workers and society.
By examining the ethical, philosophical, legal, and work-related implications of Generative AI in creative industries such as literature, software development, and screen writing, this project will deliver actionable research-based recommendations to support Australia's creative professionals in navigating this disruptive technological shift.
Anticipated outcomes include:
- scholarly publications
- popular media pieces
- industry reports
- a multi- stakeholder workshop.
The project will enhance Australia's research capacity in this emerging field while fostering valuable domestic and international collaborations. Crucially, this research will help position Australia as a global leader in responsible AI innovation, safeguarding the nation's creative ecosystem and workforce.
This project directly advances Australia's national interest across multiple domains:
- promoting responsible technology development
- supporting workforce adaptability
- protecting creative professionals' rights
- stimulating economic growth
- enriching cultural life.
No to BlackBox AI: Towards transparent and safe AI in healthcare
Discovery project Chief Investigator: Associate Professor Rita Matulionyte
While Artificial Intelligence (AI) offers immense potential for various sectors, there is little information about how AI applications are developed and tested. This lack of transparency contributes to AI safety issues and undermines trust. In healthcare, these challenges have led to limited adoption of AI in practice, with lost opportunities for patients and healthcare systems.
Based on new empirical and international comparative data, this project will develop an AI Transparency Map that identifies stakeholder transparency needs and current gaps. Outcomes will include a Framework of policy measures to improve AI transparency. Australia will benefit from safer and more effective adoption of AI in healthcare and other high-stake sectors.
The global healthcare AI market was valued at USD$16.3 billion in 2022 and is expected to grow at 40.2 per cent to reach USD$173.55 billion by 2029.
While healthcare AI is expected to improve diagnosis and treatment of patients, decrease healthcare costs, and make healthcare more accessible, the adoption of AI tools in practice has been slow. This is due to a lack of trust in AI and safety issues, which are in turn caused by a lack of transparency around AI functioning and limitations.
Based on new empirical data, this project will develop a first of its kind AI Transparency Map that identifies stakeholders' transparency needs for AI healthcare technologies and current transparency gaps.
The project will then collect best industry practices and international policy approaches to improve AI transparency, and develop a Model Framework proposing legal, policy and governance measures that will foster AI transparency around healthcare AI.
These project outputs will enable governments and stakeholders improve transparency around healthcare AI, which will lead to increased trust and safer use of AI, and eventually speed up the adoption of these promising technologies in practice.
The interdisciplinary and international project team will leverage their extensive industry contacts and engage with healthcare, AI and policy stakeholders throughout the project, which will ensure that the project benefits the targeted stakeholders and the society at large.