Skip to navigation | Skip to main content | Skip to footer
The University of Manchester home
Faculty of Humanities
  • Faculty of Humanities
  • Research
  • Projects
    • The lexical semantics of lexical categories
    • Everyday therapeutic consumption
    • Energy, data and social change in net-zero Britain
  • Home
  • Study

    Undergraduate

    • Courses
    • Scholarships and funding
    • Request a brochure

    Taught master's

    • Courses
    • Scholarships and funding
    • Request a brochure

    Postgraduate research

    • Programmes
    • How to apply
    • Funding
    • Find a supervisor
    • Supervisor-led projects
    • Our research community
  • Research

    Centres and Institutes

    Funding and fellowships

    Impact

    Projects

    Publications

  • Connect

    Business engagement

    • Research services
    • Knowledge exchange
    • Consultancy
    • Commercialisation
    • Professional development
    • Case studies
    • Contact us

    Contact us

    Events

    Schools and colleges

    Social responsibility

  • About

    News

    Our Schools

    Our subject areas

    People

  • Faculty of Humanities
  • Research
  • Projects
  • AI Trust and Security
  • Faculty of Humanities
  • Research
  • Projects
    • The lexical semantics of lexical categories
    • Everyday therapeutic consumption
    • Energy, data and social change in net-zero Britain
Back to projects homepage
Man holding smartphone with red text overlay warning of AI disinformation and fake news risks.

AI Trust and Security

We are an interdisciplinary group of researchers interested in the co-shaping of AI and society, with a focus on how these technologies intersect with our political, economic, and emotional lives.

Key research areas

  • The affective and political implications of AI companions
  • AI-driven harms and responsible design
  • AI in finance and corporate compliance
  • The governance of speech on and with GenAI
  • How AI can both drive and ameliorate the informational crisis
Justice scales with AI, law, and tech icons around, symbolising the integration of legal technology.

Current research projects

Professor Nicholas Lord is currently involved in several projects that analyse the role of AI in the organisation of varied frauds as well as the role of AI as part of regulatory, control, and compliance responses to fraud.

This project explores how diverse cultural contexts shape emotional relationships with AI in domains like therapy and companionship.

Using ethnography and theories from STS, philosophy, and AI ethics, it examines trust, care, and personhood, reconceptualizing emotionally responsive AI as relational actors embedded in local moral worlds (Dr Jennifer Cearns). 

Grounded in theoretical approaches to responsibly-designed Feminist AI and work on cultural discourses of hegemonic masculinity, this research examines how feminised AI companion apps contribute to male supremacist harm, while exploring opportunities to lessen these harms through fostering healthy manifestations of masculinity and communication in AI-human relationships (Dr Allysa Czerwinsky).

This research project seeks to map technical and regulatory issues impacting the implementation of AI-based tools across the compliance functions in the financial services industry.

Moving beyond fraud, we aim to understand the interactions between industry, civil society and regulators navigating AI implementation across CDD/KYC/AML, sanctions, and financial crime compliance (Dr Borja Alvaro Alvarez Martinez). 

Examines online marketplaces for digital objects–focusing on the blurred boundaries between legal and illegal activity, the drivers of these markets, and the role AI plays in shaping human decision-making in these online spaces (Dr Diāna Bērziņa).

This project explores the mechanisms of information dissemination and examines individual decision-making and collective behaviour in digital environments, thereby revealing the co-evolution among information, belief, and behaviour (Dr Tao Wen).

Digital services are increasingly reliant on algorithmic systems to define and control both objectionable and desirable speech.

This project investigates how these technologies embody specific norms and logics of representation and legitimation, and the extent to which they reinforce or challenge democratic principles (Klara Matusewicz). 

The research explores the feasibility of applying AI to covert testing as a tool for assessing corporate integrity, examining organisational responses to better understand how integrity is demonstrated or compromised in practice (Shaikh Waheed Mahmood). 

This project examines young Chinese “Dreaming Girls” (梦女) who form romantic relationships with AI-powered virtual lovers.

Based on a combination of digital ethnography and offline observation, it explores how platforms like LoveyDovey and Character AI transform solitary fantasy into interactive intimacy, revealing the paradox of experiencing genuine emotions through acknowledged artificial relationships (Yuqin Zhang).

Priority areas of research interest

  • AI, crime, and accountability
  • AI and intimacy
  • AI, power, and politics
  • AI as a research method

We are interested in collaborating on initiatives connected to these areas and other projects seeking to understand the societal impacts of AI development and implementation.

Cybersecurity breach with warning signs as hackers steal money and credit card data from laptop.

Meet the team

Cluster Leads: Dr João C. Magalhães and Dr Chloe Jeffries

Members:

  • Prof Nicholas Lord
  • Dr Jennifer Cearns
  • Dr Allysa Czerwinsky
  • Dr Borja Alvaro Alvarez Martinez
  • Dr Diana Berzina
  • Dr Tao Wen
  • Klara Matusewicz
  • Shaikh Waheed Mahmood
  • Yuqin Zhang

Contact us

  • +44 (0)161 306 6000
  • Contact details

Find us

The University of Manchester
Oxford Rd
Manchester
M13 9PL
UK

Connect with us

  • Facebook page for Faculty of Humanities
  • YouTube page for Faculty of Humanities
  • Instagram page for Faculty of Humanities
  • WordPress page for Faculty of Humanities

  • Disclaimer
  • Data Protection
  • Copyright notice
  • Accessibility
  • Freedom of information
  • Charitable status
  • Royal Charter Number: RC000797