The Compliant and Accountable Systems Research Group (CompAcctSys) is a multi-disciplinary team working at the intersection of computer science and law.
Broadly, our research focuses on issues of compliance and accountability as they relate to emerging technologies. We consider how technology can be better designed, engineered and deployed to accord with legal and regulatory concerns, and seek to better ground legal and policy discussions in technical realities.
Broadly, our research focuses on issues of compliance and accountability as they relate to emerging technologies. Some current topics include:
- Auditing complex and automated systems: Exploring requirements and mechanisms for facilitating the meaningful inspection and interrogation of systems and their behaviour. A current focus is on (i) augmented reality systems and the Internet of Things, and (ii) the use of machine learning in various contexts.
- Compliance and rights engineering: How systems can be better built to be (demonstrably) compliant with legal obligations, and to account for the rights of individuals.
- Algorithmic `reviewability’: Contrasting with the general focus on ‘explanation’, we consider that which is necessary to facilitate the review of algorithmic and automated decision-making (ML) systems.
- Decision provenance: Considering how tracing the flow of data can be leveraged to assist accountability in complex, automated and ML-driven environments.
- Centralised v. decentralised infrastructures: Considering the potential of data management and compute infrastructures, and their legal, regulatory and policy implications. Currently looking at personal data stores and data trusts.
- Platforms and online harms: Considering the design, use and abuse of platforms in perpetuating harms. The current focus is on social media (recommender systems), and cloud services.
The group is involved in a number of projects, supported by a range of funders. These include:
- Realising Accountable Intelligent Systems (RAInS): Exploring issues of accountability, particularly relating to audit, in intelligent systems (AI/ML driven environments). A collaboration with the Universities of Aberdeen and Oxford. Funded by the EPSRC (a TIPS 2.0 project).
- Towards a legally-compliant Internet of Things: Investigating means for addressing compliance and accountability issues in the Internet of Things (pervasive computing). Funded by the EPSRC.
- Detecting and understanding harmful content online: Exploring methods and tooling for detecting harmful content (inc. hate speech), and developing governance regimes. A collaboration with Kings College London, QMUL, UCL and the Alan Turing Institute. Funded by Alan Turing Institute.
- Microsoft Cloud Computing Research Centre (MCCRC): A collaborative project with the QMUL Centre for Commercial Law Studies to perform a tech-legal analysis of issues at the intersection of cutting-edge technology and law. For details see here. Funded by Microsoft.
- Contextual fairness in ML: Exploring context-aware approaches to issues of fairness in machine learning systems. Funded by Aviva.
- Trust & Technology Initiative: A separate but related initiative involving members of the team that works to foster interdisciplinary research on trust & distrust regarding emerging technology. For details see here. Funded by the University of Cambridge.
We are keen to supervise student projects on related topics, at all levels (undergraduate or post-graduate). Some project suggestions are available here.
The group also delivers the Advanced Computer Science MPhil unit Technology, Law & Society. The course aims to develop an awareness and consideration of the broader context of tech, and how systems can be designed and engineered to facilitate accountability, legal compliance, and generally be better for society.