We offer a range of PhDs funded by different sources, such as research councils, industries or charities.
To apply for a funded PhD please read the advertised project information carefully as requirements will vary between funders. The project information will include details of funding eligibility, application deadline dates and links to application forms. Only applicants who have a relevant background and meet the funding criteria can be considered.
1. Current PhD Opportunities
accordion
Data-Driven Hybrid Motion–Force Control for Robust Human–Manipulator Interaction Lancaster University – in collaboration with United Kingdom National Nuclear Laboratory (UKNNL)
We invite applications for a fully funded PhD studentship at Lancaster University’s School of Engineering, in partnership with United Kingdom National Nuclear Laboratory (UKNNL). This exciting project will develop novel data-driven, robust, and adaptive control methods for human–robot interaction and teleoperation, with direct applications in nuclear robotics, hazardous environment manipulation, and beyond.
Project Overview
Teleoperation is a critical enabler for safe and efficient operation in hazardous environments such as nuclear decommissioning. However, current industrial solutions suffer from limitations under uncertainty, time delays, and noisy sensing.
This PhD project will design and experimentally validate a hybrid motion–force control framework that ensures precise end-effector positioning while maintaining robust and adaptive force regulation under real-world conditions. Research will include:
Development of nonlinear robust adaptive controllers and disturbance observers.
Design of bilateral teleoperation schemes that enhance transparency and stability under communication delays.
Integration of data-driven approaches for force estimation and safety.
Experimental validation on industrial robotic platforms at the UKNNL Hot Robotics Facility.
The project provides the opportunity to work on cutting-edge robotics challenges with significant industrial impact, supported by state-of-the-art facilities at both Lancaster University and UKNNL.
Supervisory Team
Dr Allahyar Montazeri (Lead Supervisor, School of Engineering, Lancaster University; Data Science Institute Member)
Professor Plamen Angelov (Co-Supervisor, School of Computing and Communications, Lancaster University; Data Science Institute Member)
Training and Development
The successful candidate will receive a tailored training programme including:
Hands-on training with ROS2, MATLAB/Simulink, and CoppeliaSim.
Access to world-class robotics laboratories and facilities.
Opportunities to engage with national and international conferences, workshops, and training events.
Insight into the nuclear sector through industrial collaboration with UKNNL.
Funding
Duration: 4 years (3.5 years EPSRC Doctoral Landscape Award + 0.5 years UKNNL extension)
Coverage: UKRI minimum stipend, tuition fees for Home students, and a research training support grant.
Additional support for consumables, maintenance, and travel.
Eligibility
Open to UK Home students only, due to clearance requirements for UKNNL facilities.
Applicants should have (or expect to obtain) a First or Upper Second-Class degree (or equivalent) in Engineering, Control, Robotics, Computer Science, or a related discipline.
Strong mathematical and programming skills (MATLAB, Python, or C++) are highly desirable.
Application Process
Applicants should submit:
A full CV.
A one-page cover letter outlining their motivation and suitability for the project.
Reference letter from two academics commenting on the candidate abilities.
Applications will be considered on a rolling basis until the position is filled, with an expected start date of January 2025.
Closing Date – 1st December
For informal enquiries, please contact Dr Allahyar Montazeri
Details
Start Date: As soon as possible
Deadline for application: Open (it is recommended you apply as soon as possible)
Interview: Rolling
Description
If you’re interested in protecting AI from rapidly emerging cyber threats and securing a technology that will define the coming decades, this PhD studentship is for you.
We are seeking candidates to join our AI security group at Lancaster University, and become part of this rapidly growing research field.
The adoption of Artificial Intelligence (AI) and prominent technologies such as Generative AI, LLMs, and Agentic AI systems is rapidly accelerating across both research and industry.
While there is considerable research activity on the application of AI for security, there has been less attention towards the security of AI itself. AI security focuses on addressing cyber security risks against the AI systems against a wide plethora of cyber attacks, spanning prompt injection, data leakage, jailbreaking, bypassing guardrails, model backdoors, and more. The emergence of such AI risks has drawn the attention of every nation and major business, however existing cyber security tools and methods are ineffective within AI systems due to the intrinsically random, complex, and opaque nature of neural networks. To date, how to secure today’s and tomorrow’s AI models and systems remains unsolved.
This project would provide you the skill and training necessary to become a researcher specializing in AI security – an area that is increasingly sought after in academia and industry.
Research Areas
Topics of interest you could pursue include:
Discover new types of cyber attacks / security vulnerabilities in AI and GenAI
Create defence systems and countermeasures against AI cyber attacks
Design run-time detection systems for prompt injection and jailbreaking
Explore different cyber attack modalities (i.e. malicious instructions in images/audio)
Build and develop cutting-edge LLM guardrails and firewalls
Investigate hidden security characteristics within neural networks
Identify ShadowAI – malicious AI systems hidden within an organization
Uncover backdoor attacks and model hijacking within ML artefacts
What We Offer
A 3.5-year fully funded PhD studentship (including both tuition and stipend).
Access to a large-scale GPU data centre entirely dedicated to our research lab.
Comprehensive training in cutting-edge AI technology and cyber security techniques.
Employment opportunities at Mindgard (https://mindgard.ai/), an award-winning AI security company founded at our lab, and now based in the heart of London.
Collaboration opportunities with Nvidia, Mindgard, GCHQ’s National Cyber Security Centre, and NetSPI, amongst others.
Opportunity to travel to conferences internationally to present your research.
Our Research Lab
We are among the few labs globally specializing in AI security. You will be part of a new cohort of PhD students joining an established team of scientists and engineers. Founded in 2016, the research lab led by Professor Peter Garraghan is internationally renowned in AI systems and security, publishing over 70 research papers, securing over £14M in external grant funding, the formation of Mindgard, and all research students to date securing positions in academia or industry R&D labs upon graduation.
About You
We highly value people who are kind, curious and believe in making a difference.
A good background in Computer Science, ideally a BSc in Computer science (or equivalent) with a 2:1 classification and above.
Interest in Artificial Intelligence, Cyber Security, Distributed Systems, or a combination of the above.
Highly motivated, and capable of working both independently and as part of a team.
Good communication, technical knowledge, and writing skills.
Get in Touch
These positions are available now, thus candidates are strongly recommended to apply as early as possible.
For informal enquiries about these positions, please contact and share your CV with Professor Peter Garraghan. To apply, please visit our school PhD opportunities page, which includes guidance on submission, and a link to the submission system.
Details
Academic Requirements: First-class or 2.1 (Hons) degree, or master’s degree (or equivalent) in an appropriate subject
Recently, we have seen a transformative change in the use of artificial intelligence (AI) technology in many aspects of our lives. In our personal lives, we have access to services and tools that make use of AI in creative and useful ways and – similarly – in a professional setting, AI is being used to enable major changes to the way business is conducted. Some propose that we are at the beginning of a journey in which AI will fundamentally change the way our societies and businesses function.
The concept of AI has been around for several decades and can take many forms. A recent US National Institute for Standards and Technology (NIST) document (NIST AI 100-2e2023), which examines AI attacks, defines two main classes: (i) predictive; and (ii) generative AI. The former is concerned with predicting classes of data (e.g. for anomaly detection); whereas the latter is used to generate content, often using large language models (LLMs). In general, this is not a new technology. However, the recent rapid acceleration of the use of AI has emerged because of new generative models and abundant access to task-specific compute capabilities.
Inspired by this trend, the nuclear sector is exploring the use AI and its capabilities to support a variety of functions. For example, it can be used to enable efficiencies in business process execution, supporting staff with a variety of decision-making tasks using AI-enabled assistants. Moreover, AI can be used to support other functions in a nuclear setting such as those related to physical security, materials inspection, and automated and autonomous robotics and control. A comprehensive review of the uses of AI in the nuclear sector has been produced by the International Atomic Energy Agency (IAEA)[1].
An emerging area of application of AI is to support efficient, safe and secure use of operational technology (OT). This can take many forms, including using machine learning models to optimize control strategies without the need to develop mathematical models of a target system, supporting predictive maintenance to ensure maintenance activities are realized in a cost-effective and safe manner, enabling autonomous operations, and using various forms of machine learning to predict and classify anomalous system behaviour. OT systems typically support business and – in some cases – safety critical functions; therefore, the correct operation of OT that incorporates AI is of the utmost importance.
Nuclear is the most heavily regulated sector in the world. This is because of the uniquely severe consequences of the failure of functions on nuclear safety and security. Failures can result in major environmental disasters and loss of life. In this setting, the use of AI should be approached in a consequence and risk-informed manner. An important way to manage risks that stem from errant AI behaviour is to realize so-called guardrails. Guardrails take many forms and can be described in this context as socio-technical measures to protect the function of systems from the errant behaviour of artificial intelligence. Example guardrails include policies that mandate that humans are integral to decision making that is supported by AI or physical controls (safety interlocks, etc.) that prevent an AI-supported system from causing an accident. It is worth noting that guardrails will likely play an important role in gaining regulatory approval for the use of AI to support safety-relevant functions in nuclear.
Whilst chosen guardrails may be suitable at the genesis of a system, there are potential longitudinal socio-technical effects that might degrade their performance. These effects emerge because of different forms of “drift” associated with a system and its use. Example types of drift include organizational change (e.g. changes in policy), shifts in the criticality of functions and associated systems, changes in regulatory assurance requirements, and generational shifts in staff experience and knowledge, e.g. caused by AI-supported autonomy. These changes may be slow and occur over extended periods, making them difficult to detect. The result is a failure or sub-optimal use of guardrails to effectively mitigate errant AI behaviour.
The aim of this PhD proposal is to investigate a framework that supports risk-informed decisions to be made about the choice of guardrails for ensuring the safe and secure operation of nuclear functions, which include systems that have an AI component. Specifically, the project will focus on case studies that incorporate AI for improving the security and efficiency of OT in the nuclear sector. This framework should consider the characteristics of the guardrails (e.g. their cost, flexibility, scrutability, and effectiveness) along with how they are affected by longitudinal drift. The intention is to take a systems view, in line with work by Leveson et al.[2] who argue that traditional models of failure causality (the fault, error, failure chain) are inadequate for understanding the causes of failures. Rather, that a more complex view of the system in its context, which include changes in the way systems are operated over time, is better suited to this task.
Supervisor: Professor Paul Smith, School of Computing and Communications, Lancaster University
This is a 42-month funded project, including fees and an enhanced stipend.
Entry Requirements
Applicants must have a Master’s degree and/or a minimum of a 2:1 in their bachelor’s degree in computer science or a related field.
Applicants must be resident in the UK during the period of study; they may need to travel to collect data during their studies and will need to obtain security clearance. It is expected the primary fieldwork site will be in Cumbria.
You must provide an up-to-date CV, and two references. We also request a written statement of purpose (explaining why you want to undertake this project, why you have the requisite skills). A further piece of research/assignment work, dissertation section, or publication is also recommended to be submitted.
Applicants can contact Professor Paul Smith to discuss their applications
We invite applications for a fully funded PhD studentship at Lancaster University in collaboration with SP Electricity North West (SP ENWL). This is an exciting opportunity to develop next-generation methods for attack surface mapping, exploring how data science, AI, and cyber security techniques can be used to produce more accurate and reliable tools that support decision-makers in their analysis of large-scale modern digital infrastructure, such as power grids.
PhD Overview
As society becomes increasingly reliant on digital infrastructure, it is critical that decision-makers at organisational and national levels understand the resilience of their systems. Analysts use Attack Surface Mapping (ASM) to identify their internet-connected digital assets and associated vulnerabilities. This allows them to understand how robust the infrastructure is, plan mitigation strategies, and support recovery post-attack.
This PhD will leverage data science, AI, and cyber security techniques to develop the next generation of ASM tools. Research will include:
Fusing multiple ASM tools and pieces of open-source information to give more accurate understanding of attack surfaces than the current state-of-the-art tools can provide.
Developing techniques to measure and interpret the uncertainty of ASM results, giving practitioners confidence in their analysis.
Investigating how AI automation can safely and effectively improve the ASM process.
This PhD is in collaboration with SP Electricity North West, with a crucial focus on securing digital infrastructure across their network and enabling the secure deployment of innovative new services as they digitise their operations. Furthermore, this project aligns with ongoing work the team are carrying out with the UK’s National Cyber security Centre (NCSC); as such, there is a real opportunity for your research to make an impact.
Supervisory Team
Dr Edward Austin (School of Computing and Communications)
Professor Nicholas Race (School of Computing and Communications)
Dr Xiandong Ma (School of Engineering)
Training and Development
The successful candidate will receive a tailored training programme including:
Support using, and access to, ASM tools such as Shodan and Censys.
Opportunities to engage with national and international conferences, workshops, and training events.
Insight into the power sector through industrial collaboration with SP ENWL.
Funding
A 3.5-year UKRI-funded studentship, including a stipend (currently £20,780 per year) and full tuition fees for Home students.
An additional research training grant (£1,000 per year) for consumables, maintenance, and travel to events/conferences.
Eligibility
Applicants should have (or expect to obtain) a First or Upper Second-Class degree (or equivalent) in Computer Science, Data Science, or Cyber security. Applicants from other disciplines with a substantial mathematical component are also encouraged to apply.
There is no expectation that a candidate will be proficient in all areas of data science, cyber security, computer networking and AI tooling. However, candidates should be aware that this PhD will have a substantial cyber security component.
Application Process
Applicants should submit:
A cover letter outlining their motivation and suitability.
A CV outlining skills and experience.
Applications will be considered on a rolling basis until the position is filled. The expected start dates are either April 2026 or October 2026.
The School of Computing and Communications (SCC) at Lancaster University is pleased to invite applications for three funded PhD studentships supported by the EPSRC Doctoral Landscape Award (DLA).
These studentships are not restricted to a specific research field. We welcome applications from outstanding candidates wishing to pursue research across any area aligned with SCC expertise.
Applicants are encouraged to explore the range of research themes within the School of Computing and Communications to identify potential areas and supervisors prior to applying: SCC research themes and expertise.
SCC hosts a diverse and interdisciplinary research environment. Applications are particularly encouraged in areas including (but not limited to):
Artificial Intelligence and Data Science
Cyber Security
Communications and Networks
Distributed Systems
Human-Computer Interaction
Software Engineering
High-Performance and Cloud Computing
Funding
One studentship is available with funding suitable for an International applicant
Two studentships are available with funding at the Home fee rate
A tax-free stipend in line with current UKRI rates
Access to training, research support, and development opportunities
International applicants are encouraged to apply. While two awards are primarily intended for Home candidates, exceptionally strong international applicants may be considered for full funding, subject to approval, with the University potentially covering the fee difference.
In cases where this is not approved, international candidates offered a Home-funded studentship will be required to cover the difference between Home and International tuition fees.
Entry Requirements
Applicants should have:
A first-class or strong upper second-class (2:1) degree (or equivalent) in a relevant subject
Contact the chosen academic to discuss your proposed research
Obtain their agreement to support the application
Applications without a confirmed supervisor will not be considered.
2. Submit Your Application
Applications must be submitted via the process outlined in step 2 of this webpage.
Applicants must provide:
A full CV
Degree certificates and transcripts (BSc and/or MSc)
Two academic reference letters
Proof of English language proficiency (if required)
A research vision document (PDF), including:
Research Proposal
Research aims and objectives
Research questions and/or hypotheses
Proposed methodology
Expected outcomes
Consideration of ethical and societal implications (e.g., data privacy, AI bias, environmental impact, human/animal safety)
A responsible approach to research (e.g., integrity, transparency, reproducibility)
Potential impact of the proposed work academically, industrially, or societally
Previous Work and Impact
Provide evidence of any research outputs, publications, or projects that demonstrate your ability to conduct high-quality research.
If you have no publications, reflect on your BSc/MSc thesis or other significant projects, highlighting contributions, innovations, or outcomes.
Focus on impact or significance of your work, e.g., advancing understanding in your field, practical applications, or wider societal benefits.
The aim is to demonstrate your potential to produce impactful research during your PhD.
Publications List (if applicable)
For each publication, please provide the following details:
Full list of authors, in the order shown on the publication
Publication title
Venue (journal, conference, or book) and year of publication
Your contribution to the work (e.g., experimental design, analysis, writing, software)
Link to the online version (e.g., DOI, publisher page, or preprint server)
Applicants with no publications can provide significant project reports, thesis work, or other outputs demonstrating research productivity.
Selection Criteria
Candidates will be assessed based on:
Academic excellence
Strength and feasibility of the research proposal
Alignment with SCC expertise
Research potential
References
Informal Enquiries
Applicants are strongly encouraged to contact potential supervisors within SCC before applying.
2. Create your application
You will need to put in an application to the University's online application system. Please follow the University's guidance regarding the required documentation.
Please make sure to include a CV (mandatory, maximum of two pages) including your previous degrees and graduation grades, as well as any relevant skills. Where it applies, also include awards of excellence, publications, and links to code releases, such as through GitHub.
Please follow all of the requirements. Not adhering to these requirements may at best delay the processing of your application, and at worst might result in immediate rejection. The preferred format for all supporting documents is PDF.
2.1. Your Research Proposal
Please note that even if you are applying for a funded PhD position, you will need to develop a proposal.
At the top of the first page of the Research Proposal, please include the following information:
Mandatory
A clear indication of the SCC research group(s) you want to work with.
A list of two or three works that are similar to your proposal. This list is in addition to any other references you may wish to include.
Optional
The names of the SCC academic(s) you want to work with. Please also indicate if you would like us to consider your application if your preferred supervision team is not possible.
2.2. Your Personal Statement
A personal statement is mandatory and should be a maximum of one page. The document should explain your motivation to work on your chosen project and a little about your background.
Other methods of applying for a PhD
Studying for a research degree is a highly rewarding and challenging process. You'll work to become a leading expert in your topic area with regular contact and close individual supervision with your supervisor.
If you have your own research idea, we can help you to develop it. To begin this process you will need to find a PhD Supervisor from one of our research groups, whose research interests align with your own.
You can also apply for a PhD from one of the Doctoral Training Centres and Partnerships that work with the School of Computing and Communications. Details of each of the Training Centres are provided here.