Queen's School of Computing

Contents:

Artificial Intelligence

Artificial intelligence research is the quest to build machines that can reason, learn, and act intelligently – to understand and reproduce human cognition. With increasingly complex algorithms, growing computing power, and the collection of greater amounts of data, society is on the verge of a fundamental transformation.

AI is already ubiquitous. It is used to recommend what you should buy online, to understand what you say to virtual assistants, to recognize who is standing next to you in a snapshot, to warn you when you are the target of phishing, and to detect when you are the victim of credit card fraud. The application of AI is spreading to more aspects of our everyday lives. AI will drive economic growth, boost innovation and tackle issues of global significance from healthcare delivery to food security to public safety.

Artificial Intelligence Research Topics

  • Machine learning and deep learning: The most significant AI research breakthroughs in recent years have been in the related areas of machine learning and deep learning. Using machine learning, an algorithm can modify itself without human intervention to produce a desired result. It ‘feeds’ itself data and analyzes that data using deep learning, which is a system or network of algorithms, also known as an artificial neural network (ANN). The ANN mimics the operation of the human brain in analyzing unstructured data. These approaches have a wide spectrum of current and potential applications, from enabling self-driving cars to creating robotic assistants and companions to helping healthcare professionals diagnose patients faster and more accurately.
  • Robotics: AI and robotics have a symbiotic relationship – AI involves automated decision-making while robotics involves manipulating objects. Combining the two creates the ability to take effective or desired action in the real world and to acquire real-world data to adjust or improve performance. Visions of the future include robots and drones transporting goods to market, helping with routine surgical procedures, and providing personalized assistance to people with physical limitations.
  • Cognitive computing: Closely related to AI, cognitive computing involves the simulation of human thought processes in computerized models. It features self-learning systems that use data mining, pattern recognition, and natural language processing to mimic the operation of the human brain. Cognitive computing enables computers to address complex situations characterized by uncertainty and ambiguity. The main aim of cognitive computing is to help humans with decision-making, for example, to enable accurate data analysis, create more efficient business processes, or improve customer experience.

Biomedical Computing

Biomedical computing research is developing tools that enable the biomedical community to collect, analyze, model, simulate and share data on human health and disease. It does so by uniting the diagnostic, investigative and patient-centered approach of medical science with the power, speed and problem-solving capabilities of leading-edge computing. It already shows the potential for a substantial impact on the most significant challenges facing efforts to improve human health and well-being.

Modern biomedical advances – from genomics to imaging to proteomics – rely on computing infrastructure and expertise to make sense of, draw conclusions from, and apply in practice the massive amount of information that is collected on human health.

Biomedical Computing Research Topics

  • Bioinformatics: An interdisciplinary field, bioinformatics combines computer science, biology, pharmacology, information engineering, mathematics and statistics to analyze and interpret biological data. Bioinformatics research leverages vast datasets to solve problems in areas such as gene sequences, protein structures, molecular evolution, disease mechanisms, and drug testing, and plays a key role in the emerging area of ‘personalized medicine.’
  • Image-guided surgery: IGS helps surgeons to perform safer and less-invasive procedures by using surgical instruments in conjunction with medical imaging. IGS systems use cameras or electromagnetic fields to help orient the surgeon by providing three-dimensional images of surgical instruments relative to the patient’s anatomy. IGS technology helps to shorten operating times, decrease incision size and reduce a procedure's invasiveness, all of which contribute to better patient outcomes and quicker recovery. It also provides new options for patients with multiple medical problems, patients who may not be able to tolerate invasive surgeries, and patients whose conditions in the past might have been considered inoperable.

Cybersecurity

Cybersecurity is the practice of protecting computer systems, networks, and programs from digital attacks that include malware, ransomware, phishing and denial-of-service and vulnerability attacks. These attacks have the potential to damage a victim –individual or organization, industry or government – through data theft, downtime, identity theft, reputational damage, and more. Modern cybersecurity is an ‘arms race,’ in which information technology security teams strive to keep ahead of cybercriminal tactics. Cybersecurity research and innovation generates advances that help address evolving cyber risks, to build a trustworthy and resilient digital environment.

Cybersecurity Research Topics

  • Software Reliability and Security: Security has emerged as one of the most significant challenges facing developers of software systems. As the role of software systems becomes increasingly integral to the management of finance, infrastructure, defence and industry, software failures can result in catastrophic consequences directly or indirectly; not only can they lead to property damage and financial loss, but they may cause serious injury or even loss of life. Computer science researchers work closely with computer engineers to assess, predict and improve the reliability, safety, and security of software products.
  • Using AI to Address Cybersecurity Threats: AI, via advanced analytics and machine learning, is redefining cybersecurity today. AI is increasingly effective in detecting cyber threats, such as malware, fraud, and intrusion. AI also shows great promise in identifying vulnerabilities to and protecting digital platforms against cyber attacks.
  • Adversarial Analytics: The cybersecurity ‘arms race’ – between those who seek to secure a system, network or program from digital attack, and hackers who seek to access, alter or damage it in an unauthorized way – is an ongoing challenge. Technological advances such as AI can be applied not only by defenders but also by attackers. Adversarial analytics seeks to identify hacker tactics and protect against them and has applications in domains including policing, cybercrime, fraud detection, signals intelligence, counterterrorism, and anti-money-laundering.

Data Analytics

Data is everywhere in today’s world – from satellite imagery to social media posts to retail sales information to medical records. Data analytics examines large amounts of data to uncover patterns, discover correlations and generate insights. Adoption of data analytics technologies and techniques are transforming the way data is used, reshaping fields including medicine, retail, transportation, construction, and financial services. Data analytics is paving the way towards improved quality-of-life and increased operational efficiency. It is also the foundation for evidence-based decision making by individuals, firms, and governments.

Data Analytics Research Topics

  • Big Data Analytics: Big data analytics is the use of advanced analytics techniques on very large and diverse datasets that include structured and unstructured data with one or more of the following characteristics: high volume, high velocity, or high variety. Big data comes from sensors, devices, video, audio, the web, social media, and many other sources.
  • Business Intelligence: BI is a technology-driven process to analyze data from internal systems and external sources and present actionable information that helps corporate end-users make informed business decisions. A common way to present BI is through data visualizations – often accessed through data discovery tools and dashboards – that present data in ways that are accessible and user-friendly. Insights gleaned through BI can be used to make strategic business decisions that optimize business practices, identify market trends, increase profitability, enhance productivity, and accelerate growth.

Human-Computer Interaction and Gaming

Human-Computer Interaction (HCI) is a multidisciplinary field of study focusing on the design of computer technology and, in particular, on the interaction and interface between humans (users) and devices and systems that incorporate or embed computation. HCI studies both how people use technological artifacts and how the artifacts can be designed to better facilitate human use. In HCI research, computer scientists collaborate with experts in engineering, psychology, linguistics, and social and system organization, to understand and improve the design, implementation, application, analysis, and evaluation of interactive systems.

Computer games have grown beyond their origins as entertainment activities. Researchers and practitioners have applied games and gaming in innovative applications: educational games, therapeutic games, simulation games, and gamification of utilitarian applications. Gaming researchers are developing games and applying gaming principles for the benefit of individuals and society. HCI plays a critical role in the study of games. By examining player characteristics, interactions during gameplay, and behavioral implications of gameplay, HCI professionals are helping to design and develop better games.

Human-Computer Interaction and Gaming Research Topics

  • Accessibility, Usability, and Inclusion: Technology users are unique and varied, and HCI and gaming best practices consider accessibility, usability, and inclusion throughout all phases of the design process. Accessibility means that all users, including those with disabilities, can equally perceive, understand, navigate and interact with tools, devices, and systems; usability incorporates user experience design (UXD) to ensure that tools are effective and satisfying to all users; and inclusion (also referred to as Universal Design) ensures involvement of diverse users to the greatest extent possible.
  • Virtual Reality/Augmented Reality: VR is the use of computer modeling and simulation to enable the user to interact with an artificial 3D visual or other sensory environments. The user is immersed in a computer-generated environment that simulates reality using interactive devices including goggles, headsets, gloves or bodysuits. AR takes it a step further, creating an interactive experience where objects that reside in the real world are enhanced by computer-generated perceptual information. Computer scientists are focused on improving VR/AR hardware and software to improve the user experience and apply VR/AR not only to the entertainment industry but to many other areas, from education to remote conferencing to industrial design.

Software Engineering

Software Engineering (SE) is the branch of computer science that deals with the design, implementation, and maintenance of complex computer programs, emphasizing the entire software development process from initial idea to final product. The SE process encompasses several stages: from understanding end-user requirements to analyzing the financial, technical and operational aspects of software development, to creating a blueprint for the software, to coding, testing, installing, and maintaining it. Software engineering makes complex systems – from medical devices to the air traffic control network to nuclear power plants – operate effectively, safely, and reliably.

Software Engineering Research Topics

  • Mining Software Repositories: In MSR, software researchers and practitioners use data mining techniques to analyze the data in software repositories – such as source control repositories, bug repositories, archived communications, deployment logs, and code repositories – to extract useful and actionable information produced by developers during the software development process. Extracted data can be used to discover patterns and trends, support development activities, facilitate or improve maintenance of existing systems, and guide decision-making regarding future software development and evolution. MSR researchers have proposed techniques that augment traditional SE data, techniques and tools, to solve important and challenging problems, such as identifying bugs and reusing code, which practitioners must face and solve daily.
  • Recommender Systems: A recommender system is a type of information filtering system that seeks to predict the rating that a user would assign to an item in an online review, for example, the star rating that a consumer would give to a product purchased from an online retailer. Recommender systems aim to predict user interests and provide personalized recommendations to users, thus increasing sales, retaining customers, and enhancing users’ online experience. Different types of recommender systems have been developed, including content-based, collaborative filtering, hybrid recommender system, demographic and keyword-based recommender systems.

Systems and Networks

Research on systems and networks focuses on the tangible hardware on which computation occurs and the tangible networks over which information is transmitted, the foundations of computing infrastructure. Researchers in this area develop innovative theories, algorithms, and systems to solve real-world challenges in areas including parallel and high-performance computing, computer networking, cloud computing, computer architecture, and wireless communication.

Systems and Networks Research Topics

  • 5G Wireless Networks: The world of wireless communications has undergone revolutionary changes in the forty years since the introduction of the first cellular network. 5G – the fifth generation of wireless communications – is in the early stages of introduction, and it will be a game-changer. 5G will bring significant advances in three areas: connectivity (the number of devices communicating across the network), latency (how long it takes for one device to send information to another across the network) and bandwidth (allowing for high downloading speeds). 5G will improve many people’s personal and work lives, with impacts including increased access to education and healthcare, improved emergency response, and reduced traffic congestion.
  • Cloud Computing: Rapidly moving from an emerging trend to an established technology, cloud computing is internet-based computing, where shared information, software, and resources are provided to portable devices on demand; it offers ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services). The cloud is transformative for organizations from all sectors as data analytics, AI, and other capabilities are made available as services via the cloud.
  • Edge Computing: Edge computing is the practice of processing data near the edge of a network where the data is being generated. Typically, IoT data is collected and transmitted to a cloud or data centre to be processed and analyzed, a reliable but time-consuming approach. In edge computing, sensors, controllers, and other connected devices collect and analyze IoT data themselves, or transmit it to a nearby server or laptop for analysis. When data processing and analysis occurs at the edge, the data can be immediately analyzed—and put into action. Benefits of edge computing include lower operating costs, reduced network traffic, and improved application performance.

Theory of Computation

The theory of computation, also known as automata theory, is the study of abstract mathematical machines (or models of computation – the most well-known being the Turing machine) and the computation problems that they can be used to solve. Automata enable a researcher to understand how machines compute functions and solve problems. The main goal underlying the development of automata theory was to develop methods to describe and analyze the dynamic behaviour of discrete systems. It is a field closely related to formal language theory because automata are often classified by the class of formal languages they can recognize. Recent applications of automata theory and formal languages in bioscience and DNA computing have generated renewed interest in this foundational area.

Theory of Computation Research Topics

  • Computability Theory: Computability theory deals primarily with the question of the extent to which a problem can be solved on a computer. It remains an active field of research and is commonly used to discover the computational expressiveness of new ideas in computer science, mathematics, and physics.
  • Computational Complexity Theory: Computational complexity theory considers not only whether a problem can be solved on a computer, but also how efficiently it can be done. Both in theory and practice, computational complexity theory helps computer scientists determine the limits of what computers can and cannot do and therefore aids in algorithm design and analysis.