top of page

Search Results

Results found for empty search

  • Tara Collingwoode Williams

    < Back Dr Tara Collingwoode-Williams Goldsmiths iGGi Alum Tara is an IGGI PhD student from Goldsmiths University taking her Mphil/PhD in Intelligent Games/Game intelligence with a focus on Avatar Embodiment and Interaction within Virtual Reality. Before this she graduated with a Bsc in Creative Computing. Over the years, her interdisciplinary profile has enabled her to work as a Technical Support and Researcher with many organisations in relation to her research, such as UCL, Great Ormond Street Hospital, George Mason Serious Games Institute in the United States where she also co-lectured a XR Games Module and, more recently as an Associate Lecturer in Goldsmiths University teaching Unity based XR experience development. Currently, she is contracting for USTech as an Assistant UX researcher at Facebook whilst completing her PhD program. With this rise in demand for Head Mounted Displays (HMDs), so is the need to create Embodied Shared Virtual Environments (ESVE) where users may experience authentic social interactions. Tara’s research presents an exploratory examination of Embodiment - meaning the subjective feeling of owning a virtual representation in VR, and specifically Consistency in Embodiment - relating to how we prioritize and syncronise objective attributes of embodiment (i.e avatar representation) in order to create ESVEs which supports more intuitive social interaction. The goal is to understand how different technical setups could have a psychological impact on participants' experiences in ESVE. This research hopes to inform development of successful social interaction in a variety of applications in VR, ranging from training to gaming. Tara presently holds a position as Lekturer in VR at Goldsmiths, Universtiy of London. tc.williams@gold.ac.uk Email Mastodon Other links Website https://www.linkedin.com/in/tara-collingwoode-williams-81141776/ LinkedIn BlueSky Github Featured Publication(s): Delivering Bad News: VR Embodiment of Self Evaluation in Medical Communication Training The impact of self-representation and consistency in collaborative virtual environments G487 (P) Is clinician gaze and body language associated with their ability to identify safeguarding cues? Evaluating virtual reality experiences through participant choices A discussion of the use of virtual reality for training healthcare practitioners to recognize child protection issues A study of professional awareness using immersive virtual reality: the responses of general practitioners to child safeguarding concerns The effect of lip and arm synchronization on embodiment: a pilot study Themes Applied Games Player Research - Previous Next

  • How Do We Engage Children and Young People in the Design and Development Of Mental Health Games

    < Back How Do We Engage Children and Young People in the Design and Development Of Mental Health Games Link Author(s) MJ Saiger Abstract More info TBA Link

  • No Item Is an Island Entire of Itself: A Statistical Analysis of Individual Player Difference Questionnaires

    < Back No Item Is an Island Entire of Itself: A Statistical Analysis of Individual Player Difference Questionnaires Link Author(s) N Hughes, P Cairns Abstract More info TBA Link

  • Prof Simon Colton

    < Back Prof. Simon Colton Queen Mary University of London iGGi Co-Investigator Supervisor Simon Colton is an AI researcher with particular focus on issues of Computational Creativity, where we engineer software to take on creative responsibilities in art and science projects. He undertakes projects advancing the state of the art in generative technologies such as evolutionary approaches and deep learning, and uses these to help develop software such as The Painting Fool, The WhatIf Machine, the Wevva game designer, the HR3 automated code generator, and the Art Done Quick casual creator for visual art. In turn, these software systems and their output are used in cultural projects such as a poetry readings, art exhibitions, game jams, and even the production of a West-End musical. This enables Simon to undertake much public engagement, with coverage from the BBC, The Guardian, MIT Tech Review, The New Scientist and many others. These practical and cultural projects inform an evolving philosophical discourse around what it means for machines to be creative, and Simon has co-authored numerous essays driving forward our understanding of this important topic. In this way, he has helped to introduce ideas such as automated framing of products and processes, issues of authenticity and the notion of the machine condition, i.e., what the lived experience of a machine is, and how this could be expressed by that machine through creative production. He is particularly interested in supervising students in project where we apply generative technologies to applications in videogame design, visual art, software engineering, music and text generation. One particular current interest is stretching the boundaries of both what can be achieved by, and our understanding of, generation deep learning methods such as generative adversarial networks (GANs) and auto encoders. Another current interest is the design of casual creators, which are creativity support tools where the focus is on users having fun, rather than on efficient, professional production of artefacts. He is currently developing a casual creator for visual art called Art Done Quick for public release, which employs evolutionary and deep learning techniques to deliver a fun-first experience while users make decorative art pieces. Any project involving generative technologies is of interest to Simon. Research Areas: Game AI Game Audio and Music Game Design Computational Creativity Player Experience Casual Creators Generative Deep Learning s.colton@qmul.ac.uk Email Mastodon https://ccg.doc.gold.ac.uk/ccg_old/simoncolton/cv/ Other links Website LinkedIn BlueSky Github Themes Accessibility Creative Computing Game AI Game Audio Player Research - Previous Next

  • Deep visual instruments: realtime continuous, meaningful human control over deep neural networks for creative expression

    < Back Deep visual instruments: realtime continuous, meaningful human control over deep neural networks for creative expression Link Author(s) M Akten Abstract More info TBA Link

  • Dr Gaetano Dimita

    < Back Dr Gaetano Dimita Queen Mary University of London Supervisor Gaetano Dimita is a senior lecturer in International Intellectual Property Law working on Games and Interactive Entertainment Law, Regulations, Transactions and esports law. He is the Director of the Institute for Interactive Entertainment Law and Policy, the founder and editor-in-chief of the Interactive Entertainment Law Review, Edward Elgar, and the organiser of the ‘More Than Just a Game’ conference series. Gaetano is also the Deputy Director of the Queen Mary Intellectual Property Institute (QMIPRI), The Director of eLearning, CCLS, the Deputy Director of Education, CCLS, and the Director of the LLM in Intellectual Property Law. Outside of Queen Mary, he serves as Executive Committee member of the British Literary and Artistic Copyright Association, the UK national group of the Association Litteraire et Artistique Internationale; as Board Member of the National Video Game Museum; as member of the British Copyright Council - Copyright and Technology Working Group; as member of the UK IPO Copyright Advisory Council, member of the UK Department for International Trade’s Intellectual Property Expert Trade Advisory Group (IP ETGA). He is also a member of Italian Bar Association (Rome), the Video Game Bar Association, the Fair Play Alliance, and the Higher Education Video Game Association. He is particularly interested in supervising interdisciplinary research on games and interactive entertainment law and regulation. Research themes: Game AI Games with a Purpose Computational Creativity E-Sports Player Experience g.dimita@qmul.ac.uk Email Mastodon https://www.qmul.ac.uk/law/people/academic-staff/items/dimita.html Other links Website https://www.linkedin.com/in/gaetano-dimita-06484544/?originalSubdomain=uk LinkedIn BlueSky Github Themes Applied Games Creative Computing Esports Game AI Player Research - Previous Next

  • iGGi Open Evening at QMUL | iGGi PhD

    < Back iGGi Open Evening at QMUL iGGi QMUL is spontaneously running an Open Evening event on 15 December 2021 at 6pm Empire House (Whitechapel Campus) https://goo.gl/maps/dquCpQHtSuTN7YD9A We will showcase some of the ongoing research of the QMUL Game AI research group the iGGi Centre for Doctoral Training the AIM Centre for Doctoral Training which are all part of QMUL's School of Electronic Engineering and Computer Science. It will be a great opportunity to speak face-to-face to some of the Researchers and Staff relevant to iGGi (and you can also consider our "competitor" AIM who offer fully funded scholarships in a similar way to iGGi). There will also be pizza and drinks! If you can/want to attend: Fill in this form: https://forms.gle/mGmWeoGUtH4sZmz86 Note that you will be required to wear a face mask for the duration of the event, and you will be required to show proof of vaccination or a negative covid test taken within the last 48h of event start. We look forward to seeing you there! Previous 8 Dec 2021 Next

  • When Games Become Inaccessible: A Constructive Grounded Theory on Stuckness in Videogames

    < Back When Games Become Inaccessible: A Constructive Grounded Theory on Stuckness in Videogames Link Author(s) F Foffano Abstract More info TBA Link

  • Queen Mary University of London (QMUL) | iGGi PhD

    < Back Queen Mary University of London (QMUL) iGGi QMUL is located at the heart of East London on Queen Mary, University of London's Whitechapel campus. iGGi QMUL is part of QMUL’s School of Electronic Engineering and Computer Science . While QMUL-based iGGi PGRs can belong to more than one research group, they all by default belong to the Game AI Group (GAIG) . The iGGi/GAIG office space is situated within the Digital Environment Research Institute (DERI) at Empire House, Whitechapel campus. How to reach the iGGi Offices at Empire House, Whitechapel The address for the iGGi office space is 2nd Floor Empire House DERI 67-75 New Road London, Whitechapel E1 1HH Whitechapel campus map Accesibility: Empire House access guide Arriving by Tube The Whitechapel campus is easily accessible via public transport, with the Whitechapel Underground station on London Underground's Elizabeth Line (purple on the Tube map), Hammersmith and City Line (pink on the Tube map), and District Line (green on the Tube map), just a seven minute walk away. When you exit the station, turn right and walk along Whitechapel Road until the next larger junction. Turn left into New Road. Empire House will be located to your right. Please use the Transport for London Journey Planner to help you plan your journey: https://tfl.gov.uk/plan-a-journey/ or their interactive maps showing Underground, Docklands Light Railway (DLR) and bus information Arriving by Bus The Whitechapel campus is based on Whitechapel Road, on the 25 and 205 bus routes, and Empire House is just off Whitechapel Road, on New Road. Cycling/Walking If you are travelling by bike or walking, please use the postcode above and the campus map to help you navigate to the venue. Bike storage facilities can be found in the Empire House Basement. Arriving by car For both our Mile End and Whitechapel campuses, car parking for visitors is not offered due to our central location. Local parking restrictions also apply on weekdays and weekends.We therefore strongly recommend you use one of the alternative transport methods listed above. If you do need to drive to campus, QMUL open day published a list of offsite parking options within easy reach of Whitechapel, including park and ride options. If you are a blue badge holder and require parking on site, please email opendays@qmul.ac.uk . iGGi QMUL Gallery Map depicting QMUL Mile End campus & the iGGi Con 2023 venue location iGGi Con 2023 venue: The Graduate Centre (Mile End campus, QMUL), viewed from Bancroft Road iGGi Con 2023 venue: Ground floor entrance of the Graduate Centre - Mile End campus, QMUL Mile End campus with the Graduate Centre on the left Birds eye view of Mile End campus, QMUL Map depicting QMUL Whitechapel campus with Empire House where all of the iGGi Office space is located Empire House Basement, QMUL (Whitechapel) iGGi office space, Empire House, QMUL (Whitechapel campus) The Blizard Building opposite Empire House, Whitechapel campus (QMUL) Previous Next

  • Dr Ahmed Sayed

    < Back Dr Ahmed M. A. Sayed Queen Mary University of London Supervisor Ahmed Sayed is a Lecturer (Assistant Professor) of Big Data and Distributed Systems at the School of EECS, QMUL and leads the Scalable Adaptive Yet Efficient Distributed (SAYED) Systems Lab. He has a PhD in Computer Science and Engineering from the Hong Kong University of Science and Technology. His research interests lie in the intersection of distributed systems, computer networks and machine learning. He is an investigator on several UK and international grants totalling nearly USD$1 million in funding. His work appears in top-tier conferences and journals including NeurIPS, AAAI, MLSys, ACM EuroSys, IEEE INFOCOM, IEEE ICDCS, and IEEE/ACM Transactions on Networking. He is interested in supervising students with a background in game AI, machine learning, distributed systems, and/or creative computing, Ahmed is interested in working with students at the intersection of artificial intelligence, machine learning, and creative computing. He aims to leverage AI/ML methods, game data and player research to design intelligent game agents by creating systems that enable game agents to learn better gaming strategies, thus enhancing the gaming experience. He is open to any research proposals in that space and currently is keen on exploring solutions that are based on leveraging the emerging distributed privacy-preserving ML ecosystems on large-scale game data. If you are interested in working with him on this, please reach out to him. ahmed.sayed@qmul.ac.uk Email Mastodon http://eecs.qmul.ac.uk/~ahmed/ Other links Website https://www.linkedin.com/in/ahmedmabdelmoniem/ LinkedIn BlueSky https://github.com/ahmedcs Github Themes Creative Computing Design & Development Game AI Game Data Player Research - Previous Next

  • Understanding ongoing mental states using video games: applications to mental health research. | iGGi PhD

    Understanding ongoing mental states using video games: applications to mental health research. Theme Game Data Project proposed & supervised by Alex Wade To discuss whether this project could become your PhD proposal please email: alex.wade@york.ac.uk < Back Understanding ongoing mental states using video games: applications to mental health research. Project proposal abstract: A player’s behaviour in a game is directly linked to their personality and gives detailed information on their decision making processes, showing how they approach risks, socialisation and problem solving. Analysing these behaviours may also provide information about mental health disorders and indicate how these change over time. Neuroimaging methods (EEG/MEG/fMRI) can be used to examine the neural responses and patterns of ongoing neuronal activity that occur while players are engaged in a game. By linking these data to modern theories of neural economics we can explore and potentially improve aspects of a player's decision making, such as: attention span, focus, risk taking and delayed reward. This PhD will use a combination of neuroscience and advanced data analysis methods to examine the link between video game play and the brain. We will use a combination of cutting-edge data analytic techniques applied to large, existing video game telemetry datasets and neuroimaging experiments designed to measure changes in ongoing mental states while people play simple video games. The PhD would suit a student with good data analytics skills and some experience in neuroscience. Supervisor: Alex Wade Based at:

  • Dr Tony Stockman

    < Back Dr Tony Stockman Queen Mary University of London Supervisor Dr Stockman is an interaction designer/researcher who investigates how technology can enhance accessibility and improve human performance. He is particularly interested in technology to support spatial cognition and wayfinding, health monitoring and improve performance levels in sport and music. This includes the role of games in simulating these domains and supporting skill acquisition and enhanced performance. He is a Board member and former president of the International Community for Auditory Display ( www.icad.org ). He has organised 6 international workshops on a range of HCI topics, and has been on the organising committee of 10 international HCI-related conferences. Topics on which he has recently published include participatory design and prototyping, auditory overviews for route guidance, self monitoring of biological signals and accessible collaborative working. He is particularly interested in supervising students with a Computer Science, Electrical Engineering, HCI, or behavioural sciences background on the following topics: Simulation to support accessibility and skill acquisition in team sports Intelligent audio mostly games to support learning Intelligent Audio or audio-haptic approaches to health monitoring and biofeedback Intelligent systems to support individual or collaborative music making Research themes: Intelligent simulation systems Interaction design for simulated sports Game Audio and Music Game Design Games with a Purpose t.stockman@qmul.ac.uk Email Mastodon Other links Website LinkedIn BlueSky Github Themes Applied Games Design & Development Esports Game AI Game Audio Player Research - Previous Next

  • Bluesky_Logo wt
  • LinkedIn
  • YouTube
  • mastodon icon white

Copyright © 2023 iGGi

Privacy Policy

The EPSRC Centre for Doctoral Training in Intelligent Games and Game Intelligence (iGGi) is a leading PhD research programme aimed at the Games and Creative Industries.

bottom of page