Academic Profile

Dr. Andreas Aristidou

Associate Professor
Department of Computer Science, University of Cyprus,
Senior Research Fellow at CYENS Centre of Excellence

Research interests on character animation (analysis, classification, and synthesis), involving motion capture, machine learning and generative AI for virtual humans, digital heritage (intangible cultural creations), and VR/AR/XR environments.

Contact
External profiles:
Publications
67
Citations
3460
h-index
29
Based on Google Scholar
Portrait of Andreas Aristidou

About

Short professional biography

Andreas Aristidou is an Associate Professor at the Department of Computer Science, University of Cyprus, member of the Graphics & Extended Reality Lab, and a Senior Research Fellow at the CYENS Centre of Excellence with special interest in computer graphics and character animation. He completed his PhD as a Cambridge European Trust fellow at the Department of Information Engineering, University of Cambridge and holds an MSc in Mobile and Personal Communications from King's College London, where he graduated with honors. Dr. Aristidou also obtained a BSc in Informatics and Telecommunications from the National and Kapodistrian University of Athens. He previously worked as a research fellow at Shandong University (China), Reichman University (Israel), and the University of Cyprus, and participated in a number of EU funded projects. He is currently on the editorial board of The Visual Computer (TVC) and Heritage journals, and previosuly served a guest editor-in-chief for Advances in Applied Clifford Algebras (AACA) journal. He is a senior member of the Association of Computing Machinery (ACM, SM from 2020), the Institute of Electrical and Electronic Engineers (IEEE, SM from 2019), and Eurographics. He is also a member of the Cyprus Scientific and Technical Chamber, where he previously served as a member of the Research Committee, while from 2020-2021, he was a member of the Cyprus Parallel Parliament for research, innovation, and digital governance. He has received numerous fellowships, distinctions, and research grants from highly competitive local, EU, and international agencies.

Research

Areas of interest & current projects

Research Interests

His main research interests lie at the intersection of computer graphics, virtual reality, and vision. He specializes in character animation (analysis, classification, and synthesis) and motion capture, where he uses techniques like machine learning and generative AI models to create virtual humans. His work also extends to digital heritage (intangible creations), virtual performances, VR/AR/MR environments, and the application of Conformal Geometric Algebra in computer graphics. He participated in a number of EU funded projects, and he collaborates with EU creative industries to design innovative algorithms for motion synthesis and character retargeting.

Research Tags

  • Computer Graphics
  • Character Animation
  • Motion Capture & Analysis
  • Machine Learning for Motion
  • GenAI for Virtual Humans
  • Virtual / Augmented / Extended Reality
  • Digital Heritage

Selected Research Projects

  • HAMLET project teaser

    HAMLET: Human-centred generative Ai fraMework for culturaL industriEs’ digital Transition

    Funded by Horizon Europe: Leverage the digital transition for competitive European cultural and creative industries
    HORIZON-CL2-2024-HERITAGE-01-03
    Duration: 01/2025 – 12/2027 Role: PI / Partner

    HAMLET aims to democratize access to Generative AI for Europe's Cultural and Creative Industries (CCIs), enabling entities of all sizes to benefit from AI for digital transition. It integrates a collaborative platform and AI tools to foster shared investments, collaboration and innovation, while promoting sustainable and inclusive frameworks for creative industries.

  • EMBRACE project teaser

    EMBRACE: Empowering Skills for Butterfly Monitoring to Enable Resilience to Climate Change

    Funded by Erasmus+, Small-scale partnerships in vocational education and training: Vocational Education and Training
    Call 2024 Round 1 KA210-VET
    Duration: 12/2024 – 11/2025 Role: PI / Partner

    EMBRACE aims to improve butterfly monitoring, contributing to EU environmental and biodiversity goals, particularly regarding climate change resilience. It engages farmers, agronomists, environmental scientists, and other stakeholders through education, including e-learning, a 3D digital butterfly museum, and replicable training materials. By involving a wide range of groups, from civil servants to citizens, the project fosters awareness and participation in biodiversity conservation efforts across Europe.

  • PREMIERE project teaser

    PREMIERE: Performing arts in a new era: AI and XR tools for better understanding, preservation, enjoyment and accessibility

    Funded by Horizon Europe: Preserving and enhancing cultural heritage with advanced digital technologies
    HORIZON-CL2-2021-HERITAGE-01-04
    Duration: 10/2022 – 09/2025 Role: PI / Partner

    PREMIERE modernizes the performing arts, focusing on dance and theatre, by using advanced digital technologies to support the whole lifecycle of performances. It develops an AI, XR and 3D–powered ecosystem that serves diverse communities involved in performing arts productions.

  • CLIPE project teaser

    CLIPE: Creating Lively Interactive Populated Environments

    Funded by European Union’s Horizon 2020 research and innovation programme under Marie Skłodowska-Curie Training Networks
    H2020-MSCA-ITN-2019
    Duration: 05/2020 – 04/2024 Role: Partner

    The primary objective of CLIPE is to train a generation of innovators and researchers in the field of virtual characters simulation and animation. Advances in technology are pushing towards making VR/AR worlds a daily experience. Whilst virtual characters are an important component of these worlds, bringing them to life and giving them interaction and communication abilities requires highly specialized programming combined with artistic skills, and considerable investments: millions spent on countless coders and designers to develop video-games is a typical example. The research objective of CLIPE is to design the next-generation of VR-ready characters.

  • DEMONSTRATION project teaser

    DEMONSTRATION: DEep MOtioN SynThesis foR character AnimaTION

    Funded by the Research and Innovation center, University of Cyprus
    Duration: 09/2021 – 12/2023 Role: PI

    DEMONSTRATION covers a wide range of multidisciplinary topics that are in line with the recent tendencies in computer graphics, character animation, and virtual reality. It aims at investigating modern trends in machine (deep, convolutional, adversarial, and reinforcement) learning, with ultimate target to provide ingenious solutions for overcoming the current limitations in character animation, and essentials for future improvements in a wide range of ambitious and challenging projects.

  • ALADDIN project teaser

    ALADDIN: Advancing the motion capture technoLogy viA DistributeD wIreless Networks

    Funded by Cyprus Seeds
    Duration: 09/2021 – 12/2023 Role: PI

    Motion capture is a technology used for turning the observations of a moving subject into 3D position and orientation information, stimulating our ability to define and virtually portray complex movements. The ALADDIN project aims at the development of such a technology that goes beyond conventional methods, is cost-effective, and minimizes the risks associated with capturing in dynamic situations. The proposed system is easily scalable and intrinsic, in the sense that measurements are not taken by external devices, enabling efficient capturing in outdoor environments, with state-of-the-art acquisition accuracy.

  • SCHEDAR project teaser

    SCHEDAR: Safeguarding the Cultural HEritage of Dance through Augmented Reality

    Funded by the Cyprus Research Promotion Foundation through JPI on Cultural Heritage
    Duration: 06/2018 – 11/2021 Role: PI

    SCHEDAR aims to contribute in the safeguarding of our Intangible Cultural Heritage (ICH), and more specifically folk dancing, by providing novel solutions to the three key challenges of archiving, reusing & repurposing, and ultimately disseminating ICH creations. A comprehensive set of new guidelines will be devised, and a framework and software tools for leveraging existing ICH motion databases. Data acquisition will be undertaken holistically; encompassing data related to the performance, the performer, the kind of the dance, the hidden/untold story, etc. Innovative use of state-of-the-art multisensory Augmented Reality technology will be used to enable direct interaction with the dance, providing new experiences and training in traditional dance.

  • ReTrack project teaser

    ReTrack: Advancing site level management through innovative reptiles' tracking and behavioural decryption

    Funded by the European Regional Development Fund and the Republic of Cyprus through the Research Promotion Foundation
    Duration: 12/2018 – 05/2021 Role: Partner

    ReTrack aims to tackle the problem of animal monitoring and habitat utilization, specifically designed for small-sized reptiles. It will advance the scientific knowledge in the fields of (a) reptile locomotion, (b) behavioral analysis, and (c) conservation, through the development of innovative monitoring techniques and approaches. Utilizing state-of-the-art technology in remote sensing, advanced photogrammetry and image pattern recognition, ReTrack aims to create fine-scale micro-habitat utilization maps for two common Cypriot species, a lizard (Stellagama stellio) and a snake (Dolichophis jugularis), thus advancing site level management through the designing of more targeted, species based management and conservation actions.

  • ITN-DCH project teaser

    ITN-DCH: Initial Training Network for Digital Cultural Heritage

    Funded by the European Union under the FP7 PEOPLE research framework
    Duration: 10/2013 – 09/2017 Role: Project deputy manager

    ITN-DCH aims to analyze, design, research, develop and validate an innovative multi-disciplinary and inter-sectorial research training framework that covers the entire lifecycle of digital CH research for a cost–effective preservation, documentation, protection and presentation of cultural heritage. CH is an integral element of Europe and vital for the creation of a common European identity and one of the greatest assets for steering Europe’s social, economic development and job creation. However, the current research training activities in CH are fragmented and mostly design to be of a single-discipline, failing to cover the whole lifecycle of Digital Cultural Heritage (DCH) research, which is by nature a multi-disciplinary and inter-sectorial research agenda.

  • ViDaPe project teaser

    ViDaPe: Visual Dance Performance for Interactive Characters

    Co-financed by the European Regional Development Fund and the Republic of Cyprus through the Research Promotion Foundation
    Duration: 06/2012 – 01/2015 Role: PI

    ViDaPe aims at generating a virtual animated character that interacts, in real-time, with a real dancing performer to compose a contemporary dancing show. The proposed research will explore innovative topics with special interest in the area of computer animation, including methods which smoothly combine optical motion capture (mocap) data with kinematic techniques, human figure modelling, a novel methodology for motion classification and partial-body motion synthesis. The system will be adjusted dynamically according to the performers' actions and responses, offering the maximum possible interaction between the natural and virtual performer.

  • SIM.POL.VR project teaser

    SIM.POL.VR: Virtual Reality Police Simulator

    Funded by the Cyprus Research Promotion Foundation
    Duration: 12/2009 – 11/2011 Role: Research Associate

    SIM.POL.VR focuses in investigating the usability and applicability of Virtual Reality in training special forces. It proposes the development of a platform using Computer Graphics techniques (as oppossed to the existing video-based system that is restrictive), for the creation of a 3D Visualisation Tool enriched with tools suitable for the formation and customisation of dynamic scenarios.

Publications

Selected publications and full list by type

A complete list of publications is also available on Google Scholar.

  • Teaser image for DRUMS: Drummer Reconstruction Using Midi Sequences

    DRUMS: Drummer Reconstruction Using Midi Sequences

    Theodoros Kyriakou, Panayiotis Charalambous, Andreas Aristidou

    Proceeding of the 18th annual ACM SIGGRAPH conference on Motion, Interaction and Games, MIG 2025, Zurich, Switzerland, December 2025.

    DRUMS is a MIDI-driven system that generates expressive, full-body drumming performances, combining precise rhythmic accuracy with realistic hand, stick, upper-body, and facial movements. By integrating BiLSTM-based hand motion prediction, phrase-matched upper-body and facial expressions, and procedural foot control through a modular IK framework, our method produces visually convincing and musically aligned drummer animations for applications in digital performance.

  • Teaser image for Disaster evacuation of the old city of Nicosia

    Disaster evacuation of the old city of Nicosia

    Marios Stylianou, Marios Demetriou, Andreas Aristidou

    Presented at the 8th International Disaster and Risk Conference, IDRC 2025, Nicosia, Cyprus, October 22 - 24, 2025

    Best student paper award at IDRC 2025.

    This paper presents a digital twin–based simulation of evacuation scenarios in Nicosia’s historic walled Old Town, showing how dense urban morphology, gate blockages, and infrastructure changes affect evacuation efficiency and highlighting the need for coordinated, adaptive flow-management strategies to reduce disaster risks in historic cities.

  • Teaser image for MPACT

    MPACT: Mesoscopic Profiling and Abstraction of Crowd Trajectories

    Marilena Lemonari, Andreas Panayiotou, Nuria Pelechano, Theodoros Kyriakou, Yiorgos Chrysanthou, Andreas Aristidou, Panayiotis Charalambous

    Computer Graphics Forum, Volume 44, Issue 6, September, 2025.

    MPACT is a framework that transforms unlabelled crowd data into controllable simulation parameters using image-based encoding and a parameter prediction network trained on synthetic image–profile pairs. It enables intuitive crowd authoring and behavior analysis, achieving high scores in believability, plausibility, and behavioral fidelity across evaluations and user studies.

  • Teaser image for Interactive Media for Cultural Heritage

    Interactive Media for Cultural Heritage

    Editors: Fotis Liarokapis, Maria Shehade, Andreas Aristidou, Yiorgos Chrysanthou

    Springer Series on Cultural Computing, Springer Cham, 2025

    This edited book explores the latest advancements in interactive media applied to Digital Cultural Heritage research, covering areas from visual data acquisition to immersive experiences like extended reality and digital storytelling. Structured into four sections, it offers theoretical discussions and diverse case studies, making it a valuable resource for academics, scholars, researchers, and students interested in interdisciplinary approaches to cultural heritage preservation and exploration through emerging technologies.

  • Motion labelling and recognition: A case study on the Zeibekiko dance

    Maria Skublewska-Paszkowska, Pawel Powroznik, Marilena Lemonari, Andreas Aristidou

    Interactive Media for Cultural Heritage, In F. Liarokapis, M. Shahade, A. Aristidou and Y. Chrysanthou (Eds), Springer Series in Cultural Computing, Springer, 2025

  • Teaser image for DragPoser

    DragPoser: Motion Reconstruction from Variable Sparse Tracking Signals via Latent Space Optimization

    Jose Luis Pontón, Eduard Pujol, Andreas Aristidou, Carlos Andújar, Nuria Pelechano

    Computer Graphics Forum, Volume 44, Issue 2, May, 2025

    Presented at Eurographics 2025, EG'2025 proceedings, May, 2025.

    DragPoser is a deep-learning-based motion reconstruction system that uses variable sparse sensors as input, achieving real-time high end-effector position accuracy through a pose optimization process within a structured latent space. Incorporating a Temporal Predictor network with a Transformer architecture, DragPoser surpasses traditional methods in precision, producing natural and temporally coherent poses, and demonstrating robustness and adaptability to dynamic constraints and various input configurations.

  • Teaser image for CEDRL

    CEDRL: Simulating Diverse Crowds with Example-Driven Deep Reinforcement Learning

    Andreas Panayiotou, Andreas Aristidou, Panayiotis Charalambous

    Computer Graphics Forum, Volume 44, Issue 2, May, 2025

    Presented at Eurographics 2025, EG'2025 proceedings, May, 2025.

    This paper introduces CEDRL (Crowds using Example-driven Deep Reinforcement Learning), a framework that models diverse and adaptive crowd behaviors by leveraging multiple datasets and a reward function aligned with real-world observations. The approach enables real-time controllability and generalization across scenarios, showcasing enhanced behavior complexity and adaptability in virtual environments.

  • Teaser image for MMIP

    Multi-Modal Instrument Performance: A musical database

    Andreas Panayiotou, Andreas Aristidou, Panayiotis Charalambous

    Computer Graphics Forum, Volume 44, Issue 2, May, 2025

    Presented at Eurographics 2025, EG'2025 proceedings, May, 2025.

    This paper introduces the Multi-Modal Instrument Performances (MMIP) database, the first dataset to combine synchronized high-quality 3D motion capture, audio, video, and MIDI data for musical performances involving guitar, piano, and drums. It highlights the challenges of capturing and managing such multimodal data, offering an open-access repository with tools for exploration, visualization, and playback.

  • Teaser image for Reptiles

    A novel multidisciplinary approach for reptile movement and behavior analysis

    Savvas Zotos, Marilena Stamatiou, Sofia-Zacharenia Marketaki, Duncan J. Irschick, Jeremy A. Bot, Andreas Aristidou, Emily L. C. Shepard, Mark D. Holton, Ioannis N. Vogiatzakis

    Integrative Zoology, Accepted, January 2025.

    This paper introduces a multidisciplinary approach to studying reptile behavior, combining tri-axial accelerometers, video recordings, motion capture systems, and 3D reconstruction to create detailed digital archives of movements and behaviors. Using two Mediterranean reptiles as case studies, it highlights the potential of this method to advance research on complex and understudied behaviors, offering ecological insights and tools for behavioral analysis.

  • Teaser image for DCGAN_Eye_Analysis

    Deep convolutional generative adversarial networks in retinitis pigmentosa disease images augmentation and detection

    Paweł Powroźnik, Maria Skublewska-Paszkowska, Katarzyna Nowomiejska, Andreas Aristidou, Andreas Panayides, Robert Rejdak

    Advances in Science and Technology Research Journal, Volume 19, no. 2, pages 321-340, 2025.

    This study utilizes Deep Convolutional Generative Adversarial Networks (DCGAN) and hybrid VGG16-XGBoost techniques to enhance medical datasets, focusing on retinitis pigmentosa, a rare eye condition. The proposed method improves image clarity, dataset augmentation, and detection accuracy, achieving over 90% in key performance metrics and a 19% increase in baseline classification accuracy.

  • Teaser image for IdentifyingZeibekiko

    Identifying and Animating Movement of Zeibekiko Sequences by Spatial Temporal Graph Convolutional Network with Multi Attention Modules

    Maria Skublewska-Paszkowska, Paweł Powroźnik, Marcin Barszcz, Krzysztof Dziedzic, Andreas Aristidou

    Advances in Science and Technology Research Journal, Volume 18, no. 8, pages 217-227, 2024.

    This study employs optical motion capture technology to document and translate the Zeibekiko dance into a 3D virtual environment. Using a Spatial Temporal Graph Convolutional Network with Multi Attention Modules (ST-GCN-MAM), the system accurately captures and classifies essential dance sequences by focusing on key body regions, enabling precise, realistic virtual animations with applications in gaming, video production, and digital heritage preservation.

  • Teaser image for VR_Amathus_Diver

    Underwater Virtual Exploration of the Ancient Port of Amathus

    Andreas Alexandrou, Filip Skola, Dimitrios Skarlatos, Stella Demesticha, Fotis Liarokapis, Andreas Aristidou

    Journal of Cultural Heritage, Volume 70, pages 181-193, November–December 2024.

    This work focuses on the digital reconstruction and visualization of underwater cultural heritage, providing a gamified virtual reality (VR) experience of Cyprus' ancient Amathus harbor. Utilizing photogrammetry, our immersive VR environment enables seamless exploration and interaction with this historic site. Advanced features such as guided tours, procedural generation, and machine learning enhance realism and user engagement. User studies validate the quality of our VR experiences, highlighting minimal discomfort and demonstrating promising potential for advancing underwater exploration and conservation efforts.

  • Teaser image for VR_Library

    Design and Implementation of an Interactive Virtual Library based on its Physical Counterpart

    Christina-Georgia Serghides, Giorgos Christoforidis, Nikolas Iakovides, Andreas Aristidou

    Virtual Reality, Springer, Volume 28, article number 124, June, 2024.

    This work explores the creation of a digital replica of a physical Library, using photogrammetry and 3D modelling. A Virtual Reality (VR) platform was developed to immerse users in a virtual library experience, which can also serve as a community and knowledge hub. A perceptual study was conducted to understand the current usage of physical libraries, examine the users’ experience in VR, and identify the requirements and expectations in the development of a virtual library counterpart. Five key usage scenarios were implemented, as a proof-of-concept, with emphasis on 24/7 access, functionality, and interactivity. A user evaluation study endorsed all its key attributes and future viability.

  • Teaser image for Virtual Instrument Performances

    Virtual Instrument Performances (VIP): A Comprehensive Review

    Theodoros Kyriakou, Mercè Álvarez, Andreas Panayiotou, Yiorgos Chrysanthou, Panayiotis Charalambous, Andreas Aristidou

    Computer Graphics Forum, Volume 43, Issue 2, April 2024.

    Presented at Eurographics 2024, EG'24 STAR papers, April, 2024.

    The evolving landscape of performing arts, driven by advancements in Extended Reality (XR) and the Metaverse, presents transformative opportunities for digitizing musical experiences. This comprehensive survey explores the relatively unexplored field of Virtual Instrument Performances (VIP), addressing challenges related to motion capture precision, multi-modal interactions, and the integration of sensory modalities, with a focus on fostering inclusivity, creativity, and live performances in diverse settings.

  • Teaser image for EYODancer
    ICTM CDFF

    Digitizing Traditional Dances Under Extreme Clothing: The Case Study of Eyo

    Temi Ami-Williams, Christina-Georgia Serghides, Andreas Aristidou

    Journal of Cultural Heritage, Volume 67, pages 145–157, February, 2024.

    The video has been presented at the International Council for Traditional Music (ICTM) 2023 and the Cyprus Dance Film Festival (CDFF) 2023.

    This work examines the challenges of capturing movements in traditional African masquerade garments, specifically the Eyo masquerade dance from Lagos, Nigeria. By employing a combination of motion capture technologies, the study addresses the limitations posed by "extreme clothing" and offers valuable insights into preserving cultural heritage dances. The findings lead to an efficient pipeline for digitizing and visualizing folk dances with intricate costumes, culminating in a visually captivating animation showcasing an Eyo masquerade dance performance.

  • Teaser image for SparsePoser

    SparsePoser: Real-time Full-body Motion Reconstruction from Sparse Data

    Jose Luis Pontón, Haoran Yun, Andreas Aristidou, Carlos Andújar, Nuria Pelechano

    ACM Transactions on Graphics, Volume 43, Issue 1, Article No.: 5, pages 1–14.

    Presented at SIGGRAPH Asia 2023.

    SparsePoser is a novel deep learning-based approach that reconstructs full-body poses using only six tracking devices. The system uses a convolutional autoencoder to generate high-quality human poses learned from motion capture data and a lightweight feed-forward neural network IK component to adjust hands and feet based on the corresponding trackers.

  • Teaser image for CollaborativeVR

    Collaborative VR: Solving riddles in the concept of escape rooms

    Afxentis Ioannou, Marilena Lemonari, Fotis Liarokapis, Andreas Aristidou

    International Conference on Interactive Media, Smart Systems and Emerging Technologies, IMET 2023.

    This work explores alternative means of communication in collaborative virtual environments (CVEs) and their impact on users' engagement and performance. Through a case study of a collaborative VR escape room, we conduct a user study to evaluate the effects of nontraditional communication methods in computer-supported cooperative work (CSCW). Despite the absence of traditional interactions, our study reveals that users can effectively convey messages and complete tasks, akin to real-life scenarios.

  • Teaser image for VR_Dancer

    Dancing in virtual reality as an inclusive platform for social and physical fitness activities: A survey

    Bhuvaneswari Sarupuri, Richard Kulpa, Andreas Aristidou, Franck Multon

    The Visual Computer, Volume 40, pages 4055–4070, 2024.

    This paper qualitatively evaluates 292 users of a VR dancing platform, exploring their motivations, experiences, and requirements. We employ OpenAI's Artificial Intelligence platform for automatic extraction of response categories. The focus is on VR as an inclusive platform for social and physical dancing activities.

  • Teaser image for Motion Annotation

    Motion-R3: Fast and Accurate Motion Annotation via Representation-based Representativeness Ranking

    Jubo Yu, Tianxiang Ren, Shihui Guo, Fengyi Fang, Kai Wang, Zijiao Zeng, Yazhan Zhang, Andreas Aristidou, Yipeng Qin

    arXiv.org > cs.CV > arXiv:2304.01672

    In this work we present a new method for motion annotation based on the representativeness of motion data in a given dataset. Our ranks motion data based on their representativeness in a learned motion representation space. The paper also introduces a dual-level motion contrastive learning method to learn the motion representation space in a more informative way. The proposed method is efficient and can adapt to frequent requirements changes, enabling agile development of motion annotation models.

  • Teaser image for Museum Heist

    Collaborative Museum Heist with Reinforcement Learning

    Eleni Evripidou, Andreas Aristidou, Panayiotis Charalambous

    Computer Animation and Virtual Worlds, Volume 34, Issue 3-4, May 2023.

    Presented at the 36th International Conference on Computer Animation and Social Agents, CASA'23, May, 2023.

    In this paper, we present our initial findings of applying Reinforcement Learning techniques to a museum heist game, where trained robbers with different skills learn to cooperate and maximize individual and team rewards while avoiding detection by scripted security guards and cameras, showcasing the feasibility of training both sides concurrently in an adversarial game setting.

  • Teaser image for Let's All Dance

    Let's All Dance: Enhancing Amateur Dance Motions

    Qiu Zhou, Manyi Li, Qiong Zeng, Andreas Aristidou, Xiaojing Zhang, Lin Chen, Changhe Tu

    Computational Visual Media, Vol.9, No.3, September 2023

    In this paper, we present a deep model that enhances professionalism to amateur dance movements, allowing the movement quality to be improved in both the spatial and temporal domains. We illustrate the effectiveness of our method on real amateur and artificially generated dance movements. We also demonstrate that our method can synchronize 3D dance motions with any reference audio under non-uniform and irregular misalignment.

  • Teaser image for Rhythm is a Dancer

    Rhythm is a Dancer: Music-Driven Motion Synthesis with Global Structure

    Andreas Aristidou, Anastasios Yiannakidis, Kfir Aberman, Daniel Cohen-Or, Ariel Shamir, Yiorgos Chrysanthou

    IEEE Transactions on Visualization and Computer Graphics, Volume 29, Issue 8, August 2023.

    Presented at ACM SIGGRAPH/ Eurographics Symposium on Computer Animation, SCA'22, September, 2022.

    In this work, we present a music-driven neural framework that generates realistic human motions, which are rich, avoid repetitions, and jointly form a global structure that respects the culture of a specific dance genre. We illustrate examples of various dance genre, where we demonstrate choreography control and editing in a number of applications.

  • Virtual Library in the concept of digital twins

    Nikolas Iakovides, Andreas Lazarou, Panayiotis Kyriakou, Andreas Aristidou

    In Proceedings of the International Conference on Interactive Media, Smart Systems and Emerging Technologies, IMET 2022, Limassol, Cyprus, October 4 - 7, 2022

  • Teaser image for Pose Representations

    Pose Representations for Deep Skeletal Animation

    Nefeli Andreou, Andreas Aristidou, Yiorgos Chrysanthou

    Computer Graphics Forum, Volume 41, Issue 8, December 2022

    Presented at ACM SIGGRAPH/ Eurographics Symposium on Computer Animation, SCA'22, September, 2022

    In this work we present an efficient method for training neural networks, specifically designed for character animation. We use dual quaternions as the mathematical framework, and we take advantage of the skeletal hierarchy, to avoid rotation discontinuities, a common problem when using Euler angle or exponential map parameterizations, or motion ambiguities, a common problem when using positional data.

  • Teaser image for Digitizing Wildlife

    Digitizing Wildlife: The case of reptiles 3D virtual museum

    Savvas Zotos, Marilena Lemonari, Michael Konstantinou, Anastasios Yiannakidis, Georgios Pappas, Panayiotis Kyriakou, Ioannis N. Vogiatzakis, Andreas Aristidou

    IEEE Computer Graphics and Applications, Feature Article, Volume 42, Issue 5, Sept/Oct. 2022

    In this paper, we design and develop a 3D virtual museum with holistic metadata documentation and a variety of captured reptile behaviors and movements. Our main contribution lies on the procedure of rigging, capturing, and animating reptiles, as well as the development of a number of novel educational applications.

  • Teaser image for Safeguarding our Dance Cultural Heritage

    Safeguarding our Dance Cultural Heritage

    Andreas Aristidou, Alan Chalmers, Yiorgos Chrysanthou, Celine Loscos, Franck Multon, Joseph E. Parkins, Bhuvan Sarupuri, Efstathios Stavrakis

    Eurographics Tutorials, April 26, 2022

    In this tutorial, we show how the European Project, SCHEDAR, exploited emerging technologies to digitize, analyze, and holistically document our intangible heritage creations, that is a critical necessity for the preservation and the continuity of our identity as Europeans.

  • Teaser image for Virtual Dance Museums

    Virtual Dance Museums: the case of Greek/Cypriot folk dancing

    Andreas Aristidou, Nefeli Andreou, Loukas Charalambous, Anastasios Yiannakidis, Yiorgos Chrysanthou

    EUROGRAPHICS Workshop on Graphics and Cultural Heritage, GCH'21, Bournemouth, United Kingdom, November 2021

    This paper presents a virtual dance museum that has been developed to allow for widely educating the public, most specifically the youngest generations, about the story, costumes, music, and history of our dances. The museum is publicly accessible, and also enables motion data reusability, facilitating dance learning applications through gamification.

  • Teaser image for Adult2Child

    Adult2Child: Motion Style Transfer using CycleGANs

    Yuzhu Dong, Andreas Aristidou, Ariel Shamir, Moshe Mahler, Eakta Jain

    ACM SIGGRAPH Conference on Motion, Interaction, and Games, MIG'20, October 2020

    This paper presents an effective style translation method that transfers adult motion capture data to the style of child motion using CycleGANs. Our method allows training on unpaired data using a relatively small number of sequences of child and adult motions that are not required to be temporally aligned.

  • Teaser image for MotioNet

    MotioNet: 3D Human Motion Reconstruction from Monocular Video with Skeleton Consistency

    Mingyi Shi, Kfir Aberman, Andreas Aristidou, Taku Komura, Dani Lischinski, Daniel Cohen-Or, Baoquan Chen

    ACM Transaction on Graphics, 40(1), Article 1, 2020

    Presented at SIGGRAPH Asia 2020

    MotioNet is a deep neural network that directly reconstructs the motion of a 3D human skeleton from monocular video. It decomposes sequences of 2D joint positions into two separate attributes: a single, symmetric, skeleton, encoded by bone lengths, and a sequence of 3D joint rotations associated with global root positions and foot contact labels.

  • Teaser image for Salsa dance learning evaluation

    Salsa dance learning evaluation and motion analysis in gamified virtual reality environment

    Simon Senecal, Niels A. Nijdam, Andreas Aristidou, Nadia Magnenat-Thalmann

    Multimedia Tools and Applications, 79 (33-34): 24621-24643, September 2020

    We propose an interactive learning application in the form of a virtual reality game, that aims to help users to improve their salsa dancing skills. The application consists of three components, a virtual partner with interactive control to dance with, visual and haptic feedback, and a game mechanic with dance tasks.

  • CGA, a mathematical framework for motion continuity in deep neural networks

    Andreas Aristidou

    In Proceedings of the International Conference on Empowering Novel Geometric Algebra for Graphics & Engineering (ENGAGE'20), Geneva, Switzerland, September 2020

  • Virtual Dance Museum: the case of Cypiot folk dancing

    Andreas Aristidou, Anastasios Yiannakidis, Yiorgos Chrysanthou

    In Proceedings of the International Conference on Emerging Technologies and the Digital Transformation of Museums and Heritage Sites, RISE IMET, Nicosia, Cyprus, June 2020

  • Teaser image for Digital Dance Ethnography

    Digital Dance Ethnography: Organizing Large Dance Collections

    Andreas Aristidou, Ariel Shamir, Yiorgos Chrysanthou

    ACM Journal on Computing and Cultural Heritage, 12(4), Article 29, 2019

    This paper presents a method for contextually motion analysis that organizes dance data semantically, to form the first digital dance ethnography. The method is capable of exploiting the contextual correlation between dances, and distinguishing fine-grained differences between semantically similar motions.

  • Teaser image for Real-time 3D Human Pose Reconstruction

    Real-time 3D Human Pose and Motion Reconstruction from Monocular RGB Videos

    Anastasios Yiannakides, Andreas Aristidou, Yiorgos Chrysanthou

    Comp. Animation & Virtual Worlds, 30(3-4), 2019

    Proceedings of Computer Animation and Social Agents - CASA'19

    In this paper, we present a method that reconstructs articulated human motion, taken from a monocular RGB camera. Our method fits 2D deep estimated poses of multiple characters, with the 2D multi-view joint projections of 3D motion data, to retrieve the 3D body pose of the tracked character.

  • Adult2Child Age Regression Using CycleGANs

    Thomas Domas, Yuzhu Dong, Brendan John, Ariel Shamir, Andreas Aristidou, Eakta Jain

    In Proceedings of the ACM Symposium on Applied Perception (SAP'19), Barcelona, Spain, September 19-20, 2019

  • Teaser image for Deep Motifs and Motion Signatures

    Deep Motifs and Motion Signatures

    Andreas Aristidou, Daniel Cohen-Or, Jessica K. Hodgins, Yiorgos Chrysanthou, Ariel Shamir

    ACM Transaction on Graphics, 37(6), Article 187, 2018

    Proceedings of SIGGRAPH Asia 2018

    We introduce deep motion signatures, which are time-scale and temporal-order invariant, offering a succinct and descriptive representation of motion sequences. We divide motion sequences to short-term movements, and then characterize them based on the distribution of those movements.

  • Teaser image for Self-similarity Analysis

    Self-similarity Analysis for Motion Capture Cleaning

    Andreas Aristidou, Daniel Cohen-Or, Jessica K. Hodgins, Ariel Shamir

    Computer Graphics Forum, 37(2): 297-309, 2018

    Proceedings of Eurographics 2018

    Our method automatically analyzes mocap sequences of closely interacting performers based on self-similarity. We define motion-words consisting of short-sequences of joints transformations, and use a time-scale invariant similarity measure that is outlier-tolerant to find the KNN.

  • Teaser image for Inverse Kinematics Survey

    Inverse Kinematics Techniques in Computer Graphics: A Survey

    Andreas Aristidou, Joan Lasenby, Yiorgos Chrysanthou, Ariel Shamir

    Computer Graphics Forum, 37(6): 35-58, 2018

    Presented at Eurographics 2018 (STAR paper)

    In this survey, we present a comprehensive review of the IK problem and the solutions developed over the years from the computer graphics point of view. The most popular IK methods are discussed with regard to their performance, computational cost and the smoothness of their resulting postures.

  • Teaser image for Style-based Motion Analysis

    Style-based Motion Analysis for Dance Composition

    Andreas Aristidou, Efstathios Stavrakis, Margarita Papaefthimiou, George Papagiannakis, Yiorgos Chrysanthou

    The Visual Computer, 34(12), 1725-1737, 2018

    This work presents a motion analysis and synthesis framework, based on Laban Movement Analysis, that respects stylistic variations and thus is suitable for dance motion synthesis. Implemented in the context of Motion Graphs, it is used to eliminate potentially problematic transitions and synthesize style-coherent animation.

  • Teaser image for Emotion Control of Dance Movements

    Emotion Control of Unstructured Dance Movements

    Andreas Aristidou, Qiong Zeng, Efstathios Stavrakis, KangKang Yin, Daniel Cohen-Or, Yiorgos Chrysanthou, Baoquan Chen

    ACM SIGGRAPH/ Eurographics Symposium on Computer Animation, SCA'17. Eurographics Association, July, 2017

    We present a motion stylization technique suitable for highly expressive mocap data, such as contemporary dances. The method varies the emotion expressed in a motion by modifying its underlying geometric features. Even non-expert users can stylize dance motions by supplying an emotion modification as the single parameter of our algorithm.

  • Teaser image for Hand Tracking

    Hand Tracking with Physiological Constraints

    Andreas Aristidou

    The Visual Computer, 34(2): 213-228, 2018

    We present a simple and efficient methodology for tracking and reconstructing 3D hand poses. Using an optical motion capture system, where markers are positioned at strategic points, we manage to acquire the movement of the hand and establish its orientation using a minimum number of markers.

  • Teaser image for Continuous body emotion recognition

    Continuous body emotion recognition system during theater performances

    Simon Senecal, Louis Cuel, Andreas Aristidou, Nadia Magnenat-Thalmann

    Comp. Animation & Virtual Worlds, 27(3-4): 311-320, 2016

    Proceedings of Computer Animation and Social Agents - CASA'16

    We propose a system for continuous emotional behavior recognition expressed by people during communication based on their gesture and their whole body dynamical motion. The features used to classify the motion are inspired by the Laban Movement Analysis.

  • Teaser image for Extending FABRIK

    Extending FABRIK with Model Constraints

    Andreas Aristidou, Yiorgos Chrysanthou, Joan Lasenby

    Comp. Animation & Virtual Worlds, 27(1): 35-57, 2016

    This paper addresses the problem of manipulating articulated figures in an interactive and intuitive fashion for the design and control of their posture using the FABRIK algorithm; the algorithm has been extended to support a variation of different joints and has been evaluated on a humanoid model.

  • Teaser image for Folk Dance Evaluation

    Folk Dance Evaluation Using Laban Movement Analysis

    Andreas Aristidou, Efstathios Stavrakis, Panayiotis Charalambous, Yiorgos Chrysanthou, Stephania L. Himona

    ACM Journal on Computing and Cultural Heritage, 8(4): 1-19, 2015

    Best paper award at EG GCH 2014

    We present a framework based on the principles of Laban Movement Analysis (LMA) that aims to identify style qualities in dance motions, and can be subsequently used for motion comparison and evaluation. We have designed and implemented a prototype virtual reality simulator for teaching folk dances.

  • Teaser image for Emotion analysis and classification

    Emotion analysis and classification: Understanding the performers' emotions using LMA entities

    Andreas Aristidou, Panayiotis Charalambous, Yiorgos Chrysanthou

    Computer Graphics Forum, 34(6): 262–276, 2015

    Presented at Eurographics 2016

    We proposed a variety of features that encode characteristics of motion, in terms of Laban Movement Analysis, for motion classification and indexing purposes. Our framework can be used to extract both the body and stylistic characteristics, taking into consideration not only the geometry of the pose but also the qualitative characteristics of the motion.

  • A Conformal Geometric Algebra framework for Mixed Reality and mobile display

    Margarita Papaefthimiou, George Papagiannakis, Andreas Aristidou, Marinos Ioannides

    In Proceedings of the 6th Conference on Applied Geometric Algebra in Computer Science and Engineering (AGACSE'15), Barcelona, Spain, July 2015

  • Teaser image for Cypriot Intangible Cultural Heritage

    Cypriot Intangible Cultural Heritage: Digitizing Folk Dances

    Andreas Aristidou, Efstathios Stavrakis, Yiorgos Chrysanthou

    Cyprus Computer Society journal, Issue 25, pages 42-49, 2014

    We aim to preserve the Cypriot folk dance heritage, creating a state-of-the-art publicly accessible digital archive of folk dances. Our dance library, apart from the rare video materials that are commonly used to document dance performances, utilises three dimensional motion capture technologies to record and archive high quality motion data of expert dancers.

  • LMA-Based Motion Retrieval for Folk Dance Cultural Heritage

    Andreas Aristidou, Efstathios Stavrakis, Yiorgos Chrysanthou

    In Proceedings of the 5th International Conference on Cultural Heritage (EuroMed'14), LNCS, volume 8740, pages 207-216, Limassol, Cyprus, November 3 – 8, 2014

  • Motion Analysis for Folk Dance Evaluation

    Andreas Aristidou, Efstathios Stavrakis, Yiorgos Chrysanthou

    In Proceedings of the 12th EUROGRAPHICS Workshop on Graphics and Cultural Heritage (GCH'14), pages 55-64, Darmstadt, Germany, October 5 – 8, 2014

    Best Paper Award

  • Feature extraction for human motion indexing of acted dance performances

    Andreas Aristidou, Yiorgos Chrysanthou

    In Proceedings of the 9th International Conference on Computer Graphics Theory and Applications (GRAPP'14), pages 277-287, Portugal, January 05-08, 2014

  • Motion indexing of different emotional states using LMA components

    Andreas Aristidou, Yiorgos Chrysanthou

    In SIGGRAPH Asia Technical Briefs (SA'13), ACM, New York, USA, 21:1-21:4, 2013

  • Teaser image for Emotion Recognition for Exergames

    Emotion Recognition for Exergames using Laban Movement Analysis

    Haris Zacharatos, Christos Gatzoulis, Yiorgos Chrysanthou, Andreas Aristidou

    ACM SIGGRAPH Conference on Motion in Games, MIG'13, Ireland, November 7-9, 2013

    Exergames currently lack the ability to detect and adapt to players' emotional states to enhance the user experience. We propose using body motion features based on Laban Movement Analysis (LMA) to classify four emotional states with high accuracy, as demonstrated by experimental results.

  • Teaser image for Real-time marker prediction

    Real-time marker prediction and CoR estimation in optical motion capture

    Andreas Aristidou, Joan Lasenby

    The Visual Computer, 29 (1): 7-26, 2013

    An integrated framework is presented which predicts the occluded marker positions using a Variable Turn Model within an Unscented Kalman filter. Inferred information from neighbouring markers is used as observation states; these constraints are efficient, simple, and real-time implementable.

  • Marker Prediction and Skeletal Reconstruction in Motion Capture Technology

    Andreas Aristidou, Yiorgos Chrysanthou

    Technical Report (UCY-CS-TR-13-2), University of Cyprus, August 2013

  • Digitization of Cypriot Folk Dances

    Efstathios Stavrakis, Andreas Aristidou, Maria Savva, Stephania Loizidou Himona, Yiorgos Chrysanthou

    In Proceedings of the 4th International Conference on Progress in Cultural Heritage Preservation (EuroMed'12), LNCS, Volume 7616, pages 404-413, 2012

  • Teaser image for FABRIK

    FABRIK: a fast, iterative solver for the inverse kinematics problem

    Andreas Aristidou, Joan Lasenby

    Graphical Models, 73(5): 243-260, 2011

    A novel heuristic method, called Forward And Backward Reaching Inverse Kinematics (FABRIK), is described that avoids the use of rotational angles or matrices, and instead finds each joint position via locating a point on a line. Thus, it converges in few iterations, has low computational cost and produces visually realistic poses.

  • Inverse Kinematics solutions using Conformal Geometric Algebra

    Andreas Aristidou, Joan Lasenby

    Guide to Geometric Algebra in Practice, In L. Dorst and J. Lasenby (Eds), pages 47-62, Springer Verlag, 2011

  • Motion Capture with Constrained Inverse Kinematics for Real-Time Hand Tracking

    Andreas Aristidou, Joan Lasenby

    In IEEE Proceedings of the 4th International Symposium on Communications, Control and Signal Processing (ISCCSP'10), Limassol, Cyprus, May 3-5, 2010

  • Tracking and Modelling Motion for Biomechanical Analysis

    Andreas Aristidou

    A dissertation submitted to University of Cambridge for the Degree of Doctor of Philosophy, Cambridge, October 2010

    Supervisor: Dr Joan Lasenby, Examiners: Prof William J. Fitzgerald & Prof Adrian Hilton

  • Inverse Kinematics: a review of existing techniques and introduction of a new fast iterative solver

    Andreas Aristidou, Joan Lasenby

    Technical Report (CUEDF-INFENG, TR-632), University of Cambridge, September 2009

  • Methods for Real-time Restoration and Estimation in Optical Motion Capture

    Andreas Aristidou, Joan Lasenby, Jonathan Cameron

    Technical Report (CUEDF-INFENG, TR-619), University of Cambridge, January 2009

  • Predicting Missing Markers to Drive Real-Time Centre of Rotation Estimation

    Andreas Aristidou, Jonathan Cameron, Joan Lasenby

    In Proceedings of the International Conference on Articulated Motion and Deformable Objects (AMDO'08), LNCS, Vol. 5098, pages 238-247, Mallorca, Spain, July 9-11, 2008

  • Real-Time Estimation of Missing Markers in Human Motion Capture

    Andreas Aristidou, Jonathan Cameron, Joan Lasenby

    In IEEE Proceedings of the International Conference on Bioinformatics and Biomedical Engineering (iCBBE'08), pages 1343-1346, Shanghai, China, May 16-18, 2008

  • Tracking Multiple Sports Players for Mobile Display

    Andreas Aristidou, Paul Pangalos, Hamid Aghvami

    In Proceeding of the International Conference on Image Processing, Computer Vision, and Pattern Recognition (IPCV'07), pages 53-59, Las Vegas, USA, June 25-28, 2007

  • Violent Content Classification using Audio Features

    Theodoros Giannakopoulos, Dimitrios Kosmopoulos, Andreas Aristidou, Sergios Theodoridis

    In Proceedings of the Hellenic Artificial Intelligence Conference SETN-06, LNCS, Volume 3955, pages 502-507, Heraklion, Crete, Greece, May 18-20, 2006

  • A robust method for tracking sports players for web or mobile display

    Andreas Aristidou

    A dissertation submitted to King's College London for the Degree of Master of Science, London, September 2006

    Supervisor: Prof Hamid Aghvami

  • Reliable indexing and recognition of violent scenes using audio features

    Andreas Aristidou

    Annual journal of the Department of Informatics & Telecommunications, Best dissertations of the year, pages 127-136, Athens, Greece, 2006

    Supervisor: Prof Sergios Theodoridis

Databases & Repositories

Motion capture, cultural heritage, and open resources

I curate and maintain five publicly accessible motion-capture and 3D-model repositories, spanning music performance, natural heritage, biodiversity, and dance preservation. These collections provide high-quality research-ready datasets (including motion-capture recordings, 3D models, HD video, audio, and metadata) supporting interdisciplinary work in culture, education, conservation, and digital-humanities research.

  • Virtual Butterfly Exhibition

    Virtual Butterfly Exhibition

    Published: 2025

    A digital collection featuring photorealistic 3D models of native butterfly species, accompanied by taxonomy, host plant data, phenology, and conservation information. Created to support ecological education, scientific study, and biodiversity awareness among the public.

  • Multi-Modal Instrument Performances

    Multi-Modal Instrument Performances

    Published: 2025

    An open-access dataset combining synchronized high-resolution 3D motion capture, MIDI streams, HD video, and studio-quality audio from professional musicians on piano, guitar, and drums. Designed for research on musical expressivity, gesture, skill acquisition, and performance analysis.

  • Cyprus 3D reptiles

    Cyprus 3D reptiles

    Published: 2022

    A repository of animated, high-fidelity 3D models of reptile species native to Cyprus, capturing both biological morphology and live motion for reference, conservation, and educational use. The platform bridges tangible and intangible heritage by documenting color, structure, posture, movement, and behavioral signalling.

  • Virtual Dance Museum

    Virtual Dance Museum

    Published: 2022

    A dynamic 3D archive presenting traditional Cypriot and Greek dances through fully animated characters dressed in authentic attire, accompanied by metadata, photographs, and cultural context. Viewers can explore each performance interactively, experiencing dance traditions as they appear in real festivals.

  • Dance DB

    Dance Motion Capture Database

    Published: 2012

    A publicly accessible digital archive preserving traditional dance through state-of-the-art motion-capture recordings of expert performers. The repository provides high-quality datasets (including FBX, C3D, BVH, SMPL formats, and videos with music), and continues to expand as new performances are documented over time.

Teaching

Courses, supervision & dissertation topics

Current Courses

  • EPL232 – Programming Techniques and Tools

    Fall semester Level: UG

    The course teaches intermediate and advanced programming concepts, techniques and tools through a language that compiles to machine code. It familiarizes students with advanced constructs for handling memory and files, and covers compilation, debugging, documentation, and optimization of software. Emphasis is placed on methodological aspects of developing large-scale system software that addresses complex problems. Basic UNIX commands for programmers are also introduced.

  • EPL426 – Computer Graphics

    Spring semester Level: UG/PG

    The course teaches the basic principles of computer graphics. Students become familiar with scene construction, scene hierarchies, camera specification, projections of primitives, clipping, visible surface determination, polygon rasterisation (z-buffer), texture mapping, local and global illumination, shadows, ray tracing, radiosity, and real-time acceleration techniques. Both theoretical foundations and practical skills are provided through industry-standard tools such as OpenGL and the Unity game engine.

  • MAI 645 – Machine Learning for Graphics and Computer Vision

    Spring semester Level: PG

    This course introduces core machine learning algorithms and their use in computer vision and computer graphics. It also functions as a graduate-level seminar with weekly readings, short summaries, and discussions of recent research papers.

Past Courses

  • Computer Graphics

    Cyprus University of Technology2015

    Cyprus College2012, 2013, 2016

  • Audio-Visual Foundations

    University of Nicosia2012

  • Computer Networks

    Cyprus College2012, 2013

  • Digital Logic Design

    Cyprus College2012, 2013

  • Information Technology

    Cyprus College2012, 2013

  • Information Engineering Laboratories

    University of Cambridge2009

Available Dissertation / Thesis Topics

  • Interactive 3D Applications for Education and Cultural Heritage

    Design and development of immersive environments such as educational games, virtual museums, and exploratory learning tools. Focus on user interaction, gamification, and visual storytelling through real-time graphics.

    Requirements: Unity or Unreal Engine, Python or C++, Basic Computer Graphics, 3D asset workflow familiarity

  • Evaluating Motion Quality Through Real + Synthetic Data Fusion

    Systematic study of motion generation quality by blending motion captured data with AI-generated animations. Determine ideal proportions where motion realism begins to degrade and visualize quality metrics.

    Requirements: Machine Learning, Python, Motion Capture datasets, Blender or similar animation software

  • Motion-Driven Sonification and Neural Music Generation

    Explore how human motion can be translated into sound and music in real time or offline. Possible directions include feature extraction from motion data, generative music models, and interactive audio-visual feedback systems.

    Requirements: Python, Machine Learning, Audio processing basics, Blender or motion data handling

  • Laban Movement Analysis for Motion Stylization and Transfer

    Use LMA descriptors to analyze movement expressiveness and employ them as intermediates in animation generation. The goal is to manipulate style, emotion, or intent during retargeting or generative modeling.

    Requirements: Machine Learning, Python, Animation fundamentals, Blender/MoCap tools

  • Mapping Laban Movement Scores into Arousal–Valence Emotional Space

    Investigate how LMA parameters relate to emotional perception and model their projection onto the Arousal/Valence diagram. May include dataset collection, statistical correlation, and visualization or generative movement-emotion mapping.

    Requirements: Machine Learning, Python, Knowledge of LMA or affect modeling, Data visualization

Students & Opportunities

Supervision & prospective students

Current PhD Students & Research Associates

  • Portrait of Andreas Panayiotou

    Andreas Panayiotou

    PhD Student

  • Portrait of [Student Name]

    Theodoros Kyriakou

    PhD Student

  • Portrait of Christina-Georgia Serghides

    Christina-Georgia Serghides

    PhD Student

  • Portrait of Ofir Abramovich

    Ofir Abramovich

    PhD Student (co-supervised with Prof. Ariel Shamir)

  • Portrait of Alexios Mylordos

    Alexios Mylordos

    Research Associate

  • Portrait of Zhanyu Tuo

    Zhanyu Tuo

    PhD Student (co-supervised with Prof. Florence d'Artois)

Alumni

  • Portrait of Anastasios Yiannakidis

    Anastasios Yiannakidis

    Alumnus Research Software Engineer, Max Planck Institute for Intelligent Systems

Prospective Students

I am currently welcoming motivated PhD / MSc / Research Associate / internship candidates who are eager to work in a fast-paced, creative research environment. Strong programming skills are essential, while experience in computer graphics, machine learning, or VR/AR is a valuable bonus. If you are passionate, curious, and excited to build innovative digital technologies, I would love to hear from you.

Professional Service

Editorial & community roles

Editorial Boards

Program Committees

Other Service

  • I am continuously engaged in the review of high-impact journal publications and international conference papers, while also serving as a panel member for European and global initiatives. In addition, I contribute to national and EU-level committees focused on creative industries, generative AI, digital heritage, and related areas of strategic development.

Research Grants & Distinctions

Funded projects and selected recognitions

Research Grants

Selected Awards & Distinctions

  • Best student paper award at the 8th International Disaster and Risk Conference (IDRC’25), for the paper “Distaster Evacuation of the Old City of Nicosia”, co-authors: M. Stylianou, and M. Demetriou.
  • Our video on EYO dance digitization has been nominated for Best Dance Film at the 2023 Cyprus Dance Film Festival, the premier event for creators and enthusiasts of dance film in Cyprus.
  • Awarded the Erasmus Mundus Grant for visiting scholars (2018) to undertake teaching and research activities in the context of dance as Intangible Cultural Heritage (Choreomundus International Master).
  • Awarded the nVIDIA GPU grant (2017), to investigate deep learning methods for motion synthesis and analysis.
  • Following his novel research in folk dance digitization, Andreas was part of a consortium that has been awarded the DARIAH-EU Theme 2015 in Open Humanities for organizing a workshop on e-documentation of Intangible Cultural Heritage (€6K).
  • Best paper award at the 12th Eurographics Workshop on Graphics and Cultural Heritage (GCH’14), for the paper “Motion Analysis for Folk Dance Evaluation”, co-authors: E. Stavrakis, and Y. Chrysanthou.
  • Awarded the Office of Naval Research Global (ONRG) Visiting Scientist Program (VSP) scholarship in 2013 to visit PhaseSpace Inc. offices in San Leandro, CA, USA, in order to be trained and gain experiences on subjects relative to motion capture (€4.5K).
  • Awarded the prestigious ΔΙΔΑΚΤΩΡ fellowship (2012-2014), by the Cyprus Research Promotion Foundation, to establish research in motion analysis and classification (€125K, with acceptance rate 5%).
  • PhaseSpace Inc. awarded me with a full 8-camera X2 motion capture system (2012), providing the essentials to establish my own motion capture laboratory at the Graphics and Virtual Reality Lab, University of Cyprus. They also donated a multiLED board (hand glove) in order to practise hand reconstruction methodologies on their software (€60K).
  • Cambridge European Trust Bursary (2007-2010) for PhD studies at the University of Cambridge (€13.5K).
  • Best BSc dissertation (2005), included in the annual journal of the Department of Informatics & Telecommunications (University of Athens) and presented in the Hellenic Artificial Intelligence Conference SETN-06, Crete, Greece, 2006.

Contact

Get in touch

Contact Details

  • Email: a.aristidou@ieee.org
  • Office: FST01, Room B113, University of Cyprus
  • Phone: +357 22 89 2698
  • Postal address:
    Department of Computer Science
    University Campus
    P.O. Box 20537
    CY-1678 NicosiaCyprus

Find Us