YITONG SUN

About

Dr Yitong Sun is a computer scientist, HCI researcher and design-engineering practitioner working across human-centred simulation, wearable sensing and interactive systems. His research develops computational methods that make complex physical and bodily phenomena measurable, simulable and interactive - from digital twins of disaster environments, to eye- and body-centred sensing systems, to smart textile interfaces and AI-supported research tools.

Yitong's work sits at the intersection of human-computer interaction, real-time simulation, digital twins, wearable electronics, sensor systems, computer graphics and design research. He is particularly interested in systems that connect the digital and physical worlds: simulations that can support training and decision-making, sensing devices that can interpret human and environmental states, and interfaces that help people understand, shape and act on complex data.

His research is grounded in both technical development and human-centred evaluation. Across his projects, he has built real-time simulation environments, wearable sensing prototypes, PCB-based embedded systems, optical and physiological evaluation tools, AI-assisted modelling pipelines, and interactive authoring platforms. His work often moves between algorithm, hardware, interface and user study, reflecting a broader commitment to building systems that are technically rigorous, experientially meaningful and practically usable.

Research Agenda

Yitong's current research can be understood through three connected strands.

Simulation, Digital Twins and Real-Time Modelling

Yitong develops simulation systems that translate complex physical processes into real-time, interactive and visually legible computational environments. His work on earthquake simulation uses game engines, material calibration and physics-based modelling to create high-fidelity disaster scenarios for AI training, robotics, emergency planning and immersive training. More broadly, his simulation research treats virtual environments not only as visualisations, but as experimental systems for testing hypotheses, generating synthetic data and supporting decision-making.

This agenda also extends to biological and ecological simulation. His work on fungal morphology modelling and artificial life systems explores how machine learning, cellular automata and semantic feedback can make growth patterns, collective behaviour and emergent systems more controllable and interpretable for researchers, designers and public audiences.

Wearable Sensing, Smart Textiles and Body-Centred Systems

Yitong's wearable research investigates how bodies can be sensed, modelled and supported through embedded systems. His work includes posture-monitoring wearables, capacitive and EMG-based sensing, BLE PCB prototyping, sensor calibration, edge data capture and time-series modelling for personalised health intervention.

More recently, his work at MIT CSAIL explores smart textile fabrication and conductive flexible materials. His Joule-Flocked Smart Textiles project investigates how multimaterial 3D printing, conductive TPU, in-situ Joule heating and electrostatic flocking can be combined to fabricate programmable textile surfaces and pressure-sensitive interfaces. This work reflects his broader interest in soft, body-adjacent technologies that integrate material behaviour, sensing, electronics and interaction design.

HCI, Immersive Systems and Human-Centred Evaluation

Yitong's HCI research focuses on how people perceive, experience and act through computational systems. His doctoral work examined VR lighting, colour rendering, non-image-forming vision, eye-related sensing and visual comfort in immersive environments. This includes methods for predicting light spectrum exposure in VR scenes, reducing light stimulation while preserving colour fidelity, and estimating periocular metrics from headset-mounted cameras.

Across these projects, Yitong combines computational modelling with user-centred evaluation. He works with both quantitative and qualitative methods, including sensor-data analysis, psychophysical assessment, user studies, design workshops and stakeholder engagement. His goal is to create systems that are not only computationally novel, but also meaningful, interpretable and useful for the people who rely on them.

Current and Recent Roles

Yitong is currently a Visiting Scholar at MIT Computer Science and Artificial Intelligence Laboratory, working with Prof. Stefanie Mueller's group on smart textiles, conductive flexible materials and body-centred interactive systems.

He is also the founding Editor-in-Chief and technical founder of interactives, an experimental journal and publishing platform for HCI, game studies and new media. The platform supports interactive academic authoring, embedded media, timeline-based review workflows and AI-assisted editorial processes.

Previously, Yitong was a Postdoctoral Research Associate at the Royal College of Art, where he co-authored and delivered an EPSRC XR Network+ funded project in collaboration with Foster + Partners and the University of New South Wales. The project developed a high-fidelity earthquake simulation environment in Unreal Engine for disaster modelling, virtual production and AI training.

He has also worked as a Research Assistant and Wearable Electronics Engineer on AiDLab and RCA projects, contributing to wearable sensing systems for posture monitoring and personalised musculoskeletal health intervention.

Earlier in his career, Yitong was a College Lecturer at the Central Academy of Fine Arts, where he taught interaction design, HCI, programming, design methodology, biomaterials and immersive experience. He also worked with Piaggio's innovation group in Italy on speculative HCI and mobility concepts for the future of the Vespa series.

Publications, Patents and Recognition

Yitong has published and presented work across leading venues in HCI, computer graphics, immersive systems and wearable technologies, including CHI, SIGGRAPH Asia, IEEE Transactions on Visualization and Computer Graphics, Computers & Graphics, ISMAR and IEEE VR.

His work has led to multiple patent applications, including technologies for VR colour adjustment, pupillary oscillation-based brain arousal inference and personalised posture intervention. His projects have received support and recognition through the EPSRC XR Network+ Embedded Research and Development Grant, IRCA Design & Impact S/EIS Research Seed Fund, Innovation RCA Startup Pitch Deck Competition, Japan Media Arts Festival New Face Award, Prix Ars Electronica Honorary Mention, and other international awards.

He is a Fellow of the Royal Society of Arts and a Fellow of the Higher Education Academy.

Teaching, Research Leadership and Community

Yitong has experience across research, teaching, supervision, peer review and public engagement. He has taught design, HCI, programming, biomaterials and immersive technologies, and has supervised undergraduate projects in interaction design and computational media. He has also served as a reviewer for venues including SIGGRAPH, SIGGRAPH Asia, IEEE VR, ISMAR and ACM TOG.

His public and professional engagements include invited talks, workshops and collaborations with universities, research labs, design studios and technology organisations. He has presented or engaged with communities across the UK, Europe, China, Japan and Australia, including contexts such as SIGGRAPH Asia, University of Tokyo, UCL, UNSW iCinema, Foster + Partners, Sony, Meta XR Hackathon, Nvidia, HTC and other academic and industry-facing venues.

Collaboration

Yitong is interested in collaborations that connect simulation, sensing, digital twins, wearable interfaces, smart materials, immersive systems and human-centred AI. He is particularly drawn to projects where computational systems must operate across disciplines: translating physical phenomena into models, turning sensor data into actionable feedback, or making complex systems understandable through interaction.

His long-term research ambition is to build technologies that help people perceive, simulate and intervene in environments and bodies with greater precision, responsibility and care.