Currently, I am working on tactile sensors, focusing on wearable tactile sensors and whisker sensors
for the underwater robot OceanOneK.
Previously, I had the privilege of being advised by
Prof. Wei-Shi Zheng at
Sun Yat-Sen University, where I worked on dexterous hand manipulation.
Besides robotics, I am also working on interdisciplinary research in music, healthcare, hardware, and AI.
I am fortunate to collaborate with professors from the Music and Medical Schools.
TL;DR: Localize contacts with a casual transformer trained on simulation data in MuJoCo.
Designed with optic fiber, the whisker sensors are suitable for the underwater robot OceanOneK and are resistant to salt erosion.
TL;DR: Synthesize a text-grasping dataset using GPT-4 from low-level hand-object features.
Generate dexterous hand grasps with a diffusion model conditioned on CLIP-decoded text embeddings.
TL;DR: Synthesize a text-to-music dataset using GPT from extracted musical attributes.
A two-stage text-to-music framework, BERT for text understanding and a transformer decoder for music generation.
Misc
I love jazz, J-POP, and AI-generated music
(check out MusicFX DJ !)
You can often find me either playing random music or working out at the gym.