Chung Hyuk Park
  • Home
  • CV
  • Research
    • Research Themes >
      • HUMAN-ROBOT INTERACTION (HRI) IN ASSISTIVE ROBOTICS
      • MACHINE LEARNING AND ARTIFICIAL INTELLIGENCE (AI) FOR HRI
      • HRI IN TELE-MEDICAL ROBOTICS
      • AI FOR MEDICAL ASSISTIVE ROBOTICS
    • Lab Members
    • Lab equipment
  • Projects
    • Robots for Autistic Adolescents
    • AI-based IADL Analysis for AD/MCI Patients
    • Intelligent Haptic Teleoperation for Microrobots
    • Real-time Nerve Tissue Detection and Visualization
    • Robotic Surgery Assistance: Surgical Step Recognition & Prediction
    • Explainable AI for Neuroscience
    • Emotion detection with BCI
    • Music-based Interactive Robots for ASD
    • Robotic Learning
    • Medical VR Simulator
    • Haptic Telepresence
    • Assistive Robotics for VI
  • Teaching
    • Seminars
    • Courses
    • Lab Members and Mentees
  • About Me

Picture
Robotic Platforms
The ART-Med lab. has multiple robotic platforms for research projects focused on tele-manipulation, mobile navigation, and human-robot interaction. Below are the list of robotic platforms:
  • Adept Telepresence robotic platform (1 unit)
    • 2 Kinova JACO 2 robotic manipulators (2 units) with two 7 DoF manipulators with a gripper per each arm for tele-manipulation
    • Adept LX mobile robot (1 unit) with laser scanner, ultrasound sensors, and differential-drive wheels, capable of SLAM based navigation and path planning for telepresence research
  • Romo from Romotive (5 units): Interactive robotic platform for our study on music-based interactive robot therapy for children with autism spectrum disorder (ASD)
  • Robotis DARWin-OP2 (3 units) and Robotis Mini (4 units): Humanoid robot platforms for interacting with children with ASD
  • SoftBank Pepper Robot (3 units) and NAO robot (1 unit): The humanoid and programmable robot platforms for interacting with adolescents with ASD.
  • Sony Aibo 2 (1 unit):  Sony’s artificial intelligent companion robotic dog.
Sensory Systems
The ART-Med lab also provides diverse multi-modal sensors for effective and creative human-robot interaction, including the followings:
  • Visual/Depth sensors: Microsoft Kinect V1 / V2 for Windows (4 units) / MS Realsens SR300 (3 units)
  • Tobii Eye-tracker (Mobile ) (2 units).
  • Embedded depth sensors: Intel RealSense (2 units) depth cameras, ZED Stereovision Depth Camera (1 unit), and AZURE Kinect RGB+D camera (1 unit).
  • Brain-Computer Interface: Emotiv EPOC+ headsets (2 units), felt sensors, EPOC hydrator pack, and Simulink EEG importer.
  • E4 wrist sensor: E4 wristband (1 unit) for real-time physiological data acquisition.
  • Xsens MVN Wearable Motion Sensor system: Xsens 3D motion tracking system (17 IMUs + 3 wearable suits) with Analyze software (lifetime license).
 
Haptic and Augmented-Reality (AR) Interfaces
The following haptic interfaces and AR devices is equipped for interactive multi-modal communication for human users.
  • Haptic devices: Geomagic Touch (1 unit, previously named SensAble Phantom Omni) and Force Dimension Omega 7 (1 unit)
  • Microsoft Hololens 1&2 augmented-reality (AR) interface (3 units)
  • HTC Vibe VR headset and Lighthouse trackers
  • Oculus Quest2
 
Computing and Prototyping Facility
Multiple computing platforms with diverse operating system support (Windows, Mac, and Linux) are provided in the ART-Med lab., with prototyping support and embedded systems.
  • High-end PCs: Intel I7 quad-core processor-based desktops with NVidia GPUs (6 units)
  • Deep learning workstation: Lambda GPU workstation for deep learning with 4 A5000 GPUs (1 unit), and Lambda GPU workstation for deep learning with 3 A6000 GPUs (1 unit).
  • Mobile computing: laptops with i7 processor and NVidia GPU (4 units), Macbook Pro (1 unit),  iMac (1 unit), Mac Mini (1 unit), Mac Display (1 unit)
  • Mobile devices: iPod Touch (4 units), Samsung Galaxy Tab 4 (4 units).
  • 3D printer: Ultimaker 2 (2 units), FlashForge Adventurer 4 (1 unit)
  • Other electronic devices and embedded systems.
Picture
Picture
Picture
Picture