Building out Sophia 2020: An Integrative Platform for Embodied Cognition

  • The pre-print paper we published outlining this entire platform in more detail. Link.
  • AAAS Poster that we presented during the AAAS Annual Meeting. Link.

An Integrative Platform for Embodied Cognition

Sophia 2020 uses expressive human-like robotic faces, arms, locomotion, and uses ML neuro-symbolic AI dialog ensembles, NLP and NLG tools, within an open creative toolset.

Sensors and Touch Perception

In human emulation robotics, touch perception & stimulation are recognized as critical for the development of new human-inspired cognitive machine learning neural architectures.

Self-assembling porous polysiloxane emulsion artificial skin
Microfluidic pressure sensor, embedded in frubber polysiloxane emulsion

Hands and Arms

For the platform, we developed novel 14 DOF robotic arms with humanlike proportions, position and force feedback in every joint, series elastic actuators in all hand DOF, with relatively low-cost manufacturing.

Sophia 2020 Hands and Arms with force sensing joint and fingertips
  • PID control (reading the sensor and computing the desired actuator output).
  • Servo motors with 360 degrees of position control
  • URDF models in several motion control frameworks (Roodle, Gazebo, Moveit).
  • Force feedback controls, IK solvers, and PID loops, combining classic motion control with computer animation, wrapped in ROS API.

Grasping Control

We tested visual-servoing for grasping detection using an iteration on the Generative Grasping Convolutional Neural Network (GG-CNN). This algorithm takes in depth images of objects and predicts the pose of grasps at every pixel for different grasping tasks and objects.

Hanson-AI SDK

The Hanson-AI SDK includes various perception and control commands. Here are a few of the key features:
Face tracking, recognition, expressions, saliency, gestures, STT, SLAM, etc
- Procedural animation responses to perception: tracking, imitation, saccades
- Gazebo, Blender and Unity simulations
Robotic controls:
ROS, IK solver, PID loops, perceptual fusion, logging & debugging tools
- Movie-quality animation, with authoring tools for interactive performances
- Arms & hands: social gestures, rock paper scissors, figure drawing, baccarat
- Dancing, wheeled mobility, walking (KAIST with /UNLV DRC Hubo).

Human Emulation Robot & AI Platform Overview

Consciousness Experiments

Consciousness is something that is difficult to describe especially within robots. Although there is still a lot of debate around if/how we can measure consciousness we attempted to conduct studies to measure this.

Looking Ahead

As we move forward, our goal is to test our control system (such as manipulation commands) in precise environments and applications. We will be participating in more studies with humans to understand how robotic interactions impact the mood, behaviour and actions of humans.

Sophia participating in autism treatment (Pisa FACE), guided meditations (HKPU and Awakening Health), and various R&D studies (HRL & IISc)



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Alishba Imran

Alishba Imran

Machine learning developer working on accelerating automation/hardware and energy storage!