Shelly: A Robotic Tortoise for Children Robot Interaction

Sep 2017 – Feb 2018

Team Research Project, Naver Labs Robotics Group

Project Leader

* Advisor: Dr. Hae Won Park of MIT Media Lab Personal Robotics Group, and Dr. Sangok Seok

*  ACM/IEEE HRI 2018 (Chicago) Student Design Competition (First Winner) & Late-breaking Paper

* Extensive Interview with IEEE Spectrum.

Poster1

Motivation

The project “Shelly”, which I conducted with other research interns at Naver Labs robotics group, originally initiated with a minimum objective specified by Dr. Seok: ‘A social robot for children.’ Therefore, we extensively searched for problems that we can solve with a robot, which would eventually help children in social ways. We observed that children frequently abuse the service robots developed in Naver Labs during field tests in public spaces, largely driven by their curiosity.  Therefore, we decided to develop a robot that would restrain children’s robot abusing behavior, and induce a better way to treat robots. The result is Shelly, a tortoise-like robot.

 

Children like to abuse robots, but few studies have tried to remedy the phenomenon.

Process

1. Development of the prototypes.

In the span of four months, we developed two prototypes for Shelly. This was possible because of our excellent team members, each of whom contributed to different parts of Shelly’s system, such as mechanical design, hardware system, and sensory system. I dedicated myself to designing the graphical interface of Shelly embedded in its shell, as well as the shell’s whole structure.

DSC00218
Our team members: (from left) Sunho Jang, Soomin Lee, Won-Kyung Do, me, and Hyunjin Ku

The first prototype was the proof-of-concept model, which we utilized to evaluate our core hypothesis. More sophisticated interaction design and aesthetic design were applied to the second prototype, and it has been used to research perception algorithms to understand children’s touching motions.

Proto_1
Shelly’s first prototype. We used chopsticks for the frame of the shell to make it promptly.
Proto_2
Shelly’s second prototype, which is a more finished version, capable of more sophisticated perception and graphical reaction to children’s motion.

Since Shelly’s shell has many faces in which numerous sensors must be embedded for the tactile interaction, I spent a lot of time contemplating on how to develop with efficiency both in terms of fabrication and the resulting system. Therefore, I followed the track of rapid prototyping, by experimenting with numerous tactile module prototypes which incorporate LEDs and different kinds of sensors.

Fig2
Structure of the tactile modular interface of Shelly.
KakaoTalk_20180323_003241954
One of the working prototype of the tactile interface module I made.

Besides the tactile interface module prototypes, I made form mock-ups to make the aesthetic shell. See the collection of my prototypes below.

KakaoTalk_20180323_003243929
Various prototypes I made, which eventually have been combined as the shell of the second prototype of Shelly. The pink ones are the form mock-ups composed of iso pink. (Courtesy: various face prototypes of Shelly have been made by Won-Kyung Do)

And this is the resulting graphical-tactile interface of Shelly:

Poster2

2. Research with Shelly.

The core idea of Shelly was that its hiding-in-shell motion, which resembles the tortoise’s actual natural motion, can actually give a negative feedback to children’s abusive behaviors. Therefore, we first decided to verify this hypothesis. We did this with the first prototype. We conducted two field tests with children, as one of the programs for a monthly local recreational children event in Seongnam, Korea. In total, 141 people participated in the event.

Field_Test.jpg
Field test with the first prototype.

We verified our hypothesis and reported the result to the HRI 2018 Conference as the late-breaking report.

However, we wanted to go further and make a more sophisticated interaction engine for Shelly. Through investigations in previous studies in the field of HRI, I configured conceptual steps on how we can reach an interaction engine where a specific social concept is brought to be related to the interaction engine’s social functionality.

Shelly_Concept.png

Although we weren’t able to reach this step in the end, these thorough explorations of research directions have taught me valuable lessons on how to conduct research on social Human-Robot Interaction.

For the second step of this process – distinguishing user’s data – we studied method of understanding children’s various touch patterns, based on our tactile sensor’s data stream. We utilized 3D-CNN, LSTM, and HMM for a touch pattern classifier, which we carefully designed to reflect the spatiotemporal structure of the sensor data.

Fig1_new
The framework of social touch pattern recognition we proposed.

We submitted the result to UR 2020 Conference.

Result & Impact

We reported the design process of Shelly to HRI 2018 Conference and won the Student Design Challenge. I myself attended the conference to demonstrate our module prototypes and explain Shelly’s concept and design procedure.

HRI2018_Conference_Jason
Me, attending the poster session of SDC of HRI 2018 at Chicago.

The originality and significance of Shelly were also noticed by the media. The major press that reported Shelly are listed below:

Also, Shelly has been exhibited in Gwacheon National Science Museum in the summer of 2018. It is now displayed at KidZania, Seoul until the end of this year.

This is the summary of the research with Shelly in a video:

 

One Comment

  1. 장호짜응

    멋져

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s