Oh so . . . . Huggable!


It has skin.  It responds to sensation.  It has visual memory and reacts to the expression on your face.  It is lifelike in the truest sense.  But it is not alive.  It serves the same function as a companion animal.  But it is not an animal. 

It is the MIT Media Lab’s latest robot, “Huggable.”  In its current manifestation Huggable is a Teddy Bear loaded with physical sensors, and the brain that links all these sensors is an elaborate neural network.  The software that makes Huggable tick is based on the same kind of computational mathematics that powers many operations management tools, medical diagnosis classification systems, calculations to resolve tricky resource allocation problems, and predictive models for many other kinds of decision making under uncertainty.  

But Huggable is not about business or sensible decision making.  Huggable is about feeling.  This sensitive machine, which can distinguish among nine different types of sensation, is designed to produce human sentiment.  According to the robot’s inventors at MIT, the perfect use for Huggable is as a go-between for grandparent and grandchild.  Through a Web link, a grandparent can hear and see a child through the neural network perceptions of a robot.  Art imitates life–or life imitates art?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s