A team from the University of Antwerp (the
Netherlands) is coming up with a 3D-robot
arm that they say, can function as a
sign language interpreter. Really? Deaf
people depend on interpreters’ facial
expressions and body language, along with
sign language, to follow the hearing
conversation. Can a robot show facial
expressions and body language?