We have covered many cool robotic arms here in the past. Not too many of them can be controlled using facial gestures though. Bolin Gao, Justin Yang, and Robinson Yuan are working on a robotic arm that can be controlled using facial gestures. The control assembly consists of a power supply, a motor, and an Arduino board. The below video shows this robotic arm in action:
The developers relied on an Emotive EEG headset in this demo. As you can tell, it is not a finished product, but the idea is quite intriguing.