This paper presents the development of a system that operates a robotic arm to deliver an object based on the facial expression of a human standing in front of the robot, demonstrating real-Time emotion recognition for physical Human-Robot Interaction. To achieve this, a convolutional neural network-based model was developed to identify emotions in real time. The robotic arm operation was implemented using an embedded NVidia Jetson Nano computer, a web camera, and OpenCV, ROS, and TensorFlow libraries. Using a 26.6k face photos data set from the emotion detection database, the built emotion detection model demonstrated an accuracy of 93.5% and an error of 6.5% during training and validation. The final real-Time prototype had a testing accuracy of 94% with an error of 6%. This proof-of-concept shows that in the near future more advanced applications that harness user emotions may also be built.