Gácsi et al. presented short video sequences to humans, in which a PeopleBot robot and a dog displayed behaviours that corresponded to five emotional states (joy, fear, anger, sadness, and neutral) in a neutral environment. The actions of the robot were developed on the basis of dog expressive behaviours that had been described in previous studies of dog–human interactions. Participants spontaneously attributed emotional states to both the robot and the dog. They could also successfully match all dog videos and all robot videos with the correct emotional state. The authors believe their new approach is a promising model for developing believable and easily recognisable emotional displays for non-humanoid social robots.