Thursday, April 10, 2014

Both static and passive emotions are complete!

Static Mode: Our robot uses short non-personal sayings and a 0 based board to play.  Events are announced, but not reacted to.

Passive Mode: Our board utilizes boxes 1-9, the robot reacts to the outcome of the game, uses personalized phrases, and simulates thinking.  We are currently writing a bit of strategy to make it play better: block if the opponent has two X's aligned and if the robot gets twp O's aligned to make the winning move.

Dynamic Model: still needs game-play commentary (workable from the strategy) and a few gestures to seem more human-like.

After this we will begin our IRB approved study!
Speech issue all fixed!  The logic was broke into separate modules while a global board get passed around during game-play.  This not only helped to separate our logic but also use the correct processes to kill the listening process.

Wednesday, March 26, 2014

While we are continuing work on our system we have also decided on a topic for our research.  We will be looking at how the enjoyability of a simple game is affected by the humanness of a robotic opponent in regards to emotion and personality.  

We will have three basic levels of emotion: static, passive, and active. Static emotion will be the generic robotic game-play with no emotion added. It will make moves instantly and will not make any additional comments. Passive emotion is the level where the robot passively engages the user. It accomplishes this through taking time to perform actions, and will react to the end result of the game. Active emotion will mimic human game-play attributes. The robot will make comments during active game-play that reflects the current situation. This is in addition to the passive emotion actions.

Thursday, February 27, 2014

tts.say(“It’s Something”)

Here is a picture of the board output from our debug window:


Playing First Tic Tac Toe Game!


Unfortunately the video was compressed during the upload, and the text is not readable.
Speech recognition was harder to work with in python than anticipated! Our robot listens, but we cannot find the process to kill so it can stop listening.  As a result it believes its own audio is the human players input during their turn.  We solved this problem by muting the robot and having the dialog be displayed in a simulator in Choregraphe.