The use of computers for translating human affective communication into symbolic form, and for conveying rudimentary simulated emotions, has been little explored. In this short article we introduce an emotion reasoning engine called the Affective Reasoner based, in part, on the ideas of Ortony et al. (1988), along with its recent multi-media extensions, with which we attempt to address this problem. In the paper we suggest that users may best be maximally engaged with the computer, for certain tasks, by taking advantage of what appear to be innate human tendencies toward social, and emotional, interchange. We discuss two preliminary areas of exploration, based on speech recognition technology, in which we develop the spoken communication lexicon between user and computer. The first of these has to do with parsing emotion inflection in simple human utterances. The second has to do with interactively extending a base lexicon of spoken phrases that includes 198 emotion words, as well as tokens describing relationship, mood, and emotional intensity, so that users may add simple non-emotion tokens of their own choosing. This allows them to communicate about emotion situations, in diverse domains, without programmer intervention. Lastly we discuss emotionally expressive channels the multi-media computer has at its disposal. Among these are rudimentary emotionally inflected speech; indexed sub-second access to affect-inducing, and enhancing, music; schematic facial expression supporting over 60 expressions, as well as over 3000 dynamically constructed morphs; and explicit emotion content in utterances generated by the underlying emotion engine. The current implementation runs on an IBM (compatible) PC while still maintaining sub-second response time to (spoken) user utterances, to dynamic generation of new content in spoken text, to morphed changes in facial expression, and to retrieval and presentation of affect-bearing music, constraints we consider essential for supporting plausible affective interaction with the user.