Reads a near-realtime stream of tweets, filtered by a given keyword. Each tweet has any Emoji characters extracted and the emotional sentiment analysed. The Emojis are shown to users in a continuous feed, and the emotional sentiment is plotted on a moving graph and converted to piano "music" that suits the mood of each tweet.
Initially created at MLH Launch Hack by Mark Ormesher, Fares Alaboud, Mustafa al-Bassam and Kristin Kasavetova.