I was actually interested in how Natural Language Programming was working on and was trying to learn how the actual work flow happens. The current social networking sites would be one the best places to get some interesting informations from. Then the idea stoke me. Why don’t I jus combine them??

The idea is basically simple. As a preprocessing, create a classifier to classify the tweets as positive or negative baised ones(basically do sentiment analysis on the tweets). The using the twitter streaming API, iterate through each incomming tweets, classify them into their respective classes. The location details are also taken from the metadata of the incomming tweets. The next step comes the visualisation part. Using the above calculated details like location and sentiment class.

In the first prototype that I had used TfidfVectorizer for feature extraction, MultinomialNB for binary classification. SKlearn library was used for both these tasks. For accessing twitter api, I made use of the Tweepy python library. The visualisation part was done using the basemap extension for matplotlib.

Note 1: I am not sure wheather this task was done before by someone else. This is only meant as a hobby project.

Note 2: I’ll update this post soon.


I had restarted working on this earlier last week. My first aim to make it into a webapp. Since before I had a little exposure and since this will only be a small application I preffered using Flask for development. In the intial phase of development, I kept all the other components as such and just ported the UI from matplotlib to flask platform.

Later I decided to move from using regular threading to redis queues for job processing tasks. Instead of working directly with mongodb wrappers, this time I went with using SQLAlchemy and Sqlite DB. While using PyMongo(python wrapper for mongodb), there were many things that was to be done manually. But when using SQLAlchemy all the database migrations and all are handled by the module. All the models were were defined and the base class used was “Model” from SQLAlchemy module. This will help in migrating all the model classes to database tables.

I am planning on writting a more detailed post on how Emolytics works. I guess writting that post will also help me in reviewing what I have done till date.