Designing, Developing and Deploying resilient data pipelines using Python and Docker-Compose
I have had experience of developing data piplines in Python based on huge volumes of unstructured events, processing them and storing them in NoSQL solutions like Redis both for fun and profit . Developing and Deploying aggresievely while maintaining the uptime of the system and it's various components was a key metric. Learning from mistakes and failures, I wish to share my story of how docker made my and can make your life easy and sane. With advent of docker, bridge between development and deployment has been wonderfully shortened. This talk will be centered around orchestrating different docker containers using docker-compose and how it facilitates the designing of whole system architecture by taking use case of a data pipeline. After giving the first overview of the docker-compose, my experience of developing data pipeline will be shared. And in the end, a short live demo (hack session - designing and development) will be presented so as to grasp the concept of realizing this whole approach and take away hopefully to your production systems.