XenonStack - A Stack Innovator

Deploying Python Application on Docker & Kubernetes

by Gursimran | July 14, 2017 |  Categories -  Docker, Kubernetes, Python, Microservices

Blog Image

 

Overview

 

In this post, We’ll share the process how you can Develop and Deploy Python Application using Docker and Kubernetes and adopt DevOps in existing Python Applications.

 

Prerequisites are mentioned below

 

To follow this guide you need

 

 

Kubernetes is an open source platform that automates container operations, and Minikube is best for testing kubernetes in a local environment.

 

You May Also Love to Read Kubernetes Overview, Monitoring and Security

 

Kubectl is command line interface to manage kubernetes cluster either remotely or locally. To configure kubectl in your machine follow this link.

 

Shared Persistent Storage is permanent storage that we attach to the kubernetes container. We will be using cephfs as a persistent data store for kubernetes container applications.

 

Application Source Code is source code that we want to run inside a kubernetes container.

 

Dockerfile contains all the actions that are performed to build python application.

 

The Registry is an online image store for container images.

 

Below mentioned options are few most popular registries.

 

1. Private Docker Hub

2. AWS ECR

3. Docker Store

4. Google Container Registry

 

Dockerfile

 

The Below mentioned code is sample docker file for Python applications. In which we are using python 2.7 development environment.

 

 

The Below mentioned command will build your application container image.

 

Building Python Docker Image

 

The Below mentioned command will build your application container image.

 

 

Publishing Container Image

 

To publish Python container image, we can use different private/public cloud repository like Docker Hub, AWS ECR, Google Container Registry, Private Docker Registry.

 

  • Adding Container Registry to Docker Daemon

 

If you are using docker registry other than docker hub to store images, then we need to add that container registry to our local docker daemon and kubernetes Docker daemons.

 

You must have following things to follow next steps.

 

 

Now we need to Create a “daemon.json” in below-mentioned location

 

 

And add the following content to it.

 

 

Now Run the following commands to reload systemctl and restart docker daemon.

 

 

To verify that your container registry is added to local docker daemon, use the below-mentioned steps.

 

 

In output of above, you get your container registry like this

 

 

  • Pushing container Images to Registry

 

I'm using AWS ECR for publishing container images.

 

You must have an AWS account with Amazon ECR permissions. Create AWS ECR repository using a below-mentioned link.

 

http://docs.aws.amazon.com/AmazonECR/latest/userguide/repository-create.html

 

After creation, you will get registry URL, username, and password from own AWS cloud.

 

Here is a shell script that will add your AWS credentials for Amazon ECR in your local system so that you can push images to AWS ECR.

 

 

Now we need to retag python application image and push them to docker hub container registry.

 

To Retag application container image

 

 

To Push application container Images

 

 

Configure Persistent Volume (optional)

 

Persistent Volume is only required if your application has to save some data other than a database like documents, images, video, etc. then we need to use the persistent volume that kubernetes support like was AWS EBC, CephFS, GlusterFS, Azure Disk, NFS, etc.

 

Today I will be using cephfs(rbd) for persistent data to kubernetes containers.

 

We need to create two files named persistent-volume.yml and persistent-volume-claim.yml

 

  • Persistent Volume

Below I have added content for persistent-volume.yml

 

 

  • Persistent Volume Claim

Add the below-mentioned code to persistent-volume-claim.yml.

 

 

  • Adding Claims to Kubernetes

Using below mentioned commands to add persistent volume and claim to kubernetes cluster.

 

 

Creating Deployment Files for Kubernetes

 

Deploying application on kubernetes with ease using deployment and service files either in JSON or YAML format.

 

  • Deployment File

Following Content is for “<name of application>.deployment.yml” file of Python container application.

 

 

  • Service File

Following Content is for “<name of application>.service.yml” file of Python container application.

 

 

Running Python Application on Kubernetes

 

Python Container Application can be deployed either by kubernetes Dashboard or Kubectl (Command line).

 

I`m using the command line that you can use in production kubernetes cluster.

 

 

Now we have successfully deployed Python Application on Kubernetes.

 

Verification

 

We can verify application deployment either by using Kubectl or Kubernetes Dashboard.

 

Below mentioned command would show you running pods of your application with status running/terminated/stop/created.

 

 

Result of above command

 

Information of Kubernetes Pods

Testing

 

Get the External Node Port using the below-mentioned command.External Node Port is in the range from 30000 to 65000.

 

 

Launch web Browser and open any of the below-mentioned URLs.

 

  • http://<kubernetes master ip address >: <application service port number>

  • http://<cluster ip address >: <application port number>

 

Troubleshooting

 

  • Check Status of Pods.

  • Check Logs of Pods/Containers.

  • Check Service Port Status.

  • Check requirements/dependencies of application.

 

How Can XenonStack Help You?

 

 

Our DevOps Consulting Services provides DevOps Assessment and Audit of your existing Infrastructure, Development Environment and Integration. 

 

We provide End-To-End Infrastructure Automation, Continuous Integration, Continuous Deployment with automated Testing and Build Process. Our DevOps Solutions enables Continuous Delivery Pipeline on Microservices and Serverless Computing on Docker, Kubernetes, Hybrid and Public Cloud.

 

Our DevOps Professional Services includes - 

 

  • Single Click Deployment
  • Continuous Integration and Continuous Deployment
  • Support Microservices and Serverless Computing - Docker and Kubernetes
  • Deploy On-Premises, Public or Hybrid Cloud

 

 

Get 1 Hour Free Assessment for DevOps Strategy. CONTACT US NOW

 

XenonStack Offerings

 

XenonStack is a leading Software Company in Product Development and Solution Provider for DevOps, Big Data Integration, Real Time Analytics & Data Science.

 

Product NexaStack - Unified DevOps Platform Provides monitoring of Kubernetes, Docker, OpenStack infrastructure, Big Data Infrastructure and uses advanced machine learning techniques for Log Mining and Log Analytics.

 

Product ElixirData - Modern Data Integration Platform Enables enterprises and Different agencies for Log Analytics and Log Mining. 

 

Product Akira.AI is an Automated & Knowledge Drive Artificial Intelligence Platform that enables you to automate the Infrastructure to train and deploy Deep Learning Models on Public Cloud as well as On-Premises. 



Share Post On Social Media

Related Posts


Build, Deploy, Manage & Secure Continuous Delivery Pipeline & Analytics Stack.


NexaStack - DevOps & Serverless Computing Platform

Elixir Data - Modern Data Integration Platform

Contact For Free Assessment

Chat With Our Experts

ChatContact Us

Your Information Submitted Successfully!