Django provides the support of using multiple databases in your project. Let's see how we can do that, but first, let me put some use case where we might need multiple databases for our application. Why need multiple databases? In today's world, we are gathering a lot of data from user which is used for different purposes, some data is relational data and other is non-relational data. Let me put few use cases

  • Suppose you need to record all the touchpoints of a web page in your web application, for this you need a non-relation database to store that data to run some analytical result on it.
  • Read replicas, you need to set up read replicas of your default database to speed up the fetching of data from the database.
  • Saving the email metadata like how many emails were sent, open rate, error rate, link clicked to see the engagement of the emails.

Lets us see how to set up multiple databases in the Django project.

  1. Need to add the details of the databases in of Django project.
    "default": {
        "ENGINE": "django.db.backends.mysql",
        "NAME": config.get("database_default", "name"),
        "USER": config.get("database_default", "user"),
        "PASSWORD": config.get("database_default", "password"),
        "HOST": config.get("database_default", "host"),
        "PORT": "3306",
        "CONN_MAX_AGE": 0,
    "replica1": {
        "ENGINE": "django.db.backends.mysql",
        "NAME": config.get("database_replica1", "name"),
        "USER": config.get("database_replica1", "user"),
        "PASSWORD": config.get("database_replica1", "password"),
        "HOST": config.get("database_replica1", "host"),
        "PORT": "3306",
        "CONN_MAX_AGE": 0,
    "mongo": {
        "ENGINE": "djongo",
        "NAME": config.get("mongo_database", "name"),
        "HOST": config.get("mongo_database", "host"),
        "USER": config.get("mongo_database", "user"),
        "PASSWORD": config.get("mongo_database", "password"),

Here you can see we define 2 databases other than the default databases mongo and replica1. After this, you need to tell the Django router in which app you want to use which connection of database. This is one of the ways to do it, you can manually decide which database you want to use while querying.

  1. Now we need to define this router class to tell them which database to use, for that we need to write a class
class MongoRouter:
    A router to control all database operations on models in the
    analytics and status applications.
    route_app_labels = {'analytics', 'status'}

    def db_for_read(self, model, **hints):
        Attempts to read analytics and status models go to mongo db.
        if model._meta.app_label in self.route_app_labels:
            return 'mongo'
        return None

    def db_for_write(self, model, **hints):
        Attempts to write analytics and status models go to auth_db.
        if model._meta.app_label in self.route_app_labels:
            return 'mongo'
        return None

    def allow_relation(self, obj1, obj2, **hints):
        Allow relations if a model in the analytics and status apps is
        if (
            obj1._meta.app_label in self.route_app_labels or
            obj2._meta.app_label in self.route_app_labels
           return True
        return None

    def allow_migrate(self, db, app_label, model_name=None, **hints):
        Make sure the analytics and status apps only appear in the
        'mongo' database.
        if app_label in self.route_app_labels:
            return db == 'mongo
        return None

similar goes for replica1 database

class ReplicaRouter:
    def db_for_read(self, model, **hints):
        Reads go to replica1.
        return 'replica1'

    def db_for_write(self, model, **hints):
        Writes always go to default.
        return 'default'

    def allow_relation(self, obj1, obj2, **hints):
        Relations between objects are allowed if both objects are
        in the default/replica1 pool.
        db_set = {'default', 'replica1'}
        if obj1._state.db in db_set and obj2._state.db in db_set:
            return True
        return None

    def allow_migrate(self, db, app_label, model_name=None, **hints):
        All non-mongo models end up in this pool.
        return True

That's it, now you read to use multiples database in your project, which we early handle by the routers class you have defined.


#100DaysToOffload #Django #Python

In Django, we can use the abstraction concept in defining the tables for the columns which are common. We can make any modal as an abstract model by adding this meta property abstract = true.

Suppose you have some column fields which are common in all the tables, which you can abstract and have to just inherit this abstract class to add the fields in the model which can help you to follow the Don't Repeat Yourself principle. Let see an example

class Base(models.Model):
  Base parent class for all the models
  timestamp = models.DateTimeField(blank=True, db_index=True)
  is_active = models.BooleanField(default=True, db_index=True)

  def __init__(self, *args, **kwargs):
    super(Base, self).__init__(*args, **kwargs)

  class Meta:
    abstract = True

class OttPlatform(Base):

  name = models.CharField(max_length=200)
  ott_type = models.CharField(max_length=50)

  def __str__(self):

So, this helps you to stop duplication of code, but there is one more issue we can handle here. The is_active column is used to mark the row as deleted. Mainly in our use case, we can't delete the data from the table to keep the track of changes. So is_active field helps us with that. But now we have to use the is_active filter in every query.

We can solve this by overriding the manager, let see how

# First, define the Manager subclass.
class AtiveOTTManager(models.Manager):
    def get_queryset(self):
        return super().get_queryset().filter(is_active=True)

class OttPlatform(Base):

  name = models.CharField(max_length=200)
  ott_type = models.CharField(max_length=50)
   # the order matters, first come default manager, then custom managers.
  objects = models.Manager() # The default manager.
  active_objects = AtiveOTTManager() # The active OTT manager.

  def __str__(self):

# Now you have to do OttPlatform.active_objects.all(), to get all the active OTT platform name.

So, with overriding the manager we don't have to write a filter for is_active in every query.


#100DaysToOffload #django #python

Design by Contract is the software designing process that takes a contract based approach in developing the software that does no more and no less than it claims to do.

What is Contract? A contract is a document that defines the rights and responsibilities of both the party involved in the agreement and which list the repercussions if either party fails to abide by the contract. We all have seen or been in contact with your Landlord or the employment contract that specifies the roles and responsibilities that you must fulfill.

A Similar process we follow while developing the software, where we focus on documenting (and agreeing to) the rights and responsibilities of software modules to ensure program correctness. While writing the contract these questions will be helpful to get things clear.

  • What does the contract expect?
  • What does the contract guarantee?
  • What does the contract maintain?

The software which follows the Design by Contract has these condition specified

  • Preconditions, are the condition which must be true before calling the routine, if violated routine should never be called.
  • Postconditions, after the routines are finished the state which needs to be achieved should be achieved.
  • Class Invariant, A class ensures that this condition is always true from the perspective of a caller. During internal processing of a routine.

Why Design by Contract can be a good approach ?

  • DBC doesn’t require any setup or mocking
  • DBC we can define both the success and failure cases.
  • DBC can be used during the design phase, development, and deployment phase.
  • DBC fits in nicely with our concept of crashing early.

Conclusion Most of you are thinking, do we need another development approach as we already have Test-driven development (TDD). DBC and TDD are different approaches to the broader topic of the software development process. They both have value and both have used in different situations. DBC approach can be used across the Design, Development, and Deployment. DBC fits perfectly in the world where we follow the concept of crashing early. So give it a shot, will be happy to discuss it, you can reach me out her


#100DaysToOffload #SoftwareDevelopment #DBC #DesignByContract

Django Q() object helps to define SQL condition on the database and can be combined with the &(AND) and |(or) operator. Q() helps in the flexibility of defining and reusing the conditions.

  • Using Q() objects to make an AND conditions.
  • Using Q() objects to make an OR conditions.
  • Using Q() objects to make reusable conditions.

Using Q() objects to make an AND conditions We can use Q() objects to combine multiple filter conditions into one condition as filter conditions always perform AND operations.

from django.db.models import Q

# Without Q() object
document_obj = Document.objects.filter(created_by=1282).filter(doc_type='purchase_order').filter(edit=0).filter(cancelled=0)

#With Q() object
q_filter_document = Q(created_by=1282) & Q(doc_type='purchase_order') & Q(cancelled=0) &(edit=0)

# can also be written as
q_filter_document_another_way = Q(created_by=1282, doc_type='purchase_order', cancelled=0, edit=0)

document_obj = Document.objects.filter(q_filter_document)

Using Q() objects to make an OR conditions

from django.db.models import Q

#With Q() object
q_filter_document = Q(created_by=1282) | Q(created_by=1282)
document_obj = Document.objects.filter(q_filter_document)

Q() to make reusable filter condition The best use of Q() objects is reusability, we define the Q() once and can use them to combine with different Q() objects with help of &, |, and ~ operators.

Let's consider a use case, in which the user can generate a report based on certain filters. User can filter report based on these values documenttype, isdraft, createdby, documentstatus

def get_document_object(document_type, is_draft, created_by, document_status):
    base_query = Q(active=1, cancelled=0, document_tye=document_type, is_draft=is_draft)

    # based on condition we can different Q() objects to filter the tables
    if document_status = 'in_progress':
        base_query = base_query & Q(document_status=document_status, completed=0)
    else if document_status = 'completed':
        base_query = base_query & Q(document_status=document_status, completed=1)

   return Documents.objects.filter(base_query)

In Q() objects we can use the same conditional operator which we use in filter objects like in operator, startswith, endswith, etc.

Conclusion Q() objects contribute to clean code and reusability. It helps to define the condition with &, |, and ~ relation operator to simplify the complex queries.


#django #python #100DaysToOffload

In the previous blog post, we have discussed F() Expression, we will now explore more query expression in Django, to name few that we will discuss in this post are

  • Func() Expression
  • Subquery Expression
  • Aggregation () Expression

Func() Expression Func () Expression is the base of all the expressions and can be used to create your custom expression for the database level function.

# The table that we using for our query is the *Student* which keeps records of the students for the whole school.

from django.db.models import F, Func
student_obj = Student.objects.annotate(full_name=Func(F('first_name') + F('last_name'), function='UPPER')

# This will give a student object with a new field that is *full_name* of the student in upper case.

Subquery Expression Subquery are like nested condition in the query filter which helps you to make a complex query into a clean concise query. But you need to know the order of the sequence the query will be executed to use effectively. While using a Subquery you will also need to know about the OuterRef, which is like an F() Expression but points to the parent query value, let see both Subquery and OuterRef in action

# you are given a task to get the name of the student whose name starts with *S* and whose fees are due.

from django.db.models import OuterRef, Subquery
fee_objects = Fees.objects.filter(payment_due_gt=0)
student_obj = Student.objects.filter(name__startswith='S').filter(id__in=Subquery(fee_objects.values('student_id')))

# Get the lastest remarks for the students
remark = Remark.objects.filter(student_id=OuterRef('pk')).order_by('-created_at')
student_obj = Student.objects.annotate(newest_remark=Subquery(remark.values('remark_strl')[:1]))

Aggregation () Expression

Aggregation Expression is the Func Expression with GroupBy clause in the query filter.

# get the total student enrolled in the *Blind Faith* subject.

student_obj = Student.objects.filter(subject_name='blind_faith').annotate(total_count=Count('id'))

Note: All queries mentioned above in the code are not tested. So if you see any typo, a query that does not make sense, feel free to reach out to me at sandeepchoudhary1507[at]gmail[DOT]com.


#100DaysToOffload #django #python

While working from home one of the issue I faced that my laptop battery charger adapter is always remain plugged in almost all the time, due to which I have to replace my laptop battery. To deal with the problem I have now written a script to notify me about the laptop battery charging level if it goes above 85 % and below 20%.

#! /bin/bash                                                                                                                                          

while true

    battery_level=`acpi -b | grep -o '[0-9]*' | sed -n  2p`
    ac_power=`cat /sys/class/power_supply/AC/online`

    #If above command raise an error " No such file or directory" try the below command.
    #ac_power=`cat /sys/class/power_supply/ACAD/online`
    if [ $ac_power -gt 0 ]; then
        if [ $battery_level -ge 85 ]; then
            notify-send "Battery Full" "Level: ${battery_level}%"
        if [ $battery_level -le 20 ]; then
            notify-send --urgency=CRITICAL "Battery Low" "Level: ${battery_level}%"
    sleep 120


The important commands which I want to break down to explain actually what they are doing.

  • First is acpi which tell us about the battery information and other ACPI information
  • grep command is used to extract the integer value from the acpi command
  • sed is used to get the second value from the grep result.
>> acpi -b
Battery 0: Discharging, 54%, 02:03:37 remaining
>>acpi -b | grep -o '[0-9]*'
>>acpi -b | grep -o '[0-9]*' | sed -n  2p
  • After that, we check that the charger is plugged in or not, based on that we check that the battery level does not exceed the described limit, if so is the case send the notification.
  • Then we check for the battery low indication which sends the notification if the battery level less than 20 %.
  • These condition is put in a continuous loop to check after a sleep time of 120s.

To make this script run automatically you have to assign the execution permission and specifies the execution command in the ~/.profile and reboot the system.

>> sudo chmod +x /path/to/

you can find my notes on shell commands here

thanks shrini for pointing out issue for Ubuntu 20 in lineac_power=`cat /sys/class/power_supply/AC/online` :) Cheers!

#100DaysToOffload #automation #scripts

What is the F() Expression? First let me explain to you what are Query Expressions are, these expressions let you use value or computation to be used in the update, create and filters, order by, annotation, aggregation. F() object represent the value of the model fields or annotated columns. It lets you help to not load the value of the field into the python memory rather directly handles in the Database query.

How to use the F() Expression? To use the F expression you have to import them from the from django.db.models import F and have to pass the name of the field or annotated column as argument, and it will return the value of the field from the database, without letting know the python any value. Let some example.

from django.db.models import F

# Documents is the table which have the details of the document submitted by user from the registrey portal for GYM membership

# We need update the count of the document submitted by the user with pk=10091

# without using F Expression

document = Documents.objects.get(user_id=10091)
document.document_counts += 1

# Using F expression
document = Documents.objects.get(user_id=10091)
document.document_counts = F('document_counts') + 1

Benefits of the F() Expression.

  • With the help of F expression we can make are query clean and concise.
    from django.db.models import F
  document = Documents.objects.get(user_id=10091)
  document.update(document_counts=F('document_counts') + 1)

   #Here we also have achieved some performance advantage
    #1. All the work is done at database level, rather than throwing the value from the database in the python memory to do the computation.
    #2. Save queries hit on the database.
  • F Expression can save you from the race condition. Consider a scenario where multiple user access your database and when bother user access the Document object for the user 10091, the count value is two, when user updates the value and save it and other user does the same the value will be saved as three not Four because when both user fetches the value its two.

  # user A fetch the document object, and value of document_counts is two.
  document = Documents.objects.get(user_id=10091)
  document.document_counts += 1
  # after the operation value of document_counts is three

  # Code running prallerly, User B also fetch the object, and value of document_counts is two.
  document = Documents.objects.get(user_id=10091)
  document.document_counts  += 1
  # after the operation value of document_counts is three

  # But actually value should be Four, but this is not the case using F expression here will save use from this race condition.
  • F Expression are persistent, which means the query persist after the save operation, so you have to use the refreshfromdb to avoid the persistence.
  document = Documents.objects.get(user_id=10091)
  document.document_counts = F('document_counts') + 1

  document.document_validation = 0

  # This will increase the value of *document_counts* by two rather then one as the query presists and calling save will trigger the increment again.

  • More example of F Expression in action with filter, annotate query.
from django.db.models import F

# annotation example
annotate_document = Document.objects.filter(created_by=F('created_by_first_name') + F('created_by_last_name')

# filter example
filter_document = Documents.objects.filter(total_documents_count__gt=F('valid_documents_count'))

That's all about the F Expression, it is pretty handy and helps you improve the performance, by reducing value loading in the memory and query hit optimization, but you have to keep an eye over the persistent issue, which I assume will not be the case if you are writing your code in structured way.


#100DaysToOffload #django #python

Journaling is a great way to keep track of your progress and emotional state, using the same journaling principle in managing your finance can be of great help to see how money flow in & out of your life :).

So, here we will explore how to journal in Emacs or any editor of your choice with the help of the tool ledger.

Before starting, let's get familiar with the basic terminologies.

  • Assets – It's the money that you have.
  • Liabilities – It's the money that you own, or you can say Debt.

Ledger is the double-entry accounting software, which means that you have to mention the flow in of the money[where from it comes like Saving account, Credit Card] and flows out of the money[expenditure, Investment, shopping] and all these entries should balance out each other and result should be zero if it's not the case there is an issue in your entries. The best part of the ledger software is that it allows you to manage your data in a simple text file and don't alter your data, and you have all your data with you.

So, Ledger read the simple text file and generate all kinds of the report that you need. Emacs comes in the picture to manage these text file and give a solid way to manage these file with org-mode also, that we will discuss some other time.

Now get ready to set up the journal system.

Installing the Tools

  • Download the Ledger on your system based on your OS from here — for the lazy ones on Ubuntu OS, you can follow the steps given below.
$ sudo add-apt-repository ppa:mbudde/ledger
$ sudo apt-get update
$ sudo apt-get install ledger
  • Open your Emacs editor and then follow these steps.

    • Press Alt-X package-install [Enter Key]
    • Type ledger-mode [Enter Key], this will install the ledger-mode package
    • Open your Emacs config file and paste this snippet. We are telling the ledger-mode to activate for the file extension of .dat file.

      (use-package ledger-mode
           :ensure t
           (setq ledger-clear-whole-transactions 1)
           :mode "\\.dat\\'")

Ledger Mode in Action

  • Create a file with the extension .dat and open it in Emacs.
  • Press Ctrl-C Ctrl-A to enter an entry in the file.
    • This will ask for the date for the entry, afterward press enter.
    • Give a nice heading to your Ledger entry and add your expense.

Ledger entry example – It's up to you how you want to maintain your journal, Some entries example to sort out your expenses.

Ledger entry example – You can also plan your budget in the ledger and can automate the transaction, if you are geeky enough you can write a code to read the spreadsheet shared by your bank to populate the ledger. – Ctrl-C Ctrl-O Ctrl-R for report generation, you can find more about reports here

#100DaysToOffload #financial-freedom #emacs

—date: 2019-07-08 originally posted here

Identity & Access Management let the user manage access control/policies to the resources by defining who(identity) and what they can access(roles). Today we will talk about the Google Cloud Identity & Access Management and understand what it is and How to use it.

Policy in IAM is composed of the binding list which binds the Member Identity and Roles together to limit the access on google cloud resources.

Member can be of the following type

  • Google Account: This can be any valid Google account with or with any other domain name.
  • Service Account: Account related to the application rather than an individual, you can have as many numbers of service account for the logical components of your application.
  • Google Group: Google group are the collection of the different Google account and service account. Every group has a unique email id which can be used to identify members in the IAM policy. The benefit of group account is that if you want to change the permission of user you can simply move the user from one group to another group rather than changing the permission of the users.
  • G Suited domain: Is the virtual group of all the account created in the organization Suite.

Roles on other hand is collection of permission which is mainly represented as .., for example pubsub.subscriptions.consume. Permission determines what type of operation can be performed on resources. Permission cannot be directly applied to resources instead you can assign roles which are a group of different permission.

In Google Cloud Platform Roles are of three kinds

  • Primitive Roles
  • Predefined Roles
  • Custom Roles

Primitive Roles

These are of three types Owner, Editor and Viewer as the name suggest.

  • Viewer has only access to view the resources and data.
  • Editor has Viewer permission + permission to change/edit the resources.
  • Owner has the permission of editor + permission to manage all resources and user.

Predefined Roles

These are the roles provides by Cloud IAM in addition to primitive roles which provide more granular level access to the resources and these primitive roles differ based on different resources in the cloud, you can check these roles over here

Custom Roles

Cloud IAM let the user define different custom roles if primitive and predefined roles do not fulfill their requirements. Though there is some pointer to remember while creating the custom roles. Custom roles can be defined on Organization and Project level but not on Folder level and custom roles should have an iam.roles.creator

So now the question is how these rules actually work as we know that policy is the binding list which binds the member and roles. These policies are connected to resources and are enforce access control when these resources are accessed.

Google Cloud Policy have a hierarchy Organization > Folder > Project > Resources, every resource has exactly one parent and inherit the policy from its parent. Any policy assign on the parent is applied to all its child's. Here is the diagram from Google Cloud IAM docs which show how this hierarchy looks.

Here an example from official docs how permission hierarchy works,

In the diagram above, topic_a is a Cloud Pub/Sub resource that lives under the project example-prod. If you grant the Editor role to for example-prod, and grant the Publisher role to for topic_a, you effectively grant the Editor role for topic_a to and the Publisher role to

So here my effort to explain Google IAM policy in simple words, Hope you find it usefully. Please do share any feedback or any topic you think I should cover in this post. Till then Happy Clouding :)


-date: 2019-05-22

Let says you are working on a project and you have dependencies of code from some another repository, which you need in your project code.

  • One way is to copy the code from another repository manually to yours whenever it gets update not so good way :sad:
  • Another way is to use git version control system to do that for you and it's super easy to do :smile:

Let me show you how we can do it.

Fetching from Another Repository

git add remote other <repository_link>
git fetch other
git checkout <your_target_branch>
git checkout -p other/target-branch file_path

If you have multiple files you have to just change the last checkout statement to

git checkout other/target-branch file_path1 file_path2

But wait there is one catch here that is the path of the file in your repository should be a mirror image (same as) of the path of the file in another repository you are fetching from.

Fetching from Same Repository

Now If you want to fetch files from another branch of the same repository you have to just do

git checkout other_branch_name file_path1 file_path2

I have to admit that it has been three years now working with git, but it still excites me that there is a lot of things that I do not know about git, If you also have some important time-saving git commands which you feel can save someone else time to share in comment because sharing is caring :sunglasses: .


Happy Coding