Python geek who loves to play around with web technologies.

Python has a built in function called abs() that is used to find the absolute value from an interget or float value.

Absolute values are values that return a magnitue of a given number. If I put it in more simple tems, magnitue of a number tells us how far they are from zero in the number line.

For negative numbers - sign is not considered.


>>> x, y = -10.11, -10
>>> abs(x)

Like most of python builtins it also has a dunder method __abs__

>>> y = -10
>>> y.__abs__()
>>> 10

Real World Example


Suppose you are given a chart of temprature that has temprature values and you need to find the value that is closest to 0.

class TempratureReader:

    def closest_to_zero(self, readings: List[float]) -> float:

        result = readings[0]

        for reading in readings:
            if abs(reading) < abs(result):
                result = reading
            elif abs(reading) == abs(result):
                result = max(result, reading )

        return result
>>> t = TempratureReader()
>>> t.closest_to_zero([10.11,-35.11,22.11, -2.1, -1.1, 1.1])
>>> 1.1

absoulte values of other types

  • For complex number, abs() returns the maginitudu.
>>> num = complex(3, 10)
>>> num
>>> (3+10j)
>>> abs(num)
>>> 10.44030650891055
  • For Decimal values it just works like float just returns the postive value.


How not to do it

If you are doing a fresh setup of UFW whenever you add a rule like ufw allow 'Nginx HTTP' the status of it would be inactive.

$sudo ufw allow 'Nginx HTTP'
Rule added
$sudo ufw status
Status: inactive

This would seem odd right, why is it inactive.

During your search, you may discover that you need to activate it with a new command that you find on the internet in order to understand why it was inactive. So you just enable it.

Simple problem, simple solution. :-P

$sudo ufw enable
Command may disrupt existing ssh connections. Proceed with operation (y|n)? y
Firewall is active and enabled on system startup

If you just do yes here and go on a break closing the ssh connection.

Congratulations, you are logged out of your server.


This just happened with me today while setting up a small droplet on digitalocean that hosts static page for a domain with nginx.

How to do it correctly

The problem lies in the last command

$sudo ufw enable
Command may disrupt existing ssh connections. Proceed with operation (y|n)? y
Firewall is active and enabled on system startup

There can be three cases now if you have enabled UFW.

First, don't quit the ssh session and add a rule to allow ssh connection.

$sudo ufw allow OpenSSH
$sudo ufw allow ssh
$sudo ufw allow 22

Assuming you use a default port for ssh, if it;s some other port you can just allow that port number like the port 333

$sudo ufw allow 333

That way you would still have access to SSH when your session ends/timesout.

Second, be smart and allow ssh ports before you enable the ufw.

Third, you just go bonkers and reset the ufw and start again. hahahaha!

sudo ufw reset

Common ufw commands you might find handy

  • Be sus and deny incoming.
# Set the defaults to deny incoming and allow outgoing connections
$sudo ufw default deny incoming
$sudo ufw default allow outgoing

  • Allow specific port ranges
$sudo ufw allow 6000:6007/tcp
$sudo ufw allow 6000:6007/udp
  • Allow specific address
$sudo ufw allow from


Decorators 101

  • A decorator may perform some processing with the decorated function, and returns it or replaces it with another function or callable object.

  • Inspection reveals that target is a now a reference to inner

def target():
    print("Running target()")

target # <function deco.<locals>.inner at 0x10063b598>
  • Decorators have the power to replace the decorated function with a different one.
  • Decorators are executed immediately when a module is loaded.
# Program to check how decorators are executed

registry = []

def register(func):
    print(f"running register({func})")
    return func

def f1():
    print("running f1()")

def f2():
    print("running f2()")

def f3():
    print(f"running f3()")

def main():
    print("running main()")
    print("registry ->", registry)

if __name__ == '__main__':

running register(<function f1 at 0x107a25c10>)
running register(<function f2 at 0x1079b0280>)
running main()
registry -> [<function f1 at 0x107a25c10>, <function f2 at 0x1079b0280>]
running f1()
running f2()
running f3()

  • Register runs (twice) before any other function in the module.

  • When register is called, it receives as an argument the function object being decorated—for example, <function f1 at 0x100631bf8>.

    • After the module is loaded, the registry list holds references to the two decorated functions: f1 and f2.
  • A real decorator is usually defined in one module and applied to functions in other modules.Unlike the example above.

Variable Scoping

b = 6

def smart_function(a):
    b = 9 



      3 def smart_function(a):
      4     print(a)
----> 5     print(b)
      6     b = 9

UnboundLocalError: local variable 'b' referenced before assignment
  • Python does not require you to declare variables, but assumes that a variable assigned in the body of a function is local.

  • To make the above program work with b=6 you would need to use the keyword global.


  • A closure is a function with an extended scope that encompasses nonglobal variables referenced in the body of the function but not defined there.

  • It does not matter whether the function is anonymous or not; what matters is that it can access nonglobal variables that are defined outside of its body.

# Closure Example

def make_averager():
    series = [] # Free Variable
    def averager(new_value):
        total = sum(series)
        return total/sum(series)
    return averager # Returns a function object

avg = make_averager()

avg(10) # 10.0

avg(11) # 10.5 

avg(12) # 1.0
  • series is a local variable of make_averager because the assignment series = [] happens in the body of that function. But when avg(10) is called, make_averager has already returned, and its local scope is long gone.

  • Within averager(), series is a free variable i.e a variable that is not bound in the local scope.

  • __code__ attribute keeps the names of the local and free variables.

  • The value for series is kept in the __closure__ attribute of the returned function avg.

avg.__code__.co_varnames # ('new_value', 'total')

avg.__code__.co_freevars # ('series',)

avg.__closure__ # (<cell at 0x107a44f78: list object at 0x107a91a48>,) 

avg.__closure__[0].cell_contents # [10, 11, 12]

Formal Definition

A closure is a function that retains the bindings of the free variables that exist when the function is defined, so that they can be used later when the function is invoked and the defining scope is no longer available.

Refactoring averager

def make_averager():
    count = 0
    total = 0
    def averager(new_value):
        count += 1
        total += new_value
        return total/count
    return averager

avg = make_averager()

UnboundLocalError: local variable 'count' referenced before assignment

Why did count not behave like series i.e free variable ?

  • For series we took advantage of the fact that lists are mutable and we never assigned to the series name. We only called series.append and invoked sum and len on it.
  • count is an immutable types and all you can do is read, never update. If you try to rebind them, as in count = count + 1, then you are implicitly creating a local variable count. It is no longer a free variable, and therefore it is not saved in the closure.

Refactoring averager with nonlocal

def make_averager():
    count = 0
    total = 0
    def average(new_value):
        nonlocal count, total
        count += 1 
        total += new_value
        return total / count 
    return averager
  • nonlocal lets you declare a variable as a free variable even when it is assigned within the function.
  • If a new value is assigned to a nonlocal variable, the binding stored in the closure is changed.

    # Code example here!

Understanding a simple decorator

import time 

def clock(func):
    def clocked(*args):
        t0 = time.perf_counter()
        # closure for clocked encompasses the func free variable
        result = func(*args)
        elapsed = time.perf_counter()
        name = func.__name__
        arg_str = ','.join(repr(args) for arg in args)
        print(f'[{elapsed:0.8f}s] {name}({arg_str}) -> {result!r}')
        return result
    return clocked

import time 

from clockdeco import clock

def snooze(seconds):

def factorial(n):
    return 1 if n < 2 else n*factorial(n-1)

if __name__ == '__main__':
    print('*' * 40, 'Calling snooze(.123)')
    print('*' * 40, 'Calling factorial(6)')
    print('6! =', factorial(6))

**************************************** Calling snooze(.123)
[14549.25258126s] snooze((0.123,)) -> None
**************************************** Calling factorial(6)
[14549.25291446s] factorial((1,)) -> 1
[14549.25296426s] factorial((2,)) -> 2
[14549.25301028s] factorial((3,)) -> 6
[14549.25305792s] factorial((4,)) -> 24
[14549.25310024s] factorial((5,)) -> 120
[14549.25314435s] factorial((6,)) -> 720
6! = 720
  • @clock on factorial(n) is just the syntatic sugar for factorial = clock(factorial)
  • clock(func) gets the factorial function as its func argument.
  • It then creates and returns the clocked function, which the Python interpreter assigns to factorial behind the scenes.
  • The __name__ of factorial(n) would give you clocked
import clockdeco_demo
  • Each time factorial(n) is called, clocked(n) gets executed.

Common Decorators in the standard library through functools

  • @wraps
  • @lru_cache
  • @singledispatch

#notes #Python #FluentPython #Decorators #Closure

For the last couple of weeks I have been actively job hunting and in one of the place I interviewed I was asked if I can do take-home assignment in Go. Since I had already mentioned in the interview I had been learning Go on the side. I agreeded finishing the assignment in Go thinking this might be a good challenge to take on and by doing this I might have a chance to showcase my ability of picking up new things quickly.

Plus having a deadline to complete the take-home assignment in a week could also accelerate my Go learning, as it provides a time based goal to work towards.

Gopher Coffee Gif

So for a week I spend most of my time reading parts of Learning Go, refering to tutorials Learn Go with Tests, watching old episodes from justforfunc: Programming in Go, a lot of stackoverflowing and talking to friends who use Go as primary language at work to get some pointers(no pun intended) on best practices, blogs to follow etc.

After a week, I finished writing gogrep, a command line program written in Go that implements Unix grep like functionality.

gogrep has a fairly small subset of features it can search a pattern from a file or a directory and also has support for flags like

  • -i : Make the seach case sensitive.
  • -c : Count number of matches.
  • -o : Store the search results in a file.
  • -B : Print 'n' lines before the match.

While adding these features I also added a fair bit of tests to the project to see how tests are written in Go.

At the moment the test coverage of the entire project is around 72.0% where all the major packages that server as helper or have buisness logic have a coverage greater than 83% and a very few have 100%.

Since gogrep is my first Go project, I do have some loose opinions about the language and the tools it offer. Primarly from the prespective of Pythonista, who has been using Python as a primarly language for the last four years.

  • Go does not have classes and it's probably good: If you are coming from a background of classes and methods, you can relate some of it with receiver functions. Receiver functions come very close to behaving like methods for a class. Having written a few receiver functions to for a struct the straight forward use case becomes a no brainer but it might get a bit complicated once you start refactoring a code the second or the third time.

  • Poniter! Oh my! : If you have every done pointer airthmetic in C/C++ you know the pain. Thank god in Go you don't have to deal with pointer airthmetic. You still have to be careful since you are dealing with memory addresses and it can get complicated but it's defenitely not that bad. You get use to it.

  • Tooling: I coded the entire project in VSCode and I loved the Go support it has, go-static check suggestion for code snippetes are quite helpful and for tests automatically changing a color of the test file to red at that instant when the associated function is changed, the whole Developer Experience is amazing. I have heard even better things for GoLand by Jetbrains from folks who write Go daily, can't wait to try that out. Besides that gofmt, golint, go vet and go test were some other things that I found to be really handy. I did not play around much with the debugger so can't comment much on that.

  • Makefile to the resuce: Since go gives you a single binary in the end you have build it every time to check you code changes. Having some automated way like a Makefile makes things really easy. So I would suggest during your initial project setup do invest in making a good automated process for building and checking your changes.

  • Error handeling pattern: Go has this pattern for handling errors which is very straight forward but sometimes it might seem a bit too much in terms of repetitve code.

if err != nil {
// Do something here to log/return error

This seems to be the only way to handle error at least that I know of. So most of my code was be sprinkled with a lot of these statements if err != nil in the top layer of the abstraction, for gogrep it was the main.go when I was calling func ParseFlags() for processing arguments.


// main.go

conf, output, err := parseflag.ParseFlags(os.Args[0], os.Args[1:])

if err == flag.ErrHelp {
    // Print the usage of the CLI
} else if err != nil {
    fmt.Println("Error: \n", err)

func ParseFlags(searchWord string, args []string) (config *Config, output string, err error) {
 // Supressed code 	
err = flags.Parse(args)
if err != nil {
    return nil, buf.String(), err

 // Supressed code
if len(flags.Args()) == 0 {
    return nil, "", fmt.Errorf("missing argument searchword and filename")

 // Supressed code
return &conf, buf.String(), nil
  • Types magic: Since Go is statically typed that means when defenining the variable I have an option to either just declare it with a type and have a zero value in it or directly initializing it with some value.

Even with functions I had to write a signature of input types and output.

func ParseFlags(searchWord string, args []string) (config *Config, output string, err error)

Having types in the function signature and output became such a great advantage later on both developer experience when refactoring and re-reding code that I wrote days back.

I became a fan of types in the codebase to such an extent that now I am all in for type annotations in my Python code if they can give me similar advantages.

  • Things I skipped: One week is a very small time to peak into a language features. So obviously I skipped a lot of Go features too like interfaces(I might have used them like io reader but for sure haven't written them), generics and defenitely did not touch goroutines and channels. One thing that most of the people praise Go for. There might be more but from a birds eye view I can not think anything else.

Overall, it was really fun week of learning something new and not using something that is already in my toolkit.

If I do more Go project in the future I would defenitely like to do a talk “Go for Python Developers” where I talk about my experinces of building a series of projects in both Go and Python like:

  • CLI tool

  • A key value store

  • A fully featured backend system for ecommerce website/twitter clone/blogging engine that will include REST APIs, relational DB, Redis for caching, RabbitMQ as a queue, async workers for processing, etc. As close I can get to the real thing.

Thus giving out more stronger opinions that I can back and giving a much deeper overview on what to expect when working with both the languages.

Gopher Gif

References: – Gopher Gifs: Images

2022 in the hindsight was a rough year for me. The start of the year I was super confident on what I wanted this year to be but by the end of it was as clueless as anyone.

Here are some of highlights of the year in particular order that summarise 2022 for me:

  • Gave two talks this year one at PyCascades 2022 and PyLadies Berlin.

  • Mentored a person to give her first ever talk at EuroPython 2022.

  • Attended an in person conference this year after ages. God how I missed those.

  • Did a two week workcation in Himachal Pradesh, turns out I not that into this workcation thing.

  • Got back into coffee brewing. Aeropress for the win!

  • Picked up Japanese as a language this year failed at it miserably. I hope I am more consistent with it next year.

  • Bonded with a lot of childhood friend whom I thought I had grown out of.

  • Started seeing someone.

  • Sent my parents on a 10 day vacation, which I think is the highlight of the year since I always wanted to do something like this for my parents the day I got my first paycheck.

  • Paid the initial down payment for a car for my parents.

  • Quit the job that I always tried too hard to fit in and was affecting my mental health. Took a big risk here quitting without a new job in hand, hope it plays off in the longer run.

I hope 2023 is a bit kinder to me and brings in a stability.

PyLadies Berlin Logo


PyLaides's Berlin chapter hosted a meetup on 20 November which seem very interesting to me when I first saw the agenda. It was on writing better talk proposals.

Though I have not attended many online meetups in the last two years I definitely didn't want to miss this one since I had a conference talk coming up early next year and this meetup just seemed like a really good opportunity to get some early feedback on the talk.

The meetup was sponsored by and was hosted on GatherTown. The meetup agenda had only one talk and the rest was all an interactive session.

Once everyone settled in, Maria started his talk 'The Call for Proposals is open: Tips for writing a conference talk'. The talk resonated with me a lot because by the end Maria pointed out most folks don't submit a talk just because they think about not having mastery. I being a similar person who overthinks a lot of mastery could relate a lot.

Next was the Ideation round. It was the most fun and interactive part of the meetup where everyone pitched their talk idea(s) and after that, folks would give feedback (if any) on how the talk proposal can be improved.

I had initially joined the meetup to get some feedback on the talk that had already been selected for a conference. But later decided that I can utilize this chance to get some feedback on a completely new talk proposal.

So I presented a completely different talk idea 'Up and Flying with the Django Pony in 10 min' that I cooked up during the break.

Django Pony

Pony is Django's unofficial mascot. Source:

I noticed Maria mention Python Pizza in her talk and saw they were accepting 10 min short talk. So that gave me chance to experiment and present something similar to Go in 100 Seconds.

Two other reasons that made me choose a talk on Django were

  • Most folks when starting out web development run behind learning the best or constantly switch without actually learning one well and knowing its limitations. But as I was rightly pointed out during the feedback, this could be a different talk altogether.

  • This talk could have been a really good starting point for someone who is thinking of picking up Django or moving to Django as it gives a good overview of its batteries included nature.

By the end of the Ideation round, there were only a couple of folks left. So it was decided that it really would help if everyone could write a documented proposal draft that everyone shared during the Ideation round and we could brainstorm on how can we improve it more.

The brainstorming session helped frame a better proposal and left me with a lot of valuable feedback out of which a few I could consider writing all the talk proposals in the future.

Jessica one of the PyLaides Berlin organizers did a wonderful job with the meetup keeping it interactive. While creating a safe space for everyone to share their views and feedback in a constructive manner.

I think other meetup groups can definitely take pages from PyLaides Berlin book in hosting a similar meetup where folks help out writing better talk proposals. As that would directly impact on quality of talk proposal which may also impact on the quality of talk conference gets. Who knows some of these proposals can end a being really good blogposts or meetup talks if not talks.

It might not be possible for organizers of an annual PyCon like PyCon India or other PyCon. But usually, a local chapter of Python User Groups can come up with a similar meetup in their region just after the CFP opens. Which can not only help out beginners but also help out folks improve on their talk proposal.

That's all for now. Until next time. Stay safe folks.

Last weekend I attended EuroPython's virtual sprints. Though this was my second year of attending the virtual sprints I was still a bit overwhelmed. This year not only I would have been sprinting for other projects but I would be helping other folks contribute to ScanAPI. As I had been recently added to the core team, I thought it would be a good chance for me to dive deep into the codebase while helping others at the sprints.

Unlike last year where discord was the communication channel for the sprints, this year folks at EuroPython had a matrix server. Each sprint project had its channel integrated with Jitsi to help pair program or just have a hallway-like experience.

Similar to last year sprints were planned for the whole weekend with multiple opening/closing sessions to support different timezones. I liked that they kept this from last year because other folks from ScanAPI are from Brazil and to them, the first opening session which was at 9 AM CEST or 12:30 PM IST would be something like 4:30 in the morning. This gap between multiple openings also gave me some time to just visit other projects and contribute. At least that is what I thought initially and moved the sprints timings for ScanAPI to the second day.

On the first day of the sprints, I wasn't expecting that I had to be present at the opening since we had planned the sprints for the other day. But to my surprise got an invite for the opening session to present for ScanAPI as I was anyway online. So I mostly talked about what the project is, how to run the project locally and how you can pick up the issues in the opening.

After the opening since I did not see much activity in the ScanAPI channel right away, I hopped on to EuroPython's Website sprint channel where Raquel and Francesco(web lead for the project) was already present. I had a really fun time talking to them.

By the end of the day, ScanAPI had three new issues, 4 Pull Request that was to be reviewed, 2 got merged and one contribution to the wiki on how to set up the project on windows without WSL in it.

The second day was a bit slow in general where I had initially planned the sprints. Since we had very little activity on the second day, I did wrap up early after addressing the previous day's Pull Requests. Since there was only one person active on the channel post-lunch, the conversation at the end drifted to bread baking which to me seems a really fun thing to try in your free time.

In the end, it was a weekend well-spent.

Making sense of docstrings with Python's ast and tokenize module and using recursion to rewrite Python's os.walk()

Last week was among the busy week at work with a release. Sadly personal goals suffered.

I did manage to put some effort into issues that were pending. Let's walk through each of them one at a time:

  • Lynn Root, maintainer of interrogate did give a detailed description of the issue. The approach to reaching the solution makes much more sense to me now. Based on the response wrote a simple script to check # nodocqa on docstrings i.e do not get coverage of the particular class or function. Need to check now how to implement that on interrogate.

  • On reading about Recursion realized, recursive algorithms are widely used to explore and manage the file system. This reminded me about Python's os.walk() method. Did spend some time reading the implementation though the code is well documented I am still not 100% sure how it works. I guess will learn more when I finish the script that mimics os.walk() in some manner.

That's all for the last week. Until next week. Stay safe folks.


Interrogating docstrings, Django's custom user model, Python's ast and tokenize module.

Oh boy! The week was more fun than I had planned.

Let's walk through each of them one at a time:

  • After finishing one of the courses on Django Celery. I wanted to put my knowledge to the test by building a knockoff version of Buttondown. Started a project called Hermes. Currently, Hermes has all the models done, celery setup is done and basic auth in place. Wanted to implement a custom user model so that the username field can be emailed instead of a username. So currently stuck with a bug related to that hopefully, will be resolved by next week.

  • I picked an issue in a library interrogate. interrogate gives a coverage report of missing docstrings and currently does not have a skip function like noqa. So I picked up the issue to add that. On a quick glance saw that ast module was being internally used. Though ast module doesn't include comments. The tokenize module can give you comments but doesn't provide other program structures. So I guess I need to mix and match both to add the feature.

  • Last year I sprinted for scanapi at EuroPython sprints 2020. I moved back to the project this week. Started back again by adding some docs, adding issue templates, and docstring coverage to the project. Also, this is how I stumbled upon interrogate.

I also managed to stay on course with my yearly health goal. Meditated daily and spent a decent 40min doing some cardio/core exercises on the working weekdays.

That's it for this week. Until next week. Stay safe folks.


Last weekend I attended EuroPython sprints that were virtually conducted. The communication platform for the conference was discord and was kept the same for the sprints too. It served a good platform as we were able to pair program with the maintainer by sharing our screens.

Day 1

Sprints opened at 12:30 PM IST and started with its first round of project introduction. A total of 12 projects that took part in this year's sprint. Though the project maintainers were from varied timezone and timezones are difficult to handle. The first opening of sprints only had a few maintainers of the project to talk about their project.

The project that I started off in the day one of the sprints was terminusdb. I primarily contributed to terminudb's python client which had Cheuk Ting Ho and Kevin Chekov Feeney to help us out. Kevin had coded the JS Client of the project and was here to work on the Python Client.

The issue I picked up was increasing the test coverage of the project and while working on that issue I also discovered some other issues. Some depreciated function was still being used in the client and the make file did not have a command to generate coverage HTML of the project.

By the end of day one, I had moved the coverage of terminusdb_client/woqlclient/ to 70% from 62% with a PR to remove the deprecated function from the client. Doing that I learned about graph databases and how terminusdb has git like features for the database.

Day 2

I started late on the second day and continued to work on the test coverage PR. I fixed some minor flake8 errors in my test coverage PR and pushed to coverage to 75% and created a PR for that make file command. A lot of people in sprints were confused in setup of project. So opened up a documentation issue for writing the wiki for setup instructions and contributions guidelines for new/first time contributors.

Just an hour before the first closing session I moved to scanapi which is maintained by Camila Maia. I picked up some good first issues and got them merged in no time. I saw this project at the closing of the day-1 and found it very interesting.

The other projects that I really found interesting but could not contribute to were Hypothesis, strawberry GraphQL and commitizen.

Overall I had a really fun weekend and I am excited to contribute more to those projects.