Linux Readline Keybinds vs Window Manager Keybinds

Well, Linux has spoiled people with options. Alas! The choices people are making are horrendous and oftentimes backfires them. But, why that is happening in the first place? Looking at the tip of the iceberg and imagining its size and shape is a widespread blunder, and that happens every walk of life nowadays.

So, as the title suggested working on a Linux system entices you to use the venerable command line sometimes, hopefully sooner in your endeavor. And you will find the enthralling effect of it when a minimal key press can do a ton with massive impact.

GNU Readline1 comes bundled with most if not all the Linux system out there. It is such an important library, wait, did I say library? And you read it right.

Now it is coming with some predefined set of keymaps, which could be used in the shell or in the terminal. By default, it is using Emacs’s bindings, although you could have easily changed that to Vim’s if you want to. How cool is that! Not only that if you are overly curious beyond the usability of the key binds it provides, you can program2 with it too.

Having several window manager3 has several key binds because not everyone follows the same key presses. And the working internal are quite different from one another. So, in essence, it will simply overboard your mind with various different keys for various window managers you might have. However, it is certainly not a bad thing to have a single window manager across distribution and have similar key binds. Which is what I am used to and having it as my primary source of operation.

But, having said that, the readline keymaps are window manager agnostic and work similarly with various window managers without any change. Importantly, they can coexist.

Okay, the million-dollar question is, how the heck do I know when to use what? The answer is not that simple. Sometimes, you have to ride on your gut feeling for specific action to be performed or access specific things, in the most convenient way possible. And the end-user has to put their head to find out what would work best for them. There is no hard and fast rule defined by anything or anybody.

How to see and set the readline command bindings in your system

Do this:

Open a terminal/shell and type at the prompt,:

bind -P ……that is capital P. And you will see something like this on your terminal/shell:

2024-09-09-142259_1920x1200_scrot.png

You can notice on the output, that do not start with # are active readline keybinds in your system. Here are the options you might need to know when using the bind command on the terminal:

2024-09-09-142726_1075x507_scrot.png

Okay, every Linux has a file /etc/inputrc and the content of the file somewhat looks like this :

2024-09-09-143521_1920x1200_scrot.png

If you want to override the system inputrc file with your own, you might create a file under your home directory i.e. .inputrc, and put your bindings in it, as I did for a few.

2024-09-09-143915_1920x221_scrot.png

The last word of caution is, please do not overburden your dot inputrc file with loads of key binds.

Nope YouTube Doesn’t Like Behind VPN; Automated Way Of Deactivate And Activate It

Hey….hey don’t you know YouTube doesn’t like to be accessed behind from the VPN or some other means of restriction you probably have put in place?

In this post, I have my own way of doing while adhering to their imposing rule in place.

I use a specific VPN for a long time, sheer choice in one fine day in the distant past, and damn! I am sticking with it, till now.

Now the trouble started surfacing a few months back when Google rolled out their forced rule to gain something they think is important to nature. Okay, I use it like everyone else in the wild and do stuff like others do. So, the impediment took me aback. Hence, to find a way to get on with it.

Here are two little scripts with identical nature to do the jobs for me. One is to get YouTube videos downloaded and converted to mp3, so I can listen to them with my local music player.

The other one is to download YouTube videos without the distraction and best possible resolutions.

I could have combined both of them in one single script, but I refrained for various reasons. One of the prime reasons is to maintain those damn things.

These are abject ordinary stuff, so take it with a pinch of salt.

#!/usr/bin/env bash

echo Disconnecting the vpn to use this facility....

$(command -v piactl) disconnect
youtube_url=$1

dl="$(command -v yt-dlp)"
holdsongs="$HOME/Music/downloaded_songs"

cd $holdsongs

$dl  --extract-audio --audio-format mp3  $youtube_url

if test $? -eq 0;then
        find "$holdsongs" -type f -print
        sleep 10
        echo Turn back the vpn on again!
        $(command -v piactl) connect
fi


Goodness graces me! What an ordinary way of accomplishing something even more simple!! You probably noticed that the piactl binary is provided by the vpn I use, in a similar way you might induct your own vpn provided cli tool too. Oh,/yt-dlp/1 is the binary to use in the command line.

Alright, the only difference between the video download script to the above one is in a single line and that too is like this :

$dl -f 18 $youtube_url

That’s it! In terms of change.

The crux of the matter is: Turn off the damn VPN and once the job is done, turn it back on. Period.

Linux Kernel: Understanding The Boiling Points

Well, before you foray into the exciting venture of Linux Kernel contribution, you are supposed to know little intricacies it imposes on you. It may sound trivial but plays a vital role to forge ahead with the contribution.

All I am trying to suggest there are different places you should look and delve into related to different nuances of the Linux Kernel development model.

Here are the places you should follow :

Linux Next: This is the place where you are supposed to bake your code for reviews and improvements. Be ready to receive some strong feedback. But if you don’t take that into your heart and think of constructive criticism of your work, then it will help you to get along. But how do you start with? Here is what you need to do to start with this phase:

Follow this: Working with linux-next

Linux Mainline: Oh, this is a hotbed, where things are getting boiled. All the -RC’s aka release candidates curated here and released. This is a good place to start with something very bleeding edge public exposure. You can play with it and provide your feedback to the maintainers about something not normal or uncanny behavior of the tree. That will help the maintainer to correct it and provide the solutions to the next release candidate.

Follow this: Working with Linux Mainline

Linux Stable: You can think of it as a public testbed. However, the stable Linux process makes sure that you get the stuff as solid as possible. But, if it fails somehow, somewhere(after all it is damn software, right?) then please provide your feedback to the maintainer of the stable tree. Also, this tree will help you figure out, something called backporting which essentially means providing some facility to the older version of the kernel to be incorporated. Again, you have to be on top of your solution to let the maintainers know why that is important to get it included in the previous versions of the kernel and not break anything else.

Follow this: Working with Linux Stable

Gentoo Wireless Deauthentication Problem Shoot And Bring Back The Damn Network

Damn! I have been bugged for quite some time with it and finally decided to take a call on it.

Gentoo deauthentication from wireless by choice(reason=3) is/was a real problem. Oh, haven’t you started googling fanatically the moment you saw the title?? 🙂

Anyway, it has been discussed in an ancient thread in gentoo forum. And I have curated a mundane script to deal with, every time it occurs. Here is the ugly script at your disposal:

#!/usr/bin/env bash

vpnid=$(pidof pia-daemon)

echo Checking the common cause....
echo
grep  "Gentoo kernel: wlo1: deauthenticating from" /var/log/messages | tail -n 2

echo
echo If anything blocking the connection devices....
echo

sh -c "rfkill list"

print "Is it blocking anything??[Y/N]: %s"
read -r provide

if test $provide == "Y";then
       sh -c "rfkill unblock all"

print "Is it get back the network?[Y/N]: %s"
read -r response

elif test $response == "Y";then
        echo Alright...
        sh -c "ping -c 1 10 -i.2 google.com"
        exit 1
else

echo Bouncing all network.....
echo

kill -9 $vpnid

sudo /etc/init.d/net.lo restart

sudo /etc/init.d/dhcpcd restart

sudo /etc/init.d/wpa_supplicant restart

printf "\n Bounced the entire network stack.... Done\n"

sh -c "ping -c 1 10 -i.2 google.com"

fi


Pretty ordinary. Basically to bounce the damn network after checking for blockage of any particular physical device, if so, then unblock it. And if it fails still to get back the network, go to the conventional way of bouncing the entire network stack.

Now, when the deauthentication happens, then in the log it shows like this :

2024-07-17-065002_1920x180_scrot.png

So, I have checked that pattern very first thing in the above script. But wait, that could happen for several other reasons too. One of them could be, that when we close the lid of a laptop, it simply disassociates from the wireless to save power, and that is nice of it.

But, it happened while in a working state and middle of something, so I was pissed. In most cases, the first part of the if statement brings back the damn network on its feet, and very seldom the last part of the if-else part needs to get executed.

I have also inducted the bssid value i.e. the mac address of access point wpa supplicant configuration for extra sure that it should not miss it by any means.

Oh, I have ignored a large part of internet search results, because most of them were bogus and delved into something else.

Linux: Ecstasy and Agony Of Living With It

TL;DR This is NOT a pure tech post nor I intended to.

Well, Linux has been in the mainstream for a couple of decades, in case, you failed to realize it. And full-time living with it has some major consequences. I shall be pretty blunt about it(haven’t I done so in various other instances??). But to remind you of the path chosen by me for my own sake and had nothing particular in mind.

Am I the only one? What a vague query! When you see things people are flocking into it like flying on a honeypot, not everyone has the same intention and motivation to do with it. That is quite understandable because we are different human beings and our gray cells between the ears have radically different things in store. But, one common underlying thing that knits us together is to use the damn same platform and probably the other offering (read as tools) on that space. Some people are even more courageous than most to put that invaluable and unbearable effort into bringing things from other platforms to this one. So, the underlying ethos to do that kind of stuff is to see and explore the unknown and uncharted territory with the help of something known for ages. That is a good ploy and kudos to those upbeat folks.

Now, while at it, some people believe that the decision to do something well is solely on their shoulders as it is often cited as “freedom to do whatever you like”, that statement is sincerely misleading. You just can not airy-fairy way of doing things that could be used in real-life scenarios, could you? Certainly NOT. Then, the crux is, it is always the case, whether you like it or not, that some people who are core to the project always have the say to get on with things with the project. Period. That is the harsh truth, the sooner you realize it the better for you. That certainly does not stop anyone from having their opinion and importantly working to force it in the other direction. We have had many over the decades, specifically in this space(real Linux and ecosystem), and it augers well for almost everybody.

One, striking noticeable thing, if you are observant enough, is to see that we just don’t care or entertain special cases that much. In fact, have a strong opinion of eliminating that state to the early possible interaction. It is a damn good thing, for the overall health of the project. Why? Because you have to deal with less memory scathing stuff than to do work seamlessly with other parts of the thing. Plus, generic implementation makes more sense in 90% of the cases or more, so people should be encouraged to engage in it. Oh, I have forgotten to remind you that, we just don’t care about the heroes of any sort. And it is pretty evident of people who drive the project, especially the Linux kernel.

While it brings all the freedom to express yourself and in the hope to make a mark on the world stage, oh yes, that is the driving force of a lot of people getting involved in the project. Alas, soon they find, it is not just what whisker to get on. You are supposed to have an enormous amount of grit and perseverance to get with the damn thing. Otherwise, you are just a wandering walker and will be forgotten in a fizzy. Those who come and stick(yours truly) with it for a long time for the sake of pure enjoyment and a certain sense of achievement(how small it is doesn’t matter really). I have cherished every small bit I did with it, with all my miscreant and vigor. Moreover, I am kind of over the moon(and stringently NOT complacent) that I did it myself, and it took way longer time to do whatever little and less impactful work.

Daily driving(that is a common jargon)we use in the tech world to say a simple real-life thing is “use it every walk of life”. And that has a serious kind of impact on our lives and the people we live with get the consequence of our involvement with our beloved thing. Oh, it is certainly not a hunky-dory situation to live with. Like life, it has a certain amount of uncertainty around it. But most of the kernel1( read it as an important part, not just software).

So, adjusting to its vagaries is the lingua franca of surviving with it. And I have seen people give up too early, especially these days with so much distraction and the ever-growing crux of all quick gratification, meh, what a way to measure life events! But nonetheless, good work takes time and it is an undeniable truth that perseverance wins over mastery. Embracing the pain to evolve is more meaningful than to things given on a platter. Because it gives some sense of accomplishment, no matter what the scale. Provided, you are not bitten by the idiotic bug to compare yourself and your work to others for competitive sake. There will be always people who will supersede you in many ways, and that certainly does not mean you give up on your loved thing. I didn’t. Despite being told and shown not having the metal to go any further than what I “deserved to”. Now, I look back and wonder about that statement and find no logic and proper reasoning behind it. Empty words are easy to throw around and often done by the incompetent arseholes for their benefit. So, ignorance is bliss. Provided you are fully aware, of what to ignore and when to pay attention.

Staying on course is an important act of your commitment towards it. I have seen stupid people come with a bang and go with a bang and grab some limelight for themselves in a forgettable minuscule time. Good for them with their short span of in-vigorous vigil to seeing popular software. Also,, they come into the limelight to show their metal and how others are fared to be in the project. These subtleties are often missed by the naked eye. But so very harmful and highly recommended, if you discover such a feat( yours truly discover some, unfortunately), stay away or ignore it outright for your own sake.

I do what I love most is not a buzzword nor it was explicitly said two and half decades back( it might be now uproar in “unsocial media”), taking a decision to embark wasn’t that easy, considering the environment and the kinda grown-up I had had. Many have done better coming out of that similar environment. Every passing day of candor with my liking is kinda bliss to the life and every little piece takes away the grim of not good enough tag from my mind bit by bit. No, I am not a person who delves into pessimism for too long. The situation of real life does not allow me to linger in that mode, eject me out of it in a jiffy.

I have put forward what I could have done best with my limitations and certainly not sitting on it. Linux has given me the wings to express my way of seeing life and live it in my own way. Yes, there are still steps to be covered and I am well assisted by “real well-wishers by their act” to thrive. I am grateful for their consideration of my life input. One more thing that plays an important part, just not letting it go easily if I cling to something.

The excruciating pain of gaining some understanding in the early years helps the system to stabilize to get on with Linux. The lack of formal academic training made the progress slower, and it still does in some occurrences. However, I have discovered the way to get over the obstacle provided by doing so much trial and error and importantly not shy being about asking for help from people, and I felt( very important!) to provide that help. You can sense who can and can not in some sense when you are in it for a little longer, so you use your time judiciously to get it. I have always been very particular, in life, about my choices. Good or bad that beholds to me.

Bent on Linux was a damn good decision I made in the very first place at a very crucial juncture of life. And I put sincere deaf ears to people who I tagged as naysayers. It turns out boon for me. Gaining insight and living life with it is certainly a fascination that comes true in real life. I couldn’t have dreamt or done better.

Shortcoming! Who doesn’t have or what don’t have?? Getting over it and at least put an honest effort to get on with it is a bare minimal requirement, just like building a good long-lasting relationship with someone of your choice. Aren’t we all doing the same thing in life, knowingly or unknowingly? I have come to term that every good thing has some drawbacks and if you want to be in a long-lasting relationship with it, you have to ignore certain aspects of it. Oftentimes, the goods are much heavier than the downsides, when you realize some materialistic aspect takes a backseat. So, my hard-earned horse is allowing me to ride on it for the moment and I am doing my bit to get it fed properly.

I never had nor do I have any amount of inclination to compare with people who are good at on other platforms. I have always said, I feel at home on Linux and that’s my answer to their combative and competitive query. What’s the point of engaging in babbling which does not end in a fruitful result. I am more open to people who have the desire to understand and allow me to understand through their non-prejudice-laden understanding of some common interesting stuff.

My very strong and intentional avoidance of politics of any sort helps me to thrive in my eyes and understanding. When I sense something about it, I generally withdraw myself from it quietly. I am an apolitical person with a serious lack of understanding of what is achieved with it when things are clear, but not always. Like life, the tech world is prone to have some politics, after all, humans are still running the show and it reflects people’s choices about some technical things. But when the imposing takes precedence, then generally I take a call.

I had my fair share of what I wanted to do with Linux and hoped to give back more meaningful ways to it for the sake of my whole selfish reason.

Footnotes:

1 The important part of anything

Opensource Tools Emacs And Vim Helps To Organize

Well, the tools mentioned in the title are cross-platform tools1, which means, they can be used on various other platforms than on Linux. But I have been exclusively running Linux on my machines for ages(what’s the big deal??), so I better stick with that platform.

HEADS UP! None of the tools are self-fulfilling, so these have dependencies on other tools too.

Now, these editors have the capability beyond trivialities like organizing and reminding(which I have written about in other posts), but this feature is certainly highlighted for the NON-tech people as well as tech people.

Let me show you the way I keep reminding myself from time to time(and hopefully you have already seen this picture before somewhere else I might have shared, but for the sake of this post, one more time 🙂 )

While I am sitting on Emacs, I have an org file that keeps track of what I am up to and has capture templates designed to capture some keystrokes(again some other posts I have written have more details about that mechanism) to induct into the file and while looking at it, almost a ritual to do so, give me a sense to what to follow up.

Here is what it looks like :

2024-06-22-025120_1920x1200_scrot.png

See! Not many items bother me too much. That is because I have only inducted things that I can attend probably. Period.

Likewise, while I am inside Vim, I keep my knowledge base within it and keep induction if I find something interesting(the same mechanism applied in Emacs too, so it is redundant). I might look into these offline and spend prolonged time on them to gain more insight into the topic.

Here is what the KB looks like :

2024-06-22-025540_1920x1200_scrot.png

That’s a section shown in the above screenshots, there are more in that space, which is not showing. 🙂

Again, there is an easy method to induct in it and it should be, otherwise, the purpose of having them will defeat the purpose. Essentially, I took advantage of the software package specifically made for those editors and used them as needfully as possible.

But why do you need to be organized? (Please, don’t overdo it.)

To make sure the gray cells between the ears are aligned with the steps we have decided to follow. IOW, keep refraining from distraction and noise. The more involved in the process of the unknown the more confusion arises. Mitigating that space is very essential to move forward in a journey. Importantly, it should solidify your understanding of the specifics, which interest you the most.

Inside Emacs I take advantage of org-mode to the fullest extent, this is such a beast to know and enhances productivity to some degree as well, while in Vim I take advantage of Vimwiki to lay the base for the note-taking and knowledge base build.

The invocation of the specific spice(the underlying software) customized2(out of need) to make these tools available with minimal effort. And I have a sincere allergy to make something more complicated than it requires. It also means, it is after all a piece of software and it is bound to misbehave sometime, so to debug the damn thing, the implementations have to have some sort of easy interface to get on.

Convenience doesn’t mean one has to go overboard and make things happen for them. But it surely means, before stabbing at the things make sure what you expect out of it. This leads to a basic understanding of inspecting what it can offer in general. Knowing your requirements is a must. Period.

People generally tend to mold their favorite things to their liking and I am no exception to that rule. I do what I lean on every single day for productivity. Writing for the sake of writing and not following up stringently makes no sense. So, knowing the tool and not taking advantage of its offering is not so useful.

There is an adage, “A fool with a tool is nothing but a fool If he fails to tap its potential of it”. Importantly, for your betterment.

Oh, before I let you go, my Vim and Emacs dotfiles, are having a lot to do with this space. You might take a look at them at your leisure. They are both hosted on GitHub in the dotfiles repository.

Footnotes:

1  That means these tools can be run on other platforms like Windows and Mac machines.
2   You might read or think of it as automation. A kind of process to make things happen behind the scenes and present you with the effects or results.

 

Bash Script To Convert PDF To Show Properly In Emacsclient Running On Terminal

Okay, it is ugly, not elegant and easy, but the damn thing works well to fit my workflow. Period.

What am I talking about?

The backdrop is important to fathom. The kind of environment I operate in on a day-to-day basis is purely in command line interface aka CLI mode. So, I have to take some action to make things work or not available to that environment for my choice’s sake.

This is one of the cases out of many I did in the past.

The context: I do run Emacs in terminal mode(huh, what’s the big deal?). But running in that mode does not allow us to take full advantage of the power this damn tool brings onto the table. I run it this way because my entire workflow is bound with this echo system coherently. Hence, the ploy needs to be placed to get over the obstacle it provides to get on with a particular job. And that damn obstacle is Emacs running on terminal mode does not have the facility nor does it allow you to see PDF inside it.

Oh, yes! I can see the damn PDF in some external tool, which is the default case and I do use it with my favorite PDF viewer i.e. Zathura.

But, sometimes the situation demands other things, and essentially some sort of inexplicable urge to do things in my way, damnit!

So, here is what I opted for, wrote an abject ordinary shell script to do the job for me while sitting in that environment. This is for your viewing pleasure and if you are an “expert” in this field frown upon the flow, I won’t stop you.

#!/usr/bin/env bash
#===============================================================================
#
#          FILE: convert-pdf-to-org.sh
#
#         USAGE: ./convert-pdf-to-org.sh
#
#   DESCRIPTION:Simple pdf to org conversion to see it in Emacs runnig on terminal mode.
#
#       OPTIONS: ---
#  REQUIREMENTS: Pandoc popplar-tools
#          BUGS: ---
#         NOTES:  A quick and dirty way to convert a pdf to org
#        AUTHOR: Bhaskar Chowdhury (https://about.me/unixbhaskar), unixbhaskar@gmail.com
#  ORGANIZATION: Independent
#       CREATED: 05/24/24 02:01
#      REVISION:  ---
#===============================================================================

# License (GPL v2.0)

# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
# set -o nounset                              # Treat unset variables as an error

file=$1 # this suppose to be a pdf file
file_name=${file%.*}

if test ! $file;then
        echo you are suppose to provide the pdf file
        exit 1
fi


if test $(which pdftotext) != "";then
      sh -c "pdftotext "$file" - > converted.txt"
      sh -c "pandoc -t org converted.txt -o converted.org"
      sh -c "tr -d '\f' < "converted.org"  > strip.org"
      sh -c "sed -i '/^[0-9]/d' strip.org"
      sh -c "sed -i '/^\*/d' strip.org"
      mv "strip.org" "$file_name.org"
      emacsclient --tty "$file_name.org"
else
        echo Oops! the required binary is missing.
fi

Well, your taste would be quite different from mine and I just don’t care about it. People can take a cue from it and implement it in their own way, most probably the better way. I haven’t put enough time and energy into making it more “elegant and concise”, because I couldn’t.

You have to have popplar-tools installed in the system beforehand and that is available from the core repositories of all the Linux distributions I have known of. So, not a deal-breaker. Pandoc is too popular and available, so this is not also a showstopper either.

I have called up several sub-shells to perform intentional steps. At this age, the performance of this kind of processing affects the system very minuscule or nothing at all.

The blemish the ‘shell expert” might put on, I could have used pipes more vigorously, and they were probably right. I am not making a case for my reluctance to make it more elegant than this ugly-looking stuff. I have tried once during the build phase and it didn’t turn out as I was expecting, and damn! Furthermore, I was in a hurry(for some unknown reason, think of it I am making an excuse 🙂 ) to get it done and over with.

So, here you go, a minimal way of seeing PDF in Emacs terminal mode.

Linux Kernel Development Essence In Brief

Alright, I am going to dish you out some of the nuance of Linux Kernel Development in general. If you are not aware or familiar with those, then it might help you to map your mental state to approach it more concise way.

Here are a few facts right off the bat, note it down:

One, you are supposed to have or inculcate patience about the proceedings(if you don’t have, the process will teach you how to have some).

Second, you have to have some evidence of your proposed work in the kernel. That means you need to furnish some sort of work/WIP in real form to get people to be interested in your proposal.

Third, you are expected to give a clear-cut (without deviating or derailing) explanation of the queries asked by other people in that parlance. That will eventually build up confidence in people about you.

Fourth, adding new features is very tempting to the kernel and the inclination of most of the people is very prominent. But, getting rid of the gotchas or old bugs or stale code from the kernel is valued more(take a note of it and print it, so you can stick it to the wall right in front of your workplace).

Fifth, swearing doesn’t help if your work is not in line with your swearing(overall it is a bad ploy and bats actual people away). So, put your head before you preach.

Sixth, you should or must have(if you don’t, again the process will teach you the lesson) to get stern feedback(and often right) from a lot of people(you consider yourself one lucky bastard, that people who matter putting attention to you). Instead of taking that as an emotional jolt, if you, again, put a little thinking cap and realize from their perspective, then you will be better off.

Seven, mistakes happen and everyone does. You will not be an exception, that is written on the stone, don’t try to defy it. I have seen best have fallen. But the smarter ones confess and move on. That will Phillip your chance to do more.

Eight, nobody cares about your brilliance if that is not transferred to some tangible or vivid workable solution to a real-life problem. No matter how arcane your solution would be. The effort might tick the people who decide to work with you in the future in this parlance. Period.

Ninth, stick with what you could do best to help. We do not have a place for heroes here in this project and I am not aware of any. Your work should speak louder than your mouth and that it is.

Ten, don’t try to be master of everything, it is not possible and the kernel is big. Nobody cares if you start to fiddle with everything. If you stick with your area of interest then you might (and I said, you might) win some favor to get your work to be merged and used for real-life scenarios.

Eleven, we have more failure in the kernel than success. Believe it or not, so don’t lose heart, if your work is not fathomable to the real people(there are a handful and very prominent ones because of their long-lasting associations and involvement with the project), who drive the project.

Twelve, in kernel “low hanging fruits” are rare, and meritocracy is defined differently than your academic way of thinking of it. Very important distinction, if you fail to get it, you might be living your entire life having a false notion of “correctness”.

Last, but certainly not least, you are not that so important nor you are doing any favors other than yourself. How? Because you are getting a name by contributing to the best and most successful software project on the earth and some other people get the benefit of your knowledge to get better. If you don’t do it, someone else will do that and you will be using their work to get better.

Oh, it is very obvious, if you do not have the knack to do better for others in the truest sense, then you are doomed in this parlance. In the long run, you might gain that realization.

Gentoo Linux Custom Package Set Advantage

Alright, I am not sure about others who use this particular Linux distribution called Gentoo1. I do and it is been ages since that it has been the primary choice of Linux distribution in my machines. Others are, namely, Debian and Slackware.

This is a very distribution-specific post, where I am going to show you the specific stuff I do to get with the system once freshly installed on a machine. The trick is to get a certain set of packages which helps me get along with my day-to-day work on it.

Gentoo provides a facility called package sets2, where you can enlist your required packages, and you can get them at once by invoking a single command.

For this to work, you *have to have a directory name sets* under /etc/portage/ , and should contain a file having all the packages needed. So, you can call up that file like this:

emerge -av @must_have_packages

It will enlist the packages you mentioned in the file and ask for your consent to install them. My own custom-set-packages file looks like this :

2024-05-30-062813_1920x1200_scrot.png

When I invoke to install all those after a fresh install or some other circumstances, then I do :

2024-05-30-063132_786x163_scrot.png

..and the output would look like this :

2024-05-30-063142_1920x1200_scrot.png

Cool! Right?

Now, you can see what are the package sets provided by simply invoking this command on the shell :

2024-05-30-063807_1920x503_scrot.png

You can query the set’s content like this, for instance, to find out what is in the set system :

sudo emerge -av @system

You can individually query all the sets enlisted. You might be interested in what the world set contains, as it is frequently used:

sudo emerge -av @world

You might create other package sets as per your requirement, so, it lessens roaming to find stuff.

Vim Generate PDF and HTML Notes Automatically With The Help Of Pandoc

Alright, this post is all about showing you the trivialities with one of my editors i.e. Vim. Basically, to automate the entire process of building PDF and HTML notes via some rudimentary keystrokes.

Pandoc is a damn good software for the heavy lifting behind the scene.

Let me show little by little the process involved and of course, the code that generates those things.

Vimrc configuration, which helps to get it done via VIM

2024-05-15-125016_1920x169_scrot.png

See! I have designated key binds to call up for generating specific things and that essentially call up the required script.

Generate PDF Notes

2024-05-15-125309_1916x321_scrot.png

Bang! No fuss a simple Pandoc call does the trick here.

Generate HTML Notes

2024-05-15-125511_1919x319_scrot.png

Phew! No surprise here too..abject ordinary stuff to get the job done.

How To Extract The Latest PDF Build From The Designated Directory

2024-05-15-125740_1920x220_scrot.png

How To Extract The Latest HTML Build From The Designated Directory

2024-05-15-130049_1920x221_scrot.png

I have stuck to the minimal tool to be used for this and it does the damn job.

Okay, I have called a predefined template file to insert specific values to the note-taking file and a few predefined macros to fill those fields. Here is the template file look:

2024-05-15-130422_1920x298_scrot.png

Okay, if I put the cursor right next to the colon and press some keys(invoke macros) to fill in that field. For instance, if the cursor is on the Author field and I press au that fills the name of me. Likewise, on the date field if I press dst it pastes the current date to that field. So, I don’t have to manually fill in those fields but rather press the keystrokes to do the job.

Pretty ordinary, right? My requirements are minimal. However, you can extend the template file at any time wish to add more metadata to it.

Finally, if you are inclined to see how things panned out visually with live-action, then you might take a peek at my YouTube video.