My Experience with Income Tax Return (ITR) Filing

Image
I once thought that if I know something (probably because I've learnt it somehow), I am saving money because then I don't have to pay someone to do that chore for me. This notion of mine has been smashed and boinked all around the place for the last couple of days. I am talking about filing my ITR (Income Tax Returns). Context In the past, during my school days, I have seen my parents getting irritated by the whole process. It was very cumbersome at that time- No online portals to make your work easy. ITR filing in India was a nightmare. I have seen my parents to delegate that task to professionals, who do it for a fee. The problem was, my parents never seem to mind that much about tax. I mean, they did moan about the atrocity that the Indian tax system is, but that's the extent of their outrage. I felt at that time, if they knew more on tax, they would be able to save more on taxes and also would be able to do proper tax planning. Oh boy, what a sweet dream that was! My Li

GPT-3 is not an answer to everything.

Transformer architecture


While having a chat with one acquaintance over WhatsApp, I realised that there is a natural human tendency to use the biggest or the most complex or the most recent things wherever possible. I think using GPT-3 anywhere and everywhere falls within this tendency to go overboard.

For those of you, who do not know what GPT-3 is, it is a transformer-based architecture (Generative Pre-trained Transformer). Transformer architectures have gotten a lot of traction lately due to its’ performance in natural language tasks (Like Generating texts based on contexts- writing poems, stories, tweets, or any generic NLP tasks like translation, summarization, question answering etc.).

There are different transformer architectures suited for different tasks. In the last few years, few groups are trying to build and train bigger transformer architecture (not necessarily better). GPT 3 is a result of such efforts.

I am not at all saying that GPT 3 is not great. It is. Anyone who is lucky enough to test it, would testify that. It’s performance to generate text is outstanding (even better than few of my friends!) and new use cases are coming up daily. But the problem is, you shouldn’t try to solve everything with this mega architecture. It is very big and thus it is very costly to train or to host in the cloud. Even now, when we face so many problems just taking simple deep learning solutions to the production, it needs to be reminded that whether such big models can be integrated in daily real-life use cases in a feasible manner or not. GPT-3 contained 175 billion parameters. Let’s just take a moment to sink this fact. Much simpler deep learning models which take up more than 1GB of size can create both financial and logistical problems while deploying and moving to production. This will seriously impact the migration once the fad of “using bleeding edge AI” settles.    

So, beware guys. Don’t assume, that GPT (or such megatons) will solve all your problems. At least, not yet.

Comments

Popular posts from this blog

Our Most Difficult Struggle

On Crypto assets- Do we worry?

My Experience with Income Tax Return (ITR) Filing