Uttam A. looks like a good fit?

We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?

Schedule Interview Now

Uttam A. - Fullstack Developer, Artificial Intelligence, Cloud

My journey at Softaims has been defined by curiosity, growth, and collaboration. I’ve always believed that good software is not just built—it’s carefully shaped through understanding, exploration, and iteration. Every project I’ve worked on has taught me something new about how to balance simplicity with depth, and efficiency with creativity. At its core, my work revolves around helping businesses and people achieve more through thoughtful technology. I’ve learned that the most successful projects come from teams that communicate openly and stay adaptable. At Softaims, I’ve had the opportunity to work alongside professionals who challenge assumptions, share knowledge generously, and inspire continuous improvement. I take pride in focusing on the fundamentals—clarity in logic, consistency in design, and empathy in execution. Software is more than a set of features; it’s a reflection of how we think about problems and how we choose to solve them. By maintaining this perspective, I aim to build solutions that are not only effective today but also flexible enough to support the challenges of tomorrow. The culture at Softaims promotes learning as an ongoing process. Every new project feels like a step forward, both personally and professionally. I see each challenge as a chance to refine my skills and contribute to the shared vision of building technology that genuinely improves lives.

Main technologies

  • Fullstack Developer

    11 years

  • Dashboard

    3 Years

  • Machine Learning

    1 Year

  • Elasticsearch

    1 Year

Additional skills

  • Dashboard
  • Machine Learning
  • Elasticsearch
  • Python
  • node.js
  • Neo4j
  • FastAPI
  • LLM Prompt
  • LangChain
  • Generative AI
  • Vector Database
  • Pinecone
  • Grafana
  • LLM Prompt Engineering
  • Artificial Intelligence
  • Cloud

Direct hire

Potentially possible

Ready to get matched with vetted developers fast?

Let’s get started today!

Hire undefined

Experience Highlights

AI-driven analysis tool using Langchain for a leading real estate firm

Client Background Client: A leading real estate and financing firm worldwide Industry Type: Real Estate Products & Services: Infrastructure Development, Financing, Real Estate Organization Size: 10000+ The Problem Creating a user-friendly data analysis tool capable of interpreting natural language queries and providing insightful analyses from CSV data. The tool should facilitate seamless interaction, enabling users to gain valuable insights without the need for technical expertise. Key functionalities should include data exploration, trend identification, pattern recognition, and anomaly detection, all presented in a comprehensible format. The tool must also ensure efficient handling of CSV datasets while maintaining accuracy and reliability in its analyses. Our Solution Data Ingestion and Conversion: CSV data is acquired from a source (local file system, cloud storage, etc.). The data is then converted into a pandas DataFrame using the read_csv() function or similar methods provided by the pandas library. Data Cleaning: Data Cleaning operations are performed on the dataframe so that it serves as an ideal input for Pandas Agent. These may include: Column Data type conversion. Handling Duplicates Handling unnecessary columns, etc. Initialization of Langchain’s Pandas Agent: Langchain’s Pandas Agent is initialized with the necessary parameters. These parameters include: System prompt: A custom prompt provided by the user or defined in the application. Temperature: A parameter controlling the randomness of the model’s outputs. Model: The specific model or model configuration to be used by the agent. Other relevant parameters based on the requirements and capabilities of the agent. Integration with Pandas DataFrame: The DataFrame created in the previous step serves as input for the Pandas Agent. It contains the structured data which will serve as input for the Pandas Agent. Natural Language Query Interpretation: The user interacts with the system by posing queries in natural language. Langchain’s Pandas Agent interprets these queries using GPT-4 backend and converts them into executable commands or operations on the DataFrame. DataFrame Operations: The Pandas Agent executes the operations needed on the DataFrame. These operations may include: Filtering: Selecting rows or columns based on specified criteria. Aggregation: Computing summary statistics or aggregating data based on groups. Transformation: Modifying data in the DataFrame (e.g., adding or removing columns, changing data types). Joining/Merging: Combining multiple DataFrames based on common keys or indices. Sorting: Arranging rows or columns in a specified order. Other pandas DataFrame operations as required by the user queries. Delivery to End User: The processed output is delivered to the end user through the streamlit user interface. The user can review the insights provided by the system and further refine their queries if needed. Solution Architecture Deliverables Data Analysis Tool with Streamlit frontend. Tech Stack Tools used Langchain, OpenAI gpt-4 API Language/techniques used Python Models used Pandas Agent, GPT-4 Skills used Python, Streamlit, Streamlit cloud deployment, Langchain Web Cloud Servers used Streamlit cloud What are the technical Challenges Faced during Project Execution To make the tool follow the Indian standards in terms of Financial Year Quarters, currency and human readable values instead of exponential values. How the Technical Challenges were Solved The challenge was solved by decreasing the temperature of Pandas agent to 0 and make a custom system prompt to introduce maximum bias approximating the desirable answers. Business Impact The user was able get data analysis insights without expertise in python, pandas and other tools used in the process of Data Analysis in a fraction of time compared to what it would have been if the process was done manually.

Data from CRM via Zapier to Google Sheets (Dynamic) to PowerBI

Client Background Client: A leading solar panel firm in the USA Industry Type: Energy Services: Solar Panel Organization Size: 500+ The Problem Solar Panel organization from America wants to keep track of sales data. They want to see the leadership dashboard of their organization in terms of sales. They also want to keep track of their campaigns and leads generated from sources of those campaigns. They want to keep track of sales data from different sources. Our Solution First, we fetch the data from CRM to PowerBI. Clean the data of CRM using DAX and then perform calculations on the data. Using cleaned data, we build KPI on PowerBI. Solution Architecture To complete the project, we follow the following data flow pipeline: Data from CRM 🡪 Zapier 🡪 Google Sheet (Dynamic) 🡪PowerBI Language/techniques used PowerBI, DAX Language Skills used CRM, Zapier , PowerBI, Google Sheet What are the technical Challenges Faced during Project Execution Challenges Faced during the Project Execution : Fetching the data from CRM Unclean Data Merging the Data How the Technical Challenges were Solved Solution: To Fetch the data from CRM. We used Zapier. It is connector between two applications so that whenever a particular incident happen it will populate into another application. We use Zapier to connect CRM and Google sheets so that whenever a new lead will change or modified data will be stored into google sheets. Data in google sheets was uncleaned. First, we connect the Google sheet with PowerbI then perform EDA to clean the data using DAX language. Using merging of two tables by ONE-ON-ONE schema we solve duplicate entries of a particular lead in PowerBI. Business Impact Using this Dashboard client can make important decisions like from which campaign they are getting a greater number of leads and out of those leads how many are actually a Sale. They can keep track of their sales leadership of employee of the month in term of sales.

Data Management, ETL, and Data Automation

Client Background Client: A leading tech firm in the USA Industry Type: IT Services: SaaS, Products Organization Size: 100+ Project Description Businesses now have more access to data than ever before in today’s digital economy. This information is utilised to make key business choices. Businesses should invest in data management systems that increase visibility, dependability, security, and scalability to ensure that workers have the required data for decision-making. The client wanted to get the data management process automated using a tool from Python. Multiple operations like merging,sorting, filtering had to be performed on data from various resources. The data resources were mainly csv files and data from SQL queries in PostgreSQL. Our Solution The project solution contained two tools that would aid in automatic efficient data storage. The first tool will concatenate all of the CSV files before merging them with the data from the SQL file. The acquired Excel file will be used as input for the second tool. The second tool will sort, filter, and lookup the Excel file received in the first tool. This tool will add columns that will be useful for the client’s analysis. The major goal is to assist the client with data management while requiring as little manual labour as possible. The files obtain the needed data in an Excel file by giving the proper input files. Project Deliverables The project deliverables can be divided into two parts: Excel Tool1: ExcelTool1 generates an Excel file that contains two sheets RSLTS IN and RSLTS OUT. The RSLTS IN is obtained by concatenating all the csv files in the Output folder. The RSLTS OUT is the result of merging the data from vwr egeas.sql query and RSLTS IN. Excel Tool2: Excel Tool2 creates another Excel file with one sheet RSLTS and csv files like vwr_instructions_new table, vwr proto and INST_RTR. This tool performs excel operations like lookups, arithmetic calculations and merging of data from multiple sources. Tools used For the whole data management and automation, we have made our own tool by python scripts. PostgreSQL was used to merge the csv files provided by the client with the python scripts. The automation tool will store data in the excel sheets. Language/techniques used PyCharm for compiling and running the code. The scripts for the automation tool were written in the Python programming language. OS, glob, pandas, numpy and psycopg2 were thePython libraries used in the project. Skills used Configuration and Data moving using PostgreSQL. Automation of tools Exception Handling from Python Databases used Two types of databases were used: Google excel sheets and PostgreSQL. What are the technical Challenges Faced during Project Execution Some minor challenges were faced such as data discrepancies generated during the automation process. How the Technical Challenges were Solved The challenges were solved by reworking on the automation tool and consulting with the clients for their requirements. Business Impact It is critical to use appropriate data management procedures to ensure the smooth running of a firm. Furthermore, data management must be very precise, cost-effective, and completed as soon as possible. The inability to handle data can result in costly consequences and a permanent stain on the company’s image. Every company is responsible for developing a robust data management plan. The following are some of the reasons why data management is critical to the success of the firm. Instant Availability of Information: Data management makes information easily available for quick access based on company needs. Data management is also essential for accounting procedures like auditing and other strategy-based operations like company planning. The more time you spend hunting for misplaced files and missing documents, the less productive you will be. And you are aware that time is money.

Data Studio Dashboard with a data pipeline tool

The Problem The client needs a consolidated KPI dashboard that aggregates data from various applications and SaaS products. Currently, the data is scattered across different platforms, making it difficult to track key performance indicators (KPIs) effectively. The client wants a dashboard that automatically updates with new data, eliminating the need for manual updates. The dashboard should contain separate tabs for current week sales, tickets, customer satisfaction, leads, conversion, company records, and finances.

CRM (Monday.com, Make.com) to Data Warehouse to Klipfolio Dashboard

Client Background Client: A leading marketing firm in the USA Industry Type: IT Services: Marketing, promotions, campaigns, consulting, business growth Organization Size: 100+ The Problem The client requires a dashboard for a ”week in review” and “human resources”. The dashboard should be dynamic whenever the client opens the dashboard, it should show the current week and should also have a dropdown choice option based on different time periods. So the client requires a meaningful KPI on the dashboard. Research Objective Taking the problem statement into consideration the following objectives are established. Objective 1: Getting access to the Monday.com site, Make.com, Google sheet, and Klipfolio. Objective 2: Connect Monday.com data to the Google sheet. Objective 3: Data Integration using make.com. Objective 4: Building KPIs using various calculations and formulas to get meaningful insights. Objective 5: Creating a dashboard from insight driven by KPIs. Solution Architecture 1. Data Integration Fig.3.4: Data Integration 2. Overall Architecture Fig.3.4.2 Overall Architecture Tools used Klipfolio make.com Language/techniques used Klip Formula Skills used Data Integration Data Processing Data Visualization Web Cloud Servers used Google Sheet What are the technical Challenges Faced during Project Execution During the project execution we faced following challenges: 1. Mapping the values in make.com from Monday.com 2. Whenever the update is generated on Monday.com, a new row is added to the Google sheet. 3. Extracting insights from the data How the Technical Challenges were Solved To solve the technical challenges, we provided following solutions as follow: 1. For mapping the values from Monday.com to make.com, we got access as admin to reach out the columns id on Monday.com. 2. On make.com, we created multiple models linking each other based on the row id in the google sheet. 3. After completing the data integration, we use calculations to extract meaningful insights from the data. Business Impact Using this dashboard, a client can keep track of the employee’s work process. So he can analyze employee workflow nature.

Education

  • National Institute of Technology, Bhopal, India

    Bachelor of Technology (BTech) in Data Science, Analytics, Dashboard, NLP, and Machine Learning

Languages

  • English