We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?
Schedule Interview NowMy name is Gaurav T. and I have over 4 years of experience in the tech industry. I specialize in the following technologies: Automation, Make.com, Zapier, n8n, API Integration, etc.. I hold a degree in Bachelor of Engineering (BEng). Some of the notable projects I’ve worked on include: GHL - Wordpress Integration using Zapier, Scalable AWS ETL/ELT Pipeline for Real-Time & Batch Analytics, End-to-End Snowflake Data Warehouse for Customer 360 Analytics, Tuskr - Asana integration using Make.com, Tuskr - Jira integration using Make.com. I am based in Pune, India. I've successfully completed 5 projects while developing at Softaims.
I am a business-driven professional; my technical decisions are consistently guided by the principle of maximizing business value and achieving measurable ROI for the client. I view technical expertise as a tool for creating competitive advantages and solving commercial problems, not just as a technical exercise.
I actively participate in defining key performance indicators (KPIs) and ensuring that the features I build directly contribute to improving those metrics. My commitment to Softaims is to deliver solutions that are not only technically excellent but also strategically impactful.
I maintain a strong focus on the end-goal: delivering a product that solves a genuine market need. I am committed to a development cycle that is fast, focused, and aligned with the ultimate success of the client's business.
Main technologies
4 years
1 Year
2 Years
2 Years
Potentially possible
Salesforce India
Our client was struggling with connecting their LMS with user communications. With over 1,400 users taking education courses, manually tracking progress and sending appropriate emails was becoming impossible. Client was unable to send notifications to students enrolled based on their learning journey.
Architected a hybrid AWS data pipeline to process both batch & real-time streams for analytics. For batch loads, I used Apache Airflow to orchestrate Spark ETL jobs, with DBT handling data modeling. Real-time data was ingested via EventBridge and Lambda, then processed using Spark Streaming. The final, analytics-ready data was centralized in Amazon Redshift, providing a single source of truth for BI tools like Tableau and enabling immediate, data-driven business decisions.
Architected a central data warehouse in Snowflake to create a unified Customer 360 view from 5 siloed data sources. I designed an optimized star schema for high-performance analytics and built robust data ingestion pipelines using Apache Spark for batch ETL and Kinesis for real-time streams. This solution eliminated data fragmentation, providing the business with a single source of truth for integrated reporting and deep insights into the complete customer journey. The result was improved data quality and more effective decision-making.
Our client was struggling with their bug tracking process. They needed a better way for testers to report bugs to developers and get notified when fixes were ready for retesting. The manual process was slow and things were falling through the cracks.
Our client's QA and development teams were struggling with disconnected systems. When tests failed in Tuskr, testers had to manually create Jira tickets with all the relevant information. When developers fixed issues in Jira, testers had to manually update test statuses in Tuskr. This disjointed process led to inconsistent documentation, delayed bug fixes, and a lack of visibility across teams.
Bachelor of Engineering (BEng) in Electrical engineering
2012-01-01-2016-01-01