Aamir S. looks like a good fit?

We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?

Schedule Interview Now

Aamir S. - Fullstack Developer, ETL Pipeline, Frontend

At Softaims, I have been fortunate to work in an environment that values creativity, precision, and long-term thinking. Each project presents a unique opportunity to transform abstract ideas into meaningful digital experiences that create real impact. I approach every challenge with curiosity and commitment, ensuring that every solution I design aligns not just with technical requirements, but also with human needs and business objectives. One of the most rewarding aspects of my journey here has been learning how to bridge the gap between innovation and practicality. I believe technology should simplify complexity, enhance efficiency, and empower people to do more with less friction. Whether building internal systems, optimizing workflows, or helping bring client visions to life, my focus remains on developing solutions that stand the test of time. Softaims has encouraged me to grow beyond coding—to think about design, communication, and sustainability in technology. I see every project as part of a larger ecosystem, where small details contribute to long-lasting results. My daily motivation comes from collaborating with people who share the same passion for doing meaningful work, and from seeing the tangible difference our efforts make for clients around the world. More than anything, I value the culture of learning and improvement that defines Softaims. It’s a place where ideas evolve through teamwork and constructive feedback. My goal is to continue refining my craft, exploring new approaches, and contributing to solutions that are not only efficient but also elegant in their simplicity.

Main technologies

  • Fullstack Developer

    5 years

  • Amazon API Gateway

    3 Years

  • Amazon Virtual Private Cloud

    4 Years

  • Amazon DynamoDB

    3 Years

Additional skills

  • Amazon API Gateway
  • Amazon Virtual Private Cloud
  • Amazon DynamoDB
  • DevOps
  • Amazon ECS
  • AWS Fargate
  • AWS CloudFormation
  • Infrastructure as Code
  • AWS Lambda
  • Amazon S3
  • EventBus
  • Amazon CloudWatch
  • Cloud Architecture
  • CI/CD
  • ETL Pipeline
  • Frontend

Direct hire

Potentially possible

Ready to get matched with vetted developers fast?

Let’s get started today!

Hire undefined

Experience Highlights

Nebula X Gaming: Real‑Time Session Tracker & Events Platform

I built NXG’s cloud-based Session Tracker and Events platform, featuring a Next.js front end with MUI and NestJS microservices that stream data via Kafka and WebSockets. We store data in MongoDB, with Apache Pinot for live analytics, EventStoreDB for game streams, and Redis with BullMQ for job management. Deployed on Google Kubernetes Engine using Docker, Terraform, and Argo CD, it offers real-time updates under 300 milliseconds, supports over 50,000 users, maintains 99.97% uptime, and increased organic traffic by 120% through server-side rendering for SEO.

ETL Pipeline

A complete ETL pipeline built for custom integration of management system. Each service is inside VPC, on top of which API Gateway and Web Application Firewall is setup. All the data comes across using webhook events into API Gateway and then Extraction - Transformation - Loading principles/stacks are applied.

AWS Cloud for SAAS

Custom SaaS application acting as an ETL pipeline which periodically fetches data from multiple sources, processes them inside the AWS cloud and loads them into a CRM. Highly efficient and reusable pipeline which can act as a connector for creating different integrations on various SaaS applications.

AWS Cloud

Data connector built for a MMS system allowing secure connection using API gateway and processing raw data in S3 bucket. Raw data is then uploaded to dynamoDB where stream events are utilised on top of event bus to perform multiple types of transformations. Each data is then uploaded to respective destinations

Data Lake infrastructure

Data Lake pipeline built using EventBridge Cron scheduler, which periodically triggers a lambda that exports data from dynamodb to S3. Stream events are generated from S3 which are processed through a bus and mapped inside glue catalogs using glue jobs. Glue job then compresses the data and stores the compressed data into different S3 partitions. After loading data into S3, AWS athena is used for querying the data in SQL form and the result of queries are then used in QuickSight where power-bi functionalities can be applied.

Education

  • Government College University, Lahore

    Bachelor of Science (BS) in Computer science

    2018-01-01-2022-01-01

  • Punjab Group of Colleges

    Associate's degree in ICS

    2015-01-01-2017-01-01

Languages

  • English
  • Urdu