Aamir S. looks like a good fit?

We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?

Schedule Interview Now

Aamir S. Cloud, DevOps and Infrastructure Platforms

My name is Aamir S. and I have over 5 years of experience in the tech industry. I specialize in the following technologies: Amazon API Gateway, Amazon Virtual Private Cloud, Amazon DynamoDB, DevOps, Amazon ECS, etc.. I hold a degree in Bachelor of Science (BS), Associate's degree. Some of the notable projects I’ve worked on include: Nebula X Gaming: Real‑Time Session Tracker & Events Platform, ETL Pipeline, AWS Cloud for SAAS, AWS Cloud, Data Lake infrastructure, etc.. I am based in Lahore, Pakistan. I've successfully completed 9 projects while developing at Softaims.

I thrive on project diversity, possessing the adaptability to seamlessly transition between different technical stacks, industries, and team structures. This wide-ranging experience allows me to bring unique perspectives and proven solutions from one domain to another, significantly enhancing the problem-solving process.

I quickly become proficient in new technologies as required, focusing on delivering immediate, high-quality value. At Softaims, I leverage this adaptability to ensure project continuity and success, regardless of the evolving technical landscape.

My work philosophy centers on being a resilient and resourceful team member. I prioritize finding pragmatic, scalable solutions that not only meet the current needs but also provide a flexible foundation for future development and changes.

Main technologies

  • Cloud, DevOps and Infrastructure Platforms

    5 years

  • Amazon API Gateway

    4 Years

  • Amazon Virtual Private Cloud

    3 Years

  • Amazon DynamoDB

    4 Years

Additional skills

  • Amazon API Gateway
  • Amazon Virtual Private Cloud
  • Amazon DynamoDB
  • DevOps
  • Amazon ECS
  • AWS Fargate
  • AWS CloudFormation
  • Infrastructure as Code
  • AWS Lambda
  • Amazon S3
  • EventBus
  • Amazon CloudWatch
  • Cloud Architecture
  • CI/CD
  • ETL Pipeline

Direct hire

Potentially possible

Previous Company

Techlogix

Ready to get matched with vetted developers fast?

Let's get started today!

Hire Remote Developer

Experience Highlights

Nebula X Gaming: Real‑Time Session Tracker & Events Platform

I built NXG’s cloud-based Session Tracker and Events platform, featuring a Next.js front end with MUI and NestJS microservices that stream data via Kafka and WebSockets. We store data in MongoDB, with Apache Pinot for live analytics, EventStoreDB for game streams, and Redis with BullMQ for job management. Deployed on Google Kubernetes Engine using Docker, Terraform, and Argo CD, it offers real-time updates under 300 milliseconds, supports over 50,000 users, maintains 99.97% uptime, and increased organic traffic by 120% through server-side rendering for SEO.

ETL Pipeline

A complete ETL pipeline built for custom integration of management system. Each service is inside VPC, on top of which API Gateway and Web Application Firewall is setup. All the data comes across using webhook events into API Gateway and then Extraction - Transformation - Loading principles/stacks are applied.

AWS Cloud for SAAS

Custom SaaS application acting as an ETL pipeline which periodically fetches data from multiple sources, processes them inside the AWS cloud and loads them into a CRM. Highly efficient and reusable pipeline which can act as a connector for creating different integrations on various SaaS applications.

AWS Cloud

Data connector built for a MMS system allowing secure connection using API gateway and processing raw data in S3 bucket. Raw data is then uploaded to dynamoDB where stream events are utilised on top of event bus to perform multiple types of transformations. Each data is then uploaded to respective destinations

Data Lake infrastructure

Data Lake pipeline built using EventBridge Cron scheduler, which periodically triggers a lambda that exports data from dynamodb to S3. Stream events are generated from S3 which are processed through a bus and mapped inside glue catalogs using glue jobs. Glue job then compresses the data and stores the compressed data into different S3 partitions. After loading data into S3, AWS athena is used for querying the data in SQL form and the result of queries are then used in QuickSight where power-bi functionalities can be applied.

Education

  • Government College University, Lahore

    Bachelor of Science (BS) in Computer science

    2018-01-01-2022-01-01

  • Punjab Group of Colleges

    Associate's degree in ICS

    2015-01-01-2017-01-01

Languages

  • English
  • Urdu