Eman Ali M. looks like a good fit?

We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?

Schedule Interview Now

Eman Ali M. Cloud, Big Data and ETL Platforms

My name is Eman Ali M. and I have over 17 years of experience in the tech industry. I specialize in the following technologies: Project Management, Amazon Web Services, ETL Pipeline, Apache NiFi, AWS Lambda, etc.. I hold a degree in Masters. Some of the notable projects I’ve worked on include: Medical Data Platform and AI Strategy Implementation, Telecom Data Lake Platform Modernization, Advanced Data Pipeline Monitoring and Optimization System, Multi-Source Big Data Lake Architecture, E-commerce Analytics and Intelligence Platform, etc.. I am based in Islamabad, Pakistan. I've successfully completed 24 projects while developing at Softaims.

My expertise lies in deeply understanding and optimizing solution performance. I have a proven ability to profile systems, analyze data access methods, and implement caching strategies that dramatically reduce latency and improve responsiveness under load. I turn slow systems into high-speed performers.

I focus on writing highly efficient, clean, and well-documented code that minimizes resource consumption without sacrificing functionality. This dedication to efficiency is how I contribute measurable value to Softaims’ clients by reducing infrastructure costs and improving user satisfaction.

I approach every project with a critical eye for potential bottlenecks, proactively designing systems that are efficient from the ground up. I am committed to delivering software that sets the standard for speed and reliability.

Main technologies

  • Cloud, Big Data and ETL Platforms

    17 years

  • Project Management

    4 Years

  • Amazon Web Services

    15 Years

  • ETL Pipeline

    11 Years

Additional skills

  • Project Management
  • Amazon Web Services
  • ETL Pipeline
  • Apache NiFi
  • AWS Lambda
  • Python
  • Business Intelligence
  • Scala
  • Big Data
  • Apache Airflow
  • Databricks Platform
  • Google Cloud
  • Apache Spark
  • Snowflake
  • PySpark

Direct hire

Potentially possible

Previous Company

Techlogix

Ready to get matched with vetted developers fast?

Let's get started today!

Hire Remote Developer

Experience Highlights

Medical Data Platform and AI Strategy Implementation

• Strategically designed comprehensive Data and AI Strategy for medical domain, implementing advanced document digitization using Vision API, Gemini, GPT, and Claude models. • Led Data Team in architecting infrastructure for data pipelines enabling creation of datasets for dashboards, analytics • Implemented comprehensive Data Governance Strategy tailored to medical domain with focus on data security • Developed multi-agent systems using RAG, Graph Databases, Vector Databases. • Architected ingestion pipelines for unstructured medical data including PDF documents

Telecom Data Lake Platform Modernization

• Architected and managed enterprise-level data lake platform for major telecommunications infrastructure, handling petabyte-scale data processing. • Led cost optimization initiatives for Azure Databricks and Synapse platforms, • Introduced advanced Data Architecture patterns including Data Mesh, Data Fabric, and Data Vault • Led implementation of Data Governance Policies and Data Lineage using Microsoft Purview and Databricks Unity Catalog. • Architected real-time streaming solutions for telecom data platform requirements.

Advanced Data Pipeline Monitoring and Optimization System

• Designed comprehensive monitoring and trigger system for multiple big data pipelines across diverse technology stacks including Hadoop, Spark, Airflow, Snowflake, AWS EMR. • Architected high-performance ELT pipeline in Snowflake using JavaScript stored procedures, processing 1.4 billion records and 1TB of data with ingestion • Deployed advanced data profiling framework using Spark and Amazon Deequ for comprehensive data quality assessment. • Worked extensively with AWS services including SNS, SQS, S3, Lambda using boto3 Python library for seamless cloud integration.

Multi-Source Big Data Lake Architecture

• Architected comprehensive Big Data Lake solution ingesting data from 200+ different sources with annual volume of 500+ TB using advanced big data technologies. • Designed and delivered ingestion pipelines and analytical workloads in Snowflake, Databricks, Delta Lake, Airflow, NiFi, AWS, S3. • Implemented enterprise monitoring framework for ingestion and analytics using Elasticsearch, Logstash, and Kibana (ELK stack). • Deployed production-grade data profiling framework using Spark and Amazon Deequ for continuous data quality monitoring.

E-commerce Analytics and Intelligence Platform

• Integrated Amazon and eBay APIs with advanced shopping cart applications for comprehensive e-commerce data integration. • Created sophisticated ETL Pipeline transferring Products, Invoices, and Customer data into Amazon S3 in optimized parquet format for advanced reporting and analytics. • Conceived, developed, and deployed comprehensive Big Data Analytics Engine for e-commerce applications, enabling advanced customer insights and business intelligence. • Implemented real-time data processing capabilities for e-commerce transaction analysis and customer behavior tracking.

Education

  • Barani Institue of Information Technology (BIIT)

    Masters in Computer Sceinces

    1998-08-10-2002-03-10

Languages

  • English