Demeritius G. looks like a good fit?

We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?

Schedule Interview Now

Demeritius G. Cloud, Data Engineering and AI Platforms

My name is Demeritius G. and I have over 9 years of experience in the tech industry. I specialize in the following technologies: Python, SQL, Amazon Web Services, Data Warehousing, ETL Pipeline, etc.. I hold a degree in Associate of Arts and Sciences (AAS), Bachelor of Arts (BA). Some of the notable projects I’ve worked on include: Airflow Full ETL Pipeline, Data manipulation using PySpark, Data Wrangling Projects, Data Pipeline automated with Apache Airflow. I am based in Spokane, United States. I've successfully completed 4 projects while developing at Softaims.

I value a collaborative environment where shared knowledge leads to superior outcomes. I actively mentor junior team members, conduct thorough quality reviews, and champion engineering best practices across the team. I believe that the quality of the final product is a direct reflection of the team's cohesion and skill.

My experience at Softaims has refined my ability to effectively communicate complex technical concepts to non-technical stakeholders, ensuring project alignment from the outset. I am a strong believer in transparent processes and iterative delivery.

My main objective is to foster a culture of quality and accountability. I am motivated to contribute my expertise to projects that require not just technical skill, but also strong organizational and leadership abilities to succeed.

Main technologies

  • Cloud, Data Engineering and AI Platforms

    9 years

  • Python

    2 Years

  • SQL

    3 Years

  • Amazon Web Services

    6 Years

Additional skills

  • Python
  • SQL
  • Amazon Web Services
  • Data Warehousing
  • ETL Pipeline
  • Apache Spark
  • Data Analysis
  • Apache Airflow
  • Data Mining
  • AWS Glue
  • CI/CD
  • Governance, Risk Management & Compliance
  • Information Security Governance
  • Data Management
  • AWS Lambda
  • Data Engineering
  • Data Science

Direct hire

Potentially possible

Previous Company

Amazon

Ready to get matched with vetted developers fast?

Let's get started today!

Hire Remote Developer

Experience Highlights

Airflow Full ETL Pipeline

Built a complex Airflow pipeline integrating API + S3 data into a data warehouse. Applied transformations, data quality checks, and warehouse modeling with automated scheduling, retries, and monitoring.

Data manipulation using PySpark

This project showcases my ability to perform scalable data manipulation and analysis using PySpark, a key skill for modern data engineering. The goal was to process and analyze flight performance data to identify airlines and flights with the most significant delays, using best practices in distributed data processing.

Data Wrangling Projects

These projects show: (1) Pipelines to clean, enrich, and transform raw sales & customer data. (2) Delivering validated, analytics-ready datasets with quality checks, feature engineering, and aggregated reports to support business insights.

Data Pipeline automated with Apache Airflow

Designed Airflow DAGs to orchestrate ETL pipelines: one modular ETL demo (extract → transform → load) and one production-style workflow integrating AWS S3, Pandas, and Postgres for automated data processing and analytics.”

Education

  • Spokane Community College

    Associate of Arts and Sciences (AAS) in Architectural Technology

    2004-01-01-2006-01-01

  • DeVry University

    Bachelor of Arts (BA) in Information Systems Security

    2010-01-01-2014-01-01

Languages

  • English