We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?
Schedule Interview NowMy name is Demeritius G. and I have over 9 years of experience in the tech industry. I specialize in the following technologies: Python, SQL, Amazon Web Services, Data Warehousing, ETL Pipeline, etc.. I hold a degree in Associate of Arts and Sciences (AAS), Bachelor of Arts (BA). Some of the notable projects I’ve worked on include: Airflow Full ETL Pipeline, Data manipulation using PySpark, Data Wrangling Projects, Data Pipeline automated with Apache Airflow. I am based in Spokane, United States. I've successfully completed 4 projects while developing at Softaims.
I value a collaborative environment where shared knowledge leads to superior outcomes. I actively mentor junior team members, conduct thorough quality reviews, and champion engineering best practices across the team. I believe that the quality of the final product is a direct reflection of the team's cohesion and skill.
My experience at Softaims has refined my ability to effectively communicate complex technical concepts to non-technical stakeholders, ensuring project alignment from the outset. I am a strong believer in transparent processes and iterative delivery.
My main objective is to foster a culture of quality and accountability. I am motivated to contribute my expertise to projects that require not just technical skill, but also strong organizational and leadership abilities to succeed.
Main technologies
9 years
2 Years
3 Years
6 Years
Potentially possible
Amazon
Built a complex Airflow pipeline integrating API + S3 data into a data warehouse. Applied transformations, data quality checks, and warehouse modeling with automated scheduling, retries, and monitoring.
This project showcases my ability to perform scalable data manipulation and analysis using PySpark, a key skill for modern data engineering. The goal was to process and analyze flight performance data to identify airlines and flights with the most significant delays, using best practices in distributed data processing.
These projects show: (1) Pipelines to clean, enrich, and transform raw sales & customer data. (2) Delivering validated, analytics-ready datasets with quality checks, feature engineering, and aggregated reports to support business insights.
Designed Airflow DAGs to orchestrate ETL pipelines: one modular ETL demo (extract → transform → load) and one production-style workflow integrating AWS S3, Pandas, and Postgres for automated data processing and analytics.”
Associate of Arts and Sciences (AAS) in Architectural Technology
2004-01-01-2006-01-01
Bachelor of Arts (BA) in Information Systems Security
2010-01-01-2014-01-01