We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?
Schedule Interview NowMy name is Ambrose A. and I have over 5 years of experience in the tech industry. I specialize in the following technologies: Scrapy, Python, Data Scraping, Selenium, Data Extraction, etc.. I hold a degree in Bachelor of Science (BS). Some of the notable projects I’ve worked on include: Jobs Site Scraper (Indeed, Glassdoor), Real Estate Scraper, Yellow Pages Scraper, WorldCity DB Scraper. I am based in Nairobi, Kenya. I've successfully completed 4 projects while developing at Softaims.
I employ a methodical and structured approach to solution development, prioritizing deep domain understanding before execution. I excel at systems analysis, creating precise technical specifications, and ensuring that the final solution perfectly maps to the complex business logic it is meant to serve.
My tenure at Softaims has reinforced the importance of careful planning and risk mitigation. I am skilled at breaking down massive, ambiguous problems into manageable, iterative development tasks, ensuring consistent progress and predictable delivery schedules.
I strive for clarity and simplicity in both my technical outputs and my communication. I believe that the most powerful solutions are often the simplest ones, and I am committed to finding those elegant answers for our clients.
Main technologies
5 years
3 Years
3 Years
3 Years
Potentially possible
Andela
The main goal here was to collect job posts as per categories and specified titles. The data was saved in a postgres database.
The client's goal was to extract real estate dat from myproperty.ph/apartment/buy/?q=Philppines and save the data on excel spreadsheet. To achieve this, implemented an automated script that generally did the following:- 1) First of all get the web page via python requests and of course together with other necessary libraries. 2) Single out containers containing the relevant data. 3) And all that translates to scraping the data as separate variables or as a list of related information. And lastly save the data properly on a desired format. The whole process is automated with python programming. Also implemented third-party services such as scrapeops to enable IP rotation hence faster and safer scraping.
The client's goal was to extract information about business in the US and save the data on excel spreadsheet. To achieve this, implemented an automated script that generally did the following:- 1) First of all get the web page via python requests and of course together with other necessary libraries. 2) Single out containers containing the relevant data. 3) And all that translates to scraping the data as separate variables or as a list of related information. And lastly save the data properly on a desired format. The whole process is automated with python programming. Also implemented third-party services to enable IP rotation hence faster, safer scraping and to bypass cloudflare detection
The client's goal was to extract cities all around the world and save the data on excel spreadsheet. To achieve this, implemented an automated script that generally did the following:- 1) First of all get the web page via python requests and of course together with other necessary libraries. 2) Single out containers containing the relevant data. 3) And all that translates to scraping the data as separate variables or as a list of related information. And lastly save the data properly on a desired format. The whole process is automated with python programming. Also implemented third-party services to enable IP rotation hence faster, safer scraping and to bypass cloudflare detection
Bachelor of Science (BS) in Information Technology (Informatics)
2016-01-01-2020-01-01