Bohdan P. looks like a good fit?

We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?

Schedule Interview Now

Bohdan P. Python, API and Cloud Platforms

My name is Bohdan P. and I have over 9 years of experience in the tech industry. I specialize in the following technologies: Python, Flask, Django, API, Python Script, etc.. I hold a degree in Doctor of Philosophy (PhD), Bachelor of Engineering (BEng), Master's degree. Some of the notable projects I’ve worked on include: Data scraping, Data processing of products data, API for data visualiztion dashboard, Website for a local newspaper, API for support platform, etc.. I am based in Chernihiv, Ukraine. I've successfully completed 8 projects while developing at Softaims.

My expertise lies in deeply understanding and optimizing solution performance. I have a proven ability to profile systems, analyze data access methods, and implement caching strategies that dramatically reduce latency and improve responsiveness under load. I turn slow systems into high-speed performers.

I focus on writing highly efficient, clean, and well-documented code that minimizes resource consumption without sacrificing functionality. This dedication to efficiency is how I contribute measurable value to Softaims’ clients by reducing infrastructure costs and improving user satisfaction.

I approach every project with a critical eye for potential bottlenecks, proactively designing systems that are efficient from the ground up. I am committed to delivering software that sets the standard for speed and reliability.

Main technologies

  • Python, API and Cloud Platforms

    9 years

  • Python

    8 Years

  • Flask

    1 Year

  • Django

    5 Years

Additional skills

Direct hire

Potentially possible

Previous Company

SoftServe

Ready to get matched with vetted developers fast?

Let's get started today!

Hire Remote Developer

Experience Highlights

Data scraping

Overview: This project centered on the development of a Python-based data scraping system designed to extract information from multiple online sources. The extracted data was then processed, cleaned, and structured to meet the specific requirements of the client, ensuring it was ready for immediate analysis and integration into their systems. Key Features: - Data Extraction: Utilized Python libraries such as Scrapy and Beautiful Soup to efficiently scrape data from various websites, forums, and online databases. - Dynamic Adaptability: Designed the scraping system to be adaptable, allowing for the easy addition of new data sources or modifications to existing ones as client needs evolved. - Data Cleaning: Employed Pandas and NumPy to preprocess the scraped data, addressing issues like missing values, duplicates, and inconsistencies to ensure the highest data quality. - Data Transformation: Transformed the raw data into a structured format that aligned with the client's specifications, making it easier for them to integrate and analyze. - Error Handling: Implemented robust error handling mechanisms to manage potential scraping issues, such as changes in website structures or access restrictions, ensuring continuous data flow. - Data Delivery: Set up an automated delivery system that packaged and sent the prepared data to the client's desired destination, be it a database, cloud storage, or an API endpoint. Outcome: The project successfully delivered a continuous stream of clean, structured, and up-to-date data from various sources directly to the client. This streamlined data acquisition process empowered the client to focus on data-driven decision-making without the overhead of data collection and preparation.

Data processing of products data

Overview: This project involved the development of a comprehensive data processing pipeline using Python to analyze a vast dataset of product information. The primary objective was to derive actionable insights that could guide product development, marketing strategies, and inventory management. Key Features: - Data Cleaning: Utilized libraries like Pandas and NumPy to preprocess the data, handling missing values, outliers, and inconsistencies to ensure data quality. - Feature Engineering: Enhanced the dataset by creating new features that captured the underlying patterns more effectively, aiding in subsequent analyses. - Trend Analysis: Identified historical trends in product sales, popularity, and customer feedback, providing a clear picture of product performance over time. - Reporting: Created an automated reporting system that generated periodic insights and recommendations for the product team, ensuring data-driven strategies. Outcome: The project successfully transformed raw product data into meaningful insights, leading to more informed business decisions. The analyses played a pivotal role in optimizing product offerings, refining marketing campaigns, and improving inventory management practices.

API for data visualiztion dashboard

Overview: Developed a robust and scalable API to power a dynamic dashboard aimed at tracking and visualizing finance data. This project was initiated to provide stakeholders with real-time insights about data, ensuring timely responses and strategic decision-making. Key Features: 1. Data Integration: The API seamlessly integrates with various data sources, including customer feedback platforms, CRM systems, and sales databases, to fetch and consolidate relevant data. 2. Real-time Analytics: Implemented real-time data processing to ensure that the dashboard displays the most up-to-date client feedback and requirements, enabling stakeholders to make informed decisions promptly. 3. Customizable Views: The API supports customizable dashboard views, allowing users to filter and drill down into specific metrics, timeframes, or client segments. 4. Security: Ensured that the API adheres to best security practices, including data encryption, authentication, and authorization mechanisms, to protect sensitive client information. 5. Scalability: Designed the API with scalability in mind, allowing it to handle increasing data volumes and user requests without compromising performance. 6. Feedback Loop Integration: Incorporated a mechanism for stakeholders to directly address client feedback or requirements from the dashboard, streamlining the communication process. Outcome: The API has been instrumental in enhancing the company's client-centric approach. Stakeholders can now proactively address client needs, leading to improved client satisfaction rates and stronger business relationships.

Website for a local newspaper

Website for a local old newspaper with the ability to post news and articles. Responsibilities: - Django backend development; - Bootstrap 4 frontend development; - Deployment;

API for support platform

My client wanted support platform that can send messages from telegram bot chat to web interface. I have developed API for this platform by means of Django REST framework.

Education

  • Chernihiv State Technological University

    Doctor of Philosophy (PhD) in Computer science

    2022-01-01-2026-01-01

  • Chernihiv State Technological University

    Bachelor of Engineering (BEng) in Software Engineering

    2014-01-01-2018-01-01

  • Chernihiv State Technological University

    Master's degree in Software engineering

    2018-01-01-2020-01-01

Languages

  • English
  • Ukrainian