We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?
Schedule Interview NowMy name is Bishal N. and I have over 5 years of experience in the tech industry. I specialize in the following technologies: Data Scraping, Scrapy, Python, Selenium, API Integration, etc.. I hold a degree in Bachelor's degree, . Some of the notable projects I’ve worked on include: Tiktok Music Chart Display -- Ranking, Google Map Details Extraction, Flight Details Extraction from API, Setting Daily Scrape Crawler Bot, LinkedIn URL Finder From 1000+ URLS, etc.. I am based in Pokhara, Nepal. I've successfully completed 6 projects while developing at Softaims.
I approach every technical challenge with a mindset geared toward engineering excellence and robust solution architecture. I thrive on translating complex business requirements into elegant, efficient, and maintainable outputs. My expertise lies in diagnosing and optimizing system performance, ensuring that the deliverables are fast, reliable, and future-proof.
The core of my work involves adopting best practices and a disciplined methodology, focusing on meticulous planning and thorough verification. I believe that sustainable solution development requires discipline and a deep commitment to quality from inception to deployment. At Softaims, I leverage these skills daily to build resilient systems that stand the test of time.
I am dedicated to making a tangible difference in client success. I prioritize clear communication and transparency throughout the development lifecycle to ensure every deliverable exceeds expectations.
Main technologies
5 years
1 Year
3 Years
3 Years
Potentially possible
Leapfrog Technology
Goal ---"In this project data of the popular songs with its views count where fetched daily, store them in our database and using the data analytics logics ranking were calculated and displayed on table." I used two brother framework of Python Scrapy and Django For Scraping/Data Processing and Storing Using Scrapy. All this process where handled using Django commands and Using the Jinja 2 and Jquery table data where displayed by calling an API which where developed using DRF. In this project I did everything from scratch to deployment in AWS. I used EC2, AWS Lamda , AWS database for depolyment
I have completed several projects related to google map extraction. Google map is the best and good place to find the updated leads. In above projects I used different tools like python selenium and extension for grabbing the details as per requirement. Project was done before the deadline.
I this project I was able to extract the data from the API using Oauth2 authentication.
I was hired first for setting up cron job weekly in cloud. but may be because of my speedy work he hired me to scrape more job hunting sites and make it daily web scraping. The project scrapes some of the popular job hunting sites for special keywords. It runs every day and scrapes the data exactly at 9 AM AEDT. After 15 minutes it automatically uploads the file to the respective name after it job hunting sites. The google sheets contain two worksheets one for putting the Daily scrape and One for Putting the All the Data scraped till now. ----- Technical Details I used the Python for writing the script. I used Scrapy,Selenium,BS4,pandas,requests for crawling the sites. I used proxy and captachas for accidental breakdown of script. I used the gspread for dealing with google sheets. I used AWS EC2 for running my script in cloud.
In this project I was able to extract the data using the python programming. Using the Scrapy and pandas for dealing with data and provided the result in JSON file as per client requirement.
Bachelor's degree in Mathematics
in