We can organize an interview with Aldin or any of our 25,000 available candidates within 48 hours. How would you like to proceed?
Schedule Interview NowMy name is Nehuen V. and I have over 3 years of experience in the tech industry. I specialize in the following technologies: Scrapy, Python, Data Scraping, Beautiful Soup, Selenium, etc.. I hold a degree in High school degree. Some of the notable projects I’ve worked on include: Client Management AI Agent built with n8n, LinkedIn Post Generator Automation with Google Gemini AI & N8N, N8N Notion Finance Tracker AI Agent Connected to Telegram, Large Scale Video Download and Upload Automation Using Python, FULL DATA EXTRACTION AND AUTOMATION PROJECT FOR 4 REAL ESTATE WEBSITES, etc.. I am based in Lomas de Zamora, Argentina. I've successfully completed 20 projects while developing at Softaims.
I thrive on project diversity, possessing the adaptability to seamlessly transition between different technical stacks, industries, and team structures. This wide-ranging experience allows me to bring unique perspectives and proven solutions from one domain to another, significantly enhancing the problem-solving process.
I quickly become proficient in new technologies as required, focusing on delivering immediate, high-quality value. At Softaims, I leverage this adaptability to ensure project continuity and success, regardless of the evolving technical landscape.
My work philosophy centers on being a resilient and resourceful team member. I prioritize finding pragmatic, scalable solutions that not only meet the current needs but also provide a flexible foundation for future development and changes.
Main technologies
3 years
1 Year
1 Year
2 Years
Potentially possible
Mercado Libre
An AI-powered Client Management Assistant built with n8n, Slack, Gmail, and Google Sheets. It automates weekly client updates, generates polished summaries, and instantly answers project queries in Slack. Designed for agencies, SaaS teams, and freelancers, it saves 6–11+ hours weekly by removing manual reporting, reducing context switching, and keeping communication consistent. Fully customizable to fit any client operations workflow.
A Google Gemini LinkedIn Post Generator that helps professionals create polished posts from articles on a given topic in seconds. Built with n8n, it generates tailored content based on three articles on any topic (in this case, in the field of AI) and saves them to a Google Sheet for reviewing. Saves marketers, founders, and busy professionals 3–6 hours weekly by eliminating writer’s block, ensuring consistent posting, and maintaining a professional brand voice. Fully customizable to adapt to any industry or style.
A smart Notion Finance AI Assistant that automates expense tracking via Telegram. Add new expenses, convert currencies, and get instant spending summaries without opening Notion. Built with n8n, Telegram, and Notion APIs, it saves freelancers and small businesses 2–5 hours weekly by removing manual entry, reducing context switching, and delivering fast, accurate financial insights. Fully customizable to fit unique workflows and platforms.
In this project, I developed a Python program to automate the downloading of around 13,000 videos from the Gong API and upload them to Google Drive. This solution was created under a tight deadline of less than 4 days due to the imminent expiration of the Gong API. The program efficiently handled large-scale data transfer, integrated multiple APIs, and ensured seamless automation of the entire process. This project highlights my ability to deliver high-quality solutions quickly and effectively under time constraints.
In this project, over the course of several months, I developed an automated system to scrape four different Italian Real Estate Websites (Immobiliare, Subito, Casa, and Idealista) using BeautifulSoup, and Selenium in Python. Upload the scraped data into a BigQuery Database and Google Sheets, using both APIs, while also connecting with Telegram API to notify the client about properties going down in price (according to a threshold set through Google Sheets) and the state of the scraping process All scripts are running in an AWS cloud server, that was completely set up by me.
High school degree in Electrical engineering
2014-01-01-2021-01-01