Part 1: This program finds flight deals for us to save $$ on our next trip. Google sheets tracks the locations we want to visit and the price cutoff (historical low price). We feed this location and price data into an flight search API which searches through all the locations, looking for the cheapest flight in the next 6 months. When it finds a flight deal, it sends the flight date and price via Twilio SMS module to our phone. Then, we can book it right away.
Part 2: This project turned into a product that let users sign up to use our service. We send users an email notifying them of the best flight deals.
Building a personal CV website using HTML and CSS. I watched the lectures on intermediate CSS for Day 45 but opted out of the project since I'm already practicing my HTML/CSS skills through Frontend Mentor. This project involved using beautiful soup for web scraping in order to compile a list of the 100 greatest movies to watch. I used Python code to create a Spotify playlist of the top 100 songs that were playing on a particular date. To accomplish this, I used Beautiful Soup to scrape the Billboard Hot 100 songs from a certain date and the Spotify API to create a new playlist made up of those songs. The program alerts me with an email when the Amazon product price drops below my target price. I used BeautifulSoup to scrape the product price from the Amazon web page then compared this price to the target amount and set up the email alert with smtplib. I used Selenium to build a bot that automatically clicks on the cookie in the Cookie Clicker game and purchases upgrades at timed intervals. The goal is max out the cookies per second after playing the game for 5 minutes. I learned a lot from this project regarding error handling and how to use try and except to ensure that the program continues to run (and click cookies!). I also became more familiar with the time module which I used to limit the duration of the while loop to 5 minutes and time the intervals at which the bot purchases upgrades. Reading the Selenium documentation helped me use explicit waits and expected conditions to address potential errors when attempting to access or click elements. (I managed to get 107.8 cookies/second with my algorithm - higher than Angela's score but not one that I could get consistently - and I decided to stop tweaking after spending 10 hours on the project!) This project involved using selenium webdriver to open LinkedIn, log in, and apply for all jobs that meet my criteria (including Easy Apply). While this is an interesting strategy for automating the job application process on LinkedIn, coding the app bot proved to be far more complicated than I anticipated. Selenium threw up exceptions and errors at nearly every step of the process. I implemented WebdriverWait and expected conditions as well as try and except statements to catch the most common errors, NoSuchElementException and ElementClickInterceptedException. I also used the sleep function throughout the program to give the page time to load or ensure LinkedIn didn't think I was a bot. I also tried the variation of the project that involves saving the job and following the company instead of submitting an application (the code is at the bottom of the main.py file). I attempted to use selenium to build an auto-swiping tinder bot. Unfortunately, this project was unsuccessful since LinkedIn continued to demand verification via phone and email and/or some kind of CAPTCHA puzzle. This made it impossible to automate the login process and I couldn't find a workable solution to these security measures. (I uploaded the code I wrote up to this point.) This project uses Selenium to automatically check your internet speed before tweeting at your service provider to ask why it's lower than their guaranteed download and upload speed. This project also enabled me to review my OOP skills by creating an InternetSpeedTwitterBot class with two methods, get_internet_speed and tweet_at_provider. I built a follower bot that will help build my Instagram brand by following users who follow accounts that post similar content as me. Ideally, this will get the attention of those users who could potential follow me back. This bot does, however, get restricted by Instagram after following a certain number of users (even with the sleep function slowing down the button clicking to imitate a human). I researched house prices that fit a certain criteria for a client on the Zillow website and then transferred that data into a Google form. I used the form to create a spreedsheet in Google sheets. Essentially, I automated an data entry job using everything I learned about web scraping so far with Selenium.The project actually required me to use BeautifulSoup to scrape the Zillow website but I was unable to do so since the entire page's listings wouldn't load. Instead, I used Selenium and various keys to scroll through all the listings. This enabled me to access the prices, addresses, and links for each listing. Unfortunately, the results of the web scraping were inconsistent - sometimes, I was able to collect the data for all 40 listings and at other times, the first nine loaded before the program crashed with NoSuchElementException errors. In the courses's comments, other students mentioned encountering similar issues and that the website's front end code had changed since Angela developed this course. Zillow also discourages bots by requiring CAPTCHA after the program accesses it multiple times so after I successfully scraped the data I needed, I hardcoded the results as lists. I, then, looped through those lists in order to automatically fill out the Google form.
This set of lectures covered backend web development, the command line, Python decorator functions and using Flask for web development. Today's lectures covered rendering and parsing HTML in Flask and advanced decorators in order to build a 'Guess the Number' website. The user types the number into the URL after the forward slash and they find out if their guess is too high, too low, or correct! I learned more about web development with Flask (specifically including static files and rendering HTML & CSS files on my website). I modified a pre-built HTML template to create a personal namecard website with social media links and a short bio. This was then served up using Flask. I learned how to use jinja to produce dynamic HTML pages and build URLs as well as set up the blog template for the first part of the capstone project. I learned how to use Bootstrap and reviewed advanced CSS by building a doggy dating website called Tindog.