Machine Learning Fundamentals
Predict Bike Sharing Demand with AutoGluon Project
This is the first project for the Udacity AIML course, which I attended through an AWS AIML scholarship. Each year, the top 500 graduates globally from the AI Programming with Python course are eligible to complete the Machine Learning Fundamentals course.
This project involves the complete development workflow of a predictive model to forecast bike-sharing demand. It includes exploratory data analysis, model training, testing, and validation, utilizing the AutoGluon framework within SageMaker Studio. The project aims to demonstrate the practical application of machine learning in urban planning and transportation.
Github Repository: Udacity AIML Scholarship ProgrammeMachine Learning Fundamentals
Developing a Handwritten Digits Classifier with PyTorch Project
This is the second project for the Udacity AIML course, which I attended through an AWS AIML scholarship. Each year, the top 500 graduates globally from the AI Programming with Python course are eligible to complete the Machine Learning Fundamentals course.
This project focuses on the essentials of deep learning, creating a classifier for handwritten digits using PyTorch. It covers feedforward and backpropagation techniques, perceptrons, gradient descent, logistic regression, and ML fine-tuning parameters, offering a comprehensive understanding of neural networks.
Github Repository: Udacity AIML Scholarship ProgrammeMachine Learning Fundamentals
Landmark Classification and Tagging for Social Media Employing CNN fundamentals Project
This is the third project for the Udacity AIML course, which I attended through an AWS AIML scholarship. Each year, the top 500 graduates globally from the AI Programming with Python course are eligible to complete the Machine Learning Fundamentals course.
This project explores transfer learning, autoencoders, object detection, and segmentation to classify and tag landmarks in images for social media platforms. It illustrates the application of deep learning in enhancing social media content and engagement.
Project URL mybinder.org Github Repository: Udacity AIML Scholarship ProgrammeMachine Learning Fundamentals
Build an ML Workflow for Scones Unlimited on Amazon SageMaker Project
This is the final project for the Udacity AIML course, which I attended through an AWS AIML scholarship. Each year, the top 500 graduates globally from the AI Programming with Python course are eligible to complete the Machine Learning Fundamentals course.
This project involves developing a machine learning workflow for a fictional company, Scones Unlimited, using Amazon SageMaker. It encompasses SageMaker Studio, AWS Lambda, and Step Functions for workflow management, as well as AWS Cloud Gateway for integrating ML models into production environments.
Github Repository: Udacity AIML Scholarship ProgrammeAI Programming with Python
Pet Image Analysis - Python Application Project
I completed this project for the Udacity AI Programming with Python course, which I attended through an AWS DeepRacer scholarship. Each year, 2,000 of these scholarships are awarded globally.
In the "Pet Image Analysis" project, I developed a Python application that distinguishes dog images from other pet visuals, ensuring precise identification even in cases of breed misclassifications. Further, for images that were identified as dogs, the system was capable of specific breed detection. A significant component of the project involved a detailed comparative analysis of various CNN model architectures, including ResNet, AlexNet, and VGG, to determine the most effective model for our specific tasks. Additionally, I assessed each model's performance in relation to computational time, exploring potential alternative solutions to achieve optimal efficiency.
Github Repository: Pet Image AnalysisAI Programming with Python
Image Classifier with Deep Learning Project
I completed the final project for the Udacity AI Programming with Python course, which I attended through an AWS DeepRacer scholarship. Each year, 2,000 of these scholarships are awarded globally.
The project's objective was to develop a Python application capable of training an image classifier using a dataset. Once trained, the classifier should predict new images. To make it more user-friendly, I also set up the application on Streamlit. If you're curious to see it in action, just give me a shout! Just a heads-up: if it's been quiet for a while, the app might need a quick reboot.
Project URL Streamlit App Github Repository: Udacity AI Python ProjectAWS Cloud Resume Challenge
The Project
The AWS Cloud Resume Challenge was an exciting project that allowed me to create a cloud-based resume website using AWS services. I implemented best practices for security, scalability, and cost optimization while documenting my process to showcase my expertise. This challenge enhanced my skills and demonstrated my proficiency with AWS cloud-based technologies, providing a valuable learning experience.
- HTML & CSS
- S3
- CloudFront
- Route53
- DynamoDB
- Lambda
- Serverless Application Model (SAM)
- Python & Boto3
- Github
- Github Actions
Details
| 1 | Obtain the AWS Cloud Practitioner certification |
| 2 | Create a Resume in HTML |
| 3 | Style the resume with CSS |
| 4 | Deploy the resume as a static website on AWS S3 |
| 5 | Use CloudFront and a S3 website URL for HTTPS security |
| 6 | Use Route53 to point a custom DNS domain name to the CloudFront distribution |
| 7 | Use Javascript to include a view counter that displays how many people have accessed the resume website |
| 8 | Use DynamoDB to store and retrieve view count data |
| 9 | Use an API Gateway and Lambda to record view counts and store in DynamoDB |
| 10 | Write Lambda function in Python using Boto3 |
| 11 | Test your Python code |
| 12 | Create an Infrastructure-as-Code template - Use SAM to deploy template and create an S3 bucket and policy, Route53 Record Set Group, CloudFront Distribution, SSL Certificate, Serverless Function, Lambda and DynamoDB table |
| 13 | Create a GitHub repository for your front and backend code for source control and use CI/CD to automatically update your website. |
| 14 | CI/CD (Backend) - Setup GitHub Actions when you push an update to your SAM template or Python code your Python test gets run. If your test passes the SAM application should get packaged and deployed to AWS |
| 15 | CI/CD (Frontend) - Create a second GitHub repository for your website code. Create GitHub Actions so that when you push new website code, the S3 bucket automatically gets updated. |
| 16 | Blog Post for AWS Cloud Resume Challenge - Mastering Through Application: How Hands-On Practice Solidified My Understanding |