Skip to content

doanthienthuan/Udacity-DS-project2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Disaster Response Pipeline Project

Introduction

In this project, I have use my data engineeering skill to analyze disaster data to build a model to classifies disaster messages. I have 2 pipeline in the backend, 1 for ETL pipeline and 1 for ML training and improving. We also build an app with UI so users can input a new message and get classification results in several categories. The app will also display visualizations of the data.

File Descriptions

app

| - template
| |- master.html # main page of web app
| |- go.html # classification result page of web app
|- run.py # Flask file that runs app

data

|- disaster_categories.csv # data to process
|- disaster_messages.csv # data to process
|- process_data.py # data cleaning pipeline
|- InsertDatabaseName.db # database to save clean data to

models

|- train_classifier.py # machine learning pipeline
|- classifier.pkl # saved model

README.md cleaned_tel_pipeline.db

Instructions:

  1. Run the following commands in the project's root directory to set up your database and model.

    • To run ETL pipeline that cleans data and stores in database python data/process_data.py data/disaster_messages.csv data/disaster_categories.csv data/DisasterResponse.db
    • To run ML pipeline that trains classifier and saves python models/train_classifier.py data/DisasterResponse.db models/classifier.pkl
  2. Go to app directory: cd app

  3. Run your web app: python run.py

  4. Click the PREVIEW button to open the homepage

About

This repo is used for submition.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published