I recently ate at Whole Foods. As usual, I finished eating, got up to toss out my garbage, and then confronted this:
This time, instead of having my typical mini-anxiety attack over where to toss out each of my pieces of garbage, I got an idea: What if my phone's camera could just tell me where each of these items should go?
Sure, there are signs posted that are, in theory, supposed to tell me how to do this. But I don't have time to read. I already decided it's time to leave the store, and now I'm supposed to stand here and interpret a verbose chart listing what's recyclable/compostable/trashable/landfillable and what's not? No thanks. I want to help save the planet, but I got other locations on it to be at right now.
Thesis: Less thinking; more blindly obeying my phone on how to sort my garbage.
Not only could the app identify the appropriate bin for your garbage items, but it could also keep track of how many objects you've correctly thrown away. This could be used to start an incentive program by which businesses could reward customers who contribute to the effort of properly sorting their garbage, thereby reducing the need for waste management facilities to correctly sort items downstream. It could also collect data on total waste disposed of in a given store and what kinds of waste predominate.
- OpenCV v3.1.0
- Keras (TensorFlow backend) Prototype utilizes the pre-trained VGG-16 neural network.
- Numpy
Descriptions forthcoming.
None at this time. Still prototype.
I currently have the image classification program running on my laptop. The near-term goal is to get this into a mobile app. I'm also working on assessing and improving the classification performance, which will likely require additions to the training image dataset.