The Advanced Driving Assistance on a Mobile is a solution to assist while driving with a commodity mobile device.
This system operates within city/overland traffic boundaries.
At the top of the file there should be a short introduction and/ or overview that explains what the project is. This description should match descriptions added for package managers (Gemspec, package.json, etc.)
Driving
- Show a warning if the distance is below the average break distance (including human reaction and actual breaking time) (may require further calibration or knowledge of break effectiveness)
- Inform with a warning if a lane departure is expected to happen within the next x(e.g. 5 ) seconds
- Show which lane is about to be crossed
Traffic and flow information
- Give an indication if the cars in front are about to break by detecting the brakelight
- Detect and show speed signs (including city signs, end of speed limits) and show the detected speed limit
- Detect priority in traffic by showing the direction which has priority
Dashcam
- Record video footage in case of a major incident
- Display last detected information such as speed, cars in front, detected signs and lanes
- The solution should run on a commodity Android device with at least 4 CPU cores and a GPU with OpenGL ES 2.x support, support OpenCV runtime, at least 512 MB RAM, back-facing camera with at least FullHD resolution, sustainable charging (run on power-source with battery charging)
- The solution should run in almost real-time (no longer than the average humand reaction time) to detect objects/situations
- The system shall not record any data and must withdraw video footage after processing
- No car signs should be disclosed and any detected signs should be anonymoused while showing the captured video
- The system shall reach a detection ratio of at least 50% for lanes, 75% for traffic signs, 50% for brakelight, 75% for lane departure
- The solution requires a steady power connection to bridge longer driving sessions
Show what the library does as concisely as possible, developers should be able to figure out how your project solves their problem by looking at the code example. Make sure the API you are showing off is obvious, and that your code is short and concise.
A short description of the motivation behind the creation and maintenance of the project. This should explain why the project exists.
Depending on the size of the project, if it is small and simple enough the reference docs can be added to the README. For medium size to larger projects it is important to at least provide a link to where the API reference docs live.
Tests run in local JVM require a local OpenCV installation. For OS X you can simply install OpenCV with brew:
brew install homebrew/science/opencv --with-contrib --with-cuda --with-ffmpeg --with-tbb --with-java --with-opengl --with-qt5
Run local tests with:
./gradlew test
Tests can also be executed on a real device or emulator using:
./graldew connectedAndroidTest
Let people know how they can dive into the project, include important links to things like issue trackers, irc, twitter accounts if applicable.
A short snippet describing the license (MIT, Apache, etc.)