Skip to content

An innovative software prototype that gauges a users toxicity levels. This will be done by checking a new user's public information. The AI model will then determine the toxicity level from the provided information. If it is toxic enough, it will inform the proper authority. No administrative tasks are given.

Notifications You must be signed in to change notification settings

gtktorres/w_system

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Wells System

Description

An innovative software prototype that gauges a user's toxicity levels. This will be done by checking a new user's public information. The AI model will then determine the toxicity level from the provided information. If it is toxic enough, it will inform the proper authority. No administrative tasks are given.

Getting Started

Dependencies

  • nuGet packages => Rystem.OpenAI, DropboxSign
  • react.js

Installing

#Node

  • npm i react
  • npm i dotenv
  • npm i discord

#Django

  • pip install virtualenv
  • py -m pip install Django

#Tensorflow

  • pip install numpy
  • pip install pandas
  • pip install tensorflow
  • pip install scikit-learn

Authors

Guevara Torres @gtkt.dev

License

This project is licensed under the MIT License

About

An innovative software prototype that gauges a users toxicity levels. This will be done by checking a new user's public information. The AI model will then determine the toxicity level from the provided information. If it is toxic enough, it will inform the proper authority. No administrative tasks are given.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published