Skip to content
/ SBC Public

Contains Resources from Winter Internship 23'

Notifications You must be signed in to change notification settings

AbuCTF/SBC

Repository files navigation

Concepts Learned as of December 2023 on the domain of Artificial Intelligence.

Artificial Intelligence:

--Search Algorithms like Depth-First Search (DFS), Breadth-First Search (BFS), A* and Greedy Best.

--Adversarial Search like MiniMax algorithm for decision-making in game theory.

--Alpha-Beta Pruning and Depth-Limited Minimax.

--Uncertainty and Bayesian networks for probabilistic modeling.

--Markov's Assumptions in Markov models, Markov's Chain Rule.

--Base theorem for probability calculation, Unconditional/Joint Probablity

--Optimization Algorithms.

--Hill Climbing algorithms like Stochastic, Random Restart, Steepest Ascent, First Choice, Local Beam Search.

--Simulated Annealing, Backtracking search/algorithm for Travelling Salesman.

--Constraint Satisfaction, Node/Arc Consistency

--Heuristics for guiding search algorithms like Manhattan Distance, Minimum Remaining Values (MRV).

--Knowledge Engineering and Inference Rulings like Sampling.

Machine Learning:

--Supervised Learning, Classification, and k-Nearest Neighbors.

--Perceptron learning rule for binary classification.

--Activation functions like ReLU, sigmoid, and tanh.

--Regession, Loss Function.

--Overfitting and techniques to mitigate it.

--Q-Learning for reinforcement learning.

Deep Learning:

--Neural Networks (ANN), Deep Neural Networks (DNN).

--Gradient Descent, Stochastic Gradient Descent (SGD), Weight Initialization.

--Backpropagation for updating weights.

--Dropout layer for regularization.

--Large Language Models(LLMs)

--PyTorch and TensorFlow frameworks.

--Epochs in training.

--Convolutional Neural Networks (CNN).

--Pooling layer for spatial reduction.

--Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM) for sequential data.

Natural Language Processing (NLP):

--Parsing and understanding language structures.

--Bag of Words and n-grams models for text representation.

--Naive Bayes and Additive Smoothing for text classification.

--NLTK Library, n-grams.

--Word2Vec for word embeddings.

--Word Tokenization.

--Transformer Architecture for sequence-to-sequence tasks.

Tools and Libraries:

--Python programming language.

--PyTorch & TensorFlow

--Pygame , Atari Games.

--Numpy for numerical operations.

--Matplotlib for data visualization.

--scikit-learn for machine learning tasks.

--Autograd for automatic differentiation.

--MSELoss() for mean squared error loss in regression tasks.

--Linear Regression for predicting a continuous outcome.

--GloVe technique for word embeddings.

--Softmax algorithm for multiclass classification.

Datasets:

--MNIST and Cifar-10 datasets for image classification.

--Ginza , Pykakasi.

About

Contains Resources from Winter Internship 23'

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published