• Stars
    star
    113
  • Rank 310,115 (Top 7 %)
  • Language
  • License
    MIT License
  • Created about 3 years ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

I will share about Machine Learning and Deep Learning.

MACHINE LEARNING & DEEP LEARNING

Image

Books and Resources Status of Completion
1. Deep Learning for Computer Vision with Python I βœ…
2. Pattern Recognition and Machine Learning - Bishop βœ…
3. Hugging Face βœ…
4. Transformers for Natural Lanaguage Processing βœ…
5. Natural Language Processing with Transformers βœ…
Projects and Notebooks
1. Image Classification
2. Linear Classifier
3. Gradient Descent
4. Stochastic Gradient Descent
5. Neural Networks
6. Convolutional Layers II
7. LeNet Architecture
8. VGGNet Architecture
9. Pretrained CNNs
10. Object Detection
11. DCGANs
12. Hugging Face: Transformer Models
13. Hugging Face: Pipeline Function
14. Hugging Face: Models & Tokenizers
15. Hugging Face: Pretrained Models
18. Fine-Tuning BERT Model
19. Machine Translation
20. Text Classification
21. Named Entity Recognition
22. Text Generation
23. Transformers & Production

Day1 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading about History of Deep Learning, Image Fundamentals, Pixels, Scaling and Aspect Ratios, Image Classification, Semantic Gap, Feature Extraction, Viewpoint Variation, Scale Variation, Deformation, Occlusions, Illumination, Background Clutter, Intra-class Variation, Supervised and Unsupervised Learning and few more topics related to the same. I have presented the notes about Image Classification, Semantic Gap, Feature Extraction, Supervised and Unsupervised Learning here in the snapshot. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:
    • Deep Learning for Computer Vision with Python

Image

Day2 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have read about Image Classification, OpenCV, Animals Dataset, Raw Pixel Intensities, Convolutional Neural Networks, Dataset Loader and Preprocessing Modules, Aspect Ratio, Resizing and Scaling and few more topics related to the same from here. I have presented the implementation of Image Preprocessor and Dataset Loader here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day3 of MachineLearningDeepLearning

  • K-Nearest Neighbor: K-Nearest Neighbor Classifier doesn’t actually learn anything, but it directly relies on the distance between feature vectors. On my Journey of Machine Learning and Deep Learning, I have been reading the book "Deep Learning for Computer Vision with Python". Here, I have been reading about Image Classification, K-Nearest Neighbor Classifier, Partitioning Dataset, Preprocessing Images, Model Evaluation and Classification Report, Label Encoder, Hyperparameters and few more topics related to the same from here. I have presented the implementation of K-Nearest Neighbor Classifier and Model Evaluation here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day4 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have read about Parameterized Learning, Cross Entropy Loss and Softmax Classifiers, Weights and Biases, Squared Hinge Loss, Scoring Function and Optimization, Linear Classification and few more topics related to the same from here. I have presented the notes about K-Nearest Neighbor and Parameterized Learning here in the snapshot. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day5 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I am reading about Optimization Methods and Regularization, Parameterized Learning and Optimization, Gradient Descent, Loss Landscape and Optimization Surface, Local and Global Minimum, Loss Function, Partial Derivative, Classification Accuracy and few more topics related to the same from here. I have also read about Data Pipeline, Meta-Data, Data Provenance and Lineage and Label Consistency from Introduction to Machine Learning in Production course of Coursera. I have presented the notes about Gradient Descent and Optimization here in the snapshot. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day6 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Optimization Methods and Regularization, Gradient Descent, Sigmoid Activation Function, Weights and Learning Rate, Iterative Algorithm, Classification Report, Stochastic Gradient Descent, Mini-batch SGD and few more topics related to the same from here. I have presented the implementation of Sigmoid Activation Function and Gradient Descent here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day7 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Stochastic Gradient Descent, Mini-batch SGD, Sigmoid Activation Function, Weight Matrix and Losses, Momentum, Nesterov's Acceleration, Regularization, Overfitting and Underfitting and few more topics related to the same from here. I have presented the implementation of Sigmoid Activation Function and Stochastic Gradient Descent and notes about Regularization here in the snapshots. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day8 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Regularization, Cross Entropy Loss Function, Updating Loss and Weight, L2 Regularization and Weight Decay, Elastic Net Regularization, Image Classification, SGD Classifier, Label Encoder and few more topics related to the same from here. I have presented the implementation of Preprocessing Dataset, Encoding Labels, SGD Classifier and Regularization here in the snapshots. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day9 of MachineLearningDeepLearning

  • Neural Networks: Neural networks are the building blocks of deep learning systems. A system is called a neural network if it contains a labeled, directed graph structure where each node in the graph performs some computation. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have read about Neural Networks, Human Neuron Anatomy, Artificial Models, Weights and Gradients and few more topics related to the same from here. I have also spend time in Using Fasti on Sequences of Images & Video. I have presented the implementation of Preparing Dataset, Decoding Videos and Extracting Images using Fastai & PyTorch here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:
    • Deep Learning for Computer Vision with Python

Image

Day10 of MachineLearningDeepLearning

  • Neural Networks: Neural networks are the building blocks of deep learning systems. A system is called a neural network if it contains a labeled, directed graph structure where each node in the graph performs some computation. Rectified Linear Unit:ReLU is zero for negative inputs but increases linearly for positive inputs. The ReLU function is not saturable and is also extremely computationally efficient. ReLU is the most popular activation function used in deep learning and has stronger biological motivations. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have read about Activation Functions: Sigmoid, Tanh, ReLU, Feedforward Neural Network Architecture, Neural Learning, The Perceptron Algorithm, AND, OR and XOR Datasets, Perceptron Training Procedure and Delta Rule and few more topics related to the same from here. I have presented the notes about Sigmoid Function, ReLU and Feedforward Networks here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:
    • Deep Learning for Computer Vision with Python

Image

Day11 of MachineLearningDeepLearning

  • Rectified Linear Unit:ReLU is zero for negative inputs but increases linearly for positive inputs. The ReLU function is not saturable and is also extremely computationally efficient. ReLU is the most popular activation function used in deep learning and has stronger biological motivations. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Neural Networks, Perceptron Algorithm, Learning Rate, Weight Matrix and Bias, Dot Product, Linear and Non-linear Datasets, Backpropagation and Multilayer Networks, Forward Pass and Backward Pass and few more topics related to the same from here. I have presented the implementation of Perceptron Algorithm here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day12 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Nonlinear XOR Dataset, Learning Rate and Weight Initializations, Neural Networks Architecture, Squared Loss, Backpropagation, Sigmoid Activation Function, Gradient Descent and Weight Updates, Derivatives and Chain Rule, Dot Product and few more topics related to the same from here. I have presented the implementation of Neural Network and Backpropagation here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day13 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been learning about Multi-layer Neural Networks, Backpropagation, Min-max Normalization, One Hot Encoding and Feature Vectors, Probabilities, Gradient Descent, Label Binarizer, Classification Report, SGD Optimizer and Cross Entropy Loss Function and few more topics related to the same from here. I have presented the implementation of Neural Network and Backpropagation using Keras here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day14 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been learning about Convolutional Neural Networks, Convolutions versus Cross-correlation, Kernels, CNN Building Blocks, Layer Types, Depth, Stride, Zero-padding, Filters and Receptive Field and few more topics related to the same from here. I have presented the notes about Backpropagation and Convolutional Layers here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day15 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Activation Layers, Pooling Layers, RELU, Fully Connected Layers, Batch Normalization Layer, Dropout Layer, Convolutional Neural Networks Patterns, Image To Array Preprocessor, Resizing and Shallow Network, Sequential Model and few more topics related to the same from here. I have presented the implementation of Image to Array Preprocessor and Shallow Network here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day16 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about LeNet Architecture, Convolutional Layers, RELU Activation Function, Max Pooling Layer, Fully Connected Dense Layer, Softmax Activation Function, Input Data Format and Channels, Flatten Layer, Label Binarizer and Encoding, SGD, Classification Report and few more topics related to the same from here. I have presented the implementation of LeNet Architecture, Training and Model Evaluation here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day17 of MachineLearningDeepLearning

  • Logistic Regression: However, when unnecessary or excessive number of variables is used in logistic regression model, peculiarities i.e. special attributes of the underlying dataset disproportionately affect the coefficient of the model, the phenomena commonly known as overfitting. So, it is most important that the logistic regression model doesn't start training more variables than is justified for the given number of observations. On my Journey of Machine Learning and Deep Learning, I have been reading the book "Deep Learning for Computer Vision with Python". Here, I have been reading about VGG Networks, Batch Normalization, Max Pooling and Activations, Fully Connected Layers, Classification Report, Learning Rate and Decay Parameters and few more topics related to the same from here. I have presented the implementation of VGGNet Architecture here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day18 of MachineLearningDeepLearning

  • Logistic Regression: In case of logistic regression, the response variable is the log of odds of being classified in a group of binary or multi-class responses. This definition essentially demonstrates that odds can take the form of a vector. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Learning Rate Schedulers, Step-based Decay, Spotting Overfitting and Underfitting, Training Error and Generalization Error, Effects of Learning Rates, Loss and Accuracy Curves, VGG Network Architectures and few more topics related to the same from here. I have presented the notes of VGGNet Architecture here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day19 of MachineLearningDeepLearning

  • Convolutional Neural Networks: Convolutions are just a type of matrix multiplication with two constraints on the weight matrix: some elements are always zero and some elements are tied or forced to always have the same value. Batch Normalization adds some extra randomness to the training process. Larger batches have gradients that are more accurate since they are calculated from more data. But larger batch size means fewer batches per epoch which means fewer opportunities for the model to update weights. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Pretrained Convolutional Neural Networks for Classification, VGG Neural Networks, ResNet Architectures, Inception V3 and GoogLeNet, Xception, Processing Images and ImageNet and few more topics related to the same from here. I have presented the implementation of pretrained VGGNet and Xception modules here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day20 of MachineLearningDeepLearning

  • Object Detection: Object Detection is a computer technology related to computer vision and image processing that deals with detecting instances of semantic objects of a certain class. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Object Detection with Pretrained Networks, COCO Dataset, Preprocessing Images and Video, Real Time Object Detection and few more topics related to the same from here. I have presented the implementation Object Detection here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day21 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have started reading the book Deep Learning - Ian Goodfellow. Here, I have read about Introduction to DL, Scalars, Vectors, Matrices and Tensors, Random Variables and Probability Distributions, Overflow and Underflow, Gradient Based Optimization, Learning Algorithms and many topics related to the same. I have presented the implementation The Generator and The Discriminator using PyTorch here in the snapshots. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day22 of MachineLearningDeepLearning

  • Pattern Recognition: The field of pattern recognition is concerned with the automatic discovery of regularities in data through the use of computer algorithms and with the use of these regularities to take actions such as classifying the data into different categories. Generalization, the ability to categorize correctly new examples that differ from those used for training is a central goal in pattern recognition. On my Journey of Machine Learning and Deep Learning, I have started reading the book Pattern Recognition and Machine Learning - Bishop. Here, I have read about Training and Learning Phase, Generalization, Pattern Recognition and Feature Extraction, Supervised Learning, Unsupervised Learning, Clustering and Density Estimation, Reinforcement Learning and few more topics related to the same. I have shared the notes about Pattern Recognition, Supervised and Unsupervised Learning and Reinforcement Learning here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:
    • Pattern Recognition and Machine Learning
    • DCGANs

Image

Day23 of MachineLearningDeepLearning

  • Reinforcement Learning: The technique of reinforcement learning is concerned with the problem of finding suitable actions to take in a given situation in order to maximize a reward. A general feature of reinforcement learning is the trade-off between exploration, in which the system tries out new kinds of actions to see how effective they are, and exploitation, in which the system makes use of actions that are known to yield a high reward. On my Journey of Machine Learning and Deep Learning, I have read about Deep Convolutional Generative Adversarial Networks, Image Segmentation, The Generator and The Discriminator, Weights Initialization, RELU Function, Convolutional Layers, Batch Normalization Layer, Cross Entropy Loss Function, Data Loader, Gradients and few more topics related to the same from here. I have presented the implementation of Training DCGANs here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:
    • Pattern Recognition and Machine Learning
    • DCGANs

Image

Day24 of MachineLearningDeepLearning

  • Unsupervised Learning: The pattern recognition problems in which the training data consists of a set of input vectors x without any corresponding target values are called unsupervised learning problems. The goal in such unsupervised learning problems may be to discover groups of similar examples within the data, where it is called clustering, or to determine the distribution of data within the input space, known as density estimation, or to project the data from a high-dimensional space down to two or three dimensions for the purpose of visualization. On my Journey of Machine Learning and Deep Learning, I have started reading the book Pattern Recognition and Machine Learning - Bishop. Here, I have read about Probability Theory, The Rules of Probability, Bayes' Theorem, Probability Densities, Expectations and Covariances, Bayesian Probabilities, The Gaussian Distribution, Maximum Likelihood and few more topics related to the same from here. I have shared the notes about Probability Theory, Bayes' Theorem and The Rules of Probability here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:
    • Pattern Recognition and Machine Learning
    • DCGANs

Image

Day25 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have started reading the book "Pattern Recognition and Machine Learning - Bishop". Here, I am reading about Model Selection, The Curse of Dimensionality, Decision Theory, Minimizing the Misclassification Rate, Minimizing the Expected Loss, The Reject Option, Joint Probability Distribution and few more topics related to the same from here. I have shared the notes about Model Selection, Decision Theory and Loss Function here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:
    • Pattern Recognition and Machine Learning
    • DCGANs

Image

Day26 of MachineLearningDeepLearning

  • Natural Language Processing:NLP is a field of linguistics and machine learning focused on understanding everything related to human language. The aim of NLP tasks is not only to understand single words individually, but to be able to understand the context of those words. On my journey of Machine Learning and Deep Learning, I have started reading about Hugging Face. I have read about Natural Language Processing and Challenges, Sentiment Analysis, Zero Shot Classification, Text Generation, Mask Filling, Named Entity Recognition, Summarization, and Translation and few more topics related to the same from here. I have presented the implementation of Transformer Models here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day27 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have started reading about Hugging Face. I have read about Transformers, Transfer Learning, Attention Layers, Encoder Models, Decoder Models, Sequence2Sequence Models, Bias & Limitations, Pipeline Function, Tokenizer, Model Head and few more topics related to the same from here. I have presented the implementation of Pipeline Function here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day28 of MachineLearningDeepLearning

  • Sequential Learning: If the data set is sufficiently large, it may be worthwhile to use sequential algorithms, also known as on-line algorithms, in which the data points are considered one at a time, and the model parameters are updated after each such presentation. Sequential learning is also appropriate in which the data observations are arriving in a continuous stream. It can be achieved by applying the technique of stochastic gradient descent, also known as sequential gradient descent. On my journey of Machine Learning and Deep Learning, I am reading the book Pattern Recognition and Machine Learning - Bishop. Here, I have read about Sequential Learning, Geometry of Least Squares, Maximum Likelihood, Linear Basis Function Models, Linear Regression, Binary Variables and Multinomial Variables and few more topics related to the same from here.
  • Book:
    • Pattern Recognition and Machine Learning

Image

Day29 of MachineLearningDeepLearning

  • Attention Masks: Attention masks are tensors with the same shapes as the input IDs tensors, filled with 0s and 1s: 1s indicate the corresponding tokens should be attended to, and 0s indicate the corresponding tokens should not be attended to i.e. they should be ignored by the attention layers of the model. On my journey of Machine Learning and Deep Learning, I have started reading from Hugging Face. Here, I have read about Creating a Transformer, Tokenizers, Word-based, Character-based and Subword Tokenization, Encoding, Decoding, Padding and Attention Masks and few more topics related to the same from here. I have presented the implementation of Tokenizers, Padding and Attention Masks here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day30 of MachineLearningDeepLearning

  • Subword Tokenization: Subword Tokenization rely on the principle that frequently used words should not be split into smaller subwords, but rare words should be decomposed into meaningful subwords. On my journey of Machine Learning and Deep Learning, I have started reading from Hugging Face. I have read about Configurable Tokenizer Methods, Attention Masks, Tokenization Pipeline, Vocabulary, Tensors and Arrays, Padding and Truncation and few more topics related to the same from here. I have presented the implementation of Tokenization Pipeline here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day31 of MachineLearningDeepLearning

  • Dynamic Padding: The function that is responsible for putting together samples inside a batch is called a collate function. Dynamic Padding means the samples in the batch should all be padded to the maximum length inside the batch. On my journey of Machine Learning and Deep Learning, I have been reading from Hugging Face. I have read about Loading & Processing the Data, Tokenization, Dynamic Padding, Collate Function, Fine Tuning and Trainer API, Training Arguments, Transformer Model and few more topics related to the same from here. I have presented the implementation of Tokenization Pipeline & Training here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day32 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have been reading the book Transformers for Natural Language Processing. Here, I have read about Transformers, Encoder & Decoder, Positional Encoding, Multi-head Attention, BERT Architecture, Fine-tuning BERT, Optimizer & Hyperparameters, Masked Language Modeling, Next Sentence Prediction, Matthews Correlation Coefficient and few more topics related to the same from here. I have presented the implementation of Training BERT Model here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day33 of MachineLearningDeepLearning

  • Machine Translation:Machine translation is the process of reproducing human translation by machine transductions and outputs. The transduction process of the original Transformer architecture uses the encoder, the decoder stack, and all of the model's parameters to represent a reference sequence. On my journey of Machine Learning and Deep Learning, I have been reading the book Transformers for Natural Language Processing. Here, I have read about Pretraining RoBERTa Model, Machine Transduction & Transformers, Machine Translation, BLEU and Trax and few more topics related to the same from here. I have presented the implementation of Processing WMT Dataset here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day34 of MachineLearningDeepLearning

  • Encoder-Decoder Architectures:The numerical representation computed for a given token in encoder only transformer architecture depends both on the left or before the token and the right or after the token contexts which is called bidirectional attention. The numerical representation computed for a given token in decoder only transformer architecture depends only on the left context which is called autoregressive attention. On my journey of Machine Learning and Deep Learning, I have been reading the book Natural Language Processing with Transformers. Here, I have read about Encoder-Decoder Architectures, Attention Mechanisms, Transfer Learning, πŸ€— Ecosystem, Text Classification, Class Distributions, Tokenization, Fine-Tuning Transformers and Feature Extraction and many more topics related to Transformers.
  • Book:

Image

Day35 of MachineLearningDeepLearning

  • Named Entity Recognition:NER is a common NLP task that identifies entities like people, organizations or locations in text. These entities can be used for various applications such as gaining insights from documents, augmenting the quality of search engines, or building a structured database from a corpus. On my journey of Machine Learning and Deep Learning, I have been reading the book Natural Language Processing with Transformers. Here, I have read about Multilingual Named Entity Recognition, Cross-Lingual Transfer, Text Generation, Greedy Search Decoding, Beam Search Decoding, Sampling Methods, Top-k and Nucleus Sampling, Fine-Tuning XLM-Roberta, Error Analysis and many more topics related to the same from here. I have presented the implementation of Text Generation here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day36 of MachineLearningDeepLearning

  • Knowledge Distillation: Knowledge distillation is a general purpose method for training a smaller student model to mimic the behavior of a slower, larger, but better performing teacher model. The KL divergence expects the inputs in the form of log probabilities and labels as normal probabilities. So, we have used log softmax to normalize the student's logits while teacher's logits are converted to probabilities with a standard softmax. On my journey of Machine Learning and Deep Learning, I have been reading the book Natural Language Processing with Transformers. Here, I have read about Performance Benchmarking, Knowledge Distillation for Fine-Tuning, Distillation Trainer, Text Summarization and Question Answering Pipelines and many more topics related to the same. I have presented the implementation of Distillation Training Arguments, Trainer and Computing metrics using Transformers here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day37 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have been reading the book Natural Language Processing with Transformers. Here, I have read about Dealing with Few Labels, Zero-Shot Classification, Multilabel Text Classification & Multilabel Binarizer, Training Slices, Naive Bayes Classifier, Natural Language Inference, Data Augmentation and Embeddings, Mean Pooling and many topics related to the same from here. I have presented the implementation of Training Naive Bayes Classifier and Mean Pooling using Transformers here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day38 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have started taking the course Stanford CS224N: NLP with Deep Learning. Here, I am learning about Word Vectors, Conditional Probability Distribution, Distributional Semantics, Word2Vec Embedding Model, Softmax Function, Human Language and Word meaning, and many more topics related to the same. I have shared the notes about Word Vectors, Distributional Semantics, and Word2Vec Model here in the snapshot. I hope you will gain some insights and spend time learning the topics from the course mentioned below. I am excited about the days ahead.
  • Stanford CS224N: NLP with Deep Learning

Image

Day39 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have been taking the course Stanford CS224N: NLP with Deep Learning. Here, I am learning about Word Vectors, Word2Vec Model, Training Methods and Algorithms, Skip Grams and Continuous Bag of Words Models, Gradient Descent and Problems, Stochastic Gradient Descent, and many more topics related to the same. I have shared the notes about Word2Vec Model, and Gradient Descent here in the snapshot. I hope you will gain some insights and spend time learning the topics from the course mentioned below. I am excited about the days ahead.
  • Stanford CS224N: NLP with Deep Learning

Image

Day40 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have been taking the course UMass CS685: Advanced Natural Language Processing. Here, I am learning about Natural Langauge Processing, Supervised & Self-supervised Learning, Representation Learning, Sentiment Analysis, Language Modeling & Importances, Chain Rule & Markov Rule, Unigram & Bigram Models, Log Probabilities, Perplexity, and many more topics related to the same. I have shared the notes about Language Modeling & Perplexity here in the snapshot. I hope you will gain some insights and spend time learning the topics from the course mentioned below. I am excited about the days ahead.
  • UMass CS685: Advanced Natural Language Processing

Image

Day41 of MachineLearningDeepLearning

  • Batch size generally tells us how many training examples do we use to estimate the derivate of loss with respect to the parameters before taking a step. On my journey of Machine Learning and Deep Learning, I have been taking the course UMass CS685: Advanced Natural Language Processing. Here, I am learning about N-gram Models, Recurrent Neural Networks, Batch Size, Cross-entropy Loss Function, Gradient Descent & Backpropagation, Composition Functions, Forward Propagation, One-hot Vectors & Vocabulary, and many more topics related to the same. I have shared the notes about Language Modeling & Gradient Descent here in the snapshot. I hope you will gain some insights and spend time learning the topics from the course mentioned below. I am excited about the days ahead.
  • UMass CS685: Advanced Natural Language Processing

Image

Day42 of MachineLearningDeepLearning

  • Text to Speech: Text-to-speech (TTS) also known as speech synthesis, which aims to synthesize intelligible and natural speech when given a text, has broad applications in human communication. Developing a TTS system requires knowledge of languages and human speech production. Transformer-based Acoustic Models: TransformerTTS leverages Transformer based encoder -attention-decoder architecture to generate mel-spectograms from phonemes. TransformerTTS adopts the basic model structure of Transformer and absorbs some designs from Tacotron 2 such as decoder pre net and post-net and stop token prediction. It achieves similar voice quality with Tacotron 2 but enjoys faster training time. I have shared the notes about Text-to-Speech & Text Analysis here in the snapshot. I hope you will gain some insights and spend time learning the topics mentioned below. I am excited about the days ahead.
  • ** A Survey on Neural Speech Synthesis**

Image

Day43 of MachineLearningDeepLearning

  • The Module class contains three methods: The init method stores the learnable parameters. The training_step method accepts a data branch to return the loss value. The configure_optimizers method returns the optimization method that is used to update the learnable parameters. The validation_step method reports the evaluation measures. On my journey of Machine Learning and Deep Learning, I am reading the book "Dive into Deep Learning". I have read about Object Oriented Design Implementation & Modules, Linear Regression, Vectorization, Normal Distribution & Squared Loss, Linear Algebra & Calculus, Probability & Statistics, Data Manipulation, Data Preprocessing, and many more topics related to the same. I have shared the implementation of the Module here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.
  • Dive into Deep Learning

Image

Day44 of MachineLearningDeepLearning

  • DataLoaders: DataLoaders are a convenient way of abstracting out the process of loading and manipulating data so that the same machine-learning algorithm is capable of processing many different types and sources of data without the need for modification. On my journey of Machine Learning and Deep Learning, I am reading the book "Dive into Deep Learning". I have read about Data Module & DataLoader, Trainer Class & Training, Synthetic Regression Data, Object Oriented Design, Loss & Minibatches, and I have also read about Greedy Algorithms, Dynamic Programming, Backtracking, and many other topics related to the same. I have shared the implementation of the Data Module here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.
  • Dive into Deep Learning

Image

Day45 of MachineLearningDeepLearning

  • Attention & Transformers: The idea behind the Transformer model is the attention mechanism, an innovation that was originally envisioned as an enhancement for encoder-decoder RNNs applied to sequence-to-sequence models. The intuition behind attention is that rather than compressing the input, it might be better for the decoder to revisit the input sequence at every step. On my journey of Machine Learning and Deep Learning, I continued reading the book Dive into Deep Learning. I have read about Linear Regression, Training Errors & Generalization Errors, Normalization & Weight Decay, Attention Mechanisms & Transformers, Queries, Keys & Values, Attention Pooling & Nadaraya Watson Regression, and many other topics related to the same. I have shared the implementation of the Nadaraya Watson and Kernels here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.
  • Dive into Deep Learning

Image

Day46 of MachineLearningDeepLearning

  • Attention & Transformers: The idea behind the Transformer model is the attention mechanism, an innovation that was originally envisioned as an enhancement for encoder-decoder RNNs applied to sequence-to-sequence models. The intuition behind attention is that rather than compressing the input, it might be better for the decoder to revisit the input sequence at every step. On my journey of Machine Learning and Deep Learning, I continued reading the book Dive into Deep Learning. I have been reading Attention Mechanisms & Transformers, Attention Scoring Functions, Dot Product Attention, Masked Softmax Operation, Batch Matrix Multiplication, Scaled Dot Product Attention, Additive Attention, and many other topics related to the same. I have shared the implementation of the Scaled Dot Product Attention here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.
  • Dive into Deep Learning

Image

Day47 of MachineLearningDeepLearning

  • GPT-based Transformers: GPT-based Transformers are pre-trained on a large corpus of text data, which gives them a general understanding of natural language and allows them to perform well on various downstream tasks with minimal fine-tuning. They predict the next token in a sequence based on the previous tokens, which enables them to generate coherent and fluent text. On my journey of Machine Learning and Deep Learning, I am following the materials of Andrej Karpathy on implementing GPT (ChatGPT) from scratch. I learned about character level tokenization, encoding, and decoding characters, vocabulary tables, data loader, Bigram Language Model and training pipelines and architecture implementation, optimization, and many many other topics related to the same. I have presented the implementation of the Bigram Language Model here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.

Image

Day48 of MachineLearningDeepLearning

  • GPT-based Transformers: GPT-based Transformers are pre-trained on a large corpus of text data, which gives them a general understanding of natural language and allows them to perform well on various downstream tasks with minimal fine-tuning. They predict the next token in a sequence based on the previous tokens, which enables them to generate coherent and fluent text. On my journey of Machine Learning and Deep Learning, I am following the materials of Andrej Karpathy on implementing GPT (ChatGPT) from scratch. I continued working on GPT and learned about Multihead self-attention, Feedforward networks with nonlinearity, Token, and positional embedding, Layer normalization, dropout, and many other topics related to the same. I have presented the implementation of the GPTLanguage Mode or pre-training of ChatGPT here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.

Image

Day49 of MachineLearningDeepLearning

  • Stable Diffusion: Stable Diffusion is a deep learning text-to-image model which is primarily used to generate detailed images conditioned on text descriptions, though it can also be applied to other tasks such as inpainting, outpainting, and generating image-to-image translations guided by a text prompt. It is a latent diffusion model which is a kind of deep generative neural network. On my journey of Machine Learning and Deep Learning, I am learning about GAN Models, Large Language Models, Stable Diffusion, and Whisper. In Stable Diffusion, I learned about Stable Diffusion Pipeline, Classifier-Free Guidance, Negative Prompts, Image to Image Diffusion, and many other topics related to the same. I have presented the implementation of Image to Image Diffusion here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.

Image

More Repositories

1

300Days__MachineLearningDeepLearning

I am sharing my Journey of 300DaysOfData in Machine Learning and Deep Learning.
520
star
2

66Days__NaturalLanguageProcessing

I am sharing my Journey of 66DaysofData in Natural Language Processing.
186
star
3

ML..Interview..Preparation

Preparation for Machine Learning Interview
168
star
4

MLOps

The repository contains a list of projects which I will work on while learning and implementing MLOps.
Jupyter Notebook
80
star
5

Transformers_NLP

The repository will contain a list of projects which we will work on while reading the books of Natural Language Processing & Transformers.
Jupyter Notebook
65
star
6

Fastai

I will implement Fastai in each projects present in this repository.
Jupyter Notebook
64
star
7

Meta-llama

Complete implementation of Llama2 with/without KV cache & inference πŸš€
Python
40
star
8

ComputerVision

The repository is dedicated towards the implementation of Computer Vision.
Jupyter Notebook
25
star
9

HuggingFace

I am learning from Hugging Face.
Jupyter Notebook
22
star
10

CaliforniaHousing__Prices

I have built a Model using Random Forest Regressor of California Housing Prices Dataset to predict the price of the Houses in California.
HTML
14
star
11

MachineLearning__Algorithms

I am working on implementing Machine Learning Algorithms from scratch.
Jupyter Notebook
13
star
12

NeuralNetworks__SentimentAnalysis

In this repository, I have worked out on the Large Movie Review Dataset for the Sentiment Analysis of Text. I have Implemented the CNN, RNN and LSTM to predict the Sentiment of the Text Data.
Jupyter Notebook
8
star
13

Github-Actions

This is a repo for building out Github Actions and Tricks.
Makefile
7
star
14

GAN

I have prepared a Deep Convolutional Generative Adversarial Networks here.
Jupyter Notebook
7
star
15

ThinamXx

Updating.........
6
star
16

MachineLearning_with_Python

In this repository, you will gain insights about various Supervised and Unsupervised Machine Learning Algorithms implementation on real data sets along with Visualizations.
Jupyter Notebook
6
star
17

ApproachingAnyMachineLearning

The repository contains a list of projects which I have worked while reading the book Approaching Any Machine Learning Problem.
Jupyter Notebook
6
star
18

Covid19..DataAnalysis

In this Project, I have worked with Covid19 Data and HDI Data to analyze the relationships between the increasing rate of Covid19 patients and various HDI factors like Healthy Life, Freedom and so on.
Jupyter Notebook
5
star
19

D2L

The repository contains a list of projects & notebooks which I have worked on while reading the book Dive into Deep Learning.
Jupyter Notebook
5
star
20

DuplicateQuestions__Recognition

I have build a LSTM Model using Trax which can identify the Duplicate Questions or Similar Questions which is useful when we have to work with several versions of the same Questions.
Jupyter Notebook
5
star
21

FacialExpression_Classification

In this repository, I have created a model which recognizes Happy Face, Sad Face, Surprise Face, Angry Face and Laughing Face using Python, Fastai API and Convolutional Neural Networks.
Jupyter Notebook
5
star
22

DogBreedClassification

I have prepared a pretrained Neural Network Model which helps in Image Classification. The Model can classify 120 different breeds of dogs.
Jupyter Notebook
5
star
23

SemanticAnalysis__LDA..LDIA

In this repository, I have included a Notebook which contains the Implementation of LSA, LDA and LDIA in Semantic Analysis. I have presented all the Code Snippets with proper Documentation and I hope you will gain Insights about Semantic Analysis with LSA and LDIA.
Jupyter Notebook
5
star
24

reinforcement-learning

You will learn about RLHF from this repository πŸ€–.
Python
4
star
25

Horse.vs.Human_Classification

In this repository, I have used Convolution Neural Network for the binary classification between a Human and a Horse. You will gain insights about the Implementation of Convolution Neural Network in image classification.
Jupyter Notebook
4
star
26

Chatbot__Sequence2Sequence

Here, I have prepared a Chatbot using Sequence to Sequence Neural Networks. I hope you will gain insights about the Implementation of Sequence to Sequence Learning in Chabot.
Jupyter Notebook
4
star
27

YelpReviews__Analysis

In this repository, I have worked on the Sentiment Analysis of YELP Reviews Dataset using PyTorch. I hope you will gain some insights about the Implementation of PyTorch in Natural Language Processing.
Jupyter Notebook
3
star
28

SGD_from_Scratch

In this repository, you will gain insights about the Preparation of Linear Regression and SGD using Pytorch.
Python
3
star
29

MusicGeneration_with_DeepLearning

In this repository, RNN and LSTM model will compose folk songs music. You will gain a lot of insights about DeepLearning and RNN.
Jupyter Notebook
3
star
30

transformer-pytorch

Implementation of the Transformer model from the paper Attention Is All You Need in Pytorch.
Python
3
star
31

blog

Personal website for Thinam Tamang
CSS
3
star
32

AmazonReviews__Analysis

In this repository I have performed the Text Analysis of Amazon Reviews Dataset using only TensorFlow and TensorBoard. I hope you will gain some insights about the Implementation of TensorFlow here.
Jupyter Notebook
3
star
33

CIFAR10__Recognition

I have prepared a Neural Networks Model which helps in recognizing the object on an image.
Jupyter Notebook
3
star
34

Bitcoin_CryptocurrencyMarket..Analysis

Hey there, In this repository I will be Analyzing the Bitcoin Cryptocurrency Market. I have applied simple Data Manipulation and Data Visualization techniques. I have done this analysis using Jupyter Notebooks and Python Programming Language. Hope you get some insights about Data Manipulation!!
Jupyter Notebook
3
star
35

Ups.and.Downs_with_Kardashians

While I'm not a fan nor a hater of the Kardashians and Jenners, the polarizing family intrigues me. Why? Their marketing prowess. Say what you will about them and what they stand for, they are great at the hype game. Everything they touch turns to content.
Jupyter Notebook
3
star
36

Twitter..Sentiment..Analysis

Twitter Sentiment Analysis with Naive Bayes Algorithm. You can gain insights about Naive Bayes Algorithm and various Exploratory Data Analysis techniques.
Jupyter Notebook
3
star
37

ImageClassification_DataAugmentation_CNN

This repository presents the Implementation of Convolutional Neural Netwoks along with ResNet by using Fast.ai from scratch.
Python
2
star
38

TextMining_in_Python

In this repository, you will gain insights about Natural Language Processing in Python using NLTK
Jupyter Notebook
2
star
39

thinam.ai

Personal website built with Fastpages
HTML
2
star
40

RecurrentNeuralNetworks_RNN

In this repository, you will gain insights about the implementation of Recurrent Neural Network in real-world problems.
Jupyter Notebook
2
star
41

Natural_Language__Inference

I have prepared a Attention Model for Natural Language Inference.
Jupyter Notebook
2
star
42

TextSentiment_Analysis

In this repository, you will gain insights about the Sentiment Analysis of Text document with the Implementation of NLTK, TF-IDF, and LogisticRegression.
Jupyter Notebook
2
star
43

AutonomousDriving..YOLO

Here, you will gain insights about the implementation about the popular YOLO model in Object Detection which will ultimately helps in Autonomous Driving.
Jupyter Notebook
2
star
44

ImageSegmentation_with_CamVid

In this repository, I have used Fastai Datasets to perform Image Segmentation and I have checked the accuracy of the Model. For this project, I have used Fastai Library to create the model and to check the accuracy.
Jupyter Notebook
2
star
45

TV_Halftime_Shows..BigGame

Hey there!! In this Project I have tried to answer the simple Questions about the TV, Halftime Shows and the Big Games. It will give you some insights about Pandas as well. I have done this Project in Jupyter Notebook using Python Programming Language !!
Jupyter Notebook
2
star
46

BreastCancer_Wisconsin_Diagnostic

In this repository, I have implemented KNeighborsClassifier to classify the Breast Cancer: Malignant or Benign.
Jupyter Notebook
2
star
47

C.Programs

Hey there!! Here, I have included the basic programs of C Programming Language.
C
2
star
48

TheDiscovery_of_Handwashing

Hey there!! In this notebook, I am going to reanalyze the data that made Semmelweis discover the importance of handwashing.
Jupyter Notebook
2
star
49

ImageProcessing_and_Manipulation

In this repository, I have tried to process the image of Honeybee and point its difference from Bubble bee.The question at hand is: can a machine identify a bee as a honey bee or a bumble bee? These bees have different behaviors and appearances, but given the variety of backgrounds, positions, and image resolutions it can be a challenge for machines to tell them apart.
Jupyter Notebook
2
star
50

PredictFutureSales_XGBoost

In this repository, I have worked out in one of the Kaggle Competition Data, "Predict Future Sales". I have used XGBoostRegressor. I have also used Fastai API for Feature Preprocessing to enhance the Model accuracy. You can get insights about Fastai Implementation as well.
Jupyter Notebook
2
star
51

TimeSeries_Sunspots

In this repository, you will gain insights about the Implementation of Deep Neural Network along with CNN, RNN and LSTM on Time-Series data.
Jupyter Notebook
2
star
52

LanguageTranslator_EnglishFrench

I have prepared a Model which can translate English Language to French Language with above 80% accuracy. I have used Tensorflow and Keras for this Project.
Jupyter Notebook
2
star
53

ConvolutionalNeuralNetworks_CNN

In this repository, you will see various implementation of the CNN in real world problems and gain useful insights from those problems.
Jupyter Notebook
2
star
54

FaceDetection_and_PrivacyProtection

In this repository, I used Python and Keras to detect the Face and Blur them for Privacy Protection. I hope you will gain some insights about Face Detection with Python and Keras.
Jupyter Notebook
2
star
55

GitHub_History.of.ScalaLanguage

Hey there!! With almost 30k commits and a history spanning over ten years, Scala is a mature programming language. It is a general-purpose programming language that has recently become another prominent language for data scientists. I'm going to read in, clean up, and visualize the real world project repository of Scala.
Jupyter Notebook
2
star
56

Machine_Learning..Stanford

Hey there!! In this repository I have included all the Programming exercises of Machine Learning offered by Stanford in Coursera. It was taught by Andrew Ng. It has several interesting Projects like Spam Filtering, Handwritten Digits Recognition and many more. You can access from here!!
MATLAB
2
star
57

NeuralNetworks_and_DeepLearning

In this repository, you will gain insights about Neural Networks and Deep Learning.
Jupyter Notebook
2
star
58

NEURAL_STYLE_TRANSFER

Here, I have prepared a Neural Networks Style Transfer Model.
Jupyter Notebook
2
star
59

MNIST_CNN

In this repository, I have used CNN to MNIST Digit Classification Dataset. You can get insights about Convolutional Neural Network and its implementation for Handwritten Digit Classification.
Jupyter Notebook
2
star
60

TrafficSigns..Classification

In this Project, I have prepared a Deep Convolutional Neural Network Model which can classify the 43 different classes of Traffic Signs Images with above 90% accuracy. You can get insights about the Implementation of Convolutional Neural Network in Image Classification and so on. Thank you !!
Jupyter Notebook
2
star
61

CollaborativeFiltering..MovieRecommendation

In this repository, I have built Movie Recommendation System using Fast.ai approach which was built on top of PyTorch. You can get insights about the Implementation of Fastai in building Recommendation System easily with high accuracy.
Jupyter Notebook
2
star
62

VisualHistory_of_NobelPrizeWinners

Hey there!! In this Project I have analyzed the Nobel Prize Winners and it's dominance by great scientists of "United States of America" over past decades. I have also figured out the oldest and youngest Nobel Prize Winners from this Dataset. I have done this project in Jupyter Notebook using Python Programming Language!!
Jupyter Notebook
2
star
63

ImagePrediction_with_ResNet50

In this repository, I am going to predict the image using the pre-trained ResNet50 modules. ResNet-50 is a convolutional neural network that is 50 layers deep. ResNet50 is a pretrained version of the network trained on more than a million images from the ImageNet database.
Jupyter Notebook
2
star
64

SurnameClassification__PyTorch

I have worked in Surname Classification Model Inferring Demographic Information which has applications from Product Recommendations to ensuring fair outcomes for users across different Demographics. I hope you will gain some insights.
Jupyter Notebook
2
star
65

BlueBook_for_Bulldozers

Here, In this repository, I have solved one of the problems of Kaggle Competition, " Blue Book for Bulldozers"
Jupyter Notebook
1
star
66

orpo-demo

Working on the implementation of ORPO
Python
1
star
67

Cats.vs.Dogs_Classification

In this repository, I have used Convolutional Neural Network with InceptionV3 trained on Imagenet to classify Cats vs Dogs. It is one of competition from Kaggle. Hope you can get insights about implementation of CNN.
Jupyter Notebook
1
star
68

InternetMovieDatabase__NLP

In this repository, I have used Fast.ai Library to classify the IMDb reviews and measure the accuracy of Model. You can get insights about the implementation of Fast.ai Library in Natural Language Processing.
Jupyter Notebook
1
star
69

ASLRecognition_with_DeepLearning

In this notebook, I will train a convolutional neural network to classify images of American Sign Language (ASL) letters. After loading, examining, and preprocessing the data, I will train the network and test its performance.
Jupyter Notebook
1
star
70

ABTesting_with_CookieCats

In this repository, I have used Cookie Cats Game to analyze its popularity and tried to answer some common questions like, When does the new player tend to play game more?!
Jupyter Notebook
1
star
71

MachineLearning_in_WeatherData

In this project, we analyzed the Weather Pattern of Kathmandu basically focusing on Temperature and Precipitation. We have used Machine Learning viz Random Forest Regressor, and Plotly & Seaborn for Data Visualization.
Jupyter Notebook
1
star
72

TopicModeling__NLP

In this repository, I have performed Topic Modeling using Singular Value Decomposition and Non Negative Matrix Formation along with TFIDF Count Vectors.
Jupyter Notebook
1
star
73

Planet..Understanding_Amazon_from_Space

In this repository, I have worked on Image Data from one of the Kaggle Competition. Hope you will gain insights about Image Data analysis.
Jupyter Notebook
1
star
74

WordFrequency_using_NLTK

In this repository, I have used NLP to determine: What are the most frequent words in Herman Melville's novel Moby Dick and how often do they occur?
HTML
1
star
75

TheBattle_of_Neighborhood..Toronto

In this repository I have performed the Analysis using Python, API, basic Machine Learning Algorithms and Jupyter Notebook to get the insights og Toronto Neighborhood. Here I have also included the Analyzed Report for better understanding for the reader!!
Jupyter Notebook
1
star
76

TheAndroidAppMarket_on_GooglePlay

Hey there!! Mobile apps are everywhere. They are easy to create and can be lucrative. Because of these two factors, more and more apps are being developed. In this Project, I have done a comprehensive analysis of the Android app market by comparing over ten thousand apps in Google Play across different categories.
Jupyter Notebook
1
star
77

cuda-mode

Making of cuda kernel
1
star
78

PredictingSales_PandasBasics

In this repository, I have made the clear presentation of Pandas in Sales Dataset. I have used real world Dataset to inspect the ground truth of the Data and I have tried to answer the hidden information from the Data.
Jupyter Notebook
1
star
79

build-GPT

Building GPT ...
1
star
80

PredictingCreditCard_Approval

Hey there!! In this repository I have applied the Supervised Machine Learning techniques to predict the Approval and Denial of Credit Card Applications. This repository will give you the insights of Supervised Machine Learning Techniques.
Jupyter Notebook
1
star
81

NIPS_Papers..NLP

The NIPS conference (Neural Information Processing Systems) is one of the most prestigious yearly events in the machine learning community. At each NIPS conference, a large number of research papers are published. In this repository I am going to analyze the Hottest topic in Machine Learning using Jupyter Notebook & Python. I have used Natural Language Processing techniques to analyze.
Jupyter Notebook
1
star
82

Rossmann_Store_Sales

In this repository, I have worked out one of the problems of Kaggle Competition, "Rossmann Store Sales". You can get insights about the use of Fast.ai Library in Time Series Dataset.
Jupyter Notebook
1
star
83

Tabular_Data..AustinAnimalCenter

In this repository, I will include Projects or Notebooks which basically focuses on Implementation of Machine Learning and Deep Learning in Tabular Data.
Jupyter Notebook
1
star