SpaCy – Introduction for NLP | Combining NLP Models and Custom rules

Combining NLP Models and Creation of Custom rules using SpaCy Objective: In this article, we are going to create some custom rules for our requirements and will add that to our pipeline like explanding named entities and identifying person’s organization name from a given text. For example: For example, the Read more…

Words Embedding using GloVe Vectors

NLP Tutorial – GloVe Vectors Embedding with TF2.0 and Keras GloVe stands for global vectors for word representation. It is an unsupervised learning algorithm developed by Stanford for generating word embeddings by aggregating a global word-word co-occurrence matrix from a corpus. The resulting embeddings show interesting linear substructures of the word in Read more…

Multi-step-Time-series-predicting using RNN LSTM

Household Power Consumption Prediction using RNN-LSTM Power outage accidents will cause huge economic loss to the social economy. Therefore, it is very important to predict power consumption. Given the rise of smart electricity meters and the wide adoption of electricity generation technology like solar panels, there is a wealth of Read more…

SpaCy Introduction for NLP | Linguistic Features Extraction

Getting Started with spaCy This tutorial is a crisp and effective introduction to spaCy and the various NLP linguistic features it offers.We will perform several NLP related tasks, such as Tokenization, part-of-speech tagging, named entity recognition, dependency parsing and Visualization using displaCy. spaCy is a free, open-source library for advanced Natural Read more…

Deep Learning with TensorFlow 2.0 Tutorial – Building Your First ANN with TensorFlow 2.0

Deep learning with Tensorflow # pip install tensorflow==2.0.0-rc0 # pip install tensorflow-gpu==2.0.0-rc0 Watch Full Lesson Here: Objective Our objective for this code is to build to an Artificial neural network for classification problem using tensorflow and keras libraries. We will try to learn how to build a nerual netwroks model Read more…

Complete Seaborn Python Tutorial for Data Visualization in Python

Visualizing statistical relationships Seaborn is a Python data visualization library based on matplotlib. It provides a high-level interface for drawing attractive and informative statistical graphics. Statistical analysis is a process of understanding how variables in a dataset relate to each other and how those relationships depend on other variables. Visualization Read more…

Sentiment Analysis Using Scikit-learn

Sentiment Analysis Objective In this notebook we are going to perform a binary classification i.e. we will classify the sentiment as positive or negative according to the `Reviews’ column data of the IMDB dataset.  We will use TFIDF for text data vectorization and Linear Support Vector Machine for classification. Natural Read more…

Multi-Label Text Classification on Stack Overflow Tag Prediction

Multi-Label Text Classification In this notebook, we will use the dataset “StackSample:10% of Stack Overflow Q&A” and we use the questions and the tags data. We will be developing a text classification model that analyzes a textual comment and predicts multiple labels associated with the questions. We will implement a Read more…

IMDB Review Sentiment Classification using RNN LSTM

Sentiment Classification in Python In this notebook we are going to implement a LSTM model to perform classification of reviews. We are going to perform binary classification i.e. we will classify the reviews as positive or negative according to the sentiment. Recurrent Neural Network Neural Networks are set of algorithms Read more…

Feature Selection Based on Univariate ROC_AUC for Classification and MSE for Regression | Machine Learning | KGP talkie

Feature Selection Based on Univariate ROC_AUC for Classification and MSE for Regression Watch Full Playlist: https://www.youtube.com/playlist?list=PLc2rvfiptPSQYzmDIFuq2PqN2n28ZjxDH What is ROC_AUC The Receiver Operator Characteristic (ROC) curve is well-known in evaluating classification performance. Owing to its superiority in dealing with imbalanced and cost-sensitive data, the ROC curve has been exploited as a Read more…

Feature Selection using Fisher Score and Chi2 (χ2) Test | Titanic Dataset | Machine Learning | KGP Talkie

Feature Selection using Fisher Score and Chi2 (χ2) Test Watch Full Playlist: https://www.youtube.com/playlist?list=PLc2rvfiptPSQYzmDIFuq2PqN2n28ZjxDH What is Fisher Score and Chi2 ( χ2) Test Fisher score is one of the most widely used supervised feature selection methods. However, it selects each feature independently according to their scores under the Fisher criterion, which Read more…

Feature Selection Based on Univariate (ANOVA) Test for Classification | Machine Learning | KGP Talkie

Feature Selection Based on Univariate (ANOVA) Test for Classification Watch Full Playlist: https://www.youtube.com/playlist?list=PLc2rvfiptPSQYzmDIFuq2PqN2n28ZjxDH What is Univariate (ANOVA) Test The elimination process aims to reduce the size of the input feature set and at the same time to retain the class discriminatory information for classification problems. An F-test is any statistical Read more…

Feature Selection Based on Mutual Information (Entropy) Gain for Classification and Regression | Machine Learning | KGP Talkie

Feature Selection Based on Mutual Information (Entropy) Gain Watch Full Playlist: https://www.youtube.com/playlist?list=PLc2rvfiptPSQYzmDIFuq2PqN2n28ZjxDH What is Mutual Information The elimination process aims to reduce the size of the input feature set and at the same time to retain the class discriminatory information for classification problems. Mutual information (MI) is a measure of Read more…

Feature Selection with Filtering Method | Constant, Quasi Constant and Duplicate Feature Removal

Filtering method Watch Full Playlist: https://www.youtube.com/playlist?list=PLc2rvfiptPSQYzmDIFuq2PqN2n28ZjxDH Unnecessary and redundant features not only slow down the training time of an algorithm, but they also affect the performance of the algorithm. There are several advantages of performing feature selection before training machine learning models: Models with less number of features have higher Read more…