Neural Networks are set of algorithms which closely resemble the human brain and are designed to recognize patterns. They interpret sensory data through a machine perception, labelling or clustering raw input. They can recognize numerical patterns, contained in vectors, into which all real-world data ( images, sound, text or time series), must be translated. Artificial neural networks are composed of a large number of highly interconnected processing elements (neuron) working together to solve a problem.
An ANN usually involves a large number of processors operating in parallel and arranged in tiers. The first tier receives the raw input information —…
We have all seen a bizarre time, thanks to COVID-19, over the last one and a half years. With the second wave of the virus hitting us like a storm, India had seen one of the worst phases of all time. It took a toll on all of us, both physically and emotionally.
With thousands of people posting everyday asking for help and only a handful were able to help with correct sources, we were upset on seeing that there was no common platform that everyone was aware of and where verified information could be updated.
The second wave resulted…
SQL stands for Structured Query Language and is used to access and manipulate the database.
RDBMS stands for Relational Database Management System. It is the basis for SQL, and for database systems such as MS SQL Server, Oracle, MySQL, and Microsoft Access.
MongoDB is a rich open-source document-oriented and one of the widely recognised NoSQL database. It is written in C++ programming language.
Today, I wanted to share some of the basic stuff about MongoDB concepts and commands
Database is a physical container for collections. Each database gets its own set of files on the file system. A single MongoDB server typically has multiple databases.
Collection is a group of documents and is similar to an RDBMS table. A collection exists within a single database. Collections do not enforce a schema. Documents within a collection can have different fields.
A document is…
Learning can be defined as acquiring knowledge or skills through experience, study, or by being taught. So, Machine Learning can be defined as a phenomenon where a machine can be taught or it learns on its own without being explicitly programmed.
Wikipedia has defined deep learning as:
Deep learning is a class of machine learning algorithms that uses multiple layers to progressively extract higher level features from the raw input. For example, in image processing, lower layers may identify edges, while higher layers may identify the concepts relevant to a human such as digits or letters or faces.
Have you ever wondered how Facebook stores and serves thousands of petabytes of user content such as photos, videos, likes, etc.? Have you ever gotten intrigued by how you can upload your photo or video on Instagram, and your followers can view the same instantly on their feeds? The answer to these questions is distributed systems!
So, now you are wondering what exactly is a distributed system? You must have heard these two words many times in MOOCs videos or read in a few articles or books. But you don’t know what these two words mean.
Wikipedia defines distributed systems…
Clustering is an unsupervised learning technique which is used to make clusters of objects i.e. it is a technique to group objects of similar kind in a group. In clustering, we first partition the set of data into groups based on the similarity and then assign the labels to those groups. Also, it helps us to find out various useful features that can help in distinguishing between different groups.
Most common categories of clustering are:-
Partitioning method classifies the group of n objects into groups based on the features and…
Hadoop? The heartbeat of big data? Yeah, you read it right.
Hadoop is an open source framework written in Java which is generally used to process and store big data in a distributed environment using easy and simple programming models.
Before diving into hadoop, let’s discuss what’s big data exactly.
Big data is a collection of large datasets that cannot be processed using traditional computing techniques. But what’s the reason of such large data? …
The central limit theorem states that for a given dataset with unknown distribution, the sample means will approximate the normal distribution.
In other words, the theorem states that as the size of the sample increases, the distribution of the mean across multiple samples will approximate a Gaussian distribution. But for this theorem to hold true, these samples should be sufficient in size. The distribution of sample means, calculated from repeated sampling, will tend to normality with the increase in size of these samples.
To understand this theorem more clearly, let’s cover the basics first. …
Neural networks are the gist of deep learning. They are multi-layer networks of neurons that we use to classify things, make predictions, etc. There are 3 parts in any neural network:
The arrows that connect the dots shows how all the neurons are interconnected and how data travels from the input layer all the way through to the output layer.
Every neuron in a layer takes the inputs, multiples it by some weights, adds a bias, applies an activation function and passes it on to the next…