Industrial Training Presentation
On
ARTIFICIAL INTELLIGENCE : MACHINE LEARNING & DEEP LEARNING
At
Centre for Advanced Studies (CAS-AKTU)
New Campus
Sec-11, Jankipuram, Vistar Yojna, Lucknow, Uttar Pradesh-226031
TRAINING DURATION: 15/07/2020 to 14/08/2020
PRESENTED BY:
NAME: VARUN SHARMA
ROLL NUMBER: 18DLEE006
7th SEMESTER, EE-B (G2- BATCH)
Electrical Engineering Department, JSS Academy of Technical
Education, Noida, U.P, India
Details of the Industry/ Background of the company.
Theoretical Back ground of the work undertaken.
Subdividing the training program in modules :
Module-1 {Things which have been done in week-1}
Module-2 {Things which have been done in week-2}
Module-3 {Things which have been done in week-3}
Module-4 {Things which have been done in week-4}
Conclusion/ Important learnings from the training period.
Certificate Page
Details of the Research Institute
The Government of India has invested heavily in research establishments in the field
of nuclear, defense, space as well as ocean, environment, biotechnology. All these
establishments require quality manpower.
So some quality institutions are needed to develop a cadre of manpower of good
standards for carrying out scientific research.
UP Govt. issued a notification to start Centre for Advanced studies for upgrading
the standards of academics, quality of staff and research in the state.
Also to make Dr. APJ Abdul Kalam Technical University as an institution of higher
learning of international repute, an Centre for Advanced Studies (CAS-AKTU) as in-
campus academic & research institution is being setup.
Details of the Research Institute
Centre for Advanced Studies (CAS-AKTU) is the Research Institute which is located in
Sec-11, Jankipuram, Vistar Yojna, New Campus(Lucknow) , Uttar Pradesh-226031
The institute envisages to start courses and research in the areas such as
Mechatronics, Nano -Technology, Bio Technology, 3D printing, Climate control,
Market and Business Analytics, Big Data, Energy Studies, Manufacturing Science
and Technology.
The institute has started functioning from July 2017.
The institute in its first phase is beginning with two M.Tech. programs in specialised
areas of Cyber Security and Information and Communication Technologies (ICT).
Theoretical Background
Artificial Intelligence:
Artificial intelligence was born in the 1950s, when scientists from a different fields
(mathematics, psychology, engineering and economics) were began to discuss the
possibility of creating an artificial brain.
After there more research on AI and neurology they found that the brain was an
electrical network of neurons and pulses, which could be easily described digitally.
It reached its peak popularity during 1980s ,when research on AI proved to solve
well-defined, logical problems, such as playing chess, solving more complex, fuzzy
problems, such as image classification, speech recognition, and language translation.
A new approach arose to take symbolic AI’s place: Machine Learning.
Through there research they stated that , AI is a general field that encompasses
machine learning and deep learning
Theoretical Background
Machine Learning:
Charles Babbage, the inventor of the Analytical Engine ,which was first-known as
general-purpose, mechanical computer quoted whether general-purpose computers
could be capable of learning and originality a specified task?
His question opens the door to a new programming paradigm and originated the
term Machine Learning.
It started flourish in the 1990s and has quickly become the most popular and most
successful subfield of AI, because of the availability of faster hardware and larger
datasets.
As a result, machine learning tends to deal with large, complex datasets (such as a
dataset of millions of images, each consisting of tens of thousands of pixels)
Theoretical Background
Deep Learning:
The history of deep learning dates back to 1943 when Warren McCulloch and Walter
Pitts created a computer model based on the neural networks of the human brain.
They used a combination of mathematics and algorithms they called threshold logic
to mimic the thought process.
In 1979 , Kunihiko Fukushima developed an artificial neural network, called
Neocognitron which used a multi-layered and hierarchical design.
Neo-cognitron allowed the computer to learn to recognize visual patterns.
More researches found that Deep Learning uses layers of algorithms to process
data, understand human speech, and visually recognize objects.
Module-1 {Week-1}
What is Machine Learning?
It is an application of Artificial Intelligence(AI) that focuses on the development of computer programs and
provides systems the ability to automatically learn and improve from experience.
In Supervised we have a dataset which
acts as a teacher and its role is to train
model or the machine. Once the model
gets trained it can start making a
prediction or decision when new data is
given to it. Ex-Face and Speaker
Verification.
Unsupervised learns through observation
and finds structures in the data. Once the
model is given a dataset, it automatically
finds patterns & relationship, like it cannot
say this a group of apples or mangoes, but it
will separate all the apples from mangoes.
Ex- Feature selection
Reinforcement follows the concept of hit
and trial method. It predicts the data and
get rewarded or penalized with a point for
a correct or a wrong answer.
Ex- Autonomous Vehicles or Playing a
game against a human opponent.
Classification vs Clustering
Definition Classifies the data into one of numerous
already defined definite classes.
There are pre-defined labels
Clustering involves grouping data
with respect to their similarities.
There are no pre-defined labels
Similarity Used for the categorization of objects
into one or more classes based on the
features.
Used for the categorization of
objects into one or more classes
based on the features.
Involved In Supervised learning Unsupervised learning
Need It has labels so there is need of training
and testing dataset for verifying the
model created
There is no need of training and
testing dataset
Complexity More complex as compared to clustering Less complex as compared to
classification
Examples Fraud Detection
Entities can classify transactions as
correct or fraudulent using historical data
on customer behavior to detect fraud
very accurately.
Netflix
Uses these clusters to refine its
knowledge of the tastes of viewers
and thus make better decisions in
the creation of new original series.
Lets Consider The Example On The Indian Cricket World Cup Squad 2011
Each player is characterized by
Name, weight(kgs), height(cms)
Fig-1
Fig-2
Fig-3
Fig-4 Fig-5
Application Of Machine Learning In Real World
1). Traffic Alerts- (Location, Avg- Speed ,
Prediction, Learning Algorithms)
2). Image Recognition- (Digital Images, Videos,
Pattern Recognition, Phone Unlock by face nodal)
3). Video Surveillance- (Accuracy, Object Detector)
4). Online Video Streaming Applications- (Day, Time & Type Of Content,
Browsing Pattern, Many More Activities)
Module-2 {Week-2}
INTRODUCTION TO DEEP-LEARNING
AI - means getting a computer to mimic human behavior in some way.
Machine learning - is a subset of AI, and it consists of the techniques that enable
computers to figure things out from the data and deliver AI applications.
Deep learning - meanwhile, is a subset of machine learning that enables computers to
solve more complex problems.
• Deep learning is a type of machine learning
that mimics the neuron of the neural
networks present in the human brain.
• Deep learning algorithms learn
progressively about the input data as it goes
through each neural network layer.
• If the system provided with tons of
information, Deep learning begins to
understand it and respond in useful ways.
What is Deep Learning?
Why using Deep Learning Now?
• As we know Deep Learning works on the basis of the structure and functions of a human brain.
• And human brain has a neural network that has interconnected neurons which process
information and transmit signals among each other.
1. Big Data
a. Larger Datasets
b. Easier Collection and storage
2. Hardware
a. Graphics Processing Units (GPUs)
b. Massively Parallel Processor (single computer with
many network processor)
3. Software
a. Improved Techniques
b. New Models
c. Toolboxes
Deep learning libraries
Tensor Flow PyTorch MATLAB Google Colab
• As human brain has a neural network that has interconnected neurons which process
information and transmit signals among each other.
Neural Networks the core of Deep Learning
• Based on this idea, Geoffrey Hinton, the father of Deep Learning, built an Artificial Neural
Network, that comprised of artificial neurons capable of performing operations and processing
information.
• Artificial Neural Network has 3 layers in common.
Input Layer: This layer accepts various inputs in the form of text, numbers, image pixels, audio.
Hidden Layer: Hidden layers are used to perform mathematical operations, data manipulation
and feature extraction.
Output Layer: Output layer is used to get the final desired output
Module-3 {Week-3}
But how does an Artificial Neural Network(ANN) actually learn?
• ANN (Artificial Neural Network), can be created from three layers of “neurons”.
• Information flows from the input layer, through the hidden layer to the output layer and then
out.
• The input layer, the hidden layer and the output layer.
Simple Artificial Neural Network
• Each of the connections has a number associated with it called the connection weight and
each of the neurons has a number and a special formula associated with them called a threshold
value and an activation function respectively.
• Neurons are provided with a set of inputs and through the neurons on each of the layers of
the network.
• Each neuron transforms the input in some way and forwards it to the next layer and so on.
• The result that it receives on the output layer helps the neuron learns to adjust its weights and
threshold values (peak or maximum limit), to arrive at the correct output and produced outputs
as close to each other as possible.
• This process is repeated a (very high) number of times until the produced and expected
outputs are as close as possible.
Learning of ANN’s (Artificial Neural Network)
What is an activation function and why to use them?
Definition of activation function:-
• Activation function decides, whether a neuron should be activated or not .The purpose of the
activation function is to introduce non-linearity into the output of a neuron.
Why do we need Non-linear activation functions :-
• A neural network without an activation function is essentially just a linear regression model.
The activation function does the non-linear transformation to the input making it capable to
learn and perform more complex tasks.
Example Of Activation Function
• Considering ( Fig-1) the graph which consist of printed (Red & Green) points.
• Seperating green points from the red points in the graph.
• (Fig-2) consist of linear graph with all points mixed.
• (Fig-3) consist of Activation Function which helps in seperating the points.
Fig-1 Fig-2 Fig-3
Module-4 {Week-4}
Why are neural networks important?
• Neural networks are also ideally suited to help people solve complex problems in real-life
situations.
• Reveal hidden relationships, patterns and predictions and also predict rare events (such as
fraud detection).
• As a result, neural networks can improve decision processes in areas such as:
Credit card and Medicare fraud detection. Medical and disease diagnosis. Ecosystem evaluation.
Robotic control systems. Electrical load and energy demand forecasting.
AI vs ML vs DL
Ex--Video Games, Email Spam Filters Ex--Speech Recognition,Self Driving Cars Ex-Virtual Assistants, Translations
Conclusion
• Artificial Intelligence and the technology are one side of the life that always interest and
surprise us with the new ideas, topics, innovations, products …etc.
• If AI were to develop to the point that it can do everything better than humans, it would
mean that it would also do better in science and technology.
• AI having big involvement in finance, national security, health care, transportation, and
smart cities.
• We can make them Learn Advance Topics and can use them to solve Classical problems.
• A major thrust of AI is in the development of computer functions associated with human
intelligence, such as reasoning, learning, and problem solving.
• Artificial intelligence is impacting the future of virtually every industry and every human
being. Artificial intelligence has acted as the main driver of emerging technologies like big data,
robotics and IoT, and it will continue to act as a technological innovator for the foreseeable
future.
Certificate Page
THANK YOU

Industrial training (Artificial Intelligence, Machine Learning & Deep Learning))ppt

  • 1.
    Industrial Training Presentation On ARTIFICIALINTELLIGENCE : MACHINE LEARNING & DEEP LEARNING At Centre for Advanced Studies (CAS-AKTU) New Campus Sec-11, Jankipuram, Vistar Yojna, Lucknow, Uttar Pradesh-226031 TRAINING DURATION: 15/07/2020 to 14/08/2020 PRESENTED BY: NAME: VARUN SHARMA ROLL NUMBER: 18DLEE006 7th SEMESTER, EE-B (G2- BATCH) Electrical Engineering Department, JSS Academy of Technical Education, Noida, U.P, India
  • 2.
    Details of theIndustry/ Background of the company. Theoretical Back ground of the work undertaken. Subdividing the training program in modules : Module-1 {Things which have been done in week-1} Module-2 {Things which have been done in week-2} Module-3 {Things which have been done in week-3} Module-4 {Things which have been done in week-4} Conclusion/ Important learnings from the training period. Certificate Page
  • 3.
    Details of theResearch Institute The Government of India has invested heavily in research establishments in the field of nuclear, defense, space as well as ocean, environment, biotechnology. All these establishments require quality manpower. So some quality institutions are needed to develop a cadre of manpower of good standards for carrying out scientific research. UP Govt. issued a notification to start Centre for Advanced studies for upgrading the standards of academics, quality of staff and research in the state. Also to make Dr. APJ Abdul Kalam Technical University as an institution of higher learning of international repute, an Centre for Advanced Studies (CAS-AKTU) as in- campus academic & research institution is being setup.
  • 4.
    Details of theResearch Institute Centre for Advanced Studies (CAS-AKTU) is the Research Institute which is located in Sec-11, Jankipuram, Vistar Yojna, New Campus(Lucknow) , Uttar Pradesh-226031 The institute envisages to start courses and research in the areas such as Mechatronics, Nano -Technology, Bio Technology, 3D printing, Climate control, Market and Business Analytics, Big Data, Energy Studies, Manufacturing Science and Technology. The institute has started functioning from July 2017. The institute in its first phase is beginning with two M.Tech. programs in specialised areas of Cyber Security and Information and Communication Technologies (ICT).
  • 5.
    Theoretical Background Artificial Intelligence: Artificialintelligence was born in the 1950s, when scientists from a different fields (mathematics, psychology, engineering and economics) were began to discuss the possibility of creating an artificial brain. After there more research on AI and neurology they found that the brain was an electrical network of neurons and pulses, which could be easily described digitally. It reached its peak popularity during 1980s ,when research on AI proved to solve well-defined, logical problems, such as playing chess, solving more complex, fuzzy problems, such as image classification, speech recognition, and language translation. A new approach arose to take symbolic AI’s place: Machine Learning. Through there research they stated that , AI is a general field that encompasses machine learning and deep learning
  • 6.
    Theoretical Background Machine Learning: CharlesBabbage, the inventor of the Analytical Engine ,which was first-known as general-purpose, mechanical computer quoted whether general-purpose computers could be capable of learning and originality a specified task? His question opens the door to a new programming paradigm and originated the term Machine Learning. It started flourish in the 1990s and has quickly become the most popular and most successful subfield of AI, because of the availability of faster hardware and larger datasets. As a result, machine learning tends to deal with large, complex datasets (such as a dataset of millions of images, each consisting of tens of thousands of pixels)
  • 7.
    Theoretical Background Deep Learning: Thehistory of deep learning dates back to 1943 when Warren McCulloch and Walter Pitts created a computer model based on the neural networks of the human brain. They used a combination of mathematics and algorithms they called threshold logic to mimic the thought process. In 1979 , Kunihiko Fukushima developed an artificial neural network, called Neocognitron which used a multi-layered and hierarchical design. Neo-cognitron allowed the computer to learn to recognize visual patterns. More researches found that Deep Learning uses layers of algorithms to process data, understand human speech, and visually recognize objects.
  • 8.
    Module-1 {Week-1} What isMachine Learning? It is an application of Artificial Intelligence(AI) that focuses on the development of computer programs and provides systems the ability to automatically learn and improve from experience. In Supervised we have a dataset which acts as a teacher and its role is to train model or the machine. Once the model gets trained it can start making a prediction or decision when new data is given to it. Ex-Face and Speaker Verification. Unsupervised learns through observation and finds structures in the data. Once the model is given a dataset, it automatically finds patterns & relationship, like it cannot say this a group of apples or mangoes, but it will separate all the apples from mangoes. Ex- Feature selection Reinforcement follows the concept of hit and trial method. It predicts the data and get rewarded or penalized with a point for a correct or a wrong answer. Ex- Autonomous Vehicles or Playing a game against a human opponent.
  • 9.
    Classification vs Clustering DefinitionClassifies the data into one of numerous already defined definite classes. There are pre-defined labels Clustering involves grouping data with respect to their similarities. There are no pre-defined labels Similarity Used for the categorization of objects into one or more classes based on the features. Used for the categorization of objects into one or more classes based on the features. Involved In Supervised learning Unsupervised learning Need It has labels so there is need of training and testing dataset for verifying the model created There is no need of training and testing dataset Complexity More complex as compared to clustering Less complex as compared to classification Examples Fraud Detection Entities can classify transactions as correct or fraudulent using historical data on customer behavior to detect fraud very accurately. Netflix Uses these clusters to refine its knowledge of the tastes of viewers and thus make better decisions in the creation of new original series.
  • 10.
    Lets Consider TheExample On The Indian Cricket World Cup Squad 2011 Each player is characterized by Name, weight(kgs), height(cms) Fig-1 Fig-2
  • 11.
  • 14.
    Application Of MachineLearning In Real World 1). Traffic Alerts- (Location, Avg- Speed , Prediction, Learning Algorithms) 2). Image Recognition- (Digital Images, Videos, Pattern Recognition, Phone Unlock by face nodal) 3). Video Surveillance- (Accuracy, Object Detector) 4). Online Video Streaming Applications- (Day, Time & Type Of Content, Browsing Pattern, Many More Activities)
  • 15.
    Module-2 {Week-2} INTRODUCTION TODEEP-LEARNING AI - means getting a computer to mimic human behavior in some way. Machine learning - is a subset of AI, and it consists of the techniques that enable computers to figure things out from the data and deliver AI applications. Deep learning - meanwhile, is a subset of machine learning that enables computers to solve more complex problems.
  • 16.
    • Deep learningis a type of machine learning that mimics the neuron of the neural networks present in the human brain. • Deep learning algorithms learn progressively about the input data as it goes through each neural network layer. • If the system provided with tons of information, Deep learning begins to understand it and respond in useful ways. What is Deep Learning?
  • 17.
    Why using DeepLearning Now? • As we know Deep Learning works on the basis of the structure and functions of a human brain. • And human brain has a neural network that has interconnected neurons which process information and transmit signals among each other. 1. Big Data a. Larger Datasets b. Easier Collection and storage 2. Hardware a. Graphics Processing Units (GPUs) b. Massively Parallel Processor (single computer with many network processor) 3. Software a. Improved Techniques b. New Models c. Toolboxes Deep learning libraries Tensor Flow PyTorch MATLAB Google Colab
  • 18.
    • As humanbrain has a neural network that has interconnected neurons which process information and transmit signals among each other. Neural Networks the core of Deep Learning • Based on this idea, Geoffrey Hinton, the father of Deep Learning, built an Artificial Neural Network, that comprised of artificial neurons capable of performing operations and processing information. • Artificial Neural Network has 3 layers in common. Input Layer: This layer accepts various inputs in the form of text, numbers, image pixels, audio. Hidden Layer: Hidden layers are used to perform mathematical operations, data manipulation and feature extraction. Output Layer: Output layer is used to get the final desired output
  • 19.
    Module-3 {Week-3} But howdoes an Artificial Neural Network(ANN) actually learn? • ANN (Artificial Neural Network), can be created from three layers of “neurons”. • Information flows from the input layer, through the hidden layer to the output layer and then out. • The input layer, the hidden layer and the output layer. Simple Artificial Neural Network
  • 20.
    • Each ofthe connections has a number associated with it called the connection weight and each of the neurons has a number and a special formula associated with them called a threshold value and an activation function respectively. • Neurons are provided with a set of inputs and through the neurons on each of the layers of the network. • Each neuron transforms the input in some way and forwards it to the next layer and so on. • The result that it receives on the output layer helps the neuron learns to adjust its weights and threshold values (peak or maximum limit), to arrive at the correct output and produced outputs as close to each other as possible. • This process is repeated a (very high) number of times until the produced and expected outputs are as close as possible. Learning of ANN’s (Artificial Neural Network)
  • 21.
    What is anactivation function and why to use them? Definition of activation function:- • Activation function decides, whether a neuron should be activated or not .The purpose of the activation function is to introduce non-linearity into the output of a neuron. Why do we need Non-linear activation functions :- • A neural network without an activation function is essentially just a linear regression model. The activation function does the non-linear transformation to the input making it capable to learn and perform more complex tasks.
  • 22.
    Example Of ActivationFunction • Considering ( Fig-1) the graph which consist of printed (Red & Green) points. • Seperating green points from the red points in the graph. • (Fig-2) consist of linear graph with all points mixed. • (Fig-3) consist of Activation Function which helps in seperating the points. Fig-1 Fig-2 Fig-3
  • 23.
    Module-4 {Week-4} Why areneural networks important? • Neural networks are also ideally suited to help people solve complex problems in real-life situations. • Reveal hidden relationships, patterns and predictions and also predict rare events (such as fraud detection). • As a result, neural networks can improve decision processes in areas such as: Credit card and Medicare fraud detection. Medical and disease diagnosis. Ecosystem evaluation. Robotic control systems. Electrical load and energy demand forecasting.
  • 24.
    AI vs MLvs DL Ex--Video Games, Email Spam Filters Ex--Speech Recognition,Self Driving Cars Ex-Virtual Assistants, Translations
  • 25.
    Conclusion • Artificial Intelligenceand the technology are one side of the life that always interest and surprise us with the new ideas, topics, innovations, products …etc. • If AI were to develop to the point that it can do everything better than humans, it would mean that it would also do better in science and technology. • AI having big involvement in finance, national security, health care, transportation, and smart cities. • We can make them Learn Advance Topics and can use them to solve Classical problems. • A major thrust of AI is in the development of computer functions associated with human intelligence, such as reasoning, learning, and problem solving. • Artificial intelligence is impacting the future of virtually every industry and every human being. Artificial intelligence has acted as the main driver of emerging technologies like big data, robotics and IoT, and it will continue to act as a technological innovator for the foreseeable future.
  • 26.
  • 27.