DATA REDUNDANCY REDUCTION IN LARGE DIMENSIONAL DATASETS USING DEEP LEARNING
Abstract
vioft2nntf2t|tblJournal|Abstract_paper|0xf4ffd9402c000000c3d30b0001000700
In this paper, the DBN pretraining procedure is not the only one that allows effective initialization of DNNs. An alternative unsupervised approach that performs equally well is to pretrain DNNs layer by layer by considering each pair of layers as a de-noising auto-encoder regularized by setting a random subset of the inputs to zero. Another alternative is to use contractive autoencoders for the same purpose by favoring models that is less sensitive to the input variations, i.e., penalizing the gradient of the activities of the hidden units with respect to the inputs. Further, a developed the Sparse Encoding Symmetric Machine (SESM), which has a very similar architecture to RBMs as building blocks of a DBN. In principle, SESM may also be used to effectively initialize the DNN training. Besides unsupervised pretraining, the supervised pretraining, or sometimes called discriminative pretraining, has also been shown to be effective and in cases where labeled training data are abundant performs better than the unsupervised pretraining techniques. The idea of the discriminative pretraining is to start from a one-hidden-layer MLP trained with the BP algorithm. Every time when we want to add a new hidden layer we replace the output layer with a randomly initialized new hidden and output layer and train the whole new MLP (or DNN) using the BP algorithm. Different from the unsupervised pretraining techniques, the discriminative pretraining technique requires labels.

Authors
M Nagavignesh
Vels Institute of Science, Technology and Advanced Studies, India

Keywords
Deep Networks, Data Reduction, Redundancy, High Dimensional Datasets
Yearly Full Views
JanuaryFebruaryMarchAprilMayJuneJulyAugustSeptemberOctoberNovemberDecember
000000000000
Published By :
ICTACT
Published In :
ICTACT Journal on Data Science and Machine Learning
( Volume: 2 , Issue: 4 , Pages: 219-222 )
Date of Publication :
September 2021
Page Views :
250
Full Text Views :
5

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.