Open Access Open Access  Restricted Access Subscription Access
Open Access Open Access Open Access  Restricted Access Restricted Access Subscription Access

Survey on Performance Analysis of Hadoop ETL for Disaster Management


Affiliations
1 Department of Information Technology, Madurai Sivakasi Nadars Pioneer Meenakshi Women’s College, Poovanthi, India
     

   Subscribe/Renew Journal


The term big data refers to data sets whose volume, variability and Speed of velocity make them difficult to capture, manage, procedure or analyzed. To examine this huge amount of data Hadoop is able to be used. Hadoop is an open source software project that enables the Spread giving out of large data sets across a cluster of creationservers.ETL Tools extract important information from various data sources, various transFormation’s of data are established out transformation phase and then load into the big data. HDFS (Hadoop Distributed File System), is a spread file system design to hold the very huge of data (peta bytes or even zetta bytes), and there high throughput Admission to this information. Map Reduce method has been calculated n this paper which is required for implement Big Data Analysis using HDFS. In this paper the related topics of Big Data Analytics, and Hadoop, ETL, Map Reduce are reviewed.


Keywords

Big Data, Hadoop, ETL, Map Reduce, HDFS.
User
Subscription Login to verify subscription
Notifications
Font Size


  • Survey on Performance Analysis of Hadoop ETL for Disaster Management

Abstract Views: 418  |  PDF Views: 5

Authors

M. Saranya
Department of Information Technology, Madurai Sivakasi Nadars Pioneer Meenakshi Women’s College, Poovanthi, India

Abstract


The term big data refers to data sets whose volume, variability and Speed of velocity make them difficult to capture, manage, procedure or analyzed. To examine this huge amount of data Hadoop is able to be used. Hadoop is an open source software project that enables the Spread giving out of large data sets across a cluster of creationservers.ETL Tools extract important information from various data sources, various transFormation’s of data are established out transformation phase and then load into the big data. HDFS (Hadoop Distributed File System), is a spread file system design to hold the very huge of data (peta bytes or even zetta bytes), and there high throughput Admission to this information. Map Reduce method has been calculated n this paper which is required for implement Big Data Analysis using HDFS. In this paper the related topics of Big Data Analytics, and Hadoop, ETL, Map Reduce are reviewed.


Keywords


Big Data, Hadoop, ETL, Map Reduce, HDFS.

References





DOI: https://doi.org/10.36039/ciitaas%2F10%2F4%2F2018%2F172808.80-83