|
|
Bimonthly Since 1986 |
ISSN 1004-9037
|
|
|
|
|
Publication Details |
Edited by: Editorial Board of Journal of Data Acquisition and Processing
P.O. Box 2704, Beijing 100190, P.R. China
Sponsored by: Institute of Computing Technology, CAS & China Computer Federation
Undertaken by: Institute of Computing Technology, CAS
Published by: SCIENCE PRESS, BEIJING, CHINA
Distributed by:
China: All Local Post Offices
|
|
|
|
|
|
|
|
|
|
Abstract
Today the term big data produces a lot of recognition. Agencies store huge Amounts of data and extract useful information from those databases to discover desired Patterns and interrelationships among them that human brains can't understand by human brains.The data volume range crosses our ability to process and generate from divergent Structured, Unstructured and semi structure resources. The main focus is on the framework of big data Hadoop , Map Reduce Environment and various tools related to big data which plays a very vital role to handle huge Data volume.Social networking sites like Facebook, Twitter produce large volumes of data which will be unmanageable within a few years. In order to manage these data sets, the proposed method uses various algorithms for processing this huge amount of data. Main Purpose of the Map Reduce programming model is processing and producing large datasets clusters that control a variety of real-world tasks. It has a main two function map and reduce and runtime system performs parallel processing across various machines and also handles all other networking jobs. Basically it focuses on the analysis of data, especially based on the user's needs. For this purpose Map Reduce performs the task of mapping, combining, partitioning, joining and reducing. It runs on large Data Sets on different machines which are highly quantifiable Programs. Many Map Reduce Programs are executed on Google’s Data sets every day from many years.
Keyword
Big Data, Machine learning, Map Reduce, Virtualization,Algorithms of Map Reduce , Map Reduce Framework.
PDF Download (click here)
|
|
|
|
|