Home / Big Data Posts / Hadoop: what to expect in the near future?

Mail Us:

E-Mail: info@cybervisiontech.com

Mailing Address:

CyberVision, Inc.,
16047 Collins Ave, Suite 1704,
North Miami Beach, FL 33160

Follow Us:

Hadoop: what to expect in the near future?

Cybervision has analyzed some of the expert predictions for the future of Hadoop and here we outline three trends that are likely to be hot in the Big Data world going forward


An IDC study of over 200 IT professionals reveals that 32% of companies polled have existing Apache Hadoop deployments, 31% indicated having plans to deploy it within 12 months and 36% said their Hadoop deployment schedule would go beyond 12 months. Of those organizations that already use Hadoop, nearly 39% reported using Hadoop for service innovation including analysis of secondary data for modeling of “if-then scenarios” for products and services [source: RedHat.Com, 2013].

In 2013, Hadoop took a great leap forward with Hadoop 2.0, as this converted Hadoop from a batch-only file-based stack into a set of interactive capabilities with multiple optional databases.

But as the Hadoop technology embarks on its second stage of evolution, how will it influence the Big Data world going forward?

We’ve analyzed a variety of Big Data expert opinions and forecasts, and we’ve compiled a list of trends that are likely to dominate in the Hadoop world in the months to come.

Hadoop for sensor data analytics

Hadoop is expected to drive business efficiency by slicing, dicing and analyzing sensor data coming from the Internet of Things and providing correlations to give business owners a better understanding of what is happening with their processes and tools.

In today’s world, almost everything that can be tracked has a sensor attached (i.e. temperature, speed, location). Considering the rapid evolution of wearable technology and the growth of smart products market (i.e. smart watches, cars and houses), Hadoop is likely to become one of the best tools to monitor the moment-to-moment status of performance and/or operations based on the analysis of massive amounts of historical data and help businesses increase operational efficiency, minimize downtime and cut costs. That said, in the near future Hadoop has a good chance to become an attractive candidate for sensor data storage solutions.

More SQL-on-Hadoop initiatives

Innovative Hadoop platforms such as Pivotal HD (by Greenplum), Impala (by Cloudera) or Presto (by Facebook) are revamping the Hadoop technology to operate more like a relational database, thus enabling users to rapidly ask data related questions using SQL (recently, most of databases within the Hadoop ecosystem such as MongoDB, Cassandra or HBase have been non-relational, NoSQL-based). These platforms let you query the same data in a real-time manner, i.e. in seconds [source: Gigaom.Com, 2013].

David Menninger, Head of Business Development and Strategy at Pivotal Initiative, thinks relational databases will “embrace Hadoop as a way to compete at scale” [source: Information Week, 2013].

Five years ago, a company had to pay $100K per terabyte of data for a perpetual software license as well as $20K per annum for support and maintenance. Now the Hadoop technology basically enables businesses to store, manage and analyze the same amount of data with a $1,000 / year subscription [source:Information Week, 2013]. Hadoop will make Big Data storage and management pretty low-cost, which will inevitably lead to vendor market fragmentation.

have existing Apache Hadoophaving plans to deploy within 12 monthshaving plans to deploy beyond12 monthsusing for service innovationhave existing Apache Hadoophaving plans to deploy within 12 monthshaving plans to deploy beyond 12 monthsusing for service innovation

Hadoop to become the de-facto standard for building enterprise-level big data applications

According to Hadoop creator Doug Cutting, “more and more types of workloads will be supported on top of Hadoop” in the future [source: DataNami, 2013]. Besides network monitoring, fraud prevention, and targeting or risk modeling, the Hadoop technology is expected to be used in eCommerce for processing purchasing transactions and provide new tools for data aggregation, analysis and interpretation. As such, it is likely to become an operating system kernel for any data-centric platform, i.e. the de-facto standard for developers to build their big data applications.

Christophe Bisciglia, Founder at Wibidata, believes there are still a couple of years out before this actually happens, but he already sees good opportunities for billion-dollar businesses in selling Hadoop-based ERP and CRM solutions in the near future [source: Gigaom, 2013].

These and other Hadoop trends that are yet to be highlighted in our future blog posts suggest that robust SQL capabilities being integrated into the Hadoop infrastructure will expand the market for scalable big data management solutions and will gradually move Hadoop from the specialist domain to becoming the default data platform for enterprise-level big data projects.


Recommended Posts

Start typing and press Enter to search