BI Insights

Why Big Data Projects Fail And How To Make 2017 Different

16 February, 2017

Why Big Data Projects Fail And How To Make 2017 Different

As companies strive to become fully data-driven in the digital era, challenges still remain for big data projects

Major shifts are happening in our ecosystem as companies strive to become fully data-driven in the digital era. Gathering insights from data has become increasingly complicated due to a tidal wave of enterprise applications generating data.

In addition, billions of users and trillions of connected “things” generate exponentially more data outside the enterprise. At the center, enterprises deploy cloud, mobile, and analytics technologies to hopefully turn that data into insight. Unfortunately, Gartner predicts that 2017 will see  60 percent of big data projects fail. They won’t go beyond piloting and experimentation phases, and will eventually be abandoned.

Where is the disconnect happening for companies between linking data assets to strategic value?

In my experience, the two main obstacles are lack of skill or expertise, and a mismatch between the technology strategy and overall company needs.

The expertise gap

When big data was in its infancy, the technology available at the time was immature and a trial-by-fire experience. Companies with very deep pockets such as Google, Yahoo and Facebook had to build infrastructure from the ground up to handle these problems. Excited by their success, many enterprises have tried to emulate them with their own Hadoop-based, big data projects.

Read full story

The BI Guru
Presented by