10 Pitfalls That Can Undermine Big Data Analytics' Potential

 
 
By Darryl K. Taft  |  Posted 2015-11-06
 
 
 
 
 
 
 
 
 
  • Previous
    1 - 10 Pitfalls That Can Undermine Big Data Analytics' Potential
    Next

    10 Pitfalls That Can Undermine Big Data Analytics' Potential

    Due to shortcuts, questionable collection methods and a rush to be first, some organizations may not be gaining all they could through big data analytics.
  • Previous
    2 - Data That Has Been Mismanaged
    Next

    Data That Has Been Mismanaged

    Big data does not necessarily mean managed data. Many organizations trying to implement analytics projects find that their data is scattered across the business, often in stand-alone systems. Data stuck in these silos is difficult to consistently locate—let alone pool, cleanse and leverage.
  • Previous
    3 - Data That Is Too Dirty
    Next

    Data That Is Too Dirty

    Duplicate copies, corrupted files, missing sets of information and data held beyond appropriate retention periods all converge to clutter data sets. A common data management situation—such as two related departments using two different file management tools—can make their data extremely difficult to combine and analyze.
  • Previous
    4 - Data Samples That Are Too Small
    Next

    Data Samples That Are Too Small

    In an era of "big" data, small data can still be a problem. The amount of data in the world is much larger than the amount of data that is reasonably accessible; for example, private social posts discussing a specific product, or data related to infrequent internal business issues. Think about the realistic availability of data before analyzing.
  • Previous
    5 - Data That Is Outdated
    Next

    Data That Is Outdated

    Most organizations have policies to dispose of data once it has reached the end of its useful life. However, policies don't ensure consistent practice. Duplicates and silos often let outdated copies linger. Further compounding the issue is the time required for data collection and cleansing, rendering the sample nowhere near real time.
  • Previous
    6 - Data That Is Skewed
    Next

    Data That Is Skewed

    All the analytics in the world won't fix data that was gathered in a biased manner or data that is naturally skewed due to systematic business practices. Duplicate copies can create the illusion of weight. Convenience samples are common, especially when dealing with customer data. Be cognizant of how data was obtained and what it represents.
  • Previous
    7 - Data That Looks Too Good to Be True
    Next

    Data That Looks Too Good to Be True

    Great visualization tools can make even the most skewed and dirty data sets look stunning, but don't be blinded by beauty. Upstream mismanagement of data can render trend visualizations very misleading. GIGO: Garbage in, garbage out. Garbage in truly does equal garbage out—no matter how sleek or attractive the final product is. Business and leadership often take reports at face value.
  • Previous
    8 - Data That Reveals the Bad Stuff
    Next

    Data That Reveals the Bad Stuff

    Analytics can open Pandora's Box. Because practically all enterprise data is potentially discoverable in litigation, problematic analytics findings can later harm the business if they are not addressed swiftly by the proper internal teams. Trends of employee harassment, compliance violations and public outcry should all be acknowledged and addressed quickly.
  • Previous
    9 - Data That Is Analyzed by the Wrong Stakeholders
    Next

    Data That Is Analyzed by the Wrong Stakeholders

    In regulated industries, the law doesn't care who is analyzing data; if red-flag trends are found, the business is assumed to have "known" as an organization. If a pharmaceutical marketing team analyzes public social posts and finds mass complaints about adverse effects, they need to immediately escalate those findings to internal risk guardians.
  • Previous
    10 - Data Used to Analyze the Wrong Things
    Next

    Data Used to Analyze the Wrong Things

    Built-in algorithms are not one-size-fits-all; they need human judgment to be applied correctly. Analysis of email metrics might help indicate the performance of a salesperson, but might be insignificant in judging a member of the development team. Interpretation of the results from analytics tools needs to use context.
  • Previous
    11 - Big Data's Need for Speed
    Next

    Big Data's Need for Speed

    In the big data gold rush, firms are frantically scrambling to extract insight before competitors do, but methodology is being left in the dust. Specialty analytics tools are proliferating, yet failing to deliver ROI because they use poorly governed data. For long-term analytics success, it pays to slow down and focus on governance first.
 

The honeymoon phase of big data analytics is beginning to end. After the initial whirlwind romance with flashy demos and sleek visualization tools, businesses are now coming to terms with the reality of implementation. In many cases, it isn't going well. Why? No specialty analytics tool can fix underlying problems with data quality and mismanagement. Due to scattered enterprise data environments, analytics teams are having trouble accessing and analyzing data in a timely manner, particularly for the "human" content of unstructured data. Enterprises need to unlock the value of unstructured "people data," in a proactive manner, to enhance business decision-making and performance. eWEEK interviewed officials at ZL Technologies, a provider of information governance solutions, to get an update on the analytics landscape. "The mainstream analytics paradigm is broken," said Kon Leong, CEO and co-founder of ZL Technologies. "Tedious sampling and use of point tools undermine big data's potential." This eWEEK slide show examines some of the biggest pitfalls of today's analytics projects.

 
 
 
 
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
Rocket Fuel