ICMI is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Advertisement

The Rise of Self-Service Business Intelligence

Smart businesses, not just big businesses, are harnessing Big Data. The rise of self-service business intelligence promises to make sure IT bottlenecks are a thing of the past. See how your company can benefit.

Business intelligence (BI) is simply a new name for technology that has been around, in some form or another, for a long time. The number of data analytics companies in the marketplace has been growing in recent years, particularly among firms that serve small-to-medium organizations. Enterprise-class solutions from Oracle, SAP, IBM, and others are also quite mature and have seen hundreds of releases, thousands of use cases, and are stable, powerful, and expensive. Traditionally, BI implementation has been limited by two factors — cost of equipment and cost of labor.

Historically, equipment costs have been the greatest barrier to implementing data analysis initiatives. In 1976, when Honeywell began offering their Multics Relational Data Store, the first commercial relational database, the price of random access memory (RAM) was roughly $49 million per gigabyte (GB). The cost of storage was lower, but still quite high — one GB cost roughly $7 million. By the time analytics company SAP purchased its first server in 1979, memory was roughly $6.7 million per GB — a drop in price of over 86 percent in just three years — and storage costs had decreased around 14 percent to roughly $6 million per GB. In 2005, when Oracle acquired Siebel, the precursor to their Oracle Business Intelligence Enterprise Edition (OBIEE) software, the average cost per GB of storage was $1.34, and one GB of RAM cost $189. Today, the average cost of RAM is $9 per GB and the average cost of storage is around $0.04 per GB. The hardware for data analysis has become commoditized, but the lack of skilled data analysts in the job market means that while companies can afford the hardware to support a data analysis initiative, they often can’t find employees to perform the work.

The lack of skilled data analysts is now the biggest barrier to unlocking the value of Big Data. According to a recent study by McKinsey & Company, “the United States alone faces a shortage of 140,000 to 190,000 people with analytical expertise and 1.5 million managers and analysts with the skills to understand and make decisions based on the analysis of big data.” According to a recent article from the Wall Street Journal, “80% of new data scientist jobs created between 2010 and 2011 have not been filled.” While enrollments in statistical analysis courses are on the rise, on-the-job training is required in addition to college coursework — meaning the shortage is not going away anytime soon. This isn’t just an on-paper shortage either — you’ve likely experienced longer waits for reports generated by IT, long lead times on the integration of new software systems and data sources, and longer times in general on anything requiring specialized data skills. These waits can be directly attributed to a lack of skilled data analysts at the time those skills are becoming more and more needed.

According to a 2012 report from IDC, all the data that was created, replicated, and consumed in 2012 totaled 2,837 exabytes, or 2.8 trillion GB. That number is forecasted to grow to 40,000 exabytes by 2020 — roughly 5,200 GB per person alive today. In order to efficiently glean insights from that data, organizations can try to hire an elusive data scientist at an above market price (the median salary of a data scientist is $115,000/year, according to Glassdoor), assemble a team of people with various skills that approximate one data scientist, or use one of the new, self-service business intelligence tools that allow non-technical employees to create their own reports.

If you assume that ‘self-service BI’ tools completely eliminate the need for the data wizards in your IT department and allow any employee with modest skills to drag-and-drop various data sources into a report builder or visualization engine, you’d be incorrect. In order for such non-technical tools to run smoothly, links need to be created between them and traditional data infrastructures, such as data warehouses and legacy systems. That initial work, and the work needed to maintain such connections, needs to be handled by a proficient coder. These tools are merely designed to reduce the day-to-day utilization of IT resources for reporting and analysis, not eliminate IT involvement entirely.

Once a technical expert has created the links between your self-service BI tool and your company’s data resources, non-technical users can easily perform their own reports and analysis. Sales managers can reference past sales data and examine it against industry reports to create their own forecasts. Marketers can analyze the results of their latest campaign, drilling down by traffic sources, date, location, etc. — and even generate elaborate interactive visualizations or dashboards for reporting to clients or company leadership. Call center managers can see which agents have the highest first call resolution and why — making training and development truly data-driven. 

In short, self-service business intelligence tools don’t eliminate the need for data scientists or your IT department. While they put powerful analytical capabilities in the hands of non-analyst users, self-service BI platforms must still be integrated and maintained by experts. Even with this limitation, any and every company should at least explore the options. The competitive advantage gained by faster data insights can be significant.

More Resources