Across industries, companies use data analytics as part of their strategic outlook and in their decision making process. The new popular terms in the business world today are “listen-to-customers” and “innovate”. Big data and asset management offer many opportunities to improve and optimize all business areas. According to IDC, market intelligence and advisory firm, this market will grow five times over, from $3.2 billion in 2010 to $16.9 billion by 2015. Analytics helps gain valuable knowledge from data, but also uses the insights to recommend action or provide guidance. In addition to strategic insights, increasing the marketing IQ, and improving CRM (Customer Response management) – data analytics tools can help solve business and operational problems, if they deliver proven results and are cost effective.
How to optimize customer satisfaction and user interface with your products?
At the SVForum’s Big Data and Analytics conference this week, Mike Gualtieri from Forrester clarified the meaning of big data and how it relates to business intelligence. He talked the term SPA – Store, Process and Access. According to Mr. Gualtieri “Big data is the frontier of a firm’s ability to store, process, and access (SPA) all the data it needs to operate effectively, make decisions, reduce risks, and serve customers.”
He presented several critical indicators: “Can you capture and store the data? Can you cleanse, enrich, and analyze the data? Can you retrieve, search, integrate, and visualize the data?” He talked about the pragmatic approach to big data, which must acknowledge that “Exponential data growth makes it continuously difficult to manage — store, process, and access.” Data contains nonobvious information that companies can discover to improve business outcomes, although measures of data are relative to each business. Furthermore, Gualtieri said that a pragmatic definition must be actionable for both IT and business professionals.
Today, companies are flush with data but don’t extract much analytics. As we gather more data, companies will increasingly need to prepare for this growth, define data governance, secure the data and address privacy concerns. Managing and integrating data from a variety of sources is one of the key challenges.
Predictive analytics will be the new trend, utilizing various algorithms, support strategic decision-making and help with risk management. Predictive algorithms challenge IT as they present a constant disruption through ever-evolving applications and have agility to adjust to changing predictive models. For example, Google Now detects in-the-moment intent of users based on their behavior through search habits and delivers information to them. Google Now is available for Google’s Android operating system.
It takes a lot of automation efforts to collect data along the entire business ecosystem. The traditional database management tools and processing applications are insufficient to handle the complex data sets that are captured, aggregated, stored, searched, shared, analyzed, and visualized. Analytics relies on concurrent application of statistics models, software, mathematical framework, and operations research in order to quantify performance, where most solutions today offer data visualization tools to present insights (like dashboards, graphs, charts, etc.)
Big data can also have societal implications: for example, in the healthcare industry, there is a potential to advance in the medical field and improve public health by collecting, studying and analyzing volumes of disease data and symptoms findings.
When utilizing analytics models to make business sense and using predictive algorithms, we can find uses that were not available before, such in alleviating transportation issues in urban areas, while increasing safety. As cities become smarter, the amount of data the city produces grows exponentially. Information is constantly being generated from traffic lights, sensors, meters, computers and more. Currently, the information is sent to different locations and organizations; however utilities have successfully utilized advanced analytics programs to add a level of intelligence and predictive value. For example, the city of Edmonton in Canada utilized the data generated by traffic and other sensors to reduce the number of traffic accidents. Edmonton utilized an incident management system, as well as traffic congestion and prediction tools to derive key insights and develop a plan of action. Pittsburgh, PA also analyzed the abundant amount of data it received from all areas of the city to automate processes ranging from parking to bus scheduling and traffic lights.
Take the power utility sector: with deployment of smart meters and with increasing commercial and residential customers’ demand, utilities are already flooded with big data, from millions of smart meters, thousands of grid sensor systems, and other smart grid control devices. Capturing and collecting the data represent one side; making business sense of all the data and extracting value that can guide future investments and focus areas are another challenge. There are many new – web-based and mobile – big data tools today.
When a utility company collected and analyzed EV charging data in order to gain insights in its business planning process on EVs’ charging effect on grid operations, they found out a gap in the residential market: to see the impact of home-based charging of plug-in electric vehicles on the grid, the users’ smart meter data showed a fifteen-minute interval at watt-hour. Therefore, in this sector, real time visibility is usually not attainable.
Bringing science into the scheme helps develop various algorithms for predictive analytics. Essentially, algorithm models are one of the key differences between analytics offerings.
Is more and more data the right solution? Big Data can overwhelm the ability to actually use it. At which point can the data paralyze the decision makers? What is the right solution for your company?
When evaluating offerings, define data integration in the context of your business. Technical challenges include assessment of the infrastructure needed (public? cloud? own infrastructure and networks?), data storage needs, the architecture (web-based? proprietary? integration, embedability), features (plugins? more), processing response time, governance decisions (for example, how long would your business need the data?), data security, etc.
1. SVForum conference: Big Data Analytics – Between Now and 2020
According to IDC, the big data market will grow from $3.2 billion in 2010 to $16.9 billion by 2015. This growth represents a massive opportunity for startups and large IT Vendors alike. The development of data science, more powerful tools for analysis, virtualization, and cost is changing the face of data. The event covered the investment landscape and trends, business opportunities, and the enterprise approach. The conference was held at SAP in Palo Alto, CA.
2. Startup workshops, investor programs and pitching opportunities by SVForum: check http://svforum.org
3. IDC press release:
IDC Releases First Worldwide Big Data Technology and Services Market Forecast, Shows Big Data as the Next Essential Capability and a Foundation for the Intelligent Economy